Sample records for risk calculation software

  1. [Prenatal risk calculation: comparison between Fast Screen pre I plus software and ViewPoint software. Evaluation of the risk calculation algorithms].

    PubMed

    Morin, Jean-François; Botton, Eléonore; Jacquemard, François; Richard-Gireme, Anouk

    2013-01-01

    The Fetal medicine foundation (FMF) has developed a new algorithm called Prenatal Risk Calculation (PRC) to evaluate Down syndrome screening based on free hCGβ, PAPP-A and nuchal translucency. The peculiarity of this algorithm is to use the degree of extremeness (DoE) instead of the multiple of the median (MoM). The biologists measuring maternal seric markers on Kryptor™ machines (Thermo Fisher Scientific) use Fast Screen pre I plus software for the prenatal risk calculation. This software integrates the PRC algorithm. Our study evaluates the data of 2.092 patient files of which 19 show a fœtal abnormality. These files have been first evaluated with the ViewPoint software based on MoM. The link between DoE and MoM has been analyzed and the different calculated risks compared. The study shows that Fast Screen pre I plus software gives the same risk results as ViewPoint software, but yields significantly fewer false positive results.

  2. Incorporating cost-benefit analyses into software assurance planning

    NASA Technical Reports Server (NTRS)

    Feather, M. S.; Sigal, B.; Cornford, S. L.; Hutchinson, P.

    2001-01-01

    The objective is to use cost-benefit analyses to identify, for a given project, optimal sets of software assurance activities. Towards this end we have incorporated cost-benefit calculations into a risk management framework.

  3. Identification of Patient Safety Risks Associated with Electronic Health Records: A Software Quality Perspective.

    PubMed

    Virginio, Luiz A; Ricarte, Ivan Luiz Marques

    2015-01-01

    Although Electronic Health Records (EHR) can offer benefits to the health care process, there is a growing body of evidence that these systems can also incur risks to patient safety when developed or used improperly. This work is a literature review to identify these risks from a software quality perspective. Therefore, the risks were classified based on the ISO/IEC 25010 software quality model. The risks identified were related mainly to the characteristics of "functional suitability" (i.e., software bugs) and "usability" (i.e., interface prone to user error). This work elucidates the fact that EHR quality problems can adversely affect patient safety, resulting in errors such as incorrect patient identification, incorrect calculation of medication dosages, and lack of access to patient data. Therefore, the risks presented here provide the basis for developers and EHR regulating bodies to pay attention to the quality aspects of these systems that can result in patient harm.

  4. Implementation and Simulation Results using Autonomous Aerobraking Development Software

    NASA Technical Reports Server (NTRS)

    Maddock, Robert W.; DwyerCianciolo, Alicia M.; Bowes, Angela; Prince, Jill L. H.; Powell, Richard W.

    2011-01-01

    An Autonomous Aerobraking software system is currently under development with support from the NASA Engineering and Safety Center (NESC) that would move typically ground-based operations functions to onboard an aerobraking spacecraft, reducing mission risk and mission cost. The suite of software that will enable autonomous aerobraking is the Autonomous Aerobraking Development Software (AADS) and consists of an ephemeris model, onboard atmosphere estimator, temperature and loads prediction, and a maneuver calculation. The software calculates the maneuver time, magnitude and direction commands to maintain the spacecraft periapsis parameters within design structural load and/or thermal constraints. The AADS is currently tested in simulations at Mars, with plans to also evaluate feasibility and performance at Venus and Titan.

  5. Micrometeoroid and Orbital Debris (MMOD) Shield Ballistic Limit Analysis Program

    NASA Technical Reports Server (NTRS)

    Ryan, Shannon

    2013-01-01

    This software implements penetration limit equations for common micrometeoroid and orbital debris (MMOD) shield configurations, windows, and thermal protection systems. Allowable MMOD risk is formulated in terms of the probability of penetration (PNP) of the spacecraft pressure hull. For calculating the risk, spacecraft geometry models, mission profiles, debris environment models, and penetration limit equations for installed shielding configurations are required. Risk assessment software such as NASA's BUMPERII is used to calculate mission PNP; however, they are unsuitable for use in shield design and preliminary analysis studies. The software defines a single equation for the design and performance evaluation of common MMOD shielding configurations, windows, and thermal protection systems, along with a description of their validity range and guidelines for their application. Recommendations are based on preliminary reviews of fundamental assumptions, and accuracy in predicting experimental impact test results. The software is programmed in Visual Basic for Applications for installation as a simple add-in for Microsoft Excel. The user is directed to a graphical user interface (GUI) that requires user inputs and provides solutions directly in Microsoft Excel workbooks.

  6. A statistical model of operational impacts on the framework of the bridge crane

    NASA Astrophysics Data System (ADS)

    Antsev, V. Yu; Tolokonnikov, A. S.; Gorynin, A. D.; Reutov, A. A.

    2017-02-01

    The technical regulations of the Customs Union demands implementation of the risk analysis of the bridge cranes operation at their design stage. The statistical model has been developed for performance of random calculations of risks, allowing us to model possible operational influences on the bridge crane metal structure in their various combination. The statistical model is practically actualized in the software product automated calculation of risks of failure occurrence of bridge cranes.

  7. Evaluation of Automated Fracture Risk Assessment Based on the Canadian Association of Radiologists and Osteoporosis Canada Assessment Tool.

    PubMed

    Allin, Sonya; Bleakney, Robert; Zhang, Julie; Munce, Sarah; Cheung, Angela M; Jaglal, Susan

    2016-01-01

    Fracture risk assessments are not always clearly communicated on bone mineral density (BMD) reports; evidence suggests that structured reporting (SR) tools may improve report clarity. The aim of this study is to compare fracture risk assessments automatically assigned by SR software in accordance with Canadian Association of Radiologists and Osteoporosis Canada (CAROC) recommendations to assessments from experts on narrative BMD reports. Charts for 500 adult patients who recently received a BMD exam were sampled from across University of Toronto's Joint Department of Medical Imaging. BMD measures and clinical details were manually abstracted from charts and were used to create structured reports with assessments generated by a software implementation of CAROC recommendations. CAROC calculations were statistically compared to experts' original assessments using percentage agreement (PA) and Krippendorff's alpha. Canadian FRAX calculations were also compared to experts', where possible. A total of 25 (5.0%) reported assessments did not conform to categorizations recommended by Canadian guidelines. Across the remainder, the Krippendorff's alpha relating software assigned assessments to physicians was high at 0.918; PA was 94.3%. Lower agreement was associated with reports for patients with documented modifying factors (alpha = 0.860, PA = 90.2%). Similar patterns of agreement related expert assessments to FRAX calculations, although statistics of agreement were lower. Categories of disagreement were defined by (1) gray areas in current guidelines, (2) margins of assessment categorizations, (3) dictation/transcription errors, (4) patients on low doses of steroids, and (5) ambiguous documentation of modifying factors. Results suggest that SR software can produce fracture risk assessments that agree with experts on most routine, adult BMD exams. Results also highlight situations where experts tend to diverge from guidelines and illustrate the potential for SR software to (1) reduce variability in, (2) ameliorate errors in, and (3) improve clarity of routine adult BMD exam reports. Copyright © 2016 International Society for Clinical Densitometry. Published by Elsevier Inc. All rights reserved.

  8. NASGRO(registered trademark): Fracture Mechanics and Fatigue Crack Growth Analysis Software

    NASA Technical Reports Server (NTRS)

    Forman, Royce; Shivakumar, V.; Mettu, Sambi; Beek, Joachim; Williams, Leonard; Yeh, Feng; McClung, Craig; Cardinal, Joe

    2004-01-01

    This viewgraph presentation describes NASGRO, which is a fracture mechanics and fatigue crack growth analysis software package that is used to reduce risk of fracture in Space Shuttles. The contents include: 1) Consequences of Fracture; 2) NASA Fracture Control Requirements; 3) NASGRO Reduces Risk; 4) NASGRO Use Inside NASA; 5) NASGRO Components: Crack Growth Module; 6) NASGRO Components:Material Property Module; 7) Typical NASGRO analysis: Crack growth or component life calculation; and 8) NASGRO Sample Application: Orbiter feedline flowliner crack analysis.

  9. Software risk management through independent verification and validation

    NASA Technical Reports Server (NTRS)

    Callahan, John R.; Zhou, Tong C.; Wood, Ralph

    1995-01-01

    Software project managers need tools to estimate and track project goals in a continuous fashion before, during, and after development of a system. In addition, they need an ability to compare the current project status with past project profiles to validate management intuition, identify problems, and then direct appropriate resources to the sources of problems. This paper describes a measurement-based approach to calculating the risk inherent in meeting project goals that leverages past project metrics and existing estimation and tracking models. We introduce the IV&V Goal/Questions/Metrics model, explain its use in the software development life cycle, and describe our attempts to validate the model through the reverse engineering of existing projects.

  10. Isobio software: biological dose distribution and biological dose volume histogram from physical dose conversion using linear-quadratic-linear model.

    PubMed

    Jaikuna, Tanwiwat; Khadsiri, Phatchareewan; Chawapun, Nisa; Saekho, Suwit; Tharavichitkul, Ekkasit

    2017-02-01

    To develop an in-house software program that is able to calculate and generate the biological dose distribution and biological dose volume histogram by physical dose conversion using the linear-quadratic-linear (LQL) model. The Isobio software was developed using MATLAB version 2014b to calculate and generate the biological dose distribution and biological dose volume histograms. The physical dose from each voxel in treatment planning was extracted through Computational Environment for Radiotherapy Research (CERR), and the accuracy was verified by the differentiation between the dose volume histogram from CERR and the treatment planning system. An equivalent dose in 2 Gy fraction (EQD 2 ) was calculated using biological effective dose (BED) based on the LQL model. The software calculation and the manual calculation were compared for EQD 2 verification with pair t -test statistical analysis using IBM SPSS Statistics version 22 (64-bit). Two and three-dimensional biological dose distribution and biological dose volume histogram were displayed correctly by the Isobio software. Different physical doses were found between CERR and treatment planning system (TPS) in Oncentra, with 3.33% in high-risk clinical target volume (HR-CTV) determined by D 90% , 0.56% in the bladder, 1.74% in the rectum when determined by D 2cc , and less than 1% in Pinnacle. The difference in the EQD 2 between the software calculation and the manual calculation was not significantly different with 0.00% at p -values 0.820, 0.095, and 0.593 for external beam radiation therapy (EBRT) and 0.240, 0.320, and 0.849 for brachytherapy (BT) in HR-CTV, bladder, and rectum, respectively. The Isobio software is a feasible tool to generate the biological dose distribution and biological dose volume histogram for treatment plan evaluation in both EBRT and BT.

  11. Quantitative assessment of landslide risk in design practice

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Romanov, A.M.; Darevskii, V.E.

    1995-03-01

    Developments of the State Institute for River Transport Protection, which are directed toward practical implementation of an engineering method recommended by regulatory documents for calculation of landslide phenomena, are cited; the potential of operating computer software is demonstrated. Results of calculations are compared with test data, and also with problems solved in the new developments.

  12. RESRAD for Radiological Risk Assessment. Comparison with EPA CERCLA Tools - PRG and DCC Calculators

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yu, C.; Cheng, J. -J.; Kamboj, S.

    The purpose of this report is two-fold. First, the risk assessment methodology for both RESRAD and the EPA’s tools is reviewed. This includes a review of the EPA’s justification for 2 using a dose-to-risk conversion factor to reduce the dose-based protective ARAR from 15 to 12 mrem/yr. Second, the models and parameters used in RESRAD and the EPA PRG and DCC Calculators are compared in detail, and the results are summarized and discussed. Although there are suites of software tools in the RESRAD family of codes and the EPA Calculators, the scope of this report is limited to the RESRADmore » (onsite) code for soil contamination and the EPA’s PRG and DCC Calculators also for soil contamination.« less

  13. RMP*Comp

    EPA Pesticide Factsheets

    You can use this free software program to complete the Off-site Consequence Analyses (both worst case scenarios and alternative scenarios) required under the Risk Management Program rule, so that you don't have to do calculations by hand.

  14. Global review of open access risk assessment software packages valid for global or continental scale analysis

    NASA Astrophysics Data System (ADS)

    Daniell, James; Simpson, Alanna; Gunasekara, Rashmin; Baca, Abigail; Schaefer, Andreas; Ishizawa, Oscar; Murnane, Rick; Tijssen, Annegien; Deparday, Vivien; Forni, Marc; Himmelfarb, Anne; Leder, Jan

    2015-04-01

    Over the past few decades, a plethora of open access software packages for the calculation of earthquake, volcanic, tsunami, storm surge, wind and flood have been produced globally. As part of the World Bank GFDRR Review released at the Understanding Risk 2014 Conference, over 80 such open access risk assessment software packages were examined. Commercial software was not considered in the evaluation. A preliminary analysis was used to determine whether the 80 models were currently supported and if they were open access. This process was used to select a subset of 31 models that include 8 earthquake models, 4 cyclone models, 11 flood models, and 8 storm surge/tsunami models for more detailed analysis. By using multi-criteria analysis (MCDA) and simple descriptions of the software uses, the review allows users to select a few relevant software packages for their own testing and development. The detailed analysis evaluated the models on the basis of over 100 criteria and provides a synopsis of available open access natural hazard risk modelling tools. In addition, volcano software packages have since been added making the compendium of risk software tools in excess of 100. There has been a huge increase in the quality and availability of open access/source software over the past few years. For example, private entities such as Deltares now have an open source policy regarding some flood models (NGHS). In addition, leaders in developing risk models in the public sector, such as Geoscience Australia (EQRM, TCRM, TsuDAT, AnuGA) or CAPRA (ERN-Flood, Hurricane, CRISIS2007 etc.), are launching and/or helping many other initiatives. As we achieve greater interoperability between modelling tools, we will also achieve a future wherein different open source and open access modelling tools will be increasingly connected and adapted towards unified multi-risk model platforms and highly customised solutions. It was seen that many software tools could be improved by enabling user-defined exposure and vulnerability. Without this function, many tools can only be used regionally and not at global or continental scale. It is becoming increasingly easy to use multiple packages for a single region and/or hazard to characterize the uncertainty in the risk, or use as checks for the sensitivities in the analysis. There is a potential for valuable synergy between existing software. A number of open source software packages could be combined to generate a multi-risk model with multiple views of a hazard. This extensive review has simply attempted to provide a platform for dialogue between all open source and open access software packages and to hopefully inspire collaboration between developers, given the great work done by all open access and open source developers.

  15. Evaluation of Quantra Hologic Volumetric Computerized Breast Density Software in Comparison With Manual Interpretation in a Diverse Population

    PubMed Central

    Richard-Davis, Gloria; Whittemore, Brianna; Disher, Anthony; Rice, Valerie Montgomery; Lenin, Rathinasamy B; Dollins, Camille; Siegel, Eric R; Eswaran, Hari

    2018-01-01

    Objective: Increased mammographic breast density is a well-established risk factor for breast cancer development, regardless of age or ethnic background. The current gold standard for categorizing breast density consists of a radiologist estimation of percent density according to the American College of Radiology (ACR) Breast Imaging Reporting and Data System (BI-RADS) criteria. This study compares paired qualitative interpretations of breast density on digital mammograms with quantitative measurement of density using Hologic’s Food and Drug Administration–approved R2 Quantra volumetric breast density assessment tool. Our goal was to find the best cutoff value of Quantra-calculated breast density for stratifying patients accurately into high-risk and low-risk breast density categories. Methods: Screening digital mammograms from 385 subjects, aged 18 to 64 years, were evaluated. These mammograms were interpreted by a radiologist using the ACR’s BI-RADS density method, and had quantitative density measured using the R2 Quantra breast density assessment tool. The appropriate cutoff for breast density–based risk stratification using Quantra software was calculated using manually determined BI-RADS scores as a gold standard, in which scores of D3/D4 denoted high-risk densities and D1/D2 denoted low-risk densities. Results: The best cutoff value for risk stratification using Quantra-calculated breast density was found to be 14.0%, yielding a sensitivity of 65%, specificity of 77%, and positive and negative predictive values of 75% and 69%, respectively. Under bootstrap analysis, the best cutoff value had a mean ± SD of 13.70% ± 0.89%. Conclusions: Our study is the first to publish on a North American population that assesses the accuracy of the R2 Quantra system at breast density stratification. Quantitative breast density measures will improve accuracy and reliability of density determination, assisting future researchers to accurately calculate breast cancer risks associated with density increase. PMID:29511356

  16. Evaluation of Quantra Hologic Volumetric Computerized Breast Density Software in Comparison With Manual Interpretation in a Diverse Population.

    PubMed

    Richard-Davis, Gloria; Whittemore, Brianna; Disher, Anthony; Rice, Valerie Montgomery; Lenin, Rathinasamy B; Dollins, Camille; Siegel, Eric R; Eswaran, Hari

    2018-01-01

    Increased mammographic breast density is a well-established risk factor for breast cancer development, regardless of age or ethnic background. The current gold standard for categorizing breast density consists of a radiologist estimation of percent density according to the American College of Radiology (ACR) Breast Imaging Reporting and Data System (BI-RADS) criteria. This study compares paired qualitative interpretations of breast density on digital mammograms with quantitative measurement of density using Hologic's Food and Drug Administration-approved R2 Quantra volumetric breast density assessment tool. Our goal was to find the best cutoff value of Quantra-calculated breast density for stratifying patients accurately into high-risk and low-risk breast density categories. Screening digital mammograms from 385 subjects, aged 18 to 64 years, were evaluated. These mammograms were interpreted by a radiologist using the ACR's BI-RADS density method, and had quantitative density measured using the R2 Quantra breast density assessment tool. The appropriate cutoff for breast density-based risk stratification using Quantra software was calculated using manually determined BI-RADS scores as a gold standard, in which scores of D3/D4 denoted high-risk densities and D1/D2 denoted low-risk densities. The best cutoff value for risk stratification using Quantra-calculated breast density was found to be 14.0%, yielding a sensitivity of 65%, specificity of 77%, and positive and negative predictive values of 75% and 69%, respectively. Under bootstrap analysis, the best cutoff value had a mean ± SD of 13.70% ± 0.89%. Our study is the first to publish on a North American population that assesses the accuracy of the R2 Quantra system at breast density stratification. Quantitative breast density measures will improve accuracy and reliability of density determination, assisting future researchers to accurately calculate breast cancer risks associated with density increase.

  17. Estimation of Apple Intake for the Exposure Assessment of Residual Chemicals Using Korea National Health and Nutrition Examination Survey Database

    PubMed Central

    2016-01-01

    The aims of this study were to develop strategies and algorithms of calculating food commodity intake suitable for exposure assessment of residual chemicals by using the food intake database of Korea National Health and Nutrition Examination Survey (KNHANES). In this study, apples and their processed food products were chosen as a model food for accurate calculation of food commodity intakes uthrough the recently developed Korea food commodity intake calculation (KFCIC) software. The average daily intakes of total apples in Korea Health Statistics were 29.60 g in 2008, 32.40 g in 2009, 34.30 g in 2010, 28.10 g in 2011, and 24.60 g in 2012. The average daily intakes of apples by KFCIC software was 2.65 g higher than that by Korea Health Statistics. The food intake data in Korea Health Statistics might have less reflected the intake of apples from mixed and processed foods than KFCIC software has. These results can affect outcome of risk assessment for residual chemicals in foods. Therefore, the accurate estimation of the average daily intake of food commodities is very important, and more data for food intakes and recipes have to be applied to improve the quality of data. Nevertheless, this study can contribute to the predictive estimation of exposure to possible residual chemicals and subsequent analysis for their potential risks. PMID:27152299

  18. Clinician time used for decision making: a best case workflow study using cardiovascular risk assessments and Ask Mayo Expert algorithmic care process models.

    PubMed

    North, Frederick; Fox, Samuel; Chaudhry, Rajeev

    2016-07-20

    Risk calculation is increasingly used in lipid management, congestive heart failure, and atrial fibrillation. The risk scores are then used for decisions about statin use, anticoagulation, and implantable defibrillator use. Calculating risks for patients and making decisions based on these risks is often done at the point of care and is an additional time burden for clinicians that can be decreased by automating the tasks and using clinical decision-making support. Using Morae Recorder software, we timed 30 healthcare providers tasked with calculating the overall risk of cardiovascular events, sudden death in heart failure, and thrombotic event risk in atrial fibrillation. Risk calculators used were the American College of Cardiology Atherosclerotic Cardiovascular Disease risk calculator (AHA-ASCVD risk), Seattle Heart Failure Model (SHFM risk), and CHA2DS2VASc. We also timed the 30 providers using Ask Mayo Expert care process models for lipid management, heart failure management, and atrial fibrillation management based on the calculated risk scores. We used the Mayo Clinic primary care panel to estimate time for calculating an entire panel risk. Mean provider times to complete the CHA2DS2VASc, AHA-ASCVD risk, and SHFM were 36, 45, and 171 s respectively. For decision making about atrial fibrillation, lipids, and heart failure, the mean times (including risk calculations) were 85, 110, and 347 s respectively. Even under best case circumstances, providers take a significant amount of time to complete risk assessments. For a complete panel of patients this can lead to hours of time required to make decisions about prescribing statins, use of anticoagulation, and medications for heart failure. Informatics solutions are needed to capture data in the medical record and serve up automatically calculated risk assessments to physicians and other providers at the point of care.

  19. Educational software for simulating risk of HIV infection

    NASA Astrophysics Data System (ADS)

    Rothberg, Madeleine A.; Sandberg, Sonja; Awerbuch, Tamara E.

    1994-03-01

    The AIDS epidemic is still growing rapidly and the disease is thought to be uniformly fatal. With no vaccine or cure in sight, education during high school years is a critical component in the prevention of AIDS. We propose the use of computer software with which high school students can explore via simulation their own risk of acquiring an HIV infection given certain sexual behaviors. This particular software is intended to help students understand the three factors that determine their risk of HIV infection (number of sexual acts, probability that their partners are infected, and riskiness of the specific sexual activities they choose). Users can explicitly calculate their own chances of becoming infected based on decisions they make. Use of the program is expected to personalize the risk of HIV infection and thus increase users' concern and awareness. Behavioral change may not result from increased knowledge alone. Therefore the effectiveness of this program in changing attitudes toward risky sexual behaviors would be enhanced when the simulation is embedded in an appropriate curriculum. A description of the program and an example of its use are presented.

  20. Design of Stripping Columns Applied to Drinking Water to Minimize Carcinogenic Risk from Trihalomethanes (THMs)

    PubMed Central

    Canosa, Joel

    2018-01-01

    The aim of this study is the application of a software tool to the design of stripping columns to calculate the removal of trihalomethanes (THMs) from drinking water. The tool also allows calculating the rough capital cost of the column and the decrease in carcinogenic risk indeces associated with the elimination of THMs and, thus, the investment to save a human life. The design of stripping columns includes the determination, among other factors, of the height (HOG), the theoretical number of plates (NOG), and the section (S) of the columns based on the study of pressure drop. These results have been compared with THM stripping literature values, showing that simulation is sufficiently conservative. Three case studies were chosen to apply the developed software. The first case study was representative of small-scale application to a community in Córdoba (Spain) where chloroform is predominant and has a low concentration. The second case study was of an intermediate scale in a region in Venezuela, and the third case study was representative of large-scale treatment of water in the Barcelona metropolitan region (Spain). Results showed that case studies with larger scale and higher initial risk offer the best capital investment to decrease the risk. PMID:29562670

  1. APPLICATION OF THE US DECISION SUPPORT TOOL FOR MATERIALS AND WASTE MANAGEMENT

    EPA Science Inventory

    EPA¿s National Risk Management Research Laboratory has led the development of a municipal solid waste decision support tool (MSW-DST). The computer software can be used to calculate life-cycle environmental tradeoffs and full costs of different waste management plans or recycling...

  2. Architecture for Integrated Medical Model Dynamic Probabilistic Risk Assessment

    NASA Technical Reports Server (NTRS)

    Jaworske, D. A.; Myers, J. G.; Goodenow, D.; Young, M.; Arellano, J. D.

    2016-01-01

    Probabilistic Risk Assessment (PRA) is a modeling tool used to predict potential outcomes of a complex system based on a statistical understanding of many initiating events. Utilizing a Monte Carlo method, thousands of instances of the model are considered and outcomes are collected. PRA is considered static, utilizing probabilities alone to calculate outcomes. Dynamic Probabilistic Risk Assessment (dPRA) is an advanced concept where modeling predicts the outcomes of a complex system based not only on the probabilities of many initiating events, but also on a progression of dependencies brought about by progressing down a time line. Events are placed in a single time line, adding each event to a queue, as managed by a planner. Progression down the time line is guided by rules, as managed by a scheduler. The recently developed Integrated Medical Model (IMM) summarizes astronaut health as governed by the probabilities of medical events and mitigation strategies. Managing the software architecture process provides a systematic means of creating, documenting, and communicating a software design early in the development process. The software architecture process begins with establishing requirements and the design is then derived from the requirements.

  3. A Web-Based System for Bayesian Benchmark Dose Estimation.

    PubMed

    Shao, Kan; Shapiro, Andrew J

    2018-01-11

    Benchmark dose (BMD) modeling is an important step in human health risk assessment and is used as the default approach to identify the point of departure for risk assessment. A probabilistic framework for dose-response assessment has been proposed and advocated by various institutions and organizations; therefore, a reliable tool is needed to provide distributional estimates for BMD and other important quantities in dose-response assessment. We developed an online system for Bayesian BMD (BBMD) estimation and compared results from this software with U.S. Environmental Protection Agency's (EPA's) Benchmark Dose Software (BMDS). The system is built on a Bayesian framework featuring the application of Markov chain Monte Carlo (MCMC) sampling for model parameter estimation and BMD calculation, which makes the BBMD system fundamentally different from the currently prevailing BMD software packages. In addition to estimating the traditional BMDs for dichotomous and continuous data, the developed system is also capable of computing model-averaged BMD estimates. A total of 518 dichotomous and 108 continuous data sets extracted from the U.S. EPA's Integrated Risk Information System (IRIS) database (and similar databases) were used as testing data to compare the estimates from the BBMD and BMDS programs. The results suggest that the BBMD system may outperform the BMDS program in a number of aspects, including fewer failed BMD and BMDL calculations and estimates. The BBMD system is a useful alternative tool for estimating BMD with additional functionalities for BMD analysis based on most recent research. Most importantly, the BBMD has the potential to incorporate prior information to make dose-response modeling more reliable and can provide distributional estimates for important quantities in dose-response assessment, which greatly facilitates the current trend for probabilistic risk assessment. https://doi.org/10.1289/EHP1289.

  4. Usefulness of the novel risk estimation software, Heart Risk View, for the prediction of cardiac events in patients with normal myocardial perfusion SPECT.

    PubMed

    Sakatani, Tomohiko; Shimoo, Satoshi; Takamatsu, Kazuaki; Kyodo, Atsushi; Tsuji, Yumika; Mera, Kayoko; Koide, Masahiro; Isodono, Koji; Tsubakimoto, Yoshinori; Matsuo, Akiko; Inoue, Keiji; Fujita, Hiroshi

    2016-12-01

    Myocardial perfusion single-photon emission-computed tomography (SPECT) can predict cardiac events in patients with coronary artery disease with high accuracy; however, pseudo-negative cases sometimes occur. Heart Risk View, which is based on the prospective cohort study (J-ACCESS), is a software for evaluating cardiac event probability. We examined whether Heart Risk View was useful to evaluate the cardiac risk in patients with normal myocardial perfusion SPECT (MPS). We studied 3461 consecutive patients who underwent MPS to detect myocardial ischemia and those who had normal MPS were enrolled in this study (n = 698). We calculated cardiac event probability by Heart Risk View and followed-up for 3.8 ± 2.4 years. The cardiac events were defined as cardiac death, non-fatal myocardial infarction, and heart failure requiring hospitalization. During the follow-up period, 21 patients (3.0 %) had cardiac events. The event probability calculated by Heart Risk View was higher in the event group (5.5 ± 2.6 vs. 2.9 ± 2.6 %, p < 0.001). According to the receiver-operating characteristics curve, the cut-off point of the event probability for predicting cardiac events was 3.4 % (sensitivity 0.76, specificity 0.72, and AUC 0.85). Kaplan-Meier curves revealed that a higher event rate was observed in the high-event probability group by the log-rank test (p < 0.001). Although myocardial perfusion SPECT is useful for the prediction of cardiac events, risk estimation by Heart Risk View adds more prognostic information, especially in patients with normal MPS.

  5. An Updated Comprehensive Risk Analysis for Radioisotopes Identified of High Risk to National Security in the Event of a Radiological Dispersion Device Scenario

    NASA Astrophysics Data System (ADS)

    Robinson, Alexandra R.

    An updated global survey of radioisotope production and distribution was completed and subjected to a revised "down-selection methodology" to determine those radioisotopes that should be classified as potential national security risks based on availability and key physical characteristics that could be exploited in a hypothetical radiological dispersion device. The potential at-risk radioisotopes then were used in a modeling software suite known as Turbo FRMAC, developed by Sandia National Laboratories, to characterize plausible contamination maps known as Protective Action Guideline Zone Maps. This software also was used to calculate the whole body dose equivalent for exposed individuals based on various dispersion parameters and scenarios. Derived Response Levels then were determined for each radioisotope using: 1) target doses to members of the public provided by the U.S. EPA, and 2) occupational dose limits provided by the U.S. Nuclear Regulatory Commission. The limiting Derived Response Level for each radioisotope also was determined.

  6. Wallops Ship Surveillance System

    NASA Technical Reports Server (NTRS)

    Smith, Donna C.

    2011-01-01

    Approved as a Wallops control center backup system, the Wallops Ship Surveillance Software is a day-of-launch risk analysis tool for spaceport activities. The system calculates impact probabilities and displays ship locations relative to boundary lines. It enables rapid analysis of possible flight paths to preclude the need to cancel launches and allow execution of launches in a timely manner. Its design is based on low-cost, large-customer- base elements including personal computers, the Windows operating system, C/C++ object-oriented software, and network interfaces. In conformance with the NASA software safety standard, the system is designed to ensure that it does not falsely report a safe-for-launch condition. To improve the current ship surveillance method, the system is designed to prevent delay of launch under a safe-for-launch condition. A single workstation is designated the controller of the official ship information and the official risk analysis. Copies of this information are shared with other networked workstations. The program design is divided into five subsystems areas: 1. Communication Link -- threads that control the networking of workstations; 2. Contact List -- a thread that controls a list of protected item (ocean vessel) information; 3. Hazard List -- threads that control a list of hazardous item (debris) information and associated risk calculation information; 4. Display -- threads that control operator inputs and screen display outputs; and 5. Archive -- a thread that controls archive file read and write access. Currently, most of the hazard list thread and parts of other threads are being reused as part of a new ship surveillance system, under the SureTrak project.

  7. Cumulative radiation exposure and associated cancer risk estimates for scoliosis patients: Impact of repetitive full spine radiography.

    PubMed

    Law, Martin; Ma, Wang-Kei; Lau, Damian; Chan, Eva; Yip, Lawrance; Lam, Wendy

    2016-03-01

    To quantitatively evaluate the cumulative effective dose and associated cancer risk for scoliotic patients undergoing repetitive full spine radiography during their diagnosis and follow up periods. Organ absorbed doses of full spine exposed scoliotic patients at different age were computer simulated with the use of PCXMC software. Gender specific effective dose was then calculated with the ICRP-103 approach. Values of lifetime attributable cancer risk for patients exposed at different age were calculated for both patient genders and for Asian and Western population. Mathematical fitting for effective dose and for lifetime attributable cancer risk, as function of exposed age, was analytically obtained to quantitatively estimate patient cumulated effective dose and cancer risk. The cumulative effective dose of full spine radiography with posteroanterior and lateral projection for patients exposed annually at age between 5 and 30 years using digital radiography system was calculated as 15mSv. The corresponding cumulative lifetime attributable cancer risk for Asian and Western population was calculated as 0.08-0.17%. Female scoliotic patients would be at a statistically significant higher cumulated cancer risk than male patients under the same full spine radiography protocol. We demonstrate the use of computer simulation and analytic formula to quantitatively obtain the cumulated effective dose and cancer risk at any age of exposure, both of which are valuable information to medical personnel and patients' parents concern about radiation safety in repetitive full spine radiography. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  8. The development of a 3D risk analysis method.

    PubMed

    I, Yet-Pole; Cheng, Te-Lung

    2008-05-01

    Much attention has been paid to the quantitative risk analysis (QRA) research in recent years due to more and more severe disasters that have happened in the process industries. Owing to its calculation complexity, very few software, such as SAFETI, can really make the risk presentation meet the practice requirements. However, the traditional risk presentation method, like the individual risk contour in SAFETI, is mainly based on the consequence analysis results of dispersion modeling, which usually assumes that the vapor cloud disperses over a constant ground roughness on a flat terrain with no obstructions and concentration fluctuations, which is quite different from the real situations of a chemical process plant. All these models usually over-predict the hazardous regions in order to maintain their conservativeness, which also increases the uncertainty of the simulation results. On the other hand, a more rigorous model such as the computational fluid dynamics (CFD) model can resolve the previous limitations; however, it cannot resolve the complexity of risk calculations. In this research, a conceptual three-dimensional (3D) risk calculation method was proposed via the combination of results of a series of CFD simulations with some post-processing procedures to obtain the 3D individual risk iso-surfaces. It is believed that such technique will not only be limited to risk analysis at ground level, but also be extended into aerial, submarine, or space risk analyses in the near future.

  9. Development and Application of Collaborative Optimization Software for Plate - fin Heat Exchanger

    NASA Astrophysics Data System (ADS)

    Chunzhen, Qiao; Ze, Zhang; Jiangfeng, Guo; Jian, Zhang

    2017-12-01

    This paper introduces the design ideas of the calculation software and application examples for plate - fin heat exchangers. Because of the large calculation quantity in the process of designing and optimizing heat exchangers, we used Visual Basic 6.0 as a software development carrier to design a basic calculation software to reduce the calculation quantity. Its design condition is plate - fin heat exchanger which was designed according to the boiler tail flue gas. The basis of the software is the traditional design method of the plate-fin heat exchanger. Using the software for design and calculation of plate-fin heat exchangers, discovery will effectively reduce the amount of computation, and similar to traditional methods, have a high value.

  10. Validation of an online risk calculator for the prediction of anastomotic leak after colon cancer surgery and preliminary exploration of artificial intelligence-based analytics.

    PubMed

    Sammour, T; Cohen, L; Karunatillake, A I; Lewis, M; Lawrence, M J; Hunter, A; Moore, J W; Thomas, M L

    2017-11-01

    Recently published data support the use of a web-based risk calculator ( www.anastomoticleak.com ) for the prediction of anastomotic leak after colectomy. The aim of this study was to externally validate this calculator on a larger dataset. Consecutive adult patients undergoing elective or emergency colectomy for colon cancer at a single institution over a 9-year period were identified using the Binational Colorectal Cancer Audit database. Patients with a rectosigmoid cancer, an R2 resection, or a diverting ostomy were excluded. The primary outcome was anastomotic leak within 90 days as defined by previously published criteria. Area under receiver operating characteristic curve (AUROC) was derived and compared with that of the American College of Surgeons National Surgical Quality Improvement Program ® (ACS NSQIP) calculator and the colon leakage score (CLS) calculator for left colectomy. Commercially available artificial intelligence-based analytics software was used to further interrogate the prediction algorithm. A total of 626 patients were identified. Four hundred and fifty-six patients met the inclusion criteria, and 402 had complete data available for all the calculator variables (126 had a left colectomy). Laparoscopic surgery was performed in 39.6% and emergency surgery in 14.7%. The anastomotic leak rate was 7.2%, with 31.0% requiring reoperation. The anastomoticleak.com calculator was significantly predictive of leak and performed better than the ACS NSQIP calculator (AUROC 0.73 vs 0.58) and the CLS calculator (AUROC 0.96 vs 0.80) for left colectomy. Artificial intelligence-predictive analysis supported these findings and identified an improved prediction model. The anastomotic leak risk calculator is significantly predictive of anastomotic leak after colon cancer resection. Wider investigation of artificial intelligence-based analytics for risk prediction is warranted.

  11. MO-F-CAMPUS-I-01: A System for Automatically Calculating Organ and Effective Dose for Fluoroscopically-Guided Procedures

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Xiong, Z; Vijayan, S; Rana, V

    2015-06-15

    Purpose: A system was developed that automatically calculates the organ and effective dose for individual fluoroscopically-guided procedures using a log of the clinical exposure parameters. Methods: We have previously developed a dose tracking system (DTS) to provide a real-time color-coded 3D- mapping of skin dose. This software produces a log file of all geometry and exposure parameters for every x-ray pulse during a procedure. The data in the log files is input into PCXMC, a Monte Carlo program that calculates organ and effective dose for projections and exposure parameters set by the user. We developed a MATLAB program to readmore » data from the log files produced by the DTS and to automatically generate the definition files in the format used by PCXMC. The processing is done at the end of a procedure after all exposures are completed. Since there are thousands of exposure pulses with various parameters for fluoroscopy, DA and DSA and at various projections, the data for exposures with similar parameters is grouped prior to entry into PCXMC to reduce the number of Monte Carlo calculations that need to be performed. Results: The software developed automatically transfers data from the DTS log file to PCXMC and runs the program for each grouping of exposure pulses. When the dose from all exposure events are calculated, the doses for each organ and all effective doses are summed to obtain procedure totals. For a complicated interventional procedure, the calculations can be completed on a PC without manual intervention in less than 30 minutes depending on the level of data grouping. Conclusion: This system allows organ dose to be calculated for individual procedures for every patient without tedious calculations or data entry so that estimates of stochastic risk can be obtained in addition to the deterministic risk estimate provided by the DTS. Partial support from NIH grant R01EB002873 and Toshiba Medical Systems Corp.« less

  12. Software for X-Ray Images Calculation of Hydrogen Compression Device in Megabar Pressure Range

    NASA Astrophysics Data System (ADS)

    Egorov, Nikolay; Bykov, Alexander; Pavlov, Valery

    2007-06-01

    Software for x-ray images simulation is described. The software is a part of x-ray method used for investigation of an equation of state of hydrogen in a megabar pressure range. A graphical interface that clearly and simply allows users to input data for x-ray image calculation: properties of the studied device, parameters of the x-ray radiation source, parameters of the x-ray radiation recorder, the experiment geometry; to represent the calculation results and efficiently transmit them to other software for processing. The calculation time is minimized. This makes it possible to perform calculations in a dialogue regime. The software is written in ``MATLAB'' system.

  13. A Comprehensive Software and Database Management System for Glomerular Filtration Rate Estimation by Radionuclide Plasma Sampling and Serum Creatinine Methods.

    PubMed

    Jha, Ashish Kumar

    2015-01-01

    Glomerular filtration rate (GFR) estimation by plasma sampling method is considered as the gold standard. However, this method is not widely used because the complex technique and cumbersome calculations coupled with the lack of availability of user-friendly software. The routinely used Serum Creatinine method (SrCrM) of GFR estimation also requires the use of online calculators which cannot be used without internet access. We have developed user-friendly software "GFR estimation software" which gives the options to estimate GFR by plasma sampling method as well as SrCrM. We have used Microsoft Windows(®) as operating system and Visual Basic 6.0 as the front end and Microsoft Access(®) as database tool to develop this software. We have used Russell's formula for GFR calculation by plasma sampling method. GFR calculations using serum creatinine have been done using MIRD, Cockcroft-Gault method, Schwartz method, and Counahan-Barratt methods. The developed software is performing mathematical calculations correctly and is user-friendly. This software also enables storage and easy retrieval of the raw data, patient's information and calculated GFR for further processing and comparison. This is user-friendly software to calculate the GFR by various plasma sampling method and blood parameter. This software is also a good system for storing the raw and processed data for future analysis.

  14. Analysis of wheel rim - Material and manufacturing aspects

    NASA Astrophysics Data System (ADS)

    Misra, Sheelam; Singh, Abhiraaj; James, Eldhose

    2018-05-01

    The tire in an automobile is supported by the rim of the wheel and its shape and dimensions should be adjusted to accommodate a specified tire. In this study, a tire of car wheel rim belonging to the disc wheel category is considered. Design is an important industrial operation used to define and specify the quality of the product. The design and modelling reduces the risk of damage involved in the manufacturing process. The design performed on this wheel rim is done on modelling software. After designing the model, it is imported for analysis purposes. The analysis software is used to calculate the different types of force, stresses, torque, and pressures acting upon the rim of the wheel and it reduces the time spent by a human for mathematical calculations. The analysis carried out considers two different materials namely structural steel and aluminium. Both materials are analyzed and their performance is noted.

  15. 78 FR 54365 - Uniform Fine Assessment Version 4.0 Software; Calculating Amounts of Civil Penalties for...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-09-03

    ... Version 4.0 Software; Calculating Amounts of Civil Penalties for Violations of Regulations AGENCY: Federal... Agency has begun using the Uniform Fine Assessment (UFA) Version 4.0 software to calculate the amounts of... penalties for violations of the FMCSRs and HMRs and since the mid- 1990's FMCSA has used its UFA software to...

  16. Introduction to Flight Test Engineering (Introduction aux techniques des essais en vol)

    DTIC Science & Technology

    2005-07-01

    or aircraft parameters • Calculations in the frequency domain ( Fast Fourier Transform) • Data analysis with dedicated software for: • Signal...density Fast Fourier Transform Transfer function analysis Frequency response analysis Etc. PRESENTATION Color/black & white Display screen...envelope by operating the airplane at increasing ranges - representing increasing risk - of engine operation, airspeeds both fast and slow, altitude

  17. Product Engineering Class in the Software Safety Risk Taxonomy for Building Safety-Critical Systems

    NASA Technical Reports Server (NTRS)

    Hill, Janice; Victor, Daniel

    2008-01-01

    When software safety requirements are imposed on legacy safety-critical systems, retrospective safety cases need to be formulated as part of recertifying the systems for further use and risks must be documented and managed to give confidence for reusing the systems. The SEJ Software Development Risk Taxonomy [4] focuses on general software development issues. It does not, however, cover all the safety risks. The Software Safety Risk Taxonomy [8] was developed which provides a construct for eliciting and categorizing software safety risks in a straightforward manner. In this paper, we present extended work on the taxonomy for safety that incorporates the additional issues inherent in the development and maintenance of safety-critical systems with software. An instrument called a Software Safety Risk Taxonomy Based Questionnaire (TBQ) is generated containing questions addressing each safety attribute in the Software Safety Risk Taxonomy. Software safety risks are surfaced using the new TBQ and then analyzed. In this paper we give the definitions for the specialized Product Engineering Class within the Software Safety Risk Taxonomy. At the end of the paper, we present the tool known as the 'Legacy Systems Risk Database Tool' that is used to collect and analyze the data required to show traceability to a particular safety standard

  18. Automating the evaluation of flood damages: methodology and potential gains

    NASA Astrophysics Data System (ADS)

    Eleutério, Julian; Martinez, Edgar Daniel

    2010-05-01

    The evaluation of flood damage potential consists of three main steps: assessing and processing data, combining data and calculating potential damages. The first step consists of modelling hazard and assessing vulnerability. In general, this step of the evaluation demands more time and investments than the others. The second step of the evaluation consists of combining spatial data on hazard with spatial data on vulnerability. Geographic Information System (GIS) is a fundamental tool in the realization of this step. GIS software allows the simultaneous analysis of spatial and matrix data. The third step of the evaluation consists of calculating potential damages by means of damage-functions or contingent analysis. All steps demand time and expertise. However, the last two steps must be realized several times when comparing different management scenarios. In addition, uncertainty analysis and sensitivity test are made during the second and third steps of the evaluation. The feasibility of these steps could be relevant in the choice of the extent of the evaluation. Low feasibility could lead to choosing not to evaluate uncertainty or to limit the number of scenario comparisons. Several computer models have been developed over time in order to evaluate the flood risk. GIS software is largely used to realise flood risk analysis. The software is used to combine and process different types of data, and to visualise the risk and the evaluation results. The main advantages of using a GIS in these analyses are: the possibility of "easily" realising the analyses several times, in order to compare different scenarios and study uncertainty; the generation of datasets which could be used any time in future to support territorial decision making; the possibility of adding information over time to update the dataset and make other analyses. However, these analyses require personnel specialisation and time. The use of GIS software to evaluate the flood risk requires personnel with a double professional specialisation. The professional should be proficient in GIS software and in flood damage analysis (which is already a multidisciplinary field). Great effort is necessary in order to correctly evaluate flood damages, and the updating and the improvement of the evaluation over time become a difficult task. The automation of this process should bring great advance in flood management studies over time, especially for public utilities. This study has two specific objectives: (1) show the entire process of automation of the second and third steps of flood damage evaluations; and (2) analyse the induced potential gains in terms of time and expertise needed in the analysis. A programming language is used within GIS software in order to automate hazard and vulnerability data combination and potential damages calculation. We discuss the overall process of flood damage evaluation. The main result of this study is a computational tool which allows significant operational gains on flood loss analyses. We quantify these gains by means of a hypothetical example. The tool significantly reduces the time of analysis and the needs for expertise. An indirect gain is that sensitivity and cost-benefit analyses can be more easily realized.

  19. Reducing Risk in DoD Software-Intensive Systems Development

    DTIC Science & Technology

    2016-03-01

    intensive systems development risk. This research addresses the use of the Technical Readiness Assessment (TRA) using the nine-level software Technology...The software TRLs are ineffective in reducing technical risk for the software component development. • Without the software TRLs, there is no...effective method to perform software TRA or reduce the technical development risk. The software component will behave as a new, untried technology in nearly

  20. Multiphase flow calculation software

    DOEpatents

    Fincke, James R.

    2003-04-15

    Multiphase flow calculation software and computer-readable media carrying computer executable instructions for calculating liquid and gas phase mass flow rates of high void fraction multiphase flows. The multiphase flow calculation software employs various given, or experimentally determined, parameters in conjunction with a plurality of pressure differentials of a multiphase flow, preferably supplied by a differential pressure flowmeter or the like, to determine liquid and gas phase mass flow rates of the high void fraction multiphase flows. Embodiments of the multiphase flow calculation software are suitable for use in a variety of applications, including real-time management and control of an object system.

  1. Standardised survey method for identifying catchment risks to water quality.

    PubMed

    Baker, D L; Ferguson, C M; Chier, P; Warnecke, M; Watkinson, A

    2016-06-01

    This paper describes the development and application of a systematic methodology to identify and quantify risks in drinking water and recreational catchments. The methodology assesses microbial and chemical contaminants from both diffuse and point sources within a catchment using Escherichia coli, protozoan pathogens and chemicals (including fuel and pesticides) as index contaminants. Hazard source information is gathered by a defined sanitary survey process involving use of a software tool which groups hazards into six types: sewage infrastructure, on-site sewage systems, industrial, stormwater, agriculture and recreational sites. The survey estimates the likelihood of the site affecting catchment water quality, and the potential consequences, enabling the calculation of risk for individual sites. These risks are integrated to calculate a cumulative risk for each sub-catchment and the whole catchment. The cumulative risks process accounts for the proportion of potential input sources surveyed and for transfer of contaminants from upstream to downstream sub-catchments. The output risk matrices show the relative risk sources for each of the index contaminants, highlighting those with the greatest impact on water quality at a sub-catchment and catchment level. Verification of the sanitary survey assessments and prioritisation is achieved by comparison with water quality data and microbial source tracking.

  2. Developing a smartphone software package for predicting atmospheric pollutant concentrations at mobile locations.

    PubMed

    Larkin, Andrew; Williams, David E; Kile, Molly L; Baird, William M

    2015-06-01

    There is considerable evidence that exposure to air pollution is harmful to health. In the U.S., ambient air quality is monitored by Federal and State agencies for regulatory purposes. There are limited options, however, for people to access this data in real-time which hinders an individual's ability to manage their own risks. This paper describes a new software package that models environmental concentrations of fine particulate matter (PM 2.5 ), coarse particulate matter (PM 10 ), and ozone concentrations for the state of Oregon and calculates personal health risks at the smartphone's current location. Predicted air pollution risk levels can be displayed on mobile devices as interactive maps and graphs color-coded to coincide with EPA air quality index (AQI) categories. Users have the option of setting air quality warning levels via color-coded bars and were notified whenever warning levels were exceeded by predicted levels within 10 km. We validated the software using data from participants as well as from simulations which showed that the application was capable of identifying spatial and temporal air quality trends. This unique application provides a potential low-cost technology for reducing personal exposure to air pollution which can improve quality of life particularly for people with health conditions, such as asthma, that make them more susceptible to these hazards.

  3. Mandibular bone structure, bone mineral density, and clinical variables as fracture predictors: a 15-year follow-up of female patients in a dental clinic.

    PubMed

    Jonasson, Grethe; Billhult, Annika

    2013-09-01

    To compare three mandibular trabeculation evaluation methods, clinical variables, and osteoporosis as fracture predictors in women. One hundred and thirty-six female dental patients (35-94 years) answered a questionnaire in 1996 and 2011. Using intra-oral radiographs from 1996, five methods were compared as fracture predictors: (1) mandibular bone structure evaluated with a visual radiographic index, (2) bone texture, (3) size and number of intertrabecular spaces calculated with Jaw-X software, (4) fracture probability calculated with a fracture risk assessment tool (FRAX), and (5) osteoporosis diagnosis based on dual-energy-X-ray absorptiometry. Differences were assessed with the Mann-Whitney test and relative risk calculated. Previous fracture, gluco-corticoid medication, and bone texture were significant indicators of future and total (previous plus future) fracture. Osteoporosis diagnosis, sparse trabeculation, Jaw-X, and FRAX were significant predictors of total but not future fracture. Clinical and oral bone variables may identify individuals at greatest risk of fracture. Copyright © 2013 Elsevier Inc. All rights reserved.

  4. A Case Study of Measuring Process Risk for Early Insights into Software Safety

    NASA Technical Reports Server (NTRS)

    Layman, Lucas; Basili, Victor; Zelkowitz, Marvin V.; Fisher, Karen L.

    2011-01-01

    In this case study, we examine software safety risk in three flight hardware systems in NASA's Constellation spaceflight program. We applied our Technical and Process Risk Measurement (TPRM) methodology to the Constellation hazard analysis process to quantify the technical and process risks involving software safety in the early design phase of these projects. We analyzed 154 hazard reports and collected metrics to measure the prevalence of software in hazards and the specificity of descriptions of software causes of hazardous conditions. We found that 49-70% of 154 hazardous conditions could be caused by software or software was involved in the prevention of the hazardous condition. We also found that 12-17% of the 2013 hazard causes involved software, and that 23-29% of all causes had a software control. The application of the TPRM methodology identified process risks in the application of the hazard analysis process itself that may lead to software safety risk.

  5. Risk assessment for consumer exposure to toluene diisocyanate (TDI) derived from polyurethane flexible foam.

    PubMed

    Arnold, Scott M; Collins, Michael A; Graham, Cynthia; Jolly, Athena T; Parod, Ralph J; Poole, Alan; Schupp, Thomas; Shiotsuka, Ronald N; Woolhiser, Michael R

    2012-12-01

    Polyurethanes (PU) are polymers made from diisocyanates and polyols for a variety of consumer products. It has been suggested that PU foam may contain trace amounts of residual toluene diisocyanate (TDI) monomers and present a health risk. To address this concern, the exposure scenario and health risks posed by sleeping on a PU foam mattress were evaluated. Toxicity benchmarks for key non-cancer endpoints (i.e., irritation, sensitization, respiratory tract effects) were determined by dividing points of departure by uncertainty factors. The cancer benchmark was derived using the USEPA Benchmark Dose Software. Results of previous migration and emission data of TDI from PU foam were combined with conservative exposure factors to calculate upper-bound dermal and inhalation exposures to TDI as well as a lifetime average daily dose to TDI from dermal exposure. For each non-cancer endpoint, the toxicity benchmark was divided by the calculated exposure to determine the margin of safety (MOS), which ranged from 200 (respiratory tract) to 3×10(6) (irritation). Although available data indicate TDI is not carcinogenic, a theoretical excess cancer risk (1×10(-7)) was calculated. We conclude from this assessment that sleeping on a PU foam mattress does not pose TDI-related health risks to consumers. Copyright © 2012 Elsevier Inc. All rights reserved.

  6. A Model for Assessing the Liability of Seemingly Correct Software

    NASA Technical Reports Server (NTRS)

    Voas, Jeffrey M.; Voas, Larry K.; Miller, Keith W.

    1991-01-01

    Current research on software reliability does not lend itself to quantitatively assessing the risk posed by a piece of life-critical software. Black-box software reliability models are too general and make too many assumptions to be applied confidently to assessing the risk of life-critical software. We present a model for assessing the risk caused by a piece of software; this model combines software testing results and Hamlet's probable correctness model. We show how this model can assess software risk for those who insure against a loss that can occur if life-critical software fails.

  7. Reliability and availability analysis of a 10 kW@20 K helium refrigerator

    NASA Astrophysics Data System (ADS)

    Li, J.; Xiong, L. Y.; Liu, L. Q.; Wang, H. R.; Wang, B. M.

    2017-02-01

    A 10 kW@20 K helium refrigerator has been established in the Technical Institute of Physics and Chemistry, Chinese Academy of Sciences. To evaluate and improve this refrigerator’s reliability and availability, a reliability and availability analysis is performed. According to the mission profile of this refrigerator, a functional analysis is performed. The failure data of the refrigerator components are collected and failure rate distributions are fitted by software Weibull++ V10.0. A Failure Modes, Effects & Criticality Analysis (FMECA) is performed and the critical components with higher risks are pointed out. Software BlockSim V9.0 is used to calculate the reliability and the availability of this refrigerator. The result indicates that compressors, turbine and vacuum pump are the critical components and the key units of this refrigerator. The mitigation actions with respect to design, testing, maintenance and operation are proposed to decrease those major and medium risks.

  8. Software development for teleroentgenogram analysis

    NASA Astrophysics Data System (ADS)

    Goshkoderov, A. A.; Khlebnikov, N. A.; Obabkov, I. N.; Serkov, K. V.; Gajniyarov, I. M.; Aliev, A. A.

    2017-09-01

    A framework for the analysis and calculation of teleroentgenograms was developed. Software development was carried out in the Department of Children's Dentistry and Orthodontics in Ural State Medical University. The software calculates the teleroentgenogram by the original method which was developed in this medical department. Program allows designing its own methods for calculating the teleroentgenograms by new methods. It is planned to use the technology of machine learning (Neural networks) in the software. This will help to make the process of calculating the teleroentgenograms easier because methodological points will be placed automatically.

  9. Utilization of MAX and FAX human phantoms for space radiation exposure calculations using HZETRN

    NASA Astrophysics Data System (ADS)

    Qualls, Garry; Slaba, Tony; Clowdsley, Martha; Blattnig, Steve; Walker, Steven; Simonsen, Lisa

    To estimate astronaut health risk due to space radiation, one must have the ability to calculate, for known radiation environments external to the body, particle spectra, LET spectra, dose, dose equivalent, or gray equivalent that are averaged over specific organs or tissue types. This may be accomplished using radiation transport software and computational human body tissue models. Historically, NASA scientists have used the HZETRN software to calculate radiation transport through both vehicle shielding materials and body tissue. The Computerized Anatomical Man (CAM) and the Computerized Anatomical Female (CAF) body models, combined with the CAMERA software, have been used for body tissue self-shielding calculations. The CAM and CAF, which were developed in 1973 and 1992, respectively, model the 50th percentile U.S. Air Force male and female and are constructed using individual quadric surfaces that combine to form thousands of solid regions that represent specific tissues and structures within the body. In order to transport an external radiation environment to a point within one of the body models using HZETRN, a directional distribution of the tissues surrounding that point is needed. The CAMERA software is used to "ray trace" the CAM and CAF models, providing the thickness of each tissue type traversed along each of a large number of rays originating at a dose point. More recently, R. Kramer of the Departmento de Energia Nuclear, Universidade Federal de Pernambuco in Brazil and his co-workers developed the Male Adult voXel (MAX) model and the Female Adult voXel (FAX). These voxel-based body models were developed using segmented Computed Tomography (CT) scans of adult cadavers, and the quantities and distributions of various body tissues have been adjusted to match those specified in the International Commission on Radiological Protection (ICRP) reference adult male and female. A new set of tools has been developed to facilitate space radiation exposure calculation using HZETRN and the MAX and FAX models. A new ray tracer was developed for these body models, as was a methodology for evaluating organ-averaged quantities. Both tools are described in this paper and utilized in sample calculations.

  10. Software Risk Identification for Interplanetary Probes

    NASA Technical Reports Server (NTRS)

    Dougherty, Robert J.; Papadopoulos, Periklis E.

    2005-01-01

    The need for a systematic and effective software risk identification methodology is critical for interplanetary probes that are using increasingly complex and critical software. Several probe failures are examined that suggest more attention and resources need to be dedicated to identifying software risks. The direct causes of these failures can often be traced to systemic problems in all phases of the software engineering process. These failures have lead to the development of a practical methodology to identify risks for interplanetary probes. The proposed methodology is based upon the tailoring of the Software Engineering Institute's (SEI) method of taxonomy-based risk identification. The use of this methodology will ensure a more consistent and complete identification of software risks in these probes.

  11. Adaptation, Commissioning, and Evaluation of a 3D Treatment Planning System for High-Resolution Small-Animal Irradiation

    PubMed Central

    Jeong, Jeho; Chen, Qing; Febo, Robert; Yang, Jie; Pham, Hai; Xiong, Jian-Ping; Zanzonico, Pat B.; Deasy, Joseph O.; Humm, John L.; Mageras, Gig S.

    2016-01-01

    Although spatially precise systems are now available for small-animal irradiations, there are currently limited software tools available for treatment planning for such irradiations. We report on the adaptation, commissioning, and evaluation of a 3-dimensional treatment planning system for use with a small-animal irradiation system. The 225-kV X-ray beam of the X-RAD 225Cx microirradiator (Precision X-Ray) was commissioned using both ion-chamber and radiochromic film for 10 different collimators ranging in field size from 1 mm in diameter to 40 × 40 mm2. A clinical 3-dimensional treatment planning system (Metropolis) developed at our institution was adapted to small-animal irradiation by making it compatible with the dimensions of mice and rats, modeling the microirradiator beam orientations and collimators, and incorporating the measured beam data for dose calculation. Dose calculations in Metropolis were verified by comparison with measurements in phantoms. Treatment plans for irradiation of a tumor-bearing mouse were generated with both the Metropolis and the vendor-supplied software. The calculated beam-on times and the plan evaluation tools were compared. The dose rate at the central axis ranges from 74 to 365 cGy/min depending on the collimator size. Doses calculated with Metropolis agreed with phantom measurements within 3% for all collimators. The beam-on times calculated by Metropolis and the vendor-supplied software agreed within 1% at the isocenter. The modified 3-dimensional treatment planning system provides better visualization of the relationship between the X-ray beams and the small-animal anatomy as well as more complete dosimetric information on target tissues and organs at risk. It thereby enhances the potential of image-guided microirradiator systems for evaluation of dose–response relationships and for preclinical experimentation generally. PMID:25948321

  12. Effect of Education of Principles of Drug Prescription and Calculation through Lecture and Designed Multimedia Software on Nursing Students' Learning Outcomes.

    PubMed

    Valizadeh, Sousan; Feizalahzadeh, Hossein; Avari, Mina; Virani, Faza

    2016-07-01

    Medication errors are risk factors for patients' health and may have irrecoverable effects. These errors include medication miscalculations by nurses and nursing students. This study aimed to design a multimedia application in the field of education for drug calculations in order to compare its effectiveness with the lecture method. This study selected 82 nursing students of Tabriz University of Medical Sciences in their second and third semesters in 2015. They were pre-tested by a researcher-made multiple-choice questionnaire on their knowledge of drug administration principles and ability to carry out medicinal calculations before training and were then divided through a random block design into two groups of intervention (education with designed software) and control (lecturing) based on the mean grade of previous semesters and the pre-test score. The knowledge and ability post-test was performed using the same questions after 4 weeks of training, and the data were analyzed with IBM SPSS 20 using independent samples t-test, paired-samples t-test, and ANCOVA. Drug calculation ability significantly increased after training in both the control and experimental groups (p<0.05). However, no significant difference emerged between the two groups in terms of medicinal calculation ability after training (p>0.05). The results showed that both training methods had no significant effect on study participants' knowledge of medicinal principles (p>0.05), whereas the score of knowledge of medicinal principles in the control group increased non-significantly. The results of the Kolmogorov-Smirnov test show that, since p>0.05, the data in the variable of knowledge of drug prescription principles and ability of medicinal calculations had a normal distribution. The use of educational software has no significant effect on nursing students' drug knowledge or medicinal calculation ability. However, an e-learning program can reduce the lecture time and cost of repeated topics, such as medication, suggesting that it can be an effective component in nurse education programs.

  13. Software And Systems Engineering Risk Management

    DTIC Science & Technology

    2010-04-01

    RSKM 2004 COSO Enterprise RSKM Framework 2006 ISO/IEC 16085 Risk Management Process 2008 ISO/IEC 12207 Software Lifecycle Processes 2009 ISO/IEC...1 Software And Systems Engineering Risk Management John Walz VP Technical and Conferences Activities, IEEE Computer Society Vice-Chair Planning...Software & Systems Engineering Standards Committee, IEEE Computer Society US TAG to ISO TMB Risk Management Working Group Systems and Software

  14. Micrometeoroid and Orbital Debris Threat Assessment: Mars Sample Return Earth Entry Vehicle

    NASA Technical Reports Server (NTRS)

    Christiansen, Eric L.; Hyde, James L.; Bjorkman, Michael D.; Hoffman, Kevin D.; Lear, Dana M.; Prior, Thomas G.

    2011-01-01

    This report provides results of a Micrometeoroid and Orbital Debris (MMOD) risk assessment of the Mars Sample Return Earth Entry Vehicle (MSR EEV). The assessment was performed using standard risk assessment methodology illustrated in Figure 1-1. Central to the process is the Bumper risk assessment code (Figure 1-2), which calculates the critical penetration risk based on geometry, shielding configurations and flight parameters. The assessment process begins by building a finite element model (FEM) of the spacecraft, which defines the size and shape of the spacecraft as well as the locations of the various shielding configurations. This model is built using the NX I-deas software package from Siemens PLM Software. The FEM is constructed using triangular and quadrilateral elements that define the outer shell of the spacecraft. Bumper-II uses the model file to determine the geometry of the spacecraft for the analysis. The next step of the process is to identify the ballistic limit characteristics for the various shield types. These ballistic limits define the critical size particle that will penetrate a shield at a given impact angle and impact velocity. When the finite element model is built, each individual element is assigned a property identifier (PID) to act as an index for its shielding properties. Using the ballistic limit equations (BLEs) built into the Bumper-II code, the shield characteristics are defined for each and every PID in the model. The final stage of the analysis is to determine the probability of no penetration (PNP) on the spacecraft. This is done using the micrometeoroid and orbital debris environment definitions that are built into the Bumper-II code. These engineering models take into account orbit inclination, altitude, attitude and analysis date in order to predict an impacting particle flux on the spacecraft. Using the geometry and shielding characteristics previously defined for the spacecraft and combining that information with the environment model calculations, the Bumper-II code calculates a probability of no penetration for the spacecraft.

  15. Automated Routines for Calculating Whole-Stream Metabolism: Theoretical Background and User's Guide

    USGS Publications Warehouse

    Bales, Jerad D.; Nardi, Mark R.

    2007-01-01

    In order to standardize methods and facilitate rapid calculation and archival of stream-metabolism variables, the Stream Metabolism Program was developed to calculate gross primary production, net ecosystem production, respiration, and selected other variables from continuous measurements of dissolved-oxygen concentration, water temperature, and other user-supplied information. Methods for calculating metabolism from continuous measurements of dissolved-oxygen concentration and water temperature are fairly well known, but a standard set of procedures and computation software for all aspects of the calculations were not available previously. The Stream Metabolism Program addresses this deficiency with a stand-alone executable computer program written in Visual Basic.NET?, which runs in the Microsoft Windows? environment. All equations and assumptions used in the development of the software are documented in this report. Detailed guidance on application of the software is presented, along with a summary of the data required to use the software. Data from either a single station or paired (upstream, downstream) stations can be used with the software to calculate metabolism variables.

  16. The polyGeVero® software for fast and easy computation of 3D radiotherapy dosimetry data

    NASA Astrophysics Data System (ADS)

    Kozicki, Marek; Maras, Piotr

    2015-01-01

    The polyGeVero® software package was elaborated for calculations of 3D dosimetry data such as the polymer gel dosimetry. It comprises four workspaces designed for: i) calculating calibrations, ii) storing calibrations in a database, iii) calculating dose distribution 3D cubes, iv) comparing two datasets e.g. a measured one with a 3D dosimetry with a calculated one with the aid of a treatment planning system. To accomplish calculations the software was equipped with a number of tools such as the brachytherapy isotopes database, brachytherapy dose versus distance calculation based on the line approximation approach, automatic spatial alignment of two 3D dose cubes for comparison purposes, 3D gamma index, 3D gamma angle, 3D dose difference, Pearson's coefficient, histograms calculations, isodoses superimposition for two datasets, and profiles calculations in any desired direction. This communication is to briefly present the main functions of the software and report on the speed of calculations performed by polyGeVero®.

  17. Adopting Open Source Software to Address Software Risks during the Scientific Data Life Cycle

    NASA Astrophysics Data System (ADS)

    Vinay, S.; Downs, R. R.

    2012-12-01

    Software enables the creation, management, storage, distribution, discovery, and use of scientific data throughout the data lifecycle. However, the capabilities offered by software also present risks for the stewardship of scientific data, since future access to digital data is dependent on the use of software. From operating systems to applications for analyzing data, the dependence of data on software presents challenges for the stewardship of scientific data. Adopting open source software provides opportunities to address some of the proprietary risks of data dependence on software. For example, in some cases, open source software can be deployed to avoid licensing restrictions for using, modifying, and transferring proprietary software. The availability of the source code of open source software also enables the inclusion of modifications, which may be contributed by various community members who are addressing similar issues. Likewise, an active community that is maintaining open source software can be a valuable source of help, providing an opportunity to collaborate to address common issues facing adopters. As part of the effort to meet the challenges of software dependence for scientific data stewardship, risks from software dependence have been identified that exist during various times of the data lifecycle. The identification of these risks should enable the development of plans for mitigating software dependencies, where applicable, using open source software, and to improve understanding of software dependency risks for scientific data and how they can be reduced during the data life cycle.

  18. The use of copula functions for modeling the risk of investment in shares traded on the Warsaw Stock Exchange

    NASA Astrophysics Data System (ADS)

    Domino, Krzysztof; Błachowicz, Tomasz

    2014-11-01

    In our work copula functions and the Hurst exponent calculated using the local Detrended Fluctuation Analysis (DFA) were used to investigate the risk of investment made in shares traded on the Warsaw Stock Exchange. The combination of copula functions and the Hurst exponent calculated using local DFA is a new approach. For copula function analysis bivariate variables composed of shares prices of the PEKAO bank (a big bank with high capitalization) and other banks (PKOBP, BZ WBK, MBANK and HANDLOWY in decreasing capitalization order) and companies from other branches (KGHM-mining industry, PKNORLEN-petrol industry as well as ASSECO-software industry) were used. Hurst exponents were calculated for daily shares prices and used to predict high drops of those prices. It appeared to be a valuable indicator in the copula selection procedure, since Hurst exponent’s low values were pointing on heavily tailed copulas e.g. the Clayton one.

  19. TU-H-CAMPUS-IeP1-03: Comparison of Monte Carlo Simulation and Conversion Factor Based Method On Estimation of Effective Dose in Pediatric Patients Undergoing Interventional Cardiac Procedures

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Leung, K; Wong, M; Ng, Y

    Purpose: Interventional cardiac procedures utilize frequent fluoroscopy and cineangiography, which impose considerable radiation risk to patients, especially pediatric patients. Accurate calculation of effective dose is important in order to estimate cancer risk over the rest of their lifetime. This study evaluates the difference in effective dose calculated by Monte Carlo simulation with those estimated by locally-derived conversion factors (CF-local) and by commonly quoted conversion factors from Karambatsakidou et al (CF-K). Methods: Effective dose (E),of 12 pediatric patients, age between 2.5–19 years old, who had undergone interventional cardiac procedures, were calculated using PCXMC-2.0 software. Tube spectrum, irradiation geometry, exposure parameters andmore » dose-area product (DAP) of each projection were included in the software calculation. Effective doses for each patient were also estimated by two Methods: 1) CF-local: conversion factor derived locally by generalizing results of 12 patients, multiplied by DAP of each patient gives E-local. 2) CF-K: selected factor from above-mentioned literature, multiplied by DAP of each patient gives E-K. Results: Mean of E, E-local and E-K were 16.01 mSv, 16.80 mSv and 22.25 mSv respectively. A deviation of −29.35% to +34.85% between E and E-local, while a greater deviation of −28.96% to +60.86% between E and EK were observed. E-K overestimated the effective dose for patients at age 7.5–19. Conclusion: Effective dose obtained by conversion factors is simple and quick to estimate radiation risk of pediatric patients. This study showed that estimation by CF-local may bear an error of 35% when compared with Monte Carlo calculation. If using conversion factors derived by other studies may result in an even greater error, of up to 60%, due to factors that are not catered for in the estimation, including patient size, projection angles, exposure parameters, tube filtration, etc. Users must be aware of these potential inaccuracies when simple conversion method is employed.« less

  20. A Framework for Calculating Indirect Costs and Earned Value for IT Infrastructure Modernization Programs

    DTIC Science & Technology

    2005-05-01

    Standish Group 1995a; 1995b). In general , the risk of failure for large software projects is significantly greater than for small projects (Humphrey...learning, geographical dispersion, and team experience. Various weighting schemes can be developed and applied to these parameters for various...Fadtool DbCAS/ WebCAS ObligationsFunding data COPS MDMS Committments Obligations Committments PADDS Obligations EDA Contracts CAPS Contracts Document

  1. Sighten Final Technical Report DEEE0006690 Deploying an integrated and comprehensive solar financing software platform

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    O'Leary, Conlan

    Over the project, Sighten built a comprehensive software-as-a-service (Saas) platform to automate and streamline the residential solar financing workflow. Before the project period, significant time and money was spent by companies on front-end tools related to system design and proposal creation, but comparatively few resources were available to support the many back-end calculations and data management processes that underpin third party financing. Without a tool like Sighten, the solar financing processes involved passing information from the homeowner prospect into separate tools for system design, financing, and then later to reporting tools including Microsoft Excel, CRM software, in-house software, outside software,more » and offline, manual processes. Passing data between tools and attempting to connect disparate systems results in inefficiency and inaccuracy for the industry. Sighten was built to consolidate all financial and solar-related calculations in a single software platform. It significantly improves upon the accuracy of these calculations and exposes sophisticated new analysis tools resulting in a rigorous, efficient and cost-effective toolset for scaling residential solar. Widely deploying a platform like Sighten’s significantly and immediately impacts the residential solar space in several important ways: 1) standardizing and improving the quality of all quantitative calculations involved in the residential financing process, most notably project finance, system production and reporting calculations; 2) representing a true step change in terms of reporting and analysis capabilities by maintaining more accurate data and exposing sophisticated tools around simulation, tranching, and financial reporting, among others, to all stakeholders in the space; 3) allowing a broader group of developers/installers/finance companies to access the capital markets by providing an out-of-the-box toolset that handles the execution of running investor capital through a rooftop solar financing program. Standardizing and improving all calculations, improving data quality, and exposing new analysis tools previously unavailable affects investment in the residential space in several important ways: 1) lowering the cost of capital for existing capital providers by mitigating uncertainty and de-risking the solar asset class; 2) attracting new, lower cost investors to the solar asset class as reporting and data quality resemble standards of more mature asset classes; 3) increasing the prevalence of liquidity options for investors through back leverage, securitization, or secondary sale by providing the tools necessary for lenders, ratings agencies, etc. to properly understand a portfolio of residential solar assets. During the project period, Sighten successfully built and scaled a commercially ready tool for the residential solar market. The software solution built by Sighten has been deployed with key target customer segments identified in the award deliverables: solar installers, solar developers/channel managers, and solar financiers, including lenders. Each of these segments greatly benefits from the availability of the Sighten toolset.« less

  2. A computationally efficient software application for calculating vibration from underground railways

    NASA Astrophysics Data System (ADS)

    Hussein, M. F. M.; Hunt, H. E. M.

    2009-08-01

    The PiP model is a software application with a user-friendly interface for calculating vibration from underground railways. This paper reports about the software with a focus on its latest version and the plans for future developments. The software calculates the Power Spectral Density of vibration due to a moving train on floating-slab track with track irregularity described by typical values of spectra for tracks with good, average and bad conditions. The latest version accounts for a tunnel embedded in a half space by employing a toolbox developed at K.U. Leuven which calculates Green's functions for a multi-layered half-space.

  3. Airborne antenna pattern calculations

    NASA Technical Reports Server (NTRS)

    Knerr, T. J.; Mielke, R. R.

    1981-01-01

    Progress on the development of modeling software, testing software against caclulated data from program VPAP and measured patterns, and calculating roll plane patterns for general aviation aircraft is reported. Major objectives are the continued development of computer software for aircraft modeling and use of this software and program OSUVOL to calculate principal plane and volumetric radiation patterns. The determination of proper placement of antennas on aircraft to meet the requirements of the Microwave Landing System is discussed. An overview of the performed work, and an example of a roll plane model for the Piper PA-31T Cheyenne aircraft and the resulting calculated roll plane radiation pattern are included.

  4. Development of a software package for solid-angle calculations using the Monte Carlo method

    NASA Astrophysics Data System (ADS)

    Zhang, Jie; Chen, Xiulian; Zhang, Changsheng; Li, Gang; Xu, Jiayun; Sun, Guangai

    2014-02-01

    Solid-angle calculations play an important role in the absolute calibration of radioactivity measurement systems and in the determination of the activity of radioactive sources, which are often complicated. In the present paper, a software package is developed to provide a convenient tool for solid-angle calculations in nuclear physics. The proposed software calculates solid angles using the Monte Carlo method, in which a new type of variance reduction technique was integrated. The package, developed under the environment of Microsoft Foundation Classes (MFC) in Microsoft Visual C++, has a graphical user interface, in which, the visualization function is integrated in conjunction with OpenGL. One advantage of the proposed software package is that it can calculate the solid angle subtended by a detector with different geometric shapes (e.g., cylinder, square prism, regular triangular prism or regular hexagonal prism) to a point, circular or cylindrical source without any difficulty. The results obtained from the proposed software package were compared with those obtained from previous studies and calculated using Geant4. It shows that the proposed software package can produce accurate solid-angle values with a greater computation speed than Geant4.

  5. Developing a smartphone software package for predicting atmospheric pollutant concentrations at mobile locations

    PubMed Central

    Larkin, Andrew; Williams, David E.; Kile, Molly L.; Baird, William M.

    2014-01-01

    Background There is considerable evidence that exposure to air pollution is harmful to health. In the U.S., ambient air quality is monitored by Federal and State agencies for regulatory purposes. There are limited options, however, for people to access this data in real-time which hinders an individual's ability to manage their own risks. This paper describes a new software package that models environmental concentrations of fine particulate matter (PM2.5), coarse particulate matter (PM10), and ozone concentrations for the state of Oregon and calculates personal health risks at the smartphone's current location. Predicted air pollution risk levels can be displayed on mobile devices as interactive maps and graphs color-coded to coincide with EPA air quality index (AQI) categories. Users have the option of setting air quality warning levels via color-coded bars and were notified whenever warning levels were exceeded by predicted levels within 10 km. We validated the software using data from participants as well as from simulations which showed that the application was capable of identifying spatial and temporal air quality trends. This unique application provides a potential low-cost technology for reducing personal exposure to air pollution which can improve quality of life particularly for people with health conditions, such as asthma, that make them more susceptible to these hazards. PMID:26146409

  6. Improved in silico prediction of carcinogenic potency (TD50) and the risk specific dose (RSD) adjusted Threshold of Toxicological Concern (TTC) for genotoxic chemicals and pharmaceutical impurities.

    PubMed

    Contrera, Joseph F

    2011-02-01

    The Threshold of Toxicological Concern (TTC) is a level of exposure to a genotoxic impurity that is considered to represent a negligible risk to humans. The TTC was derived from the results of rodent carcinogenicity TD50 values that are a measure of carcinogenic potency. The TTC currently sets a default limit of 1.5 μg/day in food contact substances and pharmaceuticals for all genotoxic impurities without carcinogenicity data. Bercu et al. (2010) used the QSAR predicted TD50 to calculate a risk specific dose (RSD) which is a carcinogenic potency adjusted TTC for genotoxic impurities. This promising approach is currently limited by the software used, a combination of MC4PC (www.multicase.com) and a Lilly Inc. in-house software (VISDOM) that is not available to the public. In this report the TD50 and RSD were predicted using a commercially available software, SciQSAR (formally MDL-QSAR, www.scimatics.com) employing the same TD50 training data set and external validation test set that was used by Bercu et al. (2010). The results demonstrate the general applicability of QSAR predicted TD50 values to determine the RSDs for genotoxic impurities and the improved performance of SciQSAR for predicting TD50 values. Copyright © 2010 Elsevier Inc. All rights reserved.

  7. The Role and Quality of Software Safety in the NASA Constellation Program

    NASA Technical Reports Server (NTRS)

    Layman, Lucas; Basili, Victor R.; Zelkowitz, Marvin V.

    2010-01-01

    In this study, we examine software safety risk in the early design phase of the NASA Constellation spaceflight program. Obtaining an accurate, program-wide picture of software safety risk is difficult across multiple, independently-developing systems. We leverage one source of safety information, hazard analysis, to provide NASA quality assurance managers with information regarding the ongoing state of software safety across the program. The goal of this research is two-fold: 1) to quantify the relative importance of software with respect to system safety; and 2) to quantify the level of risk presented by software in the hazard analysis. We examined 154 hazard reports created during the preliminary design phase of three major flight hardware systems within the Constellation program. To quantify the importance of software, we collected metrics based on the number of software-related causes and controls of hazardous conditions. To quantify the level of risk presented by software, we created a metric scheme to measure the specificity of these software causes. We found that from 49-70% of hazardous conditions in the three systems could be caused by software or software was involved in the prevention of the hazardous condition. We also found that 12-17% of the 2013 hazard causes involved software, and that 23-29% of all causes had a software control. Furthermore, 10-12% of all controls were software-based. There is potential for inaccuracy in these counts, however, as software causes are not consistently scoped, and the presence of software in a cause or control is not always clear. The application of our software specificity metrics also identified risks in the hazard reporting process. In particular, we found a number of traceability risks in the hazard reports may impede verification of software and system safety.

  8. Impacts of software and its engineering on the carbon footprint of ICT

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kern, Eva, E-mail: e.kern@umwelt-campus.de; Dick, Markus, E-mail: sustainablesoftwareblog@gmail.com; Naumann, Stefan, E-mail: s.naumann@umwelt-campus.de

    2015-04-15

    The energy consumption of information and communication technology (ICT) is still increasing. Even though several solutions regarding the hardware side of Green IT exist, the software contribution to Green IT is not well investigated. The carbon footprint is one way to rate the environmental impacts of ICT. In order to get an impression of the induced CO{sub 2} emissions of software, we will present a calculation method for the carbon footprint of a software product over its life cycle. We also offer an approach on how to integrate some aspects of carbon footprint calculation into software development processes and discussmore » impacts and tools regarding this calculation method. We thus show the relevance of energy measurements and the attention to impacts on the carbon footprint by software within Green Software Engineering.« less

  9. Effects of self-graphing and goal setting on the math fact fluency of students with disabilities.

    PubMed

    Figarola, Patricia M; Gunter, Philip L; Reffel, Julia M; Worth, Susan R; Hummel, John; Gerber, Brian L

    2008-01-01

    We evaluated the impact of goal setting and students' participation in graphing their own performance data on the rate of math fact calculations. Participants were 3 students with mild disabilities in the first and second grades; 2 of the 3 students were also identified with Attention-Deficit/Hyperactivity Disorder (ADHD). They were taught to use Microsoft Excel® software to graph their rate of correct calculations when completing timed, independent practice sheets consisting of single-digit mathematics problems. Two students' rates of correct calculations nearly always met or exceeded the aim line established for their correct calculations. Additional interventions were required for the third student. Results are discussed in terms of implications and future directions for increasing the use of evaluation components in classrooms for students at risk for behavior disorders and academic failure.

  10. Determination of the UV solar risk in Argentina with high-resolution maps calculated using TOMS ozone climatology

    NASA Astrophysics Data System (ADS)

    Piacentini, Rubén D.; Cede, Alexander; Luccini, Eduardo; Stengel, Fernando

    2004-01-01

    The connection between ultraviolet (UV) radiation and various skin diseases is well known. In this work, we present the computer program "UVARG", developed in order to prevent the risk of getting sunburn for persons exposed to solar UV radiation in Argentina, a country that extends from low (tropical) to high southern hemisphere latitudes. The software calculates the so-called "erythemal irradiance", i.e., the spectral irradiance weighted by the McKinlay and Diffey action spectrum for erythema and integrated in wavelength. The erythemal irradiance depends mainly on the following geophysical parameters: solar elevation, total ozone column, surface altitude, surface albedo, total aerosol optical depth and Sun-Earth distance. Minor corrections are due to the variability in the vertical ozone, aerosol, pressure, humidity and temperature profiles and the extraterrestrial spectral solar UV irradiance. Key parameter in the software is a total ozone column climatology incorporating monthly averages, standard deviations and tendencies for the particular geographical situation of Argentina that was obtained from TOMS/NASA satellite data from 1978 to 2000. Different skin types are considered in order to determine the sunburn risk at any time of the day and any day of the year, with and without sunscreen protection. We present examples of the software for three different regions: the high altitude tropical Puna of Atacama desert in the North-West, Tierra del Fuego in the South when the ozone hole event overpasses and low summertime ozone conditions over Buenos Aires, the largest populated city in the country. In particular, we analyzed the maximum time for persons having different skin types during representative days of the year (southern hemisphere equinoxes and solstices). This work was made possible by the collaboration between the Argentine Skin Cancer Foundation, the Institute of Physics Rosario (CONICET-National University of Rosario, Argentina) and the Institute of Medical Physics, University of Innsbruck, Austria. With the teamwork of physicians and physicists, a scientifically reliable and easy-to-handle tool was developed to predict the risk of solar exposure in Argentina. It can be used by dermatologists as well as health authorities and educators in order to prevent health problems induced by solar UV radiation.

  11. Enhancing the Characterization of Epistemic Uncertainties in PM2.5 Risk Analyses.

    PubMed

    Smith, Anne E; Gans, Will

    2015-03-01

    The Environmental Benefits Mapping and Analysis Program (BenMAP) is a software tool developed by the U.S. Environmental Protection Agency (EPA) that is widely used inside and outside of EPA to produce quantitative estimates of public health risks from fine particulate matter (PM2.5 ). This article discusses the purpose and appropriate role of a risk analysis tool to support risk management deliberations, and evaluates the functions of BenMAP in this context. It highlights the importance in quantitative risk analyses of characterization of epistemic uncertainty, or outright lack of knowledge, about the true risk relationships being quantified. This article describes and quantitatively illustrates sensitivities of PM2.5 risk estimates to several key forms of epistemic uncertainty that pervade those calculations: the risk coefficient, shape of the risk function, and the relative toxicity of individual PM2.5 constituents. It also summarizes findings from a review of U.S.-based epidemiological evidence regarding the PM2.5 risk coefficient for mortality from long-term exposure. That review shows that the set of risk coefficients embedded in BenMAP substantially understates the range in the literature. We conclude that BenMAP would more usefully fulfill its role as a risk analysis support tool if its functions were extended to better enable and prompt its users to characterize the epistemic uncertainties in their risk calculations. This requires expanded automatic sensitivity analysis functions and more recognition of the full range of uncertainty in risk coefficients. © 2014 Society for Risk Analysis.

  12. Chicago Sanitary and Ship Canal (CSSC) Marine Safety Risk Assessment

    DTIC Science & Technology

    2013-12-01

    calculation of the rate of loss events and the associated consequences. Further, the selected tool supports a clear understanding of the drivers of failures...spreadsheet software. The event tree has a series of events stated in a success mode, or simply as the occurrence of a phenomenological condition. The...Public | December 2013 of a “transit” (when applicable). As subsequent events occur, there is a branch point, one branch representing success and the

  13. 45 CFR 153.350 - Risk adjustment data validation standards.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... implementation of any risk adjustment software and ensure proper validation of a statistically valid sample of... respect to implementation of risk adjustment software or as a result of data validation conducted pursuant... implementation of risk adjustment software or data validation. ...

  14. A Windows application for computing standardized mortality ratios and standardized incidence ratios in cohort studies based on calculation of exact person-years at risk.

    PubMed

    Geiss, Karla; Meyer, Martin

    2013-09-01

    Standardized mortality ratios and standardized incidence ratios are widely used in cohort studies to compare mortality or incidence in a study population to that in the general population on a age-time-specific basis, but their computation is not included in standard statistical software packages. Here we present a user-friendly Microsoft Windows program for computing standardized mortality ratios and standardized incidence ratios based on calculation of exact person-years at risk stratified by sex, age and calendar time. The program offers flexible import of different file formats for input data and easy handling of general population reference rate tables, such as mortality or incidence tables exported from cancer registry databases. The application of the program is illustrated with two examples using empirical data from the Bavarian Cancer Registry. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  15. Software and package applicating for network meta-analysis: A usage-based comparative study.

    PubMed

    Xu, Chang; Niu, Yuming; Wu, Junyi; Gu, Huiyun; Zhang, Chao

    2017-12-21

    To compare and analyze the characteristics and functions of software applications for network meta-analysis (NMA). PubMed, EMbase, The Cochrane Library, the official websites of Bayesian inference Using Gibbs Sampling (BUGS), Stata and R, and Google were searched to collect the software and packages for performing NMA; software and packages published up to March 2016 were included. After collecting the software, packages, and their user guides, we used the software and packages to calculate a typical example. All characteristics, functions, and computed results were compared and analyzed. Ten types of software were included, including programming and non-programming software. They were developed mainly based on Bayesian or frequentist theory. Most types of software have the characteristics of easy operation, easy mastery, exact calculation, or excellent graphing. However, there was no single software that performed accurate calculations with superior graphing; this could only be achieved through the combination of two or more types of software. This study suggests that the user should choose the appropriate software according to personal programming basis, operational habits, and financial ability. Then, the choice of the combination of BUGS and R (or Stata) software to perform the NMA is considered. © 2017 Chinese Cochrane Center, West China Hospital of Sichuan University and John Wiley & Sons Australia, Ltd.

  16. Performance and blood monitoring in sports: the artificial intelligence evoking target testing in antidoping (AR.I.E.T.T.A.) project.

    PubMed

    Manfredini, A F; Malagoni, A M; Litmanen, H; Zhukovskaja, L; Jeannier, P; Dal Follo, D; Felisatti, M; Besseberg, A; Geistlinger, M; Bayer, P; Carrabre, J E

    2011-03-01

    Substances and methods used to increase oxygen blood transport and physical performance can be detected in the blood, but the screening of the athletes to be tested remains a critical issue for the International Federations. This project, AR.I.E.T.T.A., aimed to develop a software capable of analysing athletes' hematological and performance profiles to detect abnormal patterns. One-hundred eighty athletes belonging to the International Biathlon Union gave written informed consent to have their hematological data, previously collected according to anti-doping rules, used to develop the AR.I.E.T.T.A. software. Software was developed with the included sections: 1) log-in; 2) data-entry: where data are loaded, stored and grouped; 3) analysis: where data are analysed, validated scores are calculated, and parameters are simultaneously displayed as statistics, tables and graphs, and individual or subpopulation profiles; 4) screening: where an immediate evaluation of the risk score of the present sample and/or the athlete under study is obtained. The sample risk score or AR.I.E.T.T.A. score is calculated by a simple computational system combining different parameters (absolute values and intra-individual variations) considered concurrently. The AR.I.E.T.T.A. score is obtained by the sum of the deviation units derived from each parameter, considering the shift of the present value from the reference values, based on the number of standard deviations. AR.I.E.T.T.A. enables a quick evaluation of blood results assisting surveillance programs and perform timely target testing controls on athletes by the International Federations. Future studies aiming to validate the AR.I.E.T.T.A. score and improve the diagnostic accuracy will improve the system.

  17. Software Reviews Since Acquisition Reform - The Artifact Perspective

    DTIC Science & Technology

    2004-01-01

    Risk Management OLD NEW Slide 13Acquisition of Software Intensive Systems 2004 – Peter Hantos Single, basic software paradigm Single processor Low...software risk mitigation related trade-offs must be done together Integral Software Engineering Activities Process Maturity and Quality Frameworks Quality

  18. Incidence and Residual Risk of HIV, HBV and HCV Infections Among Blood Donors in Tehran.

    PubMed

    Saber, Hamid Reza; Tabatabaee, Seyed Morteza; Abasian, Ali; Jamali, Mostafa; SalekMoghadam, Ebadollah; Hajibeigi, Bashir; Alavian, Seyed Moayed; Mirrezaie, Seyed Mohammad

    2017-09-01

    Estimation of residual risk is essential to monitor and improve blood safety. Our epidemiologic knowledge in the Iranian donor population regarding transfusion transmitted viral infections (TTIs), is confined to a few studies based on prevalence rate. There are no reports on residual risk of TTIs in Iran. In present survey, a software database of donor records of Tehran Blood Transfusion Center (TBTC) was used to estimate the incidence and residual risk of hepatitis B virus (HBV), hepatitis C virus (HCV) and human immunodeficiency virus (HIV) infections, by applying the incidence rate/window period (IR-WP) model. A total of 1,207,155 repeat donations was included in the analysis and represented a mean of 8.4 donations per donor over 6 years. The incidence amongst repeat donors was estimated by dividing the number of confirmed seroconverting donors by the total number of person-years at risk. The residual risk was calculated using the incidence/window period model. Incidence rate and residual risk for HBV, HCV and HIV infections were calculated for total (2005-2010) and two consecutive periods (2005-2007 and 2008-2010) of the study. According to the IR-WP model, overall residual risk for HIV and HCV in the total study period was 0.4 and 12.5 per million units, respectively and for HBV 4.57/100,000 donations. The incidence and residual risk of TTIs, calculated on TBTC's blood supply was low and comparable with developed countries for HIV infection but high for HCV and HBV infections. Blood safety may therefore be better managed by applying other techniques like nucleic acid amplification tests.

  19. Hydrocarbons pipeline transportation risk assessment

    NASA Astrophysics Data System (ADS)

    Zanin, A. V.; Milke, A. A.; Kvasov, I. N.

    2018-04-01

    The pipeline transportation applying risks assessment issue in the arctic conditions is addressed in the paper. Pipeline quality characteristics in the given environment has been assessed. To achieve the stated objective, the pipelines mathematical model was designed and visualized by using the software product SOLIDWORKS. When developing the mathematical model the obtained results made possible to define the pipeline optimal characteristics for designing on the Arctic sea bottom. In the course of conducting the research the pipe avalanche collapse risks were examined, internal longitudinal and circular loads acting on the pipeline were analyzed, as well as the water impact hydrodynamic force was taken into consideration. The conducted calculation can contribute to the pipeline transport further development under the harsh climate conditions of the Russian Federation Arctic shelf territory.

  20. Building Energy Simulation Test for Existing Homes (BESTEST-EX) (Presentation)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Judkoff, R.; Neymark, J.; Polly, B.

    2011-12-01

    This presentation discusses the goals of NREL Analysis Accuracy R&D; BESTEST-EX goals; what BESTEST-EX is; how it works; 'Building Physics' cases; 'Building Physics' reference results; 'utility bill calibration' cases; limitations and potential future work. Goals of NREL Analysis Accuracy R&D are: (1) Provide industry with the tools and technical information needed to improve the accuracy and consistency of analysis methods; (2) Reduce the risks associated with purchasing, financing, and selling energy efficiency upgrades; and (3) Enhance software and input collection methods considering impacts on accuracy, cost, and time of energy assessments. BESTEST-EX Goals are: (1) Test software predictions of retrofitmore » energy savings in existing homes; (2) Ensure building physics calculations and utility bill calibration procedures perform up to a minimum standard; and (3) Quantify impact of uncertainties in input audit data and occupant behavior. BESTEST-EX is a repeatable procedure that tests how well audit software predictions compare to the current state of the art in building energy simulation. There is no direct truth standard. However, reference software have been subjected to validation testing, including comparisons with empirical data.« less

  1. Software reliability through fault-avoidance and fault-tolerance

    NASA Technical Reports Server (NTRS)

    Vouk, Mladen A.; Mcallister, David F.

    1993-01-01

    Strategies and tools for the testing, risk assessment and risk control of dependable software-based systems were developed. Part of this project consists of studies to enable the transfer of technology to industry, for example the risk management techniques for safety-concious systems. Theoretical investigations of Boolean and Relational Operator (BRO) testing strategy were conducted for condition-based testing. The Basic Graph Generation and Analysis tool (BGG) was extended to fully incorporate several variants of the BRO metric. Single- and multi-phase risk, coverage and time-based models are being developed to provide additional theoretical and empirical basis for estimation of the reliability and availability of large, highly dependable software. A model for software process and risk management was developed. The use of cause-effect graphing for software specification and validation was investigated. Lastly, advanced software fault-tolerance models were studied to provide alternatives and improvements in situations where simple software fault-tolerance strategies break down.

  2. Software Safety Risk in Legacy Safety-Critical Computer Systems

    NASA Technical Reports Server (NTRS)

    Hill, Janice; Baggs, Rhoda

    2007-01-01

    Safety-critical computer systems must be engineered to meet system and software safety requirements. For legacy safety-critical computer systems, software safety requirements may not have been formally specified during development. When process-oriented software safety requirements are levied on a legacy system after the fact, where software development artifacts don't exist or are incomplete, the question becomes 'how can this be done?' The risks associated with only meeting certain software safety requirements in a legacy safety-critical computer system must be addressed should such systems be selected as candidates for reuse. This paper proposes a method for ascertaining formally, a software safety risk assessment, that provides measurements for software safety for legacy systems which may or may not have a suite of software engineering documentation that is now normally required. It relies upon the NASA Software Safety Standard, risk assessment methods based upon the Taxonomy-Based Questionnaire, and the application of reverse engineering CASE tools to produce original design documents for legacy systems.

  3. Continuous Risk Management: An Overview

    NASA Technical Reports Server (NTRS)

    Rosenberg, Linda; Hammer, Theodore F.

    1999-01-01

    Software risk management is important because it helps avoid disasters, rework, and overkill, but more importantly because it stimulates win-win situations. The objectives of software risk management are to identify, address, and eliminate software risk items before they become threats to success or major sources of rework. In general, good project managers are also good managers of risk. It makes good business sense for all software development projects to incorporate risk management as part of project management. The Software Assurance Technology Center (SATC) at NASA GSFC has been tasked with the responsibility for developing and teaching a systems level course for risk management that provides information on how to implement risk management. The course was developed in conjunction with the Software Engineering Institute at Carnegie Mellon University, then tailored to the NASA systems community. This is an introductory tutorial to continuous risk management based on this course. The rational for continuous risk management and how it is incorporated into project management are discussed. The risk management structure of six functions is discussed in sufficient depth for managers to understand what is involved in risk management and how it is implemented. These functions include: (1) Identify the risks in a specific format; (2) Analyze the risk probability, impact/severity, and timeframe; (3) Plan the approach; (4) Track the risk through data compilation and analysis; (5) Control and monitor the risk; (6) Communicate and document the process and decisions.

  4. Effects of Self-Graphing and Goal Setting on the Math Fact Fluency of Students with Disabilities

    PubMed Central

    Figarola, Patricia M; Gunter, Philip L; Reffel, Julia M; Worth, Susan R; Hummel, John; Gerber, Brian L

    2008-01-01

    We evaluated the impact of goal setting and students' participation in graphing their own performance data on the rate of math fact calculations. Participants were 3 students with mild disabilities in the first and second grades; 2 of the 3 students were also identified with Attention-Deficit/Hyperactivity Disorder (ADHD). They were taught to use Microsoft Excel® software to graph their rate of correct calculations when completing timed, independent practice sheets consisting of single-digit mathematics problems. Two students' rates of correct calculations nearly always met or exceeded the aim line established for their correct calculations. Additional interventions were required for the third student. Results are discussed in terms of implications and future directions for increasing the use of evaluation components in classrooms for students at risk for behavior disorders and academic failure. PMID:22477686

  5. A program code generator for multiphysics biological simulation using markup languages.

    PubMed

    Amano, Akira; Kawabata, Masanari; Yamashita, Yoshiharu; Rusty Punzalan, Florencio; Shimayoshi, Takao; Kuwabara, Hiroaki; Kunieda, Yoshitoshi

    2012-01-01

    To cope with the complexity of the biological function simulation models, model representation with description language is becoming popular. However, simulation software itself becomes complex in these environment, thus, it is difficult to modify the simulation conditions, target computation resources or calculation methods. In the complex biological function simulation software, there are 1) model equations, 2) boundary conditions and 3) calculation schemes. Use of description model file is useful for first point and partly second point, however, third point is difficult to handle for various calculation schemes which is required for simulation models constructed from two or more elementary models. We introduce a simulation software generation system which use description language based description of coupling calculation scheme together with cell model description file. By using this software, we can easily generate biological simulation code with variety of coupling calculation schemes. To show the efficiency of our system, example of coupling calculation scheme with three elementary models are shown.

  6. Health Disparities Calculator (HD*Calc) - SEER Software

    Cancer.gov

    Statistical software that generates summary measures to evaluate and monitor health disparities. Users can import SEER data or other population-based health data to calculate 11 disparity measurements.

  7. A Software Safety Risk Taxonomy for Use in Retrospective Safety Cases

    NASA Technical Reports Server (NTRS)

    Hill, Janice L.

    2007-01-01

    Safety standards contain technical and process-oriented safely requirements. The best time to include these requirements is early in the development lifecycle of the system. When software safety requirements are levied on a legacy system after the fact, a retrospective safety case will need to be constructed for the software in the system. This can be a difficult task because there may be few to no art facts available to show compliance to the software safely requirements. The risks associated with not meeting safely requirements in a legacy safely-critical computer system must be addressed to give confidence for reuse. This paper introduces a proposal for a software safely risk taxonomy for legacy safely-critical computer systems, by specializing the Software Engineering Institute's 'Software Development Risk Taxonomy' with safely elements and attributes.

  8. Security Risks: Management and Mitigation in the Software Life Cycle

    NASA Technical Reports Server (NTRS)

    Gilliam, David P.

    2004-01-01

    A formal approach to managing and mitigating security risks in the software life cycle is requisite to developing software that has a higher degree of assurance that it is free of security defects which pose risk to the computing environment and the organization. Due to its criticality, security should be integrated as a formal approach in the software life cycle. Both a software security checklist and assessment tools should be incorporated into this life cycle process and integrated with a security risk assessment and mitigation tool. The current research at JPL addresses these areas through the development of a Sotfware Security Assessment Instrument (SSAI) and integrating it with a Defect Detection and Prevention (DDP) risk management tool.

  9. Software-Related Recalls of Health Information Technology and Other Medical Devices: Implications for FDA Regulation of Digital Health.

    PubMed

    Ronquillo, Jay G; Zuckerman, Diana M

    2017-09-01

    Policy Points: Medical software has become an increasingly critical component of health care, yet the regulation of these devices is inconsistent and controversial. No studies of medical devices and software assess the impact on patient safety of the FDA's current regulatory safeguards and new legislative changes to those standards. Our analysis quantifies the impact of software problems in regulated medical devices and indicates that current regulations are necessary but not sufficient for ensuring patient safety by identifying and eliminating dangerous defects in software currently on the market. New legislative changes will further deregulate health IT, reducing safeguards that facilitate the reporting and timely recall of flawed medical software that could harm patients. Medical software has become an increasingly critical component of health care, yet the regulatory landscape for digital health is inconsistent and controversial. To understand which policies might best protect patients, we examined the impact of the US Food and Drug Administration's (FDA's) regulatory safeguards on software-related technologies in recent years and the implications for newly passed legislative changes in regulatory policy. Using FDA databases, we identified all medical devices that were recalled from 2011 through 2015 primarily because of software defects. We counted all software-related recalls for each FDA risk category and evaluated each high-risk and moderate-risk recall of electronic medical records to determine the manufacturer, device classification, submission type, number of units, and product details. A total of 627 software devices (1.4 million units) were subject to recalls, with 12 of these devices (190,596 units) subject to the highest-risk recalls. Eleven of the devices recalled as high risk had entered the market through the FDA review process that does not require evidence of safety or effectiveness, and one device was completely exempt from regulatory review. The largest high-risk recall categories were anesthesiology and general hospital, with one each in cardiovascular and neurology. Five electronic medical record systems (9,347 units) were recalled for software defects classified as posing a moderate risk to patient safety. Software problems in medical devices are not rare and have the potential to negatively influence medical care. Premarket regulation has not captured all the software issues that could harm patients, evidenced by the potentially large number of patients exposed to software products later subject to high-risk and moderate-risk recalls. Provisions of the 21st Century Cures Act that became law in late 2016 will reduce safeguards further. Absent stronger regulations and implementation to create robust risk assessment and adverse event reporting, physicians and their patients are likely to be at risk from medical errors caused by software-related problems in medical devices. © 2017 Milbank Memorial Fund.

  10. The OpenCalphad thermodynamic software interface.

    PubMed

    Sundman, Bo; Kattner, Ursula R; Sigli, Christophe; Stratmann, Matthias; Le Tellier, Romain; Palumbo, Mauro; Fries, Suzana G

    2016-12-01

    Thermodynamic data are needed for all kinds of simulations of materials processes. Thermodynamics determines the set of stable phases and also provides chemical potentials, compositions and driving forces for nucleation of new phases and phase transformations. Software to simulate materials properties needs accurate and consistent thermodynamic data to predict metastable states that occur during phase transformations. Due to long calculation times thermodynamic data are frequently pre-calculated into "lookup tables" to speed up calculations. This creates additional uncertainties as data must be interpolated or extrapolated and conditions may differ from those assumed for creating the lookup table. Speed and accuracy requires that thermodynamic software is fully parallelized and the Open-Calphad (OC) software is the first thermodynamic software supporting this feature. This paper gives a brief introduction to computational thermodynamics and introduces the basic features of the OC software and presents four different application examples to demonstrate its versatility.

  11. The OpenCalphad thermodynamic software interface

    PubMed Central

    Sundman, Bo; Kattner, Ursula R; Sigli, Christophe; Stratmann, Matthias; Le Tellier, Romain; Palumbo, Mauro; Fries, Suzana G

    2017-01-01

    Thermodynamic data are needed for all kinds of simulations of materials processes. Thermodynamics determines the set of stable phases and also provides chemical potentials, compositions and driving forces for nucleation of new phases and phase transformations. Software to simulate materials properties needs accurate and consistent thermodynamic data to predict metastable states that occur during phase transformations. Due to long calculation times thermodynamic data are frequently pre-calculated into “lookup tables” to speed up calculations. This creates additional uncertainties as data must be interpolated or extrapolated and conditions may differ from those assumed for creating the lookup table. Speed and accuracy requires that thermodynamic software is fully parallelized and the Open-Calphad (OC) software is the first thermodynamic software supporting this feature. This paper gives a brief introduction to computational thermodynamics and introduces the basic features of the OC software and presents four different application examples to demonstrate its versatility. PMID:28260838

  12. Proposed software system for atomic-structure calculation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fischer, C.F.

    1981-07-01

    Atomic structure calculations are understood well enough that, at a routine level, an atomic structure software package can be developed. At the Atomic Physics Conference in Riga, 1978 L.V. Chernysheva and M.Y. Amusia of Leningrad University, presented a paper on Software for Atomic Calculations. Their system, called ATOM is based on the Hartree-Fock approximation and correlation is included within the framework of RPAE. Energy level calculations, transition probabilities, photo-ionization cross-sections, electron scattering cross-sections are some of the physical properties that can be evaluated by their system. The MCHF method, together with CI techniques and the Breit-Pauli approximation also provides amore » sound theoretical basis for atomic structure calculations.« less

  13. [Development of ophthalmologic software for handheld devices].

    PubMed

    Grottone, Gustavo Teixeira; Pisa, Ivan Torres; Grottone, João Carlos; Debs, Fernando; Schor, Paulo

    2006-01-01

    The formulas for calculation of intraocular lenses have evolved since the first theoretical formulas by Fyodorov. Among the second generation formulas, the SRK-I formula has a simple calculation, taking into account a calculation that only involved anteroposterior length, IOL constant and average keratometry. With the evolution of those formulas, complexicity increased making the reconfiguration of parameters in special situations impracticable. In this way the production and development of software for such a purpose, can help surgeons to recalculate those values if needed. To idealize, develop and test a Brazilian software for calculation of IOL dioptric power for handheld computers. For the development and programming of software for calculation of IOL, we used PocketC program (OrbWorks Concentrated Software, USA). We compared the results collected from a gold-standard device (Ultrascan/Alcon Labs) with the simulation of 100 fictitious patients, using the same IOL parameters. The results were grouped for ULTRASCAN data and SOFTWARE data. Using SRK/T formula the range of those parameters included a keratometry varying between 35 and 55D, axial length between 20 and 28 mm, IOL constants of 118.7, 118.3 and 115.8. Using Wilcoxon test, it was shown that the groups do not differ (p=0.314). We had a variation in the Ultrascan sample between 11.82 and 27.97. In the tested program sample the variation was practically similar (11.83-27.98). The average of the Ultrascan group was 20.93. The software group had a similar average. The standard deviation of the samples was also similar (4.53). The precision of IOL software for handheld devices was similar to that of the standard devices using the SRK/T formula. The software worked properly, was steady without bugs in tested models of operational system.

  14. Software-Based Visual Loan Calculator For Banking Industry

    NASA Astrophysics Data System (ADS)

    Isizoh, A. N.; Anazia, A. E.; Okide, S. O. 3; Onyeyili, T. I.; Okwaraoka, C. A. P.

    2012-03-01

    industry is very necessary in modern day banking system using many design techniques for security reasons. This paper thus presents the software-based design and implementation of a Visual Loan calculator for banking industry using Visual Basic .Net (VB.Net). The fundamental approach to this is to develop a Graphical User Interface (GUI) using VB.Net operating tools, and then developing a working program which calculates the interest of any loan obtained. The VB.Net programming was done, implemented and the software proved satisfactory.

  15. Lower Total Cost of Ownership of ONE-NET by Using Thin-Client Desktop Deployment and Virtualization-Based Server Technology

    DTIC Science & Technology

    2010-09-01

    NNWC) was used to calculate major cost components—labor, hardware, software , and transport, while a VMware tool was used to calculate power and...cooling costs for both solutions. In addition, VMware provided a cost estimate for the upfront hardware and software licensing costs needed to support...cost per seat (CPS) model developed by Naval Network Warfare Command (NNWC) was used to calculate major cost components—labor, hardware, software , and

  16. The Use of Pro/Engineer CAD Software and Fishbowl Tool Kit in Ray-tracing Analysis

    NASA Technical Reports Server (NTRS)

    Nounu, Hatem N.; Kim, Myung-Hee Y.; Ponomarev, Artem L.; Cucinotta, Francis A.

    2009-01-01

    This document is designed as a manual for a user who wants to operate the Pro/ENGINEER (ProE) Wildfire 3.0 with the NASA Space Radiation Program's (SRP) custom-designed Toolkit, called 'Fishbowl', for the ray tracing of complex spacecraft geometries given by a ProE CAD model. The analysis of spacecraft geometry through ray tracing is a vital part in the calculation of health risks from space radiation. Space radiation poses severe risks of cancer, degenerative diseases and acute radiation sickness during long-term exploration missions, and shielding optimization is an important component in the application of radiation risk models. Ray tracing is a technique in which 3-dimensional (3D) vehicle geometry can be represented as the input for the space radiation transport code and subsequent risk calculations. In ray tracing a certain number of rays (on the order of 1000) are used to calculate the equivalent thickness, say of aluminum, of the spacecraft geometry seen at a point of interest called the dose point. The rays originate at the dose point and terminate at a homogenously distributed set of points lying on a sphere that circumscribes the spacecraft and that has its center at the dose point. The distance a ray traverses in each material is converted to aluminum or other user-selected equivalent thickness. Then all equivalent thicknesses are summed up for each ray. Since each ray points to a direction, the aluminum equivalent of each ray represents the shielding that the geometry provides to the dose point from that particular direction. This manual will first list for the user the contact information for help in installing ProE and Fishbowl in addition to notes on the platform support and system requirements information. Second, the document will show the user how to use the software to ray trace a Pro/E-designed 3-D assembly and will serve later as a reference for troubleshooting. The user is assumed to have previous knowledge of ProE and CAD modeling.

  17. Development of automatic visceral fat volume calculation software for CT volume data.

    PubMed

    Nemoto, Mitsutaka; Yeernuer, Tusufuhan; Masutani, Yoshitaka; Nomura, Yukihiro; Hanaoka, Shouhei; Miki, Soichiro; Yoshikawa, Takeharu; Hayashi, Naoto; Ohtomo, Kuni

    2014-01-01

    To develop automatic visceral fat volume calculation software for computed tomography (CT) volume data and to evaluate its feasibility. A total of 24 sets of whole-body CT volume data and anthropometric measurements were obtained, with three sets for each of four BMI categories (under 20, 20 to 25, 25 to 30, and over 30) in both sexes. True visceral fat volumes were defined on the basis of manual segmentation of the whole-body CT volume data by an experienced radiologist. Software to automatically calculate visceral fat volumes was developed using a region segmentation technique based on morphological analysis with CT value threshold. Automatically calculated visceral fat volumes were evaluated in terms of the correlation coefficient with the true volumes and the error relative to the true volume. Automatic visceral fat volume calculation results of all 24 data sets were obtained successfully and the average calculation time was 252.7 seconds/case. The correlation coefficients between the true visceral fat volume and the automatically calculated visceral fat volume were over 0.999. The newly developed software is feasible for calculating visceral fat volumes in a reasonable time and was proved to have high accuracy.

  18. Model based verification of the Secure Socket Layer (SSL) Protocol for NASA systems

    NASA Technical Reports Server (NTRS)

    Powell, John D.; Gilliam, David

    2004-01-01

    The National Aeronautics and Space Administration (NASA) has tens of thousands of networked computer systems and applications. Software Security vulnerabilities present risks such as lost or corrupted data, information theft, and unavailability of critical systems. These risks represent potentially enormous costs to NASA. The NASA Code Q research initiative 'Reducing Software Security Risk (RSSR) Trough an Integrated Approach' offers formal verification of information technology (IT), through the creation of a Software Security Assessment Instrument (SSAI), to address software security risks.

  19. Evaluation of RayXpert® for shielding design of medical facilities

    NASA Astrophysics Data System (ADS)

    Derreumaux, Sylvie; Vecchiola, Sophie; Geoffray, Thomas; Etard, Cécile

    2017-09-01

    In a context of growing demands for expert evaluation concerning medical, industrial and research facilities, the French Institute for radiation protection and nuclear safety (IRSN) considered necessary to acquire new software for efficient dimensioning calculations. The selected software is RayXpert®. Before using this software in routine, exposure and transmission calculations for some basic configurations were validated. The validation was performed by the calculation of gamma dose constants and tenth value layers (TVL) for usual shielding materials and for radioisotopes most used in therapy (Ir-192, Co-60 and I-131). Calculated values were compared with results obtained using MCNPX as a reference code and with published values. The impact of different calculation parameters, such as the source emission rays considered for calculation and the use of biasing techniques, was evaluated.

  20. Risk-Informed Safety Assurance and Probabilistic Assessment of Mission-Critical Software-Intensive Systems

    NASA Technical Reports Server (NTRS)

    Guarro, Sergio B.

    2010-01-01

    This report validates and documents the detailed features and practical application of the framework for software intensive digital systems risk assessment and risk-informed safety assurance presented in the NASA PRA Procedures Guide for Managers and Practitioner. This framework, called herein the "Context-based Software Risk Model" (CSRM), enables the assessment of the contribution of software and software-intensive digital systems to overall system risk, in a manner which is entirely compatible and integrated with the format of a "standard" Probabilistic Risk Assessment (PRA), as currently documented and applied for NASA missions and applications. The CSRM also provides a risk-informed path and criteria for conducting organized and systematic digital system and software testing so that, within this risk-informed paradigm, the achievement of a quantitatively defined level of safety and mission success assurance may be targeted and demonstrated. The framework is based on the concept of context-dependent software risk scenarios and on the modeling of such scenarios via the use of traditional PRA techniques - i.e., event trees and fault trees - in combination with more advanced modeling devices such as the Dynamic Flowgraph Methodology (DFM) or other dynamic logic-modeling representations. The scenarios can be synthesized and quantified in a conditional logic and probabilistic formulation. The application of the CSRM method documented in this report refers to the MiniAERCam system designed and developed by the NASA Johnson Space Center.

  1. The integration of the risk management process with the lifecycle of medical device software.

    PubMed

    Pecoraro, F; Luzi, D

    2014-01-01

    The application of software in the Medical Device (MD) domain has become central to the improvement of diagnoses and treatments. The new European regulations that specifically address software as an important component of MD, require complex procedures to make software compliant with safety requirements, introducing thereby new challenges in the qualification and classification of MD software as well as in the performance of risk management activities. Under this perspective, the aim of this paper is to propose an integrated framework that combines the activities to be carried out by the manufacturer to develop safe software within the development lifecycle based on the regulatory requirements reported in US and European regulations as well as in the relevant standards and guidelines. A comparative analysis was carried out to identify the main issues related to the application of the current new regulations. In addition, standards and guidelines recently released to harmonise procedures for the validation of MD software have been used to define the risk management activities to be carried out by the manufacturer during the software development process. This paper highlights the main issues related to the qualification and classification of MD software, providing an analysis of the different regulations applied in Europe and the US. A model that integrates the risk management process within the software development lifecycle has been proposed too. It is based on regulatory requirements and considers software risk analysis as a central input to be managed by the manufacturer already at the initial stages of the software design, in order to prevent MD failures. Relevant changes in the process of MD development have been introduced with the recognition of software being an important component of MDs as stated in regulations and standards. This implies the performance of highly iterative processes that have to integrate the risk management in the framework of software development. It also makes it necessary to involve both medical and software engineering competences to safeguard patient and user safety.

  2. Assessing risk based on uncertain avalanche activity patterns

    NASA Astrophysics Data System (ADS)

    Zeidler, Antonia; Fromm, Reinhard

    2015-04-01

    Avalanches may affect critical infrastructure and may cause great economic losses. The planning horizon of infrastructures, e.g. hydropower generation facilities, reaches well into the future. Based on the results of previous studies on the effect of changing meteorological parameters (precipitation, temperature) and the effect on avalanche activity we assume that there will be a change of the risk pattern in future. The decision makers need to understand what the future might bring to best formulate their mitigation strategies. Therefore, we explore a commercial risk software to calculate risk for the coming years that might help in decision processes. The software @risk, is known to many larger companies, and therefore we explore its capabilities to include avalanche risk simulations in order to guarantee a comparability of different risks. In a first step, we develop a model for a hydropower generation facility that reflects the problem of changing avalanche activity patterns in future by selecting relevant input parameters and assigning likely probability distributions. The uncertain input variables include the probability of avalanches affecting an object, the vulnerability of an object, the expected costs for repairing the object and the expected cost due to interruption. The crux is to find the distribution that best represents the input variables under changing meteorological conditions. Our focus is on including the uncertain probability of avalanches based on the analysis of past avalanche data and expert knowledge. In order to explore different likely outcomes we base the analysis on three different climate scenarios (likely, worst case, baseline). For some variables, it is possible to fit a distribution to historical data, whereas in cases where the past dataset is insufficient or not available the software allows to select from over 30 different distribution types. The Monte Carlo simulation uses the probability distribution of uncertain variables using all valid combinations of the values of input variables to simulate all possible outcomes. In our case the output is the expected risk (Euro/year) for each object (e.g. water intake) considered and the entire hydropower generation system. The output is again a distribution that is interpreted by the decision makers as the final strategy depends on the needs and requirements of the end-user, which may be driven by personal preferences. In this presentation, we will show a way on how we used the uncertain information on avalanche activity in future to subsequently use it in a commercial risk software and therefore bringing the knowledge of natural hazard experts to decision makers.

  3. Validation of software for calculating the likelihood ratio for parentage and kinship.

    PubMed

    Drábek, J

    2009-03-01

    Although the likelihood ratio is a well-known statistical technique, commercial off-the-shelf (COTS) software products for its calculation are not sufficiently validated to suit general requirements for the competence of testing and calibration laboratories (EN/ISO/IEC 17025:2005 norm) per se. The software in question can be considered critical as it directly weighs the forensic evidence allowing judges to decide on guilt or innocence or to identify person or kin (i.e.: in mass fatalities). For these reasons, accredited laboratories shall validate likelihood ratio software in accordance with the above norm. To validate software for calculating the likelihood ratio in parentage/kinship scenarios I assessed available vendors, chose two programs (Paternity Index and familias) for testing, and finally validated them using tests derived from elaboration of the available guidelines for the field of forensics, biomedicine, and software engineering. MS Excel calculation using known likelihood ratio formulas or peer-reviewed results of difficult paternity cases were used as a reference. Using seven testing cases, it was found that both programs satisfied the requirements for basic paternity cases. However, only a combination of two software programs fulfills the criteria needed for our purpose in the whole spectrum of functions under validation with the exceptions of providing algebraic formulas in cases of mutation and/or silent allele.

  4. Predicting death from kala-azar: construction, development, and validation of a score set and accompanying software.

    PubMed

    Costa, Dorcas Lamounier; Rocha, Regina Lunardi; Chaves, Eldo de Brito Ferreira; Batista, Vivianny Gonçalves de Vasconcelos; Costa, Henrique Lamounier; Costa, Carlos Henrique Nery

    2016-01-01

    Early identification of patients at higher risk of progressing to severe disease and death is crucial for implementing therapeutic and preventive measures; this could reduce the morbidity and mortality from kala-azar. We describe a score set composed of four scales in addition to software for quick assessment of the probability of death from kala-azar at the point of care. Data from 883 patients diagnosed between September 2005 and August 2008 were used to derive the score set, and data from 1,031 patients diagnosed between September 2008 and November 2013 were used to validate the models. Stepwise logistic regression analyses were used to derive the optimal multivariate prediction models. Model performance was assessed by its discriminatory accuracy. A computational specialist system (Kala-Cal(r)) was developed to speed up the calculation of the probability of death based on clinical scores. The clinical prediction score showed high discrimination (area under the curve [AUC] 0.90) for distinguishing death from survival for children ≤2 years old. Performance improved after adding laboratory variables (AUC 0.93). The clinical score showed equivalent discrimination (AUC 0.89) for older children and adults, which also improved after including laboratory data (AUC 0.92). The score set also showed a high, although lower, discrimination when applied to the validation cohort. This score set and Kala-Cal(r) software may help identify individuals with the greatest probability of death. The associated software may speed up the calculation of the probability of death based on clinical scores and assist physicians in decision-making.

  5. Liver Volumetry Plug and Play: Do It Yourself with ImageJ

    PubMed Central

    Dello, Simon A. W. G.; van Dam, Ronald M.; Slangen, Jules J. G.; van de Poll, Marcel C. G.; Bemelmans, Marc H. A.; Greve, Jan Willem W. M.; Beets-Tan, Regina G. H.; Wigmore, Stephen J.

    2007-01-01

    Background A small remnant liver volume is an important risk factor for posthepatectomy liver failure and can be predicted accurately by computed tomography (CT) volumetry using radiologic image analysis software. Unfortunately, this software is expensive and usually requires support by a radiologist. ImageJ is a freely downloadable image analysis software package developed by the National Institute of Health (NIH) and brings liver volumetry to the surgeon’s desktop. We aimed to assess the accuracy of ImageJ for hepatic CT volumetry. Methods ImageJ was downloaded from http://www.rsb.info.nih.gov/ij/. Preoperative CT scans of 15 patients who underwent liver resection for colorectal cancer liver metastases were retrospectively analyzed. Scans were opened in ImageJ; and the liver, all metastases, and the intended parenchymal transection line were manually outlined on each slice. The area of each selected region, metastasis, resection specimen, and remnant liver was multiplied by the slice thickness to calculate volume. Volumes of virtual liver resection specimens measured with ImageJ were compared with specimen weights and calculated volumes obtained during pathology examination after resection. Results There was an excellent correlation between the volumes calculated with ImageJ and the actual measured weights of the resection specimens (r² = 0.98, p < 0.0001). The weight/volume ratio amounted to 0.88 ± 0.04 (standard error) and was in agreement with our earlier findings using CT-linked radiologic software. Conclusion ImageJ can be used for accurate hepatic CT volumetry on a personal computer. This application brings CT volumetry to the surgeon’s desktop at no expense and is particularly useful in cases of tertiary referred patients, who already have a proper CT scan on CD-ROM from the referring institution. Most likely the discrepancy between volume and weight results from exsanguination of the liver after resection. PMID:17726630

  6. A planning system for transapical aortic valve implantation

    NASA Astrophysics Data System (ADS)

    Gessat, Michael; Merk, Denis R.; Falk, Volkmar; Walther, Thomas; Jacobs, Stefan; Nöttling, Alois; Burgert, Oliver

    2009-02-01

    Stenosis of the aortic valve is a common cardiac disease. It is usually corrected surgically by replacing the valve with a mechanical or biological prosthesis. Transapical aortic valve implantation is an experimental minimally invasive surgical technique that is applied to patients with high operative risk to avoid pulmonary arrest. A stented biological prosthesis is mounted on a catheter. Through small incisions in the fifth intercostal space and the apex of the heart, the catheter is positioned under flouroscopy in the aortic root. The stent is expanded and unfolds the valve which is thereby implanted into the aortic root. Exact targeting is crucial, since major complications can arise from a misplaced valve. Planning software for the perioperative use is presented that allows for selection of the best fitting implant and calculation of the safe target area for that implant. The software uses contrast enhanced perioperative DynaCT images acquired under rapid pacing. In a semiautomatic process, a surface segmentation of the aortic root is created. User selected anatomical landmarks are used to calculate the geometric constraints for the size and position of the implant. The software is integrated into a PACS network based on DICOM communication to query and receive the images and implants templates from a PACS server. The planning results can be exported to the same server and from there can be rertieved by an intraoperative catheter guidance device.

  7. SU-F-T-22: Clinical Implications When Using TG-186 (ACE) Heterogeneity Software

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Likhacheva, A; Grade, E; Sadeghi, A

    Purpose: The purpose of this study is to compare dosimetric calculations using traditional TG-43 formalism and Oncentra Brachy Advanced Collapsed cone Engine (ACE) TG-186 calculation algorithm in clinical setting. Methods: We analyzed dosimetry of four patients treated with accelerated partial breast irradiation using a multi-channel intracavitary device (SAVI). All patients were treated to 34 Gy in 10 fractions using a high-dose-rate (192) Ir source. The plans were designed and treated using the TG-43 model. ACE was used to assess the effect heterogeneity correction on various dosimetric parameters. Mass density was estimated using Hounsfield units. Results: Compared to TG-43 formalism, ACEmore » estimated lower doses to targets and organs at risk. The mean difference was 19.8% (range 15.3–24.1%) for PTV-eval V200, 12.0% (range 9.7–17.7%) for PTV-eval V150, 4.3% (range 3.3–6.5%) for PTV-eval D95, 3.3% (range 1.4–5.4%) for PTV-eval D90, 5.4% (range 2.9–9.9%) for maximum rib dose, and 5.7% (2.4–7.4%) for maximum skin dose. There was no correlation between the magnitude of the difference and the PTV-eval volume, air volume, or tissue-applicator conformance. Conclusion: Based on our preliminary study, the TG-43 algorithm appears to overestimate the dose to targets and organs at risk when compared to the ACE TG-186 software. We hypothesize that air adjacent to the SAVI struts contributes to lack of scatter thereby contributing a significant difference in dose calculation when using ACE. We believe that ACE calculation provides a more realistic isodose distribution than TG-43. We plan to further investigate the impact of heterogeneity correction on brachytherapy planning for a wide variety of clinical scenarios, include skin, cervix/uterus, prostate, and lung.« less

  8. CrossTalk. The Journal of Defense Software Engineering. Volume 23, Number 6, Nov/Dec 2010

    DTIC Science & Technology

    2010-11-01

    Model of archi- tectural design. It guides developers to apply effort to their software architecture commensurate with the risks faced by...Driven Model is the promotion of risk to prominence. It is possible to apply the Risk-Driven Model to essentially any software development process...succeed without any planned architecture work, while many high-risk projects would fail without it . The Risk-Driven Model walks a middle path

  9. Using software security analysis to verify the secure socket layer (SSL) protocol

    NASA Technical Reports Server (NTRS)

    Powell, John D.

    2004-01-01

    nal Aeronautics and Space Administration (NASA) have tens of thousands of networked computer systems and applications. Software Security vulnerabilities present risks such as lost or corrupted data, information the3, and unavailability of critical systems. These risks represent potentially enormous costs to NASA. The NASA Code Q research initiative 'Reducing Software Security Risk (RSSR) Trough an Integrated Approach '' offers, among its capabilities, formal verification of software security properties, through the use of model based verification (MBV) to address software security risks. [1,2,3,4,5,6] MBV is a formal approach to software assurance that combines analysis of software, via abstract models, with technology, such as model checkers, that provide automation of the mechanical portions of the analysis process. This paper will discuss: The need for formal analysis to assure software systems with respect to software and why testing alone cannot provide it. The means by which MBV with a Flexible Modeling Framework (FMF) accomplishes the necessary analysis task. An example of FMF style MBV in the verification of properties over the Secure Socket Layer (SSL) communication protocol as a demonstration.

  10. Multi-hazard risk analysis for management strategies

    NASA Astrophysics Data System (ADS)

    Kappes, M.; Keiler, M.; Bell, R.; Glade, T.

    2009-04-01

    Risk management is very often operating in a reactive way, responding to an event, instead of proactive starting with risk analysis and building up the whole process of risk evaluation, prevention, event management and regeneration. Since damage and losses from natural hazards raise continuously more and more studies, concepts (e.g. Switzerland or South Tyrol-Bolozano) and software packages (e.g. ARMAGEDOM, HAZUS or RiskScape) are developed to guide, standardize and facilitate the risk analysis. But these approaches focus on different aspects and are mostly closely adapted to the situation (legislation, organization of the administration, specific processes etc.) of the specific country or region. We propose in this study the development of a flexible methodology for multi-hazard risk analysis, identifying the stakeholders and their needs, processes and their characteristics, modeling approaches as well as incoherencies occurring by combining all these different aspects. Based on this concept a flexible software package will be established consisting of ArcGIS as central base and being complemented by various modules for hazard modeling, vulnerability assessment and risk calculation. Not all modules will be developed newly but taken from the current state-of-the-art and connected or integrated into ArcGIS. For this purpose two study sites, Valtellina in Italy and Bacelonnette in France, were chosen and the hazards types debris flows, rockfalls, landslides, avalanches and floods are planned to be included in the tool for a regional multi-hazard risk analysis. Since the central idea of this tool is its flexibility this will only be a first step, in the future further processes and scales can be included and the instrument thus adapted to any study site.

  11. Dose Estimating Application Software Modification: Additional Function of a Size-Specific Effective Dose Calculator and Auto Exposure Control.

    PubMed

    Kobayashi, Masanao; Asada, Yasuki; Matsubara, Kosuke; Suzuki, Shouichi; Matsunaga, Yuta; Haba, Tomonobu; Kawaguchi, Ai; Daioku, Tomihiko; Toyama, Hiroshi; Kato, Ryoichi

    2017-05-01

    Adequate dose management during computed tomography is important. In the present study, the dosimetric application software ImPACT was added to a functional calculator of the size-specific dose estimate and was part of the scan settings for the auto exposure control (AEC) technique. This study aimed to assess the practicality and accuracy of the modified ImPACT software for dose estimation. We compared the conversion factors identified by the software with the values reported by the American Association of Physicists in Medicine Task Group 204, and we noted similar results. Moreover, doses were calculated with the AEC technique and a fixed-tube current of 200 mA for the chest-pelvis region. The modified ImPACT software could estimate each organ dose, which was based on the modulated tube current. The ability to perform beneficial modifications indicates the flexibility of the ImPACT software. The ImPACT software can be further modified for estimation of other doses. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  12. Familial aggregation of age-related macular degeneration in the Utah population.

    PubMed

    Luo, Ling; Harmon, Jennifer; Yang, Xian; Chen, Haoyu; Patel, Shrena; Mineau, Geraldine; Yang, Zhenglin; Constantine, Ryan; Buehler, Jeanette; Kaminoh, Yuuki; Ma, Xiang; Wong, Tien Y; Zhang, Maonian; Zhang, Kang

    2008-02-01

    We examined familial aggregation and risk of age-related macular degeneration in the Utah population using a population-based case-control study. Over one million unique patient records were searched within the University of Utah Health Sciences Center and the Utah Population Database (UPDB), identifying 4764 patients with AMD. Specialized kinship analysis software was used to test for familial aggregation of disease, estimate the magnitude of familial risks, and identify families at high risk for disease. The population-attributable risk (PAR) for AMD was calculated to be 0.34. Recurrence risks in relatives indicate increased relative risks in siblings (2.95), first cousins (1.29), second cousins (1.13), and parents (5.66) of affected cases. There were 16 extended large families with AMD identified for potential use in genetic studies. Each family had five or more living affected members. The familial aggregation of AMD shown in this study exemplifies the merit of the UPDB and supports recent research demonstrating significant genetic contribution to disease development and progression.

  13. SU-F-T-111: Investigation of the Attila Deterministic Solver as a Supplement to Monte Carlo for Calculating Out-Of-Field Radiotherapy Dose

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mille, M; Lee, C; Failla, G

    Purpose: To use the Attila deterministic solver as a supplement to Monte Carlo for calculating out-of-field organ dose in support of epidemiological studies looking at the risks of second cancers. Supplemental dosimetry tools are needed to speed up dose calculations for studies involving large-scale patient cohorts. Methods: Attila is a multi-group discrete ordinates code which can solve the 3D photon-electron coupled linear Boltzmann radiation transport equation on a finite-element mesh. Dose is computed by multiplying the calculated particle flux in each mesh element by a medium-specific energy deposition cross-section. The out-of-field dosimetry capability of Attila is investigated by comparing averagemore » organ dose to that which is calculated by Monte Carlo simulation. The test scenario consists of a 6 MV external beam treatment of a female patient with a tumor in the left breast. The patient is simulated by a whole-body adult reference female computational phantom. Monte Carlo simulations were performed using MCNP6 and XVMC. Attila can export a tetrahedral mesh for MCNP6, allowing for a direct comparison between the two codes. The Attila and Monte Carlo methods were also compared in terms of calculation speed and complexity of simulation setup. A key perquisite for this work was the modeling of a Varian Clinac 2100 linear accelerator. Results: The solid mesh of the torso part of the adult female phantom for the Attila calculation was prepared using the CAD software SpaceClaim. Preliminary calculations suggest that Attila is a user-friendly software which shows great promise for our intended application. Computational performance is related to the number of tetrahedral elements included in the Attila calculation. Conclusion: Attila is being explored as a supplement to the conventional Monte Carlo radiation transport approach for performing retrospective patient dosimetry. The goal is for the dosimetry to be sufficiently accurate for use in retrospective epidemiological investigations.« less

  14. Performance of automated and manual coding systems for occupational data: a case study of historical records.

    PubMed

    Patel, Mehul D; Rose, Kathryn M; Owens, Cindy R; Bang, Heejung; Kaufman, Jay S

    2012-03-01

    Occupational data are a common source of workplace exposure and socioeconomic information in epidemiologic research. We compared the performance of two occupation coding methods, an automated software and a manual coder, using occupation and industry titles from U.S. historical records. We collected parental occupational data from 1920-40s birth certificates, Census records, and city directories on 3,135 deceased individuals in the Atherosclerosis Risk in Communities (ARIC) study. Unique occupation-industry narratives were assigned codes by a manual coder and the Standardized Occupation and Industry Coding software program. We calculated agreement between coding methods of classification into major Census occupational groups. Automated coding software assigned codes to 71% of occupations and 76% of industries. Of this subset coded by software, 73% of occupation codes and 69% of industry codes matched between automated and manual coding. For major occupational groups, agreement improved to 89% (kappa = 0.86). Automated occupational coding is a cost-efficient alternative to manual coding. However, some manual coding is required to code incomplete information. We found substantial variability between coders in the assignment of occupations although not as large for major groups.

  15. Sediment transport primer: estimating bed-material transport in gravel-bed rivers

    Treesearch

    Peter Wilcock; John Pitlick; Yantao Cui

    2009-01-01

    This primer accompanies the release of BAGS, software developed to calculate sediment transport rate in gravel-bed rivers. BAGS and other programs facilitate calculation and can reduce some errors, but cannot ensure that calculations are accurate or relevant. This primer was written to help the software user define relevant and tractable problems, select appropriate...

  16. Fracture risk assessment: improved evaluation of vertebral integrity among metastatic cancer patients to aid in surgical decision-making

    NASA Astrophysics Data System (ADS)

    Augustine, Kurt E.; Camp, Jon J.; Holmes, David R.; Huddleston, Paul M.; Lu, Lichun; Yaszemski, Michael J.; Robb, Richard A.

    2012-03-01

    Failure of the spine's structural integrity from metastatic disease can lead to both pain and neurologic deficit. Fractures that require treatment occur in over 30% of bony metastases. Our objective is to use computed tomography (CT) in conjunction with analytic techniques that have been previously developed to predict fracture risk in cancer patients with metastatic disease to the spine. Current clinical practice for cancer patients with spine metastasis often requires an empirical decision regarding spinal reconstructive surgery. Early image-based software systems used for CT analysis are time consuming and poorly suited for clinical application. The Biomedical Image Resource (BIR) at Mayo Clinic, Rochester has developed an image analysis computer program that calculates from CT scans, the residual load-bearing capacity in a vertebra with metastatic cancer. The Spine Cancer Assessment (SCA) program is built on a platform designed for clinical practice, with a workflow format that allows for rapid selection of patient CT exams, followed by guided image analysis tasks, resulting in a fracture risk report. The analysis features allow the surgeon to quickly isolate a single vertebra and obtain an immediate pre-surgical multiple parallel section composite beam fracture risk analysis based on algorithms developed at Mayo Clinic. The analysis software is undergoing clinical validation studies. We expect this approach will facilitate patient management and utilization of reliable guidelines for selecting among various treatment option based on fracture risk.

  17. Nonlinear analysis of the heartbeats in public patient ECGs using an automated PD2i algorithm for risk stratification of arrhythmic death

    PubMed Central

    Skinner, James E; Anchin, Jerry M; Weiss, Daniel N

    2008-01-01

    Heart rate variability (HRV) reflects both cardiac autonomic function and risk of arrhythmic death (AD). Reduced indices of HRV based on linear stochastic models are independent risk factors for AD in post-myocardial infarct cohorts. Indices based on nonlinear deterministic models have a significantly higher sensitivity and specificity for predicting AD in retrospective data. A need exists for nonlinear analytic software easily used by a medical technician. In the current study, an automated nonlinear algorithm, the time-dependent point correlation dimension (PD2i), was evaluated. The electrocardiogram (ECG) data were provided through an National Institutes of Health-sponsored internet archive (PhysioBank) and consisted of all 22 malignant arrhythmia ECG files (VF/VT) and 22 randomly selected arrhythmia files as the controls. The results were blindly calculated by automated software (Vicor 2.0, Vicor Technologies, Inc., Boca Raton, FL) and showed all analyzable VF/VT files had PD2i < 1.4 and all analyzable controls had PD2i > 1.4. Five VF/VT and six controls were excluded because surrogate testing showed the RR-intervals to contain noise, possibly resulting from the low digitization rate of the ECGs. The sensitivity was 100%, specificity 85%, relative risk > 100; p < 0.01, power > 90%. Thus, automated heartbeat analysis by the time-dependent nonlinear PD2i-algorithm can accurately stratify risk of AD in public data made available for competitive testing of algorithms. PMID:18728829

  18. Using the benchmark dose (BMD) methodology to determine an appropriate reduction of certain ingredients in food products.

    PubMed

    Bi, Jian

    2010-01-01

    As the desire to promote health increases, reductions of certain ingredients, for example, sodium, sugar, and fat in food products, are widely requested. However, the reduction is not risk free in sensory and marketing aspects. Over reduction may change the taste and influence the flavor of a product and lead to a decrease in consumer's overall liking or purchase intent for the product. This article uses the benchmark dose (BMD) methodology to determine an appropriate reduction. Calculations of BMD and one-sided lower confidence limit of BMD are illustrated. The article also discusses how to calculate BMD and BMDL for over dispersed binary data in replicated testing based on a corrected beta-binomial model. USEPA Benchmark Dose Software (BMDS) were used and S-Plus programs were developed. The method discussed in the article is originally used to determine an appropriate reduction of certain ingredients, for example, sodium, sugar, and fat in food products, considering both health reason and sensory or marketing risk.

  19. Implementation of density functional theory method on object-oriented programming (C++) to calculate energy band structure using the projector augmented wave (PAW)

    NASA Astrophysics Data System (ADS)

    Alfianto, E.; Rusydi, F.; Aisyah, N. D.; Fadilla, R. N.; Dipojono, H. K.; Martoprawiro, M. A.

    2017-05-01

    This study implemented DFT method into the C++ programming language with object-oriented programming rules (expressive software). The use of expressive software results in getting a simple programming structure, which is similar to mathematical formula. This will facilitate the scientific community to develop the software. We validate our software by calculating the energy band structure of Silica, Carbon, and Germanium with FCC structure using the Projector Augmented Wave (PAW) method then compare the results to Quantum Espresso calculation’s results. This study shows that the accuracy of the software is 85% compared to Quantum Espresso.

  20. FluxPyt: a Python-based free and open-source software for 13C-metabolic flux analyses.

    PubMed

    Desai, Trunil S; Srivastava, Shireesh

    2018-01-01

    13 C-Metabolic flux analysis (MFA) is a powerful approach to estimate intracellular reaction rates which could be used in strain analysis and design. Processing and analysis of labeling data for calculation of fluxes and associated statistics is an essential part of MFA. However, various software currently available for data analysis employ proprietary platforms and thus limit accessibility. We developed FluxPyt, a Python-based truly open-source software package for conducting stationary 13 C-MFA data analysis. The software is based on the efficient elementary metabolite unit framework. The standard deviations in the calculated fluxes are estimated using the Monte-Carlo analysis. FluxPyt also automatically creates flux maps based on a template for visualization of the MFA results. The flux distributions calculated by FluxPyt for two separate models: a small tricarboxylic acid cycle model and a larger Corynebacterium glutamicum model, were found to be in good agreement with those calculated by a previously published software. FluxPyt was tested in Microsoft™ Windows 7 and 10, as well as in Linux Mint 18.2. The availability of a free and open 13 C-MFA software that works in various operating systems will enable more researchers to perform 13 C-MFA and to further modify and develop the package.

  1. FluxPyt: a Python-based free and open-source software for 13C-metabolic flux analyses

    PubMed Central

    Desai, Trunil S.

    2018-01-01

    13C-Metabolic flux analysis (MFA) is a powerful approach to estimate intracellular reaction rates which could be used in strain analysis and design. Processing and analysis of labeling data for calculation of fluxes and associated statistics is an essential part of MFA. However, various software currently available for data analysis employ proprietary platforms and thus limit accessibility. We developed FluxPyt, a Python-based truly open-source software package for conducting stationary 13C-MFA data analysis. The software is based on the efficient elementary metabolite unit framework. The standard deviations in the calculated fluxes are estimated using the Monte-Carlo analysis. FluxPyt also automatically creates flux maps based on a template for visualization of the MFA results. The flux distributions calculated by FluxPyt for two separate models: a small tricarboxylic acid cycle model and a larger Corynebacterium glutamicum model, were found to be in good agreement with those calculated by a previously published software. FluxPyt was tested in Microsoft™ Windows 7 and 10, as well as in Linux Mint 18.2. The availability of a free and open 13C-MFA software that works in various operating systems will enable more researchers to perform 13C-MFA and to further modify and develop the package. PMID:29736347

  2. Interactive Software For Astrodynamical Calculations

    NASA Technical Reports Server (NTRS)

    Schlaifer, Ronald S.; Skinner, David L.; Roberts, Phillip H.

    1995-01-01

    QUICK computer program provides user with facilities of sophisticated desk calculator performing scalar, vector, and matrix arithmetic; propagate conic-section orbits; determines planetary and satellite coordinates; and performs other related astrodynamic calculations within FORTRAN-like software environment. QUICK is interpreter, and no need to use compiler or linker to run QUICK code. Outputs plotted in variety of formats on variety of terminals. Written in RATFOR.

  3. A software quality model and metrics for risk assessment

    NASA Technical Reports Server (NTRS)

    Hyatt, L.; Rosenberg, L.

    1996-01-01

    A software quality model and its associated attributes are defined and used as the model for the basis for a discussion on risk. Specific quality goals and attributes are selected based on their importance to a software development project and their ability to be quantified. Risks that can be determined by the model's metrics are identified. A core set of metrics relating to the software development process and its products is defined. Measurements for each metric and their usability and applicability are discussed.

  4. Enigma Version 12

    NASA Technical Reports Server (NTRS)

    Shores, David; Goza, Sharon P.; McKeegan, Cheyenne; Easley, Rick; Way, Janet; Everett, Shonn; Guerra, Mark; Kraesig, Ray; Leu, William

    2013-01-01

    Enigma Version 12 software combines model building, animation, and engineering visualization into one concise software package. Enigma employs a versatile user interface to allow average users access to even the most complex pieces of the application. Using Enigma eliminates the need to buy and learn several software packages to create an engineering visualization. Models can be created and/or modified within Enigma down to the polygon level. Textures and materials can be applied for additional realism. Within Enigma, these models can be combined to create systems of models that have a hierarchical relationship to one another, such as a robotic arm. Then these systems can be animated within the program or controlled by an external application programming interface (API). In addition, Enigma provides the ability to use plug-ins. Plugins allow the user to create custom code for a specific application and access the Enigma model and system data, but still use the Enigma drawing functionality. CAD files can be imported into Enigma and combined to create systems of computer graphics models that can be manipulated with constraints. An API is available so that an engineer can write a simulation and drive the computer graphics models with no knowledge of computer graphics. An animation editor allows an engineer to set up sequences of animations generated by simulations or by conceptual trajectories in order to record these to highquality media for presentation. Enigma Version 12 Lyndon B. Johnson Space Center, Houston, Texas 28 NASA Tech Briefs, September 2013 Planetary Protection Bioburden Analysis Program NASA's Jet Propulsion Laboratory, Pasadena, California This program is a Microsoft Access program that performed statistical analysis of the colony counts from assays performed on the Mars Science Laboratory (MSL) spacecraft to determine the bioburden density, 3-sigma biodensity, and the total bioburdens required for the MSL prelaunch reports. It also contains numerous tools that report the data in various ways to simplify the reports required. The program performs all the calculations directly in the MS Access program. Prior to this development, the data was exported to large Excel files that had to be cut and pasted to provide the desired results. The program contains a main menu and a number of submenus. Analyses can be performed by using either all the assays, or only the accountable assays that will be used in the final analysis. There are three options on the first menu: either calculate using (1) the old MER (Mars Exploration Rover) statistics, (2) the MSL statistics for all the assays, or This software implements penetration limit equations for common micrometeoroid and orbital debris (MMOD) shield configurations, windows, and thermal protection systems. Allowable MMOD risk is formulated in terms of the probability of penetration (PNP) of the spacecraft pressure hull. For calculating the risk, spacecraft geometry models, mission profiles, debris environment models, and penetration limit equations for installed shielding configurations are required. Risk assessment software such as NASA's BUMPERII is used to calculate mission PNP; however, they are unsuitable for use in shield design and preliminary analysis studies. The software defines a single equation for the design and performance evaluation of common MMOD shielding configurations, windows, and thermal protection systems, along with a description of their validity range and guidelines for their application. Recommendations are based on preliminary reviews of fundamental assumptions, and accuracy in predicting experimental impact test results. The software is programmed in Visual Basic for Applications for installation as a simple add-in for Microsoft Excel. The user is directed to a graphical user interface (GUI) that requires user inputs and provides solutions directly in Microsoft Excel workbooks. This work was done by Shannon Ryan of the USRA Lunar and Planetary Institute for Johnson Space Center. Further information is contained in a TSP (see page 1). MSC- 24582-1 Micrometeoroid and Orbital Debris (MMOD) Shield Ballistic Limit Analysis Program Lyndon B. Johnson Space Center, Houston, Texas Commercially, because it is so generic, Enigma can be used for almost any project that requires engineering visualization, model building, or animation. Models in Enigma can be exported to many other formats for use in other applications as well. Educationally, Enigma is being used to allow university students to visualize robotic algorithms in a simulation mode before using them with actual hardware.

  5. Continuous Risk Management Course. Revised

    NASA Technical Reports Server (NTRS)

    Hammer, Theodore F.

    1999-01-01

    This document includes a course plan for Continuous Risk Management taught by the Software Assurance Technology Center along with the Continuous Risk Management Guidebook of the Software Engineering Institute of Carnegie Mellon University and a description of Continuous Risk Management at NASA.

  6. Design and implementation of software for automated quality control and data analysis for a complex LC/MS/MS assay for urine opiates and metabolites.

    PubMed

    Dickerson, Jane A; Schmeling, Michael; Hoofnagle, Andrew N; Hoffman, Noah G

    2013-01-16

    Mass spectrometry provides a powerful platform for performing quantitative, multiplexed assays in the clinical laboratory, but at the cost of increased complexity of analysis and quality assurance calculations compared to other methodologies. Here we describe the design and implementation of a software application that performs quality control calculations for a complex, multiplexed, mass spectrometric analysis of opioids and opioid metabolites. The development and implementation of this application improved our data analysis and quality assurance processes in several ways. First, use of the software significantly improved the procedural consistency for performing quality control calculations. Second, it reduced the amount of time technologists spent preparing and reviewing the data, saving on average over four hours per run, and in some cases improving turnaround time by a day. Third, it provides a mechanism for coupling procedural and software changes with the results of each analysis. We describe several key details of the implementation including the use of version control software and automated unit tests. These generally useful software engineering principles should be considered for any software development project in the clinical lab. Copyright © 2012 Elsevier B.V. All rights reserved.

  7. Initial experience of ArcCHECK and 3DVH software for RapidArc treatment plan verification

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Infusino, Erminia; Mameli, Alessandra, E-mail: e.infusino@unicampus.it; Conti, Roberto

    2014-10-01

    The purpose of this study was to perform delivery quality assurance with ArcCHECK and 3DVH system (Sun Nuclear, FL) and to evaluate the suitability of this system for volumetric-modulated arc therapy (VMAT) (RapidArc [RA]) verification. This software calculates the delivered dose distributions in patients by perturbing the calculated dose using errors detected in fluence or planar dose measurements. The device is tested to correlate the gamma passing rate (%GP) and the composite dose predicted by 3DVH software. A total of 28 patients with prostate cancer who were treated with RA were analyzed. RA treatments were delivered to a diode arraymore » phantom (ArcCHECK), which was used to create a planned dose perturbation (PDP) file. The 3DVH analysis used the dose differences derived from comparing the measured dose with the treatment planning system (TPS)-calculated doses to perturb the initial TPS-calculated dose. The 3DVH then overlays the resultant dose on the patient's structures using the resultant “PDP” beams. Measured dose distributions were compared with the calculated ones using the gamma index (GI) method by applying the global (Van Dyk) normalization and acceptance criteria, i.e., 3%/3 mm. Paired differences tests were used to estimate statistical significance of the differences between the composite dose calculated using 3DVH and %GP. Also, statistical correlation by means of logistic regression analysis has been analyzed. Dose-volume histogram (DVH) analysis for patient plans revealed small differences between treatment plan calculations and 3DVH results for organ at risk (OAR), whereas planning target volume (PTV) of the measured plan was systematically higher than that predicted by the TPS. The t-test results between the planned and the estimated DVH values showed that mean values were incomparable (p < 0.05). The quality assurance (QA) gamma analysis 3%/3 mm showed that in all cases there were only weak-to-moderate correlations (Pearson r: 0.12 to 0.74). Moreover, clinically relevant differences increased with increasing QA passing rate, indicating that some of the largest dose differences occurred in the cases of high QA passing rates, which may be called “false negatives.” The clinical importance of any disagreement between the measured and the calculated dose is often difficult to interpret; however, beam errors (either in delivery or in TPS calculation) can affect the effectiveness of the patient dose. Further research is needed to determinate the role of a PDP-type algorithm to accurately estimate patient dose effect.« less

  8. [A quantitative risk assessment model of salmonella on carcass in poultry slaughterhouse].

    PubMed

    Zhang, Yu; Chen, Yuzhen; Hu, Chunguang; Zhang, Huaning; Bi, Zhenwang; Bi, Zhenqiang

    2015-05-01

    To construct a quantitative risk assessment model of salmonella on carcass in poultry slaughterhouse and to find out effective interventions to reduce salmonella contamination. We constructed a modular process risk model (MPRM) from evisceration to chilling in Excel Sheet using the data of the process parameters in poultry and the Salmomella concentration surveillance of Jinan in 2012. The MPRM was simulated by @ risk software. The concentration of salmonella on carcass after chilling was 1.96MPN/g which was calculated by model. The sensitive analysis indicated that the correlation coefficient of the concentration of salmonella after defeathering and in chilling pool were 0.84 and 0.34,which were the primary factors to the concentration of salmonella on carcass after chilling. The study provided a quantitative assessment model structure for salmonella on carcass in poultry slaughterhouse. The risk manager could control the contamination of salmonella on carcass after chilling by reducing the concentration of salmonella after defeathering and in chilling pool.

  9. Gas flow calculation method of a ramjet engine

    NASA Astrophysics Data System (ADS)

    Kostyushin, Kirill; Kagenov, Anuar; Eremin, Ivan; Zhiltsov, Konstantin; Shuvarikov, Vladimir

    2017-11-01

    At the present study calculation methodology of gas dynamics equations in ramjet engine is presented. The algorithm is based on Godunov`s scheme. For realization of calculation algorithm, the system of data storage is offered, the system does not depend on mesh topology, and it allows using the computational meshes with arbitrary number of cell faces. The algorithm of building a block-structured grid is given. Calculation algorithm in the software package "FlashFlow" is implemented. Software package is verified on the calculations of simple configurations of air intakes and scramjet models.

  10. Risk reduction using DDP (Defect Detection and Prevention): Software support and software applications

    NASA Technical Reports Server (NTRS)

    Feather, M. S.

    2001-01-01

    Risk assessment and mitigation is the focus of the Defect Detection and Prevention (DDP) process, which has been applied to spacecraft technology assessments and planning, both hardware and software. DDP's major elements and their relevance to core requirement engineering concerns are summarized. The accompanying research demonstration illustrates DDP's tool support, and further customizations for application to software.

  11. Emission rate modeling and risk assessment at an automobile plant from painting operations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kumar, A.; Shrivastava, A.; Kulkarni, A.

    Pollution from automobile plants from painting operations has been addressed in the Clean Act Amendments (1990). The estimation of pollutant emissions from automobile painting operation were done mostly by approximate procedures than by actual calculations. The purpose of this study was to develop a methodology for calculating the emissions of the pollutants from painting operation in an automobile plant. Five scenarios involving an automobile painting operation, located in Columbus (Ohio), were studied for pollutant emission and concomitant risk associated with that. In the study of risk, a sensitivity analysis was done using Crystal Ball{reg{underscore}sign} on the parameters involved in risk.more » This software uses the Monte Carlo principle. The most sensitive factor in the risk analysis was the ground level concentration of the pollutants. All scenarios studied met the safety goal (a risk value of 1 x 10{sup {minus}6}) with different confidence levels. The highest level of confidence in meeting the safety goal was displayed by Scenario 1 (Alpha Industries). The results from the scenarios suggest that risk is associated with the quantity of released toxic pollutants. The sensitivity analysis of the various parameter shows that average spray rate of paint is the most important parameter in the estimation of pollutants from the painting operations. The entire study is a complete module that can be used by the environmental pollution control agencies for estimation of pollution levels and estimation of associated risk. The study can be further extended to other operations in an automobile industry or to different industries.« less

  12. Role of Volatility in the Development of JP-8 Surrogates for Diesel Engine Application

    DTIC Science & Technology

    2014-01-01

    distillation curves of the surrogate fuels were calculated using the Aspen HYSYS [41] software package, and the Peng- Robinson model was chosen to...distillation curves for the surrogate fuels developed in this investigation, the accuracy of Aspen HYSYS software predictions were compared with...and SF3. The distillation curves calculated using Aspen HYSYS software for the five surrogate fuels of Table 1 are shown in Figure 7, along with the

  13. Software risk estimation and management techniques at JPL

    NASA Technical Reports Server (NTRS)

    Hihn, J.; Lum, K.

    2002-01-01

    In this talk we will discuss how uncertainty has been incorporated into the JPL software model, probabilistic-based estimates, and how risk is addressed, how cost risk is currently being explored via a variety of approaches, from traditional risk lists, to detailed WBS-based risk estimates to the Defect Detection and Prevention (DDP) tool.

  14. An open source software for fast grid-based data-mining in spatial epidemiology (FGBASE).

    PubMed

    Baker, David M; Valleron, Alain-Jacques

    2014-10-30

    Examining whether disease cases are clustered in space is an important part of epidemiological research. Another important part of spatial epidemiology is testing whether patients suffering from a disease are more, or less, exposed to environmental factors of interest than adequately defined controls. Both approaches involve determining the number of cases and controls (or population at risk) in specific zones. For cluster searches, this often must be done for millions of different zones. Doing this by calculating distances can lead to very lengthy computations. In this work we discuss the computational advantages of geographical grid-based methods, and introduce an open source software (FGBASE) which we have created for this purpose. Geographical grids based on the Lambert Azimuthal Equal Area projection are well suited for spatial epidemiology because they preserve area: each cell of the grid has the same area. We describe how data is projected onto such a grid, as well as grid-based algorithms for spatial epidemiological data-mining. The software program (FGBASE), that we have developed, implements these grid-based methods. The grid based algorithms perform extremely fast. This is particularly the case for cluster searches. When applied to a cohort of French Type 1 Diabetes (T1D) patients, as an example, the grid based algorithms detected potential clusters in a few seconds on a modern laptop. This compares very favorably to an equivalent cluster search using distance calculations instead of a grid, which took over 4 hours on the same computer. In the case study we discovered 4 potential clusters of T1D cases near the cities of Le Havre, Dunkerque, Toulouse and Nantes. One example of environmental analysis with our software was to study whether a significant association could be found between distance to vineyards with heavy pesticide. None was found. In both examples, the software facilitates the rapid testing of hypotheses. Grid-based algorithms for mining spatial epidemiological data provide advantages in terms of computational complexity thus improving the speed of computations. We believe that these methods and this software tool (FGBASE) will lower the computational barriers to entry for those performing epidemiological research.

  15. Validation of DNA-based identification software by computation of pedigree likelihood ratios.

    PubMed

    Slooten, K

    2011-08-01

    Disaster victim identification (DVI) can be aided by DNA-evidence, by comparing the DNA-profiles of unidentified individuals with those of surviving relatives. The DNA-evidence is used optimally when such a comparison is done by calculating the appropriate likelihood ratios. Though conceptually simple, the calculations can be quite involved, especially with large pedigrees, precise mutation models etc. In this article we describe a series of test cases designed to check if software designed to calculate such likelihood ratios computes them correctly. The cases include both simple and more complicated pedigrees, among which inbred ones. We show how to calculate the likelihood ratio numerically and algebraically, including a general mutation model and possibility of allelic dropout. In Appendix A we show how to derive such algebraic expressions mathematically. We have set up these cases to validate new software, called Bonaparte, which performs pedigree likelihood ratio calculations in a DVI context. Bonaparte has been developed by SNN Nijmegen (The Netherlands) for the Netherlands Forensic Institute (NFI). It is available free of charge for non-commercial purposes (see www.dnadvi.nl for details). Commercial licenses can also be obtained. The software uses Bayesian networks and the junction tree algorithm to perform its calculations. Copyright © 2010 Elsevier Ireland Ltd. All rights reserved.

  16. The NUKDOS software for treatment planning in molecular radiotherapy.

    PubMed

    Kletting, Peter; Schimmel, Sebastian; Hänscheid, Heribert; Luster, Markus; Fernández, Maria; Nosske, Dietmar; Lassmann, Michael; Glatting, Gerhard

    2015-09-01

    The aim of this work was the development of a software tool for treatment planning prior to molecular radiotherapy, which comprises all functionality to objectively determine the activity to administer and the pertaining absorbed doses (including the corresponding error) based on a series of gamma camera images and one SPECT/CT or probe data. NUKDOS was developed in MATLAB. The workflow is based on the MIRD formalism For determination of the tissue or organ pharmacokinetics, gamma camera images as well as probe, urine, serum and blood activity data can be processed. To estimate the time-integrated activity coefficients (TIAC), sums of exponentials are fitted to the time activity data and integrated analytically. To obtain the TIAC on the voxel level, the voxel activity distribution from the quantitative 3D SPECT/CT (or PET/CT) is used for scaling and weighting the TIAC derived from the 2D organ data. The voxel S-values are automatically calculated based on the voxel-size of the image and the therapeutic nuclide ((90)Y, (131)I or (177)Lu). The absorbed dose coefficients are computed by convolution of the voxel TIAC and the voxel S-values. The activity to administer and the pertaining absorbed doses are determined by entering the absorbed dose for the organ at risk. The overall error of the calculated absorbed doses is determined by Gaussian error propagation. NUKDOS was tested for the operation systems Windows(®) 7 (64 Bit) and 8 (64 Bit). The results of each working step were compared to commercially available (SAAMII, OLINDA/EXM) and in-house (UlmDOS) software. The application of the software is demonstrated using examples form peptide receptor radionuclide therapy (PRRT) and from radioiodine therapy of benign thyroid diseases. For the example from PRRT, the calculated activity to administer differed by 4% comparing NUKDOS and the final result using UlmDos, SAAMII and OLINDA/EXM sequentially. The absorbed dose for the spleen and tumour differed by 7% and 8%, respectively. The results from the example from radioiodine therapy of benign thyroid diseases and the example given in the latest corresponding SOP were identical. The implemented, objective methods facilitate accurate and reproducible results. The software is freely available. Copyright © 2015. Published by Elsevier GmbH.

  17. QuBiLS-MIDAS: a parallel free-software for molecular descriptors computation based on multilinear algebraic maps.

    PubMed

    García-Jacas, César R; Marrero-Ponce, Yovani; Acevedo-Martínez, Liesner; Barigye, Stephen J; Valdés-Martiní, José R; Contreras-Torres, Ernesto

    2014-07-05

    The present report introduces the QuBiLS-MIDAS software belonging to the ToMoCoMD-CARDD suite for the calculation of three-dimensional molecular descriptors (MDs) based on the two-linear (bilinear), three-linear, and four-linear (multilinear or N-linear) algebraic forms. Thus, it is unique software that computes these tensor-based indices. These descriptors, establish relations for two, three, and four atoms by using several (dis-)similarity metrics or multimetrics, matrix transformations, cutoffs, local calculations and aggregation operators. The theoretical background of these N-linear indices is also presented. The QuBiLS-MIDAS software was developed in the Java programming language and employs the Chemical Development Kit library for the manipulation of the chemical structures and the calculation of the atomic properties. This software is composed by a desktop user-friendly interface and an Abstract Programming Interface library. The former was created to simplify the configuration of the different options of the MDs, whereas the library was designed to allow its easy integration to other software for chemoinformatics applications. This program provides functionalities for data cleaning tasks and for batch processing of the molecular indices. In addition, it offers parallel calculation of the MDs through the use of all available processors in current computers. The studies of complexity of the main algorithms demonstrate that these were efficiently implemented with respect to their trivial implementation. Lastly, the performance tests reveal that this software has a suitable behavior when the amount of processors is increased. Therefore, the QuBiLS-MIDAS software constitutes a useful application for the computation of the molecular indices based on N-linear algebraic maps and it can be used freely to perform chemoinformatics studies. Copyright © 2014 Wiley Periodicals, Inc.

  18. Addressing software security risk mitigations in the life cycle

    NASA Technical Reports Server (NTRS)

    Gilliam, David; Powell, John; Haugh, Eric; Bishop, Matt

    2003-01-01

    The NASA Office of Safety and Mission Assurance (OSMA) has funded the Jet Propulsion Laboratory (JPL) with a Center Initiative, 'Reducing Software Security Risk through an Integrated Approach' (RSSR), to address this need. The Initiative is a formal approach to addressing software security in the life cycle through the instantiation of a Software Security Assessment Instrument (SSAI) for the development and maintenance life cycles.

  19. Analysis and calculation of macrosegregation in a casting ingot. MPS solidification model. Volume 2: Software documentation

    NASA Technical Reports Server (NTRS)

    Maples, A. L.

    1980-01-01

    The software developed for the solidification model is presented. A link between the calculations and the FORTRAN code is provided, primarily in the form of global flow diagrams and data structures. A complete listing of the solidification code is given.

  20. Improvements to NASA's Debris Assessment Software

    NASA Technical Reports Server (NTRS)

    Opiela, J.; Johnson, Nicholas L.

    2007-01-01

    NASA's Debris Assessment Software (DAS) has been substantially revised and expanded. DAS is designed to assist NASA programs in performing orbital debris assessments, as described in NASA s Guidelines and Assessment Procedures for Limiting Orbital Debris. The extensive upgrade of DAS was undertaken to reflect changes in the debris mitigation guidelines, to incorporate recommendations from DAS users, and to take advantage of recent software capabilities for greater user utility. DAS 2.0 includes an updated environment model and enhanced orbital propagators and reentry-survivability models. The ORDEM96 debris environment model has been replaced by ORDEM2000 in DAS 2.0, which is also designed to accept anticipated revisions to the environment definition. Numerous upgrades have also been applied to the assessment of human casualty potential due to reentering debris. Routines derived from the Object Reentry Survival Analysis Tool, Version 6 (ORSAT 6), determine which objects are assessed to survive reentry, and the resulting risk of human casualty is calculated directly based upon the orbital inclination and a future world population database. When evaluating reentry risks, the user may enter up to 200 unique hardware components for each launched object, in up to four nested levels. This last feature allows the software to more accurately model components that are exposed below the initial breakup altitude. The new DAS 2.0 provides an updated set of tools for users to assess their mission s compliance with the NASA Safety Standard and does so with a clear and easy-to-understand interface. The new native Microsoft Windows graphical user interface (GUI) is a vast improvement over the previous DOS-based interface. In the new version, functions are more-clearly laid out, and the GUI includes the standard Windows-style Help functions. The underlying routines within the DAS code are also improved.

  1. Hemodynamics model of fluid–solid interaction in internal carotid artery aneurysms

    PubMed Central

    Fu-Yu, Wang; Lei, Liu; Xiao-Jun, Zhang; Hai-Yue, Ju

    2010-01-01

    The objective of this study is to present a relatively simple method to reconstruct cerebral aneurysms as 3D numerical grids. The method accurately duplicates the geometry to provide computer simulations of the blood flow. Initial images were obtained by using CT angiography and 3D digital subtraction angiography in DICOM format. The image was processed by using MIMICS software, and the 3D fluid model (blood flow) and 3D solid model (wall) were generated. The subsequent output was exported to the ANSYS workbench software to generate the volumetric mesh for further hemodynamic study. The fluid model was defined and simulated in CFX software while the solid model was calculated in ANSYS software. The force data calculated firstly in the CFX software were transferred to the ANSYS software, and after receiving the force data, total mesh displacement data were calculated in the ANSYS software. Then, the mesh displacement data were transferred back to the CFX software. The data exchange was processed in workbench software. The results of simulation could be visualized in CFX-post. Two examples of grid reconstruction and blood flow simulation for patients with internal carotid artery aneurysms were presented. The wall shear stress, wall total pressure, and von Mises stress could be visualized. This method seems to be relatively simple and suitable for direct use by neurosurgeons or neuroradiologists, and maybe a practical tool for planning treatment and follow-up of patients after neurosurgical or endovascular interventions with 3D angiography. PMID:20812022

  2. Hemodynamics model of fluid-solid interaction in internal carotid artery aneurysms.

    PubMed

    Bai-Nan, Xu; Fu-Yu, Wang; Lei, Liu; Xiao-Jun, Zhang; Hai-Yue, Ju

    2011-01-01

    The objective of this study is to present a relatively simple method to reconstruct cerebral aneurysms as 3D numerical grids. The method accurately duplicates the geometry to provide computer simulations of the blood flow. Initial images were obtained by using CT angiography and 3D digital subtraction angiography in DICOM format. The image was processed by using MIMICS software, and the 3D fluid model (blood flow) and 3D solid model (wall) were generated. The subsequent output was exported to the ANSYS workbench software to generate the volumetric mesh for further hemodynamic study. The fluid model was defined and simulated in CFX software while the solid model was calculated in ANSYS software. The force data calculated firstly in the CFX software were transferred to the ANSYS software, and after receiving the force data, total mesh displacement data were calculated in the ANSYS software. Then, the mesh displacement data were transferred back to the CFX software. The data exchange was processed in workbench software. The results of simulation could be visualized in CFX-post. Two examples of grid reconstruction and blood flow simulation for patients with internal carotid artery aneurysms were presented. The wall shear stress, wall total pressure, and von Mises stress could be visualized. This method seems to be relatively simple and suitable for direct use by neurosurgeons or neuroradiologists, and maybe a practical tool for planning treatment and follow-up of patients after neurosurgical or endovascular interventions with 3D angiography.

  3. Software analysis handbook: Software complexity analysis and software reliability estimation and prediction

    NASA Technical Reports Server (NTRS)

    Lee, Alice T.; Gunn, Todd; Pham, Tuan; Ricaldi, Ron

    1994-01-01

    This handbook documents the three software analysis processes the Space Station Software Analysis team uses to assess space station software, including their backgrounds, theories, tools, and analysis procedures. Potential applications of these analysis results are also presented. The first section describes how software complexity analysis provides quantitative information on code, such as code structure and risk areas, throughout the software life cycle. Software complexity analysis allows an analyst to understand the software structure, identify critical software components, assess risk areas within a software system, identify testing deficiencies, and recommend program improvements. Performing this type of analysis during the early design phases of software development can positively affect the process, and may prevent later, much larger, difficulties. The second section describes how software reliability estimation and prediction analysis, or software reliability, provides a quantitative means to measure the probability of failure-free operation of a computer program, and describes the two tools used by JSC to determine failure rates and design tradeoffs between reliability, costs, performance, and schedule.

  4. ORAM-SENTINEL{trademark} demonstration at Fitzpatrick. Final report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lee, L.K.; Anderson, V.M.; Mohammadi, K.

    1998-06-01

    New York Power Authority, in cooperation with EPRI, installed the ORAM-SENTINEL{trademark} software at James A. Fitzpatrick (JAF) Nuclear Power Plant. This software incorporates models of safety systems and support systems that are used for defense-in-depth in the plant during outage and on-line periods. A secondary goal was to include some pre-analyzed risk results to validate the methodology for quantitative assessment of the plant risks during proposed on-line maintenance. During the past year, New York Power Authority personnel have become familiar with the formal computerized Safety Assessment process associated with on-line and outage maintenance. The report describes techniques and lessons learnedmore » during development of the ORAM-SENTINEL model at JAF. It overviews the systems important to the Safety Function Assessment Process and provides details on development of the Plant Transient Assessment process using the station emergency operating procedures. The assessment results are displayed by color (green, yellow, orange, red) to show decreasing safety conditions. The report describes use of the JAF Probabilistic Safety Assessment within the ORAM-SENTINEL code to calculate an instantaneous core damage frequency and the criteria by which this frequency is translated to a color indicator.« less

  5. Software Acquisition Risk Management Key Process Area (KPA) - A Guidebook Version 1.0.

    DTIC Science & Technology

    1997-08-01

    Budget - Software Project Management Practices and Techniques. McGraw-Hill International (UK) Limited, 1992. [Boehm 81 ] Boehm, Barry . Software...Engineering Economics. Englewood Cliffs, N.J.: Prentice-Hall, Inc., 1981. [Boehm 89] Boehm, Barry . IEEE Tutorial on Software Risk Management. New York: IEEE...95] [Mayrhauser 90] [Moran 90] [Myers 96] [NRC 89] [Osborn 53] [Paulk 95] [ Pressman 92] [Pulford 96] [Scholtes 88] [Sisti 94] [STSC 96

  6. ''Do-it-yourself'' software program calculates boiler efficiency

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1984-03-01

    An easy-to-use software package is described which runs on the IBM Personal Computer. The package calculates boiler efficiency, an important parameter of operating costs and equipment wellbeing. The program stores inputs and calculated results for 20 sets of boiler operating data, called cases. Cases can be displayed and modified on the CRT screen through multiple display pages or copied to a printer. All intermediate calculations are performed by this package. They include: steam enthalpy; water enthalpy; air humidity; gas, oil, coal, and wood heat capacity; and radiation losses.

  7. Software - Naval Oceanography Portal

    Science.gov Websites

    section Advanced Search... Sections Home Time Earth Orientation Astronomy Meteorology Oceanography Ice You are here: Home › USNO › Earth Orientation › Software USNO Logo USNO Navigation Earth Orientation Search databases Auxiliary Software Supporting Software Form Folder Earth Orientation Matrix Calculator

  8. Comparison of Numerical Analyses with a Static Load Test of a Continuous Flight Auger Pile

    NASA Astrophysics Data System (ADS)

    Hoľko, Michal; Stacho, Jakub

    2014-12-01

    The article deals with numerical analyses of a Continuous Flight Auger (CFA) pile. The analyses include a comparison of calculated and measured load-settlement curves as well as a comparison of the load distribution over a pile's length. The numerical analyses were executed using two types of software, i.e., Ansys and Plaxis, which are based on FEM calculations. Both types of software are different from each other in the way they create numerical models, model the interface between the pile and soil, and use constitutive material models. The analyses have been prepared in the form of a parametric study, where the method of modelling the interface and the material models of the soil are compared and analysed. Our analyses show that both types of software permit the modelling of pile foundations. The Plaxis software uses advanced material models as well as the modelling of the impact of groundwater or overconsolidation. The load-settlement curve calculated using Plaxis is equal to the results of a static load test with a more than 95 % degree of accuracy. In comparison, the load-settlement curve calculated using Ansys allows for the obtaining of only an approximate estimate, but the software allows for the common modelling of large structure systems together with a foundation system.

  9. Evaluation of 3D Gamma index calculation implemented in two commercial dosimetry systems

    NASA Astrophysics Data System (ADS)

    Xing, Aitang; Arumugam, Sankar; Deshpande, Shrikant; George, Armia; Vial, Philip; Holloway, Lois; Goozee, Gary

    2015-01-01

    3D Gamma index is one of the metrics which have been widely used for clinical routine patient specific quality assurance for IMRT, Tomotherapy and VMAT. The algorithms for calculating the 3D Gamma index using global and local methods implemented in two software tools: PTW- VeriSoft® as a part of OCTIVIUS 4D dosimeter systems and 3DVHTM from Sun Nuclear were assessed. The Gamma index calculated by the two systems was compared with manual calculated for one data set. The Gamma pass rate calculated by the two systems was compared using 3%/3mm, 2%/2mm, 3%/2mm and 2%/3mm for two additional data sets. The Gamma indexes calculated by the two systems were accurate, but Gamma pass rates calculated by the two software tools for same data set with the same dose threshold were different due to the different interpolation of raw dose data by the two systems and different implementation of Gamma index calculation and other modules in the two software tools. The mean difference was -1.3%±3.38 (1SD) with a maximum difference of 11.7%.

  10. Global assessment of human losses due to earthquakes

    USGS Publications Warehouse

    Silva, Vitor; Jaiswal, Kishor; Weatherill, Graeme; Crowley, Helen

    2014-01-01

    Current studies have demonstrated a sharp increase in human losses due to earthquakes. These alarming levels of casualties suggest the need for large-scale investment in seismic risk mitigation, which, in turn, requires an adequate understanding of the extent of the losses, and location of the most affected regions. Recent developments in global and uniform datasets such as instrumental and historical earthquake catalogues, population spatial distribution and country-based vulnerability functions, have opened an unprecedented possibility for a reliable assessment of earthquake consequences at a global scale. In this study, a uniform probabilistic seismic hazard assessment (PSHA) model was employed to derive a set of global seismic hazard curves, using the open-source software OpenQuake for seismic hazard and risk analysis. These results were combined with a collection of empirical fatality vulnerability functions and a population dataset to calculate average annual human losses at the country level. The results from this study highlight the regions/countries in the world with a higher seismic risk, and thus where risk reduction measures should be prioritized.

  11. [A case-control study on the risk factors of esophageal cancer in Linzhou].

    PubMed

    Lu, J; Lian, S; Sun, X; Zhang, Z; Dai, D; Li, B; Cheng, L; Wei, J; Duan, W

    2000-12-01

    To explore the characteristics of prevalence and influencing factors on the genesis of esophageal cancer. A population-based 1:1 matched case-control study was conducted in Linzhou. A total number of 352 pairs of cases and controls matched on sex, age and neighborhoods. Data was analysed by SAS software to calculate the odds ratio of and to evaluate the relative risks. It was found that lower socio-economic status, environmental pollution around the residential areas, lampblack in room, lower body mass index (BMI), more pickled food intake, cigarette smoking, alcoholic drinking, vigor mental-trauma and depression were risk factors of esophageal cancer. It also showed that the subjects having had history of upper digestive tract operation, dysplasia of esophagus and family history of carcinoma markedly increased the risks of developing esophageal cancer. Esophageal cancer seemed to be resulted from the combination of genetic and environmental factor, hence called for of medical surveillance and comprehensive prevention.

  12. Space Station: NASA's software development approach increases safety and cost risks. Report to the Chairman, Committee on Science, Space, and Technology, House of Representatives

    NASA Astrophysics Data System (ADS)

    1992-06-01

    The House Committee on Science, Space, and Technology asked NASA to study software development issues for the space station. How well NASA has implemented key software engineering practices for the station was asked. Specifically, the objectives were to determine: (1) if independent verification and validation techniques are being used to ensure that critical software meets specified requirements and functions; (2) if NASA has incorporated software risk management techniques into program; (3) whether standards are in place that will prescribe a disciplined, uniform approach to software development; and (4) if software support tools will help, as intended, to maximize efficiency in developing and maintaining the software. To meet the objectives, NASA proceeded: (1) reviewing and analyzing software development objectives and strategies contained in NASA conference publications; (2) reviewing and analyzing NASA, other government, and industry guidelines for establishing good software development practices; (3) reviewing and analyzing technical proposals and contracts; (4) reviewing and analyzing software management plans, risk management plans, and program requirements; (4) reviewing and analyzing reports prepared by NASA and contractor officials that identified key issues and challenges facing the program; (5) obtaining expert opinions on what constitutes appropriate independent V-and-V and software risk management activities; (6) interviewing program officials at NASA headquarters in Washington, DC; at the Space Station Program Office in Reston, Virginia; and at the three work package centers; Johnson in Houston, Texas; Marshall in Huntsville, Alabama; and Lewis in Cleveland, Ohio; and (7) interviewing contractor officials doing work for NASA at Johnson and Marshall. The audit work was performed in accordance with generally accepted government auditing standards, between April 1991 and May 1992.

  13. NASA's Approach to Software Assurance

    NASA Technical Reports Server (NTRS)

    Wetherholt, Martha

    2015-01-01

    NASA defines software assurance as: the planned and systematic set of activities that ensure conformance of software life cycle processes and products to requirements, standards, and procedures via quality, safety, reliability, and independent verification and validation. NASA's implementation of this approach to the quality, safety, reliability, security and verification and validation of software is brought together in one discipline, software assurance. Organizationally, NASA has software assurance at each NASA center, a Software Assurance Manager at NASA Headquarters, a Software Assurance Technical Fellow (currently the same person as the SA Manager), and an Independent Verification and Validation Organization with its own facility. An umbrella risk mitigation strategy for safety and mission success assurance of NASA's software, software assurance covers a wide area and is better structured to address the dynamic changes in how software is developed, used, and managed, as well as it's increasingly complex functionality. Being flexible, risk based, and prepared for challenges in software at NASA is essential, especially as much of our software is unique for each mission.

  14. Allergenic weed pollen forecast under the mathematical modeling method implementation in ukraine.

    PubMed

    Motruk, Irina I; Antomonov, Michael Yu; Rodinkova, Victoria V; Aleksandrova, Olena E; Yermishev, Oleh V

    2018-01-01

    Introduction: Allergies are the most common reason of the chronic diseases in developed countries and represent an important medical, social and economic issue, the relevance of which is growing both in these countries and in Ukraine. The most famous of these allergens group is the pollen of ambrosia and pollen of poaceae, which are ubiquitously distributed in the subtropical and temperate climate. The aim: The objective of our study was to develop the mathematical models, which will be able to indicate the probability of the pollen circulation, and thus these models can simplify the forecast of symptoms risk and improve the prophylaxis of pollinosis. Materials and methods: The research was conducted on the basis of the research center of National Pirogov Memorial Medical University, Vinnytsia in the years 2012-2014. A volumetric sampler of the Hirst type was used for the air sampling. The observation was conducted from the first of April to the thirty-first of October. For the initial preparation of the tables and intermediate calculations, Excel software package was used. The software STATISTICA 10.0 was applied to calculate the average coefficients values and their statistical characteristics (beta-values, errors of the mean values, Student's t-test, veracity and the factors percentage contribution into the function variation). Results: Statistically significant correlation between pollen concentrations of herbaceous plants and individual meteorological factors was found; classificational functions were designed by which it is possible to calculate the probability of presence or absence of Artemisia pollen in the atmosphere; the risks of increasing of the Artemisia pollen concentration are determined under exceeding of the critical temperature of 18°С, relative humidity of 67% and atmospheric pressure of 980 Pa. Conclusions. The results of the research can be used to predict the emission of potentially hazardous concentrations of weed pollen grains in the atmosphere of the central region of Ukraine using the weather forecast.

  15. Applications of Earth Observations for Fisheries Management: An analysis of socioeconomic benefits

    NASA Astrophysics Data System (ADS)

    Friedl, L.; Kiefer, D. A.; Turner, W.

    2013-12-01

    This paper will discuss the socioeconomic impacts of a project applying Earth observations and models to support management and conservation of tuna and other marine resources in the eastern Pacific Ocean. A project team created a software package that produces statistical analyses and dynamic maps of habitat for pelagic ocean biota. The tool integrates sea surface temperature and chlorophyll imagery from MODIS, ocean circulation models, and other data products. The project worked with the Inter-American Tropical Tuna Commission, which issues fishery management information, such as stock assessments, for the eastern Pacific region. The Commission uses the tool and broader habitat information to produce better estimates of stock and thus improve their ability to identify species that could be at risk of overfishing. The socioeconomic analysis quantified the relative value that Earth observations contributed to accurate stock size assessments through improvements in calculating population size. The analysis team calculated the first-order economic costs of a fishery collapse (or shutdown), and they calculated the benefits of improved estimates that reduce the uncertainty of stock size and thus reduce the risk of fishery collapse. The team estimated that the project reduced the probability of collapse of different fisheries, and the analysis generated net present values of risk mitigation. USC led the project with sponsorship from the NASA Earth Science Division's Applied Sciences Program, which conducted the socioeconomic impact analysis. The paper will discuss the project and focus primarily on the analytic methods, impact metrics, and the results of the socioeconomic benefits analysis.

  16. Design of a Software for Calculating Isoelectric Point of a Polypeptide According to Their Net Charge Using the Graphical Programming Language LabVIEW

    ERIC Educational Resources Information Center

    Tovar, Glomen

    2018-01-01

    A software to calculate the net charge and to predict the isoelectric point (pI) of a polypeptide is developed in this work using the graphical programming language LabVIEW. Through this instrument the net charges of the ionizable residues of the chains of the proteins are calculated at different pH values, tabulated, pI is predicted and an Excel…

  17. [Prospective performance evaluation of first trimester screenings in Germany for risk calculation through http://www.firsttrimester.net].

    PubMed

    Kleinsorge, F; Smetanay, K; Rom, J; Hörmansdörfer, C; Hörmannsdörfer, C; Scharf, A; Schmidt, P

    2010-12-01

    In 2008, 2 351 first trimester screenings were calculated by a newly developed internet database ( http:// www.firsttrimester.net ) to evaluate the risk for the presence of Down's syndrome. All data were evaluated by the conventional first trimester screening according to Nicolaides (FTS), based on the previous JOY Software, and by the advanced first trimester screening (AFS). After receiving the feedback of the karyotype as well as the rates of the correct positives, correct negatives, false positives, false negatives, the sensitivity and specificity were calculated and compared. Overall 255 cases were investigated which were analysed by both methods. These included 2 cases of Down's syndrome and one case of trisomy 18. The FTS and the AFS had a sensitivity of 100%. The specificity was 88.5% for the FTS and 93.0% for the AFS. As already shown in former studies, the higher specificity of the AFS is a result of a reduction of the false positive rate (28 to 17 cases). As a consequence of the AFS with a detection rate of 100% the rate of further invasive diagnostics in pregnant women is decreased by having 39% fewer positive tested women. © Georg Thieme Verlag KG Stuttgart · New York.

  18. The SCEC Broadband Platform: Open-Source Software for Strong Ground Motion Simulation and Validation

    NASA Astrophysics Data System (ADS)

    Goulet, C.; Silva, F.; Maechling, P. J.; Callaghan, S.; Jordan, T. H.

    2015-12-01

    The Southern California Earthquake Center (SCEC) Broadband Platform (BBP) is a carefully integrated collection of open-source scientific software programs that can simulate broadband (0-100Hz) ground motions for earthquakes at regional scales. The BBP scientific software modules implement kinematic rupture generation, low and high-frequency seismogram synthesis using wave propagation through 1D layered velocity structures, seismogram ground motion amplitude calculations, and goodness of fit measurements. These modules are integrated into a software system that provides user-defined, repeatable, calculation of ground motion seismograms, using multiple alternative ground motion simulation methods, and software utilities that can generate plots, charts, and maps. The BBP has been developed over the last five years in a collaborative scientific, engineering, and software development project involving geoscientists, earthquake engineers, graduate students, and SCEC scientific software developers. The BBP can run earthquake rupture and wave propagation modeling software to simulate ground motions for well-observed historical earthquakes and to quantify how well the simulated broadband seismograms match the observed seismograms. The BBP can also run simulations for hypothetical earthquakes. In this case, users input an earthquake location and magnitude description, a list of station locations, and a 1D velocity model for the region of interest, and the BBP software then calculates ground motions for the specified stations. The SCEC BBP software released in 2015 can be compiled and run on recent Linux systems with GNU compilers. It includes 5 simulation methods, 7 simulation regions covering California, Japan, and Eastern North America, the ability to compare simulation results against GMPEs, updated ground motion simulation methods, and a simplified command line user interface.

  19. Development and prospective evaluation of an automated software system for quality control of quantitative 99mTc-MAG3 renal studies.

    PubMed

    Folks, Russell D; Garcia, Ernest V; Taylor, Andrew T

    2007-03-01

    Quantitative nuclear renography has numerous potential sources of error. We previously reported the initial development of a computer software module for comprehensively addressing the issue of quality control (QC) in the analysis of radionuclide renal images. The objective of this study was to prospectively test the QC software. The QC software works in conjunction with standard quantitative renal image analysis using a renal quantification program. The software saves a text file that summarizes QC findings as possible errors in user-entered values, calculated values that may be unreliable because of the patient's clinical condition, and problems relating to acquisition or processing. To test the QC software, a technologist not involved in software development processed 83 consecutive nontransplant clinical studies. The QC findings of the software were then tabulated. QC events were defined as technical (study descriptors that were out of range or were entered and then changed, unusually sized or positioned regions of interest, or missing frames in the dynamic image set) or clinical (calculated functional values judged to be erroneous or unreliable). Technical QC events were identified in 36 (43%) of 83 studies. Clinical QC events were identified in 37 (45%) of 83 studies. Specific QC events included starting the camera after the bolus had reached the kidney, dose infiltration, oversubtraction of background activity, and missing frames in the dynamic image set. QC software has been developed to automatically verify user input, monitor calculation of renal functional parameters, summarize QC findings, and flag potentially unreliable values for the nuclear medicine physician. Incorporation of automated QC features into commercial or local renal software can reduce errors and improve technologist performance and should improve the efficiency and accuracy of image interpretation.

  20. Evaluating Predictive Models of Software Quality

    NASA Astrophysics Data System (ADS)

    Ciaschini, V.; Canaparo, M.; Ronchieri, E.; Salomoni, D.

    2014-06-01

    Applications from High Energy Physics scientific community are constantly growing and implemented by a large number of developers. This implies a strong churn on the code and an associated risk of faults, which is unavoidable as long as the software undergoes active evolution. However, the necessities of production systems run counter to this. Stability and predictability are of paramount importance; in addition, a short turn-around time for the defect discovery-correction-deployment cycle is required. A way to reconcile these opposite foci is to use a software quality model to obtain an approximation of the risk before releasing a program to only deliver software with a risk lower than an agreed threshold. In this article we evaluated two quality predictive models to identify the operational risk and the quality of some software products. We applied these models to the development history of several EMI packages with intent to discover the risk factor of each product and compare it with its real history. We attempted to determine if the models reasonably maps reality for the applications under evaluation, and finally we concluded suggesting directions for further studies.

  1. Structure simulation with calculated NMR parameters - integrating COSMOS into the CCPN framework.

    PubMed

    Schneider, Olaf; Fogh, Rasmus H; Sternberg, Ulrich; Klenin, Konstantin; Kondov, Ivan

    2012-01-01

    The Collaborative Computing Project for NMR (CCPN) has build a software framework consisting of the CCPN data model (with APIs) for NMR related data, the CcpNmr Analysis program and additional tools like CcpNmr FormatConverter. The open architecture allows for the integration of external software to extend the abilities of the CCPN framework with additional calculation methods. Recently, we have carried out the first steps for integrating our software Computer Simulation of Molecular Structures (COSMOS) into the CCPN framework. The COSMOS-NMR force field unites quantum chemical routines for the calculation of molecular properties with a molecular mechanics force field yielding the relative molecular energies. COSMOS-NMR allows introducing NMR parameters as constraints into molecular mechanics calculations. The resulting infrastructure will be made available for the NMR community. As a first application we have tested the evaluation of calculated protein structures using COSMOS-derived 13C Cα and Cβ chemical shifts. In this paper we give an overview of the methodology and a roadmap for future developments and applications.

  2. The COOLER Code: A Novel Analytical Approach to Calculate Subcellular Energy Deposition by Internal Electron Emitters.

    PubMed

    Siragusa, Mattia; Baiocco, Giorgio; Fredericia, Pil M; Friedland, Werner; Groesser, Torsten; Ottolenghi, Andrea; Jensen, Mikael

    2017-08-01

    COmputation Of Local Electron Release (COOLER), a software program has been designed for dosimetry assessment at the cellular/subcellular scale, with a given distribution of administered low-energy electron-emitting radionuclides in cellular compartments, which remains a critical step in risk/benefit analysis for advancements in internal radiotherapy. The software is intended to overcome the main limitations of the medical internal radiation dose (MIRD) formalism for calculations of cellular S-values (i.e., dose to a target region in the cell per decay in a given source region), namely, the use of the continuous slowing down approximation (CSDA) and the assumption of a spherical cell geometry. To this aim, we developed an analytical approach, entrusted to a MATLAB-based program, using as input simulated data for electron spatial energy deposition directly derived from full Monte Carlo track structure calculations with PARTRAC. Results from PARTRAC calculations on electron range, stopping power and residual energy versus traveled distance curves are presented and, when useful for implementation in COOLER, analytical fit functions are given. Example configurations for cells in different culture conditions (V79 cells in suspension or adherent culture) with realistic geometrical parameters are implemented for use in the tool. Finally, cellular S-value predictions by the newly developed code are presented for different cellular geometries and activity distributions (uniform activity in the nucleus, in the entire cell or on the cell surface), validated against full Monte Carlo calculations with PARTRAC, and compared to MIRD standards, as well as results based on different track structure calculations (Geant4-DNA). The largest discrepancies between COOLER and MIRD predictions were generally found for electrons between 25 and 30 keV, where the magnitude of disagreement in S-values can vary from 50 to 100%, depending on the activity distribution. In calculations for activity distribution on the cell surface, MIRD predictions appeared to fail the most. The proposed method is suitable for Auger-cascade electrons, but can be extended to any energy of interest and to beta spectra; as an example, the 3 H case is also discussed. COOLER is intended to be accessible to everyone (preclinical and clinical researchers included), and may provide important information for the selection of radionuclides, the interpretation of radiobiological or preclinical results, and the general establishment of doses in any scenario, e.g., with cultured cells in the laboratory or with therapeutic or diagnostic applications. The software will be made available for download from the DTU-Nutech website: http://www.nutech.dtu.dk/ .

  3. Development of a software based automatic exposure control system for use in image guided radiation therapy

    NASA Astrophysics Data System (ADS)

    Morton, Daniel R.

    Modern image guided radiation therapy involves the use of an isocentrically mounted imaging system to take radiographs of a patient's position before the start of each treatment. Image guidance helps to minimize errors associated with a patients setup, but the radiation dose received by patients from imaging must be managed to ensure no additional risks. The Varian On-Board Imager (OBI) (Varian Medical Systems, Inc., Palo Alto, CA) does not have an automatic exposure control system and therefore requires exposure factors to be manually selected. Without patient specific exposure factors, images may become saturated and require multiple unnecessary exposures. A software based automatic exposure control system has been developed to predict optimal, patient specific exposure factors. The OBI system was modelled in terms of the x-ray tube output and detector response in order to calculate the level of detector saturation for any exposure situation. Digitally reconstructed radiographs are produced via ray-tracing through the patients' volumetric datasets that are acquired for treatment planning. The ray-trace determines the attenuation of the patient and subsequent x-ray spectra incident on the imaging detector. The resulting spectra are used in the detector response model to determine the exposure levels required to minimize detector saturation. Images calculated for various phantoms showed good agreement with the images that were acquired on the OBI. Overall, regions of detector saturation were accurately predicted and the detector response for non-saturated regions in images of an anthropomorphic phantom were calculated to generally be within 5 to 10 % of the measured values. Calculations were performed on patient data and found similar results as the phantom images, with the calculated images being able to determine detector saturation with close agreement to images that were acquired during treatment. Overall, it was shown that the system model and calculation method could potentially be used to predict patients' exposure factors before their treatment begins, thus preventing the need for multiple exposures.

  4. Improving Software Engineering on NASA Projects

    NASA Technical Reports Server (NTRS)

    Crumbley, Tim; Kelly, John C.

    2010-01-01

    Software Engineering Initiative: Reduces risk of software failure -Increases mission safety. More predictable software cost estimates and delivery schedules. Smarter buyer of contracted out software. More defects found and removed earlier. Reduces duplication of efforts between projects. Increases ability to meet the challenges of evolving software technology.

  5. Software Assurance Competency Model

    DTIC Science & Technology

    2013-03-01

    COTS) software , and software as a service ( SaaS ). L2: Define and analyze risks in the acquisition of contracted software , COTS software , and SaaS ...2010a]: Application of technologies and processes to achieve a required level of confidence that software systems and services function in the...

  6. Specification and Verification of Medical Monitoring System Using Petri-nets.

    PubMed

    Majma, Negar; Babamir, Seyed Morteza

    2014-07-01

    To monitor the patient behavior, data are collected from patient's body by a medical monitoring device so as to calculate the output using embedded software. Incorrect calculations may endanger the patient's life if the software fails to meet the patient's requirements. Accordingly, the veracity of the software behavior is a matter of concern in the medicine; moreover, the data collected from the patient's body are fuzzy. Some methods have already dealt with monitoring the medical monitoring devices; however, model based monitoring fuzzy computations of such devices have been addressed less. The present paper aims to present synthesizing a fuzzy Petri-net (FPN) model to verify behavior of a sample medical monitoring device called continuous infusion insulin (INS) because Petri-net (PN) is one of the formal and visual methods to verify the software's behavior. The device is worn by the diabetic patients and then the software calculates the INS dose and makes a decision for injection. The input and output of the infusion INS software are not crisp in the real world; therefore, we present them in fuzzy variables. Afterwards, we use FPN instead of clear PN to model the fuzzy variables. The paper follows three steps to synthesize an FPN to deal with verification of the infusion INS device: (1) Definition of fuzzy variables, (2) definition of fuzzy rules and (3) design of the FPN model to verify the software behavior.

  7. Clinical risk analysis with failure mode and effect analysis (FMEA) model in a dialysis unit.

    PubMed

    Bonfant, Giovanna; Belfanti, Pietro; Paternoster, Giuseppe; Gabrielli, Danila; Gaiter, Alberto M; Manes, Massimo; Molino, Andrea; Pellu, Valentina; Ponzetti, Clemente; Farina, Massimo; Nebiolo, Pier E

    2010-01-01

    The aim of clinical risk management is to improve the quality of care provided by health care organizations and to assure patients' safety. Failure mode and effect analysis (FMEA) is a tool employed for clinical risk reduction. We applied FMEA to chronic hemodialysis outpatients. FMEA steps: (i) process study: we recorded phases and activities. (ii) Hazard analysis: we listed activity-related failure modes and their effects; described control measures; assigned severity, occurrence and detection scores for each failure mode and calculated the risk priority numbers (RPNs) by multiplying the 3 scores. Total RPN is calculated by adding single failure mode RPN. (iii) Planning: we performed a RPNs prioritization on a priority matrix taking into account the 3 scores, and we analyzed failure modes causes, made recommendations and planned new control measures. (iv) Monitoring: after failure mode elimination or reduction, we compared the resulting RPN with the previous one. Our failure modes with the highest RPN came from communication and organization problems. Two tools have been created to ameliorate information flow: "dialysis agenda" software and nursing datasheets. We scheduled nephrological examinations, and we changed both medical and nursing organization. Total RPN value decreased from 892 to 815 (8.6%) after reorganization. Employing FMEA, we worked on a few critical activities, and we reduced patients' clinical risk. A priority matrix also takes into account the weight of the control measures: we believe this evaluation is quick, because of simple priority selection, and that it decreases action times.

  8. Semantic Metrics for Analysis of Software

    NASA Technical Reports Server (NTRS)

    Etzkorn, Letha H.; Cox, Glenn W.; Farrington, Phil; Utley, Dawn R.; Ghalston, Sampson; Stein, Cara

    2005-01-01

    A recently conceived suite of object-oriented software metrics focus is on semantic aspects of software, in contradistinction to traditional software metrics, which focus on syntactic aspects of software. Semantic metrics represent a more human-oriented view of software than do syntactic metrics. The semantic metrics of a given computer program are calculated by use of the output of a knowledge-based analysis of the program, and are substantially more representative of software quality and more readily comprehensible from a human perspective than are the syntactic metrics.

  9. Computerized assessment of placental calcification post-ultrasound: a novel software tool.

    PubMed

    Moran, M; Higgins, M; Zombori, G; Ryan, J; McAuliffe, F M

    2013-05-01

    Placental calcification is associated with an increased risk of perinatal morbidity and mortality. The subjectivity of current ultrasound methods of assessment of placental calcification indicates that a more objective method is required. The aim of this study was to correlate the percentage of calcification defined by the clinician using a new software tool for calculating the extent of placental calcification with traditional ultrasound methods and with pregnancy outcome. Ninety placental images were individually assessed. An upper threshold was defined, based on high intensity, to quantify calcification within the placenta. Output metrics were then produced including the overall percentage of calcification with respect to the total number of pixels within the region of interest. The results were correlated with traditional ultrasound methods of assessment of placental calcification and with pregnancy outcome. The results demonstrate a significant correlation between placental calcification, as defined using the software, and traditional methods of Grannum grading of placental calcification. Whilst correlation with perinatal outcome and cord pH was not significant as a result of small numbers, patients with placental calcification assessed using the computerized software at the upper quartile had higher rates of poor perinatal outcome when compared with those at the lower quartile (8/22 (36%) vs 3/23 (13%); P = 0.069). These results suggest that this computerized software tool has the potential to become an alternative method of assessing placental calcification. Copyright © 2012 ISUOG. Published by John Wiley & Sons Ltd.

  10. Bioinactivation: Software for modelling dynamic microbial inactivation.

    PubMed

    Garre, Alberto; Fernández, Pablo S; Lindqvist, Roland; Egea, Jose A

    2017-03-01

    This contribution presents the bioinactivation software, which implements functions for the modelling of isothermal and non-isothermal microbial inactivation. This software offers features such as user-friendliness, modelling of dynamic conditions, possibility to choose the fitting algorithm and generation of prediction intervals. The software is offered in two different formats: Bioinactivation core and Bioinactivation SE. Bioinactivation core is a package for the R programming language, which includes features for the generation of predictions and for the fitting of models to inactivation experiments using non-linear regression or a Markov Chain Monte Carlo algorithm (MCMC). The calculations are based on inactivation models common in academia and industry (Bigelow, Peleg, Mafart and Geeraerd). Bioinactivation SE supplies a user-friendly interface to selected functions of Bioinactivation core, namely the model fitting of non-isothermal experiments and the generation of prediction intervals. The capabilities of bioinactivation are presented in this paper through a case study, modelling the non-isothermal inactivation of Bacillus sporothermodurans. This study has provided a full characterization of the response of the bacteria to dynamic temperature conditions, including confidence intervals for the model parameters and a prediction interval of the survivor curve. We conclude that the MCMC algorithm produces a better characterization of the biological uncertainty and variability than non-linear regression. The bioinactivation software can be relevant to the food and pharmaceutical industry, as well as to regulatory agencies, as part of a (quantitative) microbial risk assessment. Copyright © 2017 Elsevier Ltd. All rights reserved.

  11. Risk-Based Object Oriented Testing

    NASA Technical Reports Server (NTRS)

    Rosenberg, Linda H.; Stapko, Ruth; Gallo, Albert

    2000-01-01

    Software testing is a well-defined phase of the software development life cycle. Functional ("black box") testing and structural ("white box") testing are two methods of test case design commonly used by software developers. A lesser known testing method is risk-based testing, which takes into account the probability of failure of a portion of code as determined by its complexity. For object oriented programs, a methodology is proposed for identification of risk-prone classes. Risk-based testing is a highly effective testing technique that can be used to find and fix the most important problems as quickly as possible.

  12. Process air quality data

    NASA Technical Reports Server (NTRS)

    Butler, C. M.; Hogge, J. E.

    1978-01-01

    Air quality sampling was conducted. Data for air quality parameters, recorded on written forms, punched cards or magnetic tape, are available for 1972 through 1975. Computer software was developed to (1) calculate several daily statistical measures of location, (2) plot time histories of data or the calculated daily statistics, (3) calculate simple correlation coefficients, and (4) plot scatter diagrams. Computer software was developed for processing air quality data to include time series analysis and goodness of fit tests. Computer software was developed to (1) calculate a larger number of daily statistical measures of location, and a number of daily monthly and yearly measures of location, dispersion, skewness and kurtosis, (2) decompose the extended time series model and (3) perform some goodness of fit tests. The computer program is described, documented and illustrated by examples. Recommendations are made for continuation of the development of research on processing air quality data.

  13. HSTRESS: A computer program to calculate the height of a hydraulic fracture in a multi-layered stress medium

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Warpinski, N.R.

    A computer code for calculating hydraulic fracture height and width in a stressed-layer medium has been modified for easy use on a personal computer. HSTRESS allows for up to 51 layers having different thicknesses, stresses and fracture toughnesses. The code can calculate fracture height versus pressure or pressure versus fracture height, depending on the design model in which the data will be used. At any pressure/height, a width profile is calculated and an equivalent width factor and flow resistance factor are determined. This program is written in FORTRAN. Graphics use PLOT88 software by Plotworks, Inc., but the graphics software mustmore » be obtained by the user because of licensing restrictions. A version without graphics can also be run. This code is available through the National Energy Software Center (NESC), operated by Argonne National Laboratory. 14 refs., 21 figs.« less

  14. Parametric Design and Mechanical Analysis of Beams based on SINOVATION

    NASA Astrophysics Data System (ADS)

    Xu, Z. G.; Shen, W. D.; Yang, D. Y.; Liu, W. M.

    2017-07-01

    In engineering practice, engineer needs to carry out complicated calculation when the loads on the beam are complex. The processes of analysis and calculation take a lot of time and the results are unreliable. So VS2005 and ADK are used to develop a software for beams design based on the 3D CAD software SINOVATION with C ++ programming language. The software can realize the mechanical analysis and parameterized design of various types of beams and output the report of design in HTML format. Efficiency and reliability of design of beams are improved.

  15. A new Web-based medical tool for assessment and prevention of comprehensive cardiovascular risk

    PubMed Central

    Franchi, Daniele; Cini, Davide; Iervasi, Giorgio

    2011-01-01

    Background: Multifactor cardiovascular disease is the leading cause of death; besides well-known cardiovascular risk factors, several emerging factors such as mental stress, diet type, and physical inactivity, have been associated to cardiovascular disease. To date, preventive strategies are based on the concept of absolute risk calculated by different algorithms and scoring systems. However, in general practice the patient’s data collection represents a critical issue. Design: A new multipurpose computer-based program has been developed in order to:1) easily calculate and compare the absolute cardiovascular risk by the Framingham, Procam, and Progetto Cuore algorithms; 2) to design a web-based computerized tool for prospective collection of structured data; 3) to support the doctor in the decision-making process for patients at risk according to recent international guidelines. Methods: During a medical consultation the doctor utilizes a common computer connected by Internet to a medical server where all the patient’s data and software reside. The program evaluates absolute and relative cardiovascular risk factors, personalized patient’s goals, and multiparametric trends, monitors critical parameter values, and generates an automated medical report. Results: In a pilot study on 294 patients (47% males; mean age 60 ± 12 years [±SD]) the global time to collect data at first consultation was 13 ± 11 minutes which declined to 8 ± 7 minutes at the subsequent consultation. In 48.2% of cases the program revealed 2 or more primary risk factor parameters outside guideline indications and gave specific clinical suggestions to return altered parameters to target values. Conclusion: The web-based system proposed here may represent a feasible and flexible tool for clinical management of patients at risk of cardiovascular disease and for epidemiological research. PMID:21445280

  16. A GPU Simulation Tool for Training and Optimisation in 2D Digital X-Ray Imaging.

    PubMed

    Gallio, Elena; Rampado, Osvaldo; Gianaria, Elena; Bianchi, Silvio Diego; Ropolo, Roberto

    2015-01-01

    Conventional radiology is performed by means of digital detectors, with various types of technology and different performance in terms of efficiency and image quality. Following the arrival of a new digital detector in a radiology department, all the staff involved should adapt the procedure parameters to the properties of the detector, in order to achieve an optimal result in terms of correct diagnostic information and minimum radiation risks for the patient. The aim of this study was to develop and validate a software capable of simulating a digital X-ray imaging system, using graphics processing unit computing. All radiological image components were implemented in this application: an X-ray tube with primary beam, a virtual patient, noise, scatter radiation, a grid and a digital detector. Three different digital detectors (two digital radiography and a computed radiography systems) were implemented. In order to validate the software, we carried out a quantitative comparison of geometrical and anthropomorphic phantom simulated images with those acquired. In terms of average pixel values, the maximum differences were below 15%, while the noise values were in agreement with a maximum difference of 20%. The relative trends of contrast to noise ratio versus beam energy and intensity were well simulated. Total calculation times were below 3 seconds for clinical images with pixel size of actual dimensions less than 0.2 mm. The application proved to be efficient and realistic. Short calculation times and the accuracy of the results obtained make this software a useful tool for training operators and dose optimisation studies.

  17. Stochastic Modeling of Radioactive Material Releases

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Andrus, Jason; Pope, Chad

    2015-09-01

    Nonreactor nuclear facilities operated under the approval authority of the U.S. Department of Energy use unmitigated hazard evaluations to determine if potential radiological doses associated with design basis events challenge or exceed dose evaluation guidelines. Unmitigated design basis events that sufficiently challenge dose evaluation guidelines or exceed the guidelines for members of the public or workers, merit selection of safety structures, systems, or components or other controls to prevent or mitigate the hazard. Idaho State University, in collaboration with Idaho National Laboratory, has developed a portable and simple to use software application called SODA (Stochastic Objective Decision-Aide) that stochastically calculatesmore » the radiation dose associated with hypothetical radiological material release scenarios. Rather than producing a point estimate of the dose, SODA produces a dose distribution result to allow a deeper understanding of the dose potential. SODA allows users to select the distribution type and parameter values for all of the input variables used to perform the dose calculation. SODA then randomly samples each distribution input variable and calculates the overall resulting dose distribution. In cases where an input variable distribution is unknown, a traditional single point value can be used. SODA was developed using the MATLAB coding framework. The software application has a graphical user input. SODA can be installed on both Windows and Mac computers and does not require MATLAB to function. SODA provides improved risk understanding leading to better informed decision making associated with establishing nuclear facility material-at-risk limits and safety structure, system, or component selection. It is important to note that SODA does not replace or compete with codes such as MACCS or RSAC, rather it is viewed as an easy to use supplemental tool to help improve risk understanding and support better informed decisions. The work was funded through a grant from the DOE Nuclear Safety Research and Development Program.« less

  18. Impact of clinical input variable uncertainties on ten-year atherosclerotic cardiovascular disease risk using new pooled cohort equations.

    PubMed

    Gupta, Himanshu; Schiros, Chun G; Sharifov, Oleg F; Jain, Apurva; Denney, Thomas S

    2016-08-31

    Recently released American College of Cardiology/American Heart Association (ACC/AHA) guideline recommends the Pooled Cohort equations for evaluating atherosclerotic cardiovascular risk of individuals. The impact of the clinical input variable uncertainties on the estimates of ten-year cardiovascular risk based on ACC/AHA guidelines is not known. Using a publicly available the National Health and Nutrition Examination Survey dataset (2005-2010), we computed maximum and minimum ten-year cardiovascular risks by assuming clinically relevant variations/uncertainties in input of age (0-1 year) and ±10 % variation in total-cholesterol, high density lipoprotein- cholesterol, and systolic blood pressure and by assuming uniform distribution of the variance of each variable. We analyzed the changes in risk category compared to the actual inputs at 5 % and 7.5 % risk limits as these limits define the thresholds for consideration of drug therapy in the new guidelines. The new-pooled cohort equations for risk estimation were implemented in a custom software package. Based on our input variances, changes in risk category were possible in up to 24 % of the population cohort at both 5 % and 7.5 % risk boundary limits. This trend was consistently noted across all subgroups except in African American males where most of the cohort had ≥7.5 % baseline risk regardless of the variation in the variables. The uncertainties in the input variables can alter the risk categorization. The impact of these variances on the ten-year risk needs to be incorporated into the patient/clinician discussion and clinical decision making. Incorporating good clinical practices for the measurement of critical clinical variables and robust standardization of laboratory parameters to more stringent reference standards is extremely important for successful implementation of the new guidelines. Furthermore, ability to customize the risk calculator inputs to better represent unique clinical circumstances specific to individual needs would be highly desirable in the future versions of the risk calculator.

  19. Designing a Software for Flood Risk Assessment Based on Multi Criteria Desicion Analysis and Information Diffusion Methods

    NASA Astrophysics Data System (ADS)

    Musaoglu, N.; Saral, A.; Seker, D. Z.

    2012-12-01

    Flooding is one of the major natural disasters not only in Turkey but also in all over the world and it causes serious damage and harm. It is estimated that of the total economic loss caused by all kinds of disasters, 40% was due to floods. In July 1995, the Ayamama Creek in Istanbul was flooded, the insurance sector received around 1,200 claims notices during that period, insurance companies had to pay a total of $40 million for claims. In 2009, the same creek was flooded again and killed 31 people over two days and insurance firms paid for damages around cost €150 million for claims. To solve these kinds of problems modern tools such as GIS and Remote Sensing should be utilized. In this study, a software was designed for the flood risk analysis with Analytic Hierarchy Process (AHP) and Information Diffusion( InfoDif) methods.In the developed sofware, five evaluation criterias were taken into account, which were slope, aspect, elevation, geology and land use which were extracted from the satellite sensor data. The Digital Elevation Model (DEM) of the Ayamama River Basin was acquired from the SPOT 5 satellite image with 2.5 meter spatial resolution. Slope and aspect values of the study basin were extracted from this DEM. The land use of the Ayamama Creek was obtained by performing object-oriented nearest neighbor classification method by image segmentation on SPOT 5 image dated 2010. All produced data were used as an input for the part of Multi Criteria Desicion Analysis (MCDA) method of this software. Criterias and their each sub criteras were weighted and flood vulnerability was determined with MCDA-AHP. Also, daily flood data was collected from Florya Meteorological Station, between 1975 to 2009 years and the daily flood peak discharge was calculated with the method of Soil Conservation Service-Curve Number (SCS-CN) and were used as an input in the software for the part of InfoDif.Obtained results were verified using ground truth data and it has been clearly seen that the developed (TRA) software which uses two different methods for flood risk analysis, can be more effective for achieving different decision problems, from conventional techniques and produce more reliable results in a short time.; Study Area

  20. Cardiovascular risk assessment according to the Framingham score and abdominal obesity in individuals seen by a clinical school of nutrition.

    PubMed

    Oliveira, Alane Cabral Menezes de; Ferreira, Raphaela Costa; Santos, Arianne Albuquerque

    2016-04-01

    To analyze the relation of abdominal obesity on cardiovascular risk in individuals seen by a clinic school of nutrition, classifying them based on Framingham score. Cross-sectional study, conducted at the nutrition clinic of a private college in the city of Maceió, Alagoas. We included randomly selected adults and elderly individuals with abdominal obesity, of both sexes, treated from August to December of 2009, with no history of cardiomyopathy or cardiovascular events. To determine the cardiovascular risk, the Framingham score was calculated. All analyzes were performed with SPSS software version 20.0, with p <0.05 as significative. We studied 54 subjects, 83% female, the mean age was 48 years old, ranging from 31 to 73 years. No correlation was observed between measurements of waist circumference and cardiovascular risk in the subjects studied (r=0.065, p=0.048), and there was no relationship between these parameters. Abdominal fat distribution was weakly related to cardiovascular risk in patients seen by a clinical school of nutrition.

  1. Project Scheduling Based on Risk of Gas Transmission Pipe

    NASA Astrophysics Data System (ADS)

    Silvianita; Nurbaity, A.; Mulyadi, Y.; Suntoyo; Chamelia, D. M.

    2018-03-01

    The planning of a project has a time limit on which must be completed before or right at a predetermined time. Thus, in a project planning, it is necessary to have scheduling management that is useful for completing a project to achieve maximum results by considering the constraints that will exists. Scheduling management is undertaken to deal with uncertainties and negative impacts of time and cost in project completion. This paper explains about scheduling management in gas transmission pipeline project Gresik-Semarang to find out which scheduling plan is most effectively used in accordance with its risk value. Scheduling management in this paper is assissted by Microsoft Project software to find the critical path of existing project scheduling planning data. Critical path is the longest scheduling path with the fastest completion time. The result is found a critical path on project scheduling with completion time is 152 days. Furthermore, the calculation of risk is done by using House of Risk (HOR) method and it is found that the critical path has a share of 40.98 percent of all causes of the occurence of risk events that will be experienced.

  2. The Global Earthquake Model and Disaster Risk Reduction

    NASA Astrophysics Data System (ADS)

    Smolka, A. J.

    2015-12-01

    Advanced, reliable and transparent tools and data to assess earthquake risk are inaccessible to most, especially in less developed regions of the world while few, if any, globally accepted standards currently allow a meaningful comparison of risk between places. The Global Earthquake Model (GEM) is a collaborative effort that aims to provide models, datasets and state-of-the-art tools for transparent assessment of earthquake hazard and risk. As part of this goal, GEM and its global network of collaborators have developed the OpenQuake engine (an open-source software for hazard and risk calculations), the OpenQuake platform (a web-based portal making GEM's resources and datasets freely available to all potential users), and a suite of tools to support modelers and other experts in the development of hazard, exposure and vulnerability models. These resources are being used extensively across the world in hazard and risk assessment, from individual practitioners to local and national institutions, and in regional projects to inform disaster risk reduction. Practical examples for how GEM is bridging the gap between science and disaster risk reduction are: - Several countries including Switzerland, Turkey, Italy, Ecuador, Papua-New Guinea and Taiwan (with more to follow) are computing national seismic hazard using the OpenQuake-engine. In some cases these results are used for the definition of actions in building codes. - Technical support, tools and data for the development of hazard, exposure, vulnerability and risk models for regional projects in South America and Sub-Saharan Africa. - Going beyond physical risk, GEM's scorecard approach evaluates local resilience by bringing together neighborhood/community leaders and the risk reduction community as a basis for designing risk reduction programs at various levels of geography. Actual case studies are Lalitpur in the Kathmandu Valley in Nepal and Quito/Ecuador. In agreement with GEM's collaborative approach, all projects are undertaken with strong involvement of local scientific and risk reduction communities. Open-source software and careful documentation of the methodologies create full transparency of the modelling process, so that results can be reproduced any time by third parties.

  3. Dill: an algorithm and a symbolic software package for doing classical supersymmetry calculations

    NASA Astrophysics Data System (ADS)

    Luc̆ić, Vladan

    1995-11-01

    An algorithm is presented that formalizes different steps in a classical Supersymmetric (SUSY) calculation. Based on the algorithm Dill, a symbolic software package, that can perform the calculations, is developed in the Mathematica programming language. While the algorithm is quite general, the package is created for the 4 - D, N = 1 model. Nevertheless, with little modification, the package could be used for other SUSY models. The package has been tested and some of the results are presented.

  4. A dual mode breath sampler for the collection of the end-tidal and dead space fractions.

    PubMed

    Salvo, P; Ferrari, C; Persia, R; Ghimenti, S; Lomonaco, T; Bellagambi, F; Di Francesco, F

    2015-06-01

    This work presents a breath sampler prototype automatically collecting end-tidal (single and multiple breaths) or dead space air fractions (multiple breaths). This result is achieved by real time measurements of the CO2 partial pressure and airflow during the expiratory and inspiratory phases. Suitable algorithms, used to control a solenoid valve, guarantee that a Nalophan(®) bag is filled with the selected breath fraction even if the subject under test hyperventilates. The breath sampler has low pressure drop (<0.5 kPa) and uses inert or disposable components to avoid bacteriological risk for the patients and contamination of the breath samples. A fully customisable software interface allows a real time control of the hardware and software status. The performances of the breath sampler were evaluated by comparing (a) the CO2 partial pressure calculated during the sampling with the CO2 pressure measured off-line within the Nalophan(®) bag; (b) the concentrations of four selected volatile organic compounds in dead space, end-tidal and mixed breath fractions. Results showed negligible deviations between calculated and off-line CO2 pressure values and the distributions of the selected compounds into dead space, end-tidal and mixed breath fractions were in agreement with their chemical-physical properties. Copyright © 2015. Published by Elsevier Ltd.

  5. Expecting the Unexpected: Radiation Hardened Software

    NASA Technical Reports Server (NTRS)

    Penix, John; Mehlitz, Peter C.

    2005-01-01

    Radiation induced Single Event Effects (SEEs) are a serious problem for spacecraft flight software, potentially leading to a complete loss of mission. Conventional risk mitigation has been focused on hardware, leading to slow, expensive and outdated on-board computing devices, increased power consumption and launch mass. Our approach is to look at SEEs from a software perspective, and to explicitly design flight software so that it can detect and correct the majority of SEES. Radiation hardened flight software will reduce the significant residual residual risk for critical missions and flight phases, and enable more use of inexpensive and fast COTS hardware.

  6. High-fidelity modeling and impact footprint prediction for vehicle breakup analysis

    NASA Astrophysics Data System (ADS)

    Ling, Lisa

    For decades, vehicle breakup analysis had been performed for space missions that used nuclear heater or power units in order to assess aerospace nuclear safety for potential launch failures leading to inadvertent atmospheric reentry. Such pre-launch risk analysis is imperative to assess possible environmental impacts, obtain launch approval, and for launch contingency planning. In order to accurately perform a vehicle breakup analysis, the analysis tool should include a trajectory propagation algorithm coupled with thermal and structural analyses and influences. Since such a software tool was not available commercially or in the public domain, a basic analysis tool was developed by Dr. Angus McRonald prior to this study. This legacy software consisted of low-fidelity modeling and had the capability to predict vehicle breakup, but did not predict the surface impact point of the nuclear component. Thus the main thrust of this study was to develop and verify the additional dynamics modeling and capabilities for the analysis tool with the objectives to (1) have the capability to predict impact point and footprint, (2) increase the fidelity in the prediction of vehicle breakup, and (3) reduce the effort and time required to complete an analysis. The new functions developed for predicting the impact point and footprint included 3-degrees-of-freedom trajectory propagation, the generation of non-arbitrary entry conditions, sensitivity analysis, and the calculation of impact footprint. The functions to increase the fidelity in the prediction of vehicle breakup included a panel code to calculate the hypersonic aerodynamic coefficients for an arbitrary-shaped body and the modeling of local winds. The function to reduce the effort and time required to complete an analysis included the calculation of node failure criteria. The derivation and development of these new functions are presented in this dissertation, and examples are given to demonstrate the new capabilities and the improvements made, with comparisons between the results obtained from the upgraded analysis tool and the legacy software wherever applicable.

  7. [Risk assessment of thrombotic events in patients with schizophrenia and schizoaffective disorder in the acute state: the 'fibrinodynamics' technology].

    PubMed

    Brusov, O S; Matveev, I A; Kirillov, P S; Faktor, M I; Karpova, N S; Vasilyeva, E F; Katasonov, A B; Zozulya, S A; Klushnik, T P

    To assess the risk of thrombotic events in patients with schizophrenia and schizoaffective disorder based on 'fibrinodynamics' technology. A group of 76 women, including 38 with paranoid schizophrenia (F20.0), 18 with schizoaffective disorder (F25.1) in the acute stage, and 20 healthy controls, participated in the study. The technology includes the study of coagulation and fibrinolysis, Karmin author software, and calculation of peak time and hemostasis potential of spontaneous clots. Growth and lysis of fibrin clots were studied in plasma purified from platelets. All preanalytic procedures were conducted within 30 minutes after blood sampling. Blood serum was studied separately using the neuroimmunological test. Dynamic of brightness profiles of the clots was determined and a number of parameters (peak time and hemostasis potential of spontaneous clots) were calculated using the Karmin software. In patients with schizophrenia, the dynamic brightness profile of the clots has two peaks: the first peak is formed as a result of the growth and lysis of the clot initiated by the activator, the second peak is due to the growth and lysis of spontaneous clots in the volume of the measuring cuvette far from the activator. In healthy donors, the second peak under experimental conditions is absent. In the group of schizophrenic patients, a strong negative correlation is observed between the peak time of the second peak and the activity of leukocyte elastase (Spearman R = -0.75, p<0.0001), i.e. the greater the activity of elastase, the earlier the maximum of the second peak is formed and vice versa. In the control group, there is no such correlation. Evaluation of the potential of hemostasis of spontaneous clots showed that in 42% of schizophrenic patients this parameter is shifted above the norm, which indicates an increased risk of thrombosis of small brain arteries in these patients. The developed technology of 'fibrinodynamics' has a good potential for introduction into personalized medicine to identify increased risks of thrombosis of small cerebral vessels in patients with acute schizophrenia leading to the development of cognitive disorders and to control the normalization of hemostasis with antiplatelet or anticoagulant drugs.

  8. Comparison of Oncotype DX® Recurrence Score® with other risk assessment tools including the Nottingham Prognostic Index in the identification of patients with low-risk invasive breast cancer.

    PubMed

    Cotter, Maura Bríd; Dakin, Alex; Maguire, Aoife; Walshe, Janice M; Kennedy, M John; Dunne, Barbara; Riain, Ciarán Ó; Quinn, Cecily M

    2017-09-01

    Oncotype DX® is a gene expression assay that quantifies the risk of distant recurrence in patients with hormone receptor positive early breast cancer, publicly funded in Ireland since 2011. The aim of this study was to correlate Oncotype DX® risk groupings with traditional histopathological parameters and the results of other risk assessment tools including Recurrence Score-Pathology-Clinical (RSPC), Adjuvant Risk Index (Adj RI), Nottingham Prognostic Index (NPI) and the Adjuvant! Online 10-year score (AO). Patients were retrospectively identified from the histopathology databases of two Irish hospitals and patient and tumour characteristics collated. Associations between categorical variables were evaluated with Pearson's chi-square test. Correlations were calculated using Spearman's correlation coefficient and concordance using Lin's concordance correlation coefficient. Statistical analysis was performed using SPSS software, version 22.0.In our 300 patient cohort, Oncotype DX® classified 59.7% (n = 179) as low, 30% (n = 90) as intermediate, and 10.3% (n = 31) as high risk. Overall concordance between the RS and RSPC, Adj RI, NPI, and AO was 67.3% (n = 202), 56.3% (n = 169), 59% (n = 177), and 36.3% (n = 109), respectively. All risk assessment tools classified the majority of patients as low risk apart from the AO 10-year score, with RSPC classifying the highest number of patients as low risk. This study demonstrates that there is good correlation between the RS and scores obtained using alternative risk tools. Concordance with NPI is strong, particularly in the low-risk group. NPI, calculated from traditional clinicopathological characteristics, is a reliable alternative to Oncotype DX® in the identification of low-risk patients who may avoid adjuvant chemotherapy.

  9. The Evolution of Software Publication in Astronomy

    NASA Astrophysics Data System (ADS)

    Cantiello, Matteo

    2018-01-01

    Software is a fundamental component of the scientific research process. As astronomical discoveries increasingly rely on complex numerical calculations and the analysis of big data sets, publishing and documenting software is a fundamental step in ensuring transparency and reproducibility of results. I will briefly discuss the recent history of software publication and highlight the challenges and opportunities ahead.

  10. Malignant transformation of oral lichen planus and oral lichenoid lesions: A meta-analysis of 20095 patient data.

    PubMed

    Aghbari, Sana Maher Hasan; Abushouk, Abdelrahman Ibrahim; Attia, Attia; Elmaraezy, Ahmed; Menshawy, Amr; Ahmed, Mohamed Shehata; Elsaadany, Basma Abdelaleem; Ahmed, Eman Magdy

    2017-05-01

    For over a century, a heated debate existed over the possibility of malignant transformation of oral lichen planus (OLP). We performed this meta-analysis to evaluate the malignant potential of OLP and oral lichenoid lesions (OLL) and investigate the possible risk factors for OLP malignant transformation into oral squamous cell carcinoma (OSCC). We searched Medline, Scopus, and Web of Knowledge for relevant observational studies. Data on OLP malignant transformation were calculated as a pooled proportion (PP), using the Der-Simonian Liard method. We performed subgroup analyses by OLP diagnostic criteria, site, and clinical type, using Open Meta[Analyst] software. Data on possible risk factors for malignant transformation were pooled as odds ratios (ORs), using Comprehensive Meta-Analysis software. Pooling data for OLP malignant transformation from 57 studies (19,676 patients) resulted in an overall PP of 1.1% [95% CI: 0.9%, 1.4%], while pooling data from 14 recent studies that used the World Health Organization-2003 diagnostic criteria resulted in an overall-PP of 0.9% [95% CI: 0.5%, 1.3%]. The risk of malignant transformation was higher (PP=2.5%, 95% CI [1%, 4%]) in OLL patients (419 patients). A significant increase of malignant transformation risk was noted among smokers (OR=2, 95% CI [1.25, 3.22]), alcoholics (OR=3.52, 95% CI [1.54, 8.03]), and HCV-infected patients (OR=5, 95% CI [1.56, 16.07]), compared to patients without these risk factors. A small subset of OLP patients (1.1%) develop OSCC; therefore, regular follow-up for these patients is recommended. A higher incidence of malignant transformation was found among smokers, alcoholics, and HCV-infected patients; however, these associations should be further investigated. Copyright © 2017 Elsevier Ltd. All rights reserved.

  11. Overdose problem associated with treatment planning software for high energy photons in response of Panama's accident.

    PubMed

    Attalla, Ehab M; Lotayef, Mohamed M; Khalil, Ehab M; El-Hosiny, Hesham A; Nazmy, Mohamed S

    2007-06-01

    The purpose of this study was to quantify dose distribution errors by comparing actual dose measurements with the calculated values done by the software. To evaluate the outcome of radiation overexposure related to Panama's accident and in response to ensure that the treatment planning systems (T.P.S.) are being operated in accordance with the appropriate quality assurance programme, we studied the central axis and pripheral depth dose data using complex field shaped with blocks to quantify dose distribution errors. Multidata T.P.S. software versions 2.35 and 2.40 and Helax T.P.S. software version 5.1 B were assesed. The calculated data of the software treatment planning systems were verified by comparing these data with the actual dose measurements for open and blocked high energy photon fields (Co-60, 6MV & 18MV photons). Close calculated and measured results were obtained for the 2-D (Multidata) and 3-D treatment planning (TMS Helax). These results were correct within 1 to 2% for open fields and 0.5 to 2.5% for peripheral blocked fields. Discrepancies between calculated and measured data ranged between 13. to 36% along the central axis of complex blocked fields when normalisation point was selected at the Dmax, when the normalisation point was selected near or under the blocks, the variation between the calculated and the measured data was up to 500% difference. The present results emphasize the importance of the proper selection of the normalization point in the radiation field, as this facilitates detection of aberrant dose distribution (over exposure or under exposure).

  12. 1986 Petroleum Software Directory. [800 mini, micro and mainframe computer software packages

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1985-01-01

    Pennwell's 1986 Petroleum Software Directory is a complete listing of software created specifically for the petroleum industry. Details are provided on over 800 mini, micro and mainframe computer software packages from more than 250 different companies. An accountant can locate programs to automate bookkeeping functions in large oil and gas production firms. A pipeline engineer will find programs designed to calculate line flow and wellbore pressure drop.

  13. A new software for dimensional measurements in 3D endodontic root canal instrumentation.

    PubMed

    Sinibaldi, Raffaele; Pecci, Raffaella; Somma, Francesco; Della Penna, Stefania; Bedini, Rossella

    2012-01-01

    The main issue to be faced to get size estimates of 3D modification of the dental canal after endodontic treatment is the co-registration of the image stacks obtained through micro computed tomography (micro-CT) scans before and after treatment. Here quantitative analysis of micro-CT images have been performed by means of new dedicated software targeted to the analysis of root canal after endodontic instrumentation. This software analytically calculates the best superposition between the pre and post structures using the inertia tensor of the tooth. This strategy avoid minimization procedures, which can be user dependent, and time consuming. Once the co-registration have been achieved dimensional measurements have then been performed by contemporary evaluation of quantitative parameters over the two superimposed stacks of micro-CT images. The software automatically calculated the changes of volume, surface and symmetry axes in 3D occurring after the instrumentation. The calculation is based on direct comparison of the canal and canal branches selected by the user on the pre treatment image stack.

  14. The SCEC Broadband Platform: Open-Source Software for Strong Ground Motion Simulation and Validation

    NASA Astrophysics Data System (ADS)

    Silva, F.; Goulet, C. A.; Maechling, P. J.; Callaghan, S.; Jordan, T. H.

    2016-12-01

    The Southern California Earthquake Center (SCEC) Broadband Platform (BBP) is a carefully integrated collection of open-source scientific software programs that can simulate broadband (0-100 Hz) ground motions for earthquakes at regional scales. The BBP can run earthquake rupture and wave propagation modeling software to simulate ground motions for well-observed historical earthquakes and to quantify how well the simulated broadband seismograms match the observed seismograms. The BBP can also run simulations for hypothetical earthquakes. In this case, users input an earthquake location and magnitude description, a list of station locations, and a 1D velocity model for the region of interest, and the BBP software then calculates ground motions for the specified stations. The BBP scientific software modules implement kinematic rupture generation, low- and high-frequency seismogram synthesis using wave propagation through 1D layered velocity structures, several ground motion intensity measure calculations, and various ground motion goodness-of-fit tools. These modules are integrated into a software system that provides user-defined, repeatable, calculation of ground-motion seismograms, using multiple alternative ground motion simulation methods, and software utilities to generate tables, plots, and maps. The BBP has been developed over the last five years in a collaborative project involving geoscientists, earthquake engineers, graduate students, and SCEC scientific software developers. The SCEC BBP software released in 2016 can be compiled and run on recent Linux and Mac OS X systems with GNU compilers. It includes five simulation methods, seven simulation regions covering California, Japan, and Eastern North America, and the ability to compare simulation results against empirical ground motion models (aka GMPEs). The latest version includes updated ground motion simulation methods, a suite of new validation metrics and a simplified command line user interface.

  15. SU-E-T-348: Verification MU Calculation for Conformal Radiotherapy with Multileaf Collimator Using Report AAPM TG 114

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Adrada, A; Tello, Z; Medina, L

    Purpose: The purpose of this work was to develop and validate an open source independent MU dose calculation software for 3D conformal radiotherapy with multileaf high and low resolution according to the report of AAPM TG 11 Methods: Treatment plans were done using Iplan v4.5 BrainLAB TPS. A 6MV photon beam produced by Primus and Novalis linear accelerators equipped with an Optifocus MLC and HDMLC, respectively. TPS dose calculation algorithms were pencil beam and Monte Carlo. 1082 treatments plans were selected for the study. The algorithm was written in free and open source CodeBlocks C++ platform. Treatment plans were importedmore » by the software using RTP format. Equivalent size field is obtained from the positions of the leaves; the effective depth of calculation can be introduced by TPS's dosimetry report or automatically calculated starting from SSD. The inverse square law is calculated by the 3D coordinates of the isocenter and normalization point of the treatment plan. The dosimetric parameters TPR, Sc, Sp and WF are linearly interpolated. Results: 1082 plans of both machines were analyzed. The average uncertainty between the TPS and the independent calculation was −0.43% ± 2.42% [−7.90%, 7.50%]. Specifically for the Primus the variation obtained was −0.85% ± 2.53% and for the Novalis 0.00% ± 2.23%. Data show that 94.8% of the cases the uncertainty was less than or equal to 5%, while 98.9% is less than or equal to 6%. Conclusion: The developed software is appropriate for use in calculation of UM. This software can be obtained upon request.« less

  16. Practical Issues in Implementing Software Reliability Measurement

    NASA Technical Reports Server (NTRS)

    Nikora, Allen P.; Schneidewind, Norman F.; Everett, William W.; Munson, John C.; Vouk, Mladen A.; Musa, John D.

    1999-01-01

    Many ways of estimating software systems' reliability, or reliability-related quantities, have been developed over the past several years. Of particular interest are methods that can be used to estimate a software system's fault content prior to test, or to discriminate between components that are fault-prone and those that are not. The results of these methods can be used to: 1) More accurately focus scarce fault identification resources on those portions of a software system most in need of it. 2) Estimate and forecast the risk of exposure to residual faults in a software system during operation, and develop risk and safety criteria to guide the release of a software system to fielded use. 3) Estimate the efficiency of test suites in detecting residual faults. 4) Estimate the stability of the software maintenance process.

  17. Nonstationary Extreme Value Analysis in a Changing Climate: A Software Package

    NASA Astrophysics Data System (ADS)

    Cheng, L.; AghaKouchak, A.; Gilleland, E.

    2013-12-01

    Numerous studies show that climatic extremes have increased substantially in the second half of the 20th century. For this reason, analysis of extremes under a nonstationary assumption has received a great deal of attention. This paper presents a software package developed for estimation of return levels, return periods, and risks of climatic extremes in a changing climate. This MATLAB software package offers tools for analysis of climate extremes under both stationary and non-stationary assumptions. The Nonstationary Extreme Value Analysis (hereafter, NEVA) provides an efficient and generalized framework for analyzing extremes using Bayesian inference. NEVA estimates the extreme value parameters using a Differential Evolution Markov Chain (DE-MC) which utilizes the genetic algorithm Differential Evolution (DE) for global optimization over the real parameter space with the Markov Chain Monte Carlo (MCMC) approach and has the advantage of simplicity, speed of calculation and convergence over conventional MCMC. NEVA also offers the confidence interval and uncertainty bounds of estimated return levels based on the sampled parameters. NEVA integrates extreme value design concepts, data analysis tools, optimization and visualization, explicitly designed to facilitate analysis extremes in geosciences. The generalized input and output files of this software package make it attractive for users from across different fields. Both stationary and nonstationary components of the package are validated for a number of case studies using empirical return levels. The results show that NEVA reliably describes extremes and their return levels.

  18. A method for simulating the entire leaking process and calculating the liquid leakage volume of a damaged pressurized pipeline.

    PubMed

    He, Guoxi; Liang, Yongtu; Li, Yansong; Wu, Mengyu; Sun, Liying; Xie, Cheng; Li, Feng

    2017-06-15

    The accidental leakage of long-distance pressurized oil pipelines is a major area of risk, capable of causing extensive damage to human health and environment. However, the complexity of the leaking process, with its complex boundary conditions, leads to difficulty in calculating the leakage volume. In this study, the leaking process is divided into 4 stages based on the strength of transient pressure. 3 models are established to calculate the leaking flowrate and volume. First, a negative pressure wave propagation attenuation model is applied to calculate the sizes of orifices. Second, a transient oil leaking model, consisting of continuity, momentum conservation, energy conservation and orifice flow equations, is built to calculate the leakage volume. Third, a steady-state oil leaking model is employed to calculate the leakage after valves and pumps shut down. Moreover, sensitive factors that affect the leak coefficient of orifices and volume are analyzed respectively to determine the most influential one. To validate the numerical simulation, two types of leakage test with different sizes of leakage holes were conducted from Sinopec product pipelines. More validations were carried out by applying commercial software to supplement the experimental insufficiency. Thus, the leaking process under different leaking conditions are described and analyzed. Copyright © 2017 Elsevier B.V. All rights reserved.

  19. Leak detection by mass balance effective for Norman Wells line

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liou, J.C.P.

    Mass-balance calculations for leak detection have been shown as effective as a leading software system, in a comparison based on a major Canadian crude-oil pipeline. The calculations and NovaCorp`s Leakstop software each detected 4% (approximately) or greater leaks on Interprovincial Pipe Line (IPL) Inc.`s Norman Wells pipeline. Insufficient data exist to assess performances of the two methods for leaks smaller than 4%. Pipeline leak detection using such software-based systems are common. Their effectiveness is measured by how small and how quickly a leak can be detected. Algorithms used and measurement uncertainties determine leak detectability.

  20. EVALUATION OF VADOSE ZONE AND SOURCE MODELS FOR MULTI-MEDIA, MULTI-PATHWAY, MULTI-RECEPTOR RISK ASSESSMENT USING LARGE SOIL COLUMN EXPERIMENT DATA

    EPA Science Inventory

    The U.S. Environmental Protection Agency (EPA) is developing a comprehensive environmental exposure and risk analysis software system for agency-wide application using the methodology of a Multi-media, Multi-pathway, Multi-receptor Risk Assessment (3MRA) model. This software sys...

  1. Framework for Risk Analysis in Multimedia Environmental Systems: Modeling Individual Steps of a Risk Assessment Process

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shah, Anuj; Castleton, Karl J.; Hoopes, Bonnie L.

    2004-06-01

    The study of the release and effects of chemicals in the environment and their associated risks to humans is central to public and private decision making. FRAMES 1.X, Framework for Risk Analysis in Multimedia Environmental Systems, is a systems modeling software platform, developed by PNNL, Pacific Northwest National Laboratory, that helps scientists study the release and effects of chemicals on a source to outcome basis, create environmental models for similar risk assessment and management problems. The unique aspect of FRAMES is to dynamically introduce software modules representing individual components of a risk assessment (e.g., source release of contaminants, fate andmore » transport in various environmental media, exposure, etc.) within a software framework, manipulate their attributes and run simulations to obtain results. This paper outlines the fundamental constituents of FRAMES 2.X, an enhanced version of FRAMES 1.X, that greatly improve the ability of the module developers to “plug” their self-developed software modules into the system. The basic design, the underlying principles and a discussion of the guidelines for module developers are presented.« less

  2. DenInv3D: a geophysical software for three-dimensional density inversion of gravity field data

    NASA Astrophysics Data System (ADS)

    Tian, Yu; Ke, Xiaoping; Wang, Yong

    2018-04-01

    This paper presents a three-dimensional density inversion software called DenInv3D that operates on gravity and gravity gradient data. The software performs inversion modelling, kernel function calculation, and inversion calculations using the improved preconditioned conjugate gradient (PCG) algorithm. In the PCG algorithm, due to the uncertainty of empirical parameters, such as the Lagrange multiplier, we use the inflection point of the L-curve as the regularisation parameter. The software can construct unequally spaced grids and perform inversions using such grids, which enables changing the resolution of the inversion results at different depths. Through inversion of airborne gradiometry data on the Australian Kauring test site, we discovered that anomalous blocks of different sizes are present within the study area in addition to the central anomalies. The software of DenInv3D can be downloaded from http://159.226.162.30.

  3. Synthesis, molecular properties, toxicity and biological evaluation of some new substituted imidazolidine derivatives in search of potent anti-inflammatory agents

    PubMed Central

    Husain, Asif; Ahmad, Aftab; Khan, Shah Alam; Asif, Mohd; Bhutani, Rubina; Al-Abbasi, Fahad A.

    2015-01-01

    The aim of this study was to design and synthesize pharmaceutical agents containing imidazolidine heterocyclic ring in the hope of developing potent, safe and orally active anti-inflammatory agents. A number of substituted-imidazolidine derivatives (3a–k) were synthesized starting from ethylene diamine and aromatic aldehydes. The imidazolidine derivatives (3a–k) were investigated for their anticipated anti-inflammatory, and analgesic activity in Wistar albino rats and Swiss albino mice, respectively. Bioactivity score, molecular and pharmacokinetic properties of the imidazolidine derivatives were calculated by online computer software programs viz. Molinspiration and Osiris property explorer. The results of biological testing indicated that among the synthesized compounds only three imidazolidine derivatives namely 4-[1,3-Bis(2,6-dichlorobenzyl)-2-imidazolidinyl]phenyl-diethylamine (3g), 4-[1,3-Bis(3-hydroxy-4-methoxybenzyl)-2-imidazolidinyl]phenyl-diethylamine (3i) and 4-(1,3-Bis(4-methoxybenzyl)-4-methylimidazolidin-2-yl)-phenyl-diethylamine (3j) possess promising anti-inflammatory and analgesic actions. Additionally these derivatives displayed superior GI safety profile (low severity index) with respect to the positive control, Indomethacin. All synthesized compounds showed promising bioactivity score for drug targets by Molinspiration software. Almost all the compounds were predicted to have very low toxicity risk by Osiris online software. Compound number (3i) emerged as a potential candidate for further research as it obeyed Lipinski’s rule of five for drug likeness, exhibited promising biological activity in-vivo and showed no risk of toxicity in computer aided screening. PMID:26903774

  4. User’s guide for MapMark4—An R package for the probability calculations in three-part mineral resource assessments

    USGS Publications Warehouse

    Ellefsen, Karl J.

    2017-06-27

    MapMark4 is a software package that implements the probability calculations in three-part mineral resource assessments. Functions within the software package are written in the R statistical programming language. These functions, their documentation, and a copy of this user’s guide are bundled together in R’s unit of shareable code, which is called a “package.” This user’s guide includes step-by-step instructions showing how the functions are used to carry out the probability calculations. The calculations are demonstrated using test data, which are included in the package.

  5. 3D echocardiographic analysis of aortic annulus for transcatheter aortic valve replacement using novel aortic valve quantification software: Comparison with computed tomography.

    PubMed

    Mediratta, Anuj; Addetia, Karima; Medvedofsky, Diego; Schneider, Robert J; Kruse, Eric; Shah, Atman P; Nathan, Sandeep; Paul, Jonathan D; Blair, John E; Ota, Takeyoshi; Balkhy, Husam H; Patel, Amit R; Mor-Avi, Victor; Lang, Roberto M

    2017-05-01

    With the increasing use of transcatheter aortic valve replacement (TAVR) in patients with aortic stenosis (AS), computed tomography (CT) remains the standard for annulus sizing. However, 3D transesophageal echocardiography (TEE) has been an alternative in patients with contraindications to CT. We sought to (1) test the feasibility, accuracy, and reproducibility of prototype 3DTEE analysis software (Philips) for aortic annular measurements and (2) compare the new approach to the existing echocardiographic techniques. We prospectively studied 52 patients who underwent gated contrast CT, procedural 3DTEE, and TAVR. 3DTEE images were analyzed using novel semi-automated software designed for 3D measurements of the aortic root, which uses multiplanar reconstruction, similar to CT analysis. Aortic annulus measurements included area, perimeter, and diameter calculations from these measurements. The results were compared to CT-derived values. Additionally, 3D echocardiographic measurements (3D planimetry and mitral valve analysis software adapted for the aortic valve) were also compared to the CT reference values. 3DTEE image quality was sufficient in 90% of patients for aortic annulus measurements using the new software, which were in good agreement with CT (r-values: .89-.91) and small (<4%) inter-modality nonsignificant biases. Repeated measurements showed <10% measurements variability. The new 3D analysis was the more accurate and reproducible of the existing echocardiographic techniques. Novel semi-automated 3DTEE analysis software can accurately measure aortic annulus in patients with severe AS undergoing TAVR, in better agreement with CT than the existing methodology. Accordingly, intra-procedural TEE could potentially replace CT in patients where CT carries significant risk. © 2017, Wiley Periodicals, Inc.

  6. Iowa Bridge Backwater Software : users manual IHRB TR-564, version 2.0, June 2010.

    DOT National Transportation Integrated Search

    2010-06-01

    This manual describes how to use the Iowa Bridge Backwater software. It also documents the methods and equations used for the calculations. The main body describes how to use the software and the appendices cover technical aspects. : The Bridge Backw...

  7. Redesign of Transjakarta Bus Driver's Cabin

    NASA Astrophysics Data System (ADS)

    Mardi Safitri, Dian; Azmi, Nora; Singh, Gurbinder; Astuti, Pudji

    2016-02-01

    Ergonomic risk at work stations with type Seated Work Control was one of the problems faced by Transjakarta bus driver. Currently “Trisakti” type bus, one type of bus that is used by Transjakarta in corridor 9, serving route Pinang Ranti - Pluit, gained many complaints from drivers. From the results of Nordic Body Map questionnaires given to 30 drivers, it was known that drivers feel pain in the neck, arms, hips, and buttocks. Allegedly this was due to the seat position and the button/panel bus has a considerable distance range (1 meter) to be achieved by drivers. In addition, preliminary results of the questionnaire using Workstation Checklist identified their complaints about uncomfortable cushion, driver's seat backrest, and the exact position of the AC is above the driver head. To reduce the risk level of ergonomics, then did research to design the cabin by using a generic approach to designing products. The risk analysis driver posture before the design was done by using Rapid Upper Limb Assessment (RULA), Rapid Entire Body Assessment (REBA), and Quick Exposure Checklist (QEC), while the calculation of the moment the body is done by using software Mannequin Pro V10.2. Furthermore, the design of generic products was done through the stages: need metric-matrix, house of quality, anthropometric data collection, classification tree concept, concept screening, scoring concept, design and manufacture of products in the form of two-dimensional. While the design after design risk analysis driver posture was done by using RULA, REBA, and calculation of moments body as well as the design visualized using software 3DMax. From the results of analysis before the draft design improvements cabin RULA obtained scores of 6, REBA 9, and the result amounted to 57.38% QEC and moment forces on the back is 247.3 LbF.inch and on the right hip is 72.9 LbF.in. While the results of the proposed improvements cabin design RULA obtained scores of 3, REBA 4, and the moment of force on the back is 90.3 LbF.in and on the right hip is 70.6 LbF.in. This indicated improvement cabin design can reduce ergonomic risk with lower scores on several parts of the body.

  8. Nonlinear Simulation of the Tooth Enamel Spectrum for EPR Dosimetry

    NASA Astrophysics Data System (ADS)

    Kirillov, V. A.; Dubovsky, S. V.

    2016-07-01

    Software was developed where initial EPR spectra of tooth enamel were deconvoluted based on nonlinear simulation, line shapes and signal amplitudes in the model initial spectrum were calculated, the regression coefficient was evaluated, and individual spectra were summed. Software validation demonstrated that doses calculated using it agreed excellently with the applied radiation doses and the doses reconstructed by the method of additive doses.

  9. Passive PE Sampling in Support of In Situ Remediation of Contaminated Sediments - Passive Sampler PRC Calculation Software User’s Guide

    DTIC Science & Technology

    2014-07-01

    different value and pressing Enter. The PRC- Calc session can be saved for future use with these new values using the Save Session button in the upper...describe (a) how the PRC Correction Calculator (PRC- Calc ) uses the model of Fernandez et al. (2009), (b) how well its performance compares against...experimental data, (c) how the user may prepare their computer with software to use the PRC calculator, and then (d) how to use PRC- Calc to process PRC

  10. Stata Modules for Calculating Novel Predictive Performance Indices for Logistic Models.

    PubMed

    Barkhordari, Mahnaz; Padyab, Mojgan; Hadaegh, Farzad; Azizi, Fereidoun; Bozorgmanesh, Mohammadreza

    2016-01-01

    Prediction is a fundamental part of prevention of cardiovascular diseases (CVD). The development of prediction algorithms based on the multivariate regression models loomed several decades ago. Parallel with predictive models development, biomarker researches emerged in an impressively great scale. The key question is how best to assess and quantify the improvement in risk prediction offered by new biomarkers or more basically how to assess the performance of a risk prediction model. Discrimination, calibration, and added predictive value have been recently suggested to be used while comparing the predictive performances of the predictive models' with and without novel biomarkers. Lack of user-friendly statistical software has restricted implementation of novel model assessment methods while examining novel biomarkers. We intended, thus, to develop a user-friendly software that could be used by researchers with few programming skills. We have written a Stata command that is intended to help researchers obtain cut point-free and cut point-based net reclassification improvement index and (NRI) and relative and absolute Integrated discriminatory improvement index (IDI) for logistic-based regression analyses.We applied the commands to a real data on women participating the Tehran lipid and glucose study (TLGS) to examine if information of a family history of premature CVD, waist circumference, and fasting plasma glucose can improve predictive performance of the Framingham's "general CVD risk" algorithm. The command is addpred for logistic regression models. The Stata package provided herein can encourage the use of novel methods in examining predictive capacity of ever-emerging plethora of novel biomarkers.

  11. Toward Baseline Software Anomalies in NASA Missions

    NASA Technical Reports Server (NTRS)

    Layman, Lucas; Zelkowitz, Marvin; Basili, Victor; Nikora, Allen P.

    2012-01-01

    In this fast abstract, we provide preliminary findings an analysis of 14,500 spacecraft anomalies from unmanned NASA missions. We provide some baselines for the distributions of software vs. non-software anomalies in spaceflight systems, the risk ratings of software anomalies, and the corrective actions associated with software anomalies.

  12. [Development and practice evaluation of blood acid-base imbalance analysis software].

    PubMed

    Chen, Bo; Huang, Haiying; Zhou, Qiang; Peng, Shan; Jia, Hongyu; Ji, Tianxing

    2014-11-01

    To develop a blood gas, acid-base imbalance analysis computer software to diagnose systematically, rapidly, accurately and automatically determine acid-base imbalance type, and evaluate the clinical application. Using VBA programming language, a computer aided diagnostic software for the judgment of acid-base balance was developed. The clinical data of 220 patients admitted to the Second Affiliated Hospital of Guangzhou Medical University were retrospectively analyzed. The arterial blood gas [pH value, HCO(3)(-), arterial partial pressure of carbon dioxide (PaCO₂)] and electrolytes included data (Na⁺ and Cl⁻) were collected. Data were entered into the software for acid-base imbalances judgment. At the same time the data generation was calculated manually by H-H compensation formula for determining the type of acid-base imbalance. The consistency of judgment results from software and manual calculation was evaluated, and the judgment time of two methods was compared. The clinical diagnosis of the types of acid-base imbalance for the 220 patients: 65 cases were normal, 90 cases with simple type, mixed type in 41 cases, and triplex type in 24 cases. The accuracy of the judgment results of the normal and triplex types from computer software compared with which were calculated manually was 100%, the accuracy of the simple type judgment was 98.9% and 78.0% for the mixed type, and the total accuracy was 95.5%. The Kappa value of judgment result from software and manual judgment was 0.935, P=0.000. It was demonstrated that the consistency was very good. The time for software to determine acid-base imbalances was significantly shorter than the manual judgment (seconds:18.14 ± 3.80 vs. 43.79 ± 23.86, t=7.466, P=0.000), so the method of software was much faster than the manual method. Software judgment can replace manual judgment with the characteristics of rapid, accurate and convenient, can improve work efficiency and quality of clinical doctors and has great clinical application promotion value.

  13. Radiation breakage of DNA: a model based on random-walk chromatin structure

    NASA Technical Reports Server (NTRS)

    Ponomarev, A. L.; Sachs, R. K.

    2001-01-01

    Monte Carlo computer software, called DNAbreak, has recently been developed to analyze observed non-random clustering of DNA double strand breaks in chromatin after exposure to densely ionizing radiation. The software models coarse-grained configurations of chromatin and radiation tracks, small-scale details being suppressed in order to obtain statistical results for larger scales, up to the size of a whole chromosome. We here give an analytic counterpart of the numerical model, useful for benchmarks, for elucidating the numerical results, for analyzing the assumptions of a more general but less mechanistic "randomly-located-clusters" formalism, and, potentially, for speeding up the calculations. The equations characterize multi-track DNA fragment-size distributions in terms of one-track action; an important step in extrapolating high-dose laboratory results to the much lower doses of main interest in environmental or occupational risk estimation. The approach can utilize the experimental information on DNA fragment-size distributions to draw inferences about large-scale chromatin geometry during cell-cycle interphase.

  14. EPA's Benchmark Dose Modeling Software

    EPA Science Inventory

    The EPA developed the Benchmark Dose Software (BMDS) as a tool to help Agency risk assessors facilitate applying benchmark dose (BMD) method’s to EPA’s human health risk assessment (HHRA) documents. The application of BMD methods overcomes many well know limitations ...

  15. Bill Calculator V1.0

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    2002-08-19

    Utitlity tariffs vary significantly from utility to utility. Each utility has its own rates and sets of rules by which bills are calculated. The Bill Calculator reconstructs the tariff based on these rules, stored in data tables, and access the appropriate charges for a given energy consumption and demand. The software reconstructs the tariff logic from the rules stored in data tables. Changes are tallied as the logic is reconstructed. This is essentially an accounting program. The main limitation is on the time to search for each tariff element. It is currently on O(N) search. Also, since the Bill calculatormore » first stores all tariffs in an array and then reads the array to reconstruct a specific tariff, the memory limitatins of a particular system would limit the number of tariffs that could be handled. This tool allows a user to calculate a bill from any sampled utility without prior knowledge of the tariff logic or structure. The peculiarities of the tariff logic are stored in data tables and manged by the Bill Calculator software. This version of the software is implemented as a VB module that operates within Microsoft Excel. Input data tables are stored in Excel worksheets. In this version the Bill Calculator functions can be assessed through Excel as user defined worksheet functions. Bill Calculator can calculate approximately 50,000 bills in less than 30 minutes.« less

  16. Software engineering processes for Class D missions

    NASA Astrophysics Data System (ADS)

    Killough, Ronnie; Rose, Debi

    2013-09-01

    Software engineering processes are often seen as anathemas; thoughts of CMMI key process areas and NPR 7150.2A compliance matrices can motivate a software developer to consider other career fields. However, with adequate definition, common-sense application, and an appropriate level of built-in flexibility, software engineering processes provide a critical framework in which to conduct a successful software development project. One problem is that current models seem to be built around an underlying assumption of "bigness," and assume that all elements of the process are applicable to all software projects regardless of size and tolerance for risk. This is best illustrated in NASA's NPR 7150.2A in which, aside from some special provisions for manned missions, the software processes are to be applied based solely on the criticality of the software to the mission, completely agnostic of the mission class itself. That is, the processes applicable to a Class A mission (high priority, very low risk tolerance, very high national significance) are precisely the same as those applicable to a Class D mission (low priority, high risk tolerance, low national significance). This paper will propose changes to NPR 7150.2A, taking mission class into consideration, and discuss how some of these changes are being piloted for a current Class D mission—the Cyclone Global Navigation Satellite System (CYGNSS).

  17. Impact of the new nuclear decay data of ICRP publication 107 on inhalation dose coefficients for workers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Manabe, K.; Endo, Akira; Eckerman, Keith F

    2010-03-01

    The impact a revision of nuclear decay data had on dose coefficients was studied using data newly published in ICRP Publication 107 (ICRP 107) and existing data from ICRP Publication 38 (ICRP 38). Committed effective dose coefficients for occupational inhalation of radionuclides were calculated using two sets of decay data with the dose and risk calculation software DCAL for 90 elements, 774 nuclides and 1572 cases. The dose coefficients based on ICRP 107 increased by over 10 % compared with those based on ICRP 38 in 98 cases, and decreased by over 10 % in 54 cases. It was foundmore » that the differences in dose coefficients mainly originated from changes in the radiation energy emitted per nuclear transformation. In addition, revisions of the half-lives, radiation types and decay modes also resulted in changes in the dose coefficients.« less

  18. High-school software development project helps increasing students' awareness of geo-hydrological hazards and their risks

    NASA Astrophysics Data System (ADS)

    Marchesini, Ivan; Rossi, Mauro; Balducci, Vinicio; Salvati, Paola; Guzzetti, Fausto; Bianchini, Andrea; Grzeleswki, Emanuell; Canonico, Andrea; Coccia, Rita; Fiorucci, Gianni Mario; Gobbi, Francesca; Ciuchetti, Monica

    2015-04-01

    In Italy, inundation and landslides are widespread phenomena that impact the population and cause significant economic damage to private and public properties. The perception of the risk posed by these natural geo-hydrological hazards varies geographically and in time. The variation in the perception of the risks has negative consequences on risk management, and limits the adoption of effective risk reduction strategies. We maintain that targeted education can foster the understanding of geo-hydrological hazards, improving their perception and the awareness of the associated risk. Collaboration of a research center experienced in geo-hydrological hazards and risks (CNR IRPI, Perugia) and a high school (ITIS Alessandro Volta, Perugia) has resulted in the design and execution of a project aimed at improving the perception of geo-hydrological risks in high school students and teachers through software development. In the two-year project, students, high school teachers and research scientists have jointly developed software broadly related to landslide and flood hazards. User requirements and system specifications were decided to facilitate the distribution and use of the software among students and their peers. This allowed a wider distribution of the project results. We discuss two prototype software developed by the high school students, including an application of augmented reality for improved dissemination of information of landslides and floods with human consequences in Italy, and a crowd science application to allow students (and others, including their families and friends) to collect information on landslide and flood occurrence exploiting modern mobile devices. This information can prove important e.g., for the validation of landslide forecasting models.

  19. DNA Commission of the International Society for Forensic Genetics: Recommendations on the validation of software programs performing biostatistical calculations for forensic genetics applications.

    PubMed

    Coble, M D; Buckleton, J; Butler, J M; Egeland, T; Fimmers, R; Gill, P; Gusmão, L; Guttman, B; Krawczak, M; Morling, N; Parson, W; Pinto, N; Schneider, P M; Sherry, S T; Willuweit, S; Prinz, M

    2016-11-01

    The use of biostatistical software programs to assist in data interpretation and calculate likelihood ratios is essential to forensic geneticists and part of the daily case work flow for both kinship and DNA identification laboratories. Previous recommendations issued by the DNA Commission of the International Society for Forensic Genetics (ISFG) covered the application of bio-statistical evaluations for STR typing results in identification and kinship cases, and this is now being expanded to provide best practices regarding validation and verification of the software required for these calculations. With larger multiplexes, more complex mixtures, and increasing requests for extended family testing, laboratories are relying more than ever on specific software solutions and sufficient validation, training and extensive documentation are of upmost importance. Here, we present recommendations for the minimum requirements to validate bio-statistical software to be used in forensic genetics. We distinguish between developmental validation and the responsibilities of the software developer or provider, and the internal validation studies to be performed by the end user. Recommendations for the software provider address, for example, the documentation of the underlying models used by the software, validation data expectations, version control, implementation and training support, as well as continuity and user notifications. For the internal validations the recommendations include: creating a validation plan, requirements for the range of samples to be tested, Standard Operating Procedure development, and internal laboratory training and education. To ensure that all laboratories have access to a wide range of samples for validation and training purposes the ISFG DNA commission encourages collaborative studies and public repositories of STR typing results. Published by Elsevier Ireland Ltd.

  20. In vivo dose perturbation effects of metallic dental alloys during head and neck irradiation with intensity modulated radiation therapy.

    PubMed

    Fuller, Clifton D; Diaz, Irma; Cavanaugh, Sean X; Eng, Tony Y

    2004-07-01

    A patient with base of tongue squamous sell carcinoma, with significant CT artifact-inducing metallic alloy, non-removable dental restorations in both the mandible and maxilla was identified. Simultaneous with IMRT treatment, thermoluminescent dosimeters (TLDs) were placed in the oral cavity. After a series of three treatments, the data from the TLDs and software calculations were analyzed. Analysis of mean in vivo TLD dosimetry reveals differentials from software predicted dose calculation that fall within acceptable dose variation limits. IMRT dose calculation software is a relatively accurate predictor of dose attenuation and augmentation due to dental alloys within the treatment volume, as measured by intra-oral thermoluminescent dosimetry. IMRT represents a safe and effective methodology to treat patients with non-removable metallic dental work who have head and neck cancer.

  1. Computer Aided Design of Ni-Based Single Crystal Superalloy for Industrial Gas Turbine Blades

    NASA Astrophysics Data System (ADS)

    Wei, Xianping; Gong, Xiufang; Yang, Gongxian; Wang, Haiwei; Li, Haisong; Chen, Xueda; Gao, Zhenhuan; Xu, Yongfeng; Yang, Ming

    The influence of molybdenum, tungsten and cobalt on stress-rupture properties of single crystal superalloy PWA1483 has been investigated using the simulated calculation of JMatPro software which ha s been widely used to develop single crystal superalloy, and the effect of alloying element on the stability of strengthening phase has been revealed by using the Thermo-Calc software. Those properties calculation results showed that the increasing of alloy content could facilitate the precipitation of TCP phases and increase the lattice misfit between γ and γ' phase, and the effect of molybdenum, tantalum was the strongest and that of cobalt was the weakest. Then the chemical composition was optimized, and the selected compositions showed excellent microstructure stability and stress-rupture properties by the confirmation of d-electrons concept and software calculation.

  2. The numerical-statistical approach for hazard prediction of landslides and its application in Ukraine

    NASA Astrophysics Data System (ADS)

    Trofimchuk, O.; Kaliukh, Iu.

    2012-04-01

    More than 90% of the territory of Ukraine has complex ground conditions. Unpredictable changes of natural geological and man-made factors governing ground conditions, may lead to dangerous deformation processes resulting in accidents and disasters. Among them, landslides are the first by the amount of the inflicted damage in Ukraine and the second only to earthquakes in the world. Totally about 23 000 landslides were identified in the territory of Ukraine. The standard deterministic procedure of assessment of the slope stability, especially with the lack of reference engineering geological data, results in obtaining estimated values of stability coefficients differing from the real ones in many cases. Application of a probabilistic approach will allow to take into account the changeable properties of soils and to determine danger and risk of landslide dislocations. The matter of choice of landslide protection measures is directly connected with a risk: expensively but reliably or cheaper but with a great probability of accidents. The risk determines the consequences either economic, social or others, of a potential landslide dislocation on the slope both during construction of a retaining structure on it and in the process of its further maintenance. The quintessence of risk determination consists in the following: study and extrapolation of the past events for each specific occurrence. Expected conclusions and probable damages as a result of a calculated and accepted risk can be determined only with a certain level of uncertainty. Considering this fact improvement of the accuracy of numerical and analytical estimates when calculating the risk magnitude makes it possible to reduce the uncertainty. Calculations of the Chernivtsi shear landslides (Ukraine) were made with an application of Plaxis software and due account of a risk of its displacement was performed for the typical distribution diagram of the landslide-prone slope. The calculations showed that seismic events of intensity up to 6 points are able to significantly impair the characteristics of soil along the sliding surface and affect the slope stability and the value of landslide pressure onto supporting buildings. The further improvement of the site seismicity (up to 7-8 points) results in the substantial decrease of the stability coefficient. The slope, which firstly was stable, turns into a limiting equilibrium state, and then its state becomes unstable. Based on the calculation results, it is possible to follow step-by-step the process of stress redistribution in the landslide slope with its seismicity increase, which eventually causes the slope motion (unloading of accumulated stresses). Measures of landslide risk management are aimed at ensuring and maintaining acceptable or in some case an allowable risk level. In addition to calculations of soil masses stability, parameters of structures, creation of drawings and making of estimates, it is also necessary to take measures aimed at compensation of uncertainties and prevention of unforeseen situations due to incomplete (unreliable) surveys, etc. Depending on the initial situation or requirements the landslide risk management is to solve the following three problems: prevention of negative consequences; reduction of danger and risk; rectification of consequences, and prevention of a new danger development.

  3. Extensions to decision curve analysis, a novel method for evaluating diagnostic tests, prediction models and molecular markers

    PubMed Central

    Vickers, Andrew J; Cronin, Angel M; Elkin, Elena B; Gonen, Mithat

    2008-01-01

    Background Decision curve analysis is a novel method for evaluating diagnostic tests, prediction models and molecular markers. It combines the mathematical simplicity of accuracy measures, such as sensitivity and specificity, with the clinical applicability of decision analytic approaches. Most critically, decision curve analysis can be applied directly to a data set, and does not require the sort of external data on costs, benefits and preferences typically required by traditional decision analytic techniques. Methods In this paper we present several extensions to decision curve analysis including correction for overfit, confidence intervals, application to censored data (including competing risk) and calculation of decision curves directly from predicted probabilities. All of these extensions are based on straightforward methods that have previously been described in the literature for application to analogous statistical techniques. Results Simulation studies showed that repeated 10-fold crossvalidation provided the best method for correcting a decision curve for overfit. The method for applying decision curves to censored data had little bias and coverage was excellent; for competing risk, decision curves were appropriately affected by the incidence of the competing risk and the association between the competing risk and the predictor of interest. Calculation of decision curves directly from predicted probabilities led to a smoothing of the decision curve. Conclusion Decision curve analysis can be easily extended to many of the applications common to performance measures for prediction models. Software to implement decision curve analysis is provided. PMID:19036144

  4. Extensions to decision curve analysis, a novel method for evaluating diagnostic tests, prediction models and molecular markers.

    PubMed

    Vickers, Andrew J; Cronin, Angel M; Elkin, Elena B; Gonen, Mithat

    2008-11-26

    Decision curve analysis is a novel method for evaluating diagnostic tests, prediction models and molecular markers. It combines the mathematical simplicity of accuracy measures, such as sensitivity and specificity, with the clinical applicability of decision analytic approaches. Most critically, decision curve analysis can be applied directly to a data set, and does not require the sort of external data on costs, benefits and preferences typically required by traditional decision analytic techniques. In this paper we present several extensions to decision curve analysis including correction for overfit, confidence intervals, application to censored data (including competing risk) and calculation of decision curves directly from predicted probabilities. All of these extensions are based on straightforward methods that have previously been described in the literature for application to analogous statistical techniques. Simulation studies showed that repeated 10-fold crossvalidation provided the best method for correcting a decision curve for overfit. The method for applying decision curves to censored data had little bias and coverage was excellent; for competing risk, decision curves were appropriately affected by the incidence of the competing risk and the association between the competing risk and the predictor of interest. Calculation of decision curves directly from predicted probabilities led to a smoothing of the decision curve. Decision curve analysis can be easily extended to many of the applications common to performance measures for prediction models. Software to implement decision curve analysis is provided.

  5. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tuazon, B; Narayanasamy, G; Kirby, N

    Purpose: The purpose of this study was to evaluate and compare the accuracy of dose calculation algorithms in the second check software programs Radcalc, Diamond, IMSure, and MUcheck, against the Pinnacle3 treatment planning system (TPS). Methods: Baseline accuracy of the second check software was established by comparison against Pinnacle TPS data using open square fields of 5, 10, 20, 30 and 40cm in a SAD setup. 18 previously treated patients’ files were exported from the Pinnacle3 TPS to each of the four second check softwares, consisting of 146 step and shoot intensity modulated radiotherapy (IMRT) beams and 60 Smart Arcs.more » Monitor unit (MU) calculated in each of the software were compared with the TPS and the values were represented as a percent difference. Open fields were calculated as a baseline for each software’s accuracy using 5×5, 10×10, 20×20, 30×30, and 40×40 fields. Box plots, Pearson correlation, and Bland-Altman analysis were used for comparison of the results. Results: The baseline accuracy was established to within 0.6%, −1.4%, −0.2%, and −1.0% for Diamond, IMSure,MUcheck, and Radcalc, respectively. In the clinical data, the dose difference represented as mean ± 1 standard deviation were 0.7%±0.1%, −0.3%±0.1%, −1.5%±0.1%, and 0.4%±0.0% for Diamond, IMSure, MUcheck, and Radcalc, respectively Conclusion: The implementation of Clarkson algorithm for the dose calculation between each of the software in question can vary considerably. The currently used second check software, Radcalc has shown the best agreement on average, variance, and smallest percent range from Pinnacle3 TPS values. The closest in average percent difference from the TPS data was the IMSure software, but has significantly larger variance and percent range. The mean percent differences in Diamond and MUcheck were significantly larger than Radcalc and IMSure.« less

  6. Investigation of the accuracy of MV radiation isocentre calculations in the Elekta cone-beam CT software XVI.

    PubMed

    Riis, Hans L; Moltke, Lars N; Zimmermann, Sune J; Ebert, Martin A; Rowshanfarzad, Pejman

    2016-06-07

    Accurate determination of the megavoltage (MV) radiation isocentre of a linear accelerator (linac) is an important task in radiotherapy. The localization of the MV radiation isocentre is crucial for correct calibration of the in-room lasers and the cone-beam CT scanner used for patient positioning prior to treatment. Linac manufacturers offer tools for MV radiation isocentre localization. As a user, there is no access to the documentation for the underlying method and calculation algorithm used in the commercial software. The idea of this work was to evaluate the accuracy of the software tool for MV radiation isocentre calculation as delivered by Elekta using independent software. The image acquisition was based on the scheme designed by the manufacturer. Eight MV images were acquired in each series of a ball-bearing (BB) phantom attached to the treatment couch. The images were recorded at cardinal angles of the gantry using the electronic portal imaging device (EPID). Eight Elekta linacs with three different types of multileaf collimators (MLCs) were included in the test. The influence of MLC orientation, x-ray energy, and phantom modifications were examined. The acquired images were analysed using the Elekta x-ray volume imaging (XVI) software and in-house developed (IHD) MATLAB code. Results from the two different software were compared. A discrepancy in the longitudinal direction of the isocentre localization was found averaging 0.23 mm up to a maximum of 0.75 mm. The MLC orientation or the phantom asymmetry in the longitudinal direction do not appear to cause the discrepancy. The main cause of the differences could not be clearly identified. However, it is our opinion that the commercial software delivered by the linac manufacturer should be improved to reach better stability and precise results in the MV radiation isocentre calculations.

  7. Micrometeoroid and Orbital Debris Risk Assessment With Bumper 3

    NASA Technical Reports Server (NTRS)

    Hyde, J.; Bjorkman, M.; Christiansen, E.; Lear, D.

    2017-01-01

    The Bumper 3 computer code is the primary tool used by NASA for micrometeoroid and orbital debris (MMOD) risk analysis. Bumper 3 (and its predecessors) have been used to analyze a variety of manned and unmanned spacecraft. The code uses NASA's latest micrometeoroid (MEM-R2) and orbital debris (ORDEM 3.0) environment definition models and is updated frequently with ballistic limit equations that describe the hypervelocity impact performance of spacecraft materials. The Bumper 3 program uses these inputs along with a finite element representation of spacecraft geometry to provide a deterministic calculation of the expected number of failures. The Bumper 3 software is configuration controlled by the NASA/JSC Hypervelocity Impact Technology (HVIT) Group. This paper will demonstrate MMOD risk assessment techniques with Bumper 3 used by NASA's HVIT Group. The Permanent Multipurpose Module (PMM) was added to the International Space Station in 2011. A Bumper 3 MMOD risk assessment of this module will show techniques used to create the input model and assign the property IDs. The methodology used to optimize the MMOD shielding for minimum mass while still meeting structural penetration requirements will also be demonstrated.

  8. Cumulative Aggregate Risk Evaluation Software

    EPA Science Inventory

    CARES is a state-of-the-art software program designed to conduct complex exposure and risk assessments for pesticides, such as the assessments required under the 1996 Food Quality Protection Act (FQPA). CARES was originally developed under the auspices of CropLife America (CLA),...

  9. Calculation Software

    NASA Technical Reports Server (NTRS)

    1994-01-01

    MathSoft Plus 5.0 is a calculation software package for electrical engineers and computer scientists who need advanced math functionality. It incorporates SmartMath, an expert system that determines a strategy for solving difficult mathematical problems. SmartMath was the result of the integration into Mathcad of CLIPS, a NASA-developed shell for creating expert systems. By using CLIPS, MathSoft, Inc. was able to save the time and money involved in writing the original program.

  10. Electron tunneling in proteins program.

    PubMed

    Hagras, Muhammad A; Stuchebrukhov, Alexei A

    2016-06-05

    We developed a unique integrated software package (called Electron Tunneling in Proteins Program or ETP) which provides an environment with different capabilities such as tunneling current calculation, semi-empirical quantum mechanical calculation, and molecular modeling simulation for calculation and analysis of electron transfer reactions in proteins. ETP program is developed as a cross-platform client-server program in which all the different calculations are conducted at the server side while only the client terminal displays the resulting calculation outputs in the different supported representations. ETP program is integrated with a set of well-known computational software packages including Gaussian, BALLVIEW, Dowser, pKip, and APBS. In addition, ETP program supports various visualization methods for the tunneling calculation results that assist in a more comprehensive understanding of the tunneling process. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.

  11. Comparison of Methodologies Using Estimated or Measured Values of Total Corneal Astigmatism for Toric Intraocular Lens Power Calculation.

    PubMed

    Ferreira, Tiago B; Ribeiro, Paulo; Ribeiro, Filomena J; O'Neill, João G

    2017-12-01

    To compare the prediction error in the calculation of toric intraocular lenses (IOLs) associated with methods that estimate the power of the posterior corneal surface (ie, Barrett toric calculator and Abulafia-Koch formula) with that of methods that consider real measures obtained using Scheimpflug imaging: a software that uses vectorial calculation (Panacea toric calculator: http://www.panaceaiolandtoriccalculator.com) and a ray tracing software (PhacoOptics, Aarhus Nord, Denmark). In 107 eyes of 107 patients undergoing cataract surgery with toric IOL implantation (Acrysof IQ Toric; Alcon Laboratories, Inc., Fort Worth, TX), predicted residual astigmatism by each calculation method was compared with manifest refractive astigmatism. Prediction error in residual astigmatism was calculated using vector analysis. All calculation methods resulted in overcorrection of with-the-rule astigmatism and undercorrection of against-the-rule astigmatism. Both estimation methods resulted in lower mean and centroid astigmatic prediction errors, and a larger number of eyes within 0.50 diopters (D) of absolute prediction error than methods considering real measures (P < .001). Centroid prediction error (CPE) was 0.07 D at 172° for the Barrett toric calculator and 0.13 D at 174° for the Abulafia-Koch formula (combined with Holladay calculator). For methods using real posterior corneal surface measurements, CPE was 0.25 D at 173° for the Panacea calculator and 0.29 D at 171° for the ray tracing software. The Barrett toric calculator and Abulafia-Koch formula yielded the lowest astigmatic prediction errors. Directly evaluating total corneal power for toric IOL calculation was not superior to estimating it. [J Refract Surg. 2017;33(12):794-800.]. Copyright 2017, SLACK Incorporated.

  12. The optimization problems of CP operation

    NASA Astrophysics Data System (ADS)

    Kler, A. M.; Stepanova, E. L.; Maximov, A. S.

    2017-11-01

    The problem of enhancing energy and economic efficiency of CP is urgent indeed. One of the main methods for solving it is optimization of CP operation. To solve the optimization problems of CP operation, Energy Systems Institute, SB of RAS, has developed a software. The software makes it possible to make optimization calculations of CP operation. The software is based on the techniques and software tools of mathematical modeling and optimization of heat and power installations. Detailed mathematical models of new equipment have been developed in the work. They describe sufficiently accurately the processes that occur in the installations. The developed models include steam turbine models (based on the checking calculation) which take account of all steam turbine compartments and regeneration system. They also enable one to make calculations with regenerative heaters disconnected. The software for mathematical modeling of equipment and optimization of CP operation has been developed. It is based on the technique for optimization of CP operating conditions in the form of software tools and integrates them in the common user interface. The optimization of CP operation often generates the need to determine the minimum and maximum possible total useful electricity capacity of the plant at set heat loads of consumers, i.e. it is necessary to determine the interval on which the CP capacity may vary. The software has been applied to optimize the operating conditions of the Novo-Irkutskaya CP of JSC “Irkutskenergo”. The efficiency of operating condition optimization and the possibility for determination of CP energy characteristics that are necessary for optimization of power system operation are shown.

  13. Software for predictive microbiology and risk assessment: a description and comparison of tools presented at the ICPMF8 Software Fair.

    PubMed

    Tenenhaus-Aziza, Fanny; Ellouze, Mariem

    2015-02-01

    The 8th International Conference on Predictive Modelling in Food was held in Paris, France in September 2013. One of the major topics of this conference was the transfer of knowledge and tools between academics and stakeholders of the food sector. During the conference, a "Software Fair" was held to provide information and demonstrations of predictive microbiology and risk assessment software. This article presents an overall description of the 16 software tools demonstrated at the session and provides a comparison based on several criteria such as the modeling approach, the different modules available (e.g. databases, predictors, fitting tools, risk assessment tools), the studied environmental factors (temperature, pH, aw, etc.), the type of media (broth or food) and the number and type of the provided micro-organisms (pathogens and spoilers). The present study is a guide to help users select the software tools which are most suitable to their specific needs, before they test and explore the tool(s) in more depth. Copyright © 2014 Elsevier Ltd. All rights reserved.

  14. A Dynamic Human Health Risk Assessment System

    PubMed Central

    Prasad, Umesh; Singh, Gurmit; Pant, A. B.

    2012-01-01

    An online human health risk assessment system (OHHRAS) has been designed and developed in the form of a prototype database-driven system and made available for the population of India through a website – www.healthriskindia.in. OHHRAS provide the three utilities, that is, health survey, health status, and bio-calculators. The first utility health survey is functional on the basis of database being developed dynamically and gives the desired output to the user on the basis of input criteria entered into the system; the second utility health status is providing the output on the basis of dynamic questionnaire and ticked (selected) answers and generates the health status reports based on multiple matches set as per advise of medical experts and the third utility bio-calculators are very useful for the scientists/researchers as online statistical analysis tool that gives more accuracy and save the time of user. The whole system and database-driven website has been designed and developed by using the software (mainly are PHP, My-SQL, Deamweaver, C++ etc.) and made available publically through a database-driven website (www.healthriskindia.in), which are very useful for researchers, academia, students, and general masses of all sectors. PMID:22778520

  15. Discrete Element Modeling (DEM) of Triboelectrically Charged Particles: Revised Experiments

    NASA Technical Reports Server (NTRS)

    Hogue, Michael D.; Calle, Carlos I.; Curry, D. R.; Weitzman, P. S.

    2008-01-01

    In a previous work, the addition of basic screened Coulombic electrostatic forces to an existing commercial discrete element modeling (DEM) software was reported. Triboelectric experiments were performed to charge glass spheres rolling on inclined planes of various materials. Charge generation constants and the Q/m ratios for the test materials were calculated from the experimental data and compared to the simulation output of the DEM software. In this paper, we will discuss new values of the charge generation constants calculated from improved experimental procedures and data. Also, planned work to include dielectrophoretic, Van der Waals forces, and advanced mechanical forces into the software will be discussed.

  16. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ahmed, A.; Chadwick, T.; Makhlouf, M.

    This paper deals with the effects of various solidification variables such as cooling rate, temperature gradient, solidification rate, etc. on the microstructure and shrinkage defects in aluminum alloy (A356) castings. The effects are first predicted using commercial solidification modeling softwares and then verified experimentally. For this work, the authors are considering a rectangular bar cast in a sand mold. Simulation is performed using SIMULOR, a finite volume based casting simulation program. Microstructural variables such as dendritic arm spacing (DAS) and defects (percentage porosity) are calculated from the temperature fields, cooling rate, solidification time, etc. predicted by the computer softwares. Themore » same variables are then calculated experimentally in the foundry. The test piece is cast in a resin (Sodium Silicate) bonded sand mold and the DAS and porosity variables are calculated using Scanning Electron Microscopy and Image Analysis. The predictions from the software are compared with the experimental results. The results are presented and critically analyzed to determine the quality of the predicted results. The usefulness of the commercial solidification modeling softwares as a tool for the foundry are also discussed.« less

  17. The SCEC Broadband Platform: A Collaborative Open-Source Software Package for Strong Ground Motion Simulation and Validation

    NASA Astrophysics Data System (ADS)

    Silva, F.; Maechling, P. J.; Goulet, C. A.; Somerville, P.; Jordan, T. H.

    2014-12-01

    The Southern California Earthquake Center (SCEC) Broadband Platform is a collaborative software development project involving geoscientists, earthquake engineers, graduate students, and the SCEC Community Modeling Environment. The SCEC Broadband Platform (BBP) is open-source scientific software that can generate broadband (0-100Hz) ground motions for earthquakes, integrating complex scientific modules that implement rupture generation, low and high-frequency seismogram synthesis, non-linear site effects calculation, and visualization into a software system that supports easy on-demand computation of seismograms. The Broadband Platform operates in two primary modes: validation simulations and scenario simulations. In validation mode, the Platform runs earthquake rupture and wave propagation modeling software to calculate seismograms for a well-observed historical earthquake. Then, the BBP calculates a number of goodness of fit measurements that quantify how well the model-based broadband seismograms match the observed seismograms for a certain event. Based on these results, the Platform can be used to tune and validate different numerical modeling techniques. In scenario mode, the Broadband Platform can run simulations for hypothetical (scenario) earthquakes. In this mode, users input an earthquake description, a list of station names and locations, and a 1D velocity model for their region of interest, and the Broadband Platform software then calculates ground motions for the specified stations. Working in close collaboration with scientists and research engineers, the SCEC software development group continues to add new capabilities to the Broadband Platform and to release new versions as open-source scientific software distributions that can be compiled and run on many Linux computer systems. Our latest release includes 5 simulation methods, 7 simulation regions covering California, Japan, and Eastern North America, the ability to compare simulation results against GMPEs, and several new data products, such as map and distance-based goodness of fit plots. As the number and complexity of scenarios simulated using the Broadband Platform increases, we have added batching utilities to substantially improve support for running large-scale simulations on computing clusters.

  18. Integrating Free Computer Software in Chemistry and Biochemistry Instruction: An International Collaboration

    ERIC Educational Resources Information Center

    Cedeno, David L.; Jones, Marjorie A.; Friesen, Jon A.; Wirtz, Mark W.; Rios, Luz Amalia; Ocampo, Gonzalo Taborda

    2010-01-01

    At the Universidad de Caldas, Manizales, Colombia, we used their new computer facilities to introduce chemistry graduate students to biochemical database mining and quantum chemistry calculations using freeware. These hands-on workshops allowed the students a strong introduction to easily accessible software and how to use this software to begin…

  19. Software for Probabilistic Risk Reduction

    NASA Technical Reports Server (NTRS)

    Hensley, Scott; Michel, Thierry; Madsen, Soren; Chapin, Elaine; Rodriguez, Ernesto

    2004-01-01

    A computer program implements a methodology, denoted probabilistic risk reduction, that is intended to aid in planning the development of complex software and/or hardware systems. This methodology integrates two complementary prior methodologies: (1) that of probabilistic risk assessment and (2) a risk-based planning methodology, implemented in a prior computer program known as Defect Detection and Prevention (DDP), in which multiple requirements and the beneficial effects of risk-mitigation actions are taken into account. The present methodology and the software are able to accommodate both process knowledge (notably of the efficacy of development practices) and product knowledge (notably of the logical structure of a system, the development of which one seeks to plan). Estimates of the costs and benefits of a planned development can be derived. Functional and non-functional aspects of software can be taken into account, and trades made among them. It becomes possible to optimize the planning process in the sense that it becomes possible to select the best suite of process steps and design choices to maximize the expectation of success while remaining within budget.

  20. Assessing ergonomic risks of software: Development of the SEAT.

    PubMed

    Peres, S Camille; Mehta, Ranjana K; Ritchey, Paul

    2017-03-01

    Software utilizing interaction designs that require extensive dragging or clicking of icons may increase users' risks for upper extremity cumulative trauma disorders. The purpose of this research is to develop a Self-report Ergonomic Assessment Tool (SEAT) for assessing the risks of software interaction designs and facilitate mitigation of those risks. A 28-item self-report measure was developed by combining and modifying items from existing industrial ergonomic tools. Data were collected from 166 participants after they completed four different tasks that varied by method of input (touch or keyboard and mouse) and type of task (selecting or typing). Principal component analysis found distinct factors associated with stress (i.e., demands) and strain (i.e., response). Repeated measures analyses of variance showed that participants could discriminate the different strain induced by the input methods and tasks. However, participants' ability to discriminate between the stressors associated with that strain was mixed. Further validation of the SEAT is necessary but these results indicate that the SEAT may be a viable method of assessing ergonomics risks presented by software design. Copyright © 2016 Elsevier Ltd. All rights reserved.

  1. Development of efficiency module of organization of Arctic sea cargo transportation with application of neural network technologies

    NASA Astrophysics Data System (ADS)

    Sobolevskaya, E. Yu; Glushkov, S. V.; Levchenko, N. G.; Orlov, A. P.

    2018-05-01

    The analysis of software intended for organizing and managing the processes of sea cargo transportation has been carried out. The shortcomings of information resources are presented, for the organization of work in the Arctic and Subarctic regions of the Far East: the lack of decision support systems, the lack of factor analysis to calculate the time and cost of delivery. The architecture of the module for calculating the effectiveness of the organization of sea cargo transportation has been developed. The simulation process has been considered, which is based on the neural network. The main classification factors with their weighting coefficients have been identified. The architecture of the neural network has been developed to calculate the efficiency of the organization of sea cargo transportation in Arctic conditions. The architecture of the intellectual system of organization of sea cargo transportation has been developed, taking into account the difficult navigation conditions in the Arctic. Its implementation will allow one to provide the management of the shipping company with predictive analytics; to support decision-making; to calculate the most efficient delivery route; to provide on demand online transportation forecast, to minimize the shipping cost, delays in transit, and risks to cargo safety.

  2. Factors associated with perception of risk of contracting HIV among secondary school female learners in Mbonge subdivision of rural Cameroon.

    PubMed

    Tarkang, Elvis Enowbeyang

    2014-01-01

    Since learners in secondary schools fall within the age group hardest hit by HIV/AIDS, it is obvious that these learners might be at high risk of contracting HIV/AIDS. However, little has been explored on the perception of risk of contracting HIV among secondary school learners in Cameroon. This study aimed at examining the perception of risk of contracting HIV among secondary school learners in Mbonge subdivision of rural Cameroon using the Health Belief Model (HBM) as framework. A quantitative, correlational design was adopted, using a self-administered questionnaire to collect data from 210 female learners selected through disproportional, stratified, simple random sampling technique, from three participating senior secondary schools. Statistics were calculated using SPSS version 20 software program. Only 39.4% of the respondents perceived themselves to be at high risk of contracting HIV, though the majority, 54.0% were sexually active. Multinomial logistic regression analyses show that sexual risk behaviours (p=0.000) and the Integrated Value Mapping (IVM) of the perception components of the HBM are the most significant factors associated with perception of risk of contracting HIV at the level p<0.05. The findings of this study can play an instrumental role in the development of effective preventive and interventional messages for adolescents in Cameroon.

  3. Real-time 3D radiation risk assessment supporting simulation of work in nuclear environments.

    PubMed

    Szőke, I; Louka, M N; Bryntesen, T R; Bratteli, J; Edvardsen, S T; RøEitrheim, K K; Bodor, K

    2014-06-01

    This paper describes the latest developments at the Institute for Energy Technology (IFE) in Norway, in the field of real-time 3D (three-dimensional) radiation risk assessment for the support of work simulation in nuclear environments. 3D computer simulation can greatly facilitate efficient work planning, briefing, and training of workers. It can also support communication within and between work teams, and with advisors, regulators, the media and public, at all the stages of a nuclear installation's lifecycle. Furthermore, it is also a beneficial tool for reviewing current work practices in order to identify possible gaps in procedures, as well as to support the updating of international recommendations, dissemination of experience, and education of the current and future generation of workers.IFE has been involved in research and development into the application of 3D computer simulation and virtual reality (VR) technology to support work in radiological environments in the nuclear sector since the mid 1990s. During this process, two significant software tools have been developed, the VRdose system and the Halden Planner, and a number of publications have been produced to contribute to improving the safety culture in the nuclear industry.This paper describes the radiation risk assessment techniques applied in earlier versions of the VRdose system and the Halden Planner, for visualising radiation fields and calculating dose, and presents new developments towards implementing a flexible and up-to-date dosimetric package in these 3D software tools, based on new developments in the field of radiation protection. The latest versions of these 3D tools are capable of more accurate risk estimation, permit more flexibility via a range of user choices, and are applicable to a wider range of irradiation situations than their predecessors.

  4. Prediction of rodent carcinogenic potential of naturally occurring chemicals in the human diet using high-throughput QSAR predictive modeling

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Valerio, Luis G.; Arvidson, Kirk B.; Chanderbhan, Ronald F.

    2007-07-01

    Consistent with the U.S. Food and Drug Administration (FDA) Critical Path Initiative, predictive toxicology software programs employing quantitative structure-activity relationship (QSAR) models are currently under evaluation for regulatory risk assessment and scientific decision support for highly sensitive endpoints such as carcinogenicity, mutagenicity and reproductive toxicity. At the FDA's Center for Food Safety and Applied Nutrition's Office of Food Additive Safety and the Center for Drug Evaluation and Research's Informatics and Computational Safety Analysis Staff (ICSAS), the use of computational SAR tools for both qualitative and quantitative risk assessment applications are being developed and evaluated. One tool of current interest ismore » MDL-QSAR predictive discriminant analysis modeling of rodent carcinogenicity, which has been previously evaluated for pharmaceutical applications by the FDA ICSAS. The study described in this paper aims to evaluate the utility of this software to estimate the carcinogenic potential of small, organic, naturally occurring chemicals found in the human diet. In addition, a group of 19 known synthetic dietary constituents that were positive in rodent carcinogenicity studies served as a control group. In the test group of naturally occurring chemicals, 101 were found to be suitable for predictive modeling using this software's discriminant analysis modeling approach. Predictions performed on these compounds were compared to published experimental evidence of each compound's carcinogenic potential. Experimental evidence included relevant toxicological studies such as rodent cancer bioassays, rodent anti-carcinogenicity studies, genotoxic studies, and the presence of chemical structural alerts. Statistical indices of predictive performance were calculated to assess the utility of the predictive modeling method. Results revealed good predictive performance using this software's rodent carcinogenicity module of over 1200 chemicals, comprised primarily of pharmaceutical, industrial and some natural products developed under an FDA-MDL cooperative research and development agreement (CRADA). The predictive performance for this group of dietary natural products and the control group was 97% sensitivity and 80% concordance. Specificity was marginal at 53%. This study finds that the in silico QSAR analysis employing this software's rodent carcinogenicity database is capable of identifying the rodent carcinogenic potential of naturally occurring organic molecules found in the human diet with a high degree of sensitivity. It is the first study to demonstrate successful QSAR predictive modeling of naturally occurring carcinogens found in the human diet using an external validation test. Further test validation of this software and expansion of the training data set for dietary chemicals will help to support the future use of such QSAR methods for screening and prioritizing the risk of dietary chemicals when actual animal data are inadequate, equivocal, or absent.« less

  5. TU-FG-201-07: Development of SRS Conical Collimator Collision Prediction Software for Radiation Treatment Safety

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gutti, V; Morrow, A; Kim, S

    Purpose: Stereotactic radiosurgery (SRS) treatments using conical collimators can potentially result in gantry collision with treatment table due to limited collision-clear spaces. An in-house software was developed to help the SRS treatment planner mitigate potential SRS conical collimator (Varian Medical System, Palo Alto, CA) collisions with the treatment table. This software was designed to remove treatment re-planning secondary to unexpected collisions. Methods: A BrainLAB SRS ICT Frameless Extension used for SRS treatments in our clinic was mathematically modelled using surface points registered to the 3D co-ordinate space of the couch extension. The surface points are transformed based on the treatmentmore » isocenter point and potential collisions are determined in 3D space for couch and gantry angle combinations. The distance between the SRS conical collimators and LINAC isocenter is known. The collision detection model was programmed in MATLAB (Mathwork, Natick, MA) to display graphical plots of the calculations, and the plotted data is used to avoid the gantry and couch angle combinations that would likely result in a collision. We have utilized the cone collision tool for 23 SRS cone treatment plans (8 retrospective and 15 prospective for 10 patients). Results: Twenty one plans strongly agreed with the software tool prediction for collision. However, in two plans, a collision was observed with a 0.5 cm margin when the software predicted no collision. Therefore, additional margins were added to the clearance criteria in the program to achieve a lower risk of actual collisions. Conclusion: Our in-house developed collision check software successfully avoided SRS cone re-planning by 91.3% due to a reduction in cone collisions with the treatment table. Future developments to our software will include a CT image data set based collision prediction model as well as a beam angle optimization tool to avoid normal critical tissues as well as previously treated lesions.« less

  6. Software IV and V Research Priorities and Applied Program Accomplishments Within NASA

    NASA Technical Reports Server (NTRS)

    Blazy, Louis J.

    2000-01-01

    The mission of this research is to be world-class creators and facilitators of innovative, intelligent, high performance, reliable information technologies that enable NASA missions to (1) increase software safety and quality through error avoidance, early detection and resolution of errors, by utilizing and applying empirically based software engineering best practices; (2) ensure customer software risks are identified and/or that requirements are met and/or exceeded; (3) research, develop, apply, verify, and publish software technologies for competitive advantage and the advancement of science; and (4) facilitate the transfer of science and engineering data, methods, and practices to NASA, educational institutions, state agencies, and commercial organizations. The goals are to become a national Center Of Excellence (COE) in software and system independent verification and validation, and to become an international leading force in the field of software engineering for improving the safety, quality, reliability, and cost performance of software systems. This project addresses the following problems: Ensure safety of NASA missions, ensure requirements are met, minimize programmatic and technological risks of software development and operations, improve software quality, reduce costs and time to delivery, and improve the science of software engineering

  7. Is radiography justified for the evaluation of patients presenting with cervical spine trauma?

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Theocharopoulos, Nicholas; Chatzakis, Georgios; Damilakis, John

    2009-10-15

    Conventional radiography has been for decades the standard method of evaluation for cervical spine trauma patients. However, currently available helical multidetector CT scanners allow multiplanar reconstruction of images, leading to increased diagnostic accuracy. The purpose of this study was to determine the relative benefit/risk ratio between cervical spine CT and cervical spine radiography and between cervical spine CT and cervical spine radiography, followed by CT as an adjunct for positive findings. A decision analysis model for the determination of the optimum imaging technique was developed. The sensitivity and specificity of CT and radiography were obtained by dedicated meta-analysis. Lifetime attributablemore » risk of mortal cancer from CT and radiography was calculated using updated organ-specific risk coefficients and organ-absorbed doses. Patient organ doses from radiography were calculated using Monte Carlo techniques, simulated exposures performed on an anthropomorphic phantom, and thermoluminescence dosimetry. A prospective patient study was performed regarding helical CT scans of the cervical spine. Patient doses were calculated based on the dose-length-product values and Monte Carlo-based CT dosimetry software program. Three groups of patient risk for cervical spine fracture were incorporated in the decision model on the basis of hypothetical trauma mechanism and clinical findings. Radiation effects were assessed separately for males and females for four age groups (20, 40, 60, and 80 yr old). Effective dose from radiography amounts to 0.050 mSv and from a typical CT scan to 3.8 mSv. The use of CT in a hypothetical cohort of 10{sup 6} patients prevents approximately 130 incidents of paralysis in the low risk group (a priori fracture probability of 0.5%), 500 in the moderate risk group (a priori fracture probability of 2%), and 5100 in the high risk group (a priori fracture probability of 20%). The expense of this CT-based prevention is 15-32 additional radiogenic lethal cancer incidents. According to the decision model calculations, the use of CT is more favorable over the use of radiography alone or radiography with CT by a factor of 13, for low risk 20 yr old patients, to a factor of 23, for high risk patients younger than 80 yr old. The radiography/CT imaging strategy slightly outperforms plain radiography for high and moderate risk patients. Regardless of the patient age, sex, and fracture risk, the higher diagnostic accuracy obtained by the CT examination counterbalances the increase in dose compared to plain radiography or radiography followed by CT only for positive radiographs and renders CT utilization justified and the radiographic screening redundant.« less

  8. Integrated analyses in plastics forming

    NASA Astrophysics Data System (ADS)

    Bo, Wang

    This is the thesis which explains the progress made in the analysis, simulation and testing of plastics forming. This progress can be applied to injection and compression mould design. Three activities of plastics forming have been investigated, namely filling analysis, cooling analysis and ejecting analysis. The filling section of plastics forming has been analysed and calculated by using MOLDFLOW and FILLCALC V. software. A comparing of high speed compression moulding and injection moulding has been made. The cooling section of plastics forming has been analysed by using MOLDFLOW software and a finite difference computer program. The latter program can be used as a sample program to calculate the feasibility of cooling different materials to required target temperatures under controlled cooling conditions. The application of thermal imaging has been also introduced to determine the actual process temperatures. Thermal imaging can be used as a powerful tool to analyse mould surface temperatures and to verify the mathematical model. A buckling problem for ejecting section has been modelled and calculated by PATRAN/ABAQUS finite element analysis software and tested. These calculations and analysis are applied to the special case but can be use as an example for general analysis and calculation in the ejection section of plastics forming.

  9. WaveAR: A software tool for calculating parameters for water waves with incident and reflected components

    NASA Astrophysics Data System (ADS)

    Landry, Blake J.; Hancock, Matthew J.; Mei, Chiang C.; García, Marcelo H.

    2012-09-01

    The ability to determine wave heights and phases along a spatial domain is vital to understanding a wide range of littoral processes. The software tool presented here employs established Stokes wave theory and sampling methods to calculate parameters for the incident and reflected components of a field of weakly nonlinear waves, monochromatic at first order in wave slope and propagating in one horizontal dimension. The software calculates wave parameters over an entire wave tank and accounts for reflection, weak nonlinearity, and a free second harmonic. Currently, no publicly available program has such functionality. The included MATLAB®-based open source code has also been compiled for Windows®, Mac® and Linux® operating systems. An additional companion program, VirtualWave, is included to generate virtual wave fields for WaveAR. Together, the programs serve as ideal analysis and teaching tools for laboratory water wave systems.

  10. BIM cost analysis of transport infrastructure projects

    NASA Astrophysics Data System (ADS)

    Volkov, Andrey; Chelyshkov, Pavel; Grossman, Y.; Khromenkova, A.

    2017-10-01

    The article describes the method of analysis of the energy costs of transport infrastructure objects using BIM software. The paper consideres several options of orientation of a building using SketchUp and IES VE software programs. These options allow to choose the best direction of the building facades. Particular attention is given to a distribution of a temperature field in a cross-section of the wall according to the calculation made in the ELCUT software. The issues related to calculation of solar radiation penetration into a building and selection of translucent structures are considered in the paper. The article presents data on building codes relating to the transport sector, on the basis of which the calculations were made. The author emphasizes that BIM-programs should be implemented and used in order to optimize a thermal behavior of a building and increase its energy efficiency using climatic data.

  11. ASR4: A computer code for fitting and processing 4-gage anelastic strain recovery data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Warpinski, N.R.

    A computer code for analyzing four-gage Anelastic Strain Recovery (ASR) data has been modified for use on a personal computer. This code fits the viscoelastic model of Warpinski and Teufel to measured ASR data, calculates the stress orientation directly, and computes stress magnitudes if sufficient input data are available. The code also calculates the stress orientation using strain-rosette equations, and its calculates stress magnitudes using Blanton's approach, assuming sufficient input data are available. The program is written in FORTRAN, compiled with Ryan-McFarland Version 2.4. Graphics use PLOT88 software by Plotworks, Inc., but the graphics software must be obtained by themore » user because of licensing restrictions. A version without graphics can also be run. This code is available through the National Energy Software Center (NESC), operated by Argonne National Laboratory. 5 refs., 3 figs.« less

  12. Quantitative Microbial Risk Assessment Tutorial: Installation of Software for Watershed Modeling in Support of QMRA

    EPA Science Inventory

    This tutorial provides instructions for accessing, retrieving, and downloading the following software to install on a host computer in support of Quantitative Microbial Risk Assessment (QMRA) modeling:• SDMProjectBuilder (which includes the Microbial Source Module as part...

  13. Massively parallelized Monte Carlo software to calculate the light propagation in arbitrarily shaped 3D turbid media

    NASA Astrophysics Data System (ADS)

    Zoller, Christian; Hohmann, Ansgar; Ertl, Thomas; Kienle, Alwin

    2017-07-01

    The Monte Carlo method is often referred as the gold standard to calculate the light propagation in turbid media [1]. Especially for complex shaped geometries where no analytical solutions are available the Monte Carlo method becomes very important [1, 2]. In this work a Monte Carlo software is presented, to simulate the light propagation in complex shaped geometries. To improve the simulation time the code is based on OpenCL such that graphics cards can be used as well as other computing devices. Within the software an illumination concept is presented to realize easily all kinds of light sources, like spatial frequency domain (SFD), optical fibers or Gaussian beam profiles. Moreover different objects, which are not connected to each other, can be considered simultaneously, without any additional preprocessing. This Monte Carlo software can be used for many applications. In this work the transmission spectrum of a tooth and the color reconstruction of a virtual object are shown, using results from the Monte Carlo software.

  14. Developing smartphone apps for behavioural studies: The AlcoRisk app case study.

    PubMed

    Smith, Anthony; de Salas, Kristy; Lewis, Ian; Schüz, Benjamin

    2017-08-01

    Smartphone apps have emerged as valuable research tools to sample human behaviours at their time of occurrence within natural environments. Human behaviour sampling methods, such as Ecological Momentary Assessment (EMA), aim to facilitate research that is situated in ecologically valid real world environments rather than laboratory environments. Researchers have trialled a range of EMA smartphone apps to sample human behaviours such as dieting, physical activity and smoking. Software development processes for EMA smartphones apps, however, are not widely documented with little guidance provided for the integration of complex multidisciplinary behavioural and technical fields. In this paper, the AlcoRisk app for studying alcohol consumption and risk taking tendencies is presented alongside a software development process that integrates these multidisciplinary fields. The software development process consists of three stages including requirements analysis, feature and interface design followed by app implementation. Results from a preliminary feasibility study support the efficacy of the AlcoRisk app's software development process. Copyright © 2017 Elsevier Inc. All rights reserved.

  15. RiskChanges Spatial Decision Support system for the analysis of changing multi-hazard risk

    NASA Astrophysics Data System (ADS)

    van Westen, Cees; Zhang, Kaixi; Bakker, Wim; Andrejchenko, Vera; Berlin, Julian; Olyazadeh, Roya; Cristal, Irina

    2015-04-01

    Within the framework of the EU FP7 Marie Curie Project CHANGES and the EU FP7 Copernicus project INCREO a spatial decision support system was developed with the aim to analyse the effect of risk reduction planning alternatives on reducing the risk now and in the future, and support decision makers in selecting the best alternatives. Central to the SDSS are the stakeholders. The envisaged users of the system are organizations involved in planning of risk reduction measures, and that have staff capable of visualizing and analyzing spatial data at a municipal scale. The SDSS should be able to function in different countries with different legal frameworks and with organizations with different mandates. These could be subdivided into Civil protection organization with the mandate to design disaster response plans, Expert organizations with the mandate to design structural risk reduction measures (e.g. dams, dikes, check-dams etc), and planning organizations with the mandate to make land development plans. The SDSS can be used in different ways: analyzing the current level of risk, analyzing the best alternatives for risk reduction, the evaluation of the consequences of possible future scenarios to the risk levels, and the evaluation how different risk reduction alternatives will lead to risk reduction under different future scenarios. The SDSS is developed based on open source software and following open standards, for code as well as for data formats and service interfaces. Code development was based upon open source software as well. The architecture of the system is modular. The various parts of the system are loosely coupled, extensible, using standards for interoperability, flexible and web-based. The Spatial Decision Support System is composed of a number of integrated components. The Risk Assessment component allows to carry out spatial risk analysis, with different degrees of complexity, ranging from simple exposure (overlay of hazard and assets maps) to quantitative analysis (using different hazard types, temporal scenarios and vulnerability curves) resulting into risk curves. The platform does not include a component to calculate hazard maps, and existing hazard maps are used as input data for the risk component. The second component of the SDSS is a risk reduction planning component, which forms the core of the platform. This component includes the definition of risk reduction alternatives (related to disaster response planning, risk reduction measures and spatial planning) and links back to the risk assessment module to calculate the new level of risk if the measure is implemented, and a cost-benefit (or cost-effectiveness/ Spatial Multi Criteria Evaluation) component to compare the alternatives and make decision on the optimal one. The third component of the SDSS is a temporal scenario component, which allows to define future scenarios in terms of climate change, land use change and population change, and the time periods for which these scenarios will be made. The component doesn't generate these scenarios but uses input maps for the effect of the scenarios on the hazard and assets maps. The last component is a communication and visualization component, which can compare scenarios and alternatives, not only in the form of maps, but also in other forms (risk curves, tables, graphs)

  16. Assessment of nursing care using indicators generated by software.

    PubMed

    Lima, Ana Paula Souza; Chianca, Tânia Couto Machado; Tannure, Meire Chucre

    2015-01-01

    to analyze the efficacy of the Nursing Process in an Intensive Care Unit using indicators generated by software. cross-sectional study using data collected for four months. RNs and students daily registered patients, took history (at admission), performed physical assessments, and established nursing diagnoses, nursing plans/prescriptions, and assessed care delivered to 17 patients using software. Indicators concerning the incidence and prevalence of nursing diagnoses, rate of effectiveness, risk diagnoses, and rate of effective prevention of complications were computed. the Risk for imbalanced body temperature was the most frequent diagnosis (23.53%), while the least frequent was Risk for constipation (0%). The Risk for Impaired skin integrity was prevalent in 100% of the patients, while Risk for acute confusion was the least prevalent (11.76%). Risk for constipation and Risk for impaired skin integrity obtained a rate of risk diagnostic effectiveness of 100%. The rate of effective prevention of acute confusion and falls was 100%. the efficacy of the Nursing Process using indicators was analyzed because these indicators reveal how nurses have identified patients' risks and conditions, and planned care in a systematized manner.

  17. Design and development of a nutritional assessment application for smartphones and tablets with Android OS.

    PubMed

    Carnero Gregorio, Miguel; Blanco Ramos, Montserrat; Obeso Carillo, Gerardo Andrés; García Fontán, Eva; Álvarez González, Miguel Ángel; Cañizares Carretero, Miguel Ángel

    2014-10-03

    To design and develop a nutritional application for smartphones and tablets with Android operating system for using to in- and outpatients that need a nutritional assessment. To check the validity of the results of such software. The application was compiled for version 2.1 of the Android operating system from Google. A cohort of 30 patients was included for evaluating the reliability of the application. The calculations were performed by staff of the Nutrition Unit of the Complexo Hospitalario Universitario de Vigo, manually and through e-Nutrimet software on a smartphone and a tablet. Concordance was absolute between results of different methods obtained using e-Nutrimet on a smartphone and a tablet (Fleiss index κ= 1). The same level of concordance was obtained by comparing handmade and e-Nutrimet made results. The degree of correlation is good, and it would be extended to all healthcare staff who wants to determine whether a patient has malnutrition, or not. The nutritional assessment software e-Nutrimet does not replace healthcare staff in any case, but could be an important aid in assessing patients who may be in risk of malnutrition, saving time of evaluation. Copyright AULA MEDICA EDICIONES 2014. Published by AULA MEDICA. All rights reserved.

  18. Development and application of a complex numerical model and software for the computation of dose conversion factors for radon progenies.

    PubMed

    Farkas, Árpád; Balásházy, Imre

    2015-04-01

    A more exact determination of dose conversion factors associated with radon progeny inhalation was possible due to the advancements in epidemiological health risk estimates in the last years. The enhancement of computational power and the development of numerical techniques allow computing dose conversion factors with increasing reliability. The objective of this study was to develop an integrated model and software based on a self-developed airway deposition code, an own bronchial dosimetry model and the computational methods accepted by International Commission on Radiological Protection (ICRP) to calculate dose conversion coefficients for different exposure conditions. The model was tested by its application for exposure and breathing conditions characteristic of mines and homes. The dose conversion factors were 8 and 16 mSv WLM(-1) for homes and mines when applying a stochastic deposition model combined with the ICRP dosimetry model (named PM-A model), and 9 and 17 mSv WLM(-1) when applying the same deposition model combined with authors' bronchial dosimetry model and the ICRP bronchiolar and alveolar-interstitial dosimetry model (called PM-B model). User friendly software for the computation of dose conversion factors has also been developed. The software allows one to compute conversion factors for a large range of exposure and breathing parameters and to perform sensitivity analyses. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  19. Cumulative effective dose and cancer risk for pediatric population in repetitive full spine follow-up imaging: How micro dose is the EOS microdose protocol?

    PubMed

    Law, Martin; Ma, Wang-Kei; Lau, Damian; Cheung, Kenneth; Ip, Janice; Yip, Lawrance; Lam, Wendy

    2018-04-01

    To evaluate and to obtain analytic formulation for the calculation of the effective dose and associated cancer risk using the EOS microdose protocol for scoliotic pediatric patients undergoing full spine imaging at different age of exposure; to demonstrate the microdose protocol capable of delivering lesser radiation dose and hence of further reducing cancer risk induction when compared with the EOS low dose protocol; to obtain cumulative effective dose and cancer risk for both genders scoliotic pediatrics of US and Hong Kong population using the microdose protocol. Organ absorbed doses of full spine exposed scoliotic pediatric patients have been simulated with the use of EOS microdose protocol imaging parameters input to the Monte Carlo software PCXMC. Gender and age specific effective dose has been calculated with the simulated organ absorbed dose using the ICRP-103 approach. The associated radiation induced cancer risk, expressed as lifetime attributable risk (LAR), has been estimated according to the method introduced in the Biological Effects of Ionizing Radiation VII report. Values of LAR have been estimated for scoliotic patients exposed repetitively during their follow up period at different age for US and Hong Kong population. The effective doses of full spine imaging with simultaneous posteroanterior and lateral projection for patients exposed at the age between 5 and 18 years using the EOS microdose protocol have been calculated within the range of 2.54-14.75 μSv. The corresponding LAR for US and Hong Kong population was ranged between 0.04 × 10 -6 and 0.84 × 10 -6 . Cumulative effective dose and cancer risk during follow-up period can be estimated using the results and are of information to patients and their parents. With the use of computer simulation and analytic formulation, we obtained the cumulative effective dose and cancer risk at any age of exposure for pediatric patients of US and Hong Kong population undergoing repetitive microdose protocol full spine imaging. Girls would be at a statistically significant higher cumulative cancer risk than boys undergoing the same microdose full spine imaging protocol and the same follow-up schedule. Copyright © 2018 Elsevier B.V. All rights reserved.

  20. A Statistical Testing Approach for Quantifying Software Reliability; Application to an Example System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chu, Tsong-Lun; Varuttamaseni, Athi; Baek, Joo-Seok

    The U.S. Nuclear Regulatory Commission (NRC) encourages the use of probabilistic risk assessment (PRA) technology in all regulatory matters, to the extent supported by the state-of-the-art in PRA methods and data. Although much has been accomplished in the area of risk-informed regulation, risk assessment for digital systems has not been fully developed. The NRC established a plan for research on digital systems to identify and develop methods, analytical tools, and regulatory guidance for (1) including models of digital systems in the PRAs of nuclear power plants (NPPs), and (2) incorporating digital systems in the NRC's risk-informed licensing and oversight activities.more » Under NRC's sponsorship, Brookhaven National Laboratory (BNL) explored approaches for addressing the failures of digital instrumentation and control (I and C) systems in the current NPP PRA framework. Specific areas investigated included PRA modeling digital hardware, development of a philosophical basis for defining software failure, and identification of desirable attributes of quantitative software reliability methods. Based on the earlier research, statistical testing is considered a promising method for quantifying software reliability. This paper describes a statistical software testing approach for quantifying software reliability and applies it to the loop-operating control system (LOCS) of an experimental loop of the Advanced Test Reactor (ATR) at Idaho National Laboratory (INL).« less

  1. Requirement Metrics for Risk Identification

    NASA Technical Reports Server (NTRS)

    Hammer, Theodore; Huffman, Lenore; Wilson, William; Rosenberg, Linda; Hyatt, Lawrence

    1996-01-01

    The Software Assurance Technology Center (SATC) is part of the Office of Mission Assurance of the Goddard Space Flight Center (GSFC). The SATC's mission is to assist National Aeronautics and Space Administration (NASA) projects to improve the quality of software which they acquire or develop. The SATC's efforts are currently focused on the development and use of metric methodologies and tools that identify and assess risks associated with software performance and scheduled delivery. This starts at the requirements phase, where the SATC, in conjunction with software projects at GSFC and other NASA centers is working to identify tools and metric methodologies to assist project managers in identifying and mitigating risks. This paper discusses requirement metrics currently being used at NASA in a collaborative effort between the SATC and the Quality Assurance Office at GSFC to utilize the information available through the application of requirements management tools.

  2. GlassForm

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    2011-09-16

    GlassForm is a software tool for generating preliminary waste glass formulas for a given waste stream. The software is useful because it reduces the number of verification melts required to develop a suitable additive composition. The software includes property models that calculate glass properties of interest from the chemical composition of the waste glass. The software includes property models for glass viscosity, electrical conductivity, glass transition temperature, and leach resistance as measured by the 7-day product consistency test (PCT).

  3. Quantitative Microbial Risk Assessment Tutorial Installation of Software for Watershed Modeling in Support of QMRA - Updated 2017

    EPA Science Inventory

    This tutorial provides instructions for accessing, retrieving, and downloading the following software to install on a host computer in support of Quantitative Microbial Risk Assessment (QMRA) modeling: • QMRA Installation • SDMProjectBuilder (which includes the Microbial ...

  4. SU-E-J-264: Comparison of Two Commercially Available Software Platforms for Deformable Image Registration

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tuohy, R; Stathakis, S; Mavroidis, P

    2014-06-01

    Purpose: To evaluate and compare the deformable image registration algorithms available in the Velocity (Velocity Medical Solutions, Atlanta, GA) and RayStation (RaySearch Americas, Inc., Garden city NY). Methods: Ten consecutive patient cone beam CTs (CBCT) for each fraction were collected. The CBCTs along with the simulation CT were exported to the Velocity and the RayStation software. Each CBCT was registered using deformable image registration to the simulation CT and the resulting deformable vector matrix was generated. Each registration was visually inspected by a physicist and the prescribing physician. The volumes of the critical organs were calculated for each deformable CTmore » and used for comparison. Results: The resulting deformable registrations revealed differences between the two algorithms. These differences were realized when the organs at risk were contoured on each deformed CBCT. Differences in the order of 10% ±30% in volume were observed for bladder, 17 ±21% for rectum and 16±10% for sigmoid. The prostate and PTV volume differences were in the order of 3±5%. The volumetric differences observed had a respective impact on the DVHs of all organs at risk. Differences of 8–10% in the mean dose were observed for all organs above. Conclusion: Deformable registration is a powerful tool that aids in the definition of critical structures and is often used for the evaluation of daily dose delivered to the patient. It should be noted that extended QA should be performed before clinical implementation of the software and the users should be aware of advantages and limitations of the methods.« less

  5. Software Reviews.

    ERIC Educational Resources Information Center

    Science and Children, 1989

    1989-01-01

    Reviews of seven software packages are presented including "The Environment I: Habitats and EcoSystems; II Cycles and Interactions"; "Super Sign Maker"; "The Great Knowledge Race: Substance Abuse"; "Exploring Science: Temperature"; "Fast Food Calculator and RD Aide"; "The Human Body:…

  6. Technology Tips: A Potpourri.

    ERIC Educational Resources Information Center

    Cuoco, Albert A.; And Others, Eds.

    1994-01-01

    Contains tips from readers about using technology in the classroom, including notebook computers, classroom sets of calculators, geometry software, LOGO software, publisher discounts, curriculum materials in CD-ROM, and volunteer help in computers and computer networking for schools. (MKR)

  7. Web servers and services for electrostatics calculations with APBS and PDB2PQR

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Unni, Samir; Huang, Yong; Hanson, Robert M.

    APBS and PDB2PQR are widely utilized free software packages for biomolecular electrostatics calculations. Using the Opal toolkit, we have developed a web services framework for these software packages that enables the use of APBS and PDB2PQR by users who do not have local access to the necessary amount of computational capabilities. This not only increases accessibility of the software to a wider range of scientists, educators, and students but it also increases the availability of electrostatics calculations on portable computing platforms. Users can access this new functionality in two ways. First, an Opal-enabled version of APBS is provided in currentmore » distributions, available freely on the web. Second, we have extended the PDB2PQR web server to provide an interface for the setup, execution, and visualization electrostatics potentials as calculated by APBS. This web interface also uses the Opal framework which ensures the scalability needed to support the large APBS user community. Both of these resources are available from the APBS/PDB2PQR website: http://www.poissonboltzmann.org/.« less

  8. Web servers and services for electrostatics calculations with APBS and PDB2PQR

    PubMed Central

    Unni, Samir; Huang, Yong; Hanson, Robert; Tobias, Malcolm; Krishnan, Sriram; Li, Wilfred W.; Nielsen, Jens E.; Baker, Nathan A.

    2011-01-01

    APBS and PDB2PQR are widely utilized free software packages for biomolecular electrostatics calculations. Using the Opal toolkit, we have developed a Web services framework for these software packages that enables the use of APBS and PDB2PQR by users who do not have local access to the necessary amount of computational capabilities. This not only increases accessibility of the software to a wider range of scientists, educators, and students but it also increases the availability of electrostatics calculations on portable computing platforms. Users can access this new functionality in two ways. First, an Opal-enabled version of APBS is provided in current distributions, available freely on the web. Second, we have extended the PDB2PQR web server to provide an interface for the setup, execution, and visualization electrostatics potentials as calculated by APBS. This web interface also uses the Opal framework which ensures the scalability needed to support the large APBS user community. Both of these resources are available from the APBS/PDB2PQR website: http://www.poissonboltzmann.org/. PMID:21425296

  9. Rapid Analysis of Mass Distribution of Radiation Shielding

    NASA Technical Reports Server (NTRS)

    Zapp, Edward

    2007-01-01

    Radiation Shielding Evaluation Toolset (RADSET) is a computer program that rapidly calculates the spatial distribution of mass of an arbitrary structure for use in ray-tracing analysis of the radiation-shielding properties of the structure. RADSET was written to be used in conjunction with unmodified commercial computer-aided design (CAD) software that provides access to data on the structure and generates selected three-dimensional-appearing views of the structure. RADSET obtains raw geometric, material, and mass data on the structure from the CAD software. From these data, RADSET calculates the distribution(s) of the masses of specific materials about any user-specified point(s). The results of these mass-distribution calculations are imported back into the CAD computing environment, wherein the radiation-shielding calculations are performed.

  10. [The planning of resource support of secondary medical care in hospital].

    PubMed

    Kungurov, N V; Zil'berberg, N V

    2010-01-01

    The Ural Institute of dermatovenerology and immunopathology developed and implemented the software concerning the personalized total recording of medical services and pharmaceuticals. The Institute also presents such software as listing of medical services, software module of calculation of financial costs of implementing full standards of secondary medical care in case of chronic dermatopathy, reference book of standards of direct specific costs on laboratory and physiotherapy services, reference book of pharmaceuticals, testing systems and consumables. The unified information system of management recording is a good technique to substantiate the costs of the implementation of standards of medical care, including high-tech care with taking into account the results of total calculation of provided medical services.

  11. WannierTools: An open-source software package for novel topological materials

    NASA Astrophysics Data System (ADS)

    Wu, QuanSheng; Zhang, ShengNan; Song, Hai-Feng; Troyer, Matthias; Soluyanov, Alexey A.

    2018-03-01

    We present an open-source software package WannierTools, a tool for investigation of novel topological materials. This code works in the tight-binding framework, which can be generated by another software package Wannier90 (Mostofi et al., 2008). It can help to classify the topological phase of a given material by calculating the Wilson loop, and can get the surface state spectrum, which is detected by angle resolved photoemission (ARPES) and in scanning tunneling microscopy (STM) experiments. It also identifies positions of Weyl/Dirac points and nodal line structures, calculates the Berry phase around a closed momentum loop and Berry curvature in a part of the Brillouin zone (BZ).

  12. Turbo FRMAC 2016 v. 7.3.0

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Madrid, Gregory J.; Whitener, Dustin Heath; Folz, Wesley

    2017-05-27

    The Turbo FRMAC (TF) software program is the software implementation of the science and methodologies utilized in the Federal Radiological Monitoring and Assessment Center (FRMAC). The software automates the calculations described in volumes 1 of "The Federal Manual for Assessing Environmental Data during a Radiological Emergency" (2015 version). In the event of the intentional or accidental release of radioactive material, the software is used to guide and govern the response of the Federal, State, Local, and Tribal governments. The manual, upon which the software is based, is unclassified and freely available on the Internet.

  13. Turbo FRMAC 2018

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fulton, John; Gallagher, Linda; Gonzales, Alejandro

    The Turbo FRMAC (TF) software program is the software implementation of the science and methodologies utilized in the Federal Radiological Monitoring and Assessment Center (FRMAC). The software automates the calculations described in volumes 1 of "The Federal Manual for Assessing Environmental Data during a Radiological Emergency" (2015 version). In the event of the intentional or accidental release of radioactive material, the software is used to guide and govern the response of the Federal, State, Local, and Tribal governments. The manual, upon which the software is based, is unclassified and freely available on the Internet.

  14. Turbo FRMAC 2016 Version 7.1.0

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fulton, John; Gallagher, Linda K.; Madrid, Gregory J.

    2016-08-01

    The Turbo FRMAC (TF) software program is the software implementation of the science and methodologies utilized in the Federal Radiological Monitoring and Assessment Center (FRMAC). The software automates the calculations described in volumes 1 of "The Federal Manual for Assessing Environmental Data during a Radiological Emergency" (2015 version). In the event of the intentional or accidental release of radioactive material, the software is used to guide and govern the response of the Federal, State, Local, and Tribal governments. The manual, upon which the software is based, is unclassified and freely available on the Internet.

  15. Turbo FRMAC 2016 v. 7.2.0

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Madrid, Gregory J.; Whitener, Dustin Heath; Folz, Wesley

    2017-02-27

    The Turbo FRMAC (TF) software program is the software implementation of the science and methodologies utilized in the Federal Radiological Monitoring and Assessment Center (FRMAC). The software automates the calculations described in volumes 1 of "The Federal Manual for Assessing Environmental Data during a Radiological Emergency" (2015 version). In the event of the intentional or accidental release of radioactive material, the software is used to guide and govern the response of the Federal, State, Local, and Tribal governments. The manual, upon which the software is based, is unclassified and freely available on the Internet.

  16. Software reliability through fault-avoidance and fault-tolerance

    NASA Technical Reports Server (NTRS)

    Vouk, Mladen A.; Mcallister, David F.

    1992-01-01

    Accomplishments in the following research areas are summarized: structure based testing, reliability growth, and design testability with risk evaluation; reliability growth models and software risk management; and evaluation of consensus voting, consensus recovery block, and acceptance voting. Four papers generated during the reporting period are included as appendices.

  17. Multi-institutional Validation Study of Commercially Available Deformable Image Registration Software for Thoracic Images.

    PubMed

    Kadoya, Noriyuki; Nakajima, Yujiro; Saito, Masahide; Miyabe, Yuki; Kurooka, Masahiko; Kito, Satoshi; Fujita, Yukio; Sasaki, Motoharu; Arai, Kazuhiro; Tani, Kensuke; Yagi, Masashi; Wakita, Akihisa; Tohyama, Naoki; Jingu, Keiichi

    2016-10-01

    To assess the accuracy of the commercially available deformable image registration (DIR) software for thoracic images at multiple institutions. Thoracic 4-dimensional (4D) CT images of 10 patients with esophageal or lung cancer were used. Datasets for these patients were provided by DIR-lab (dir-lab.com) and included a coordinate list of anatomic landmarks (300 bronchial bifurcations) that had been manually identified. Deformable image registration was performed between the peak-inhale and -exhale images. Deformable image registration error was determined by calculating the difference at each landmark point between the displacement calculated by DIR software and that calculated by the landmark. Eleven institutions participated in this study: 4 used RayStation (RaySearch Laboratories, Stockholm, Sweden), 5 used MIM Software (Cleveland, OH), and 3 used Velocity (Varian Medical Systems, Palo Alto, CA). The ranges of the average absolute registration errors over all cases were as follows: 0.48 to 1.51 mm (right-left), 0.53 to 2.86 mm (anterior-posterior), 0.85 to 4.46 mm (superior-inferior), and 1.26 to 6.20 mm (3-dimensional). For each DIR software package, the average 3-dimensional registration error (range) was as follows: RayStation, 3.28 mm (1.26-3.91 mm); MIM Software, 3.29 mm (2.17-3.61 mm); and Velocity, 5.01 mm (4.02-6.20 mm). These results demonstrate that there was moderate variation among institutions, although the DIR software was the same. We evaluated the commercially available DIR software using thoracic 4D-CT images from multiple centers. Our results demonstrated that DIR accuracy differed among institutions because it was dependent on both the DIR software and procedure. Our results could be helpful for establishing prospective clinical trials and for the widespread use of DIR software. In addition, for clinical care, we should try to find the optimal DIR procedure using thoracic 4D-CT data. Copyright © 2016 Elsevier Inc. All rights reserved.

  18. Multi-institutional Validation Study of Commercially Available Deformable Image Registration Software for Thoracic Images

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kadoya, Noriyuki, E-mail: kadoya.n@rad.med.tohoku.ac.jp; Nakajima, Yujiro; Saito, Masahide

    Purpose: To assess the accuracy of the commercially available deformable image registration (DIR) software for thoracic images at multiple institutions. Methods and Materials: Thoracic 4-dimensional (4D) CT images of 10 patients with esophageal or lung cancer were used. Datasets for these patients were provided by DIR-lab ( (dir-lab.com)) and included a coordinate list of anatomic landmarks (300 bronchial bifurcations) that had been manually identified. Deformable image registration was performed between the peak-inhale and -exhale images. Deformable image registration error was determined by calculating the difference at each landmark point between the displacement calculated by DIR software and that calculated bymore » the landmark. Results: Eleven institutions participated in this study: 4 used RayStation (RaySearch Laboratories, Stockholm, Sweden), 5 used MIM Software (Cleveland, OH), and 3 used Velocity (Varian Medical Systems, Palo Alto, CA). The ranges of the average absolute registration errors over all cases were as follows: 0.48 to 1.51 mm (right-left), 0.53 to 2.86 mm (anterior-posterior), 0.85 to 4.46 mm (superior-inferior), and 1.26 to 6.20 mm (3-dimensional). For each DIR software package, the average 3-dimensional registration error (range) was as follows: RayStation, 3.28 mm (1.26-3.91 mm); MIM Software, 3.29 mm (2.17-3.61 mm); and Velocity, 5.01 mm (4.02-6.20 mm). These results demonstrate that there was moderate variation among institutions, although the DIR software was the same. Conclusions: We evaluated the commercially available DIR software using thoracic 4D-CT images from multiple centers. Our results demonstrated that DIR accuracy differed among institutions because it was dependent on both the DIR software and procedure. Our results could be helpful for establishing prospective clinical trials and for the widespread use of DIR software. In addition, for clinical care, we should try to find the optimal DIR procedure using thoracic 4D-CT data.« less

  19. An overview to CERSSO's self evaluation of the cost-benefit on the investment in occupational safety and health in the textile factories: "a step by step methodology".

    PubMed

    Amador-Rodezno, Rafael

    2005-01-01

    The Pan American Health Organization (PAHO) and CERSSO collaborated to develop a new Tool Kit (TK), which became available in May 2002. PAHO already had a TK in place, and CERSSO requested that one be developed for their needs. CERSSO wanted to enable managers and line workers in garment factories to self-diagnose plant and workstation hazards and to estimate the costs and benefits of investing in occupational safety and health (OSH) as a way to improve productivity and competitiveness. For consistency, the collaborating organizations agreed to construct the TK according to PAHO's methodology. The instrument was developed to be comprehensive enough that any user can collect the data easily. It integrates epidemiologic, risk assessment, clinic, engineering, and accountability issues, organized to include step-by-step training in: (a) performing risk assessments in the workplaces (risk factors); (b) making cause-effect relationships; (c) improving decision making on OSH interventions; (d) doing calculations of direct and indirect costs and savings; and (e) doing calculation of the overall cost-benefit of OSH interventions. Since July 2002, about 2,400 employees and officials from 736 garment factories, Ministries of Labor, Health, Social Security Institutes, and Technical Training Institutions of Central America and the Dominican Republic have used this instrument. Systematically, they have calculated a positive relationship of the investment (3 to 33 times). Employers are now aware of the financial rewards of investing in OSH. The TK is available in Spanish, Korean, and English. In July 2003, a software program in Spanish and English was developed (180 persons have been trained in the region), which requires less time to execute with better reliability.

  20. Radiation dose and cancer risk estimates in helical CT for pulmonary tuberculosis infections

    NASA Astrophysics Data System (ADS)

    Adeleye, Bamise; Chetty, Naven

    2017-12-01

    The preference for computed tomography (CT) for the clinical assessment of pulmonary tuberculosis (PTB) infections has increased the concern about the potential risk of cancer in exposed patients. In this study, we investigated the correlation between cancer risk and radiation doses from different CT scanners, assuming an equivalent scan protocol. Radiation doses from three 16-slice units were estimated using the CT-Expo dosimetry software version 2.4 and standard CT scan protocol for patients with suspected PTB infections. The lifetime risk of cancer for each scanner was determined using the methodology outlined in the BEIR VII report. Organ doses were significantly different (P < 0.05) between the scanners. The calculated effective dose for scanner H2 is 34% and 37% higher than scanners H3 and H1 respectively. A high and statistically significant correlation was observed between estimated lifetime cancer risk for both male (r2 = 0.943, P < 0.05) and female patients (r2 = 0.989, P < 0.05). The risk variation between the scanners was slightly higher than 2% for all ages but was much smaller for specific ages for male and female patients (0.2% and 0.7%, respectively). These variations provide an indication that the use of a scanner optimizing protocol is imperative.

  1. Software Systems for Prediction and Immediate Assessment of Emergency Situations on Municipalities Territories

    NASA Astrophysics Data System (ADS)

    Poluyan, L. V.; Syutkina, E. V.; Guryev, E. S.

    2017-11-01

    The comparative analysis of key features of the software systems TOXI+Risk and ALOHA is presented. The authors made a comparison of domestic (TOXI+Risk) and foreign (ALOHA) software systems allowing to give the quantitative assessment of impact areas (pressure, thermal, toxic) in case of hypothetical emergencies in potentially hazardous objects of the oil, gas, chemical, petrochemical and oil-processing industry. Both software systems use different mathematical models for assessment of the release rate of a chemically hazardous substance from a storage tank and its evaporation. The comparison of the accuracy of definition of impact areas made by both software systems to verify the examples shows good convergence of both products. The analysis results showed that the ALOHA software can be actively used for forecasting and immediate assessment of emergency situations, assessment of damage as a result of emergencies on the territories of municipalities.

  2. Software for Optimizing Quality Assurance of Other Software

    NASA Technical Reports Server (NTRS)

    Feather, Martin; Cornford, Steven; Menzies, Tim

    2004-01-01

    Software assurance is the planned and systematic set of activities that ensures that software processes and products conform to requirements, standards, and procedures. Examples of such activities are the following: code inspections, unit tests, design reviews, performance analyses, construction of traceability matrices, etc. In practice, software development projects have only limited resources (e.g., schedule, budget, and availability of personnel) to cover the entire development effort, of which assurance is but a part. Projects must therefore select judiciously from among the possible assurance activities. At its heart, this can be viewed as an optimization problem; namely, to determine the allocation of limited resources (time, money, and personnel) to minimize risk or, alternatively, to minimize the resources needed to reduce risk to an acceptable level. The end result of the work reported here is a means to optimize quality-assurance processes used in developing software.

  3. Nonlinear Combustion Instability Prediction

    NASA Technical Reports Server (NTRS)

    Flandro, Gary

    2010-01-01

    The liquid rocket engine stability prediction software (LCI) predicts combustion stability of systems using LOX-LH2 propellants. Both longitudinal and transverse mode stability characteristics are calculated. This software has the unique feature of being able to predict system limit amplitude.

  4. Cybersecurity, massive data processing, community interaction, and other developments at WWW-based computational X-ray Server

    NASA Astrophysics Data System (ADS)

    Stepanov, Sergey

    2013-03-01

    X-Ray Server (x-server.gmca.aps.anl.gov) is a WWW-based computational server for modeling of X-ray diffraction, reflection and scattering data. The modeling software operates directly on the server and can be accessed remotely either from web browsers or from user software. In the later case the server can be deployed as a software library or a data fitting engine. As the server recently surpassed the milestones of 15 years online and 1.5 million calculations, it accumulated a number of technical solutions that are discussed in this paper. The developed approaches to detecting physical model limits and user calculations failures, solutions to spam and firewall problems, ways to involve the community in replenishing databases and methods to teach users automated access to the server programs may be helpful for X-ray researchers interested in using the server or sharing their own software online.

  5. Data systems and computer science: Software Engineering Program

    NASA Technical Reports Server (NTRS)

    Zygielbaum, Arthur I.

    1991-01-01

    An external review of the Integrated Technology Plan for the Civil Space Program is presented. This review is specifically concerned with the Software Engineering Program. The goals of the Software Engineering Program are as follows: (1) improve NASA's ability to manage development, operation, and maintenance of complex software systems; (2) decrease NASA's cost and risk in engineering complex software systems; and (3) provide technology to assure safety and reliability of software in mission critical applications.

  6. NSR&D Program Fiscal Year 2015 Funded Research Stochastic Modeling of Radioactive Material Releases Final Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Andrus, Jason P.; Pope, Chad; Toston, Mary

    2016-12-01

    Nonreactor nuclear facilities operating under the approval authority of the U.S. Department of Energy use unmitigated hazard evaluations to determine if potential radiological doses associated with design basis events challenge or exceed dose evaluation guidelines. Unmitigated design basis events that sufficiently challenge dose evaluation guidelines or exceed the guidelines for members of the public or workers, merit selection of safety structures, systems, or components or other controls to prevent or mitigate the hazard. Idaho State University, in collaboration with Idaho National Laboratory, has developed a portable and simple to use software application called SODA (Stochastic Objective Decision-Aide) that stochastically calculatesmore » the radiation dose distribution associated with hypothetical radiological material release scenarios. Rather than producing a point estimate of the dose, SODA produces a dose distribution result to allow a deeper understanding of the dose potential. SODA allows users to select the distribution type and parameter values for all of the input variables used to perform the dose calculation. Users can also specify custom distributions through a user defined distribution option. SODA then randomly samples each distribution input variable and calculates the overall resulting dose distribution. In cases where an input variable distribution is unknown, a traditional single point value can be used. SODA, developed using the MATLAB coding framework, has a graphical user interface and can be installed on both Windows and Mac computers. SODA is a standalone software application and does not require MATLAB to function. SODA provides improved risk understanding leading to better informed decision making associated with establishing nuclear facility material-at-risk limits and safety structure, system, or component selection. It is important to note that SODA does not replace or compete with codes such as MACCS or RSAC; rather it is viewed as an easy to use supplemental tool to help improve risk understanding and support better informed decisions. The SODA development project was funded through a grant from the DOE Nuclear Safety Research and Development Program.« less

  7. NSR&D Program Fiscal Year 2015 Funded Research Stochastic Modeling of Radioactive Material Releases Final Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Andrus, Jason P.; Pope, Chad; Toston, Mary

    Nonreactor nuclear facilities operating under the approval authority of the U.S. Department of Energy use unmitigated hazard evaluations to determine if potential radiological doses associated with design basis events challenge or exceed dose evaluation guidelines. Unmitigated design basis events that sufficiently challenge dose evaluation guidelines or exceed the guidelines for members of the public or workers, merit selection of safety structures, systems, or components or other controls to prevent or mitigate the hazard. Idaho State University, in collaboration with Idaho National Laboratory, has developed a portable and simple to use software application called SODA (Stochastic Objective Decision-Aide) that stochastically calculatesmore » the radiation dose distribution associated with hypothetical radiological material release scenarios. Rather than producing a point estimate of the dose, SODA produces a dose distribution result to allow a deeper understanding of the dose potential. SODA allows users to select the distribution type and parameter values for all of the input variables used to perform the dose calculation. Users can also specify custom distributions through a user defined distribution option. SODA then randomly samples each distribution input variable and calculates the overall resulting dose distribution. In cases where an input variable distribution is unknown, a traditional single point value can be used. SODA, developed using the MATLAB coding framework, has a graphical user interface and can be installed on both Windows and Mac computers. SODA is a standalone software application and does not require MATLAB to function. SODA provides improved risk understanding leading to better informed decision making associated with establishing nuclear facility material-at-risk limits and safety structure, system, or component selection. It is important to note that SODA does not replace or compete with codes such as MACCS or RSAC; rather it is viewed as an easy to use supplemental tool to help improve risk understanding and support better informed decisions. The SODA development project was funded through a grant from the DOE Nuclear Safety Research and Development Program.« less

  8. Investigating the Acquisition of Software Systems that Rely on Open Architecture and Open Source Software

    DTIC Science & Technology

    2010-03-01

    associated with certain software systems [Breaux and Anton 2008]. With this basis to build on, it is now possible to analyze the alignment of...Kazman, R., (2003). Software Architecture in Practice, 2nd Edition, Addison-Wesley Pro- fessional, New York.. Breaux, T.D. and Anton , A.I. (2008... calculus for license rights and obligations in license and context models. Using them, we calculate rights and obligations for specific sys- tems, identify

  9. An Architecture, System Engineering, and Acquisition Approach for Space System Software Resiliency

    NASA Astrophysics Data System (ADS)

    Phillips, Dewanne Marie

    Software intensive space systems can harbor defects and vulnerabilities that may enable external adversaries or malicious insiders to disrupt or disable system functions, risking mission compromise or loss. Mitigating this risk demands a sustained focus on the security and resiliency of the system architecture including software, hardware, and other components. Robust software engineering practices contribute to the foundation of a resilient system so that the system "can take a hit to a critical component and recover in a known, bounded, and generally acceptable period of time". Software resiliency must be a priority and addressed early in the life cycle development to contribute a secure and dependable space system. Those who develop, implement, and operate software intensive space systems must determine the factors and systems engineering practices to address when investing in software resiliency. This dissertation offers methodical approaches for improving space system resiliency through software architecture design, system engineering, increased software security, thereby reducing the risk of latent software defects and vulnerabilities. By providing greater attention to the early life cycle phases of development, we can alter the engineering process to help detect, eliminate, and avoid vulnerabilities before space systems are delivered. To achieve this objective, this dissertation will identify knowledge, techniques, and tools that engineers and managers can utilize to help them recognize how vulnerabilities are produced and discovered so that they can learn to circumvent them in future efforts. We conducted a systematic review of existing architectural practices, standards, security and coding practices, various threats, defects, and vulnerabilities that impact space systems from hundreds of relevant publications and interviews of subject matter experts. We expanded on the system-level body of knowledge for resiliency and identified a new software architecture framework and acquisition methodology to improve the resiliency of space systems from a software perspective with an emphasis on the early phases of the systems engineering life cycle. This methodology involves seven steps: 1) Define technical resiliency requirements, 1a) Identify standards/policy for software resiliency, 2) Develop a request for proposal (RFP)/statement of work (SOW) for resilient space systems software, 3) Define software resiliency goals for space systems, 4) Establish software resiliency quality attributes, 5) Perform architectural tradeoffs and identify risks, 6) Conduct architecture assessments as part of the procurement process, and 7) Ascertain space system software architecture resiliency metrics. Data illustrates that software vulnerabilities can lead to opportunities for malicious cyber activities, which could degrade the space mission capability for the user community. Reducing the number of vulnerabilities by improving architecture and software system engineering practices can contribute to making space systems more resilient. Since cyber-attacks are enabled by shortfalls in software, robust software engineering practices and an architectural design are foundational to resiliency, which is a quality that allows the system to "take a hit to a critical component and recover in a known, bounded, and generally acceptable period of time". To achieve software resiliency for space systems, acquirers and suppliers must identify relevant factors and systems engineering practices to apply across the lifecycle, in software requirements analysis, architecture development, design, implementation, verification and validation, and maintenance phases.

  10. Assessing the Potential Impact of the 2015-2016 El Niño on the California Rim Fire Burn Scar Through Debris Flow Hazard Mapping

    NASA Astrophysics Data System (ADS)

    Larcom, S.; Grigsby, S.; Ustin, S.

    2015-12-01

    Wildfires are a perennial issue for California, and the current record-breaking drought is exacerbating the potential problems for the state. Fires leave behind burn scars characterized by diminished vegetative cover and abundant bare soil, and these areas are especially susceptible to storm events that pose an elevated risk of debris flows and sediment-rich sheet wash. This study focused on the 2013 Rim Fire that devastated significant portions of Stanislaus National Forest and Yosemite National Park, and utilized readily available NASA JPL SRTM elevation data and AVIRIS spectral imaging data to construct a debris flow hazard map that assesses mass wasting risk for the Rim Fire burn scar. This study consisted entirely of remotely sensed data, which was processed in software programs such as ENVI, GRASS GIS, ArcMap, and Google Earth. Parameters that were taken into consideration when constructing this map include hill slope (greater than 30 percent rise), burn severity (assessed by calculating NDVI), and erodibility of the soil (by comparing spectral reflectance of AVIRIS images with the reference spectra of illite). By calculating percent of total burn area, 6% was classified as low risk, 55% as medium risk, and 39% as high risk. In addition, this study assessed the importance of the 2015-2016 El Niño, which is projected to be one of the strongest on record, by studying historic rainfall records and storm events of past El Niño's. Hydrological and infrastructural problems that could be caused by short-term convective or long-term synoptic storms and subsequent debris flows were explored as well.

  11. Processing Raman Spectra of High-Pressure Hydrogen Flames

    NASA Technical Reports Server (NTRS)

    Nguyen, Quang-Viet; Kojima, Jun

    2006-01-01

    The Raman Code automates the analysis of laser-Raman-spectroscopy data for diagnosis of combustion at high pressure. On the basis of the theory of molecular spectroscopy, the software calculates the rovibrational and pure rotational Raman spectra of H2, O2, N2, and H2O in hydrogen/air flames at given temperatures and pressures. Given a set of Raman spectral data from measurements on a given flame and results from the aforementioned calculations, the software calculates the thermodynamic temperature and number densities of the aforementioned species. The software accounts for collisional spectral-line-broadening effects at pressures up to 60 bar (6 MPa). The line-broadening effects increase with pressure and thereby complicate the analysis. The software also corrects for spectral interference ("cross-talk") among the various chemical species. In the absence of such correction, the cross-talk is a significant source of error in temperatures and number densities. This is the first known comprehensive computer code that, when used in conjunction with a spectral calibration database, can process Raman-scattering spectral data from high-pressure hydrogen/air flames to obtain temperatures accurate to within 10 K and chemical-species number densities accurate to within 2 percent.

  12. Development of digital reconstructed radiography software at new treatment facility for carbon-ion beam scanning of National Institute of Radiological Sciences.

    PubMed

    Mori, Shinichiro; Inaniwa, Taku; Kumagai, Motoki; Kuwae, Tsunekazu; Matsuzaki, Yuka; Furukawa, Takuji; Shirai, Toshiyuki; Noda, Koji

    2012-06-01

    To increase the accuracy of carbon ion beam scanning therapy, we have developed a graphical user interface-based digitally-reconstructed radiograph (DRR) software system for use in routine clinical practice at our center. The DRR software is used in particular scenarios in the new treatment facility to achieve the same level of geometrical accuracy at the treatment as at the imaging session. DRR calculation is implemented simply as the summation of CT image voxel values along the X-ray projection ray. Since we implemented graphics processing unit-based computation, the DRR images are calculated with a speed sufficient for the particular clinical practice requirements. Since high spatial resolution flat panel detector (FPD) images should be registered to the reference DRR images in patient setup process in any scenarios, the DRR images also needs higher spatial resolution close to that of FPD images. To overcome the limitation of the CT spatial resolution imposed by the CT voxel size, we applied image processing to improve the calculated DRR spatial resolution. The DRR software introduced here enabled patient positioning with sufficient accuracy for the implementation of carbon-ion beam scanning therapy at our center.

  13. RAMPART (TM): Risk Assessment Method-Property Analysis and Ranking Tool v.4.0

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Carson, Susan D.; Hunter, Regina L.; Link, Madison D.

    RAMPART{trademark}, Risk Assessment Method-property Analysis and Ranking Tool, is a new type of computer software package for the assessment of risk to buildings. RAMPART{trademark} has been developed by Sandia National Laboratories (SNL) for the U.S. General Services Administration (GSA). RAMPART {trademark} has been designed and developed to be a risk-based decision support tool that requires no risk analysis expertise on the part of the user. The RAMPART{trademark} user interface elicits information from the user about the building. The RAMPART{trademark} expert system is a set of rules that embodies GSA corporate knowledge and SNL's risk assessment experience. The RAMPART{trademark} database containsmore » both data entered by the user during a building analysis session and large sets of natural hazard and crime data. RAMPART{trademark} algorithms use these data to assess the risk associated with a given building in the face of certain hazards. Risks arising from five natural hazards (earthquake, hurricane, winter storm, tornado and flood); crime (inside and outside the building); fire and terrorism are calculated. These hazards may cause losses of various kinds. RAMPART{trademark} considers death, injury, loss of mission, loss of property, loss of contents, loss of building use, and first-responder loss. The results of each analysis are presented graphically on the screen and in a written report.« less

  14. The key role of psychosocial risk on therapeutic outcome in obese children and adolescents. Results from a longitudinal multicenter study.

    PubMed

    Röbl, Markus; de Souza, Martin; Schiel, Ralf; Gellhaus, Ines; Zwiauer, Karl; Holl, Reinhard W; Wiegand, Susanna

    2013-01-01

    Childhood obesity is high on the global public health agenda. Although risk factors are well known, the influence of social risk on the therapeutic outcome of lifestyle intervention is poorly examined. This study aims to investigate the influence of migration background, low education, and parental unemployment. 62,147 patients participated in multidimensional lifestyle intervention programs in 179 pediatric obesity centers. Data were collected using standardized software for longitudinal multicenter documentation. 12,305 (19.8%) attended care for 6-24 months, undergoing an intensive therapy period and subsequent follow-ups for up to 3 years. A cumulative social risk score was calculated based on different risk indicators. Migration background, low education, and parental employment significantly influenced the outcome of lifestyle intervention. The observed BMI-SDS reduction was significantly higher in the subgroup with low social risks factors (Δ BMI-SDS -0.19) compared to those presenting moderate (Δ BMI-SDS -0.14) and high social risk (Δ BMI-SDS -0.11). Our data underline the effect of children's social setting on the outcome of multidimensional lifestyle intervention. The presence of a high social risk burden is a negative predictor for successful weight loss. Specific therapeutic programs need to be developed for disadvantaged children and adolescents. Copyright © 2013 S. Karger GmbH, Freiburg

  15. A Formal Application of Safety and Risk Assessment in Software Systems

    DTIC Science & Technology

    2004-09-01

    characteristics of Software Engineering, Development, and Safety...against a comparison of planned and actual schedules, costs, and characteristics . Software Safety is focused on the reduction of unsafe incidents...they merely carry out the role for which they were anatomically designed.55 Software is characteristically like an anatomical cell as it merely

  16. GEDAE-LaB: A Free Software to Calculate the Energy System Contributions during Exercise

    PubMed Central

    Bertuzzi, Rômulo; Melegati, Jorge; Bueno, Salomão; Ghiarone, Thaysa; Pasqua, Leonardo A.; Gáspari, Arthur Fernandes; Lima-Silva, Adriano E.; Goldman, Alfredo

    2016-01-01

    Purpose The aim of the current study is to describe the functionality of free software developed for energy system contributions and energy expenditure calculation during exercise, namely GEDAE-LaB. Methods Eleven participants performed the following tests: 1) a maximal cycling incremental test to measure the ventilatory threshold and maximal oxygen uptake (V˙O2max); 2) a cycling workload constant test at moderate domain (90% ventilatory threshold); 3) a cycling workload constant test at severe domain (110% V˙O2max). Oxygen uptake and plasma lactate were measured during the tests. The contributions of the aerobic (AMET), anaerobic lactic (LAMET), and anaerobic alactic (ALMET) systems were calculated based on the oxygen uptake during exercise, the oxygen energy equivalents provided by lactate accumulation, and the fast component of excess post-exercise oxygen consumption, respectively. In order to assess the intra-investigator variation, four different investigators performed the analyses independently using GEDAE-LaB. A direct comparison with commercial software was also provided. Results All subjects completed 10 min of exercise at moderate domain, while the time to exhaustion at severe domain was 144 ± 65 s. The AMET, LAMET, and ALMET contributions during moderate domain were about 93, 2, and 5%, respectively. The AMET, LAMET, and ALMET contributions during severe domain were about 66, 21, and 13%, respectively. No statistical differences were found between the energy system contributions and energy expenditure obtained by GEDAE-LaB and commercial software for both moderate and severe domains (P > 0.05). The ICC revealed that these estimates were highly reliable among the four investigators for both moderate and severe domains (all ICC ≥ 0.94). Conclusion These findings suggest that GEDAE-LaB is a free software easily comprehended by users minimally familiarized with adopted procedures for calculations of energetic profile using oxygen uptake and lactate accumulation during exercise. By providing availability of the software and its source code we hope to facilitate future related research. PMID:26727499

  17. GEDAE-LaB: A Free Software to Calculate the Energy System Contributions during Exercise.

    PubMed

    Bertuzzi, Rômulo; Melegati, Jorge; Bueno, Salomão; Ghiarone, Thaysa; Pasqua, Leonardo A; Gáspari, Arthur Fernandes; Lima-Silva, Adriano E; Goldman, Alfredo

    2016-01-01

    The aim of the current study is to describe the functionality of free software developed for energy system contributions and energy expenditure calculation during exercise, namely GEDAE-LaB. Eleven participants performed the following tests: 1) a maximal cycling incremental test to measure the ventilatory threshold and maximal oxygen uptake (V̇O2max); 2) a cycling workload constant test at moderate domain (90% ventilatory threshold); 3) a cycling workload constant test at severe domain (110% V̇O2max). Oxygen uptake and plasma lactate were measured during the tests. The contributions of the aerobic (AMET), anaerobic lactic (LAMET), and anaerobic alactic (ALMET) systems were calculated based on the oxygen uptake during exercise, the oxygen energy equivalents provided by lactate accumulation, and the fast component of excess post-exercise oxygen consumption, respectively. In order to assess the intra-investigator variation, four different investigators performed the analyses independently using GEDAE-LaB. A direct comparison with commercial software was also provided. All subjects completed 10 min of exercise at moderate domain, while the time to exhaustion at severe domain was 144 ± 65 s. The AMET, LAMET, and ALMET contributions during moderate domain were about 93, 2, and 5%, respectively. The AMET, LAMET, and ALMET contributions during severe domain were about 66, 21, and 13%, respectively. No statistical differences were found between the energy system contributions and energy expenditure obtained by GEDAE-LaB and commercial software for both moderate and severe domains (P > 0.05). The ICC revealed that these estimates were highly reliable among the four investigators for both moderate and severe domains (all ICC ≥ 0.94). These findings suggest that GEDAE-LaB is a free software easily comprehended by users minimally familiarized with adopted procedures for calculations of energetic profile using oxygen uptake and lactate accumulation during exercise. By providing availability of the software and its source code we hope to facilitate future related research.

  18. Effects of photographic distance on tree crown atributes calculated using urbancrowns image analysis software

    Treesearch

    Mason F. Patterson; P. Eric Wiseman; Matthew F. Winn; Sang-mook Lee; Philip A. Araman

    2011-01-01

    UrbanCrowns is a software program developed by the USDA Forest Service that computes crown attributes using a side-view digital photograph and a few basic field measurements. From an operational standpoint, it is not known how well the software performs under varying photographic conditions for trees of diverse size, which could impact measurement reproducibility and...

  19. Improving Building Energy Simulation Programs Through Diagnostic Testing (Fact Sheet)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    2012-02-01

    New test procedure evaluates quality and accuracy of energy analysis tools for the residential building retrofit market. Reducing the energy use of existing homes in the United States offers significant energy-saving opportunities, which can be identified through building simulation software tools that calculate optimal packages of efficiency measures. To improve the accuracy of energy analysis for residential buildings, the National Renewable Energy Laboratory's (NREL) Buildings Research team developed the Building Energy Simulation Test for Existing Homes (BESTEST-EX), a method for diagnosing and correcting errors in building energy audit software and calibration procedures. BESTEST-EX consists of building physics and utility billmore » calibration test cases, which software developers can use to compare their tools simulation findings to reference results generated with state-of-the-art simulation tools. Overall, the BESTEST-EX methodology: (1) Tests software predictions of retrofit energy savings in existing homes; (2) Ensures building physics calculations and utility bill calibration procedures perform to a minimum standard; and (3) Quantifies impacts of uncertainties in input audit data and occupant behavior. BESTEST-EX is helping software developers identify and correct bugs in their software, as well as develop and test utility bill calibration procedures.« less

  20. The cost of work-related physical assaults in Minnesota.

    PubMed Central

    McGovern, P; Kochevar, L; Lohman, W; Zaidman, B; Gerberich, S G; Nyman, J; Findorff-Dennis, M

    2000-01-01

    OBJECTIVE: To describe the long-term productivity costs of occupational assaults. DATA SOURCES/STUDY SETTING: All incidents of physical assaults that resulted in indemnity payments, identified from the Minnesota Department of Labor and Industry (DLI) Workers' Compensation system in 1992. Medical expenditures were obtained from insurers, and data on lost wages, legal fees, and permanency ratings were collected from DLI records. Insurance administrative expenses were estimated. Lost fringe benefits and household production losses were imputed. STUDY DESIGN: The human capital approach was used to describe the long-term costs of occupational assaults. Economic software was used to apply a modified version of Rice, MacKenzie, and Associates' (1989) model for estimating the present value of past losses from 1992 through 1995 for all cases, and the future losses for cases open in 1996. PRINCIPAL FINDINGS: The total costs for 344 nonfatal work-related assaults were estimated at $5,885,448 (1996 dollars). Calculation of injury incidence and average costs per case and per employee identified populations with an elevated risk of assault. An analysis by industry revealed an elevated risk for workers employed in justice and safety (incidence: 198/100,000; $19,251 per case; $38 per employee), social service (incidence: 127/100,000; $24,210 per case; $31 per employee), and health care (incidence: 76/100,000; $13,197 per case; $10 per employee). CONCLUSIONS: Identified subgroups warrant attention for risk factor identification and prevention efforts. Cost estimates can serve as the basis for business calculations on the potential value of risk management interventions. PMID:10966089

  1. VAVUQ, Python and Matlab freeware for Verification and Validation, Uncertainty Quantification

    NASA Astrophysics Data System (ADS)

    Courtney, J. E.; Zamani, K.; Bombardelli, F. A.; Fleenor, W. E.

    2015-12-01

    A package of scripts is presented for automated Verification and Validation (V&V) and Uncertainty Quantification (UQ) for engineering codes that approximate Partial Differential Equations (PDFs). The code post-processes model results to produce V&V and UQ information. This information can be used to assess model performance. Automated information on code performance can allow for a systematic methodology to assess the quality of model approximations. The software implements common and accepted code verification schemes. The software uses the Method of Manufactured Solutions (MMS), the Method of Exact Solution (MES), Cross-Code Verification, and Richardson Extrapolation (RE) for solution (calculation) verification. It also includes common statistical measures that can be used for model skill assessment. Complete RE can be conducted for complex geometries by implementing high-order non-oscillating numerical interpolation schemes within the software. Model approximation uncertainty is quantified by calculating lower and upper bounds of numerical error from the RE results. The software is also able to calculate the Grid Convergence Index (GCI), and to handle adaptive meshes and models that implement mixed order schemes. Four examples are provided to demonstrate the use of the software for code and solution verification, model validation and uncertainty quantification. The software is used for code verification of a mixed-order compact difference heat transport solver; the solution verification of a 2D shallow-water-wave solver for tidal flow modeling in estuaries; the model validation of a two-phase flow computation in a hydraulic jump compared to experimental data; and numerical uncertainty quantification for 3D CFD modeling of the flow patterns in a Gust erosion chamber.

  2. [Aconitum in treatment of rheumatoid arthritis: benefit-risk assessment].

    PubMed

    Zhang, Xiao-Meng; Jin, Yong-Nan; Zhang, Bing; Li, Ning

    2018-01-01

    Rheumatoid arthritis (RA) has the characteristics of long course of disease and difficulty in treatment. The conventional therapy may easily induce adverse drug reactions or events (ADR/ADE) due to the long-time medication. Thus, it should be given special attentions to treatment benefit and medication risk of RA patients. Aconitum, a kind of toxic traditional Chinese herbs, is an important complement therapy for RA, with some controversy in clinical application. Coming straight to the practical problem of combined use of traditional Chinese medicines (TCM) and Western medicines (WM), this study conducted quantitative assessment on the benefits and risks of Aconitum using combined with WM or not, which was carried out by the method of multi-criteria decision analysis model. RevMan 5.2 software was used to separately analyze the results of every index of 21 random clinical trials (RCTs) of Aconitum exclusive use in the treatment of RA, and 49 RCTs of Aconitum combined use with WM. The merged results indicated that as compared with the conventional therapy of WM, no matter the exclusive use or the combined use of Aconitum could improve the efficacy and decrease the incidence of ADR/ADEs. Based on the benefit-risk assessment decision tree of RA treatment, Hiview 3 software and Crystal Ball Monte Carlo simulation were used to calculate the benefit value, risk value and benefit-risk value of Aconitum exclusive use and the combined use of Aconitum with WM. The results showed that the combination therapy had significantly better benefits than Aconitum exclusive using, difference value was 15, (95%CI[9.72, 20.25]), but the risk of combined use was higher difference value=23, (95%CI[15.57, 30.55]). In comprehensive consideration of the benefit and risk, the total benefit-risk value of using Aconitum alone was 58, while that of the combination therapy was 55, and the probability of the former superior to the latter was 81.07%. The study showed that Aconitum was the important therapy to supply RA treatment. In clinical application, the patient's acceptability of benefit and risk need to be considered; if patients cannot bear the risk, the combination use of Aconitum and WM is not recommended. Copyright© by the Chinese Pharmaceutical Association.

  3. APBSmem: A Graphical Interface for Electrostatic Calculations at the Membrane

    PubMed Central

    Callenberg, Keith M.; Choudhary, Om P.; de Forest, Gabriel L.; Gohara, David W.; Baker, Nathan A.; Grabe, Michael

    2010-01-01

    Electrostatic forces are one of the primary determinants of molecular interactions. They help guide the folding of proteins, increase the binding of one protein to another and facilitate protein-DNA and protein-ligand binding. A popular method for computing the electrostatic properties of biological systems is to numerically solve the Poisson-Boltzmann (PB) equation, and there are several easy-to-use software packages available that solve the PB equation for soluble proteins. Here we present a freely available program, called APBSmem, for carrying out these calculations in the presence of a membrane. The Adaptive Poisson-Boltzmann Solver (APBS) is used as a back-end for solving the PB equation, and a Java-based graphical user interface (GUI) coordinates a set of routines that introduce the influence of the membrane, determine its placement relative to the protein, and set the membrane potential. The software Jmol is embedded in the GUI to visualize the protein inserted in the membrane before the calculation and the electrostatic potential after completing the computation. We expect that the ease with which the GUI allows one to carry out these calculations will make this software a useful resource for experimenters and computational researchers alike. Three examples of membrane protein electrostatic calculations are carried out to illustrate how to use APBSmem and to highlight the different quantities of interest that can be calculated. PMID:20949122

  4. APBSmem: a graphical interface for electrostatic calculations at the membrane.

    PubMed

    Callenberg, Keith M; Choudhary, Om P; de Forest, Gabriel L; Gohara, David W; Baker, Nathan A; Grabe, Michael

    2010-09-29

    Electrostatic forces are one of the primary determinants of molecular interactions. They help guide the folding of proteins, increase the binding of one protein to another and facilitate protein-DNA and protein-ligand binding. A popular method for computing the electrostatic properties of biological systems is to numerically solve the Poisson-Boltzmann (PB) equation, and there are several easy-to-use software packages available that solve the PB equation for soluble proteins. Here we present a freely available program, called APBSmem, for carrying out these calculations in the presence of a membrane. The Adaptive Poisson-Boltzmann Solver (APBS) is used as a back-end for solving the PB equation, and a Java-based graphical user interface (GUI) coordinates a set of routines that introduce the influence of the membrane, determine its placement relative to the protein, and set the membrane potential. The software Jmol is embedded in the GUI to visualize the protein inserted in the membrane before the calculation and the electrostatic potential after completing the computation. We expect that the ease with which the GUI allows one to carry out these calculations will make this software a useful resource for experimenters and computational researchers alike. Three examples of membrane protein electrostatic calculations are carried out to illustrate how to use APBSmem and to highlight the different quantities of interest that can be calculated.

  5. BYMUR software: a free and open source tool for quantifying and visualizing multi-risk analyses

    NASA Astrophysics Data System (ADS)

    Tonini, Roberto; Selva, Jacopo

    2013-04-01

    The BYMUR software aims to provide an easy-to-use open source tool for both computing multi-risk and managing/visualizing/comparing all the inputs (e.g. hazard, fragilities and exposure) as well as the corresponding results (e.g. risk curves, risk indexes). For all inputs, a complete management of inter-model epistemic uncertainty is considered. The BYMUR software will be one of the final products provided by the homonymous ByMuR project (http://bymur.bo.ingv.it/) funded by Italian Ministry of Education, Universities and Research (MIUR), focused to (i) provide a quantitative and objective general method for a comprehensive long-term multi-risk analysis in a given area, accounting for inter-model epistemic uncertainty through Bayesian methodologies, and (ii) apply the methodology to seismic, volcanic and tsunami risks in Naples (Italy). More specifically, the BYMUR software will be able to separately account for the probabilistic hazard assessment of different kind of hazardous phenomena, the relative (time-dependent/independent) vulnerabilities and exposure data, and their possible (predefined) interactions: the software will analyze these inputs and will use them to estimate both single- and multi- risk associated to a specific target area. In addition, it will be possible to connect the software to further tools (e.g., a full hazard analysis), allowing a dynamic I/O of results. The use of Python programming language guarantees that the final software will be open source and platform independent. Moreover, thanks to the integration of some most popular and rich-featured Python scientific modules (Numpy, Matplotlib, Scipy) with the wxPython graphical user toolkit, the final tool will be equipped with a comprehensive Graphical User Interface (GUI) able to control and visualize (in the form of tables, maps and/or plots) any stage of the multi-risk analysis. The additional features of importing/exporting data in MySQL databases and/or standard XML formats (for instance, the global standards defined in the frame of GEM project for seismic hazard and risk) will grant the interoperability with other FOSS software and tools and, at the same time, to be on hand of the geo-scientific community. An already available example of connection is represented by the BET_VH(**) tool, which probabilistic volcanic hazard outputs will be used as input for BYMUR. Finally, the prototype version of BYMUR will be used for the case study of the municipality of Naples, by considering three different natural hazards (volcanic eruptions, earthquakes and tsunamis) and by assessing the consequent long-term risk evaluation. (**)BET_VH (Bayesian Event Tree for Volcanic Hazard) is probabilistic tool for long-term volcanic hazard assessment, recently re-designed and adjusted to be run on the Vhub cyber-infrastructure, a free web-based collaborative tool in volcanology research (see http://vhub.org/resources/betvh).

  6. Process-based quality management for clinical implementation of adaptive radiotherapy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Noel, Camille E.; Santanam, Lakshmi; Parikh, Parag J.

    Purpose: Intensity-modulated adaptive radiotherapy (ART) has been the focus of considerable research and developmental work due to its potential therapeutic benefits. However, in light of its unique quality assurance (QA) challenges, no one has described a robust framework for its clinical implementation. In fact, recent position papers by ASTRO and AAPM have firmly endorsed pretreatment patient-specific IMRT QA, which limits the feasibility of online ART. The authors aim to address these obstacles by applying failure mode and effects analysis (FMEA) to identify high-priority errors and appropriate risk-mitigation strategies for clinical implementation of intensity-modulated ART. Methods: An experienced team of twomore » clinical medical physicists, one clinical engineer, and one radiation oncologist was assembled to perform a standard FMEA for intensity-modulated ART. A set of 216 potential radiotherapy failures composed by the forthcoming AAPM task group 100 (TG-100) was used as the basis. Of the 216 failures, 127 were identified as most relevant to an ART scheme. Using the associated TG-100 FMEA values as a baseline, the team considered how the likeliness of occurrence (O), outcome severity (S), and likeliness of failure being undetected (D) would change for ART. New risk priority numbers (RPN) were calculated. Failures characterized by RPN ≥ 200 were identified as potentially critical. Results: FMEA revealed that ART RPN increased for 38% (n = 48/127) of potential failures, with 75% (n = 36/48) attributed to failures in the segmentation and treatment planning processes. Forty-three of 127 failures were identified as potentially critical. Risk-mitigation strategies include implementing a suite of quality control and decision support software, specialty QA software/hardware tools, and an increase in specially trained personnel. Conclusions: Results of the FMEA-based risk assessment demonstrate that intensity-modulated ART introduces different (but not necessarily more) risks than standard IMRT and may be safely implemented with the proper mitigations.« less

  7. SU-E-T-421: Failure Mode and Effects Analysis (FMEA) of Xoft Electronic Brachytherapy for the Treatment of Superficial Skin Cancers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hoisak, J; Manger, R; Dragojevic, I

    Purpose: To perform a failure mode and effects analysis (FMEA) of the process for treating superficial skin cancers with the Xoft Axxent electronic brachytherapy (eBx) system, given the recent introduction of expanded quality control (QC) initiatives at our institution. Methods: A process map was developed listing all steps in superficial treatments with Xoft eBx, from the initial patient consult to the completion of the treatment course. The process map guided the FMEA to identify the failure modes for each step in the treatment workflow and assign Risk Priority Numbers (RPN), calculated as the product of the failure mode’s probability ofmore » occurrence (O), severity (S) and lack of detectability (D). FMEA was done with and without the inclusion of recent QC initiatives such as increased staffing, physics oversight, standardized source calibration, treatment planning and documentation. The failure modes with the highest RPNs were identified and contrasted before and after introduction of the QC initiatives. Results: Based on the FMEA, the failure modes with the highest RPN were related to source calibration, treatment planning, and patient setup/treatment delivery (Fig. 1). The introduction of additional physics oversight, standardized planning and safety initiatives such as checklists and time-outs reduced the RPNs of these failure modes. High-risk failure modes that could be mitigated with improved hardware and software interlocks were identified. Conclusion: The FMEA analysis identified the steps in the treatment process presenting the highest risk. The introduction of enhanced QC initiatives mitigated the risk of some of these failure modes by decreasing their probability of occurrence and increasing their detectability. This analysis demonstrates the importance of well-designed QC policies, procedures and oversight in a Xoft eBx programme for treatment of superficial skin cancers. Unresolved high risk failure modes highlight the need for non-procedural quality initiatives such as improved planning software and more robust hardware interlock systems.« less

  8. Process-based quality management for clinical implementation of adaptive radiotherapy

    PubMed Central

    Noel, Camille E.; Santanam, Lakshmi; Parikh, Parag J.; Mutic, Sasa

    2014-01-01

    Purpose: Intensity-modulated adaptive radiotherapy (ART) has been the focus of considerable research and developmental work due to its potential therapeutic benefits. However, in light of its unique quality assurance (QA) challenges, no one has described a robust framework for its clinical implementation. In fact, recent position papers by ASTRO and AAPM have firmly endorsed pretreatment patient-specific IMRT QA, which limits the feasibility of online ART. The authors aim to address these obstacles by applying failure mode and effects analysis (FMEA) to identify high-priority errors and appropriate risk-mitigation strategies for clinical implementation of intensity-modulated ART. Methods: An experienced team of two clinical medical physicists, one clinical engineer, and one radiation oncologist was assembled to perform a standard FMEA for intensity-modulated ART. A set of 216 potential radiotherapy failures composed by the forthcoming AAPM task group 100 (TG-100) was used as the basis. Of the 216 failures, 127 were identified as most relevant to an ART scheme. Using the associated TG-100 FMEA values as a baseline, the team considered how the likeliness of occurrence (O), outcome severity (S), and likeliness of failure being undetected (D) would change for ART. New risk priority numbers (RPN) were calculated. Failures characterized by RPN ≥ 200 were identified as potentially critical. Results: FMEA revealed that ART RPN increased for 38% (n = 48/127) of potential failures, with 75% (n = 36/48) attributed to failures in the segmentation and treatment planning processes. Forty-three of 127 failures were identified as potentially critical. Risk-mitigation strategies include implementing a suite of quality control and decision support software, specialty QA software/hardware tools, and an increase in specially trained personnel. Conclusions: Results of the FMEA-based risk assessment demonstrate that intensity-modulated ART introduces different (but not necessarily more) risks than standard IMRT and may be safely implemented with the proper mitigations. PMID:25086527

  9. Process-based quality management for clinical implementation of adaptive radiotherapy.

    PubMed

    Noel, Camille E; Santanam, Lakshmi; Parikh, Parag J; Mutic, Sasa

    2014-08-01

    Intensity-modulated adaptive radiotherapy (ART) has been the focus of considerable research and developmental work due to its potential therapeutic benefits. However, in light of its unique quality assurance (QA) challenges, no one has described a robust framework for its clinical implementation. In fact, recent position papers by ASTRO and AAPM have firmly endorsed pretreatment patient-specific IMRT QA, which limits the feasibility of online ART. The authors aim to address these obstacles by applying failure mode and effects analysis (FMEA) to identify high-priority errors and appropriate risk-mitigation strategies for clinical implementation of intensity-modulated ART. An experienced team of two clinical medical physicists, one clinical engineer, and one radiation oncologist was assembled to perform a standard FMEA for intensity-modulated ART. A set of 216 potential radiotherapy failures composed by the forthcoming AAPM task group 100 (TG-100) was used as the basis. Of the 216 failures, 127 were identified as most relevant to an ART scheme. Using the associated TG-100 FMEA values as a baseline, the team considered how the likeliness of occurrence (O), outcome severity (S), and likeliness of failure being undetected (D) would change for ART. New risk priority numbers (RPN) were calculated. Failures characterized by RPN ≥ 200 were identified as potentially critical. FMEA revealed that ART RPN increased for 38% (n = 48/127) of potential failures, with 75% (n = 36/48) attributed to failures in the segmentation and treatment planning processes. Forty-three of 127 failures were identified as potentially critical. Risk-mitigation strategies include implementing a suite of quality control and decision support software, specialty QA software/hardware tools, and an increase in specially trained personnel. Results of the FMEA-based risk assessment demonstrate that intensity-modulated ART introduces different (but not necessarily more) risks than standard IMRT and may be safely implemented with the proper mitigations.

  10. Compensating the aberrations of actual optical systems by means of a nonaxisymmetric retouching of the surface.

    NASA Astrophysics Data System (ADS)

    Gan, M. A.; Ustinov, S. I.; Starkov, A. A.

    1993-08-01

    A theory, methods, and software are developed for the automated calculation of the retouching profile in order to compensate axisymmetric and nonaxisymmetric aberrations that are caused by errors in the fabrication of high-resolution optical systems. The retouching profile is calculated on the basis of interferograms recorded within the field of view of the objective. The software makes it possible to estimate the effectiveness of the retouching on the basis of optophysical image-quality criteria.

  11. Power and sample size for multivariate logistic modeling of unmatched case-control studies.

    PubMed

    Gail, Mitchell H; Haneuse, Sebastien

    2017-01-01

    Sample size calculations are needed to design and assess the feasibility of case-control studies. Although such calculations are readily available for simple case-control designs and univariate analyses, there is limited theory and software for multivariate unconditional logistic analysis of case-control data. Here we outline the theory needed to detect scalar exposure effects or scalar interactions while controlling for other covariates in logistic regression. Both analytical and simulation methods are presented, together with links to the corresponding software.

  12. Panthere V2: Multipurpose Simulation Software for 3D Dose Rate Calculations

    NASA Astrophysics Data System (ADS)

    Penessot, Gaël; Bavoil, Éléonore; Wertz, Laurent; Malouch, Fadhel; Visonneau, Thierry; Dubost, Julien

    2017-09-01

    PANTHERE is a multipurpose radiation protection software developed by EDF to calculate gamma dose rates in complex 3D environments. PANTHERE takes a key role in the EDF ALARA process, enabling to predict dose rates and to organize and optimize operations in high radiation environments. PANTHERE is also used for nuclear waste characterization, transport of nuclear materials, etc. It is used in most of the EDF engineering units and their design service providers and industrial partners.

  13. A code inspection process for security reviews

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Garzoglio, Gabriele; /Fermilab

    2009-05-01

    In recent years, it has become more and more evident that software threat communities are taking an increasing interest in Grid infrastructures. To mitigate the security risk associated with the increased numbers of attacks, the Grid software development community needs to scale up effort to reduce software vulnerabilities. This can be achieved by introducing security review processes as a standard project management practice. The Grid Facilities Department of the Fermilab Computing Division has developed a code inspection process, tailored to reviewing security properties of software. The goal of the process is to identify technical risks associated with an application andmore » their impact. This is achieved by focusing on the business needs of the application (what it does and protects), on understanding threats and exploit communities (what an exploiter gains), and on uncovering potential vulnerabilities (what defects can be exploited). The desired outcome of the process is an improvement of the quality of the software artifact and an enhanced understanding of possible mitigation strategies for residual risks. This paper describes the inspection process and lessons learned on applying it to Grid middleware.« less

  14. A code inspection process for security reviews

    NASA Astrophysics Data System (ADS)

    Garzoglio, Gabriele

    2010-04-01

    In recent years, it has become more and more evident that software threat communities are taking an increasing interest in Grid infrastructures. To mitigate the security risk associated with the increased numbers of attacks, the Grid software development community needs to scale up effort to reduce software vulnerabilities. This can be achieved by introducing security review processes as a standard project management practice. The Grid Facilities Department of the Fermilab Computing Division has developed a code inspection process, tailored to reviewing security properties of software. The goal of the process is to identify technical risks associated with an application and their impact. This is achieved by focusing on the business needs of the application (what it does and protects), on understanding threats and exploit communities (what an exploiter gains), and on uncovering potential vulnerabilities (what defects can be exploited). The desired outcome of the process is an improvement of the quality of the software artifact and an enhanced understanding of possible mitigation strategies for residual risks. This paper describes the inspection process and lessons learned on applying it to Grid middleware.

  15. The impact of organizational structure on flight software cost risk

    NASA Technical Reports Server (NTRS)

    Hihn, Jairus; Lum, Karen; Monson, Erik

    2004-01-01

    This paper summarizes the final results of the follow-up study updating the estimated software effort growth for those projects that were still under development and including an evaluation of the roles versus observed cost risk for the missions included in the original study which expands the data set to thirteen missions.

  16. Securing PCs and Data in Libraries and Schools: A Handbook with Menuing, Anti-Virus, and Other Protective Software.

    ERIC Educational Resources Information Center

    Benson, Allen C.

    This handbook is designed to help readers identify and eliminate security risks, with sound recommendations and library-tested security software. Chapter 1 "Managing Your Facilities and Assessing Your Risks" addresses fundamental management responsibilities including planning for a secure system, organizing computer-related information, assessing…

  17. Vulnerability in Determining the Cost of Information System Project to Avoid Loses

    NASA Astrophysics Data System (ADS)

    Haryono, Kholid; Ikhsani, Zulfa Amalia

    2018-03-01

    Context: This study discusses the priority of cost required in software development projects. Objectives: To show the costing models, the variables involved, and how practitioners assess and decide the priorities of each variable. To strengthen the information, each variable also confirmed the risk if ignored. Method: The method is done by two approaches. First, systematic literature reviews to find the models and variables used to decide the cost of software development. Second, confirm and take judgments about the level of importance and risk of each variable to the software developer. Result: Obtained about 54 variables that appear on the 10 models discussed. The variables are categorized into 15 groups based on the similarity of meaning. Each group becomes a variable. Confirmation results with practitioners on the level of importance and risk. It shown there are two variables that are considered very important and high risk if ignored. That is duration and effort. Conclusion: The relationship of variable rates between the results of literature studies and confirmation of practitioners contributes to the use of software business actors in considering project cost variables.

  18. A novel tool for user-friendly estimation of natural, diagnostic and professional radiation risk: Radio-Risk software.

    PubMed

    Carpeggiani, Clara; Paterni, Marco; Caramella, Davide; Vano, Eliseo; Semelka, Richard C; Picano, Eugenio

    2012-11-01

    Awareness of radiological risk is low among doctors and patients. An educational/decision tool that considers each patient' s cumulative lifetime radiation exposure would facilitate provider-patient communication. The purpose of this work was to develop user-friendly software for simple estimation and communication of radiological risk to patients and doctors as a part of the SUIT-Heart (Stop Useless Imaging Testing in Heart disease) Project of the Tuscany Region. We developed a novel software program (PC-platform, Windows OS fully downloadable at http://suit-heart.ifc.cnr.it) considering reference dose estimates from American Heart Association Radiological Imaging 2009 guidelines and UK Royal College of Radiology 2007 guidelines. Cancer age and gender-weighted risk were derived from Biological Effects of Ionising Radiation VII Committee, 2006. With simple input functions (demographics, age, gender) the user selects from a predetermined menu variables relating to natural (e.g., airplane flights and geo-tracked background exposure), professional (e.g., cath lab workers) and medical (e.g., CT, cardiac scintigraphy, coronary stenting) sources. The program provides a simple numeric (cumulative effective dose in milliSievert, mSv, and equivalent number of chest X-rays) and graphic (cumulative temporal trends of exposure, cancer cases out of 100 exposed persons) display. A simple software program allows straightforward estimation of cumulative dose (in multiples of chest X-rays) and risk (in extra % lifetime cancer risk), with simple numbers quantifying lifetime extra cancer risk. Pictorial display of radiation risk may be valuable for increasing radiological awareness in cardiologists. Copyright © 2011 Elsevier Ireland Ltd. All rights reserved.

  19. Software for Statistical Analysis of Weibull Distributions with Application to Gear Fatigue Data: User Manual with Verification

    NASA Technical Reports Server (NTRS)

    Krantz, Timothy L.

    2002-01-01

    The Weibull distribution has been widely adopted for the statistical description and inference of fatigue data. This document provides user instructions, examples, and verification for software to analyze gear fatigue test data. The software was developed presuming the data are adequately modeled using a two-parameter Weibull distribution. The calculations are based on likelihood methods, and the approach taken is valid for data that include type 1 censoring. The software was verified by reproducing results published by others.

  20. Software for Statistical Analysis of Weibull Distributions with Application to Gear Fatigue Data: User Manual with Verification

    NASA Technical Reports Server (NTRS)

    Kranz, Timothy L.

    2002-01-01

    The Weibull distribution has been widely adopted for the statistical description and inference of fatigue data. This document provides user instructions, examples, and verification for software to analyze gear fatigue test data. The software was developed presuming the data are adequately modeled using a two-parameter Weibull distribution. The calculations are based on likelihood methods, and the approach taken is valid for data that include type I censoring. The software was verified by reproducing results published by others.

  1. PIV Data Validation Software Package

    NASA Technical Reports Server (NTRS)

    Blackshire, James L.

    1997-01-01

    A PIV data validation and post-processing software package was developed to provide semi-automated data validation and data reduction capabilities for Particle Image Velocimetry data sets. The software provides three primary capabilities including (1) removal of spurious vector data, (2) filtering, smoothing, and interpolating of PIV data, and (3) calculations of out-of-plane vorticity, ensemble statistics, and turbulence statistics information. The software runs on an IBM PC/AT host computer working either under Microsoft Windows 3.1 or Windows 95 operating systems.

  2. TMT approach to observatory software development process

    NASA Astrophysics Data System (ADS)

    Buur, Hanne; Subramaniam, Annapurni; Gillies, Kim; Dumas, Christophe; Bhatia, Ravinder

    2016-07-01

    The purpose of the Observatory Software System (OSW) is to integrate all software and hardware components of the Thirty Meter Telescope (TMT) to enable observations and data capture; thus it is a complex software system that is defined by four principal software subsystems: Common Software (CSW), Executive Software (ESW), Data Management System (DMS) and Science Operations Support System (SOSS), all of which have interdependencies with the observatory control systems and data acquisition systems. Therefore, the software development process and plan must consider dependencies to other subsystems, manage architecture, interfaces and design, manage software scope and complexity, and standardize and optimize use of resources and tools. Additionally, the TMT Observatory Software will largely be developed in India through TMT's workshare relationship with the India TMT Coordination Centre (ITCC) and use of Indian software industry vendors, which adds complexity and challenges to the software development process, communication and coordination of activities and priorities as well as measuring performance and managing quality and risk. The software project management challenge for the TMT OSW is thus a multi-faceted technical, managerial, communications and interpersonal relations challenge. The approach TMT is using to manage this multifaceted challenge is a combination of establishing an effective geographically distributed software team (Integrated Product Team) with strong project management and technical leadership provided by the TMT Project Office (PO) and the ITCC partner to manage plans, process, performance, risk and quality, and to facilitate effective communications; establishing an effective cross-functional software management team composed of stakeholders, OSW leadership and ITCC leadership to manage dependencies and software release plans, technical complexities and change to approved interfaces, architecture, design and tool set, and to facilitate effective communications; adopting an agile-based software development process across the observatory to enable frequent software releases to help mitigate subsystem interdependencies; defining concise scope and work packages for each of the OSW subsystems to facilitate effective outsourcing of software deliverables to the ITCC partner, and to enable performance monitoring and risk management. At this stage, the architecture and high-level design of the software system has been established and reviewed. During construction each subsystem will have a final design phase with reviews, followed by implementation and testing. The results of the TMT approach to the Observatory Software development process will only be preliminary at the time of the submittal of this paper, but it is anticipated that the early results will be a favorable indication of progress.

  3. Distance education course on spatial multi-hazard risk assessment, using Open Source software

    NASA Astrophysics Data System (ADS)

    van Westen, C. J.; Frigerio, S.

    2009-04-01

    As part of the capacity building activities of the United Nations University - ITC School on Disaster Geo-Information Management (UNU-ITC DGIM) the International Institute for Geoinformation Science and Earth Observation (ITC) has developed a distance education course on the application of Geographic Information Systems for multi-hazard risk assessment. This course is designed for academic staff, as well as for professionals working in (non-) governmental organizations where knowledge of disaster risk management is essential. The course guides the participants through the entire process of risk assessment, on the basis of a case study of a city exposed to multiple hazards, in a developing country. The courses consists of eight modules, each with a guide book explaining the theoretical background, and guiding the participants through spatial data requirements for risk assessment, hazard assessment procedures, generation of elements at risk databases, vulnerability assessment, qualitative and quantitative risk assessment methods, risk evaluation and risk reduction. Linked to the theory is a large set of exercises, with exercise descriptions, answer sheets, demos and GIS data. The exercises deal with four different types of hazards: earthquakes, flooding, technological hazards, and landslides. One important consideration in designing the course is that people from developing countries should not be restricted in using it due to financial burdens for software acquisition. Therefore the aim was to use Open Source software as a basis. The GIS exercises are written for the ILWIS software. All exercises have also been integrated into a WebGIS, using the Open source software CartoWeb (based on GNU License). It is modular and customizable thanks to its object-oriented architecture and based on a hierarchical structure (to manage and organize every package of information of every step required in risk assessment). Different switches for every component of the risk assessment course have been defined and through various menus the user can define the options for every exercise. For every layer of information tools for querying, printing, searching and surface analysis are implemented, allowing the option to compare maps at different scale and for on-line interpretations.

  4. Mathemagical Computing: Order of Operations and New Software.

    ERIC Educational Resources Information Center

    Ecker, Michael W.

    1989-01-01

    Describes mathematical problems which occur when using the computer as a calculator. Considers errors in BASIC calculation and the order of mathematical operations. Identifies errors in spreadsheet and calculator programs. Comments on sorting programs and provides a source for Mathemagical Black Holes. (MVL)

  5. Risk-Significant Adverse Condition Awareness Strengthens Assurance of Fault Management Systems

    NASA Technical Reports Server (NTRS)

    Fitz, Rhonda

    2017-01-01

    As spaceflight systems increase in complexity, Fault Management (FM) systems are ranked high in risk-based assessment of software criticality, emphasizing the importance of establishing highly competent domain expertise to provide assurance. Adverse conditions (ACs) and specific vulnerabilities encountered by safety- and mission-critical software systems have been identified through efforts to reduce the risk posture of software-intensive NASA missions. Acknowledgement of potential off-nominal conditions and analysis to determine software system resiliency are important aspects of hazard analysis and FM. A key component of assuring FM is an assessment of how well software addresses susceptibility to failure through consideration of ACs. Focus on significant risk predicted through experienced analysis conducted at the NASA Independent Verification & Validation (IV&V) Program enables the scoping of effective assurance strategies with regard to overall asset protection of complex spaceflight as well as ground systems. Research efforts sponsored by NASAs Office of Safety and Mission Assurance (OSMA) defined terminology, categorized data fields, and designed a baseline repository that centralizes and compiles a comprehensive listing of ACs and correlated data relevant across many NASA missions. This prototype tool helps projects improve analysis by tracking ACs and allowing queries based on project, mission type, domain/component, causal fault, and other key characteristics. Vulnerability in off-nominal situations, architectural design weaknesses, and unexpected or undesirable system behaviors in reaction to faults are curtailed with the awareness of ACs and risk-significant scenarios modeled for analysts through this database. Integration within the Enterprise Architecture at NASA IV&V enables interfacing with other tools and datasets, technical support, and accessibility across the Agency. This paper discusses the development of an improved workflow process utilizing this database for adaptive, risk-informed FM assurance that critical software systems will safely and securely protect against faults and respond to ACs in order to achieve successful missions.

  6. Risk-Significant Adverse Condition Awareness Strengthens Assurance of Fault Management Systems

    NASA Technical Reports Server (NTRS)

    Fitz, Rhonda

    2017-01-01

    As spaceflight systems increase in complexity, Fault Management (FM) systems are ranked high in risk-based assessment of software criticality, emphasizing the importance of establishing highly competent domain expertise to provide assurance. Adverse conditions (ACs) and specific vulnerabilities encountered by safety- and mission-critical software systems have been identified through efforts to reduce the risk posture of software-intensive NASA missions. Acknowledgement of potential off-nominal conditions and analysis to determine software system resiliency are important aspects of hazard analysis and FM. A key component of assuring FM is an assessment of how well software addresses susceptibility to failure through consideration of ACs. Focus on significant risk predicted through experienced analysis conducted at the NASA Independent Verification Validation (IVV) Program enables the scoping of effective assurance strategies with regard to overall asset protection of complex spaceflight as well as ground systems. Research efforts sponsored by NASA's Office of Safety and Mission Assurance defined terminology, categorized data fields, and designed a baseline repository that centralizes and compiles a comprehensive listing of ACs and correlated data relevant across many NASA missions. This prototype tool helps projects improve analysis by tracking ACs and allowing queries based on project, mission type, domaincomponent, causal fault, and other key characteristics. Vulnerability in off-nominal situations, architectural design weaknesses, and unexpected or undesirable system behaviors in reaction to faults are curtailed with the awareness of ACs and risk-significant scenarios modeled for analysts through this database. Integration within the Enterprise Architecture at NASA IVV enables interfacing with other tools and datasets, technical support, and accessibility across the Agency. This paper discusses the development of an improved workflow process utilizing this database for adaptive, risk-informed FM assurance that critical software systems will safely and securely protect against faults and respond to ACs in order to achieve successful missions.

  7. System Risk Balancing Profiles: Software Component

    NASA Technical Reports Server (NTRS)

    Kelly, John C.; Sigal, Burton C.; Gindorf, Tom

    2000-01-01

    The Software QA / V&V guide will be reviewed and updated based on feedback from NASA organizations and others with a vested interest in this area. Hardware, EEE Parts, Reliability, and Systems Safety are a sample of the future guides that will be developed. Cost Estimates, Lessons Learned, Probability of Failure and PACTS (Prevention, Avoidance, Control or Test) are needed to provide a more complete risk management strategy. This approach to risk management is designed to help balance the resources and program content for risk reduction for NASA's changing environment.

  8. [Does fertility treatment increase the risk of breast cancer? Current knowledge and meta-analysis].

    PubMed

    Gabriele, V; Benabu, J-C; Ohl, J; Youssef, C Akladios; Mathelin, C

    2017-05-01

    The objective of this review was to assess the level of risk of breast cancer for women exposed to ovulation-inducing therapy (OIT). The 25 selected studies were extracted from the PUBMED database from January 2000 until March 2016 with the following key-words: "fertility agents", "infertility treatments", "clomiphene citrate", "buserelin", "ovarian stimulation", "assisted reproductive technology" and "breast cancer". Our meta-analysis was performed using Review Manager software, Cochrane Collaboration, 2014. The results were calculated by type of OIT, as well as globally. The analysis of these published epidemiological studies confirms that exposition to OIT is not a breast cancer risk factor, but the results are contradictory. Two studies have shown a significantly increased risk of breast cancer in a population of infertile women, while two others have found a significant decrease of this risk. The twenty others did not show any impact of IOT over this risk. Our meta-analysis of 20 selected studies has not identified a significant association between exposition to OIT and breast cancer risk (relative risk=0,96; IC 95: (0,81-1,14) for cohort studies and odds ratio=0,94; IC 95% (0,81-1,10) for case-control studies). Exposition to OIT is not an identified risk factor for breast cancer. A message reassuring about a possible risk of OIT-related breast cancer should be given to these women. Exposition to OIT is therefore not an indication of increased breast surveillance. Copyright © 2017 Elsevier Masson SAS. All rights reserved.

  9. Economic Evaluation of the Information Security Levels Achieved by Electric Energy Providers in North Arctic Region

    NASA Astrophysics Data System (ADS)

    Sushko, O. P.; Kaznin, A. A.; Babkin, A. V.; Bogdanov, D. A.

    2017-10-01

    The study we are conducting involves the analysis of information security levels achieved by energy providers operating in the North Arctic Region. We look into whether the energy providers’ current information security levels meet reliability standards and determine what further actions may be needed for upgrading information security in the context of the digital transformation that the world community is undergoing. When developing the information security systems for electric energy providers or selecting the protection means for them, we are governed by the fact that the assets to be protected are process technologies. While information security risk can be assessed using different methods, the evaluation of the economic damage from these risks appears to be a difficult task. The most probable and harmful risks we have identified when evaluating the electric energy providers’ information security will be used by us as variables. To provide the evaluation, it is necessary to calculate the costs relating to elimination of the risks identified. The final stage of the study will involve the development of an operation algorithm for the North Arctic Region’s energy provider’s business information protection security system - a set of information security services, and security software and hardware.

  10. Genetic structure and conservation of Mountain Lions in the South-Brazilian Atlantic Rain Forest.

    PubMed

    Castilho, Camila S; Marins-Sá, Luiz G; Benedet, Rodrigo C; Freitas, Thales R O

    2012-01-01

    The Brazilian Atlantic Rain Forest, one of the most endangered ecosystems worldwide, is also among the most important hotspots as regards biodiversity. Through intensive logging, the initial area has been reduced to around 12% of its original size. In this study we investigated the genetic variability and structure of the mountain lion, Puma concolor. Using 18 microsatellite loci we analyzed evidence of allele dropout, null alleles and stuttering, calculated the number of allele/locus, PIC, observed and expected heterozygosity, linkage disequilibrium, Hardy-Weinberg equilibrium, F(IS), effective population size and genetic structure (MICROCHECKER, CERVUS, GENEPOP, FSTAT, ARLEQUIN, ONESAMP, LDNe, PCAGEN, GENECLASS software), we also determine whether there was evidence of a bottleneck (HYBRIDLAB, BOTTLENECK software) that might influence the future viability of the population in south Brazil. 106 alleles were identified, with the number of alleles/locus ranging from 2 to 11. Mean observed heterozygosity, mean number of alleles and polymorphism information content were 0.609, 5.89, and 0.6255, respectively. This population presented evidence of a recent bottleneck and loss of genetic variation. Persistent regional poaching constitutes an increasing in the extinction risk.

  11. Distinct role of the Fas rs1800682 and FasL rs763110 polymorphisms in determining the risk of breast cancer among Han Chinese females.

    PubMed

    Wang, Meng; Wang, Zheng; Wang, Xi-Jing; Jin, Tian-Bo; Dai, Zhi-Ming; Kang, Hua-Feng; Guan, Hai-Tao; Ma, Xiao-Bin; Liu, Xing-Han; Dai, Zhi-Jun

    2016-01-01

    In recent years, studies have demonstrated that polymorphisms in the promoters of Fas and FasL are significantly associated with breast cancer risk. However, the results of these studies were inconsistent. This case-control study was performed to explore the associations between Fas rs1800682 and FasL rs763110 polymorphisms and breast cancer. A hospital-based case-control study of 560 Han Chinese females with breast cancer (583 controls) was conducted. The MassARRAY system was used to search for a possible association between the disease risk and the two single nucleotide polymorphisms, Fas rs1800682 and FasL rs763110. Statistical analyses were performed using SNPStats software to conduct Pearson's chi-square tests in five different genetic models. Odds ratios (ORs) and 95% confidence intervals (CIs) were calculated after adjustment to age and body mass index. PHASE v2.1 software was used to reconstruct all common haplotypes. A statistically significant association was found between Fas rs1800682 and increased breast cancer risk (AG vs AA: OR =1.37, 95% CI =1.06-1.78; AA+AG vs GG: OR =1.32, 95% CI =1.04-1.66), and also it was found that the FasL rs763110 polymorphism may decrease the risk. Stratified analyses demonstrated that the rs763110 polymorphism was associated with lower breast cancer risk among postmenopausal females (heterozygote model: OR =0.69, 95% CI =0.49-0.97; dominant model: OR =0.70, 95% CI =0.51-0.96). The T allele of rs763110 was also associated with a decreased risk of lymph node metastasis (allele model: OR =0.75, 95% CI =0.57-0.97) and an increased risk of the breast cancer being human epidermal growth factor receptor 2 positive (allele model: OR =1.37, 95% CI =1.03-1.18). Moreover, haplotype analysis showed that Ars1800682Trs763110 was associated to a statistically significant degree with lower risk of breast cancer (OR =0.70, 95% CI =0.53-0.91). These data suggest that the presence of Fas rs1800683 is an important risk factor for breast cancer, whereas FasL rs763110 may exert a protective effect against the onset of breast cancer.

  12. Caries risk assessment in schoolchildren - a form based on Cariogram® software

    PubMed Central

    CABRAL, Renata Nunes; HILGERT, Leandro Augusto; FABER, Jorge; LEAL, Soraya Coelho

    2014-01-01

    Identifying caries risk factors is an important measure which contributes to best understanding of the cariogenic profile of the patient. The Cariogram® software provides this analysis, and protocols simplifying the method were suggested. Objectives The aim of this study was to determine whether a newly developed Caries Risk Assessment (CRA) form based on the Cariogram® software could classify schoolchildren according to their caries risk and to evaluate relationships between caries risk and the variables in the form. Material and Methods 150 schoolchildren aged 5 to 7 years old were included in this survey. Caries prevalence was obtained according to International Caries Detection and Assessment System (ICDAS) II. Information for filling in the form based on Cariogram® was collected clinically and from questionnaires sent to parents. Linear regression and a forward stepwise multiple regression model were applied to correlate the variables included in the form with the caries risk. Results Caries prevalence, in primary dentition, including enamel and dentine carious lesions was 98.6%, and 77.3% when only dentine lesions were considered. Eighty-six percent of the children were classified as at moderate caries risk. The forward stepwise multiple regression model result was significant (R2=0.904; p<0.00001), showing that the most significant factors influencing caries risk were caries experience, oral hygiene, frequency of food consumption, sugar consumption and fluoride sources. Conclusion The use of the form based on the Cariogram® software enabled classification of the schoolchildren at low, moderate and high caries risk. Caries experience, oral hygiene, frequency of food consumption, sugar consumption and fluoride sources are the variables that were shown to be highly correlated with caries risk. PMID:25466473

  13. A parameterization of nuclear track profiles in CR-39 detector

    NASA Astrophysics Data System (ADS)

    Azooz, A. A.; Al-Nia'emi, S. H.; Al-Jubbori, M. A.

    2012-11-01

    In this work, the empirical parameterization describing the alpha particles’ track depth in CR-39 detectors is extended to describe longitudinal track profiles against etching time for protons and alpha particles. MATLAB based software is developed for this purpose. The software calculates and plots the depth, diameter, range, residual range, saturation time, and etch rate versus etching time. The software predictions are compared with other experimental data and with results of calculations using the original software, TRACK_TEST, developed for alpha track calculations. The software related to this work is freely downloadable and performs calculations for protons in addition to alpha particles. Program summary Program title: CR39 Catalog identifier: AENA_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AENA_v1_0.html Program obtainable from: CPC Program Library, Queen’s University, Belfast, N. Ireland Licensing provisions: Copyright (c) 2011, Aasim Azooz Redistribution and use in source and binary forms, with or without modification, are permitted provided that the following conditions are met • Redistributions of source code must retain the above copyright, this list of conditions and the following disclaimer. • Redistributions in binary form must reproduce the above copyright notice, this list of conditions and the following disclaimer in the documentation and/or other materials provided with the distribution This software is provided by the copyright holders and contributors “as is” and any express or implied warranties, including, but not limited to, the implied warranties of merchantability and fitness for a particular purpose are disclaimed. In no event shall the copyright owner or contributors be liable for any direct, indirect, incidental, special, exemplary, or consequential damages (including, but not limited to, procurement of substitute goods or services; loss of use, data, or profits; or business interruption) however caused and on any theory of liability, whether in contract, strict liability, or tort (including negligence or otherwise) arising in any way out of the use of this software, even if advised of the possibility of such damage. No. of lines in distributed program, including test data, etc.: 15598 No. of bytes in distributed program, including test data, etc.: 3933244 Distribution format: tar.gz Programming language: MATLAB. Computer: Any Desktop or Laptop. Operating system: Windows 1998 or above (with MATLAB R13 or above installed). RAM: 512 Megabytes or higher Classification: 17.5. Nature of problem: A new semispherical parameterization of charged particle tracks in CR-39 SSNTD is carried out in a previous paper. This parameterization is developed here into a MATLAB based software to calculate the track length and track profile for any proton or alpha particle energy or etching time. This software is intended to compete with the TRACK_TEST [1] and TRACK_VISION [2] software currently in use by all people working in the field of SSNTD. Solution method: Based on fitting of experimental results of protons and alpha particles track lengths for various energies and etching times to a new semispherical formula with four free fitting parameters, the best set of energy independent parameters were found. These parameters are introduced into the software and the software is programmed to solve the set of equations to calculate the track depth, track etching rate as a function of both time and residual range for particles of normal and oblique incidence, the track longitudinal profile at both normal and oblique incidence, and the three dimensional track profile at normal incidence. Running time: 1-8 s on Pentium (4) 2 GHz CPU, 3 GB of RAM depending on the etching time value References: [1] ADWT_v1_0 Track_Test Computer program TRACK_TEST for calculating parameters and plotting profiles for etch pits in nuclear track materials. D. Nikezic, K.N. Yu Comput. Phys. Commun. 174(2006)160 [2] AEAF_v1_0 TRACK_VISION Computer program TRACK_VISION for simulating optical appearance of etched tracks in CR-39 nuclear track detectors. D. Nikezic, K.N. Yu Comput. Phys. Commun. 178(2008)591

  14. A Novel Method for Mining SaaS Software Tag via Community Detection in Software Services Network

    NASA Astrophysics Data System (ADS)

    Qin, Li; Li, Bing; Pan, Wei-Feng; Peng, Tao

    The number of online software services based on SaaS paradigm is increasing. However, users usually find it hard to get the exact software services they need. At present, tags are widely used to annotate specific software services and also to facilitate the searching of them. Currently these tags are arbitrary and ambiguous since mostly of them are generated manually by service developers. This paper proposes a method for mining tags from the help documents of software services. By extracting terms from the help documents and calculating the similarity between the terms, we construct a software similarity network where nodes represent software services, edges denote the similarity relationship between software services, and the weights of the edges are the similarity degrees. The hierarchical clustering algorithm is used for community detection in this software similarity network. At the final stage, tags are mined for each of the communities and stored as ontology.

  15. Factors associated with perception of risk of contracting HIV among secondary school female learners in Mbonge subdivision of rural Cameroon

    PubMed Central

    Tarkang, Elvis Enowbeyang

    2014-01-01

    Introduction Since learners in secondary schools fall within the age group hardest hit by HIV/AIDS, it is obvious that these learners might be at high risk of contracting HIV/AIDS. However, little has been explored on the perception of risk of contracting HIV among secondary school learners in Cameroon. This study aimed at examining the perception of risk of contracting HIV among secondary school learners in Mbonge subdivision of rural Cameroon using the Health Belief Model (HBM) as framework. Methods A quantitative, correlational design was adopted, using a self-administered questionnaire to collect data from 210 female learners selected through disproportional, stratified, simple random sampling technique, from three participating senior secondary schools. Statistics were calculated using SPSS version 20 software program. Results Only 39.4% of the respondents perceived themselves to be at high risk of contracting HIV, though the majority, 54.0% were sexually active. Multinomial logistic regression analyses show that sexual risk behaviours (p=0.000) and the Integrated Value Mapping (IVM) of the perception components of the HBM are the most significant factors associated with perception of risk of contracting HIV at the level p<0.05. Conclusion The findings of this study can play an instrumental role in the development of effective preventive and interventional messages for adolescents in Cameroon. PMID:25309659

  16. Risk Profile of Hepatitis E Virus from Pigs or Pork in Canada.

    PubMed

    Wilhelm, B; Fazil, A; Rajić, A; Houde, A; McEwen, S A

    2017-12-01

    The role and importance of pigs and pork as sources of zoonotic hepatitis E virus (HEV) has been debated in Canada and abroad for over 20 years. To further investigate this question, we compiled data to populate a risk profile for HEV in pigs or pork in Canada. We organized the risk profile (RP) using the headings prescribed for a foodborne microbial risk assessment and used research synthesis methods and inputs wherever possible in populating the fields of this RP. A scoping review of potential public health risks of HEV, and two Canadian field surveys sampling finisher pigs, and retail pork chops and pork livers, provided inputs to inform this RP. We calculated summary estimates of prevalence using the Comprehensive Meta-analysis 3 software, employing the method of moments. Overall, we found the incidence of sporadic locally acquired hepatitis E in Canada, compiled from peer-reviewed literature or from diagnosis at the National Microbiology Laboratory to be low relative to other non-endemic countries. In contrast, we found the prevalence of detection of HEV RNA in pigs and retail pork livers, to be comparable to that reported in the USA and Europe. We drafted risk categories (high/medium/low) for acquiring clinical hepatitis E from exposure to pigs or pork in Canada and hypothesize that the proportion of the Canadian population at high risk from either exposure is relatively small. © 2016 Crown copyright.

  17. Using Problem-Based Learning Software with At-Risk Students: A Case Study

    ERIC Educational Resources Information Center

    Samsonov, Pavel; Pedersen, Susan; Hill, Christine L.

    2006-01-01

    In an extension of research examining student-centered pedagogy, the present case study examined how at-risk students used Alien Rescue, a problem-based learning (PBL) software program for middle school science. Twenty-nine participants were observed and interviewed over the twelve class days in which they were engaged in Alien Rescue. Students'…

  18. User's Manual for the National Water-Quality Assessment Program Invertebrate Data Analysis System (IDAS) Software: Version 3

    USGS Publications Warehouse

    Cuffney, Thomas F.

    2003-01-01

    The Invertebrate Data Analysis System (IDAS) software provides an accurate, consistent, and efficient mechanism for analyzing invertebrate data collected as part of the National Water-Quality Assessment Program and stored in the Biological Transactional Database (Bio-TDB). The IDAS software is a stand-alone program for personal computers that run Microsoft (MS) Windows?. It allows users to read data downloaded from Bio-TDB and stored either as MS Excel? or MS Access? files. The program consists of five modules. The Edit Data module allows the user to subset, combine, delete, and summarize community data. The Data Preparation module allows the user to select the type(s) of sample(s) to process, calculate densities, delete taxa based on laboratory processing notes, combine lifestages or keep them separate, select a lowest taxonomic level for analysis, delete rare taxa, and resolve taxonomic ambiguities. The Calculate Community Metrics module allows the user to calculate over 130 community metrics, including metrics based on organism tolerances and functional feeding groups. The Calculate Diversities and Similarities module allows the user to calculate nine diversity and eight similarity indices. The Data export module allows the user to export data to other software packages and produce tables of community data that can be imported into spreadsheet and word-processing programs. Though the IDAS program was developed to process invertebrate data downloaded from USGS databases, it will work with other data sets that are converted to the USGS (Bio-TDB) format. Consequently, the data manipulation, analysis, and export procedures provided by the IDAS program can be used by anyone involved in using benthic macroinvertebrates in applied or basic research.

  19. Recent and planned developments in the CARI program.

    DOT National Transportation Integrated Search

    2013-04-01

    CARI-6 is the sixth major release of galactic cosmic radiation (GCR) dose calculation software developed by the U.S. Federal Aviation Administration (FAA). The software is of benefit to the FAA and the public as a tool used by scientists investigatin...

  20. Software technology insertion: A study of success factors

    NASA Technical Reports Server (NTRS)

    Lydon, Tom

    1990-01-01

    Managing software development in large organizations has become increasingly difficult due to increasing technical complexity, stricter government standards, a shortage of experienced software engineers, competitive pressure for improved productivity and quality, the need to co-develop hardware and software together, and the rapid changes in both hardware and software technology. The 'software factory' approach to software development minimizes risks while maximizing productivity and quality through standardization, automation, and training. However, in practice, this approach is relatively inflexible when adopting new software technologies. The methods that a large multi-project software engineering organization can use to increase the likelihood of successful software technology insertion (STI), especially in a standardized engineering environment, are described.

  1. Development of a phantom to test fully automated breast density software - A work in progress.

    PubMed

    Waade, G G; Hofvind, S; Thompson, J D; Highnam, R; Hogg, P

    2017-02-01

    Mammographic density (MD) is an independent risk factor for breast cancer and may have a future role for stratified screening. Automated software can estimate MD but the relationship between breast thickness reduction and MD is not fully understood. Our aim is to develop a deformable breast phantom to assess automated density software and the impact of breast thickness reduction on MD. Several different configurations of poly vinyl alcohol (PVAL) phantoms were created. Three methods were used to estimate their density. Raw image data of mammographic images were processed using Volpara to estimate volumetric breast density (VBD%); Hounsfield units (HU) were measured on CT images; and physical density (g/cm 3 ) was calculated using a formula involving mass and volume. Phantom volume versus contact area and phantom volume versus phantom thickness was compared to values of real breasts. Volpara recognized all deformable phantoms as female breasts. However, reducing the phantom thickness caused a change in phantom density and the phantoms were not able to tolerate same level of compression and thickness reduction experienced by female breasts during mammography. Our results are promising as all phantoms resulted in valid data for automated breast density measurement. Further work should be conducted on PVAL and other materials to produce deformable phantoms that mimic female breast structure and density with the ability of being compressed to the same level as female breasts. We are the first group to have produced deformable phantoms that are recognized as breasts by Volpara software. Copyright © 2016 The College of Radiographers. All rights reserved.

  2. Caries Risk Assessment of 12-13-year-old Government and Private School Going Children of Mysore City Using Cariogram: A Comparative Study.

    PubMed

    Naik, Sandhya P; Moyin, Shabna; Patel, Bhakti; Warad, Lata Prabhu; Punathil, Sameer; Sudeep, C B

    2018-01-01

    The aim of this study is to assess the caries risk assessment of 12-13-year-old government and private school going children of Mysore city using Cariogram. A cross-sectional examination was carried out on a total of 104 government and private schoolchildren aged 12-13 years. Ten factors from the Cariogram software(D Bratthall, Computer software, Malmo, Sweden) were included from study participant's records to complete the Cariogram. The percentage of "chances of avoiding new lesions" (caries risk) among government and private school study participants were obtained from Cariogram, and the participants were classified into five risk groups. Statistical analysis was performed using the software program Statistical Package of Social Science (version 17.0, SPSS Inc., Chicago IL, USA). Findings revealed that there is slight difference in caries risk among government and private schoolchildren, where 48% caries risk development and 52% chance to avoid dental caries were showed in government schoolchildren, and 51% caries risk development and 49% chance to avoid dental caries were showed in private schoolchildren, according to Cariogram. Decayed, missing, and filled teeth component, mutans streptococci, and Lactobacillus counts were slightly higher in private schoolchildren compared with government schoolchildren. The private schoolchildren had less favorable values than government schoolchildren for most of the caries-related factors. Cariogram can be the most modest and reliable tool for caries prediction, thus aiding in identifying different risk groups in a community so that appropriate preventive strategies can be provided to overcome new carious lesion formation.

  3. Risk analysis of technological hazards: Simulation of scenarios and application of a local vulnerability index.

    PubMed

    Sanchez, E Y; Represa, S; Mellado, D; Balbi, K B; Acquesta, A D; Colman Lerner, J E; Porta, A A

    2018-06-15

    The potential impact of a technological accident can be assessed by risk estimation. Taking this into account, the latent or potential condition can be warned and mitigated. In this work we propose a methodology to estimate risk of technological hazards, focused on two components. The first one is the processing of meteorological databases to define the most probably and conservative scenario of study, and the second one, is the application of a local social vulnerability index to classify the population. In this case of study, the risk was estimated for a hypothetical release of liquefied ammonia in a meat-packing industry in the city of La Plata, Argentina. The method consists in integrating the simulated toxic threat zone with ALOHA software, and the layer of sociodemographic classification of the affected population. The results show the areas associated with higher risks of exposure to ammonia, which are worth being addressed for the prevention of disasters in the region. Advantageously, this systemic approach is methodologically flexible as it provides the possibility of being applied in various scenarios based on the available information of both, the exposed population and its meteorology. Furthermore, this methodology optimizes the processing of the input data and its calculation. Copyright © 2018 Elsevier B.V. All rights reserved.

  4. Optimization Testbed Cometboards Extended into Stochastic Domain

    NASA Technical Reports Server (NTRS)

    Patnaik, Surya N.; Pai, Shantaram S.; Coroneos, Rula M.; Patnaik, Surya N.

    2010-01-01

    COMparative Evaluation Testbed of Optimization and Analysis Routines for the Design of Structures (CometBoards) is a multidisciplinary design optimization software. It was originally developed for deterministic calculation. It has now been extended into the stochastic domain for structural design problems. For deterministic problems, CometBoards is introduced through its subproblem solution strategy as well as the approximation concept in optimization. In the stochastic domain, a design is formulated as a function of the risk or reliability. Optimum solution including the weight of a structure, is also obtained as a function of reliability. Weight versus reliability traced out an inverted-S-shaped graph. The center of the graph corresponded to 50 percent probability of success, or one failure in two samples. A heavy design with weight approaching infinity could be produced for a near-zero rate of failure that corresponded to unity for reliability. Weight can be reduced to a small value for the most failure-prone design with a compromised reliability approaching zero. The stochastic design optimization (SDO) capability for an industrial problem was obtained by combining three codes: MSC/Nastran code was the deterministic analysis tool, fast probabilistic integrator, or the FPI module of the NESSUS software, was the probabilistic calculator, and CometBoards became the optimizer. The SDO capability requires a finite element structural model, a material model, a load model, and a design model. The stochastic optimization concept is illustrated considering an academic example and a real-life airframe component made of metallic and composite materials.

  5. ToxPredictor: a Toxicity Estimation Software Tool

    EPA Science Inventory

    The Computational Toxicology Team within the National Risk Management Research Laboratory has developed a software tool that will allow the user to estimate the toxicity for a variety of endpoints (such as acute aquatic toxicity). The software tool is coded in Java and can be ac...

  6. Editorial: Challenges and solutions in GW calculations for complex systems

    NASA Astrophysics Data System (ADS)

    Giustino, F.; Umari, P.; Rubio, A.

    2012-09-01

    We report key advances in the area of GW calculations, review the available software implementations and define standardization criteria to render the comparison between GW calculations from different codes meaningful, and identify future major challenges in the area of quasiparticle calculations. This Topical Issue should be a reference point for further developments in the field.

  7. An Investigation into whether Student Use of Graphics Calculators Matches Their Teacher's Expectations

    ERIC Educational Resources Information Center

    Graham, E.; Headlam, C.; Sharp, J.; Watson, B.

    2008-01-01

    This research examines students' use of graphics calculators and investigates the extent to which the students' use meets their teachers aim when using graphics calculators in the classroom. The teacher's use of her graphics calculator was analysed over a week using Key Record software. The teacher was questioned about her aims and expectations…

  8. Product-oriented Software Certification Process for Software Synthesis

    NASA Technical Reports Server (NTRS)

    Nelson, Stacy; Fischer, Bernd; Denney, Ewen; Schumann, Johann; Richardson, Julian; Oh, Phil

    2004-01-01

    The purpose of this document is to propose a product-oriented software certification process to facilitate use of software synthesis and formal methods. Why is such a process needed? Currently, software is tested until deemed bug-free rather than proving that certain software properties exist. This approach has worked well in most cases, but unfortunately, deaths still occur due to software failure. Using formal methods (techniques from logic and discrete mathematics like set theory, automata theory and formal logic as opposed to continuous mathematics like calculus) and software synthesis, it is possible to reduce this risk by proving certain software properties. Additionally, software synthesis makes it possible to automate some phases of the traditional software development life cycle resulting in a more streamlined and accurate development process.

  9. Simulating flow around scaled model of a hypersonic vehicle in wind tunnel

    NASA Astrophysics Data System (ADS)

    Markova, T. V.; Aksenov, A. A.; Zhluktov, S. V.; Savitsky, D. V.; Gavrilov, A. D.; Son, E. E.; Prokhorov, A. N.

    2016-11-01

    A prospective hypersonic HEXAFLY aircraft is considered in the given paper. In order to obtain the aerodynamic characteristics of a new construction design of the aircraft, experiments with a scaled model have been carried out in a wind tunnel under different conditions. The runs have been performed at different angles of attack with and without hydrogen combustion in the scaled propulsion engine. However, the measured physical quantities do not provide all the information about the flowfield. Numerical simulation can complete the experimental data as well as to reduce the number of wind tunnel experiments. Besides that, reliable CFD software can be used for calculations of the aerodynamic characteristics for any possible design of the full-scale aircraft under different operation conditions. The reliability of the numerical predictions must be confirmed in verification study of the software. The given work is aimed at numerical investigation of the flowfield around and inside the scaled model of the HEXAFLY-CIAM module under wind tunnel conditions. A cold run (without combustion) was selected for this study. The calculations are performed in the FlowVision CFD software. The flow characteristics are compared against the available experimental data. The carried out verification study confirms the capability of the FlowVision CFD software to calculate the flows discussed.

  10. Summary of paper: Area navigation implementation for a microcomputer-based Loran-C receiver

    NASA Technical Reports Server (NTRS)

    Oguri, Fujiko

    1987-01-01

    The development of an area navigation program and the implementation of this software on a microcomputer-based Loran-C receiver to provide high-quality, practical area navigation information for general aviation are described. This software provides range and bearing angle to a selected waypoint, cross-track error, course deviation indication (CDI), ground speed, and estimated time of arrival at the waypoint. The range/bearing calculation, using an elliptical Earth model, provides very good accuracy; the error does not exceed more than -.012 nm (range) or 0.09 degree (bearing) for a maximum range to 530 nm. The alpha-beta filtering is applied in order to reduce the random noise on Loran-C raw data and in the ground speed calculation. Due to alpha-beta filtering, the ground speed calculation has good stability for constant or low-accelerative flight. The execution time of this software is approximately 0.2 second. Flight testing was done with a prototype Loran-C front-end receiver, with the Loran-C area navigation software demonstrating the ability to provide navigation for the pilot to any point in the Loran-C coverage area in true area navigation fashion without line-of-sight and range restriction typical of VOR area navigation.

  11. Turbo FRMAC 2011

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fulton, John; Gallagher, Linda K.; Whitener, Dustin

    The Turbo FRMAC (TF) software automates the calculations described in volumes 1-3 of "The Federal Manual for Assessing Environmental Data During a Radiological Emergency" (2010 version). This software automates the process of assessing radiological data during a Federal Radiological Emergency. The manual upon which the software is based is unclassified and freely available on the Internet. TF takes values generated by field samples or computer dispersion models and assesses the data in a way which is meaningful to a decision maker at a radiological emergency; such as, do radiation values exceed city, state, or federal limits; should the crops bemore » destroyed or can they be utilized; do residents need to be evacuated, sheltered in place, or should another action taken. The software also uses formulas generated by the EPA, FDA, and other federal agencies to generate field observable values specific to the radiological event that can be used to determine where regulatory limit values are exceeded. In addition to these calculations, TF calculates values which indicate how long an emergency worker can work in the contaminated area during a radiological emergency, the dose received from drinking contaminated water or milk, the dose from eating contaminated food, the does expected down or upwind of a given field sample, along with a significant number of other similar radiological health values.« less

  12. An Integrated Fuel Depletion Calculator for Fuel Cycle Options Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schneider, Erich; Scopatz, Anthony

    2016-04-25

    Bright-lite is a reactor modeling software developed at the University of Texas Austin to expand upon the work done with the Bright [1] reactor modeling software. Originally, bright-lite was designed to function as a standalone reactor modeling software. However, this aim was refocused t couple bright-lite with the Cyclus fuel cycle simulator [2] to make it a module for the fuel cycle simulator.

  13. CheMentor Software System by H. A. Peoples

    NASA Astrophysics Data System (ADS)

    Reid, Brian P.

    1997-09-01

    CheMentor Software System H. A. Peoples. Computerized Learning Enhancements: http://www.ecis.com/~clehap; email: clehap@ecis.com; 1996 - 1997. CheMentor is a series of software packages for introductory-level chemistry, which includes Practice Items (I), Stoichiometry (I), Calculating Chemical Formulae, and the CheMentor Toolkit. The first three packages provide practice problems for students and various types of help to solve them; the Toolkit includes "calculators" for determining chemical quantities as well as the Practice Items (I) set of problems. The set of software packages is designed so that each individual product acts as a module of a common CheMentor program. As the name CheMentor implies, the software is designed as a "mentor" for students learning introductory chemistry concepts and problems. The typical use of the software would be by individual students (or perhaps small groups) as an adjunct to lectures. CheMentor is a HyperCard application and the modules are HyperCard stacks. The requirements to run the packages include a Macintosh computer with at least 1 MB of RAM, a hard drive with several MB of available space depending upon the packages selected (10 MB were required for all the packages reviewed here), and the Mac operating system 6.0.5 or later.

  14. Cardiovascular Disease Risk Score: Results from the Filipino-American Women Cardiovascular Study.

    PubMed

    Ancheta, Irma B; Battie, Cynthia A; Volgman, Annabelle S; Ancheta, Christine V; Palaniappan, Latha

    2017-02-01

    Although cardiovascular disease (CVD) is a leading cause of morbidity and mortality of Filipino-Americans, conventional CVD risk calculators may not be accurate for this population. CVD risk scores of a group of Filipino-American women (FAW) were measured using the major risk calculators. Secondly, the sensitivity of the various calculators to obesity was determined. This is a cross-sectional descriptive study that enrolled 40-65-year-old FAW (n = 236), during a community-based health screening study. Ten-year CVD risk was calculated using the Framingham Risk Score (FRS), Reynolds Risk Score (RRS), and Atherosclerotic Cardiovascular Disease (ASCVD) calculators. The 30-year risk FRS and the lifetime ASCVD calculators were also determined. Levels of predicted CVD risk varied as a function of the calculator. The 10-year ASCVD calculator classified 12 % of participants with ≥10 % risk, but the 10-year FRS and RRS calculators classified all participants with ≤10 % risk. The 30-year "Hard" Lipid and BMI FRS calculators classified 32 and 43 % of participants with high (≥20 %) risk, respectively, while 95 % of participants were classified with ≥20 % risk by the lifetime ASCVD calculator. The percent of participants with elevated CVD risk increased as a function of waist circumference for most risk score calculators. Differences in risk score as a function of the risk score calculator indicate the need for outcome studies in this population. Increased waist circumference was associated with increased CVD risk scores underscoring the need for obesity control as a primary prevention of CVD in FAW.

  15. Verification and Validation in a Rapid Software Development Process

    NASA Technical Reports Server (NTRS)

    Callahan, John R.; Easterbrook, Steve M.

    1997-01-01

    The high cost of software production is driving development organizations to adopt more automated design and analysis methods such as rapid prototyping, computer-aided software engineering (CASE) tools, and high-level code generators. Even developers of safety-critical software system have adopted many of these new methods while striving to achieve high levels Of quality and reliability. While these new methods may enhance productivity and quality in many cases, we examine some of the risks involved in the use of new methods in safety-critical contexts. We examine a case study involving the use of a CASE tool that automatically generates code from high-level system designs. We show that while high-level testing on the system structure is highly desirable, significant risks exist in the automatically generated code and in re-validating releases of the generated code after subsequent design changes. We identify these risks and suggest process improvements that retain the advantages of rapid, automated development methods within the quality and reliability contexts of safety-critical projects.

  16. Software for occupational health and safety risk analysis based on a fuzzy model.

    PubMed

    Stefanovic, Miladin; Tadic, Danijela; Djapan, Marko; Macuzic, Ivan

    2012-01-01

    Risk and safety management are very important issues in healthcare systems. Those are complex systems with many entities, hazards and uncertainties. In such an environment, it is very hard to introduce a system for evaluating and simulating significant hazards. In this paper, we analyzed different types of hazards in healthcare systems and we introduced a new fuzzy model for evaluating and ranking hazards. Finally, we presented a developed software solution, based on the suggested fuzzy model for evaluating and monitoring risk.

  17. Approaches in highly parameterized inversion: TSPROC, a general time-series processor to assist in model calibration and result summarization

    USGS Publications Warehouse

    Westenbroek, Stephen M.; Doherty, John; Walker, John F.; Kelson, Victor A.; Hunt, Randall J.; Cera, Timothy B.

    2012-01-01

    The TSPROC (Time Series PROCessor) computer software uses a simple scripting language to process and analyze time series. It was developed primarily to assist in the calibration of environmental models. The software is designed to perform calculations on time-series data commonly associated with surface-water models, including calculation of flow volumes, transformation by means of basic arithmetic operations, and generation of seasonal and annual statistics and hydrologic indices. TSPROC can also be used to generate some of the key input files required to perform parameter optimization by means of the PEST (Parameter ESTimation) computer software. Through the use of TSPROC, the objective function for use in the model-calibration process can be focused on specific components of a hydrograph.

  18. STEFFY-software to calculate nuclide-specific total counting efficiency in well-type γ-ray detectors.

    PubMed

    Pommé, S

    2012-09-01

    A software package is presented to calculate the total counting efficiency for the decay of radionuclides in a well-type γ-ray detector. It is specifically applied to primary standardisation of activity by means of 4πγ-counting with a NaI(Tl) well-type scintillation detector. As an alternative to Monte Carlo simulations, the software combines good accuracy with superior speed and ease-of-use. It is also well suited to investigate uncertainties associated with the 4πγ-counting method for a variety of radionuclides and detector dimensions. In this paper, the underlying analytical models for the radioactive decay and subsequent counting efficiency of the emitted radiation in the detector are summarised. Copyright © 2012 Elsevier Ltd. All rights reserved.

  19. Obtaining Valid Safety Data for Software Safety Measurement and Process Improvement

    NASA Technical Reports Server (NTRS)

    Basili, Victor r.; Zelkowitz, Marvin V.; Layman, Lucas; Dangle, Kathleen; Diep, Madeline

    2010-01-01

    We report on a preliminary case study to examine software safety risk in the early design phase of the NASA Constellation spaceflight program. Our goal is to provide NASA quality assurance managers with information regarding the ongoing state of software safety across the program. We examined 154 hazard reports created during the preliminary design phase of three major flight hardware systems within the Constellation program. Our purpose was two-fold: 1) to quantify the relative importance of software with respect to system safety; and 2) to identify potential risks due to incorrect application of the safety process, deficiencies in the safety process, or the lack of a defined process. One early outcome of this work was to show that there are structural deficiencies in collecting valid safety data that make software safety different from hardware safety. In our conclusions we present some of these deficiencies.

  20. Software for Analyzing Laminar-to-Turbulent Flow Transitions

    NASA Technical Reports Server (NTRS)

    Chang, Chau-Lyan

    2004-01-01

    Software assurance is the planned and systematic set of activities that ensures that software processes and products conform to requirements, standards, and procedures. Examples of such activities are the following: code inspections, unit tests, design reviews, performance analyses, construction of traceability matrices, etc. In practice, software development projects have only limited resources (e.g., schedule, budget, and availability of personnel) to cover the entire development effort, of which assurance is but a part. Projects must therefore select judiciously from among the possible assurance activities. At its heart, this can be viewed as an optimization problem; namely, to determine the allocation of limited resources (time, money, and personnel) to minimize risk or, alternatively, to minimize the resources needed to reduce risk to an acceptable level. The end result of the work reported here is a means to optimize quality-assurance processes used in developing software. This is achieved by combining two prior programs in an innovative manner

  1. Software Products - Naval Oceanography Portal

    Science.gov Websites

    section Advanced Search... Sections Home Time Earth Orientation Astronomy Meteorology Oceanography Ice You astronomy. Available as Fortran, C, or Python source code. Current version: 3.1 Software Products by Our computer or programmable calculator. Standards Of Fundamental Astronomy (SOFA) Libraries The International

  2. Preliminary results of 3D dose calculations with MCNP-4B code from a SPECT image.

    PubMed

    Rodríguez Gual, M; Lima, F F; Sospedra Alfonso, R; González González, J; Calderón Marín, C

    2004-01-01

    Interface software was developed to generate the input file to run Monte Carlo MCNP-4B code from medical image in Interfile format version 3.3. The software was tested using a spherical phantom of tomography slides with known cumulated activity distribution in Interfile format generated with IMAGAMMA medical image processing system. The 3D dose calculation obtained with Monte Carlo MCNP-4B code was compared with the voxel S factor method. The results show a relative error between both methods less than 1 %.

  3. Develop Direct Geo-referencing System Based on Open Source Software and Hardware Platform

    NASA Astrophysics Data System (ADS)

    Liu, H. S.; Liao, H. M.

    2015-08-01

    Direct geo-referencing system uses the technology of remote sensing to quickly grasp images, GPS tracks, and camera position. These data allows the construction of large volumes of images with geographic coordinates. So that users can be measured directly on the images. In order to properly calculate positioning, all the sensor signals must be synchronized. Traditional aerial photography use Position and Orientation System (POS) to integrate image, coordinates and camera position. However, it is very expensive. And users could not use the result immediately because the position information does not embed into image. To considerations of economy and efficiency, this study aims to develop a direct geo-referencing system based on open source software and hardware platform. After using Arduino microcontroller board to integrate the signals, we then can calculate positioning with open source software OpenCV. In the end, we use open source panorama browser, panini, and integrate all these to open source GIS software, Quantum GIS. A wholesome collection of data - a data processing system could be constructed.

  4. 76 FR 28819 - NUREG/CR-XXXX, Development of Quantitative Software Reliability Models for Digital Protection...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-05-18

    ... NUCLEAR REGULATORY COMMISSION [NRC-2011-0109] NUREG/CR-XXXX, Development of Quantitative Software..., ``Development of Quantitative Software Reliability Models for Digital Protection Systems of Nuclear Power Plants... of Risk Analysis, Office of Nuclear Regulatory Research, U.S. Nuclear Regulatory Commission...

  5. The Consumer Juggernaut: Web-Based and Mobile Applications as Innovation Pioneer

    NASA Astrophysics Data System (ADS)

    Messerschmitt, David G.

    As happened previously in electronics, software targeted at consumers is increasingly the focus of investment and innovation. Some of the areas where it is leading is animated interfaces, treating users as a community, audio and video information, software as a service, agile software development, and the integration of business models with software design. As a risk-taking and experimental market, and as a source of ideas, consumer software can benefit other areas of applications software. The influence of consumer software can be magnified by research into the internal organizations and processes of the innovative firms at its foundation.

  6. The pyPHaz software, an interactive tool to analyze and visualize results from probabilistic hazard assessments

    NASA Astrophysics Data System (ADS)

    Tonini, Roberto; Selva, Jacopo; Costa, Antonio; Sandri, Laura

    2014-05-01

    Probabilistic Hazard Assessment (PHA) is becoming an essential tool for risk mitigation policies, since it allows to quantify the hazard due to hazardous phenomena and, differently from the deterministic approach, it accounts for both aleatory and epistemic uncertainties. On the other hand, one of the main disadvantages of PHA methods is that their results are not easy to understand and interpret by people who are not specialist in probabilistic tools. For scientists, this leads to the issue of providing tools that can be easily used and understood by decision makers (i.e., risk managers or local authorities). The work here presented fits into the problem of simplifying the transfer between scientific knowledge and land protection policies, by providing an interface between scientists, who produce PHA's results, and decision makers, who use PHA's results for risk analyses. In this framework we present pyPHaz, an open tool developed and designed to visualize and analyze PHA results due to one or more phenomena affecting a specific area of interest. The software implementation has been fully developed with the free and open-source Python programming language and some featured Python-based libraries and modules. The pyPHaz tool allows to visualize the Hazard Curves (HC) calculated in a selected target area together with different levels of uncertainty (mean and percentiles) on maps that can be interactively created and modified by the user, thanks to a dedicated Graphical User Interface (GUI). Moreover, the tool can be used to compare the results of different PHA models and to merge them, by creating ensemble models. The pyPHaz software has been designed with the features of storing and accessing all the data through a MySQL database and of being able to read as input the XML-based standard file formats defined in the frame of GEM (Global Earthquake Model). This format model is easy to extend also to any other kind of hazard, as it will be shown in the applications here used as examples of the pyPHaz potentialities, that are focused on a Probabilistic Volcanic Hazard Assessment (PVHA) for tephra dispersal and fallout applied to the municipality of Naples.

  7. Autonomy Software: V&V Challenges and Characteristics

    NASA Technical Reports Server (NTRS)

    Schumann, Johann; Visser, Willem

    2006-01-01

    The successful operation of unmanned air vehicles requires software with a high degree of autonomy. Only if high level functions can be carried out without human control and intervention, complex missions in a changing and potentially unknown environment can be carried out successfully. Autonomy software is highly mission and safety critical: failures, caused by flaws in the software cannot only jeopardize the mission, but could also endanger human life (e.g., a crash of an UAV in a densely populated area). Due to its large size, high complexity, and use of specialized algorithms (planner, constraint-solver, etc.), autonomy software poses specific challenges for its verification, validation, and certification. -- - we have carried out a survey among researchers aid scientists at NASA to study these issues. In this paper, we will present major results of this study, discussing the broad spectrum. of notions and characteristics of autonomy software and its challenges for design and development. A main focus of this survey was to evaluate verification and validation (V&V) issues and challenges, compared to the development of "traditional" safety-critical software. We will discuss important issues in V&V of autonomous software and advanced V&V tools which can help to mitigate software risks. Results of this survey will help to identify and understand safety concerns in autonomy software and will lead to improved strategies for mitigation of these risks.

  8. An integrated multi-scale risk analysis procedure for pluvial flooding

    NASA Astrophysics Data System (ADS)

    Tader, Andreas; Mergili, Martin; Jäger, Stefan; Glade, Thomas; Neuhold, Clemens; Stiefelmeyer, Heinz

    2016-04-01

    Mitigation of or adaptation to the negative impacts of natural processes on society requires a better understanding of the spatio-temporal distribution not only of the processes themselves, but also of the elements at risk. Information on their values, exposures and vulnerabilities towards the expected impact magnitudes/intensities of the relevant processes is needed. GIS-supported methods are particularly useful for integrated spatio-temporal analyses of natural processes and their potential consequences. Hereby, pluvial floods are of particular concern for many parts of Austria. The overall aim of the present study is to calculate the hazards emanating from pluvial floods, to determine the exposure of given elements at risk, to determine their vulnerabilities towards given pluvial flood hazards and to analyze potential consequences in terms of monetary losses. The whole approach builds on data available on a national scale. We introduce an integrated, multi-scale risk analysis procedure with regard to pluvial flooding. Focusing on the risk to buildings, we firstly exemplify this procedure with a well-documented event in the city of Graz (Austria), in order to highlight the associated potentials and limitations. Secondly, we attempt to predict the possible consequences of pluvial flooding triggered by rainfall events with recurrence intervals of 30, 100 and 300 years. (i) We compute spatially distributed inundation depths using the software FloodArea. Infiltration capacity and surface roughness are estimated from the land cover units given by the official cadastre. Various assumptions are tested with regard to the inflow to the urban sewer system. (ii) Based on the inundation depths and the official building register, we employ a set of rules and functions to deduce the exposure, vulnerability and risk for each building. A risk indicator for each building, expressed as the expected damage associated to a given event, is derived by combining the building value and its vulnerability. (iii) The object-based hazards, exposures, vulnerabilities and risks can be scaled to any spatial unit desired. For this purpose we have developed an automated work flow building on the Python programming language in combination with ArcGIS and the R statistical software. This enables us to easily adapt the resulting risk indication maps to different zooming levels; to build statistics for various types of units; to flexibly react to the needs of the end users; and to account for the availability of reference data for validation. In the present study, we scale the results to the level of postal code zones. The evaluation of the results is based on loss reports of an insurance company and on photographs and videos obtained from various sources. We show a potential of the suggested work flow to reproduce the documented damages at the level of postal code zones. However, the results are very sensitive to the input parameters and model assumptions, and a robust back-calculation even of well-documented events remains a major challenge. Ultimately, we aim at integrating the procedure presented in a work flow for generating risk indication maps for pluvial flooding throughout the entire territory of Austria.

  9. Polymorphisms in interleukins 17A and 17F genes and periodontitis: results from a meta-analysis.

    PubMed

    da Silva, Felipe Rodolfo Pereira; Pessoa, Larissa Dos Santos; Vasconcelos, Any Carolina Cardoso Guimarães; de Aquino Lima, Weberson; Alves, Even Herlany Pereira; Vasconcelos, Daniel Fernando Pereira

    2017-12-01

    Polymorphisms in inflammatory genes such as interleukins 17A and 17F are associated with the risk of development of periodontitis, although the results remain contradictory. Hence, the aim of this study was perform a meta-analysis focusing on two polymorphisms (rs2275913 and rs763780) in interleukins 17A and 17F genes, respectively, in both chronic (CP) and aggressive periodontitis (AgP). A review in literature was performed in several databases for studies published before 25, September 2016. The meta-analysis was obtained through the review manager statistical software (version 5.2) with odds ratio (OR) calculation and funnel plot (P < 0.05) for heterogeneity, as well as the comprehensive meta-analysis software (version 3.3.070) for the assessment of publication bias. Seven articles with 1540 participants composed the results in which the mutant allele in the rs2275913 polymorphism did not present significant association with the risk of CP or AgP (OR 1.56, 95% CI 0.77, 3.15, P = 0.21; OR 1.12, 95% CI 0.05, 23.44, P = 0.94, respectively) nor was the mutant allele in rs763780 associated with the risk of CP (OR 1.19, 95% CI 0.80, 1.76, P = 0.39) or AgP (OR 1.07, 95% CI 0.63, 1.84, P = 0.79). No bias of publication was observed by Egger's and Begg's tests in any allelic evaluation. This meta-analysis showed a non-significant association between the polymorphisms rs2275913 and rs763780 in interleukins 17A and 17F genes and chronic and aggressive periodontitis in the allelic evaluation.

  10. The burden of respiratory syncytial virus (RSV) associated acute lower respiratory infections in children with Down syndrome: A systematic review and meta-analysis.

    PubMed

    Chan, Markus; Park, John J; Shi, Ting; Martinón-Torres, Federico; Bont, Louis; Nair, Harish

    2017-12-01

    Acute lower respiratory tract infections (ALRIs) caused by respiratory syncytial virus (RSV) are a leading cause of hospitalization in infants. Numerous risk factors have been identified in the aetiology of severe RSV-associated ALRI necessitating hospitalisation, including prematurity and congenital heart disease. Down syndrome (DS), a common genetic disorder associated with congenital and dysmorphic features, has recently been identified as an independent risk factor for RSV-associated ALRI requiring hospitalisation; however, the disease burden of RSV-associated ALRI in this population has not yet been established. Similarly, the impact of DS as an independent risk factor has not yet been quantified. We aimed therefore to estimate the incidence of admissions in children with DS, and by comparing this with unaffected children, to quantify the risk of DS independent of other risk factors. A systematic review of the existing literature published between 1995 and March 1, 2017 was performed to quantify the incidence of hospitalisation due to RSV-associated ALRI in children with DS. Meta-analyses were performed on extracted data using STATA statistical software, and hospitalisation rates for children with and without DS under the age of 2 were calculated. 5 articles were ultimately deemed eligible for analyses. Analyses were limited to children under the age of 2 years. We calculated the hospitalisation rate for children with DS in this age group to be 117.6 per 1000 child-years (95% CI 67.4-205.2), vs a rate of 15.2 per 1000 child-years (95% CI 8.3-27.6) in unaffected children. This indicates DS contributes to a 6.8 (95% CI 5.5-8.4) fold increase in the relative risk of hospitalisation for RSV-associated ALRI. Though limited by a small number of articles, this review found sufficient evidence to conclude DS was a significant independent risk factor for the development of severe RSV-associated ALRI requiring hospitalisation. Further studies are needed to define the impact of DS in conjunction with other comorbidities on the risk of severe RSV infection. Determining benefits of immunoprophylaxis or future vaccines against RSV in this at-risk population is warranted.

  11. Risk factors for re-bleeding of aneurysmal subarachnoid hemorrhage: meta-analysis of observational studies.

    PubMed

    Alfotih, Gobran Taha Ahmed; Li, FangCheng; Xu, XinKe; Zhang, ShangYi

    2014-01-01

    The mortality of re-bleeding following aneurysmal subarachnoid hemorrhage is high, and surviving patients often have poor clinical condition and worse outcome than patients with a single bleed. In this study, we performed an updated systematic review and meta-analysis to determine the most common risk factors for re-bleeding in this patient population, with the goal of providing neurologists, neurosurgeons, neuro-interventionalists with a simple and fast method to evaluate the re-bleeding risk for aneurysmal subarachnoid hemorrhage. We conducted a thorough meta-analysis of the risk factors associated with re-bleeding or re-rupture of intracranial aneurysms in cases published between 2000 and 2013. Pooled mean difference was calculated for the continuous variables (age), and pooled odds ratio (OR) was calculated for categorical factors. If heterogeneity was significant (p<0.05), a random effect model was applied; otherwise, a fixed model was used. Testing for pooled effects and statistical significance for each potential risk factor were analyzed using Review Manager software. Our literature search identified 174 articles. Of these, only seven retrospective studies met the inclusion criteria. These seven studies consisted of 2470 patients, 283 of which had aneurysmal re-bleeding, resulting in a weighted average rate of re-bleeding of 11.3% with 95% confidence interval [CI]: 10.1-12.6. In this population, sex (OR 1.46; 95% CI: 1.11-1.92), high systolic blood pressure [SBP] (OR 2.52; 95% CI: 1.40-4.53), aneurysm size (OR 3.00; 95% CI: 2.06-4.37), clinical condition (Hunt & Hess) (OR 4.94; 95% CI: 2.29,10.68), and Fisher grade (OR 2.29; 95% CI: 1.45, 3.61) were statistically significant risk factors for re-bleeding. Sex, high SBP, high Fisher grade, aneurysm size larger than 10mm, and poor clinical condition were independent risk factors for aneurysmal re-bleeding. The importance of early aneurysm intervention and careful consideration of patient risk factors should be emphasized to eliminate the risk of re-bleeding and poor outcome. Copyright © 2014 Polish Neurological Society. Published by Elsevier Urban & Partner Sp. z o.o. All rights reserved.

  12. [Morbidity rate of obesity in children in ukraine. Overweight as noncontagious disease risk factor].

    PubMed

    Заболотна, Ірина Е

    The upsurge of prevalence rate of obesity and overweight that in the majority of cases traces back to childhood is a risk factor of the most common noncontagious diseases in adults. The aim was to analyze prevalence of obesity in children in Ukraine and to conduct the pilot study of medical condition of overweight children. Official state statistics of prevalence rate of obesity in kids and screening data of anthropometric characteristics, arterial tension levels, physical performance decrement and medical condition of children (boys - 50, girls - 90, average age - 15,1±0,1 years) was used in research. Data calculation performed by Statistica v. 6.0 software. Over the past few decades, the morbidity rate of obesity in children in Ukraine has greatly increased, especially in year class 15-17. Insufficient diagnosis of obesity in children is the consequence of the inadequacy of the existing system of preventive care and monitoring survey of decease risk factors. Children with body mass index (BMI) above normal have a risk of work decrement in 5,2 times (odds ratio, OR=5,2, CI95%: 1,7-10,6). Such children have higher risk of development of the diseases of the respiratory system (OR=8,1; CI95%: 3,9-13,6) and allergic dermatitis (OR=7,7; CI95%: 3,7-12,9). The odds ratio of arterial hypertension in such children is equal to 3,46±0,3 (95%CI: 2,0-5,9). According to prediction calculations, the situation with the increase of prevalence rate of obesity in children in Ukraine is unfavorable. The introduction of measures aimed at finding children with obesity, their registration and monitoring of patients' health with due regard to decease risk factors at the primary care level would conduce to improving prevention of obesity and prevention of alimentary diseases progression.

  13. [Non-muscle-invasive bladder cancer: Information transfer from the clinic to the doctor's office : Results of a questionnaire study and presentation of a software solution].

    PubMed

    Lebentrau, S; May, M; Weckermann, D; Speck, T; Wick, A-K; Mathew, M; Schostak, M

    2017-02-01

    The adjuvant treatment of non-muscle-invasive bladder cancer (NMIBC) is based on the individual risk profile (RP) and its sufficient transfer from the clinic to the doctor's office. The objectives of our study were to verify the importance and degree of transfer of RP and recommendation for risk-adapted adjuvant treatment (RAAT) in patients with NMIBC as well as to develop appropriate tools for this purpose, if necessary. An email-based survey distributed to urologists in Brandenburg, Berlin, Bavaria and Lower Saxony explored the questions mentioned above. In addition, a tool for risk stratification and information transfer for patients with NMIBC was developed and validated. From a total of 134 questionnaires analyzed, 55 were from clinic urologists (CUs) and 79 were from ambulant urologists (AUs). Although 9 out of 10 urologists considered the RP of importance, only 29 % of CUs and 24 % of AUs (p = 0.553) confirmed that the RP was always mentioned in medical reports. The recommendation for RAAT was confirmed from 62 % of CUs and 20 % of AUs (p < 0.001). A recommendation for RAAT in the medical report was requested by 86 % of AUs. The risk calculator presented here - to our knowledge the first with integration of the 2004 WHO grading - is delivered in all mathematically possible constellations a RP, according to guideline recommendations. Urologists in the clinic and doctor's office both attach considerable importance to the determination and transfer of RP and the recommendation for RAAT. There was evidence to suggest an overestimation of the quality of medical reports by the CU. The risk calculator provides an easy and cost-neutral option to improve risk stratification and information transfer from the clinic to the doctor's office.

  14. Nasa-wide Standard Administrative Systems

    NASA Technical Reports Server (NTRS)

    Schneck, P.

    1984-01-01

    Factors to be considered in developing agency-wide standard administrative systems for NASA include uniformity of hardware and software; centralization vs. decentralization; risk exposure; and models for software development.

  15. Enhancements and Extensions of Formal Models for Risk Assessment in Software Projects

    DTIC Science & Technology

    2002-09-01

    the five defect categories. Cosmetic Defects. The name that corresponds to QSM®’s cosmetic defects. Cosmetic defects can be described as deferred...California. June 2002. (Fent00) Fenton , N. E. and Neil, M., Software Metrics: Roadmap. Proceedings of the Conference on the Future of Software

  16. [Risk factors associated to diffuse gastric cancer and intestinal histological patterns in an adult population from Western Mexico].

    PubMed

    Delgado-Figueroa, Netzahualpilli; Casas-Junco, Paloma; Torres-Jasso, Juan Heriberto; Bustos-Carpinteyro, Andrea Rebeca; Santiago-Luna, Ernesto; Marín-Contreras, María Eugenia; Sánchez-López, Josefina Yoaly

    Gastric cancer (GC) is the third leading cause of cancer death worldwide, and is divided histologically in diffuse gastric cancer (DGC) and intestinal gastric cancer (IGC). Multiple risk factors have been associated with GC in different populations. The objective was to analyze the risk factors associated to DGC and IGC in a population from the western region of Mexico. The DGC (n = 27) and IGC (n = 26) cases, each matched by age and sex with a control group, were analyzed. Diet and lifestyle data were obtained by a questionnaire. Statistical analysis was performed with the software SPSSv18. The association of risk was calculated in odds ratio (OR); a value of p < 0.05 was considered significant. In the DGC group, the factors with significant OR values were: consumption of pork OR: 3.4 (1.11-10.4; p =0.032), smoking OR: 4.7 (1.5-15.0; p =0.007), green vegetables OR: 0.16 (0.03-0.83; p =0.029) and fruit OR: 0.28 (0.08-0.88; p =0.029). In the IGC group, the consumption of canned sardines was a significant risk factor OR: 4.07 (1.25-13.24; p =0.019). This work is the first to analyze the risk factors associated with GC in a population from western Mexico.

  17. Study on Collision of Ship Side Structure by Simplified Plastic Analysis Method

    NASA Astrophysics Data System (ADS)

    Sun, C. J.; Zhou, J. H.; Wu, W.

    2017-10-01

    During its lifetime, a ship may encounter collision or grounding and sustain permanent damage after these types of accidents. Crashworthiness has been based on two kinds of main methods: simplified plastic analysis and numerical simulation. A simplified plastic analysis method is presented in this paper. Numerical methods using the non-linear finite-element software LS-DYNA are conducted to validate the method. The results show that, as for the accuracy of calculation results, the simplified plasticity analysis are in good agreement with the finite element simulation, which reveals that the simplified plasticity analysis method can quickly and accurately estimate the crashworthiness of the side structure during the collision process and can be used as a reliable risk assessment method.

  18. Design of a secure remote management module for a software-operated medical device.

    PubMed

    Burnik, Urban; Dobravec, Štefan; Meža, Marko

    2017-12-09

    Software-based medical devices need to be maintained throughout their entire life cycle. The efficiency of after-sales maintenance can be improved by managing medical systems remotely. This paper presents how to design the remote access function extensions in order to prevent risks imposed by uncontrolled remote access. A thorough analysis of standards and legislation requirements regarding safe operation and risk management of medical devices is presented. Based on the formal requirements, a multi-layer machine design solution is proposed that eliminates remote connectivity risks by strict separation of regular device functionalities from remote management service, deploys encrypted communication links and uses digital signatures to prevent mishandling of software images. The proposed system may also be used as an efficient version update of the existing medical device designs.

  19. Spatial analysis and health risk assessment of heavy metals concentration in drinking water resources.

    PubMed

    Fallahzadeh, Reza Ali; Ghaneian, Mohammad Taghi; Miri, Mohammad; Dashti, Mohamad Mehdi

    2017-11-01

    The heavy metals available in drinking water can be considered as a threat to human health. Oncogenic risk of such metals is proven in several studies. Present study aimed to investigate concentration of the heavy metals including As, Cd, Cr, Cu, Fe, Hg, Mn, Ni, Pb, and Zn in 39 water supply wells and 5 water reservoirs within the cities Ardakan, Meibod, Abarkouh, Bafgh, and Bahabad. The spatial distribution of the concentration was carried out by the software ArcGIS. Such simulations as non-carcinogenic hazard and lifetime cancer risk were conducted for lead and nickel using Monte Carlo technique. The sensitivity analysis was carried out to find the most important and effective parameters on risk assessment. The results indicated that concentration of all metals in 39 wells (except iron in 3 cases) reached the levels mentioned in EPA, World Health Organization, and Pollution Control Department standards. Based on the spatial distribution results at all studied regions, the highest concentrations of metals were derived, respectively, for iron and zinc. Calculated HQ values for non-carcinogenic hazard indicated a reasonable risk. Average lifetime cancer risks for the lead in Ardakan and nickel in Meibod and Bahabad were shown to be 1.09 × 10 -3 , 1.67 × 10 -1 , and 2 × 10 -1 , respectively, demonstrating high carcinogenic risk compared to similar standards and studies. The sensitivity analysis suggests high impact of concentration and BW in carcinogenic risk.

  20. Teaching Science and Mathematics Subjects Using the Excel Spreadsheet Package

    ERIC Educational Resources Information Center

    Ibrahim, Dogan

    2009-01-01

    The teaching of scientific subjects usually require laboratories where students can put the theory they have learned into practice. Traditionally, electronic programmable calculators, dedicated software, or expensive software simulation packages, such as MATLAB have been used to simulate scientific experiments. Recently, spreadsheet programs have…

  1. Does Your Graphing Software Real-ly Work?

    ERIC Educational Resources Information Center

    Marchand, R. J.; McDevitt, T. J.; Bosse, Michael J.; Nandakumar, N. R.

    2007-01-01

    Many popular mathematical software products including Maple, Mathematica, Derive, Mathcad, Matlab, and some of the TI calculators produce incorrect graphs because they use complex arithmetic instead of "real" arithmetic. This article expounds on this issue, provides possible remedies for instructors to share with their students, and demonstrates…

  2. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mouton, S.; Ledoux, Y.; Teissandier, D.

    A key challenge for the future is to reduce drastically the human impact on the environment. In the aeronautic field, this challenge aims at optimizing the design of the aircraft to decrease the global mass. This reduction leads to the optimization of every part constitutive of the plane. This operation is even more delicate when the used material is composite material. In this case, it is necessary to find a compromise between the strength, the mass and the manufacturing cost of the component. Due to these different kinds of design constraints it is necessary to assist engineer with decision supportmore » system to determine feasible solutions. In this paper, an approach is proposed based on the coupling of the different key characteristics of the design process and on the consideration of the failure risk of the component. The originality of this work is that the manufacturing deviations due to the RTM process are integrated in the simulation of the assembly process. Two kinds of deviations are identified: volume impregnation (injection phase of RTM process) and geometrical deviations (curing and cooling phases). The quantification of these deviations and the related failure risk calculation is based on finite element simulations (Pam RTM registered and Samcef registered softwares). The use of genetic algorithm allows to estimate the impact of the design choices and their consequences on the failure risk of the component. The main focus of the paper is the optimization of tool design. In the framework of decision support systems, the failure risk calculation is used for making the comparison of possible industrialization alternatives. It is proposed to apply this method on a particular part of the airplane structure: a spar unit made of carbon fiber/epoxy composite.« less

  3. Environmental and socio-economic risk modelling for Chagas disease in Bolivia.

    PubMed

    Mischler, Paula; Kearney, Michael; McCarroll, Jennifer C; Scholte, Ronaldo G C; Vounatsou, Penelope; Malone, John B

    2012-09-01

    Accurately defining disease distributions and calculating disease risk is an important step in the control and prevention of diseases. Geographical information systems (GIS) and remote sensing technologies, with maximum entropy (Maxent) ecological niche modelling computer software, were used to create predictive risk maps for Chagas disease in Bolivia. Prevalence rates were calculated from 2007 to 2009 household infection survey data for Bolivia, while environmental data were compiled from the Worldclim database and MODIS satellite imagery. Socio-economic data were obtained from the Bolivian National Institute of Statistics. Disease models identified altitudes at 500-3,500 m above the mean sea level (MSL), low annual precipitation (45-250 mm), and higher diurnal range of temperature (10-19 °C; peak 16 °C) as compatible with the biological requirements of the insect vectors. Socio-economic analyses demonstrated the importance of improved housing materials and water source. Home adobe wall materials and having to fetch drinking water from rivers or wells without pump were found to be highly related to distribution of the disease by the receiver operator characteristic (ROC) area under the curve (AUC) (0.69 AUC, 0.67 AUC and 0.62 AUC, respectively), while areas with hardwood floors demonstrated a direct negative relationship (-0.71 AUC). This study demonstrates that Maxent modelling can be used in disease prevalence and incidence studies to provide governmental agencies with an easily learned, understandable method to define areas as either high, moderate or low risk for the disease. This information may be used in resource planning, targeting and implementation. However, access to high-resolution, sub-municipality socio-economic data (e.g. census tracts) would facilitate elucidation of the relative influence of poverty-related factors on regional disease dynamics.

  4. Three-dimensional modeling and animation of two carpal bones: a technique.

    PubMed

    Green, Jason K; Werner, Frederick W; Wang, Haoyu; Weiner, Marsha M; Sacks, Jonathan M; Short, Walter H

    2004-05-01

    The objectives of this study were to (a). create 3D reconstructions of two carpal bones from single CT data sets and animate these bones with experimental in vitro motion data collected during dynamic loading of the wrist joint, (b). develop a technique to calculate the minimum interbone distance between the two carpal bones, and (c). validate the interbone distance calculation process. This method utilized commercial software to create the animations and an in-house program to interface with three-dimensional CAD software to calculate the minimum distance between the irregular geometries of the bones. This interbone minimum distance provides quantitative information regarding the motion of the bones studied and may help to understand and quantify the effects of ligamentous injury.

  5. An Alternative Lunar Ephemeris Model for On-Board Flight Software Use

    NASA Technical Reports Server (NTRS)

    Simpson, David G.

    1998-01-01

    In calculating the position vector of the Moon in on-board flight software, one often begins by using a series expansion to calculate the ecliptic latitude and longitude of the Moon, referred to the mean ecliptic and equinox of date. One then performs a reduction for precession, followed by a rotation of the position vector from the ecliptic plane to the equator, and a transformation from spherical to Cartesian coordinates before finally arriving at the desired result: equatorial J2000 Cartesian components of the lunar position vector. An alternative method is developed here in which the equatorial J2000 Cartesian components of the lunar position vector are calculated directly by a series expansion, saving valuable on-board computer resources.

  6. The Mayak Worker Dosimetry System (MWDS-2013): Implementation of the Dose Calculations.

    PubMed

    Zhdanov, А; Vostrotin, V; Efimov, А; Birchall, A; Puncher, M

    2016-07-15

    The calculation of internal doses for the Mayak Worker Dosimetry System (MWDS-2013) involved extensive computational resources due to the complexity and sheer number of calculations required. The required output consisted of a set of 1000 hyper-realizations: each hyper-realization consists of a set (1 for each worker) of probability distributions of organ doses. This report describes the hardware components and computational approaches required to make the calculation tractable. Together with the software, this system is referred to here as the 'PANDORA system'. It is based on a commercial SQL server database in a series of six work stations. A complete run of the entire Mayak worker cohort entailed a huge amount of calculations in PANDORA and due to the relatively slow speed of writing the data into the SQL server, each run took about 47 days. Quality control was monitored by comparing doses calculated in PANDORA with those in a specially modified version of the commercial software 'IMBA Professional Plus'. Suggestions are also made for increasing calculation and storage efficiency for future dosimetry calculations using PANDORA. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  7. GPAW - massively parallel electronic structure calculations with Python-based software.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Enkovaara, J.; Romero, N.; Shende, S.

    2011-01-01

    Electronic structure calculations are a widely used tool in materials science and large consumer of supercomputing resources. Traditionally, the software packages for these kind of simulations have been implemented in compiled languages, where Fortran in its different versions has been the most popular choice. While dynamic, interpreted languages, such as Python, can increase the effciency of programmer, they cannot compete directly with the raw performance of compiled languages. However, by using an interpreted language together with a compiled language, it is possible to have most of the productivity enhancing features together with a good numerical performance. We have used thismore » approach in implementing an electronic structure simulation software GPAW using the combination of Python and C programming languages. While the chosen approach works well in standard workstations and Unix environments, massively parallel supercomputing systems can present some challenges in porting, debugging and profiling the software. In this paper we describe some details of the implementation and discuss the advantages and challenges of the combined Python/C approach. We show that despite the challenges it is possible to obtain good numerical performance and good parallel scalability with Python based software.« less

  8. Element Load Data Processor (ELDAP) Users Manual

    NASA Technical Reports Server (NTRS)

    Ramsey, John K., Jr.; Ramsey, John K., Sr.

    2015-01-01

    Often, the shear and tensile forces and moments are extracted from finite element analyses to be used in off-line calculations for evaluating the integrity of structural connections involving bolts, rivets, and welds. Usually the maximum forces and moments are desired for use in the calculations. In situations where there are numerous structural connections of interest for numerous load cases, the effort in finding the true maximum force and/or moment combinations among all fasteners and welds and load cases becomes difficult. The Element Load Data Processor (ELDAP) software described herein makes this effort manageable. This software eliminates the possibility of overlooking the worst-case forces and moments that could result in erroneous positive margins of safety and/or selecting inconsistent combinations of forces and moments resulting in false negative margins of safety. In addition to forces and moments, any scalar quantity output in a PATRAN report file may be evaluated with this software. This software was originally written to fill an urgent need during the structural analysis of the Ares I-X Interstage segment. As such, this software was coded in a straightforward manner with no effort made to optimize or minimize code or to develop a graphical user interface.

  9. Research on Occupational Safety, Health Management and Risk Control Technology in Coal Mines.

    PubMed

    Zhou, Lu-Jie; Cao, Qing-Gui; Yu, Kai; Wang, Lin-Lin; Wang, Hai-Bin

    2018-04-26

    This paper studies the occupational safety and health management methods as well as risk control technology associated with the coal mining industry, including daily management of occupational safety and health, identification and assessment of risks, early warning and dynamic monitoring of risks, etc.; also, a B/S mode software (Geting Coal Mine, Jining, Shandong, China), i.e., Coal Mine Occupational Safety and Health Management and Risk Control System, is developed to attain the aforementioned objectives, namely promoting the coal mine occupational safety and health management based on early warning and dynamic monitoring of risks. Furthermore, the practical effectiveness and the associated pattern for applying this software package to coal mining is analyzed. The study indicates that the presently developed coal mine occupational safety and health management and risk control technology and the associated software can support the occupational safety and health management efforts in coal mines in a standardized and effective manner. It can also control the accident risks scientifically and effectively; its effective implementation can further improve the coal mine occupational safety and health management mechanism, and further enhance the risk management approaches. Besides, its implementation indicates that the occupational safety and health management and risk control technology has been established based on a benign cycle involving dynamic feedback and scientific development, which can provide a reliable assurance to the safe operation of coal mines.

  10. Research on Occupational Safety, Health Management and Risk Control Technology in Coal Mines

    PubMed Central

    Zhou, Lu-jie; Cao, Qing-gui; Yu, Kai; Wang, Lin-lin; Wang, Hai-bin

    2018-01-01

    This paper studies the occupational safety and health management methods as well as risk control technology associated with the coal mining industry, including daily management of occupational safety and health, identification and assessment of risks, early warning and dynamic monitoring of risks, etc.; also, a B/S mode software (Geting Coal Mine, Jining, Shandong, China), i.e., Coal Mine Occupational Safety and Health Management and Risk Control System, is developed to attain the aforementioned objectives, namely promoting the coal mine occupational safety and health management based on early warning and dynamic monitoring of risks. Furthermore, the practical effectiveness and the associated pattern for applying this software package to coal mining is analyzed. The study indicates that the presently developed coal mine occupational safety and health management and risk control technology and the associated software can support the occupational safety and health management efforts in coal mines in a standardized and effective manner. It can also control the accident risks scientifically and effectively; its effective implementation can further improve the coal mine occupational safety and health management mechanism, and further enhance the risk management approaches. Besides, its implementation indicates that the occupational safety and health management and risk control technology has been established based on a benign cycle involving dynamic feedback and scientific development, which can provide a reliable assurance to the safe operation of coal mines. PMID:29701715

  11. Reconstruction of Internal Doses for the Alpha-Risk Case-Control Study of Lung Cancer and Leukaemia Among European Nuclear Workers.

    PubMed

    Bingham, Derek; Bérard, Philippe; Birchall, Alan; Bull, Richard; Cardis, Elisabeth; Challeton-de Vathaire, Cécile; Grellier, James; Hurtgen, Christian; Puncher, Matthew; Riddell, Anthony; Thierry-Chef, Isabelle

    2017-05-01

    The Alpha-Risk study required the reconstruction of doses to lung and red bone marrow for lung cancer and leukaemia cases and their matched controls from cohorts of nuclear workers in the UK, France and Belgium. The dosimetrists and epidemiologists agreed requirements regarding the bioassay data, biokinetic and dosimetric models and dose assessment software to be used and doses to be reported. The best values to use for uncertainties on the monitoring data, setting of exposure regimes and characteristics of the exposure material, including lung solubility, were the responsibility of the dosimetrist responsible for each cohort. Among 1721 subjects, the median absorbed dose to the lung from alpha radiations was 2.1 mGy, with a maximum dose of 316 mGy. The lung doses calculated reflect the higher levels of exposure seen among workers in the early years of the nuclear industry compared to today. © Crown copyright 2016.

  12. Teamwork tools and activities within the hazard component of the Global Earthquake Model

    NASA Astrophysics Data System (ADS)

    Pagani, M.; Weatherill, G.; Monelli, D.; Danciu, L.

    2013-05-01

    The Global Earthquake Model (GEM) is a public-private partnership aimed at supporting and fostering a global community of scientists and engineers working in the fields of seismic hazard and risk assessment. In the hazard sector, in particular, GEM recognizes the importance of local ownership and leadership in the creation of seismic hazard models. For this reason, over the last few years, GEM has been promoting different activities in the context of seismic hazard analysis ranging, for example, from regional projects targeted at the creation of updated seismic hazard studies to the development of a new open-source seismic hazard and risk calculation software called OpenQuake-engine (http://globalquakemodel.org). In this communication we'll provide a tour of the various activities completed, such as the new ISC-GEM Global Instrumental Catalogue, and of currently on-going initiatives like the creation of a suite of tools for the creation of PSHA input models. Discussion, comments and criticism by the colleagues in the audience will be highly appreciated.

  13. Utilization of GIS/GPS-Based Information Technology in Commercial Crop Decision Making in California, Washington, Oregon, Idaho, and Arizona

    PubMed Central

    Thomas, C. S.; Skinner, P. W.; Fox, A. D.; Greer, C. A.; Gubler, W. D.

    2002-01-01

    Ground-based weather, plant-stage measurements, and remote imagery were geo-referenced in geographic information system (GIS) software using an integrated approach to determine insect and disease risk and crop cultural requirements. Weather forecasts and disease weather forecasts for agricultural areas were constructed with elevation, weather, and satellite data. Models for 6 insect pests and 12 diseases of various crops were calculated and presented daily in georeferenced maps for agricultural areas in northern California and Washington. Grape harvest dates and yields also were predicted with high accuracy. The data generated from the GIS global positioning system (GPS) analyses were used to make management decisions over a large number of acres in California, Washington, Oregon, Idaho, and Arizona. Information was distributed daily over the Internet as regional weather, insect, and disease risk maps as industry-sponsored or subscription-based products. Use of GIS/GPS technology for semi-automated data analysis is discussed. PMID:19265934

  14. Risk Based Inspection Methodology and Software Applied to Atmospheric Storage Tanks

    NASA Astrophysics Data System (ADS)

    Topalis, P.; Korneliussen, G.; Hermanrud, J.; Steo, Y.

    2012-05-01

    A new risk-based inspection (RBI) methodology and software is presented in this paper. The objective of this work is to allow management of the inspections of atmospheric storage tanks in the most efficient way, while, at the same time, accident risks are minimized. The software has been built on the new risk framework architecture, a generic platform facilitating efficient and integrated development of software applications using risk models. The framework includes a library of risk models and the user interface is automatically produced on the basis of editable schemas. This risk-framework-based RBI tool has been applied in the context of RBI for above-ground atmospheric storage tanks (AST) but it has been designed with the objective of being generic enough to allow extension to the process plants in general. This RBI methodology is an evolution of an approach and mathematical models developed for Det Norske Veritas (DNV) and the American Petroleum Institute (API). The methodology assesses damage mechanism potential, degradation rates, probability of failure (PoF), consequence of failure (CoF) in terms of environmental damage and financial loss, risk and inspection intervals and techniques. The scope includes assessment of the tank floor for soil-side external corrosion and product-side internal corrosion and the tank shell courses for atmospheric corrosion and internal thinning. It also includes preliminary assessment for brittle fracture and cracking. The data are structured according to an asset hierarchy including Plant, Production Unit, Process Unit, Tag, Part and Inspection levels and the data are inherited / defaulted seamlessly from a higher hierarchy level to a lower level. The user interface includes synchronized hierarchy tree browsing, dynamic editor and grid-view editing and active reports with drill-in capability.

  15. Caries Risk Assessment of 12–13-year-old Government and Private School Going Children of Mysore City Using Cariogram: A Comparative Study

    PubMed Central

    Naik, Sandhya P.; Moyin, Shabna; Patel, Bhakti; Warad, Lata Prabhu; Punathil, Sameer; Sudeep, C. B.

    2018-01-01

    Aim: The aim of this study is to assess the caries risk assessment of 12–13-year-old government and private school going children of Mysore city using Cariogram. Materials and Methods: A cross-sectional examination was carried out on a total of 104 government and private schoolchildren aged 12–13 years. Ten factors from the Cariogram software(D Bratthall, Computer software, Malmo, Sweden) were included from study participant's records to complete the Cariogram. The percentage of “chances of avoiding new lesions” (caries risk) among government and private school study participants were obtained from Cariogram, and the participants were classified into five risk groups. Statistical analysis was performed using the software program Statistical Package of Social Science (version 17.0, SPSS Inc., Chicago IL, USA). Results: Findings revealed that there is slight difference in caries risk among government and private schoolchildren, where 48% caries risk development and 52% chance to avoid dental caries were showed in government schoolchildren, and 51% caries risk development and 49% chance to avoid dental caries were showed in private schoolchildren, according to Cariogram. Decayed, missing, and filled teeth component, mutans streptococci, and Lactobacillus counts were slightly higher in private schoolchildren compared with government schoolchildren. Conclusion: The private schoolchildren had less favorable values than government schoolchildren for most of the caries-related factors. Cariogram can be the most modest and reliable tool for caries prediction, thus aiding in identifying different risk groups in a community so that appropriate preventive strategies can be provided to overcome new carious lesion formation. PMID:29780742

  16. Proposing an Evidence-Based Strategy for Software Requirements Engineering.

    PubMed

    Lindoerfer, Doris; Mansmann, Ulrich

    2016-01-01

    This paper discusses an evidence-based approach to software requirements engineering. The approach is called evidence-based, since it uses publications on the specific problem as a surrogate for stakeholder interests, to formulate risks and testing experiences. This complements the idea that agile software development models are more relevant, in which requirements and solutions evolve through collaboration between self-organizing cross-functional teams. The strategy is exemplified and applied to the development of a Software Requirements list used to develop software systems for patient registries.

  17. Continuous Risk Management: A NASA Program Initiative

    NASA Technical Reports Server (NTRS)

    Hammer, Theodore F.; Rosenberg, Linda

    1999-01-01

    NPG 7120.5A, "NASA Program and Project Management Processes and Requirements" enacted in April, 1998, requires that "The program or project manager shall apply risk management principles..." The Software Assurance Technology Center (SATC) at NASA GSFC has been tasked with the responsibility for developing and teaching a systems level course for risk management that provides information on how to comply with this edict. The course was developed in conjunction with the Software Engineering Institute at Carnegie Mellon University, then tailored to the NASA systems community. This presentation will briefly discuss the six functions for risk management: (1) Identify the risks in a specific format; (2) Analyze the risk probability, impact/severity, and timeframe; (3) Plan the approach; (4) Track the risk through data compilation and analysis; (5) Control and monitor the risk; (6) Communicate and document the process and decisions.

  18. Big Software for SmallSats: Adapting cFS to CubeSat Missions

    NASA Technical Reports Server (NTRS)

    Cudmore, Alan P.; Crum, Gary Alex; Sheikh, Salman; Marshall, James

    2015-01-01

    Expanding capabilities and mission objectives for SmallSats and CubeSats is driving the need for reliable, reusable, and robust flight software. While missions are becoming more complicated and the scientific goals more ambitious, the level of acceptable risk has decreased. Design challenges are further compounded by budget and schedule constraints that have not kept pace. NASA's Core Flight Software System (cFS) is an open source solution which enables teams to build flagship satellite level flight software within a CubeSat schedule and budget. NASA originally developed cFS to reduce mission and schedule risk for flagship satellite missions by increasing code reuse and reliability. The Lunar Reconnaissance Orbiter, which launched in 2009, was the first of a growing list of Class B rated missions to use cFS.

  19. [Comparison among various software for LMS growth curve fitting methods].

    PubMed

    Han, Lin; Wu, Wenhong; Wei, Qiuxia

    2015-03-01

    To explore the methods to realize the growth curve fitting of coefficients of skewness-median-coefficient of variation (LMS) using different software, and to optimize growth curve statistical method for grass-root child and adolescent staffs. Regular physical examination data of head circumference for normal infants aging 3, 6, 9 and 12 months in Baotou City were analyzed. Statistical software such as SAS, R, STATA and SPSS were used to fit the LMS growth curve and the results were evaluated upon the user 's convenience, study circle, user interface, results display forms, software update and maintenance and so on. Growth curve fitting results showed the same calculation outcome and each of statistical software had its own advantages and disadvantages. With all the evaluation aspects in consideration, R software excelled others in LMS growth curve fitting. R software have the advantage over other software in grass roots child and adolescent staff.

  20. Making sense of cancer risk calculators on the web.

    PubMed

    Levy, Andrea Gurmankin; Sonnad, Seema S; Kurichi, Jibby E; Sherman, Melani; Armstrong, Katrina

    2008-03-01

    Cancer risk calculators on the internet have the potential to provide users with valuable information about their individual cancer risk. However, the lack of oversight of these sites raises concerns about low quality and inconsistent information. These concerns led us to evaluate internet cancer risk calculators. After a systematic search to find all cancer risk calculators on the internet, we reviewed the content of each site for information that users should seek to evaluate the quality of a website. We then examined the consistency of the breast cancer risk calculators by having 27 women complete 10 of the breast cancer risk calculators for themselves. We also completed the breast cancer risk calculators for a hypothetical high- and low-risk woman, and compared the output to Surveillance Epidemiology and End Results estimates for the average same-age and same-race woman. Nineteen sites were found, 13 of which calculate breast cancer risk. Most sites do not provide the information users need to evaluate the legitimacy of a website. The breast cancer calculator sites vary in the risk factors they assess to calculate breast cancer risk, how they operationalize each risk factor and in the risk estimate they provide for the same individual. Internet cancer risk calculators have the potential to provide a public health benefit by educating individuals about their risks and potentially encouraging preventive health behaviors. However, our evaluation of internet calculators revealed several problems that call into question the accuracy of the information that they provide. This may lead the users of these sites to make inappropriate medical decisions on the basis of misinformation.

  1. Major Software Vendor Puts Students on Many Campuses at Risk of Identity Theft

    ERIC Educational Resources Information Center

    Foster, Andrea

    2008-01-01

    At least 18 colleges are scrambling to inform tens of thousands of students that they are at risk of having their identities stolen after SunGard, a leading software vendor, reported that a laptop owned by one of its consultants was stolen. The extent of the problem is still unknown, though many of the campuses that have been identified are in…

  2. Scalable collaborative risk management technology for complex critical systems

    NASA Technical Reports Server (NTRS)

    Campbell, Scott; Torgerson, Leigh; Burleigh, Scott; Feather, Martin S.; Kiper, James D.

    2004-01-01

    We describe here our project and plans to develop methods, software tools, and infrastructure tools to address challenges relating to geographically distributed software development. Specifically, this work is creating an infrastructure that supports applications working over distributed geographical and organizational domains and is using this infrastructure to develop a tool that supports project development using risk management and analysis techniques where the participants are not collocated.

  3. Software system design for the non-null digital Moiré interferometer

    NASA Astrophysics Data System (ADS)

    Chen, Meng; Hao, Qun; Hu, Yao; Wang, Shaopu; Li, Tengfei; Li, Lin

    2016-11-01

    Aspheric optical components are an indispensable part of modern optics systems. With the development of aspheric optical elements fabrication technique, high-precision figure error test method of aspheric surfaces is a quite urgent issue now. We proposed a digital Moiré interferometer technique (DMIT) based on partial compensation principle for aspheric and freeform surface measurement. Different from traditional interferometer, DMIT consists of a real and a virtual interferometer. The virtual interferometer is simulated with Zemax software to perform phase-shifting and alignment. We can get the results by a series of calculation with the real interferogram and virtual interferograms generated by computer. DMIT requires a specific, reliable software system to ensure its normal work. Image acquisition and data processing are two important parts in this system. And it is also a challenge to realize the connection between the real and virtual interferometer. In this paper, we present a software system design for DMIT with friendly user interface and robust data processing features, enabling us to acquire the figure error of the measured asphere. We choose Visual C++ as the software development platform and control the ideal interferometer by using hybrid programming with Zemax. After image acquisition and data transmission, the system calls image processing algorithms written with Matlab to calculate the figure error of the measured asphere. We test the software system experimentally. In the experiment, we realize the measurement of an aspheric surface and prove the feasibility of the software system.

  4. Semi-automatic computerized approach to radiological quantification in rheumatoid arthritis

    NASA Astrophysics Data System (ADS)

    Steiner, Wolfgang; Schoeffmann, Sylvia; Prommegger, Andrea; Boegl, Karl; Klinger, Thomas; Peloschek, Philipp; Kainberger, Franz

    2004-04-01

    Rheumatoid Arthritis (RA) is a common systemic disease predominantly involving the joints. Precise diagnosis and follow-up therapy requires objective quantification. For this purpose, radiological analyses using standardized scoring systems are considered to be the most appropriate method. The aim of our study is to develop a semi-automatic image analysis software, especially applicable for scoring of joints in rheumatic disorders. The X-Ray RheumaCoach software delivers various scoring systems (Larsen-Score and Ratingen-Rau-Score) which can be applied by the scorer. In addition to the qualitative assessment of joints performed by the radiologist, a semi-automatic image analysis for joint detection and measurements of bone diameters and swollen tissue supports the image assessment process. More than 3000 radiographs from hands and feet of more than 200 RA patients were collected, analyzed, and statistically evaluated. Radiographs were quantified using conventional paper-based Larsen score and the X-Ray RheumaCoach software. The use of the software shortened the scoring time by about 25 percent and reduced the rate of erroneous scorings in all our studies. Compared to paper-based scoring methods, the X-Ray RheumaCoach software offers several advantages: (i) Structured data analysis and input that minimizes variance by standardization, (ii) faster and more precise calculation of sum scores and indices, (iii) permanent data storing and fast access to the software"s database, (iv) the possibility of cross-calculation to other scores, (v) semi-automatic assessment of images, and (vii) reliable documentation of results in the form of graphical printouts.

  5. Exciting Normal Distribution

    ERIC Educational Resources Information Center

    Fuchs, Karl Josef; Simonovits, Reinhard; Thaller, Bernd

    2008-01-01

    This paper describes a high school project where the mathematics teaching and learning software M@th Desktop (MD) based on the Computer Algebra System Mathematica was used for symbolical and numerical calculations and for visualisation. The mathematics teaching and learning software M@th Desktop 2.0 (MD) contains the modules Basics including tools…

  6. A Simple Interactive Software Package for Plotting, Animating, and Calculating

    ERIC Educational Resources Information Center

    Engelhardt, Larry

    2012-01-01

    We introduce a new open source (free) software package that provides a simple, highly interactive interface for carrying out certain mathematical tasks that are commonly encountered in physics. These tasks include plotting and animating functions, solving systems of coupled algebraic equations, and basic calculus (differentiating and integrating…

  7. Software GOLUCA: Knowledge Representation in Mental Calculation

    ERIC Educational Resources Information Center

    Casas-Garcia, Luis M.; Luengo-Gonzalez, Ricardo; Godinho-Lopes, Vitor

    2011-01-01

    We present a new software, called Goluca (Godinho, Luengo, and Casas, 2007), based on the technique of Pathfinder Associative Networks (Schvaneveldt, 1989), which produces graphical representations of the cognitive structure of individuals in a given field knowledge. In this case, we studied the strategies used by teachers and its relationship…

  8. The Design of Lessons Using Mathematics Analysis Software to Support Multiple Representations in Secondary School Mathematics

    ERIC Educational Resources Information Center

    Pierce, Robyn; Stacey, Kaye; Wander, Roger; Ball, Lynda

    2011-01-01

    Current technologies incorporating sophisticated mathematical analysis software (calculation, graphing, dynamic geometry, tables, and more) provide easy access to multiple representations of mathematical problems. Realising the affordances of such technology for students' learning requires carefully designed lessons. This paper reports on design…

  9. 75 FR 80677 - The Low-Income Definition

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-12-23

    ... original regulatory text so it is consistent with the geo-coding software the agency uses to make the low... Union Act (Act) authorizes the NCUA Board (Board) to define ``low-income members'' so that credit unions... process of implementing geo- coding software to make the calculation automatically for credit unions...

  10. Understanding Computation of Impulse Response in Microwave Software Tools

    ERIC Educational Resources Information Center

    Potrebic, Milka M.; Tosic, Dejan V.; Pejovic, Predrag V.

    2010-01-01

    In modern microwave engineering curricula, the introduction of the many new topics in microwave industrial development, or of software tools for design and simulation, sometimes results in students having an inadequate understanding of the fundamental theory. The terminology for and the explanation of algorithms for calculating impulse response in…

  11. Stressing and Ignoring--The Influence of Computer Software Environments.

    ERIC Educational Resources Information Center

    Pope, Sue

    2003-01-01

    Discusses drawing a Pythagoras diagram in the context of how computer software influences mathematical understanding. Requires different understandings of what the diagram involves in order to be successfully completed in different environments. Suggests that while LOGO is often expected to be easier, a graphic calculator can be less demanding.…

  12. SigrafW: An easy-to-use program for fitting enzyme kinetic data.

    PubMed

    Leone, Francisco Assis; Baranauskas, José Augusto; Furriel, Rosa Prazeres Melo; Borin, Ivana Aparecida

    2005-11-01

    SigrafW is Windows-compatible software developed using the Microsoft® Visual Basic Studio program that uses the simplified Hill equation for fitting kinetic data from allosteric and Michaelian enzymes. SigrafW uses a modified Fibonacci search to calculate maximal velocity (V), the Hill coefficient (n), and the enzyme-substrate apparent dissociation constant (K). The estimation of V, K, and the sum of the squares of residuals is performed using a Wilkinson nonlinear regression at any Hill coefficient (n). In contrast to many currently available kinetic analysis programs, SigrafW shows several advantages for the determination of kinetic parameters of both hyperbolic and nonhyperbolic saturation curves. No initial estimates of the kinetic parameters are required, a measure of the goodness-of-the-fit for each calculation performed is provided, the nonlinear regression used for calculations eliminates the statistical bias inherent in linear transformations, and the software can be used for enzyme kinetic simulations either for educational or research purposes. Persons interested in receiving a free copy of the software should contact Dr. F. A. Leone. Copyright © 2005 International Union of Biochemistry and Molecular Biology, Inc.

  13. Orbiter subsystem hardware/software interaction analysis. Volume 8: Forward reaction control system

    NASA Technical Reports Server (NTRS)

    Becker, D. D.

    1980-01-01

    The results of the orbiter hardware/software interaction analysis for the AFT reaction control system are presented. The interaction between hardware failure modes and software are examined in order to identify associated issues and risks. All orbiter subsystems and interfacing program elements which interact with the orbiter computer flight software are analyzed. The failure modes identified in the subsystem/element failure mode and effects analysis are discussed.

  14. The SCEC Broadband Platform: A Collaborative Open-Source Software Package for Strong Ground Motion Simulation and Validation

    NASA Astrophysics Data System (ADS)

    Silva, F.; Maechling, P. J.; Goulet, C.; Somerville, P.; Jordan, T. H.

    2013-12-01

    The Southern California Earthquake Center (SCEC) Broadband Platform is a collaborative software development project involving SCEC researchers, graduate students, and the SCEC Community Modeling Environment. The SCEC Broadband Platform is open-source scientific software that can generate broadband (0-100Hz) ground motions for earthquakes, integrating complex scientific modules that implement rupture generation, low and high-frequency seismogram synthesis, non-linear site effects calculation, and visualization into a software system that supports easy on-demand computation of seismograms. The Broadband Platform operates in two primary modes: validation simulations and scenario simulations. In validation mode, the Broadband Platform runs earthquake rupture and wave propagation modeling software to calculate seismograms of a historical earthquake for which observed strong ground motion data is available. Also in validation mode, the Broadband Platform calculates a number of goodness of fit measurements that quantify how well the model-based broadband seismograms match the observed seismograms for a certain event. Based on these results, the Platform can be used to tune and validate different numerical modeling techniques. During the past year, we have modified the software to enable the addition of a large number of historical events, and we are now adding validation simulation inputs and observational data for 23 historical events covering the Eastern and Western United States, Japan, Taiwan, Turkey, and Italy. In scenario mode, the Broadband Platform can run simulations for hypothetical (scenario) earthquakes. In this mode, users input an earthquake description, a list of station names and locations, and a 1D velocity model for their region of interest, and the Broadband Platform software then calculates ground motions for the specified stations. By establishing an interface between scientific modules with a common set of input and output files, the Broadband Platform facilitates the addition of new scientific methods, which are written by earth scientists in a number of languages such as C, C++, Fortran, and Python. The Broadband Platform's modular design also supports the reuse of existing software modules as building blocks to create new scientific methods. Additionally, the Platform implements a wrapper around each scientific module, converting input and output files to and from the specific formats required (or produced) by individual scientific codes. Working in close collaboration with scientists and research engineers, the SCEC software development group continues to add new capabilities to the Broadband Platform and to release new versions as open-source scientific software distributions that can be compiled and run on many Linux computer systems. Our latest release includes the addition of 3 new simulation methods and several new data products, such as map and distance-based goodness of fit plots. Finally, as the number and complexity of scenarios simulated using the Broadband Platform increase, we have added batching utilities to substantially improve support for running large-scale simulations on computing clusters.

  15. Assessment and quantification of patient set-up errors in nasopharyngeal cancer patients and their biological and dosimetric impact in terms of generalized equivalent uniform dose (gEUD), tumour control probability (TCP) and normal tissue complication probability (NTCP).

    PubMed

    Boughalia, A; Marcie, S; Fellah, M; Chami, S; Mekki, F

    2015-06-01

    The aim of this study is to assess and quantify patients' set-up errors using an electronic portal imaging device and to evaluate their dosimetric and biological impact in terms of generalized equivalent uniform dose (gEUD) on predictive models, such as the tumour control probability (TCP) and the normal tissue complication probability (NTCP). 20 patients treated for nasopharyngeal cancer were enrolled in the radiotherapy-oncology department of HCA. Systematic and random errors were quantified. The dosimetric and biological impact of these set-up errors on the target volume and the organ at risk (OARs) coverage were assessed using calculation of dose-volume histogram, gEUD, TCP and NTCP. For this purpose, an in-house software was developed and used. The standard deviations (1SDs) of the systematic set-up and random set-up errors were calculated for the lateral and subclavicular fields and gave the following results: ∑ = 0.63 ± (0.42) mm and σ = 3.75 ± (0.79) mm, respectively. Thus a planning organ at risk volume (PRV) margin of 3 mm was defined around the OARs, and a 5-mm margin used around the clinical target volume. The gEUD, TCP and NTCP calculations obtained with and without set-up errors have shown increased values for tumour, where ΔgEUD (tumour) = 1.94% Gy (p = 0.00721) and ΔTCP = 2.03%. The toxicity of OARs was quantified using gEUD and NTCP. The values of ΔgEUD (OARs) vary from 0.78% to 5.95% in the case of the brainstem and the optic chiasm, respectively. The corresponding ΔNTCP varies from 0.15% to 0.53%, respectively. The quantification of set-up errors has a dosimetric and biological impact on the tumour and on the OARs. The developed in-house software using the concept of gEUD, TCP and NTCP biological models has been successfully used in this study. It can be used also to optimize the treatment plan established for our patients. The gEUD, TCP and NTCP may be more suitable tools to assess the treatment plans before treating the patients.

  16. Developing a Treatment Planning Software Based on TG-43U1 Formalism for Cs-137 LDR Brachytherapy.

    PubMed

    Sina, Sedigheh; Faghihi, Reza; Soleimani Meigooni, Ali; Siavashpour, Zahra; Mosleh-Shirazi, Mohammad Amin

    2013-08-01

    The old Treatment Planning Systems (TPSs) used for intracavitary brachytherapy with Cs-137 Selectron source utilize traditional dose calculation methods, considering each source as a point source. Using such methods introduces significant errors in dose estimation. As of 1995, TG-43 is used as the main dose calculation formalism in treatment TPSs. The purpose of this study is to design and establish a treatment planning software for Cs-137 Solectron brachytherapy source, based on TG-43U1 formalism by applying the effects of the applicator and dummy spacers. Two softwares used for treatment planning of Cs-137 sources in Iran (STPS and PLATO), are based on old formalisms. The purpose of this work is to establish and develop a TPS for Selectron source based on TG-43 formalism. In this planning system, the dosimetry parameters of each pellet in different places inside applicators were obtained by MCNP4c code. Then the dose distribution around every combination of active and inactive pellets was obtained by summing the doses. The accuracy of this algorithm was checked by comparing its results for special combination of active and inactive pellets with MC simulations. Finally, the uncertainty of old dose calculation formalism was investigated by comparing the results of STPS and PLATO softwares with those obtained by the new algorithm. For a typical arrangement of 10 active pellets in the applicator, the percentage difference between doses obtained by the new algorithm at 1cm distance from the tip of the applicator and those obtained by old formalisms is about 30%, while the difference between the results of MCNP and the new algorithm is less than 5%. According to the results, the old dosimetry formalisms, overestimate the dose especially towards the applicator's tip. While the TG-43U1 based software perform the calculations more accurately.

  17. Calculation of the bending of electromechanical aircraft element made of the carbon fiber

    NASA Astrophysics Data System (ADS)

    Danilova-Volkovskaya, Galina; Chepurnenko, Anton; Begak, Aleksandr; Savchenko, Andrey

    2017-10-01

    We consider a method of calculation of an orthotropic plate with variable thickness. The solution is performed numerically by the finite element method. The calculation is made for the springs of a hang glider made of carbon fiber. The comparison of the results with Sofistik software complex is given.

  18. Space Telecommunications Radio System (STRS) Architecture Standard. Release 1.02.1

    NASA Technical Reports Server (NTRS)

    Reinhart, Richard C.; Kacpura, Thomas J.; Handler, Louis M.; Hall, C. Steve; Mortensen, Dale J.; Johnson, Sandra K.; Briones, Janette C.; Nappier, Jennifer M.; Downey, Joseph A.; Lux, James P.

    2012-01-01

    This document contains the NASA architecture standard for software defined radios used in space- and ground-based platforms to enable commonality among radio developments to enhance capability and services while reducing mission and programmatic risk. Transceivers (or transponders) with functionality primarily defined in software (e.g., firmware) have the ability to change their functional behavior through software alone. This radio architecture standard offers value by employing common waveform software interfaces, method of instantiation, operation, and testing among different compliant hardware and software products. These common interfaces within the architecture abstract application software from the underlying hardware to enable technology insertion independently at either the software or hardware layer.

  19. 77 FR 17219 - Patient Protection and Affordable Care Act; Standards Related to Reinsurance, Risk Corridors and...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-03-23

    ... parts of the risk adjustment process--the risk adjustment model, the calculation of plan average... risk adjustment process. The risk adjustment model calculates individual risk scores. The calculation...'' to mean all data that are used in a risk adjustment model, the calculation of plan average actuarial...

  20. Transfer of Learning: The Effects of Different Instruction Methods on Software Application Learning

    ERIC Educational Resources Information Center

    Larson, Mark E.

    2010-01-01

    Human Resource Departments (HRD), especially instructors, are challenged to keep pace with rapidly changing computer software applications and technology. The problem under investigation revealed after instruction of a software application if a particular method of instruction was a predictor of transfer of learning, when other risk factors were…

  1. Annotated Bibliography of Computer Software for Teaching Early Reading and Spelling. Project RIMES 2000.

    ERIC Educational Resources Information Center

    Rhein, Deborah; Alibrandi, Mary; Lyons, Mary; Sammons, Janice; Doyle, Luther

    This bibliography, developed by Project RIMES (Reading Instructional Methods of Efficacy with Students) lists 80 software packages for teaching early reading and spelling to students at risk for reading and spelling failure. The software packages are presented alphabetically by title. Entries usually include a grade level indicator, a brief…

  2. Automating Risk Analysis of Software Design Models

    PubMed Central

    Ruiz, Guifré; Heymann, Elisa; César, Eduardo; Miller, Barton P.

    2014-01-01

    The growth of the internet and networked systems has exposed software to an increased amount of security threats. One of the responses from software developers to these threats is the introduction of security activities in the software development lifecycle. This paper describes an approach to reduce the need for costly human expertise to perform risk analysis in software, which is common in secure development methodologies, by automating threat modeling. Reducing the dependency on security experts aims at reducing the cost of secure development by allowing non-security-aware developers to apply secure development with little to no additional cost, making secure development more accessible. To automate threat modeling two data structures are introduced, identification trees and mitigation trees, to identify threats in software designs and advise mitigation techniques, while taking into account specification requirements and cost concerns. These are the components of our model for automated threat modeling, AutSEC. We validated AutSEC by implementing it in a tool based on data flow diagrams, from the Microsoft security development methodology, and applying it to VOMS, a grid middleware component, to evaluate our model's performance. PMID:25136688

  3. Automating risk analysis of software design models.

    PubMed

    Frydman, Maxime; Ruiz, Guifré; Heymann, Elisa; César, Eduardo; Miller, Barton P

    2014-01-01

    The growth of the internet and networked systems has exposed software to an increased amount of security threats. One of the responses from software developers to these threats is the introduction of security activities in the software development lifecycle. This paper describes an approach to reduce the need for costly human expertise to perform risk analysis in software, which is common in secure development methodologies, by automating threat modeling. Reducing the dependency on security experts aims at reducing the cost of secure development by allowing non-security-aware developers to apply secure development with little to no additional cost, making secure development more accessible. To automate threat modeling two data structures are introduced, identification trees and mitigation trees, to identify threats in software designs and advise mitigation techniques, while taking into account specification requirements and cost concerns. These are the components of our model for automated threat modeling, AutSEC. We validated AutSEC by implementing it in a tool based on data flow diagrams, from the Microsoft security development methodology, and applying it to VOMS, a grid middleware component, to evaluate our model's performance.

  4. RiskScape: a new tool for comparing risk from natural hazards (Invited)

    NASA Astrophysics Data System (ADS)

    Stirling, M. W.; King, A.

    2010-12-01

    The Regional RiskScape is New Zealand’s joint venture between GNS Science & NIWA, and represents a comprehensive and easy-to-use tool for multi-hazard-based risk and impact analysis. It has basic GIS functionality, in that it has Import/Export functions to use with GIS software. Five natural hazards have been implemented in Riskscape to date: Flood (river), earthquake, volcano (ash), tsunami and wind storm. The software converts hazard exposure information into the likely impacts for a region, for example, damage and replacement costs, casualties, economic losses, disruption, and number of people affected. It therefore can be used to assist with risk management, land use planning, building codes and design, risk identification, prioritization of risk-reduction/mitigation, determination of “best use” risk-reduction investment, evacuation and contingency planning, awareness raising, public information, realistic scenarios for exercises, and hazard event response. Three geographically disparate pilot regions have been used to develop and triall Riskscape in New Zealand, and each region is exposed to a different mix of natural hazards. Future (phase II) development of Riskscape will include the following hazards: Landslides (both rainfall and earthquake triggered), storm surges, pyroclastic flows and lahars, and climate change effects. While Riskscape developments have thus far focussed on scenario-based risk, future developments will advance the software into providing probabilistic-based solutions.

  5. POTAMOS mass spectrometry calculator: computer aided mass spectrometry to the post-translational modifications of proteins. A focus on histones.

    PubMed

    Vlachopanos, A; Soupsana, E; Politou, A S; Papamokos, G V

    2014-12-01

    Mass spectrometry is a widely used technique for protein identification and it has also become the method of choice in order to detect and characterize the post-translational modifications (PTMs) of proteins. Many software tools have been developed to deal with this complication. In this paper we introduce a new, free and user friendly online software tool, named POTAMOS Mass Spectrometry Calculator, which was developed in the open source application framework Ruby on Rails. It can provide calculated mass spectrometry data in a time saving manner, independently of instrumentation. In this web application we have focused on a well known protein family of histones whose PTMs are believed to play a crucial role in gene regulation, as suggested by the so called "histone code" hypothesis. The PTMs implemented in this software are: methylations of arginines and lysines, acetylations of lysines and phosphorylations of serines and threonines. The application is able to calculate the kind, the number and the combinations of the possible PTMs corresponding to a given peptide sequence and a given mass along with the full set of the unique primary structures produced by the possible distributions along the amino acid sequence. It can also calculate the masses and charges of a fragmented histone variant, which carries predefined modifications already implemented. Additional functionality is provided by the calculation of the masses of fragments produced upon protein cleavage by the proteolytic enzymes that are most widely used in proteomics studies. Copyright © 2014 Elsevier Ltd. All rights reserved.

  6. Medical device software: defining key terms.

    PubMed

    Pashkov, Vitalii; Gutorova, Nataliya; Harkusha, Andrii

    one of the areas of significant growth in medical devices has been the role of software - as an integral component of a medical device, as a standalone device and more recently as applications on mobile devices. The risk related to a malfunction of the standalone software used within healthcare is in itself not a criterion for its qualification or not as a medical device. It is therefore, necessary to clarify some criteria for the qualification of stand-alone software as medical devices Materials and methods: Ukrainian, European Union, United States of America legislation, Guidelines developed by European Commission and Food and Drug Administration's, recommendations represented by international voluntary group and scientific works. This article is based on dialectical, comparative, analytic, synthetic and comprehensive research methods. the legal regulation of software which is used for medical purpose in Ukraine limited to one definition. In European Union and United States of America were developed and applying special guidelines that help developers, manufactures and end users to difference software on types standing on medical purpose criteria. Software becomes more and more incorporated into medical devices. Developers and manufacturers may not have initially appreciated potential risks to patients and users such situation could have dangerous results for patients or users. It is necessary to develop and adopt the legislation that will intend to define the criteria for the qualification of medical device software and the application of the classification criteria to such software, provide some illustrative examples and step by step recommendations to qualify software as medical device.

  7. Automation system for neutron activation analysis at the reactor IBR-2, Frank Laboratory of Neutron Physics, Joint Institute for Nuclear Research, Dubna, Russia.

    PubMed

    Pavlov, Sergey S; Dmitriev, Andrey Yu; Frontasyeva, Marina V

    The present status of development of software packages and equipment designed for automation of NAA at the reactor IBR-2 of FLNP, JINR, Dubna, RF, is described. The NAA database, construction of sample changers and software for automation of spectra measurement and calculation of concentrations are presented. Automation of QC procedures is integrated in the software developed. Details of the design are shown.

  8. MO-D-213-07: RadShield: Semi- Automated Calculation of Air Kerma Rate and Barrier Thickness

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    DeLorenzo, M; Wu, D; Rutel, I

    2015-06-15

    Purpose: To develop the first Java-based semi-automated calculation program intended to aid professional radiation shielding design. Air-kerma rate and barrier thickness calculations are performed by implementing NCRP Report 147 formalism into a Graphical User Interface (GUI). The ultimate aim of this newly created software package is to reduce errors and improve radiographic and fluoroscopic room designs over manual approaches. Methods: Floor plans are first imported as images into the RadShield software program. These plans serve as templates for drawing barriers, occupied regions and x-ray tube locations. We have implemented sub-GUIs that allow the specification in regions and equipment for occupancymore » factors, design goals, number of patients, primary beam directions, source-to-patient distances and workload distributions. Once the user enters the above parameters, the program automatically calculates air-kerma rate at sampled points beyond all barriers. For each sample point, a corresponding minimum barrier thickness is calculated to meet the design goal. RadShield allows control over preshielding, sample point location and material types. Results: A functional GUI package was developed and tested. Examination of sample walls and source distributions yields a maximum percent difference of less than 0.1% between hand-calculated air-kerma rates and RadShield. Conclusion: The initial results demonstrated that RadShield calculates air-kerma rates and required barrier thicknesses with reliable accuracy and can be used to make radiation shielding design more efficient and accurate. This newly developed approach differs from conventional calculation methods in that it finds air-kerma rates and thickness requirements for many points outside the barriers, stores the information and selects the largest value needed to comply with NCRP Report 147 design goals. Floor plans, parameters, designs and reports can be saved and accessed later for modification and recalculation. We have confirmed that this software accurately calculates air-kerma rates and required barrier thicknesses for diagnostic radiography and fluoroscopic rooms.« less

  9. Alternatives for jet engine control

    NASA Technical Reports Server (NTRS)

    Sain, M. K.; Yurkovich, S.; Hill, J. P.; Kingler, T. A.

    1983-01-01

    The development of models of tensor type for a digital simulation of the quiet, clean safe engine (QCSE) gas turbine engine; the extension, to nonlinear multivariate control system design, of the concepts of total synthesis which trace their roots back to certain early investigations under this grant; the role of series descriptions as they relate to questions of scheduling in the control of gas turbine engines; the development of computer-aided design software for tensor modeling calculations; further enhancement of the softwares for linear total synthesis, mentioned above; and calculation of the first known examples using tensors for nonlinear feedback control are discussed.

  10. A seismic analysis for masonry constructions: The different schematization methods of masonry walls

    NASA Astrophysics Data System (ADS)

    Olivito, Renato. S.; Codispoti, Rosamaria; Scuro, Carmelo

    2017-11-01

    Seismic analysis of masonry structures is usually analyzed through the use of structural calculation software based on equivalent frames method or to macro-elements method. In these approaches, the masonry walls are divided into vertical elements, masonry walls, and horizontal elements, so-called spandrel elements, interconnected by rigid nodes. The aim of this work is to make a critical comparison between different schematization methods of masonry wall underlining the structural importance of the spandrel elements. In order to implement the methods, two different structural calculation software were used and an existing masonry building has been examined.

  11. Numerical modeling of interaction of the aircraft engine with concrete protective structures

    NASA Astrophysics Data System (ADS)

    Radchenko, P. A.; Batuev, S. P.; Radchenko, A. V.; Plevkov, V. S.

    2018-01-01

    The paper presents numerical modeling results considering interaction of Boeing 747 aircraft engine with nuclear power station protective shell. Protective shell has been given as a reinforced concrete structure with complex scheme of reinforcement. The engine has been simulated by cylinder projectile made from titanium alloy. The interaction velocity has comprised 180 m/s. The simulation is three-dimensional solved by finite element method using the author’s own software package EFES. Fracture and fragmentation of materials have been considered in calculations. Program software has been assessed to be used in calculation of multiple-contact objectives.

  12. A new software for prediction of femoral neck fractures.

    PubMed

    Testi, Debora; Cappello, Angelo; Sgallari, Fiorella; Rumpf, Martin; Viceconti, Marco

    2004-08-01

    Femoral neck fractures are an important clinical, social and economic problem. Even if many different attempts have been carried out to improve the accuracy predicting the fracture risk, it was demonstrated in retrospective studies that the standard clinical protocol achieves an accuracy of about 65%. A new procedure was developed including for the prediction not only bone mineral density but also geometric and femoral strength information and achieving an accuracy of about 80% in a previous retrospective study. Aim of the present work was to re-engineer research-based procedures and develop a real-time software for the prediction of the risk for femoral fracture. The result was efficient, repeatable and easy to use software for the evaluation of the femoral neck fracture risk to be inserted in the daily clinical practice providing a useful tool for the improvement of fracture prediction.

  13. Requirements UML Tool (RUT) Expanded for Extreme Programming (CI02)

    NASA Technical Reports Server (NTRS)

    McCoy, James R.

    2003-01-01

    A procedure for capturing and managing system requirements that incorporates XP user stories. Because costs associated with identifying problems in requirements increase dramatically over the lifecycle of a project, a method for identifying sources of software risks in user stories is urgently needed. This initiative aims to determine a set of guide-lines for user stories that will result in high-quality requirement. To further this initiative, a tool is needed to analyze user stories that can assess the quality of individual user stories, detect sources cf software risk's, produce software metrics, and identify areas in user stories that can be improved.

  14. Composable Framework Support for Software-FMEA Through Model Execution

    NASA Astrophysics Data System (ADS)

    Kocsis, Imre; Patricia, Andras; Brancati, Francesco; Rossi, Francesco

    2016-08-01

    Performing Failure Modes and Effect Analysis (FMEA) during software architecture design is becoming a basic requirement in an increasing number of domains; however, due to the lack of standardized early design phase model execution, classic SW-FMEA approaches carry significant risks and are human effort-intensive even in processes that use Model-Driven Engineering.Recently, modelling languages with standardized executable semantics have emerged. Building on earlier results, this paper describes framework support for generating executable error propagation models from such models during software architecture design. The approach carries the promise of increased precision, decreased risk and more automated execution for SW-FMEA during dependability- critical system development.

  15. Influence of Smartphones and Software on Acoustic Voice Measures

    PubMed Central

    GRILLO, ELIZABETH U.; BROSIOUS, JENNA N.; SORRELL, STACI L.; ANAND, SUPRAJA

    2016-01-01

    This study assessed the within-subject variability of voice measures captured using different recording devices (i.e., smartphones and head mounted microphone) and software programs (i.e., Analysis of Dysphonia in Speech and Voice (ADSV), Multi-dimensional Voice Program (MDVP), and Praat). Correlations between the software programs that calculated the voice measures were also analyzed. Results demonstrated no significant within-subject variability across devices and software and that some of the measures were highly correlated across software programs. The study suggests that certain smartphones may be appropriate to record daily voice measures representing the effects of vocal loading within individuals. In addition, even though different algorithms are used to compute voice measures across software programs, some of the programs and measures share a similar relationship. PMID:28775797

  16. Combining Architecture-Centric Engineering with the Team Software Process

    DTIC Science & Technology

    2010-12-01

    colleagues from Quarksoft and CIMAT have re- cently reported on their experiences in “Introducing Software Architecture Development Methods into a TSP...Postmortem Lessons, new goals, new requirements, new risk , etc. Business and technical goals Estimates, plans, process, commitment Work products...architecture to mitigate the risks unco- vered by the ATAM. At the end of the iteration, version 1.0 of the architec- ture is available. Implement a second

  17. Current modeling practice may lead to falsely high benchmark dose estimates.

    PubMed

    Ringblom, Joakim; Johanson, Gunnar; Öberg, Mattias

    2014-07-01

    Benchmark dose (BMD) modeling is increasingly used as the preferred approach to define the point-of-departure for health risk assessment of chemicals. As data are inherently variable, there is always a risk to select a model that defines a lower confidence bound of the BMD (BMDL) that, contrary to expected, exceeds the true BMD. The aim of this study was to investigate how often and under what circumstances such anomalies occur under current modeling practice. Continuous data were generated from a realistic dose-effect curve by Monte Carlo simulations using four dose groups and a set of five different dose placement scenarios, group sizes between 5 and 50 animals and coefficients of variations of 5-15%. The BMD calculations were conducted using nested exponential models, as most BMD software use nested approaches. "Non-protective" BMDLs (higher than true BMD) were frequently observed, in some scenarios reaching 80%. The phenomenon was mainly related to the selection of the non-sigmoidal exponential model (Effect=a·e(b)(·dose)). In conclusion, non-sigmoid models should be used with caution as it may underestimate the risk, illustrating that awareness of the model selection process and sound identification of the point-of-departure is vital for health risk assessment. Copyright © 2014 The Authors. Published by Elsevier Inc. All rights reserved.

  18. Combining PubMed knowledge and EHR data to develop a weighted bayesian network for pancreatic cancer prediction.

    PubMed

    Zhao, Di; Weng, Chunhua

    2011-10-01

    In this paper, we propose a novel method that combines PubMed knowledge and Electronic Health Records to develop a weighted Bayesian Network Inference (BNI) model for pancreatic cancer prediction. We selected 20 common risk factors associated with pancreatic cancer and used PubMed knowledge to weigh the risk factors. A keyword-based algorithm was developed to extract and classify PubMed abstracts into three categories that represented positive, negative, or neutral associations between each risk factor and pancreatic cancer. Then we designed a weighted BNI model by adding the normalized weights into a conventional BNI model. We used this model to extract the EHR values for patients with or without pancreatic cancer, which then enabled us to calculate the prior probabilities for the 20 risk factors in the BNI. The software iDiagnosis was designed to use this weighted BNI model for predicting pancreatic cancer. In an evaluation using a case-control dataset, the weighted BNI model significantly outperformed the conventional BNI and two other classifiers (k-Nearest Neighbor and Support Vector Machine). We conclude that the weighted BNI using PubMed knowledge and EHR data shows remarkable accuracy improvement over existing representative methods for pancreatic cancer prediction. Copyright © 2011 Elsevier Inc. All rights reserved.

  19. Combining PubMed Knowledge and EHR Data to Develop a Weighted Bayesian Network for Pancreatic Cancer Prediction

    PubMed Central

    Zhao, Di; Weng, Chunhua

    2011-01-01

    In this paper, we propose a novel method that combines PubMed knowledge and Electronic Health Records to develop a weighted Bayesian Network Inference (BNI) model for pancreatic cancer prediction. We selected 20 common risk factors associated with pancreatic cancer and used PubMed knowledge to weigh the risk factors. A keyword-based algorithm was developed to extract and classify PubMed abstracts into three categories that represented positive, negative, or neutral associations between each risk factor and pancreatic cancer. Then we designed a weighted BNI model by adding the normalized weights into a conventional BNI model. We used this model to extract the EHR values for patients with or without pancreatic cancer, which then enabled us to calculate the prior probabilities for the 20 risk factors in the BNI. The software iDiagnosis was designed to use this weighted BNI model for predicting pancreatic cancer. In an evaluation using a case-control dataset, the weighted BNI model significantly outperformed the conventional BNI and two other classifiers (k-Nearest Neighbor and Support Vector Machine). We conclude that the weighted BNI using PubMed knowledge and EHR data shows remarkable accuracy improvement over existing representative methods for pancreatic cancer prediction. PMID:21642013

  20. Mother and child characteristics at birth and early age leukemia: a case-cohort population-based study.

    PubMed

    Reis, Rejane de Souza; Silva, Neimar de Paula; Santos, Marceli de Oliveira; Oliveira, Julio Fernando Pinto; Thuler, Luiz Claudio Santos; de Camargo, Beatriz; Pombo-de-Oliveira, Maria S

    The population-based cancer registries (PBCR) and the Information System on Live Births in Brazil (Sistema de Informações sobre Nascidos Vivos [SINASC]) have information that enables the test for risk factors associated with leukemia at an early age. The aim of this study was to identify maternal and birth characteristics associated with early-age acute leukemia (EAL) in Brazil. A case-cohort study was performed using secondary dataset information of PBCR and SINASC. The risk association variables were grouped into (i) characteristics of the child at birth and (ii) characteristics of maternal exposure during pregnancy. The case-control ratio was 1:4. Linkage was performed using R software; odds ratio (OR) and 95% confidence interval (CI) were calculated by logistic regression models. EAL was associated with maternal occupational exposure to chemicals (agricultural, chemical, and petrochemical industry; adjOR: 2.18, 95% CI: 1.16-4.10) and with birth defects (adjOR: 3.62, 95% CI: 1.19-11.00). The results of this study, with the identification of EAL risk factors in population-based case-cohort study, strengthen the knowledge and improve databases, contributing to investigations on risk factors associated with childhood leukemia worldwide. Copyright © 2017 Sociedade Brasileira de Pediatria. Published by Elsevier Editora Ltda. All rights reserved.

  1. [Calculating Pearson residual in logistic regressions: a comparison between SPSS and SAS].

    PubMed

    Xu, Hao; Zhang, Tao; Li, Xiao-song; Liu, Yuan-yuan

    2015-01-01

    To compare the results of Pearson residual calculations in logistic regression models using SPSS and SAS. We reviewed Pearson residual calculation methods, and used two sets of data to test logistic models constructed by SPSS and STATA. One model contained a small number of covariates compared to the number of observed. The other contained a similar number of covariates as the number of observed. The two software packages produced similar Pearson residual estimates when the models contained a similar number of covariates as the number of observed, but the results differed when the number of observed was much greater than the number of covariates. The two software packages produce different results of Pearson residuals, especially when the models contain a small number of covariates. Further studies are warranted.

  2. Method of experimental and calculation determination of dissipative properties of carbon

    NASA Astrophysics Data System (ADS)

    Kazakova, Olga I.; Smolin, Igor Yu.; Bezmozgiy, Iosif M.

    2017-12-01

    This paper describes the process of definition of relations between the damping ratio and strain/state levels in a material. For these purposes, the experimental-calculation approach was applied. The experimental research was performed on plane composite specimens. The tests were accompanied by finite element modeling using the ANSYS software. Optimization was used as a tool for FEM property setting and for finding the above-mentioned relations. A difference between the calculation and experimental results was accepted as objective functions of this optimization. The optimization cycle was implemented using the pSeven DATADVANCE software platform. The developed approach makes it possible to determine the relations between the damping ratio and strain/state levels in the material, which can be used for computer modeling of the structure response under dynamic loading.

  3. Nested Cohort - R software package

    Cancer.gov

    NestedCohort is an R software package for fitting Kaplan-Meier and Cox Models to estimate standardized survival and attributable risks for studies where covariates of interest are observed on only a sample of the cohort.

  4. Digital Methodologies of Education Governance: Pearson plc and the Remediation of Methods

    ERIC Educational Resources Information Center

    Williamson, Ben

    2016-01-01

    This article analyses the rise of software systems in education governance, focusing on digital methods in the collection, calculation and circulation of educational data. It examines how software-mediated methods intervene in the ways educational institutions and actors are seen, known and acted upon through an analysis of the methodological…

  5. KinChem: A Computational Resource for Teaching and Learning Chemical Kinetics

    ERIC Educational Resources Information Center

    da Silva, Jose´ Nunes, Jr.; Sousa Lima, Mary Anne; Silva Sousa, Eduardo Henrique; Oliveira Alexandre, Francisco Serra; Melo Leite, Antonio Jose´, Jr.

    2014-01-01

    This paper presents a piece of educational software covering a comprehensive number of topics of chemical kinetics, which is available free of charge in Portuguese and English. The software was developed to support chemistry educators and students in the teaching-learning process of chemical kinetics by using animations, calculations, and…

  6. An intelligent maximum permissible exposure meter for safety assessments of laser radiation

    NASA Astrophysics Data System (ADS)

    Corder, D. A.; Evans, D. R.; Tyrer, J. R.

    1996-09-01

    There is frequently a need to make laser power or energy density measurements when determining whether radiation from a laser system exceeds the Maximum Permissible Exposure (MPE) as defined in BS EN 60825. This can be achieved using standard commercially available laser power or energy measurement equipment, but some of these have shortcomings when used in this application. Calculations must be performed by the user to compare the measured value to the MPE. The measurement and calculation procedure appears complex to the nonexpert who may be performing the assessment. A novel approach is described which uses purpose designed hardware and software to simplify the process. The hardware is optimized for measuring the relatively low powers associated with MPEs. The software runs on a Psion Series 3a palmtop computer. This reduces the cost and size of the system yet allows graphical and numerical presentation of data. Data output to other software running on PCs is also possible, enabling the instrument to be used as part of a quality system. Throughout the measurement process the opportunity for user error has been minimized by the hardware and software design.

  7. Development and validation of risk prediction algorithms to estimate future risk of common cancers in men and women: prospective cohort study

    PubMed Central

    Hippisley-Cox, Julia; Coupland, Carol

    2015-01-01

    Objective To derive and validate a set of clinical risk prediction algorithm to estimate the 10-year risk of 11 common cancers. Design Prospective open cohort study using routinely collected data from 753 QResearch general practices in England. We used 565 practices to develop the scores and 188 for validation. Subjects 4.96 million patients aged 25–84 years in the derivation cohort; 1.64 million in the validation cohort. Patients were free of the relevant cancer at baseline. Methods Cox proportional hazards models in the derivation cohort to derive 10-year risk algorithms. Risk factors considered included age, ethnicity, deprivation, body mass index, smoking, alcohol, previous cancer diagnoses, family history of cancer, relevant comorbidities and medication. Measures of calibration and discrimination in the validation cohort. Outcomes Incident cases of blood, breast, bowel, gastro-oesophageal, lung, oral, ovarian, pancreas, prostate, renal tract and uterine cancers. Cancers were recorded on any one of four linked data sources (general practitioner (GP), mortality, hospital or cancer records). Results We identified 228 241 incident cases during follow-up of the 11 types of cancer. Of these 25 444 were blood; 41 315 breast; 32 626 bowel, 12 808 gastro-oesophageal; 32 187 lung; 4811 oral; 6635 ovarian; 7119 pancreatic; 35 256 prostate; 23 091 renal tract; 6949 uterine cancers. The lung cancer algorithm had the best performance with an R2 of 64.2%; D statistic of 2.74; receiver operating characteristic curve statistic of 0.91 in women. The sensitivity for the top 10% of women at highest risk of lung cancer was 67%. Performance of the algorithms in men was very similar to that for women. Conclusions We have developed and validated a prediction models to quantify absolute risk of 11 common cancers. They can be used to identify patients at high risk of cancers for prevention or further assessment. The algorithms could be integrated into clinical computer systems and used to identify high-risk patients. Web calculator: There is a simple web calculator to implement the Qcancer 10 year risk algorithm together with the open source software for download (available at http://qcancer.org/10yr/). PMID:25783428

  8. SPREADSHEET BASED SCALING CALCULATIONS AND MEMBRANE PERFORMANCE

    EPA Science Inventory

    Many membrane element manufacturers provide a computer program to aid buyers in the use of their elements. However, to date there are few examples of fully integrated public domain software available for calculating reverse osmosis and nanofiltration system performance. The Total...

  9. Watershed Health Assessment Tools Investigating Fisheries

    EPA Science Inventory

    WHATIF is software that integrates a number of calculators, tools, and models for assessing the health of watersheds and streams with an emphasis on fish communities. The tool set consists of hydrologic and stream geometry calculators, a fish assemblage predictor, a fish habitat ...

  10. Techniques for Developing an Acquisition Strategy by Profiling Software Risks

    DTIC Science & Technology

    2006-08-01

    Drivers...................................................................................... 13 Figure 8: BMW 745Li Software... BMW 745Li, shown in Figure 8, is a good illustration of the increasing software control of hardware systems in automobiles. Among the many features...roll stabilization, dynamic brake con- trol, coded drive-away protection, an adaptive automatic transmission, and iDrive systems. This list can be

  11. Automated Estimation Of Software-Development Costs

    NASA Technical Reports Server (NTRS)

    Roush, George B.; Reini, William

    1993-01-01

    COSTMODL is automated software development-estimation tool. Yields significant reduction in risk of cost overruns and failed projects. Accepts description of software product developed and computes estimates of effort required to produce it, calendar schedule required, and distribution of effort and staffing as function of defined set of development life-cycle phases. Written for IBM PC(R)-compatible computers.

  12. Predicting Vulnerability Risks Using Software Characteristics

    ERIC Educational Resources Information Center

    Roumani, Yaman

    2012-01-01

    Software vulnerabilities have been regarded as one of the key reasons for computer security breaches that have resulted in billions of dollars in losses per year (Telang and Wattal 2005). With the growth of the software industry and the Internet, the number of vulnerability attacks and the ease with which an attack can be made have increased. From…

  13. Meningiomas: Objective assessment of proliferative indices by immunohistochemistry and automated counting method.

    PubMed

    Chavali, Pooja; Uppin, Megha S; Uppin, Shantveer G; Challa, Sundaram

    2017-01-01

    The most reliable histological correlate of recurrence risk in meningiomas is increased mitotic activity. Proliferative index with Ki-67 immunostaining is a helpful adjunct to manual counting. However, both show considerable inter-observer variability. A new immunohistochemical method for counting mitotic figures, using antibody against the phosphohistone H3 (PHH3) protein was introduced. Similarly, a computer based automated counting for Ki-67 labelling index (LI) is available. To study the use of these new techniques in the objective assessment of proliferation indices in meningiomas. This was a retrospective study of intracranial meningiomas diagnosed during the year 2013.The hematoxylin and eosin (H and E) sections and immunohistochemistry (IHC) with Ki-67 were reviewed by two pathologists. Photomicrographs of the representative areas were subjected to Ki-67 analysis by Immunoratio (IR) software. Mean Ki-67 LI, both manual and by IR were calculated. IHC with PHH3 was performed. PHH3 positive nuclei were counted and mean values calculated. Data analysis was done using SPSS software. A total of 64 intracranial meningiomas were diagnosed. Evaluation on H and E, PHH3, Ki-67 LI (both manual and IR) were done in 32 cases (22 grade I and 10 grade II meningiomas). Statistically significant correlation was seen between the mitotic count in each grade and PHH3 values and also between the grade of the tumor and values of Ki-67 and PHH3. Both the techniques used in the study had advantage over, as well as, correlated well with the existing techniques and hence, can be applied to routine use.

  14. The MiAge Calculator: a DNA methylation-based mitotic age calculator of human tissue types.

    PubMed

    Youn, Ahrim; Wang, Shuang

    2018-01-01

    Cell division is important in human aging and cancer. The estimation of the number of cell divisions (mitotic age) of a given tissue type in individuals is of great interest as it allows not only the study of biological aging (using a new molecular aging target) but also the stratification of prospective cancer risk. Here, we introduce the MiAge Calculator, a mitotic age calculator based on a novel statistical framework, the MiAge model. MiAge is designed to quantitatively estimate mitotic age (total number of lifetime cell divisions) of a tissue using the stochastic replication errors accumulated in the epigenetic inheritance process during cell divisions. With the MiAge model, the MiAge Calculator was built using the training data of DNA methylation measures of 4,020 tumor and adjacent normal tissue samples from eight TCGA cancer types and was tested using the testing data of DNA methylation measures of 2,221 tumor and adjacent normal tissue samples of five other TCGA cancer types. We showed that within each of the thirteen cancer types studied, the estimated mitotic age is universally accelerated in tumor tissues compared to adjacent normal tissues. Across the thirteen cancer types, we showed that worse cancer survivals are associated with more accelerated mitotic age in tumor tissues. Importantly, we demonstrated the utility of mitotic age by showing that the integration of mitotic age and clinical information leads to improved survival prediction in six out of the thirteen cancer types studied. The MiAge Calculator is available at http://www.columbia.edu/∼sw2206/softwares.htm .

  15. Meta-analysis of association between mobile phone use and glioma risk.

    PubMed

    Wang, Yabo; Guo, Xiaqing

    2016-12-01

    The purpose of this study was to evaluate the association between mobile phone use and glioma risk through pooling the published data. By searching Medline, EMBSE, and CNKI databases, we screened the open published case-control or cohort studies about mobile phone use and glioma risk by systematic searching strategy. The pooled odds of mobile use in glioma patients versus healthy controls were calculated by meta-analysis method. The statistical analysis was done by Stata12.0 software (http://www.stata.com). After searching the Medline, EMBSE, and CNKI databases, we ultimately included 11 studies range from 2001 to 2008. For ≥1 year group, the data were pooled by random effects model. The combined data showed that there was no association between mobile phone use and glioma odds ratio (OR) =1.08 (95% confidence interval [CI]: 0.91-1.25,P > 0.05). However, a significant association was found between mobile phone use more than 5 years and glioma risk OR = 1.35 (95% CI: 1.09-1.62, P < 0.05). The publication bias of this study was evaluated by funnel plot and line regression test. The funnel plot and line regression test (t = 0.25,P = 0.81) did not indicate any publication bias. Long-term mobile phone use may increase the risk of developing glioma according to this meta-analysis.

  16. Acute myocardial infarction and COPD attributed to ambient SO2 in Iran.

    PubMed

    Khaniabadi, Yusef Omidi; Daryanoosh, Seyed Mohammad; Hopke, Philip K; Ferrante, Margherita; De Marco, Alessandra; Sicard, Pierre; Oliveri Conti, Gea; Goudarzi, Gholamreza; Basiri, Hassan; Mohammadi, Mohammad Javad; Keishams, Fariba

    2017-07-01

    Acute myocardial infarction (MI) and chronic obstructive pulmonary disease (COPD) are important diseases worldwide. Inhalation is the major route of short-term exposure to air sulfur dioxide (SO 2 ) that negatively affect human health. The objective of this study was to estimate the health effects of short-term exposure to SO 2 in Khorramabad, Iran using the AirQ software developed by the World Health Organization (WHO). Daily mean SO 2 concentrations were used as the estimates of human short-term exposure and allow calculation of the attributable excess relative risk of an acute MI and hospital admissions due to COPD (HACOPD). The annual mean SO 2 concentration in Khorramabad was 51.33µg/m 3 . Based on the relative risk (RR) and baseline incidence (BI) approach of WHO, an increased risk of 2.7% (95% CI: 1.1-4.2%) of acute MI and 2.0% (95% CI: 0-4.6%) of HACOPD, respectively, were attributed to a 10µg/m 3 SO 2 increase. Since the geographic, demographic, and climatic characteristics are different from the areas in which the risk relationships were developed and not evaluated here, further investigations will be needed to fully quantify other health impacts of SO 2 . A decreased risk for MIs and COPD attributable to SO 2 could be achieved if mitigation strategies and measures are implemented to reduce the exposure. Copyright © 2017. Published by Elsevier Inc.

  17. Environmental pollution and deaths due to stroke in a city with low levels of air pollution: ecological time series study.

    PubMed

    Amancio, Camila Trolez; Nascimento, Luiz Fernando

    2014-12-01

    Little has been discussed about the increased risk of stroke after exposure to air pollutants, particularly in Brazil. The mechanisms through which air pollution can influence occurrences of vascular events such as stroke are still poorly understood. The aim of this study was to estimate the association between exposure to some air pollutants and risk of death due to stroke. Ecological time series study with data from São José dos Campos, Brazil. Data on deaths due to stroke among individuals of all ages living in São José dos Campos and on particulate matter, sulfur dioxide and ozone were used. Statistical analysis was performed using a generalized additive model of Poisson regression with the Statistica software, in unipollutant and multipollutant models. The percentage increase in the risk of increased interquartile difference was calculated. There were 1,032 deaths due to stroke, ranging from 0 to 5 per day. The statistical significance of the exposure to particulate matter was ascertained in the unipollutant model and the importance of particulate matter and sulfur dioxide, in the multipollutant model. The increases in risk were 10% and 7%, for particulate matter and sulfur dioxide, respectively. It was possible to identify exposure to air pollutants as a risk factor for death due to stroke, even in a city with low levels of air pollution.

  18. Foot clearance and variability in mono- and multifocal intraocular lens users during stair navigation.

    PubMed

    Renz, Erik; Hackney, Madeleine; Hall, Courtney

    2016-01-01

    Intraocular lenses (IOLs) provide distance and near refraction and are becoming the standard for cataract surgery. Multifocal glasses increase variability of toe clearance in older adults navigating stairs and increase fall risk; however, little is known about the biomechanics of stair navigation in individuals with multifocal IOLs. This study compared clearance while ascending and descending stairs in individuals with monofocal versus multifocal IOLs. Eight participants with multifocal IOLs (4 men, 4 women; mean age = 66.5 yr, standard deviation [SD] = 6.26) and fifteen male participants with monofocal IOLs (mean age = 69.9 yr, SD = 6.9) underwent vision and mobility testing. Motion analysis recorded kinematic and custom software-calculated clearances in three-dimensional space. No significant differences were found between groups on minimum clearance or variability. Clearance differed for ascending versus descending stairs: the first step onto the stair had the greatest toe clearance during ascent, whereas the final step to the floor had the greatest heel clearance during descent. This preliminary study indicates that multifocal IOLs have similar biomechanic characteristics to monofocal IOLs. Given that step characteristics are related to fall risk, we can tentatively speculate that multifocal IOLs may carry no additional fall risk.

  19. Finite element based damage assessment of composite tidal turbine blades

    NASA Astrophysics Data System (ADS)

    Fagan, Edward M.; Leen, Sean B.; Kennedy, Ciaran R.; Goggins, Jamie

    2015-07-01

    With significant interest growing in the ocean renewables sector, horizontal axis tidal current turbines are in a position to dominate the marketplace. The test devices that have been placed in operation so far have suffered from premature failures, caused by difficulties with structural strength prediction. The goal of this work is to develop methods of predicting the damage level in tidal turbines under their maximum operating tidal velocity. The analysis was conducted using the finite element software package Abaqus; shell models of three representative tidal turbine blades are produced. Different construction methods will affect the damage level in the blade and for this study models were developed with varying hydrofoil profiles. In order to determine the risk of failure, a user material subroutine (UMAT) was created. The UMAT uses the failure criteria designed by Alfred Puck to calculate the risk of fibre and inter-fibre failure in the blades. The results show that degradation of the stiffness is predicted for the operating conditions, having an effect on the overall tip deflection. The failure criteria applied via the UMAT form a useful tool for analysis of high risk regions within the blade designs investigated.

  20. The Validation by Measurement Theory of Proposed Object-Oriented Software Metrics

    NASA Technical Reports Server (NTRS)

    Neal, Ralph D.

    1996-01-01

    Moving software development into the engineering arena requires controllability, and to control a process, it must be measurable. Measuring the process does no good if the product is not also measured, i.e., being the best at producing an inferior product does not define a quality process. Also, not every number extracted from software development is a valid measurement. A valid measurement only results when we are able to verify that the number is representative of the attribute that we wish to measure. Many proposed software metrics are used by practitioners without these metrics ever having been validated, leading to costly but often useless calculations. Several researchers have bemoaned the lack of scientific precision in much of the published software measurement work and have called for validation of software metrics by measurement theory. This dissertation applies measurement theory to validate fifty proposed object-oriented software metrics.

  1. Integrating open-source software applications to build molecular dynamics systems.

    PubMed

    Allen, Bruce M; Predecki, Paul K; Kumosa, Maciej

    2014-04-05

    Three open-source applications, NanoEngineer-1, packmol, and mis2lmp are integrated using an open-source file format to quickly create molecular dynamics (MD) cells for simulation. The three software applications collectively make up the open-source software (OSS) suite known as MD Studio (MDS). The software is validated through software engineering practices and is verified through simulation of the diglycidyl ether of bisphenol-a and isophorone diamine (DGEBA/IPD) system. Multiple simulations are run using the MDS software to create MD cells, and the data generated are used to calculate density, bulk modulus, and glass transition temperature of the DGEBA/IPD system. Simulation results compare well with published experimental and numerical results. The MDS software prototype confirms that OSS applications can be analyzed against real-world research requirements and integrated to create a new capability. Copyright © 2014 Wiley Periodicals, Inc.

  2. Genetic structure and conservation of Mountain Lions in the South-Brazilian Atlantic Rain Forest

    PubMed Central

    Castilho, Camila S.; Marins-Sá, Luiz G.; Benedet, Rodrigo C.; Freitas, Thales R.O.

    2012-01-01

    The Brazilian Atlantic Rain Forest, one of the most endangered ecosystems worldwide, is also among the most important hotspots as regards biodiversity. Through intensive logging, the initial area has been reduced to around 12% of its original size. In this study we investigated the genetic variability and structure of the mountain lion, Puma concolor. Using 18 microsatellite loci we analyzed evidence of allele dropout, null alleles and stuttering, calculated the number of allele/locus, PIC, observed and expected heterozygosity, linkage disequilibrium, Hardy-Weinberg equilibrium, FIS, effective population size and genetic structure (MICROCHECKER, CERVUS, GENEPOP, FSTAT, ARLEQUIN, ONESAMP, LDNe, PCAGEN, GENECLASS software), we also determine whether there was evidence of a bottleneck (HYBRIDLAB, BOTTLENECK software) that might influence the future viability of the population in south Brazil. 106 alleles were identified, with the number of alleles/locus ranging from 2 to 11. Mean observed heterozygosity, mean number of alleles and polymorphism information content were 0.609, 5.89, and 0.6255, respectively. This population presented evidence of a recent bottleneck and loss of genetic variation. Persistent regional poaching constitutes an increasing in the extinction risk. PMID:22481876

  3. Predicting Software Suitability Using a Bayesian Belief Network

    NASA Technical Reports Server (NTRS)

    Beaver, Justin M.; Schiavone, Guy A.; Berrios, Joseph S.

    2005-01-01

    The ability to reliably predict the end quality of software under development presents a significant advantage for a development team. It provides an opportunity to address high risk components earlier in the development life cycle, when their impact is minimized. This research proposes a model that captures the evolution of the quality of a software product, and provides reliable forecasts of the end quality of the software being developed in terms of product suitability. Development team skill, software process maturity, and software problem complexity are hypothesized as driving factors of software product quality. The cause-effect relationships between these factors and the elements of software suitability are modeled using Bayesian Belief Networks, a machine learning method. This research presents a Bayesian Network for software quality, and the techniques used to quantify the factors that influence and represent software quality. The developed model is found to be effective in predicting the end product quality of small-scale software development efforts.

  4. An object oriented implementation of the Yeadon human inertia model

    PubMed Central

    Dembia, Christopher; Moore, Jason K.; Hubbard, Mont

    2015-01-01

    We present an open source software implementation of a popular mathematical method developed by M.R. Yeadon for calculating the body and segment inertia parameters of a human body. The software is written in a high level open source language and provides three interfaces for manipulating the data and the model: a Python API, a command-line user interface, and a graphical user interface. Thus the software can fit into various data processing pipelines and requires only simple geometrical measures as input. PMID:25717365

  5. An object oriented implementation of the Yeadon human inertia model.

    PubMed

    Dembia, Christopher; Moore, Jason K; Hubbard, Mont

    2014-01-01

    We present an open source software implementation of a popular mathematical method developed by M.R. Yeadon for calculating the body and segment inertia parameters of a human body. The software is written in a high level open source language and provides three interfaces for manipulating the data and the model: a Python API, a command-line user interface, and a graphical user interface. Thus the software can fit into various data processing pipelines and requires only simple geometrical measures as input.

  6. The computer-aided parallel external fixator for complex lower limb deformity correction.

    PubMed

    Wei, Mengting; Chen, Jianwen; Guo, Yue; Sun, Hao

    2017-12-01

    Since parameters of the parallel external fixator are difficult to measure and calculate in real applications, this study developed computer software that can help the doctor measure parameters using digital technology and generate an electronic prescription for deformity correction. According to Paley's deformity measurement method, we provided digital measurement techniques. In addition, we proposed an deformity correction algorithm to calculate the elongations of the six struts and developed a electronic prescription software. At the same time, a three-dimensional simulation of the parallel external fixator and deformed fragment was made using virtual reality modeling language technology. From 2013 to 2015, fifteen patients with complex lower limb deformity were treated with parallel external fixators and the self-developed computer software. All of the cases had unilateral limb deformity. The deformities were caused by old osteomyelitis in nine cases and traumatic sequelae in six cases. A doctor measured the related angulation, displacement and rotation on postoperative radiographs using the digital measurement techniques. Measurement data were input into the electronic prescription software to calculate the daily adjustment elongations of the struts. Daily strut adjustments were conducted according to the data calculated. The frame was removed when expected results were achieved. Patients lived independently during the adjustment. The mean follow-up was 15 months (range 10-22 months). The duration of frame fixation from the time of application to the time of removal averaged 8.4 months (range 2.5-13.1 months). All patients were satisfied with the corrected limb alignment. No cases of wound infections or complications occurred. Using the computer-aided parallel external fixator for the correction of lower limb deformities can achieve satisfactory outcomes. The correction process can be simplified and is precise and digitized, which will greatly improve the treatment in a clinical application.

  7. User Guide for GoldSim Model to Calculate PA/CA Doses and Limits

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Smith, F.

    2016-10-31

    A model to calculate doses for solid waste disposal at the Savannah River Site (SRS) and corresponding disposal limits has been developed using the GoldSim commercial software. The model implements the dose calculations documented in SRNL-STI-2015-00056, Rev. 0 “Dose Calculation Methodology and Data for Solid Waste Performance Assessment (PA) and Composite Analysis (CA) at the Savannah River Site”.

  8. Validation of a Custom-made Software for DQE Assessment in Mammography Digital Detectors

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ayala-Dominguez, L.; Perez-Ponce, H.; Brandan, M. E.

    2010-12-07

    This works presents the validation of a custom-made software, designed and developed in Matlab, intended for routine evaluation of detective quantum efficiency DQE, according to algorithms described in the IEC 62220-1-2 standard. DQE, normalized noise power spectrum NNPS and pre-sampling modulation transfer function MTF were calculated from RAW images from a GE Senographe DS (FineView disabled) and a Siemens Novation system. Calculated MTF is in close agreement with results obtained with alternative codes: MTF lowbar tool (Maidment), ImageJ plug-in (Perez-Ponce) and MIQuaELa (Ayala). Overall agreement better than {approx_equal}90% was found in MTF; the largest differences were observed at frequencies closemore » to the Nyquist limit. For the measurement of NNPS and DQE, agreement is similar to that obtained in the MTF. These results suggest that the developed software can be used with confidence for image quality assessment.« less

  9. The SRS-Viewer: A Software Tool for Displaying and Evaluation of Pyroshock Data

    NASA Astrophysics Data System (ADS)

    Eberl, Stefan

    2014-06-01

    For the evaluation of the success of a pyroshock, the time domain and the corresponding Shock-Response- Spectra (SRS) have to be considered. The SRS-Viewer is an IABG developed software tool [1] to read data in Universal File format (*.unv) and either display or plot for each accelerometer the time domain, corresponding SRS and the specified Reference-SRS with tolerances in the background.The software calculates the "Average (AVG)", "Maximum (MAX)" and "Minimum (MIN)" SRS of any selection of accelerometers. A statistical analysis calculates the percentages of measured SRS above the specified Reference-SRS level and the percentage within the tolerance bands for comparison with the specified success criteria.Overlay plots of single accelerometers of different test runs enable to monitor the repeatability of the shock input and the integrity of the specimen. Furthermore the difference between the shock on a mass-dummy and the real test unit can be examined.

  10. HIGH-RATE FORMABILITY OF HIGH-STRENGTH ALUMINUM ALLOYS: A STUDY ON OBJECTIVITY OF MEASURED STRAIN AND STRAIN RATE

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Upadhyay, Piyush; Rohatgi, Aashish; Stephens, Elizabeth V.

    2015-02-18

    Al alloy AA7075 sheets were deformed at room temperature at strain-rates exceeding 1000 /s using the electrohydraulic forming (EHF) technique. A method that combines high speed imaging and digital image correlation technique, developed at Pacific Northwest National Laboratory, is used to investigate high strain rate deformation behavior of AA7075. For strain-rate sensitive materials, the ability to accurately model their high-rate deformation behavior is dependent upon the ability to accurately quantify the strain-rate that the material is subjected to. This work investigates the objectivity of software-calculated strain and strain rate by varying different parameters within commonly used commercially available digital imagemore » correlation software. Except for very close to the time of crack opening the calculated strain and strain rates are very consistent and independent of the adjustable parameters of the software.« less

  11. Interactive software tool to comprehend the calculation of optimal sequence alignments with dynamic programming.

    PubMed

    Ibarra, Ignacio L; Melo, Francisco

    2010-07-01

    Dynamic programming (DP) is a general optimization strategy that is successfully used across various disciplines of science. In bioinformatics, it is widely applied in calculating the optimal alignment between pairs of protein or DNA sequences. These alignments form the basis of new, verifiable biological hypothesis. Despite its importance, there are no interactive tools available for training and education on understanding the DP algorithm. Here, we introduce an interactive computer application with a graphical interface, for the purpose of educating students about DP. The program displays the DP scoring matrix and the resulting optimal alignment(s), while allowing the user to modify key parameters such as the values in the similarity matrix, the sequence alignment algorithm version and the gap opening/extension penalties. We hope that this software will be useful to teachers and students of bioinformatics courses, as well as researchers who implement the DP algorithm for diverse applications. The software is freely available at: http:/melolab.org/sat. The software is written in the Java computer language, thus it runs on all major platforms and operating systems including Windows, Mac OS X and LINUX. All inquiries or comments about this software should be directed to Francisco Melo at fmelo@bio.puc.cl.

  12. Software and hardware complex for research and management of the separation process

    NASA Astrophysics Data System (ADS)

    Borisov, A. P.

    2018-01-01

    The article is devoted to the development of a program for studying the operation of an asynchronous electric drive using vector-algorithmic switching of windings, as well as the development of a hardware-software complex for controlling parameters and controlling the speed of rotation of an asynchronous electric drive for investigating the operation of a cyclone. To study the operation of an asynchronous electric drive, a method was used in which the average value of flux linkage is found and a method for vector-algorithmic calculation of the power and electromagnetic moment of an asynchronous electric drive feeding from a single-phase network is developed, with vector-algorithmic commutation, and software for calculating parameters. The software part of the complex allows to regulate the speed of rotation of the motor by vector-algorithmic switching of transistors or, using pulse-width modulation (PWM), set any engine speed. Also sensors are connected to the hardware-software complex at the inlet and outlet of the cyclone. The developed cyclone with an inserted complex allows to receive high efficiency of product separation at various entrance speeds. At an inlet air speed of 18 m / s, the cyclone’s maximum efficiency is achieved. For this, it is necessary to provide the rotational speed of an asynchronous electric drive with a frequency of 45 Hz.

  13. [Sem: a suitable statistical software adaptated for research in oncology].

    PubMed

    Kwiatkowski, F; Girard, M; Hacene, K; Berlie, J

    2000-10-01

    Many softwares have been adapted for medical use; they rarely enable conveniently both data management and statistics. A recent cooperative work ended up in a new software, Sem (Statistics Epidemiology Medicine), which allows data management of trials and, as well, statistical treatments on them. Very convenient, it can be used by non professional in statistics (biologists, doctors, researchers, data managers), since usually (excepted with multivariate models), the software performs by itself the most adequate test, after what complementary tests can be requested if needed. Sem data base manager (DBM) is not compatible with usual DBM: this constitutes a first protection against loss of privacy. Other shields (passwords, cryptage...) strengthen data security, all the more necessary today since Sem can be run on computers nets. Data organization enables multiplicity: forms can be duplicated by patient. Dates are treated in a special but transparent manner (sorting, date and delay calculations...). Sem communicates with common desktop softwares, often with a simple copy/paste. So, statistics can be easily performed on data stored in external calculation sheets, and slides by pasting graphs with a single mouse click (survival curves...). Already used over fifty places in different hospitals for daily work, this product, combining data management and statistics, appears to be a convenient and innovative solution.

  14. Volumetric Analysis of Alveolar Bone Defect Using Three-Dimensional-Printed Models Versus Computer-Aided Engineering.

    PubMed

    Du, Fengzhou; Li, Binghang; Yin, Ningbei; Cao, Yilin; Wang, Yongqian

    2017-03-01

    Knowing the volume of a graft is essential in repairing alveolar bone defects. This study investigates the 2 advanced preoperative volume measurement methods: three-dimensional (3D) printing and computer-aided engineering (CAE). Ten unilateral alveolar cleft patients were enrolled in this study. Their computed tomographic data were sent to 3D printing and CAE software. A simulated graft was used on the 3D-printed model, and the graft volume was measured by water displacement. The volume calculated by CAE software used mirror-reverses technique. The authors compared the actual volumes of the simulated grafts with the CAE software-derived volumes. The average volume of the simulated bone grafts by 3D-printed models was 1.52 mL, higher than the mean volume of 1.47 calculated by CAE software. The difference between the 2 volumes was from -0.18 to 0.42 mL. The paired Student t test showed no statistically significant difference between the volumes derived from the 2 methods. This study demonstrated that the mirror-reversed technique by CAE software is as accurate as the simulated operation on 3D-printed models in unilateral alveolar cleft patients. These findings further validate the use of 3D printing and CAE technique in alveolar defect repairing.

  15. An Assessment of Software Safety as Applied to the Department of Defense Software Development Process

    DTIC Science & Technology

    1992-12-01

    provide program 5 managers some level of confidence that their software will operate at an acceptable level of risk. A number of structured safety...safety within the constraints of operational effectiveness, schedule, and cost through timely application of system safety management and engineering...Master of Science in Software Systems Management Peter W. Colan, B.S.E. Robert W. Prouhet, B.S. Captain, USAF Captain, USAF December 1992 Approved for

  16. Software for determining the direction of movement, shear and normal stresses of a fault under a determined stress state

    NASA Astrophysics Data System (ADS)

    Álvarez del Castillo, Alejandra; Alaniz-Álvarez, Susana Alicia; Nieto-Samaniego, Angel Francisco; Xu, Shunshan; Ochoa-González, Gil Humberto; Velasquillo-Martínez, Luis Germán

    2017-07-01

    In the oil, gas and geothermal industry, the extraction or the input of fluids induces changes in the stress field of the reservoir, if the in-situ stress state of a fault plane is sufficiently disturbed, a fault may slip and can trigger fluid leakage or the reservoir might fracture and become damaged. The goal of the SSLIPO 1.0 software is to obtain data that can reduce the risk of affecting the stability of wellbores. The input data are the magnitudes of the three principal stresses and their orientation in geographic coordinates. The output data are the slip direction of a fracture in geographic coordinates, and its normal (σn) and shear (τ) stresses resolved on a single or multiple fracture planes. With this information, it is possible to calculate the slip tendency (τ/σn) and the propensity to open a fracture that is inversely proportional to σn. This software could analyze any compressional stress system, even non-Andersonian. An example is given from an oilfield in southern Mexico, in a region that contains fractures formed in three events of deformation. In the example SSLIPO 1.0 was used to determine in which deformation event the oil migrated. SSLIPO 1.0 is an open code application developed in MATLAB. The URL to obtain the source code and to download SSLIPO 1.0 are: http://www.geociencias.unam.mx/ alaniz/main_code.txt, http://www.geociencias.unam.mx/ alaniz/ SSLIPO_pkg.exe.

  17. Locally Downscaled and Spatially Customizable Climate Data for Historical and Future Periods for North America

    PubMed Central

    Wang, Tongli; Hamann, Andreas; Spittlehouse, Dave; Carroll, Carlos

    2016-01-01

    Large volumes of gridded climate data have become available in recent years including interpolated historical data from weather stations and future predictions from general circulation models. These datasets, however, are at various spatial resolutions that need to be converted to scales meaningful for applications such as climate change risk and impact assessments or sample-based ecological research. Extracting climate data for specific locations from large datasets is not a trivial task and typically requires advanced GIS and data management skills. In this study, we developed a software package, ClimateNA, that facilitates this task and provides a user-friendly interface suitable for resource managers and decision makers as well as scientists. The software locally downscales historical and future monthly climate data layers into scale-free point estimates of climate values for the entire North American continent. The software also calculates a large number of biologically relevant climate variables that are usually derived from daily weather data. ClimateNA covers 1) 104 years of historical data (1901–2014) in monthly, annual, decadal and 30-year time steps; 2) three paleoclimatic periods (Last Glacial Maximum, Mid Holocene and Last Millennium); 3) three future periods (2020s, 2050s and 2080s); and 4) annual time-series of model projections for 2011–2100. Multiple general circulation models (GCMs) were included for both paleo and future periods, and two representative concentration pathways (RCP4.5 and 8.5) were chosen for future climate data. PMID:27275583

  18. Locally Downscaled and Spatially Customizable Climate Data for Historical and Future Periods for North America.

    PubMed

    Wang, Tongli; Hamann, Andreas; Spittlehouse, Dave; Carroll, Carlos

    2016-01-01

    Large volumes of gridded climate data have become available in recent years including interpolated historical data from weather stations and future predictions from general circulation models. These datasets, however, are at various spatial resolutions that need to be converted to scales meaningful for applications such as climate change risk and impact assessments or sample-based ecological research. Extracting climate data for specific locations from large datasets is not a trivial task and typically requires advanced GIS and data management skills. In this study, we developed a software package, ClimateNA, that facilitates this task and provides a user-friendly interface suitable for resource managers and decision makers as well as scientists. The software locally downscales historical and future monthly climate data layers into scale-free point estimates of climate values for the entire North American continent. The software also calculates a large number of biologically relevant climate variables that are usually derived from daily weather data. ClimateNA covers 1) 104 years of historical data (1901-2014) in monthly, annual, decadal and 30-year time steps; 2) three paleoclimatic periods (Last Glacial Maximum, Mid Holocene and Last Millennium); 3) three future periods (2020s, 2050s and 2080s); and 4) annual time-series of model projections for 2011-2100. Multiple general circulation models (GCMs) were included for both paleo and future periods, and two representative concentration pathways (RCP4.5 and 8.5) were chosen for future climate data.

  19. The Preliminary Results of GMSTech: A Software Development for Microseismic Characterization

    NASA Astrophysics Data System (ADS)

    Rohaman, Maman; Suhendi, Cahli; Verdhora Ry, Rexha; Sugiartono Prabowo, Billy; Widiyantoro, Sri; Nugraha, Andri Dian; Yudistira, Tedi; Mujihardi, Bambang

    2017-04-01

    The processing of microseismic data requires reliable software for imaging the condition of subsurface related to occurring microseismicity. In general, the currently available software is only specific for certain processing module and developed by the different developer. However, the software with integrated processing modules will give a better value because the users can use it easier and faster. We developed GMSTech (Ganesha Microseismic Technology), a C# language-based standing-alone software consisting several modules for processing of microseismic data. Its function is to solve a non-linear inverse problem and imaging the subsurface. C# library is supported by ILNumerics to reduce time consumption and give good visualization. In this preliminary result, we will present four developed modules: (1) hypocenter determination, (2) moment magnitude calculation, and (3) 3D seismic tomography. In the first module, we provide four methods for locating the microseismic events that can be chosen by a user independently: simulated annealing method, guided grid-search method, Geiger’s method, and joint hypocenter determination (JHD). The second module can be used for calculating moment magnitude using Brune method and to estimate the released energy of the event. At last, we also provided the module of 3-D seismic tomography for imaging the velocity structures based on delay time tomography. We demonstrated the software using both a synthetic data and a real data from a certain geothermal field in Indonesia. The results for all modules are reliable and remarkable, reviewed statistically by RMS error. We will keep examining the software using another set of data and developing further modules of processing.

  20. A semi-automated volumetric software for segmentation and perfusion parameter quantification of brain tumors using 320-row multidetector computed tomography: a validation study.

    PubMed

    Chae, Soo Young; Suh, Sangil; Ryoo, Inseon; Park, Arim; Noh, Kyoung Jin; Shim, Hackjoon; Seol, Hae Young

    2017-05-01

    We developed a semi-automated volumetric software, NPerfusion, to segment brain tumors and quantify perfusion parameters on whole-brain CT perfusion (WBCTP) images. The purpose of this study was to assess the feasibility of the software and to validate its performance compared with manual segmentation. Twenty-nine patients with pathologically proven brain tumors who underwent preoperative WBCTP between August 2012 and February 2015 were included. Three perfusion parameters, arterial flow (AF), equivalent blood volume (EBV), and Patlak flow (PF, which is a measure of permeability of capillaries), of brain tumors were generated by a commercial software and then quantified volumetrically by NPerfusion, which also semi-automatically segmented tumor boundaries. The quantification was validated by comparison with that of manual segmentation in terms of the concordance correlation coefficient and Bland-Altman analysis. With NPerfusion, we successfully performed segmentation and quantified whole volumetric perfusion parameters of all 29 brain tumors that showed consistent perfusion trends with previous studies. The validation of the perfusion parameter quantification exhibited almost perfect agreement with manual segmentation, with Lin concordance correlation coefficients (ρ c ) for AF, EBV, and PF of 0.9988, 0.9994, and 0.9976, respectively. On Bland-Altman analysis, most differences between this software and manual segmentation on the commercial software were within the limit of agreement. NPerfusion successfully performs segmentation of brain tumors and calculates perfusion parameters of brain tumors. We validated this semi-automated segmentation software by comparing it with manual segmentation. NPerfusion can be used to calculate volumetric perfusion parameters of brain tumors from WBCTP.

  1. Route Planning Software for Lunar Polar Missions

    NASA Astrophysics Data System (ADS)

    Cunningham, C.; Jones, H.; Amato, J.; Holst, I.; Otten, N.; Kitchell, F.; Whittaker, W.; Horchler, A.

    2016-11-01

    Rover mission planning on the lunar poles is challenging due to the long, time-varying shadows. This abstract presents software for efficiently planning traverses while balancing competing demands of science goals, rover energy constraints, and risk.

  2. An open source software tool to assign the material properties of bone for ABAQUS finite element simulations.

    PubMed

    Pegg, Elise C; Gill, Harinderjit S

    2016-09-06

    A new software tool to assign the material properties of bone to an ABAQUS finite element mesh was created and compared with Bonemat, a similar tool originally designed to work with Ansys finite element models. Our software tool (py_bonemat_abaqus) was written in Python, which is the chosen scripting language for ABAQUS. The purpose of this study was to compare the software packages in terms of the material assignment calculation and processing speed. Three element types were compared (linear hexahedral (C3D8), linear tetrahedral (C3D4) and quadratic tetrahedral elements (C3D10)), both individually and as part of a mesh. Comparisons were made using a CT scan of a hemi-pelvis as a test case. A small difference, of -0.05kPa on average, was found between Bonemat version 3.1 (the current version) and our Python package. Errors were found in the previous release of Bonemat (version 3.0 downloaded from www.biomedtown.org) during calculation of the quadratic tetrahedron Jacobian, and conversion of the apparent density to modulus when integrating over the Young׳s modulus field. These issues caused up to 2GPa error in the modulus assignment. For these reasons, we recommend users upgrade to the most recent release of Bonemat. Processing speeds were assessed for the three different element types. Our Python package took significantly longer (110s on average) to perform the calculations compared with the Bonemat software (10s). Nevertheless, the workflow advantages of the package and added functionality makes 'py_bonemat_abaqus' a useful tool for ABAQUS users. Copyright © 2016 Elsevier Ltd. All rights reserved.

  3. "SABER": A new software tool for radiotherapy treatment plan evaluation.

    PubMed

    Zhao, Bo; Joiner, Michael C; Orton, Colin G; Burmeister, Jay

    2010-11-01

    Both spatial and biological information are necessary in order to perform true optimization of a treatment plan and for predicting clinical outcome. The goal of this work is to develop an enhanced treatment plan evaluation tool which incorporates biological parameters and retains spatial dose information. A software system is developed which provides biological plan evaluation with a novel combination of features. It incorporates hyper-radiosensitivity using the induced-repair model and applies the new concept of dose convolution filter (DCF) to simulate dose wash-out effects due to cell migration, bystander effect, and/or tissue motion during treatment. Further, the concept of spatial DVH (sDVH) is introduced to evaluate and potentially optimize the spatial dose distribution in the target volume. Finally, generalized equivalent uniform dose is derived from both the physical dose distribution (gEUD) and the distribution of equivalent dose in 2 Gy fractions (gEUD2) and the software provides three separate models for calculation of tumor control probability (TCP), normal tissue complication probability (NTCP), and probability of uncomplicated tumor control (P+). TCP, NTCP, and P+ are provided as a function of prescribed dose and multivariable TCP, NTCP, and P+ plots are provided to illustrate the dependence on individual parameters used to calculate these quantities. Ten plans from two clinical treatment sites are selected to test the three calculation models provided by this software. By retaining both spatial and biological information about the dose distribution, the software is able to distinguish features of radiotherapy treatment plans not discernible using commercial systems. Plans that have similar DVHs may have different spatial and biological characteristics and the application of novel tools such as sDVH and DCF within the software may substantially change the apparent plan quality or predicted plan metrics such as TCP and NTCP. For the cases examined, both the calculation method and the application of DCF can change the ranking order of competing plans. The voxel-by-voxel TCP model makes it feasible to incorporate spatial variations of clonogen densities (n), radiosensitivities (SF2), and fractionation sensitivities (alpha/beta) as those data become available. The new software incorporates both spatial and biological information into the treatment planning process. The application of multiple methods for the incorporation of biological and spatial information has demonstrated that the order of application of biological models can change the order of plan ranking. Thus, the results of plan evaluation and optimization are dependent not only on the models used but also on the order in which they are applied. This software can help the planner choose more biologically optimal treatment plans and potentially predict treatment outcome more accurately.

  4. Security Risks of Cloud Computing and Its Emergence as 5th Utility Service

    NASA Astrophysics Data System (ADS)

    Ahmad, Mushtaq

    Cloud Computing is being projected by the major cloud services provider IT companies such as IBM, Google, Yahoo, Amazon and others as fifth utility where clients will have access for processing those applications and or software projects which need very high processing speed for compute intensive and huge data capacity for scientific, engineering research problems and also e- business and data content network applications. These services for different types of clients are provided under DASM-Direct Access Service Management based on virtualization of hardware, software and very high bandwidth Internet (Web 2.0) communication. The paper reviews these developments for Cloud Computing and Hardware/Software configuration of the cloud paradigm. The paper also examines the vital aspects of security risks projected by IT Industry experts, cloud clients. The paper also highlights the cloud provider's response to cloud security risks.

  5. Testing the Quick Seismic Event Locator and Magnitude Calculator (SSL_Calc) by Marsite Project Data Base

    NASA Astrophysics Data System (ADS)

    Tunc, Suleyman; Tunc, Berna; Caka, Deniz; Baris, Serif

    2016-04-01

    Locating and calculating size of the seismic events is quickly one of the most important and challenging issue in especially real time seismology. In this study, we developed a Matlab application to locate seismic events and calculate their magnitudes (Local Magnitude and empirical Moment Magnitude) using single station called SSL_Calc. This newly developed sSoftware has been tested on the all stations of the Marsite project "New Directions in Seismic Hazard Assessment through Focused Earth Observation in the Marmara Supersite-MARsite". SSL_Calc algorithm is suitable both for velocity and acceleration sensors. Data has to be in GCF (Güralp Compressed Format). Online or offline data can be selected in SCREAM software (belongs to Guralp Systems Limited) and transferred to SSL_Calc. To locate event P and S wave picks have to be marked by using SSL_Calc window manually. During magnitude calculation, instrument correction has been removed and converted to real displacement in millimeter. Then the displacement data is converted to Wood Anderson Seismometer output by using; Z=[0;0]; P=[-6.28+4.71j; -6.28-4.71j]; A0=[2080] parameters. For Local Magnitude calculation,; maximum displacement amplitude (A) and distance (dist) are used in formula (1) for distances up to 200km and formula (2) for more than 200km. ML=log10(A)-(-1.118-0.0647*dist+0.00071*dist2-3.39E-6*dist3+5.71e-9*dist4) (1) ML=log10(A)+(2.1173+0.0082*dist-0.0000059628*dist2) (2) Following Local Magnitude calculation, the programcode calculates two empiric Moment Magnitudes using formulas (3) Akkar et al. (2010) and (4) Ulusay et al. (2004). Mw=0.953* ML+0.422 (3) Mw=0.7768* ML+1.5921 (4) SSL_Calc is a software that is easy to implement and user friendly and offers practical solution to individual users to location of event and ML, Mw calculation.

  6. The new meaning of quality in the information age.

    PubMed

    Prahalad, C K; Krishnan, M S

    1999-01-01

    Software applications are now a mission-critical source of competitive advantage for most companies. They are also a source of great risk, as the Y2K bug has made clear. Yet many line managers still haven't confronted software issues--partly because they aren't sure how best to define the quality of the applications in their IT infrastructures. Some companies such as Wal-Mart and the Gap have successfully integrated the software in their networks, but most have accumulated an unwidely number of incompatible applications--all designed to perform the same tasks. The authors provide a framework for measuring the performance of software in a company's IT portfolio. Quality traditionally has been measured according to a product's ability to meet certain specifications; other views of quality have emerged that measure a product's adaptability to customers' needs and a product's ability to encourage innovation. To judge software quality properly, argue the authors, managers must measure applications against all three approaches. Understanding the domain of a software application is an important part of that process. The domain is the body of knowledge about a user's needs and expectations for a product. Software domains change frequently based on how a consumer chooses to use, for example, Microsoft Word or a spreadsheet application. The domain can also be influenced by general changes in technology, such as the development of a new software platform. Thus, applications can't be judged only according to whether they conform to specifications. The authors discuss how to identify domain characteristics and software risks and suggest ways to reduce the variability of software domains.

  7. Stability Analysis of a mortar cover ejected at various Mach numbers and angles of attack

    NASA Astrophysics Data System (ADS)

    Schwab, Jane; Carnasciali, Maria-Isabel; Andrejczyk, Joe; Kandis, Mike

    2011-11-01

    This study utilized CFD software to predict the aerodynamic coefficient of a wedge-shaped mortar cover which is ejected from a spacecraft upon deployment of its Parachute Recovery System (PRS). Concern over recontact or collision between the mortar cover and spacecraft served as the impetus for this study in which drag and moment coefficients were determined at Mach numbers from 0.3 to 1.6 at 30-degree increments. These CFD predictions were then used as inputs to a two-dimensional, multi-body, three-DoF trajectory model to calculate the relative motion of the mortar cover and spacecraft. Based upon those simulations, the study concluded a minimal/zero risk of collision with either the spacecraft or PRS. Sponsored by Pioneer Aerospace.

  8. Design and evaluation of a fault-tolerant multiprocessor using hardware recovery blocks

    NASA Technical Reports Server (NTRS)

    Lee, Y. H.; Shin, K. G.

    1982-01-01

    A fault-tolerant multiprocessor with a rollback recovery mechanism is discussed. The rollback mechanism is based on the hardware recovery block which is a hardware equivalent to the software recovery block. The hardware recovery block is constructed by consecutive state-save operations and several state-save units in every processor and memory module. When a fault is detected, the multiprocessor reconfigures itself to replace the faulty component and then the process originally assigned to the faulty component retreats to one of the previously saved states in order to resume fault-free execution. A mathematical model is proposed to calculate both the coverage of multi-step rollback recovery and the risk of restart. A performance evaluation in terms of task execution time is also presented.

  9. Method of predicting the mean lung dose based on a patient's anatomy and dose-volume histograms

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zawadzka, Anna, E-mail: a.zawadzka@zfm.coi.pl; Nesteruk, Marta; Department of Radiation Oncology, University Hospital Zurich and University of Zurich, Zurich

    The aim of this study was to propose a method to predict the minimum achievable mean lung dose (MLD) and corresponding dosimetric parameters for organs-at-risk (OAR) based on individual patient anatomy. For each patient, the dose for 36 equidistant individual multileaf collimator shaped fields in the treatment planning system (TPS) was calculated. Based on these dose matrices, the MLD for each patient was predicted by the homemade DosePredictor software in which the solution of linear equations was implemented. The software prediction results were validated based on 3D conformal radiotherapy (3D-CRT) and volumetric modulated arc therapy (VMAT) plans previously prepared formore » 16 patients with stage III non–small-cell lung cancer (NSCLC). For each patient, dosimetric parameters derived from plans and the results calculated by DosePredictor were compared. The MLD, the maximum dose to the spinal cord (D{sub max} {sub cord}) and the mean esophageal dose (MED) were analyzed. There was a strong correlation between the MLD calculated by the DosePredictor and those obtained in treatment plans regardless of the technique used. The correlation coefficient was 0.96 for both 3D-CRT and VMAT techniques. In a similar manner, MED correlations of 0.98 and 0.96 were obtained for 3D-CRT and VMAT plans, respectively. The maximum dose to the spinal cord was not predicted very well. The correlation coefficient was 0.30 and 0.61 for 3D-CRT and VMAT, respectively. The presented method allows us to predict the minimum MLD and corresponding dosimetric parameters to OARs without the necessity of plan preparation. The method can serve as a guide during the treatment planning process, for example, as initial constraints in VMAT optimization. It allows the probability of lung pneumonitis to be predicted.« less

  10. Economic Analysis of Complex Nuclear Fuel Cycles with NE-COST

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ganda, Francesco; Dixon, Brent; Hoffman, Edward

    The purpose of this work is to present a new methodology, and associated computational tools, developed within the U.S. Department of Energy (U.S. DOE) Fuel Cycle Option Campaign to quantify the economic performance of complex nuclear fuel cycles. The levelized electricity cost at the busbar is generally chosen to quantify and compare the economic performance of different baseload generating technologies, including of nuclear: it is the cost of electricity which renders the risk-adjusted discounted net present value of the investment cash flow equal to zero. The work presented here is focused on the calculation of the levelized cost of electricitymore » of fuel cycles at mass balance equilibrium, which is termed LCAE (Levelized Cost of Electricity at Equilibrium). To alleviate the computational issues associated with the calculation of the LCAE for complex fuel cycles, a novel approach has been developed, which has been called the “island approach” because of its logical structure: a generic complex fuel cycle is subdivided into subsets of fuel cycle facilities, called islands, each containing one and only one type of reactor or blanket and an arbitrary number of fuel cycle facilities. A nuclear economic software tool, NE-COST, written in the commercial programming software MATLAB®, has been developed to calculate the LCAE of complex fuel cycles with the “island” computational approach. NE-COST has also been developed with the capability to handle uncertainty: the input parameters (both unit costs and fuel cycle characteristics) can have uncertainty distributions associated with them, and the output can be computed in terms of probability density functions of the LCAE. In this paper NE-COST will be used to quantify, as examples, the economic performance of (1) current Light Water Reactors (LWR) once-through systems; (2) continuous plutonium recycling in Fast Reactors (FR) with driver and blanket; (3) Recycling of plutonium bred in FR into LWR. For each fuel cycle, the contributions to the total LCAE of the main cost components will be identified.« less

  11. Software-defined Quantum Networking Ecosystem

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Humble, Travis S.; Sadlier, Ronald

    The software enables a user to perform modeling and simulation of software-defined quantum networks. The software addresses the problem of how to synchronize transmission of quantum and classical signals through multi-node networks and to demonstrate quantum information protocols such as quantum teleportation. The software approaches this problem by generating a graphical model of the underlying network and attributing properties to each node and link in the graph. The graphical model is then simulated using a combination of discrete-event simulators to calculate the expected state of each node and link in the graph at a future time. A user interacts withmore » the software by providing an initial network model and instantiating methods for the nodes to transmit information with each other. This includes writing application scripts in python that make use of the software library interfaces. A user then initiates the application scripts, which invokes the software simulation. The user then uses the built-in diagnostic tools to query the state of the simulation and to collect statistics on synchronization.« less

  12. Software Transition Project Retrospectives and the Application of SEL Effort Estimation Model and Boehm's COCOMO to Complex Software Transition Projects

    NASA Technical Reports Server (NTRS)

    McNeill, Justin

    1995-01-01

    The Multimission Image Processing Subsystem (MIPS) at the Jet Propulsion Laboratory (JPL) has managed transitions of application software sets from one operating system and hardware platform to multiple operating systems and hardware platforms. As a part of these transitions, cost estimates were generated from the personal experience of in-house developers and managers to calculate the total effort required for such projects. Productivity measures have been collected for two such transitions, one very large and the other relatively small in terms of source lines of code. These estimates used a cost estimation model similar to the Software Engineering Laboratory (SEL) Effort Estimation Model. Experience in transitioning software within JPL MIPS have uncovered a high incidence of interface complexity. Interfaces, both internal and external to individual software applications, have contributed to software transition project complexity, and thus to scheduling difficulties and larger than anticipated design work on software to be ported.

  13. Incorporating Risk

    NASA Technical Reports Server (NTRS)

    Hihn, Jairus

    2011-01-01

    This presentation is about cost risk identification and estimation which is only a part of risk management. The purpose of this step is to identify common software risks, to assess their impact on the cost estimate, and to make revisions to the estimate based on these impacts.

  14. FoilSim: Basic Aerodynamics Software Created

    NASA Technical Reports Server (NTRS)

    Peterson, Ruth A.

    1999-01-01

    FoilSim is interactive software that simulates the airflow around various shapes of airfoils. The graphical user interface, which looks more like a video game than a learning tool, captures and holds the students interest. The software is a product of NASA Lewis Research Center s Learning Technologies Project, an educational outreach initiative within the High Performance Computing and Communications Program (HPCCP).This airfoil view panel is a simulated view of a wing being tested in a wind tunnel. As students create new wing shapes by moving slider controls that change parameters, the software calculates their lift. FoilSim also displays plots of pressure or airspeed above and below the airfoil surface.

  15. Requirements for a multifunctional code architecture

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tiihonen, O.; Juslin, K.

    1997-07-01

    The present paper studies a set of requirements for a multifunctional simulation software architecture in the light of experiences gained in developing and using the APROS simulation environment. The huge steps taken in the development of computer hardware and software during the last ten years are changing the status of the traditional nuclear safety analysis software. The affordable computing power on the safety analysts table by far exceeds the possibilities offered to him/her ten years ago. At the same time the features of everyday office software tend to set standards to the way the input data and calculational results aremore » managed.« less

  16. New technologies for supporting real-time on-board software development

    NASA Astrophysics Data System (ADS)

    Kerridge, D.

    1995-03-01

    The next generation of on-board data management systems will be significantly more complex than current designs, and will be required to perform more complex and demanding tasks in software. Improved hardware technology, in the form of the MA31750 radiation hard processor, is one key component in addressing the needs of future embedded systems. However, to complement these hardware advances, improved support for the design and implementation of real-time data management software is now needed. This will help to control the cost and risk assoicated with developing data management software development as it becomes an increasingly significant element within embedded systems. One particular problem with developing embedded software is managing the non-functional requirements in a systematic way. This paper identifies how Logica has exploited recent developments in hard real-time theory to address this problem through the use of new hard real-time analysis and design methods which can be supported by specialized tools. The first stage in transferring this technology from the research domain to industrial application has already been completed. The MA37150 Hard Real-Time Embedded Software Support Environment (HESSE) is a loosely integrated set of hardware and software tools which directly support the process of hard real-time analysis for software targeting the MA31750 processor. With further development, this HESSE promises to provide embedded system developers with software tools which can reduce the risks associated with developing complex hard real-time software. Supported in this way by more sophisticated software methods and tools, it is foreseen that MA31750 based embedded systems can meet the processing needs for the next generation of on-board data management systems.

  17. Open Architecture Standard for NASA's Software-Defined Space Telecommunications Radio Systems

    NASA Technical Reports Server (NTRS)

    Reinhart, Richard C.; Johnson, Sandra K.; Kacpura, Thomas J.; Hall, Charles S.; Smith, Carl R.; Liebetreu, John

    2008-01-01

    NASA is developing an architecture standard for software-defined radios used in space- and ground-based platforms to enable commonality among radio developments to enhance capability and services while reducing mission and programmatic risk. Transceivers (or transponders) with functionality primarily defined in software (e.g., firmware) have the ability to change their functional behavior through software alone. This radio architecture standard offers value by employing common waveform software interfaces, method of instantiation, operation, and testing among different compliant hardware and software products. These common interfaces within the architecture abstract application software from the underlying hardware to enable technology insertion independently at either the software or hardware layer. This paper presents the initial Space Telecommunications Radio System (STRS) Architecture for NASA missions to provide the desired software abstraction and flexibility while minimizing the resources necessary to support the architecture.

  18. Design of a software for calculating isoelectric point of a polypeptide according to their net charge using the graphical programming language LabVIEW.

    PubMed

    Tovar, Glomen

    2018-01-01

    A software to calculate the net charge and to predict the isoelectric point (pI) of a polypeptide is developed in this work using the graphical programming language LabVIEW. Through this instrument the net charges of the ionizable residues of the polypeptide chains of the proteins are calculated at different pH values, tabulated, pI is predicted and an Excel (-xls) type file is generated. In this work, the experimental values of the pIs (pI) of different proteins are compared with the values of the pIs (pI) calculated graphically, achieving a correlation coefficient (R) of 0.934746 which represents a good reliability for a p < 0.01. In this way the generated program can constitute an instrument applicable in the laboratory, facilitating the calculation to graduate students and junior researchers. © 2017 by The International Union of Biochemistry and Molecular Biology, 46(1):39-46, 2018. © 2017 The International Union of Biochemistry and Molecular Biology.

  19. The environmental control and life support system advanced automation project

    NASA Technical Reports Server (NTRS)

    Dewberry, Brandon S.

    1991-01-01

    The objective of the ECLSS Advanced Automation project includes reduction of the risk associated with the integration of new, beneficial software techniques. Demonstrations of this software to baseline engineering and test personnel will show the benefits of these techniques. The advanced software will be integrated into ground testing and ground support facilities, familiarizing its usage by key personnel.

  20. Patient-specific IMRT verification using independent fluence-based dose calculation software: experimental benchmarking and initial clinical experience.

    PubMed

    Georg, Dietmar; Stock, Markus; Kroupa, Bernhard; Olofsson, Jörgen; Nyholm, Tufve; Ahnesjö, Anders; Karlsson, Mikael

    2007-08-21

    Experimental methods are commonly used for patient-specific intensity-modulated radiotherapy (IMRT) verification. The purpose of this study was to investigate the accuracy and performance of independent dose calculation software (denoted as 'MUV' (monitor unit verification)) for patient-specific quality assurance (QA). 52 patients receiving step-and-shoot IMRT were considered. IMRT plans were recalculated by the treatment planning systems (TPS) in a dedicated QA phantom, in which an experimental 1D and 2D verification (0.3 cm(3) ionization chamber; films) was performed. Additionally, an independent dose calculation was performed. The fluence-based algorithm of MUV accounts for collimator transmission, rounded leaf ends, tongue-and-groove effect, backscatter to the monitor chamber and scatter from the flattening filter. The dose calculation utilizes a pencil beam model based on a beam quality index. DICOM RT files from patient plans, exported from the TPS, were directly used as patient-specific input data in MUV. For composite IMRT plans, average deviations in the high dose region between ionization chamber measurements and point dose calculations performed with the TPS and MUV were 1.6 +/- 1.2% and 0.5 +/- 1.1% (1 S.D.). The dose deviations between MUV and TPS slightly depended on the distance from the isocentre position. For individual intensity-modulated beams (total 367), an average deviation of 1.1 +/- 2.9% was determined between calculations performed with the TPS and with MUV, with maximum deviations up to 14%. However, absolute dose deviations were mostly less than 3 cGy. Based on the current results, we aim to apply a confidence limit of 3% (with respect to the prescribed dose) or 6 cGy for routine IMRT verification. For off-axis points at distances larger than 5 cm and for low dose regions, we consider 5% dose deviation or 10 cGy acceptable. The time needed for an independent calculation compares very favourably with the net time for an experimental approach. The physical effects modelled in the dose calculation software MUV allow accurate dose calculations in individual verification points. Independent calculations may be used to replace experimental dose verification once the IMRT programme is mature.

Top