Sample records for asci software quality

  1. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Peck, T; Sparkman, D; Storch, N

    ''The LLNL Site-Specific Advanced Simulation and Computing (ASCI) Software Quality Engineering Recommended Practices VI.I'' document describes a set of recommended software quality engineering (SQE) practices for ASCI code projects at Lawrence Livermore National Laboratory (LLNL). In this context, SQE is defined as the process of building quality into software products by applying the appropriate guiding principles and management practices. Continual code improvement and ongoing process improvement are expected benefits. Certain practices are recommended, although projects may select the specific activities they wish to improve, and the appropriate time lines for such actions. Additionally, projects can rely on the guidance ofmore » this document when generating ASCI Verification and Validation (VSrV) deliverables. ASCI program managers will gather information about their software engineering practices and improvement. This information can be shared to leverage the best SQE practices among development organizations. It will further be used to ensure the currency and vitality of the recommended practices. This Overview is intended to provide basic information to the LLNL ASCI software management and development staff from the ''LLNL Site-Specific ASCI Software Quality Engineering Recommended Practices VI.I'' document. Additionally the Overview provides steps to using the ''LLNL Site-Specific ASCI Software Quality Engineering Recommended Practices VI.I'' document. For definitions of terminology and acronyms, refer to the Glossary and Acronyms sections in the ''LLNL Site-Specific ASCI Software Quality Engineering Recommended Practices VI.I''.« less

  2. Sandia National Laboratories Advanced Simulation and Computing (ASC) software quality plan. Part 1: ASC software quality engineering practices, Version 2.0.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sturtevant, Judith E.; Heaphy, Robert; Hodges, Ann Louise

    2006-09-01

    The purpose of the Sandia National Laboratories Advanced Simulation and Computing (ASC) Software Quality Plan is to clearly identify the practices that are the basis for continually improving the quality of ASC software products. The plan defines the ASC program software quality practices and provides mappings of these practices to Sandia Corporate Requirements CPR 1.3.2 and 1.3.6 and to a Department of Energy document, ASCI Software Quality Engineering: Goals, Principles, and Guidelines. This document also identifies ASC management and software project teams responsibilities in implementing the software quality practices and in assessing progress towards achieving their software quality goals.

  3. Sandia National Laboratories Advanced Simulation and Computing (ASC) software quality plan. Part 1 : ASC software quality engineering practices version 1.0.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Minana, Molly A.; Sturtevant, Judith E.; Heaphy, Robert

    2005-01-01

    The purpose of the Sandia National Laboratories (SNL) Advanced Simulation and Computing (ASC) Software Quality Plan is to clearly identify the practices that are the basis for continually improving the quality of ASC software products. Quality is defined in DOE/AL Quality Criteria (QC-1) as conformance to customer requirements and expectations. This quality plan defines the ASC program software quality practices and provides mappings of these practices to the SNL Corporate Process Requirements (CPR 1.3.2 and CPR 1.3.6) and the Department of Energy (DOE) document, ASCI Software Quality Engineering: Goals, Principles, and Guidelines (GP&G). This quality plan identifies ASC management andmore » software project teams' responsibilities for cost-effective software engineering quality practices. The SNL ASC Software Quality Plan establishes the signatories commitment to improving software products by applying cost-effective software engineering quality practices. This document explains the project teams opportunities for tailoring and implementing the practices; enumerates the practices that compose the development of SNL ASC's software products; and includes a sample assessment checklist that was developed based upon the practices in this document.« less

  4. Delivering Insight The History of the Accelerated Strategic Computing Initiative

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Larzelere II, A R

    2007-01-03

    The history of the Accelerated Strategic Computing Initiative (ASCI) tells of the development of computational simulation into a third fundamental piece of the scientific method, on a par with theory and experiment. ASCI did not invent the idea, nor was it alone in bringing it to fruition. But ASCI provided the wherewithal - hardware, software, environment, funding, and, most of all, the urgency - that made it happen. On October 1, 2005, the Initiative completed its tenth year of funding. The advances made by ASCI over its first decade are truly incredible. Lawrence Livermore, Los Alamos, and Sandia National Laboratories,more » along with leadership provided by the Department of Energy's Defense Programs Headquarters, fundamentally changed computational simulation and how it is used to enable scientific insight. To do this, astounding advances were made in simulation applications, computing platforms, and user environments. ASCI dramatically changed existing - and forged new - relationships, both among the Laboratories and with outside partners. By its tenth anniversary, despite daunting challenges, ASCI had accomplished all of the major goals set at its beginning. The history of ASCI is about the vision, leadership, endurance, and partnerships that made these advances possible.« less

  5. Inhibition of Breast Cancer-lnduced Bone Pain, Metastasis, and Osteolysis in Nude Mice by LOVAZA and DHA Fatty Acids

    DTIC Science & Technology

    2012-10-01

    ASIC3, TGAGAGCCACCAGCTTACCT/ACATGTCCTCAAGGGAGTGG (30 cycles); mouse TRPV1 , GTGACCCTCTTGGTGGAGAA/ CTTCAGTGTGGGGTGGAGTT (30 cycles), mouse GAPDH...densitometry assisted by the image analysis software MetaMorph Image ( xx). Sizes are as follows: ASIC1a 506bp, ASCI1b 563bp, ASCI3 245pb, TRPV1

  6. Urinary 3-hydroxypropyl mercapturic acid (3-HPMA) concentrations in dogs with acute spinal cord injury due to intervertebral disc herniation.

    PubMed

    Sangster, A M; Zheng, L; Bentley, R T; Shi, R; Packer, R A

    2017-01-01

    The aim of this study was to investigate urinary 3-hydroxypropyl mercapturic acid (3-HPMA), a metabolite of acrolein, as a novel biomarker in acute spinal cord injury (ASCI) due to intervertebral disc herniation in dogs. Urine from 10 client-owned dogs with ASCI collected at presentation and 10 control dogs was analyzed for 3-HPMA. The median urinary 3-HPMA concentration in ASCI dogs was significantly higher than in control dogs, but was not correlated with the severity of ASCI. The median urinary 3-HPMA concentration in intact dogs was higher than in neutered dogs. Higher urinary 3-HPMA concentrations in dogs after ASCI support a role for acrolein, a cytotoxic by-product of lipid peroxidation, in canine ASCI. Urinary 3-HPMA could be used as a biomarker in future clinical trials to measure the effect of therapeutic intervention of reducing acrolein after ASCI. Copyright © 2016. Published by Elsevier Ltd.

  7. Preferential Ascus Discharge during Cross Maturation in SORDARIA BREVICOLLIS

    PubMed Central

    MacDonald, D. J.; Bond, D. J.

    1974-01-01

    Crosses involving spore color mutants of Sordaria brevicollis all showed a decline in the frequency of second division asymmetric asci (2:2:2:2's) as the cross matured. This decline was due to the preferential maturation and/or discharge of these asci. The proportion of spindle overlap and recombinational asci within the group did not change as shown by ascus dissection. The preferential discharge was also found to occur in two-point crosses where the asci did not contain wild-type spores. PMID:4822469

  8. Preferential ascus discharge during cross maturation in Sordaria brevicollis.

    PubMed

    MacDonald, D J; Bond, D J

    1974-02-01

    Crosses involving spore color mutants of Sordaria brevicollis all showed a decline in the frequency of second division asymmetric asci (2:2:2:2's) as the cross matured. This decline was due to the preferential maturation and/or discharge of these asci. The proportion of spindle overlap and recombinational asci within the group did not change as shown by ascus dissection. The preferential discharge was also found to occur in two-point crosses where the asci did not contain wild-type spores.

  9. Refinement of a Chemistry Attitude Measure for College Students

    ERIC Educational Resources Information Center

    Xu, Xiaoying; Lewis, Jennifer E.

    2011-01-01

    This work presents the evaluation and refinement of a chemistry attitude measure, Attitude toward the Subject of Chemistry Inventory (ASCI), for college students. The original 20-item and revised 8-item versions of ASCI (V1 and V2) were administered to different samples. The evaluation for ASCI had two main foci: reliability and validity. This…

  10. A history of the American Society for Clinical Investigation

    PubMed Central

    Howell, Joel D.

    2009-01-01

    One hundred years ago, in 1909, the American Society for Clinical Investigation (ASCI) held its first annual meeting. The founding members based this new society on a revolutionary approach to research that emphasized newer physiological methods. In 1924 the ASCI started a new journal, the Journal of Clinical Investigation. The ASCI has also held an annual meeting almost every year. The society has long debated who could be a member, with discussions about whether members must be physicians, what sorts of research they could do, and the role of women within the society. The ASCI has also grappled with what else the society should do, especially whether it ought to take a stand on policy issues. ASCI history has reflected changing social, political, and economic contexts, including several wars, concerns about the ethics of biomedical research, massive increases in federal research funding, and an increasingly large and specialized medical environment. PMID:19348041

  11. The kinetics of autophagy in the lung following acute spinal cord injury in rats.

    PubMed

    Chu, Ruiliang; Wang, Jiuling; Bi, Yang; Nan, Guoxin

    2018-05-01

    Lung injury is a major cause of respiratory complications following an acute spinal cord injury (ASCI), which are associated with a high mortality rate. Autophagy has been shown to be involved in a variety of lung diseases; however, whether autophagy is activated in the lung following ASCI remains unknown. The objective of this study was to investigate the induction of autophagy in the lung after ASCI. This is an experimental animal study of ASCI investigating kinetics of autophagy in the lung following ASCI. One hundred and forty-four rats (N=144) were divided into two groups: (1) a sham (n=72) and (2) an injury group (n=72). Allen's method was used to induce an injury at the level of the 10th thoracic vertebra. Rats were sacrificed at 6, 12, 24, 48, and 72 hours, 1 week, and 2 weeks after surgery. Lung pathology and apoptosis were assessed to determine the level of damage in the lung. LC3, RAB7, P62, and Beclin 1 were used to detect the induction of autophagy. The study was funded by the Natural Science Foundation of China (NSFC,81272172); National Key Specialty Construction of Clinical Projects of China (#2013-544). The funder of the present study had no capacity to influence the scholarly conduct of the research, interpretation of results, or dissemination of study outcomes. In the injury group, pathologic changes (i.e., pulmonary congestion, hemorrhage, inflammatory exudation, and alveolar collapse) occurred within the lung tissue within 72 hours after ASCI. Apoptosis of the lung cells gradually increased and peaked 72 hours after ASCI. Within 24 hours of ASCI, LC3 expression decreased, recovered, and gradually increased from 24 hours to 72 hours. As RAB7 decreased, P62 increased, and the ratio of RAB7/LC3 significantly decreased. After ASCI, autophagy in the injured lung underwent dynamic changes, as early autophagosome formation decreased and late autophagosomes accumulated; thus, autophagy is in a state of inhibition. Copyright © 2018 Elsevier Inc. All rights reserved.

  12. ASCI visualization tool evaluation, Version 2.0

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kegelmeyer, P.

    1997-04-01

    The charter of the ASCI Visualization Common Tools subgroup was to investigate and evaluate 3D scientific visualization tools. As part of that effort, a Tri-Lab evaluation effort was launched in February of 1996. The first step was to agree on a thoroughly documented list of 32 features against which all tool candidates would be evaluated. These evaluation criteria were both gleaned from a user survey and determined from informed extrapolation into the future, particularly as concerns the 3D nature and extremely large size of ASCI data sets. The second step was to winnow a field of 41 candidate tools downmore » to 11. The selection principle was to be as inclusive as practical, retaining every tool that seemed to hold any promise of fulfilling all of ASCI`s visualization needs. These 11 tools were then closely investigated by volunteer evaluators distributed across LANL, LLNL, and SNL. This report contains the results of those evaluations, as well as a discussion of the evaluation philosophy and criteria.« less

  13. The ASCI Network for SC '99: A Step on the Path to a 100 Gigabit Per Second Supercomputing Network

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    PRATT,THOMAS J.; TARMAN,THOMAS D.; MARTINEZ,LUIS M.

    2000-07-24

    This document highlights the Discom{sup 2}'s Distance computing and communication team activities at the 1999 Supercomputing conference in Portland, Oregon. This conference is sponsored by the IEEE and ACM. Sandia, Lawrence Livermore and Los Alamos National laboratories have participated in this conference for eleven years. For the last four years the three laboratories have come together at the conference under the DOE's ASCI, Accelerated Strategic Computing Initiatives rubric. Communication support for the ASCI exhibit is provided by the ASCI DISCOM{sup 2} project. The DISCOM{sup 2} communication team uses this forum to demonstrate and focus communication and networking developments within themore » community. At SC 99, DISCOM built a prototype of the next generation ASCI network demonstrated remote clustering techniques, demonstrated the capabilities of the emerging Terabit Routers products, demonstrated the latest technologies for delivering visualization data to the scientific users, and demonstrated the latest in encryption methods including IP VPN technologies and ATM encryption research. The authors also coordinated the other production networking activities within the booth and between their demonstration partners on the exhibit floor. This paper documents those accomplishments, discusses the details of their implementation, and describes how these demonstrations support Sandia's overall strategies in ASCI networking.« less

  14. Effects of aquaporin 4 and inward rectifier potassium channel 4.1 on medullospinal edema after methylprednisolone treatment to suppress acute spinal cord injury in rats.

    PubMed

    Li, Ye; Hu, Haifeng; Liu, Jingchen; Zhu, Qingsan; Gu, Rui

    2018-02-01

    To investigate the effects of aquaporin 4 (AQP4) and inward rectifier potassium channel 4.1 (Kir4.1) on medullospinal edema after treatment with methylprednisolone (MP) to suppress acute spinal cord injury (ASCI) in rats. Sprague Dawley rats were randomly divided into control, sham, ASCI, and MP-treated ASCI groups. After the induction of ASCI, we injected 30 mg/kg MP via the tail vein at various time points. The Tarlov scoring method was applied to evaluate neurological symptoms, and the wet-dry weights method was applied to measure the water content of the spinal cord. The motor function score of the ASCI group was significantly lower than that of the sham group, and the spinal water content was significantly increased. In addition, the levels of AQP4 and Kir4.1 were significantly increased, as was their degree of coexpression. Compared with that in the ASCI group, the motor function score and the water content were significantly increased in the MP group; in addition, the expression and coexpression of AQP4 and Kir4.1 were significantly reduced. Methylprednisolone inhibited medullospinal edema in rats with acute spinal cord injury, possibly by reducing the coexpression of aquaporin 4 and Kir4.1 in medullospinal tissues.

  15. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sierra Thermal /Fluid Team

    The SIERRA Low Mach Module: Fuego along with the SIERRA Participating Media Radiation Module: Syrinx, henceforth referred to as Fuego and Syrinx, respectively, are the key elements of the ASCI fire environment simulation project. The fire environment simulation project is directed at characterizing both open large-scale pool fires and building enclosure fires. Fuego represents the turbulent, buoyantly-driven incompressible flow, heat transfer, mass transfer, combustion, soot, and absorption coefficient model portion of the simulation software. Syrinx represents the participating-media thermal radiation mechanics. This project is an integral part of the SIERRA multi-mechanics software development project. Fuego depends heavily upon the coremore » architecture developments provided by SIERRA for massively parallel computing, solution adaptivity, and mechanics coupling on unstructured grids.« less

  16. Zhenbao pill protects against acute spinal cord injury via miR-146a-5p regulating the expression of GPR17

    PubMed Central

    Lv, Bokang; Huan, Yanqiang; Liu, Bin; Li, Yutang; Jia, Lizhou; Qu, Chenhui; Wang, Dongsheng; Yu, Hai

    2018-01-01

    The aim of the present study was to observe the effect of zhenbao pill on the motor function of acute spinal cord injury (ASCI) rats and the molecular mechanisms involving miR-146a-5p and G-protein-coupled receptor 17 (GPR17). ASCI rat model was established by modified Allen method, and then the rats were divided into three groups. SH-SY5Y cells were cultured overnight in hypoxia condition and transfected with miR-146a-5p mimic or miR-146a-5p inhibitor. The hind limb motor function of the rats was evaluated by Basso, Beattie, Bresnahan (BBB) scoring system. Quantitative real-time PCR (qRT-PCR) and Western blot were used to detect the expression of miR-146a-5p, GPR17, inducible nitric oxide synthase (iNOS), interleukin 1β (IL-1β), and tumor necrosis factor α (TNF-α). Neuronal apoptosis was measured using flow cytometry assay. Luciferase reporter assay was performed to determine the regulation of miR-146a-5p on GPR17. Zhenbao pill could enhance hind limb motor function and attenuate the inflammatory response caused by ASCI. Moreover, zhenbao pill increased the level of miR-146a-5p and decreased GPR17 expression in vivo and in vitro. Bioinformatics software predicted that GPR17 3′-UTR had a binding site with miR-146a-5p. Luciferase reporter assay showed that miR-146a-5p had a negative regulatory effect on GPR17 expression. Knockdown of miR-146a-5p could reverse the effect of zhenbao pill on the up-regulation of GPR17 induced by hypoxia, reversed the inhibitory effect of zhenbao pill on the cell apoptosis induced by hypoxia and the recovery of zhenbao pill on hind limb motor function in ASCI rats. Zhenbao pill could inhibit neuronal apoptosis by regulating miR-146a-5p/GPR17 expression, and then promoting the recovery of spinal cord function. PMID:29187582

  17. Corridor One:An Integrated Distance Visualization Enuronments for SSI+ASCI Applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Christopher R. Johnson, Charles D. Hansen

    2001-10-29

    The goal of Corridor One: An Integrated Distance Visualization Environment for ASCI and SSI Application was to combine the forces of six leading edge laboratories working in the areas of visualization and distributed computing and high performance networking (Argonne National Laboratory, Lawrence Berkeley National Laboratory, Los Alamos National Laboratory, University of Illinois, University of Utah and Princeton University) to develop and deploy the most advanced integrated distance visualization environment for large-scale scientific visualization and demonstrate it on applications relevant to the DOE SSI and ASCI programs. The Corridor One team brought world class expertise in parallel rendering, deep image basedmore » rendering, immersive environment technology, large-format multi-projector wall based displays, volume and surface visualization algorithms, collaboration tools and streaming media technology, network protocols for image transmission, high-performance networking, quality of service technology and distributed computing middleware. Our strategy was to build on the very successful teams that produced the I-WAY, ''Computational Grids'' and CAVE technology and to add these to the teams that have developed the fastest parallel visualizations systems and the most widely used networking infrastructure for multicast and distributed media. Unfortunately, just as we were getting going on the Corridor One project, DOE cut the program after the first year. As such, our final report consists of our progress during year one of the grant.« less

  18. Use of virion DNA as a cloning vector for the construction of mutant and recombinant herpesviruses.

    PubMed

    Duboise, S M; Guo, J; Desrosiers, R C; Jung, J U

    1996-10-15

    We have developed improved procedures for the isolation of deletion mutant, point mutant, and recombinant herpesvirus saimiri. These procedures take advantage of the absence of NotI and AscI restriction enzyme sites within the viral genome and use reporter genes for the identification of recombinant viruses. Genes for secreted engineered alkaline phosphatase and green fluorescent protein were placed under simian virus 40 early promoter control and flanked by NotI and AscI restriction sites. When permissive cells were cotransfected with herpesvirus saimiri virion DNA and one of the engineered reporter genes cloned within herpesvirus saimiri sequences, recombinant viruses were readily identified and purified on the basis of expression of the reporter gene. Digestion of recombinant virion DNA with NotI or AscI was used to delete the reporter gene from the recombinant herpesvirus saimiri. Replacement of the reporter gene can be achieved by NotI or AscI digestion of virion DNA and ligation with a terminally matched fragment or, alternatively, by homologous recombination in cotransfected cells. Any gene can, in theory, be cloned directly into the virion DNA when flanked by the appropriate NotI or AscI sites. These procedures should be widely applicable in their general form to most or all herpesviruses that replicate permissively in cultured cells.

  19. SIERRA Low Mach Module: Fuego User Manual Version 4.46.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sierra Thermal/Fluid Team

    2017-09-01

    The SIERRA Low Mach Module: Fuego along with the SIERRA Participating Media Radiation Module: Syrinx, henceforth referred to as Fuego and Syrinx, respectively, are the key elements of the ASCI fire environment simulation project. The fire environment simulation project is directed at characterizing both open large-scale pool fires and building enclosure fires. Fuego represents the turbulent, buoyantly-driven incompressible flow, heat transfer, mass transfer, combustion, soot, and absorption coefficient model portion of the simulation software. Syrinx represents the participating-media thermal radiation mechanics. This project is an integral part of the SIERRA multi-mechanics software development project. Fuego depends heavily upon the coremore » architecture developments provided by SIERRA for massively parallel computing, solution adaptivity, and mechanics coupling on unstructured grids.« less

  20. SIERRA Low Mach Module: Fuego Theory Manual Version 4.44

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sierra Thermal /Fluid Team

    2017-04-01

    The SIERRA Low Mach Module: Fuego along with the SIERRA Participating Media Radiation Module: Syrinx, henceforth referred to as Fuego and Syrinx, respectively, are the key elements of the ASCI fire environment simulation project. The fire environment simulation project is directed at characterizing both open large-scale pool fires and building enclosure fires. Fuego represents the turbulent, buoyantly-driven incompressible flow, heat transfer, mass transfer, combustion, soot, and absorption coefficient model portion of the simulation software. Syrinx represents the participating-media thermal radiation mechanics. This project is an integral part of the SIERRA multi-mechanics software development project. Fuego depends heavily upon the coremore » architecture developments provided by SIERRA for massively parallel computing, solution adaptivity, and mechanics coupling on unstructured grids.« less

  1. SIERRA Low Mach Module: Fuego Theory Manual Version 4.46.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sierra Thermal/Fluid Team

    The SIERRA Low Mach Module: Fuego along with the SIERRA Participating Media Radiation Module: Syrinx, henceforth referred to as Fuego and Syrinx, respectively, are the key elements of the ASCI fire environment simulation project. The fire environment simulation project is directed at characterizing both open large-scale pool fires and building enclosure fires. Fuego represents the turbulent, buoyantly-driven incompressible flow, heat transfer, mass transfer, combustion, soot, and absorption coefficient model portion of the simulation software. Syrinx represents the participating-media thermal radiation mechanics. This project is an integral part of the SIERRA multi-mechanics software development project. Fuego depends heavily upon the coremore » architecture developments provided by SIERRA for massively parallel computing, solution adaptivity, and mechanics coupling on unstructured grids.« less

  2. Cooperative high-performance storage in the accelerated strategic computing initiative

    NASA Technical Reports Server (NTRS)

    Gary, Mark; Howard, Barry; Louis, Steve; Minuzzo, Kim; Seager, Mark

    1996-01-01

    The use and acceptance of new high-performance, parallel computing platforms will be impeded by the absence of an infrastructure capable of supporting orders-of-magnitude improvement in hierarchical storage and high-speed I/O (Input/Output). The distribution of these high-performance platforms and supporting infrastructures across a wide-area network further compounds this problem. We describe an architectural design and phased implementation plan for a distributed, Cooperative Storage Environment (CSE) to achieve the necessary performance, user transparency, site autonomy, communication, and security features needed to support the Accelerated Strategic Computing Initiative (ASCI). ASCI is a Department of Energy (DOE) program attempting to apply terascale platforms and Problem-Solving Environments (PSEs) toward real-world computational modeling and simulation problems. The ASCI mission must be carried out through a unified, multilaboratory effort, and will require highly secure, efficient access to vast amounts of data. The CSE provides a logically simple, geographically distributed, storage infrastructure of semi-autonomous cooperating sites to meet the strategic ASCI PSE goal of highperformance data storage and access at the user desktop.

  3. The incidence of rugby-related catastrophic injuries (including cardiac events) in South Africa from 2008 to 2011: a cohort study

    PubMed Central

    Brown, James Craig; Lambert, Mike I; Verhagen, Evert; Readhead, Clint; van Mechelen, Willem; Viljoen, Wayne

    2013-01-01

    Objectives To establish an accurate and comprehensive injury incidence registry of all rugby union-related catastrophic events in South Africa between 2008 and 2011. An additional aim was to investigate correlates associated with these injuries. Design Prospective. Setting The South African amateur and professional rugby-playing population. Participants An estimated 529 483 Junior and 121 663 Senior rugby union (‘rugby’) players (population at risk). Outcome measures Annual average incidences of rugby-related catastrophic injuries by type (cardiac events, traumatic brain and acute spinal cord injuries (ASCIs)) and outcome (full recoveries—fatalities). Playing level (junior and senior levels), position and event (phase of play) were also assessed. Results The average annual incidence of ASCIs and Traumatic Brain Injuries combined was 2.00 per 100 000 players (95% CI 0.91 to 3.08) from 2008 to 2011. The incidence of ASCIs with permanent outcomes was significantly higher at the Senior level (4.52 per 100 000 players, 95% CI 0.74 to 8.30) than the Junior level (0.24 per 100 000 players, 95% CI 0 to 0.65) during this period. The hooker position was associated with 46% (n=12 of 26) of all permanent ASCI outcomes, the majority of which (83%) occurred during the scrum phase of play. Conclusions The incidence of rugby-related catastrophic injuries in South Africa between 2008 and 2011 is comparable to that of other countries and to most other collision sports. The higher incidence rate of permanent ASCIs at the Senior level could be related to the different law variations or characteristics (eg, less regular training) compared with the Junior level. The hooker and scrum were associated with high proportions of permanent ASCIs. The BokSmart injury prevention programme should focus efforts on these areas (Senior level, hooker and scrum) and use this study as a reference point for the evaluation of the effectiveness of the programme. PMID:23447464

  4. The incidence of rugby-related catastrophic injuries (including cardiac events) in South Africa from 2008 to 2011: a cohort study.

    PubMed

    Brown, James Craig; Lambert, Mike I; Verhagen, Evert; Readhead, Clint; van Mechelen, Willem; Viljoen, Wayne

    2013-01-01

    To establish an accurate and comprehensive injury incidence registry of all rugby union-related catastrophic events in South Africa between 2008 and 2011. An additional aim was to investigate correlates associated with these injuries. Prospective. The South African amateur and professional rugby-playing population. An estimated 529 483 Junior and 121 663 Senior rugby union ('rugby') players (population at risk). Annual average incidences of rugby-related catastrophic injuries by type (cardiac events, traumatic brain and acute spinal cord injuries (ASCIs)) and outcome (full recoveries-fatalities). Playing level (junior and senior levels), position and event (phase of play) were also assessed. The average annual incidence of ASCIs and Traumatic Brain Injuries combined was 2.00 per 100 000 players (95% CI 0.91 to 3.08) from 2008 to 2011. The incidence of ASCIs with permanent outcomes was significantly higher at the Senior level (4.52 per 100 000 players, 95% CI 0.74 to 8.30) than the Junior level (0.24 per 100 000 players, 95% CI 0 to 0.65) during this period. The hooker position was associated with 46% (n=12 of 26) of all permanent ASCI outcomes, the majority of which (83%) occurred during the scrum phase of play. The incidence of rugby-related catastrophic injuries in South Africa between 2008 and 2011 is comparable to that of other countries and to most other collision sports. The higher incidence rate of permanent ASCIs at the Senior level could be related to the different law variations or characteristics (eg, less regular training) compared with the Junior level. The hooker and scrum were associated with high proportions of permanent ASCIs. The BokSmart injury prevention programme should focus efforts on these areas (Senior level, hooker and scrum) and use this study as a reference point for the evaluation of the effectiveness of the programme.

  5. Adaptation of the Attitude toward the Subject of Chemistry Inventory (ASCI) into Turkish

    ERIC Educational Resources Information Center

    Sen, Senol; Yilmaz, Ayhan; Temel, Senar

    2016-01-01

    Developing an attitude influential in individuals' behaviours and related with academic achievement is a concept whose development science educators consider important. This research aims to adapt the 8-item Attitude toward the Subject of Chemistry Inventory (ASCI)--which was developed by Bauer (2008) and revised by Xu and Lewis (2011)--into…

  6. A Novel Gene, ROA, Is Required for Normal Morphogenesis and Discharge of Ascospores in Gibberella zeae▿†

    PubMed Central

    Min, Kyunghun; Lee, Jungkwan; Kim, Jin-Cheol; Kim, Sang Gyu; Kim, Young Ho; Vogel, Steven; Trail, Frances; Lee, Yin-Won

    2010-01-01

    Head blight, caused by Gibberella zeae, is a significant disease among cereal crops, including wheat, barley, and rice, due to contamination of grain with mycotoxins. G. zeae is spread by ascospores forcibly discharged from sexual fruiting bodies forming on crop residues. In this study, we characterized a novel gene, ROA, which is required for normal sexual development. Deletion of ROA (Δroa) resulted in an abnormal size and shape of asci and ascospores but did not affect vegetative growth. The Δroa mutation triggered round ascospores and insufficient cell division after spore delimitation. The asci of the Δroa strain discharged fewer ascospores from the perithecia but achieved a greater dispersal distance than those of the wild-type strain. Turgor pressure within the asci was calculated through the analysis of osmolytes in the epiplasmic fluid. Deletion of the ROA gene appeared to increase turgor pressure in the mutant asci. The higher turgor pressure of the Δroa mutant asci and the mutant spore shape contributed to the longer distance dispersal. When the Δroa mutant was outcrossed with a Δmat1-2 mutant, a strain that contains a green fluorescence protein (GFP) marker in place of the MAT1-2 gene, unusual phenotypic segregation occurred. The ratio of GFP to non-GFP segregation was 1:1; however, all eight spores had the same shape. Taken together, the results of this study suggest that ROA plays multiple roles in maintaining the proper morphology and discharge of ascospores in G. zeae. PMID:20802018

  7. Antigen Binding and Site-Directed Labeling of Biosilica-Immobilized Fusion Proteins Expressed in Diatoms.

    PubMed

    Ford, Nicole R; Hecht, Karen A; Hu, DeHong; Orr, Galya; Xiong, Yijia; Squier, Thomas C; Rorrer, Gregory L; Roesijadi, Guritno

    2016-03-18

    The diatom Thalassiosira pseudonana was genetically modified to express biosilica-targeted fusion proteins comprising either enhanced green fluorescent protein (EGFP) or single chain antibodies engineered with a tetracysteine tagging sequence. Of interest were the site-specific binding of (1) the fluorescent biarsenical probe AsCy3 and AsCy3e to the tetracysteine tagged fusion proteins and (2) high and low molecular mass antigens, the Bacillus anthracis surface layer protein EA1 or small molecule explosive trinitrotoluene (TNT), to biosilica-immobilized single chain antibodies. Analysis of biarsenical probe binding using fluorescence and structured illumination microscopy indicated differential colocalization with EGFP in nascent and mature biosilica, supporting the use of either EGFP or bound AsCy3 and AsCy3e in studying biosilica maturation. Large increases in the lifetime of a fluorescent analogue of TNT upon binding single chain antibodies provided a robust signal capable of discriminating binding to immobilized antibodies in the transformed frustule from nonspecific binding to the biosilica matrix. In conclusion, our results demonstrate an ability to engineer diatoms to create antibody-functionalized mesoporous silica able to selectively bind chemical and biological agents for the development of sensing platforms.

  8. Downregulation of miR-199b promotes the acute spinal cord injury through IKKβ-NF-κB signaling pathway activating microglial cells

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhou, Heng-Jun; Wang, Li-Qing; Xu, Qing-Sheng

    Inflammatory response played an important role in the progression of spinal cord injury (SCI). Several miRNAs were associated with the pathology of SCI. However, the molecular mechanism of miRNA involving in inflammatory response in acute SCI (ASCI) was poorly understood. Sprague-Dawley (SD) rats were divided into 2 groups: control group (n=6) and acute SCI (ASCI) group (n=6). The expression of miR-199b and IκB kinase β-nuclear factor-kappa B (IKKβ-NF-κB) signaling pathway were evaluated by quantitative reverse transcription-PCR (qRT-PCR) in rats with ASCI and in primary microglia activated by lipopolysaccharide (LPS). We found that downregulation of miR-199b and activation of IKKβ/NF-κB weremore » observed in rats after ASCI and in activated microglia. miR-199b negatively regulated IKKβ by targeting its 3′- untranslated regions (UTR) through using luciferase reporter assay. Overexpression of miR-199b reversed the up-regulation of IKKβ, p-p65, tumor necrosis factor-α (TNF-α) and interleukin-1β (IL-1β) in LPS-treated BV2 cells assessed by western blotting analysis. In addition, BMS-345541 reversed the up-regulation effects of miR-199b inhibitor on the expression of TNF-α and IL-1β. In the SCI rats, overexpression of miR-199b attenuated ASCI and decreased the expression of IKKβ-NF-κB signaling pathway and TNF-α and IL-1β. These results indicated that miR-199b attenuated ASCI at least partly through IKKβ-NF-κB signaling pathway and affecting the function of microglia. Our findings suggest that miR-199b may be employed as therapeutic for spinal cord injury. - Highlights: • Downregulation of miR-199b and activation of IKKβ/NF-κB were observed in rat after SCI. • miR-199b negatively regulated IKKβ by targeting its 3′-UTR. • miR-199b overexpression reversed the increasing IKKβ, p-p65, TNF-α and IL-1β in LPS-treated BV2. • BMS-345541 reversed the up-regulation of TNF-α and IL-1β induced by miR-199b inhibitor. • Overexpression of miR-199b attenuated ASCI and decreased the expression of IKKβ-NF-κB in rats.« less

  9. Inhibition on Breast Cancer Induced Bone Pain, Metastasis and Osteolysis in Nude Mice by LOVAZA and DHA Fattty Acids

    DTIC Science & Technology

    2011-10-01

    GCGGTTGTCCC/CATGGTAACAGCATTGCAGGTGC (30 cycles); mouse ASIC3, TGAGAGCCACCAGCTTACCT/ACATGTCCTCAAGGGAGTGG (30 cycles); mouse TRPV1 ...follows: ASIC1a 506bp, ASCI1b 563bp, ASCI3 245pb, TRPV1 FIGURE 8A 8 324bp, and GAPDH 233bp as seen in Figure 8A. Reference: Malin S, et al. (2007

  10. Spinal cord injuries in Australian footballers 1997-2002.

    PubMed

    Carmody, David J; Taylor, Thomas K F; Parker, David A; Coolican, Myles R J; Cumming, Robert G

    2005-06-06

    To review acute spinal cord injuries (ASCIs) in all Australian codes of football (rugby union [RU], rugby league [RL], Australian Rules football [ARF] and soccer) for 1997-2002 and to compare data with those of a 1986-1996 survey. Retrospective review of hospital records, and structured interviews with injured players. Patients admitted to any of the six Australian spinal cord injury units with a documented football-related ASCI over the period 1997-2002. Average annual incidence of ASCIs per 100,000 players in the different codes, final Frankel grading of injuries, and wheelchair status. Fifty-two footballers (45 adult men and seven schoolboys) suffered ASCIs between 1997 and 2002. The average annual incidence of ASCIs per 100,000 players was 3.2 for RU, 1.5 for RL, 0.5 for ARF and 0.2 for soccer. While there has been little change in incidence since the 1986-1996 survey, there has been a trend towards less severe injuries in RU and RL, but not in ARF. There have been no scrum injuries in RL since 1996, when the scrum stopped being contested. Seven injuries occurred in RU scrums, six at the moment of engagement of the opposing teams. The incidence of 2-on-1 and "gang" tackles (involving multiple tacklers) in RL is disturbing. Overall, 39% of injured players became permanently wheelchair-dependent. There continues to be good reason to revise the laws of scrum engagement in RU. The laws relating to multiple tacklers in RL should be examined. The insurance cover for injured players is grossly inadequate. The longstanding need for a registry of spinal cord injuries for all football codes regrettably remains unmet.

  11. Modal characterization of the ASCIE segmented optics testbed: New algorithms and experimental results

    NASA Technical Reports Server (NTRS)

    Carrier, Alain C.; Aubrun, Jean-Noel

    1993-01-01

    New frequency response measurement procedures, on-line modal tuning techniques, and off-line modal identification algorithms are developed and applied to the modal identification of the Advanced Structures/Controls Integrated Experiment (ASCIE), a generic segmented optics telescope test-bed representative of future complex space structures. The frequency response measurement procedure uses all the actuators simultaneously to excite the structure and all the sensors to measure the structural response so that all the transfer functions are measured simultaneously. Structural responses to sinusoidal excitations are measured and analyzed to calculate spectral responses. The spectral responses in turn are analyzed as the spectral data become available and, which is new, the results are used to maintain high quality measurements. Data acquisition, processing, and checking procedures are fully automated. As the acquisition of the frequency response progresses, an on-line algorithm keeps track of the actuator force distribution that maximizes the structural response to automatically tune to a structural mode when approaching a resonant frequency. This tuning is insensitive to delays, ill-conditioning, and nonproportional damping. Experimental results show that is useful for modal surveys even in high modal density regions. For thorough modeling, a constructive procedure is proposed to identify the dynamics of a complex system from its frequency response with the minimization of a least-squares cost function as a desirable objective. This procedure relies on off-line modal separation algorithms to extract modal information and on least-squares parameter subset optimization to combine the modal results and globally fit the modal parameters to the measured data. The modal separation algorithms resolved modal density of 5 modes/Hz in the ASCIE experiment. They promise to be useful in many challenging applications.

  12. An Insertional Translocation in Neurospora That Generates Duplications Heterozygous for Mating Type

    PubMed Central

    Perkins, David D.

    1972-01-01

    In strain T(I→II)39311 a long interstitial segment is transposed from IL to IIR, where it is inserted in reversed order with respect to the centromere. In crosses of T x T essentially all asci have eight viable, black spores, and all progeny are phenotypically normal. When T(I→II)39311 is crossed by Normal sequence (N), the expected duplication class is viable while the corresponding deficiency is lethal; 44% of the asci have 8 Black (viable) spores and 0 White (inviable) spores, 41% have 4 Black: 4 White, and 10% have 6 Black: 2 White. These are the ascus types expected from normal centromere disjunction without crossing over (8B:0W and 4B:4W equally probable), and with crossing over between centromere and break point (6B:2W). On germination, 8B:0W asci give rise to only parental types—4 T and 4 N; 4B:4W asci usually give four duplication (Dup) progeny; and 6B:2W asci usually give 2 T, 2 N, 2 Dup. Thus one third of all viable, black ascospores contain duplications.—Recessive markers in the donor chromosome which contributes the translocated segment can be mapped by duplication coverage. Ratios of 2 Dominant: 1 Recessive vs. 1 Dominant: 2 Recessive distinguish location in or outside the transposed segment. Eleven loci including mating type have been shown to lie within the segment, and markers at four loci have been transferred into the segment by meiotic recombination. The frequency of marker transfer indicates that the inserted segment usually pairs with its homologue. Ascus types that would result from single exchanges within the insertion are infrequent, as expected if asci containing dicentric bridges usually do not survive.—Duplication ascospores germinate to produce distinctive inhibited colonies. Later these "escape" to grow like wild type, and genes that were initially heterozygous in the duplication segregate when escape occurs. As with duplications from pericentric inversion In(IL→IR)H4250 (Newmeyer and Taylor 1967), the initial inhibition is attributed to mating-type heterozygosity, and escape to a somatic event that makes mating type homoor hemizygous.—Twenty additional duplication-generating Neurospora rearrangements are listed and described briefly in an Appendix. PMID:17248574

  13. Science and Technology Review June 2000

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    de Pruneda, J.H.

    2000-06-01

    This issue contains the following articles: (1) ''Accelerating on the ASCI Challenge''. (2) ''New Day Daws in Supercomputing'' When the ASCI White supercomputer comes online this summer, DOE's Stockpile Stewardship Program will make another significant advanced toward helping to ensure the safety, reliability, and performance of the nation's nuclear weapons. (3) ''Uncovering the Secrets of Actinides'' Researchers are obtaining fundamental information about the actinides, a group of elements with a key role in nuclear weapons and fuels. (4) ''A Predictable Structure for Aerogels''. (5) ''Tibet--Where Continents Collide''.

  14. Transreplication and crossing over in Sordaria fimicola.

    PubMed

    KITANI, Y; OLIVE, L S; EL-ANI, A S

    1961-09-08

    A study of the segregation of markers closely linked to the gray ascospore color locus in Sordaria fimicola reveals that there is a high incidence of crossing over very near the locus when it transreplicates, which is much more pronounced in 5:3 than in 6:2 asci. Also, a single 7:1 and several aberrant 4:4 asci are described. At a different spore color locus, transreplication yields only 6:2 ratios, while other spore color loci fail to transreplicate altogether

  15. Changes in the Lipid Composition and Fine Structure of Saccharomyces cerevisiae During Ascus Formation

    PubMed Central

    Illingworth, R. F.; Rose, A. H.; Beckett, A.

    1973-01-01

    Eighty to ninety percent of vegetative cells of Saccharomyces cerevisiae DCL 740 incubated in KCl-acetate medium form asci, the majority of which are four-spored. Ascospores are visible in asci after about 24 hr, and spore formation is complete after about 48 hr. The dry weight of the cells increases by about 75% during 48 hr of incubation, while the lipid content of the cells increases by a factor of four. The increase in lipid content is attributed mainly to an increased synthesis of sterol esters and triacylglycerols and to a lesser extent of phospholipids. The phospholipid and sterol compositions do not change appreciably, but there is a marked increase in the proportion of unsaturated fatty acid residues in ascan lipids. Uniformly labeled 14C-acetate is incorporated mainly into sterol esters and triacylglycerols and phospholipids. Pulse-labeling by adding acetate-U-14C to sporulating cultures and harvesting after a further 6 hr of incubation reveal two main periods of acetate incorporation, namely between 0 and 18 hr, and between 24 and 30 hr. Electron micrographs of thin sections through developing asci show that the principal changes in fine structure occur between 18 and 24 hr and include the appearance of numerous electron-transparent vesicles which become aligned around the meiotic nucleus, and the laying down of extensive endoplasmic reticulum membranes. Changes in fine structure are discussed in relation to the alterations in lipid content and composition of asci. Images PMID:4569408

  16. Expression of Meiotic Drive Elements Spore Killer-2 and Spore Killer-3 in Asci of Neurospora Tetrasperma

    PubMed Central

    Raju, N. B.; Perkins, D. D.

    1991-01-01

    It was shown previously that when a chromosomal Spore killer factor is heterozygous in Neurospora species with eight-spored asci, the four sensitive ascospores in each ascus die and the four survivors are all killers. Sk-2(K) and Sk-3(K) are nonrecombining haplotypes that segregate with the centromere of linkage group III. No killing occurs when either one of these killers is homozygous, but each is sensitive to killing by the other in crosses of Sk-2(K) X Sk-3(K). In the present study, Sk-2(K) and Sk-3(K) were transferred by recurrent backcrosses from the eight-spored species Neurospora crassa into Neurospora tetrasperma, a pseudohomothallic species which normally makes asci with four large spores, each heterokaryotic for mating type and for any other centromere-linked genes that are heterozygous in the cross. The action of Sk-2(K) and Sk-3(K) in N. tetrasperma is that predicted from their behavior in eight-spored species. A sensitive nucleus is protected from killing if it is enclosed in the same ascospore with a killer nucleus. Crosses of Sk-2(K) X Sk-2(S), Sk-3(K) X Sk-3(S), and Sk-2(K) X Sk-3(K) all produce four-spored asci that are wild type in appearance, with the ascospores heterokaryotic and viable. The Eight-spore gene E, which shows variable penetrance, was used to obtain N. tetrasperma asci in which two to eight spores are small and homokaryotic. When killer and sensitive alleles are segregating in the presence of E, only those ascospores that contain a killer allele survive. Half of the small ascospores are killed. In crosses of Sk-2(K) X Sk-3(K) (with E heterozygous), effectively all small ascospores are killed. The ability of N. tetrasperma to carry killer elements in cryptic condition suggests a possible role for Spore killers in the origin of pseudohomothallism, with adoption of the four-spored mode restoring ascospore viability of crosses in which killing would otherwise occur. PMID:1834522

  17. Characterization of Listeria monocytogenes from an ice cream plant by serotyping and pulsed-field gel electrophoresis.

    PubMed

    Miettinen, M K; Björkroth, K J; Korkeala, H J

    1999-02-18

    One dominating strain of serotype 1/2b was found when serotyping and pulsed-field gel electrophoresis (PFGE) patterns were used for the characterization of 41 Listeria monocytogenes isolates originating from an ice cream plant. Samples were taken from the production environment, equipment and ice cream during the years 1990-1997. Serotyping divided the isolates into two serovars, 1/2b and 4b. Three rare-cutting enzymes (ApaI, AscI and SmaI) were used in the creation of PFGE patterns. AscI resulted in the best restriction enzyme digestion patterns (REDPs) for visual comparison. Eight different AscI REDPs were obtained, whereas ApaI produced six and SmaI seven banding patterns. When one-band differences are taken into account, 12 different PFGE types were distinguished based on information obtained with all three enzymes. The dominant PFGE type was found to have persisted in the ice cream plant for seven years. Improved and precisely targeted cleaning and disinfection practices combined with structural changes making for easier cleaning of the packaging machine, resulted in eradication of L. monocytogenes from this plant.

  18. IP-Based Video Modem Extender Requirements

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pierson, L G; Boorman, T M; Howe, R E

    2003-12-16

    Visualization is one of the keys to understanding large complex data sets such as those generated by the large computing resources purchased and developed by the Advanced Simulation and Computing program (aka ASCI). In order to be convenient to researchers, visualization data must be distributed to offices and large complex visualization theaters. Currently, local distribution of the visual data is accomplished by distance limited modems and RGB switches that simply do not scale to hundreds of users across the local, metropolitan, and WAN distances without incurring large costs in fiber plant installation and maintenance. Wide Area application over the DOEmore » Complex is infeasible using these limited distance RGB extenders. On the other hand, Internet Protocols (IP) over Ethernet is a scalable well-proven technology that can distribute large volumes of data over these distances. Visual data has been distributed at lower resolutions over IP in industrial applications. This document describes requirements of the ASCI program in visual signal distribution for the purpose of identifying industrial partners willing to develop products to meet ASCI's needs.« less

  19. Advanced Simulation and Computing: A Summary Report to the Director's Review

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McCoy, M G; Peck, T

    2003-06-01

    It has now been three years since the Advanced Simulation and Computing Program (ASCI), as managed by Defense and Nuclear Technologies (DNT) Directorate, has been reviewed by this Director's Review Committee (DRC). Since that time, there has been considerable progress for all components of the ASCI Program, and these developments will be highlighted in this document and in the presentations planned for June 9 and 10, 2003. There have also been some name changes. Today, the Program is called ''Advanced Simulation and Computing,'' Although it retains the familiar acronym ASCI, the initiative nature of the effort has given way tomore » sustained services as an integral part of the Stockpile Stewardship Program (SSP). All computing efforts at LLNL and the other two Defense Program (DP) laboratories are funded and managed under ASCI. This includes the so-called legacy codes, which remain essential tools in stockpile stewardship. The contract between the Department of Energy (DOE) and the University of California (UC) specifies an independent appraisal of Directorate technical work and programmatic management. Such represents the work of this DNT Review Committee. Beginning this year, the Laboratory is implementing a new review system. This process was negotiated between UC, the National Nuclear Security Administration (NNSA), and the Laboratory Directors. Central to this approach are eight performance objectives that focus on key programmatic and administrative goals. Associated with each of these objectives are a number of performance measures to more clearly characterize the attainment of the objectives. Each performance measure has a lead directorate and one or more contributing directorates. Each measure has an evaluation plan and has identified expected documentation to be included in the ''Assessment File''.« less

  20. Most Uv-Induced Reciprocal Translocations in SORDARIA MACROSPORA Occur in or near Centromere Regions

    PubMed Central

    Leblon, G.; Zickler, D.; Lebilcot, S.

    1986-01-01

    In fungi, translocations can be identified and classified by the patterns of ascospore abortion in asci from crosses of rearrangement x normal sequence. Previous studies of UV-induced rearrangements in Sordaria macrospora revealed that a major class (called type III) appeared to be reciprocal translocations that were anomalous in producing an unexpected class of asci with four aborted ascospores in bbbbaaaa linear sequence (b = black; a = abortive). The present study shows that the anomalous type III rearrangements are, in fact, reciprocal translocations having both breakpoints within or adjacent to centromeres and that bbbbaaaa asci result from 3:1 disjunction from the translocation quadrivalent.—Electron microscopic observations of synaptonemal complexes enable centromeres to be visualized. Lengths of synaptonemal complexes lateral elements in translocation quadrivalents accurately reflect chromosome arm lengths, enabling breakpoints to be located reliably in centromere regions. All genetic data are consistent with the behavior expected of translocations with breakpoints at centromeres.—Two-thirds of the UV-induced reciprocal translocations are of this type. Certain centromere regions are involved preferentially. Among 73 type-III translocations, there were but 13 of the 21 possible chromosome combinations and 20 of the 42 possible combinations of chromosome arms. PMID:17246312

  1. Most Uv-Induced Reciprocal Translocations in SORDARIA MACROSPORA Occur in or near Centromere Regions.

    PubMed

    Leblon, G; Zickler, D; Lebilcot, S

    1986-02-01

    In fungi, translocations can be identified and classified by the patterns of ascospore abortion in asci from crosses of rearrangement x normal sequence. Previous studies of UV-induced rearrangements in Sordaria macrospora revealed that a major class (called type III) appeared to be reciprocal translocations that were anomalous in producing an unexpected class of asci with four aborted ascospores in bbbbaaaa linear sequence (b = black; a = abortive). The present study shows that the anomalous type III rearrangements are, in fact, reciprocal translocations having both breakpoints within or adjacent to centromeres and that bbbbaaaa asci result from 3:1 disjunction from the translocation quadrivalent.-Electron microscopic observations of synaptonemal complexes enable centromeres to be visualized. Lengths of synaptonemal complexes lateral elements in translocation quadrivalents accurately reflect chromosome arm lengths, enabling breakpoints to be located reliably in centromere regions. All genetic data are consistent with the behavior expected of translocations with breakpoints at centromeres.-Two-thirds of the UV-induced reciprocal translocations are of this type. Certain centromere regions are involved preferentially. Among 73 type-III translocations, there were but 13 of the 21 possible chromosome combinations and 20 of the 42 possible combinations of chromosome arms.

  2. Modeling and Simulation of Explosively Driven Electromechanical Devices

    NASA Astrophysics Data System (ADS)

    Demmie, Paul N.

    2002-07-01

    Components that store electrical energy in ferroelectric materials and produce currents when their permittivity is explosively reduced are used in a variety of applications. The modeling and simulation of such devices is a challenging problem since one has to represent the coupled physics of detonation, shock propagation, and electromagnetic field generation. The high fidelity modeling and simulation of complicated electromechanical devices was not feasible prior to having the Accelerated Strategic Computing Initiative (ASCI) computers and the ASCI developed codes at Sandia National Laboratories (SNL). The EMMA computer code is used to model such devices and simulate their operation. In this paper, I discuss the capabilities of the EMMA code for the modeling and simulation of one such electromechanical device, a slim-loop ferroelectric (SFE) firing set.

  3. Attitude to the subject of chemistry in undergraduate nursing students at Fiji National University and Federation University, Australia.

    PubMed

    Brown, Stephen; Wakeling, Lara; Peck, Blake; Naiker, Mani; Hill, Dolores; Naidu, Keshni

    2015-01-01

    Attitude to the subject of chemistry was quantified in first-year undergraduate nursing students, at two geographically distinct universities. A purpose-designed diagnostic instrument (ASCI) was given to students at Federation University, Australia (n= 114), and at Fiji National University, Fiji (n=160). Affective and cognitive sub-scales within ASCI showed reasonable internal consistency. Cronbach's alpha for the cognitive sub-scale was 0.786 and 0.630, and 0.787 and 0.788 for affective sub-scale for the Federation University and Fiji National University students, respectively. Mean (SD) score for the cognitive sub-scale was 10.5 (5.6) and 15.2 (4.1) for students at Federation University and Fiji National University, respectively (P<0.001, t-test). Mean (SD) score for the affective sub-scale was 13.1 (5.1) and 20.7 (4.3) for students at Federation University and Fiji National University, respectively (P < 0.001, t-test). An exploratory factor analysis (n=274) confirmed a two-factor solution consistent with affective and cognitive sub-scales, each with good internal consistency. Quantifying attitude to chemistry in undergraduate nursing students using ASCI may have utility in assessing the impact of novel teaching strategies used in the education of nursing students in areas of bioscience and chemistry. However, geographically distinct populations of undergraduate nurses may show very different attitudes to chemistry.

  4. [Measuring the impact of practices of social appropriation of science and technology: a proposal for a set of indicators].

    PubMed

    Daza-Caicedo, Sandra; Maldonado, Oscar; Arboleda-Castrillón, Tania; Falla, Sigrid; Moreno, Pablo; Tafur-Sequera, Mayali; Papagayo, Diana

    2017-01-01

    We propose a set of qualitative indicators for monitoring practices of social appropriation of science and technology. The design of this set is based on the Maloka case, but it can be of use to multiple actors involved in the social appropriation of science and technology (referred to by its Spanish acronym, ASCyT). The introduction discusses the concept of ASCyT. The first section provides a review of the literature about measuring activities that link science and society. The second section explains why it is important to develop this type of measurement. The third section lays out the methodology used in designing the indicators. The fourth section explains the set of indicators and the fifth reflects on that process.

  5. 2017 Multimodality Appropriate Use Criteria for Noninvasive Cardiac Imaging: Expert Consensus of the Asian Society of Cardiovascular Imaging.

    PubMed

    Beck, Kyongmin Sarah; Kim, Jeong A; Choe, Yeon Hyeon; Hian, Sim Kui; Hoe, John; Hong, Yoo Jin; Kim, Sung Mok; Kim, Tae Hoon; Kim, Young Jin; Kim, Yun Hyeon; Kuribayashi, Sachio; Lee, Jongmin; Leong, Lilian; Lim, Tae-Hwan; Lu, Bin; Park, Jae Hyung; Sakuma, Hajime; Yang, Dong Hyun; Yaw, Tan Swee; Wan, Yung-Liang; Zhang, Zhaoqi; Zhao, Shihua; Yong, Hwan Seok

    2017-01-01

    In 2010, the Asian Society of Cardiovascular Imaging (ASCI) provided recommendations for cardiac CT and MRI, and this document reflects an update of the 2010 ASCI appropriate use criteria (AUC). In 2016, the ASCI formed a new working group for revision of AUC for noninvasive cardiac imaging. A major change that we made in this document is the rating of various noninvasive tests (exercise electrocardiogram, echocardiography, positron emission tomography, single-photon emission computed tomography, radionuclide imaging, cardiac magnetic resonance, and cardiac computed tomography/angiography), compared side by side for their applications in various clinical scenarios. Ninety-five clinical scenarios were developed from eight selected pre-existing guidelines and classified into four sections as follows: 1) detection of coronary artery disease, symptomatic or asymptomatic; 2) cardiac evaluation in various clinical scenarios; 3) use of imaging modality according to prior testing; and 4) evaluation of cardiac structure and function. The clinical scenarios were scored by a separate rating committee on a scale of 1-9 to designate appropriate use, uncertain use, or inappropriate use according to a modified Delphi method. Overall, the AUC ratings for CT were higher than those of previous guidelines. These new AUC provide guidance for clinicians choosing among available testing modalities for various cardiac diseases and are also unique, given that most previous AUC for noninvasive imaging include only one imaging technique. As cardiac imaging is multimodal in nature, we believe that these AUC will be more useful for clinical decision making.

  6. Ascus dysgenesis in hybrid crosses of Neurospora and Sordaria (Sordariaceae).

    PubMed

    Kasbekar, Durgadas P

    2017-07-01

    When two lineages derived from a common ancestor become reproductively isolated (e.g. Neurospora crassa and N. tetrasperma), genes that have undergone mutation and adaptive evolution in one lineage can potentially become dysfunctional when transferred into the other, since other genes have undergone mutation and evolution in the second lineage, and the derived alleles were never 'tested' together before hybrid formation. Bateson (1909), Dobzhansky (1936), and Muller (1942) recognized that incompatibility between the derived alleles could potentially make the hybrid lethal, sterile, or display some other detriment. Alternatively, the detrimental effects seen in crosses with the hybrids may result from the silencing of ascus-development genes by meiotic silencing by unpaired DNA (MSUD). Aberrant transcripts from genes improperly paired in meiosis are processed into single-stranded MSUD-associated small interfering RNA (masiRNA), which is used to degrade complementary mRNA. Recently, backcrosses of N. crassa / N. tetrasperma hybrid translocation strains with wild-type N. tetrasperma were found to elicit novel ascus dysgenesis phenotypes. One was a transmission ratio distortion that apparently disfavoured the homokaryotic ascospores formed following alternate segregation. Another was the production of heterokaryotic ascospores in eight-spored asci. Lewis (1969) also had reported sighting rare eight-spored asci with heterokaryotic ascospores in interspecific crosses in Sordaria, a related genus. Ordinarily, in both Neurospora and Sordaria, the ascospores are partitioned at the eight-nucleus stage, and ascospores in eight-spored asci are initially uninucleate. Evidently, in hybrid crosses of the family Sordariaceae, ascospore partitioning can be delayed until after one or more mitoses following the postmeiotic mitosis.

  7. 2017 Multimodality Appropriate Use Criteria for Noninvasive Cardiac Imaging: Expert Consensus of the Asian Society of Cardiovascular Imaging

    PubMed Central

    Beck, Kyongmin Sarah; Kim, Jeong A; Choe, Yeon Hyeon; Hian, Sim Kui; Hoe, John; Hong, Yoo Jin; Kim, Sung Mok; Kim, Tae Hoon; Kim, Young Jin; Kim, Yun Hyeon; Kuribayashi, Sachio; Lee, Jongmin; Leong, Lilian; Lim, Tae-Hwan; Lu, Bin; Park, Jae Hyung; Sakuma, Hajime; Yang, Dong Hyun; Yaw, Tan Swee; Wan, Yung-Liang; Zhang, Zhaoqi; Zhao, Shihua

    2017-01-01

    In 2010, the Asian Society of Cardiovascular Imaging (ASCI) provided recommendations for cardiac CT and MRI, and this document reflects an update of the 2010 ASCI appropriate use criteria (AUC). In 2016, the ASCI formed a new working group for revision of AUC for noninvasive cardiac imaging. A major change that we made in this document is the rating of various noninvasive tests (exercise electrocardiogram, echocardiography, positron emission tomography, single-photon emission computed tomography, radionuclide imaging, cardiac magnetic resonance, and cardiac computed tomography/angiography), compared side by side for their applications in various clinical scenarios. Ninety-five clinical scenarios were developed from eight selected pre-existing guidelines and classified into four sections as follows: 1) detection of coronary artery disease, symptomatic or asymptomatic; 2) cardiac evaluation in various clinical scenarios; 3) use of imaging modality according to prior testing; and 4) evaluation of cardiac structure and function. The clinical scenarios were scored by a separate rating committee on a scale of 1–9 to designate appropriate use, uncertain use, or inappropriate use according to a modified Delphi method. Overall, the AUC ratings for CT were higher than those of previous guidelines. These new AUC provide guidance for clinicians choosing among available testing modalities for various cardiac diseases and are also unique, given that most previous AUC for noninvasive imaging include only one imaging technique. As cardiac imaging is multimodal in nature, we believe that these AUC will be more useful for clinical decision making. PMID:29089819

  8. The ASCI Network for SC 2000: Gigabyte Per Second Networking

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    PRATT, THOMAS J.; NAEGLE, JOHN H.; MARTINEZ JR., LUIS G.

    2001-11-01

    This document highlights the Discom's Distance computing and communication team activities at the 2000 Supercomputing conference in Dallas Texas. This conference is sponsored by the IEEE and ACM. Sandia's participation in the conference has now spanned a decade, for the last five years Sandia National Laboratories, Los Alamos National Lab and Lawrence Livermore National Lab have come together at the conference under the DOE's ASCI, Accelerated Strategic Computing Initiatives, Program rubric to demonstrate ASCI's emerging capabilities in computational science and our combined expertise in high performance computer science and communication networking developments within the program. At SC 2000, DISCOM demonstratedmore » an infrastructure. DISCOM2 uses this forum to demonstrate and focus communication and pre-standard implementation of 10 Gigabit Ethernet, the first gigabyte per second data IP network transfer application, and VPN technology that enabled a remote Distributed Resource Management tools demonstration. Additionally a national OC48 POS network was constructed to support applications running between the show floor and home facilities. This network created the opportunity to test PSE's Parallel File Transfer Protocol (PFTP) across a network that had similar speed and distances as the then proposed DISCOM WAN. The SCINET SC2000 showcased wireless networking and the networking team had the opportunity to explore this emerging technology while on the booth. This paper documents those accomplishments, discusses the details of their convention exhibit floor. We also supported the production networking needs of the implementation, and describes how these demonstrations supports DISCOM overall strategies in high performance computing networking.« less

  9. Wickerhamiella van der Walt (1973)

    USDA-ARS?s Scientific Manuscript database

    This chapter describes the ascomycetous yeast genus Wickerhamiella, which has five described species and has been defined from multigene deoxyribonucleic acid (DNA) sequence analysis. The species reproduce by multilateral budding but do not form hyphae or pseudohyphae. Asci typically form a single a...

  10. Multiphysics Application Coupling Toolkit

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Campbell, Michael T.

    2013-12-02

    This particular consortium implementation of the software integration infrastructure will, in large part, refactor portions of the Rocstar multiphysics infrastructure. Development of this infrastructure originated at the University of Illinois DOE ASCI Center for Simulation of Advanced Rockets (CSAR) to support the center's massively parallel multiphysics simulation application, Rocstar, and has continued at IllinoisRocstar, a small company formed near the end of the University-based program. IllinoisRocstar is now licensing these new developments as free, open source, in hopes to help improve their own and others' access to infrastructure which can be readily utilized in developing coupled or composite software systems;more » with particular attention to more rapid production and utilization of multiphysics applications in the HPC environment. There are two major pieces to the consortium implementation, the Application Component Toolkit (ACT), and the Multiphysics Application Coupling Toolkit (MPACT). The current development focus is the ACT, which is (will be) the substrate for MPACT. The ACT itself is built up from the components described in the technical approach. In particular, the ACT has the following major components: 1.The Component Object Manager (COM): The COM package provides encapsulation of user applications, and their data. COM also provides the inter-component function call mechanism. 2.The System Integration Manager (SIM): The SIM package provides constructs and mechanisms for orchestrating composite systems of multiply integrated pieces.« less

  11. Numerical Technology for Large-Scale Computational Electromagnetics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sharpe, R; Champagne, N; White, D

    The key bottleneck of implicit computational electromagnetics tools for large complex geometries is the solution of the resulting linear system of equations. The goal of this effort was to research and develop critical numerical technology that alleviates this bottleneck for large-scale computational electromagnetics (CEM). The mathematical operators and numerical formulations used in this arena of CEM yield linear equations that are complex valued, unstructured, and indefinite. Also, simultaneously applying multiple mathematical modeling formulations to different portions of a complex problem (hybrid formulations) results in a mixed structure linear system, further increasing the computational difficulty. Typically, these hybrid linear systems aremore » solved using a direct solution method, which was acceptable for Cray-class machines but does not scale adequately for ASCI-class machines. Additionally, LLNL's previously existing linear solvers were not well suited for the linear systems that are created by hybrid implicit CEM codes. Hence, a new approach was required to make effective use of ASCI-class computing platforms and to enable the next generation design capabilities. Multiple approaches were investigated, including the latest sparse-direct methods developed by our ASCI collaborators. In addition, approaches that combine domain decomposition (or matrix partitioning) with general-purpose iterative methods and special purpose pre-conditioners were investigated. Special-purpose pre-conditioners that take advantage of the structure of the matrix were adapted and developed based on intimate knowledge of the matrix properties. Finally, new operator formulations were developed that radically improve the conditioning of the resulting linear systems thus greatly reducing solution time. The goal was to enable the solution of CEM problems that are 10 to 100 times larger than our previous capability.« less

  12. PARAPHYSOIDS, PSEUDOPARAPHYSES, AND APICAL PARAPHYSES,

    DTIC Science & Technology

    The distinct, vertical, paraphysis-like hyphae developing in the centrum of loculoascomycetes prior to the formation of asci are properly termed...structure is of doubtful usefulness. The downward-growing palisade of hyphae with free tips in the centrum of hypocreaceous fungi should be distinguished as apical paraphyses. (Author)

  13. Spherical harmonic results for the 3D Kobayashi Benchmark suite

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brown, P N; Chang, B; Hanebutte, U R

    1999-03-02

    Spherical harmonic solutions are presented for the Kobayashi benchmark suite. The results were obtained with Ardra, a scalable, parallel neutron transport code developed at Lawrence Livermore National Laboratory (LLNL). The calculations were performed on the IBM ASCI Blue-Pacific computer at LLNL.

  14. Further evidence that aberrant segregation and crossing over in Sordaria brevicollis may be discrete, though associated, events.

    PubMed

    Theivendirarajah, K; Whitehouse, H L

    1983-01-01

    Crosses were made between buff spore colour mutants in Sordaria brevicollis in the presence of flanking markers. Recombinant asci with one or more wild-type spores were isolated and the spores germinated and scored for buff and flanking marker genotype. The buff genotype was determined by back-crossing to each parent and looking for recombinants. It was found that the majority of the recombinant asci had aberrant segregation at one or other mutant site but not both. It was inferred that in the recombinants hybrid DNA rarely extended to both sites. When the aberrant segregation was associated with crossing-over, the crossovers were situated at either end of the gene rather than between the allelic sites where the hybrid DNA was believed to terminate. Thus, some of the crossovers were separated from the site of the aberrant segregation by a site apparently not involved in hybrid DNA and none was in the position predicted by the Meselson-Radding model, that is, where the hybrid DNA terminates.

  15. The Manifestation of Chromosome Rearrangements in Unordered Asci of Neurospora

    PubMed Central

    Perkins, David D.

    1974-01-01

    Rapid, effective techniques have been developed for detecting and characterizing chromosome aberrations in Neurospora by visual inspection of ascospores and asci. Rearrangements that are detectable by the presence of deficient, nonblack ascospores in test crosses make up 5 to 10% of survivors after UV doses giving 10-55% survival. Over 135 rearrangements have been diagnosed by classifying unordered asci according to numbers of defective spores. (These include 15 originally identified or analyzed by other workers.) About 100 reciprocal translocations (RT's) have been confirmed and mapped genetically, involving all combinations of the seven chromosomes. Thirty-three other rearrangements generate viable nontandem duplications in meiosis. These consist of insertional translocations (IT's) (15 confirmed), and of rearrangements that involve a chromosome tip (10 translocations and 3 pericentric inversions). No inversion has been found that does not include the centromere. A reciprocal translocation was found within one population in nature. When pairs of RT's that involve the same two chromosome arms were intercrossed, viable duplications were produced if the breakpoints overlapped in such a way that pairing resembled that of insertional translocations (27 combinations).—The rapid analytical technique depends on the following. Deficiency ascospores are usually nonblack (W: "white") and inviable, while nondeficient ascospores, even those that include duplications, are black (B) and viable. Thus RT's typically produce 50% black spores, and IT's 75% black. Asci are shot spontaneously from ripe perithecia, and can be collected in large numbers as groups of eight ascospores representing unordered tetrads, which fall into five classes: 8B:0W; 6B:2W, 4B:4W, 2B:6B, 0B:8W. In isosequential crosses, 90-95% of tetrads are 8:0. When a rearrangement is heterozygous, the frequencies of tetrad classes are diagnostic of the type of rearrangement, and provide information also on the positions of break points. With RT's, 8:0 (alternate centromere segregation) = 0:8 (adjacent-1), 4:4's require interstitial crossing over in a centromere-break point interval, and no 6:2's or 2:6's are expected. With IT's, duplications are viable, 8:0 = 4:4, 6:2's are from interstitial crossing over, 0:8's or 2:6's are rare. Tetrads from RT's that involve a chromosome tip resemble those from IT's, as do tetrads from intercrosses between partially overlapping RT's that involve identical chromosome arms.—Because viable duplications and other aneuploid derivatives regularly occur among the offspring of rearrangements such as insertional translocations, care must be taken in selecting stocks, and original strains should be kept for reference. PMID:4416353

  16. Proceedings from the conference on high speed computing: High speed computing and national security

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hirons, K.P.; Vigil, M.; Carlson, R.

    1997-07-01

    This meeting covered the following topics: technologies/national needs/policies: past, present and future; information warfare; crisis management/massive data systems; risk assessment/vulnerabilities; Internet law/privacy and rights of society; challenges to effective ASCI programmatic use of 100 TFLOPs systems; and new computing technologies.

  17. Australasian sequestrate fungi 17: the genus Hydnoplicata (Ascomycota, Pezizaceae) resurrected.

    Treesearch

    James M. Trappe; Andrew W. Claridge

    2006-01-01

    The genus Hydnoplicata and its type species, H. whitei, were described by Gilkey in 1954. Having discovered that it has amyloid asci and other characters that relate it to the genus Peziza, Trappe later proposed the new combination, Peziza whitei, even though the species is consistently...

  18. Australasian sequestrate fungi 17: The genus Hydnoplicata (Ascomycota, Pezizacae) resurrected

    Treesearch

    James M. Trappe; Andrew W. Claridge

    2006-01-01

    The genus Hydnoplicata and its type species, H. whitei, were described by Gilkey in 1954. Having discovered that it has amyloid asci and other characters that relate it to the genus Peziza, Trappe later proposed the new combination, Peziza whiten, even though the species is consistently...

  19. Complex Multi-Chamber Airbag Performance Simulation Final Report CRADA No. TSB-961-94

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kay, Gregory; Kithil, Philip

    The purpose of this small business CRADA was to evaluate the performance of new airbag concepts which were developed by the Advanced Safety Concepts, Inc. (ASCI). These new airbag concepts, if successful, could have major potential savings to society in terms of fewer injuries, lost time and lives.

  20. Quantifying Attitude to Chemistry in Students at the University of the South Pacific

    ERIC Educational Resources Information Center

    Brown, S. J.; Sharma, B. N.; Wakeling, L.; Naiker, M.; Chandra, S.; Gopalan, R. D.; Bilimoria, V. B.

    2014-01-01

    The attitude towards the study of chemistry for new entrant chemistry students from a multi-national, regional, tertiary educational institution in the South Pacific was investigated using a purpose-designed diagnostic instrument. The Attitude toward the Study of Chemistry Inventory (ASCI) was used to quantify attitude in a cohort of first year…

  1. Attitude to the Study of Chemistry and Its Relationship with Achievement in an Introductory Undergraduate Course

    ERIC Educational Resources Information Center

    Brown, Stephen J.; White, Sue; Sharma, Bibhya; Wakeling, Lara; Naiker, Mani; Chandra, Shaneel; Gopalan, Romila; Bilimoria, Veena

    2015-01-01

    A positive attitude to a subject may be congruent with higher achievement; however, limited evidence supports this for students in undergraduate chemistry--this may result from difficulties in quantifying attitude. Therefore, in this study, the Attitude to the Study of Chemistry Inventory (ASCI)--a validated instrument to quantify attitude, was…

  2. Using SPEEDES to simulate the blue gene interconnect network

    NASA Technical Reports Server (NTRS)

    Springer, P.; Upchurch, E.

    2003-01-01

    JPL and the Center for Advanced Computer Architecture (CACR) is conducting application and simulation analyses of BG/L in order to establish a range of effectiveness for the Blue Gene/L MPP architecture in performing important classes of computations and to determine the design sensitivity of the global interconnect network in support of real world ASCI application execution.

  3. Purple L1 Milestone Review Panel TotalView Debugger Functionality and Performance for ASC Purple

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wolfe, M

    2006-12-12

    ASC code teams require a robust software debugging tool to help developers quickly find bugs in their codes and get their codes running. Development debugging commonly runs up to 512 processes. Production jobs run up to full ASC Purple scale, and at times require introspection while running. Developers want a debugger that runs on all their development and production platforms and that works with all compilers and runtimes used with ASC codes. The TotalView Multiprocess Debugger made by Etnus was specified for ASC Purple to address this needed capability. The ASC Purple environment builds on the environment seen by TotalViewmore » on ASCI White. The debugger must now operate with the Power5 CPU, Federation switch, AIX 5.3 operating system including large pages, IBM compilers 7 and 9, POE 4.2 parallel environment, and rs6000 SLURM resource manager. Users require robust, basic debugger functionality with acceptable performance at development debugging scale. A TotalView installation must be provided at the beginning of the early user access period that meets these requirements. A functional enhancement, fast conditional data watchpoints, and a scalability enhancement, capability up to 8192 processes, are to be demonstrated.« less

  4. Diagnosing Changes in Attitude in First-Year College Chemistry Students with a Shortened Version of Bauer's Semantic Differential

    ERIC Educational Resources Information Center

    Brandriet, Alexandra R.; Xu, Xiaoying; Bretz, Stacey Lowery; Lewis, Jennifer E.

    2011-01-01

    In this quantitative study, a shortened version of the Attitude toward the Subject of Chemistry Inventory (ASCI) created by Bauer (2008) was used to identify the chemistry attitudes of two populations of general chemistry students at two universities. The ASCIv2 contained just two factors from the original instrument. These factors measured…

  5. Sandia National Laboratories Advanced Simulation and Computing (ASC) software quality plan : ASC software quality engineering practices Version 3.0.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Turgeon, Jennifer L.; Minana, Molly A.; Hackney, Patricia

    2009-01-01

    The purpose of the Sandia National Laboratories (SNL) Advanced Simulation and Computing (ASC) Software Quality Plan is to clearly identify the practices that are the basis for continually improving the quality of ASC software products. Quality is defined in the US Department of Energy/National Nuclear Security Agency (DOE/NNSA) Quality Criteria, Revision 10 (QC-1) as 'conformance to customer requirements and expectations'. This quality plan defines the SNL ASC Program software quality engineering (SQE) practices and provides a mapping of these practices to the SNL Corporate Process Requirement (CPR) 001.3.6; 'Corporate Software Engineering Excellence'. This plan also identifies ASC management's and themore » software project teams responsibilities in implementing the software quality practices and in assessing progress towards achieving their software quality goals. This SNL ASC Software Quality Plan establishes the signatories commitments to improving software products by applying cost-effective SQE practices. This plan enumerates the SQE practices that comprise the development of SNL ASC's software products and explains the project teams opportunities for tailoring and implementing the practices.« less

  6. Application Specific Chemical Information Microprocessor (ASCI mu P)

    DTIC Science & Technology

    1999-09-30

    lithography created channels in polydimethylsiloxane polymer. 1C. Optical micrograph of 100 um line widths using soft lithography Progress has also been made...also collaborated with Dr. Jose Almirall at Florida International University and have accomplished the HPLC method development of explosives detection...analytical materials. We have established the base for LIF electrophoretic chip analysis and similarly for the electrochemcial detection. We have learned the

  7. Modernization of software quality assurance

    NASA Technical Reports Server (NTRS)

    Bhaumik, Gokul

    1988-01-01

    The customers satisfaction depends not only on functional performance, it also depends on the quality characteristics of the software products. An examination of this quality aspect of software products will provide a clear, well defined framework for quality assurance functions, which improve the life-cycle activities of software development. Software developers must be aware of the following aspects which have been expressed by many quality experts: quality cannot be added on; the level of quality built into a program is a function of the quality attributes employed during the development process; and finally, quality must be managed. These concepts have guided our development of the following definition for a Software Quality Assurance function: Software Quality Assurance is a formal, planned approach of actions designed to evaluate the degree of an identifiable set of quality attributes present in all software systems and their products. This paper is an explanation of how this definition was developed and how it is used.

  8. Software archeology: a case study in software quality assurance and design

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Macdonald, John M; Lloyd, Jane A; Turner, Cameron J

    2009-01-01

    Ideally, quality is designed into software, just as quality is designed into hardware. However, when dealing with legacy systems, demonstrating that the software meets required quality standards may be difficult to achieve. As the need to demonstrate the quality of existing software was recognized at Los Alamos National Laboratory (LANL), an effort was initiated to uncover and demonstrate that legacy software met the required quality standards. This effort led to the development of a reverse engineering approach referred to as software archaeology. This paper documents the software archaeology approaches used at LANL to document legacy software systems. A case studymore » for the Robotic Integrated Packaging System (RIPS) software is included.« less

  9. Predicting Software Suitability Using a Bayesian Belief Network

    NASA Technical Reports Server (NTRS)

    Beaver, Justin M.; Schiavone, Guy A.; Berrios, Joseph S.

    2005-01-01

    The ability to reliably predict the end quality of software under development presents a significant advantage for a development team. It provides an opportunity to address high risk components earlier in the development life cycle, when their impact is minimized. This research proposes a model that captures the evolution of the quality of a software product, and provides reliable forecasts of the end quality of the software being developed in terms of product suitability. Development team skill, software process maturity, and software problem complexity are hypothesized as driving factors of software product quality. The cause-effect relationships between these factors and the elements of software suitability are modeled using Bayesian Belief Networks, a machine learning method. This research presents a Bayesian Network for software quality, and the techniques used to quantify the factors that influence and represent software quality. The developed model is found to be effective in predicting the end product quality of small-scale software development efforts.

  10. Evaluation of the fish passage effectiveness of the Bonneville I prototype surface collector using three-dimensional ultrasonic fish tracking - Final Report

    USGS Publications Warehouse

    Faber, D.M; Weiland, M.A.; Moursund, R.A.; Carlson, T.J.; Adams, N.; Rondorf, D.

    2001-01-01

    This report describes tests conducted at Bonneville Dam on the Columbia River in the spring of 2000. The studies used three-dimensional (3D) acoustic telemetry and computational fluid dynamics (CFD) hydraulic modeling techniques to evaluate the response of outmigrating juvenile steelhead (Oncorhynchus mykiss) and yearling chinook (O. tshawytscha) to the Prototype Surface Collector (PSC) installed at Powerhouse I of Bonneville Dam in 1998 to test the concept of using a deep-slot surface bypass collector to divert downstream migrating salmon from turbines. The study was conducted by Pacific Northwest National Laboratory (PNNL), the Waterways Experiment Station of the U.S. Army Corp of Engineers (COE), Asci Corporation, and the U.S. Geological Survey (USGS), and was sponsored by COE’s Portland District. The goal of the study was to observe the three-dimensional behavior of tagged fish (fish bearing ultrasonic micro-transmitters) within 100 meters (m) of the surface flow bypass structure to test hypotheses about the response of migrants to flow stimuli generated by the presence of the surface flow bypass prototype and its operation. Research was done in parallel with radio telemetry studies conducted by USGS and hydroacoustic studies conducted by WES & Asci to evaluate the prototype surface collector.

  11. Software Quality Perceptions of Stakeholders Involved in the Software Development Process

    ERIC Educational Resources Information Center

    Padmanabhan, Priya

    2013-01-01

    Software quality is one of the primary determinants of project management success. Stakeholders involved in software development widely agree that quality is important (Barney and Wohlin 2009). However, they may differ on what constitutes software quality, and which of its attributes are more important than others. Although, software quality…

  12. Analysis, Thematic Maps and Data Mining from Point Cloud to Ontology for Software Development

    NASA Astrophysics Data System (ADS)

    Nespeca, R.; De Luca, L.

    2016-06-01

    The primary purpose of the survey for the restoration of Cultural Heritage is the interpretation of the state of building preservation. For this, the advantages of the remote sensing systems that generate dense point cloud (range-based or image-based) are not limited only to the acquired data. The paper shows that it is possible to extrapolate very useful information in diagnostics using spatial annotation, with the use of algorithms already implemented in open-source software. Generally, the drawing of degradation maps is the result of manual work, so dependent on the subjectivity of the operator. This paper describes a method of extraction and visualization of information, obtained by mathematical procedures, quantitative, repeatable and verifiable. The case study is a part of the east facade of the Eglise collégiale Saint-Maurice also called Notre Dame des Grâces, in Caromb, in southern France. The work was conducted on the matrix of information contained in the point cloud asci format. The first result is the extrapolation of new geometric descriptors. First, we create the digital maps with the calculated quantities. Subsequently, we have moved to semi-quantitative analyses that transform new data into useful information. We have written the algorithms for accurate selection, for the segmentation of point cloud, for automatic calculation of the real surface and the volume. Furthermore, we have created the graph of spatial distribution of the descriptors. This work shows that if we work during the data processing we can transform the point cloud into an enriched database: the use, the management and the data mining is easy, fast and effective for everyone involved in the restoration process.

  13. Software cost/resource modeling: Software quality tradeoff measurement

    NASA Technical Reports Server (NTRS)

    Lawler, R. W.

    1980-01-01

    A conceptual framework for treating software quality from a total system perspective is developed. Examples are given to show how system quality objectives may be allocated to hardware and software; to illustrate trades among quality factors, both hardware and software, to achieve system performance objectives; and to illustrate the impact of certain design choices on software functionality.

  14. Software Acquisition: Evolution, Total Quality Management, and Applications to the Army Tactical Missile System

    DTIC Science & Technology

    1992-06-01

    presents the concept of software Total Quality Management (TQM) which focuses on the entire process of software acquisition, as a partial solution to...software TQM can be applied to software acquisition. Software Development, Software Acquisition, Total Quality management (TQM), Army Tactical Missile

  15. The Effects of Development Team Skill on Software Product Quality

    NASA Technical Reports Server (NTRS)

    Beaver, Justin M.; Schiavone, Guy A.

    2006-01-01

    This paper provides an analysis of the effect of the skill/experience of the software development team on the quality of the final software product. A method for the assessment of software development team skill and experience is proposed, and was derived from a workforce management tool currently in use by the National Aeronautics and Space Administration. Using data from 26 smallscale software development projects, the team skill measures are correlated to 5 software product quality metrics from the ISO/IEC 9126 Software Engineering Product Quality standard. in the analysis of the results, development team skill is found to be a significant factor in the adequacy of the design and implementation. In addition, the results imply that inexperienced software developers are tasked with responsibilities ill-suited to their skill level, and thus have a significant adverse effect on the quality of the software product. Keywords: software quality, development skill, software metrics

  16. A Framework of the Use of Information in Software Testing

    ERIC Educational Resources Information Center

    Kaveh, Payman

    2010-01-01

    With the increasing role that software systems play in our daily lives, software quality has become extremely important. Software quality is impacted by the efficiency of the software testing process. There are a growing number of software testing methodologies, models, and initiatives to satisfy the need to improve software quality. The main…

  17. Matrix Metalloproteinases as a Therapeutic Target to Improve Neurologic Recovery After Spinal Cord Injury

    DTIC Science & Technology

    2015-12-01

    aberrant remodeling of the bladder wall, which could contribute to increased weight of this structure and reduced voiding. We next evaluated the...moderate levels of SCI show both neurological and urological recovery. Task 4. Analysis of lesion epicenter and serotonergic fiber tracks caudal to a...SCI in mice. 4a. Perfuse animals with fixative, remove the cords, and stain with Eriochrome cyanine or immunostain for serotonergic fiber tracks

  18. User Metrics in NASA Earth Science Data Systems

    NASA Technical Reports Server (NTRS)

    Lynnes, Chris

    2018-01-01

    This presentation the collection and use of user metrics in NASA's Earth Science data systems. A variety of collection methods is discussed, with particular emphasis given to the American Customer Satisfaction Index (ASCI). User sentiment on potential use of cloud computing is presented, with generally positive responses. The presentation also discusses various forms of automatically collected metrics, including an example of the relative usage of different functions within the Giovanni analysis system.

  19. Software Quality Assurance Metrics

    NASA Technical Reports Server (NTRS)

    McRae, Kalindra A.

    2004-01-01

    Software Quality Assurance (SQA) is a planned and systematic set of activities that ensures conformance of software life cycle processes and products conform to requirements, standards and procedures. In software development, software quality means meeting requirements and a degree of excellence and refinement of a project or product. Software Quality is a set of attributes of a software product by which its quality is described and evaluated. The set of attributes includes functionality, reliability, usability, efficiency, maintainability, and portability. Software Metrics help us understand the technical process that is used to develop a product. The process is measured to improve it and the product is measured to increase quality throughout the life cycle of software. Software Metrics are measurements of the quality of software. Software is measured to indicate the quality of the product, to assess the productivity of the people who produce the product, to assess the benefits derived from new software engineering methods and tools, to form a baseline for estimation, and to help justify requests for new tools or additional training. Any part of the software development can be measured. If Software Metrics are implemented in software development, it can save time, money, and allow the organization to identify the caused of defects which have the greatest effect on software development. The summer of 2004, I worked with Cynthia Calhoun and Frank Robinson in the Software Assurance/Risk Management department. My task was to research and collect, compile, and analyze SQA Metrics that have been used in other projects that are not currently being used by the SA team and report them to the Software Assurance team to see if any metrics can be implemented in their software assurance life cycle process.

  20. Training, Quality Assurance Factors, and Tools Investigation: a Work Report and Suggestions on Software Quality Assurance

    NASA Technical Reports Server (NTRS)

    Lee, Pen-Nan

    1991-01-01

    Previously, several research tasks have been conducted, some observations were obtained, and several possible suggestions have been contemplated involving software quality assurance engineering at NASA Johnson. These research tasks are briefly described. Also, a brief discussion is given on the role of software quality assurance in software engineering along with some observations and suggestions. A brief discussion on a training program for software quality assurance engineers is provided. A list of assurance factors as well as quality factors are also included. Finally, a process model which can be used for searching and collecting software quality assurance tools is presented.

  1. SWiFT Software Quality Assurance Plan.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Berg, Jonathan Charles

    This document describes the software development practice areas and processes which contribute to the ability of SWiFT software developers to provide quality software. These processes are designed to satisfy the requirements set forth by the Sandia Software Quality Assurance Program (SSQAP). APPROVALS SWiFT Software Quality Assurance Plan (SAND2016-0765) approved by: Department Manager SWiFT Site Lead Dave Minster (6121) Date Jonathan White (6121) Date SWiFT Controls Engineer Jonathan Berg (6121) Date CHANGE HISTORY Issue Date Originator(s) Description A 2016/01/27 Jon Berg (06121) Initial release of the SWiFT Software Quality Assurance Plan

  2. Pragmatic quality metrics for evolutionary software development models

    NASA Technical Reports Server (NTRS)

    Royce, Walker

    1990-01-01

    Due to the large number of product, project, and people parameters which impact large custom software development efforts, measurement of software product quality is a complex undertaking. Furthermore, the absolute perspective from which quality is measured (customer satisfaction) is intangible. While we probably can't say what the absolute quality of a software product is, we can determine the relative quality, the adequacy of this quality with respect to pragmatic considerations, and identify good and bad trends during development. While no two software engineers will ever agree on an optimum definition of software quality, they will agree that the most important perspective of software quality is its ease of change. We can call this flexibility, adaptability, or some other vague term, but the critical characteristic of software is that it is soft. The easier the product is to modify, the easier it is to achieve any other software quality perspective. This paper presents objective quality metrics derived from consistent lifecycle perspectives of rework which, when used in concert with an evolutionary development approach, can provide useful insight to produce better quality per unit cost/schedule or to achieve adequate quality more efficiently. The usefulness of these metrics is evaluated by applying them to a large, real world, Ada project.

  3. Building quality into medical product software design.

    PubMed

    Mallory, S R

    1993-01-01

    The software engineering and quality assurance disciplines are a requisite to the design of safe and effective software-based medical devices. It is in the areas of software methodology and process that the most beneficial application of these disciplines to software development can be made. Software is a product of complex operations and methodologies and is not amenable to the traditional electromechanical quality assurance processes. Software quality must be built in by the developers, with the software verification and validation engineers acting as the independent instruments for ensuring compliance with performance objectives and with development and maintenance standards. The implementation of a software quality assurance program is a complex process involving management support, organizational changes, and new skill sets, but the benefits are profound. Its rewards provide safe, reliable, cost-effective, maintainable, and manageable software, which may significantly speed the regulatory review process and therefore potentially shorten the overall time to market. The use of a trial project can greatly facilitate the learning process associated with the first-time application of a software quality assurance program.

  4. A measurement system for large, complex software programs

    NASA Technical Reports Server (NTRS)

    Rone, Kyle Y.; Olson, Kitty M.; Davis, Nathan E.

    1994-01-01

    This paper describes measurement systems required to forecast, measure, and control activities for large, complex software development and support programs. Initial software cost and quality analysis provides the foundation for meaningful management decisions as a project evolves. In modeling the cost and quality of software systems, the relationship between the functionality, quality, cost, and schedule of the product must be considered. This explicit relationship is dictated by the criticality of the software being developed. This balance between cost and quality is a viable software engineering trade-off throughout the life cycle. Therefore, the ability to accurately estimate the cost and quality of software systems is essential to providing reliable software on time and within budget. Software cost models relate the product error rate to the percent of the project labor that is required for independent verification and validation. The criticality of the software determines which cost model is used to estimate the labor required to develop the software. Software quality models yield an expected error discovery rate based on the software size, criticality, software development environment, and the level of competence of the project and developers with respect to the processes being employed.

  5. Software quality in 1997

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jones, C.

    1997-11-01

    For many years, software quality assurance lagged behind hardware quality assurance in terms of methods, metrics, and successful results. New approaches such as Quality Function Deployment (QFD) the ISO 9000-9004 standards, the SEI maturity levels, and Total Quality Management (TQM) are starting to attract wide attention, and in some cases to bring software quality levels up to a parity with manufacturing quality levels. Since software is on the critical path for many engineered products, and for internal business systems as well, the new approaches are starting to affect global competition and attract widespread international interest. It can be hypothesized thatmore » success in mastering software quality will be a key strategy for dominating global software markets in the 21st century.« less

  6. Software component quality evaluation

    NASA Technical Reports Server (NTRS)

    Clough, A. J.

    1991-01-01

    The paper describes a software inspection process that can be used to evaluate the quality of software components. Quality criteria, process application, independent testing of the process and proposed associated tool support are covered. Early results indicate that this technique is well suited for assessing software component quality in a standardized fashion. With automated machine assistance to facilitate both the evaluation and selection of software components, such a technique should promote effective reuse of software components.

  7. Quality measures and assurance for AI (Artificial Intelligence) software

    NASA Technical Reports Server (NTRS)

    Rushby, John

    1988-01-01

    This report is concerned with the application of software quality and evaluation measures to AI software and, more broadly, with the question of quality assurance for AI software. Considered are not only the metrics that attempt to measure some aspect of software quality, but also the methodologies and techniques (such as systematic testing) that attempt to improve some dimension of quality, without necessarily quantifying the extent of the improvement. The report is divided into three parts Part 1 reviews existing software quality measures, i.e., those that have been developed for, and applied to, conventional software. Part 2 considers the characteristics of AI software, the applicability and potential utility of measures and techniques identified in the first part, and reviews those few methods developed specifically for AI software. Part 3 presents an assessment and recommendations for the further exploration of this important area.

  8. Software Quality Evaluation Models Applicable in Health Information and Communications Technologies. A Review of the Literature.

    PubMed

    Villamor Ordozgoiti, Alberto; Delgado Hito, Pilar; Guix Comellas, Eva María; Fernandez Sanchez, Carlos Manuel; Garcia Hernandez, Milagros; Lluch Canut, Teresa

    2016-01-01

    Information and Communications Technologies in healthcare has increased the need to consider quality criteria through standardised processes. The aim of this study was to analyse the software quality evaluation models applicable to healthcare from the perspective of ICT-purchasers. Through a systematic literature review with the keywords software, product, quality, evaluation and health, we selected and analysed 20 original research papers published from 2005-2016 in health science and technology databases. The results showed four main topics: non-ISO models, software quality evaluation models based on ISO/IEC standards, studies analysing software quality evaluation models, and studies analysing ISO standards for software quality evaluation. The models provide cost-efficiency criteria for specific software, and improve use outcomes. The ISO/IEC25000 standard is shown as the most suitable for evaluating the quality of ICTs for healthcare use from the perspective of institutional acquisition.

  9. Culture shock: Improving software quality

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    de Jong, K.; Trauth, S.L.

    1988-01-01

    The concept of software quality can represent a significant shock to an individual who has been developing software for many years and who believes he or she has been doing a high quality job. The very idea that software includes lines of code and associated documentation is foreign and difficult to grasp, at best. Implementation of a software quality program hinges on the concept that software is a product whose quality needs improving. When this idea is introduced into a technical community that is largely ''self-taught'' and has been producing ''good'' software for some time, a fundamental understanding of themore » concepts associated with software is often weak. Software developers can react as if to say, ''What are you talking about. What do you mean I'm not doing a good job. I haven't gotten any complaints about my code yetexclamation'' Coupling such surprise and resentment with the shock that software really is a product and software quality concepts do exist, can fuel the volatility of these emotions. In this paper, we demonstrate that the concept of software quality can indeed pose a culture shock to developers. We also show that a ''typical'' quality assurance approach, that of imposing a standard and providing inspectors and auditors to assure its adherence, contributes to this shock and detracts from the very goal the approach should achieve. We offer an alternative, adopted through experience, to implement a software quality program: cooperative assistance. We show how cooperation, education, consultation and friendly assistance can overcome this culture shock. 3 refs.« less

  10. Terascale spectral element algorithms and implementations.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fischer, P. F.; Tufo, H. M.

    1999-08-17

    We describe the development and implementation of an efficient spectral element code for multimillion gridpoint simulations of incompressible flows in general two- and three-dimensional domains. We review basic and recently developed algorithmic underpinnings that have resulted in good parallel and vector performance on a broad range of architectures, including the terascale computing systems now coming online at the DOE labs. Sustained performance of 219 GFLOPS has been recently achieved on 2048 nodes of the Intel ASCI-Red machine at Sandia.

  11. Software quality for 1997 - what works and what doesn`t?

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jones, C.

    1997-11-01

    This presentation provides a view of software quality for 1997 - what works and what doesn`t. For many years, software quality assurance lagged behind hardware quality assurance in terms of methods, metrics, and successful results. New approaches such as Quality Function Development (WFD) the ISO 9000-9004 standards, the SEI maturity levels, and Total Quality Management (TQM) are starting to attract wide attention, and in some cases to bring software quality levels up to a parity with manufacturing quality levels.

  12. Software metrics: Software quality metrics for distributed systems. [reliability engineering

    NASA Technical Reports Server (NTRS)

    Post, J. V.

    1981-01-01

    Software quality metrics was extended to cover distributed computer systems. Emphasis is placed on studying embedded computer systems and on viewing them within a system life cycle. The hierarchy of quality factors, criteria, and metrics was maintained. New software quality factors were added, including survivability, expandability, and evolvability.

  13. An empirical evaluation of software quality assurance practices and challenges in a developing country: a comparison of Nigeria and Turkey.

    PubMed

    Sowunmi, Olaperi Yeside; Misra, Sanjay; Fernandez-Sanz, Luis; Crawford, Broderick; Soto, Ricardo

    2016-01-01

    The importance of quality assurance in the software development process cannot be overemphasized because its adoption results in high reliability and easy maintenance of the software system and other software products. Software quality assurance includes different activities such as quality control, quality management, quality standards, quality planning, process standardization and improvement amongst others. The aim of this work is to further investigate the software quality assurance practices of practitioners in Nigeria. While our previous work covered areas on quality planning, adherence to standardized processes and the inherent challenges, this work has been extended to include quality control, software process improvement and international quality standard organization membership. It also makes comparison based on a similar study carried out in Turkey. The goal is to generate more robust findings that can properly support decision making by the software community. The qualitative research approach, specifically, the use of questionnaire research instruments was applied to acquire data from software practitioners. In addition to the previous results, it was observed that quality assurance practices are quite neglected and this can be the cause of low patronage. Moreover, software practitioners are neither aware of international standards organizations or the required process improvement techniques; as such their claimed standards are not aligned to those of accredited bodies, and are only limited to their local experience and knowledge, which makes it questionable. The comparison with Turkey also yielded similar findings, making the results typical of developing countries. The research instrument used was tested for internal consistency using the Cronbach's alpha, and it was proved reliable. For the software industry in developing countries to grow strong and be a viable source of external revenue, software assurance practices have to be taken seriously because its effect is evident in the final product. Moreover, quality frameworks and tools which require minimum time and cost are highly needed in these countries.

  14. The Influence of Software Complexity on the Maintenance Effort: Case Study on Software Developed within Educational Process

    ERIC Educational Resources Information Center

    Radulescu, Iulian Ionut

    2006-01-01

    Software complexity is the most important software quality attribute and a very useful instrument in the study of software quality. Is one of the factors that affect most of the software quality characteristics, including maintainability. It is very important to quantity this influence and identify the means to keep it under control; by using…

  15. Quality Market: Design and Field Study of Prediction Market for Software Quality Control

    ERIC Educational Resources Information Center

    Krishnamurthy, Janaki

    2010-01-01

    Given the increasing competition in the software industry and the critical consequences of software errors, it has become important for companies to achieve high levels of software quality. While cost reduction and timeliness of projects continue to be important measures, software companies are placing increasing attention on identifying the user…

  16. Four simple recommendations to encourage best practices in research software

    PubMed Central

    Jiménez, Rafael C.; Kuzak, Mateusz; Alhamdoosh, Monther; Barker, Michelle; Batut, Bérénice; Borg, Mikael; Capella-Gutierrez, Salvador; Chue Hong, Neil; Cook, Martin; Corpas, Manuel; Flannery, Madison; Garcia, Leyla; Gelpí, Josep Ll.; Gladman, Simon; Goble, Carole; González Ferreiro, Montserrat; Gonzalez-Beltran, Alejandra; Griffin, Philippa C.; Grüning, Björn; Hagberg, Jonas; Holub, Petr; Hooft, Rob; Ison, Jon; Katz, Daniel S.; Leskošek, Brane; López Gómez, Federico; Oliveira, Luis J.; Mellor, David; Mosbergen, Rowland; Mulder, Nicola; Perez-Riverol, Yasset; Pergl, Robert; Pichler, Horst; Pope, Bernard; Sanz, Ferran; Schneider, Maria V.; Stodden, Victoria; Suchecki, Radosław; Svobodová Vařeková, Radka; Talvik, Harry-Anton; Todorov, Ilian; Treloar, Andrew; Tyagi, Sonika; van Gompel, Maarten; Vaughan, Daniel; Via, Allegra; Wang, Xiaochuan; Watson-Haigh, Nathan S.; Crouch, Steve

    2017-01-01

    Scientific research relies on computer software, yet software is not always developed following practices that ensure its quality and sustainability. This manuscript does not aim to propose new software development best practices, but rather to provide simple recommendations that encourage the adoption of existing best practices. Software development best practices promote better quality software, and better quality software improves the reproducibility and reusability of research. These recommendations are designed around Open Source values, and provide practical suggestions that contribute to making research software and its source code more discoverable, reusable and transparent. This manuscript is aimed at developers, but also at organisations, projects, journals and funders that can increase the quality and sustainability of research software by encouraging the adoption of these recommendations. PMID:28751965

  17. Four simple recommendations to encourage best practices in research software.

    PubMed

    Jiménez, Rafael C; Kuzak, Mateusz; Alhamdoosh, Monther; Barker, Michelle; Batut, Bérénice; Borg, Mikael; Capella-Gutierrez, Salvador; Chue Hong, Neil; Cook, Martin; Corpas, Manuel; Flannery, Madison; Garcia, Leyla; Gelpí, Josep Ll; Gladman, Simon; Goble, Carole; González Ferreiro, Montserrat; Gonzalez-Beltran, Alejandra; Griffin, Philippa C; Grüning, Björn; Hagberg, Jonas; Holub, Petr; Hooft, Rob; Ison, Jon; Katz, Daniel S; Leskošek, Brane; López Gómez, Federico; Oliveira, Luis J; Mellor, David; Mosbergen, Rowland; Mulder, Nicola; Perez-Riverol, Yasset; Pergl, Robert; Pichler, Horst; Pope, Bernard; Sanz, Ferran; Schneider, Maria V; Stodden, Victoria; Suchecki, Radosław; Svobodová Vařeková, Radka; Talvik, Harry-Anton; Todorov, Ilian; Treloar, Andrew; Tyagi, Sonika; van Gompel, Maarten; Vaughan, Daniel; Via, Allegra; Wang, Xiaochuan; Watson-Haigh, Nathan S; Crouch, Steve

    2017-01-01

    Scientific research relies on computer software, yet software is not always developed following practices that ensure its quality and sustainability. This manuscript does not aim to propose new software development best practices, but rather to provide simple recommendations that encourage the adoption of existing best practices. Software development best practices promote better quality software, and better quality software improves the reproducibility and reusability of research. These recommendations are designed around Open Source values, and provide practical suggestions that contribute to making research software and its source code more discoverable, reusable and transparent. This manuscript is aimed at developers, but also at organisations, projects, journals and funders that can increase the quality and sustainability of research software by encouraging the adoption of these recommendations.

  18. CrossTalk: The Journal of Defense Software Engineering. Volume 24, Number 6. November/December 2011

    DTIC Science & Technology

    2011-11-01

    Software Development.” Software Quality Professional Journal, American Society for Quality (ASQ), (March 2010) 4-14. 3. Nair, Gopalakrishnan T.R...Inspection Performance Metric”. Software Quality Professional Journal, American Society for Quality (ASQ), Volume 13, Issue 2, (March 2011) 14-26...the discovery process and are marketed by compa- nies such as Black Duck Software, OpenLogic, Palamida, and Protecode, among others.7 A number of open

  19. Using software metrics and software reliability models to attain acceptable quality software for flight and ground support software for avionic systems

    NASA Technical Reports Server (NTRS)

    Lawrence, Stella

    1992-01-01

    This paper is concerned with methods of measuring and developing quality software. Reliable flight and ground support software is a highly important factor in the successful operation of the space shuttle program. Reliability is probably the most important of the characteristics inherent in the concept of 'software quality'. It is the probability of failure free operation of a computer program for a specified time and environment.

  20. Operational excellence (six sigma) philosophy: Application to software quality assurance

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lackner, M.

    1997-11-01

    This report contains viewgraphs on operational excellence philosophy of six sigma applied to software quality assurance. This report outlines the following: goal of six sigma; six sigma tools; manufacturing vs administrative processes; Software quality assurance document inspections; map software quality assurance requirements document; failure mode effects analysis for requirements document; measuring the right response variables; and questions.

  1. Architectural requirements for the Red Storm computing system.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Camp, William J.; Tomkins, James Lee

    This report is based on the Statement of Work (SOW) describing the various requirements for delivering 3 new supercomputer system to Sandia National Laboratories (Sandia) as part of the Department of Energy's (DOE) Accelerated Strategic Computing Initiative (ASCI) program. This system is named Red Storm and will be a distributed memory, massively parallel processor (MPP) machine built primarily out of commodity parts. The requirements presented here distill extensive architectural and design experience accumulated over a decade and a half of research, development and production operation of similar machines at Sandia. Red Storm will have an unusually high bandwidth, low latencymore » interconnect, specially designed hardware and software reliability features, a light weight kernel compute node operating system and the ability to rapidly switch major sections of the machine between classified and unclassified computing environments. Particular attention has been paid to architectural balance in the design of Red Storm, and it is therefore expected to achieve an atypically high fraction of its peak speed of 41 TeraOPS on real scientific computing applications. In addition, Red Storm is designed to be upgradeable to many times this initial peak capability while still retaining appropriate balance in key design dimensions. Installation of the Red Storm computer system at Sandia's New Mexico site is planned for 2004, and it is expected that the system will be operated for a minimum of five years following installation.« less

  2. 78 FR 16474 - Extension of the Period for Comments on the Enhancement of Quality of Software-Related Patents

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-03-15

    ...] Extension of the Period for Comments on the Enhancement of Quality of Software-Related Patents AGENCY... announcing the formation of a partnership with the software community to enhance the quality of software-related patents (Software Partnership), and a request for comments on the preparation of patent...

  3. Investigation of Surface Phenomena in Shocked Tin in Converging Geometry

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rousculp, Christopher L.; Oro, David Michael; Griego, Jeffrey Randall

    2016-04-14

    There is a great interest in RMI as source of ejecta from metal shells. Previous experiments have explored wavelength amplitude (kA) variation but they have a small range of drive pressures and are in planer geometry. Simulations, both MD and hydro, have explored RMI in planer geometry. The ejecta source model from RMI is an area of active algorithm and code development in ASCI-IC Lagrangian Applications Project. PHELIX offers precise, reproducible variable driver for Hydro and material physics diagnoses with proton radiography.

  4. Endomyces tetrasperma, sp. n

    PubMed Central

    Macy, J. M.; Miller, M. W.

    1971-01-01

    A new fungal species has been described and placed in the genus Endomyces. Endomyces tetrasperma forms a true septate, multinucleate mycelium which breaks up into arthrospores. Ascus formation occurs after isogamous copulation between sexual protuberances which develop at the ends of arthrospores or between two cells, adjacent mycelial cells, or arthrospores. The asci which dehisce at maturity release two to four smooth, ovoid, thick-walled spores, each containing two oil droplets. The proposed life cycle is based on morphological and cytological observations. Images PMID:5541538

  5. The State of Software for Evolutionary Biology.

    PubMed

    Darriba, Diego; Flouri, Tomáš; Stamatakis, Alexandros

    2018-05-01

    With Next Generation Sequencing data being routinely used, evolutionary biology is transforming into a computational science. Thus, researchers have to rely on a growing number of increasingly complex software. All widely used core tools in the field have grown considerably, in terms of the number of features as well as lines of code and consequently, also with respect to software complexity. A topic that has received little attention is the software engineering quality of widely used core analysis tools. Software developers appear to rarely assess the quality of their code, and this can have potential negative consequences for end-users. To this end, we assessed the code quality of 16 highly cited and compute-intensive tools mainly written in C/C++ (e.g., MrBayes, MAFFT, SweepFinder, etc.) and JAVA (BEAST) from the broader area of evolutionary biology that are being routinely used in current data analysis pipelines. Because, the software engineering quality of the tools we analyzed is rather unsatisfying, we provide a list of best practices for improving the quality of existing tools and list techniques that can be deployed for developing reliable, high quality scientific software from scratch. Finally, we also discuss journal as well as science policy and, more importantly, funding issues that need to be addressed for improving software engineering quality as well as ensuring support for developing new and maintaining existing software. Our intention is to raise the awareness of the community regarding software engineering quality issues and to emphasize the substantial lack of funding for scientific software development.

  6. APPLICATION OF SOFTWARE QUALITY ASSURANCE CONCEPTS AND PROCEDURES TO ENVIORNMENTAL RESEARCH INVOLVING SOFTWARE DEVELOPMENT

    EPA Science Inventory

    As EPA’s environmental research expands into new areas that involve the development of software, quality assurance concepts and procedures that were originally developed for environmental data collection may not be appropriate. Fortunately, software quality assurance is a ...

  7. Quality Attributes for Mission Flight Software: A Reference for Architects

    NASA Technical Reports Server (NTRS)

    Wilmot, Jonathan; Fesq, Lorraine; Dvorak, Dan

    2016-01-01

    In the international standards for architecture descriptions in systems and software engineering (ISO/IEC/IEEE 42010), "concern" is a primary concept that often manifests itself in relation to the quality attributes or "ilities" that a system is expected to exhibit - qualities such as reliability, security and modifiability. One of the main uses of an architecture description is to serve as a basis for analyzing how well the architecture achieves its quality attributes, and that requires architects to be as precise as possible about what they mean in claiming, for example, that an architecture supports "modifiability." This paper describes a table, generated by NASA's Software Architecture Review Board, which lists fourteen key quality attributes, identifies different important aspects of each quality attribute and considers each aspect in terms of requirements, rationale, evidence, and tactics to achieve the aspect. This quality attribute table is intended to serve as a guide to software architects, software developers, and software architecture reviewers in the domain of mission-critical real-time embedded systems, such as space mission flight software.

  8. A research review of quality assessment for software

    NASA Technical Reports Server (NTRS)

    1991-01-01

    Measures were recommended to assess the quality of software submitted to the AdaNet program. The quality factors that are important to software reuse are explored and methods of evaluating those factors are discussed. Quality factors important to software reuse are: correctness, reliability, verifiability, understandability, modifiability, and certifiability. Certifiability is included because the documentation of many factors about a software component such as its efficiency, portability, and development history, constitute a class for factors important to some users, not important at all to other, and impossible for AdaNet to distinguish between a priori. The quality factors may be assessed in different ways. There are a few quantitative measures which have been shown to indicate software quality. However, it is believed that there exists many factors that indicate quality and have not been empirically validated due to their subjective nature. These subjective factors are characterized by the way in which they support the software engineering principles of abstraction, information hiding, modularity, localization, confirmability, uniformity, and completeness.

  9. A Structure for Creating Quality Software.

    ERIC Educational Resources Information Center

    Christensen, Larry C.; Bodey, Michael R.

    1990-01-01

    Addresses the issue of assuring quality software for use in computer-aided instruction and presents a structure by which developers can create quality courseware. Differences between courseware and computer-aided instruction software are discussed, methods for testing software are described, and human factors issues as well as instructional design…

  10. Development of innovative computer software to facilitate the setup and computation of water quality index.

    PubMed

    Nabizadeh, Ramin; Valadi Amin, Maryam; Alimohammadi, Mahmood; Naddafi, Kazem; Mahvi, Amir Hossein; Yousefzadeh, Samira

    2013-04-26

    Developing a water quality index which is used to convert the water quality dataset into a single number is the most important task of most water quality monitoring programmes. As the water quality index setup is based on different local obstacles, it is not feasible to introduce a definite water quality index to reveal the water quality level. In this study, an innovative software application, the Iranian Water Quality Index Software (IWQIS), is presented in order to facilitate calculation of a water quality index based on dynamic weight factors, which will help users to compute the water quality index in cases where some parameters are missing from the datasets. A dataset containing 735 water samples of drinking water quality in different parts of the country was used to show the performance of this software using different criteria parameters. The software proved to be an efficient tool to facilitate the setup of water quality indices based on flexible use of variables and water quality databases.

  11. The State of Software for Evolutionary Biology

    PubMed Central

    Darriba, Diego; Flouri, Tomáš; Stamatakis, Alexandros

    2018-01-01

    Abstract With Next Generation Sequencing data being routinely used, evolutionary biology is transforming into a computational science. Thus, researchers have to rely on a growing number of increasingly complex software. All widely used core tools in the field have grown considerably, in terms of the number of features as well as lines of code and consequently, also with respect to software complexity. A topic that has received little attention is the software engineering quality of widely used core analysis tools. Software developers appear to rarely assess the quality of their code, and this can have potential negative consequences for end-users. To this end, we assessed the code quality of 16 highly cited and compute-intensive tools mainly written in C/C++ (e.g., MrBayes, MAFFT, SweepFinder, etc.) and JAVA (BEAST) from the broader area of evolutionary biology that are being routinely used in current data analysis pipelines. Because, the software engineering quality of the tools we analyzed is rather unsatisfying, we provide a list of best practices for improving the quality of existing tools and list techniques that can be deployed for developing reliable, high quality scientific software from scratch. Finally, we also discuss journal as well as science policy and, more importantly, funding issues that need to be addressed for improving software engineering quality as well as ensuring support for developing new and maintaining existing software. Our intention is to raise the awareness of the community regarding software engineering quality issues and to emphasize the substantial lack of funding for scientific software development. PMID:29385525

  12. Engineering Quality Software: 10 Recommendations for Improved Software Quality Management

    DTIC Science & Technology

    2010-04-27

    lack of user involvement • Inadequate Software Process Management & Control By Contractors • No “Team” of Vendors and users; little SME participation...1990 Quality Perspectives • Process Quality ( CMMI ) • Product Quality (ISO/IEC 2500x) – Internal Quality Attributes – External Quality Attributes... CMMI /ISO 9000 Assessments – Capture organizational knowledge • Identify best practices, lessons learned Know where you are, and where you need to be

  13. Getting started on metrics - Jet Propulsion Laboratory productivity and quality

    NASA Technical Reports Server (NTRS)

    Bush, M. W.

    1990-01-01

    A review is presented to describe the effort and difficulties of reconstructing fifteen years of JPL software history. In 1987 the collection and analysis of project data were started with the objective of creating laboratory-wide measures of quality and productivity for software development. As a result of this two-year Software Product Assurance metrics study, a rough measurement foundation for software productivity and software quality, and an order-of-magnitude quantitative baseline for software systems and subsystems are now available.

  14. Software Reviews Since Acquisition Reform - The Artifact Perspective

    DTIC Science & Technology

    2004-01-01

    Risk Management OLD NEW Slide 13Acquisition of Software Intensive Systems 2004 – Peter Hantos Single, basic software paradigm Single processor Low...software risk mitigation related trade-offs must be done together Integral Software Engineering Activities Process Maturity and Quality Frameworks Quality

  15. The experience factory: Can it make you a 5? or what is its relationship to other quality and improvement concepts?

    NASA Technical Reports Server (NTRS)

    Basili, Victor R.

    1992-01-01

    The concepts of quality improvements have permeated many businesses. It is clear that the nineties will be the quality era for software and there is a growing need to develop or adapt quality improvement approaches to the software business. Thus we must understand software as an artifact and software as a business. Since the business we are dealing with is software, we must understand the nature of software and software development. The software discipline is evolutionary and experimental; it is a laboratory science. Software is development not production. The technologies of the discipline are human based. There is a lack of models that allow us to reason about the process and the product. All software is not the same; process is a variable, goals are variable, etc. Packaged, reusable, experiences require additional resources in the form of organization, processes, people, etc. There have been a variety of organizational frameworks proposed to improve quality for various businesses. The ones discussed in this presentation include: Plan-Do-Check-Act, a quality improvement process based upon a feedback cycle for optimizing a single process model/production line; the Experience Factory/Quality Improvement Paradigm, continuous improvements through the experimentation, packaging, and reuse of experiences based upon a business's needs; Total Quality Management, a management approach to long term success through customer satisfaction based on the participation of all members of an organization; the SEI capability maturity model, a staged process improvement based upon assessment with regard to a set of key process areas until you reach a level 5 which represents a continuous process improvement; and Lean (software) Development, a principle supporting the concentration of the production on 'value added' activities and the elimination of reduction of 'not value added' activities.

  16. Establishing Qualitative Software Metrics in Department of the Navy Programs

    DTIC Science & Technology

    2015-10-29

    dedicated to provide the highest quality software to its users. In doing, there is a need for a formalized set of Software Quality Metrics . The goal...of this paper is to establish the validity of those necessary Quality metrics . In our approach we collected the data of over a dozen programs...provide the necessary variable data for our formulas and tested the formulas for validity. Keywords: metrics ; software; quality I. PURPOSE Space

  17. Integrating automated support for a software management cycle into the TAME system

    NASA Technical Reports Server (NTRS)

    Sunazuka, Toshihiko; Basili, Victor R.

    1989-01-01

    Software managers are interested in the quantitative management of software quality, cost and progress. An integrated software management methodology, which can be applied throughout the software life cycle for any number purposes, is required. The TAME (Tailoring A Measurement Environment) methodology is based on the improvement paradigm and the goal/question/metric (GQM) paradigm. This methodology helps generate a software engineering process and measurement environment based on the project characteristics. The SQMAR (software quality measurement and assurance technology) is a software quality metric system and methodology applied to the development processes. It is based on the feed forward control principle. Quality target setting is carried out before the plan-do-check-action activities are performed. These methodologies are integrated to realize goal oriented measurement, process control and visual management. A metric setting procedure based on the GQM paradigm, a management system called the software management cycle (SMC), and its application to a case study based on NASA/SEL data are discussed. The expected effects of SMC are quality improvement, managerial cost reduction, accumulation and reuse of experience, and a highly visual management reporting system.

  18. Development of innovative computer software to facilitate the setup and computation of water quality index

    PubMed Central

    2013-01-01

    Background Developing a water quality index which is used to convert the water quality dataset into a single number is the most important task of most water quality monitoring programmes. As the water quality index setup is based on different local obstacles, it is not feasible to introduce a definite water quality index to reveal the water quality level. Findings In this study, an innovative software application, the Iranian Water Quality Index Software (IWQIS), is presented in order to facilitate calculation of a water quality index based on dynamic weight factors, which will help users to compute the water quality index in cases where some parameters are missing from the datasets. Conclusion A dataset containing 735 water samples of drinking water quality in different parts of the country was used to show the performance of this software using different criteria parameters. The software proved to be an efficient tool to facilitate the setup of water quality indices based on flexible use of variables and water quality databases. PMID:24499556

  19. Master Pump Shutdown MPS Software Quality Assurance Plan (SQAP)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    BEVINS, R.R.

    2000-09-20

    The MPSS Software Quality Assurance (SQAP) describes the tools and strategy used in the development of the MPSS software. The document also describes the methodology for controlling and managing changes to the software.

  20. Software Quality and Copyright: Issues in Computer-Assisted Instruction.

    ERIC Educational Resources Information Center

    Helm, Virginia

    The two interconnected problems of educational quality and piracy are described and analyzed in this book, which begins with an investigation of the accusations regarding the alleged dismal quality of educational software. The reality behind accusations of rampant piracy and the effect of piracy on the quality of educational software is examined…

  1. Software Quality Assurance Audits Guidebooks

    NASA Technical Reports Server (NTRS)

    1990-01-01

    The growth in cost and importance of software to NASA has caused NASA to address the improvement of software development across the agency. One of the products of this program is a series of guidebooks that define a NASA concept of the assurance processes that are used in software development. The Software Assurance Guidebook, NASA-GB-A201, issued in September, 1989, provides an overall picture of the NASA concepts and practices in software assurance. Second level guidebooks focus on specific activities that fall within the software assurance discipline, and provide more detailed information for the manager and/or practitioner. This is the second level Software Quality Assurance Audits Guidebook that describes software quality assurance audits in a way that is compatible with practices at NASA Centers.

  2. The NCC project: A quality management perspective

    NASA Technical Reports Server (NTRS)

    Lee, Raymond H.

    1993-01-01

    The Network Control Center (NCC) Project introduced the concept of total quality management (TQM) in mid-1990. The CSC project team established a program which focused on continuous process improvement in software development methodology and consistent deliveries of high quality software products for the NCC. The vision of the TQM program was to produce error free software. Specific goals were established to allow continuing assessment of the progress toward meeting the overall quality objectives. The total quality environment, now a part of the NCC Project culture, has become the foundation for continuous process improvement and has resulted in the consistent delivery of quality software products over the last three years.

  3. The impact of software quality characteristics on healthcare outcome: a literature review.

    PubMed

    Aghazadeh, Sakineh; Pirnejad, Habibollah; Moradkhani, Alireza; Aliev, Alvosat

    2014-01-01

    The aim of this study was to discover the effect of software quality characteristics on healthcare quality and efficiency indicators. Through a systematic literature review, we selected and analyzed 37 original research papers to investigate the impact of the software indicators (coming from the standard ISO 9126 quality characteristics and sub-characteristics) on some of healthcare important outcome indicators and finally ranked these software indicators. The results showed that the software characteristics usability, reliability and efficiency were mostly favored in the studies, indicating their importance. On the other hand, user satisfaction, quality of patient care, clinical workflow efficiency, providers' communication and information exchange, patient satisfaction and care costs were among the healthcare outcome indicators frequently evaluated in relation to the mentioned software characteristics. Regression Logistic Method was the most common assessment methodology, and Confirmatory Factor Analysis and Structural Equation Modeling were performed to test the structural model's fit. The software characteristics were considered to impact the healthcare outcome indicators through other intermediate factors (variables).

  4. Quality Assurance in Software Development: An Exploratory Investigation in Software Project Failures and Business Performance

    ERIC Educational Resources Information Center

    Ichu, Emmanuel A.

    2010-01-01

    Software quality is perhaps one of the most sought-after attributes in product development, however; this goal is unattained. Problem factors in software development and how these have affected the maintainability of the delivered software systems requires a thorough investigation. It was, therefore, very important to understand software…

  5. Software Process Improvement through the Removal of Project-Level Knowledge Flow Obstacles: The Perceptions of Software Engineers

    ERIC Educational Resources Information Center

    Mitchell, Susan Marie

    2012-01-01

    Uncontrollable costs, schedule overruns, and poor end product quality continue to plague the software engineering field. Innovations formulated with the expectation to minimize or eliminate cost, schedule, and quality problems have generally fallen into one of three categories: programming paradigms, software tools, and software process…

  6. Improvement of Computer Software Quality through Software Automated Tools.

    DTIC Science & Technology

    1986-08-31

    requirement for increased emphasis on software quality assurance has lead to the creation of various methods of verification and validation. Experience...result was a vast array of methods , systems, languages and automated tools to assist in the process. Given that the primary role of quality assurance is...Unfortunately, there is no single method , tool or technique that can insure accurate, reliable and cost effective software. Therefore, government and industry

  7. A Framework for Evaluating the Software Product Quality of Pregnancy Monitoring Mobile Personal Health Records.

    PubMed

    Idri, Ali; Bachiri, Mariam; Fernández-Alemán, José Luis

    2016-03-01

    Stakeholders' needs and expectations are identified by means of software quality requirements, which have an impact on software product quality. In this paper, we present a set of requirements for mobile personal health records (mPHRs) for pregnancy monitoring, which have been extracted from literature and existing mobile apps on the market. We also use the ISO/IEC 25030 standard to suggest the requirements that should be considered during the quality evaluation of these mPHRs. We then go on to design a checklist in which we contrast the mPHRs for pregnancy monitoring requirements with software product quality characteristics and sub-characteristics in order to calculate the impact of these requirements on software product quality, using the ISO/IEC 25010 software product quality standard. The results obtained show that the requirements related to the user's actions and the app's features have the most impact on the external sub-characteristics of the software product quality model. The only sub-characteristic affected by all the requirements is Appropriateness of Functional suitability. The characteristic Operability is affected by 95% of the requirements while the lowest degrees of impact were identified for Compatibility (15%) and Transferability (6%). Lastly, the degrees of the impact of the mPHRs for pregnancy monitoring requirements are discussed in order to provide appropriate recommendations for the developers and stakeholders of mPHRs for pregnancy monitoring.

  8. Software quality: Process or people

    NASA Technical Reports Server (NTRS)

    Palmer, Regina; Labaugh, Modenna

    1993-01-01

    This paper will present data related to software development processes and personnel involvement from the perspective of software quality assurance. We examine eight years of data collected from six projects. Data collected varied by project but usually included defect and fault density with limited use of code metrics, schedule adherence, and budget growth information. The data are a blend of AFSCP 800-14 and suggested productivity measures in Software Metrics: A Practioner's Guide to Improved Product Development. A software quality assurance database tool, SQUID, was used to store and tabulate the data.

  9. Cost Estimation of Software Development and the Implications for the Program Manager

    DTIC Science & Technology

    1992-06-01

    Software Lifecycle Model (SLIM), the Jensen System-4 model, the Software Productivity, Quality, and Reliability Estimator ( SPQR \\20), the Constructive...function models in current use are the Software Productivity, Quality, and Reliability Estimator ( SPQR /20) and the Software Architecture Sizing and...Estimator ( SPQR /20) was developed by T. Capers Jones of Software Productivity Research, Inc., in 1985. The model is intended to estimate the outcome

  10. Retrieval practice in the form of online homework improved information retention more when spaced 5 days rather than 1 day after class in two physiology courses.

    PubMed

    Cadaret, Caitlin N; Yates, Dustin T

    2018-06-01

    Studies have shown that practicing temporally spaced retrieval of previously learned information via formal assessments increases student retention of the information. Our objective was to determine the impact of online homework administered as a first retrieval practice 1 or 5 days after introduction of physiology topics on long-term information retention. Students in two undergraduate courses, Anatomy and Physiology (ASCI 240) and Animal Physiological Systems (ASCI 340), were presented with information on a specific physiological system during each weekly laboratory and then completed an online homework assignment either 1 or 5 days later. Information retention was assessed via an in-class quiz the following week and by a comprehensive final exam at semester's end (4-13 wk later). Performance on homework assignments was generally similar between groups for both courses. Information retention at 1 wk did not differ due to timing of homework in either course. In both courses, however, students who received homework 5 days after class performed better on final exam questions relevant to that week's topic compared with their day 1 counterparts. These findings indicate that the longer period between introducing physiology information in class and assigning the first retrieval practice was more beneficial to long-term information retention than the shorter period, despite seemingly equivalent benefits in the shorter term. Since information is typically forgotten over time, we speculate that the longer interval necessitates greater retrieval effort in much the same way as built-in desirable difficulties, thus allowing for stronger conceptual connections and deeper comprehension.

  11. Pentium Pro inside. 1; A treecode at 430 Gigaflops on ASCI Red

    NASA Technical Reports Server (NTRS)

    Warren, M. S.; Becker, D. J.; Sterling, T.; Salmon, J. K.; Goda, M. P.

    1997-01-01

    As an entry for the 1997 Gordon Bell performance prize, we present results from two methods of solving the gravitational N-body problem on the Intel Teraflops system at Sandia National Laboratory (ASCI Red). The first method, an O(N2) algorithm, obtained 635 Gigaflops for a 1 million particle problem on 6800 Pentium Pro processors. The second solution method, a tree-code which scales as O(N log N), sustained 170 Gigaflops over a continuous 9.4 hour period on 4096 processors, integrating the motion of 322 million mutually interacting particles in a cosmology simulation, while saving over 100 Gigabytes of raw data. Additionally, the tree-code sustained 430 Gigaflops on 6800 processors for the first 5 time-steps of that simulation. This tree-code solution is approximately 105 times more efficient than the O(N2) algorithm for this problem. As an entry for the 1997 Gordon Bell price/performance prize, we present two calculations from the disciplines of astrophysics and fluid dynamics. The simulations were performed on two 16 Pentium Pro processor Beowulf-class computers (Loki and Hyglac) constructed entirely from commodity personal computer technology, at a cost of roughly $50k each in September, 1996. The price of an equivalent system in August 1997 is less than $30. At Los Alamos, Loki performed a gravitational tree-code N-body simulation of galaxy formation using 9.75 million particles, which sustained an average of 879 Mflops over a ten day period, and produced roughly 10 Gbytes of raw data.

  12. Evaluating software development characteristics: Assessment of software measures in the Software Engineering Laboratory. [reliability engineering

    NASA Technical Reports Server (NTRS)

    Basili, V. R.

    1981-01-01

    Work on metrics is discussed. Factors that affect software quality are reviewed. Metrics is discussed in terms of criteria achievements, reliability, and fault tolerance. Subjective and objective metrics are distinguished. Product/process and cost/quality metrics are characterized and discussed.

  13. Automated daily quality control analysis for mammography in a multi-unit imaging center.

    PubMed

    Sundell, Veli-Matti; Mäkelä, Teemu; Meaney, Alexander; Kaasalainen, Touko; Savolainen, Sauli

    2018-01-01

    Background The high requirements for mammography image quality necessitate a systematic quality assurance process. Digital imaging allows automation of the image quality analysis, which can potentially improve repeatability and objectivity compared to a visual evaluation made by the users. Purpose To develop an automatic image quality analysis software for daily mammography quality control in a multi-unit imaging center. Material and Methods An automated image quality analysis software using the discrete wavelet transform and multiresolution analysis was developed for the American College of Radiology accreditation phantom. The software was validated by analyzing 60 randomly selected phantom images from six mammography systems and 20 phantom images with different dose levels from one mammography system. The results were compared to a visual analysis made by four reviewers. Additionally, long-term image quality trends of a full-field digital mammography system and a computed radiography mammography system were investigated. Results The automated software produced feature detection levels comparable to visual analysis. The agreement was good in the case of fibers, while the software detected somewhat more microcalcifications and characteristic masses. Long-term follow-up via a quality assurance web portal demonstrated the feasibility of using the software for monitoring the performance of mammography systems in a multi-unit imaging center. Conclusion Automated image quality analysis enables monitoring the performance of digital mammography systems in an efficient, centralized manner.

  14. Improving the quality of EHR recording in primary care: a data quality feedback tool.

    PubMed

    van der Bij, Sjoukje; Khan, Nasra; Ten Veen, Petra; de Bakker, Dinny H; Verheij, Robert A

    2017-01-01

    Electronic health record (EHR) data are used to exchange information among health care providers. For this purpose, the quality of the data is essential. We developed a data quality feedback tool that evaluates differences in EHR data quality among practices and software packages as part of a larger intervention. The tool was applied in 92 practices in the Netherlands using different software packages. Practices received data quality feedback in 2010 and 2012. We observed large differences in the quality of recording. For example, the percentage of episodes of care that had a meaningful diagnostic code ranged from 30% to 100%. Differences were highly related to the software package. A year after the first measurement, the quality of recording had improved significantly and differences decreased, with 67% of the physicians indicating that they had actively changed their recording habits based on the results of the first measurement. About 80% found the feedback helpful in pinpointing recording problems. One of the software vendors made changes in functionality as a result of the feedback. Our EHR data quality feedback tool is capable of highlighting differences among practices and software packages. As such, it also stimulates improvements. As substantial variability in recording is related to the software package, our study strengthens the evidence that data quality can be improved substantially by standardizing the functionalities of EHR software packages. © The Author 2016. Published by Oxford University Press on behalf of the American Medical Informatics Association. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  15. Evaluating Software Assurance Knowledge and Competency of Acquisition Professionals

    DTIC Science & Technology

    2014-10-01

    of ISO 12207 -2008, both internationally and in the United States [7]. That standard documents a comprehensive set of activities and supporting...grows, organizations must ensure that their procurement agents acquire high quality, secure software. ISO 12207 and the Software Assurance Competency...cyberattacks grows, organizations must ensure that their procurement agents acquire high quality, secure software. ISO 12207 and the Software Assurance

  16. SOA: A Quality Attribute Perspective

    DTIC Science & Technology

    2011-06-23

    in software engineering from CMU. 6June 2011 Twitter #seiwebinar © 2011 Carnegie Mellon University Agenda Service -Oriented Architecture and... Software Architecture: Review Service -Orientation and Quality Attributes Summary and Future Challenges 7June 2011 Twitter #seiwebinar © 2011...Architecture and Software Architecture: Review Service -Orientation and Quality Attributes Summary and Future Challenges Review 10June 2011 Twitter

  17. Development and Application of New Quality Model for Software Projects

    PubMed Central

    Karnavel, K.; Dillibabu, R.

    2014-01-01

    The IT industry tries to employ a number of models to identify the defects in the construction of software projects. In this paper, we present COQUALMO and its limitations and aim to increase the quality without increasing the cost and time. The computation time, cost, and effort to predict the residual defects are very high; this was overcome by developing an appropriate new quality model named the software testing defect corrective model (STDCM). The STDCM was used to estimate the number of remaining residual defects in the software product; a few assumptions and the detailed steps of the STDCM are highlighted. The application of the STDCM is explored in software projects. The implementation of the model is validated using statistical inference, which shows there is a significant improvement in the quality of the software projects. PMID:25478594

  18. Development and application of new quality model for software projects.

    PubMed

    Karnavel, K; Dillibabu, R

    2014-01-01

    The IT industry tries to employ a number of models to identify the defects in the construction of software projects. In this paper, we present COQUALMO and its limitations and aim to increase the quality without increasing the cost and time. The computation time, cost, and effort to predict the residual defects are very high; this was overcome by developing an appropriate new quality model named the software testing defect corrective model (STDCM). The STDCM was used to estimate the number of remaining residual defects in the software product; a few assumptions and the detailed steps of the STDCM are highlighted. The application of the STDCM is explored in software projects. The implementation of the model is validated using statistical inference, which shows there is a significant improvement in the quality of the software projects.

  19. Improving Software Quality and Management Through Use of Service Level Agreements

    DTIC Science & Technology

    2005-03-01

    many who believe that the quality of the development process is the best predictor of software product quality. ( Fenton ) Repeatable software processes...reduced errors per KLOC for small projects ( Fenton ), and the quality management metric (QMM) (Machniak, Osmundson). There are also numerous IEEE 14...attention to cosmetic user interface issues and any problems that may arise with the prototype. (Sawyer) The validation process is also another check

  20. Guidelines for software inspections

    NASA Technical Reports Server (NTRS)

    1983-01-01

    Quality control inspections are software problem finding procedures which provide defect removal as well as improvements in software functionality, maintenance, quality, and development and testing methodology is discussed. The many side benefits include education, documentation, training, and scheduling.

  1. Copyright and the Assurance of Quality Courseware.

    ERIC Educational Resources Information Center

    Helm, Virginia M.

    Issues related to the illegal copying or piracy of educational software in the schools and its potential effect on quality software availability are discussed. Copyright violation is examined as a reason some software producers may be abandoning the school software market. An explanation of what the copyright allows and prohibits in terms of…

  2. A survey of Canadian medical physicists: software quality assurance of in-house software.

    PubMed

    Salomons, Greg J; Kelly, Diane

    2015-01-05

    This paper reports on a survey of medical physicists who write and use in-house written software as part of their professional work. The goal of the survey was to assess the extent of in-house software usage and the desire or need for related software quality guidelines. The survey contained eight multiple-choice questions, a ranking question, and seven free text questions. The survey was sent to medical physicists associated with cancer centers across Canada. The respondents to the survey expressed interest in having guidelines to help them in their software-related work, but also demonstrated extensive skills in the area of testing, safety, and communication. These existing skills form a basis for medical physicists to establish a set of software quality guidelines.

  3. Impact of Requirements Quality on Project Success or Failure

    NASA Astrophysics Data System (ADS)

    Tamai, Tetsuo; Kamata, Mayumi Itakura

    We are interested in the relationship between the quality of the requirements specifications for software projects and the subsequent outcome of the projects. To examine this relationship, we investigated 32 projects started and completed between 2003 and 2005 by the software development division of a large company in Tokyo. The company has collected reliable data on requirements specification quality, as evaluated by software quality assurance teams, and overall project performance data relating to cost and time overruns. The data for requirements specification quality were first converted into a multiple-dimensional space, with each dimension corresponding to an item of the recommended structure for software requirements specifications (SRS) defined in IEEE Std. 830-1998. We applied various statistical analysis methods to the SRS quality data and project outcomes.

  4. Validation of a Quality Management Metric

    DTIC Science & Technology

    2000-09-01

    quality management metric (QMM) was used to measure the performance of ten software managers on Department of Defense (DoD) software development programs. Informal verification and validation of the metric compared the QMM score to an overall program success score for the entire program and yielded positive correlation. The results of applying the QMM can be used to characterize the quality of software management and can serve as a template to improve software management performance. Future work includes further refining the QMM, applying the QMM scores to provide feedback

  5. A software quality model and metrics for risk assessment

    NASA Technical Reports Server (NTRS)

    Hyatt, L.; Rosenberg, L.

    1996-01-01

    A software quality model and its associated attributes are defined and used as the model for the basis for a discussion on risk. Specific quality goals and attributes are selected based on their importance to a software development project and their ability to be quantified. Risks that can be determined by the model's metrics are identified. A core set of metrics relating to the software development process and its products is defined. Measurements for each metric and their usability and applicability are discussed.

  6. Scytalidium parasiticum sp. nov., a New Species Parasitizing on Ganoderma boninense Isolated from Oil Palm in Peninsular Malaysia.

    PubMed

    Goh, Yit Kheng; Goh, Teik Khiang; Marzuki, Nurul Fadhilah; Tung, Hun Jiat; Goh, You Keng; Goh, Kah Joo

    2015-06-01

    A mycoparasite, Scytalidium parasiticum sp. nov., isolated from the basidiomata of Ganoderma boninense causing basal stem rot of oil palm in Johor, Malaysia, is described and illustrated. It is distinct from other Scytalidium species in having smaller asci and ascospores (teleomorphic stage), longer arthroconidia (anamorphic stage), hyaline to yellowish chlamydospores, and producing a fluorescent pigment. The phylogenetic position of S. parasiticum was determined by sequence analyses of the internal transcribed spacers and the small-subunit ribosomal RNA gene regions. A key to identify Scytalidium species with teleomorphic stage is provided.

  7. Scytalidium parasiticum sp. nov., a New Species Parasitizing on Ganoderma boninense Isolated from Oil Palm in Peninsular Malaysia

    PubMed Central

    Goh, Teik Khiang; Marzuki, Nurul Fadhilah; Tung, Hun Jiat; Goh, You Keng; Goh, Kah Joo

    2015-01-01

    A mycoparasite, Scytalidium parasiticum sp. nov., isolated from the basidiomata of Ganoderma boninense causing basal stem rot of oil palm in Johor, Malaysia, is described and illustrated. It is distinct from other Scytalidium species in having smaller asci and ascospores (teleomorphic stage), longer arthroconidia (anamorphic stage), hyaline to yellowish chlamydospores, and producing a fluorescent pigment. The phylogenetic position of S. parasiticum was determined by sequence analyses of the internal transcribed spacers and the small-subunit ribosomal RNA gene regions. A key to identify Scytalidium species with teleomorphic stage is provided. PMID:26190917

  8. Therans-3-enoic acids ofAster alpinus andArctium minus seed oils.

    PubMed

    Morris, L J; Marshall, M O; Hammond, E W

    1968-01-01

    Thetrans-3-enoic acids ofAster alpinus (dwarf aster, rock aster) andArctium minus (burdock) seed oils have been isolated and characterized.Arctium seed oil containstrans-3,cis-9,cis-12-octadecatrienoic acid (9.9%), andAster oil containstrans-3-hexadecenoic (7.1%),rans-3-octadecenoic (1.9%),trans-3,cis-9-octadecadienoic (3.0%),a ndtrans-3,cis-9,cis-12-octadecatrienoic (13.7%) acids.Aster oil also has an epoxy acid as a minor constituent (ca. 2.0%), which has been identified ascis-9,10-epoxy-cis-12-octadecenoic acid.

  9. Guidance and Control Software Project Data - Volume 4: Configuration Management and Quality Assurance Documents

    NASA Technical Reports Server (NTRS)

    Hayhurst, Kelly J. (Editor)

    2008-01-01

    The Guidance and Control Software (GCS) project was the last in a series of software reliability studies conducted at Langley Research Center between 1977 and 1994. The technical results of the GCS project were recorded after the experiment was completed. Some of the support documentation produced as part of the experiment, however, is serving an unexpected role far beyond its original project context. Some of the software used as part of the GCS project was developed to conform to the RTCA/DO-178B software standard, "Software Considerations in Airborne Systems and Equipment Certification," used in the civil aviation industry. That standard requires extensive documentation throughout the software development life cycle, including plans, software requirements, design and source code, verification cases and results, and configuration management and quality control data. The project documentation that includes this information is open for public scrutiny without the legal or safety implications associated with comparable data from an avionics manufacturer. This public availability has afforded an opportunity to use the GCS project documents for DO-178B training. This report provides a brief overview of the GCS project, describes the 4-volume set of documents and the role they are playing in training, and includes configuration management and quality assurance documents from the GCS project. Volume 4 contains six appendices: A. Software Accomplishment Summary for the Guidance and Control Software Project; B. Software Configuration Index for the Guidance and Control Software Project; C. Configuration Management Records for the Guidance and Control Software Project; D. Software Quality Assurance Records for the Guidance and Control Software Project; E. Problem Report for the Pluto Implementation of the Guidance and Control Software Project; and F. Support Documentation Change Reports for the Guidance and Control Software Project.

  10. Identifying strengths and weaknesses of Quality Management Unit University of Sumatera Utara software using SCAMPI C

    NASA Astrophysics Data System (ADS)

    Gunawan, D.; Amalia, A.; Rahmat, R. F.; Muchtar, M. A.; Siregar, I.

    2018-02-01

    Identification of software maturity level is a technique to determine the quality of the software. By identifying the software maturity level, the weaknesses of the software can be observed. As a result, the recommendations might be a reference for future software maintenance and development. This paper discusses the software Capability Level (CL) with case studies on Quality Management Unit (Unit Manajemen Mutu) University of Sumatera Utara (UMM-USU). This research utilized Standard CMMI Appraisal Method for Process Improvement class C (SCAMPI C) model with continuous representation. This model focuses on activities for developing quality products and services. The observation is done in three process areas, such as Project Planning (PP), Project Monitoring and Control (PMC), and Requirements Management (REQM). According to the measurement of software capability level for UMM-USU software, turns out that the capability level for the observed process area is in the range of CL1 and CL2. Planning Project (PP) is the only process area which reaches capability level 2, meanwhile, PMC and REQM are still in CL 1 or in performed level. This research reveals several weaknesses of existing UMM-USU software. Therefore, this study proposes several recommendations for UMM-USU to improve capability level for observed process areas.

  11. Design and implementation of software for automated quality control and data analysis for a complex LC/MS/MS assay for urine opiates and metabolites.

    PubMed

    Dickerson, Jane A; Schmeling, Michael; Hoofnagle, Andrew N; Hoffman, Noah G

    2013-01-16

    Mass spectrometry provides a powerful platform for performing quantitative, multiplexed assays in the clinical laboratory, but at the cost of increased complexity of analysis and quality assurance calculations compared to other methodologies. Here we describe the design and implementation of a software application that performs quality control calculations for a complex, multiplexed, mass spectrometric analysis of opioids and opioid metabolites. The development and implementation of this application improved our data analysis and quality assurance processes in several ways. First, use of the software significantly improved the procedural consistency for performing quality control calculations. Second, it reduced the amount of time technologists spent preparing and reviewing the data, saving on average over four hours per run, and in some cases improving turnaround time by a day. Third, it provides a mechanism for coupling procedural and software changes with the results of each analysis. We describe several key details of the implementation including the use of version control software and automated unit tests. These generally useful software engineering principles should be considered for any software development project in the clinical lab. Copyright © 2012 Elsevier B.V. All rights reserved.

  12. Software technology insertion: A study of success factors

    NASA Technical Reports Server (NTRS)

    Lydon, Tom

    1990-01-01

    Managing software development in large organizations has become increasingly difficult due to increasing technical complexity, stricter government standards, a shortage of experienced software engineers, competitive pressure for improved productivity and quality, the need to co-develop hardware and software together, and the rapid changes in both hardware and software technology. The 'software factory' approach to software development minimizes risks while maximizing productivity and quality through standardization, automation, and training. However, in practice, this approach is relatively inflexible when adopting new software technologies. The methods that a large multi-project software engineering organization can use to increase the likelihood of successful software technology insertion (STI), especially in a standardized engineering environment, are described.

  13. Requirements model for an e-Health awareness portal

    NASA Astrophysics Data System (ADS)

    Hussain, Azham; Mkpojiogu, Emmanuel O. C.; Nawi, Mohd Nasrun M.

    2016-08-01

    Requirements engineering is at the heart and foundation of software engineering process. Poor quality requirements inevitably lead to poor quality software solutions. Also, poor requirement modeling is tantamount to designing a poor quality product. So, quality assured requirements development collaborates fine with usable products in giving the software product the needed quality it demands. In the light of the foregoing, the requirements for an e-Ebola Awareness Portal were modeled with a good attention given to these software engineering concerns. The requirements for the e-Health Awareness Portal are modeled as a contribution to the fight against Ebola and helps in the fulfillment of the United Nation's Millennium Development Goal No. 6. In this study requirements were modeled using UML 2.0 modeling technique.

  14. Software quality assurance plan for GCS

    NASA Technical Reports Server (NTRS)

    Duncan, Stephen E.; Bailey, Elizabeth K.

    1990-01-01

    The software quality assurance (SQA) function for the Guidance and Control Software (GCS) project which is part of a software error studies research program is described. The SQA plan outlines all of the procedures, controls, and audits to be carried out by the SQA organization to ensure adherence to the policies, procedures, and standards for the GCS project.

  15. Defect measurement and analysis of JPL ground software: a case study

    NASA Technical Reports Server (NTRS)

    Powell, John D.; Spagnuolo, John N., Jr.

    2004-01-01

    Ground software systems at JPL must meet high assurance standards while remaining on schedule due to relatively immovable launch dates for spacecraft that will be controlled by such systems. Toward this end, the Software Quality Improvement (SQI) project's Measurement and Benchmarking (M&B) team is collecting and analyzing defect data of JPL ground system software projects to build software defect prediction models. The aim of these models is to improve predictability with regard to software quality activities. Predictive models will quantitatively define typical trends for JPL ground systems as well as Critical Discriminators (CDs) to provide explanations for atypical deviations from the norm at JPL. CDs are software characteristics that can be estimated or foreseen early in a software project's planning. Thus, these CDs will assist in planning for the predicted degree to which software quality activities for a project are likely to deviation from the normal JPL ground system based on pasted experience across the lab.

  16. Development and case study of a science-based software platform to support policy making on air quality.

    PubMed

    Zhu, Yun; Lao, Yanwen; Jang, Carey; Lin, Chen-Jen; Xing, Jia; Wang, Shuxiao; Fu, Joshua S; Deng, Shuang; Xie, Junping; Long, Shicheng

    2015-01-01

    This article describes the development and implementations of a novel software platform that supports real-time, science-based policy making on air quality through a user-friendly interface. The software, RSM-VAT, uses a response surface modeling (RSM) methodology and serves as a visualization and analysis tool (VAT) for three-dimensional air quality data obtained by atmospheric models. The software features a number of powerful and intuitive data visualization functions for illustrating the complex nonlinear relationship between emission reductions and air quality benefits. The case study of contiguous U.S. demonstrates that the enhanced RSM-VAT is capable of reproducing the air quality model results with Normalized Mean Bias <2% and assisting in air quality policy making in near real time. Copyright © 2014. Published by Elsevier B.V.

  17. The new meaning of quality in the information age.

    PubMed

    Prahalad, C K; Krishnan, M S

    1999-01-01

    Software applications are now a mission-critical source of competitive advantage for most companies. They are also a source of great risk, as the Y2K bug has made clear. Yet many line managers still haven't confronted software issues--partly because they aren't sure how best to define the quality of the applications in their IT infrastructures. Some companies such as Wal-Mart and the Gap have successfully integrated the software in their networks, but most have accumulated an unwidely number of incompatible applications--all designed to perform the same tasks. The authors provide a framework for measuring the performance of software in a company's IT portfolio. Quality traditionally has been measured according to a product's ability to meet certain specifications; other views of quality have emerged that measure a product's adaptability to customers' needs and a product's ability to encourage innovation. To judge software quality properly, argue the authors, managers must measure applications against all three approaches. Understanding the domain of a software application is an important part of that process. The domain is the body of knowledge about a user's needs and expectations for a product. Software domains change frequently based on how a consumer chooses to use, for example, Microsoft Word or a spreadsheet application. The domain can also be influenced by general changes in technology, such as the development of a new software platform. Thus, applications can't be judged only according to whether they conform to specifications. The authors discuss how to identify domain characteristics and software risks and suggest ways to reduce the variability of software domains.

  18. Use of a quality improvement tool, the prioritization matrix, to identify and prioritize triage software algorithm enhancement.

    PubMed

    North, Frederick; Varkey, Prathiba; Caraballo, Pedro; Vsetecka, Darlene; Bartel, Greg

    2007-10-11

    Complex decision support software can require significant effort in maintenance and enhancement. A quality improvement tool, the prioritization matrix, was successfully used to guide software enhancement of algorithms in a symptom assessment call center.

  19. Measuring the impact of computer resource quality on the software development process and product

    NASA Technical Reports Server (NTRS)

    Mcgarry, Frank; Valett, Jon; Hall, Dana

    1985-01-01

    The availability and quality of computer resources during the software development process was speculated to have measurable, significant impact on the efficiency of the development process and the quality of the resulting product. Environment components such as the types of tools, machine responsiveness, and quantity of direct access storage may play a major role in the effort to produce the product and in its subsequent quality as measured by factors such as reliability and ease of maintenance. During the past six years, the NASA Goddard Space Flight Center has conducted experiments with software projects in an attempt to better understand the impact of software development methodologies, environments, and general technologies on the software process and product. Data was extracted and examined from nearly 50 software development projects. All were related to support of satellite flight dynamics ground-based computations. The relationship between computer resources and the software development process and product as exemplified by the subject NASA data was examined. Based upon the results, a number of computer resource-related implications are provided.

  20. Terascale Visualization: Multi-resolution Aspirin for Big-Data Headaches

    NASA Astrophysics Data System (ADS)

    Duchaineau, Mark

    2001-06-01

    Recent experience on the Accelerated Strategic Computing Initiative (ASCI) computers shows that computational physicists are successfully producing a prodigious collection of numbers on several thousand processors. But with this wealth of numbers comes an unprecedented difficulty in processing and moving them to provide useful insight and analysis. In this talk, a few simulations are highlighted where recent advancements in multiple-resolution mathematical representations and algorithms have provided some hope of seeing most of the physics of interest while keeping within the practical limits of the post-simulation storage and interactive data-exploration resources. A whole host of visualization research activities was spawned by the 1999 Gordon Bell Prize-winning computation of a shock-tube experiment showing Richtmyer-Meshkov turbulent instabilities. This includes efforts for the entire data pipeline from running simulation to interactive display: wavelet compression of field data, multi-resolution volume rendering and slice planes, out-of-core extraction and simplification of mixing-interface surfaces, shrink-wrapping to semi-regularize the surfaces, semi-structured surface wavelet compression, and view-dependent display-mesh optimization. More recently on the 12 TeraOps ASCI platform, initial results from a 5120-processor, billion-atom molecular dynamics simulation showed that 30-to-1 reductions in storage size can be achieved with no human-observable errors for the analysis required in simulations of supersonic crack propagation. This made it possible to store the 25 trillion bytes worth of simulation numbers in the available storage, which was under 1 trillion bytes. While multi-resolution methods and related systems are still in their infancy, for the largest-scale simulations there is often no other choice should the science require detailed exploration of the results.

  1. Metschnikowia drakensbergensis sp. nov. and Metschnikowia caudata sp. nov., endemic yeasts associated with Protea flowers in South Africa.

    PubMed

    de Vega, Clara; Guzmán, Beatriz; Steenhuisen, Sandy-Lynn; Johnson, Steven D; Herrera, Carlos M; Lachance, Marc-André

    2014-11-01

    In a taxonomic study of yeasts recovered from nectar of flowers and associated insects in South Africa, 11 strains were found to represent two novel species. Morphological and physiological characteristics and sequence analyses of the large-subunit rRNA gene D1/D2 region, as well as the actin, RNA polymerase II and elongation factor 2 genes, showed that the two novel species belonged to the genus Metschnikowia. Metschnikowia drakensbergensis sp. nov. (type strain EBD-CdVSA09-2(T) =CBS 13649(T) =NRRL Y-63721(T); MycoBank no. MB809688; allotype EBD-CdVSA10-2(A) =CBS13650(A) =NRRL Y-63720(A)) was recovered from nectar of Protea roupelliae and the beetle Heterochelus sp. This species belongs to the large-spored Metschnikowia clade and is closely related to Metschnikowia proteae, with which mating reactions and single-spored asci were observed. Metschnikowia caudata sp. nov. (type strain EBD-CdVSA08-1(T) =CBS 13651(T) =NRRL Y-63722(T); MycoBank no. MB809689; allotype EBD-CdVSA57-2(A) =CBS 13729(A) =NRRL Y-63723(A)) was isolated from nectar of Protea dracomontana, P. roupelliae and P. subvestita and a honeybee, and is a sister species to Candida hainanensis and Metschnikowia lopburiensis. Analyses of the four sequences demonstrated the existence of three separate phylotypes. Intraspecies matings led to the production of mature asci of unprecedented morphology, with a long, flexuous tail. A single ascospore was produced in all compatible crosses, regardless of sequence phylotype. The two species appear to be endemic to South Africa. The ecology and habitat specificity of these novel species are discussed in terms of host plant and insect host species. © 2014 IUMS.

  2. Gene Expression of Pneumocystis murina after Treatment with Anidulafungin Results in Strong Signals for Sexual Reproduction, Cell Wall Integrity, and Cell Cycle Arrest, Indicating a Requirement for Ascus Formation for Proliferation.

    PubMed

    Cushion, Melanie T; Ashbaugh, Alan; Hendrix, Keeley; Linke, Michael J; Tisdale, Nikeya; Sayson, Steven G; Porollo, Aleksey

    2018-05-01

    The echinocandins are a class of antifungal agents that target β-1,3-d-glucan (BG) biosynthesis. In the ascigerous Pneumocystis species, treatment with these drugs depletes the ascus life cycle stage, which contains BG, but large numbers of forms which do not express BG remain in the infected lungs. In the present study, the gene expression profiles of Pneumocystis murina were compared between infected, untreated mice and mice treated with anidulafungin for 2 weeks to understand the metabolism of the persisting forms. Almost 80 genes were significantly up- or downregulated. Like other fungi exposed to echinocandins, genes associated with sexual replication, cell wall integrity, cell cycle arrest, and stress comprised the strongest upregulated signals in P. murina from the treated mice. The upregulation of the P. murina β-1,3-d-glucan endohydrolase and endo-1,3-glucanase was notable and may explain the disappearance of the existing asci in the lungs of treated mice since both enzymes can degrade BG. The biochemical measurement of BG in the lungs of treated mice and fluorescence microscopy with an anti-BG antibody supported the loss of BG. Downregulated signals included genes involved in cell replication, genome stability, and ribosomal biogenesis and function and the Pneumocystis -specific genes encoding the major surface glycoproteins (Msg). These studies suggest that P. murina attempted to undergo sexual replication in response to a stressed environment and was halted in any type of proliferative cycle, likely due to a lack of BG. Asci appear to be a required part of the life cycle stage of Pneumocystis , and BG may be needed to facilitate progression through the life cycle via sexual replication. Copyright © 2018 Cushion et al.

  3. The Intergradation, Genetic Interchangeability and Interpretation of Gene Conversion Spectrum Types

    PubMed Central

    Lamb, Bernard C.; Ghikas, Aglaia

    1979-01-01

    In the Pasadena strains of Ascobolus immersus, the gene conversion propperties of 29 induced (nine UV, nine NG, and 11 ICR-170) and nine spontaneous white-ascospore mutations have been studied. Each mutant was crossed to three types of derived wild-type strains; single mutants often gave very different conversion results in the three types of crosses, with any or all of the following changes in: percentage with post-meiotic segregation among aberrant-ratio asci; percentage with conversion to wild type among aberrant-ratio asci; and in total conversion frequency. — These results are compared with those of Leblon (1972 a, b) from Ascobolus immersus and Yu-Sun, Wickramaratne and Whitehouse (1977) from Sordaria brevicollis. It is shown that conversion spectrum types are not necessarily distinct, but can completely intergrade, on the criteria of both post-meiotic segregation frequency and direction of correction. Genetic differences between strains in the present work resulted in much interchangeability of spectrum types for the same mutation in different crosses; e.g., from type C in one cross to type B/D type in another cross, although the mutation is presumably of the same molecular type (addition or deletion frame shift, or base substitution) in each cross. These changes of conversion properties for a given mutation in different crosses mean that previous interpretations of spectrum types in terms of specific conversion properties for various molecular types of mutation are inapplicable, or inadequate on their own, to explain the present data. Other factors, such as heterozygous cryptic mutations or conversion control genes, are probably involved. Because of asymmetric hybrid DNA formation, correction properties may differ from observed conversion properties. PMID:17248926

  4. Software process assessments

    NASA Technical Reports Server (NTRS)

    Miller, Sharon E.; Tucker, George T.; Verducci, Anthony J., Jr.

    1992-01-01

    Software process assessments (SPA's) are part of an ongoing program of continuous quality improvements in AT&T. Their use was found to be very beneficial by software development organizations in identifying the issues facing the organization and the actions required to increase both quality and productivity in the organization.

  5. Software IV and V Research Priorities and Applied Program Accomplishments Within NASA

    NASA Technical Reports Server (NTRS)

    Blazy, Louis J.

    2000-01-01

    The mission of this research is to be world-class creators and facilitators of innovative, intelligent, high performance, reliable information technologies that enable NASA missions to (1) increase software safety and quality through error avoidance, early detection and resolution of errors, by utilizing and applying empirically based software engineering best practices; (2) ensure customer software risks are identified and/or that requirements are met and/or exceeded; (3) research, develop, apply, verify, and publish software technologies for competitive advantage and the advancement of science; and (4) facilitate the transfer of science and engineering data, methods, and practices to NASA, educational institutions, state agencies, and commercial organizations. The goals are to become a national Center Of Excellence (COE) in software and system independent verification and validation, and to become an international leading force in the field of software engineering for improving the safety, quality, reliability, and cost performance of software systems. This project addresses the following problems: Ensure safety of NASA missions, ensure requirements are met, minimize programmatic and technological risks of software development and operations, improve software quality, reduce costs and time to delivery, and improve the science of software engineering

  6. xSDK Foundations: Toward an Extreme-scale Scientific Software Development Kit

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Heroux, Michael A.; Bartlett, Roscoe; Demeshko, Irina

    Here, extreme-scale computational science increasingly demands multiscale and multiphysics formulations. Combining software developed by independent groups is imperative: no single team has resources for all predictive science and decision support capabilities. Scientific libraries provide high-quality, reusable software components for constructing applications with improved robustness and portability. However, without coordination, many libraries cannot be easily composed. Namespace collisions, inconsistent arguments, lack of third-party software versioning, and additional difficulties make composition costly. The Extreme-scale Scientific Software Development Kit (xSDK) defines community policies to improve code quality and compatibility across independently developed packages (hypre, PETSc, SuperLU, Trilinos, and Alquimia) and provides a foundationmore » for addressing broader issues in software interoperability, performance portability, and sustainability. The xSDK provides turnkey installation of member software and seamless combination of aggregate capabilities, and it marks first steps toward extreme-scale scientific software ecosystems from which future applications can be composed rapidly with assured quality and scalability.« less

  7. xSDK Foundations: Toward an Extreme-scale Scientific Software Development Kit

    DOE PAGES

    Heroux, Michael A.; Bartlett, Roscoe; Demeshko, Irina; ...

    2017-03-01

    Here, extreme-scale computational science increasingly demands multiscale and multiphysics formulations. Combining software developed by independent groups is imperative: no single team has resources for all predictive science and decision support capabilities. Scientific libraries provide high-quality, reusable software components for constructing applications with improved robustness and portability. However, without coordination, many libraries cannot be easily composed. Namespace collisions, inconsistent arguments, lack of third-party software versioning, and additional difficulties make composition costly. The Extreme-scale Scientific Software Development Kit (xSDK) defines community policies to improve code quality and compatibility across independently developed packages (hypre, PETSc, SuperLU, Trilinos, and Alquimia) and provides a foundationmore » for addressing broader issues in software interoperability, performance portability, and sustainability. The xSDK provides turnkey installation of member software and seamless combination of aggregate capabilities, and it marks first steps toward extreme-scale scientific software ecosystems from which future applications can be composed rapidly with assured quality and scalability.« less

  8. 15 CFR 995.25 - Quality management system.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... management system are those defined in this part. The quality management system must ensure that the... type approved conversion software is maintained by a third party, CEVAD shall ensure that no changes made to the conversion software render the type approval of the conversion software invalid, and shall...

  9. 15 CFR 995.25 - Quality management system.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... management system are those defined in this part. The quality management system must ensure that the... type approved conversion software is maintained by a third party, CEVAD shall ensure that no changes made to the conversion software render the type approval of the conversion software invalid, and shall...

  10. 15 CFR 995.25 - Quality management system.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... management system are those defined in this part. The quality management system must ensure that the... type approved conversion software is maintained by a third party, CEVAD shall ensure that no changes made to the conversion software render the type approval of the conversion software invalid, and shall...

  11. 15 CFR 995.25 - Quality management system.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... management system are those defined in this part. The quality management system must ensure that the... type approved conversion software is maintained by a third party, CEVAD shall ensure that no changes made to the conversion software render the type approval of the conversion software invalid, and shall...

  12. Software Design Improvements. Part 2; Software Quality and the Design and Inspection Process

    NASA Technical Reports Server (NTRS)

    Lalli, Vincent R.; Packard, Michael H.; Ziemianski, Tom

    1997-01-01

    The application of assurance engineering techniques improves the duration of failure-free performance of software. The totality of features and characteristics of a software product are what determine its ability to satisfy customer needs. Software in safety-critical systems is very important to NASA. We follow the System Safety Working Groups definition for system safety software as: 'The optimization of system safety in the design, development, use and maintenance of software and its integration with safety-critical systems in an operational environment. 'If it is not safe, say so' has become our motto. This paper goes over methods that have been used by NASA to make software design improvements by focusing on software quality and the design and inspection process.

  13. Establishing Quantitative Software Metrics in Department of the Navy Programs

    DTIC Science & Technology

    2016-04-01

    13 Quality to Metrics Dependency Matrix...11 7. Quality characteristics to metrics dependecy matrix...In accomplishing this goal, a need exists for a formalized set of software quality metrics . This document establishes the validity of those necessary

  14. Towards Accurate Application Characterization for Exascale (APEX)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hammond, Simon David

    Sandia National Laboratories has been engaged in hardware and software codesign activities for a number of years, indeed, it might be argued that prototyping of clusters as far back as the CPLANT machines and many large capability resources including ASCI Red and RedStorm were examples of codesigned solutions. As the research supporting our codesign activities has moved closer to investigating on-node runtime behavior a nature hunger has grown for detailed analysis of both hardware and algorithm performance from the perspective of low-level operations. The Application Characterization for Exascale (APEX) LDRD was a project concieved of addressing some of these concerns.more » Primarily the research was to intended to focus on generating accurate and reproducible low-level performance metrics using tools that could scale to production-class code bases. Along side this research was an advocacy and analysis role associated with evaluating tools for production use, working with leading industry vendors to develop and refine solutions required by our code teams and to directly engage with production code developers to form a context for the application analysis and a bridge to the research community within Sandia. On each of these accounts significant progress has been made, particularly, as this report will cover, in the low-level analysis of operations for important classes of algorithms. This report summarizes the development of a collection of tools under the APEX research program and leaves to other SAND and L2 milestone reports the description of codesign progress with Sandia’s production users/developers.« less

  15. [Software for illustrating a cost-quality balance carried out by clinical laboratory practice].

    PubMed

    Nishibori, Masahiro; Asayama, Hitoshi; Kimura, Satoshi; Takagi, Yasushi; Hagihara, Michio; Fujiwara, Mutsunori; Yoneyama, Akiko; Watanabe, Takashi

    2010-09-01

    We have no proper reference indicating the quality of clinical laboratory practice, which should clearly illustrates that better medical tests require more expenses. Japanese Society of Laboratory Medicine was concerned about recent difficult medical economy and issued a committee report proposing a guideline to evaluate the good laboratory practice. According to the guideline, we developed software that illustrate a cost-quality balance carried out by clinical laboratory practice. We encountered a number of controversial problems, for example, how to measure and weight each quality-related factor, how to calculate costs of a laboratory test and how to consider characteristics of a clinical laboratory. Consequently we finished only prototype software within the given period and the budget. In this paper, software implementation of the guideline and the above-mentioned problems are summarized. Aiming to stimulate these discussions, the operative software will be put on the Society's homepage for trial

  16. A survey of Canadian medical physicists: software quality assurance of in‐house software

    PubMed Central

    Kelly, Diane

    2015-01-01

    This paper reports on a survey of medical physicists who write and use in‐house written software as part of their professional work. The goal of the survey was to assess the extent of in‐house software usage and the desire or need for related software quality guidelines. The survey contained eight multiple‐choice questions, a ranking question, and seven free text questions. The survey was sent to medical physicists associated with cancer centers across Canada. The respondents to the survey expressed interest in having guidelines to help them in their software‐related work, but also demonstrated extensive skills in the area of testing, safety, and communication. These existing skills form a basis for medical physicists to establish a set of software quality guidelines. PACS number: 87.55.Qr PMID:25679168

  17. A Guide to Software Evaluation.

    ERIC Educational Resources Information Center

    Leonard, Rex; LeCroy, Barbara

    Arguing that software evaluation is crucial to the quality of courseware available in a school, this paper begins by discussing reasons why microcomputers are making such a tremendous impact on education, and notes that, although the quality of software has improved over the years, the challenge for teachers to integrate computing into the…

  18. 76 FR 54800 - International Business Machines (IBM), Software Group Business Unit, Quality Assurance Group, San...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-09-02

    ... Machines (IBM), Software Group Business Unit, Quality Assurance Group, San Jose, California; Notice of... workers of International Business Machines (IBM), Software Group Business Unit, Optim Data Studio Tools QA... February 2, 2011 (76 FR 5832). The subject worker group supplies acceptance testing services, design...

  19. The Assistant for Specifying the Quality Software (ASQS) Operational Concept Document. Volume 1

    DTIC Science & Technology

    1990-09-01

    Assistant in which the manager supplies system-specific characteristics and needs and the Assistant fills in the software quality concepts and methods. The...member(s) of the Computer Resources Working Group (CRWG) to aid in performing a software quality engineering study. Figure 3.4-1 outlines the...need to recovery from faults more likely than need _o provide alternative functions or interfaces), and more on Autcncmy - 27 - that Modularity

  20. Secure software practices among Malaysian software practitioners: An exploratory study

    NASA Astrophysics Data System (ADS)

    Mohamed, Shafinah Farvin Packeer; Baharom, Fauziah; Deraman, Aziz; Yahya, Jamaiah; Mohd, Haslina

    2016-08-01

    Secure software practices is increasingly gaining much importance among software practitioners and researchers due to the rise of computer crimes in the software industry. It has become as one of the determinant factors for producing high quality software. Even though its importance has been revealed, its current practice in the software industry is still scarce, particularly in Malaysia. Thus, an exploratory study is conducted among software practitioners in Malaysia to study their experiences and practices in the real-world projects. This paper discusses the findings from the study, which involved 93 software practitioners. Structured questionnaire is utilized for data collection purpose whilst statistical methods such as frequency, mean, and cross tabulation are used for data analysis. Outcomes from this study reveal that software practitioners are becoming increasingly aware on the importance of secure software practices, however, they lack of appropriate implementation, which could affect the quality of produced software.

  1. Absence of Interference in Association with Gene Conversion in SORDARIA FIMICOLA, and Presence of Interference in Association with Ordinary Recombination

    PubMed Central

    Kitani, Y.

    1978-01-01

    From the analysis of large samples of gene conversion asci in the g locus of Sordaria fimicola, it was found that neither the conversion event itself nor conversion-associated recombination of flanking markers cause either chiasma or chromatid interference with crossing over in a neighboring interval. The presence of more than one kind of crossover event, one causing interference the other not, is considered. The existence of two kinds of gene loci, one of single-cistron composition and the other of multiple-cistron composition, is discussed in relation to reciprocal recombination within a locus. PMID:17176535

  2. Absence of interference in association with gene conversion in Sordaria fimicola, and presence of interference in association with ordinary recombination.

    PubMed

    Kitani, Y

    1978-07-01

    From the analysis of large samples of gene conversion asci in the g locus of Sordaria fimicola, it was found that neither the conversion event itself nor conversion-associated recombination of flanking markers cause either chiasma or chromatid interference with crossing over in a neighboring interval. The presence of more than one kind of crossover event, one causing interference the other not, is considered. The existence of two kinds of gene loci, one of single-cistron composition and the other of multiple-cistron composition, is discussed in relation to reciprocal recombination within a locus.

  3. Handbook for Forecasters in the Mediterranean. Part 2. Regional Forecasting Aids for the Mediterranean Basin.

    DTIC Science & Technology

    1980-12-01

    38 V-i 2 Table V-4. Forecasting rules for stations and anchorages. Sigonella Rules 39-42 Valletta Rules 43-45 Taranto Rules 46-48 Argostolion Rule 49...Lampedusa 16 490 Lecce 16 332 Luga/ Valletta 16 597 Messina 16 420 Pal asci a 16 334 Pantel 1 eri a 16 470 Quargle 60 580 Si gonel I a 16 459 S. Maria Di...8217 Pantelleria Gea S.a Bay 蚦 4 53( 480 Lampedusa *. Cazzo Spadaro52 49.0 " 597 •~ 55• 5 5• ,/ • . Luga/ Valletta . • " El Qued -Guemar 559 580 󈧢 Figure

  4. Identification of Patient Safety Risks Associated with Electronic Health Records: A Software Quality Perspective.

    PubMed

    Virginio, Luiz A; Ricarte, Ivan Luiz Marques

    2015-01-01

    Although Electronic Health Records (EHR) can offer benefits to the health care process, there is a growing body of evidence that these systems can also incur risks to patient safety when developed or used improperly. This work is a literature review to identify these risks from a software quality perspective. Therefore, the risks were classified based on the ISO/IEC 25010 software quality model. The risks identified were related mainly to the characteristics of "functional suitability" (i.e., software bugs) and "usability" (i.e., interface prone to user error). This work elucidates the fact that EHR quality problems can adversely affect patient safety, resulting in errors such as incorrect patient identification, incorrect calculation of medication dosages, and lack of access to patient data. Therefore, the risks presented here provide the basis for developers and EHR regulating bodies to pay attention to the quality aspects of these systems that can result in patient harm.

  5. Revealing the ISO/IEC 9126-1 Clique Tree for COTS Software Evaluation

    NASA Technical Reports Server (NTRS)

    Morris, A. Terry

    2007-01-01

    Previous research has shown that acyclic dependency models, if they exist, can be extracted from software quality standards and that these models can be used to assess software safety and product quality. In the case of commercial off-the-shelf (COTS) software, the extracted dependency model can be used in a probabilistic Bayesian network context for COTS software evaluation. Furthermore, while experts typically employ Bayesian networks to encode domain knowledge, secondary structures (clique trees) from Bayesian network graphs can be used to determine the probabilistic distribution of any software variable (attribute) using any clique that contains that variable. Secondary structures, therefore, provide insight into the fundamental nature of graphical networks. This paper will apply secondary structure calculations to reveal the clique tree of the acyclic dependency model extracted from the ISO/IEC 9126-1 software quality standard. Suggestions will be provided to describe how the clique tree may be exploited to aid efficient transformation of an evaluation model.

  6. A method for tailoring the information content of a software process model

    NASA Technical Reports Server (NTRS)

    Perkins, Sharon; Arend, Mark B.

    1990-01-01

    The framework is defined for a general method for selecting a necessary and sufficient subset of a general software life cycle's information products, to support new software development process. Procedures for characterizing problem domains in general and mapping to a tailored set of life cycle processes and products is presented. An overview of the method is shown using the following steps: (1) During the problem concept definition phase, perform standardized interviews and dialogs between developer and user, and between user and customer; (2) Generate a quality needs profile of the software to be developed, based on information gathered in step 1; (3) Translate the quality needs profile into a profile of quality criteria that must be met by the software to satisfy the quality needs; (4) Map the quality criteria to set of accepted processes and products for achieving each criterion; (5) Select the information products which match or support the accepted processes and product of step 4; and (6) Select the design methodology which produces the information products selected in step 5.

  7. A method for tailoring the information content of a software process model

    NASA Technical Reports Server (NTRS)

    Perkins, Sharon; Arend, Mark B.

    1990-01-01

    The framework is defined for a general method for selecting a necessary and sufficient subset of a general software life cycle's information products, to support new software development process. Procedures for characterizing problem domains in general and mapping to a tailored set of life cycle processes and products is presented. An overview of the method is shown using the following steps: (1) During the problem concept definition phase, perform standardized interviews and dialogs between developer and user, and between user and customer; (2) Generate a quality needs profile of the software to be developed, based on information gathered in step 1; (3) Translate the quality needs profile into a profile of quality criteria that must be met by the software to satisfy the quality needs; (4) Map the quality criteria to a set of accepted processes and products for achieving each criterion; (5) select the information products which match or support the accepted processes and product of step 4; and (6) Select the design methodology which produces the information products selected in step 5.

  8. Development of the quality assessment model of EHR software in family medicine practices: research based on user satisfaction.

    PubMed

    Kralj, Damir; Kern, Josipa; Tonkovic, Stanko; Koncar, Miroslav

    2015-09-09

    Family medicine practices (FMPs) make the basis for the Croatian health care system. Use of electronic health record (EHR) software is mandatory and it plays an important role in running these practices, but important functional features still remain uneven and largely left to the will of the software developers. The objective of this study was to develop a novel and comprehensive model for functional evaluation of the EHR software in FMPs, based on current world standards, models and projects, as well as on actual user satisfaction and requirements. Based on previous theoretical and experimental research in this area, we made the initial framework model consisting of six basic categories as a base for online survey questionnaire. Family doctors assessed perceived software quality by using a five-point Likert-type scale. Using exploratory factor analysis and appropriate statistical methods over the collected data, the final optimal structure of the novel model was formed. Special attention was focused on the validity and quality of the novel model. The online survey collected a total of 384 cases. The obtained results indicate both the quality of the assessed software and the quality in use of the novel model. The intense ergonomic orientation of the novel measurement model was particularly emphasised. The resulting novel model is multiple validated, comprehensive and universal. It could be used to assess the user-perceived quality of almost all forms of the ambulatory EHR software and therefore useful to all stakeholders in this area of the health care informatisation.

  9. NASA's Approach to Software Assurance

    NASA Technical Reports Server (NTRS)

    Wetherholt, Martha

    2015-01-01

    NASA defines software assurance as: the planned and systematic set of activities that ensure conformance of software life cycle processes and products to requirements, standards, and procedures via quality, safety, reliability, and independent verification and validation. NASA's implementation of this approach to the quality, safety, reliability, security and verification and validation of software is brought together in one discipline, software assurance. Organizationally, NASA has software assurance at each NASA center, a Software Assurance Manager at NASA Headquarters, a Software Assurance Technical Fellow (currently the same person as the SA Manager), and an Independent Verification and Validation Organization with its own facility. An umbrella risk mitigation strategy for safety and mission success assurance of NASA's software, software assurance covers a wide area and is better structured to address the dynamic changes in how software is developed, used, and managed, as well as it's increasingly complex functionality. Being flexible, risk based, and prepared for challenges in software at NASA is essential, especially as much of our software is unique for each mission.

  10. Measuring health care process quality with software quality measures.

    PubMed

    Yildiz, Ozkan; Demirörs, Onur

    2012-01-01

    Existing quality models focus on some specific diseases, clinics or clinical areas. Although they contain structure, process, or output type measures, there is no model which measures quality of health care processes comprehensively. In addition, due to the not measured overall process quality, hospitals cannot compare quality of processes internally and externally. To bring a solution to above problems, a new model is developed from software quality measures. We have adopted the ISO/IEC 9126 software quality standard for health care processes. Then, JCIAS (Joint Commission International Accreditation Standards for Hospitals) measurable elements were added to model scope for unifying functional requirements. Assessment (diagnosing) process measurement results are provided in this paper. After the application, it was concluded that the model determines weak and strong aspects of the processes, gives a more detailed picture for the process quality, and provides quantifiable information to hospitals to compare their processes with multiple organizations.

  11. Modernized build and test infrastructure for control software at ESO: highly flexible building, testing, and automatic quality practices for telescope control software

    NASA Astrophysics Data System (ADS)

    Pellegrin, F.; Jeram, B.; Haucke, J.; Feyrin, S.

    2016-07-01

    The paper describes the introduction of a new automatized build and test infrastructure, based on the open-source software Jenkins1, into the ESO Very Large Telescope control software to replace the preexisting in-house solution. A brief introduction to software quality practices is given, a description of the previous solution, the limitations of it and new upcoming requirements. Modifications required to adapt the new system are described, how these were implemented to current software and the results obtained. An overview on how the new system may be used in future projects is also presented.

  12. Phylogeny of Penicillium and the segregation of Trichocomaceae into three families

    PubMed Central

    Houbraken, J.; Samson, R.A.

    2011-01-01

    Species of Trichocomaceae occur commonly and are important to both industry and medicine. They are associated with food spoilage and mycotoxin production and can occur in the indoor environment, causing health hazards by the formation of β-glucans, mycotoxins and surface proteins. Some species are opportunistic pathogens, while others are exploited in biotechnology for the production of enzymes, antibiotics and other products. Penicillium belongs phylogenetically to Trichocomaceae and more than 250 species are currently accepted in this genus. In this study, we investigated the relationship of Penicillium to other genera of Trichocomaceae and studied in detail the phylogeny of the genus itself. In order to study these relationships, partial RPB1, RPB2 (RNA polymerase II genes), Tsr1 (putative ribosome biogenesis protein) and Cct8 (putative chaperonin complex component TCP-1) gene sequences were obtained. The Trichocomaceae are divided in three separate families: Aspergillaceae, Thermoascaceae and Trichocomaceae. The Aspergillaceae are characterised by the formation flask-shaped or cylindrical phialides, asci produced inside cleistothecia or surrounded by Hülle cells and mainly ascospores with a furrow or slit, while the Trichocomaceae are defined by the formation of lanceolate phialides, asci borne within a tuft or layer of loose hyphae and ascospores lacking a slit. Thermoascus and Paecilomyces, both members of Thermoascaceae, also form ascospores lacking a furrow or slit, but are differentiated from Trichocomaceae by the production of asci from croziers and their thermotolerant or thermophilic nature. Phylogenetic analysis shows that Penicillium is polyphyletic. The genus is re-defined and a monophyletic genus for both anamorphs and teleomorphs is created (Penicillium sensu stricto). The genera Thysanophora, Eupenicillium, Chromocleista, Hemicarpenteles and Torulomyces belong in Penicillium s. str. and new combinations for the species belonging to these genera are proposed. Analysis of Penicillium below genus rank revealed the presence of 25 clades. A new classification system including both anamorph and teleomorph species is proposed and these 25 clades are treated here as sections. An overview of species belonging to each section is presented. Taxonomic novelties: New sections, all in Penicillium: sect. Sclerotiora Houbraken & Samson, sect. Charlesia Houbraken & Samson, sect. Thysanophora Houbraken & Samson,sect. Ochrosalmonea Houbraken & Samson, sect. Cinnamopurpurea Houbraken & Samson, Fracta Houbraken & Samson, sect. Stolkia Houbraken & Samson, sect. Gracilenta Houbraken & Samson, sect. Citrina Houbraken & Samson, sect. Turbata Houbraken & Samson, sect. Paradoxa Houbraken & Samson, sect. Canescentia Houbraken & Samson. New combinations: Penicillium asymmetricum (Subramanian & Sudha) Houbraken & Samson, P. bovifimosum (Tuthill & Frisvad) Houbraken & Samson, P. glaucoalbidum (Desmazières) Houbraken & Samson, P. laeve (K. Ando & Manoch) Houbraken & Samson, P. longisporum (Kendrick) Houbraken & Samson, P. malachiteum (Yaguchi & Udagawa) Houbraken & Samson, P. ovatum (K. Ando & Nawawi) Houbraken & Samson, P. parviverrucosum (K. Ando & Pitt) Houbraken & Samson, P. saturniforme (Wang & Zhuang) Houbraken & Samson, P. taiwanense (Matsushima) Houbraken & Samson. New names: Penicillium coniferophilum Houbraken & Samson, P. hennebertii Houbraken & Samson, P. melanostipe Houbraken & Samson, P. porphyreum Houbraken & Samson. PMID:22308045

  13. Improving the quality of care of patients with rheumatic disease using patient-centric electronic redesign software.

    PubMed

    Newman, Eric D; Lerch, Virginia; Billet, Jon; Berger, Andrea; Kirchner, H Lester

    2015-04-01

    Electronic health records (EHRs) are not optimized for chronic disease management. To improve the quality of care for patients with rheumatic disease, we developed electronic data capture, aggregation, display, and documentation software. The software integrated and reassembled information from the patient (via a touchscreen questionnaire), nurse, physician, and EHR into a series of actionable views. Core functions included trends over time, rheumatology-related demographics, and documentation for patient and provider. Quality measures collected included patient-reported outcomes, disease activity, and function. The software was tested and implemented in 3 rheumatology departments, and integrated into routine care delivery. Post-implementation evaluation measured adoption, efficiency, productivity, and patient perception. Over 2 years, 6,725 patients completed 19,786 touchscreen questionnaires. The software was adopted for use by 86% of patients and rheumatologists. Chart review and documentation time trended downward, and productivity increased by 26%. Patient satisfaction, activation, and adherence remained unchanged, although pre-implementation values were high. A strong correlation was seen between use of the software and disease control (weighted Pearson's correlation coefficient 0.5927, P = 0.0095), and a relative increase in patients with low disease activity of 3% per quarter was noted. We describe innovative software that aggregates, stores, and displays information vital to improving the quality of care for patients with chronic rheumatic disease. The software was well-adopted by patients and providers. Post-implementation, significant improvements in quality of care, efficiency of care, and productivity were demonstrated. Copyright © 2015 by the American College of Rheumatology.

  14. Software Quality Assurance and Controls Standard

    DTIC Science & Technology

    2010-04-27

    Software Quality Assurance d C t l St d dan on ro s an ar Sue Carroll Principal Software Quality Analyst, SAS John Wal z VP Technology and...for Information Operations and Reports, 1215 Jefferson Davis Highway, Suite 1204, Arlington VA 22202-4302. Respondents should be aware that...Cycle (SLC) process? • What is in a SQA Process? • Where are SQA Controls? • What is the SQA standards history? Wh t i h i i SQA?• a s c ang ng n

  15. Simple solution to the medical instrumentation software problem

    NASA Astrophysics Data System (ADS)

    Leif, Robert C.; Leif, Suzanne B.; Leif, Stephanie H.; Bingue, E.

    1995-04-01

    Medical devices now include a substantial software component, which is both difficult and expensive to produce and maintain. Medical software must be developed according to `Good Manufacturing Practices', GMP. Good Manufacturing Practices as specified by the FDA and ISO requires the definition and compliance to a software processes which ensures quality products by specifying a detailed method of software construction. The software process should be based on accepted standards. US Department of Defense software standards and technology can both facilitate the development and improve the quality of medical systems. We describe the advantages of employing Mil-Std-498, Software Development and Documentation, and the Ada programming language. Ada provides the very broad range of functionalities, from embedded real-time to management information systems required by many medical devices. It also includes advanced facilities for object oriented programming and software engineering.

  16. Software assurance standard

    NASA Technical Reports Server (NTRS)

    1992-01-01

    This standard specifies the software assurance program for the provider of software. It also delineates the assurance activities for the provider and the assurance data that are to be furnished by the provider to the acquirer. In any software development effort, the provider is the entity or individual that actually designs, develops, and implements the software product, while the acquirer is the entity or individual who specifies the requirements and accepts the resulting products. This standard specifies at a high level an overall software assurance program for software developed for and by NASA. Assurance includes the disciplines of quality assurance, quality engineering, verification and validation, nonconformance reporting and corrective action, safety assurance, and security assurance. The application of these disciplines during a software development life cycle is called software assurance. Subsequent lower-level standards will specify the specific processes within these disciplines.

  17. Projecting manpower to attain quality

    NASA Technical Reports Server (NTRS)

    Rone, K. Y.

    1983-01-01

    The resulting model is useful as a projection tool but must be validated in order to be used as an on-going software cost engineering tool. A procedure is developed to facilitate the tracking of model projections and actual data to allow the model to be tuned. Finally, since the model must be used in an environment of overlapping development activities on a progression of software elements in development and maintenance, a manpower allocation model is developed for use in a steady state development/maintenance environment. In these days of soaring software costs it becomes increasingly important to properly manage a software development project. One element of the management task is the projection and tracking of manpower required to perform the task. In addition, since the total cost of the task is directly related to the initial quality built into the software, it becomes a necessity to project the development manpower in a way to attain that quality. An approach to projecting and tracking manpower with quality in mind is described.

  18. Analysis of quality raw data of second generation sequencers with Quality Assessment Software.

    PubMed

    Ramos, Rommel Tj; Carneiro, Adriana R; Baumbach, Jan; Azevedo, Vasco; Schneider, Maria Pc; Silva, Artur

    2011-04-18

    Second generation technologies have advantages over Sanger; however, they have resulted in new challenges for the genome construction process, especially because of the small size of the reads, despite the high degree of coverage. Independent of the program chosen for the construction process, DNA sequences are superimposed, based on identity, to extend the reads, generating contigs; mismatches indicate a lack of homology and are not included. This process improves our confidence in the sequences that are generated. We developed Quality Assessment Software, with which one can review graphs showing the distribution of quality values from the sequencing reads. This software allow us to adopt more stringent quality standards for sequence data, based on quality-graph analysis and estimated coverage after applying the quality filter, providing acceptable sequence coverage for genome construction from short reads. Quality filtering is a fundamental step in the process of constructing genomes, as it reduces the frequency of incorrect alignments that are caused by measuring errors, which can occur during the construction process due to the size of the reads, provoking misassemblies. Application of quality filters to sequence data, using the software Quality Assessment, along with graphing analyses, provided greater precision in the definition of cutoff parameters, which increased the accuracy of genome construction.

  19. A quality-refinement process for medical imaging applications.

    PubMed

    Neuhaus, J; Maleike, D; Nolden, M; Kenngott, H-G; Meinzer, H-P; Wolf, I

    2009-01-01

    To introduce and evaluate a process for refinement of software quality that is suitable to research groups. In order to avoid constraining researchers too much, the quality improvement process has to be designed carefully. The scope of this paper is to present and evaluate a process to advance quality aspects of existing research prototypes in order to make them ready for initial clinical studies. The proposed process is tailored for research environments and therefore more lightweight than traditional quality management processes. Focus on quality criteria that are important at the given stage of the software life cycle. Usage of tools that automate aspects of the process is emphasized. To evaluate the additional effort that comes along with the process, it was exemplarily applied for eight prototypical software modules for medical image processing. The introduced process has been applied to improve the quality of all prototypes so that they could be successfully used in clinical studies. The quality refinement yielded an average of 13 person days of additional effort per project. Overall, 107 bugs were found and resolved by applying the process. Careful selection of quality criteria and the usage of automated process tools lead to a lightweight quality refinement process suitable for scientific research groups that can be applied to ensure a successful transfer of technical software prototypes into clinical research workflows.

  20. SIMULATION TOOL KIT FOR INDOOR AIR QUALITY AND INHALATION EXPOSURE (IAQX) VERSION 1.0 USER'S GUIDE

    EPA Science Inventory

    The User's Guide describes a Microsoft Windows-based indoor air quality (IAQ) simulation software package designed Simulation Tool Kit for Indoor Air Quality and Inhalation Exposure, or IAQX for short. This software complements and supplements existing IAQ simulation programs and...

  1. Digital radiography: optimization of image quality and dose using multi-frequency software.

    PubMed

    Precht, H; Gerke, O; Rosendahl, K; Tingberg, A; Waaler, D

    2012-09-01

    New developments in processing of digital radiographs (DR), including multi-frequency processing (MFP), allow optimization of image quality and radiation dose. This is particularly promising in children as they are believed to be more sensitive to ionizing radiation than adults. To examine whether the use of MFP software reduces the radiation dose without compromising quality at DR of the femur in 5-year-old-equivalent anthropomorphic and technical phantoms. A total of 110 images of an anthropomorphic phantom were imaged on a DR system (Canon DR with CXDI-50 C detector and MLT[S] software) and analyzed by three pediatric radiologists using Visual Grading Analysis. In addition, 3,500 images taken of a technical contrast-detail phantom (CDRAD 2.0) provide an objective image-quality assessment. Optimal image-quality was maintained at a dose reduction of 61% with MLT(S) optimized images. Even for images of diagnostic quality, MLT(S) provided a dose reduction of 88% as compared to the reference image. Software impact on image quality was found significant for dose (mAs), dynamic range dark region and frequency band. By optimizing image processing parameters, a significant dose reduction is possible without significant loss of image quality.

  2. Unisys' experience in software quality and productivity management of an existing system

    NASA Technical Reports Server (NTRS)

    Munson, John B.

    1988-01-01

    A summary of Quality Improvement techniques, implementation, and results in the maintenance, management, and modification of large software systems for the Space Shuttle Program's ground-based systems is provided.

  3. Quality and standardization of telecommunication switching system software

    NASA Astrophysics Data System (ADS)

    Ranko, K.; Hivensaio, J.; Myllykangas, A.

    1981-12-01

    The purpose of this paper has been to illustrate quality and standardization of switching system software from the authors point of view with the aim of developing standardization in the user environment.

  4. Organizational management practices for achieving software process improvement

    NASA Technical Reports Server (NTRS)

    Kandt, Ronald Kirk

    2004-01-01

    The crisis in developing software has been known for over thirty years. Problems that existed in developing software in the early days of computing still exist today. These problems include the delivery of low-quality products, actual development costs that exceed expected development costs, and actual development time that exceeds expected development time. Several solutions have been offered to overcome out inability to deliver high-quality software, on-time and within budget. One of these solutions involves software process improvement. However, such efforts often fail because of organizational management issues. This paper discusses business practices that organizations should follow to improve their chances of initiating and sustaining successful software process improvement efforts.

  5. A Strategy for Improved System Assurance

    DTIC Science & Technology

    2007-06-20

    Quality (Measurements Life Cycle Safety, Security & Others) ISO /IEC 12207 * Software Life Cycle Processes ISO 9001 Quality Management System...14598 Software Product Evaluation Related ISO /IEC 90003 Guidelines for the Application of ISO 9001:2000 to Computer Software IEEE 12207 Industry...Implementation of International Standard ISO /IEC 12207 IEEE 1220 Standard for Application and Management of the System Engineering Process Use in

  6. Proceedings of Tenth Annual Software Engineering Workshop

    NASA Technical Reports Server (NTRS)

    1985-01-01

    Papers are presented on the following topics: measurement of software technology, recent studies of the Software Engineering Lab, software management tools, expert systems, error seeding as a program validation technique, software quality assurance, software engineering environments (including knowledge-based environments), the Distributed Computing Design System, and various Ada experiments.

  7. Proceedings of the Seventeenth Annual Software Engineering Workshop

    NASA Technical Reports Server (NTRS)

    1992-01-01

    Proceedings of the Seventeenth Annual Software Engineering Workshop are presented. The software Engineering Laboratory (SEL) is an organization sponsored by NASA/Goddard Space Flight Center and created to investigate the effectiveness of software engineering technologies when applied to the development of applications software. Topics covered include: the Software Engineering Laboratory; process measurement; software reuse; software quality; lessons learned; and is Ada dying.

  8. Usability Prediction & Ranking of SDLC Models Using Fuzzy Hierarchical Usability Model

    NASA Astrophysics Data System (ADS)

    Gupta, Deepak; Ahlawat, Anil K.; Sagar, Kalpna

    2017-06-01

    Evaluation of software quality is an important aspect for controlling and managing the software. By such evaluation, improvements in software process can be made. The software quality is significantly dependent on software usability. Many researchers have proposed numbers of usability models. Each model considers a set of usability factors but do not cover all the usability aspects. Practical implementation of these models is still missing, as there is a lack of precise definition of usability. Also, it is very difficult to integrate these models into current software engineering practices. In order to overcome these challenges, this paper aims to define the term `usability' using the proposed hierarchical usability model with its detailed taxonomy. The taxonomy considers generic evaluation criteria for identifying the quality components, which brings together factors, attributes and characteristics defined in various HCI and software models. For the first time, the usability model is also implemented to predict more accurate usability values. The proposed system is named as fuzzy hierarchical usability model that can be easily integrated into the current software engineering practices. In order to validate the work, a dataset of six software development life cycle models is created and employed. These models are ranked according to their predicted usability values. This research also focuses on the detailed comparison of proposed model with the existing usability models.

  9. Spectrum analysis on quality requirements consideration in software design documents.

    PubMed

    Kaiya, Haruhiko; Umemura, Masahiro; Ogata, Shinpei; Kaijiri, Kenji

    2013-12-01

    Software quality requirements defined in the requirements analysis stage should be implemented in the final products, such as source codes and system deployment. To guarantee this meta-requirement, quality requirements should be considered in the intermediate stages, such as the design stage or the architectural definition stage. We propose a novel method for checking whether quality requirements are considered in the design stage. In this method, a technique called "spectrum analysis for quality requirements" is applied not only to requirements specifications but also to design documents. The technique enables us to derive the spectrum of a document, and quality requirements considerations in the document are numerically represented in the spectrum. We can thus objectively identify whether the considerations of quality requirements in a requirements document are adapted to its design document. To validate the method, we applied it to commercial software systems with the help of a supporting tool, and we confirmed that the method worked well.

  10. Application of newly developed Fluoro-QC software for image quality evaluation in cardiac X-ray systems.

    PubMed

    Oliveira, M; Lopez, G; Geambastiani, P; Ubeda, C

    2018-05-01

    A quality assurance (QA) program is a valuable tool for the continuous production of optimal quality images. The aim of this paper is to assess a newly developed automatic computer software for image quality (IR) evaluation in fluoroscopy X-ray systems. Test object images were acquired using one fluoroscopy system, Siemens Axiom Artis model (Siemens AG, Medical Solutions Erlangen, Germany). The software was developed as an ImageJ plugin. Two image quality parameters were assessed: high-contrast spatial resolution (HCSR) and signal-to-noise ratio (SNR). The time between manual and automatic image quality assessment procedures were compared. The paired t-test was used to assess the data. p Values of less than 0.05 were considered significant. The Fluoro-QC software generated faster IQ evaluation results (mean = 0.31 ± 0.08 min) than manual procedure (mean = 4.68 ± 0.09 min). The mean difference between techniques was 4.36 min. Discrepancies were identified in the region of interest (ROI) areas drawn manually with evidence of user dependence. The new software presented the results of two tests (HCSR = 3.06, SNR = 5.17) and also collected information from the DICOM header. Significant differences were not identified between manual and automatic measures of SNR (p value = 0.22) and HCRS (p value = 0.46). The Fluoro-QC software is a feasible, fast and free to use method for evaluating imaging quality parameters on fluoroscopy systems. Copyright © 2017 The College of Radiographers. Published by Elsevier Ltd. All rights reserved.

  11. SQA of finite element method (FEM) codes used for analyses of pit storage/transport packages

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Russel, E.

    1997-11-01

    This report contains viewgraphs on the software quality assurance of finite element method codes used for analyses of pit storage and transport projects. This methodology utilizes the ISO 9000-3: Guideline for application of 9001 to the development, supply, and maintenance of software, for establishing well-defined software engineering processes to consistently maintain high quality management approaches.

  12. QRev—Software for computation and quality assurance of acoustic doppler current profiler moving-boat streamflow measurements—Technical manual for version 2.8

    USGS Publications Warehouse

    Mueller, David S.

    2016-06-21

    The software program, QRev applies common and consistent computational algorithms combined with automated filtering and quality assessment of the data to improve the quality and efficiency of streamflow measurements and helps ensure that U.S. Geological Survey streamflow measurements are consistent, accurate, and independent of the manufacturer of the instrument used to make the measurement. Software from different manufacturers uses different algorithms for various aspects of the data processing and discharge computation. The algorithms used by QRev to filter data, interpolate data, and compute discharge are documented and compared to the algorithms used in the manufacturers’ software. QRev applies consistent algorithms and creates a data structure that is independent of the data source. QRev saves an extensible markup language (XML) file that can be imported into databases or electronic field notes software. This report is the technical manual for version 2.8 of QRev.

  13. [Quality assurance of the renal applications software].

    PubMed

    del Real Núñez, R; Contreras Puertas, P I; Moreno Ortega, E; Mena Bares, L M; Maza Muret, F R; Latre Romero, J M

    2007-01-01

    The need for quality assurance of all technical aspects of nuclear medicine studies is widely recognised. However, little attention has been paid to the quality assurance of the applications software. Our work reported here aims at verifying the analysis software for processing of renal nuclear medicine studies (renograms). The software tools were used to build a synthetic dynamic model of renal system. The model consists of two phases: perfusion and function. The organs of interest (kidneys, bladder and aortic artery) were simple geometric forms. The uptake of the renal structures was described by mathematic functions. Curves corresponding to normal or pathological conditions were simulated for kidneys, bladder and aortic artery by appropriate selection of parameters. There was no difference between the parameters of the mathematic curves and the quantitative data produced by the renal analysis program. Our test procedure is simple to apply, reliable, reproducible and rapid to verify the renal applications software.

  14. Early experiences building a software quality prediction model

    NASA Technical Reports Server (NTRS)

    Agresti, W. W.; Evanco, W. M.; Smith, M. C.

    1990-01-01

    Early experiences building a software quality prediction model are discussed. The overall research objective is to establish a capability to project a software system's quality from an analysis of its design. The technical approach is to build multivariate models for estimating reliability and maintainability. Data from 21 Ada subsystems were analyzed to test hypotheses about various design structures leading to failure-prone or unmaintainable systems. Current design variables highlight the interconnectivity and visibility of compilation units. Other model variables provide for the effects of reusability and software changes. Reported results are preliminary because additional project data is being obtained and new hypotheses are being developed and tested. Current multivariate regression models are encouraging, explaining 60 to 80 percent of the variation in error density of the subsystems.

  15. Software Quality Control at Belle II

    NASA Astrophysics Data System (ADS)

    Ritter, M.; Kuhr, T.; Hauth, T.; Gebard, T.; Kristof, M.; Pulvermacher, C.; Belle Software Group, II

    2017-10-01

    Over the last seven years the software stack of the next generation B factory experiment Belle II has grown to over one million lines of C++ and Python code, counting only the part included in offline software releases. There are several thousand commits to the central repository by about 100 individual developers per year. To keep a coherent software stack of high quality that it can be sustained and used efficiently for data acquisition, simulation, reconstruction, and analysis over the lifetime of the Belle II experiment is a challenge. A set of tools is employed to monitor the quality of the software and provide fast feedback to the developers. They are integrated in a machinery that is controlled by a buildbot master and automates the quality checks. The tools include different compilers, cppcheck, the clang static analyzer, valgrind memcheck, doxygen, a geometry overlap checker, a check for missing or extra library links, unit tests, steering file level tests, a sophisticated high-level validation suite, and an issue tracker. The technological development infrastructure is complemented by organizational means to coordinate the development.

  16. Evaluating Predictive Models of Software Quality

    NASA Astrophysics Data System (ADS)

    Ciaschini, V.; Canaparo, M.; Ronchieri, E.; Salomoni, D.

    2014-06-01

    Applications from High Energy Physics scientific community are constantly growing and implemented by a large number of developers. This implies a strong churn on the code and an associated risk of faults, which is unavoidable as long as the software undergoes active evolution. However, the necessities of production systems run counter to this. Stability and predictability are of paramount importance; in addition, a short turn-around time for the defect discovery-correction-deployment cycle is required. A way to reconcile these opposite foci is to use a software quality model to obtain an approximation of the risk before releasing a program to only deliver software with a risk lower than an agreed threshold. In this article we evaluated two quality predictive models to identify the operational risk and the quality of some software products. We applied these models to the development history of several EMI packages with intent to discover the risk factor of each product and compare it with its real history. We attempted to determine if the models reasonably maps reality for the applications under evaluation, and finally we concluded suggesting directions for further studies.

  17. Software for improving the quality of project management, a case study: international manufacture of electrical equipment

    NASA Astrophysics Data System (ADS)

    Preradović, D. M.; Mićić, Lj S.; Barz, C.

    2017-05-01

    Production conditions in today’s world require software support at every stage of production and development of new products, for quality assurance and compliance with ISO standards. In addition to ISO standards such as usual metrics of quality, companies today are focused on other optional standards, such as CMMI (Capability Maturity Model Integrated) or prescribing they own standards. However, while there is intensive progress being made in the PM (project management), there is still a significant number of projects, at the global level, that are failures. These have failed to achieve their goals, within budget or timeframe. This paper focuses on checking the role of software tools through the rate of success in projects implemented in the case of internationally manufactured electrical equipment. The results of this research show the level of contribution of the project management software used to manage and develop new products to improve PM processes and PM functions, and how selection of the software tools affects the quality of PM processes and successfully completed projects.

  18. Assessing and Managing Quality of Information Assurance

    DTIC Science & Technology

    2010-11-01

    such as firewalls, antivirus scanning tools and mechanisms for user authentication and authorization. Advanced mission-critical systems often...imply increased risk to DoD information systems. The Process and Organizational Maturity (POM) class focuses on the maturity of the software and...include architectural quality. Common Weakness Enumeration (CWE) is a recent example that highlights the connection between software quality and

  19. Software Engineering Guidebook

    NASA Technical Reports Server (NTRS)

    Connell, John; Wenneson, Greg

    1993-01-01

    The Software Engineering Guidebook describes SEPG (Software Engineering Process Group) supported processes and techniques for engineering quality software in NASA environments. Three process models are supported: structured, object-oriented, and evolutionary rapid-prototyping. The guidebook covers software life-cycles, engineering, assurance, and configuration management. The guidebook is written for managers and engineers who manage, develop, enhance, and/or maintain software under the Computer Software Services Contract.

  20. Software Quality Metrics: A Software Management Monitoring Method for Air Force Logistics Command in Its Software Quality Assurance Program for the Quantitative Assessment of the System Development Life Cycle under Configuration Management.

    DTIC Science & Technology

    1982-03-01

    pilot systems. Magnitude of the mutant error is classified as: o Program does not compute. o Program computes but does not run test data. o Program...14 Test and Integration ... ............ .. 105 15 The Mapping of SQM to the SDLC ........ ... 108 16 ADS Development .... .............. . 224 17...and funds. While the test phase concludes the normal development cycle, one should realize that with software the development continues in the

  1. Perspectives of high power ultrasound in food preservation

    NASA Astrophysics Data System (ADS)

    Evelyn; Silva, F. V. M.

    2018-04-01

    High Power ultrasound can be used to alter physicochemical properties and improve the quality of foods during processing due to a number of mechanical, chemical, and biochemical effects arising from acoustic cavitation. Cavitation creates pressure waves that inactivate microbes and de-agglomerate bacterial clusters or release ascospores from fungal asci. Bacterial and heat resistant fungal spores’ inactivation is a great challenge in food preservation due to their ability to survive after conventional food processing, causing food-borne diseases or spoilage. In this work, a showcase of application of high power ultrasound combined with heat or thermosonication, to inactivate bacterial spores i.e. Bacillus cereus spores in beef slurry and fungal spores i.e. Neosartorya fischeri ascospores in apple juice was presented and compared with thermal processing. Faster inactivation was achieved at higher TS (24 KHz, 0.33 W/g or W/mL) temperatures. Around 2 log inactivation was obtained for B. cereus spores after1 min (70 °C) and N. fischeri ascospores after 30 min (75 °C). Thermal treatments caused <1 log in B. Cereus after 2 min (70 °C) and no inactivation in N. Fischeri ascospores after 30 min (80 °C). In conclusion, temperature plays a significant role for TS spore inactivation and TS was more effective than thermal treatment alone. The mould spores were more resistant than the bacterial spores.

  2. IEEE Std 730 Software Quality Assurance: Supporting CMMI-DEV v1.3, Product and Process Quality Assurance

    DTIC Science & Technology

    2011-05-27

    frameworks 4 CMMI-DEV IEEE / ISO / IEC 15288 / 12207 Quality Assurance ©2011 Walz IEEE Life Cycle Processes & Artifacts • Systems Life Cycle Processes...TAG to ISO TC 176 Quality Management • Quality: ASQ, work experience • Software: three books, consulting, work experience • Systems: Telecom & DoD...and IEEE 730 SQA need to align. The P730 IEEE standards working group has expanded the scope of the SQA process standard to align with IS 12207

  3. Quality and security - They work together

    NASA Technical Reports Server (NTRS)

    Carr, Richard; Tynan, Marie; Davis, Russell

    1991-01-01

    This paper describes the importance of considering computer security as part of software quality assurance practice. The intended audience is primarily those professionals involved in the design, development, and quality assurance of software. Many issues are raised which point to the need ultimately for integration of quality assurance and computer security disciplines. To address some of the issues raised, the NASA Automated Information Security program is presented as a model which may be used for improving interactions between the quality assurance and computer security community of professionals.

  4. Ontology Based Quality Evaluation for Spatial Data

    NASA Astrophysics Data System (ADS)

    Yılmaz, C.; Cömert, Ç.

    2015-08-01

    Many institutions will be providing data to the National Spatial Data Infrastructure (NSDI). Current technical background of the NSDI is based on syntactic web services. It is expected that this will be replaced by semantic web services. The quality of the data provided is important in terms of the decision-making process and the accuracy of transactions. Therefore, the data quality needs to be tested. This topic has been neglected in Turkey. Data quality control for NSDI may be done by private or public "data accreditation" institutions. A methodology is required for data quality evaluation. There are studies for data quality including ISO standards, academic studies and software to evaluate spatial data quality. ISO 19157 standard defines the data quality elements. Proprietary software such as, 1Spatial's 1Validate and ESRI's Data Reviewer offers quality evaluation based on their own classification of rules. Commonly, rule based approaches are used for geospatial data quality check. In this study, we look for the technical components to devise and implement a rule based approach with ontologies using free and open source software in semantic web context. Semantic web uses ontologies to deliver well-defined web resources and make them accessible to end-users and processes. We have created an ontology conforming to the geospatial data and defined some sample rules to show how to test data with respect to data quality elements including; attribute, topo-semantic and geometrical consistency using free and open source software. To test data against rules, sample GeoSPARQL queries are created, associated with specifications.

  5. Software Past, Present, and Future: Views from Government, Industry and Academia

    NASA Technical Reports Server (NTRS)

    Holcomb, Lee; Page, Jerry; Evangelist, Michael

    2000-01-01

    Views from the NASA CIO NASA Software Engineering Workshop on software development from the past, present, and future are presented. The topics include: 1) Software Past; 2) Software Present; 3) NASA's Largest Software Challenges; 4) 8330 Software Projects in Industry Standish Groups 1994 Report; 5) Software Future; 6) Capability Maturity Model (CMM): Software Engineering Institute (SEI) levels; 7) System Engineering Quality Also Part of the Problem; 8) University Environment Trends Will Increase the Problem in Software Engineering; and 9) NASA Software Engineering Goals.

  6. Digitized hand-wrist radiographs: comparison of subjective and software-derived image quality at various compression ratios.

    PubMed

    McCord, Layne K; Scarfe, William C; Naylor, Rachel H; Scheetz, James P; Silveira, Anibal; Gillespie, Kevin R

    2007-05-01

    The objectives of this study were to compare the effect of JPEG 2000 compression of hand-wrist radiographs on observer image quality qualitative assessment and to compare with a software-derived quantitative image quality index. Fifteen hand-wrist radiographs were digitized and saved as TIFF and JPEG 2000 images at 4 levels of compression (20:1, 40:1, 60:1, and 80:1). The images, including rereads, were viewed by 13 orthodontic residents who determined the image quality rating on a scale of 1 to 5. A quantitative analysis was also performed by using a readily available software based on the human visual system (Image Quality Measure Computer Program, version 6.2, Mitre, Bedford, Mass). ANOVA was used to determine the optimal compression level (P < or =.05). When we compared subjective indexes, JPEG compression greater than 60:1 significantly reduced image quality. When we used quantitative indexes, the JPEG 2000 images had lower quality at all compression ratios compared with the original TIFF images. There was excellent correlation (R2 >0.92) between qualitative and quantitative indexes. Image Quality Measure indexes are more sensitive than subjective image quality assessments in quantifying image degradation with compression. There is potential for this software-based quantitative method in determining the optimal compression ratio for any image without the use of subjective raters.

  7. Proceedings of the Eighth Annual Software Engineering Workshop

    NASA Technical Reports Server (NTRS)

    1983-01-01

    The four major topics of discussion included: the NASA Software Engineering Laboratory, software testing, human factors in software engineering and software quality assessment. As in the past years, there were 12 position papers presented (3 for each topic) followed by questions and very heavy participation by the general audience.

  8. Understanding Acceptance of Software Metrics--A Developer Perspective

    ERIC Educational Resources Information Center

    Umarji, Medha

    2009-01-01

    Software metrics are measures of software products and processes. Metrics are widely used by software organizations to help manage projects, improve product quality and increase efficiency of the software development process. However, metrics programs tend to have a high failure rate in organizations, and developer pushback is one of the sources…

  9. 78 FR 25482 - Notice of Revised Determination on Reconsideration

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-05-01

    ...-PROGRESSIVE SOFTWARE COMPUTING, QUALITY TESTING SERVICES, INC., RAILROAD CONSTRUCTION CO. OF SOUTH JERSEY, INC..., LP, PSCI- Progressive Software Computing, Quality Testing Services, Inc., Railroad Construction Co..., ANDERSON CONSTRUCTION SERVICES, BAKER PETROLITE, BAKERCORP, BELL-FAST FIRE PROTECTION INC., BOLTTECH INC...

  10. Automated Reuse of Scientific Subroutine Libraries through Deductive Synthesis

    NASA Technical Reports Server (NTRS)

    Lowry, Michael R.; Pressburger, Thomas; VanBaalen, Jeffrey; Roach, Steven

    1997-01-01

    Systematic software construction offers the potential of elevating software engineering from an art-form to an engineering discipline. The desired result is more predictable software development leading to better quality and more maintainable software. However, the overhead costs associated with the formalisms, mathematics, and methods of systematic software construction have largely precluded their adoption in real-world software development. In fact, many mainstream software development organizations, such as Microsoft, still maintain a predominantly oral culture for software development projects; which is far removed from a formalism-based culture for software development. An exception is the limited domain of safety-critical software, where the high-assuiance inherent in systematic software construction justifies the additional cost. We believe that systematic software construction will only be adopted by mainstream software development organization when the overhead costs have been greatly reduced. Two approaches to cost mitigation are reuse (amortizing costs over many applications) and automation. For the last four years, NASA Ames has funded the Amphion project, whose objective is to automate software reuse through techniques from systematic software construction. In particular, deductive program synthesis (i.e., program extraction from proofs) is used to derive a composition of software components (e.g., subroutines) that correctly implements a specification. The construction of reuse libraries of software components is the standard software engineering solution for improving software development productivity and quality.

  11. Data quality can make or break a research infrastructure

    NASA Astrophysics Data System (ADS)

    Pastorello, G.; Gunter, D.; Chu, H.; Christianson, D. S.; Trotta, C.; Canfora, E.; Faybishenko, B.; Cheah, Y. W.; Beekwilder, N.; Chan, S.; Dengel, S.; Keenan, T. F.; O'Brien, F.; Elbashandy, A.; Poindexter, C.; Humphrey, M.; Papale, D.; Agarwal, D.

    2017-12-01

    Research infrastructures (RIs) commonly support observational data provided by multiple, independent sources. Uniformity in the data distributed by such RIs is important in most applications, e.g., in comparative studies using data from two or more sources. Achieving uniformity in terms of data quality is challenging, especially considering that many data issues are unpredictable and cannot be detected until a first occurrence of the issue. With that, many data quality control activities within RIs require a manual, human-in-the-loop element, making it an expensive activity. Our motivating example is the FLUXNET2015 dataset - a collection of ecosystem-level carbon, water, and energy fluxes between land and atmosphere from over 200 sites around the world, some sites with over 20 years of data. About 90% of the human effort to create the dataset was spent in data quality related activities. Based on this experience, we have been working on solutions to increase the automation of data quality control procedures. Since it is nearly impossible to fully automate all quality related checks, we have been drawing from the experience with techniques used in software development, which shares a few common constraints. In both managing scientific data and writing software, human time is a precious resource; code bases, as Science datasets, can be large, complex, and full of errors; both scientific and software endeavors can be pursued by individuals, but collaborative teams can accomplish a lot more. The lucrative and fast-paced nature of the software industry fueled the creation of methods and tools to increase automation and productivity within these constraints. Issue tracking systems, methods for translating problems into automated tests, powerful version control tools are a few examples. Terrestrial and aquatic ecosystems research relies heavily on many types of observational data. As volumes of data collection increases, ensuring data quality is becoming an unwieldy challenge for RIs. Business as usual approaches to data quality do not work with larger data volumes. We believe RIs can benefit greatly from adapting and imitating this body of theory and practice from software quality into data quality, enabling systematic and reproducible safeguards against errors and mistakes in datasets as much as in software.

  12. An Investigation of Techniques for Detecting Data Anomalies in Earned Value Management Data

    DTIC Science & Technology

    2011-12-01

    Management Studio Harte Hanks Trillium Software Trillium Software System IBM Info Sphere Foundation Tools Informatica Data Explorer Informatica ...Analyst Informatica Developer Informatica Administrator Pitney Bowes Business Insight Spectrum SAP BusinessObjects Data Quality Management DataFlux...menting quality monitoring efforts and tracking data quality improvements Informatica http://www.informatica.com/products_services/Pages/index.aspx

  13. Software Architecture Evolution

    ERIC Educational Resources Information Center

    Barnes, Jeffrey M.

    2013-01-01

    Many software systems eventually undergo changes to their basic architectural structure. Such changes may be prompted by new feature requests, new quality attribute requirements, changing technology, or other reasons. Whatever the causes, architecture evolution is commonplace in real-world software projects. Today's software architects, however,…

  14. Speciation in the large-spored Metschnikowia clade and establishment of a new species, Metschnikowia borealis comb. nov.

    PubMed

    Marinoni, Gaëlle; Lachance, Marc-André

    2004-03-01

    The reproductive boundaries among species in the large-spored Metschnikowia clade were studied by prototrophic recombinant selection, electrophoretic karyotyping, mitochondrial DNA restriction analysis, and DNA sequence analysis. Inviable ascospores arose from crosses between the two varieties of Metschnikowia continentalis, indicating that they should be recognized as separate species. Prototrophic recombinants were recovered from crosses between auxotrophic mutants of Metschnikowia borealis, M. continentalis, Metschnikowia lochheadii, Metschnikowia sp. UWO(PS)00-154.1, and Candida ipomoeae, showing that some genetic exchange is possible in spite of the sterility of the asci formed in interspecific crosses. Metschnikowia hawaiiensis, although capable of ascus formation when its h(-) mating type is crossed with the h(+) mating type of the other species, did not give rise to recombinants. In the other species, some recombinants acquired the ability to form asci directly from single cells. These often contained the chromosomes of both parents, suggesting formation of allodiploid hybrids. Other recombinants behaved as haploids and were similar to one parent except for having inherited the selectable wild-type allele from the other parent. In most, but not all cases, inheritance of the mitochondrial genome was uniparental and correlated with the inheritance of the nuclear chromosome complement. In some cases, what appeared to be a recombinant mitochondrial genome was observed. Phylogenies derived from the sequences of various DNA regions were not congruent, indicating that hybridization may have taken place in nature as the large-spored species diverged from their common ancestor. Further evidence that C. ipomoeae arose from a natural recombination event was obtained, but a pair of Metschnikowia species that might represent derived forms of the parents could not be identified conclusively. C. ipomoeae and most of its closely related Metschnikowia species contained a group-II intron in the mitochondrial small-subunit ribosomal gene. The intron was absent in M. borealis, M. hawaiiensis, and other species in the genus Metschnikowia.

  15. UrQt: an efficient software for the Unsupervised Quality trimming of NGS data.

    PubMed

    Modolo, Laurent; Lerat, Emmanuelle

    2015-04-29

    Quality control is a necessary step of any Next Generation Sequencing analysis. Although customary, this step still requires manual interventions to empirically choose tuning parameters according to various quality statistics. Moreover, current quality control procedures that provide a "good quality" data set, are not optimal and discard many informative nucleotides. To address these drawbacks, we present a new quality control method, implemented in UrQt software, for Unsupervised Quality trimming of Next Generation Sequencing reads. Our trimming procedure relies on a well-defined probabilistic framework to detect the best segmentation between two segments of unreliable nucleotides, framing a segment of informative nucleotides. Our software only requires one user-friendly parameter to define the minimal quality threshold (phred score) to consider a nucleotide to be informative, which is independent of both the experiment and the quality of the data. This procedure is implemented in C++ in an efficient and parallelized software with a low memory footprint. We tested the performances of UrQt compared to the best-known trimming programs, on seven RNA and DNA sequencing experiments and demonstrated its optimality in the resulting tradeoff between the number of trimmed nucleotides and the quality objective. By finding the best segmentation to delimit a segment of good quality nucleotides, UrQt greatly increases the number of reads and of nucleotides that can be retained for a given quality objective. UrQt source files, binary executables for different operating systems and documentation are freely available (under the GPLv3) at the following address: https://lbbe.univ-lyon1.fr/-UrQt-.html .

  16. The Implications of Using Integrated Software Support Environment for Design of Guidance and Control Systems Software

    DTIC Science & Technology

    1990-02-01

    inspections are performed before each formal review of each software life cycle phase. * Required software audits are performed . " The software is acceptable... Audits : Software audits are performed bySQA consistent with thegeneral audit rules and an auditreportis prepared. Software Quality Inspection (SQI...DSD Software Development Method 3-34 DEFINITION OF ACRONYMS Acronym Full Name or Description MACH Methode d’Analyse et de Conception Flierarchisee

  17. An experience of qualified preventive screening: shiraz smart screening software.

    PubMed

    Islami Parkoohi, Parisa; Zare, Hashem; Abdollahifard, Gholamreza

    2015-01-01

    Computerized preventive screening software is a cost effective intervention tool to address non-communicable chronic diseases. Shiraz Smart Screening Software (SSSS) was developed as an innovative tool for qualified screening. It allows simultaneous smart screening of several high-burden chronic diseases and supports reminder notification functionality. The extent in which SSSS affects screening quality is also described. Following software development, preventive screening and annual health examinations of 261 school staff (Medical School of Shiraz, Iran) was carried out in a software-assisted manner. To evaluate the quality of the software-assisted screening, we used quasi-experimental study design and determined coverage, irregular attendance and inappropriateness proportions in relation with the manual and software-assisted screening as well as the corresponding number of requested tests. In manual screening method, 27% of employees were covered (with 94% irregular attendance) while by software-assisted screening, the coverage proportion was 79% (attendance status will clear after the specified time). The frequency of inappropriate screening test requests, before the software implementation, was 41.37% for fasting plasma glucose, 41.37% for lipid profile, 0.84% for occult blood, 0.19% for flexible sigmoidoscopy/colonoscopy, 35.29% for Pap smear, 19.20% for mammography and 11.2% for prostate specific antigen. All of the above were corrected by the software application. In total, 366 manual screening and 334 software-assisted screening tests were requested. SSSS is an innovative tool to improve the quality of preventive screening plans in terms of increased screening coverage, reduction in inappropriateness and the total number of requested tests.

  18. Effective Software Engineering Leadership for Development Programs

    ERIC Educational Resources Information Center

    Cagle West, Marsha

    2010-01-01

    Software is a critical component of systems ranging from simple consumer appliances to complex health, nuclear, and flight control systems. The development of quality, reliable, and effective software solutions requires the incorporation of effective software engineering processes and leadership. Processes, approaches, and methodologies for…

  19. Quality assurance software inspections at NASA Ames: Metrics for feedback and modification

    NASA Technical Reports Server (NTRS)

    Wenneson, G.

    1985-01-01

    Software inspections are a set of formal technical review procedures held at selected key points during software development in order to find defects in software documents--is described in terms of history, participants, tools, procedures, statistics, and database analysis.

  20. GenePRIMP: A software quality control tool

    ScienceCinema

    Amrita Pati

    2017-12-09

    Amrita Pati of the DOE Joint Genome Institute's Genome Biology group describes the software tool GenePRIMP and how it fits into the quality control pipeline for microbial genomics. Further details regarding GenePRIMP appear in a paper published online May 2, 2010 in Nature Methods.

  1. State of the art metrics for aspect oriented programming

    NASA Astrophysics Data System (ADS)

    Ghareb, Mazen Ismaeel; Allen, Gary

    2018-04-01

    The quality evaluation of software, e.g., defect measurement, gains significance with higher use of software applications. Metric measurements are considered as the primary indicator of imperfection prediction and software maintenance in various empirical studies of software products. However, there is no agreement on which metrics are compelling quality indicators for novel development approaches such as Aspect Oriented Programming (AOP). AOP intends to enhance programming quality, by providing new and novel constructs for the development of systems, for example, point cuts, advice and inter-type relationships. Hence, it is not evident if quality pointers for AOP can be derived from direct expansions of traditional OO measurements. Then again, investigations of AOP do regularly depend on established coupling measurements. Notwithstanding the late reception of AOP in empirical studies, coupling measurements have been adopted as useful markers of flaw inclination in this context. In this paper we will investigate the state of the art metrics for measurement of Aspect Oriented systems development.

  2. A proposed classification scheme for Ada-based software products

    NASA Technical Reports Server (NTRS)

    Cernosek, Gary J.

    1986-01-01

    As the requirements for producing software in the Ada language become a reality for projects such as the Space Station, a great amount of Ada-based program code will begin to emerge. Recognizing the potential for varying levels of quality to result in Ada programs, what is needed is a classification scheme that describes the quality of a software product whose source code exists in Ada form. A 5-level classification scheme is proposed that attempts to decompose this potentially broad spectrum of quality which Ada programs may possess. The number of classes and their corresponding names are not as important as the mere fact that there needs to be some set of criteria from which to evaluate programs existing in Ada. An exact criteria for each class is not presented, nor are any detailed suggestions of how to effectively implement this quality assessment. The idea of Ada-based software classification is introduced and a set of requirements from which to base further research and development is suggested.

  3. Web Implementation of Quality Assurance (QA) for X-ray Units in Balkanic Medical Institutions.

    PubMed

    Urošević, Vlade; Ristić, Olga; Milošević, Danijela; Košutić, Duško

    2015-08-01

    Diagnostic radiology is the major contributor to the total dose of the population from all artificial sources. In order to reduce radiation exposure and optimize diagnostic x-ray image quality, it is necessary to increase the quality and efficiency of quality assurance (QA) and audit programs. This work presents a web application providing completely new QA solutions for x-ray modalities and facilities. The software gives complete online information (using European standards) with which the corresponding institutions and individuals can evaluate and control a facility's Radiation Safety and QA program. The software enables storage of all data in one place and sharing the same information (data), regardless of whether the measured data is used by an individual user or by an authorized institution. The software overcomes the distance and time separation of institutions and individuals who take part in QA. Upgrading the software will enable assessment of the medical exposure level to ionizing radiation.

  4. Third-Party Software's Trust Quagmire.

    PubMed

    Voas, J; Hurlburt, G

    2015-12-01

    Current software development has trended toward the idea of integrating independent software sub-functions to create more complete software systems. Software sub-functions are often not homegrown - instead they are developed by unknown 3 rd party organizations and reside in software marketplaces owned or controlled by others. Such software sub-functions carry plausible concern in terms of quality, origins, functionality, security, interoperability, to name a few. This article surveys key technical difficulties in confidently building systems from acquired software sub-functions by calling out the principle software supply chain actors.

  5. Software metrics: The key to quality software on the NCC project

    NASA Technical Reports Server (NTRS)

    Burns, Patricia J.

    1993-01-01

    Network Control Center (NCC) Project metrics are captured during the implementation and testing phases of the NCCDS software development lifecycle. The metrics data collection and reporting function has interfaces with all elements of the NCC project. Close collaboration with all project elements has resulted in the development of a defined and repeatable set of metrics processes. The resulting data are used to plan and monitor release activities on a weekly basis. The use of graphical outputs facilitates the interpretation of progress and status. The successful application of metrics throughout the NCC project has been instrumental in the delivery of quality software. The use of metrics on the NCC Project supports the needs of the technical and managerial staff. This paper describes the project, the functions supported by metrics, the data that are collected and reported, how the data are used, and the improvements in the quality of deliverable software since the metrics processes and products have been in use.

  6. Software reliability models for fault-tolerant avionics computers and related topics

    NASA Technical Reports Server (NTRS)

    Miller, Douglas R.

    1987-01-01

    Software reliability research is briefly described. General research topics are reliability growth models, quality of software reliability prediction, the complete monotonicity property of reliability growth, conceptual modelling of software failure behavior, assurance of ultrahigh reliability, and analysis techniques for fault-tolerant systems.

  7. Quantitative evaluation of software packages for single-molecule localization microscopy.

    PubMed

    Sage, Daniel; Kirshner, Hagai; Pengo, Thomas; Stuurman, Nico; Min, Junhong; Manley, Suliana; Unser, Michael

    2015-08-01

    The quality of super-resolution images obtained by single-molecule localization microscopy (SMLM) depends largely on the software used to detect and accurately localize point sources. In this work, we focus on the computational aspects of super-resolution microscopy and present a comprehensive evaluation of localization software packages. Our philosophy is to evaluate each package as a whole, thus maintaining the integrity of the software. We prepared synthetic data that represent three-dimensional structures modeled after biological components, taking excitation parameters, noise sources, point-spread functions and pixelation into account. We then asked developers to run their software on our data; most responded favorably, allowing us to present a broad picture of the methods available. We evaluated their results using quantitative and user-interpretable criteria: detection rate, accuracy, quality of image reconstruction, resolution, software usability and computational resources. These metrics reflect the various tradeoffs of SMLM software packages and help users to choose the software that fits their needs.

  8. The software product assurance metrics study: JPL's software systems quality and productivity

    NASA Technical Reports Server (NTRS)

    Bush, Marilyn W.

    1989-01-01

    The findings are reported of the Jet Propulsion Laboratory (JPL)/Software Product Assurance (SPA) Metrics Study, conducted as part of a larger JPL effort to improve software quality and productivity. Until recently, no comprehensive data had been assembled on how JPL manages and develops software-intensive systems. The first objective was to collect data on software development from as many projects and for as many years as possible. Results from five projects are discussed. These results reflect 15 years of JPL software development, representing over 100 data points (systems and subsystems), over a third of a billion dollars, over four million lines of code and 28,000 person months. Analysis of this data provides a benchmark for gauging the effectiveness of past, present and future software development work. In addition, the study is meant to encourage projects to record existing metrics data and to gather future data. The SPA long term goal is to integrate the collection of historical data and ongoing project data with future project estimations.

  9. [The development and evaluation of software to verify diagnostic accuracy].

    PubMed

    Jensen, Rodrigo; de Moraes Lopes, Maria Helena Baena; Silveira, Paulo Sérgio Panse; Ortega, Neli Regina Siqueira

    2012-02-01

    This article describes the development and evaluation of software that verifies the accuracy of diagnoses made by nursing students. The software was based on a model that uses fuzzy logic concepts, including PERL, the MySQL database for Internet accessibility, and the NANDA-I 2007-2008 classification system. The software was evaluated in terms of its technical quality and usability through specific instruments. The activity proposed in the software involves four stages in which students establish the relationship values between nursing diagnoses, defining characteristics/risk factors and clinical cases. The relationship values determined by students are compared to those of specialists, generating performance scores for the students. In the evaluation, the software demonstrated satisfactory outcomes regarding the technical quality and, according to the students, helped in their learning and may become an educational tool to teach the process of nursing diagnosis.

  10. A Quantitative Analysis of Open Source Software's Acceptability as Production-Quality Code

    ERIC Educational Resources Information Center

    Fischer, Michael

    2011-01-01

    The difficulty in writing defect-free software has been long acknowledged both by academia and industry. A constant battle occurs as developers seek to craft software that works within aggressive business schedules and deadlines. Many tools and techniques are used in attempt to manage these software projects. Software metrics are a tool that has…

  11. A Heuristic for Improving Legacy Software Quality during Maintenance: An Empirical Case Study

    ERIC Educational Resources Information Center

    Sale, Michael John

    2017-01-01

    Many organizations depend on the functionality of mission-critical legacy software and the continued maintenance of this software is vital. Legacy software is defined here as software that contains no testing suite, is often foreign to the developer performing the maintenance, lacks meaningful documentation, and over time, has become difficult to…

  12. Practical Methods for Estimating Software Systems Fault Content and Location

    NASA Technical Reports Server (NTRS)

    Nikora, A.; Schneidewind, N.; Munson, J.

    1999-01-01

    Over the past several years, we have developed techniques to discriminate between fault-prone software modules and those that are not, to estimate a software system's residual fault content, to identify those portions of a software system having the highest estimated number of faults, and to estimate the effects of requirements changes on software quality.

  13. Status of Simulations for the Cyclotron Laboratory at the Institute for Nuclear Research and Nuclear Energy

    NASA Astrophysics Data System (ADS)

    Asova, G.; Goutev, N.; Tonev, D.; Artinyan, A.

    2018-05-01

    The Institute for Nuclear Research and Nuclear Energy is preparing to operate a high-power cyclotron for production of radioisotopes for nuclear medicine, research in radiochemistry, radiobiology, nuclear physics, solid state physics. The cyclotron is a TR24 produced by ASCI, Canada, capable to deliver proton beams in the energy range of 15 to 24 MeV with current as high as 400 µA. Multiple extraction lines can be fed. The primary goal of the project is the production of PET and SPECT isotopes as 18F, 67,68Ga, 99mTc, etc. This contribution reports the status of the project. Design considerations for the cyclotron vault will be discussed for some of the target radioisotopes.

  14. Saturnispora bothae sp. nov., isolated from rotting wood.

    PubMed

    Morais, Camila G; Lara, Carla A; Borelli, Beatriz M; Cadete, Raquel M; Moreira, Juliana D; Lachance, Marc-André; Rosa, Carlos A

    2016-10-01

    Two strains representing a novel species of the genus Saturnispora were isolated from rotting wood samples collected in an Atlantic Rainforest site in Brazil. Analyses of the sequences of the D1/D2 domains of the rRNA gene showed that this novel species belongs to a subclade in the Saturnispora clade formed by Saturnispora sanitii, Saturnispora sekii, Saturnispora silvae and Saturnisporasuwanaritii. The novel species differed in D1/D2 sequences by 60 or more nucleotide substitutions from these species. The strains produced asci with one to four hemispherical ascospores. A novel species named Saturnispora bothae sp. nov. is proposed to accommodate these isolates. The type strain is UFMG-CM-Y292T (=CBS 13484T). The MycoBank number is MB 817127.

  15. Earth Observing System (EOS)/Advanced Microwave Sounding Unit-A (AMSU-A) software assurance plan

    NASA Technical Reports Server (NTRS)

    Schwantje, Robert; Smith, Claude

    1994-01-01

    This document defines the responsibilities of Software Quality Assurance (SOA) for the development of the flight software installed in EOS/AMSU-A instruments, and the ground support software used in the test and integration of the EOS/AMSU-A instruments.

  16. Process air quality data

    NASA Technical Reports Server (NTRS)

    Butler, C. M.; Hogge, J. E.

    1978-01-01

    Air quality sampling was conducted. Data for air quality parameters, recorded on written forms, punched cards or magnetic tape, are available for 1972 through 1975. Computer software was developed to (1) calculate several daily statistical measures of location, (2) plot time histories of data or the calculated daily statistics, (3) calculate simple correlation coefficients, and (4) plot scatter diagrams. Computer software was developed for processing air quality data to include time series analysis and goodness of fit tests. Computer software was developed to (1) calculate a larger number of daily statistical measures of location, and a number of daily monthly and yearly measures of location, dispersion, skewness and kurtosis, (2) decompose the extended time series model and (3) perform some goodness of fit tests. The computer program is described, documented and illustrated by examples. Recommendations are made for continuation of the development of research on processing air quality data.

  17. Constraint-Driven Software Design: An Escape from the Waterfall Model.

    ERIC Educational Resources Information Center

    de Hoog, Robert; And Others

    1994-01-01

    Presents the principles of a development methodology for software design based on a nonlinear, product-driven approach that integrates quality aspects. Two examples are given to show that the flexibility needed for building high quality systems leads to integrated development environments in which methodology, product, and tools are closely…

  18. Quality Assurance Testing of Version 1.3 of U.S. EPA Benchmark Dose Software (Presentation)

    EPA Science Inventory

    EPA benchmark dose software (BMDS) issued to evaluate chemical dose-response data in support of Agency risk assessments, and must therefore be dependable. Quality assurance testing methods developed for BMDS were designed to assess model dependability with respect to curve-fitt...

  19. Four Pillars for Improving the Quality of Safety-Critical Software-Reliant Systems

    DTIC Science & Technology

    2013-04-01

    Studies of safety-critical software-reliant systems developed using the current practices of build-then-test show that requirements and architecture ... design defects make up approximately 70% of all defects, many system level related to operational quality attributes, and 80% of these defects are

  20. Evaluation of features to support safety and quality in general practice clinical software

    PubMed Central

    2011-01-01

    Background Electronic prescribing is now the norm in many countries. We wished to find out if clinical software systems used by general practitioners in Australia include features (functional capabilities and other characteristics) that facilitate improved patient safety and care, with a focus on quality use of medicines. Methods Seven clinical software systems used in general practice were evaluated. Fifty software features that were previously rated as likely to have a high impact on safety and/or quality of care in general practice were tested and are reported here. Results The range of results for the implementation of 50 features across the 7 clinical software systems was as follows: 17-31 features (34-62%) were fully implemented, 9-13 (18-26%) partially implemented, and 9-20 (18-40%) not implemented. Key findings included: Access to evidence based drug and therapeutic information was limited. Decision support for prescribing was available but varied markedly between systems. During prescribing there was potential for medicine mis-selection in some systems, and linking a medicine with its indication was optional. The definition of 'current medicines' versus 'past medicines' was not always clear. There were limited resources for patients, and some medicines lists for patients were suboptimal. Results were provided to the software vendors, who were keen to improve their systems. Conclusions The clinical systems tested lack some of the features expected to support patient safety and quality of care. Standards and certification for clinical software would ensure that safety features are present and that there is a minimum level of clinical functionality that clinicians could expect to find in any system.

  1. Designing Educational Software for Tomorrow.

    ERIC Educational Resources Information Center

    Harvey, Wayne

    Designed to address the management and use of computer software in education and training, this paper explores both good and poor software design, calling for improvements in the quality of educational software by attending to design considerations that are based on general principles of learning rather than specific educational objectives. This…

  2. On Quality and Measures in Software Engineering

    ERIC Educational Resources Information Center

    Bucur, Ion I.

    2006-01-01

    Complexity measures are mainly used to estimate vital information about reliability and maintainability of software systems from regular analysis of the source code. Such measures also provide constant feedback during a software project to assist the control of the development procedure. There exist several models to classify a software product's…

  3. ClassCompass: A Software Design Mentoring System

    ERIC Educational Resources Information Center

    Coelho, Wesley; Murphy, Gail

    2007-01-01

    Becoming a quality software developer requires practice under the guidance of an expert mentor. Unfortunately, in most academic environments, there are not enough experts to provide any significant design mentoring for software engineering students. To address this problem, we present a collaborative software design tool intended to maximize an…

  4. Software Assurance: Five Essential Considerations for Acquisition Officials

    DTIC Science & Technology

    2007-05-01

    May 2007 www.stsc.hill.af.mil 17 2 • address security concerns in the software development life cycle ( SDLC )? • Are there formal software quality...What threat modeling process, if any, is used when designing the software ? What analysis, design, and construction tools are used by your software design...the-shelf (COTS), government off-the-shelf (GOTS), open- source, embedded, and legacy software . Attackers exploit unintentional vulnerabil- ities or

  5. Continuous integration and quality control for scientific software

    NASA Astrophysics Data System (ADS)

    Neidhardt, A.; Ettl, M.; Brisken, W.; Dassing, R.

    2013-08-01

    Modern software has to be stable, portable, fast and reliable. This is going to be also more and more important for scientific software. But this requires a sophisticated way to inspect, check and evaluate the quality of source code with a suitable, automated infrastructure. A centralized server with a software repository and a version control system is one essential part, to manage the code basis and to control the different development versions. While each project can be compiled separately, the whole code basis can also be compiled with one central “Makefile”. This is used to create automated, nightly builds. Additionally all sources are inspected automatically with static code analysis and inspection tools, which check well-none error situations, memory and resource leaks, performance issues, or style issues. In combination with an automatic documentation generator it is possible to create the developer documentation directly from the code and the inline comments. All reports and generated information are presented as HTML page on a Web server. Because this environment increased the stability and quality of the software of the Geodetic Observatory Wettzell tremendously, it is now also available for scientific communities. One regular customer is already the developer group of the DiFX software correlator project.

  6. Factors in Software Quality. Volume I. Concepts and Definitions of Software Quality

    DTIC Science & Technology

    1977-11-01

    FLEXIBILITY COMPLEXITY EXPANDABILITY PRECISION DOCUMENTATION TOLERANCE REPAIRABILITY COMPATABIL ITY SERVICEABILITY 2-4 AiI1I~3~I!-T A1 11 NI󈧥 AIiB 9l 0...applications. Several standard documents are required by DOD/AF’ regulations . The following references were used to compile the rFpnge of documents...documents are specified by the AF regulations or SPO-local regulations listed above. Each ot the document types for a long life/high cost software

  7. Space Shuttle Program Primary Avionics Software System (PASS) Success Legacy - Quality and Reliability Date

    NASA Technical Reports Server (NTRS)

    Orr, James K.; Peltier, Daryl

    2010-01-01

    Thsi slide presentation reviews the avionics software system on board the space shuttle, with particular emphasis on the quality and reliability. The Primary Avionics Software System (PASS) provides automatic and fly-by-wire control of critical shuttle systems which executes in redundant computers. Charts given show the number of space shuttle flights vs time, PASS's development history, and other charts that point to the reliability of the system's development. The reliability of the system is also compared to predicted reliability.

  8. A tool to include gamma analysis software into a quality assurance program.

    PubMed

    Agnew, Christina E; McGarry, Conor K

    2016-03-01

    To provide a tool to enable gamma analysis software algorithms to be included in a quality assurance (QA) program. Four image sets were created comprising two geometric images to independently test the distance to agreement (DTA) and dose difference (DD) elements of the gamma algorithm, a clinical step and shoot IMRT field and a clinical VMAT arc. The images were analysed using global and local gamma analysis with 2 in-house and 8 commercially available software encompassing 15 software versions. The effect of image resolution on gamma pass rates was also investigated. All but one software accurately calculated the gamma passing rate for the geometric images. Variation in global gamma passing rates of 1% at 3%/3mm and over 2% at 1%/1mm was measured between software and software versions with analysis of appropriately sampled images. This study provides a suite of test images and the gamma pass rates achieved for a selection of commercially available software. This image suite will enable validation of gamma analysis software within a QA program and provide a frame of reference by which to compare results reported in the literature from various manufacturers and software versions. Copyright © 2015. Published by Elsevier Ireland Ltd.

  9. IMPROVING (SOFTWARE) PATENT QUALITY THROUGH THE ADMINISTRATIVE PROCESS

    PubMed Central

    Rai, Arti K.

    2014-01-01

    The available evidence indicates that patent quality, particularly in the area of software, needs improvement. This Article argues that even an agency as institutionally constrained as the U.S. Patent and Trademark Office (“PTO”) could implement a portfolio of pragmatic, cost-effective quality improvement strategies. The argument in favor of these strategies draws upon not only legal theory and doctrine but also new data from a PTO software examination unit with relatively strict practices. Strategies that resolve around Section 112 of the patent statute could usefully be deployed at the initial examination stage. Other strategies could be deployed within the new post-issuance procedures available to the agency under the America Invents Act. Notably, although the strategies the Article discusses have the virtue of being neutral as to technology, they are likely to have a very significant practical impact in the area of software. PMID:25221346

  10. QuickEval: a web application for psychometric scaling experiments

    NASA Astrophysics Data System (ADS)

    Van Ngo, Khai; Storvik, Jehans J.; Dokkeberg, Christopher A.; Farup, Ivar; Pedersen, Marius

    2015-01-01

    QuickEval is a web application for carrying out psychometric scaling experiments. It offers the possibility of running controlled experiments in a laboratory, or large scale experiment over the web for people all over the world. It is a unique one of a kind web application, and it is a software needed in the image quality field. It is also, to the best of knowledge, the first software that supports the three most common scaling methods; paired comparison, rank order, and category judgement. It is also the first software to support rank order. Hopefully, a side effect of this newly created software is that it will lower the threshold to perform psychometric experiments, improve the quality of the experiments being carried out, make it easier to reproduce experiments, and increase research on image quality both in academia and industry. The web application is available at www.colourlab.no/quickeval.

  11. IMPROVING (SOFTWARE) PATENT QUALITY THROUGH THE ADMINISTRATIVE PROCESS.

    PubMed

    Rai, Arti K

    2013-11-24

    The available evidence indicates that patent quality, particularly in the area of software, needs improvement. This Article argues that even an agency as institutionally constrained as the U.S. Patent and Trademark Office ("PTO") could implement a portfolio of pragmatic, cost-effective quality improvement strategies. The argument in favor of these strategies draws upon not only legal theory and doctrine but also new data from a PTO software examination unit with relatively strict practices. Strategies that resolve around Section 112 of the patent statute could usefully be deployed at the initial examination stage. Other strategies could be deployed within the new post-issuance procedures available to the agency under the America Invents Act. Notably, although the strategies the Article discusses have the virtue of being neutral as to technology, they are likely to have a very significant practical impact in the area of software.

  12. Standardized development of computer software. Part 2: Standards

    NASA Technical Reports Server (NTRS)

    Tausworthe, R. C.

    1978-01-01

    This monograph contains standards for software development and engineering. The book sets forth rules for design, specification, coding, testing, documentation, and quality assurance audits of software; it also contains detailed outlines for the documentation to be produced.

  13. [Use of Adobe Photoshop software in medical criminology].

    PubMed

    Nikitin, S A; Demidov, I V

    2000-01-01

    Describes the method of comparative analysis of various objects in practical medical criminology and making of high-quality photographs with the use of Adobe Photoshop software. Options of the software needed for expert evaluations are enumerated.

  14. Harnessing ISO/IEC 12207 to Examine the Extent of SPI Activity in an Organisation

    NASA Astrophysics Data System (ADS)

    Clarke, Paul; O'Connor, Rory

    The quality of the software development process directly affects the quality of the software product. To be successful, software development organisations must respond to changes in technology and business circumstances, and therefore software process improvement (SPI) is required. SPI activity relates to any modification that is performed to the software process in order to improve an aspect of the process. Although multiple process assessments could be employed to examine SPI activity, they present an inefficient tool for such an examination. This paper presents an overview of a new survey-based resource that utilises the process reference model in ISO/IEC 12207 in order to expressly and directly determine the level of SPI activity in a software development organisation. This survey instrument can be used by practitioners, auditors and researchers who are interested in determining the extent of SPI activity in an organisation.

  15. Software for Optimizing Quality Assurance of Other Software

    NASA Technical Reports Server (NTRS)

    Feather, Martin; Cornford, Steven; Menzies, Tim

    2004-01-01

    Software assurance is the planned and systematic set of activities that ensures that software processes and products conform to requirements, standards, and procedures. Examples of such activities are the following: code inspections, unit tests, design reviews, performance analyses, construction of traceability matrices, etc. In practice, software development projects have only limited resources (e.g., schedule, budget, and availability of personnel) to cover the entire development effort, of which assurance is but a part. Projects must therefore select judiciously from among the possible assurance activities. At its heart, this can be viewed as an optimization problem; namely, to determine the allocation of limited resources (time, money, and personnel) to minimize risk or, alternatively, to minimize the resources needed to reduce risk to an acceptable level. The end result of the work reported here is a means to optimize quality-assurance processes used in developing software.

  16. Parameter-based estimation of CT dose index and image quality using an in-house android™-based software

    NASA Astrophysics Data System (ADS)

    Mubarok, S.; Lubis, L. E.; Pawiro, S. A.

    2016-03-01

    Compromise between radiation dose and image quality is essential in the use of CT imaging. CT dose index (CTDI) is currently the primary dosimetric formalisms in CT scan, while the low and high contrast resolutions are aspects indicating the image quality. This study was aimed to estimate CTDIvol and image quality measures through a range of exposure parameters variation. CTDI measurements were performed using PMMA (polymethyl methacrylate) phantom of 16 cm diameter, while the image quality test was conducted by using catphan ® 600. CTDI measurements were carried out according to IAEA TRS 457 protocol using axial scan mode, under varied parameters of tube voltage, collimation or slice thickness, and tube current. Image quality test was conducted accordingly under the same exposure parameters with CTDI measurements. An Android™ based software was also result of this study. The software was designed to estimate the value of CTDIvol with maximum difference compared to actual CTDIvol measurement of 8.97%. Image quality can also be estimated through CNR parameter with maximum difference to actual CNR measurement of 21.65%.

  17. Flexible and Low-Cost Measurements for Space Software Development- The Measurements Exploration Framework

    NASA Astrophysics Data System (ADS)

    Marculescu, Bogdan; Feldt, Robert; Torkar, Richard; Green, Lars-Goran; Liljegren, Thomas; Hult, Erika

    2011-08-01

    Verification and validation is an important part of software development and accounts for significant amounts of the costs associated with such a project. For developers of life or mission critical systems, such as software being developed for space applications, a balance must be reached between ensuring the quality of the system by extensive and rigorous testing and reducing costs and allowing the company to compete.Ensuring the quality of any system starts with a quality development process. To evaluate both the software development process and the product itself, measurements are needed. A balance must be then struck between ensuring the best possible quality of both process and product on the one hand, and reducing the cost of performing requirements on the other.A number of measurements have already been defined and are being used. For some of these, data collection can be automated as well, further lowering costs associated with implementing them. In practice, however, there may be situations where existing measurements are unsuitable for a variety of reasons.This paper describes a framework for creating low cost, flexible measurements in areas where initial information is scarce. The framework, called The Measurements Exploration Framework, is aimed in particular at the Space Software development industry and was developed is such an environment.

  18. Semantic Metrics for Analysis of Software

    NASA Technical Reports Server (NTRS)

    Etzkorn, Letha H.; Cox, Glenn W.; Farrington, Phil; Utley, Dawn R.; Ghalston, Sampson; Stein, Cara

    2005-01-01

    A recently conceived suite of object-oriented software metrics focus is on semantic aspects of software, in contradistinction to traditional software metrics, which focus on syntactic aspects of software. Semantic metrics represent a more human-oriented view of software than do syntactic metrics. The semantic metrics of a given computer program are calculated by use of the output of a knowledge-based analysis of the program, and are substantially more representative of software quality and more readily comprehensible from a human perspective than are the syntactic metrics.

  19. 21 CFR 1271.160 - Establishment and maintenance of a quality program.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... perform for management review a quality audit, as defined in § 1271.3(gg), of activities related to core CGTP requirements. (d) Computers. You must validate the performance of computer software for the intended use, and the performance of any changes to that software for the intended use, if you rely upon...

  20. 21 CFR 1271.160 - Establishment and maintenance of a quality program.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... perform for management review a quality audit, as defined in § 1271.3(gg), of activities related to core CGTP requirements. (d) Computers. You must validate the performance of computer software for the intended use, and the performance of any changes to that software for the intended use, if you rely upon...

  1. 21 CFR 1271.160 - Establishment and maintenance of a quality program.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... perform for management review a quality audit, as defined in § 1271.3(gg), of activities related to core CGTP requirements. (d) Computers. You must validate the performance of computer software for the intended use, and the performance of any changes to that software for the intended use, if you rely upon...

  2. 21 CFR 1271.160 - Establishment and maintenance of a quality program.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... perform for management review a quality audit, as defined in § 1271.3(gg), of activities related to core CGTP requirements. (d) Computers. You must validate the performance of computer software for the intended use, and the performance of any changes to that software for the intended use, if you rely upon...

  3. Framework for Small-Scale Experiments in Software Engineering: Guidance and Control Software Project: Software Engineering Case Study

    NASA Technical Reports Server (NTRS)

    Hayhurst, Kelly J.

    1998-01-01

    Software is becoming increasingly significant in today's critical avionics systems. To achieve safe, reliable software, government regulatory agencies such as the Federal Aviation Administration (FAA) and the Department of Defense mandate the use of certain software development methods. However, little scientific evidence exists to show a correlation between software development methods and product quality. Given this lack of evidence, a series of experiments has been conducted to understand why and how software fails. The Guidance and Control Software (GCS) project is the latest in this series. The GCS project is a case study of the Requirements and Technical Concepts for Aviation RTCA/DO-178B guidelines, Software Considerations in Airborne Systems and Equipment Certification. All civil transport airframe and equipment vendors are expected to comply with these guidelines in building systems to be certified by the FAA for use in commercial aircraft. For the case study, two implementations of a guidance and control application were developed to comply with the DO-178B guidelines for Level A (critical) software. The development included the requirements, design, coding, verification, configuration management, and quality assurance processes. This paper discusses the details of the GCS project and presents the results of the case study.

  4. A Bibliography of the Personal Software Process (PSP) and the Team Software Process (TSP)

    DTIC Science & Technology

    2009-10-01

    Postmortem.‖ Proceedings of the TSP Symposium (September 2007). http://www.sei.cmu.edu/tspsymposium/ Rickets , Chris; Lindeman, Robert; & Hodgins, Brad... Rickets , Chris A. ―A TSP Software Maintenance Life Cycle.‖ CrossTalk (March 2005). Rozanc, I. & Mahnic, V. ―Teaching Software Quality with Emphasis on PSP

  5. Effects of the Meetings-Flow Approach on Quality Teamwork in the Training of Software Capstone Projects

    ERIC Educational Resources Information Center

    Chen, Chung-Yang; Hong, Ya-Chun; Chen, Pei-Chi

    2014-01-01

    Software development relies heavily on teamwork; determining how to streamline this collaborative development is an essential training subject in computer and software engineering education. A team process known as the meetings-flow (MF) approach has recently been introduced in software capstone projects in engineering programs at various…

  6. Automatic Rotational Sky Quality Meter (R-SQM) Design and Software for Astronomical Observatories

    NASA Astrophysics Data System (ADS)

    Dogan, E.; Ozbaldan, E. E.; Shameoni, Niaei M.; Yesilyaprak, C.

    2016-12-01

    We have presented the new design of Sky Quality Meter (SQM) device that is an automatic rotational model of sky quality meter (R-SQM) carried out by DAG (Eastern Anatolia Observatory) Technical Team. R-SQM is required for determining the long-term changes of sky quality of an astronomical observatory and consists of four SQM devices mounted on a rotating shaft with different angles for scanning all sky. This system is controlled by a Raspberry Pi control card and a step motor with its driver and a special software.

  7. A software sensor model based on hybrid fuzzy neural network for rapid estimation water quality in Guangzhou section of Pearl River, China.

    PubMed

    Zhou, Chunshan; Zhang, Chao; Tian, Di; Wang, Ke; Huang, Mingzhi; Liu, Yanbiao

    2018-01-02

    In order to manage water resources, a software sensor model was designed to estimate water quality using a hybrid fuzzy neural network (FNN) in Guangzhou section of Pearl River, China. The software sensor system was composed of data storage module, fuzzy decision-making module, neural network module and fuzzy reasoning generator module. Fuzzy subtractive clustering was employed to capture the character of model, and optimize network architecture for enhancing network performance. The results indicate that, on basis of available on-line measured variables, the software sensor model can accurately predict water quality according to the relationship between chemical oxygen demand (COD) and dissolved oxygen (DO), pH and NH 4 + -N. Owing to its ability in recognizing time series patterns and non-linear characteristics, the software sensor-based FNN is obviously superior to the traditional neural network model, and its R (correlation coefficient), MAPE (mean absolute percentage error) and RMSE (root mean square error) are 0.8931, 10.9051 and 0.4634, respectively.

  8. Software Formal Inspections Standard

    NASA Technical Reports Server (NTRS)

    1993-01-01

    This Software Formal Inspections Standard (hereinafter referred to as Standard) is applicable to NASA software. This Standard defines the requirements that shall be fulfilled by the software formal inspections process whenever this process is specified for NASA software. The objective of this Standard is to define the requirements for a process that inspects software products to detect and eliminate defects as early as possible in the software life cycle. The process also provides for the collection and analysis of inspection data to improve the inspection process as well as the quality of the software.

  9. Guidance and Control Software,

    DTIC Science & Technology

    1980-05-01

    commitments of function, cost, and schedule . The phrase "software engineering" was intended to contrast with the phrase "computer science" the latter aims...the software problems of cost, delivery schedule , and quality were gradually being recognized at the highest management levels. Thus, in a project... schedule dates. Although the analysis of software problems indicated that the entire software development process (figure 1) needed new methods, only

  10. Development and Evaluation of a Computer-Based Program for Assessing Quality of Family Medicine Teams Based on Accreditation Standards

    PubMed Central

    Valjevac, Salih; Ridjanovic, Zoran; Masic, Izet

    2009-01-01

    CONFLICT OF INTEREST: NONE DECLARED SUMMARY Introduction Agency for healthcare quality and accreditation in Federation of Bosnia and Herzegovina (AKAZ) is authorized body in the field of healthcare quality and safety improvement and accreditation of healthcare institutions. Beside accreditation standards for hospitals and primary health care centers, AKAZ has also developed accreditation standards for family medicine teams. Methods Software development was primarily based on Accreditation Standards for Family Medicine Teams. Seven chapters / topics: (1. Physical factors; 2. Equipment; 3. Organization and Management; 4. Health promotion and illness prevention; 5. Clinical services; 6. Patient survey; and 7. Patient’s rights and obligations) contain 35 standards describing expected level of family medicine team’s quality. Based on accreditation standards structure and needs of different potential users, it was concluded that software backbone should be a database containing all accreditation standards, self assessment and external assessment details. In this article we will present the development of standardized software for self and external evaluation of quality of service in family medicine, as well as plans for the future development of this software package. Conclusion Electronic data gathering and storing enhances the management, access and overall use of information. During this project we came to conclusion that software for self assessment and external assessment is ideal for accreditation standards distribution, their overview by the family medicine team members, their self assessment and external assessment. PMID:24109157

  11. SAGA: A project to automate the management of software production systems

    NASA Technical Reports Server (NTRS)

    Campbell, Roy H.; Beckman-Davies, C. S.; Benzinger, L.; Beshers, G.; Laliberte, D.; Render, H.; Sum, R.; Smith, W.; Terwilliger, R.

    1986-01-01

    Research into software development is required to reduce its production cost and to improve its quality. Modern software systems, such as the embedded software required for NASA's space station initiative, stretch current software engineering techniques. The requirements to build large, reliable, and maintainable software systems increases with time. Much theoretical and practical research is in progress to improve software engineering techniques. One such technique is to build a software system or environment which directly supports the software engineering process, i.e., the SAGA project, comprising the research necessary to design and build a software development which automates the software engineering process. Progress under SAGA is described.

  12. Guidance and Control Software Project Data - Volume 1: Planning Documents

    NASA Technical Reports Server (NTRS)

    Hayhurst, Kelly J. (Editor)

    2008-01-01

    The Guidance and Control Software (GCS) project was the last in a series of software reliability studies conducted at Langley Research Center between 1977 and 1994. The technical results of the GCS project were recorded after the experiment was completed. Some of the support documentation produced as part of the experiment, however, is serving an unexpected role far beyond its original project context. Some of the software used as part of the GCS project was developed to conform to the RTCA/DO-178B software standard, "Software Considerations in Airborne Systems and Equipment Certification," used in the civil aviation industry. That standard requires extensive documentation throughout the software development life cycle, including plans, software requirements, design and source code, verification cases and results, and configuration management and quality control data. The project documentation that includes this information is open for public scrutiny without the legal or safety implications associated with comparable data from an avionics manufacturer. This public availability has afforded an opportunity to use the GCS project documents for DO-178B training. This report provides a brief overview of the GCS project, describes the 4-volume set of documents and the role they are playing in training, and includes the planning documents from the GCS project. Volume 1 contains five appendices: A. Plan for Software Aspects of Certification for the Guidance and Control Software Project; B. Software Development Standards for the Guidance and Control Software Project; C. Software Verification Plan for the Guidance and Control Software Project; D. Software Configuration Management Plan for the Guidance and Control Software Project; and E. Software Quality Assurance Activities.

  13. Mining dynamic noteworthy functions in software execution sequences.

    PubMed

    Zhang, Bing; Huang, Guoyan; Wang, Yuqian; He, Haitao; Ren, Jiadong

    2017-01-01

    As the quality of crucial entities can directly affect that of software, their identification and protection become an important premise for effective software development, management, maintenance and testing, which thus contribute to improving the software quality and its attack-defending ability. Most analysis and evaluation on important entities like codes-based static structure analysis are on the destruction of the actual software running. In this paper, from the perspective of software execution process, we proposed an approach to mine dynamic noteworthy functions (DNFM)in software execution sequences. First, according to software decompiling and tracking stack changes, the execution traces composed of a series of function addresses were acquired. Then these traces were modeled as execution sequences and then simplified so as to get simplified sequences (SFS), followed by the extraction of patterns through pattern extraction (PE) algorithm from SFS. After that, evaluating indicators inner-importance and inter-importance were designed to measure the noteworthiness of functions in DNFM algorithm. Finally, these functions were sorted by their noteworthiness. Comparison and contrast were conducted on the experiment results from two traditional complex network-based node mining methods, namely PageRank and DegreeRank. The results show that the DNFM method can mine noteworthy functions in software effectively and precisely.

  14. Overview of the TriBITS Lifecycle Model: Lean/Agile Software Lifecycle Model for Research-based Computational Science and Engineering Software

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bartlett, Roscoe A; Heroux, Dr. Michael A; Willenbring, James

    2012-01-01

    Software lifecycles are becoming an increasingly important issue for computational science & engineering (CSE) software. The process by which a piece of CSE software begins life as a set of research requirements and then matures into a trusted high-quality capability is both commonplace and extremely challenging. Although an implicit lifecycle is obviously being used in any effort, the challenges of this process--respecting the competing needs of research vs. production--cannot be overstated. Here we describe a proposal for a well-defined software lifecycle process based on modern Lean/Agile software engineering principles. What we propose is appropriate for many CSE software projects thatmore » are initially heavily focused on research but also are expected to eventually produce usable high-quality capabilities. The model is related to TriBITS, a build, integration and testing system, which serves as a strong foundation for this lifecycle model, and aspects of this lifecycle model are ingrained in the TriBITS system. Indeed this lifecycle process, if followed, will enable large-scale sustainable integration of many complex CSE software efforts across several institutions.« less

  15. A review of the FDA draft guidance document for software validation: guidance for industry.

    PubMed

    Keatley, K L

    1999-01-01

    A Draft Guidance Document (Version 1.1) was issued by the United States Food and Drug Administration (FDA) to address the software validation requirement of the Quality System Regulation, 21 CFR Part 820, effective June 1, 1997. The guidance document outlines validation considerations that the FDA regards as applicable to both medical device software and software used to "design, develop or manufacture" medical devices. The Draft Guidance is available at the FDA web site http:@www.fda.gov/cdrh/comps/swareval++ +.html. Presented here is a review of the main features of the FDA document for Quality System Regulation (QSR), and some guidance for its implementation in industry.

  16. Rules of thumb to increase the software quality through testing

    NASA Astrophysics Data System (ADS)

    Buttu, M.; Bartolini, M.; Migoni, C.; Orlati, A.; Poppi, S.; Righini, S.

    2016-07-01

    The software maintenance typically requires 40-80% of the overall project costs, and this considerable variability mostly depends on the software internal quality: the more the software is designed and implemented to constantly welcome new changes, the lower will be the maintenance costs. The internal quality is typically enforced through testing, which in turn also affects the development and maintenance costs. This is the reason why testing methodologies have become a major concern for any company that builds - or is involved in building - software. Although there is no testing approach that suits all contexts, we infer some general guidelines learned during the Development of the Italian Single-dish COntrol System (DISCOS), which is a project aimed at producing the control software for the three INAF radio telescopes (the Medicina and Noto dishes, and the newly-built SRT). These guidelines concern both the development and the maintenance phases, and their ultimate goal is to maximize the DISCOS software quality through a Behavior-Driven Development (BDD) workflow beside a continuous delivery pipeline. We consider different topics and patterns; they involve the proper apportion of the tests (from end-to-end to low-level tests), the choice between hardware simulators and mockers, why and how to apply TDD and the dependency injection to increase the test coverage, the emerging technologies available for test isolation, bug fixing, how to protect the system from the external resources changes (firmware updating, hardware substitution, etc.) and, eventually, how to accomplish BDD starting from functional tests and going through integration and unit tests. We discuss pros and cons of each solution and point out the motivations of our choices either as a general rule or narrowed in the context of the DISCOS project.

  17. The Role and Quality of Software Safety in the NASA Constellation Program

    NASA Technical Reports Server (NTRS)

    Layman, Lucas; Basili, Victor R.; Zelkowitz, Marvin V.

    2010-01-01

    In this study, we examine software safety risk in the early design phase of the NASA Constellation spaceflight program. Obtaining an accurate, program-wide picture of software safety risk is difficult across multiple, independently-developing systems. We leverage one source of safety information, hazard analysis, to provide NASA quality assurance managers with information regarding the ongoing state of software safety across the program. The goal of this research is two-fold: 1) to quantify the relative importance of software with respect to system safety; and 2) to quantify the level of risk presented by software in the hazard analysis. We examined 154 hazard reports created during the preliminary design phase of three major flight hardware systems within the Constellation program. To quantify the importance of software, we collected metrics based on the number of software-related causes and controls of hazardous conditions. To quantify the level of risk presented by software, we created a metric scheme to measure the specificity of these software causes. We found that from 49-70% of hazardous conditions in the three systems could be caused by software or software was involved in the prevention of the hazardous condition. We also found that 12-17% of the 2013 hazard causes involved software, and that 23-29% of all causes had a software control. Furthermore, 10-12% of all controls were software-based. There is potential for inaccuracy in these counts, however, as software causes are not consistently scoped, and the presence of software in a cause or control is not always clear. The application of our software specificity metrics also identified risks in the hazard reporting process. In particular, we found a number of traceability risks in the hazard reports may impede verification of software and system safety.

  18. The six critical attributes of the next generation of quality management software systems.

    PubMed

    Clark, Kathleen

    2011-07-01

    Driven by both the need to meet regulatory requirements and a genuine desire to drive improved quality, quality management systems encompassing standard operating procedure, corrective and preventative actions and related processes have existed for many years, both in paper and electronic form. The impact of quality management systems on 'actual' quality, however, is often reported as far less than desired. A quality management software system that moves beyond formal forms-driven processes to include a true closed loop design, manage disparate processes across the enterprise, provide support for collaborative processes and deliver insight into the overall state of control has the potential to close the gap between simply accomplishing regulatory compliance and delivering measurable improvements in quality and efficiency.

  19. Enhancements to the EPANET-RTX (Real-Time Analytics) ...

    EPA Pesticide Factsheets

    Technical brief and software The U.S. Environmental Protection Agency (EPA) developed EPANET-RTX as a collection of object-oriented software libraries comprising the core data access, data transformation, and data synthesis (real-time analytics) components of a real-time hydraulic and water quality modeling system. While EPANET-RTX uses the hydraulic and water quality solvers of EPANET, the object libraries are a self-contained set of building blocks for software developers. “Real-time EPANET” promises to change the way water utilities, commercial vendors, engineers, and the water community think about modeling.

  20. Evolving the ECSS Standards and their Use: Experience Based on Industrial Case Studies

    NASA Astrophysics Data System (ADS)

    Feldt, R.; Ahmad, E.; Raza, B.; Hult, E.; Nordebäck, T.

    2009-05-01

    This paper introduces two case studies conducted at two Swedish companies developing software for the space industry. The overall goal of the project is to evaluate if current use of ECSS is cost efficient and if there are ways to make the process leaner while maintaining quality. The case studies reported on here focused on how the ECSS standard was used by the companies and how that affected software development processes and software quality. This paper describes the results and recommendations based on identified challenges.

  1. Harmonize Pipeline and Archiving Aystem: PESSTO@IA2 Use Case

    NASA Astrophysics Data System (ADS)

    Smareglia, R.; Knapic, C.; Molinaro, M.; Young, D.; Valenti, S.

    2013-10-01

    Italian Astronomical Archives Center (IA2) is a research infrastructure project that aims at coordinating different national and international initiatives to improve the quality of astrophysical data services. IA2 is now also involved in the PESSTO (Public ESO Spectroscopic Survey of Transient Objects) collaboration, developing a complete archiving system to store calibrated post processed data (including sensitive intermediate products), a user interface to access private data and Virtual Observatory (VO) compliant web services to access public fast reduction data via VO tools. The archive system shall rely on the PESSTO Marshall to provide file data and its associated metadata output by the PESSTO data-reduction pipeline. To harmonize the object repository, data handling and archiving system, new tools are under development. These systems must have a strong cross-interaction without increasing the complexities of any single task, in order to improve the performances of the whole system and must have a sturdy logic in order to perform all operations in coordination with the other PESSTO tools. MySQL Replication technology and triggers are used for the synchronization of new data in an efficient, fault tolerant manner. A general purpose library is under development to manage data starting from raw observations to final calibrated ones, open to the overriding of different sources, formats, management fields, storage and publication policies. Configurations for all the systems are stored in a dedicated schema (no configuration files), but can be easily updated by a planned Archiving System Configuration Interface (ASCI).

  2. Application of QC_DR software for acceptance testing and routine quality control of direct digital radiography systems: initial experiences using the Italian Association of Physicist in Medicine quality control protocol.

    PubMed

    Nitrosi, Andrea; Bertolini, Marco; Borasi, Giovanni; Botti, Andrea; Barani, Adriana; Rivetti, Stefano; Pierotti, Luisa

    2009-12-01

    Ideally, medical x-ray imaging systems should be designed to deliver maximum image quality at an acceptable radiation risk to the patient. Quality assurance procedures are employed to ensure that these standards are maintained. A quality control protocol for direct digital radiography (DDR) systems is described and discussed. Software to automatically process and analyze the required images was developed. In this paper, the initial results obtained on equipment of different DDR manufacturers were reported. The protocol was developed to highlight even small discrepancies in standard operating performance.

  3. Data-Driven Decision Making as a Tool to Improve Software Development Productivity

    ERIC Educational Resources Information Center

    Brown, Mary Erin

    2013-01-01

    The worldwide software project failure rate, based on a survey of information technology software manager's view of user satisfaction, product quality, and staff productivity, is estimated to be between 24% and 36% and software project success has not kept pace with the advances in hardware. The problem addressed by this study was the limited…

  4. 2005 8th Annual Systems Engineering Conference. Volume 4, Thursday

    DTIC Science & Technology

    2005-10-27

    requirements, allocation , and utilization statistics Operations Decisions Acquisition Decisions Resource Management — Integrated Requirements/ Allocation ...Quality Improvement Consultants, Inc. “Automated Software Testing Increases Test Quality and Coverage Resulting in Improved Software Reliability.”, Mr...Steven Ligon, SAIC The Return of Discipline, Ms. Jacqueline Townsend, Air Force Materiel Command Track 4 - Net Centric Operations: Testing Net-Centric

  5. Design Aids for Real-Time Systems (DARTS)

    NASA Technical Reports Server (NTRS)

    Szulewski, P. A.

    1982-01-01

    Design-Aids for Real-Time Systems (DARTS) is a tool that assists in defining embedded computer systems through tree structured graphics, military standard documentation support, and various analyses including automated Software Science parameter counting and metrics calculation. These analyses provide both static and dynamic design quality feedback which can potentially aid in producing efficient, high quality software systems.

  6. Relating Communications Mode Choice and Teamwork Quality: Conversational versus Textual Communication in IT System and Software Development Teams

    ERIC Educational Resources Information Center

    Smith, James Robert

    2012-01-01

    This cross-sectional study explored how IT system and software development team members communicated in the workplace and whether teams that used more verbal communication (and less text-based communication) experienced higher levels of collaboration as measured using the Teamwork Quality (TWQ) scale. Although computer-mediated communication tools…

  7. Transforming High School Classrooms with Free/Open Source Software: "It's Time for an Open Source Software Revolution"

    ERIC Educational Resources Information Center

    Pfaffman, Jay

    2008-01-01

    Free/Open Source Software (FOSS) applications meet many of the software needs of high school science classrooms. In spite of the availability and quality of FOSS tools, they remain unknown to many teachers and utilized by fewer still. In a world where most software has restrictions on copying and use, FOSS is an anomaly, free to use and to…

  8. Automating Software Design Metrics.

    DTIC Science & Technology

    1984-02-01

    INTRODUCTION 1 ", ... 0..1 1.2 HISTORICAL PERSPECTIVE High quality software is of interest to both the software engineering com- munity and its users. As...contributions of many other software engineering efforts, most notably [MCC 77] and [Boe 83b], which have defined and refined a framework for quantifying...AUTOMATION OF DESIGN METRICS Software metrics can be useful within the context of an integrated soft- ware engineering environment. The purpose of this

  9. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Strohmaier, Erich; Meuer, Hans W.; Dongarra, Jack

    20th Edition of TOP500 List of World's Fastest Supercomputers Released MANNHEIM, Germany; KNOXVILLE, Tenn.;&BERKELEY, Calif. In what has become a much-anticipated event in the world of high-performance computing, the 20th edition of the TOP500 list of the world's fastest supercomputers was released today (November 15, 2002). The Earth Simulator supercomputer installed earlier this year at the Earth Simulator Center in Yokohama, Japan, is with its Linpack benchmark performance of 35.86 Tflop/s (trillions of calculations per second) retains the number one position. The No.2 and No.3 positions are held by two new, identical ASCI Q systems at Los Alamos National Laboratorymore » (7.73Tflop/s each). These systems are built by Hewlett-Packard and based on the Alpha Server SC computer system.« less

  10. Software for aerospace education: A bibliography, 2nd edition

    NASA Technical Reports Server (NTRS)

    Vogt, Gregory L.; Roth, Susan Kies; Phelps, Malcom V.

    1990-01-01

    This is the second aerospace education software bibliography to be published by the NASA Educational Technology Branch in Washington, DC. Unlike many software bibliographies, this bibliography does not evaluate and grade software according to its quality and value to the classroom, nor does it make any endorsements or warrant scientific accuracy. Rather, it describes software, its subject, approach, and technical details. This bibliography is intended as a convenience to educators. The specific software included represents replies to more than 300 queries to software producers for aerospace education programs.

  11. A systematic literature review of open source software quality assessment models.

    PubMed

    Adewumi, Adewole; Misra, Sanjay; Omoregbe, Nicholas; Crawford, Broderick; Soto, Ricardo

    2016-01-01

    Many open source software (OSS) quality assessment models are proposed and available in the literature. However, there is little or no adoption of these models in practice. In order to guide the formulation of newer models so they can be acceptable by practitioners, there is need for clear discrimination of the existing models based on their specific properties. Based on this, the aim of this study is to perform a systematic literature review to investigate the properties of the existing OSS quality assessment models by classifying them with respect to their quality characteristics, the methodology they use for assessment, and their domain of application so as to guide the formulation and development of newer models. Searches in IEEE Xplore, ACM, Science Direct, Springer and Google Search is performed so as to retrieve all relevant primary studies in this regard. Journal and conference papers between the year 2003 and 2015 were considered since the first known OSS quality model emerged in 2003. A total of 19 OSS quality assessment model papers were selected. To select these models we have developed assessment criteria to evaluate the quality of the existing studies. Quality assessment models are classified into five categories based on the quality characteristics they possess namely: single-attribute, rounded category, community-only attribute, non-community attribute as well as the non-quality in use models. Our study reflects that software selection based on hierarchical structures is found to be the most popular selection method in the existing OSS quality assessment models. Furthermore, we found that majority (47%) of the existing models do not specify any domain of application. In conclusion, our study will be a valuable contribution to the community and helps the quality assessment model developers in formulating newer models and also to the practitioners (software evaluators) in selecting suitable OSS in the midst of alternatives.

  12. Taking advantage of ground data systems attributes to achieve quality results in testing software

    NASA Technical Reports Server (NTRS)

    Sigman, Clayton B.; Koslosky, John T.; Hageman, Barbara H.

    1994-01-01

    During the software development life cycle process, basic testing starts with the development team. At the end of the development process, an acceptance test is performed for the user to ensure that the deliverable is acceptable. Ideally, the delivery is an operational product with zero defects. However, the goal of zero defects is normally not achieved but is successful to various degrees. With the emphasis on building low cost ground support systems while maintaining a quality product, a key element in the test process is simulator capability. This paper reviews the Transportable Payload Operations Control Center (TPOCC) Advanced Spacecraft Simulator (TASS) test tool that is used in the acceptance test process for unmanned satellite operations control centers. The TASS is designed to support the development, test and operational environments of the Goddard Space Flight Center (GSFC) operations control centers. The TASS uses the same basic architecture as the operations control center. This architecture is characterized by its use of distributed processing, industry standards, commercial off-the-shelf (COTS) hardware and software components, and reusable software. The TASS uses much of the same TPOCC architecture and reusable software that the operations control center developer uses. The TASS also makes use of reusable simulator software in the mission specific versions of the TASS. Very little new software needs to be developed, mainly mission specific telemetry communication and command processing software. By taking advantage of the ground data system attributes, successful software reuse for operational systems provides the opportunity to extend the reuse concept into the test area. Consistency in test approach is a major step in achieving quality results.

  13. ICESat (GLAS) Science Processing Software Document Series. Volume 1; Science Software Management Plan; 3.0

    NASA Technical Reports Server (NTRS)

    Hancock, David W., III

    1999-01-01

    This document provides the Software Management Plan for the GLAS Standard Data Software (SDS) supporting the GLAS instrument of the EOS ICESat Spacecraft. The SDS encompasses the ICESat Science Investigator-led Processing System (I-SIPS) Software and the Instrument Support Terminal (IST) Software. For the I-SIPS Software, the SDS will produce Level 0, Level 1, and Level 2 data products as well as the associated product quality assessments and descriptive information. For the IST Software, the SDS will accommodate the GLAS instrument support areas of engineering status, command, performance assessment, and instrument health status.

  14. A Prototype for the Support of Integrated Software Process Development and Improvement

    NASA Astrophysics Data System (ADS)

    Porrawatpreyakorn, Nalinpat; Quirchmayr, Gerald; Chutimaskul, Wichian

    An efficient software development process is one of key success factors for quality software. Not only can the appropriate establishment but also the continuous improvement of integrated project management and of the software development process result in efficiency. This paper hence proposes a software process maintenance framework which consists of two core components: an integrated PMBOK-Scrum model describing how to establish a comprehensive set of project management and software engineering processes and a software development maturity model advocating software process improvement. Besides, a prototype tool to support the framework is introduced.

  15. Multiple breath washout analysis in infants: quality assessment and recommendations for improvement.

    PubMed

    Anagnostopoulou, Pinelopi; Egger, Barbara; Lurà, Marco; Usemann, Jakob; Schmidt, Anne; Gorlanova, Olga; Korten, Insa; Roos, Markus; Frey, Urs; Latzin, Philipp

    2016-03-01

    Infant multiple breath washout (MBW) testing serves as a primary outcome in clinical studies. However, it is still unknown whether current software algorithms allow between-centre comparisons. In this study of healthy infants, we quantified MBW measurement errors and tried to improve data quality by simply changing software settings. We analyzed best quality MBW measurements performed with an ultrasonic flowmeter in 24 infants from two centres in Switzerland with the current software settings. To challenge the robustness of these settings, we also used alternative analysis approaches. Using the current analysis software, the coefficient of variation (CV) for functional residual capacity (FRC) differed significantly between centres (mean  ±  SD (%): 9.8  ±  5.6 and 5.8  ±  2.9, respectively, p  =  0.039). In addition, FRC values calculated during the washout differed between  -25 and  +30% from those of the washin of the same tracing. Results were mainly influenced by analysis settings and temperature recordings. Changing few algorithms resulted in significantly more robust analysis. Non-systematic inter-centre differences can be reduced by using correctly recorded environmental data and simple changes in the software algorithms. We provide implications that greatly improve infant MBW outcomes' quality and can be applied when multicentre trials are conducted.

  16. A unified software framework for deriving, visualizing, and exploring abstraction networks for ontologies

    PubMed Central

    Ochs, Christopher; Geller, James; Perl, Yehoshua; Musen, Mark A.

    2016-01-01

    Software tools play a critical role in the development and maintenance of biomedical ontologies. One important task that is difficult without software tools is ontology quality assurance. In previous work, we have introduced different kinds of abstraction networks to provide a theoretical foundation for ontology quality assurance tools. Abstraction networks summarize the structure and content of ontologies. One kind of abstraction network that we have used repeatedly to support ontology quality assurance is the partial-area taxonomy. It summarizes structurally and semantically similar concepts within an ontology. However, the use of partial-area taxonomies was ad hoc and not generalizable. In this paper, we describe the Ontology Abstraction Framework (OAF), a unified framework and software system for deriving, visualizing, and exploring partial-area taxonomy abstraction networks. The OAF includes support for various ontology representations (e.g., OWL and SNOMED CT's relational format). A Protégé plugin for deriving “live partial-area taxonomies” is demonstrated. PMID:27345947

  17. Verification and Validation in a Rapid Software Development Process

    NASA Technical Reports Server (NTRS)

    Callahan, John R.; Easterbrook, Steve M.

    1997-01-01

    The high cost of software production is driving development organizations to adopt more automated design and analysis methods such as rapid prototyping, computer-aided software engineering (CASE) tools, and high-level code generators. Even developers of safety-critical software system have adopted many of these new methods while striving to achieve high levels Of quality and reliability. While these new methods may enhance productivity and quality in many cases, we examine some of the risks involved in the use of new methods in safety-critical contexts. We examine a case study involving the use of a CASE tool that automatically generates code from high-level system designs. We show that while high-level testing on the system structure is highly desirable, significant risks exist in the automatically generated code and in re-validating releases of the generated code after subsequent design changes. We identify these risks and suggest process improvements that retain the advantages of rapid, automated development methods within the quality and reliability contexts of safety-critical projects.

  18. A unified software framework for deriving, visualizing, and exploring abstraction networks for ontologies.

    PubMed

    Ochs, Christopher; Geller, James; Perl, Yehoshua; Musen, Mark A

    2016-08-01

    Software tools play a critical role in the development and maintenance of biomedical ontologies. One important task that is difficult without software tools is ontology quality assurance. In previous work, we have introduced different kinds of abstraction networks to provide a theoretical foundation for ontology quality assurance tools. Abstraction networks summarize the structure and content of ontologies. One kind of abstraction network that we have used repeatedly to support ontology quality assurance is the partial-area taxonomy. It summarizes structurally and semantically similar concepts within an ontology. However, the use of partial-area taxonomies was ad hoc and not generalizable. In this paper, we describe the Ontology Abstraction Framework (OAF), a unified framework and software system for deriving, visualizing, and exploring partial-area taxonomy abstraction networks. The OAF includes support for various ontology representations (e.g., OWL and SNOMED CT's relational format). A Protégé plugin for deriving "live partial-area taxonomies" is demonstrated. Copyright © 2016 Elsevier Inc. All rights reserved.

  19. IEEE Computer Society/Software Engineering Institute Software Process Achievement (SPA) Award 2009

    DTIC Science & Technology

    2011-03-01

    capabilities to our GDM. We also introduced software as a service ( SaaS ) as part our technology solutions and have further enhanced our ability to...model PROSPER Infosys production support methodology Q&P quality and productivity R&D research and development SaaS software as a service ... Software Development Life Cycle (SDLC) 23 Table 10: Scientific Estimation Coverage by Service Line 27 CMU/SEI-2011-TR-008 | vi CMU/SEI-2011

  20. Open Source Molecular Modeling

    PubMed Central

    Pirhadi, Somayeh; Sunseri, Jocelyn; Koes, David Ryan

    2016-01-01

    The success of molecular modeling and computational chemistry efforts are, by definition, dependent on quality software applications. Open source software development provides many advantages to users of modeling applications, not the least of which is that the software is free and completely extendable. In this review we categorize, enumerate, and describe available open source software packages for molecular modeling and computational chemistry. PMID:27631126

  1. The Future of Statistical Software. Proceedings of a Forum--Panel on Guidelines for Statistical Software (Washington, D.C., February 22, 1991).

    ERIC Educational Resources Information Center

    National Academy of Sciences - National Research Council, Washington, DC.

    The Panel on Guidelines for Statistical Software was organized in 1990 to document, assess, and prioritize problem areas regarding quality and reliability of statistical software; present prototype guidelines in high priority areas; and make recommendations for further research and discussion. This document provides the following papers presented…

  2. Bridging the Research-to-Practice Gap in Education: A Software-Mediated Approach for Improving Classroom Instruction

    ERIC Educational Resources Information Center

    Weston, Mark E.; Bain, Alan

    2015-01-01

    This study reports findings from a matched-comparison, repeated-measure for intact groups design of the mediating effect of a suite of software on the quality of classroom instruction provided to students by teachers. The quality of instruction provided by teachers in the treatment and control groups was documented via observations that were…

  3. Software Quality Assurance and Verification for the MPACT Library Generation Process

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liu, Yuxuan; Williams, Mark L.; Wiarda, Dorothea

    This report fulfills the requirements for the Consortium for the Advanced Simulation of Light-Water Reactors (CASL) milestone L2:RTM.P14.02, “SQA and Verification for MPACT Library Generation,” by documenting the current status of the software quality, verification, and acceptance testing of nuclear data libraries for MPACT. It provides a brief overview of the library generation process, from general-purpose evaluated nuclear data files (ENDF/B) to a problem-dependent cross section library for modeling of light-water reactors (LWRs). The software quality assurance (SQA) programs associated with each of the software used to generate the nuclear data libraries are discussed; specific tests within the SCALE/AMPX andmore » VERA/XSTools repositories are described. The methods and associated tests to verify the quality of the library during the generation process are described in detail. The library generation process has been automated to a degree to (1) ensure that it can be run without user intervention and (2) to ensure that the library can be reproduced. Finally, the acceptance testing process that will be performed by representatives from the Radiation Transport Methods (RTM) Focus Area prior to the production library’s release is described in detail.« less

  4. Assessment of three different software systems in the evaluation of dynamic MRI of the breast.

    PubMed

    Kurz, K D; Steinhaus, D; Klar, V; Cohnen, M; Wittsack, H J; Saleh, A; Mödder, U; Blondin, D

    2009-02-01

    The aim was to compare the diagnostic performance and handling of dynamic contrast-enhanced MRI of the breast with two commercial software solutions ("CADstream" and "3TP") and one self-developed software system ("Mammatool"). Identical data sets of dynamic breast MRI from 21 patients were evaluated retrospectively with all three software systems. The exams were classified according to the BI-RADS classification. The number of lesions in the parametric mapping was compared to histology or follow-up of more than 2 years. In addition, 25 quality criteria were judged by 3 independent investigators with a score from 0 to 5. Statistical analysis was performed to document the quality ranking of the different software systems. There were 9 invasive carcinomas, one pure DCIS, one papilloma, one radial scar, three histologically proven changes due to mastopathy, one adenosis and two fibroadenomas. Additionally two patients with enhancing parenchyma followed with MRI for more than 3 years and one scar after breast conserving therapy were included. All malignant lesions were classified as BI-RADS 4 or 5 using all software systems and showed significant enhancement in the parametric mapping. "CADstream" showed the best score on subjective quality criteria. "3TP" showed the lowest number of false-positive results. "Mammatool" produced the lowest number of benign tissues indicated with parametric overlay. All three software programs tested were adequate for sensitive and efficient assessment of dynamic MRI of the breast. Improvements in specificity may be achievable.

  5. Reuse Metrics for Object Oriented Software

    NASA Technical Reports Server (NTRS)

    Bieman, James M.

    1998-01-01

    One way to increase the quality of software products and the productivity of software development is to reuse existing software components when building new software systems. In order to monitor improvements in reuse, the level of reuse must be measured. In this NASA supported project we (1) derived a suite of metrics which quantify reuse attributes for object oriented, object based, and procedural software, (2) designed prototype tools to take these measurements in Ada, C++, Java, and C software, (3) evaluated the reuse in available software, (4) analyzed the relationship between coupling, cohesion, inheritance, and reuse, (5) collected object oriented software systems for our empirical analyses, and (6) developed quantitative criteria and methods for restructuring software to improve reusability.

  6. Mining dynamic noteworthy functions in software execution sequences

    PubMed Central

    Huang, Guoyan; Wang, Yuqian; He, Haitao; Ren, Jiadong

    2017-01-01

    As the quality of crucial entities can directly affect that of software, their identification and protection become an important premise for effective software development, management, maintenance and testing, which thus contribute to improving the software quality and its attack-defending ability. Most analysis and evaluation on important entities like codes-based static structure analysis are on the destruction of the actual software running. In this paper, from the perspective of software execution process, we proposed an approach to mine dynamic noteworthy functions (DNFM)in software execution sequences. First, according to software decompiling and tracking stack changes, the execution traces composed of a series of function addresses were acquired. Then these traces were modeled as execution sequences and then simplified so as to get simplified sequences (SFS), followed by the extraction of patterns through pattern extraction (PE) algorithm from SFS. After that, evaluating indicators inner-importance and inter-importance were designed to measure the noteworthiness of functions in DNFM algorithm. Finally, these functions were sorted by their noteworthiness. Comparison and contrast were conducted on the experiment results from two traditional complex network-based node mining methods, namely PageRank and DegreeRank. The results show that the DNFM method can mine noteworthy functions in software effectively and precisely. PMID:28278276

  7. Organizational Stresses and Practices Impeding Quality Software Development in Government Procurements

    ERIC Educational Resources Information Center

    Holcomb, Glenda S.

    2010-01-01

    This qualitative, phenomenological doctoral dissertation research study explored the software project team members perceptions of changing organizational cultures based on management decisions made at project deviation points. The research study provided a view into challenged or failing government software projects through the lived experiences…

  8. Investigating the application of AOP methodology in development of Financial Accounting Software using Eclipse-AJDT Environment

    NASA Astrophysics Data System (ADS)

    Sharma, Amita; Sarangdevot, S. S.

    2010-11-01

    Aspect-Oriented Programming (AOP) methodology has been investigated in development of real world business application software—Financial Accounting Software. Eclipse-AJDT environment has been used as open source enhanced IDE support for programming in AOP language—Aspect J. Crosscutting concerns have been identified and modularized as aspects. This reduces the complexity of the design considerably due to elimination of code scattering and tangling. Improvement in modularity, quality and performance is achieved. The study concludes that AOP methodology in Eclipse-AJDT environment offers powerful support for modular design and implementation of real world quality business software.

  9. Measuring Software Product Quality: The ISO 25000 Series and CMMI

    DTIC Science & Technology

    2004-06-14

    performance objectives” covers objectives and requirements for product quality, service quality , and process performance. Process performance objectives...such that product quality, service quality , and process performance attributes are measurable and controlled throughout the project (internal and

  10. Software safety

    NASA Technical Reports Server (NTRS)

    Leveson, Nancy

    1987-01-01

    Software safety and its relationship to other qualities are discussed. It is shown that standard reliability and fault tolerance techniques will not solve the safety problem for the present. A new attitude requires: looking at what you do NOT want software to do along with what you want it to do; and assuming things will go wrong. New procedures and changes to entire software development process are necessary: special software safety analysis techniques are needed; and design techniques, especially eliminating complexity, can be very helpful.

  11. Software engineering project management - A state-of-the-art report

    NASA Technical Reports Server (NTRS)

    Thayer, R. H.; Lehman, J. H.

    1977-01-01

    The management of software engineering projects in the aerospace industry was investigated. The survey assessed such features as contract type, specification preparation techniques, software documentation required by customers, planning and cost-estimating, quality control, the use of advanced program practices, software tools and test procedures, the education levels of project managers, programmers and analysts, work assignment, automatic software monitoring capabilities, design and coding reviews, production times, success rates, and organizational structure of the projects.

  12. The Legacy of Space Shuttle Flight Software

    NASA Technical Reports Server (NTRS)

    Hickey, Christopher J.; Loveall, James B.; Orr, James K.; Klausman, Andrew L.

    2011-01-01

    The initial goals of the Space Shuttle Program required that the avionics and software systems blaze new trails in advancing avionics system technology. Many of the requirements placed on avionics and software were accomplished for the first time on this program. Examples include comprehensive digital fly-by-wire technology, use of a digital databus for flight critical functions, fail operational/fail safe requirements, complex automated redundancy management, and the use of a high-order software language for flight software development. In order to meet the operational and safety goals of the program, the Space Shuttle software had to be extremely high quality, reliable, robust, reconfigurable and maintainable. To achieve this, the software development team evolved a software process focused on continuous process improvement and defect elimination that consistently produced highly predictable and top quality results, providing software managers the confidence needed to sign each Certificate of Flight Readiness (COFR). This process, which has been appraised at Capability Maturity Model (CMM)/Capability Maturity Model Integration (CMMI) Level 5, has resulted in one of the lowest software defect rates in the industry. This paper will present an overview of the evolution of the Primary Avionics Software System (PASS) project and processes over thirty years, an argument for strong statistical control of software processes with examples, an overview of the success story for identifying and driving out errors before flight, a case study of the few significant software issues and how they were either identified before flight or slipped through the process onto a flight vehicle, and identification of the valuable lessons learned over the life of the project.

  13. Assurance Evaluation for OSS Adoption in a Telco Context

    NASA Astrophysics Data System (ADS)

    Ardagna, Claudio A.; Banzi, Massimo; Damiani, Ernesto; El Ioini, Nabil; Frati, Fulvio

    Software Assurance (SwA) is a complex concept that involves different stages of a software development process and may be defined differently depending on its focus, as for instance software quality, security, or dependability. In Computer Science, the term assurance is referred to all activities necessary to provide enough confidence that a software product will satisfy its users’ functional and non-functional requirements.

  14. Developing Confidence Limits For Reliability Of Software

    NASA Technical Reports Server (NTRS)

    Hayhurst, Kelly J.

    1991-01-01

    Technique developed for estimating reliability of software by use of Moranda geometric de-eutrophication model. Pivotal method enables straightforward construction of exact bounds with associated degree of statistical confidence about reliability of software. Confidence limits thus derived provide precise means of assessing quality of software. Limits take into account number of bugs found while testing and effects of sampling variation associated with random order of discovering bugs.

  15. Socio-Cultural Challenges in Global Software Engineering Education

    ERIC Educational Resources Information Center

    Hoda, Rashina; Babar, Muhammad Ali; Shastri, Yogeshwar; Yaqoob, Humaa

    2017-01-01

    Global software engineering education (GSEE) is aimed at providing software engineering (SE) students with knowledge, skills, and understanding of working in globally distributed arrangements so they can be prepared for the global SE (GSE) paradigm. It is important to understand the challenges involved in GSEE for improving the quality and…

  16. Multidisciplinary and Active/Collaborative Approaches in Teaching Requirements Engineering

    ERIC Educational Resources Information Center

    Rosca, Daniela

    2005-01-01

    The requirements engineering course is a core component of the curriculum for the Master's in Software Engineering programme, at Monmouth University (MU). It covers the process, methods and tools specific to this area, together with the corresponding software quality issues. The need to produce software engineers with strong teamwork and…

  17. Future of Software Engineering Standards

    NASA Technical Reports Server (NTRS)

    Poon, Peter T.

    1997-01-01

    In the new millennium, software engineering standards are expected to continue to influence the process of producing software-intensive systems which are cost-effetive and of high quality. These sytems may range from ground and flight systems used for planetary exploration to educational support systems used in schools as well as consumer-oriented systems.

  18. Quality control in urodynamics and the role of software support in the QC procedure.

    PubMed

    Hogan, S; Jarvis, P; Gammie, A; Abrams, P

    2011-11-01

    This article aims to identify quality control (QC) best practice, to review published QC audits in order to identify how closely good practice is followed, and to carry out a market survey of the software features that support QC offered by urodynamics machines available in the UK. All UK distributors of urodynamic systems were contacted and asked to provide information on the software features relating to data quality of the products they supply. The results of the market survey show that the features offered by manufacturers differ greatly. Automated features, which can be turned off in most cases, include: cough recognition, detrusor contraction detection, and high pressure alerts. There are currently no systems that assess data quality based on published guidelines. A literature review of current QC guidelines for urodynamics was carried out; QC audits were included in the literature review to see how closely guidelines were being followed. This review highlights the fact that basic QC is not being carried out effectively by urodynamicists. Based on the software features currently available and the results of the literature review there is both the need and capacity for a greater degree of automation in relation to urodynamic data quality and accuracy assessment. Some progress has been made in this area and certain manufacturers have already developed automated cough detection. Copyright © 2011 Wiley Periodicals, Inc.

  19. Proteomics Quality Control: Quality Control Software for MaxQuant Results.

    PubMed

    Bielow, Chris; Mastrobuoni, Guido; Kempa, Stefan

    2016-03-04

    Mass spectrometry-based proteomics coupled to liquid chromatography has matured into an automatized, high-throughput technology, producing data on the scale of multiple gigabytes per instrument per day. Consequently, an automated quality control (QC) and quality analysis (QA) capable of detecting measurement bias, verifying consistency, and avoiding propagation of error is paramount for instrument operators and scientists in charge of downstream analysis. We have developed an R-based QC pipeline called Proteomics Quality Control (PTXQC) for bottom-up LC-MS data generated by the MaxQuant software pipeline. PTXQC creates a QC report containing a comprehensive and powerful set of QC metrics, augmented with automated scoring functions. The automated scores are collated to create an overview heatmap at the beginning of the report, giving valuable guidance also to nonspecialists. Our software supports a wide range of experimental designs, including stable isotope labeling by amino acids in cell culture (SILAC), tandem mass tags (TMT), and label-free data. Furthermore, we introduce new metrics to score MaxQuant's Match-between-runs (MBR) functionality by which peptide identifications can be transferred across Raw files based on accurate retention time and m/z. Last but not least, PTXQC is easy to install and use and represents the first QC software capable of processing MaxQuant result tables. PTXQC is freely available at https://github.com/cbielow/PTXQC .

  20. Application of a newly developed software program for image quality assessment in cone-beam computed tomography.

    PubMed

    de Oliveira, Marcus Vinicius Linhares; Santos, António Carvalho; Paulo, Graciano; Campos, Paulo Sergio Flores; Santos, Joana

    2017-06-01

    The purpose of this study was to apply a newly developed free software program, at low cost and with minimal time, to evaluate the quality of dental and maxillofacial cone-beam computed tomography (CBCT) images. A polymethyl methacrylate (PMMA) phantom, CQP-IFBA, was scanned in 3 CBCT units with 7 protocols. A macro program was developed, using the free software ImageJ, to automatically evaluate the image quality parameters. The image quality evaluation was based on 8 parameters: uniformity, the signal-to-noise ratio (SNR), noise, the contrast-to-noise ratio (CNR), spatial resolution, the artifact index, geometric accuracy, and low-contrast resolution. The image uniformity and noise depended on the protocol that was applied. Regarding the CNR, high-density structures were more sensitive to the effect of scanning parameters. There were no significant differences between SNR and CNR in centered and peripheral objects. The geometric accuracy assessment showed that all the distance measurements were lower than the real values. Low-contrast resolution was influenced by the scanning parameters, and the 1-mm rod present in the phantom was not depicted in any of the 3 CBCT units. Smaller voxel sizes presented higher spatial resolution. There were no significant differences among the protocols regarding artifact presence. This software package provided a fast, low-cost, and feasible method for the evaluation of image quality parameters in CBCT.

  1. The research and practice of spacecraft software engineering

    NASA Astrophysics Data System (ADS)

    Chen, Chengxin; Wang, Jinghua; Xu, Xiaoguang

    2017-06-01

    In order to ensure the safety and reliability of spacecraft software products, it is necessary to execute engineering management. Firstly, the paper introduces the problems of unsystematic planning, uncertain classified management and uncontinuous improved mechanism in domestic and foreign spacecraft software engineering management. Then, it proposes a solution for software engineering management based on system-integrated ideology in the perspective of spacecraft system. Finally, a application result of spacecraft is given as an example. The research can provides a reference for executing spacecraft software engineering management and improving software product quality.

  2. Towards a New Paradigm of Software Development: an Ambassador Driven Process in Distributed Software Companies

    NASA Astrophysics Data System (ADS)

    Kumlander, Deniss

    The globalization of companies operations and competitor between software vendors demand improving quality of delivered software and decreasing the overall cost. The same in fact introduce a lot of problem into software development process as produce distributed organization breaking the co-location rule of modern software development methodologies. Here we propose a reformulation of the ambassador position increasing its productivity in order to bridge communication and workflow gap by managing the entire communication process rather than concentrating purely on the communication result.

  3. Estimation of water quality parameters of inland and coastal waters with the use of a toolkit for processing of remote sensing data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dekker, A.G.; Hoogenboom, H.J.; Rijkeboer, M.

    1997-06-01

    Deriving thematic maps of water quality parameters from a remote sensing image requires a number of processing steps, such as calibration, atmospheric correction, air/water interface correction, and application of water quality algorithms. A prototype software environment has recently been developed that enables the user to perform and control these processing steps. Main parts of this environment are: (i) access to the MODTRAN 3 radiative transfer code for removing atmospheric and air-water interface influences, (ii) a tool for analyzing of algorithms for estimating water quality and (iii) a spectral database, containing apparent and inherent optical properties and associated water quality parameters.more » The use of the software is illustrated by applying implemented algorithms for estimating chlorophyll to data from a spectral library of Dutch inland waters with CHL ranging from 1 to 500 pg 1{sup -1}. The algorithms currently implemented in the Toolkit software are recommended for optically simple waters, but for optically complex waters development of more advanced retrieval methods is required.« less

  4. Development of a calibrated software reliability model for flight and supporting ground software for avionic systems

    NASA Technical Reports Server (NTRS)

    Lawrence, Stella

    1991-01-01

    The object of this project was to develop and calibrate quantitative models for predicting the quality of software. Reliable flight and supporting ground software is a highly important factor in the successful operation of the space shuttle program. The models used in the present study consisted of SMERFS (Statistical Modeling and Estimation of Reliability Functions for Software). There are ten models in SMERFS. For a first run, the results obtained in modeling the cumulative number of failures versus execution time showed fairly good results for our data. Plots of cumulative software failures versus calendar weeks were made and the model results were compared with the historical data on the same graph. If the model agrees with actual historical behavior for a set of data then there is confidence in future predictions for this data. Considering the quality of the data, the models have given some significant results, even at this early stage. With better care in data collection, data analysis, recording of the fixing of failures and CPU execution times, the models should prove extremely helpful in making predictions regarding the future pattern of failures, including an estimate of the number of errors remaining in the software and the additional testing time required for the software quality to reach acceptable levels. It appears that there is no one 'best' model for all cases. It is for this reason that the aim of this project was to test several models. One of the recommendations resulting from this study is that great care must be taken in the collection of data. When using a model, the data should satisfy the model assumptions.

  5. A methodology for model-based development and automated verification of software for aerospace systems

    NASA Astrophysics Data System (ADS)

    Martin, L.; Schatalov, M.; Hagner, M.; Goltz, U.; Maibaum, O.

    Today's software for aerospace systems typically is very complex. This is due to the increasing number of features as well as the high demand for safety, reliability, and quality. This complexity also leads to significant higher software development costs. To handle the software complexity, a structured development process is necessary. Additionally, compliance with relevant standards for quality assurance is a mandatory concern. To assure high software quality, techniques for verification are necessary. Besides traditional techniques like testing, automated verification techniques like model checking become more popular. The latter examine the whole state space and, consequently, result in a full test coverage. Nevertheless, despite the obvious advantages, this technique is rarely yet used for the development of aerospace systems. In this paper, we propose a tool-supported methodology for the development and formal verification of safety-critical software in the aerospace domain. The methodology relies on the V-Model and defines a comprehensive work flow for model-based software development as well as automated verification in compliance to the European standard series ECSS-E-ST-40C. Furthermore, our methodology supports the generation and deployment of code. For tool support we use the tool SCADE Suite (Esterel Technology), an integrated design environment that covers all the requirements for our methodology. The SCADE Suite is well established in avionics and defense, rail transportation, energy and heavy equipment industries. For evaluation purposes, we apply our approach to an up-to-date case study of the TET-1 satellite bus. In particular, the attitude and orbit control software is considered. The behavioral models for the subsystem are developed, formally verified, and optimized.

  6. MoniQA: a general approach to monitor quality assurance

    NASA Astrophysics Data System (ADS)

    Jacobs, J.; Deprez, T.; Marchal, G.; Bosmans, H.

    2006-03-01

    MoniQA ("Monitor Quality Assurance") is a new, non-commercial, independent quality assurance software application developed in our medical physics team. It is a complete Java TM - based modular environment for the evaluation of radiological viewing devices and it thus fits in the global quality assurance network of our (film less) radiology department. The purpose of the software tool is to guide the medical physicist through an acceptance protocol and the radiologist through a constancy check protocol by presentation of the necessary test patterns and by automated data collection. Data are then sent to a central management system for further analysis. At the moment more than 55 patterns have been implemented, which can be grouped in schemes to implement protocols (i.e. AAPMtg18, DIN and EUREF). Some test patterns are dynamically created and 'drawn' on the viewing device with random parameters as is the case in a recently proposed new pattern for constancy testing. The software is installed on 35 diagnostic stations (70 monitors) in a film less radiology department. Learning time was very limited. A constancy check -with the new pattern that assesses luminance decrease, resolution problems and geometric distortion- takes only 2 minutes and 28 seconds per monitor. The modular approach of the software allows the evaluation of new or emerging test patterns. We will report on the software and its usability: practicality of the constancy check tests in our hospital and on the results from acceptance tests of viewing stations for digital mammography.

  7. A proven approach for more effective software development and maintenance

    NASA Technical Reports Server (NTRS)

    Pajerski, Rose; Hall, Dana; Sinclair, Craig

    1994-01-01

    Modern space flight mission operations and associated ground data systems are increasingly dependent upon reliable, quality software. Critical functions such as command load preparation, health and status monitoring, communications link scheduling and conflict resolution, and transparent gateway protocol conversion are routinely performed by software. Given budget constraints and the ever increasing capabilities of processor technology, the next generation of control centers and data systems will be even more dependent upon software across all aspects of performance. A key challenge now is to implement improved engineering, management, and assurance processes for the development and maintenance of that software; processes that cost less, yield higher quality products, and that self-correct for continual improvement evolution. The NASA Goddard Space Flight Center has a unique experience base that can be readily tapped to help solve the software challenge. Over the past eighteen years, the Software Engineering Laboratory within the code 500 Flight Dynamics Division has evolved a software development and maintenance methodology that accommodates the unique characteristics of an organization while optimizing and continually improving the organization's software capabilities. This methodology relies upon measurement, analysis, and feedback much analogous to that of control loop systems. It is an approach with a time-tested track record proven through repeated applications across a broad range of operational software development and maintenance projects. This paper describes the software improvement methodology employed by the Software Engineering Laboratory, and how it has been exploited within the Flight Dynamics Division with GSFC Code 500. Examples of specific improvement in the software itself and its processes are presented to illustrate the effectiveness of the methodology. Finally, the initial findings are given when this methodology was applied across the mission operations and ground data systems software domains throughout Code 500.

  8. Retinal Image Simulation of Subjective Refraction Techniques.

    PubMed

    Perches, Sara; Collados, M Victoria; Ares, Jorge

    2016-01-01

    Refraction techniques make it possible to determine the most appropriate sphero-cylindrical lens prescription to achieve the best possible visual quality. Among these techniques, subjective refraction (i.e., patient's response-guided refraction) is the most commonly used approach. In this context, this paper's main goal is to present a simulation software that implements in a virtual manner various subjective-refraction techniques--including Jackson's Cross-Cylinder test (JCC)--relying all on the observation of computer-generated retinal images. This software has also been used to evaluate visual quality when the JCC test is performed in multifocal-contact-lens wearers. The results reveal this software's usefulness to simulate the retinal image quality that a particular visual compensation provides. Moreover, it can help to gain a deeper insight and to improve existing refraction techniques and it can be used for simulated training.

  9. Proceedings of the Annual Ada Software Engineering Education and Training Symposium (3rd) Held in Denver, Colorado on June 14-16, 1988

    DTIC Science & Technology

    1988-06-01

    Based Software Engineering Project Course .............. 83 SSoftware Engineering, Software Engineering Concepts: The Importance of Object-Based...quality assurance, and independent system testing . The Chief Programmer is responsible for all software development activities, including prototyping...during the Requirements Analysis phase, the Preliminary Design, the Detailed Design, Coding and Unit Testing , CSC Integration and Testing , and informal

  10. Methodical Design of Software Architecture Using an Architecture Design Assistant (ArchE)

    DTIC Science & Technology

    2005-04-01

    PA 15213-3890 Methodical Design of Software Architecture Using an Architecture Design Assistant (ArchE) Felix Bachmann and Mark Klein Software...DATES COVERED 00-00-2005 to 00-00-2005 4. TITLE AND SUBTITLE Methodical Design of Software Architecture Using an Architecture Design Assistant...important for architecture design – quality requirements and constraints are most important Here’s some evidence: If the only concern is

  11. Educational Software--New Guidelines for Development.

    ERIC Educational Resources Information Center

    Gold, Patricia Cohen

    1984-01-01

    Discusses standards developed by the Educational Computer Service of the National Education Association that incorporate technical, educational, and documentation components to guide authors in the development of quality educational software. (Author/MBR)

  12. Product assurance policies and procedures for flight dynamics software development

    NASA Technical Reports Server (NTRS)

    Perry, Sandra; Jordan, Leon; Decker, William; Page, Gerald; Mcgarry, Frank E.; Valett, Jon

    1987-01-01

    The product assurance policies and procedures necessary to support flight dynamics software development projects for Goddard Space Flight Center are presented. The quality assurance and configuration management methods and tools for each phase of the software development life cycles are described, from requirements analysis through acceptance testing; maintenance and operation are not addressed.

  13. Integrating Model-Based Verification into Software Design Education

    ERIC Educational Resources Information Center

    Yilmaz, Levent; Wang, Shuo

    2005-01-01

    Proper design analysis is indispensable to assure quality and reduce emergent costs due to faulty software. Teaching proper design verification skills early during pedagogical development is crucial, as such analysis is the only tractable way of resolving software problems early when they are easy to fix. The premise of the presented strategy is…

  14. Criteria for the Assessment of Foreign Language Instructional Software and Web Sites.

    ERIC Educational Resources Information Center

    Rifkin, Benjamin

    2003-01-01

    Presents standards for assessing language-learning software and Web sites in three different contexts: (1) teachers considering whether and how to integrate computer-mediated materials into their instruction; (2) specialists writing reviews of software or Web sites for professional journals; and (3) college administrators evaluating the quality of…

  15. Identification of features of electronic prescribing systems to support quality and safety in primary care using a modified Delphi process.

    PubMed

    Sweidan, Michelle; Williamson, Margaret; Reeve, James F; Harvey, Ken; O'Neill, Jennifer A; Schattner, Peter; Snowdon, Teri

    2010-04-15

    Electronic prescribing is increasingly being used in primary care and in hospitals. Studies on the effects of e-prescribing systems have found evidence for both benefit and harm. The aim of this study was to identify features of e-prescribing software systems that support patient safety and quality of care and that are useful to the clinician and the patient, with a focus on improving the quality use of medicines. Software features were identified by a literature review, key informants and an expert group. A modified Delphi process was used with a 12-member multidisciplinary expert group to reach consensus on the expected impact of the features in four domains: patient safety, quality of care, usefulness to the clinician and usefulness to the patient. The setting was electronic prescribing in general practice in Australia. A list of 114 software features was developed. Most of the features relate to the recording and use of patient data, the medication selection process, prescribing decision support, monitoring drug therapy and clinical reports. The expert group rated 78 of the features (68%) as likely to have a high positive impact in at least one domain, 36 features (32%) as medium impact, and none as low or negative impact. Twenty seven features were rated as high positive impact across 3 or 4 domains including patient safety and quality of care. Ten features were considered "aspirational" because of a lack of agreed standards and/or suitable knowledge bases. This study defines features of e-prescribing software systems that are expected to support safety and quality, especially in relation to prescribing and use of medicines in general practice. The features could be used to develop software standards, and could be adapted if necessary for use in other settings and countries.

  16. QCScreen: a software tool for data quality control in LC-HRMS based metabolomics.

    PubMed

    Simader, Alexandra Maria; Kluger, Bernhard; Neumann, Nora Katharina Nicole; Bueschl, Christoph; Lemmens, Marc; Lirk, Gerald; Krska, Rudolf; Schuhmacher, Rainer

    2015-10-24

    Metabolomics experiments often comprise large numbers of biological samples resulting in huge amounts of data. This data needs to be inspected for plausibility before data evaluation to detect putative sources of error e.g. retention time or mass accuracy shifts. Especially in liquid chromatography-high resolution mass spectrometry (LC-HRMS) based metabolomics research, proper quality control checks (e.g. for precision, signal drifts or offsets) are crucial prerequisites to achieve reliable and comparable results within and across experimental measurement sequences. Software tools can support this process. The software tool QCScreen was developed to offer a quick and easy data quality check of LC-HRMS derived data. It allows a flexible investigation and comparison of basic quality-related parameters within user-defined target features and the possibility to automatically evaluate multiple sample types within or across different measurement sequences in a short time. It offers a user-friendly interface that allows an easy selection of processing steps and parameter settings. The generated results include a coloured overview plot of data quality across all analysed samples and targets and, in addition, detailed illustrations of the stability and precision of the chromatographic separation, the mass accuracy and the detector sensitivity. The use of QCScreen is demonstrated with experimental data from metabolomics experiments using selected standard compounds in pure solvent. The application of the software identified problematic features, samples and analytical parameters and suggested which data files or compounds required closer manual inspection. QCScreen is an open source software tool which provides a useful basis for assessing the suitability of LC-HRMS data prior to time consuming, detailed data processing and subsequent statistical analysis. It accepts the generic mzXML format and thus can be used with many different LC-HRMS platforms to process both multiple quality control sample types as well as experimental samples in one or more measurement sequences.

  17. Identification of features of electronic prescribing systems to support quality and safety in primary care using a modified Delphi process

    PubMed Central

    2010-01-01

    Background Electronic prescribing is increasingly being used in primary care and in hospitals. Studies on the effects of e-prescribing systems have found evidence for both benefit and harm. The aim of this study was to identify features of e-prescribing software systems that support patient safety and quality of care and that are useful to the clinician and the patient, with a focus on improving the quality use of medicines. Methods Software features were identified by a literature review, key informants and an expert group. A modified Delphi process was used with a 12-member multidisciplinary expert group to reach consensus on the expected impact of the features in four domains: patient safety, quality of care, usefulness to the clinician and usefulness to the patient. The setting was electronic prescribing in general practice in Australia. Results A list of 114 software features was developed. Most of the features relate to the recording and use of patient data, the medication selection process, prescribing decision support, monitoring drug therapy and clinical reports. The expert group rated 78 of the features (68%) as likely to have a high positive impact in at least one domain, 36 features (32%) as medium impact, and none as low or negative impact. Twenty seven features were rated as high positive impact across 3 or 4 domains including patient safety and quality of care. Ten features were considered "aspirational" because of a lack of agreed standards and/or suitable knowledge bases. Conclusions This study defines features of e-prescribing software systems that are expected to support safety and quality, especially in relation to prescribing and use of medicines in general practice. The features could be used to develop software standards, and could be adapted if necessary for use in other settings and countries. PMID:20398294

  18. A proposed defect tracking model for classifying the inserted defect reports to enhance software quality control.

    PubMed

    Sultan, Torky; Khedr, Ayman E; Sayed, Mostafa

    2013-01-01

    NONE DECLARED Defect tracking systems play an important role in the software development organizations as they can store historical information about defects. There are many research in defect tracking models and systems to enhance their capabilities to be more specifically tracking, and were adopted with new technology. Furthermore, there are different studies in classifying bugs in a step by step method to have clear perception and applicable method in detecting such bugs. This paper shows a new proposed defect tracking model for the purpose of classifying the inserted defects reports in a step by step method for more enhancement of the software quality.

  19. Requirements UML Tool (RUT) Expanded for Extreme Programming (CI02)

    NASA Technical Reports Server (NTRS)

    McCoy, James R.

    2003-01-01

    A procedure for capturing and managing system requirements that incorporates XP user stories. Because costs associated with identifying problems in requirements increase dramatically over the lifecycle of a project, a method for identifying sources of software risks in user stories is urgently needed. This initiative aims to determine a set of guide-lines for user stories that will result in high-quality requirement. To further this initiative, a tool is needed to analyze user stories that can assess the quality of individual user stories, detect sources cf software risk's, produce software metrics, and identify areas in user stories that can be improved.

  20. CrossTalk: The Journal of Defense Software Engineering. Volume 19, Number 6

    DTIC Science & Technology

    2006-06-01

    improvement methods. The total volume of projects studied now exceeds 12,000. Software Productivity Research, LLC Phone: (877) 570-5459 (973) 273-5829...While performing quality con- sulting, Olson has helped organizations measurably improve quality and productivity , save millions of dollars in costs of...This article draws parallels between the outrageous events on the Jerry Springer Show and problems faced by process improvement programs. by Paul

  1. OPEN-SOURCE SOFTWARE IN DENTISTRY: A SYSTEMATIC REVIEW.

    PubMed

    Chruściel-Nogalska, Małgorzata; Smektała, Tomasz; Tutak, Marcin; Sporniak-Tutak, Katarzyna; Olszewski, Raphael

    2017-01-01

    Technological development and the need for electronic health records management resulted in the need for a computer with dedicated, commercial software in daily dental practice. The alternative for commercial software may be open-source solutions. Therefore, this study reviewed the current literature on the availability and use of open-source software (OSS) in dentistry. A comprehensive database search was performed on February 1, 2017. Only articles published in peer-reviewed journals with a focus on the use or description of OSS were retrieved. The level of evidence, according to Oxford EBM Centre Levels of Evidence Scale was classified for all studies. Experimental studies underwent additional quality reporting assessment. The screening and evaluation process resulted in twenty-one studies from 1,940 articles found, with 10 of them being experimental studies. None of the articles provided level 1 evidence, and only one study was considered high quality following quality assessment. Twenty-six different OSS programs were described in the included studies of which ten were used for image visualization, five were used for healthcare records management, four were used for educations processes, one was used for remote consultation and simulation, and six were used for general purposes. Our analysis revealed that the dental literature on OSS consists of scarce, incomplete, and methodologically low quality information.

  2. Onward to Petaflops Computing

    NASA Technical Reports Server (NTRS)

    Bailey, David H.; Chancellor, Marisa K. (Technical Monitor)

    1997-01-01

    With programs such as the US High Performance Computing and Communications Program (HPCCP), the attention of scientists and engineers worldwide has been focused on the potential of very high performance scientific computing, namely systems that are hundreds or thousands of times more powerful than those typically available in desktop systems at any given point in time. Extending the frontiers of computing in this manner has resulted in remarkable advances, both in computing technology itself and also in the various scientific and engineering disciplines that utilize these systems. Within the month or two, a sustained rate of 1 Tflop/s (also written 1 teraflops, or 10(exp 12) floating-point operations per second) is likely to be achieved by the 'ASCI Red' system at Sandia National Laboratory in New Mexico. With this objective in sight, it is reasonable to ask what lies ahead for high-end computing.

  3. Test and evaluation procedures for Sandia's Teraflops Operating System (TOS) on Janus.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Barnette, Daniel Wayne

    This report describes the test and evaluation methods by which the Teraflops Operating System, or TOS, that resides on Sandia's massively-parallel computer Janus is verified for production release. Also discussed are methods used to build TOS before testing and evaluating, miscellaneous utility scripts, a sample test plan, and a proposed post-test method for quickly examining the large number of test results. The purpose of the report is threefold: (1) to provide a guide to T&E procedures, (2) to aid and guide others who will run T&E procedures on the new ASCI Red Storm machine, and (3) to document some ofmore » the history of evaluation and testing of TOS. This report is not intended to serve as an exhaustive manual for testers to conduct T&E procedures.« less

  4. First record of Talaromyces udagawae in soil related to decomposing human remains in Argentina.

    PubMed

    Tranchida, María C; Centeno, Néstor D; Stenglein, Sebastián A; Cabello, Marta N

    2016-01-01

    The morphologic features of Talaromyces udagawae Stolk and Samson are here described and illustrated. This teleomorphic Ascomycota fungus was isolated from soil obtained in Buenos Aires province (Argentina) from beneath a human cadaver in an advanced state of decomposition. After washing and serial dilution of the soil along with moist-chamber techniques for fungal cultivation, T. udagawae formed very restricted colonies of bright yellow color on different growth media with 8-ascospored asci. The ascospores were ellipsoidal and ornamented. The anamorphic state was not observed. Molecular-genetic techniques identified the species. The present record is the first of the species in Argentina, pointing it as a tool to identify soils where cadaver decomposition occurs. Copyright © 2015 Asociación Argentina de Microbiología. Publicado por Elsevier España, S.L.U. All rights reserved.

  5. The influence of software filtering in digital mammography image quality

    NASA Astrophysics Data System (ADS)

    Michail, C.; Spyropoulou, V.; Kalyvas, N.; Valais, I.; Dimitropoulos, N.; Fountos, G.; Kandarakis, I.; Panayiotakis, G.

    2009-05-01

    Breast cancer is one of the most frequently diagnosed cancers among women. Several techniques have been developed to help in the early detection of breast cancer such as conventional and digital x-ray mammography, positron and single-photon emission mammography, etc. A key advantage in digital mammography is that images can be manipulated as simple computer image files. Thus non-dedicated commercially available image manipulation software can be employed to process and store the images. The image processing tools of the Photoshop (CS 2) software usually incorporate digital filters which may be used to reduce image noise, enhance contrast and increase spatial resolution. However, improving an image quality parameter may result in degradation of another. The aim of this work was to investigate the influence of three sharpening filters, named hereafter sharpen, sharpen more and sharpen edges on image resolution and noise. Image resolution was assessed by means of the Modulation Transfer Function (MTF).In conclusion it was found that the correct use of commercial non-dedicated software on digital mammograms may improve some aspects of image quality.

  6. [Quality assurance of a virtual simulation software: application to IMAgo and SIMAgo (ISOgray)].

    PubMed

    Isambert, A; Beaudré, A; Ferreira, I; Lefkopoulos, D

    2007-06-01

    Virtual simulation process is often used to prepare three dimensional conformal radiation therapy treatments. As the quality of the treatment is widely dependent on this step, it is mandatory to perform extensive controls on this software before clinical use. The tests presented in this work have been carried out on the treatment planning system ISOgray (DOSIsoft), including the delineation module IMAgo and the virtual simulation module SIMAgo. According to our experience, the most relevant controls of international protocols have been selected. These tests mainly focused on measuring and delineation tools, virtual simulation functionalities, and have been performed with three phantoms: the Quasar Multi-Purpose Body Phantom, the Quasar MLC Beam Geometry Phantom (Modus Medical Devices Inc.) and a phantom developed at Hospital Tenon. No major issues have been identified while performing the tests. These controls have emphasized the necessity for the user to consider with a critical eye the results displayed by a virtual simulation software. The contrast of visualisation, the slice thickness, the calculation and display mode of 3D structures used by the software are many factors of uncertainties. A virtual simulation software quality assurance procedure has been written and applied on a set of CT images. Similar tests have to be performed periodically and at minimum at each change of major version.

  7. Semantic Entity-Component State Management Techniques to Enhance Software Quality for Multimodal VR-Systems.

    PubMed

    Fischbach, Martin; Wiebusch, Dennis; Latoschik, Marc Erich

    2017-04-01

    Modularity, modifiability, reusability, and API usability are important software qualities that determine the maintainability of software architectures. Virtual, Augmented, and Mixed Reality (VR, AR, MR) systems, modern computer games, as well as interactive human-robot systems often include various dedicated input-, output-, and processing subsystems. These subsystems collectively maintain a real-time simulation of a coherent application state. The resulting interdependencies between individual state representations, mutual state access, overall synchronization, and flow of control implies a conceptual close coupling whereas software quality asks for a decoupling to develop maintainable solutions. This article presents five semantics-based software techniques that address this contradiction: Semantic grounding, code from semantics, grounded actions, semantic queries, and decoupling by semantics. These techniques are applied to extend the well-established entity-component-system (ECS) pattern to overcome some of this pattern's deficits with respect to the implied state access. A walk-through of central implementation aspects of a multimodal (speech and gesture) VR-interface is used to highlight the techniques' benefits. This use-case is chosen as a prototypical example of complex architectures with multiple interacting subsystems found in many VR, AR and MR architectures. Finally, implementation hints are given, lessons learned regarding maintainability pointed-out, and performance implications discussed.

  8. A Reference Model for Software and System Inspections. White Paper

    NASA Technical Reports Server (NTRS)

    He, Lulu; Shull, Forrest

    2009-01-01

    Software Quality Assurance (SQA) is an important component of the software development process. SQA processes provide assurance that the software products and processes in the project life cycle conform to their specified requirements by planning, enacting, and performing a set of activities to provide adequate confidence that quality is being built into the software. Typical techniques include: (1) Testing (2) Simulation (3) Model checking (4) Symbolic execution (5) Management reviews (6) Technical reviews (7) Inspections (8) Walk-throughs (9) Audits (10) Analysis (complexity analysis, control flow analysis, algorithmic analysis) (11) Formal method Our work over the last few years has resulted in substantial knowledge about SQA techniques, especially the areas of technical reviews and inspections. But can we apply the same QA techniques to the system development process? If yes, what kind of tailoring do we need before applying them in the system engineering context? If not, what types of QA techniques are actually used at system level? And, is there any room for improvement.) After a brief examination of the system engineering literature (especially focused on NASA and DoD guidance) we found that: (1) System and software development process interact with each other at different phases through development life cycle (2) Reviews are emphasized in both system and software development. (Figl.3). For some reviews (e.g. SRR, PDR, CDR), there are both system versions and software versions. (3) Analysis techniques are emphasized (e.g. Fault Tree Analysis, Preliminary Hazard Analysis) and some details are given about how to apply them. (4) Reviews are expected to use the outputs of the analysis techniques. In other words, these particular analyses are usually conducted in preparation for (before) reviews. The goal of our work is to explore the interaction between the Quality Assurance (QA) techniques at the system level and the software level.

  9. Preliminary description of the area navigation software for a microcomputer-based Loran-C receiver

    NASA Technical Reports Server (NTRS)

    Oguri, F.

    1983-01-01

    The development of new software implementation of this software on a microcomputer (MOS 6502) to provide high quality navigation information is described. This software development provides Area/Route Navigation (RNAV) information from Time Differences (TDs) in raw form using an elliptical Earth model and a spherical model. The software is prepared for the microcomputer based Loran-C receiver. To compute navigation infomation, a (MOS 6502) microcomputer and a mathematical chip (AM 9511A) were combined with the Loran-C receiver. Final data reveals that this software does indeed provide accurate information with reasonable execution times.

  10. Performing Verification and Validation in Reuse-Based Software Engineering

    NASA Technical Reports Server (NTRS)

    Addy, Edward A.

    1999-01-01

    The implementation of reuse-based software engineering not only introduces new activities to the software development process, such as domain analysis and domain modeling, it also impacts other aspects of software engineering. Other areas of software engineering that are affected include Configuration Management, Testing, Quality Control, and Verification and Validation (V&V). Activities in each of these areas must be adapted to address the entire domain or product line rather than a specific application system. This paper discusses changes and enhancements to the V&V process, in order to adapt V&V to reuse-based software engineering.

  11. Modeling defect trends for iterative development

    NASA Technical Reports Server (NTRS)

    Powell, J. D.; Spanguolo, J. N.

    2003-01-01

    The Employment of Defects (EoD) approach to measuring and analyzing defects seeks to identify and capture trends and phenomena that are critical to managing software quality in the iterative software development lifecycle at JPL.

  12. Requirement Metrics for Risk Identification

    NASA Technical Reports Server (NTRS)

    Hammer, Theodore; Huffman, Lenore; Wilson, William; Rosenberg, Linda; Hyatt, Lawrence

    1996-01-01

    The Software Assurance Technology Center (SATC) is part of the Office of Mission Assurance of the Goddard Space Flight Center (GSFC). The SATC's mission is to assist National Aeronautics and Space Administration (NASA) projects to improve the quality of software which they acquire or develop. The SATC's efforts are currently focused on the development and use of metric methodologies and tools that identify and assess risks associated with software performance and scheduled delivery. This starts at the requirements phase, where the SATC, in conjunction with software projects at GSFC and other NASA centers is working to identify tools and metric methodologies to assist project managers in identifying and mitigating risks. This paper discusses requirement metrics currently being used at NASA in a collaborative effort between the SATC and the Quality Assurance Office at GSFC to utilize the information available through the application of requirements management tools.

  13. [The primary research and development of software oversampling mapping system for electrocardiogram].

    PubMed

    Zhou, Yu; Ren, Jie

    2011-04-01

    We put forward a new concept of software oversampling mapping system for electrocardiogram (ECG) to assist the research of the ECG inverse problem to improve the generality of mapping system and the quality of mapping signals. We then developed a conceptual system based on the traditional ECG detecting circuit, Labview and DAQ card produced by National Instruments, and at the same time combined the newly-developed oversampling method into the system. The results indicated that the system could map ECG signals accurately and the quality of the signals was good. The improvement of hardware and enhancement of software made the system suitable for mapping in different situations. So the primary development of the software for oversampling mapping system was successful and further research and development can make the system a powerful tool for researching ECG inverse problem.

  14. Database Performance Monitoring for the Photovoltaic Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Klise, Katherine A.

    The Database Performance Monitoring (DPM) software (copyright in processes) is being developed at Sandia National Laboratories to perform quality control analysis on time series data. The software loads time indexed databases (currently csv format), performs a series of quality control tests defined by the user, and creates reports which include summary statistics, tables, and graphics. DPM can be setup to run on an automated schedule defined by the user. For example, the software can be run once per day to analyze data collected on the previous day. HTML formatted reports can be sent via email or hosted on a website.more » To compare performance of several databases, summary statistics and graphics can be gathered in a dashboard view which links to detailed reporting information for each database. The software can be customized for specific applications.« less

  15. Report of AAPM Task Group 162: Software for planar image quality metrology.

    PubMed

    Samei, Ehsan; Ikejimba, Lynda C; Harrawood, Brian P; Rong, John; Cunningham, Ian A; Flynn, Michael J

    2018-02-01

    The AAPM Task Group 162 aimed to provide a standardized approach for the assessment of image quality in planar imaging systems. This report offers a description of the approach as well as the details of the resultant software bundle to measure detective quantum efficiency (DQE) as well as its basis components and derivatives. The methodology and the associated software include the characterization of the noise power spectrum (NPS) from planar images acquired under specific acquisition conditions, modulation transfer function (MTF) using an edge test object, the DQE, and effective DQE (eDQE). First, a methodological framework is provided to highlight the theoretical basis of the work. Then, a step-by-step guide is included to assist in proper execution of each component of the code. Lastly, an evaluation of the method is included to validate its accuracy against model-based and experimental data. The code was built using a Macintosh OSX operating system. The software package contains all the source codes to permit an experienced user to build the suite on a Linux or other *nix type system. The package further includes manuals and sample images and scripts to demonstrate use of the software for new users. The results of the code are in close alignment with theoretical expectations and published results of experimental data. The methodology and the software package offered in AAPM TG162 can be used as baseline for characterization of inherent image quality attributes of planar imaging systems. © 2017 American Association of Physicists in Medicine.

  16. Software Metrics

    DTIC Science & Technology

    1988-12-01

    software development scene is often charac- c. SPQR Model-Jones terized by: * schedule and cost estimates that are gross-d. COPMO-Thebaut ly inaccurate, SEI...time c. SPQR Model-Jones (in seconds) is simply derived from E by dividing T. Capers Jones has developed a software cost by the Stroud number, S...estimation model called the Software Produc- T=E/S tivity, Quality, and Reliability ( SPQR ) model. The basic approach is similar to that of Boehm’s The value

  17. Improvement of Computer Software Quality through Software Automated Tools.

    DTIC Science & Technology

    1986-08-30

    information that are returned from the tools to the human user, and the forms in which these outputs are presented. Page 2 of 4 STAGE OF DEVELOPMENT: What... AUTOMIATED SOFTWARE TOOL MONITORING SYSTEM APPENDIX 2 2-1 INTRODUCTION This document and Automated Software Tool Monitoring Program (Appendix 1) are...t Output Output features provide links from the tool to both the human user and the target machine (where applicable). They describe the types

  18. Adopting software quality measures for healthcare processes.

    PubMed

    Yildiz, Ozkan; Demirörs, Onur

    2009-01-01

    In this study, we investigated the adoptability of software quality measures for healthcare process measurement. Quality measures of ISO/IEC 9126 are redefined from a process perspective to build a generic healthcare process quality measurement model. Case study research method is used, and the model is applied to a public hospital's Entry to Care process. After the application, weak and strong aspects of the process can be easily observed. Access audibility, fault removal, completeness of documentation, and machine utilization are weak aspects and these aspects are the candidates for process improvement. On the other hand, functional completeness, fault ratio, input validity checking, response time, and throughput time are the strong aspects of the process.

  19. A conceptual persistent healthcare quality improvement process for software development management.

    PubMed

    Lin, Jen-Chiun; Su, Mei-Ju; Cheng, Po-Hsun; Weng, Yung-Chien; Chen, Sao-Jie; Lai, Jin-Shin; Lai, Feipei

    2007-01-01

    This paper illustrates a sustained conceptual service quality improvement process for the management of software development within a healthcare enterprise. Our proposed process is revised from Niland's healthcare quality information system (HQIS). This process includes functions to survey the satisfaction of system functions, describe the operation bylaws on-line, and provide on-demand training. To achieve these goals, we integrate five information systems in National Taiwan University Hospital, including healthcare information systems, health quality information system, requirement management system, executive information system, and digital learning system, to form a full Deming cycle. A preliminary user satisfaction survey showed that our outpatient information system scored an average of 71.31 in 2006.

  20. Preface to QoIS 2009

    NASA Astrophysics Data System (ADS)

    Comyn-Wattiau, Isabelle; Thalheim, Bernhard

    Quality assurance is a growing research domain within the Information Systems (IS) and Conceptual Modeling (CM) disciplines. Ongoing research on quality in IS and CM is highly diverse and encompasses theoretical aspects including quality definition and quality models, and practical/empirical aspects such as the development of methods, approaches and tools for quality measurement and improvement. Current research on quality also includes quality characteristics definitions, validation instruments, methodological and development approaches to quality assurance during software and information systems development, quality monitors, quality assurance during information systems development processes and practices, quality assurance both for data and (meta)schemata, quality support for information systems data import and export, quality of query answering, and cost/benefit analysis of quality assurance processes. Quality assurance is also depending on the application area and the specific requirements in applications such as health sector, logistics, public sector, financial sector, manufacturing, services, e-commerce, software, etc. Furthermore, quality assurance must also be supported for data aggregation, ETL processes, web content management and other multi-layered applications. Quality assurance is typically requiring resources and has therefore beside its benefits a computational and economical trade-off. It is therefore also based on compromising between the value of quality data and the cost for quality assurance.

  1. Clinical evaluation of reducing acquisition time on single-photon emission computed tomography image quality using proprietary resolution recovery software.

    PubMed

    Aldridge, Matthew D; Waddington, Wendy W; Dickson, John C; Prakash, Vineet; Ell, Peter J; Bomanji, Jamshed B

    2013-11-01

    A three-dimensional model-based resolution recovery (RR) reconstruction algorithm that compensates for collimator-detector response, resulting in an improvement in reconstructed spatial resolution and signal-to-noise ratio of single-photon emission computed tomography (SPECT) images, was tested. The software is said to retain image quality even with reduced acquisition time. Clinically, any improvement in patient throughput without loss of quality is to be welcomed. Furthermore, future restrictions in radiotracer supplies may add value to this type of data analysis. The aims of this study were to assess improvement in image quality using the software and to evaluate the potential of performing reduced time acquisitions for bone and parathyroid SPECT applications. Data acquisition was performed using the local standard SPECT/CT protocols for 99mTc-hydroxymethylene diphosphonate bone and 99mTc-methoxyisobutylisonitrile parathyroid SPECT imaging. The principal modification applied was the acquisition of an eight-frame gated data set acquired using an ECG simulator with a fixed signal as the trigger. This had the effect of partitioning the data such that the effect of reduced time acquisitions could be assessed without conferring additional scanning time on the patient. The set of summed data sets was then independently reconstructed using the RR software to permit a blinded assessment of the effect of acquired counts upon reconstructed image quality as adjudged by three experienced observers. Data sets reconstructed with the RR software were compared with the local standard processing protocols; filtered back-projection and ordered-subset expectation-maximization. Thirty SPECT studies were assessed (20 bone and 10 parathyroid). The images reconstructed with the RR algorithm showed improved image quality for both full-time and half-time acquisitions over local current processing protocols (P<0.05). The RR algorithm improved image quality compared with local processing protocols and has been introduced into routine clinical use. SPECT acquisitions are now acquired at half of the time previously required. The method of binning the data can be applied to any other camera system to evaluate the reduction in acquisition time for similar processes. The potential for dose reduction is also inherent with this approach.

  2. Are Future Teachers Methodically Trained to Distinguish Good from Bad Educational Software?

    ERIC Educational Resources Information Center

    Pjanic, Karmelita; Hamzabegovic, Jasna

    2016-01-01

    In the era of information technology and general digitization of society, invasion of every kind of software is evident. No matter how laudable is the existence and development of educational software, taking into account its role, its quality and whether it achieves the desired goal is very important. In addition to programming experts it is…

  3. Using Academia-Industry Partnerships to Enhance Software Verification & Validation Education via Active Learning Tools

    ERIC Educational Resources Information Center

    Acharya, Sushil; Manohar, Priyadarshan; Wu, Peter; Schilling, Walter

    2017-01-01

    Imparting real world experiences in a software verification and validation (SV&V) course is often a challenge due to the lack of effective active learning tools. This pedagogical requirement is important because graduates are expected to develop software that meets rigorous quality standards in functional and application domains. Realizing the…

  4. A Guideline of Using Case Method in Software Engineering Courses

    ERIC Educational Resources Information Center

    Zainal, Dzulaiha Aryanee Putri; Razali, Rozilawati; Shukur, Zarina

    2014-01-01

    Software Engineering (SE) education has been reported to fall short in producing high quality software engineers. In seeking alternative solutions, Case Method (CM) is regarded as having potential to solve the issue. CM is a teaching and learning (T&L) method that has been found to be effective in Social Science education. In principle,…

  5. Applying a Framework to Evaluate Assignment Marking Software: A Case Study on Lightwork

    ERIC Educational Resources Information Center

    Heinrich, Eva; Milne, John

    2012-01-01

    This article presents the findings of a qualitative evaluation on the effect of a specialised software tool on the efficiency and quality of assignment marking. The software, Lightwork, combines with the Moodle learning management system and provides support through marking rubrics and marker allocations. To enable the evaluation a framework has…

  6. 78 FR 292 - Request for Comments and Notice of Roundtable Events for Partnership for Enhancement of Quality...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-01-03

    ... define the boundaries of the patent property right. Software by its nature is operation-based and is... elements of software are often defined using functional language. While it is permissible to use functional... Software-Related Patents AGENCY: United States Patent and Trademark Office, Commerce. ACTION: Request for...

  7. Common Misconceptions About Service-Oriented Architecture

    DTIC Science & Technology

    2007-11-01

    addition, the architect(s) must make decisions on how services are implemented. Service implementations may involve developing new software , wrapping a...legacy software system, incor- porating services provided by third par- ties, or a combination of these options. Information about the quality attrib...temperature. However, there 28 CROSSTALK The Journal of Defense Software Engineering November 2007 Common Misconceptions About Service -Oriented

  8. Supporting Early Math--Rationales and Requirements for High Quality Software

    ERIC Educational Resources Information Center

    Haake, Magnus; Husain, Layla; Gulz, Agneta

    2015-01-01

    There is substantial evidence that preschooler's performance in early math is highly correlated to math performance throughout school as well as academic skills in general. One way to help children attain early math skills is by using targeted educational software and the paper discusses potential gains of using such software to support early math…

  9. Agile Software Development in Defense Acquisition: A Mission Assurance Perspective

    DTIC Science & Technology

    2012-03-23

    based information retrieval system, we might say that this program works like a hive of bees , going out for pollen and bringing it back to the hive...developers ® Six Siqma is reqistered in the U. S. Patent and Trademark Office by Motorola ^_ 33 @ AEROSPACE Major Areas in a Typical Software...requirements - Capturing and evaluating quality metrics, identifying common problem areas **» Despite its positive impact on quality, pair programming

  10. Functional description of the ISIS system

    NASA Technical Reports Server (NTRS)

    Berman, W. J.

    1979-01-01

    Development of software for avionic and aerospace applications (flight software) is influenced by a unique combination of factors which includes: (1) length of the life cycle of each project; (2) necessity for cooperation between the aerospace industry and NASA; (3) the need for flight software that is highly reliable; (4) the increasing complexity and size of flight software; and (5) the high quality of the programmers and the tightening of project budgets. The interactive software invocation system (ISIS) which is described is designed to overcome the problems created by this combination of factors.

  11. Assuring Software Cost Estimates: Is it an Oxymoron?

    NASA Technical Reports Server (NTRS)

    Hihn, Jarius; Tregre, Grant

    2013-01-01

    The software industry repeatedly observes cost growth of well over 100% even after decades of cost estimation research and well-known best practices, so "What's the problem?" In this paper we will provide an overview of the current state oj software cost estimation best practice. We then explore whether applying some of the methods used in software assurance might improve the quality of software cost estimates. This paper especially focuses on issues associated with model calibration, estimate review, and the development and documentation of estimates as part alan integrated plan.

  12. Quality in End User Documentation.

    ERIC Educational Resources Information Center

    Morrison, Ronald

    1994-01-01

    Discusses quality in end-user documentation for computer applications and explains four approaches to improving quality in end-user documents. Highlights include online help, usability testing, technical writing elements, statistical approaches, and concepts relating to software quality that are also applicable to user manuals. (LRW)

  13. SIMPATIQCO: a server-based software suite which facilitates monitoring the time course of LC-MS performance metrics on Orbitrap instruments.

    PubMed

    Pichler, Peter; Mazanek, Michael; Dusberger, Frederico; Weilnböck, Lisa; Huber, Christian G; Stingl, Christoph; Luider, Theo M; Straube, Werner L; Köcher, Thomas; Mechtler, Karl

    2012-11-02

    While the performance of liquid chromatography (LC) and mass spectrometry (MS) instrumentation continues to increase, applications such as analyses of complete or near-complete proteomes and quantitative studies require constant and optimal system performance. For this reason, research laboratories and core facilities alike are recommended to implement quality control (QC) measures as part of their routine workflows. Many laboratories perform sporadic quality control checks. However, successive and systematic longitudinal monitoring of system performance would be facilitated by dedicated automatic or semiautomatic software solutions that aid an effortless analysis and display of QC metrics over time. We present the software package SIMPATIQCO (SIMPle AuTomatIc Quality COntrol) designed for evaluation of data from LTQ Orbitrap, Q-Exactive, LTQ FT, and LTQ instruments. A centralized SIMPATIQCO server can process QC data from multiple instruments. The software calculates QC metrics supervising every step of data acquisition from LC and electrospray to MS. For each QC metric the software learns the range indicating adequate system performance from the uploaded data using robust statistics. Results are stored in a database and can be displayed in a comfortable manner from any computer in the laboratory via a web browser. QC data can be monitored for individual LC runs as well as plotted over time. SIMPATIQCO thus assists the longitudinal monitoring of important QC metrics such as peptide elution times, peak widths, intensities, total ion current (TIC) as well as sensitivity, and overall LC-MS system performance; in this way the software also helps identify potential problems. The SIMPATIQCO software package is available free of charge.

  14. SIMPATIQCO: A Server-Based Software Suite Which Facilitates Monitoring the Time Course of LC–MS Performance Metrics on Orbitrap Instruments

    PubMed Central

    2012-01-01

    While the performance of liquid chromatography (LC) and mass spectrometry (MS) instrumentation continues to increase, applications such as analyses of complete or near-complete proteomes and quantitative studies require constant and optimal system performance. For this reason, research laboratories and core facilities alike are recommended to implement quality control (QC) measures as part of their routine workflows. Many laboratories perform sporadic quality control checks. However, successive and systematic longitudinal monitoring of system performance would be facilitated by dedicated automatic or semiautomatic software solutions that aid an effortless analysis and display of QC metrics over time. We present the software package SIMPATIQCO (SIMPle AuTomatIc Quality COntrol) designed for evaluation of data from LTQ Orbitrap, Q-Exactive, LTQ FT, and LTQ instruments. A centralized SIMPATIQCO server can process QC data from multiple instruments. The software calculates QC metrics supervising every step of data acquisition from LC and electrospray to MS. For each QC metric the software learns the range indicating adequate system performance from the uploaded data using robust statistics. Results are stored in a database and can be displayed in a comfortable manner from any computer in the laboratory via a web browser. QC data can be monitored for individual LC runs as well as plotted over time. SIMPATIQCO thus assists the longitudinal monitoring of important QC metrics such as peptide elution times, peak widths, intensities, total ion current (TIC) as well as sensitivity, and overall LC–MS system performance; in this way the software also helps identify potential problems. The SIMPATIQCO software package is available free of charge. PMID:23088386

  15. Software Design Improvements. Part 1; Software Benefits and Limitations

    NASA Technical Reports Server (NTRS)

    Lalli, Vincent R.; Packard, Michael H.; Ziemianski, Tom

    1997-01-01

    Computer hardware and associated software have been used for many years to process accounting information, to analyze test data and to perform engineering analysis. Now computers and software also control everything from automobiles to washing machines and the number and type of applications are growing at an exponential rate. The size of individual program has shown similar growth. Furthermore, software and hardware are used to monitor and/or control potentially dangerous products and safety-critical systems. These uses include everything from airplanes and braking systems to medical devices and nuclear plants. The question is: how can this hardware and software be made more reliable? Also, how can software quality be improved? What methodology needs to be provided on large and small software products to improve the design and how can software be verified?

  16. Maintaining Quality and Confidence in Open-Source, Evolving Software: Lessons Learned with PFLOTRAN

    NASA Astrophysics Data System (ADS)

    Frederick, J. M.; Hammond, G. E.

    2017-12-01

    Software evolution in an open-source framework poses a major challenge to a geoscientific simulator, but when properly managed, the pay-off can be enormous for both the developers and the community at large. Developers must juggle implementing new scientific process models, adopting increasingly efficient numerical methods and programming paradigms, changing funding sources (or total lack of funding), while also ensuring that legacy code remains functional and reported bugs are fixed in a timely manner. With robust software engineering and a plan for long-term maintenance, a simulator can evolve over time incorporating and leveraging many advances in the computational and domain sciences. In this positive light, what practices in software engineering and code maintenance can be employed within open-source development to maximize the positive aspects of software evolution and community contributions while minimizing its negative side effects? This presentation will discusses steps taken in the development of PFLOTRAN (www.pflotran.org), an open source, massively parallel subsurface simulator for multiphase, multicomponent, and multiscale reactive flow and transport processes in porous media. As PFLOTRAN's user base and development team continues to grow, it has become increasingly important to implement strategies which ensure sustainable software development while maintaining software quality and community confidence. In this presentation, we will share our experiences and "lessons learned" within the context of our open-source development framework and community engagement efforts. Topics discussed will include how we've leveraged both standard software engineering principles, such as coding standards, version control, and automated testing, as well unique advantages of object-oriented design in process model coupling, to ensure software quality and confidence. We will also be prepared to discuss the major challenges faced by most open-source software teams, such as on-boarding new developers or one-time contributions, dealing with competitors or lookie-loos, and other downsides of complete transparency, as well as our approach to community engagement, including a user group email list, hosting short courses and workshops for new users, and maintaining a website. SAND2017-8174A

  17. Assessment and modeling of the groundwater hydrogeochemical quality parameters via geostatistical approaches

    NASA Astrophysics Data System (ADS)

    Karami, Shawgar; Madani, Hassan; Katibeh, Homayoon; Fatehi Marj, Ahmad

    2018-03-01

    Geostatistical methods are one of the advanced techniques used for interpolation of groundwater quality data. The results obtained from geostatistics will be useful for decision makers to adopt suitable remedial measures to protect the quality of groundwater sources. Data used in this study were collected from 78 wells in Varamin plain aquifer located in southeast of Tehran, Iran, in 2013. Ordinary kriging method was used in this study to evaluate groundwater quality parameters. According to what has been mentioned in this paper, seven main quality parameters (i.e. total dissolved solids (TDS), sodium adsorption ratio (SAR), electrical conductivity (EC), sodium (Na+), total hardness (TH), chloride (Cl-) and sulfate (SO4 2-)), have been analyzed and interpreted by statistical and geostatistical methods. After data normalization by Nscore method in WinGslib software, variography as a geostatistical tool to define spatial regression was compiled and experimental variograms were plotted by GS+ software. Then, the best theoretical model was fitted to each variogram based on the minimum RSS. Cross validation method was used to determine the accuracy of the estimated data. Eventually, estimation maps of groundwater quality were prepared in WinGslib software and estimation variance map and estimation error map were presented to evaluate the quality of estimation in each estimated point. Results showed that kriging method is more accurate than the traditional interpolation methods.

  18. Free and open source software for the manipulation of digital images.

    PubMed

    Solomon, Robert W

    2009-06-01

    Free and open source software is a type of software that is nearly as powerful as commercial software but is freely downloadable. This software can do almost everything that the expensive programs can. GIMP (gnu image manipulation program) is the free program that is comparable to Photoshop, and versions are available for Windows, Macintosh, and Linux platforms. This article briefly describes how GIMP can be installed and used to manipulate radiology images. It is no longer necessary to budget large amounts of money for high-quality software to achieve the goals of image processing and document creation because free and open source software is available for the user to download at will.

  19. Statistical modeling of software reliability

    NASA Technical Reports Server (NTRS)

    Miller, Douglas R.

    1992-01-01

    This working paper discusses the statistical simulation part of a controlled software development experiment being conducted under the direction of the System Validation Methods Branch, Information Systems Division, NASA Langley Research Center. The experiment uses guidance and control software (GCS) aboard a fictitious planetary landing spacecraft: real-time control software operating on a transient mission. Software execution is simulated to study the statistical aspects of reliability and other failure characteristics of the software during development, testing, and random usage. Quantification of software reliability is a major goal. Various reliability concepts are discussed. Experiments are described for performing simulations and collecting appropriate simulated software performance and failure data. This data is then used to make statistical inferences about the quality of the software development and verification processes as well as inferences about the reliability of software versions and reliability growth under random testing and debugging.

  20. The Assignment of Scale to Object-Oriented Software Measures

    NASA Technical Reports Server (NTRS)

    Neal, Ralph D.; Weistroffer, H. Roland; Coppins, Richard J.

    1997-01-01

    In order to improve productivity (and quality), measurement of specific aspects of software has become imperative. As object oriented programming languages have become more widely used, metrics designed specifically for object-oriented software are required. Recently a large number of new metrics for object- oriented software has appeared in the literature. Unfortunately, many of these proposed metrics have not been validated to measure what they purport to measure. In this paper fifty (50) of these metrics are analyzed.

  1. Collaborative Software Development Approach Used to Deliver the New Shuttle Telemetry Ground Station

    NASA Technical Reports Server (NTRS)

    Kirby, Randy L.; Mann, David; Prenger, Stephen G.; Craig, Wayne; Greenwood, Andrew; Morsics, Jonathan; Fricker, Charles H.; Quach, Son; Lechese, Paul

    2003-01-01

    United Space Alliance (USA) developed and used a new software development method to meet technical, schedule, and budget challenges faced during the development and delivery of the new Shuttle Telemetry Ground Station at Kennedy Space Center. This method, called Collaborative Software Development, enabled KSC to effectively leverage industrial software and build additional capabilities to meet shuttle system and operational requirements. Application of this method resulted in reduced time to market, reduced development cost, improved product quality, and improved programmer competence while developing technologies of benefit to a small company in California (AP Labs Inc.). Many modifications were made to the baseline software product (VMEwindow), which improved its quality and functionality. In addition, six new software capabilities were developed, which are the subject of this article and add useful functionality to the VMEwindow environment. These new software programs are written in C or VXWorks and are used in conjunction with other ground station software packages, such as VMEwindow, Matlab, Dataviews, and PVWave. The Space Shuttle Telemetry Ground Station receives frequency-modulation (FM) and pulse-code-modulated (PCM) signals from the shuttle and support equipment. The hardware architecture (see figure) includes Sun workstations connected to multiple PCM- and FM-processing VersaModule Eurocard (VME) chassis. A reflective memory network transports raw data from PCM Processors (PCMPs) to the programmable digital-to-analog (D/A) converters, strip chart recorders, and analysis and controller workstations.

  2. A Proposed Defect Tracking Model for Classifying the Inserted Defect Reports to Enhance Software Quality Control

    PubMed Central

    Khedr, Ayman E.; Sayed, Mostafa

    2013-01-01

    CONFLICT OF INTEREST: NONE DECLARED Defect tracking systems play an important role in the software development organizations as they can store historical information about defects. There are many research in defect tracking models and systems to enhance their capabilities to be more specifically tracking, and were adopted with new technology. Furthermore, there are different studies in classifying bugs in a step by step method to have clear perception and applicable method in detecting such bugs. This paper shows a new proposed defect tracking model for the purpose of classifying the inserted defects reports in a step by step method for more enhancement of the software quality. PMID:24039334

  3. Do Acute Myocardial Infarction and Heart Failure Readmissions Flagged as Potentially Preventable by the 3M Potentially Preventable Readmissions Software Have More Process-of-Care Problems?

    PubMed

    Borzecki, Ann M; Chen, Qi; Mull, Hillary J; Shwartz, Michael; Bhatt, Deepak L; Hanchate, Amresh; Rosen, Amy K

    2016-09-01

    The 3M Potentially Preventable Readmissions (3M-PPR) software matches clinically related index admission and readmission diagnoses that may signify in-hospital or postdischarge quality problems. To assess whether the PPR algorithm identifies preventable readmissions, we compared processes of care between PPR software-flagged and nonflagged cases. Using 2006 to 2010 national VA administrative data, we identified acute myocardial infarction and heart failure discharges associated with 30-day all-cause readmissions, then flagged cases (PPR-Yes/PPR-No) using the 3M-PPR software. To assess care quality, we abstracted medical records of 100 readmissions per condition using tools containing explicit processes organized into admission work-up, in-hospital evaluation/treatment, discharge readiness, postdischarge period. We derived quality scores, scaled to a maximum of 25 per section (maximum total score=100) and compared cases on total and section-specific mean scores. For acute myocardial infarction, 77 of 100 cases were flagged as PPR-Yes. Section quality scores were highest for in-hospital evaluation/treatment (20.5±2.8) and lowest for postdischarge care (6.8±9.1). Total and section-related mean scores did not differ by PPR status; respective PPR-Yes versus PPR-No total scores were 61.6±11.1 and 60.4±9.4; P=0.98. For heart failure, 86 of 100 cases were flagged as PPR-Yes. Section scores were highest for discharge readiness (18.8±2.4) and lowest for postdischarge care (7.3±8.1). Like acute myocardial infarction, total and section-related mean scores did not differ by PPR status; PPR-Yes versus PPR-No total scores were 61.2±10.8 and 63.4±7.0, respectively; P=0.47. Among VA acute myocardial infarction and heart failure readmissions, the 3M-PPR software does not distinguish differences in case-level quality of care. Whether 3M-PPR software better identifies preventable readmissions by using other methods to capture poorly documented processes or performing different comparisons requires further study. © 2016 American Heart Association, Inc.

  4. Software Writing Skills for Your Research - Lessons Learned from Workshops in the Geosciences

    NASA Astrophysics Data System (ADS)

    Hammitzsch, Martin

    2016-04-01

    Findings presented in scientific papers are based on data and software. Once in a while they come along with data - but not commonly with software. However, the software used to gain findings plays a crucial role in the scientific work. Nevertheless, software is rarely seen publishable. Thus researchers may not reproduce the findings without the software which is in conflict with the principle of reproducibility in sciences. For both, the writing of publishable software and the reproducibility issue, the quality of software is of utmost importance. For many programming scientists the treatment of source code, e.g. with code design, version control, documentation, and testing is associated with additional work that is not covered in the primary research task. This includes the adoption of processes following the software development life cycle. However, the adoption of software engineering rules and best practices has to be recognized and accepted as part of the scientific performance. Most scientists have little incentive to improve code and do not publish code because software engineering habits are rarely practised by researchers or students. Software engineering skills are not passed on to followers as for paper writing skill. Thus it is often felt that the software or code produced is not publishable. The quality of software and its source code has a decisive influence on the quality of research results obtained and their traceability. So establishing best practices from software engineering to serve scientific needs is crucial for the success of scientific software. Even though scientists use existing software and code, i.e., from open source software repositories, only few contribute their code back into the repositories. So writing and opening code for Open Science means that subsequent users are able to run the code, e.g. by the provision of sufficient documentation, sample data sets, tests and comments which in turn can be proven by adequate and qualified reviews. This assumes that scientist learn to write and release code and software as they learn to write and publish papers. Having this in mind, software could be valued and assessed as a contribution to science. But this requires the relevant skills that can be passed to colleagues and followers. Therefore, the GFZ German Research Centre for Geosciences performed three workshops in 2015 to address the passing of software writing skills to young scientists, the next generation of researchers in the Earth, planetary and space sciences. Experiences in running these workshops and the lessons learned will be summarized in this presentation. The workshops have received support and funding by Software Carpentry, a volunteer organization whose goal is to make scientists more productive, and their work more reliable, by teaching them basic computing skills, and by FOSTER (Facilitate Open Science Training for European Research), a two-year, EU-Funded (FP7) project, whose goal to produce a European-wide training programme that will help to incorporate Open Access approaches into existing research methodologies and to integrate Open Science principles and practice in the current research workflow by targeting the young researchers and other stakeholders.

  5. Improving software quality - The use of formal inspections at the Jet Propulsion Laboratory

    NASA Technical Reports Server (NTRS)

    Bush, Marilyn

    1990-01-01

    The introduction of software formal inspections (Fagan Inspections) at JPL for finding and fixing defects early in the software development life cycle are reviewed. It is estimated that, by the year 2000, some software efforts will rise to as much as 80 percent of the total. Software problems are especially important at NASA as critical flight software must be error-free. It is shown that formal inspections are particularly effective at finding and removing defects having to do with clarity, correctness, consistency, and completeness. A very significant discovery was that code audits were not as effective at finding defects as code inspections.

  6. The Fifth Annual NASA/Contractors Conference on Quality and Productivity. Quality: A Commitment to the Future

    NASA Technical Reports Server (NTRS)

    1989-01-01

    The report is a summary of the 5th NASA/Contractors Conference on Quality and Productivity. The theme was 'Quality - A Commitment to the Future'. The summary report highlights the key points: commitment to quality, strategic and long-range planning, quality commitment, risk management, teaming, quality measurement, creating a quality environment, contract incentives, software quality and reliability.

  7. Alberta Education's Clearinghouse: Functions and Findings.

    ERIC Educational Resources Information Center

    Wighton, David

    1984-01-01

    Discusses functions of the Alberta (Canada) Computer Technology Project's courseware clearinghouse, reviews findings on instructional software quality, identifies software development trends, and discusses need for support systems to facilitate the incorporation of computer assisted instruction in Canadian schools. (MBR)

  8. A Discussion of the Software Quality Assurance Role

    NASA Technical Reports Server (NTRS)

    Kandt, Ronald Kirk

    2010-01-01

    The basic idea underlying this paper is that the conventional understanding of the role of a Software Quality Assurance (SQA) engineer is unduly limited. This is because few have asked who the customers of a SQA engineer are. Once you do this, you can better define what tasks a SQA engineer should perform, as well as identify the knowledge and skills that such a person should have. The consequence of doing this is that a SQA engineer can provide greater value to his or her customers. It is the position of this paper that a SQA engineer providing significant value to his or her customers must not only assume the role of an auditor, but also that of a software and systems engineer. This is because software engineers and their managers particularly value contributions that directly impact products and their development. These ideas are summarized as lessons learned, based on my experience at Jet Propulsion Laboratory (JPL).

  9. On the release of cppxfel for processing X-ray free-electron laser images.

    PubMed

    Ginn, Helen Mary; Evans, Gwyndaf; Sauter, Nicholas K; Stuart, David Ian

    2016-06-01

    As serial femtosecond crystallography expands towards a variety of delivery methods, including chip-based methods, and smaller collected data sets, the requirement to optimize the data analysis to produce maximum structure quality is becoming increasingly pressing. Here cppxfel , a software package primarily written in C++, which showcases several data analysis techniques, is released. This software package presently indexes images using DIALS (diffraction integration for advanced light sources) and performs an initial orientation matrix refinement, followed by post-refinement of individual images against a reference data set. Cppxfel is released with the hope that the unique and useful elements of this package can be repurposed for existing software packages. However, as released, it produces high-quality crystal structures and is therefore likely to be also useful to experienced users of X-ray free-electron laser (XFEL) software who wish to maximize the information extracted from a limited number of XFEL images.

  10. On the release of cppxfel for processing X-ray free-electron laser images

    DOE PAGES

    Ginn, Helen Mary; Evans, Gwyndaf; Sauter, Nicholas K.; ...

    2016-05-11

    As serial femtosecond crystallography expands towards a variety of delivery methods, including chip-based methods, and smaller collected data sets, the requirement to optimize the data analysis to produce maximum structure quality is becoming increasingly pressing. Herecppxfel, a software package primarily written in C++, which showcases several data analysis techniques, is released. This software package presently indexes images using DIALS (diffraction integration for advanced light sources) and performs an initial orientation matrix refinement, followed by post-refinement of individual images against a reference data set.Cppxfelis released with the hope that the unique and useful elements of this package can be repurposed formore » existing software packages. However, as released, it produces high-quality crystal structures and is therefore likely to be also useful to experienced users of X-ray free-electron laser (XFEL) software who wish to maximize the information extracted from a limited number of XFEL images.« less

  11. The business process management software for successful quality management and organization: A case study from the University of Split School of Medicine.

    PubMed

    Sapunar, Damir; Grković, Ivica; Lukšić, Davor; Marušić, Matko

    2016-05-01

    Our aim was to describe a comprehensive model of internal quality management (QM) at a medical school founded on the business process analysis (BPA) software tool. BPA software tool was used as the core element for description of all working processes in our medical school, and subsequently the system served as the comprehensive model of internal QM. The quality management system at the University of Split School of Medicine included the documentation and analysis of all business processes within the School. The analysis revealed 80 weak points related to one or several business processes. A precise analysis of medical school business processes allows identification of unfinished, unclear and inadequate points in these processes, and subsequently the respective improvements and increase of the QM level and ultimately a rationalization of the institution's work. Our approach offers a potential reference model for development of common QM framework allowing a continuous quality control, i.e. the adjustments and adaptation to contemporary educational needs of medical students. Copyright © 2016 by Academy of Sciences and Arts of Bosnia and Herzegovina.

  12. MapEdit: solution to continuous raster map creation

    NASA Astrophysics Data System (ADS)

    Rančić, Dejan; Djordjevi-Kajan, Slobodanka

    2003-03-01

    The paper describes MapEdit, MS Windows TM software for georeferencing and rectification of scanned paper maps. The software produces continuous raster maps which can be used as background in geographical information systems. Process of continuous raster map creation using MapEdit "mosaicking" function is also described as well as the georeferencing and rectification algorithms which are used in MapEdit. Our approach for georeferencing and rectification using four control points and two linear transformations for each scanned map part, together with nearest neighbor resampling method, represents low cost—high speed solution that produce continuous raster maps with satisfactory quality for many purposes (±1 pixel). Quality assessment of several continuous raster maps at different scales that have been created using our software and methodology, has been undertaken and results are presented in the paper. For the quality control of the produced raster maps we referred to three wide adopted standards: US Standard for Digital Cartographic Data, National Standard for Spatial Data Accuracy and US National Map Accuracy Standard. The results obtained during the quality assessment process are given in the paper and show that our maps meat all three standards.

  13. Section 3: Quality and Value-Based Requirements

    NASA Astrophysics Data System (ADS)

    Mylopoulos, John

    Traditionally, research and practice in software engineering has focused its attention on specific software qualities, such as functionality and performance. According to this perspective, a system is deemed to be of good quality if it delivers all required functionality (“fitness-for-purpose”) and its performance is above required thresholds. Increasingly, primarily in research but also in practice, other qualities are attracting attention. To facilitate evolution, maintainability and adaptability are gaining popularity. Usability, universal accessibility, innovativeness, and enjoyability are being studied as novel types of non-functional requirements that we do not know how to define, let alone accommodate, but which we realize are critical under some contingencies. The growing importance of the business context in the design of software-intensive systems has also thrust economic value, legal compliance, and potential social and ethical implications into the forefront of requirements topics. A focus on the broader user environment and experience, as well as the organizational and societal implications of system use, thus has become more central to the requirements discourse. This section includes three contributions to this broad and increasingly important topic.

  14. Development of Software Sensors for Determining Total Phosphorus and Total Nitrogen in Waters

    PubMed Central

    Lee, Eunhyoung; Han, Sanghoon; Kim, Hyunook

    2013-01-01

    Total nitrogen (TN) and total phosphorus (TP) concentrations are important parameters to assess the quality of water bodies and are used as criteria to regulate the water quality of the effluent from a wastewater treatment plant (WWTP) in Korea. Therefore, continuous monitoring of TN and TP using in situ instruments is conducted nationwide in Korea. However, most in situ instruments in the market are expensive and require a time-consuming sample pretreatment step, which hinders the widespread use of in situ TN and TP monitoring. In this study, therefore, software sensors based on multiple-regression with a few easily in situ measurable water quality parameters were applied to estimate the TN and TP concentrations in a stream, a lake, combined sewer overflows (CSOs), and WWTP effluent. In general, the developed software sensors predicted TN and TP concentrations of the WWTP effluent and CSOs reasonably well. However, they showed relatively lower predictability for TN and TP concentrations of stream and lake waters, possibly because the water quality of stream and lake waters is more variable than that of WWTP effluent or CSOs. PMID:23307350

  15. 3D Vectorial Time Domain Computational Integrated Photonics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kallman, J S; Bond, T C; Koning, J M

    2007-02-16

    The design of integrated photonic structures poses considerable challenges. 3D-Time-Domain design tools are fundamental in enabling technologies such as all-optical logic, photonic bandgap sensors, THz imaging, and fast radiation diagnostics. Such technologies are essential to LLNL and WFO sponsors for a broad range of applications: encryption for communications and surveillance sensors (NSA, NAI and IDIV/PAT); high density optical interconnects for high-performance computing (ASCI); high-bandwidth instrumentation for NIF diagnostics; micro-sensor development for weapon miniaturization within the Stockpile Stewardship and DNT programs; and applications within HSO for CBNP detection devices. While there exist a number of photonics simulation tools on the market,more » they primarily model devices of interest to the communications industry. We saw the need to extend our previous software to match the Laboratory's unique emerging needs. These include modeling novel material effects (such as those of radiation induced carrier concentrations on refractive index) and device configurations (RadTracker bulk optics with radiation induced details, Optical Logic edge emitting lasers with lateral optical inputs). In addition we foresaw significant advantages to expanding our own internal simulation codes: parallel supercomputing could be incorporated from the start, and the simulation source code would be accessible for modification and extension. This work addressed Engineering's Simulation Technology Focus Area, specifically photonics. Problems addressed from the Engineering roadmap of the time included modeling the Auston switch (an important THz source/receiver), modeling Vertical Cavity Surface Emitting Lasers (VCSELs, which had been envisioned as part of fast radiation sensors), and multi-scale modeling of optical systems (for a variety of applications). We proposed to develop novel techniques to numerically solve the 3D multi-scale propagation problem for both the microchip laser logic devices as well as devices characterized by electromagnetic (EM) propagation in nonlinear materials with time-varying parameters. The deliverables for this project were extended versions of the laser logic device code Quench2D and the EM propagation code EMsolve with new modules containing the novel solutions incorporated by taking advantage of the existing software interface and structured computational modules. Our approach was multi-faceted since no single methodology can always satisfy the tradeoff between model runtime and accuracy requirements. We divided the problems to be solved into two main categories: those that required Full Wave Methods and those that could be modeled using Approximate Methods. Full Wave techniques are useful in situations where Maxwell's equations are not separable (or the problem is small in space and time), while approximate techniques can treat many of the remaining cases.« less

  16. Gaining Control and Predictability of Software-Intensive Systems Development and Sustainment

    DTIC Science & Technology

    2015-02-04

    implementation of the baselines, audits , and technical reviews within an overarching systems engineering process (SEP; Defense Acquisition University...warfighters’ needs. This management and metrics effort supplements and supports the system’s technical development through the baselines, audits and...other areas that could be researched and added into the nine-tier model. Areas including software metrics, quality assurance , software-oriented

  17. The Alignment of Software Testing Skills of IS Students with Industry Practices--A South African Perspective

    ERIC Educational Resources Information Center

    Scott, Elsje; Zadirov, Alexander; Feinberg, Sean; Jayakody, Ruwanga

    2004-01-01

    Software testing is a crucial component in the development of good quality systems in industry. For this reason it was considered important to investigate the extent to which the Information Systems (IS) syllabus at the University of Cape Town (UCT) was aligned with accepted software testing practices in South Africa. For students to be effective…

  18. The Missing Link: The Use of Link Words and Phrases as a Link to Manuscript Quality

    ERIC Educational Resources Information Center

    Onwuegbuzie, Anthony J.

    2016-01-01

    In this article, I provide a typology of transition words/phrases. This typology comprises 12 dimensions of link words/phrases that capture 277 link words/phrases. Using QDA Miner, WordStat, and SPSS--a computer-assisted mixed methods data analysis software, content analysis software, and statistical software, respectively--I analyzed 74…

  19. User Documentation for Multiple Software Releases

    NASA Technical Reports Server (NTRS)

    Humphrey, R.

    1982-01-01

    In proposed solution to problems of frequent software releases and updates, documentation would be divided into smaller packages, each of which contains data relating to only one of several software components. Changes would not affect entire document. Concept would improve dissemination of information regarding changes and would improve quality of data supporting packages. Would help to insure both timeliness and more thorough scrutiny of changes.

  20. Technology Infusion of CodeSonar into the Space Network Ground Segment (RII07)

    NASA Technical Reports Server (NTRS)

    Benson, Markland

    2008-01-01

    The NASA Software Assurance Research Program (in part) performs studies as to the feasibility of technologies for improving the safety, quality, reliability, cost, and performance of NASA software. This study considers the application of commercial automated source code analysis tools to mission critical ground software that is in the operations and sustainment portion of the product lifecycle.

  1. Software Issues at the User Interface

    DTIC Science & Technology

    1991-05-01

    successful integration of parallel computers into mainstream scientific computing. Clearly a compiler is the most important software tool available to a...Computer Science University of Colorado Boulder, CO 80309 ABSTRACT We review software issues that are critical to the successful integration of parallel...The development of an optimizing compiler of this quality, addressing communicaton instructions as well as computational instructions is a major

  2. Testing in Service-Oriented Environments

    DTIC Science & Technology

    2010-03-01

    software releases (versions, service packs, vulnerability patches) for one com- mon ESB during the 13-month period from January 1, 2008 through...impact on quality of service : Unlike traditional software compo- nents, a single instance of a web service can be used by multiple consumers. Since the...distributed, with heterogeneous hardware and software (SOA infrastructure, services , operating systems, and databases). Because of cost and security, it

  3. The cleanroom case study in the Software Engineering Laboratory: Project description and early analysis

    NASA Technical Reports Server (NTRS)

    Green, Scott; Kouchakdjian, Ara; Basili, Victor; Weidow, David

    1990-01-01

    This case study analyzes the application of the cleanroom software development methodology to the development of production software at the NASA/Goddard Space Flight Center. The cleanroom methodology emphasizes human discipline in program verification to produce reliable software products that are right the first time. Preliminary analysis of the cleanroom case study shows that the method can be applied successfully in the FDD environment and may increase staff productivity and product quality. Compared to typical Software Engineering Laboratory (SEL) activities, there is evidence of lower failure rates, a more complete and consistent set of inline code documentation, a different distribution of phase effort activity, and a different growth profile in terms of lines of code developed. The major goals of the study were to: (1) assess the process used in the SEL cleanroom model with respect to team structure, team activities, and effort distribution; (2) analyze the products of the SEL cleanroom model and determine the impact on measures of interest, including reliability, productivity, overall life-cycle cost, and software quality; and (3) analyze the residual products in the application of the SEL cleanroom model, such as fault distribution, error characteristics, system growth, and computer usage.

  4. TriBITS lifecycle model. Version 1.0, a lean/agile software lifecycle model for research-based computational science and engineering and applied mathematical software.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Willenbring, James M.; Bartlett, Roscoe Ainsworth; Heroux, Michael Allen

    2012-01-01

    Software lifecycles are becoming an increasingly important issue for computational science and engineering (CSE) software. The process by which a piece of CSE software begins life as a set of research requirements and then matures into a trusted high-quality capability is both commonplace and extremely challenging. Although an implicit lifecycle is obviously being used in any effort, the challenges of this process - respecting the competing needs of research vs. production - cannot be overstated. Here we describe a proposal for a well-defined software lifecycle process based on modern Lean/Agile software engineering principles. What we propose is appropriate for manymore » CSE software projects that are initially heavily focused on research but also are expected to eventually produce usable high-quality capabilities. The model is related to TriBITS, a build, integration and testing system, which serves as a strong foundation for this lifecycle model, and aspects of this lifecycle model are ingrained in the TriBITS system. Here, we advocate three to four phases or maturity levels that address the appropriate handling of many issues associated with the transition from research to production software. The goals of this lifecycle model are to better communicate maturity levels with customers and to help to identify and promote Software Engineering (SE) practices that will help to improve productivity and produce better software. An important collection of software in this domain is Trilinos, which is used as the motivation and the initial target for this lifecycle model. However, many other related and similar CSE (and non-CSE) software projects can also make good use of this lifecycle model, especially those that use the TriBITS system. Indeed this lifecycle process, if followed, will enable large-scale sustainable integration of many complex CSE software efforts across several institutions.« less

  5. [Physical Activity in the Context of Workplace Health Promotion: A Systematic Review on the Effectiveness of Software-Based in Contrast to Personal-Based Interventions].

    PubMed

    Rudolph, Sabrina; Göring, Arne; Padrok, Dennis

    2018-01-03

    Sports and physical activity interventions are attracting considerable attention in the context of workplace health promotion. Due to increasing digitalization, especially software-based interventions that promote physical activity are gaining acceptance in practice. Empirical evidence concerning the efficiency of software-based interventions in the context of workplace health promotion is rather low so far. This paper examines the question in what way software-based interventions are more efficient than personal-based interventions in terms of increasing the level of physical activity. A systematic review according to the specifications of the Cochrane Collaboration was conducted. Inclusion criteria and should-have criteria were defined and by means of the should-have criteria the quality score of the studies was calculated. The software-based and personal-based interventions are presented in 2 tables with the categories author, year, country, sample group, aim of the intervention, methods, outcome and study quality. A total of 25 studies are included in the evaluation (12 personal- and 13 software-based interventions). The quality scores of the studies are heterogeneous and range from 3 to 9 points. 5 personal- and 5 software-based studies achieved an increase of physical activity. Other positive effects on health could be presented in the studies, for example, a reduction in blood pressure or body-mass index. A few studies did not show any improvement in health-related parameters. This paper demonstrates that positive effects can be achieved with both intervention types. Software-based interventions show advantages due to the use of new technologies. Use of desktop or mobile applications facilitate organization, communication and data acquisition with fewer resources needed. A schooled trainer, on the other hand, is able to react to specific and varying needs of the employees. This aspect should be considered as very significant. © Georg Thieme Verlag KG Stuttgart · New York.

  6. Managing quality and compliance.

    PubMed

    McNeil, Alice; Koppel, Carl

    2015-01-01

    Critical care nurses assume vital roles in maintaining patient care quality. There are distinct facets to the process including standard setting, regulatory compliance, and completion of reports associated with these endeavors. Typically, multiple niche software applications are required and user interfaces are varied and complex. Although there are distinct quality indicators that must be tracked as well as a list of serious or sentinel events that must be documented and reported, nurses may not know the precise steps to ensure that information is properly documented and actually reaches the proper authorities for further investigation and follow-up actions. Technology advances have permitted the evolution of a singular software platform, capable of monitoring quality indicators and managing all facets of reporting associated with regulatory compliance.

  7. Evaluating the feasibility of using online software to collect patient information in a chiropractic practice-based research network.

    PubMed

    Kania-Richmond, Ania; Weeks, Laura; Scholten, Jeffrey; Reney, Mikaël

    2016-03-01

    Practice based research networks (PBRNs) are increasingly used as a tool for evidence based practice. We developed and tested the feasibility of using software to enable online collection of patient data within a chiropractic PBRN to support clinical decision making and research in participating clinics. To assess the feasibility of using online software to collect quality patient information. The study consisted of two phases: 1) Assessment of the quality of information provided, using a standardized form; and 2) Exploration of patients' perspectives and experiences regarding online information provision through semi-structured interviews. Data analysis was descriptive. Forty-five new patients were recruited. Thirty-six completed online forms, which were submitted by an appropriate person 100% of the time, with an error rate of less than 1%, and submitted in a timely manner 83% of the time. Twenty-one participants were interviewed. Overall, online forms were preferred given perceived security, ease of use, and enabling provision of more accurate information. Use of online software is feasible, provides high quality information, and is preferred by most participants. A pen-and-paper format should be available for patients with this preference and in case of technical difficulties.

  8. [Development of integrated support software for clinical nutrition].

    PubMed

    Siquier Homar, Pedro; Pinteño Blanco, Manel; Calleja Hernández, Miguel Ángel; Fernández Cortés, Francisco; Martínez Sotelo, Jesús

    2015-09-01

    to develop an integrated computer software application for specialized nutritional support, integrated in the electronic clinical record, which detects automatically and early those undernourished patients or at risk of developing undernourishment, determining points of opportunity for improvement and evaluation of the results. the quality standards published by the Nutrition Work Group of the Spanish Society of Hospital Pharmacy (SEFH) and the recommendations by the Pharmacy Group of the Spanish Society of Parenteral and Enteral Nutrition (SENPE) have been taken into account. According to these quality standards, the nutritional support has to include the following healthcare stages or sub-processes: nutritional screening, nutritional assessment, plan for nutritional care, prescription, preparation and administration. this software allows to conduct, in an automated way, a specific nutritional assessment for those patients with nutritional risk, implementing, if necessary, a nutritional treatment plan, conducting follow-up and traceability of outcomes derived from the implementation of improvement actions, and quantifying to what extent our practice is close to the established standard. this software allows to standardize the specialized nutritional support from a multidisciplinary point of view, introducing the concept of quality control per processes, and including patient as the main customer. Copyright AULA MEDICA EDICIONES 2014. Published by AULA MEDICA. All rights reserved.

  9. The maturing of the quality improvement paradigm in the SEL

    NASA Technical Reports Server (NTRS)

    Basili, Victor R.

    1993-01-01

    The Software Engineering Laboratory uses a paradigm for improving the software process and product, called the quality improvement paradigm. This paradigm has evolved over the past 18 years, along with our software development processes and product. Since 1976, when we first began the SEL, we have learned a great deal about improving the software process and product, making a great many mistakes along the way. Quality improvement paradigm, as it is currently defined, can be broken up into six steps: characterize the current project and its environment with respect to the appropriate models and metrics; set the quantifiable goals for successful project performance and improvement; choose the appropriate process model and supporting methods and tools for this project; execute the processes, construct the products, and collect, validate, and analyze the data to provide real-time feedback for corrective action; analyze the data to evaluate the current practices, determine problems, record findings, and make recommendations for future project improvements; and package the experience gained in the form of updated and refined models and other forms of structured knowledge gained from this and prior projects and save it in an experience base to be reused on future projects.

  10. HeatmapGenerator: high performance RNAseq and microarray visualization software suite to examine differential gene expression levels using an R and C++ hybrid computational pipeline.

    PubMed

    Khomtchouk, Bohdan B; Van Booven, Derek J; Wahlestedt, Claes

    2014-01-01

    The graphical visualization of gene expression data using heatmaps has become an integral component of modern-day medical research. Heatmaps are used extensively to plot quantitative differences in gene expression levels, such as those measured with RNAseq and microarray experiments, to provide qualitative large-scale views of the transcriptonomic landscape. Creating high-quality heatmaps is a computationally intensive task, often requiring considerable programming experience, particularly for customizing features to a specific dataset at hand. Software to create publication-quality heatmaps is developed with the R programming language, C++ programming language, and OpenGL application programming interface (API) to create industry-grade high performance graphics. We create a graphical user interface (GUI) software package called HeatmapGenerator for Windows OS and Mac OS X as an intuitive, user-friendly alternative to researchers with minimal prior coding experience to allow them to create publication-quality heatmaps using R graphics without sacrificing their desired level of customization. The simplicity of HeatmapGenerator is that it only requires the user to upload a preformatted input file and download the publicly available R software language, among a few other operating system-specific requirements. Advanced features such as color, text labels, scaling, legend construction, and even database storage can be easily customized with no prior programming knowledge. We provide an intuitive and user-friendly software package, HeatmapGenerator, to create high-quality, customizable heatmaps generated using the high-resolution color graphics capabilities of R. The software is available for Microsoft Windows and Apple Mac OS X. HeatmapGenerator is released under the GNU General Public License and publicly available at: http://sourceforge.net/projects/heatmapgenerator/. The Mac OS X direct download is available at: http://sourceforge.net/projects/heatmapgenerator/files/HeatmapGenerator_MAC_OSX.tar.gz/download. The Windows OS direct download is available at: http://sourceforge.net/projects/heatmapgenerator/files/HeatmapGenerator_WINDOWS.zip/download.

  11. Software Tech News. The DoD Source for Software Technology Information. Software Testing Series: Part 2, Volume 3, Number 3, 1999

    DTIC Science & Technology

    1999-01-01

    published in December of 1998. In addition, Mr. Drake is the author of a theme article entitled: "Measuring Software Quality: A Case Study...and services may run on different platforms in differing combinations , • Partial application failure (e.g., a client running, service down) is...result in a combined utility function that is some aggregation of the underlying utility functions. The benefit a client receives from a service

  12. A Brief Study of Software Engineering Professional Continuing Education in DoD Acquisition

    DTIC Science & Technology

    2010-04-01

    Lifecycle Processes (IEEE 12207 ) (810) 37% 61% 2% Guide to the Software Engineering Body of K l d (SWEBOK) (804) 67% 31% 2% now e ge Software...Engineering-Software Measurement Process ( ISO /IEC 15939) (797) 55% 44% 2% Capability Maturity Model Integration (806) 17% 81% 2% Six Sigma Process...Improvement (804) 7% 91% 1% ISO 9000 Quality Management Systems (803) 10% 89% 1% 28 Conclusions Significant problem areas R i tequ remen s Management Very

  13. Agile Development Methods for Space Operations

    NASA Technical Reports Server (NTRS)

    Trimble, Jay; Webster, Chris

    2012-01-01

    Main stream industry software development practice has gone from a traditional waterfall process to agile iterative development that allows for fast response to customer inputs and produces higher quality software at lower cost. How can we, the space ops community, adopt state of the art software development practice, achieve greater productivity at lower cost, and maintain safe and effective space flight operations? At NASA Ames, we are developing Mission Control Technologies Software, in collaboration with Johnson Space Center (JSC) and, more recently, the Jet Propulsion Laboratory (JPL).

  14. Design requirements for SRB production control system. Volume 3: Package evaluation, modification and hardware

    NASA Technical Reports Server (NTRS)

    1981-01-01

    The software package evaluation was designed to analyze commercially available, field-proven, production control or manufacturing resource planning management technology and software package. The analysis was conducted by comparing SRB production control software requirements and conceptual system design to software package capabilities. The methodology of evaluation and the findings at each stage of evaluation are described. Topics covered include: vendor listing; request for information (RFI) document; RFI response rate and quality; RFI evaluation process; and capabilities versus requirements.

  15. Open source molecular modeling.

    PubMed

    Pirhadi, Somayeh; Sunseri, Jocelyn; Koes, David Ryan

    2016-09-01

    The success of molecular modeling and computational chemistry efforts are, by definition, dependent on quality software applications. Open source software development provides many advantages to users of modeling applications, not the least of which is that the software is free and completely extendable. In this review we categorize, enumerate, and describe available open source software packages for molecular modeling and computational chemistry. An updated online version of this catalog can be found at https://opensourcemolecularmodeling.github.io. Copyright © 2016 The Author(s). Published by Elsevier Inc. All rights reserved.

  16. Retina Image Screening and Analysis Software Version 2.0

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tobin, Jr., Kenneth W.; Karnowski, Thomas P.; Aykac, Deniz

    2009-04-01

    The software allows physicians or researchers to ground-truth images of retinas, identifying key physiological features and lesions that are indicative of disease. The software features methods to automatically detect the physiological features and lesions. The software contains code to measure the quality of images received from a telemedicine network; create and populate a database for a telemedicine network; review and report the diagnosis of a set of images; and also contains components to transmit images from a Zeiss camera to the network through SFTP.

  17. Software engineering and Ada in design

    NASA Technical Reports Server (NTRS)

    Oneill, Don

    1986-01-01

    Modern software engineering promises significant reductions in software costs and improvements in software quality. The Ada language is the focus for these software methodology and tool improvements. The IBM FSD approach, including the software engineering practices that guide the systematic design and development of software products and the management of the software process are examined. The revised Ada design language adaptation is revealed. This four level design methodology is detailed including the purpose of each level, the management strategy that integrates the software design activity with the program milestones, and the technical strategy that maps the Ada constructs to each level of design. A complete description of each design level is provided along with specific design language recording guidelines for each level. Finally, some testimony is offered on education, tools, architecture, and metrics resulting from project use of the four level Ada design language adaptation.

  18. Publishing Platform for Scientific Software - Lessons Learned

    NASA Astrophysics Data System (ADS)

    Hammitzsch, Martin; Fritzsch, Bernadette; Reusser, Dominik; Brembs, Björn; Deinzer, Gernot; Loewe, Peter; Fenner, Martin; van Edig, Xenia; Bertelmann, Roland; Pampel, Heinz; Klump, Jens; Wächter, Joachim

    2015-04-01

    Scientific software has become an indispensable commodity for the production, processing and analysis of empirical data but also for modelling and simulation of complex processes. Software has a significant influence on the quality of research results. For strengthening the recognition of the academic performance of scientific software development, for increasing its visibility and for promoting the reproducibility of research results, concepts for the publication of scientific software have to be developed, tested, evaluated, and then transferred into operations. For this, the publication and citability of scientific software have to fulfil scientific criteria by means of defined processes and the use of persistent identifiers, similar to data publications. The SciForge project is addressing these challenges. Based on interviews a blueprint for a scientific software publishing platform and a systematic implementation plan has been designed. In addition, the potential of journals, software repositories and persistent identifiers have been evaluated to improve the publication and dissemination of reusable software solutions. It is important that procedures for publishing software as well as methods and tools for software engineering are reflected in the architecture of the platform, in order to improve the quality of the software and the results of research. In addition, it is necessary to work continuously on improving specific conditions that promote the adoption and sustainable utilization of scientific software publications. Among others, this would include policies for the development and publication of scientific software in the institutions but also policies for establishing the necessary competencies and skills of scientists and IT personnel. To implement the concepts developed in SciForge a combined bottom-up / top-down approach is considered that will be implemented in parallel in different scientific domains, e.g. in earth sciences, climate research and the life sciences. Based on the developed blueprints a scientific software publishing platform will be iteratively implemented, tested, and evaluated. Thus the platform should be developed continuously on the basis of gained experiences and results. The platform services will be extended one by one corresponding to the requirements of the communities. Thus the implemented platform for the publication of scientific software can be improved and stabilized incrementally as a tool with software, science, publishing, and user oriented features.

  19. Configuration management and software measurement in the Ground Systems Development Environment (GSDE)

    NASA Technical Reports Server (NTRS)

    Church, Victor E.; Long, D.; Hartenstein, Ray; Perez-Davila, Alfredo

    1992-01-01

    A set of functional requirements for software configuration management (CM) and metrics reporting for Space Station Freedom ground systems software are described. This report is one of a series from a study of the interfaces among the Ground Systems Development Environment (GSDE), the development systems for the Space Station Training Facility (SSTF) and the Space Station Control Center (SSCC), and the target systems for SSCC and SSTF. The focus is on the CM of the software following delivery to NASA and on the software metrics that relate to the quality and maintainability of the delivered software. The CM and metrics requirements address specific problems that occur in large-scale software development. Mechanisms to assist in the continuing improvement of mission operations software development are described.

  20. Computerized stratified random site-selection approaches for design of a ground-water-quality sampling network

    USGS Publications Warehouse

    Scott, J.C.

    1990-01-01

    Computer software was written to randomly select sites for a ground-water-quality sampling network. The software uses digital cartographic techniques and subroutines from a proprietary geographic information system. The report presents the approaches, computer software, and sample applications. It is often desirable to collect ground-water-quality samples from various areas in a study region that have different values of a spatial characteristic, such as land-use or hydrogeologic setting. A stratified network can be used for testing hypotheses about relations between spatial characteristics and water quality, or for calculating statistical descriptions of water-quality data that account for variations that correspond to the spatial characteristic. In the software described, a study region is subdivided into areal subsets that have a common spatial characteristic to stratify the population into several categories from which sampling sites are selected. Different numbers of sites may be selected from each category of areal subsets. A population of potential sampling sites may be defined by either specifying a fixed population of existing sites, or by preparing an equally spaced population of potential sites. In either case, each site is identified with a single category, depending on the value of the spatial characteristic of the areal subset in which the site is located. Sites are selected from one category at a time. One of two approaches may be used to select sites. Sites may be selected randomly, or the areal subsets in the category can be grouped into cells and sites selected randomly from each cell.

  1. A survey of quality assurance practices in biomedical open source software projects.

    PubMed

    Koru, Günes; El Emam, Khaled; Neisa, Angelica; Umarji, Medha

    2007-05-07

    Open source (OS) software is continuously gaining recognition and use in the biomedical domain, for example, in health informatics and bioinformatics. Given the mission critical nature of applications in this domain and their potential impact on patient safety, it is important to understand to what degree and how effectively biomedical OS developers perform standard quality assurance (QA) activities such as peer reviews and testing. This would allow the users of biomedical OS software to better understand the quality risks, if any, and the developers to identify process improvement opportunities to produce higher quality software. A survey of developers working on biomedical OS projects was conducted to examine the QA activities that are performed. We took a descriptive approach to summarize the implementation of QA activities and then examined some of the factors that may be related to the implementation of such practices. Our descriptive results show that 63% (95% CI, 54-72) of projects did not include peer reviews in their development process, while 82% (95% CI, 75-89) did include testing. Approximately 74% (95% CI, 67-81) of developers did not have a background in computing, 80% (95% CI, 74-87) were paid for their contributions to the project, and 52% (95% CI, 43-60) had PhDs. A multivariate logistic regression model to predict the implementation of peer reviews was not significant (likelihood ratio test = 16.86, 9 df, P = .051) and neither was a model to predict the implementation of testing (likelihood ratio test = 3.34, 9 df, P = .95). Less attention is paid to peer review than testing. However, the former is a complementary, and necessary, QA practice rather than an alternative. Therefore, one can argue that there are quality risks, at least at this point in time, in transitioning biomedical OS software into any critical settings that may have operational, financial, or safety implications. Developers of biomedical OS applications should invest more effort in implementing systemic peer review practices throughout the development and maintenance processes.

  2. A Quality System Database

    NASA Technical Reports Server (NTRS)

    Snell, William H.; Turner, Anne M.; Gifford, Luther; Stites, William

    2010-01-01

    A quality system database (QSD), and software to administer the database, were developed to support recording of administrative nonconformance activities that involve requirements for documentation of corrective and/or preventive actions, which can include ISO 9000 internal quality audits and customer complaints.

  3. TOP500 Supercomputers for June 2003

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Strohmaier, Erich; Meuer, Hans W.; Dongarra, Jack

    2003-06-23

    21st Edition of TOP500 List of World's Fastest Supercomputers Released MANNHEIM, Germany; KNOXVILLE, Tenn.;&BERKELEY, Calif. In what has become a much-anticipated event in the world of high-performance computing, the 21st edition of the TOP500 list of the world's fastest supercomputers was released today (June 23, 2003). The Earth Simulator supercomputer built by NEC and installed last year at the Earth Simulator Center in Yokohama, Japan, with its Linpack benchmark performance of 35.86 Tflop/s (teraflops or trillions of calculations per second), retains the number one position. The number 2 position is held by the re-measured ASCI Q system at Los Alamosmore » National Laboratory. With 13.88 Tflop/s, it is the second system ever to exceed the 10 Tflop/smark. ASCIQ was built by Hewlett-Packard and is based on the AlphaServerSC computer system.« less

  4. Freshwater ascomycetes: Alascospora evergladensis, a new genus and species from the Florida Everglades.

    PubMed

    Raja, Huzefa A; Violi, Helen A; Shearer, Carol A

    2010-01-01

    Alascospora evergladensis, a freshwater ascomycete collected from submerged dead petioles of Nymphaea odorata during a survey of aquatic fungi along a phosphorus gradient in the Florida Everglades, is described and illustrated as a new genus and species in the Pleosporales (Pleosporomycetidae, Dothideomycetes). The new fungus is unique among genera in the Pleosporales based on a combination of morphological characters that include light brown, translucent, membranous, ostiolate ascomata with dark, amorphous material irregularly deposited on the peridium, especially around the ostiole; globose, fissitunicate, thick-walled asci; septate pseudoparaphyses; and 1-septate ascospores that are hyaline when young, and surrounded by a hyaline gelatinous sheath that is wing-shaped in outline on each side of the ascospore. The sheath is distinctive in that it first expands in water and is translucent, then condenses and darkens around older ascospores, giving them a dark brown, verruculose appearance.

  5. Multi-locus phylogeny of Pleosporales: a taxonomic, ecological and evolutionary re-evaluation

    PubMed Central

    Zhang, Y.; Schoch, C.L.; Fournier, J.; Crous, P.W.; de Gruyter, J.; Woudenberg, J.H.C.; Hirayama, K.; Tanaka, K.; Pointing, S.B.; Spatafora, J.W.; Hyde, K.D.

    2009-01-01

    Five loci, nucSSU, nucLSU rDNA, TEF1, RPB1 and RPB2, are used for analysing 129 pleosporalean taxa representing 59 genera and 15 families in the current classification of Pleosporales. The suborder Pleosporineae is emended to include four families, viz. Didymellaceae, Leptosphaeriaceae, Phaeosphaeriaceae and Pleosporaceae. In addition, two new families are introduced, i.e. Amniculicolaceae and Lentitheciaceae. Pleomassariaceae is treated as a synonym of Melanommataceae, and new circumscriptions of Lophiostomataceae s. str., Massarinaceae and Lophiotrema are proposed. Familial positions of Entodesmium and Setomelanomma in Phaeosphaeriaceae, Neophaeosphaeria in Leptosphaeriaceae, Leptosphaerulina, Macroventuria and Platychora in Didymellaceae, Pleomassaria in Melanommataceae and Bimuria, Didymocrea, Karstenula and Paraphaeosphaeria in Montagnulaceae are clarified. Both ecological and morphological characters show varying degrees of phylogenetic significance. Pleosporales is most likely derived from a saprobic ancestor with fissitunicate asci containing conspicuous ocular chambers and apical rings. Nutritional shifts in Pleosporales likely occured from saprotrophic to hemibiotrophic or biotrophic. PMID:20169024

  6. Pichia insulana sp. nov., a novel cactophilic yeast from the Caribbean

    PubMed Central

    Ganter, Philip F.; Cardinali, Gianluigi; Boundy-Mills, Kyria

    2010-01-01

    A novel species of ascomycetous yeast, Pichia insulana sp. nov., is described from necrotic tissue of columnar cacti on Caribbean islands. P. insulana is closely related to and phenotypically very similar to Pichia cactophila and Pichia pseudocactophila. There are few distinctions between these taxa besides spore type, host preference and locality. Sporogenous strains of P. insulana that produce asci with four hat-shaped spores have been found only on Curaçao, whereas there was no evidence of sporogenous P. cactophila from that island. In addition, sequences of the D1/D2 fragment of the large-subunit rDNA from 12 Curaçao strains showed consistent differences from the sequences of the type strains of P. cactophila and P. pseudocactophila. The type strain of P. insulana is TSU00-106.5T (=CBS 11169T =UCD-FST 09-160T). PMID:19661524

  7. Case of Contamination by Listeria Monocytogenes in Mozzarella Cheese

    PubMed Central

    Tolli, Rita; Bossù, Teresa; Rodas, Eda Maria Flores; Di Giamberardino, Fabiola; Di Sirio, Alessandro; Vita, Silvia; De Angelis, Veronica; Bilei, Stefano; Sonnessa, Michele; Gattuso, Antonietta; Lanni, Luigi

    2014-01-01

    Following a Listeria monocytogenes detection in a mozzarella cheese sampled at a dairy plant in Lazio Region, further investigations have been conducted both by the competent Authority and the food business operatordairy factory (as a part of dairy factory HACCP control). In total, 90 dairy products, 7 brine and 64 environmental samples have been tested. The prevalence of Listeria monocytogenes was 24.4% in mozzarella cheese, and 9.4% in environmental samples, while brines were all negatives. Forty-seven strains of L. monocytogenes have been isolated, all belonging to 4b/4e serotype. In 12 of these, the macrorestriction profile has been determined by means of pulsed field gel electrophoresis. The profiles obtained with AscI enzyme showed a 100% similarity while those obtained with ApaI a 96.78% similarity. These characteristics of the isolated strains jointly with the production process of mozzarella cheese has allowed to hypothesise an environmental contamination. PMID:27800317

  8. LATIS3D: The Goal Standard for Laser-Tissue-Interaction Modeling

    NASA Astrophysics Data System (ADS)

    London, R. A.; Makarewicz, A. M.; Kim, B. M.; Gentile, N. A.; Yang, T. Y. B.

    2000-03-01

    The goal of this LDRD project has been to create LATIS3D-the world's premier computer program for laser-tissue interaction modeling. The development was based on recent experience with the 2D LATIS code and the ASCI code, KULL. With LATIS3D, important applications in laser medical therapy were researched including dynamical calculations of tissue emulsification and ablation, photothermal therapy, and photon transport for photodynamic therapy. This project also enhanced LLNL's core competency in laser-matter interactions and high-energy-density physics by pushing simulation codes into new parameter regimes and by attracting external expertise. This will benefit both existing LLNL programs such as ICF and SBSS and emerging programs in medical technology and other laser applications. The purpose of this project was to develop and apply a computer program for laser-tissue interaction modeling to aid in the development of new instruments and procedures in laser medicine.

  9. Ultrastructure of developing ascospores in Sordaria brevicollis.

    PubMed

    Hackett, C J; Chen, K C

    1976-05-01

    The ultrastructure of ascospore wall formation in the pyrenomycete Sordaria brevicollis was studied in developing asci at progressive time intervals. From early spore delimitation through final stage of maturation, the wall of the ascospore differentiated into four composite layers, the periascosporium the delineation ascosporium, the subascosproium, and the endoascosproium, While ascospores were at the hyaline stage of development,they possessed only the periascosporium and delineation ascosporium as their wall components. At about 7 to 8 days from the initiation of the cross, the spores developed a yellow color, and this coloration was always associated with the elaboration of the subascorsporium just internal to the ascosporium. Asthe spores continued to progressively darken in color, the subascosporium was seen to increase in complexity, electron density, and thickness. Soon after the formation of the subascosporium, the endoascosporium began to develop de novo and was, therefore, the last wall layer formed as the spore approached maturity.

  10. Meiotic aneuploidy: its origins and induction following chemical treatment in Sordaria brevicollis.

    PubMed

    Bond, D J; McMillan, L

    1979-08-01

    A system suitable for the detection of meiotic aneuploidy is described in which various different origins of the aneuploidy can be distinguished. Aneuploid meiotic products are detected as black disomic spores held in asci containing all the products of a single meiosis. Aneuploidy may result from nondisjunction or from a meiosis in which an extra replica of one of the chromosomes has been generated in some other way, e.g., extra replication. By using this system it has been shown that pFPA treatment increase aneuploidy, primarily through an effect on nondisjunction. Preliminary results with trifluralin have indicated that this compound, too, may increase aneuploidy. There is a good possibility that the system can be further developed to permit a more rapid screening using a random plating method; this will allow a more efficient two-part analysis of the effects of compounds under test.

  11. The temporal response of recombination events to gamma radiation of meiotic cells in Sordaria brevicollis.

    PubMed

    Lewis, L A

    1982-01-01

    The temporal frequencies of different stages of prophase I were determined cytologically in Sordaria brevicollis (Olive and Fantini) as the basis for ascertaining the degree of synchrony in meiosis in this ascomycete. Croziers, karyogamy-zygotene and pachytene asci were shown to be in significant majorities at three distinct periods of the meiotic cycle. The response of recombination frequency to ionizing radiation was examined for the entire meiotic cycle. Three radiosensitive periods were determined. This response, which correlated temporally with each of the three peaks in ascal frequency, is interpreted as showing that the meiotic cycle of this organism is divided into periods of recombination commitment (radiation reduced frequencies) during the pre-meiotic S phase and recombination consummation (radiation induced frequencies) during zygotene and pachytene. The results are discussed in the context of the time at which recombination is consummated in eukaryotes such as yeast and Drosophila.

  12. Tactical Approaches for Making a Successful Satellite Passive Microwave ESDR

    NASA Astrophysics Data System (ADS)

    Hardman, M.; Brodzik, M. J.; Gotberg, J.; Long, D. G.; Paget, A. C.

    2014-12-01

    Our NASA MEaSUREs project is producing a new, enhanced resolution gridded Earth System Data Record for the entire satellite passive microwave (SMMR, SSM/I-SSMIS and AMSR-E) time series. Our project goals are twofold: to produce a well-documented, consistently processed, high-quality historical record at higher spatial resolutions than have previously been available, and to transition the production software to the NSIDC DAAC for ongoing processing after our project completion. In support of these goals, our distributed team at BYU and NSIDC faces project coordination challenges to produce a high-quality data set that our user community will accept as a replacement for the currently available historical versions of these data. We work closely with our DAAC liaison on format specifications, data and metadata plans, and project progress. In order for the user community to understand and support our project, we have solicited a team of Early Adopters who are reviewing and evaluating a prototype version of the data. Early Adopter feedback will be critical input to our final data content and format decisions. For algorithm transparency and accountability, we have released an Algorithm Theoretical Basis Document (ATBD) and detailed supporting technical documentation, with rationale for all algorithm implementation decisions. For distributed team management, we are using collaborative tools for software revision control and issue tracking. For reliably transitioning a research-quality image reconstruction software system to production-quality software suitable for use at the DAAC, we have adopted continuous integration methods for running automated regression testing. Our presentation will summarize bothadvantages and challenges of each of these tactics in ensuring production of a successful ESDR and an enduring production software system.

  13. General guidelines for biomedical software development

    PubMed Central

    Silva, Luis Bastiao; Jimenez, Rafael C.; Blomberg, Niklas; Luis Oliveira, José

    2017-01-01

    Most bioinformatics tools available today were not written by professional software developers, but by people that wanted to solve their own problems, using computational solutions and spending the minimum time and effort possible, since these were just the means to an end. Consequently, a vast number of software applications are currently available, hindering the task of identifying the utility and quality of each. At the same time, this situation has hindered regular adoption of these tools in clinical practice. Typically, they are not sufficiently developed to be used by most clinical researchers and practitioners. To address these issues, it is necessary to re-think how biomedical applications are built and adopt new strategies that ensure quality, efficiency, robustness, correctness and reusability of software components. We also need to engage end-users during the development process to ensure that applications fit their needs. In this review, we present a set of guidelines to support biomedical software development, with an explanation of how they can be implemented and what kind of open-source tools can be used for each specific topic. PMID:28443186

  14. Software Quality Measurement for Distributed Systems. Volume 3. Distributed Computing Systems: Impact on Software Quality.

    DTIC Science & Technology

    1983-07-01

    Distributed Computing Systems impact DrnwrR - aehR on Sotwar Quaity. PERFORMING 010. REPORT NUMBER 7. AUTNOW) S. CONTRACT OR GRANT "UMBER(*)IS ThomasY...C31 Application", "Space Systems Network", "Need for Distributed Database Management", and "Adaptive Routing". This is discussed in the last para ...data reduction, buffering, encryption, and error detection and correction functions. Examples of such data streams include imagery data, video

  15. Server-based enterprise collaboration software improves safety and quality in high-volume PET/CT practice.

    PubMed

    McDonald, James E; Kessler, Marcus M; Hightower, Jeremy L; Henry, Susan D; Deloney, Linda A

    2013-12-01

    With increasing volumes of complex imaging cases and rising economic pressure on physician staffing, timely reporting will become progressively challenging. Current and planned iterations of PACS and electronic medical record systems do not offer workflow management tools to coordinate delivery of imaging interpretations with the needs of the patient and ordering physician. The adoption of a server-based enterprise collaboration software system by our Division of Nuclear Medicine has significantly improved our efficiency and quality of service.

  16. School Microware Reviews. Evaluations of Educational Software for Apple, PET, TRS-80, with Index to Evaluations in Other Publications.

    ERIC Educational Resources Information Center

    School Microware Reviews, 1981

    1981-01-01

    This document describes the operation and quality of pre-college instructional software sold for use on microcomputers. It also assists in locating other sources of similar information about instructional software. This edition is limited to programs for the Apple II, Commodore PET, and Radio Shack TRS-80 Model I. Fifty reviews of software…

  17. Software Engineering Education Directory

    DTIC Science & Technology

    1988-01-01

    Dana Hausman and Suzanne Woolf were crucial to the successful completion of this edition of the directory. Their teamwork, energy, and dedication...for this directory began in the summer of 1986 with a questionnaire mailed to schools selected from Peterson’s Graduate Programs in Engineering and...Christoper, and Siegel, Stan Software Cost Estimation and Life-Cycle Control by Putnam, Lawrence H. Software Quality Assurance: A Practical Approach by

  18. Effect of metal artifact reduction software on image quality of C-arm cone-beam computed tomography during intracranial aneurysm treatment.

    PubMed

    Enomoto, Yukiko; Yamauchi, Keita; Asano, Takahiko; Otani, Katharina; Iwama, Toru

    2018-01-01

    Background and purpose C-arm cone-beam computed tomography (CBCT) has the drawback that image quality is degraded by artifacts caused by implanted metal objects. We evaluated whether metal artifact reduction (MAR) prototype software can improve the subjective image quality of CBCT images of patients with intracranial aneurysms treated with coils or clips. Materials and methods Forty-four patients with intracranial aneurysms implanted with coils (40 patients) or clips (four patients) underwent one CBCT scan from which uncorrected and MAR-corrected CBCT image datasets were reconstructed. Three blinded readers evaluated the image quality of the image sets using a four-point scale (1: Excellent, 2: Good, 3: Poor, 4: Bad). The median scores of the three readers of uncorrected and MAR-corrected images were compared with the paired Wilcoxon signed-rank and inter-reader agreement of change scores was assessed by weighted kappa statistics. The readers also recorded new clinical findings, such as intracranial hemorrhage, air, or surrounding anatomical structures on MAR-corrected images. Results The image quality of MAR-corrected CBCT images was significantly improved compared with the uncorrected CBCT image ( p < 0.001). Additional clinical findings were seen on CBCT images of 70.4% of patients after MAR correction. Conclusion MAR software improved image quality of CBCT images degraded by metal artifacts.

  19. Cumulative Aggregate Risk Evaluation Software

    EPA Science Inventory

    CARES is a state-of-the-art software program designed to conduct complex exposure and risk assessments for pesticides, such as the assessments required under the 1996 Food Quality Protection Act (FQPA). CARES was originally developed under the auspices of CropLife America (CLA),...

  20. Performing Quantitative Imaging Acquisition, Analysis and Visualization Using the Best of Open Source and Commercial Software Solutions.

    PubMed

    Shenoy, Shailesh M

    2016-07-01

    A challenge in any imaging laboratory, especially one that uses modern techniques, is to achieve a sustainable and productive balance between using open source and commercial software to perform quantitative image acquisition, analysis and visualization. In addition to considering the expense of software licensing, one must consider factors such as the quality and usefulness of the software's support, training and documentation. Also, one must consider the reproducibility with which multiple people generate results using the same software to perform the same analysis, how one may distribute their methods to the community using the software and the potential for achieving automation to improve productivity.

  1. Software development environments: Status and trends

    NASA Technical Reports Server (NTRS)

    Duffel, Larry E.

    1988-01-01

    Currently software engineers are the essential integrating factors tying several components together. The components consist of process, methods, computers, tools, support environments, and software engineers. The engineers today empower the tools versus the tools empowering the engineers. Some of the issues in software engineering are quality, managing the software engineering process, and productivity. A strategy to accomplish this is to promote the evolution of software engineering from an ad hoc, labor intensive activity to a managed, technology supported discipline. This strategy may be implemented by putting the process under management control, adopting appropriate methods, inserting the technology that provides automated support for the process and methods, collecting automated tools into an integrated environment and educating the personnel.

  2. Effect of Software Version on the Accuracy of an Intraoral Scanning Device.

    PubMed

    Haddadi, Yasser; Bahrami, Golnosh; Isidor, Flemming

    2018-04-06

    To investigate the impact of software version on the accuracy of an intraoral scanning device. A master tooth was scanned with a high-precision optical scanner and then 10 times with a CEREC Omnicam scanner with software versions 4.4.0 and 4.4.4. Discrepancies were measured using quality control software. Mean deviation for 4.4.0 was 36.2 ± 35 μm and for 4.4.4 was 20.7 ± 14.2 μm (P ≤ .001). Software version has a significant impact on the accuracy of an intraoral scanner. It is important that researchers also publish the software version of scanners when publishing their findings.

  3. Developing a smartphone software package for predicting atmospheric pollutant concentrations at mobile locations.

    PubMed

    Larkin, Andrew; Williams, David E; Kile, Molly L; Baird, William M

    2015-06-01

    There is considerable evidence that exposure to air pollution is harmful to health. In the U.S., ambient air quality is monitored by Federal and State agencies for regulatory purposes. There are limited options, however, for people to access this data in real-time which hinders an individual's ability to manage their own risks. This paper describes a new software package that models environmental concentrations of fine particulate matter (PM 2.5 ), coarse particulate matter (PM 10 ), and ozone concentrations for the state of Oregon and calculates personal health risks at the smartphone's current location. Predicted air pollution risk levels can be displayed on mobile devices as interactive maps and graphs color-coded to coincide with EPA air quality index (AQI) categories. Users have the option of setting air quality warning levels via color-coded bars and were notified whenever warning levels were exceeded by predicted levels within 10 km. We validated the software using data from participants as well as from simulations which showed that the application was capable of identifying spatial and temporal air quality trends. This unique application provides a potential low-cost technology for reducing personal exposure to air pollution which can improve quality of life particularly for people with health conditions, such as asthma, that make them more susceptible to these hazards.

  4. Increasing quality and managing complexity in neuroinformatics software development with continuous integration.

    PubMed

    Zaytsev, Yury V; Morrison, Abigail

    2012-01-01

    High quality neuroscience research requires accurate, reliable and well maintained neuroinformatics applications. As software projects become larger, offering more functionality and developing a denser web of interdependence between their component parts, we need more sophisticated methods to manage their complexity. If complexity is allowed to get out of hand, either the quality of the software or the speed of development suffer, and in many cases both. To address this issue, here we develop a scalable, low-cost and open source solution for continuous integration (CI), a technique which ensures the quality of changes to the code base during the development procedure, rather than relying on a pre-release integration phase. We demonstrate that a CI-based workflow, due to rapid feedback about code integration problems and tracking of code health measures, enabled substantial increases in productivity for a major neuroinformatics project and additional benefits for three further projects. Beyond the scope of the current study, we identify multiple areas in which CI can be employed to further increase the quality of neuroinformatics projects by improving development practices and incorporating appropriate development tools. Finally, we discuss what measures can be taken to lower the barrier for developers of neuroinformatics applications to adopt this useful technique.

  5. Increasing quality and managing complexity in neuroinformatics software development with continuous integration

    PubMed Central

    Zaytsev, Yury V.; Morrison, Abigail

    2013-01-01

    High quality neuroscience research requires accurate, reliable and well maintained neuroinformatics applications. As software projects become larger, offering more functionality and developing a denser web of interdependence between their component parts, we need more sophisticated methods to manage their complexity. If complexity is allowed to get out of hand, either the quality of the software or the speed of development suffer, and in many cases both. To address this issue, here we develop a scalable, low-cost and open source solution for continuous integration (CI), a technique which ensures the quality of changes to the code base during the development procedure, rather than relying on a pre-release integration phase. We demonstrate that a CI-based workflow, due to rapid feedback about code integration problems and tracking of code health measures, enabled substantial increases in productivity for a major neuroinformatics project and additional benefits for three further projects. Beyond the scope of the current study, we identify multiple areas in which CI can be employed to further increase the quality of neuroinformatics projects by improving development practices and incorporating appropriate development tools. Finally, we discuss what measures can be taken to lower the barrier for developers of neuroinformatics applications to adopt this useful technique. PMID:23316158

  6. Evaluation of Aesthetic and Quality-of-Life Results after Immediate Breast Reconstruction with Definitive Form-Stable Anatomical Implants.

    PubMed

    Kuroda, Flavia; Urban, Cicero; Zucca-Matthes, Gustavo; de Oliveira, Vilmar Marques; Arana, Gabriel Hubner; Iera, Marco; Rietjens, Mario; Santos, Gabriela; Spagnol, Caroline; de Lima, Rubens Silveira

    2016-02-01

    Although there are many reports on different techniques in breast reconstruction, there are few data regarding immediate breast reconstruction with definitive form-stable anatomical implants in terms of aesthetics and quality-of-life outcomes. Ninety-four patients underwent mastectomy with immediate breast reconstruction using anatomical implants and contralateral symmetrization. Aesthetic results were evaluated by three different methods: the patient's self-report, the assessment of four independent specialists (two breast surgeons and two plastic surgeons from different institutions), and the BCCT.core software. Quality of life was evaluated by means of the BREAST-Q instrument. Average age ± SD was 52.1 ± 11.6 years. Most of patients had medium size breasts and T1 tumors. Patients had evaluated their aesthetic results better than did software and specialists. There was no significant difference in the comparison between software and specialist's evaluation. Multifactorial analysis showed that age older than 70 years and radiotherapy were significant risk factors for poor aesthetic outcomes after immediate breast reconstruction with implants. Considering quality of life, most of the patients were satisfied with their outcome and psychosocial and sexual well-being. Immediate breast reconstruction with implants and contralateral symmetrization had a positive impact on the quality of life and showed satisfactory outcomes when evaluated by subjective and objective methods.

  7. Development and prospective evaluation of an automated software system for quality control of quantitative 99mTc-MAG3 renal studies.

    PubMed

    Folks, Russell D; Garcia, Ernest V; Taylor, Andrew T

    2007-03-01

    Quantitative nuclear renography has numerous potential sources of error. We previously reported the initial development of a computer software module for comprehensively addressing the issue of quality control (QC) in the analysis of radionuclide renal images. The objective of this study was to prospectively test the QC software. The QC software works in conjunction with standard quantitative renal image analysis using a renal quantification program. The software saves a text file that summarizes QC findings as possible errors in user-entered values, calculated values that may be unreliable because of the patient's clinical condition, and problems relating to acquisition or processing. To test the QC software, a technologist not involved in software development processed 83 consecutive nontransplant clinical studies. The QC findings of the software were then tabulated. QC events were defined as technical (study descriptors that were out of range or were entered and then changed, unusually sized or positioned regions of interest, or missing frames in the dynamic image set) or clinical (calculated functional values judged to be erroneous or unreliable). Technical QC events were identified in 36 (43%) of 83 studies. Clinical QC events were identified in 37 (45%) of 83 studies. Specific QC events included starting the camera after the bolus had reached the kidney, dose infiltration, oversubtraction of background activity, and missing frames in the dynamic image set. QC software has been developed to automatically verify user input, monitor calculation of renal functional parameters, summarize QC findings, and flag potentially unreliable values for the nuclear medicine physician. Incorporation of automated QC features into commercial or local renal software can reduce errors and improve technologist performance and should improve the efficiency and accuracy of image interpretation.

  8. Cost Effective Development of Usable Systems: Gaps between HCI and Software Architecture Design

    NASA Astrophysics Data System (ADS)

    Folmer, Eelke; Bosch, Jan

    A software product with poor usability is likely to fail in a highly competitive market; therefore software developing organizations are paying more and more attention to ensuring the usability of their software. Practice, however, shows that product quality (which includes usability among others) is not that high as it could be. Studies of software projects (Pressman, 2001) reveal that organizations spend a relative large amount of money and effort on fixing usability problems during late stage development. Some of these problems could have been detected and fixed much earlier. This avoidable rework leads to high costs and because during development different tradeoffs have to be made, for example between cost and quality leads to systems with less than optimal usability. This problem has been around for a couple of decades especially after software engineering (SE) and human computer interaction (HCI) became disciplines on their own. While both disciplines developed themselves, several gaps appeared which are now receiving increased attention in research literature. Major gaps of understanding, both between suggested practice and how software is actually developed in industry, but also between the best practices of each of the fields have been identified (Carrol et al, 1994, Bass et al, 2001, Folmer and Bosch, 2002). In addition, there are gaps in the fields of differing terminology, concepts, education, and methods.

  9. QRev—Software for computation and quality assurance of acoustic doppler current profiler moving-boat streamflow measurements—User’s manual for version 2.8

    USGS Publications Warehouse

    Mueller, David S.

    2016-05-12

    The software program, QRev computes the discharge from moving-boat acoustic Doppler current profiler measurements using data collected with any of the Teledyne RD Instrument or SonTek bottom tracking acoustic Doppler current profilers. The computation of discharge is independent of the manufacturer of the acoustic Doppler current profiler because QRev applies consistent algorithms independent of the data source. In addition, QRev automates filtering and quality checking of the collected data and provides feedback to the user of potential quality issues with the measurement. Various statistics and characteristics of the measurement, in addition to a simple uncertainty assessment are provided to the user to assist them in properly rating the measurement. QRev saves an extensible markup language file that can be imported into databases or electronic field notes software. The user interacts with QRev through a tablet-friendly graphical user interface. This report is the manual for version 2.8 of QRev.

  10. Friction and lubrication modelling in sheet metal forming: Influence of lubrication amount, tool roughness and sheet coating on product quality

    NASA Astrophysics Data System (ADS)

    Hol, J.; Wiebenga, J. H.; Carleer, B.

    2017-09-01

    In the stamping of automotive parts, friction and lubrication play a key role in achieving high quality products. In the development process of new automotive parts, it is therefore crucial to accurately account for these effects in sheet metal forming simulations. This paper presents a selection of results considering friction and lubrication modelling in sheet metal forming simulations of a front fender product. For varying lubrication conditions, the front fender can either show wrinkling or fractures. The front fender is modelled using different lubrication amounts, tool roughness’s and sheet coatings to show the strong influence of friction on both part quality and the overall production stability. For this purpose, the TriboForm software is used in combination with the AutoForm software. The results demonstrate that the TriboForm software enables the simulation of friction behaviour for varying lubrication conditions, i.e. resulting in a generally applicable approach for friction characterization under industrial sheet metal forming process conditions.

  11. Development of a New VLBI Data Analysis Software

    NASA Technical Reports Server (NTRS)

    Bolotin, Sergei; Gipson, John M.; MacMillan, Daniel S.

    2010-01-01

    We present an overview of a new VLBI analysis software under development at NASA GSFC. The new software will replace CALC/SOLVE and many related utility programs. It will have the capabilities of the current system as well as incorporate new models and data analysis techniques. In this paper we give a conceptual overview of the new software. We formulate the main goals of the software. The software should be flexible and modular to implement models and estimation techniques that currently exist or will appear in future. On the other hand it should be reliable and possess production quality for processing standard VLBI sessions. Also, it needs to be capable of processing observations from a fully deployed network of VLBI2010 stations in a reasonable time. We describe the software development process and outline the software architecture.

  12. Graphical simulation for aerospace manufacturing

    NASA Technical Reports Server (NTRS)

    Babai, Majid; Bien, Christopher

    1994-01-01

    Simulation software has become a key technological enabler for integrating flexible manufacturing systems and streamlining the overall aerospace manufacturing process. In particular, robot simulation and offline programming software is being credited for reducing down time and labor cost, while boosting quality and significantly increasing productivity.

  13. Implementing Software Safety in the NASA Environment

    NASA Technical Reports Server (NTRS)

    Wetherholt, Martha S.; Radley, Charles F.

    1994-01-01

    Until recently, NASA did not consider allowing computers total control of flight systems. Human operators, via hardware, have constituted the ultimate safety control. In an attempt to reduce costs, NASA has come to rely more and more heavily on computers and software to control space missions. (For example. software is now planned to control most of the operational functions of the International Space Station.) Thus the need for systematic software safety programs has become crucial for mission success. Concurrent engineering principles dictate that safety should be designed into software up front, not tested into the software after the fact. 'Cost of Quality' studies have statistics and metrics to prove the value of building quality and safety into the development cycle. Unfortunately, most software engineers are not familiar with designing for safety, and most safety engineers are not software experts. Software written to specifications which have not been safety analyzed is a major source of computer related accidents. Safer software is achieved step by step throughout the system and software life cycle. It is a process that includes requirements definition, hazard analyses, formal software inspections, safety analyses, testing, and maintenance. The greatest emphasis is placed on clearly and completely defining system and software requirements, including safety and reliability requirements. Unfortunately, development and review of requirements are the weakest link in the process. While some of the more academic methods, e.g. mathematical models, may help bring about safer software, this paper proposes the use of currently approved software methodologies, and sound software and assurance practices to show how, to a large degree, safety can be designed into software from the start. NASA's approach today is to first conduct a preliminary system hazard analysis (PHA) during the concept and planning phase of a project. This determines the overall hazard potential of the system to be built. Shortly thereafter, as the system requirements are being defined, the second iteration of hazard analyses takes place, the systems hazard analysis (SHA). During the systems requirements phase, decisions are made as to what functions of the system will be the responsibility of software. This is the most critical time to affect the safety of the software. From this point, software safety analyses as well as software engineering practices are the main focus for assuring safe software. While many of the steps proposed in this paper seem like just sound engineering practices, they are the best technical and most cost effective means to assure safe software within a safe system.

  14. INDOOR AIR QUALITY AND INHALATION EXPOSURE - SIMULATION TOOL KIT

    EPA Science Inventory

    A Microsoft Windows-based indoor air quality (IAQ) simulation software package is presented. Named Simulation Tool Kit for Indoor Air Quality and Inhalation Exposure, or IAQX for short, this package complements and supplements existing IAQ simulation programs and is desi...

  15. HOW GOOD ARE MY DATA? INFORMATION QUALITY ASSESSMENT METHODOLOGY

    EPA Science Inventory


    Quality assurance techniques used in software development and hardware maintenance/reliability help to ensure that data in a computerized information management system are maintained well. However, information workers may not know the quality of the data resident in their inf...

  16. FSO and quality of service software prediction

    NASA Astrophysics Data System (ADS)

    Bouchet, O.; Marquis, T.; Chabane, M.; Alnaboulsi, M.; Sizun, H.

    2005-08-01

    Free-space optical (FSO) communication links constitute an alternative option to radio relay links and to optical cables facing growth needs in high-speed telecommunications (abundance of unregulated bandwidth, rapid installation, availability of low-cost optical components offering a high data rate, etc). Their operationalisation requires a good knowledge of the atmospheric effects which can negatively affect role propagation and the availability of the link, and thus to the quality of service (QoS). Better control of these phenomena will allow for the evaluation of system performance and thus assist with improving reliability. The aim of this paper is to compare the behavior of a FSO link located in south of France (Toulouse: with the following parameters: around 270 meters (0.2 mile) long, 34 Mbps data rate, 850 nm wavelength and PDH frame) with airport meteorological data. The second aim of the paper is to assess in-house FSO quality of service prediction software, through comparing simulations with the optical link data and the weather data. The analysis uses in-house software FSO quality of service prediction software ("FSO Prediction") developed by France Telecom Research & Development, which integrates news fog fading equations (compare to Kim & al.) and includes multiple effects (geometrical attenuation, atmospheric fading, rain, snow, scintillation and refraction attenuation due to atmospheric turbulence, optical mispointing attenuation). The FSO link field trial, intended to enable the demonstration and evaluation of these different effects, is described; and preliminary results of the field trial, from December 2004 to May 2005, are then presented.

  17. Leaf vein length per unit area is not intrinsically dependent on image magnification: avoiding measurement artifacts for accuracy and precision.

    PubMed

    Sack, Lawren; Caringella, Marissa; Scoffoni, Christine; Mason, Chase; Rawls, Michael; Markesteijn, Lars; Poorter, Lourens

    2014-10-01

    Leaf vein length per unit leaf area (VLA; also known as vein density) is an important determinant of water and sugar transport, photosynthetic function, and biomechanical support. A range of software methods are in use to visualize and measure vein systems in cleared leaf images; typically, users locate veins by digital tracing, but recent articles introduced software by which users can locate veins using thresholding (i.e. based on the contrasting of veins in the image). Based on the use of this method, a recent study argued against the existence of a fixed VLA value for a given leaf, proposing instead that VLA increases with the magnification of the image due to intrinsic properties of the vein system, and recommended that future measurements use a common, low image magnification for measurements. We tested these claims with new measurements using the software LEAFGUI in comparison with digital tracing using ImageJ software. We found that the apparent increase of VLA with magnification was an artifact of (1) using low-quality and low-magnification images and (2) errors in the algorithms of LEAFGUI. Given the use of images of sufficient magnification and quality, and analysis with error-free software, the VLA can be measured precisely and accurately. These findings point to important principles for improving the quantity and quality of important information gathered from leaf vein systems. © 2014 American Society of Plant Biologists. All Rights Reserved.

  18. Institute for scientific computing research;fiscal year 1999 annual report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Keyes, D

    2000-03-28

    Large-scale scientific computation, and all of the disciplines that support it and help to validate it, have been placed at the focus of Lawrence Livermore National Laboratory by the Accelerated Strategic Computing Initiative (ASCI). The Laboratory operates the computer with the highest peak performance in the world and has undertaken some of the largest and most compute-intensive simulations ever performed. Computers at the architectural extremes, however, are notoriously difficult to use efficiently. Even such successes as the Laboratory's two Bell Prizes awarded in November 1999 only emphasize the need for much better ways of interacting with the results of large-scalemore » simulations. Advances in scientific computing research have, therefore, never been more vital to the core missions of the Laboratory than at present. Computational science is evolving so rapidly along every one of its research fronts that to remain on the leading edge, the Laboratory must engage researchers at many academic centers of excellence. In FY 1999, the Institute for Scientific Computing Research (ISCR) has expanded the Laboratory's bridge to the academic community in the form of collaborative subcontracts, visiting faculty, student internships, a workshop, and a very active seminar series. ISCR research participants are integrated almost seamlessly with the Laboratory's Center for Applied Scientific Computing (CASC), which, in turn, addresses computational challenges arising throughout the Laboratory. Administratively, the ISCR flourishes under the Laboratory's University Relations Program (URP). Together with the other four Institutes of the URP, it must navigate a course that allows the Laboratory to benefit from academic exchanges while preserving national security. Although FY 1999 brought more than its share of challenges to the operation of an academic-like research enterprise within the context of a national security laboratory, the results declare the challenges well met and well worth the continued effort. A change of administration for the ISCR occurred during FY 1999. Acting Director John Fitzgerald retired from LLNL in August after 35 years of service, including the last two at helm of the ISCR. David Keyes, who has been a regular visitor in conjunction with ASCI scalable algorithms research since October 1997, overlapped with John for three months and serves half-time as the new Acting Director.« less

  19. A Framework for Analyzing and Testing the Performance of Software Services

    NASA Astrophysics Data System (ADS)

    Bertolino, Antonia; de Angelis, Guglielmo; di Marco, Antinisca; Inverardi, Paola; Sabetta, Antonino; Tivoli, Massimo

    Networks "Beyond the 3rd Generation" (B3G) are characterized by mobile and resource-limited devices that communicate through different kinds of network interfaces. Software services deployed in such networks shall adapt themselves according to possible execution contexts and requirement changes. At the same time, software services have to be competitive in terms of the Quality of Service (QoS) provided, or perceived by the end user.

  20. A User's Guide to the Meta-Analysis of Research Studies. Meta-Stat: Software To Aid in the Meta-Analysis of Research Findings.

    ERIC Educational Resources Information Center

    Rudner, Lawrence M.; Glass Gene V.; Evartt, David L.; Emery, Patrick J.

    This manual and the accompanying software are intended to provide a step-by-step guide to conducting a meta-analytic study along with references for further reading and free high-quality software, "Meta-Stat.""Meta-Stat" is a comprehensive package designed to help in the meta-analysis of research studies in the social and behavioral sciences.…

  1. An Experimental Investigation of Computer Program Development Approaches and Computer Programming Metrics.

    DTIC Science & Technology

    1979-12-01

    team progranming in reducing software dleveloup- ment costs relative to ad hoc approaches and improving software product quality relative to...are interpreted as demonstrating the advantages of disciplined team programming in reducing software development costs relative to ad hoc approaches...is due oartialty to the cost and imoracticality of a valiI experimental setup within a oroauct ion environment. Thus the question remains, are

  2. Survivability as a Tool for Evaluating Open Source Software

    DTIC Science & Technology

    2015-06-01

    the thesis limited the program development, so it is only able to process project issues (bugs or feature requests), which is an important metric for...Ideally, these insights may provide an analytic framework to generate guidance for decision makers that may support the inclusion of OSS to more...refine their efforts to build quality software and to strengthen their software development communities. 1.4 Research Questions This thesis addresses

  3. Automated verification of flight software. User's manual

    NASA Technical Reports Server (NTRS)

    Saib, S. H.

    1982-01-01

    (Automated Verification of Flight Software), a collection of tools for analyzing source programs written in FORTRAN and AED is documented. The quality and the reliability of flight software are improved by: (1) indented listings of source programs, (2) static analysis to detect inconsistencies in the use of variables and parameters, (3) automated documentation, (4) instrumentation of source code, (5) retesting guidance, (6) analysis of assertions, (7) symbolic execution, (8) generation of verification conditions, and (9) simplification of verification conditions. Use of AVFS in the verification of flight software is described.

  4. Applying CASE Tools for On-Board Software Development

    NASA Astrophysics Data System (ADS)

    Brammer, U.; Hönle, A.

    For many space projects the software development is facing great pressure with respect to quality, costs and schedule. One way to cope with these challenges is the application of CASE tools for automatic generation of code and documentation. This paper describes two CASE tools: Rhapsody (I-Logix) featuring UML and ISG (BSSE) that provides modeling of finite state machines. Both tools have been used at Kayser-Threde in different space projects for the development of on-board software. The tools are discussed with regard to the full software development cycle.

  5. Advanced fingerprint verification software

    NASA Astrophysics Data System (ADS)

    Baradarani, A.; Taylor, J. R. B.; Severin, F.; Maev, R. Gr.

    2016-05-01

    We have developed a fingerprint software package that can be used in a wide range of applications from law enforcement to public and private security systems, and to personal devices such as laptops, vehicles, and door- locks. The software and processing units are a unique implementation of new and sophisticated algorithms that compete with the current best systems in the world. Development of the software package has been in line with the third generation of our ultrasonic fingerprinting machine1. Solid and robust performance is achieved in the presence of misplaced and low quality fingerprints.

  6. Testing and validation of computerized decision support systems.

    PubMed

    Sailors, R M; East, T D; Wallace, C J; Carlson, D A; Franklin, M A; Heermann, L K; Kinder, A T; Bradshaw, R L; Randolph, A G; Morris, A H

    1996-01-01

    Systematic, through testing of decision support systems (DSSs) prior to release to general users is a critical aspect of high quality software design. Omission of this step may lead to the dangerous, and potentially fatal, condition of relying on a system with outputs of uncertain quality. Thorough testing requires a great deal of effort and is a difficult job because tools necessary to facilitate testing are not well developed. Testing is a job ill-suited to humans because it requires tireless attention to a large number of details. For these reasons, the majority of DSSs available are probably not well tested prior to release. We have successfully implemented a software design and testing plan which has helped us meet our goal of continuously improving the quality of our DSS software prior to release. While requiring large amounts of effort, we feel that the process of documenting and standardizing our testing methods are important steps toward meeting recognized national and international quality standards. Our testing methodology includes both functional and structural testing and requires input from all levels of development. Our system does not focus solely on meeting design requirements but also addresses the robustness of the system and the completeness of testing.

  7. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mundy, D; Tryggestad, E; Beltran, C

    Purpose: To develop daily and monthly quality assurance (QA) programs in support of a new spot-scanning proton treatment facility using a combination of commercial and custom equipment and software. Emphasis was placed on efficiency and evaluation of key quality parameters. Methods: The daily QA program was developed to test output, spot size and position, proton beam energy, and image guidance using the Sun Nuclear Corporation rf-DQA™3 device and Atlas QA software. The program utilizes standard Atlas linear accelerator tests repurposed for proton measurements and a custom jig for indexing the device to the treatment couch. The monthly QA program wasmore » designed to test mechanical performance, image quality, radiation quality, isocenter coincidence, and safety features. Many of these tests are similar to linear accelerator QA counterparts, but many require customized test design and equipment. Coincidence of imaging, laser marker, mechanical, and radiation isocenters, for instance, is verified using a custom film-based device devised and manufactured at our facility. Proton spot size and position as a function of energy are verified using a custom spot pattern incident on film and analysis software developed in-house. More details concerning the equipment and software developed for monthly QA are included in the supporting document. Thresholds for daily and monthly tests were established via perturbation analysis, early experience, and/or proton system specifications and associated acceptance test results. Results: The periodic QA program described here has been in effect for approximately 9 months and has proven efficient and sensitive to sub-clinical variations in treatment delivery characteristics. Conclusion: Tools and professional guidelines for periodic proton system QA are not as well developed as their photon and electron counterparts. The program described here efficiently evaluates key quality parameters and, while specific to the needs of our facility, could be readily adapted to other proton centers.« less

  8. Teacher-Pedagogy Approach for Sustainable Proficiency

    ERIC Educational Resources Information Center

    Nath, Baiju K.; Balan, Meera

    2010-01-01

    Quality concerns of an institution shall be explained in terms of hardware and software. The hardware comprises of building and other infrastructural facilities and software involves teachers, students and administrative staff. Various agencies such as National Council for Educational Research & Training (NCERT), National Council for Teacher…

  9. Automated Theorem Proving in High-Quality Software Design

    NASA Technical Reports Server (NTRS)

    Schumann, Johann; Swanson, Keith (Technical Monitor)

    2001-01-01

    The amount and complexity of software developed during the last few years has increased tremendously. In particular, programs are being used more and more in embedded systems (from car-brakes to plant-control). Many of these applications are safety-relevant, i.e. a malfunction of hardware or software can cause severe damage or loss. Tremendous risks are typically present in the area of aviation, (nuclear) power plants or (chemical) plant control. Here, even small problems can lead to thousands of casualties and huge financial losses. Large financial risks also exist when computer systems are used in the area of telecommunication (telephone, electronic commerce) or space exploration. Computer applications in this area are not only subject to safety considerations, but also security issues are important. All these systems must be designed and developed to guarantee high quality with respect to safety and security. Even in an industrial setting which is (or at least should be) aware of the high requirements in Software Engineering, many incidents occur. For example, the Warshaw Airbus crash, was caused by an incomplete requirements specification. Uncontrolled reuse of an Ariane 4 software module was the reason for the Ariane 5 disaster. Some recent incidents in the telecommunication area, like illegal "cloning" of smart-cards of D2GSM handies, or the extraction of (secret) passwords from German T-online users show that also in this area serious flaws can happen. Due to the inherent complexity of computer systems, most authors claim that only a rigorous application of formal methods in all stages of the software life cycle can ensure high quality of the software and lead to real safe and secure systems. In this paper, we will have a look, in how far automated theorem proving can contribute to a more widespread application of formal methods and their tools, and what automated theorem provers (ATPs) must provide in order to be useful.

  10. Developing Software to “Track and Catch” Missed Follow-up of Abnormal Test Results in a Complex Sociotechnical Environment

    PubMed Central

    Smith, M.; Murphy, D.; Laxmisan, A.; Sittig, D.; Reis, B.; Esquivel, A.; Singh, H.

    2013-01-01

    Summary Background Abnormal test results do not always receive timely follow-up, even when providers are notified through electronic health record (EHR)-based alerts. High workload, alert fatigue, and other demands on attention disrupt a provider’s prospective memory for tasks required to initiate follow-up. Thus, EHR-based tracking and reminding functionalities are needed to improve follow-up. Objectives The purpose of this study was to develop a decision-support software prototype enabling individual and system-wide tracking of abnormal test result alerts lacking follow-up, and to conduct formative evaluations, including usability testing. Methods We developed a working prototype software system, the Alert Watch And Response Engine (AWARE), to detect abnormal test result alerts lacking documented follow-up, and to present context-specific reminders to providers. Development and testing took place within the VA’s EHR and focused on four cancer-related abnormal test results. Design concepts emphasized mitigating the effects of high workload and alert fatigue while being minimally intrusive. We conducted a multifaceted formative evaluation of the software, addressing fit within the larger socio-technical system. Evaluations included usability testing with the prototype and interview questions about organizational and workflow factors. Participants included 23 physicians, 9 clinical information technology specialists, and 8 quality/safety managers. Results Evaluation results indicated that our software prototype fit within the technical environment and clinical workflow, and physicians were able to use it successfully. Quality/safety managers reported that the tool would be useful in future quality assurance activities to detect patients who lack documented follow-up. Additionally, we successfully installed the software on the local facility’s “test” EHR system, thus demonstrating technical compatibility. Conclusion To address the factors involved in missed test results, we developed a software prototype to account for technical, usability, organizational, and workflow needs. Our evaluation has shown the feasibility of the prototype as a means of facilitating better follow-up for cancer-related abnormal test results. PMID:24155789

  11. High-throughput image analysis of tumor spheroids: a user-friendly software application to measure the size of spheroids automatically and accurately.

    PubMed

    Chen, Wenjin; Wong, Chung; Vosburgh, Evan; Levine, Arnold J; Foran, David J; Xu, Eugenia Y

    2014-07-08

    The increasing number of applications of three-dimensional (3D) tumor spheroids as an in vitro model for drug discovery requires their adaptation to large-scale screening formats in every step of a drug screen, including large-scale image analysis. Currently there is no ready-to-use and free image analysis software to meet this large-scale format. Most existing methods involve manually drawing the length and width of the imaged 3D spheroids, which is a tedious and time-consuming process. This study presents a high-throughput image analysis software application - SpheroidSizer, which measures the major and minor axial length of the imaged 3D tumor spheroids automatically and accurately; calculates the volume of each individual 3D tumor spheroid; then outputs the results in two different forms in spreadsheets for easy manipulations in the subsequent data analysis. The main advantage of this software is its powerful image analysis application that is adapted for large numbers of images. It provides high-throughput computation and quality-control workflow. The estimated time to process 1,000 images is about 15 min on a minimally configured laptop, or around 1 min on a multi-core performance workstation. The graphical user interface (GUI) is also designed for easy quality control, and users can manually override the computer results. The key method used in this software is adapted from the active contour algorithm, also known as Snakes, which is especially suitable for images with uneven illumination and noisy background that often plagues automated imaging processing in high-throughput screens. The complimentary "Manual Initialize" and "Hand Draw" tools provide the flexibility to SpheroidSizer in dealing with various types of spheroids and diverse quality images. This high-throughput image analysis software remarkably reduces labor and speeds up the analysis process. Implementing this software is beneficial for 3D tumor spheroids to become a routine in vitro model for drug screens in industry and academia.

  12. Developing software to "track and catch" missed follow-up of abnormal test results in a complex sociotechnical environment.

    PubMed

    Smith, M; Murphy, D; Laxmisan, A; Sittig, D; Reis, B; Esquivel, A; Singh, H

    2013-01-01

    Abnormal test results do not always receive timely follow-up, even when providers are notified through electronic health record (EHR)-based alerts. High workload, alert fatigue, and other demands on attention disrupt a provider's prospective memory for tasks required to initiate follow-up. Thus, EHR-based tracking and reminding functionalities are needed to improve follow-up. The purpose of this study was to develop a decision-support software prototype enabling individual and system-wide tracking of abnormal test result alerts lacking follow-up, and to conduct formative evaluations, including usability testing. We developed a working prototype software system, the Alert Watch And Response Engine (AWARE), to detect abnormal test result alerts lacking documented follow-up, and to present context-specific reminders to providers. Development and testing took place within the VA's EHR and focused on four cancer-related abnormal test results. Design concepts emphasized mitigating the effects of high workload and alert fatigue while being minimally intrusive. We conducted a multifaceted formative evaluation of the software, addressing fit within the larger socio-technical system. Evaluations included usability testing with the prototype and interview questions about organizational and workflow factors. Participants included 23 physicians, 9 clinical information technology specialists, and 8 quality/safety managers. Evaluation results indicated that our software prototype fit within the technical environment and clinical workflow, and physicians were able to use it successfully. Quality/safety managers reported that the tool would be useful in future quality assurance activities to detect patients who lack documented follow-up. Additionally, we successfully installed the software on the local facility's "test" EHR system, thus demonstrating technical compatibility. To address the factors involved in missed test results, we developed a software prototype to account for technical, usability, organizational, and workflow needs. Our evaluation has shown the feasibility of the prototype as a means of facilitating better follow-up for cancer-related abnormal test results.

  13. Evaluating the feasibility of using online software to collect patient information in a chiropractic practice-based research network

    PubMed Central

    Kania-Richmond, Ania; Weeks, Laura; Scholten, Jeffrey; Reney, Mikaël

    2016-01-01

    Background: Practice based research networks (PBRNs) are increasingly used as a tool for evidence based practice. We developed and tested the feasibility of using software to enable online collection of patient data within a chiropractic PBRN to support clinical decision making and research in participating clinics. Purpose: To assess the feasibility of using online software to collect quality patient information. Methods: The study consisted of two phases: 1) Assessment of the quality of information provided, using a standardized form; and 2) Exploration of patients’ perspectives and experiences regarding online information provision through semi-structured interviews. Data analysis was descriptive. Results: Forty-five new patients were recruited. Thirty-six completed online forms, which were submitted by an appropriate person 100% of the time, with an error rate of less than 1%, and submitted in a timely manner 83% of the time. Twenty-one participants were interviewed. Overall, online forms were preferred given perceived security, ease of use, and enabling provision of more accurate information. Conclusions: Use of online software is feasible, provides high quality information, and is preferred by most participants. A pen-and-paper format should be available for patients with this preference and in case of technical difficulties. PMID:27069272

  14. COMPUTERIZED NEEDS-ORIENTED QUALITY MEASUREMENT EVALUATION SYSTEM (CONQUEST)

    EPA Science Inventory

    CONQUEST is an easy-to-use quality improvement software tool that uses a common structure and language to help users identity, understand, compare, evaluate, and select among 1,200 clinical performance measures that can be used to assess and improve quality of care. CONQUEST's in...

  15. Interventions to Improve the Quality of Outpatient Specialty Referral Requests: A Systematic Review.

    PubMed

    Hendrickson, Chase D; Lacourciere, Stacy L; Zanetti, Cole A; Donaldson, Patrick C; Larson, Robin J

    2016-09-01

    Requests for outpatient specialty consultations occur frequently but often are of poor quality because of incompleteness. The authors searched bibliographic databases, trial registries, and references during October 2014 for studies evaluating interventions to improve the quality of outpatient specialty referral requests compared to usual practice. Two reviewers independently extracted data and assessed quality. Findings were qualitatively summarized for completeness of information relayed in a referral request within naturally emerging intervention categories. Of 3495 articles screened, 11 were eligible. All 3 studies evaluating software-based interventions found statistically significant improvements. Among 4 studies evaluating template/pro forma interventions, completeness was uniformly improved but with variable or unreported statistical significance. Of 4 studies evaluating educational interventions, 2 favored the intervention and 2 found no difference. One study evaluating referral management was negative. Current evidence for improving referral request quality is strongest for software-based interventions and templates, although methodological quality varied and findings may be setting specific. © The Author(s) 2015.

  16. Time-lapse systems for embryo incubation and assessment in assisted reproduction.

    PubMed

    Armstrong, Sarah; Bhide, Priya; Jordan, Vanessa; Pacey, Allan; Farquhar, Cindy

    2018-05-25

    Embryo incubation and assessment is a vital step in assisted reproductive technology (ART). Traditionally, embryo assessment has been achieved by removing embryos from a conventional incubator daily for quality assessment by an embryologist, under a light microscope. Over recent years time-lapse systems have been developed which can take digital images of embryos at frequent time intervals. This allows embryologists, with or without the assistance of embryo selection software, to assess the quality of the embryos without physically removing them from the incubator.The potential advantages of a time-lapse system (TLS) include the ability to maintain a stable culture environment, therefore limiting the exposure of embryos to changes in gas composition, temperature and movement. A TLS has the potential advantage of improving embryo selection for ART treatment by utilising additional information gained through continuously monitoring embryo development. Use of a TLS often adds significant extra cost onto an in vitro fertilisation (IVF) cycle. To determine the effect of a TLS compared to conventional embryo incubation and assessment on clinical outcomes in couples undergoing ART. We used standard methodology recommended by Cochrane. We searched the Cochrane Gynaecology and Fertility (CGF) Group trials register, CENTRAL, MEDLINE, Embase, CINAHL and two trials registers on 2 August 2017. We included randomised controlled trials (RCTs) in the following comparisons: comparing a TLS, with or without embryo selection software, versus conventional incubation with morphological assessment; and TLS with embryo selection software versus TLS without embryo selection software among couples undergoing ART. We used standard methodological procedures recommended by Cochrane. The primary review outcomes were live birth, miscarriage and stillbirth. Secondary outcomes were clinical pregnancy and cumulative clinical pregnancy. We reported quality of the evidence for important outcomes using GRADE methodology. We made the following comparisons.TLS with conventional morphological assessment of still TLS images versus conventional incubation and assessmentTLS utilising embryo selection software versus TLS with conventional morphological assessment of still TLS images TLS utilising embryo selection software versus conventional incubation and assessment MAIN RESULTS: We included eight RCTs (N = 2303 women). The quality of the evidence ranged from very low to moderate. The main limitations were imprecision and risk of bias associated with lack of blinding of participants and researchers, and indirectness secondary to significant heterogeneity between interventions in some studies. There were no data on cumulative clinical pregnancy.TLS with conventional morphological assessment of still TLS images versus conventional incubation and assessmentThere is no evidence of a difference between the interventions in terms of live birth rates (odds ratio (OR) 0.73, 95% CI 0.47 to 1.13, 2 RCTs, N = 440, I 2 = 11% , moderate-quality evidence) and may also be no evidence of difference in miscarriage rates (OR 2.25, 95% CI 0.84 to 6.02, 2 RCTs, N = 440, I 2 = 44%, low-quality evidence). The evidence suggests that if the live birth rate associated with conventional incubation and assessment is 33%, the rate with use of TLS with conventional morphological assessment of still TLS images is between 19% and 36%; and that if the miscarriage rate with conventional incubation is 3%, the rate associated with conventional morphological assessment of still TLS images would be between 3% and 18%. There is no evidence of a difference between the interventions in the stillbirth rate (OR 1.00, 95% CI 0.13 to 7.49, 1 RCT, N = 76, low-quality evidence). There is no evidence of a difference between the interventions in clinical pregnancy rates (OR 0.88, 95% CI 0.58 to 1.33, 3 RCTs, N = 489, I 2 = 0%, moderate-quality evidence).TLS utilising embryo selection software versus TLS with conventional morphological assessment of still TLS imagesNo data were available on live birth or stillbirth. We are uncertain whether TLS utilising embryo selection software influences miscarriage rates (OR 1.39, 95% CI 0.64 to 3.01, 2 RCTs, N = 463, I 2 = 0%, very low-quality evidence) and there may be no difference in clinical pregnancy rates (OR 0.97, 95% CI 0.67 to 1.42, 2 RCTs, N = 463, I 2 = 0%, low-quality evidence). The evidence suggests that if the miscarriage rate associated with assessment of still TLS images is 5%, the rate with embryo selection software would be between 3% and 14%.TLS utilising embryo selection software versus conventional incubation and assessmentThere is no evidence of a difference between TLS utilising embryo selection software and conventional incubation improving live birth rates (OR 1.21, 95% CI 0.96 to 1.54, 2 RCTs, N = 1017, I 2 = 0%, very low-quality evidence). We are uncertain whether TLS influences miscarriage rates (OR 0.73, 95% CI 0.49 to 1.08, 3 RCTs, N = 1351, I 2 = 0%, very low-quality evidence). The evidence suggests that if the live birth rate associated with no TLS is 38%, the rate with use of conventional incubation would be between 36% and 58%, and that if miscarriage rate with conventional incubation is 9%, the rate associated with TLS would be between 4% and 10%. No data on stillbirths were available. It was uncertain whether the intervention influenced clinical pregnancy rates (OR 1.17, 95% CI 0.94 to 1.45, 3 RCTs, N = 1351, I 2 = 42%, very low-quality evidence). There is insufficient evidence of differences in live birth, miscarriage, stillbirth or clinical pregnancy to choose between TLS, with or without embryo selection software, and conventional incubation. The studies were at high risk of bias for randomisation and allocation concealment, the result should be interpreted with extreme caution.

  17. TMT approach to observatory software development process

    NASA Astrophysics Data System (ADS)

    Buur, Hanne; Subramaniam, Annapurni; Gillies, Kim; Dumas, Christophe; Bhatia, Ravinder

    2016-07-01

    The purpose of the Observatory Software System (OSW) is to integrate all software and hardware components of the Thirty Meter Telescope (TMT) to enable observations and data capture; thus it is a complex software system that is defined by four principal software subsystems: Common Software (CSW), Executive Software (ESW), Data Management System (DMS) and Science Operations Support System (SOSS), all of which have interdependencies with the observatory control systems and data acquisition systems. Therefore, the software development process and plan must consider dependencies to other subsystems, manage architecture, interfaces and design, manage software scope and complexity, and standardize and optimize use of resources and tools. Additionally, the TMT Observatory Software will largely be developed in India through TMT's workshare relationship with the India TMT Coordination Centre (ITCC) and use of Indian software industry vendors, which adds complexity and challenges to the software development process, communication and coordination of activities and priorities as well as measuring performance and managing quality and risk. The software project management challenge for the TMT OSW is thus a multi-faceted technical, managerial, communications and interpersonal relations challenge. The approach TMT is using to manage this multifaceted challenge is a combination of establishing an effective geographically distributed software team (Integrated Product Team) with strong project management and technical leadership provided by the TMT Project Office (PO) and the ITCC partner to manage plans, process, performance, risk and quality, and to facilitate effective communications; establishing an effective cross-functional software management team composed of stakeholders, OSW leadership and ITCC leadership to manage dependencies and software release plans, technical complexities and change to approved interfaces, architecture, design and tool set, and to facilitate effective communications; adopting an agile-based software development process across the observatory to enable frequent software releases to help mitigate subsystem interdependencies; defining concise scope and work packages for each of the OSW subsystems to facilitate effective outsourcing of software deliverables to the ITCC partner, and to enable performance monitoring and risk management. At this stage, the architecture and high-level design of the software system has been established and reviewed. During construction each subsystem will have a final design phase with reviews, followed by implementation and testing. The results of the TMT approach to the Observatory Software development process will only be preliminary at the time of the submittal of this paper, but it is anticipated that the early results will be a favorable indication of progress.

  18. Software Dependability and Safety Evaluations ESA's Initiative

    NASA Astrophysics Data System (ADS)

    Hernek, M.

    ESA has allocated funds for an initiative to evaluate Dependability and Safety methods of Software. The objectives of this initiative are; · More extensive validation of Safety and Dependability techniques for Software · Provide valuable results to improve the quality of the Software thus promoting the application of Dependability and Safety methods and techniques. ESA space systems are being developed according to defined PA requirement specifications. These requirements may be implemented through various design concepts, e.g. redundancy, diversity etc. varying from project to project. Analysis methods (FMECA. FTA, HA, etc) are frequently used during requirements analysis and design activities to assure the correct implementation of system PA requirements. The criticality level of failures, functions and systems is determined and by doing that the critical sub-systems are identified, on which dependability and safety techniques are to be applied during development. Proper performance of the software development requires the development of a technical specification for the products at the beginning of the life cycle. Such technical specification comprises both functional and non-functional requirements. These non-functional requirements address characteristics of the product such as quality, dependability, safety and maintainability. Software in space systems is more and more used in critical functions. Also the trend towards more frequent use of COTS and reusable components pose new difficulties in terms of assuring reliable and safe systems. Because of this, its dependability and safety must be carefully analysed. ESA identified and documented techniques, methods and procedures to ensure that software dependability and safety requirements are specified and taken into account during the design and development of a software system and to verify/validate that the implemented software systems comply with these requirements [R1].

  19. Fostering successful scientific software communities

    NASA Astrophysics Data System (ADS)

    Bangerth, W.; Heister, T.; Hwang, L.; Kellogg, L. H.

    2016-12-01

    Developing sustainable open source software packages for the sciences appears at first to be primarily a technical challenge: How can one create stable and robust algorithms, appropriate software designs, sufficient documentation, quality assurance strategies such as continuous integration and test suites, or backward compatibility approaches that yield high-quality software usable not only by the authors, but also the broader community of scientists? However, our experience from almost two decades of leading the development of the deal.II software library (http://www.dealii.org, a widely-used finite element package) and the ASPECT code (http://aspect.dealii.org, used to simulate convection in the Earth's mantle) has taught us that technical aspects are not the most difficult ones in scientific open source software. Rather, it is the social challenge of building and maintaining a community of users and developers interested in answering questions on user forums, contributing code, and jointly finding solutions to common technical and non-technical challenges. These problems are posed in an environment where project leaders typically have no resources to reward the majority of contributors, where very few people are specifically paid for the work they do on the project, and with frequent turnover of contributors as project members rotate into and out of jobs. In particular, much software work is done by graduate students who may become fluent enough in a software only a year or two before they leave academia. We will discuss strategies we have found do and do not work in maintaining and growing communities around the scientific software projects we lead. Specifically, we will discuss the management style necessary to keep contributors engaged, ways to give credit where credit is due, and structuring documentation to decrease reliance on forums and thereby allow user communities to grow without straining those who answer questions.

  20. Simulation of textile manufacturing processes for planning, scheduling, and quality control purposes

    NASA Astrophysics Data System (ADS)

    Cropper, A. E.; Wang, Z.

    1995-08-01

    Simulation, as a management information tool, has been applied to engineering manufacture and assembly operations. The application of the principles to textile manufacturing (fiber to fabric) is discussed. The particular problems and solutions in applying the simulation software package to the yarn production processes are discussed with an indication of how the software achieves the production schedule. The system appears to have application in planning, scheduling, and quality assurance. The latter being a result of the traceability possibilities through a process involving mixing and splitting of material.

Top