Sample records for standard numerical software

  1. Data Processing System (DPS) software with experimental design, statistical analysis and data mining developed for use in entomological research.

    PubMed

    Tang, Qi-Yi; Zhang, Chuan-Xi

    2013-04-01

    A comprehensive but simple-to-use software package called DPS (Data Processing System) has been developed to execute a range of standard numerical analyses and operations used in experimental design, statistics and data mining. This program runs on standard Windows computers. Many of the functions are specific to entomological and other biological research and are not found in standard statistical software. This paper presents applications of DPS to experimental design, statistical analysis and data mining in entomology. © 2012 The Authors Insect Science © 2012 Institute of Zoology, Chinese Academy of Sciences.

  2. Strategies for the Creation, Design and Implementation of Effective Interactive Computer-Aided Learning Software in Numerate Business Subjects--The Byzantium Experience.

    ERIC Educational Resources Information Center

    Wilkinson-Riddle, G. J.; Patel, Ashok

    1998-01-01

    Discusses courseware development, including intelligent tutoring systems, under the Teaching and Learning Technology Programme and the Byzantium project that was designed to define computer-aided learning performance standards suitable for numerate business subjects; examine reasons to use computer-aided learning; and improve access to educational…

  3. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lehe, Remi

    Many simulation software produce data in the form of a set of field values or of a set of particle positions. (one such example is that of particle-in-cell codes, which produce data on the electromagnetic fields that they simulate.) However, each particular software uses its own particular format and layout, for the output data. This makes it difficult to compare the results of different simulation software, or to have a common visualization tool for these results. However, a standardized layout for fields and particles has recently been developed: the openPMD format ( HYPERLINK "http://www.openpmd.org/"www.openpmd.org) This format is open- source, andmore » specifies a standard way in which field data and particle data should be written. The openPMD format is already implemented in the particle-in-cell code Warp (developed at LBL) and in PIConGPU (developed at HZDR, Germany). In this context, the proposed software (openPMD-viewer) is a Python package, which allows to access and visualize any data which has been formatted according to the openPMD standard. This package contains two main components: - a Python API, which allows to read and extract the data from a openPMD file, so as to be able to work with it within the Python environment. (e.g. plot the data and reprocess it with particular Python functions) - a graphical interface, which works with the ipython notebook, and allows to quickly visualize the data and browse through a set of openPMD files. The proposed software will be typically used when analyzing the results of numerical simulations. It will be useful to quickly extract scientific meaning from a set of numerical data.« less

  4. Molecular radiotherapy: the NUKFIT software for calculating the time-integrated activity coefficient.

    PubMed

    Kletting, P; Schimmel, S; Kestler, H A; Hänscheid, H; Luster, M; Fernández, M; Bröer, J H; Nosske, D; Lassmann, M; Glatting, G

    2013-10-01

    Calculation of the time-integrated activity coefficient (residence time) is a crucial step in dosimetry for molecular radiotherapy. However, available software is deficient in that it is either not tailored for the use in molecular radiotherapy and/or does not include all required estimation methods. The aim of this work was therefore the development and programming of an algorithm which allows for an objective and reproducible determination of the time-integrated activity coefficient and its standard error. The algorithm includes the selection of a set of fitting functions from predefined sums of exponentials and the choice of an error model for the used data. To estimate the values of the adjustable parameters an objective function, depending on the data, the parameters of the error model, the fitting function and (if required and available) Bayesian information, is minimized. To increase reproducibility and user-friendliness the starting values are automatically determined using a combination of curve stripping and random search. Visual inspection, the coefficient of determination, the standard error of the fitted parameters, and the correlation matrix are provided to evaluate the quality of the fit. The functions which are most supported by the data are determined using the corrected Akaike information criterion. The time-integrated activity coefficient is estimated by analytically integrating the fitted functions. Its standard error is determined assuming Gaussian error propagation. The software was implemented using MATLAB. To validate the proper implementation of the objective function and the fit functions, the results of NUKFIT and SAAM numerical, a commercially available software tool, were compared. The automatic search for starting values was successfully tested for reproducibility. The quality criteria applied in conjunction with the Akaike information criterion allowed the selection of suitable functions. Function fit parameters and their standard error estimated by using SAAM numerical and NUKFIT showed differences of <1%. The differences for the time-integrated activity coefficients were also <1% (standard error between 0.4% and 3%). In general, the application of the software is user-friendly and the results are mathematically correct and reproducible. An application of NUKFIT is presented for three different clinical examples. The software tool with its underlying methodology can be employed to objectively and reproducibly estimate the time integrated activity coefficient and its standard error for most time activity data in molecular radiotherapy.

  5. influx_s: increasing numerical stability and precision for metabolic flux analysis in isotope labelling experiments.

    PubMed

    Sokol, Serguei; Millard, Pierre; Portais, Jean-Charles

    2012-03-01

    The problem of stationary metabolic flux analysis based on isotope labelling experiments first appeared in the early 1950s and was basically solved in early 2000s. Several algorithms and software packages are available for this problem. However, the generic stochastic algorithms (simulated annealing or evolution algorithms) currently used in these software require a lot of time to achieve acceptable precision. For deterministic algorithms, a common drawback is the lack of convergence stability for ill-conditioned systems or when started from a random point. In this article, we present a new deterministic algorithm with significantly increased numerical stability and accuracy of flux estimation compared with commonly used algorithms. It requires relatively short CPU time (from several seconds to several minutes with a standard PC architecture) to estimate fluxes in the central carbon metabolism network of Escherichia coli. The software package influx_s implementing this algorithm is distributed under an OpenSource licence at http://metasys.insa-toulouse.fr/software/influx/. Supplementary data are available at Bioinformatics online.

  6. A general framework for parametric survival analysis.

    PubMed

    Crowther, Michael J; Lambert, Paul C

    2014-12-30

    Parametric survival models are being increasingly used as an alternative to the Cox model in biomedical research. Through direct modelling of the baseline hazard function, we can gain greater understanding of the risk profile of patients over time, obtaining absolute measures of risk. Commonly used parametric survival models, such as the Weibull, make restrictive assumptions of the baseline hazard function, such as monotonicity, which is often violated in clinical datasets. In this article, we extend the general framework of parametric survival models proposed by Crowther and Lambert (Journal of Statistical Software 53:12, 2013), to incorporate relative survival, and robust and cluster robust standard errors. We describe the general framework through three applications to clinical datasets, in particular, illustrating the use of restricted cubic splines, modelled on the log hazard scale, to provide a highly flexible survival modelling framework. Through the use of restricted cubic splines, we can derive the cumulative hazard function analytically beyond the boundary knots, resulting in a combined analytic/numerical approach, which substantially improves the estimation process compared with only using numerical integration. User-friendly Stata software is provided, which significantly extends parametric survival models available in standard software. Copyright © 2014 John Wiley & Sons, Ltd.

  7. Comparison of software tools for kinetic evaluation of chemical degradation data.

    PubMed

    Ranke, Johannes; Wöltjen, Janina; Meinecke, Stefan

    2018-01-01

    For evaluating the fate of xenobiotics in the environment, a variety of degradation or environmental metabolism experiments are routinely conducted. The data generated in such experiments are evaluated by optimizing the parameters of kinetic models in a way that the model simulation fits the data. No comparison of the main software tools currently in use has been published to date. This article shows a comparison of numerical results as well as an overall, somewhat subjective comparison based on a scoring system using a set of criteria. The scoring was separately performed for two types of uses. Uses of type I are routine evaluations involving standard kinetic models and up to three metabolites in a single compartment. Evaluations involving non-standard model components, more than three metabolites or more than a single compartment belong to use type II. For use type I, usability is most important, while the flexibility of the model definition is most important for use type II. Test datasets were assembled that can be used to compare the numerical results for different software tools. These datasets can also be used to ensure that no unintended or erroneous behaviour is introduced in newer versions. In the comparison of numerical results, good agreement between the parameter estimates was observed for datasets with up to three metabolites. For the now unmaintained reference software DegKinManager/ModelMaker, and for OpenModel which is still under development, user options were identified that should be taken care of in order to obtain results that are as reliable as possible. Based on the scoring system mentioned above, the software tools gmkin, KinGUII and CAKE received the best scores for use type I. Out of the 15 software packages compared with respect to use type II, again gmkin and KinGUII were the first two, followed by the script based tool mkin, which is the technical basis for gmkin, and by OpenModel. Based on the evaluation using the system of criteria mentioned above and the comparison of numerical results for the suite of test datasets, the software tools gmkin, KinGUII and CAKE are recommended for use type I, and gmkin and KinGUII for use type II. For users that prefer to work with scripts instead of graphical user interfaces, mkin is recommended. For future software evaluations, it is recommended to include a measure for the total time that a typical user needs for a kinetic evaluation into the scoring scheme. It is the hope of the authors that the publication of test data, source code and overall rankings foster the evolution of useful and reliable software in the field.

  8. An Artificial Neural Networks Method for Solving Partial Differential Equations

    NASA Astrophysics Data System (ADS)

    Alharbi, Abir

    2010-09-01

    While there already exists many analytical and numerical techniques for solving PDEs, this paper introduces an approach using artificial neural networks. The approach consists of a technique developed by combining the standard numerical method, finite-difference, with the Hopfield neural network. The method is denoted Hopfield-finite-difference (HFD). The architecture of the nets, energy function, updating equations, and algorithms are developed for the method. The HFD method has been used successfully to approximate the solution of classical PDEs, such as the Wave, Heat, Poisson and the Diffusion equations, and on a system of PDEs. The software Matlab is used to obtain the results in both tabular and graphical form. The results are similar in terms of accuracy to those obtained by standard numerical methods. In terms of speed, the parallel nature of the Hopfield nets methods makes them easier to implement on fast parallel computers while some numerical methods need extra effort for parallelization.

  9. GPAW - massively parallel electronic structure calculations with Python-based software.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Enkovaara, J.; Romero, N.; Shende, S.

    2011-01-01

    Electronic structure calculations are a widely used tool in materials science and large consumer of supercomputing resources. Traditionally, the software packages for these kind of simulations have been implemented in compiled languages, where Fortran in its different versions has been the most popular choice. While dynamic, interpreted languages, such as Python, can increase the effciency of programmer, they cannot compete directly with the raw performance of compiled languages. However, by using an interpreted language together with a compiled language, it is possible to have most of the productivity enhancing features together with a good numerical performance. We have used thismore » approach in implementing an electronic structure simulation software GPAW using the combination of Python and C programming languages. While the chosen approach works well in standard workstations and Unix environments, massively parallel supercomputing systems can present some challenges in porting, debugging and profiling the software. In this paper we describe some details of the implementation and discuss the advantages and challenges of the combined Python/C approach. We show that despite the challenges it is possible to obtain good numerical performance and good parallel scalability with Python based software.« less

  10. Applying Use Cases to Describe the Role of Standards in e-Health Information Systems

    NASA Astrophysics Data System (ADS)

    Chávez, Emma; Finnie, Gavin; Krishnan, Padmanabhan

    Individual health records (IHRs) contain a person's lifetime records of their key health history and care within a health system (National E-Health Transition Authority, Retrieved Jan 12, 2009 from http://www.nehta.gov.au/coordinated-care/whats-in-iehr, 2004). This information can be processed and stored in different ways. The record should be available electronically to authorized health care providers and the individual anywhere, anytime, to support high-quality care. Many organizations provide a diversity of solutions for e-health and its services. Standards play an important role to enable these organizations to support information interchange and improve efficiency of health care delivery. However, there are numerous standards to choose from and not all of them are accessible to the software developer. This chapter proposes a framework to describe the e-health standards that can be used by software engineers to implement e-health information systems.

  11. Steady-State Cycle Deck Launcher Developed for Numerical Propulsion System Simulation

    NASA Technical Reports Server (NTRS)

    VanDrei, Donald E.

    1997-01-01

    One of the objectives of NASA's High Performance Computing and Communications Program's (HPCCP) Numerical Propulsion System Simulation (NPSS) is to reduce the time and cost of generating aerothermal numerical representations of engines, called customer decks. These customer decks, which are delivered to airframe companies by various U.S. engine companies, numerically characterize an engine's performance as defined by the particular U.S. airframe manufacturer. Until recently, all numerical models were provided with a Fortran-compatible interface in compliance with the Society of Automotive Engineers (SAE) document AS681F, and data communication was performed via a standard, labeled common structure in compliance with AS681F. Recently, the SAE committee began to develop a new standard: AS681G. AS681G addresses multiple language requirements for customer decks along with alternative data communication techniques. Along with the SAE committee, the NPSS Steady-State Cycle Deck project team developed a standard Application Program Interface (API) supported by a graphical user interface. This work will result in Aerospace Recommended Practice 4868 (ARP4868). The Steady-State Cycle Deck work was validated against the Energy Efficient Engine customer deck, which is publicly available. The Energy Efficient Engine wrapper was used not only to validate ARP4868 but also to demonstrate how to wrap an existing customer deck. The graphical user interface for the Steady-State Cycle Deck facilitates the use of the new standard and makes it easier to design and analyze a customer deck. This software was developed following I. Jacobson's Object-Oriented Design methodology and is implemented in C++. The AS681G standard will establish a common generic interface for U.S. engine companies and airframe manufacturers. This will lead to more accurate cycle models, quicker model generation, and faster validation leading to specifications. The standard will facilitate cooperative work between industry and NASA. The NPSS Steady-State Cycle Deck team released a batch version of the Steady-State Cycle Deck in March 1996. Version 1.1 was released in June 1996. During fiscal 1997, NPSS accepted enhancements and modifications to the Steady-State Cycle Deck launcher. Consistent with NPSS' commercialization plan, these modifications will be done by a third party that can provide long-term software support.

  12. The International Atomic Energy Agency software package for the analysis of scintigraphic renal dynamic studies: a tool for the clinician, teacher, and researcher.

    PubMed

    Zaknun, John J; Rajabi, Hossein; Piepsz, Amy; Roca, Isabel; Dondi, Maurizio

    2011-01-01

    Under the auspices of the International Atomic Energy Agency, a new-generation, platform-independent, and x86-compatible software package was developed for the analysis of scintigraphic renal dynamic imaging studies. It provides nuclear medicine professionals cost-free access to the most recent developments in the field. The software package is a step forward towards harmonization and standardization. Embedded functionalities render it a suitable tool for education, research, and for receiving distant expert's opinions. Another objective of this effort is to allow introducing clinically useful parameters of drainage, including normalized residual activity and outflow efficiency. Furthermore, it provides an effective teaching tool for young professionals who are being introduced to dynamic kidney studies by selected teaching case studies. The software facilitates a better understanding through practically approaching different variables and settings and their effect on the numerical results. An effort was made to introduce instruments of quality assurance at the various levels of the program's execution, including visual inspection and automatic detection and correction of patient's motion, automatic placement of regions of interest around the kidneys, cortical regions, and placement of reproducible background region on both primary dynamic and on postmicturition studies. The user can calculate the differential renal function through 2 independent methods, the integral or the Rutland-Patlak approaches. Standardized digital reports, storage and retrieval of regions of interest, and built-in database operations allow the generation and tracing of full image reports and of numerical outputs. The software package is undergoing quality assurance procedures to verify the accuracy and the interuser reproducibility with the final aim of launching the program for use by professionals and teaching institutions worldwide. Copyright © 2011 Elsevier Inc. All rights reserved.

  13. Computer investigations of the turbulent flow around a NACA2415 airfoil wind turbine

    NASA Astrophysics Data System (ADS)

    Driss, Zied; Chelbi, Tarek; Abid, Mohamed Salah

    2015-12-01

    In this work, computer investigations are carried out to study the flow field developing around a NACA2415 airfoil wind turbine. The Navier-Stokes equations in conjunction with the standard k-ɛ turbulence model are considered. These equations are solved numerically to determine the local characteristics of the flow. The models tested are implemented in the software "SolidWorks Flow Simulation" which uses a finite volume scheme. The numerical results are compared with experiments conducted on an open wind tunnel to validate the numerical results. This will help improving the aerodynamic efficiency in the design of packaged installations of the NACA2415 airfoil type wind turbine.

  14. Cultural and Technological Issues and Solutions for Geodynamics Software Citation

    NASA Astrophysics Data System (ADS)

    Heien, E. M.; Hwang, L.; Fish, A. E.; Smith, M.; Dumit, J.; Kellogg, L. H.

    2014-12-01

    Computational software and custom-written codes play a key role in scientific research and teaching, providing tools to perform data analysis and forward modeling through numerical computation. However, development of these codes is often hampered by the fact that there is no well-defined way for the authors to receive credit or professional recognition for their work through the standard methods of scientific publication and subsequent citation of the work. This in turn may discourage researchers from publishing their codes or making them easier for other scientists to use. We investigate the issues involved in citing software in a scientific context, and introduce features that should be components of a citation infrastructure, particularly oriented towards the codes and scientific culture in the area of geodynamics research. The codes used in geodynamics are primarily specialized numerical modeling codes for continuum mechanics problems; they may be developed by individual researchers, teams of researchers, geophysicists in collaboration with computational scientists and applied mathematicians, or by coordinated community efforts such as the Computational Infrastructure for Geodynamics. Some but not all geodynamics codes are open-source. These characteristics are common to many areas of geophysical software development and use. We provide background on the problem of software citation and discuss some of the barriers preventing adoption of such citations, including social/cultural barriers, insufficient technological support infrastructure, and an overall lack of agreement about what a software citation should consist of. We suggest solutions in an initial effort to create a system to support citation of software and promotion of scientific software development.

  15. Numerical implementation of the S-matrix algorithm for modeling of relief diffraction gratings

    NASA Astrophysics Data System (ADS)

    Yaremchuk, Iryna; Tamulevičius, Tomas; Fitio, Volodymyr; Gražulevičiūte, Ieva; Bobitski, Yaroslav; Tamulevičius, Sigitas

    2013-11-01

    A new numerical implementation is developed to calculate the diffraction efficiency of relief diffraction gratings. In the new formulation, vectors containing the expansion coefficients of electric and magnetic fields on boundaries of the grating layer are expressed by additional constants. An S-matrix algorithm has been systematically described in detail and adapted to a simple matrix form. This implementation is suitable for the study of optical characteristics of periodic structures by using modern object-oriented programming languages and different standard mathematical software. The modeling program has been developed on the basis of this numerical implementation and tested by comparison with other commercially available programs and experimental data. Numerical examples are given to show the usefulness of the new implementation.

  16. Tough2{_}MP: A parallel version of TOUGH2

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhang, Keni; Wu, Yu-Shu; Ding, Chris

    2003-04-09

    TOUGH2{_}MP is a massively parallel version of TOUGH2. It was developed for running on distributed-memory parallel computers to simulate large simulation problems that may not be solved by the standard, single-CPU TOUGH2 code. The new code implements an efficient massively parallel scheme, while preserving the full capacity and flexibility of the original TOUGH2 code. The new software uses the METIS software package for grid partitioning and AZTEC software package for linear-equation solving. The standard message-passing interface is adopted for communication among processors. Numerical performance of the current version code has been tested on CRAY-T3E and IBM RS/6000 SP platforms. Inmore » addition, the parallel code has been successfully applied to real field problems of multi-million-cell simulations for three-dimensional multiphase and multicomponent fluid and heat flow, as well as solute transport. In this paper, we will review the development of the TOUGH2{_}MP, and discuss the basic features, modules, and their applications.« less

  17. LLIMAS: Revolutionizing integrating modeling and analysis at MIT Lincoln Laboratory

    NASA Astrophysics Data System (ADS)

    Doyle, Keith B.; Stoeckel, Gerhard P.; Rey, Justin J.; Bury, Mark E.

    2017-08-01

    MIT Lincoln Laboratory's Integrated Modeling and Analysis Software (LLIMAS) enables the development of novel engineering solutions for advanced prototype systems through unique insights into engineering performance and interdisciplinary behavior to meet challenging size, weight, power, environmental, and performance requirements. LLIMAS is a multidisciplinary design optimization tool that wraps numerical optimization algorithms around an integrated framework of structural, thermal, optical, stray light, and computational fluid dynamics analysis capabilities. LLIMAS software is highly extensible and has developed organically across a variety of technologies including laser communications, directed energy, photometric detectors, chemical sensing, laser radar, and imaging systems. The custom software architecture leverages the capabilities of existing industry standard commercial software and supports the incorporation of internally developed tools. Recent advances in LLIMAS's Structural-Thermal-Optical Performance (STOP), aeromechanical, and aero-optical capabilities as applied to Lincoln prototypes are presented.

  18. Hybrid Numerical-Analytical Scheme for Calculating Elastic Wave Diffraction in Locally Inhomogeneous Waveguides

    NASA Astrophysics Data System (ADS)

    Glushkov, E. V.; Glushkova, N. V.; Evdokimov, A. A.

    2018-01-01

    Numerical simulation of traveling wave excitation, propagation, and diffraction in structures with local inhomogeneities (obstacles) is computationally expensive due to the need for mesh-based approximation of extended domains with the rigorous account for the radiation conditions at infinity. Therefore, hybrid numerical-analytic approaches are being developed based on the conjugation of a numerical solution in a local vicinity of the obstacle and/or source with an explicit analytic representation in the remaining semi-infinite external domain. However, in standard finite-element software, such a coupling with the external field, moreover, in the case of multimode expansion, is generally not provided. This work proposes a hybrid computational scheme that allows realization of such a conjugation using a standard software. The latter is used to construct a set of numerical solutions used as the basis for the sought solution in the local internal domain. The unknown expansion coefficients on this basis and on normal modes in the semi-infinite external domain are then determined from the conditions of displacement and stress continuity at the boundary between the two domains. We describe the implementation of this approach in the scalar and vector cases. To evaluate the reliability of the results and the efficiency of the algorithm, we compare it with a semianalytic solution to the problem of traveling wave diffraction by a horizontal obstacle, as well as with a finite-element solution obtained for a limited domain artificially restricted using absorbing boundaries. As an example, we consider the incidence of a fundamental antisymmetric Lamb wave onto surface and partially submerged elastic obstacles. It is noted that the proposed hybrid scheme can also be used to determine the eigenfrequencies and eigenforms of resonance scattering, as well as the characteristics of traveling waves in embedded waveguides.

  19. Numerical comparison of grid pattern diffraction effects through measurement and modeling with OptiScan software

    NASA Astrophysics Data System (ADS)

    Murray, Ian B.; Densmore, Victor; Bora, Vaibhav; Pieratt, Matthew W.; Hibbard, Douglas L.; Milster, Tom D.

    2011-06-01

    Coatings of various metalized patterns are used for heating and electromagnetic interference (EMI) shielding applications. Previous work has focused on macro differences between different types of grids, and has shown good correlation between measurements and analyses of grid diffraction. To advance this work, we have utilized the University of Arizona's OptiScan software, which has been optimized for this application by using the Babinet Principle. When operating on an appropriate computer system, this algorithm produces results hundreds of times faster than standard Fourier-based methods, and allows realistic cases to be modeled for the first time. By using previously published derivations by Exotic Electro-Optics, we compare diffraction performance of repeating and randomized grid patterns with equivalent sheet resistance using numerical performance metrics. Grid patterns of each type are printed on optical substrates and measured energy is compared against modeled energy.

  20. Toward uniform implementation of parametric map Digital Imaging and Communication in Medicine standard in multisite quantitative diffusion imaging studies.

    PubMed

    Malyarenko, Dariya; Fedorov, Andriy; Bell, Laura; Prah, Melissa; Hectors, Stefanie; Arlinghaus, Lori; Muzi, Mark; Solaiyappan, Meiyappan; Jacobs, Michael; Fung, Maggie; Shukla-Dave, Amita; McManus, Kevin; Boss, Michael; Taouli, Bachir; Yankeelov, Thomas E; Quarles, Christopher Chad; Schmainda, Kathleen; Chenevert, Thomas L; Newitt, David C

    2018-01-01

    This paper reports on results of a multisite collaborative project launched by the MRI subgroup of Quantitative Imaging Network to assess current capability and provide future guidelines for generating a standard parametric diffusion map Digital Imaging and Communication in Medicine (DICOM) in clinical trials that utilize quantitative diffusion-weighted imaging (DWI). Participating sites used a multivendor DWI DICOM dataset of a single phantom to generate parametric maps (PMs) of the apparent diffusion coefficient (ADC) based on two models. The results were evaluated for numerical consistency among models and true phantom ADC values, as well as for consistency of metadata with attributes required by the DICOM standards. This analysis identified missing metadata descriptive of the sources for detected numerical discrepancies among ADC models. Instead of the DICOM PM object, all sites stored ADC maps as DICOM MR objects, generally lacking designated attributes and coded terms for quantitative DWI modeling. Source-image reference, model parameters, ADC units and scale, deemed important for numerical consistency, were either missing or stored using nonstandard conventions. Guided by the identified limitations, the DICOM PM standard has been amended to include coded terms for the relevant diffusion models. Open-source software has been developed to support conversion of site-specific formats into the standard representation.

  1. Interoperability of Neuroscience Modeling Software

    PubMed Central

    Cannon, Robert C.; Gewaltig, Marc-Oliver; Gleeson, Padraig; Bhalla, Upinder S.; Cornelis, Hugo; Hines, Michael L.; Howell, Fredrick W.; Muller, Eilif; Stiles, Joel R.; Wils, Stefan; De Schutter, Erik

    2009-01-01

    Neuroscience increasingly uses computational models to assist in the exploration and interpretation of complex phenomena. As a result, considerable effort is invested in the development of software tools and technologies for numerical simulations and for the creation and publication of models. The diversity of related tools leads to the duplication of effort and hinders model reuse. Development practices and technologies that support interoperability between software systems therefore play an important role in making the modeling process more efficient and in ensuring that published models can be reliably and easily reused. Various forms of interoperability are possible including the development of portable model description standards, the adoption of common simulation languages or the use of standardized middleware. Each of these approaches finds applications within the broad range of current modeling activity. However more effort is required in many areas to enable new scientific questions to be addressed. Here we present the conclusions of the “Neuro-IT Interoperability of Simulators” workshop, held at the 11th computational neuroscience meeting in Edinburgh (July 19-20 2006; http://www.cnsorg.org). We assess the current state of interoperability of neural simulation software and explore the future directions that will enable the field to advance. PMID:17873374

  2. ScanImage: flexible software for operating laser scanning microscopes.

    PubMed

    Pologruto, Thomas A; Sabatini, Bernardo L; Svoboda, Karel

    2003-05-17

    Laser scanning microscopy is a powerful tool for analyzing the structure and function of biological specimens. Although numerous commercial laser scanning microscopes exist, some of the more interesting and challenging applications demand custom design. A major impediment to custom design is the difficulty of building custom data acquisition hardware and writing the complex software required to run the laser scanning microscope. We describe a simple, software-based approach to operating a laser scanning microscope without the need for custom data acquisition hardware. Data acquisition and control of laser scanning are achieved through standard data acquisition boards. The entire burden of signal integration and image processing is placed on the CPU of the computer. We quantitate the effectiveness of our data acquisition and signal conditioning algorithm under a variety of conditions. We implement our approach in an open source software package (ScanImage) and describe its functionality. We present ScanImage, software to run a flexible laser scanning microscope that allows easy custom design.

  3. Multimodal visualization interface for data management, self-learning and data presentation.

    PubMed

    Van Sint Jan, S; Demondion, X; Clapworthy, G; Louryan, S; Rooze, M; Cotten, A; Viceconti, M

    2006-10-01

    A multimodal visualization software, called the Data Manager (DM), has been developed to increase interdisciplinary communication around the topic of visualization and modeling of various aspects of the human anatomy. Numerous tools used in Radiology are integrated in the interface that runs on standard personal computers. The available tools, combined to hierarchical data management and custom layouts, allow analyzing of medical imaging data using advanced features outside radiological premises (for example, for patient review, conference presentation or tutorial preparation). The system is free, and based on an open-source software development architecture, and therefore updates of the system for custom applications are possible.

  4. Computer output microfilm (FR80) systems software documentation, volume 2

    NASA Technical Reports Server (NTRS)

    1975-01-01

    The system consists of a series of programs which convert digital data from magnetic tapes into alpha-numeric characters, graphic plots, and imagery that is recorded on the face of a cathode ray tube. A special camera photographs the face of the tube on microfilm for subsequent display on a film reader. The applicable documents which apply to this system are delineated. The functional relationship between the system software, the standard insert routines, and the applications programs is described; all the applications programs are described in detail. Instructions for locating those documents are presented along with test preparations sheets for all baseline and/or program modification acceptance tests.

  5. NASA space station software standards issues

    NASA Technical Reports Server (NTRS)

    Tice, G. D., Jr.

    1985-01-01

    The selection and application of software standards present the NASA Space Station Program with the opportunity to serve as a pacesetter for the United States software in the area of software standards. The strengths and weaknesses of each of the NASA defined software standards issues are summerized and discussed. Several significant standards issues are offered for NASA consideration. A challenge is presented for the NASA Space Station Program to serve as a pacesetter for the U.S. Software Industry through: (1) Management commitment to software standards; (2) Overall program participation in software standards; and (3) Employment of the best available technology to support software standards

  6. Using CAD software to simulate PV energy yield - The case of product integrated photovoltaic operated under indoor solar irradiation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Reich, N.H.; van Sark, W.G.J.H.M.; Turkenburg, W.C.

    2010-08-15

    In this paper, we show that photovoltaic (PV) energy yields can be simulated using standard rendering and ray-tracing features of Computer Aided Design (CAD) software. To this end, three-dimensional (3-D) sceneries are ray-traced in CAD. The PV power output is then modeled by translating irradiance intensity data of rendered images back into numerical data. To ensure accurate results, the solar irradiation data used as input is compared to numerical data obtained from rendered images, showing excellent agreement. As expected, also ray-tracing precision in the CAD software proves to be very high. To demonstrate PV energy yield simulations using this innovativemore » concept, solar radiation time course data of a few days was modeled in 3-D to simulate distributions of irradiance incident on flat, single- and double-bend shapes and a PV powered computer mouse located on a window sill. Comparisons of measured to simulated PV output of the mouse show that also in practice, simulation accuracies can be very high. Theoretically, this concept has great potential, as it can be adapted to suit a wide range of solar energy applications, such as sun-tracking and concentrator systems, Building Integrated PV (BIPV) or Product Integrated PV (PIPV). However, graphical user interfaces of 'CAD-PV' software tools are not yet available. (author)« less

  7. Numerical simulation of flood barriers

    NASA Astrophysics Data System (ADS)

    Srb, Pavel; Petrů, Michal; Kulhavý, Petr

    This paper deals with testing and numerical simulating of flood barriers. The Czech Republic has been hit by several very devastating floods in past years. These floods caused several dozens of causalities and property damage reached billions of Euros. The development of flood measures is very important, especially for the reduction the number of casualties and the amount of property damage. The aim of flood control measures is the detention of water outside populated areas and drainage of water from populated areas as soon as possible. For new flood barrier design it is very important to know its behaviour in case of a real flood. During the development of the barrier several standardized tests have to be carried out. Based on the results from these tests numerical simulation was compiled using Abaqus software and some analyses were carried out. Based on these numerical simulations it will be possible to predict the behaviour of barriers and thus improve their design.

  8. U.S. Army Armament Research, Development and Engineering Center Grain Evaluation Software to Numerically Predict Linear Burn Regression for Solid Propellant Grain Geometries

    DTIC Science & Technology

    2017-10-01

    ENGINEERING CENTER GRAIN EVALUATION SOFTWARE TO NUMERICALLY PREDICT LINEAR BURN REGRESSION FOR SOLID PROPELLANT GRAIN GEOMETRIES Brian...author(s) and should not be construed as an official Department of the Army position, policy, or decision, unless so designated by other documentation...U.S. ARMY ARMAMENT RESEARCH, DEVELOPMENT AND ENGINEERING CENTER GRAIN EVALUATION SOFTWARE TO NUMERICALLY PREDICT LINEAR BURN REGRESSION FOR SOLID

  9. Software Formal Inspections Standard

    NASA Technical Reports Server (NTRS)

    1993-01-01

    This Software Formal Inspections Standard (hereinafter referred to as Standard) is applicable to NASA software. This Standard defines the requirements that shall be fulfilled by the software formal inspections process whenever this process is specified for NASA software. The objective of this Standard is to define the requirements for a process that inspects software products to detect and eliminate defects as early as possible in the software life cycle. The process also provides for the collection and analysis of inspection data to improve the inspection process as well as the quality of the software.

  10. SUBOPT: A CAD program for suboptimal linear regulators

    NASA Technical Reports Server (NTRS)

    Fleming, P. J.

    1985-01-01

    An interactive software package which provides design solutions for both standard linear quadratic regulator (LQR) and suboptimal linear regulator problems is described. Intended for time-invariant continuous systems, the package is easily modified to include sampled-data systems. LQR designs are obtained by established techniques while the large class of suboptimal problems containing controller and/or performance index options is solved using a robust gradient minimization technique. Numerical examples demonstrate features of the package and recent developments are described.

  11. GlobiPack v. 1.0

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bartlett, Roscoe

    2010-03-31

    GlobiPack contains a small collection of optimization globalization algorithms. These algorithms are used by optimization and various nonlinear equation solver algorithms.Used as the line-search procedure with Newton and Quasi-Newton optimization and nonlinear equation solver methods. These are standard published 1-D line search algorithms such as are described in the book Nocedal and Wright Numerical Optimization: 2nd edition, 2006. One set of algorithms were copied and refactored from the existing open-source Trilinos package MOOCHO where the linear search code is used to globalize SQP methods. This software is generic to any mathematical optimization problem where smooth derivatives exist. There is nomore » specific connection or mention whatsoever to any specific application, period. You cannot find more general mathematical software.« less

  12. Toolpack mathematical software development environment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Osterweil, L.

    1982-07-21

    The purpose of this research project was to produce a well integrated set of tools for the support of numerical computation. The project entailed the specification, design and implementation of both a diversity of tools and an innovative tool integration mechanism. This large configuration of tightly integrated tools comprises an environment for numerical software development, and has been named Toolpack/IST (Integrated System of Tools). Following the creation of this environment in prototype form, the environment software was readied for widespread distribution by transitioning it to a development organization for systematization, documentation and distribution. It is expected that public release ofmore » Toolpack/IST will begin imminently and will provide a basis for evaluation of the innovative software approaches taken as well as a uniform set of development tools for the numerical software community.« less

  13. Object oriented development of engineering software using CLIPS

    NASA Technical Reports Server (NTRS)

    Yoon, C. John

    1991-01-01

    Engineering applications involve numeric complexity and manipulations of a large amount of data. Traditionally, numeric computation has been the concern in developing an engineering software. As engineering application software became larger and more complex, management of resources such as data, rather than the numeric complexity, has become the major software design problem. Object oriented design and implementation methodologies can improve the reliability, flexibility, and maintainability of the resulting software; however, some tasks are better solved with the traditional procedural paradigm. The C Language Integrated Production System (CLIPS), with deffunction and defgeneric constructs, supports the procedural paradigm. The natural blending of object oriented and procedural paradigms has been cited as the reason for the popularity of the C++ language. The CLIPS Object Oriented Language's (COOL) object oriented features are more versatile than C++'s. A software design methodology based on object oriented and procedural approaches appropriate for engineering software, and to be implemented in CLIPS was outlined. A method for sensor placement for Space Station Freedom is being implemented in COOL as a sample problem.

  14. RELAP-7 Software Verification and Validation Plan: Requirements Traceability Matrix (RTM) Part 1 – Physics and numerical methods

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Choi, Yong Joon; Yoo, Jun Soo; Smith, Curtis Lee

    2015-09-01

    This INL plan comprehensively describes the Requirements Traceability Matrix (RTM) on main physics and numerical method of the RELAP-7. The plan also describes the testing-based software verification and validation (SV&V) process—a set of specially designed software models used to test RELAP-7.

  15. Advancing MODFLOW Applying the Derived Vector Space Method

    NASA Astrophysics Data System (ADS)

    Herrera, G. S.; Herrera, I.; Lemus-García, M.; Hernandez-Garcia, G. D.

    2015-12-01

    The most effective domain decomposition methods (DDM) are non-overlapping DDMs. Recently a new approach, the DVS-framework, based on an innovative discretization method that uses a non-overlapping system of nodes (the derived-nodes), was introduced and developed by I. Herrera et al. [1, 2]. Using the DVS-approach a group of four algorithms, referred to as the 'DVS-algorithms', which fulfill the DDM-paradigm (i.e. the solution of global problems is obtained by resolution of local problems exclusively) has been derived. Such procedures are applicable to any boundary-value problem, or system of such equations, for which a standard discretization method is available and then software with a high degree of parallelization can be constructed. In a parallel talk, in this AGU Fall Meeting, Ismael Herrera will introduce the general DVS methodology. The application of the DVS-algorithms has been demonstrated in the solution of several boundary values problems of interest in Geophysics. Numerical examples for a single-equation, for the cases of symmetric, non-symmetric and indefinite problems were demonstrated before [1,2]. For these problems DVS-algorithms exhibited significantly improved numerical performance with respect to standard versions of DDM algorithms. In view of these results our research group is in the process of applying the DVS method to a widely used simulator for the first time, here we present the advances of the application of this method for the parallelization of MODFLOW. Efficiency results for a group of tests will be presented. References [1] I. Herrera, L.M. de la Cruz and A. Rosas-Medina. Non overlapping discretization methods for partial differential equations, Numer Meth Part D E, (2013). [2] Herrera, I., & Contreras Iván "An Innovative Tool for Effectively Applying Highly Parallelized Software To Problems of Elasticity". Geofísica Internacional, 2015 (In press)

  16. ACQ4: an open-source software platform for data acquisition and analysis in neurophysiology research.

    PubMed

    Campagnola, Luke; Kratz, Megan B; Manis, Paul B

    2014-01-01

    The complexity of modern neurophysiology experiments requires specialized software to coordinate multiple acquisition devices and analyze the collected data. We have developed ACQ4, an open-source software platform for performing data acquisition and analysis in experimental neurophysiology. This software integrates the tasks of acquiring, managing, and analyzing experimental data. ACQ4 has been used primarily for standard patch-clamp electrophysiology, laser scanning photostimulation, multiphoton microscopy, intrinsic imaging, and calcium imaging. The system is highly modular, which facilitates the addition of new devices and functionality. The modules included with ACQ4 provide for rapid construction of acquisition protocols, live video display, and customizable analysis tools. Position-aware data collection allows automated construction of image mosaics and registration of images with 3-dimensional anatomical atlases. ACQ4 uses free and open-source tools including Python, NumPy/SciPy for numerical computation, PyQt for the user interface, and PyQtGraph for scientific graphics. Supported hardware includes cameras, patch clamp amplifiers, scanning mirrors, lasers, shutters, Pockels cells, motorized stages, and more. ACQ4 is available for download at http://www.acq4.org.

  17. An intelligent maximum permissible exposure meter for safety assessments of laser radiation

    NASA Astrophysics Data System (ADS)

    Corder, D. A.; Evans, D. R.; Tyrer, J. R.

    1996-09-01

    There is frequently a need to make laser power or energy density measurements when determining whether radiation from a laser system exceeds the Maximum Permissible Exposure (MPE) as defined in BS EN 60825. This can be achieved using standard commercially available laser power or energy measurement equipment, but some of these have shortcomings when used in this application. Calculations must be performed by the user to compare the measured value to the MPE. The measurement and calculation procedure appears complex to the nonexpert who may be performing the assessment. A novel approach is described which uses purpose designed hardware and software to simplify the process. The hardware is optimized for measuring the relatively low powers associated with MPEs. The software runs on a Psion Series 3a palmtop computer. This reduces the cost and size of the system yet allows graphical and numerical presentation of data. Data output to other software running on PCs is also possible, enabling the instrument to be used as part of a quality system. Throughout the measurement process the opportunity for user error has been minimized by the hardware and software design.

  18. Maintaining Quality and Confidence in Open-Source, Evolving Software: Lessons Learned with PFLOTRAN

    NASA Astrophysics Data System (ADS)

    Frederick, J. M.; Hammond, G. E.

    2017-12-01

    Software evolution in an open-source framework poses a major challenge to a geoscientific simulator, but when properly managed, the pay-off can be enormous for both the developers and the community at large. Developers must juggle implementing new scientific process models, adopting increasingly efficient numerical methods and programming paradigms, changing funding sources (or total lack of funding), while also ensuring that legacy code remains functional and reported bugs are fixed in a timely manner. With robust software engineering and a plan for long-term maintenance, a simulator can evolve over time incorporating and leveraging many advances in the computational and domain sciences. In this positive light, what practices in software engineering and code maintenance can be employed within open-source development to maximize the positive aspects of software evolution and community contributions while minimizing its negative side effects? This presentation will discusses steps taken in the development of PFLOTRAN (www.pflotran.org), an open source, massively parallel subsurface simulator for multiphase, multicomponent, and multiscale reactive flow and transport processes in porous media. As PFLOTRAN's user base and development team continues to grow, it has become increasingly important to implement strategies which ensure sustainable software development while maintaining software quality and community confidence. In this presentation, we will share our experiences and "lessons learned" within the context of our open-source development framework and community engagement efforts. Topics discussed will include how we've leveraged both standard software engineering principles, such as coding standards, version control, and automated testing, as well unique advantages of object-oriented design in process model coupling, to ensure software quality and confidence. We will also be prepared to discuss the major challenges faced by most open-source software teams, such as on-boarding new developers or one-time contributions, dealing with competitors or lookie-loos, and other downsides of complete transparency, as well as our approach to community engagement, including a user group email list, hosting short courses and workshops for new users, and maintaining a website. SAND2017-8174A

  19. Web-Based Real-Time Emergency Monitoring

    NASA Technical Reports Server (NTRS)

    Harvey, Craig A.; Lawhead, Joel

    2007-01-01

    The Web-based Real-Time Asset Monitoring (RAM) module for emergency operations and facility management enables emergency personnel in federal agencies and local and state governments to monitor and analyze data in the event of a natural disaster or other crisis that threatens a large number of people and property. The software can manage many disparate sources of data within a facility, city, or county. It was developed on industry-standard Geo- Spatial software and is compliant with open GIS standards. RAM View can function as a standalone system, or as an integrated plugin module to Emergency Operations Center (EOC) software suites such as REACT (Real-time Emergency Action Coordination Tool), thus ensuring the widest possible distribution among potential users. RAM has the ability to monitor various data sources, including streaming data. Many disparate systems are included in the initial suite of supported hardware systems, such as mobile GPS units, ambient measurements of temperature, moisture and chemical agents, flow meters, air quality, asset location, and meteorological conditions. RAM View displays real-time data streams such as gauge heights from the U.S. Geological Survey gauging stations, flood crests from the National Weather Service, and meteorological data from numerous sources. Data points are clearly visible on the map interface, and attributes as specified in the user requirements can be viewed and queried.

  20. Feasibility of video codec algorithms for software-only playback

    NASA Astrophysics Data System (ADS)

    Rodriguez, Arturo A.; Morse, Ken

    1994-05-01

    Software-only video codecs can provide good playback performance in desktop computers with a 486 or 68040 CPU running at 33 MHz without special hardware assistance. Typically, playback of compressed video can be categorized into three tasks: the actual decoding of the video stream, color conversion, and the transfer of decoded video data from system RAM to video RAM. By current standards, good playback performance is the decoding and display of video streams of 320 by 240 (or larger) compressed frames at 15 (or greater) frames-per- second. Software-only video codecs have evolved by modifying and tailoring existing compression methodologies to suit video playback in desktop computers. In this paper we examine the characteristics used to evaluate software-only video codec algorithms, namely: image fidelity (i.e., image quality), bandwidth (i.e., compression) ease-of-decoding (i.e., playback performance), memory consumption, compression to decompression asymmetry, scalability, and delay. We discuss the tradeoffs among these variables and the compromises that can be made to achieve low numerical complexity for software-only playback. Frame- differencing approaches are described since software-only video codecs typically employ them to enhance playback performance. To complement other papers that appear in this session of the Proceedings, we review methods derived from binary pattern image coding since these methods are amenable for software-only playback. In particular, we introduce a novel approach called pixel distribution image coding.

  1. Clustered Numerical Data Analysis Using Markov Lie Monoid Based Networks

    NASA Astrophysics Data System (ADS)

    Johnson, Joseph

    2016-03-01

    We have designed and build an optimal numerical standardization algorithm that links numerical values with their associated units, error level, and defining metadata thus supporting automated data exchange and new levels of artificial intelligence (AI). The software manages all dimensional and error analysis and computational tracing. Tables of entities verses properties of these generalized numbers (called ``metanumbers'') support a transformation of each table into a network among the entities and another network among their properties where the network connection matrix is based upon a proximity metric between the two items. We previously proved that every network is isomorphic to the Lie algebra that generates continuous Markov transformations. We have also shown that the eigenvectors of these Markov matrices provide an agnostic clustering of the underlying patterns. We will present this methodology and show how our new work on conversion of scientific numerical data through this process can reveal underlying information clusters ordered by the eigenvalues. We will also show how the linking of clusters from different tables can be used to form a ``supernet'' of all numerical information supporting new initiatives in AI.

  2. TESSIM: a simulator for the Athena-X-IFU

    NASA Astrophysics Data System (ADS)

    Wilms, J.; Smith, S. J.; Peille, P.; Ceballos, M. T.; Cobo, B.; Dauser, T.; Brand, T.; den Hartog, R. H.; Bandler, S. R.; de Plaa, J.; den Herder, J.-W. A.

    2016-07-01

    We present the design of tessim, a simulator for the physics of transition edge sensors developed in the framework of the Athena end to end simulation effort. Designed to represent the general behavior of transition edge sensors and to provide input for engineering and science studies for Athena, tessim implements a numerical solution of the linearized equations describing these devices. The simulation includes a model for the relevant noise sources and several implementations of possible trigger algorithms. Input and output of the software are standard FITS- files which can be visualized and processed using standard X-ray astronomical tool packages. Tessim is freely available as part of the SIXTE package (http://www.sternwarte.uni-erlangen.de/research/sixte/).

  3. TESSIM: A Simulator for the Athena-X-IFU

    NASA Technical Reports Server (NTRS)

    Wilms, J.; Smith, S. J.; Peille, P.; Ceballos, M. T.; Cobo, B.; Dauser, T.; Brand, T.; Den Hartog, R. H.; Bandler, S. R.; De Plaa, J.; hide

    2016-01-01

    We present the design of tessim, a simulator for the physics of transition edge sensors developed in the framework of the Athena end to end simulation effort. Designed to represent the general behavior of transition edge sensors and to provide input for engineering and science studies for Athena, tessim implements a numerical solution of the linearized equations describing these devices. The simulation includes a model for the relevant noise sources and several implementations of possible trigger algorithms. Input and output of the software are standard FITS-les which can be visualized and processed using standard X-ray astronomical tool packages. Tessim is freely available as part of the SIXTE package (http:www.sternwarte.uni-erlangen.deresearchsixte).

  4. Ground Support Software for Spaceborne Instrumentation

    NASA Technical Reports Server (NTRS)

    Anicich, Vincent; Thorpe, rob; Fletcher, Greg; Waite, Hunter; Xu, Hykua; Walter, Erin; Frick, Kristie; Farris, Greg; Gell, Dave; Furman, Jufy; hide

    2004-01-01

    ION is a system of ground support software for the ion and neutral mass spectrometer (INMS) instrument aboard the Cassini spacecraft. By incorporating commercial off-the-shelf database, Web server, and Java application components, ION offers considerably more ground-support-service capability than was available previously. A member of the team that operates the INMS or a scientist who uses the data collected by the INMS can gain access to most of the services provided by ION via a standard pointand click hyperlink interface generated by almost any Web-browser program running in almost any operating system on almost any computer. Data are stored in one central location in a relational database in a non-proprietary format, are accessible in many combinations and formats, and can be combined with data from other instruments and spacecraft. The use of the Java programming language as a system-interface language offers numerous capabilities for object-oriented programming and for making the database accessible to participants using a variety of computer hardware and software.

  5. Planning the Unplanned Experiment: Assessing the Efficacy of Standards for Safety Critical Software

    NASA Technical Reports Server (NTRS)

    Graydon, Patrick J.; Holloway, C. Michael

    2015-01-01

    We need well-founded means of determining whether software is t for use in safety-critical applications. While software in industries such as aviation has an excellent safety record, the fact that software aws have contributed to deaths illustrates the need for justi ably high con dence in software. It is often argued that software is t for safety-critical use because it conforms to a standard for software in safety-critical systems. But little is known about whether such standards `work.' Reliance upon a standard without knowing whether it works is an experiment; without collecting data to assess the standard, this experiment is unplanned. This paper reports on a workshop intended to explore how standards could practicably be assessed. Planning the Unplanned Experiment: Assessing the Ecacy of Standards for Safety Critical Software (AESSCS) was held on 13 May 2014 in conjunction with the European Dependable Computing Conference (EDCC). We summarize and elaborate on the workshop's discussion of the topic, including both the presented positions and the dialogue that ensued.

  6. New method of processing heat treatment experiments with numerical simulation support

    NASA Astrophysics Data System (ADS)

    Kik, T.; Moravec, J.; Novakova, I.

    2017-08-01

    In this work, benefits of combining modern software for numerical simulations of welding processes with laboratory research was described. Proposed new method of processing heat treatment experiments leading to obtaining relevant input data for numerical simulations of heat treatment of large parts was presented. It is now possible, by using experiments on small tested samples, to simulate cooling conditions comparable with cooling of bigger parts. Results from this method of testing makes current boundary conditions during real cooling process more accurate, but also can be used for improvement of software databases and optimization of a computational models. The point is to precise the computation of temperature fields for large scale hardening parts based on new method of temperature dependence determination of the heat transfer coefficient into hardening media for the particular material, defined maximal thickness of processed part and cooling conditions. In the paper we will also present an example of the comparison standard and modified (according to newly suggested methodology) heat transfer coefficient data’s and theirs influence on the simulation results. It shows how even the small changes influence mainly on distribution of temperature, metallurgical phases, hardness and stresses distribution. By this experiment it is also possible to obtain not only input data and data enabling optimization of computational model but at the same time also verification data. The greatest advantage of described method is independence of used cooling media type.

  7. RCHILD - an R-package for flexible use of the landscape evolution model CHILD

    NASA Astrophysics Data System (ADS)

    Dietze, Michael

    2014-05-01

    Landscape evolution models provide powerful approaches to numerically assess earth surface processes, to quantify rates of landscape change, infer sediment transfer rates, estimate sediment budgets, investigate the consequences of changes in external drivers on a geomorphic system, to provide spatio-temporal interpolations between known landscape states or to test conceptual hypotheses. CHILD (Channel-Hillslope Integrated Landscape Development Model) is one of the most-used models of landscape change in the context of at least tectonic and geomorphologic process interactions. Running CHILD from command line and working with the model output can be a rather awkward task (static model control via text input file, only numeric output in text files). The package RCHILD is a collection of functions for the free statistical software R that help using CHILD in a flexible, dynamic and user-friendly way. The comprised functions allow creating maps, real-time scenes, animations and further thematic plots from model output. The model input files can be modified dynamically and, hence, (feedback-related) changes in external factors can be implemented iteratively. Output files can be written to common formats that can be readily imported to standard GIS software. This contribution presents the basic functionality of the model CHILD as visualised and modified by the package. A rough overview of the available functions is given. Application examples help to illustrate the great potential of numeric modelling of geomorphologic processes.

  8. NASA Software Documentation Standard

    NASA Technical Reports Server (NTRS)

    1991-01-01

    The NASA Software Documentation Standard (hereinafter referred to as "Standard") is designed to support the documentation of all software developed for NASA; its goal is to provide a framework and model for recording the essential information needed throughout the development life cycle and maintenance of a software system. The NASA Software Documentation Standard can be applied to the documentation of all NASA software. The Standard is limited to documentation format and content requirements. It does not mandate specific management, engineering, or assurance standards or techniques. This Standard defines the format and content of documentation for software acquisition, development, and sustaining engineering. Format requirements address where information shall be recorded and content requirements address what information shall be recorded. This Standard provides a framework to allow consistency of documentation across NASA and visibility into the completeness of project documentation. The basic framework consists of four major sections (or volumes). The Management Plan contains all planning and business aspects of a software project, including engineering and assurance planning. The Product Specification contains all technical engineering information, including software requirements and design. The Assurance and Test Procedures contains all technical assurance information, including Test, Quality Assurance (QA), and Verification and Validation (V&V). The Management, Engineering, and Assurance Reports is the library and/or listing of all project reports.

  9. NASA software documentation standard software engineering program

    NASA Technical Reports Server (NTRS)

    1991-01-01

    The NASA Software Documentation Standard (hereinafter referred to as Standard) can be applied to the documentation of all NASA software. This Standard is limited to documentation format and content requirements. It does not mandate specific management, engineering, or assurance standards or techniques. This Standard defines the format and content of documentation for software acquisition, development, and sustaining engineering. Format requirements address where information shall be recorded and content requirements address what information shall be recorded. This Standard provides a framework to allow consistency of documentation across NASA and visibility into the completeness of project documentation. This basic framework consists of four major sections (or volumes). The Management Plan contains all planning and business aspects of a software project, including engineering and assurance planning. The Product Specification contains all technical engineering information, including software requirements and design. The Assurance and Test Procedures contains all technical assurance information, including Test, Quality Assurance (QA), and Verification and Validation (V&V). The Management, Engineering, and Assurance Reports is the library and/or listing of all project reports.

  10. The inverse Numerical Computer Program FLUX-BOT for estimating Vertical Water Fluxes from Temperature Time-Series.

    NASA Astrophysics Data System (ADS)

    Trauth, N.; Schmidt, C.; Munz, M.

    2016-12-01

    Heat as a natural tracer to quantify water fluxes between groundwater and surface water has evolved to a standard hydrological method. Typically, time series of temperatures in the surface water and in the sediment are observed and are subsequently evaluated by a vertical 1D representation of heat transport by advection and dispersion. Several analytical solutions as well as their implementation into user-friendly software exist in order to estimate water fluxes from the observed temperatures. Analytical solutions can be easily implemented but assumptions on the boundary conditions have to be made a priori, e.g. sinusoidal upper temperature boundary. Numerical models offer more flexibility and can handle temperature data which is characterized by irregular variations such as storm-event induced temperature changes and thus cannot readily be incorporated in analytical solutions. This also reduced the effort of data preprocessing such as the extraction of the diurnal temperature variation. We developed a software to estimate water FLUXes Based On Temperatures- FLUX-BOT. FLUX-BOT is a numerical code written in MATLAB which is intended to calculate vertical water fluxes in saturated sediments, based on the inversion of measured temperature time series observed at multiple depths. It applies a cell-centered Crank-Nicolson implicit finite difference scheme to solve the one-dimensional heat advection-conduction equation. Besides its core inverse numerical routines, FLUX-BOT includes functions visualizing the results and functions for performing uncertainty analysis. We provide applications of FLUX-BOT to generic as well as to measured temperature data to demonstrate its performance.

  11. Modeling and simulation of different and representative engineering problems using Network Simulation Method

    PubMed Central

    2018-01-01

    Mathematical models simulating different and representative engineering problem, atomic dry friction, the moving front problems and elastic and solid mechanics are presented in the form of a set of non-linear, coupled or not coupled differential equations. For different parameters values that influence the solution, the problem is numerically solved by the network method, which provides all the variables of the problems. Although the model is extremely sensitive to the above parameters, no assumptions are considered as regards the linearization of the variables. The design of the models, which are run on standard electrical circuit simulation software, is explained in detail. The network model results are compared with common numerical methods or experimental data, published in the scientific literature, to show the reliability of the model. PMID:29518121

  12. Further studies on stability analysis of nonlinear Roesser-type two-dimensional systems

    NASA Astrophysics Data System (ADS)

    Dai, Xiao-Lin

    2014-04-01

    This paper is concerned with further relaxations of the stability analysis of nonlinear Roesser-type two-dimensional (2D) systems in the Takagi-Sugeno fuzzy form. To achieve the goal, a novel slack matrix variable technique, which is homogenous polynomially parameter-dependent on the normalized fuzzy weighting functions with arbitrary degree, is developed and the algebraic properties of the normalized fuzzy weighting functions are collected into a set of augmented matrices. Consequently, more information about the normalized fuzzy weighting functions is involved and the relaxation quality of the stability analysis is significantly improved. Moreover, the obtained result is formulated in the form of linear matrix inequalities, which can be easily solved via standard numerical software. Finally, a numerical example is provided to demonstrate the effectiveness of the proposed result.

  13. Modeling and simulation of different and representative engineering problems using Network Simulation Method.

    PubMed

    Sánchez-Pérez, J F; Marín, F; Morales, J L; Cánovas, M; Alhama, F

    2018-01-01

    Mathematical models simulating different and representative engineering problem, atomic dry friction, the moving front problems and elastic and solid mechanics are presented in the form of a set of non-linear, coupled or not coupled differential equations. For different parameters values that influence the solution, the problem is numerically solved by the network method, which provides all the variables of the problems. Although the model is extremely sensitive to the above parameters, no assumptions are considered as regards the linearization of the variables. The design of the models, which are run on standard electrical circuit simulation software, is explained in detail. The network model results are compared with common numerical methods or experimental data, published in the scientific literature, to show the reliability of the model.

  14. An easy and effective approach to manage radiologic portable document format (PDF) files using iTunes.

    PubMed

    Qian, Li Jun; Zhou, Mi; Xu, Jian Rong

    2008-07-01

    The objective of this article is to explain an easy and effective approach for managing radiologic files in portable document format (PDF) using iTunes. PDF files are widely used as a standard file format for electronic publications as well as for medical online documents. Unfortunately, there is a lack of powerful software to manage numerous PDF documents. In this article, we explain how to use the hidden function of iTunes (Apple Computer) to manage PDF documents as easily as managing music files.

  15. Edge directed image interpolation with Bamberger pyramids

    NASA Astrophysics Data System (ADS)

    Rosiles, Jose Gerardo

    2005-08-01

    Image interpolation is a standard feature in digital image editing software, digital camera systems and printers. Classical methods for resizing produce blurred images with unacceptable quality. Bamberger Pyramids and filter banks have been successfully used for texture and image analysis. They provide excellent multiresolution and directional selectivity. In this paper we present an edge-directed image interpolation algorithm which takes advantage of the simultaneous spatial-directional edge localization at the subband level. The proposed algorithm outperform classical schemes like bilinear and bicubic schemes from the visual and numerical point of views.

  16. SpectraFox: A free open-source data management and analysis tool for scanning probe microscopy and spectroscopy

    NASA Astrophysics Data System (ADS)

    Ruby, Michael

    In the last decades scanning probe microscopy and spectroscopy have become well-established tools in nanotechnology and surface science. This opened the market for many commercial manufacturers, each with different hardware and software standards. Besides the advantage of a wide variety of available hardware, the diversity may software-wise complicate the data exchange between scientists, and the data analysis for groups working with hardware developed by different manufacturers. Not only the file format differs between manufacturers, but also the data often requires further numerical treatment before publication. SpectraFox is an open-source and independent tool which manages, processes, and evaluates scanning probe spectroscopy and microscopy data. It aims at simplifying the documentation in parallel to measurement, and it provides solid evaluation tools for a large number of data.

  17. ATHENA, ARTEMIS, HEPHAESTUS: data analysis for X-ray absorption spectropscopy using IFEFFIT

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ravel, B.; Newville, M.; UC)

    2010-07-20

    A software package for the analysis of X-ray absorption spectroscopy (XAS) data is presented. This package is based on the IFEFFIT library of numerical and XAS algorithms and is written in the Perl programming language using the Perl/Tk graphics toolkit. The programs described here are: (i) ATHENA, a program for XAS data processing, (ii) ARTEMIS, a program for EXAFS data analysis using theoretical standards from FEFF and (iii) HEPHAESTUS, a collection of beamline utilities based on tables of atomic absorption data. These programs enable high-quality data analysis that is accessible to novices while still powerful enough to meet the demandsmore » of an expert practitioner. The programs run on all major computer platforms and are freely available under the terms of a free software license.« less

  18. ACQ4: an open-source software platform for data acquisition and analysis in neurophysiology research

    PubMed Central

    Campagnola, Luke; Kratz, Megan B.; Manis, Paul B.

    2014-01-01

    The complexity of modern neurophysiology experiments requires specialized software to coordinate multiple acquisition devices and analyze the collected data. We have developed ACQ4, an open-source software platform for performing data acquisition and analysis in experimental neurophysiology. This software integrates the tasks of acquiring, managing, and analyzing experimental data. ACQ4 has been used primarily for standard patch-clamp electrophysiology, laser scanning photostimulation, multiphoton microscopy, intrinsic imaging, and calcium imaging. The system is highly modular, which facilitates the addition of new devices and functionality. The modules included with ACQ4 provide for rapid construction of acquisition protocols, live video display, and customizable analysis tools. Position-aware data collection allows automated construction of image mosaics and registration of images with 3-dimensional anatomical atlases. ACQ4 uses free and open-source tools including Python, NumPy/SciPy for numerical computation, PyQt for the user interface, and PyQtGraph for scientific graphics. Supported hardware includes cameras, patch clamp amplifiers, scanning mirrors, lasers, shutters, Pockels cells, motorized stages, and more. ACQ4 is available for download at http://www.acq4.org. PMID:24523692

  19. Software assurance standard

    NASA Technical Reports Server (NTRS)

    1992-01-01

    This standard specifies the software assurance program for the provider of software. It also delineates the assurance activities for the provider and the assurance data that are to be furnished by the provider to the acquirer. In any software development effort, the provider is the entity or individual that actually designs, develops, and implements the software product, while the acquirer is the entity or individual who specifies the requirements and accepts the resulting products. This standard specifies at a high level an overall software assurance program for software developed for and by NASA. Assurance includes the disciplines of quality assurance, quality engineering, verification and validation, nonconformance reporting and corrective action, safety assurance, and security assurance. The application of these disciplines during a software development life cycle is called software assurance. Subsequent lower-level standards will specify the specific processes within these disciplines.

  20. Software technology insertion: A study of success factors

    NASA Technical Reports Server (NTRS)

    Lydon, Tom

    1990-01-01

    Managing software development in large organizations has become increasingly difficult due to increasing technical complexity, stricter government standards, a shortage of experienced software engineers, competitive pressure for improved productivity and quality, the need to co-develop hardware and software together, and the rapid changes in both hardware and software technology. The 'software factory' approach to software development minimizes risks while maximizing productivity and quality through standardization, automation, and training. However, in practice, this approach is relatively inflexible when adopting new software technologies. The methods that a large multi-project software engineering organization can use to increase the likelihood of successful software technology insertion (STI), especially in a standardized engineering environment, are described.

  1. Development and prospective evaluation of an automated software system for quality control of quantitative 99mTc-MAG3 renal studies.

    PubMed

    Folks, Russell D; Garcia, Ernest V; Taylor, Andrew T

    2007-03-01

    Quantitative nuclear renography has numerous potential sources of error. We previously reported the initial development of a computer software module for comprehensively addressing the issue of quality control (QC) in the analysis of radionuclide renal images. The objective of this study was to prospectively test the QC software. The QC software works in conjunction with standard quantitative renal image analysis using a renal quantification program. The software saves a text file that summarizes QC findings as possible errors in user-entered values, calculated values that may be unreliable because of the patient's clinical condition, and problems relating to acquisition or processing. To test the QC software, a technologist not involved in software development processed 83 consecutive nontransplant clinical studies. The QC findings of the software were then tabulated. QC events were defined as technical (study descriptors that were out of range or were entered and then changed, unusually sized or positioned regions of interest, or missing frames in the dynamic image set) or clinical (calculated functional values judged to be erroneous or unreliable). Technical QC events were identified in 36 (43%) of 83 studies. Clinical QC events were identified in 37 (45%) of 83 studies. Specific QC events included starting the camera after the bolus had reached the kidney, dose infiltration, oversubtraction of background activity, and missing frames in the dynamic image set. QC software has been developed to automatically verify user input, monitor calculation of renal functional parameters, summarize QC findings, and flag potentially unreliable values for the nuclear medicine physician. Incorporation of automated QC features into commercial or local renal software can reduce errors and improve technologist performance and should improve the efficiency and accuracy of image interpretation.

  2. Software Piracy, Ethics, and the Academician.

    ERIC Educational Resources Information Center

    Bassler, Richard A.

    The numerous software programs available for easy, low-cost copying raise ethical questions. The problem can be examined from the viewpoints of software users, teachers, authors, vendors, and distributors. Software users might hesitate to purchase or use software which prevents the making of back-up copies for program protection. Teachers in…

  3. A nonlinear dynamic finite element approach for simulating muscular hydrostats.

    PubMed

    Vavourakis, V; Kazakidi, A; Tsakiris, D P; Ekaterinaris, J A

    2014-01-01

    An implicit nonlinear finite element model for simulating biological muscle mechanics is developed. The numerical method is suitable for dynamic simulations of three-dimensional, nonlinear, nearly incompressible, hyperelastic materials that undergo large deformations. These features characterise biological muscles, which consist of fibres and connective tissues. It can be assumed that the stress distribution inside the muscles is the superposition of stresses along the fibres and the connective tissues. The mechanical behaviour of the surrounding tissues is determined by adopting a Mooney-Rivlin constitutive model, while the mechanical description of fibres is considered to be the sum of active and passive stresses. Due to the nonlinear nature of the problem, evaluation of the Jacobian matrix is carried out in order to subsequently utilise the standard Newton-Raphson iterative procedure and to carry out time integration with an implicit scheme. The proposed methodology is implemented into our in-house, open source, finite element software, which is validated by comparing numerical results with experimental measurements and other numerical results. Finally, the numerical procedure is utilised to simulate primitive octopus arm manoeuvres, such as bending and reaching.

  4. Modeling the Zeeman effect in high altitude SSMIS channels for numerical weather prediction profiles: comparing a fast model and a line-by-line model

    NASA Astrophysics Data System (ADS)

    Larsson, R.; Milz, M.; Rayer, P.; Saunders, R.; Bell, W.; Booton, A.; Buehler, S. A.; Eriksson, P.; John, V.

    2015-10-01

    We present a comparison of a reference and a fast radiative transfer model using numerical weather prediction profiles for the Zeeman-affected high altitude Special Sensor Microwave Imager/Sounder channels 19-22. We find that the models agree well for channels 21 and 22 compared to the channels' system noise temperatures (1.9 and 1.3 K, respectively) and the expected profile errors at the affected altitudes (estimated to be around 5 K). For channel 22 there is a 0.5 K average difference between the models, with a standard deviation of 0.24 K for the full set of atmospheric profiles. Same channel, there is 1.2 K in average between the fast model and the sensor measurement, with 1.4 K standard deviation. For channel 21 there is a 0.9 K average difference between the models, with a standard deviation of 0.56 K. Same channel, there is 1.3 K in average between the fast model and the sensor measurement, with 2.4 K standard deviation. We consider the relatively small model differences as a validation of the fast Zeeman effect scheme for these channels. Both channels 19 and 20 have smaller average differences between the models (at below 0.2 K) and smaller standard deviations (at below 0.4 K) when both models use a two-dimensional magnetic field profile. However, when the reference model is switched to using a full three-dimensional magnetic field profile, the standard deviation to the fast model is increased to almost 2 K due to viewing geometry dependencies causing up to ± 7 K differences near the equator. The average differences between the two models remain small despite changing magnetic field configurations. We are unable to compare channels 19 and 20 to sensor measurements due to limited altitude range of the numerical weather prediction profiles. We recommended that numerical weather prediction software using the fast model takes the available fast Zeeman scheme into account for data assimilation of the affected sensor channels to better constrain the upper atmospheric temperatures.

  5. Modeling the Zeeman effect in high-altitude SSMIS channels for numerical weather prediction profiles: comparing a fast model and a line-by-line model

    NASA Astrophysics Data System (ADS)

    Larsson, Richard; Milz, Mathias; Rayer, Peter; Saunders, Roger; Bell, William; Booton, Anna; Buehler, Stefan A.; Eriksson, Patrick; John, Viju O.

    2016-03-01

    We present a comparison of a reference and a fast radiative transfer model using numerical weather prediction profiles for the Zeeman-affected high-altitude Special Sensor Microwave Imager/Sounder channels 19-22. We find that the models agree well for channels 21 and 22 compared to the channels' system noise temperatures (1.9 and 1.3 K, respectively) and the expected profile errors at the affected altitudes (estimated to be around 5 K). For channel 22 there is a 0.5 K average difference between the models, with a standard deviation of 0.24 K for the full set of atmospheric profiles. Concerning the same channel, there is 1.2 K on average between the fast model and the sensor measurement, with 1.4 K standard deviation. For channel 21 there is a 0.9 K average difference between the models, with a standard deviation of 0.56 K. Regarding the same channel, there is 1.3 K on average between the fast model and the sensor measurement, with 2.4 K standard deviation. We consider the relatively small model differences as a validation of the fast Zeeman effect scheme for these channels. Both channels 19 and 20 have smaller average differences between the models (at below 0.2 K) and smaller standard deviations (at below 0.4 K) when both models use a two-dimensional magnetic field profile. However, when the reference model is switched to using a full three-dimensional magnetic field profile, the standard deviation to the fast model is increased to almost 2 K due to viewing geometry dependencies, causing up to ±7 K differences near the equator. The average differences between the two models remain small despite changing magnetic field configurations. We are unable to compare channels 19 and 20 to sensor measurements due to limited altitude range of the numerical weather prediction profiles. We recommended that numerical weather prediction software using the fast model takes the available fast Zeeman scheme into account for data assimilation of the affected sensor channels to better constrain the upper atmospheric temperatures.

  6. DEVELOPMENT OF A PORTABLE SOFTWARE LANGUAGE FOR PHYSIOLOGICALLY-BASED PHARMACOKINETIC (PBPK) MODELS

    EPA Science Inventory

    The PBPK modeling community has had a long-standing problem with modeling software compatibility. The numerous software packages used for PBPK models are, at best, minimally compatible. This creates problems ranging from model obsolescence due to software support discontinuation...

  7. Standardized development of computer software. Part 2: Standards

    NASA Technical Reports Server (NTRS)

    Tausworthe, R. C.

    1978-01-01

    This monograph contains standards for software development and engineering. The book sets forth rules for design, specification, coding, testing, documentation, and quality assurance audits of software; it also contains detailed outlines for the documentation to be produced.

  8. Meshless collocation methods for the numerical solution of elliptic boundary valued problems the rotational shallow water equations on the sphere

    NASA Astrophysics Data System (ADS)

    Blakely, Christopher D.

    This dissertation thesis has three main goals: (1) To explore the anatomy of meshless collocation approximation methods that have recently gained attention in the numerical analysis community; (2) Numerically demonstrate why the meshless collocation method should clearly become an attractive alternative to standard finite-element methods due to the simplicity of its implementation and its high-order convergence properties; (3) Propose a meshless collocation method for large scale computational geophysical fluid dynamics models. We provide numerical verification and validation of the meshless collocation scheme applied to the rotational shallow-water equations on the sphere and demonstrate computationally that the proposed model can compete with existing high performance methods for approximating the shallow-water equations such as the SEAM (spectral-element atmospheric model) developed at NCAR. A detailed analysis of the parallel implementation of the model, along with the introduction of parallel algorithmic routines for the high-performance simulation of the model will be given. We analyze the programming and computational aspects of the model using Fortran 90 and the message passing interface (mpi) library along with software and hardware specifications and performance tests. Details from many aspects of the implementation in regards to performance, optimization, and stabilization will be given. In order to verify the mathematical correctness of the algorithms presented and to validate the performance of the meshless collocation shallow-water model, we conclude the thesis with numerical experiments on some standardized test cases for the shallow-water equations on the sphere using the proposed method.

  9. Three Years of Global Positioning System Experience on International Space Station

    NASA Technical Reports Server (NTRS)

    Gomez, Susan

    2005-01-01

    The International Space Station global positioning systems (GPS) receiver was activated in April 2002. Since that time, numerous software anomalies surfaced that had to be worked around. Some of the software problems required waivers, such as the time function, while others required extensive operator intervention, such as numerous power cycles. Eventually, enough anomalies surfaced that the three pieces of code included in the GPS unit have been re-written and the GPS units were upgraded. The technical aspects of the problems are discussed, as well as the underlying causes that led to the delivery of a product that has had numerous problems. The technical aspects of the problems included physical phenomena that were not well understood, such as the affect that the ionosphere would have on the GPS measurements. The underlying causes were traced to inappropriate use of legacy software, changing requirements, inadequate software processes, unrealistic schedules, incorrect contract type, and unclear ownership responsibilities.

  10. Computer-assisted qualitative data analysis software.

    PubMed

    Cope, Diane G

    2014-05-01

    Advances in technology have provided new approaches for data collection methods and analysis for researchers. Data collection is no longer limited to paper-and-pencil format, and numerous methods are now available through Internet and electronic resources. With these techniques, researchers are not burdened with entering data manually and data analysis is facilitated by software programs. Quantitative research is supported by the use of computer software and provides ease in the management of large data sets and rapid analysis of numeric statistical methods. New technologies are emerging to support qualitative research with the availability of computer-assisted qualitative data analysis software (CAQDAS).CAQDAS will be presented with a discussion of advantages, limitations, controversial issues, and recommendations for this type of software use.

  11. Validation of Calculations in a Digital Thermometer Firmware

    NASA Astrophysics Data System (ADS)

    Batagelj, V.; Miklavec, A.; Bojkovski, J.

    2014-04-01

    State-of-the-art digital thermometers are arguably remarkable measurement instruments, measuring outputs from resistance thermometers and/or thermocouples. Not only that they can readily achieve measuring accuracies in the parts-per-million range, but they also incorporate sophisticated algorithms for the transformation calculation of the measured resistance or voltage to temperature. These algorithms often include high-order polynomials, exponentials and logarithms, and must be performed using both standard coefficients and particular calibration coefficients. The numerical accuracy of these calculations and the associated uncertainty component must be much better than the accuracy of the raw measurement in order to be negligible in the total measurement uncertainty. In order for the end-user to gain confidence in these calculations as well as to conform to formal requirements of ISO/IEC 17025 and other standards, a way of validation of these numerical procedures performed in the firmware of the instrument is required. A software architecture which allows a simple validation of internal measuring instrument calculations is suggested. The digital thermometer should be able to expose all its internal calculation functions to the communication interface, so the end-user can compare the results of the internal measuring instrument calculation with reference results. The method can be regarded as a variation of the black-box software validation. Validation results on a thermometer prototype with implemented validation ability show that the calculation error of basic arithmetic operations is within the expected rounding error. For conversion functions, the calculation error is at least ten times smaller than the thermometer effective resolution for the particular probe type.

  12. [Development of a software standardizing optical density with operation settings related to several limitations].

    PubMed

    Tu, Xiao-Ming; Zhang, Zuo-Heng; Wan, Cheng; Zheng, Yu; Xu, Jin-Mei; Zhang, Yuan-Yuan; Luo, Jian-Ping; Wu, Hai-Wei

    2012-12-01

    To develop a software that can be used to standardize optical density to normalize the procedures and results of standardization in order to effectively solve several problems generated during standardization of in-direct ELISA results. The software was designed based on the I-STOD method with operation settings to solve the problems that one might encounter during the standardization. Matlab GUI was used as a tool for the development. The software was tested with the results of the detection of sera of persons from schistosomiasis japonica endemic areas. I-STOD V1.0 (WINDOWS XP/WIN 7, 0.5 GB) was successfully developed to standardize optical density. A serial of serum samples from schistosomiasis japonica endemic areas were used to examine the operational effects of I-STOD V1.0 software. The results indicated that the software successfully overcame several problems including reliability of standard curve, applicable scope of samples and determination of dilution for samples outside the scope, so that I-STOD was performed more conveniently and the results of standardization were more consistent. I-STOD V1.0 is a professional software based on I-STOD. It can be easily operated and can effectively standardize the testing results of in-direct ELISA.

  13. Comparison of Numerical Analyses with a Static Load Test of a Continuous Flight Auger Pile

    NASA Astrophysics Data System (ADS)

    Hoľko, Michal; Stacho, Jakub

    2014-12-01

    The article deals with numerical analyses of a Continuous Flight Auger (CFA) pile. The analyses include a comparison of calculated and measured load-settlement curves as well as a comparison of the load distribution over a pile's length. The numerical analyses were executed using two types of software, i.e., Ansys and Plaxis, which are based on FEM calculations. Both types of software are different from each other in the way they create numerical models, model the interface between the pile and soil, and use constitutive material models. The analyses have been prepared in the form of a parametric study, where the method of modelling the interface and the material models of the soil are compared and analysed. Our analyses show that both types of software permit the modelling of pile foundations. The Plaxis software uses advanced material models as well as the modelling of the impact of groundwater or overconsolidation. The load-settlement curve calculated using Plaxis is equal to the results of a static load test with a more than 95 % degree of accuracy. In comparison, the load-settlement curve calculated using Ansys allows for the obtaining of only an approximate estimate, but the software allows for the common modelling of large structure systems together with a foundation system.

  14. JPL Space Telecommunications Radio System Operating Environment

    NASA Technical Reports Server (NTRS)

    Lux, James P.; Lang, Minh; Peters, Kenneth J.; Taylor, Gregory H.; Duncan, Courtney B.; Orozco, David S.; Stern, Ryan A.; Ahten, Earl R.; Girard, Mike

    2013-01-01

    A flight-qualified implementation of a Software Defined Radio (SDR) Operating Environment for the JPL-SDR built for the CoNNeCT Project has been developed. It is compliant with the NASA Space Telecommunications Radio System (STRS) Architecture Standard, and provides the software infrastructure for STRS compliant waveform applications. This software provides a standards-compliant abstracted view of the JPL-SDR hardware platform. It uses industry standard POSIX interfaces for most functions, as well as exposing the STRS API (Application Programming In terface) required by the standard. This software includes a standardized interface for IP components instantiated within a Xilinx FPGA (Field Programmable Gate Array). The software provides a standardized abstracted interface to platform resources such as data converters, file system, etc., which can be used by STRS standards conformant waveform applications. It provides a generic SDR operating environment with a much smaller resource footprint than similar products such as SCA (Software Communications Architecture) compliant implementations, or the DoD Joint Tactical Radio Systems (JTRS).

  15. Developing Teaching Material Software Assisted for Numerical Methods

    NASA Astrophysics Data System (ADS)

    Handayani, A. D.; Herman, T.; Fatimah, S.

    2017-09-01

    The NCTM vision shows the importance of two things in school mathematics, which is knowing the mathematics of the 21st century and the need to continue to improve mathematics education to answer the challenges of a changing world. One of the competencies associated with the great challenges of the 21st century is the use of help and tools (including IT), such as: knowing the existence of various tools for mathematical activity. One of the significant challenges in mathematical learning is how to teach students about abstract concepts. In this case, technology in the form of mathematics learning software can be used more widely to embed the abstract concept in mathematics. In mathematics learning, the use of mathematical software can make high level math activity become easier accepted by student. Technology can strengthen student learning by delivering numerical, graphic, and symbolic content without spending the time to calculate complex computing problems manually. The purpose of this research is to design and develop teaching materials software assisted for numerical method. The process of developing the teaching material starts from the defining step, the process of designing the learning material developed based on information obtained from the step of early analysis, learners, materials, tasks that support then done the design step or design, then the last step is the development step. The development of teaching materials software assisted for numerical methods is valid in content. While validator assessment for teaching material in numerical methods is good and can be used with little revision.

  16. Development of a Multi-Channel, High Frequency QRS Electrocardiograph

    NASA Technical Reports Server (NTRS)

    DePalma, Jude L.

    2003-01-01

    With the advent of the ISS era and the potential requirement for increased cardiovascular monitoring of crewmembers during extended EVAs, NASA flight surgeons would stand to benefit from an evolving technology that allows for a more rapid diagnosis of myocardial ischemia compared to standard electrocardiography. Similarly, during the astronaut selection process, NASA flight surgeons and other physicians would also stand to benefit from a completely noninvasive technology that, either at rest or during maximal exercise tests, is more sensitive than standard ECG in identifying the presence of ischemia. Perhaps most importantly, practicing cardiologists and emergency medicine physicians could greatly benefit from such a device as it could augment (or even replace) standard electrocardiography in settings where the rapid diagnosis of myocardial ischemia (or the lack thereof) is required for proper clinical decision-making. A multi-channel, high-frequency QRS electrocardiograph is currently under development in the Life Sciences Research Laboratories at JSC. Specifically the project consisted of writing software code, some of which contained specially-designed digital filters, which will be incorporated into an existing commercial software program that is already designed to collect, plot and analyze conventional 12-lead ECG signals on a desktop, portable or palm PC. The software will derive the high-frequency QRS signals, which will be analyzed (in numerous ways) and plotted alongside of the conventional ECG signals, giving the PC-viewing clinician advanced diagnostic information that has never been available previously in all 12 ECG leads simultaneously. After the hardware and software for the advanced digital ECG monitor have been fully integrated, plans are to use the monitor to begin clinical studies both on healthy subjects and on patients with known coronary artery disease in both the outpatient and hospital settings. The ultimate goal is to get the technology out into the clinical world, where it has the potential to save lives.

  17. The Elements of an Effective Software Development Plan - Software Development Process Guidebook

    DTIC Science & Technology

    2011-11-11

    standards and practices required for all XMPL software development. This SDP implements the <corporate> Standard Software Process (SSP). as tailored...Developing and integrating reusable software products • Approach to managing COTS/Reuse software implementation • COTS/Reuse software selection...final selection and submit to change board for approval MAINTENANCE Monitor current products for obsolescence or end of support Track new

  18. Auto-Generated Semantic Processing Services

    NASA Technical Reports Server (NTRS)

    Davis, Rodney; Hupf, Greg

    2009-01-01

    Auto-Generated Semantic Processing (AGSP) Services is a suite of software tools for automated generation of other computer programs, denoted cross-platform semantic adapters, that support interoperability of computer-based communication systems that utilize a variety of both new and legacy communication software running in a variety of operating- system/computer-hardware combinations. AGSP has numerous potential uses in military, space-exploration, and other government applications as well as in commercial telecommunications. The cross-platform semantic adapters take advantage of common features of computer- based communication systems to enforce semantics, messaging protocols, and standards of processing of streams of binary data to ensure integrity of data and consistency of meaning among interoperating systems. The auto-generation aspect of AGSP Services reduces development time and effort by emphasizing specification and minimizing implementation: In effect, the design, building, and debugging of software for effecting conversions among complex communication protocols, custom device mappings, and unique data-manipulation algorithms is replaced with metadata specifications that map to an abstract platform-independent communications model. AGSP Services is modular and has been shown to be easily integrable into new and legacy NASA flight and ground communication systems.

  19. Numerical Analyses of Subsoil-structure Interaction in Original Non-commercial Software based on FEM

    NASA Astrophysics Data System (ADS)

    Cajka, R.; Vaskova, J.; Vasek, J.

    2018-04-01

    For decades attention has been paid to interaction of foundation structures and subsoil and development of interaction models. Given that analytical solutions of subsoil-structure interaction could be deduced only for some simple shapes of load, analytical solutions are increasingly being replaced by numerical solutions (eg. FEM – Finite element method). Numerical analyses provides greater possibilities for taking into account the real factors involved in the subsoil-structure interaction and was also used in this article. This makes it possible to design the foundation structures more efficiently and still reliably and securely. Currently there are several software that, can deal with the interaction of foundations and subsoil. It has been demonstrated that non-commercial software called MKPINTER (created by Cajka) provides appropriately results close to actual measured values. In MKPINTER software stress-strain analysis of elastic half-space by means of Gauss numerical integration and Jacobean of transformation is done. Input data for numerical analysis were observed by experimental loading test of concrete slab. The loading was performed using unique experimental equipment which was constructed in the area Faculty of Civil Engineering, VŠB-TU Ostrava. The purpose of this paper is to compare resulting deformation of the slab with values observed during experimental loading test.

  20. Shear Resistance between Concrete-Concrete Surfaces

    NASA Astrophysics Data System (ADS)

    Kovačovic, Marek

    2013-12-01

    The application of precast beams and cast-in-situ structural members cast at different times has been typical of bridges and buildings for many years. A load-bearing frame consists of a set of prestressed precast beams supported by columns and diaphragms joined with an additionally cast slab deck. This article is focused on the theoretical and experimental analyses of the shear resistance at an interface. The first part of the paper deals with the state-of-art knowledge of the composite behaviour of concrete-concrete structures and a comparison of the numerical methods introduced in the relevant standards. In the experimental part, a set of specimens with different interface treatments was tested until failure in order to predict the composite behaviour of coupled beams. The experimental part was compared to the numerical analysis performed by means of FEM basis nonlinear software.

  1. Distributed parameter statics of magnetic catheters.

    PubMed

    Tunay, Ilker

    2011-01-01

    We discuss how to use special Cosserat rod theory for deriving distributed-parameter static equilibrium equations of magnetic catheters. These medical devices are used for minimally-invasive diagnostic and therapeutic procedures and can be operated remotely or controlled by automated algorithms. The magnetic material can be lumped in rigid segments or distributed in flexible segments. The position vector of the cross-section centroid and quaternion representation of an orthonormal triad are selected as DOF. The strain energy for transversely isotropic, hyperelastic rods is augmented with the mechanical potential energy of the magnetic field and a penalty term to enforce the quaternion unity constraint. Numerical solution is found by 1D finite elements. Material properties of polymer tubes in extension, bending and twist are determined by mechanical and magnetic experiments. Software experiments with commercial FEM software indicate that the computational effort with the proposed method is at least one order of magnitude less than standard 3D FEM.

  2. An expert system for prediction of chemical toxicity

    USGS Publications Warehouse

    Hickey, James P.; Aldridge, Andrew J.; Passino-Reader, Dora R.; Frank, Anthony M.

    1992-01-01

    The National Fisheries Research Center- Great Lakes has developed an interactive computer program that uses the structure of an organic molecule to predict its acute toxicity to four aquatic species. The expert system software, written in the muLISP language, identifies the skeletal structures and substituent groups of an organic molecule from a user-supplied standard chemical notation known as a SMILES string, and then generates values for four solvatochromic parameters. Multiple regression equations relate these parameters to the toxicities (expressed as log10LC50s and log10EC50s, along with 95% confidence intervals) for four species. The system is demonstrated by prediction of toxicity for anilide-type pesticides to the fathead minnow (Pimephales promelas). This software is designed for use on an IBM-compatible personal computer by personnel with minimal toxicology background for rapid estimation of chemical toxicity. The system has numerous applications, with much potential for use in the pharmaceutical industry

  3. Space Telecommunications Radio System (STRS) Architecture Standard. Release 1.02.1

    NASA Technical Reports Server (NTRS)

    Reinhart, Richard C.; Kacpura, Thomas J.; Handler, Louis M.; Hall, C. Steve; Mortensen, Dale J.; Johnson, Sandra K.; Briones, Janette C.; Nappier, Jennifer M.; Downey, Joseph A.; Lux, James P.

    2012-01-01

    This document contains the NASA architecture standard for software defined radios used in space- and ground-based platforms to enable commonality among radio developments to enhance capability and services while reducing mission and programmatic risk. Transceivers (or transponders) with functionality primarily defined in software (e.g., firmware) have the ability to change their functional behavior through software alone. This radio architecture standard offers value by employing common waveform software interfaces, method of instantiation, operation, and testing among different compliant hardware and software products. These common interfaces within the architecture abstract application software from the underlying hardware to enable technology insertion independently at either the software or hardware layer.

  4. Software Development Standard Processes (SDSP)

    NASA Technical Reports Server (NTRS)

    Lavin, Milton L.; Wang, James J.; Morillo, Ronald; Mayer, John T.; Jamshidian, Barzia; Shimizu, Kenneth J.; Wilkinson, Belinda M.; Hihn, Jairus M.; Borgen, Rosana B.; Meyer, Kenneth N.; hide

    2011-01-01

    A JPL-created set of standard processes is to be used throughout the lifecycle of software development. These SDSPs cover a range of activities, from management and engineering activities, to assurance and support activities. These processes must be applied to software tasks per a prescribed set of procedures. JPL s Software Quality Improvement Project is currently working at the behest of the JPL Software Process Owner to ensure that all applicable software tasks follow these procedures. The SDSPs are captured as a set of 22 standards in JPL s software process domain. They were developed in-house at JPL by a number of Subject Matter Experts (SMEs) residing primarily within the Engineering and Science Directorate, but also from the Business Operations Directorate and Safety and Mission Success Directorate. These practices include not only currently performed best practices, but also JPL-desired future practices in key thrust areas like software architecting and software reuse analysis. Additionally, these SDSPs conform to many standards and requirements to which JPL projects are beholden.

  5. Software engineering standards and practices

    NASA Technical Reports Server (NTRS)

    Durachka, R. W.

    1981-01-01

    Guidelines are presented for the preparation of a software development plan. The various phases of a software development project are discussed throughout its life cycle including a general description of the software engineering standards and practices to be followed during each phase.

  6. Demographic-Based Perceptions of Adequacy of Software Security's Presence within Individual Phases of the Software Development Life Cycle

    ERIC Educational Resources Information Center

    Kramer, Aleksey

    2013-01-01

    The topic of software security has become paramount in information technology (IT) related scholarly research. Researchers have addressed numerous software security topics touching on all phases of the Software Development Life Cycle (SDLC): requirements gathering phase, design phase, development phase, testing phase, and maintenance phase.…

  7. SNPmplexViewer--toward a cost-effective traceability system

    PubMed Central

    2011-01-01

    Background Beef traceability has become mandatory in many regions of the world and is typically achieved through the use of unique numerical codes on ear tags and animal passports. DNA-based traceability uses the animal's own DNA code to identify it and the products derived from it. Using SNaPshot, a primer-extension-based method, a multiplex of 25 SNPs in a single reaction has been practiced for reducing the expense of genotyping a panel of SNPs useful for identity control. Findings To further decrease SNaPshot's cost, we introduced the Perl script SNPmplexViewer, which facilitates the analysis of trace files for reactions performed without the use of fluorescent size standards. SNPmplexViewer automatically aligns reference and target trace electropherograms, run with and without fluorescent size standards, respectively. SNPmplexViewer produces a modified target trace file containing a normalised trace in which the reference size standards are embedded. SNPmplexViewer also outputs aligned images of the two electropherograms together with a difference profile. Conclusions Modified trace files generated by SNPmplexViewer enable genotyping of SnaPshot reactions performed without fluorescent size standards, using common fragment-sizing software packages. SNPmplexViewer's normalised output may also improve the genotyping software's performance. Thus, SNPmplexViewer is a general free tool enabling the reduction of SNaPshot's cost as well as the fast viewing and comparing of trace electropherograms for fragment analysis. SNPmplexViewer is available at http://cowry.agri.huji.ac.il/cgi-bin/SNPmplexViewer.cgi. PMID:21600063

  8. Preliminary clinical evaluation of automated analysis of the sublingual microcirculation in the assessment of patients with septic shock: Comparison of automated versus semi-automated software.

    PubMed

    Sharawy, Nivin; Mukhtar, Ahmed; Islam, Sufia; Mahrous, Reham; Mohamed, Hassan; Ali, Mohamed; Hakeem, Amr A; Hossny, Osama; Refaa, Amera; Saka, Ahmed; Cerny, Vladimir; Whynot, Sara; George, Ronald B; Lehmann, Christian

    2017-01-01

    The outcome of patients in septic shock has been shown to be related to changes within the microcirculation. Modern imaging technologies are available to generate high resolution video recordings of the microcirculation in humans. However, evaluation of the microcirculation is not yet implemented in the routine clinical monitoring of critically ill patients. This is mainly due to large amount of time and user interaction required by the current video analysis software. The aim of this study was to validate a newly developed automated method (CCTools®) for microcirculatory analysis of sublingual capillary perfusion in septic patients in comparison to standard semi-automated software (AVA3®). 204 videos from 47 patients were recorded using incident dark field (IDF) imaging. Total vessel density (TVD), proportion of perfused vessels (PPV), perfused vessel density (PVD), microvascular flow index (MFI) and heterogeneity index (HI) were measured using AVA3® and CCTools®. Significant differences between the numeric results obtained by the two different software packages were observed. The values for TVD, PVD and MFI were statistically related though. The automated software technique successes to show septic shock induced microcirculation alterations in near real time. However, we found wide degrees of agreement between AVA3® and CCTools® values due to several technical factors that should be considered in the future studies.

  9. Proceedings of the Workshop on software tools for distributed intelligent control systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Herget, C.J.

    1990-09-01

    The Workshop on Software Tools for Distributed Intelligent Control Systems was organized by Lawrence Livermore National Laboratory for the United States Army Headquarters Training and Doctrine Command and the Defense Advanced Research Projects Agency. The goals of the workshop were to the identify the current state of the art in tools which support control systems engineering design and implementation, identify research issues associated with writing software tools which would provide a design environment to assist engineers in multidisciplinary control design and implementation, formulate a potential investment strategy to resolve the research issues and develop public domain code which can formmore » the core of more powerful engineering design tools, and recommend test cases to focus the software development process and test associated performance metrics. Recognizing that the development of software tools for distributed intelligent control systems will require a multidisciplinary effort, experts in systems engineering, control systems engineering, and compute science were invited to participate in the workshop. In particular, experts who could address the following topics were selected: operating systems, engineering data representation and manipulation, emerging standards for manufacturing data, mathematical foundations, coupling of symbolic and numerical computation, user interface, system identification, system representation at different levels of abstraction, system specification, system design, verification and validation, automatic code generation, and integration of modular, reusable code.« less

  10. Software Defined Radio Standard Architecture and its Application to NASA Space Missions

    NASA Technical Reports Server (NTRS)

    Andro, Monty; Reinhart, Richard C.

    2006-01-01

    A software defined radio (SDR) architecture used in space-based platforms proposes to standardize certain aspects of radio development such as interface definitions, functional control and execution, and application software and firmware development. NASA has charted a team to develop an open software defined radio hardware and software architecture to support NASA missions and determine the viability of an Agency-wide Standard. A draft concept of the proposed standard has been released and discussed among organizations in the SDR community. Appropriate leveraging of the JTRS SCA, OMG's SWRadio Architecture and other aspects are considered. A standard radio architecture offers potential value by employing common waveform software instantiation, operation, testing and software maintenance. While software defined radios offer greater flexibility, they also poses challenges to the radio development for the space environment in terms of size, mass and power consumption and available technology. An SDR architecture for space must recognize and address the constraints of space flight hardware, and systems along with flight heritage and culture. NASA is actively participating in the development of technology and standards related to software defined radios. As NASA considers a standard radio architecture for space communications, input and coordination from government agencies, the industry, academia, and standards bodies is key to a successful architecture. The unique aspects of space require thorough investigation of relevant terrestrial technologies properly adapted to space. The talk will describe NASA's current effort to investigate SDR applications to space missions and a brief overview of a candidate architecture under consideration for space based platforms.

  11. AstroBus On-Board Software

    NASA Astrophysics Data System (ADS)

    Biscarros, D.; Cantenot, C.; Séronie-Vivien, J.; Schmidt, G.

    AstroBus on-board software is a customisable software for ERC32 based avionics implementing standard ESA Packet Utilization Standard functions. Its architecture based on generic design templates and relying on a library providing standard PUS TC, TM and event services enhances its reusability on various programs. Finally, AstroBus on-board software development and validation environment is based on last generation tools providing an optimised customisation environment.

  12. 48 CFR 227.7203-5 - Government rights.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... Software and Computer Software Documentation 227.7203-5 Government rights. The standard license rights in computer software that a licensor grants to the Government are unlimited rights, government purpose rights, or restricted rights. The standard license in computer software documentation conveys unlimited...

  13. 48 CFR 227.7203-5 - Government rights.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... Software and Computer Software Documentation 227.7203-5 Government rights. The standard license rights in computer software that a licensor grants to the Government are unlimited rights, government purpose rights, or restricted rights. The standard license in computer software documentation conveys unlimited...

  14. 48 CFR 227.7203-5 - Government rights.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... Software and Computer Software Documentation 227.7203-5 Government rights. The standard license rights in computer software that a licensor grants to the Government are unlimited rights, government purpose rights, or restricted rights. The standard license in computer software documentation conveys unlimited...

  15. 48 CFR 227.7203-5 - Government rights.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... Software and Computer Software Documentation 227.7203-5 Government rights. The standard license rights in computer software that a licensor grants to the Government are unlimited rights, government purpose rights, or restricted rights. The standard license in computer software documentation conveys unlimited...

  16. 48 CFR 227.7203-5 - Government rights.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... Software and Computer Software Documentation 227.7203-5 Government rights. The standard license rights in computer software that a licensor grants to the Government are unlimited rights, government purpose rights, or restricted rights. The standard license in computer software documentation conveys unlimited...

  17. A temperature and pressure controlled calibration system for pressure sensors

    NASA Technical Reports Server (NTRS)

    Chapman, John J.; Kahng, Seun K.

    1989-01-01

    A data acquisition and experiment control system capable of simulating temperatures from -184 to +220 C and pressures either absolute or differential from 0 to 344.74 kPa is developed to characterize silicon pressure sensor response to temperature and pressure. System software is described that includes sensor data acquisition, algorithms for numerically derived thermal offset and sensitivity correction, and operation of the environmental chamber and pressure standard. This system is shown to be capable of computer interfaced cryogenic testing to within 1 C and 34.47 Pa of single channel or multiplexed arrays of silicon pressure sensors.

  18. NASA's SDR Standard: Space Telecommunications Radio System

    NASA Technical Reports Server (NTRS)

    Reinhart, Richard C.; Johnson, Sandra K.

    2007-01-01

    A software defined radio (SDR) architecture used in space-based platforms proposes to standardize certain aspects of radio development such as interface definitions, functional control and execution, and application software and firmware development. NASA has charted a team to develop an open software defined radio hardware and software architecture to support NASA missions and determine the viability of an Agency-wide Standard. A draft concept of the proposed standard has been released and discussed among organizations in the SDR community. Appropriate leveraging of the JTRS SCA, OMG s SWRadio Architecture and other aspects are considered. A standard radio architecture offers potential value by employing common waveform software instantiation, operation, testing and software maintenance. While software defined radios offer greater flexibility, they also poses challenges to the radio development for the space environment in terms of size, mass and power consumption and available technology. An SDR architecture for space must recognize and address the constraints of space flight hardware, and systems along with flight heritage and culture. NASA is actively participating in the development of technology and standards related to software defined radios. As NASA considers a standard radio architecture for space communications, input and coordination from government agencies, the industry, academia, and standards bodies is key to a successful architecture. The unique aspects of space require thorough investigation of relevant terrestrial technologies properly adapted to space. The talk will describe NASA s current effort to investigate SDR applications to space missions and a brief overview of a candidate architecture under consideration for space based platforms.

  19. Improvements to NASA's Debris Assessment Software

    NASA Technical Reports Server (NTRS)

    Opiela, J.; Johnson, Nicholas L.

    2007-01-01

    NASA's Debris Assessment Software (DAS) has been substantially revised and expanded. DAS is designed to assist NASA programs in performing orbital debris assessments, as described in NASA s Guidelines and Assessment Procedures for Limiting Orbital Debris. The extensive upgrade of DAS was undertaken to reflect changes in the debris mitigation guidelines, to incorporate recommendations from DAS users, and to take advantage of recent software capabilities for greater user utility. DAS 2.0 includes an updated environment model and enhanced orbital propagators and reentry-survivability models. The ORDEM96 debris environment model has been replaced by ORDEM2000 in DAS 2.0, which is also designed to accept anticipated revisions to the environment definition. Numerous upgrades have also been applied to the assessment of human casualty potential due to reentering debris. Routines derived from the Object Reentry Survival Analysis Tool, Version 6 (ORSAT 6), determine which objects are assessed to survive reentry, and the resulting risk of human casualty is calculated directly based upon the orbital inclination and a future world population database. When evaluating reentry risks, the user may enter up to 200 unique hardware components for each launched object, in up to four nested levels. This last feature allows the software to more accurately model components that are exposed below the initial breakup altitude. The new DAS 2.0 provides an updated set of tools for users to assess their mission s compliance with the NASA Safety Standard and does so with a clear and easy-to-understand interface. The new native Microsoft Windows graphical user interface (GUI) is a vast improvement over the previous DOS-based interface. In the new version, functions are more-clearly laid out, and the GUI includes the standard Windows-style Help functions. The underlying routines within the DAS code are also improved.

  20. Diversification and Challenges of Software Engineering Standards

    NASA Technical Reports Server (NTRS)

    Poon, Peter T.

    1994-01-01

    The author poses certain questions in this paper: 'In the future, should there be just one software engineering standards set? If so, how can we work towards that goal? What are the challenges of internationalizing standards?' Based on the author's personal view, the statement of his position is as follows: 'There should NOT be just one set of software engineering standards in the future. At the same time, there should NOT be the proliferation of standards, and the number of sets of standards should be kept to a minimum.It is important to understand the diversification of the areas which are spanned by the software engineering standards.' The author goes on to describe the diversification of processes, the diversification in the national and international character of standards organizations, the diversification of the professional organizations producing standards, the diversification of the types of businesses and industries, and the challenges of internationalizing standards.

  1. An Object Model for a Rocket Engine Numerical Simulator

    NASA Technical Reports Server (NTRS)

    Mitra, D.; Bhalla, P. N.; Pratap, V.; Reddy, P.

    1998-01-01

    Rocket Engine Numerical Simulator (RENS) is a packet of software which numerically simulates the behavior of a rocket engine. Different parameters of the components of an engine is the input to these programs. Depending on these given parameters the programs output the behaviors of those components. These behavioral values are then used to guide the design of or to diagnose a model of a rocket engine "built" by a composition of these programs simulating different components of the engine system. In order to use this software package effectively one needs to have a flexible model of a rocket engine. These programs simulating different components then should be plugged into this modular representation. Our project is to develop an object based model of such an engine system. We are following an iterative and incremental approach in developing the model, as is the standard practice in the area of object oriented design and analysis of softwares. This process involves three stages: object modeling to represent the components and sub-components of a rocket engine, dynamic modeling to capture the temporal and behavioral aspects of the system, and functional modeling to represent the transformational aspects. This article reports on the first phase of our activity under a grant (RENS) from the NASA Lewis Research center. We have utilized Rambaugh's object modeling technique and the tool UML for this purpose. The classes of a rocket engine propulsion system are developed and some of them are presented in this report. The next step, developing a dynamic model for RENS, is also touched upon here. In this paper we will also discuss the advantages of using object-based modeling for developing this type of an integrated simulator over other tools like an expert systems shell or a procedural language, e.g., FORTRAN. Attempts have been made in the past to use such techniques.

  2. Algorithms and Libraries

    NASA Technical Reports Server (NTRS)

    Dongarra, Jack

    1998-01-01

    This exploratory study initiated our inquiry into algorithms and applications that would benefit by latency tolerant approach to algorithm building, including the construction of new algorithms where appropriate. In a multithreaded execution, when a processor reaches a point where remote memory access is necessary, the request is sent out on the network and a context--switch occurs to a new thread of computation. This effectively masks a long and unpredictable latency due to remote loads, thereby providing tolerance to remote access latency. We began to develop standards to profile various algorithm and application parameters, such as the degree of parallelism, granularity, precision, instruction set mix, interprocessor communication, latency etc. These tools will continue to develop and evolve as the Information Power Grid environment matures. To provide a richer context for this research, the project also focused on issues of fault-tolerance and computation migration of numerical algorithms and software. During the initial phase we tried to increase our understanding of the bottlenecks in single processor performance. Our work began by developing an approach for the automatic generation and optimization of numerical software for processors with deep memory hierarchies and pipelined functional units. Based on the results we achieved in this study we are planning to study other architectures of interest, including development of cost models, and developing code generators appropriate to these architectures.

  3. Implementationof a modular software system for multiphysical processes in porous media

    NASA Astrophysics Data System (ADS)

    Naumov, Dmitri; Watanabe, Norihiro; Bilke, Lars; Fischer, Thomas; Lehmann, Christoph; Rink, Karsten; Walther, Marc; Wang, Wenqing; Kolditz, Olaf

    2016-04-01

    Subsurface georeservoirs are a candidate technology for large scale energy storage required as part of the transition to renewable energy sources. The increased use of the subsurface results in competing interests and possible impacts on protected entities. To optimize and plan the use of the subsurface in large scale scenario analyses,powerful numerical frameworks are required that aid process understanding and can capture the coupled thermal (T), hydraulic (H), mechanical (M), and chemical (C) processes with high computational efficiency. Due to having a multitude of different couplings between basic T, H, M, or C processes and the necessity to implement new numerical schemes the development focus has moved to software's modularity. The decreased coupling between the components results in two major advantages: easier addition of specialized processes and improvement of the code's testability and therefore its quality. The idea of modularization is implemented on several levels, in addition to library based separation of the previous code version, by using generalized algorithms available in the Standard Template Library and the Boost library, relying on efficient implementations of liner algebra solvers, using concepts when designing new types, and localization of frequently accessed data structures. This procedure shows certain benefits for a flexible high-performance framework applied to the analysis of multipurpose georeservoirs.

  4. Simplifying the Reuse and Interoperability of Geoscience Data Sets and Models with Semantic Metadata that is Human-Readable and Machine-actionable

    NASA Astrophysics Data System (ADS)

    Peckham, S. D.

    2017-12-01

    Standardized, deep descriptions of digital resources (e.g. data sets, computational models, software tools and publications) make it possible to develop user-friendly software systems that assist scientists with the discovery and appropriate use of these resources. Semantic metadata makes it possible for machines to take actions on behalf of humans, such as automatically identifying the resources needed to solve a given problem, retrieving them and then automatically connecting them (despite their heterogeneity) into a functioning workflow. Standardized model metadata also helps model users to understand the important details that underpin computational models and to compare the capabilities of different models. These details include simplifying assumptions on the physics, governing equations and the numerical methods used to solve them, discretization of space (the grid) and time (the time-stepping scheme), state variables (input or output), model configuration parameters. This kind of metadata provides a "deep description" of a computational model that goes well beyond other types of metadata (e.g. author, purpose, scientific domain, programming language, digital rights, provenance, execution) and captures the science that underpins a model. A carefully constructed, unambiguous and rules-based schema to address this problem, called the Geoscience Standard Names ontology will be presented that utilizes Semantic Web best practices and technologies. It has also been designed to work across science domains and to be readable by both humans and machines.

  5. Space Station Software Recommendations

    NASA Technical Reports Server (NTRS)

    Voigt, S. (Editor)

    1985-01-01

    Four panels of invited experts and NASA representatives focused on the following topics: software management, software development environment, languages, and software standards. Each panel deliberated in private, held two open sessions with audience participation, and developed recommendations for the NASA Space Station Program. The major thrusts of the recommendations were as follows: (1) The software management plan should establish policies, responsibilities, and decision points for software acquisition; (2) NASA should furnish a uniform modular software support environment and require its use for all space station software acquired (or developed); (3) The language Ada should be selected for space station software, and NASA should begin to address issues related to the effective use of Ada; and (4) The space station software standards should be selected (based upon existing standards where possible), and an organization should be identified to promulgate and enforce them. These and related recommendations are described in detail in the conference proceedings.

  6. Planning the Unplanned Experiment: Towards Assessing the Efficacy of Standards for Safety-Critical Software

    NASA Technical Reports Server (NTRS)

    Graydon, Patrick J.; Holloway, C. M.

    2015-01-01

    Safe use of software in safety-critical applications requires well-founded means of determining whether software is fit for such use. While software in industries such as aviation has a good safety record, little is known about whether standards for software in safety-critical applications 'work' (or even what that means). It is often (implicitly) argued that software is fit for safety-critical use because it conforms to an appropriate standard. Without knowing whether a standard works, such reliance is an experiment; without carefully collecting assessment data, that experiment is unplanned. To help plan the experiment, we organized a workshop to develop practical ideas for assessing software safety standards. In this paper, we relate and elaborate on the workshop discussion, which revealed subtle but important study design considerations and practical barriers to collecting appropriate historical data and recruiting appropriate experimental subjects. We discuss assessing standards as written and as applied, several candidate definitions for what it means for a standard to 'work,' and key assessment strategies and study techniques and the pros and cons of each. Finally, we conclude with thoughts about the kinds of research that will be required and how academia, industry, and regulators might collaborate to overcome the noted barriers.

  7. Open Architecture Standard for NASA's Software-Defined Space Telecommunications Radio Systems

    NASA Technical Reports Server (NTRS)

    Reinhart, Richard C.; Johnson, Sandra K.; Kacpura, Thomas J.; Hall, Charles S.; Smith, Carl R.; Liebetreu, John

    2008-01-01

    NASA is developing an architecture standard for software-defined radios used in space- and ground-based platforms to enable commonality among radio developments to enhance capability and services while reducing mission and programmatic risk. Transceivers (or transponders) with functionality primarily defined in software (e.g., firmware) have the ability to change their functional behavior through software alone. This radio architecture standard offers value by employing common waveform software interfaces, method of instantiation, operation, and testing among different compliant hardware and software products. These common interfaces within the architecture abstract application software from the underlying hardware to enable technology insertion independently at either the software or hardware layer. This paper presents the initial Space Telecommunications Radio System (STRS) Architecture for NASA missions to provide the desired software abstraction and flexibility while minimizing the resources necessary to support the architecture.

  8. Towards Test Driven Development for Computational Science with pFUnit

    NASA Technical Reports Server (NTRS)

    Rilee, Michael L.; Clune, Thomas L.

    2014-01-01

    Developers working in Computational Science & Engineering (CSE)/High Performance Computing (HPC) must contend with constant change due to advances in computing technology and science. Test Driven Development (TDD) is a methodology that mitigates software development risks due to change at the cost of adding comprehensive and continuous testing to the development process. Testing frameworks tailored for CSE/HPC, like pFUnit, can lower the barriers to such testing, yet CSE software faces unique constraints foreign to the broader software engineering community. Effective testing of numerical software requires a comprehensive suite of oracles, i.e., use cases with known answers, as well as robust estimates for the unavoidable numerical errors associated with implementation with finite-precision arithmetic. At first glance these concerns often seem exceedingly challenging or even insurmountable for real-world scientific applications. However, we argue that this common perception is incorrect and driven by (1) a conflation between model validation and software verification and (2) the general tendency in the scientific community to develop relatively coarse-grained, large procedures that compound numerous algorithmic steps.We believe TDD can be applied routinely to numerical software if developers pursue fine-grained implementations that permit testing, neatly side-stepping concerns about needing nontrivial oracles as well as the accumulation of errors. We present an example of a successful, complex legacy CSE/HPC code whose development process shares some aspects with TDD, which we contrast with current and potential capabilities. A mix of our proposed methodology and framework support should enable everyday use of TDD by CSE-expert developers.

  9. Behind Linus's Law: Investigating Peer Review Processes in Open Source

    ERIC Educational Resources Information Center

    Wang, Jing

    2013-01-01

    Open source software has revolutionized the way people develop software, organize collaborative work, and innovate. The numerous open source software systems that have been created and adopted over the past decade are influential and vital in all aspects of work and daily life. The understanding of open source software development can enhance its…

  10. Program Helps Standardize Documentation Of Software

    NASA Technical Reports Server (NTRS)

    Howe, G.

    1994-01-01

    Intelligent Documentation Management System, IDMS, computer program developed to assist project managers in implementing information system documentation standard known as NASA-STD-2100-91, NASA STD, COS-10300, of NASA's Software Management and Assurance Program. Standard consists of data-item descriptions or templates, each of which governs particular component of software documentation. IDMS helps program manager in tailoring documentation standard to project. Written in C language.

  11. Towards a metadata scheme for the description of materials - the description of microstructures

    NASA Astrophysics Data System (ADS)

    Schmitz, Georg J.; Böttger, Bernd; Apel, Markus; Eiken, Janin; Laschet, Gottfried; Altenfeld, Ralph; Berger, Ralf; Boussinot, Guillaume; Viardin, Alexandre

    2016-01-01

    The property of any material is essentially determined by its microstructure. Numerical models are increasingly the focus of modern engineering as helpful tools for tailoring and optimization of custom-designed microstructures by suitable processing and alloy design. A huge variety of software tools is available to predict various microstructural aspects for different materials. In the general frame of an integrated computational materials engineering (ICME) approach, these microstructure models provide the link between models operating at the atomistic or electronic scales, and models operating on the macroscopic scale of the component and its processing. In view of an improved interoperability of all these different tools it is highly desirable to establish a standardized nomenclature and methodology for the exchange of microstructure data. The scope of this article is to provide a comprehensive system of metadata descriptors for the description of a 3D microstructure. The presented descriptors are limited to a mere geometric description of a static microstructure and have to be complemented by further descriptors, e.g. for properties, numerical representations, kinetic data, and others in the future. Further attributes to each descriptor, e.g. on data origin, data uncertainty, and data validity range are being defined in ongoing work. The proposed descriptors are intended to be independent of any specific numerical representation. The descriptors defined in this article may serve as a first basis for standardization and will simplify the data exchange between different numerical models, as well as promote the integration of experimental data into numerical models of microstructures. An HDF5 template data file for a simple, three phase Al-Cu microstructure being based on the defined descriptors complements this article.

  12. Towards a metadata scheme for the description of materials - the description of microstructures.

    PubMed

    Schmitz, Georg J; Böttger, Bernd; Apel, Markus; Eiken, Janin; Laschet, Gottfried; Altenfeld, Ralph; Berger, Ralf; Boussinot, Guillaume; Viardin, Alexandre

    2016-01-01

    The property of any material is essentially determined by its microstructure. Numerical models are increasingly the focus of modern engineering as helpful tools for tailoring and optimization of custom-designed microstructures by suitable processing and alloy design. A huge variety of software tools is available to predict various microstructural aspects for different materials. In the general frame of an integrated computational materials engineering (ICME) approach, these microstructure models provide the link between models operating at the atomistic or electronic scales, and models operating on the macroscopic scale of the component and its processing. In view of an improved interoperability of all these different tools it is highly desirable to establish a standardized nomenclature and methodology for the exchange of microstructure data. The scope of this article is to provide a comprehensive system of metadata descriptors for the description of a 3D microstructure. The presented descriptors are limited to a mere geometric description of a static microstructure and have to be complemented by further descriptors, e.g. for properties, numerical representations, kinetic data, and others in the future. Further attributes to each descriptor, e.g. on data origin, data uncertainty, and data validity range are being defined in ongoing work. The proposed descriptors are intended to be independent of any specific numerical representation. The descriptors defined in this article may serve as a first basis for standardization and will simplify the data exchange between different numerical models, as well as promote the integration of experimental data into numerical models of microstructures. An HDF5 template data file for a simple, three phase Al-Cu microstructure being based on the defined descriptors complements this article.

  13. Numerical evaluation of moiré pattern in touch sensor module with electrode mesh structure in oblique view

    NASA Astrophysics Data System (ADS)

    Pournoury, M.; Zamiri, A.; Kim, T. Y.; Yurlov, V.; Oh, K.

    2016-03-01

    Capacitive touch sensor screen with the metal materials has recently become qualified for substitution of ITO; however several obstacles still have to be solved. One of the most important issues is moiré phenomenon. The visibility problem of the metal-mesh, in touch sensor module (TSM) is numerically considered in this paper. Based on human eye contract sensitivity function (CSF), moiré pattern of TSM electrode mesh structure is simulated with MATLAB software for 8 inch screen display in oblique view. Standard deviation of the generated moiré by the superposition of electrode mesh and screen image is calculated to find the optimal parameters which provide the minimum moiré visibility. To create the screen pixel array and mesh electrode, rectangular function is used. The filtered image, in frequency domain, is obtained by multiplication of Fourier transform of the finite mesh pattern (product of screen pixel and mesh electrode) with the calculated CSF function for three different observer distances (L=200, 300 and 400 mm). It is observed that the discrepancy between analytical and numerical results is less than 0.6% for 400 mm viewer distance. Moreover, in the case of oblique view due to considering the thickness of the finite film between mesh electrodes and screen, different points of minimum standard deviation of moiré pattern are predicted compared to normal view.

  14. Modular Software for Spacecraft Navigation Using the Global Positioning System (GPS)

    NASA Technical Reports Server (NTRS)

    Truong, S. H.; Hartman, K. R.; Weidow, D. A.; Berry, D. L.; Oza, D. H.; Long, A. C.; Joyce, E.; Steger, W. L.

    1996-01-01

    The Goddard Space Flight Center Flight Dynamics and Mission Operations Divisions have jointly investigated the feasibility of engineering modular Global Positioning SYSTEM (GPS) navigation software to support both real time flight and ground postprocessing configurations. The goals of this effort are to define standard GPS data interfaces and to engineer standard, reusable navigation software components that can be used to build a broad range of GPS navigation support applications. The paper discusses the GPS modular software (GMOD) system and operations concepts, major requirements, candidate software architecture, feasibility assessment and recommended software interface standards. In additon, ongoing efforts to broaden the scope of the initial study and to develop modular software to support autonomous navigation using GPS are addressed,

  15. Space Telecommunications Radio Architecture (STRS)

    NASA Technical Reports Server (NTRS)

    Reinhart, Richard C.

    2006-01-01

    A software defined radio (SDR) architecture used in space-based platforms proposes to standardize certain aspects of radio development such as interface definitions, functional control and execution, and application software and firmware development. NASA has charted a team to develop an open software defined radio hardware and software architecture to support NASA missions and determine the viability of an Agency-wide Standard. A draft concept of the proposed standard has been released and discussed among organizations in the SDR community. Appropriate leveraging of the JTRS SCA, OMG's SWRadio Architecture and other aspects are considered. A standard radio architecture offers potential value by employing common waveform software instantiation, operation, testing and software maintenance. While software defined radios offer greater flexibility, they also poses challenges to the radio development for the space environment in terms of size, mass and power consumption and available technology. An SDR architecture for space must recognize and address the constraints of space flight hardware, and systems along with flight heritage and culture. NASA is actively participating in the development of technology and standards related to software defined radios. As NASA considers a standard radio architecture for space communications, input and coordination from government agencies, the industry, academia, and standards bodies is key to a successful architecture. The unique aspects of space require thorough investigation of relevant terrestrial technologies properly adapted to space. The talk will describe NASA s current effort to investigate SDR applications to space missions and a brief overview of a candidate architecture under consideration for space based platforms.

  16. Space Telecommunications Radio Architecture (STRS): Technical Overview

    NASA Technical Reports Server (NTRS)

    Reinhart, Richard C.

    2006-01-01

    A software defined radio (SDR) architecture used in space-based platforms proposes to standardize certain aspects of radio development such as interface definitions, functional control and execution, and application software and firmware development. NASA has charted a team to develop an open software defined radio hardware and software architecture to support NASA missions and determine the viability of an Agency-wide Standard. A draft concept of the proposed standard has been released and discussed among organizations in the SDR community. Appropriate leveraging of the JTRS SCA, OMG s SWRadio Architecture and other aspects are considered. A standard radio architecture offers potential value by employing common waveform software instantiation, operation, testing and software maintenance. While software defined radios offer greater flexibility, they also poses challenges to the radio development for the space environment in terms of size, mass and power consumption and available technology. An SDR architecture for space must recognize and address the constraints of space flight hardware, and systems along with flight heritage and culture. NASA is actively participating in the development of technology and standards related to software defined radios. As NASA considers a standard radio architecture for space communications, input and coordination from government agencies, the industry, academia, and standards bodies is key to a successful architecture. The unique aspects of space require thorough investigation of relevant terrestrial technologies properly adapted to space. The talk will describe NASA's current effort to investigate SDR applications to space missions and a brief overview of a candidate architecture under consideration for space based platforms.

  17. Side-branch resonators modelling with Green's function methods

    NASA Astrophysics Data System (ADS)

    Perrey-Debain, E.; Maréchal, R.; Ville, J. M.

    2014-09-01

    This paper deals with strategies for computing efficiently the propagation of sound waves in ducts containing passive components. In many cases of practical interest, these components are acoustic cavities which are connected to the duct. Though standard Finite Element software could be used for the numerical prediction of sound transmission through such a system, the method is known to be extremely demanding, both in terms of data preparation and computation, especially in the mid-frequency range. To alleviate this, a numerical technique that exploits the benefit of the FEM and the BEM approach has been devised. First, a set of eigenmodes is computed in the cavity to produce a numerical impedance matrix connecting the pressure and the acoustic velocity on the duct wall interface. Then an integral representation for the acoustic pressure in the main duct is used. By choosing an appropriate Green's function for the duct, the integration procedure is limited to the duct-cavity interface only. This allows an accurate computation of the scattering matrix of such an acoustic system with a numerical complexity that grows very mildly with the frequency. Typical applications involving Helmholtz and Herschel-Quincke resonators are presented.

  18. Experimental and numerical analysis of convergent nozzlex

    NASA Astrophysics Data System (ADS)

    Srinivas, G.; Rakham, Bhupal

    2017-05-01

    In this paper the main focus was given to convergent nozzle where both the experimental and numerical calculations were carried out with the support of standardized literature. In the recent years the field of air breathing and non-air breathing engine developments significantly increase its performance. To enhance the performance of both the type of engines the nozzle is the one of the component which will play a vital role, especially selecting the type of nozzle depends upon the vehicle speed requirement and aerodynamic behavior at most important in the field of propulsion. The convergent nozzle flow experimental analysis done using scaled apparatus and the similar setup was arranged artificially in the ANSYS software for doing the flow analysis across the convergent nozzle. The consistent calculation analysis are done based on the public literature survey to validate the experimental and numerical simulation results of convergent nozzle. Using these two experimental and numerical simulation approaches the best fit results will bring up to meet the design requirements. However the comparison also made to meet the reliability of the work on design criteria of convergent nozzle which can entrench in the field of propulsion applications.

  19. Modeling and Evaluation of Geophysical Methods for Monitoring and Tracking CO2 Migration

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Daniels, Jeff

    2012-11-30

    Geological sequestration has been proposed as a viable option for mitigating the vast amount of CO{sub 2} being released into the atmosphere daily. Test sites for CO{sub 2} injection have been appearing across the world to ascertain the feasibility of capturing and sequestering carbon dioxide. A major concern with full scale implementation is monitoring and verifying the permanence of injected CO{sub 2}. Geophysical methods, an exploration industry standard, are non-invasive imaging techniques that can be implemented to address that concern. Geophysical methods, seismic and electromagnetic, play a crucial role in monitoring the subsurface pre- and post-injection. Seismic techniques have beenmore » the most popular but electromagnetic methods are gaining interest. The primary goal of this project was to develop a new geophysical tool, a software program called GphyzCO2, to investigate the implementation of geophysical monitoring for detecting injected CO{sub 2} at test sites. The GphyzCO2 software consists of interconnected programs that encompass well logging, seismic, and electromagnetic methods. The software enables users to design and execute 3D surface-to-surface (conventional surface seismic) and borehole-to-borehole (cross-hole seismic and electromagnetic methods) numerical modeling surveys. The generalized flow of the program begins with building a complex 3D subsurface geological model, assigning properties to the models that mimic a potential CO{sub 2} injection site, numerically forward model a geophysical survey, and analyze the results. A test site located in Warren County, Ohio was selected as the test site for the full implementation of GphyzCO2. Specific interest was placed on a potential reservoir target, the Mount Simon Sandstone, and cap rock, the Eau Claire Formation. Analysis of the test site included well log data, physical property measurements (porosity), core sample resistivity measurements, calculating electrical permittivity values, seismic data collection, and seismic interpretation. The data was input into GphyzCO2 to demonstrate a full implementation of the software capabilities. Part of the implementation investigated the limits of using geophysical methods to monitor CO{sub 2} injection sites. The results show that cross-hole EM numerical surveys are limited to under 100 meter borehole separation. Those results were utilized in executing numerical EM surveys that contain hypothetical CO{sub 2} injections. The outcome of the forward modeling shows that EM methods can detect the presence of CO{sub 2}.« less

  20. Standard practices for the implementation of computer software

    NASA Technical Reports Server (NTRS)

    Irvine, A. P. (Editor)

    1978-01-01

    A standard approach to the development of computer program is provided that covers the file cycle of software development from the planning and requirements phase through the software acceptance testing phase. All documents necessary to provide the required visibility into the software life cycle process are discussed in detail.

  1. 78 FR 47014 - Configuration Management Plans for Digital Computer Software Used in Safety Systems of Nuclear...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-08-02

    ... Software Used in Safety Systems of Nuclear Power Plants AGENCY: Nuclear Regulatory Commission. ACTION... Computer Software Used in Safety Systems of Nuclear Power Plants.'' This RG endorses, with clarifications... Electrical and Electronic Engineers (IEEE) Standard 828-2005, ``IEEE Standard for Software Configuration...

  2. Standardization: Hardware and Software Standardization Can Reduce Costs and Save Time

    ERIC Educational Resources Information Center

    Brooks-Young, Susan

    2005-01-01

    Sadly, technical support doesn't come cheap. One money-saving strategy that's gained popularity among school technicians is equipment and software standardization. When it works, standardization can be very effective. However, standardization has its drawbacks. This article discusses the advantages and disadvantages of standardization.

  3. Numerical Simulation of Creep Characteristic for Composite Rock Mass with Weak Interlayer

    NASA Astrophysics Data System (ADS)

    Li, Jian-guang; Zhang, Zuo-liang; Zhang, Yu-biao; Shi, Xiu-wen; Wei, Jian

    2017-06-01

    The composite rock mass with weak interlayer is widely exist in engineering, and it’s essential to research the creep behavior which could cause stability problems of rock engineering and production accidents. However, due to it is difficult to take samples, the losses and damages in delivery and machining process, we always cannot get enough natural layered composite rock mass samples, so the indirect test method has been widely used. In this paper, we used ANSYS software (a General Finite Element software produced by American ANSYS, Inc) to carry out the numerical simulation based on the uniaxial compression creep experiments of artificial composite rock mass with weak interlayer, after experimental data fitted. The results show that the laws obtained by numerical simulations and experiments are consistent. Thus confirmed that carry out numerical simulation for the creep characteristics of rock mass with ANSYS software is feasible, and this method can also be extended to other underground engineering of simulate the weak intercalations.

  4. Fusing Symbolic and Numerical Diagnostic Computations

    NASA Technical Reports Server (NTRS)

    James, Mark

    2007-01-01

    X-2000 Anomaly Detection Language denotes a developmental computing language, and the software that establishes and utilizes the language, for fusing two diagnostic computer programs, one implementing a numerical analysis method, the other implementing a symbolic analysis method into a unified event-based decision analysis software system for realtime detection of events (e.g., failures) in a spacecraft, aircraft, or other complex engineering system. The numerical analysis method is performed by beacon-based exception analysis for multi-missions (BEAMs), which has been discussed in several previous NASA Tech Briefs articles. The symbolic analysis method is, more specifically, an artificial-intelligence method of the knowledge-based, inference engine type, and its implementation is exemplified by the Spacecraft Health Inference Engine (SHINE) software. The goal in developing the capability to fuse numerical and symbolic diagnostic components is to increase the depth of analysis beyond that previously attainable, thereby increasing the degree of confidence in the computed results. In practical terms, the sought improvement is to enable detection of all or most events, with no or few false alarms.

  5. New Passivity Criteria for Fuzzy Bam Neural Networks with Markovian Jumping Parameters and Time-Varying Delays

    NASA Astrophysics Data System (ADS)

    Vadivel, P.; Sakthivel, R.; Mathiyalagan, K.; Thangaraj, P.

    2013-02-01

    This paper addresses the problem of passivity analysis issue for a class of fuzzy bidirectional associative memory (BAM) neural networks with Markovian jumping parameters and time varying delays. A set of sufficient conditions for the passiveness of the considered fuzzy BAM neural network model is derived in terms of linear matrix inequalities by using the delay fractioning technique together with the Lyapunov function approach. In addition, the uncertainties are inevitable in neural networks because of the existence of modeling errors and external disturbance. Further, this result is extended to study the robust passivity criteria for uncertain fuzzy BAM neural networks with time varying delays and uncertainties. These criteria are expressed in the form of linear matrix inequalities (LMIs), which can be efficiently solved via standard numerical software. Two numerical examples are provided to demonstrate the effectiveness of the obtained results.

  6. Contracting for Computer Software in Standardized Computer Languages

    PubMed Central

    Brannigan, Vincent M.; Dayhoff, Ruth E.

    1982-01-01

    The interaction between standardized computer languages and contracts for programs which use these languages is important to the buyer or seller of software. The rationale for standardization, the problems in standardizing computer languages, and the difficulties of determining whether the product conforms to the standard are issues which must be understood. The contract law processes of delivery, acceptance testing, acceptance, rejection, and revocation of acceptance are applicable to the contracting process for standard language software. Appropriate contract language is suggested for requiring strict compliance with a standard, and an overview of remedies is given for failure to comply.

  7. Numerical Analysis of Deflections of Multi-Layered Beams

    NASA Astrophysics Data System (ADS)

    Biliński, Tadeusz; Socha, Tomasz

    2015-03-01

    The paper concerns the rheological bending problem of wooden beams reinforced with embedded composite bars. A theoretical model of the behaviour of a multi-layered beam is presented. The component materials of this beam are described with equations for the linear viscoelastic five-parameter rheological model. Two numerical analysis methods for the long-term response of wood structures are presented. The first method has been developed with SCILAB software. The second one has been developed with the finite element calculation software ABAQUS and user subroutine UMAT. Laboratory investigations were conducted on sample beams of natural dimensions in order to validate the proposed theoretical model and verify numerical simulations. Good agreement between experimental measurements and numerical results is observed.

  8. Temperature distribution of thick thermoset composites

    NASA Astrophysics Data System (ADS)

    Guo, Zhan-Sheng; Du, Shanyi; Zhang, Boming

    2004-05-01

    The development of temperature distribution of thick polymeric matrix laminates during an autoclave vacuum bag process was measured and compared with numerically calculated results. The finite element formulation of the transient heat transfer problem was carried out for polymeric matrix composite materials from the heat transfer differential equations including internal heat generation produced by exothermic chemical reactions. Software based on the general finite element software package was developed for numerical simulation of the entire composite process. From the experimental and numerical results, it was found that the measured temperature profiles were in good agreement with the numerical ones, and conventional cure cycles recommended by prepreg manufacturers for thin laminates should be modified to prevent temperature overshoot.

  9. The Evolution of Software Publication in Astronomy

    NASA Astrophysics Data System (ADS)

    Cantiello, Matteo

    2018-01-01

    Software is a fundamental component of the scientific research process. As astronomical discoveries increasingly rely on complex numerical calculations and the analysis of big data sets, publishing and documenting software is a fundamental step in ensuring transparency and reproducibility of results. I will briefly discuss the recent history of software publication and highlight the challenges and opportunities ahead.

  10. Code White: A Signed Code Protection Mechanism for Smartphones

    DTIC Science & Technology

    2010-09-01

    analogous to computer security is the use of antivirus (AV) software . 12 AV software is a brute force approach to security. The software ...these users, numerous malicious programs have also surfaced. And while smartphones have desktop-like capabilities to execute software , they do not...11 2.3.1 Antivirus and Mobile Phones ............................................................... 11 2.3.2

  11. NASA Software Safety Standard

    NASA Technical Reports Server (NTRS)

    Rosenberg, Linda

    1997-01-01

    If software is a critical element in a safety critical system, it is imperative to implement a systematic approach to software safety as an integral part of the overall system safety programs. The NASA-STD-8719.13A, "NASA Software Safety Standard", describes the activities necessary to ensure that safety is designed into software that is acquired or developed by NASA, and that safety is maintained throughout the software life cycle. A PDF version, is available on the WWW from Lewis. A Guidebook that will assist in the implementation of the requirements in the Safety Standard is under development at the Lewis Research Center (LeRC). After completion, it will also be available on the WWW from Lewis.

  12. Numerical modelling techniques of soft soil improvement via stone columns: A brief review

    NASA Astrophysics Data System (ADS)

    Zukri, Azhani; Nazir, Ramli

    2018-04-01

    There are a number of numerical studies on stone column systems in the literature. Most of the studies found were involved with two-dimensional analysis of the stone column behaviour, while only a few studies used three-dimensional analysis. The most popular software utilised in those studies was Plaxis 2D and 3D. Other types of software that used for numerical analysis are DIANA, EXAMINE, ZSoil, ABAQUS, ANSYS, NISA, GEOSTUDIO, CRISP, TOCHNOG, CESAR, GEOFEM (2D & 3D), FLAC, and FLAC 3. This paper will review the methodological approaches to model stone column numerically, both in two-dimensional and three-dimensional analyses. The numerical techniques and suitable constitutive model used in the studies will also be discussed. In addition, the validation methods conducted were to verify the numerical analysis conducted will be presented. This review paper also serves as a guide for junior engineers through the applicable procedures and considerations when constructing and running a two or three-dimensional numerical analysis while also citing numerous relevant references.

  13. NASA's Software Safety Standard

    NASA Technical Reports Server (NTRS)

    Ramsay, Christopher M.

    2007-01-01

    NASA relies more and more on software to control, monitor, and verify its safety critical systems, facilities and operations. Since the 1960's there has hardly been a spacecraft launched that does not have a computer on board that will provide command and control services. There have been recent incidents where software has played a role in high-profile mission failures and hazardous incidents. For example, the Mars Orbiter, Mars Polar Lander, the DART (Demonstration of Autonomous Rendezvous Technology), and MER (Mars Exploration Rover) Spirit anomalies were all caused or contributed to by software. The Mission Control Centers for the Shuttle, ISS, and unmanned programs are highly dependant on software for data displays, analysis, and mission planning. Despite this growing dependence on software control and monitoring, there has been little to no consistent application of software safety practices and methodology to NASA's projects with safety critical software. Meanwhile, academia and private industry have been stepping forward with procedures and standards for safety critical systems and software, for example Dr. Nancy Leveson's book Safeware: System Safety and Computers. The NASA Software Safety Standard, originally published in 1997, was widely ignored due to its complexity and poor organization. It also focused on concepts rather than definite procedural requirements organized around a software project lifecycle. Led by NASA Headquarters Office of Safety and Mission Assurance, the NASA Software Safety Standard has recently undergone a significant update. This new standard provides the procedures and guidelines for evaluating a project for safety criticality and then lays out the minimum project lifecycle requirements to assure the software is created, operated, and maintained in the safest possible manner. This update of the standard clearly delineates the minimum set of software safety requirements for a project without detailing the implementation for those requirements. This allows the projects leeway to meet these requirements in many forms that best suit a particular project's needs and safety risk. In other words, it tells the project what to do, not how to do it. This update also incorporated advances in the state of the practice of software safety from academia and private industry. It addresses some of the more common issues now facing software developers in the NASA environment such as the use of Commercial-Off-the-Shelf Software (COTS), Modified OTS (MOTS), Government OTS (GOTS), and reused software. A team from across NASA developed the update and it has had both NASA-wide internal reviews by software engineering, quality, safety, and project management. It has also had expert external review. This presentation and paper will discuss the new NASA Software Safety Standard, its organization, and key features. It will start with a brief discussion of some NASA mission failures and incidents that had software as one of their root causes. It will then give a brief overview of the NASA Software Safety Process. This will include an overview of the key personnel responsibilities and functions that must be performed for safety-critical software.

  14. NASA's Software Safety Standard

    NASA Technical Reports Server (NTRS)

    Ramsay, Christopher M.

    2005-01-01

    NASA (National Aeronautics and Space Administration) relies more and more on software to control, monitor, and verify its safety critical systems, facilities and operations. Since the 1960's there has hardly been a spacecraft (manned or unmanned) launched that did not have a computer on board that provided vital command and control services. Despite this growing dependence on software control and monitoring, there has been no consistent application of software safety practices and methodology to NASA's projects with safety critical software. Led by the NASA Headquarters Office of Safety and Mission Assurance, the NASA Software Safety Standard (STD-18l9.13B) has recently undergone a significant update in an attempt to provide that consistency. This paper will discuss the key features of the new NASA Software Safety Standard. It will start with a brief history of the use and development of software in safety critical applications at NASA. It will then give a brief overview of the NASA Software Working Group and the approach it took to revise the software engineering process across the Agency.

  15. Design and Implementation of Hybrid CORDIC Algorithm Based on Phase Rotation Estimation for NCO

    PubMed Central

    Zhang, Chaozhu; Han, Jinan; Li, Ke

    2014-01-01

    The numerical controlled oscillator has wide application in radar, digital receiver, and software radio system. Firstly, this paper introduces the traditional CORDIC algorithm. Then in order to improve computing speed and save resources, this paper proposes a kind of hybrid CORDIC algorithm based on phase rotation estimation applied in numerical controlled oscillator (NCO). Through estimating the direction of part phase rotation, the algorithm reduces part phase rotation and add-subtract unit, so that it decreases delay. Furthermore, the paper simulates and implements the numerical controlled oscillator by Quartus II software and Modelsim software. Finally, simulation results indicate that the improvement over traditional CORDIC algorithm is achieved in terms of ease of computation, resource utilization, and computing speed/delay while maintaining the precision. It is suitable for high speed and precision digital modulation and demodulation. PMID:25110750

  16. Software Formal Inspections Guidebook

    NASA Technical Reports Server (NTRS)

    1993-01-01

    The Software Formal Inspections Guidebook is designed to support the inspection process of software developed by and for NASA. This document provides information on how to implement a recommended and proven method for conducting formal inspections of NASA software. This Guidebook is a companion document to NASA Standard 2202-93, Software Formal Inspections Standard, approved April 1993, which provides the rules, procedures, and specific requirements for conducting software formal inspections. Application of the Formal Inspections Standard is optional to NASA program or project management. In cases where program or project management decide to use the formal inspections method, this Guidebook provides additional information on how to establish and implement the process. The goal of the formal inspections process as documented in the above-mentioned Standard and this Guidebook is to provide a framework and model for an inspection process that will enable the detection and elimination of defects as early as possible in the software life cycle. An ancillary aspect of the formal inspection process incorporates the collection and analysis of inspection data to effect continual improvement in the inspection process and the quality of the software subjected to the process.

  17. Certification Processes for Safety-Critical and Mission-Critical Aerospace Software

    NASA Technical Reports Server (NTRS)

    Nelson, Stacy

    2003-01-01

    This document is a quick reference guide with an overview of the processes required to certify safety-critical and mission-critical flight software at selected NASA centers and the FAA. Researchers and software developers can use this guide to jumpstart their understanding of how to get new or enhanced software onboard an aircraft or spacecraft. The introduction contains aerospace industry definitions of safety and safety-critical software, as well as, the current rationale for certification of safety-critical software. The Standards for Safety-Critical Aerospace Software section lists and describes current standards including NASA standards and RTCA DO-178B. The Mission-Critical versus Safety-Critical software section explains the difference between two important classes of software: safety-critical software involving the potential for loss of life due to software failure and mission-critical software involving the potential for aborting a mission due to software failure. The DO-178B Safety-critical Certification Requirements section describes special processes and methods required to obtain a safety-critical certification for aerospace software flying on vehicles under auspices of the FAA. The final two sections give an overview of the certification process used at Dryden Flight Research Center and the approval process at the Jet Propulsion Lab (JPL).

  18. rpe v5: an emulator for reduced floating-point precision in large numerical simulations

    NASA Astrophysics Data System (ADS)

    Dawson, Andrew; Düben, Peter D.

    2017-06-01

    This paper describes the rpe (reduced-precision emulator) library which has the capability to emulate the use of arbitrary reduced floating-point precision within large numerical models written in Fortran. The rpe software allows model developers to test how reduced floating-point precision affects the result of their simulations without having to make extensive code changes or port the model onto specialized hardware. The software can be used to identify parts of a program that are problematic for numerical precision and to guide changes to the program to allow a stronger reduction in precision.The development of rpe was motivated by the strong demand for more computing power. If numerical precision can be reduced for an application under consideration while still achieving results of acceptable quality, computational cost can be reduced, since a reduction in numerical precision may allow an increase in performance or a reduction in power consumption. For simulations with weather and climate models, savings due to a reduction in precision could be reinvested to allow model simulations at higher spatial resolution or complexity, or to increase the number of ensemble members to improve predictions. rpe was developed with a particular focus on the community of weather and climate modelling, but the software could be used with numerical simulations from other domains.

  19. An empirical evaluation of software quality assurance practices and challenges in a developing country: a comparison of Nigeria and Turkey.

    PubMed

    Sowunmi, Olaperi Yeside; Misra, Sanjay; Fernandez-Sanz, Luis; Crawford, Broderick; Soto, Ricardo

    2016-01-01

    The importance of quality assurance in the software development process cannot be overemphasized because its adoption results in high reliability and easy maintenance of the software system and other software products. Software quality assurance includes different activities such as quality control, quality management, quality standards, quality planning, process standardization and improvement amongst others. The aim of this work is to further investigate the software quality assurance practices of practitioners in Nigeria. While our previous work covered areas on quality planning, adherence to standardized processes and the inherent challenges, this work has been extended to include quality control, software process improvement and international quality standard organization membership. It also makes comparison based on a similar study carried out in Turkey. The goal is to generate more robust findings that can properly support decision making by the software community. The qualitative research approach, specifically, the use of questionnaire research instruments was applied to acquire data from software practitioners. In addition to the previous results, it was observed that quality assurance practices are quite neglected and this can be the cause of low patronage. Moreover, software practitioners are neither aware of international standards organizations or the required process improvement techniques; as such their claimed standards are not aligned to those of accredited bodies, and are only limited to their local experience and knowledge, which makes it questionable. The comparison with Turkey also yielded similar findings, making the results typical of developing countries. The research instrument used was tested for internal consistency using the Cronbach's alpha, and it was proved reliable. For the software industry in developing countries to grow strong and be a viable source of external revenue, software assurance practices have to be taken seriously because its effect is evident in the final product. Moreover, quality frameworks and tools which require minimum time and cost are highly needed in these countries.

  20. Algorithms and software for U-Pb geochronology by LA-ICPMS

    NASA Astrophysics Data System (ADS)

    McLean, Noah M.; Bowring, James F.; Gehrels, George

    2016-07-01

    The past 15 years have produced numerous innovations in geochronology, including experimental methods, instrumentation, and software that are revolutionizing the acquisition and application of geochronological data. For example, exciting advances are being driven by Laser-Ablation ICP Mass Spectrometry (LA-ICPMS), which allows for rapid determination of U-Th-Pb ages with 10s of micrometer-scale spatial resolution. This method has become the most commonly applied tool for dating zircons, constraining a host of geological problems. The LA-ICPMS community is now faced with archiving these data with associated analytical results and, more importantly, ensuring that data meet the highest standards for precision and accuracy and that interlaboratory biases are minimized. However, there is little consensus with regard to analytical strategies and data reduction protocols for LA-ICPMS geochronology. The result is systematic interlaboratory bias and both underestimation and overestimation of uncertainties on calculated dates that, in turn, decrease the value of data in repositories such as EarthChem, which archives data and analytical results from participating laboratories. We present free open-source software that implements new algorithms for evaluating and resolving many of these discrepancies. This solution is the result of a collaborative effort to extend the U-Pb_Redux software for the ID-TIMS community to the LA-ICPMS community. Now named ET_Redux, our new software automates the analytical and scientific workflows of data acquisition, statistical filtering, data analysis and interpretation, publication, community-based archiving, and the compilation and comparison of data from different laboratories to support collaborative science.

  1. Computational Methods for Identification, Optimization and Control of PDE Systems

    DTIC Science & Technology

    2010-04-30

    focused on the development of numerical methods and software specifically for the purpose of solving control, design, and optimization prob- lems where...that provide the foundations of simulation software must play an important role in any research of this type, the demands placed on numerical methods...y sus Aplicaciones , Ciudad de Cor- doba - Argentina, October 2007. 3. Inverse Problems in Deployable Space Structures, Fourth Conference on Inverse

  2. Surveying the Numeric Databanks.

    ERIC Educational Resources Information Center

    O'Leary, Mick

    1987-01-01

    Describes six leading numeric databank services and compares them with bibliographic databases in terms of customers' needs, search software, pricing arrangements, and the role of the search specialist. A listing of the locations of the numeric databanks discussed is provided. (CLB)

  3. Supporting the Use of CERT (registered trademark) Secure Coding Standards in DoD Acquisitions

    DTIC Science & Technology

    2012-07-01

    Capability Maturity Model IntegrationSM (CMMI®) [Davis 2009]. SM Team Software Process, TSP, and Capability Maturity Model Integration are service...STP Software Test Plan TEP Test and Evaluation Plan TSP Team Software Process V & V verification and validation CMU/SEI-2012-TN-016 | 47...Supporting the Use of CERT® Secure Coding Standards in DoD Acquisitions Tim Morrow ( Software Engineering Institute) Robert Seacord ( Software

  4. A software controllable modular RF signal generator with multichannel transmission capabilities.

    PubMed

    Shaw, Z; Feilner, W; Esser, B; Dickens, J C; Neuber, A A

    2017-09-01

    A software controllable system which generates and transmits user defined RF signals is discussed. The system is implemented with multiple, modular transmitting channels that allow the user to easily replace parts such as amplifiers or antennas. Each channel is comprised of a data pattern generator (DPG), a digital to analog converter (DAC), a power amplifier, and a transmitting antenna. All channels are controlled through a host PC and synchronized through a master clock signal provided to each DAC by an external clock source. Signals to be transmitted are generated through the DPG control software on the PC or can be created by the user in a numerical computing environment. Three experiments are discussed using a two- and four-channel antenna array incorporating Chebyshev tapered TEM horn antennas. Transmitting distinct sets of nonperiodic bipolar impulses through each of the antennas in the array enabled synthesizing a sinusoidal signal of specific frequency in free space. Opposite to the standard phased array approach, each antenna radiates a distinctly different signal rather than the same signal simply phase shifted. The presented approach may be employed as a physical layer of encryption dependent on the position of the receiving antenna.

  5. Software archeology: a case study in software quality assurance and design

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Macdonald, John M; Lloyd, Jane A; Turner, Cameron J

    2009-01-01

    Ideally, quality is designed into software, just as quality is designed into hardware. However, when dealing with legacy systems, demonstrating that the software meets required quality standards may be difficult to achieve. As the need to demonstrate the quality of existing software was recognized at Los Alamos National Laboratory (LANL), an effort was initiated to uncover and demonstrate that legacy software met the required quality standards. This effort led to the development of a reverse engineering approach referred to as software archaeology. This paper documents the software archaeology approaches used at LANL to document legacy software systems. A case studymore » for the Robotic Integrated Packaging System (RIPS) software is included.« less

  6. A Probabilistic Software System Attribute Acceptance Paradigm for COTS Software Evaluation

    NASA Technical Reports Server (NTRS)

    Morris, A. Terry

    2005-01-01

    Standard software requirement formats are written from top-down perspectives only, that is, from an ideal notion of a client s needs. Despite the exactness of the standard format, software and system errors in designed systems have abounded. Bad and inadequate requirements have resulted in cost overruns, schedule slips and lost profitability. Commercial off-the-shelf (COTS) software components are even more troublesome than designed systems because they are often provided as is and subsequently delivered with unsubstantiated validation of described capabilities. For COTS software, there needs to be a way to express the client s software needs in a consistent and formal manner using software system attributes derived from software quality standards. Additionally, the format needs to be amenable to software evaluation processes that integrate observable evidence garnered from historical data. This paper presents a paradigm that effectively bridges the gap between what a client desires (top-down) and what has been demonstrated (bottom-up) for COTS software evaluation. The paradigm addresses the specification of needs before the software evaluation is performed and can be used to increase the shared understanding between clients and software evaluators about what is required and what is technically possible.

  7. Numerical Analysis on Tensile Properties of Grout-filled Splice Sleeve Rebars under ISO 834 Standard Fire

    NASA Astrophysics Data System (ADS)

    Liu, Yong Jun; Li, Chao; Zhou, When Jun

    2018-06-01

    This paper presents some numerical simulation results of tensile properties of reinforcing bars spliced by grout-filled coupling sleeves under fire conditions to identify the effect of load ratio on fire resistance time of spliced reinforcing bars, which provide a useful base for predicting structural behaviors of pre-cast reinforced concrete buildings in fires. A spliced rebar system investigated in this paper consists of two equal-diameter steel reinforcing bars with 25mm diameter and a straight coupling sleeve with 50mm outer and 45mm inner diameters. As a result, the thickness of grout between steel bars and sleeves are 20mm. Firstly, the temperature distributions in steel bars connected by grout- filled coupling sleeves exposed to ISO 834 standard fire were calculated utilizing finite element analysis software ANSYS. Secondly, the stress changes in heated steel bars connected by grout-filled coupling sleeves under different constant tensile loads were calculated step by step until the rebar system failed due to fire. Thus, the fire resistant time of rebar spliced by grout-filled coupling sleeves under different axial tensile loads can be determined, further, the relationship between fire resistance time and axial tensile loads ratio can could be obtained. Finally, the fire resistant times versus axial tensile load ratios curve of grout-filled splice sleeve rebars exposed to ISO 834 standard fire is presented.

  8. 75 FR 15440 - Guidance for Industry on Standards for Securing the Drug Supply Chain-Standardized Numerical...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-03-29

    ...] Guidance for Industry on Standards for Securing the Drug Supply Chain--Standardized Numerical... industry entitled ``Standards for Securing the Drug Supply Chain-Standardized Numerical Identification for... the Drug Supply Chain-Standardized Numerical Identification for Prescription Drug Packages.'' In the...

  9. An Open Simulation System Model for Scientific Applications

    NASA Technical Reports Server (NTRS)

    Williams, Anthony D.

    1995-01-01

    A model for a generic and open environment for running multi-code or multi-application simulations - called the open Simulation System Model (OSSM) - is proposed and defined. This model attempts to meet the requirements of complex systems like the Numerical Propulsion Simulator System (NPSS). OSSM places no restrictions on the types of applications that can be integrated at any state of its evolution. This includes applications of different disciplines, fidelities, etc. An implementation strategy is proposed that starts with a basic prototype, and evolves over time to accommodate an increasing number of applications. Potential (standard) software is also identified which may aid in the design and implementation of the system.

  10. Department of the Navy Justification of Estimates for Fiscal Year 1987 Submitted to Congress February 1986. Operation & Maintenance, Navy. Book 3. Budget Activity 3: Intelligence & Communications Budget Activity 8: Training, Medical & OGPA Budget Activity 9: Administration & Assoc Acts. Budget Activity 10: Support to Other Nations

    DTIC Science & Technology

    1986-02-01

    ionospheric sensing 118 I device, provide more reliable comnunications, especially when HF propagation is uncertain, by determining which frequencies...the Fleet Numerical Oceanography Center computer systems. GDEM generates a sound velocity profile from the surface to the sea floor at every 1/2...degrees of lat/log for the Northern Hemisphere oceans. GDEM is a Navy standard data base for all acoustic models. 18) Software Improvement Plan (SIP) as 600

  11. Fleet Numerical Oceanography Center Software Development Standards: An Implementation of DoD-STD-2167A

    DTIC Science & Technology

    1989-09-01

    STD-2167A by William T. Livings September 1989 Thesis Advisor: Barry A. Frew Approved for public release; distribution is unlimited UNCLASSIFIED...f’P) TfLI Po*i~o1 (InoriudC A,.’-g IOle) *’, i 14 41 iProf.- Barry A. Frow (Q11rioqAr. DD Form 1 413, JUN 86 ’Ciij ’iI ’ ti)P ,i I. ij j-~-~I i~4~~6...easily changed or corrected when errors are found; and programs that are delivered for use months or even years too late. ( Pressman , 1988, pp. I- 2

  12. A radial transmission line material measurement apparatus

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Warne, L.K.; Moyer, R.D.; Koontz, T.E.

    1993-05-01

    A radial transmission line material measurement sample apparatus (sample holder, offset short standards, measurement software, and instrumentation) is described which has been proposed, analyzed, designed, constructed, and tested. The purpose of the apparatus is to obtain accurate surface impedance measurements of lossy, possibly anisotropic, samples at low and intermediate frequencies (vhf and low uhf). The samples typically take the form of sections of the material coatings on conducting objects. Such measurements thus provide the key input data for predictive numerical scattering codes. Prediction of the sample surface impedance from the coaxial input impedance measurement is carried out by two techniques.more » The first is an analytical model for the coaxial-to-radial transmission line junction. The second is an empirical determination of the bilinear transformation model of the junction by the measurement of three full standards. The standards take the form of three offset shorts (and an additional lossy Salisbury load), which have also been constructed. The accuracy achievable with the device appears to be near one percent.« less

  13. GCS programmer's manual

    NASA Technical Reports Server (NTRS)

    Lowman, Douglas S.; Withers, B. Edward; Shagnea, Anita M.; Dent, Leslie A.; Hayhurst, Kelly J.

    1990-01-01

    A variety of instructions to be used in the development of implementations of software for the Guidance and Control Software (GCS) project is described. This document fulfills the Radio Technical Commission for Aeronautics RTCA/DO-178A guidelines, 'Software Considerations in Airborne Systems and Equipment Certification' requirements for document No. 4, which specifies the information necessary for understanding and programming the host computer, and document No. 12, which specifies the software design and implementation standards that are applicable to the software development and testing process. Information on the following subjects is contained: activity recording, communication protocol, coding standards, change management, error handling, design standards, problem reporting, module testing logs, documentation formats, accuracy requirements, and programmer responsibilities.

  14. A microkernel design for component-based parallel numerical software systems.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Balay, S.

    1999-01-13

    What is the minimal software infrastructure and what type of conventions are needed to simplify development of sophisticated parallel numerical application codes using a variety of software components that are not necessarily available as source code? We propose an opaque object-based model where the objects are dynamically loadable from the file system or network. The microkernel required to manage such a system needs to include, at most: (1) a few basic services, namely--a mechanism for loading objects at run time via dynamic link libraries, and consistent schemes for error handling and memory management; and (2) selected methods that all objectsmore » share, to deal with object life (destruction, reference counting, relationships), and object observation (viewing, profiling, tracing). We are experimenting with these ideas in the context of extensible numerical software within the ALICE (Advanced Large-scale Integrated Computational Environment) project, where we are building the microkernel to manage the interoperability among various tools for large-scale scientific simulations. This paper presents some preliminary observations and conclusions from our work with microkernel design.« less

  15. Quality and standardization of telecommunication switching system software

    NASA Astrophysics Data System (ADS)

    Ranko, K.; Hivensaio, J.; Myllykangas, A.

    1981-12-01

    The purpose of this paper has been to illustrate quality and standardization of switching system software from the authors point of view with the aim of developing standardization in the user environment.

  16. Numerical investigation of flow on NACA4412 aerofoil with different aspect ratios

    NASA Astrophysics Data System (ADS)

    Demir, Hacımurat; Özden, Mustafa; Genç, Mustafa Serdar; Çağdaş, Mücahit

    2016-03-01

    In this study, the flow over NACA4412 was investigated both numerically and experimentally at a different Reynolds numbers. The experiments were carried out in a low speed wind tunnel with various angles of attack and different Reynolds numbers (25000 and 50000). Airfoil was manufactured using 3D printer with a various aspect ratios (AR = 1 and AR = 3). Smoke-wire and oil flow visualization methods were used to visualize the surface flow patterns. NACA4412 aerofoil was designed by using SOLIDWORKS. The structural grid of numerical model was constructed by ANSYS ICEM CFD meshing software. Furthermore, ANSYS FLUENT™ software was used to perform numerical calculations. The numerical results were compared with experimental results. Bubble formation was shown in CFD streamlines and smoke-wire experiments at z / c = 0.4. Furthermore, bubble shrunk at z / c = 0.2 by reason of the effects of tip vortices in both numerical and experimental studies. Consequently, it was seen that there was a good agreement between numerical and experimental results.

  17. ANOPP programming and documentation standards document

    NASA Technical Reports Server (NTRS)

    1976-01-01

    Standards defining the requirements for preparing software for the Aircraft Noise Prediction Program (ANOPP) were given. It is the intent of these standards to provide definition, design, coding, and documentation criteria for the achievement of a unity among ANOPP products. These standards apply to all of ANOPP's standard software system. The standards encompass philosophy as well as techniques and conventions.

  18. Electromagnetic Field Effects in Semiconductor Crystal Growth

    NASA Technical Reports Server (NTRS)

    Dulikravich, George S.

    1996-01-01

    This proposed two-year research project was to involve development of an analytical model, a numerical algorithm for its integration, and a software for the analysis of a solidification process under the influence of electric and magnetic fields in microgravity. Due to the complexity of the analytical model that was developed and its boundary conditions, only a preliminary version of the numerical algorithm was developed while the development of the software package was not completed.

  19. Numerical simulation of mechanical properties tests of tungsten mud waste geopolymer

    NASA Astrophysics Data System (ADS)

    Paszek, Natalia; Krystek, Małgorzata

    2018-03-01

    Geopolymers are believed to become in the future an environmental friendly alternative for the concrete. The low CO2 emission during the production process and the possibility of ecological management of the industrial wastes are mentioned as main advantages of geopolymers. The main drawback, causing problems with application of geopolymers as a building material is the lack of the theoretical material model. Indicated problem is being solved now by the group of scientists from the Silesian University of Technology. The series of laboratory tests are carried out within the European research project REMINE. The paper introduces the numerical analyses of tungsten mud waste geopolymer samples which have been performed in the Atena software on the basis of the laboratory tests. Numerical models of bended and compressed samples of different shapes are presented in the paper. The results obtained in Atena software were compared with results obtained in Abaqus and Mafem3D software.

  20. Geneious Basic: An integrated and extendable desktop software platform for the organization and analysis of sequence data

    PubMed Central

    Kearse, Matthew; Moir, Richard; Wilson, Amy; Stones-Havas, Steven; Cheung, Matthew; Sturrock, Shane; Buxton, Simon; Cooper, Alex; Markowitz, Sidney; Duran, Chris; Thierer, Tobias; Ashton, Bruce; Meintjes, Peter; Drummond, Alexei

    2012-01-01

    Summary: The two main functions of bioinformatics are the organization and analysis of biological data using computational resources. Geneious Basic has been designed to be an easy-to-use and flexible desktop software application framework for the organization and analysis of biological data, with a focus on molecular sequences and related data types. It integrates numerous industry-standard discovery analysis tools, with interactive visualizations to generate publication-ready images. One key contribution to researchers in the life sciences is the Geneious public application programming interface (API) that affords the ability to leverage the existing framework of the Geneious Basic software platform for virtually unlimited extension and customization. The result is an increase in the speed and quality of development of computation tools for the life sciences, due to the functionality and graphical user interface available to the developer through the public API. Geneious Basic represents an ideal platform for the bioinformatics community to leverage existing components and to integrate their own specific requirements for the discovery, analysis and visualization of biological data. Availability and implementation: Binaries and public API freely available for download at http://www.geneious.com/basic, implemented in Java and supported on Linux, Apple OSX and MS Windows. The software is also available from the Bio-Linux package repository at http://nebc.nerc.ac.uk/news/geneiousonbl. Contact: peter@biomatters.com PMID:22543367

  1. Geneious Basic: an integrated and extendable desktop software platform for the organization and analysis of sequence data.

    PubMed

    Kearse, Matthew; Moir, Richard; Wilson, Amy; Stones-Havas, Steven; Cheung, Matthew; Sturrock, Shane; Buxton, Simon; Cooper, Alex; Markowitz, Sidney; Duran, Chris; Thierer, Tobias; Ashton, Bruce; Meintjes, Peter; Drummond, Alexei

    2012-06-15

    The two main functions of bioinformatics are the organization and analysis of biological data using computational resources. Geneious Basic has been designed to be an easy-to-use and flexible desktop software application framework for the organization and analysis of biological data, with a focus on molecular sequences and related data types. It integrates numerous industry-standard discovery analysis tools, with interactive visualizations to generate publication-ready images. One key contribution to researchers in the life sciences is the Geneious public application programming interface (API) that affords the ability to leverage the existing framework of the Geneious Basic software platform for virtually unlimited extension and customization. The result is an increase in the speed and quality of development of computation tools for the life sciences, due to the functionality and graphical user interface available to the developer through the public API. Geneious Basic represents an ideal platform for the bioinformatics community to leverage existing components and to integrate their own specific requirements for the discovery, analysis and visualization of biological data. Binaries and public API freely available for download at http://www.geneious.com/basic, implemented in Java and supported on Linux, Apple OSX and MS Windows. The software is also available from the Bio-Linux package repository at http://nebc.nerc.ac.uk/news/geneiousonbl.

  2. Simulation of one-sided heating of boiler unit membrane-type water walls

    NASA Astrophysics Data System (ADS)

    Kurepin, M. P.; Serbinovskiy, M. Yu.

    2017-03-01

    This study describes the results of simulation of the temperature field and the stress-strain state of membrane-type gastight water walls of boiler units using the finite element method. The methods of analytical and standard calculation of one-sided heating of fin-tube water walls by a radiative heat flux are analyzed. The methods and software for input data calculation in the finite-element simulation, including thermoelastic moments in welded panels that result from their one-sided heating, are proposed. The method and software modules are used for water wall simulation using ANSYS. The results of simulation of the temperature field, stress field, deformations and displacement of the membrane-type panel for the boiler furnace water wall using the finite-element method, as well as the results of calculation of the panel tube temperature, stresses and deformations using the known methods, are presented. The comparison of the known experimental results on heating and bending by given moments of membrane-type water walls and numerical simulations is performed. It is demonstrated that numerical results agree with high accuracy with the experimental data. The relative temperature difference does not exceed 1%. The relative difference of the experimental fin mutual turning angle caused by one-sided heating by radiative heat flux and the results obtained in the finite element simulation does not exceed 8.5% for nondisplaced fins and 7% for fins with displacement. The same difference for the theoretical results and the simulation using the finite-element method does not exceed 3% and 7.1%, respectively. The proposed method and software modules for simulation of the temperature field and stress-strain state of the water walls are verified and the feasibility of their application in practical design is proven.

  3. Space Station Software Issues

    NASA Technical Reports Server (NTRS)

    Voigt, S. (Editor); Beskenis, S. (Editor)

    1985-01-01

    Issues in the development of software for the Space Station are discussed. Software acquisition and management, software development environment, standards, information system support for software developers, and a future software advisory board are addressed.

  4. Space and Missile Systems Center Standard: Software Development

    DTIC Science & Technology

    2015-01-16

    maintenance , or any other activity or combination of activities resulting in products . Within this standard, requirements to “develop,” “define...integration, reuse, reengineering, maintenance , or any other activity that results in products ). The term “developer” encompasses all software team...activities that results in software products . Software development includes new development, modification, reuse, reengineering, maintenance , and any other

  5. SimVascular: An Open Source Pipeline for Cardiovascular Simulation.

    PubMed

    Updegrove, Adam; Wilson, Nathan M; Merkow, Jameson; Lan, Hongzhi; Marsden, Alison L; Shadden, Shawn C

    2017-03-01

    Patient-specific cardiovascular simulation has become a paradigm in cardiovascular research and is emerging as a powerful tool in basic, translational and clinical research. In this paper we discuss the recent development of a fully open-source SimVascular software package, which provides a complete pipeline from medical image data segmentation to patient-specific blood flow simulation and analysis. This package serves as a research tool for cardiovascular modeling and simulation, and has contributed to numerous advances in personalized medicine, surgical planning and medical device design. The SimVascular software has recently been refactored and expanded to enhance functionality, usability, efficiency and accuracy of image-based patient-specific modeling tools. Moreover, SimVascular previously required several licensed components that hindered new user adoption and code management and our recent developments have replaced these commercial components to create a fully open source pipeline. These developments foster advances in cardiovascular modeling research, increased collaboration, standardization of methods, and a growing developer community.

  6. Software Certification - Coding, Code, and Coders

    NASA Technical Reports Server (NTRS)

    Havelund, Klaus; Holzmann, Gerard J.

    2011-01-01

    We describe a certification approach for software development that has been adopted at our organization. JPL develops robotic spacecraft for the exploration of the solar system. The flight software that controls these spacecraft is considered to be mission critical. We argue that the goal of a software certification process cannot be the development of "perfect" software, i.e., software that can be formally proven to be correct under all imaginable and unimaginable circumstances. More realistically, the goal is to guarantee a software development process that is conducted by knowledgeable engineers, who follow generally accepted procedures to control known risks, while meeting agreed upon standards of workmanship. We target three specific issues that must be addressed in such a certification procedure: the coding process, the code that is developed, and the skills of the coders. The coding process is driven by standards (e.g., a coding standard) and tools. The code is mechanically checked against the standard with the help of state-of-the-art static source code analyzers. The coders, finally, are certified in on-site training courses that include formal exams.

  7. Application of troposphere model from NWP and GNSS data into real-time precise positioning

    NASA Astrophysics Data System (ADS)

    Wilgan, Karina; Hadas, Tomasz; Kazmierski, Kamil; Rohm, Witold; Bosy, Jaroslaw

    2016-04-01

    The tropospheric delay empirical models are usually functions of meteorological parameters (temperature, pressure and humidity). The application of standard atmosphere parameters or global models, such as GPT (global pressure/temperature) model or UNB3 (University of New Brunswick, version 3) model, may not be sufficient, especially for positioning in non-standard weather conditions. The possible solution is to use regional troposphere models based on real-time or near-real time measurements. We implement a regional troposphere model into the PPP (Precise Point Positioning) software GNSS-WARP (Wroclaw Algorithms for Real-time Positioning) developed at Wroclaw University of Environmental and Life Sciences. The software is capable of processing static and kinematic multi-GNSS data in real-time and post-processing mode and takes advantage of final IGS (International GNSS Service) products as well as IGS RTS (Real-Time Service) products. A shortcoming of PPP technique is the time required for the solution to converge. One of the reasons is the high correlation among the estimated parameters: troposphere delay, receiver clock offset and receiver height. To efficiently decorrelate these parameters, a significant change in satellite geometry is required. Alternative solution is to introduce the external high-quality regional troposphere delay model to constrain troposphere estimates. The proposed model consists of zenith total delays (ZTD) and mapping functions calculated from meteorological parameters from Numerical Weather Prediction model WRF (Weather Research and Forecasting) and ZTDs from ground-based GNSS stations using the least-squares collocation software COMEDIE (Collocation of Meteorological Data for Interpretation and Estimation of Tropospheric Pathdelays) developed at ETH Zurich.

  8. Software Safety Risk in Legacy Safety-Critical Computer Systems

    NASA Technical Reports Server (NTRS)

    Hill, Janice L.; Baggs, Rhoda

    2007-01-01

    Safety Standards contain technical and process-oriented safety requirements. Technical requirements are those such as "must work" and "must not work" functions in the system. Process-Oriented requirements are software engineering and safety management process requirements. Address the system perspective and some cover just software in the system > NASA-STD-8719.13B Software Safety Standard is the current standard of interest. NASA programs/projects will have their own set of safety requirements derived from the standard. Safety Cases: a) Documented demonstration that a system complies with the specified safety requirements. b) Evidence is gathered on the integrity of the system and put forward as an argued case. [Gardener (ed.)] c) Problems occur when trying to meet safety standards, and thus make retrospective safety cases, in legacy safety-critical computer systems.

  9. The IEEE Software Engineering Standards Process

    PubMed Central

    Buckley, Fletcher J.

    1984-01-01

    Software Engineering has emerged as a field in recent years, and those involved increasingly recognize the need for standards. As a result, members of the Institute of Electrical and Electronics Engineers (IEEE) formed a subcommittee to develop these standards. This paper discusses the ongoing standards development, and associated efforts.

  10. Software Quality Evaluation Models Applicable in Health Information and Communications Technologies. A Review of the Literature.

    PubMed

    Villamor Ordozgoiti, Alberto; Delgado Hito, Pilar; Guix Comellas, Eva María; Fernandez Sanchez, Carlos Manuel; Garcia Hernandez, Milagros; Lluch Canut, Teresa

    2016-01-01

    Information and Communications Technologies in healthcare has increased the need to consider quality criteria through standardised processes. The aim of this study was to analyse the software quality evaluation models applicable to healthcare from the perspective of ICT-purchasers. Through a systematic literature review with the keywords software, product, quality, evaluation and health, we selected and analysed 20 original research papers published from 2005-2016 in health science and technology databases. The results showed four main topics: non-ISO models, software quality evaluation models based on ISO/IEC standards, studies analysing software quality evaluation models, and studies analysing ISO standards for software quality evaluation. The models provide cost-efficiency criteria for specific software, and improve use outcomes. The ISO/IEC25000 standard is shown as the most suitable for evaluating the quality of ICTs for healthcare use from the perspective of institutional acquisition.

  11. Standard Populations (Millions) for Age-Adjustment - SEER Population Datasets

    Cancer.gov

    Download files containing standard population data for use in statististical software. The files contain the same data distributed with SEER*Stat software. You can also view the standard populations, either 19 age groups or single ages.

  12. Enhancing requirements engineering for patient registry software systems with evidence-based components.

    PubMed

    Lindoerfer, Doris; Mansmann, Ulrich

    2017-07-01

    Patient registries are instrumental for medical research. Often their structures are complex and their implementations use composite software systems to meet the wide spectrum of challenges. Commercial and open-source systems are available for registry implementation, but many research groups develop their own systems. Methodological approaches in the selection of software as well as the construction of proprietary systems are needed. We propose an evidence-based checklist, summarizing essential items for patient registry software systems (CIPROS), to accelerate the requirements engineering process. Requirements engineering activities for software systems follow traditional software requirements elicitation methods, general software requirements specification (SRS) templates, and standards. We performed a multistep procedure to develop a specific evidence-based CIPROS checklist: (1) A systematic literature review to build a comprehensive collection of technical concepts, (2) a qualitative content analysis to define a catalogue of relevant criteria, and (3) a checklist to construct a minimal appraisal standard. CIPROS is based on 64 publications and covers twelve sections with a total of 72 items. CIPROS also defines software requirements. Comparing CIPROS with traditional software requirements elicitation methods, SRS templates and standards show a broad consensus but differences in issues regarding registry-specific aspects. Using an evidence-based approach to requirements engineering for registry software adds aspects to the traditional methods and accelerates the software engineering process for registry software. The method we used to construct CIPROS serves as a potential template for creating evidence-based checklists in other fields. The CIPROS list supports developers in assessing requirements for existing systems and formulating requirements for their own systems, while strengthening the reporting of patient registry software system descriptions. It may be a first step to create standards for patient registry software system assessments. Copyright © 2017 Elsevier Inc. All rights reserved.

  13. CAD/CAE Integration Enhanced by New CAD Services Standard

    NASA Technical Reports Server (NTRS)

    Claus, Russell W.

    2002-01-01

    A Government-industry team led by the NASA Glenn Research Center has developed a computer interface standard for accessing data from computer-aided design (CAD) systems. The Object Management Group, an international computer standards organization, has adopted this CAD services standard. The new standard allows software (e.g., computer-aided engineering (CAE) and computer-aided manufacturing software to access multiple CAD systems through one programming interface. The interface is built on top of a distributed computing system called the Common Object Request Broker Architecture (CORBA). CORBA allows the CAD services software to operate in a distributed, heterogeneous computing environment.

  14. The MINERVA Software Development Process

    NASA Technical Reports Server (NTRS)

    Narkawicz, Anthony; Munoz, Cesar A.; Dutle, Aaron M.

    2017-01-01

    This paper presents a software development process for safety-critical software components of cyber-physical systems. The process is called MINERVA, which stands for Mirrored Implementation Numerically Evaluated against Rigorously Verified Algorithms. The process relies on formal methods for rigorously validating code against its requirements. The software development process uses: (1) a formal specification language for describing the algorithms and their functional requirements, (2) an interactive theorem prover for formally verifying the correctness of the algorithms, (3) test cases that stress the code, and (4) numerical evaluation on these test cases of both the algorithm specifications and their implementations in code. The MINERVA process is illustrated in this paper with an application to geo-containment algorithms for unmanned aircraft systems. These algorithms ensure that the position of an aircraft never leaves a predetermined polygon region and provide recovery maneuvers when the region is inadvertently exited.

  15. From Print to Pixels: Practitioners' Reflections on the Use of Qualitative Data Analysis Software.

    ERIC Educational Resources Information Center

    Gilbert, Linda S.

    This paper studied how individual qualitative researchers perceive that their research procedures and perspectives have been influenced by the adoption of computer assisted qualitative data software. The study focused on Nud*Ist software (non-numerical Unstructured Data; Indexing, Searching, and Theorizing). The seven participants ranged from new…

  16. HPC Software Stack Testing Framework

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Garvey, Cormac

    The HPC Software stack testing framework (hpcswtest) is used in the INL Scientific Computing Department to test the basic sanity and integrity of the HPC Software stack (Compilers, MPI, Numerical libraries and Applications) and to quickly discover hard failures, and as a by-product it will indirectly check the HPC infrastructure (network, PBS and licensing servers).

  17. Methods, Software and Tools for Three Numerical Applications. Final report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    E. R. Jessup

    2000-03-01

    This is a report of the results of the authors work supported by DOE contract DE-FG03-97ER25325. They proposed to study three numerical problems. They are: (1) the extension of the PMESC parallel programming library; (2) the development of algorithms and software for certain generalized eigenvalue and singular value (SVD) problems, and (3) the application of techniques of linear algebra to an information retrieval technique known as latent semantic indexing (LSI).

  18. Software database creature for investment property measurement according to international standards

    NASA Astrophysics Data System (ADS)

    Ponomareva, S. V.; Merzliakova, N. A.

    2018-05-01

    The article deals with investment property measurement and accounting problems at the international, national and enterprise levels. The need to create the software for investment property measurement according to International Accounting Standards was substantiated. The necessary software functions and the processes were described.

  19. Evidence and diagnostic reporting in the IHE context.

    PubMed

    Loef, Cor; Truyen, Roel

    2005-05-01

    Capturing clinical observations and findings during the diagnostic imaging process is increasingly becoming a critical step in diagnostic reporting. Standards developers-notably HL7 and DICOM-are making significant progress toward standards that enable exchanging clinical observations and findings among the various information systems of the healthcare enterprise. DICOM-like the HL7 Clinical Document Architecture (CDA) -uses templates and constrained, coded vocabulary (SNOMED, LOINC, etc.). Such a representation facilitates automated software recognition of findings and observations, intrapatient comparison, correlation to norms, and outcomes research. The scope of DICOM Structured Reporting (SR) includes many findings that products routinely create in digital form (measurements, computed estimates, etc.). In the Integrating the Healthcare Enterprise (IHE) framework, two Integration Profiles are defined for clinical data capture and diagnostic reporting: Evidence Document, and Simple Image and Numeric Report. This report describes these two DICOM SR-based integration profiles in the diagnostic reporting process.

  20. Standards for Environmental Measurement Using GIS: Toward a Protocol for Protocols.

    PubMed

    Forsyth, Ann; Schmitz, Kathryn H; Oakes, Michael; Zimmerman, Jason; Koepp, Joel

    2006-02-01

    Interdisciplinary research regarding how the built environment influences physical activity has recently increased. Many research projects conducted jointly by public health and environmental design professionals are using geographic information systems (GIS) to objectively measure the built environment. Numerous methodological issues remain, however, and environmental measurements have not been well documented with accepted, common definitions of valid, reliable variables. This paper proposes how to create and document standardized definitions for measures of environmental variables using GIS with the ultimate goal of developing reliable, valid measures. Inherent problems with software and data that hamper environmental measurement can be offset by protocols combining clear conceptual bases with detailed measurement instructions. Examples demonstrate how protocols can more clearly translate concepts into specific measurement. This paper provides a model for developing protocols to allow high quality comparative research on relationships between the environment and physical activity and other outcomes of public health interest.

  1. Development, testing, and numerical modeling of a foam sandwich biocomposite

    NASA Astrophysics Data System (ADS)

    Chachra, Ricky

    This study develops a novel sandwich composite material using plant based materials for potential use in nonstructural building applications. The face sheets comprise woven hemp fabric and a sap based epoxy, while the core comprises castor oil based foam with waste rice hulls as reinforcement. Mechanical properties of the individual materials are tested in uniaxial compression and tension for the foam and hemp, respectively. The sandwich composite is tested in 3 point bending. Flexural results are compared to a finite element model developed in the commercial software Abaqus, and the validated model is then used to investigate alternate sandwich geometries. Sandwich model responses are compared to existing standards for nonstructural building panels, showing that the novel material is roughly half the strength of equally thick drywall. When space limitations are not an issue, a double thickness sandwich biocomposite is found to be a structurally acceptable replacement for standard gypsum drywall.

  2. Value-Based Requirements Traceability: Lessons Learned

    NASA Astrophysics Data System (ADS)

    Egyed, Alexander; Grünbacher, Paul; Heindl, Matthias; Biffl, Stefan

    Traceability from requirements to code is mandated by numerous software development standards. These standards, however, are not explicit about the appropriate level of quality of trace links. From a technical perspective, trace quality should meet the needs of the intended trace utilizations. Unfortunately, long-term trace utilizations are typically unknown at the time of trace acquisition which represents a dilemma for many companies. This chapter suggests ways to balance the cost and benefits of requirements traceability. We present data from three case studies demonstrating that trace acquisition requires broad coverage but can tolerate imprecision. With this trade-off our lessons learned suggest a traceability strategy that (1) provides trace links more quickly, (2) refines trace links according to user-defined value considerations, and (3) supports the later refinement of trace links in case the initial value consideration has changed over time. The scope of our work considers the entire life cycle of traceability instead of just the creation of trace links.

  3. Survey of Software Assurance Techniques for Highly Reliable Systems

    NASA Technical Reports Server (NTRS)

    Nelson, Stacy

    2004-01-01

    This document provides a survey of software assurance techniques for highly reliable systems including a discussion of relevant safety standards for various industries in the United States and Europe, as well as examples of methods used during software development projects. It contains one section for each industry surveyed: Aerospace, Defense, Nuclear Power, Medical Devices and Transportation. Each section provides an overview of applicable standards and examples of a mission or software development project, software assurance techniques used and reliability achieved.

  4. Level 1 Processing of MODIS Direct Broadcast Data From Terra

    NASA Technical Reports Server (NTRS)

    Lynnes, Christopher; Smith, Peter; Shotland, Larry; El-Ghazawi, Tarek; Zhu, Ming

    2000-01-01

    In February 2000, an effort was begun to adapt the Moderate Resolution Imaging Spectroradiometer (MODIS) Level 1 production software to process direct broadcast data. Three Level 1 algorithms have been adapted and packaged for release: Level 1A converts raw (level 0) data into Hierarchical Data Format (HDF), unpacking packets into scans; Geolocation computes geographic information for the data points in the Level 1A; and the Level 1B computes geolocated, calibrated radiances from the Level 1A and Geolocation products. One useful aspect of adapting the production software is the ability to incorporate enhancements contributed by the MODIS Science Team. We have therefore tried to limit changes to the software. However, in order to process the data immediately on receipt, we have taken advantage of a branch in the geolocation software that reads orbit and altitude information from the packets themselves, rather than external ancillary files used in standard production. We have also verified that the algorithms can be run with smaller time increments (2.5 minutes) than the five-minute increments used in production. To make the code easier to build and run, we have simplified directories and build scripts. Also, dependencies on a commercial numerics library have been replaced by public domain software. A version of the adapted code has been released for Silicon Graphics machines running lrix. Perhaps owing to its origin in production, the software is rather CPU-intensive. Consequently, a port to Linux is underway, followed by a version to run on PC clusters, with an eventual goal of running in near-real-time (i.e., process a ten-minute pass in ten minutes).

  5. DIATOM (Data Initialization and Modification) Library Version 7.0

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Crawford, David A.; Schmitt, Robert G.; Hensinger, David M.

    DIATOM is a library that provides numerical simulation software with a computational geometry front end that can be used to build up complex problem geometries from collections of simpler shapes. The library provides a parser which allows for application-independent geometry descriptions to be embedded in simulation software input decks. Descriptions take the form of collections of primitive shapes and/or CAD input files and material properties that can be used to describe complex spatial and temporal distributions of numerical quantities (often called “database variables” or “fields”) to help define starting conditions for numerical simulations. The capability is designed to be generalmore » purpose, robust and computationally efficient. By using a combination of computational geometry and recursive divide-and-conquer approximation techniques, a wide range of primitive shapes are supported to arbitrary degrees of fidelity, controllable through user input and limited only by machine resources. Through the use of call-back functions, numerical simulation software can request the value of a field at any time or location in the problem domain. Typically, this is used only for defining initial conditions, but the capability is not limited to just that use. The most recent version of DIATOM provides the ability to import the solution field from one numerical solution as input for another.« less

  6. Scilab software as an alternative low-cost computing in solving the linear equations problem

    NASA Astrophysics Data System (ADS)

    Agus, Fahrul; Haviluddin

    2017-02-01

    Numerical computation packages are widely used both in teaching and research. These packages consist of license (proprietary) and open source software (non-proprietary). One of the reasons to use the package is a complexity of mathematics function (i.e., linear problems). Also, number of variables in a linear or non-linear function has been increased. The aim of this paper was to reflect on key aspects related to the method, didactics and creative praxis in the teaching of linear equations in higher education. If implemented, it could be contribute to a better learning in mathematics area (i.e., solving simultaneous linear equations) that essential for future engineers. The focus of this study was to introduce an additional numerical computation package of Scilab as an alternative low-cost computing programming. In this paper, Scilab software was proposed some activities that related to the mathematical models. In this experiment, four numerical methods such as Gaussian Elimination, Gauss-Jordan, Inverse Matrix, and Lower-Upper Decomposition (LU) have been implemented. The results of this study showed that a routine or procedure in numerical methods have been created and explored by using Scilab procedures. Then, the routine of numerical method that could be as a teaching material course has exploited.

  7. Numerical systems on a minicomputer

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brown, Jr., Roy Leonard

    1973-02-01

    This thesis defines the concept of a numerical system for a minicomputer and provides a description of the software and computer system configuration necessary to implement such a system. A procedure for creating a numerical system from a FORTRAN program is developed and an example is presented.

  8. 25 CFR 543.7 - What are the minimum internal control standards for bingo?

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... software upgrades, data storage media replacement, etc.). The information recorded must be used when...., draw objects and back-up draw objects); and (ii) Random number generator software. (Additional information technology security standards can be found in § 543.16 of this part.) (2) The game software...

  9. 25 CFR 543.7 - What are the minimum internal control standards for bingo?

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... software upgrades, data storage media replacement, etc.). The information recorded must be used when...., draw objects and back-up draw objects); and (ii) Random number generator software. (Additional information technology security standards can be found in § 543.16 of this part.) (2) The game software...

  10. Future of Software Engineering Standards

    NASA Technical Reports Server (NTRS)

    Poon, Peter T.

    1997-01-01

    In the new millennium, software engineering standards are expected to continue to influence the process of producing software-intensive systems which are cost-effetive and of high quality. These sytems may range from ground and flight systems used for planetary exploration to educational support systems used in schools as well as consumer-oriented systems.

  11. Evolution of Secondary Software Businesses: Understanding Industry Dynamics

    NASA Astrophysics Data System (ADS)

    Tyrväinen, Pasi; Warsta, Juhani; Seppänen, Veikko

    Primary software industry originates from IBM's decision to unbundle software-related computer system development activities to external partners. This kind of outsourcing from an enterprise internal software development activity is a common means to start a new software business serving a vertical software market. It combines knowledge of the vertical market process with competence in software development. In this research, we present and analyze the key figures of the Finnish secondary software industry, in order to quantify its interaction with the primary software industry during the period of 2000-2003. On the basis of the empirical data, we present a model for evolution of a secondary software business, which makes explicit the industry dynamics. It represents the shift from internal software developed for competitive advantage to development of products supporting standard business processes on top of standardized technologies. We also discuss the implications for software business strategies in each phase.

  12. Free and open-source software application for the evaluation of coronary computed tomography angiography images.

    PubMed

    Hadlich, Marcelo Souza; Oliveira, Gláucia Maria Moraes; Feijóo, Raúl A; Azevedo, Clerio F; Tura, Bernardo Rangel; Ziemer, Paulo Gustavo Portela; Blanco, Pablo Javier; Pina, Gustavo; Meira, Márcio; Souza e Silva, Nelson Albuquerque de

    2012-10-01

    The standardization of images used in Medicine in 1993 was performed using the DICOM (Digital Imaging and Communications in Medicine) standard. Several tests use this standard and it is increasingly necessary to design software applications capable of handling this type of image; however, these software applications are not usually free and open-source, and this fact hinders their adjustment to most diverse interests. To develop and validate a free and open-source software application capable of handling DICOM coronary computed tomography angiography images. We developed and tested the ImageLab software in the evaluation of 100 tests randomly selected from a database. We carried out 600 tests divided between two observers using ImageLab and another software sold with Philips Brilliance computed tomography appliances in the evaluation of coronary lesions and plaques around the left main coronary artery (LMCA) and the anterior descending artery (ADA). To evaluate intraobserver, interobserver and intersoftware agreements, we used simple and kappa statistics agreements. The agreements observed between software applications were generally classified as substantial or almost perfect in most comparisons. The ImageLab software agreed with the Philips software in the evaluation of coronary computed tomography angiography tests, especially in patients without lesions, with lesions < 50% in the LMCA and < 70% in the ADA. The agreement for lesions > 70% in the ADA was lower, but this is also observed when the anatomical reference standard is used.

  13. Safe Software for Space Applications: Building on the DO-178 Experience

    NASA Astrophysics Data System (ADS)

    Dorsey, Cheryl A.; Dorsey, Timothy A.

    2013-09-01

    DO-178, Software Considerations in Airborne Systems and Equipment Certification, is the well-known international standard dealing with the assurance of software used in airborne systems [1,2]. Insights into the DO-178 experiences, strengths and weaknesses can benefit the international space community. As DO-178 is an excellent standard for safe software development when used appropriately, this paper provides lessons learned and suggestions for using it effectively.

  14. FAILSAFE Health Management for Embedded Systems

    NASA Technical Reports Server (NTRS)

    Horvath, Gregory A.; Wagner, David A.; Wen, Hui Ying; Barry, Matthew

    2010-01-01

    The FAILSAFE project is developing concepts and prototype implementations for software health management in mission- critical, real-time embedded systems. The project unites features of the industry-standard ARINC 653 Avionics Application Software Standard Interface and JPL s Mission Data System (MDS) technology (see figure). The ARINC 653 standard establishes requirements for the services provided by partitioned, real-time operating systems. The MDS technology provides a state analysis method, canonical architecture, and software framework that facilitates the design and implementation of software-intensive complex systems. The MDS technology has been used to provide the health management function for an ARINC 653 application implementation. In particular, the focus is on showing how this combination enables reasoning about, and recovering from, application software problems.

  15. The microcomputer scientific software series 1: the numerical information manipulation system.

    Treesearch

    Harold M. Rauscher

    1983-01-01

    The Numerical Information Manipulation System extends the versatility provided by word processing systems for textual data manipulation to mathematical or statistical data in numeric matrix form. Numeric data, stored and processed in the matrix form, may be manipulated in a wide variety of ways. The system allows operations on single elements, entire rows, or columns...

  16. Searching for Physics Beyond the Standard Model: Strongly-Coupled Field Theories at the Intensity and Energy Frontiers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brower, Richard C.

    This proposal is to develop the software and algorithmic infrastructure needed for the numerical study of quantum chromodynamics (QCD), and of theories that have been proposed to describe physics beyond the Standard Model (BSM) of high energy physics, on current and future computers. This infrastructure will enable users (1) to improve the accuracy of QCD calculations to the point where they no longer limit what can be learned from high-precision experiments that seek to test the Standard Model, and (2) to determine the predictions of BSM theories in order to understand which of them are consistent with the data thatmore » will soon be available from the LHC. Work will include the extension and optimizations of community codes for the next generation of leadership class computers, the IBM Blue Gene/Q and the Cray XE/XK, and for the dedicated hardware funded for our field by the Department of Energy. Members of our collaboration at Brookhaven National Laboratory and Columbia University worked on the design of the Blue Gene/Q, and have begun to develop software for it. Under this grant we will build upon their experience to produce high-efficiency production codes for this machine. Cray XE/XK computers with many thousands of GPU accelerators will soon be available, and the dedicated commodity clusters we obtain with DOE funding include growing numbers of GPUs. We will work with our partners in NVIDIA's Emerging Technology group to scale our existing software to thousands of GPUs, and to produce highly efficient production codes for these machines. Work under this grant will also include the development of new algorithms for the effective use of heterogeneous computers, and their integration into our codes. It will include improvements of Krylov solvers and the development of new multigrid methods in collaboration with members of the FASTMath SciDAC Institute, using their HYPRE framework, as well as work on improved symplectic integrators.« less

  17. Implicity restarted Arnoldi/Lanczos methods for large scale eigenvalue calculations

    NASA Technical Reports Server (NTRS)

    Sorensen, Danny C.

    1996-01-01

    Eigenvalues and eigenfunctions of linear operators are important to many areas of applied mathematics. The ability to approximate these quantities numerically is becoming increasingly important in a wide variety of applications. This increasing demand has fueled interest in the development of new methods and software for the numerical solution of large-scale algebraic eigenvalue problems. In turn, the existence of these new methods and software, along with the dramatically increased computational capabilities now available, has enabled the solution of problems that would not even have been posed five or ten years ago. Until very recently, software for large-scale nonsymmetric problems was virtually non-existent. Fortunately, the situation is improving rapidly. The purpose of this article is to provide an overview of the numerical solution of large-scale algebraic eigenvalue problems. The focus will be on a class of methods called Krylov subspace projection methods. The well-known Lanczos method is the premier member of this class. The Arnoldi method generalizes the Lanczos method to the nonsymmetric case. A recently developed variant of the Arnoldi/Lanczos scheme called the Implicitly Restarted Arnoldi Method is presented here in some depth. This method is highlighted because of its suitability as a basis for software development.

  18. Numerical modeling tools for chemical vapor deposition

    NASA Technical Reports Server (NTRS)

    Jasinski, Thomas J.; Childs, Edward P.

    1992-01-01

    Development of general numerical simulation tools for chemical vapor deposition (CVD) was the objective of this study. Physical models of important CVD phenomena were developed and implemented into the commercial computational fluid dynamics software FLUENT. The resulting software can address general geometries as well as the most important phenomena occurring with CVD reactors: fluid flow patterns, temperature and chemical species distribution, gas phase and surface deposition. The physical models are documented which are available and examples are provided of CVD simulation capabilities.

  19. SAO mission support software and data standards, version 1.0

    NASA Technical Reports Server (NTRS)

    Hsieh, P.

    1993-01-01

    This document defines the software developed by the SAO AXAF Mission Support (MS) Program and defines standards for the software development process and control of data products generated by the software. The SAO MS is tasked to develop and use software to perform a variety of functions in support of the AXAF mission. Software is developed by software engineers and scientists, and commercial off-the-shelf (COTS) software is used either directly or customized through the use of scripts to implement analysis procedures. Software controls real-time laboratory instruments, performs data archiving, displays data, and generates model predictions. Much software is used in the analysis of data to generate data products that are required by the AXAF project, for example, on-orbit mirror performance predictions or detailed characterization of the mirror reflection performance with energy.

  20. Nasa-wide Standard Administrative Systems

    NASA Technical Reports Server (NTRS)

    Schneck, P.

    1984-01-01

    Factors to be considered in developing agency-wide standard administrative systems for NASA include uniformity of hardware and software; centralization vs. decentralization; risk exposure; and models for software development.

  1. Some Methods of Applied Numerical Analysis to 3d Facial Reconstruction Software

    NASA Astrophysics Data System (ADS)

    Roşu, Şerban; Ianeş, Emilia; Roşu, Doina

    2010-09-01

    This paper deals with the collective work performed by medical doctors from the University Of Medicine and Pharmacy Timisoara and engineers from the Politechnical Institute Timisoara in the effort to create the first Romanian 3d reconstruction software based on CT or MRI scans and to test the created software in clinical practice.

  2. Non-Classroom Use of "Presentation Software" in Accelerated Classes: Student Use and Perceptions of Value

    ERIC Educational Resources Information Center

    Davies, Thomas; Korte, Leon; Cornelsen, Erin

    2016-01-01

    Numerous articles found in education literature discuss the advantages and disadvantages of using "presentation" software to deliver critical course content to students. Frequently the perceived value of the use of software such as PowerPoint is dependent upon how it is used, for instance, the extent to which bells and whistles are…

  3. Guidance and Control Software Project Data - Volume 1: Planning Documents

    NASA Technical Reports Server (NTRS)

    Hayhurst, Kelly J. (Editor)

    2008-01-01

    The Guidance and Control Software (GCS) project was the last in a series of software reliability studies conducted at Langley Research Center between 1977 and 1994. The technical results of the GCS project were recorded after the experiment was completed. Some of the support documentation produced as part of the experiment, however, is serving an unexpected role far beyond its original project context. Some of the software used as part of the GCS project was developed to conform to the RTCA/DO-178B software standard, "Software Considerations in Airborne Systems and Equipment Certification," used in the civil aviation industry. That standard requires extensive documentation throughout the software development life cycle, including plans, software requirements, design and source code, verification cases and results, and configuration management and quality control data. The project documentation that includes this information is open for public scrutiny without the legal or safety implications associated with comparable data from an avionics manufacturer. This public availability has afforded an opportunity to use the GCS project documents for DO-178B training. This report provides a brief overview of the GCS project, describes the 4-volume set of documents and the role they are playing in training, and includes the planning documents from the GCS project. Volume 1 contains five appendices: A. Plan for Software Aspects of Certification for the Guidance and Control Software Project; B. Software Development Standards for the Guidance and Control Software Project; C. Software Verification Plan for the Guidance and Control Software Project; D. Software Configuration Management Plan for the Guidance and Control Software Project; and E. Software Quality Assurance Activities.

  4. Three Years of Global Positioning System Experience on International Space Station

    NASA Technical Reports Server (NTRS)

    Gomez, Susan

    2006-01-01

    The International Space Station global positioning system (GPS) receiver was activated in April 2002. Since that time, numerous software anomalies surfaced that had to be worked around. Some of the software problems required waivers, such as the time function, while others required extensive operator intervention, such as numerous power cycles. Eventually enough anomalies surfaced that the three pieces of code included in the GPS unit have been re-written and the GPS units upgraded. The technical aspects of the problems are discussed, as well as the underlying causes that led to the delivery of a product that has had so many problems. The technical aspects of the problems included physical phenomena that were not well understood, such as the affect that the ionosphere would have on the GPS measurements. The underlying causes were traced to inappropriate use of legacy software, changing requirements, inadequate software processes, unrealistic schedules, incorrect contract type, and unclear ownership responsibilities..

  5. Software architecture standard for simulation virtual machine, version 2.0

    NASA Technical Reports Server (NTRS)

    Sturtevant, Robert; Wessale, William

    1994-01-01

    The Simulation Virtual Machine (SBM) is an Ada architecture which eases the effort involved in the real-time software maintenance and sustaining engineering. The Software Architecture Standard defines the infrastructure which all the simulation models are built from. SVM was developed for and used in the Space Station Verification and Training Facility.

  6. 78 FR 22432 - Airworthiness Directives; Airbus Airplanes

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-04-16

    ... electrical rudder], through Airbus Service Bulletin (SB) A330-27-3176, --software standard P12A/M21A on FCPC.... Since we issued that AD, we have determined that new software standards for the flight control primary.... This proposed AD would require that operators modify or replace all three FCPCs with new software...

  7. Transient loads analysis for space flight applications

    NASA Technical Reports Server (NTRS)

    Thampi, S. K.; Vidyasagar, N. S.; Ganesan, N.

    1992-01-01

    A significant part of the flight readiness verification process involves transient analysis of the coupled Shuttle-payload system to determine the low frequency transient loads. This paper describes a methodology for transient loads analysis and its implementation for the Spacelab Life Sciences Mission. The analysis is carried out using two major software tools - NASTRAN and an external FORTRAN code called EZTRAN. This approach is adopted to overcome some of the limitations of NASTRAN's standard transient analysis capabilities. The method uses Data Recovery Matrices (DRM) to improve computational efficiency. The mode acceleration method is fully implemented in the DRM formulation to recover accurate displacements, stresses, and forces. The advantages of the method are demonstrated through a numerical example.

  8. Close to real life. [solving for transonic flow about lifting airfoils using supercomputers

    NASA Technical Reports Server (NTRS)

    Peterson, Victor L.; Bailey, F. Ron

    1988-01-01

    NASA's Numerical Aerodynamic Simulation (NAS) facility for CFD modeling of highly complex aerodynamic flows employs as its basic hardware two Cray-2s, an ETA-10 Model Q, an Amdahl 5880 mainframe computer that furnishes both support processing and access to 300 Gbytes of disk storage, several minicomputers and superminicomputers, and a Thinking Machines 16,000-device 'connection machine' processor. NAS, which was the first supercomputer facility to standardize operating-system and communication software on all processors, has done important Space Shuttle aerodynamics simulations and will be critical to the configurational refinement of the National Aerospace Plane and its intergrated powerplant, which will involve complex, high temperature reactive gasdynamic computations.

  9. DICOM image quantification secondary capture (DICOM IQSC) integrated with numeric results, regions, and curves: implementation and applications in nuclear medicine

    NASA Astrophysics Data System (ADS)

    Cao, Xinhua; Xu, Xiaoyin; Voss, Stephan

    2017-03-01

    In this paper, we describe an enhanced DICOM Secondary Capture (SC) that integrates Image Quantification (IQ) results, Regions of Interest (ROIs), and Time Activity Curves (TACs) with screen shots by embedding extra medical imaging information into a standard DICOM header. A software toolkit of DICOM IQSC has been developed to implement the SC-centered information integration of quantitative analysis for routine practice of nuclear medicine. Primary experiments show that the DICOM IQSC method is simple and easy to implement seamlessly integrating post-processing workstations with PACS for archiving and retrieving IQ information. Additional DICOM IQSC applications in routine nuclear medicine and clinic research are also discussed.

  10. GODIVA2: interactive visualization of environmental data on the Web.

    PubMed

    Blower, J D; Haines, K; Santokhee, A; Liu, C L

    2009-03-13

    GODIVA2 is a dynamic website that provides visual access to several terabytes of physically distributed, four-dimensional environmental data. It allows users to explore large datasets interactively without the need to install new software or download and understand complex data. Through the use of open international standards, GODIVA2 maintains a high level of interoperability with third-party systems, allowing diverse datasets to be mutually compared. Scientists can use the system to search for features in large datasets and to diagnose the output from numerical simulations and data processing algorithms. Data providers around Europe have adopted GODIVA2 as an INSPIRE-compliant dynamic quick-view system for providing visual access to their data.

  11. ICESat (GLAS) Science Processing Software Document Series. Volume 2; Science Data Management Plan; 4.0

    NASA Technical Reports Server (NTRS)

    Jester, Peggy L.; Hancock, David W., III

    1999-01-01

    This document provides the Data Management Plan for the GLAS Standard Data Software (SDS) supporting the GLAS instrument of the EOS ICESat Spacecraft. The SDS encompasses the ICESat Science Investigator-led Processing System (I-SIPS) Software and the Instrument Support Facility (ISF) Software. This Plan addresses the identification, authority, and description of the interface nodes associated with the GLAS Standard Data Products and the GLAS Ancillary Data.

  12. ICESat (GLAS) Science Processing Software Document Series. Volume 3; GLAS Science Software Requirements Document; Ver 2.1

    NASA Technical Reports Server (NTRS)

    Jester, Peggy L.; Lee, Jeffrey; Zukor, Dorothy J. (Technical Monitor)

    2001-01-01

    This document addresses the software requirements of the Geoscience Laser Altimeter System (GLAS) Standard Data Software (SDS) supporting the GLAS instrument on the EOS ICESat Spacecraft. This Software Requirements Document represents the initial collection of the technical engineering information for the GLAS SDS. This information is detailed within the second of four main volumes of the Standard documentation, the Product Specification volume. This document is a "roll-out" from the governing volume outline containing the Concept and Requirements sections.

  13. Traceability of Software Safety Requirements in Legacy Safety Critical Systems

    NASA Technical Reports Server (NTRS)

    Hill, Janice L.

    2007-01-01

    How can traceability of software safety requirements be created for legacy safety critical systems? Requirements in safety standards are imposed most times during contract negotiations. On the other hand, there are instances where safety standards are levied on legacy safety critical systems, some of which may be considered for reuse for new applications. Safety standards often specify that software development documentation include process-oriented and technical safety requirements, and also require that system and software safety analyses are performed supporting technical safety requirements implementation. So what can be done if the requisite documents for establishing and maintaining safety requirements traceability are not available?

  14. Benchmarking a Visual-Basic based multi-component one-dimensional reactive transport modeling tool

    NASA Astrophysics Data System (ADS)

    Torlapati, Jagadish; Prabhakar Clement, T.

    2013-01-01

    We present the details of a comprehensive numerical modeling tool, RT1D, which can be used for simulating biochemical and geochemical reactive transport problems. The code can be run within the standard Microsoft EXCEL Visual Basic platform, and it does not require any additional software tools. The code can be easily adapted by others for simulating different types of laboratory-scale reactive transport experiments. We illustrate the capabilities of the tool by solving five benchmark problems with varying levels of reaction complexity. These literature-derived benchmarks are used to highlight the versatility of the code for solving a variety of practical reactive transport problems. The benchmarks are described in detail to provide a comprehensive database, which can be used by model developers to test other numerical codes. The VBA code presented in the study is a practical tool that can be used by laboratory researchers for analyzing both batch and column datasets within an EXCEL platform.

  15. Occupational exposure of personnel operating military radio equipment: measurements and simulation.

    PubMed

    Paljanos, Annamaria; Miclaus, Simona; Munteanu, Calin

    2015-09-01

    Technical literature provides numerous studies concerning radiofrequency exposure measurements for various radio communication devices, but there are few studies related to exposure of personnel operating military radio equipment. In order to evaluate exposure and identify cases when safety requirements are not entirely met, both measurements and simulations are needed for accurate results. Moreover, given the technical characteristics of the radio devices used in the military, personnel mainly operate in the near-field region so both measurements and simulation becomes more complex. Measurements were made in situ using a broadband personal exposimeter equipped with two isotropic probes for both electric and magnetic components of the field. The experiment was designed for three different operating frequencies of the same radio equipment, while simulations were made in FEKO software using hybrid numerical methods to solve complex electromagnetic field problems. The paper aims to discuss the comparative results of the measurements and simulation, as well as comparing them to reference levels specified in military or civilian radiofrequency exposure standards.

  16. Automatic Parallelization of Numerical Python Applications using the Global Arrays Toolkit

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Daily, Jeffrey A.; Lewis, Robert R.

    2011-11-30

    Global Arrays is a software system from Pacific Northwest National Laboratory that enables an efficient, portable, and parallel shared-memory programming interface to manipulate distributed dense arrays. The NumPy module is the de facto standard for numerical calculation in the Python programming language, a language whose use is growing rapidly in the scientific and engineering communities. NumPy provides a powerful N-dimensional array class as well as other scientific computing capabilities. However, like the majority of the core Python modules, NumPy is inherently serial. Using a combination of Global Arrays and NumPy, we have reimplemented NumPy as a distributed drop-in replacement calledmore » Global Arrays in NumPy (GAiN). Serial NumPy applications can become parallel, scalable GAiN applications with only minor source code changes. Scalability studies of several different GAiN applications will be presented showing the utility of developing serial NumPy codes which can later run on more capable clusters or supercomputers.« less

  17. An Implicit Algorithm for the Numerical Simulation of Shape-Memory Alloys

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Becker, R; Stolken, J; Jannetti, C

    Shape-memory alloys (SMA) have the potential to be used in a variety of interesting applications due to their unique properties of pseudoelasticity and the shape-memory effect. However, in order to design SMA devices efficiently, a physics-based constitutive model is required to accurately simulate the behavior of shape-memory alloys. The scope of this work is to extend the numerical capabilities of the SMA constitutive model developed by Jannetti et. al. (2003), to handle large-scale polycrystalline simulations. The constitutive model is implemented within the finite-element software ABAQUS/Standard using a user defined material subroutine, or UMAT. To improve the efficiency of the numericalmore » simulations, so that polycrystalline specimens of shape-memory alloys can be modeled, a fully implicit algorithm has been implemented to integrate the constitutive equations. Using an implicit integration scheme increases the efficiency of the UMAT over the previously implemented explicit integration method by a factor of more than 100 for single crystal simulations.« less

  18. A parametric simulation of solar chimney power plant

    NASA Astrophysics Data System (ADS)

    Beng Hooi, Lim; Kannan Thangavelu, Saravana

    2018-01-01

    The strong solar radiation, continuous supplies of sunlight and environmental friendly factors have made the solar chimney power plant becoming highly feasible to build in Malaysia. Solar chimney power plant produces upward buoyancy force through the greenhouse effect. Numerical simulation was performed on the model of a solar chimney power plant using the ANSYS Fluent software by applying standard k-epsilon turbulence model and discrete ordinates (DO) radiation model to solve the relevant equations. A parametric study was carried out to evaluate the performance of solar chimney power plant, which focused on the temperature rise in the collector, air velocity at the chimney base, and pressure drop inside the chimney were based on the results of temperature, velocity, and static pressure distributions. The results demonstrate reliability by comparing a model with the experimental data of Manzanares Spanish prototype. Based on the numerical results, power capacity and efficiency were analysed theoretically. Results indicate that a stronger solar radiation and larger prototype will improve the performance of solar chimney power plant.

  19. Stability analysis of nonlinear Roesser-type two-dimensional systems via a homogenous polynomial technique

    NASA Astrophysics Data System (ADS)

    Zhang, Tie-Yan; Zhao, Yan; Xie, Xiang-Peng

    2012-12-01

    This paper is concerned with the problem of stability analysis of nonlinear Roesser-type two-dimensional (2D) systems. Firstly, the fuzzy modeling method for the usual one-dimensional (1D) systems is extended to the 2D case so that the underlying nonlinear 2D system can be represented by the 2D Takagi—Sugeno (TS) fuzzy model, which is convenient for implementing the stability analysis. Secondly, a new kind of fuzzy Lyapunov function, which is a homogeneous polynomially parameter dependent on fuzzy membership functions, is developed to conceive less conservative stability conditions for the TS Roesser-type 2D system. In the process of stability analysis, the obtained stability conditions approach exactness in the sense of convergence by applying some novel relaxed techniques. Moreover, the obtained result is formulated in the form of linear matrix inequalities, which can be easily solved via standard numerical software. Finally, a numerical example is also given to demonstrate the effectiveness of the proposed approach.

  20. Algorithm-Based Fault Tolerance for Numerical Subroutines

    NASA Technical Reports Server (NTRS)

    Tumon, Michael; Granat, Robert; Lou, John

    2007-01-01

    A software library implements a new methodology of detecting faults in numerical subroutines, thus enabling application programs that contain the subroutines to recover transparently from single-event upsets. The software library in question is fault-detecting middleware that is wrapped around the numericalsubroutines. Conventional serial versions (based on LAPACK and FFTW) and a parallel version (based on ScaLAPACK) exist. The source code of the application program that contains the numerical subroutines is not modified, and the middleware is transparent to the user. The methodology used is a type of algorithm- based fault tolerance (ABFT). In ABFT, a checksum is computed before a computation and compared with the checksum of the computational result; an error is declared if the difference between the checksums exceeds some threshold. Novel normalization methods are used in the checksum comparison to ensure correct fault detections independent of algorithm inputs. In tests of this software reported in the peer-reviewed literature, this library was shown to enable detection of 99.9 percent of significant faults while generating no false alarms.

  1. Information system life-cycle and documentation standards, volume 1

    NASA Technical Reports Server (NTRS)

    Callender, E. David; Steinbacher, Jody

    1989-01-01

    The Software Management and Assurance Program (SMAP) Information System Life-Cycle and Documentation Standards Document describes the Version 4 standard information system life-cycle in terms of processes, products, and reviews. The description of the products includes detailed documentation standards. The standards in this document set can be applied to the life-cycle, i.e., to each phase in the system's development, and to the documentation of all NASA information systems. This provides consistency across the agency as well as visibility into the completeness of the information recorded. An information system is software-intensive, but consists of any combination of software, hardware, and operational procedures required to process, store, or transmit data. This document defines a standard life-cycle model and content for associated documentation.

  2. Low cycle fatigue numerical estimation of a high pressure turbine disc for the AL-31F jet engine

    NASA Astrophysics Data System (ADS)

    Spodniak, Miroslav; Klimko, Marek; Hocko, Marián; Žitek, Pavel

    This article deals with the description of an approximate numerical estimation approach of a low cycle fatigue of a high pressure turbine disc for the AL-31F turbofan jet engine. The numerical estimation is based on the finite element method carried out in the SolidWorks software. The low cycle fatigue assessment of a high pressure turbine disc was carried out on the basis of dimensional, shape and material disc characteristics, which are available for the particular high pressure engine turbine. The method described here enables relatively fast setting of economically feasible low cycle fatigue of the assessed high pressure turbine disc using a commercially available software. The numerical estimation of accuracy of a low cycle fatigue depends on the accuracy of required input data for the particular investigated object.

  3. VAVUQ, Python and Matlab freeware for Verification and Validation, Uncertainty Quantification

    NASA Astrophysics Data System (ADS)

    Courtney, J. E.; Zamani, K.; Bombardelli, F. A.; Fleenor, W. E.

    2015-12-01

    A package of scripts is presented for automated Verification and Validation (V&V) and Uncertainty Quantification (UQ) for engineering codes that approximate Partial Differential Equations (PDFs). The code post-processes model results to produce V&V and UQ information. This information can be used to assess model performance. Automated information on code performance can allow for a systematic methodology to assess the quality of model approximations. The software implements common and accepted code verification schemes. The software uses the Method of Manufactured Solutions (MMS), the Method of Exact Solution (MES), Cross-Code Verification, and Richardson Extrapolation (RE) for solution (calculation) verification. It also includes common statistical measures that can be used for model skill assessment. Complete RE can be conducted for complex geometries by implementing high-order non-oscillating numerical interpolation schemes within the software. Model approximation uncertainty is quantified by calculating lower and upper bounds of numerical error from the RE results. The software is also able to calculate the Grid Convergence Index (GCI), and to handle adaptive meshes and models that implement mixed order schemes. Four examples are provided to demonstrate the use of the software for code and solution verification, model validation and uncertainty quantification. The software is used for code verification of a mixed-order compact difference heat transport solver; the solution verification of a 2D shallow-water-wave solver for tidal flow modeling in estuaries; the model validation of a two-phase flow computation in a hydraulic jump compared to experimental data; and numerical uncertainty quantification for 3D CFD modeling of the flow patterns in a Gust erosion chamber.

  4. Analytical validation of an explicit finite element model of a rolling element bearing with a localised line spall

    NASA Astrophysics Data System (ADS)

    Singh, Sarabjeet; Howard, Carl Q.; Hansen, Colin H.; Köpke, Uwe G.

    2018-03-01

    In this paper, numerically modelled vibration response of a rolling element bearing with a localised outer raceway line spall is presented. The results were obtained from a finite element (FE) model of the defective bearing solved using an explicit dynamics FE software package, LS-DYNA. Time domain vibration signals of the bearing obtained directly from the FE modelling were processed further to estimate time-frequency and frequency domain results, such as spectrogram and power spectrum, using standard signal processing techniques pertinent to the vibration-based monitoring of rolling element bearings. A logical approach to analyses of the numerically modelled results was developed with an aim to presenting the analytical validation of the modelled results. While the time and frequency domain analyses of the results show that the FE model generates accurate bearing kinematics and defect frequencies, the time-frequency analysis highlights the simulation of distinct low- and high-frequency characteristic vibration signals associated with the unloading and reloading of the rolling elements as they move in and out of the defect, respectively. Favourable agreement of the numerical and analytical results demonstrates the validation of the results from the explicit FE modelling of the bearing.

  5. Aeroacoustic Simulations of a Nose Landing Gear Using FUN3D on Pointwise Unstructured Grids

    NASA Technical Reports Server (NTRS)

    Vatsa, Veer N.; Khorrami, Mehdi R.; Rhoads, John; Lockard, David P.

    2015-01-01

    Numerical simulations have been performed for a partially-dressed, cavity-closed (PDCC) nose landing gear configuration that was tested in the University of Florida's open-jet acoustic facility known as the UFAFF. The unstructured-grid flow solver FUN3D is used to compute the unsteady flow field for this configuration. Mixed-element grids generated using the Pointwise(TradeMark) grid generation software are used for these simulations. Particular care is taken to ensure quality cells and proper resolution in critical areas of interest in an effort to minimize errors introduced by numerical artifacts. A hybrid Reynolds-averaged Navier-Stokes/large eddy simulation (RANS/LES) turbulence model is used for these simulations. Solutions are also presented for a wall function model coupled to the standard turbulence model. Time-averaged and instantaneous solutions obtained on these Pointwise grids are compared with the measured data and previous numerical solutions. The resulting CFD solutions are used as input to a Ffowcs Williams-Hawkings noise propagation code to compute the farfield noise levels in the flyover and sideline directions. The computed noise levels compare well with previous CFD solutions and experimental data.

  6. Feasibility of using the Massively Parallel Processor for large eddy simulations and other Computational Fluid Dynamics applications

    NASA Technical Reports Server (NTRS)

    Bruno, John

    1984-01-01

    The results of an investigation into the feasibility of using the MPP for direct and large eddy simulations of the Navier-Stokes equations is presented. A major part of this study was devoted to the implementation of two of the standard numerical algorithms for CFD. These implementations were not run on the Massively Parallel Processor (MPP) since the machine delivered to NASA Goddard does not have sufficient capacity. Instead, a detailed implementation plan was designed and from these were derived estimates of the time and space requirements of the algorithms on a suitably configured MPP. In addition, other issues related to the practical implementation of these algorithms on an MPP-like architecture were considered; namely, adaptive grid generation, zonal boundary conditions, the table lookup problem, and the software interface. Performance estimates show that the architectural components of the MPP, the Staging Memory and the Array Unit, appear to be well suited to the numerical algorithms of CFD. This combined with the prospect of building a faster and larger MMP-like machine holds the promise of achieving sustained gigaflop rates that are required for the numerical simulations in CFD.

  7. Summary of research in applied mathematics, numerical analysis, and computer sciences

    NASA Technical Reports Server (NTRS)

    1986-01-01

    The major categories of current ICASE research programs addressed include: numerical methods, with particular emphasis on the development and analysis of basic numerical algorithms; control and parameter identification problems, with emphasis on effective numerical methods; computational problems in engineering and physical sciences, particularly fluid dynamics, acoustics, and structural analysis; and computer systems and software, especially vector and parallel computers.

  8. LEGOS: Object-based software components for mission-critical systems. Final report, June 1, 1995--December 31, 1997

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    1998-08-01

    An estimated 85% of the installed base of software is a custom application with a production quantity of one. In practice, almost 100% of military software systems are custom software. Paradoxically, the marginal costs of producing additional units are near zero. So why hasn`t the software market, a market with high design costs and low productions costs evolved like other similar custom widget industries, such as automobiles and hardware chips? The military software industry seems immune to market pressures that have motivated a multilevel supply chain structure in other widget industries: design cost recovery, improve quality through specialization, and enablemore » rapid assembly from purchased components. The primary goal of the ComponentWare Consortium (CWC) technology plan was to overcome barriers to building and deploying mission-critical information systems by using verified, reusable software components (Component Ware). The adoption of the ComponentWare infrastructure is predicated upon a critical mass of the leading platform vendors` inevitable adoption of adopting emerging, object-based, distributed computing frameworks--initially CORBA and COM/OLE. The long-range goal of this work is to build and deploy military systems from verified reusable architectures. The promise of component-based applications is to enable developers to snap together new applications by mixing and matching prefabricated software components. A key result of this effort is the concept of reusable software architectures. A second important contribution is the notion that a software architecture is something that can be captured in a formal language and reused across multiple applications. The formalization and reuse of software architectures provide major cost and schedule improvements. The Unified Modeling Language (UML) is fast becoming the industry standard for object-oriented analysis and design notation for object-based systems. However, the lack of a standard real-time distributed object operating system, lack of a standard Computer-Aided Software Environment (CASE) tool notation and lack of a standard CASE tool repository has limited the realization of component software. The approach to fulfilling this need is the software component factory innovation. The factory approach takes advantage of emerging standards such as UML, CORBA, Java and the Internet. The key technical innovation of the software component factory is the ability to assemble and test new system configurations as well as assemble new tools on demand from existing tools and architecture design repositories.« less

  9. Development and Evaluation of a Computer-Based Program for Assessing Quality of Family Medicine Teams Based on Accreditation Standards

    PubMed Central

    Valjevac, Salih; Ridjanovic, Zoran; Masic, Izet

    2009-01-01

    CONFLICT OF INTEREST: NONE DECLARED SUMMARY Introduction Agency for healthcare quality and accreditation in Federation of Bosnia and Herzegovina (AKAZ) is authorized body in the field of healthcare quality and safety improvement and accreditation of healthcare institutions. Beside accreditation standards for hospitals and primary health care centers, AKAZ has also developed accreditation standards for family medicine teams. Methods Software development was primarily based on Accreditation Standards for Family Medicine Teams. Seven chapters / topics: (1. Physical factors; 2. Equipment; 3. Organization and Management; 4. Health promotion and illness prevention; 5. Clinical services; 6. Patient survey; and 7. Patient’s rights and obligations) contain 35 standards describing expected level of family medicine team’s quality. Based on accreditation standards structure and needs of different potential users, it was concluded that software backbone should be a database containing all accreditation standards, self assessment and external assessment details. In this article we will present the development of standardized software for self and external evaluation of quality of service in family medicine, as well as plans for the future development of this software package. Conclusion Electronic data gathering and storing enhances the management, access and overall use of information. During this project we came to conclusion that software for self assessment and external assessment is ideal for accreditation standards distribution, their overview by the family medicine team members, their self assessment and external assessment. PMID:24109157

  10. Simple solution to the medical instrumentation software problem

    NASA Astrophysics Data System (ADS)

    Leif, Robert C.; Leif, Suzanne B.; Leif, Stephanie H.; Bingue, E.

    1995-04-01

    Medical devices now include a substantial software component, which is both difficult and expensive to produce and maintain. Medical software must be developed according to `Good Manufacturing Practices', GMP. Good Manufacturing Practices as specified by the FDA and ISO requires the definition and compliance to a software processes which ensures quality products by specifying a detailed method of software construction. The software process should be based on accepted standards. US Department of Defense software standards and technology can both facilitate the development and improve the quality of medical systems. We describe the advantages of employing Mil-Std-498, Software Development and Documentation, and the Ada programming language. Ada provides the very broad range of functionalities, from embedded real-time to management information systems required by many medical devices. It also includes advanced facilities for object oriented programming and software engineering.

  11. An open trial assessment of "The Number Race", an adaptive computer game for remediation of dyscalculia

    PubMed Central

    Wilson, Anna J; Revkin, Susannah K; Cohen, David; Cohen, Laurent; Dehaene, Stanislas

    2006-01-01

    Background In a companion article [1], we described the development and evaluation of software designed to remediate dyscalculia. This software is based on the hypothesis that dyscalculia is due to a "core deficit" in number sense or in its access via symbolic information. Here we review the evidence for this hypothesis, and present results from an initial open-trial test of the software in a sample of nine 7–9 year old children with mathematical difficulties. Methods Children completed adaptive training on numerical comparison for half an hour a day, four days a week over a period of five-weeks. They were tested before and after intervention on their performance in core numerical tasks: counting, transcoding, base-10 comprehension, enumeration, addition, subtraction, and symbolic and non-symbolic numerical comparison. Results Children showed specific increases in performance on core number sense tasks. Speed of subitizing and numerical comparison increased by several hundred msec. Subtraction accuracy increased by an average of 23%. Performance on addition and base-10 comprehension tasks did not improve over the period of the study. Conclusion Initial open-trial testing showed promising results, and suggested that the software was successful in increasing number sense over the short period of the study. However these results need to be followed up with larger, controlled studies. The issues of transfer to higher-level tasks, and of the best developmental time window for intervention also need to be addressed. PMID:16734906

  12. Preparation guide for class B software specification documents

    NASA Technical Reports Server (NTRS)

    Tausworthe, R. C.

    1979-01-01

    General conceptual requirements and specific application rules and procedures are provided for the production of software specification documents in conformance with deep space network software standards and class B standards. Class B documentation is identified as the appropriate level applicable to implementation, sustaining engineering, and operational uses by qualified personnel. Special characteristics of class B documents are defined.

  13. 35 Ways to Take a "Byte" out of Software Costs. Fund Raising Ideas from COMPress Customers.

    ERIC Educational Resources Information Center

    COMPress, Wentworth, NH.

    Based on a survey sponsored by COMPress Quarterly of various schools to determine the extent of the problem of lack of funds for purchasing computer software and how schools have coped with the problem, this booklet describes numerous ways to raise funds for software purchases. Nearly 1,000 questionnaires were returned and this booklet was…

  14. Software Engineering Support of the Third Round of Scientific Grand Challenge Investigations: Earth System Modeling Software Framework Survey

    NASA Technical Reports Server (NTRS)

    Talbot, Bryan; Zhou, Shu-Jia; Higgins, Glenn; Zukor, Dorothy (Technical Monitor)

    2002-01-01

    One of the most significant challenges in large-scale climate modeling, as well as in high-performance computing in other scientific fields, is that of effectively integrating many software models from multiple contributors. A software framework facilitates the integration task, both in the development and runtime stages of the simulation. Effective software frameworks reduce the programming burden for the investigators, freeing them to focus more on the science and less on the parallel communication implementation. while maintaining high performance across numerous supercomputer and workstation architectures. This document surveys numerous software frameworks for potential use in Earth science modeling. Several frameworks are evaluated in depth, including Parallel Object-Oriented Methods and Applications (POOMA), Cactus (from (he relativistic physics community), Overture, Goddard Earth Modeling System (GEMS), the National Center for Atmospheric Research Flux Coupler, and UCLA/UCB Distributed Data Broker (DDB). Frameworks evaluated in less detail include ROOT, Parallel Application Workspace (PAWS), and Advanced Large-Scale Integrated Computational Environment (ALICE). A host of other frameworks and related tools are referenced in this context. The frameworks are evaluated individually and also compared with each other.

  15. Preliminary design of the HARMONI science software

    NASA Astrophysics Data System (ADS)

    Piqueras, Laure; Jarno, Aurelien; Pécontal-Rousset, Arlette; Loupias, Magali; Richard, Johan; Schwartz, Noah; Fusco, Thierry; Sauvage, Jean-François; Neichel, Benoît; Correia, Carlos M.

    2016-08-01

    This paper introduces the science software of HARMONI. The Instrument Numerical Model simulates the instrument from the optical point of view and provides synthetic exposures simulating detector readouts from data-cubes containing astrophysical scenes. The Data Reduction Software converts raw-data frames into a fully calibrated, scientifically usable data cube. We present the functionalities and the preliminary design of this software, describe some of the methods and algorithms used and highlight the challenges that we will have to face.

  16. The Software Maturity Matrix: A Software Performance Metric

    DTIC Science & Technology

    2003-01-28

    are for Managing n Use Them! n Unused measurements have the same value as last night’s unused hotel room or an empty airline seat. n Be Prepared to...standard measurements are implicit n Organization standard verification is implicit n Organization standard SMM training can be the basis of an

  17. 77 FR 50726 - Software Requirement Specifications for Digital Computer Software and Complex Electronics Used in...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-08-22

    ... Computer Software and Complex Electronics Used in Safety Systems of Nuclear Power Plants AGENCY: Nuclear...-1209, ``Software Requirement Specifications for Digital Computer Software and Complex Electronics used... Electronics Engineers (ANSI/IEEE) Standard 830-1998, ``IEEE Recommended Practice for Software Requirements...

  18. Documentation of operational protocol for the use of MAMA software

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schwartz, Daniel S.

    2016-01-21

    Image analysis of Scanning Electron Microscope (SEM) micrographs is a complex process that can vary significantly between analysts. The factors causing the variation are numerous, and the purpose of Task 2b is to develop and test a set of protocols designed to minimize variation in image analysis between different analysts and laboratories, specifically using the MAMA software package, Version 2.1. The protocols were designed to be “minimally invasive”, so that expert SEM operators will not be overly constrained in the way they analyze particle samples. The protocols will be tested using a round-robin approach where results from expert SEM usersmore » at Los Alamos National Laboratory, Lawrence Livermore National Laboratory, Pacific Northwest National Laboratory, Savannah River National Laboratory, and the National Institute of Standards and Testing will be compared. The variation of the results will be used to quantify uncertainty in the particle image analysis process. The round-robin exercise will proceed with 3 levels of rigor, each with their own set of protocols, as described below in Tasks 2b.1, 2b.2, and 2b.3. The uncertainty will be developed using NIST standard reference material SRM 1984 “Thermal Spray Powder – Particle Size Distribution, Tungsten Carbide/Cobalt (Acicular)” [Reference 1]. Full details are available in the Certificate of Analysis, posted on the NIST website (http://www.nist.gov/srm/).« less

  19. BARTTest: Community-Standard Atmospheric Radiative-Transfer and Retrieval Tests

    NASA Astrophysics Data System (ADS)

    Harrington, Joseph; Himes, Michael D.; Cubillos, Patricio E.; Blecic, Jasmina; Challener, Ryan C.

    2018-01-01

    Atmospheric radiative transfer (RT) codes are used both to predict planetary and brown-dwarf spectra and in retrieval algorithms to infer atmospheric chemistry, clouds, and thermal structure from observations. Observational plans, theoretical models, and scientific results depend on the correctness of these calculations. Yet, the calculations are complex and the codes implementing them are often written without modern software-verification techniques. The community needs a suite of test calculations with analytically, numerically, or at least community-verified results. We therefore present the Bayesian Atmospheric Radiative Transfer Test Suite, or BARTTest. BARTTest has four categories of tests: analytically verified RT tests of simple atmospheres (single line in single layer, line blends, saturation, isothermal, multiple line-list combination, etc.), community-verified RT tests of complex atmospheres, synthetic retrieval tests on simulated data with known answers, and community-verified real-data retrieval tests.BARTTest is open-source software intended for community use and further development. It is available at https://github.com/ExOSPORTS/BARTTest. We propose this test suite as a standard for verifying atmospheric RT and retrieval codes, analogous to the Held-Suarez test for general circulation models. This work was supported by NASA Planetary Atmospheres grant NX12AI69G, NASA Astrophysics Data Analysis Program grant NNX13AF38G, and NASA Exoplanets Research Program grant NNX17AB62G.

  20. Oak Regeneration: A Knowledge Synthesis

    Treesearch

    H. Michael Rauscher; David L. Loftis; Charles E. McGee; Christopher V. Worth

    1997-01-01

    This scientific literature is represented by a hypertext software. To view this literature you must download and install the hypertext software.Abstract: The scientific literature concerning oak regeneration problems is lengthy, complex, paradoxical, and often perplexing. Despite a large scientific literature and numerous conference...

  1. Understanding the Perception of Very Small Software Companies towards the Adoption of Process Standards

    NASA Astrophysics Data System (ADS)

    Basri, Shuib; O'Connor, Rory V.

    This paper is concerned with understanding the issues that affect the adoption of software process standards by Very Small Entities (VSEs), their needs from process standards and their willingness to engage with the new ISO/IEC 29110 standard in particular. In order to achieve this goal, a series of industry data collection studies were undertaken with a collection of VSEs. A twin track approach of a qualitative data collection (interviews and focus groups) and quantitative data collection (questionnaire) were undertaken. Data analysis was being completed separately and the final results were merged, using the coding mechanisms of grounded theory. This paper serves as a roadmap for both researchers wishing to understand the issues of process standards adoption by very small companies and also for the software process standards community.

  2. The MATH--Open Source Application for Easier Learning of Numerical Mathematics

    ERIC Educational Resources Information Center

    Glaser-Opitz, Henrich; Budajová, Kristina

    2016-01-01

    The article introduces a software application (MATH) supporting an education of Applied Mathematics, with focus on Numerical Mathematics. The MATH is an easy to use tool supporting various numerical methods calculations with graphical user interface and integrated plotting tool for graphical representation written in Qt with extensive use of Qwt…

  3. Supercomputer modeling of flow past hypersonic flight vehicles

    NASA Astrophysics Data System (ADS)

    Ermakov, M. K.; Kryukov, I. A.

    2017-02-01

    A software platform for MPI-based parallel solution of the Navier-Stokes (Euler) equations for viscous heat-conductive compressible perfect gas on 3-D unstructured meshes is developed. The discretization and solution of the Navier-Stokes equations are constructed on generalized S.K. Godunov’s method and the second order approximation in space and time. Developed software platform allows to carry out effectively flow past hypersonic flight vehicles simulations for the Mach numbers 6 and higher, and numerical meshes with up to 1 billion numerical cells and with up to 128 processors.

  4. Numerical simulation study on the distribution law of smoke flow velocity in horizontal tunnel fire

    NASA Astrophysics Data System (ADS)

    Liu, Yejiao; Tian, Zhichao; Xue, Junhua; Wang, Wencai

    2018-02-01

    According to the fluid similarity theory, the simulation experiment system of mining tunnel fire is established. The grid division of experimental model roadway is carried on by GAMBIT software. By setting the boundary and initial conditions of smoke flow during fire period in FLUENT software, using RNG k-Ɛ two-equation turbulence model, energy equation and SIMPLE algorithm, the steady state numerical simulation of smoke flow velocity in mining tunnel is done to obtain the distribution law of smoke flow velocity in tunnel during fire period.

  5. First experiences with the implementation of the European standard EN 62304 on medical device software for the quality assurance of a radiotherapy unit

    PubMed Central

    2014-01-01

    Background According to the latest amendment of the Medical Device Directive standalone software qualifies as a medical device when intended by the manufacturer to be used for medical purposes. In this context, the EN 62304 standard is applicable which defines the life-cycle requirements for the development and maintenance of medical device software. A pilot project was launched to acquire skills in implementing this standard in a hospital-based environment (in-house manufacture). Methods The EN 62304 standard outlines minimum requirements for each stage of the software life-cycle, defines the activities and tasks to be performed and scales documentation and testing according to its criticality. The required processes were established for the pre-existent decision-support software FlashDumpComparator (FDC) used during the quality assurance of treatment-relevant beam parameters. As the EN 62304 standard implicates compliance with the EN ISO 14971 standard on the application of risk management to medical devices, a risk analysis was carried out to identify potential hazards and reduce the associated risks to acceptable levels. Results The EN 62304 standard is difficult to implement without proper tools, thus open-source software was selected and integrated into a dedicated development platform. The control measures yielded by the risk analysis were independently implemented and verified, and a script-based test automation was retrofitted to reduce the associated test effort. After all documents facilitating the traceability of the specified requirements to the corresponding tests and of the control measures to the proof of execution were generated, the FDC was released as an accessory to the HIT facility. Conclusions The implementation of the EN 62304 standard was time-consuming, and a learning curve had to be overcome during the first iterations of the associated processes, but many process descriptions and all software tools can be re-utilized in follow-up projects. It has been demonstrated that a standards-compliant development of small and medium-sized medical software can be carried out by a small team with limited resources in a clinical setting. This is of particular relevance as the upcoming revision of the Medical Device Directive is expected to harmonize and tighten the current legal requirements for all European in-house manufacturers. PMID:24655818

  6. First experiences with the implementation of the European standard EN 62304 on medical device software for the quality assurance of a radiotherapy unit.

    PubMed

    Höss, Angelika; Lampe, Christian; Panse, Ralf; Ackermann, Benjamin; Naumann, Jakob; Jäkel, Oliver

    2014-03-21

    According to the latest amendment of the Medical Device Directive standalone software qualifies as a medical device when intended by the manufacturer to be used for medical purposes. In this context, the EN 62304 standard is applicable which defines the life-cycle requirements for the development and maintenance of medical device software. A pilot project was launched to acquire skills in implementing this standard in a hospital-based environment (in-house manufacture). The EN 62304 standard outlines minimum requirements for each stage of the software life-cycle, defines the activities and tasks to be performed and scales documentation and testing according to its criticality. The required processes were established for the pre-existent decision-support software FlashDumpComparator (FDC) used during the quality assurance of treatment-relevant beam parameters. As the EN 62304 standard implicates compliance with the EN ISO 14971 standard on the application of risk management to medical devices, a risk analysis was carried out to identify potential hazards and reduce the associated risks to acceptable levels. The EN 62304 standard is difficult to implement without proper tools, thus open-source software was selected and integrated into a dedicated development platform. The control measures yielded by the risk analysis were independently implemented and verified, and a script-based test automation was retrofitted to reduce the associated test effort. After all documents facilitating the traceability of the specified requirements to the corresponding tests and of the control measures to the proof of execution were generated, the FDC was released as an accessory to the HIT facility. The implementation of the EN 62304 standard was time-consuming, and a learning curve had to be overcome during the first iterations of the associated processes, but many process descriptions and all software tools can be re-utilized in follow-up projects. It has been demonstrated that a standards-compliant development of small and medium-sized medical software can be carried out by a small team with limited resources in a clinical setting. This is of particular relevance as the upcoming revision of the Medical Device Directive is expected to harmonize and tighten the current legal requirements for all European in-house manufacturers.

  7. A Strategy for Improved System Assurance

    DTIC Science & Technology

    2007-06-20

    Quality (Measurements Life Cycle Safety, Security & Others) ISO /IEC 12207 * Software Life Cycle Processes ISO 9001 Quality Management System...14598 Software Product Evaluation Related ISO /IEC 90003 Guidelines for the Application of ISO 9001:2000 to Computer Software IEEE 12207 Industry...Implementation of International Standard ISO /IEC 12207 IEEE 1220 Standard for Application and Management of the System Engineering Process Use in

  8. Simulating flow around scaled model of a hypersonic vehicle in wind tunnel

    NASA Astrophysics Data System (ADS)

    Markova, T. V.; Aksenov, A. A.; Zhluktov, S. V.; Savitsky, D. V.; Gavrilov, A. D.; Son, E. E.; Prokhorov, A. N.

    2016-11-01

    A prospective hypersonic HEXAFLY aircraft is considered in the given paper. In order to obtain the aerodynamic characteristics of a new construction design of the aircraft, experiments with a scaled model have been carried out in a wind tunnel under different conditions. The runs have been performed at different angles of attack with and without hydrogen combustion in the scaled propulsion engine. However, the measured physical quantities do not provide all the information about the flowfield. Numerical simulation can complete the experimental data as well as to reduce the number of wind tunnel experiments. Besides that, reliable CFD software can be used for calculations of the aerodynamic characteristics for any possible design of the full-scale aircraft under different operation conditions. The reliability of the numerical predictions must be confirmed in verification study of the software. The given work is aimed at numerical investigation of the flowfield around and inside the scaled model of the HEXAFLY-CIAM module under wind tunnel conditions. A cold run (without combustion) was selected for this study. The calculations are performed in the FlowVision CFD software. The flow characteristics are compared against the available experimental data. The carried out verification study confirms the capability of the FlowVision CFD software to calculate the flows discussed.

  9. Test Driven Development of a Parameterized Ice Sheet Component

    NASA Astrophysics Data System (ADS)

    Clune, T.

    2011-12-01

    Test driven development (TDD) is a software development methodology that offers many advantages over traditional approaches including reduced development and maintenance costs, improved reliability, and superior design quality. Although TDD is widely accepted in many software communities, the suitability to scientific software is largely undemonstrated and warrants a degree of skepticism. Indeed, numerical algorithms pose several challenges to unit testing in general, and TDD in particular. Among these challenges are the need to have simple, non-redundant closed-form expressions to compare against the results obtained from the implementation as well as realistic error estimates. The necessity for serial and parallel performance raises additional concerns for many scientific applicaitons. In previous work I demonstrated that TDD performed well for the development of a relatively simple numerical model that simulates the growth of snowflakes, but the results were anecdotal and of limited relevance to far more complex software components typical of climate models. This investigation has now been extended by successfully applying TDD to the implementation of a substantial portion of a new parameterized ice sheet component within a full climate model. After a brief introduction to TDD, I will present techniques that address some of the obstacles encountered with numerical algorithms. I will conclude with some quantitative and qualitative comparisons against climate components developed in a more traditional manner.

  10. Estimation and enhancement of real-time software reliability through mutation analysis

    NASA Technical Reports Server (NTRS)

    Geist, Robert; Offutt, A. J.; Harris, Frederick C., Jr.

    1992-01-01

    A simulation-based technique for obtaining numerical estimates of the reliability of N-version, real-time software is presented. An extended stochastic Petri net is employed to represent the synchronization structure of N versions of the software, where dependencies among versions are modeled through correlated sampling of module execution times. Test results utilizing specifications for NASA's planetary lander control software indicate that mutation-based testing could hold greater potential for enhancing reliability than the desirable but perhaps unachievable goal of independence among N versions.

  11. AN EVALUATION OF FIVE COMMERCIAL IMMUNOASSAY DATA ANALYSIS SOFTWARE SYSTEMS

    EPA Science Inventory

    An evaluation of five commercial software systems used for immunoassay data analysis revealed numerous deficiencies. Often, the utility of statistical output was compromised by poor documentation. Several data sets were run through each system using a four-parameter calibration f...

  12. Multiscale Software Tool for Controls Prototyping in Supersonic Combustors

    DTIC Science & Technology

    2004-04-01

    and design software (GEMA, NPSS , LES combustion). We are partner with major propulsion system developers (GE, Rolls Royce, Aerojet), and a...participant in NASA/GRC Numerical Propulsion System Simulation ( NPSS ) program. The principal investigator is the primary developer (Pindera, 2001) of a

  13. Logic Model Checking of Unintended Acceleration Claims in Toyota Vehicles

    NASA Technical Reports Server (NTRS)

    Gamble, Ed

    2012-01-01

    Part of the US Department of Transportation investigation of Toyota sudden unintended acceleration (SUA) involved analysis of the throttle control software, JPL Laboratory for Reliable Software applied several techniques including static analysis and logic model checking, to the software; A handful of logic models were build, Some weaknesses were identified; however, no cause for SUA was found; The full NASA report includes numerous other analyses

  14. An Empirical Verification of a-priori Learning Models on Mailing Archives in the Context of Online Learning Activities of Participants in Free\\Libre Open Source Software (FLOSS) Communities

    ERIC Educational Resources Information Center

    Mukala, Patrick; Cerone, Antonio; Turini, Franco

    2017-01-01

    Free\\Libre Open Source Software (FLOSS) environments are increasingly dubbed as learning environments where practical software engineering skills can be acquired. Numerous studies have extensively investigated how knowledge is acquired in these environments through a collaborative learning model that define a learning process. Such a learning…

  15. Definition of the Flexible Image Transport System (FITS), Version 3.0

    NASA Technical Reports Server (NTRS)

    Pence, W. D.; Chiapetti, L.; Page, C. G.; Shaw, R. A.; Stobie, E.

    2010-01-01

    The Flexible Image Transport System (FITS) has been used by astronomers for over 30 years as a data interchange and archiving format; FITS files are now handled by a wide range of astronomical software packages. Since the FITS format definition document (the "standard") was last printed in this journal in 2001, several new features have been developed and standardized, notably support for 64-bit integers in images and tables, variable-length arrays in tables, and new world coordinate system conventions which provide a mapping from an element in a data array to a physical coordinate on the sky or within a spectrum. The FITS Working Group of the International Astronomical Union has therefore produced this new Version 3.0 of the FITS standard, which is provided here in its entirety. In addition to describing the new features in FITS, numerous editorial changes were made to the previous version to clarify and reorganize many of the sections. Also included are some appendices which are not formally part of the standard. The FITS standard is likely to undergo further evolution, in which case the latest version may be found on the FITS Support Office Web site at http://fits.gsfc.nasa.gov/, which also provides many links to FITS-related resources.

  16. Programming biological models in Python using PySB.

    PubMed

    Lopez, Carlos F; Muhlich, Jeremy L; Bachman, John A; Sorger, Peter K

    2013-01-01

    Mathematical equations are fundamental to modeling biological networks, but as networks get large and revisions frequent, it becomes difficult to manage equations directly or to combine previously developed models. Multiple simultaneous efforts to create graphical standards, rule-based languages, and integrated software workbenches aim to simplify biological modeling but none fully meets the need for transparent, extensible, and reusable models. In this paper we describe PySB, an approach in which models are not only created using programs, they are programs. PySB draws on programmatic modeling concepts from little b and ProMot, the rule-based languages BioNetGen and Kappa and the growing library of Python numerical tools. Central to PySB is a library of macros encoding familiar biochemical actions such as binding, catalysis, and polymerization, making it possible to use a high-level, action-oriented vocabulary to construct detailed models. As Python programs, PySB models leverage tools and practices from the open-source software community, substantially advancing our ability to distribute and manage the work of testing biochemical hypotheses. We illustrate these ideas using new and previously published models of apoptosis.

  17. Programming biological models in Python using PySB

    PubMed Central

    Lopez, Carlos F; Muhlich, Jeremy L; Bachman, John A; Sorger, Peter K

    2013-01-01

    Mathematical equations are fundamental to modeling biological networks, but as networks get large and revisions frequent, it becomes difficult to manage equations directly or to combine previously developed models. Multiple simultaneous efforts to create graphical standards, rule-based languages, and integrated software workbenches aim to simplify biological modeling but none fully meets the need for transparent, extensible, and reusable models. In this paper we describe PySB, an approach in which models are not only created using programs, they are programs. PySB draws on programmatic modeling concepts from little b and ProMot, the rule-based languages BioNetGen and Kappa and the growing library of Python numerical tools. Central to PySB is a library of macros encoding familiar biochemical actions such as binding, catalysis, and polymerization, making it possible to use a high-level, action-oriented vocabulary to construct detailed models. As Python programs, PySB models leverage tools and practices from the open-source software community, substantially advancing our ability to distribute and manage the work of testing biochemical hypotheses. We illustrate these ideas using new and previously published models of apoptosis. PMID:23423320

  18. Spinoff 2013

    NASA Technical Reports Server (NTRS)

    2014-01-01

    Topics covered include: Innovative Software Tools Measure Behavioral Alertness; Miniaturized, Portable Sensors Monitor Metabolic Health; Patient Simulators Train Emergency Caregivers; Solar Refrigerators Store Life-Saving Vaccines; Monitors Enable Medication Management in Patients' Homes; Handheld Diagnostic Device Delivers Quick Medical Readings; Experiments Result in Safer, Spin-Resistant Aircraft; Interfaces Visualize Data for Airline Safety, Efficiency; Data Mining Tools Make Flights Safer, More Efficient; NASA Standards Inform Comfortable Car Seats; Heat Shield Paves the Way for Commercial Space; Air Systems Provide Life Support to Miners; Coatings Preserve Metal, Stone, Tile, and Concrete; Robots Spur Software That Lends a Hand; Cloud-Based Data Sharing Connects Emergency Managers; Catalytic Converters Maintain Air Quality in Mines; NASA-Enhanced Water Bottles Filter Water on the Go; Brainwave Monitoring Software Improves Distracted Minds; Thermal Materials Protect Priceless, Personal Keepsakes; Home Air Purifiers Eradicate Harmful Pathogens; Thermal Materials Drive Professional Apparel Line; Radiant Barriers Save Energy in Buildings; Open Source Initiative Powers Real-Time Data Streams; Shuttle Engine Designs Revolutionize Solar Power; Procedure-Authoring Tool Improves Safety on Oil Rigs; Satellite Data Aid Monitoring of Nation's Forests; Mars Technologies Spawn Durable Wind Turbines; Programs Visualize Earth and Space for Interactive Education; Processor Units Reduce Satellite Construction Costs; Software Accelerates Computing Time for Complex Math; Simulation Tools Prevent Signal Interference on Spacecraft; Software Simplifies the Sharing of Numerical Models; Virtual Machine Language Controls Remote Devices; Micro-Accelerometers Monitor Equipment Health; Reactors Save Energy, Costs for Hydrogen Production; Cameras Monitor Spacecraft Integrity to Prevent Failures; Testing Devices Garner Data on Insulation Performance; Smart Sensors Gather Information for Machine Diagnostics; Oxygen Sensors Monitor Bioreactors and Ensure Health and Safety; Vision Algorithms Catch Defects in Screen Displays; and Deformable Mirrors Capture Exoplanet Data, Reflect Lasers.

  19. Vertical bone measurements from cone beam computed tomography images using different software packages.

    PubMed

    Vasconcelos, Taruska Ventorini; Neves, Frederico Sampaio; Moraes, Lívia Almeida Bueno; Freitas, Deborah Queiroz

    2015-01-01

    This article aimed at comparing the accuracy of linear measurement tools of different commercial software packages. Eight fully edentulous dry mandibles were selected for this study. Incisor, canine, premolar, first molar and second molar regions were selected. Cone beam computed tomography (CBCT) images were obtained with i-CAT Next Generation. Linear bone measurements were performed by one observer on the cross-sectional images using three different software packages: XoranCat®, OnDemand3D® and KDIS3D®, all able to assess DICOM images. In addition, 25% of the sample was reevaluated for the purpose of reproducibility. The mandibles were sectioned to obtain the gold standard for each region. Intraclass coefficients (ICC) were calculated to examine the agreement between the two periods of evaluation; the one-way analysis of variance performed with the post-hoc Dunnett test was used to compare each of the software-derived measurements with the gold standard. The ICC values were excellent for all software packages. The least difference between the software-derived measurements and the gold standard was obtained with the OnDemand3D and KDIS3D (-0.11 and -0.14 mm, respectively), and the greatest, with the XoranCAT (+0.25 mm). However, there was no statistical significant difference between the measurements obtained with the different software packages and the gold standard (p> 0.05). In conclusion, linear bone measurements were not influenced by the software package used to reconstruct the image from CBCT DICOM data.

  20. Stochastic coalescence in finite systems: an algorithm for the numerical solution of the multivariate master equation.

    NASA Astrophysics Data System (ADS)

    Alfonso, Lester; Zamora, Jose; Cruz, Pedro

    2015-04-01

    The stochastic approach to coagulation considers the coalescence process going in a system of a finite number of particles enclosed in a finite volume. Within this approach, the full description of the system can be obtained from the solution of the multivariate master equation, which models the evolution of the probability distribution of the state vector for the number of particles of a given mass. Unfortunately, due to its complexity, only limited results were obtained for certain type of kernels and monodisperse initial conditions. In this work, a novel numerical algorithm for the solution of the multivariate master equation for stochastic coalescence that works for any type of kernels and initial conditions is introduced. The performance of the method was checked by comparing the numerically calculated particle mass spectrum with analytical solutions obtained for the constant and sum kernels, with an excellent correspondence between the analytical and numerical solutions. In order to increase the speedup of the algorithm, software parallelization techniques with OpenMP standard were used, along with an implementation in order to take advantage of new accelerator technologies. Simulations results show an important speedup of the parallelized algorithms. This study was funded by a grant from Consejo Nacional de Ciencia y Tecnologia de Mexico SEP-CONACYT CB-131879. The authors also thanks LUFAC® Computacion SA de CV for CPU time and all the support provided.

  1. A study of software standards used in the avionics industry

    NASA Technical Reports Server (NTRS)

    Hayhurst, Kelly J.

    1994-01-01

    Within the past decade, software has become an increasingly common element in computing systems. In particular, the role of software used in the aerospace industry, especially in life- or safety-critical applications, is rapidly expanding. This intensifies the need to use effective techniques for achieving and verifying the reliability of avionics software. Although certain software development processes and techniques are mandated by government regulating agencies, no one methodology has been shown to consistently produce reliable software. The knowledge base for designing reliable software simply has not reached the maturity of its hardware counterpart. In an effort to increase our understanding of software, the Langley Research Center conducted a series of experiments over 15 years with the goal of understanding why and how software fails. As part of this program, the effectiveness of current industry standards for the development of avionics is being investigated. This study involves the generation of a controlled environment to conduct scientific experiments on software processes.

  2. Announcing a Community Effort to Create an Information Model for Research Software Archives

    NASA Astrophysics Data System (ADS)

    Million, C.; Brazier, A.; King, T.; Hayes, A.

    2018-04-01

    An effort has started to create recommendations and standards for the archiving of planetary science research software. The primary goal is to define an information model that is consistent with OAIS standards.

  3. 78 FR 47011 - Software Unit Testing for Digital Computer Software Used in Safety Systems of Nuclear Power Plants

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-08-02

    ... NUCLEAR REGULATORY COMMISSION [NRC-2012-0195] Software Unit Testing for Digital Computer Software... revised regulatory guide (RG), revision 1 of RG 1.171, ``Software Unit Testing for Digital Computer Software Used in Safety Systems of Nuclear Power Plants.'' This RG endorses American National Standards...

  4. Second generation heliostat development for solar central receiver systems. Volume 4, appendices F-J: Control software test results manufacturing pile installation pile coatings

    NASA Astrophysics Data System (ADS)

    1981-03-01

    Support documentation for a second generation heliostat project is presented. Flowcharts of control software are included. Numerical and graphic test results are provided. Project management information is also provided.

  5. Methods to ensure the standardization of FORTRAN software. [PFORT, DAVE, POLISH, and BRNANL, for analysis and editing of codes, in FORTRAN for PDP-10 and IBM 360 and 370

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gaffney, P.W.; Wooten, J.W.

    1980-05-01

    Four software tools PFORT, DAVE, POLISH, and BRNANL, which may be used to ensure the standardization of FORTRAN software are introduced. First, FORTRAN computer programs are loosely classified into three groups. Then reasons are given why the program in two of these groups should adhere to a portable subset of the American National Standard (ANS) First FORTRAN 1966. Next, the software tools PFORT, DAVE, POLISH, and BRNANL, are briefly described, and an example of the output from PFORT, DAVE, and POLISH are given. Finally, the dissemination of information pertaining to the tools together with their availability is outlined. 11 figures.

  6. Parallel Fortran-MPI software for numerical inversion of the Laplace transform and its application to oscillatory water levels in groundwater environments

    USGS Publications Warehouse

    Zhan, X.

    2005-01-01

    A parallel Fortran-MPI (Message Passing Interface) software for numerical inversion of the Laplace transform based on a Fourier series method is developed to meet the need of solving intensive computational problems involving oscillatory water level's response to hydraulic tests in a groundwater environment. The software is a parallel version of ACM (The Association for Computing Machinery) Transactions on Mathematical Software (TOMS) Algorithm 796. Running 38 test examples indicated that implementation of MPI techniques with distributed memory architecture speedups the processing and improves the efficiency. Applications to oscillatory water levels in a well during aquifer tests are presented to illustrate how this package can be applied to solve complicated environmental problems involved in differential and integral equations. The package is free and is easy to use for people with little or no previous experience in using MPI but who wish to get off to a quick start in parallel computing. ?? 2004 Elsevier Ltd. All rights reserved.

  7. Practical Applications of Digital Pathology.

    PubMed

    Saeed-Vafa, Daryoush; Magliocco, Anthony M

    2015-04-01

    Virtual microscopy and advances in machine learning have paved the way for the ever-expanding field of digital pathology. Multiple image-based computing environments capable of performing automated quantitative and morphological analyses are the foundation on which digital pathology is built. The applications for digital pathology in the clinical setting are numerous and are explored along with the digital software environments themselves, as well as the different analytical modalities specific to digital pathology. Prospective studies, case-control analyses, meta-analyses, and detailed descriptions of software environments were explored that pertained to digital pathology and its use in the clinical setting. Many different software environments have advanced platforms capable of improving digital pathology and potentially influencing clinical decisions. The potential of digital pathology is vast, particularly with the introduction of numerous software environments available for use. With all the digital pathology tools available as well as those in development, the field will continue to advance, particularly in the era of personalized medicine, providing health care professionals with more precise prognostic information as well as helping them guide treatment decisions.

  8. Software for improving the quality of project management, a case study: international manufacture of electrical equipment

    NASA Astrophysics Data System (ADS)

    Preradović, D. M.; Mićić, Lj S.; Barz, C.

    2017-05-01

    Production conditions in today’s world require software support at every stage of production and development of new products, for quality assurance and compliance with ISO standards. In addition to ISO standards such as usual metrics of quality, companies today are focused on other optional standards, such as CMMI (Capability Maturity Model Integrated) or prescribing they own standards. However, while there is intensive progress being made in the PM (project management), there is still a significant number of projects, at the global level, that are failures. These have failed to achieve their goals, within budget or timeframe. This paper focuses on checking the role of software tools through the rate of success in projects implemented in the case of internationally manufactured electrical equipment. The results of this research show the level of contribution of the project management software used to manage and develop new products to improve PM processes and PM functions, and how selection of the software tools affects the quality of PM processes and successfully completed projects.

  9. Code Verification Capabilities and Assessments in Support of ASC V&V Level 2 Milestone #6035

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Doebling, Scott William; Budzien, Joanne Louise; Ferguson, Jim Michael

    This document provides a summary of the code verification activities supporting the FY17 Level 2 V&V milestone entitled “Deliver a Capability for V&V Assessments of Code Implementations of Physics Models and Numerical Algorithms in Support of Future Predictive Capability Framework Pegposts.” The physics validation activities supporting this milestone are documented separately. The objectives of this portion of the milestone are: 1) Develop software tools to support code verification analysis; 2) Document standard definitions of code verification test problems; and 3) Perform code verification assessments (focusing on error behavior of algorithms). This report and a set of additional standalone documents servemore » as the compilation of results demonstrating accomplishment of these objectives.« less

  10. OpenCL Implementation of NeuroIsing

    NASA Astrophysics Data System (ADS)

    Zapart, C. A.

    Recent advances in graphics card hardware combined with anintroduction of the OpenCL standard promise to accelerate numerical simulations across diverse scientific disciplines. One such field benefiting from new hardware/software paradigms is econophysics. The paper describes an OpenCL implementation of a selected econophysics model: NeuroIsing, which has been designed to execute in parallel on a vendor-independent graphics card. Originally introduced in the paper [C.~A.~Zapart, ``Econophysics in Financial Time Series Prediction'', PhD thesis, Graduate University for Advanced Studies, Japan (2009)], at first it was implemented on a CELL processor running inside a SONY PS3 games console. The NeuroIsing framework can be applied to predicting and trading foreign exchange as well as stock market index futures.

  11. Libration Orbit Mission Design: Applications of Numerical & Dynamical Methods

    NASA Technical Reports Server (NTRS)

    Bauer, Frank (Technical Monitor); Folta, David; Beckman, Mark

    2002-01-01

    Sun-Earth libration point orbits serve as excellent locations for scientific investigations. These orbits are often selected to minimize environmental disturbances and maximize observing efficiency. Trajectory design in support of libration orbits is ever more challenging as more complex missions are envisioned in the next decade. Trajectory design software must be further enabled to incorporate better understanding of the libration orbit solution space and thus improve the efficiency and expand the capabilities of current approaches. The Goddard Space Flight Center (GSFC) is currently supporting multiple libration missions. This end-to-end support consists of mission operations, trajectory design, and control. It also includes algorithm and software development. The recently launched Microwave Anisotropy Probe (MAP) and upcoming James Webb Space Telescope (JWST) and Constellation-X missions are examples of the use of improved numerical methods for attaining constrained orbital parameters and controlling their dynamical evolution at the collinear libration points. This paper presents a history of libration point missions, a brief description of the numerical and dynamical design techniques including software used, and a sample of future GSFC mission designs.

  12. Starworld: Preparing Accountants for the Future: A Case-Based Approach to Teach International Financial Reporting Standards Using ERP Software

    ERIC Educational Resources Information Center

    Ragan, Joseph M.; Savino, Christopher J.; Parashac, Paul; Hosler, Jonathan C.

    2010-01-01

    International Financial Reporting Standards now constitute an important part of educating young professional accountants. This paper looks at a case based process to teach International Financial Reporting Standards using integrated Enterprise Resource Planning software. The case contained within the paper can be used within a variety of courses…

  13. Software handlers for process interfaces

    NASA Technical Reports Server (NTRS)

    Bercaw, R. W.

    1976-01-01

    The principles involved in the development of software handlers for custom interfacing problems are discussed. Handlers for the CAMAC standard are examined in detail. The types of transactions that must be supported have been established by standards groups, eliminating conflicting requirements arising out of different design philosophies and applications. Implementation of the standard handlers has been facilititated by standardization of hardware. The necessary local processors can be placed in the handler when it is written or at run time by means of input/output directives, or they can be built into a high-performance input/output processor. The full benefits of these process interfaces will only be realized when software requirements are incorporated uniformly into the hardware.

  14. SIM_EXPLORE: Software for Directed Exploration of Complex Systems

    NASA Technical Reports Server (NTRS)

    Burl, Michael; Wang, Esther; Enke, Brian; Merline, William J.

    2013-01-01

    Physics-based numerical simulation codes are widely used in science and engineering to model complex systems that would be infeasible to study otherwise. While such codes may provide the highest- fidelity representation of system behavior, they are often so slow to run that insight into the system is limited. Trying to understand the effects of inputs on outputs by conducting an exhaustive grid-based sweep over the input parameter space is simply too time-consuming. An alternative approach called "directed exploration" has been developed to harvest information from numerical simulators more efficiently. The basic idea is to employ active learning and supervised machine learning to choose cleverly at each step which simulation trials to run next based on the results of previous trials. SIM_EXPLORE is a new computer program that uses directed exploration to explore efficiently complex systems represented by numerical simulations. The software sequentially identifies and runs simulation trials that it believes will be most informative given the results of previous trials. The results of new trials are incorporated into the software's model of the system behavior. The updated model is then used to pick the next round of new trials. This process, implemented as a closed-loop system wrapped around existing simulation code, provides a means to improve the speed and efficiency with which a set of simulations can yield scientifically useful results. The software focuses on the case in which the feedback from the simulation trials is binary-valued, i.e., the learner is only informed of the success or failure of the simulation trial to produce a desired output. The software offers a number of choices for the supervised learning algorithm (the method used to model the system behavior given the results so far) and a number of choices for the active learning strategy (the method used to choose which new simulation trials to run given the current behavior model). The software also makes use of the LEGION distributed computing framework to leverage the power of a set of compute nodes. The approach has been demonstrated on a planetary science application in which numerical simulations are used to study the formation of asteroid families.

  15. Improving Software Quality and Management Through Use of Service Level Agreements

    DTIC Science & Technology

    2005-03-01

    many who believe that the quality of the development process is the best predictor of software product quality. ( Fenton ) Repeatable software processes...reduced errors per KLOC for small projects ( Fenton ), and the quality management metric (QMM) (Machniak, Osmundson). There are also numerous IEEE 14...attention to cosmetic user interface issues and any problems that may arise with the prototype. (Sawyer) The validation process is also another check

  16. Evaluation and Validation (E&V) Team Public Report. Volume 5

    DTIC Science & Technology

    1990-10-31

    aspects, software engineering practices, etc. The E&V requirements which are developed will be used to guide the E&V technical effort. The currently...interoperability of Ada software engineering environment tools and data. The scope of the CAIS-A includes the functionality affecting transportability that is...requirement that they be CAIS conforming tools or data. That is, for example numerous CIVC data exist on special purpose software currently available

  17. NASA/NBS (National Aeronautics and Space Administration/National Bureau of Standards) standard reference model for telerobot control system architecture (NASREM)

    NASA Technical Reports Server (NTRS)

    Albus, James S.; Mccain, Harry G.; Lumia, Ronald

    1989-01-01

    The document describes the NASA Standard Reference Model (NASREM) Architecture for the Space Station Telerobot Control System. It defines the functional requirements and high level specifications of the control system for the NASA space Station document for the functional specification, and a guideline for the development of the control system architecture, of the 10C Flight Telerobot Servicer. The NASREM telerobot control system architecture defines a set of standard modules and interfaces which facilitates software design, development, validation, and test, and make possible the integration of telerobotics software from a wide variety of sources. Standard interfaces also provide the software hooks necessary to incrementally upgrade future Flight Telerobot Systems as new capabilities develop in computer science, robotics, and autonomous system control.

  18. NHPP-Based Software Reliability Models Using Equilibrium Distribution

    NASA Astrophysics Data System (ADS)

    Xiao, Xiao; Okamura, Hiroyuki; Dohi, Tadashi

    Non-homogeneous Poisson processes (NHPPs) have gained much popularity in actual software testing phases to estimate the software reliability, the number of remaining faults in software and the software release timing. In this paper, we propose a new modeling approach for the NHPP-based software reliability models (SRMs) to describe the stochastic behavior of software fault-detection processes. The fundamental idea is to apply the equilibrium distribution to the fault-detection time distribution in NHPP-based modeling. We also develop efficient parameter estimation procedures for the proposed NHPP-based SRMs. Through numerical experiments, it can be concluded that the proposed NHPP-based SRMs outperform the existing ones in many data sets from the perspective of goodness-of-fit and prediction performance.

  19. Gis-Based Smart Cartography Using 3d Modeling

    NASA Astrophysics Data System (ADS)

    Malinverni, E. S.; Tassetti, A. N.

    2013-08-01

    3D City Models have evolved to be important tools for urban decision processes and information systems, especially in planning, simulation, analysis, documentation and heritage management. On the other hand existing and in use numerical cartography is often not suitable to be used in GIS because not geometrically and topologically correctly structured. The research aim is to 3D structure and organize a numeric cartography for GIS and turn it into CityGML standardized features. The work is framed around a first phase of methodological analysis aimed to underline which existing standard (like ISO and OGC rules) can be used to improve the quality requirement of a cartographic structure. Subsequently, from this technical specifics, it has been investigated the translation in formal contents, using an owner interchange software (SketchUp), to support some guide lines implementations to generate a GIS3D structured in GML3. It has been therefore predisposed a test three-dimensional numerical cartography (scale 1:500, generated from range data captured by 3D laser scanner), tested on its quality according to the previous standard and edited when and where necessary. Cad files and shapefiles are converted into a final 3D model (Google SketchUp model) and then exported into a 3D city model (CityGML LoD1/LoD2). The GIS3D structure has been managed in a GIS environment to run further spatial analysis and energy performance estimate, not achievable in a 2D environment. In particular geometrical building parameters (footprint, volume etc.) are computed and building envelop thermal characteristics are derived from. Lastly, a simulation is carried out to deal with asbestos and home renovating charges and show how the built 3D city model can support municipal managers with risk diagnosis of the present situation and development of strategies for a sustainable redevelop.

  20. 78 FR 63135 - Airworthiness Directives; The Boeing Company Airplanes

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-10-23

    ... system. The NPRM also proposed to require replacing the operational program software of certain... condition would not be adequately addressed by the proposed action. The manufacturer has issued new service... software of certain indicating/recording systems. The NPRM was prompted by numerous operator reports of...

  1. Numerical simulation of nonequilibrium flows by using the state-to-state approach in commercial software

    NASA Astrophysics Data System (ADS)

    Kunova, O. V.; Shoev, G. V.; Kudryavtsev, A. N.

    2017-01-01

    Nonequilibrium flows of a two-component oxygen mixture O2/O behind a shock wave are studied with due allowance for the state-to-state vibrational and chemical kinetics. The system of gas-dynamic equations is supplemented with kinetic equations including contributions of VT (TV)-exchange and dissociation processes. A method of the numerical solution of this system with the use of the ANSYS Fluent commercial software package is proposed, which is used in a combination with the authors' code that takes into account nonequilibrium kinetics. The computed results are compared with parameters obtained by solving the problem in the shock-fitting formulation. The vibrational temperature is compared with experimental data. The numerical tool proposed in the present paper is applied to study the flow around a cylinder.

  2. [Numerical simulation and operation optimization of biological filter].

    PubMed

    Zou, Zong-Sen; Shi, Han-Chang; Chen, Xiang-Qiang; Xie, Xiao-Qing

    2014-12-01

    BioWin software and two sensitivity analysis methods were used to simulate the Denitrification Biological Filter (DNBF) + Biological Aerated Filter (BAF) process in Yuandang Wastewater Treatment Plant. Based on the BioWin model of DNBF + BAF process, the operation data of September 2013 were used for sensitivity analysis and model calibration, and the operation data of October 2013 were used for model validation. The results indicated that the calibrated model could accurately simulate practical DNBF + BAF processes, and the most sensitive parameters were the parameters related to biofilm, OHOs and aeration. After the validation and calibration of model, it was used for process optimization with simulating operation results under different conditions. The results showed that, the best operation condition for discharge standard B was: reflux ratio = 50%, ceasing methanol addition, influent C/N = 4.43; while the best operation condition for discharge standard A was: reflux ratio = 50%, influent COD = 155 mg x L(-1) after methanol addition, influent C/N = 5.10.

  3. The Effective Use of System and Software Architecture Standards for Software Technology Readiness Assessments

    DTIC Science & Technology

    2011-05-01

    IEC 42010 Technology Viewpoint • Case Study – Multimedia Conferencing System – Technology Specification • Risks of Software TRL Determination...fully support the required threshold functionality . • Relevant Environment for Space* – A satellite from launch to standard operation in space is...Analytical and experimental critical function and/or characteristic f f t TRL 4 TRL 3 proo o concep Technology concept and/or application

  4. The role of open-source software in innovation and standardization in radiology.

    PubMed

    Erickson, Bradley J; Langer, Steve; Nagy, Paul

    2005-11-01

    The use of open-source software (OSS), in which developers release the source code to applications they have developed, is popular in the software industry. This is done to allow others to modify and improve software (which may or may not be shared back to the community) and to allow others to learn from the software. Radiology was an early participant in this model, supporting OSS that implemented the ACR-National Electrical Manufacturers Association (now Digital Imaging and Communications in Medicine) standard for medical image communications. In radiology and in other fields, OSS has promoted innovation and the adoption of standards. Popular OSS is of high quality because access to source code allows many people to identify and resolve errors. Open-source software is analogous to the peer-review scientific process: one must be able to see and reproduce results to understand and promote what is shared. The authors emphasize that support for OSS need not threaten vendors; most vendors embrace and benefit from standards. Open-source development does not replace vendors but more clearly defines their roles, typically focusing on areas in which proprietary differentiators benefit customers and on professional services such as implementation planning and service. Continued support for OSS is essential for the success of our field.

  5. Scilab software package for the study of dynamical systems

    NASA Astrophysics Data System (ADS)

    Bordeianu, C. C.; Beşliu, C.; Jipa, Al.; Felea, D.; Grossu, I. V.

    2008-05-01

    This work presents a new software package for the study of chaotic flows and maps. The codes were written using Scilab, a software package for numerical computations providing a powerful open computing environment for engineering and scientific applications. It was found that Scilab provides various functions for ordinary differential equation solving, Fast Fourier Transform, autocorrelation, and excellent 2D and 3D graphical capabilities. The chaotic behaviors of the nonlinear dynamics systems were analyzed using phase-space maps, autocorrelation functions, power spectra, Lyapunov exponents and Kolmogorov-Sinai entropy. Various well known examples are implemented, with the capability of the users inserting their own ODE. Program summaryProgram title: Chaos Catalogue identifier: AEAP_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEAP_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 885 No. of bytes in distributed program, including test data, etc.: 5925 Distribution format: tar.gz Programming language: Scilab 3.1.1 Computer: PC-compatible running Scilab on MS Windows or Linux Operating system: Windows XP, Linux RAM: below 100 Megabytes Classification: 6.2 Nature of problem: Any physical model containing linear or nonlinear ordinary differential equations (ODE). Solution method: Numerical solving of ordinary differential equations. The chaotic behavior of the nonlinear dynamical system is analyzed using Poincaré sections, phase-space maps, autocorrelation functions, power spectra, Lyapunov exponents and Kolmogorov-Sinai entropies. Restrictions: The package routines are normally able to handle ODE systems of high orders (up to order twelve and possibly higher), depending on the nature of the problem. Running time: 10 to 20 seconds for problems that do not involve Lyapunov exponents calculation; 60 to 1000 seconds for problems that involve high orders ODE and Lyapunov exponents calculation.

  6. CALIPSO Data Read Software

    Atmospheric Science Data Center

    2018-06-14

      CALIPSO Data Read Software Callable routines in Interactive Data Language (IDL) provide basic read access to CALIPSO science data files. ... Release 4.30  (PDF) Standard Data Sets: LIDAR L1:  CAL_LID_L1-Standard-V4-10 LIDAR L2: ...

  7. Numerical study of rice husk and coal co-combustion characteristics in a circulating fluidized bed

    NASA Astrophysics Data System (ADS)

    Wang, Zuomin; Li, Jiuru

    2018-02-01

    This paper discussed the rationality of coal and rice husk co-combustion. Using ICEM software, a two-dimensional model of the riser has been established for circulating fluidized bed experimental table. Using Fluent software, numerical simulation has been made for the combustion reaction of different proportions of rice husk mixed with coal. The results show that, with the increase of rice husk ratio, both the combustion temperature and the amount of nitrogen oxides decrease and the effect is gradually reduced. In this simulation, the rice husks occupying about 30% is a reasonable proportion.

  8. Numerical modeling of interaction of the aircraft engine with concrete protective structures

    NASA Astrophysics Data System (ADS)

    Radchenko, P. A.; Batuev, S. P.; Radchenko, A. V.; Plevkov, V. S.

    2018-01-01

    The paper presents numerical modeling results considering interaction of Boeing 747 aircraft engine with nuclear power station protective shell. Protective shell has been given as a reinforced concrete structure with complex scheme of reinforcement. The engine has been simulated by cylinder projectile made from titanium alloy. The interaction velocity has comprised 180 m/s. The simulation is three-dimensional solved by finite element method using the author’s own software package EFES. Fracture and fragmentation of materials have been considered in calculations. Program software has been assessed to be used in calculation of multiple-contact objectives.

  9. Software Design Document SAF Workstation. Volume 1, Sections 1.0 - 2.4. 3.4.86

    DTIC Science & Technology

    1991-06-01

    SLECT TERMS IS. NUMER OF PAGES SIMNET Software Design Document for the SAF Workstation CSCI (CSCI 6). 14. PRICE CODE SECUWItY CLASSIFICATION Is. SECUJRITY...AD-A244 972 SOFTWARE DESIGN DOCUMENT SAF Workstation CSCI (6) Volume 1 of 2 Sections 1.0 - 2.4.3.4.86 DTIC June, 1991 Flt. FCTE S JAN 09 1992...00247 APPROVED FOR PUBLIC RELEASE DISTRBUTION UNLIMITED -Mono SOFTWARE DESIGN DOCUMENT SAF Workstation CSCI (6) Volume 1 of 2 Sections 1.0 - 2.4.3.4.86

  10. Element Load Data Processor (ELDAP) Users Manual

    NASA Technical Reports Server (NTRS)

    Ramsey, John K., Jr.; Ramsey, John K., Sr.

    2015-01-01

    Often, the shear and tensile forces and moments are extracted from finite element analyses to be used in off-line calculations for evaluating the integrity of structural connections involving bolts, rivets, and welds. Usually the maximum forces and moments are desired for use in the calculations. In situations where there are numerous structural connections of interest for numerous load cases, the effort in finding the true maximum force and/or moment combinations among all fasteners and welds and load cases becomes difficult. The Element Load Data Processor (ELDAP) software described herein makes this effort manageable. This software eliminates the possibility of overlooking the worst-case forces and moments that could result in erroneous positive margins of safety and/or selecting inconsistent combinations of forces and moments resulting in false negative margins of safety. In addition to forces and moments, any scalar quantity output in a PATRAN report file may be evaluated with this software. This software was originally written to fill an urgent need during the structural analysis of the Ares I-X Interstage segment. As such, this software was coded in a straightforward manner with no effort made to optimize or minimize code or to develop a graphical user interface.

  11. Numerical research of the optimal control problem in the semi-Markov inventory model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gorshenin, Andrey K.; Belousov, Vasily V.; Shnourkoff, Peter V.

    2015-03-10

    This paper is devoted to the numerical simulation of stochastic system for inventory management products using controlled semi-Markov process. The results of a special software for the system’s research and finding the optimal control are presented.

  12. ESML for Earth Science Data Sets and Analysis

    NASA Technical Reports Server (NTRS)

    Graves, Sara; Ramachandran, Rahul

    2003-01-01

    The primary objective of this research project was to transition ESML from design to application. The resulting schema and prototype software will foster community acceptance for the Define once, use anywhere concept central to ESML. Supporting goals include: 1) Refinement of the ESML schema and software libraries in cooperation with the user community; 2) Application of the ESML schema and software to a variety of Earth science data sets and analysis tools; 3) Development of supporting prototype software for enhanced ease of use; 4) Cooperation with standards bodies in order to assure ESML is aligned with related metadata standards as appropriate; and 5) Widespread publication of the ESML approach, schema, and software.

  13. Earth Science Markup Language: Transitioning From Design to Application

    NASA Technical Reports Server (NTRS)

    Moe, Karen; Graves, Sara; Ramachandran, Rahul

    2002-01-01

    The primary objective of the proposed Earth Science Markup Language (ESML) research is to transition from design to application. The resulting schema and prototype software will foster community acceptance for the "define once, use anywhere" concept central to ESML. Supporting goals include: 1. Refinement of the ESML schema and software libraries in cooperation with the user community. 2. Application of the ESML schema and software libraries to a variety of Earth science data sets and analysis tools. 3. Development of supporting prototype software for enhanced ease of use. 4. Cooperation with standards bodies in order to assure ESML is aligned with related metadata standards as appropriate. 5. Widespread publication of the ESML approach, schema, and software.

  14. The Future of Software Engineering for High Performance Computing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pope, G

    DOE ASCR requested that from May through mid-July 2015 a study group identify issues and recommend solutions from a software engineering perspective transitioning into the next generation of High Performance Computing. The approach used was to ask some of the DOE complex experts who will be responsible for doing this work to contribute to the study group. The technique used was to solicit elevator speeches: a short and concise write up done as if the author was a speaker with only a few minutes to convince a decision maker of their top issues. Pages 2-18 contain the original texts ofmore » the contributed elevator speeches and end notes identifying the 20 contributors. The study group also ranked the importance of each topic, and those scores are displayed with each topic heading. A perfect score (and highest priority) is three, two is medium priority, and one is lowest priority. The highest scoring topic areas were software engineering and testing resources; the lowest scoring area was compliance to DOE standards. The following two paragraphs are an elevator speech summarizing the contributed elevator speeches. Each sentence or phrase in the summary is hyperlinked to its source via a numeral embedded in the text. A risk one liner has also been added to each topic to allow future risk tracking and mitigation.« less

  15. [Development of analysis software package for the two kinds of Japanese fluoro-d-glucose-positron emission tomography guideline].

    PubMed

    Matsumoto, Keiichi; Endo, Keigo

    2013-06-01

    Two kinds of Japanese guidelines for the data acquisition protocol of oncology fluoro-D-glucose-positron emission tomography (FDG-PET)/computed tomography (CT) scans were created by the joint task force of the Japanese Society of Nuclear Medicine Technology (JSNMT) and the Japanese Society of Nuclear Medicine (JSNM), and published in Kakuigaku-Gijutsu 27(5): 425-456, 2007 and 29(2): 195-235, 2009. These guidelines aim to standardize PET image quality among facilities and different PET/CT scanner models. The objective of this study was to develop a personal computer-based performance measurement and image quality processor for the two kinds of Japanese guidelines for oncology (18)F-FDG PET/CT scans. We call this software package the "PET quality control tool" (PETquact). Microsoft Corporation's Windows(™) is used as the operating system for PETquact, which requires 1070×720 image resolution and includes 12 different applications. The accuracy was examined for numerous applications of PETquact. For example, in the sensitivity application, the system sensitivity measurement results were equivalent when comparing two PET sinograms obtained from the PETquact and the report. PETquact is suited for analysis of the two kinds of Japanese guideline, and it shows excellent spec to performance measurements and image quality analysis. PETquact can be used at any facility if the software package is installed on a laptop computer.

  16. Computer-intensive simulation of solid-state NMR experiments using SIMPSON.

    PubMed

    Tošner, Zdeněk; Andersen, Rasmus; Stevensson, Baltzar; Edén, Mattias; Nielsen, Niels Chr; Vosegaard, Thomas

    2014-09-01

    Conducting large-scale solid-state NMR simulations requires fast computer software potentially in combination with efficient computational resources to complete within a reasonable time frame. Such simulations may involve large spin systems, multiple-parameter fitting of experimental spectra, or multiple-pulse experiment design using parameter scan, non-linear optimization, or optimal control procedures. To efficiently accommodate such simulations, we here present an improved version of the widely distributed open-source SIMPSON NMR simulation software package adapted to contemporary high performance hardware setups. The software is optimized for fast performance on standard stand-alone computers, multi-core processors, and large clusters of identical nodes. We describe the novel features for fast computation including internal matrix manipulations, propagator setups and acquisition strategies. For efficient calculation of powder averages, we implemented interpolation method of Alderman, Solum, and Grant, as well as recently introduced fast Wigner transform interpolation technique. The potential of the optimal control toolbox is greatly enhanced by higher precision gradients in combination with the efficient optimization algorithm known as limited memory Broyden-Fletcher-Goldfarb-Shanno. In addition, advanced parallelization can be used in all types of calculations, providing significant time reductions. SIMPSON is thus reflecting current knowledge in the field of numerical simulations of solid-state NMR experiments. The efficiency and novel features are demonstrated on the representative simulations. Copyright © 2014 Elsevier Inc. All rights reserved.

  17. A computerized system to measure and predict air quality for emission control

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Crooks, G.; Ciccone, A.; Frattolillo, P.

    1997-12-31

    A Supplementary Emission Control (SEC) system has been developed on behalf of the Association Industrielle de l`Est de Montreal (AIEM). The objective of the SEC is to avoid exceedences of the Montreal Urban Community (MUC) 24 hour ambient Air Quality Standard (AQS) for sulphur dioxide in the industrial East Montreal area. The SEC system is comprised of: 3 continuous SO{sub 2} monitoring stations with data loggers and remote communications; a meteorological tower with data logger and modem for acquiring local meteorology; communications with Environment Canada to download meteorological forecast data; a polling PC for data retrieval; and Windows NT basedmore » software running on the AIEM computer server. The SEC software utilizes relational databases to store and maintain measured SO{sub 2} concentration data, emission data, as well as observed and forecast meteorological data. The SEC system automatically executes a numerical dispersion model to forecast SO{sub 2} concentrations up to six hours in the future. Based on measured SO{sub 2} concentrations at the monitoring stations and the six hour forecast concentrations, the system determines if local sources should reduce their emission levels to avoid potential exceedences of the AQS. The SEC system also includes a Graphical User Interface (GUI) for user access to the system. The SEC system and software are described, and the accuracy of the system at forecasting SO{sub 2} concentrations is examined.« less

  18. Exciting Normal Distribution

    ERIC Educational Resources Information Center

    Fuchs, Karl Josef; Simonovits, Reinhard; Thaller, Bernd

    2008-01-01

    This paper describes a high school project where the mathematics teaching and learning software M@th Desktop (MD) based on the Computer Algebra System Mathematica was used for symbolical and numerical calculations and for visualisation. The mathematics teaching and learning software M@th Desktop 2.0 (MD) contains the modules Basics including tools…

  19. Analyzing Population Genetics Data: A Comparison of the Software

    USDA-ARS?s Scientific Manuscript database

    Choosing a software program for analyzing population genetic data can be a challenge without prior knowledge of the methods used by each program. There are numerous web sites listing programs by type of data analyzed, type of analyses performed, or other criteria. Even with programs categorized in ...

  20. Geometry + Technology = Proof

    ERIC Educational Resources Information Center

    Lyublinskaya, Irina; Funsch, Dan

    2012-01-01

    Several interactive geometry software packages are available today to secondary school teachers. An example is The Geometer's Sketchpad[R] (GSP), also known as Dynamic Geometry[R] software, developed by Key Curriculum Press. This numeric based technology has been widely adopted in the last twenty years, and a vast amount of creativity has been…

  1. GridKit

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Peles, Slaven

    2016-11-06

    GridKit is a software development kit for interfacing power systems and power grid application software with high performance computing (HPC) libraries developed at National Labs and academia. It is also intended as interoperability layer between different numerical libraries. GridKit is not a standalone application, but comes with a suite of test examples illustrating possible usage.

  2. Integrating model behavior, optimization, and sensitivity/uncertainty analysis: overview and application of the MOUSE software toolbox

    USDA-ARS?s Scientific Manuscript database

    This paper provides an overview of the Model Optimization, Uncertainty, and SEnsitivity Analysis (MOUSE) software application, an open-source, Java-based toolbox of visual and numerical analysis components for the evaluation of environmental models. MOUSE is based on the OPTAS model calibration syst...

  3. Standards guide for space and earth sciences computer software

    NASA Technical Reports Server (NTRS)

    Mason, G.; Chapman, R.; Klinglesmith, D.; Linnekin, J.; Putney, W.; Shaffer, F.; Dapice, R.

    1972-01-01

    Guidelines for the preparation of systems analysis and programming work statements are presented. The data is geared toward the efficient administration of available monetary and equipment resources. Language standards and the application of good management techniques to software development are emphasized.

  4. INTERFACING SAS TO ORACLE IN THE UNIX ENVIRONMENT

    EPA Science Inventory

    SAS is an EPA standard data and statistical analysis software package while ORACLE is EPA's standard data base management system software package. RACLE has the advantage over SAS in data retrieval and storage capabilities but has limited data and statistical analysis capability....

  5. Equivalent circuit modeling of a piezo-patch energy harvester on a thin plate with AC-DC conversion

    NASA Astrophysics Data System (ADS)

    Bayik, B.; Aghakhani, A.; Basdogan, I.; Erturk, A.

    2016-05-01

    As an alternative to beam-like structures, piezoelectric patch-based energy harvesters attached to thin plates can be readily integrated to plate-like structures in automotive, marine, and aerospace applications, in order to directly exploit structural vibration modes of the host system without mass loading and volumetric occupancy of cantilever attachments. In this paper, a multi-mode equivalent circuit model of a piezo-patch energy harvester integrated to a thin plate is developed and coupled with a standard AC-DC conversion circuit. Equivalent circuit parameters are obtained in two different ways: (1) from the modal analysis solution of a distributed-parameter analytical model and (2) from the finite-element numerical model of the harvester by accounting for two-way coupling. After the analytical modeling effort, multi-mode equivalent circuit representation of the harvester is obtained via electronic circuit simulation software SPICE. Using the SPICE software, electromechanical response of the piezoelectric energy harvester connected to linear and nonlinear circuit elements are computed. Simulation results are validated for the standard AC-AC and AC-DC configurations. For the AC input-AC output problem, voltage frequency response functions are calculated for various resistive loads, and they show excellent agreement with modal analysis-based analytical closed-form solution and with the finite-element model. For the standard ideal AC input-DC output case, a full-wave rectifier and a smoothing capacitor are added to the harvester circuit for conversion of the AC voltage to a stable DC voltage, which is also validated against an existing solution by treating the single-mode plate dynamics as a single-degree-of-freedom system.

  6. Making the Business Case for Software Assurance

    DTIC Science & Technology

    2009-04-01

    and Capability dEtermination-SPICE, ISO /IEC 15504, 1998. [ ISO 2007] International Organization for Standardization. " ISO /IEC 27001 & 27002 ...Implementing the Process Areas 6.2.7 Differences Between the CMMI and Software CMM Process Areas 6.3 The CMMI Appraisal Process 6.4 Adapting ISO 15504 to...Secure Software Assurance 6.4.1 Assessment and the Secure Life Cycle 6.4.2 ISO 15504 Capability Levels 6.5 Adapting the ISOIIEC 21287 Standard Approach to

  7. Final Report of the NASA Office of Safety and Mission Assurance Agile Benchmarking Team

    NASA Technical Reports Server (NTRS)

    Wetherholt, Martha

    2016-01-01

    To ensure that the NASA Safety and Mission Assurance (SMA) community remains in a position to perform reliable Software Assurance (SA) on NASAs critical software (SW) systems with the software industry rapidly transitioning from waterfall to Agile processes, Terry Wilcutt, Chief, Safety and Mission Assurance, Office of Safety and Mission Assurance (OSMA) established the Agile Benchmarking Team (ABT). The Team's tasks were: 1. Research background literature on current Agile processes, 2. Perform benchmark activities with other organizations that are involved in software Agile processes to determine best practices, 3. Collect information on Agile-developed systems to enable improvements to the current NASA standards and processes to enhance their ability to perform reliable software assurance on NASA Agile-developed systems, 4. Suggest additional guidance and recommendations for updates to those standards and processes, as needed. The ABT's findings and recommendations for software management, engineering and software assurance are addressed herein.

  8. Data storage technology: Hardware and software, Appendix B

    NASA Technical Reports Server (NTRS)

    Sable, J. D.

    1972-01-01

    This project involves the development of more economical ways of integrating and interfacing new storage devices and data processing programs into a computer system. It involves developing interface standards and a software/hardware architecture which will make it possible to develop machine independent devices and programs. These will interface with the machine dependent operating systems of particular computers. The development project will not be to develop the software which would ordinarily be the responsibility of the manufacturer to supply, but to develop the standards with which that software is expected to confirm in providing an interface with the user or storage system.

  9. The integration of the risk management process with the lifecycle of medical device software.

    PubMed

    Pecoraro, F; Luzi, D

    2014-01-01

    The application of software in the Medical Device (MD) domain has become central to the improvement of diagnoses and treatments. The new European regulations that specifically address software as an important component of MD, require complex procedures to make software compliant with safety requirements, introducing thereby new challenges in the qualification and classification of MD software as well as in the performance of risk management activities. Under this perspective, the aim of this paper is to propose an integrated framework that combines the activities to be carried out by the manufacturer to develop safe software within the development lifecycle based on the regulatory requirements reported in US and European regulations as well as in the relevant standards and guidelines. A comparative analysis was carried out to identify the main issues related to the application of the current new regulations. In addition, standards and guidelines recently released to harmonise procedures for the validation of MD software have been used to define the risk management activities to be carried out by the manufacturer during the software development process. This paper highlights the main issues related to the qualification and classification of MD software, providing an analysis of the different regulations applied in Europe and the US. A model that integrates the risk management process within the software development lifecycle has been proposed too. It is based on regulatory requirements and considers software risk analysis as a central input to be managed by the manufacturer already at the initial stages of the software design, in order to prevent MD failures. Relevant changes in the process of MD development have been introduced with the recognition of software being an important component of MDs as stated in regulations and standards. This implies the performance of highly iterative processes that have to integrate the risk management in the framework of software development. It also makes it necessary to involve both medical and software engineering competences to safeguard patient and user safety.

  10. Toward Software Both Seen and Heard.

    ERIC Educational Resources Information Center

    Lazzaro, Joseph J.

    1996-01-01

    Visually impaired users are hampered by current PC software written for graphical user interfaces. Screen readers that vocalize displayed text require standardization that remains missing in the programming industry; the readers cannot interpret many cues in the Windows environment. More programming standards and adaptive technology for computers…

  11. Making tomorrow's mistakes today: Evolutionary prototyping for risk reduction and shorter development time

    NASA Astrophysics Data System (ADS)

    Friedman, Gary; Schwuttke, Ursula M.; Burliegh, Scott; Chow, Sanguan; Parlier, Randy; Lee, Lorrine; Castro, Henry; Gersbach, Jim

    1993-03-01

    In the early days of JPL's solar system exploration, each spacecraft mission required its own dedicated data system with all software applications written in the mainframe's native assembly language. Although these early telemetry processing systems were a triumph of engineering in their day, since that time the computer industry has advanced to the point where it is now advantageous to replace these systems with more modern technology. The Space Flight Operations Center (SFOC) Prototype group was established in 1985 as a workstation and software laboratory. The charter of the lab was to determine if it was possible to construct a multimission telemetry processing system using commercial, off-the-shelf computers that communicated via networks. The staff of the lab mirrored that of a typical skunk works operation -- a small, multi-disciplinary team with a great deal of autonomy that could get complex tasks done quickly. In an effort to determine which approaches would be useful, the prototype group experimented with all types of operating systems, inter-process communication mechanisms, network protocols, packet size parameters. Out of that pioneering work came the confidence that a multi-mission telemetry processing system could be built using high-level languages running in a heterogeneous, networked workstation environment. Experience revealed that the operating systems on all nodes should be similar (i.e., all VMS or all PC-DOS or all UNIX), and that a unique Data Transport Subsystem tool needed to be built to address the incompatibilities of network standards, byte ordering, and socket buffering. The advantages of building a telemetry processing system based on emerging industry standards were numerous: by employing these standards, we would no longer be locked into a single vendor. When new technology came to market which offered ten times the performance at one eighth the cost, it would be possible to attach the new machine to the network, re-compile the application code, and run. In addition, we would no longer be plagued with lack of manufacturer support when we encountered obscure bugs. And maybe, hopefully, the eternal elusive goal of software portability across different vendors' platforms would finally be available. Some highlights of our prototyping efforts are described.

  12. Making tomorrow's mistakes today: Evolutionary prototyping for risk reduction and shorter development time

    NASA Technical Reports Server (NTRS)

    Friedman, Gary; Schwuttke, Ursula M.; Burliegh, Scott; Chow, Sanguan; Parlier, Randy; Lee, Lorrine; Castro, Henry; Gersbach, Jim

    1993-01-01

    In the early days of JPL's solar system exploration, each spacecraft mission required its own dedicated data system with all software applications written in the mainframe's native assembly language. Although these early telemetry processing systems were a triumph of engineering in their day, since that time the computer industry has advanced to the point where it is now advantageous to replace these systems with more modern technology. The Space Flight Operations Center (SFOC) Prototype group was established in 1985 as a workstation and software laboratory. The charter of the lab was to determine if it was possible to construct a multimission telemetry processing system using commercial, off-the-shelf computers that communicated via networks. The staff of the lab mirrored that of a typical skunk works operation -- a small, multi-disciplinary team with a great deal of autonomy that could get complex tasks done quickly. In an effort to determine which approaches would be useful, the prototype group experimented with all types of operating systems, inter-process communication mechanisms, network protocols, packet size parameters. Out of that pioneering work came the confidence that a multi-mission telemetry processing system could be built using high-level languages running in a heterogeneous, networked workstation environment. Experience revealed that the operating systems on all nodes should be similar (i.e., all VMS or all PC-DOS or all UNIX), and that a unique Data Transport Subsystem tool needed to be built to address the incompatibilities of network standards, byte ordering, and socket buffering. The advantages of building a telemetry processing system based on emerging industry standards were numerous: by employing these standards, we would no longer be locked into a single vendor. When new technology came to market which offered ten times the performance at one eighth the cost, it would be possible to attach the new machine to the network, re-compile the application code, and run. In addition, we would no longer be plagued with lack of manufacturer support when we encountered obscure bugs. And maybe, hopefully, the eternal elusive goal of software portability across different vendors' platforms would finally be available. Some highlights of our prototyping efforts are described.

  13. Numerical prediction on the dispersion of pollutant particles

    NASA Astrophysics Data System (ADS)

    Osman, Kahar; Ali, Zairi; Ubaidullah, S.; Zahid, M. N.

    2012-06-01

    The increasing concern on air pollution has led people around the world to find more efficient ways to control the problem. Air dispersion modeling is proven to be one of the alternatives that provide economical ways to control the growing threat of air pollution. The objective of this research is to develop a practical numerical algorithm to predict the dispersion of pollutant particles around a specific source of emission. The source selected was a rubber wood manufacturing plant. Gaussian-plume model were used as air dispersion model due to its simplicity and generic application. Results of this study show the concentrations of the pollutant particles on ground level reached approximately 90μg/m3, compared with other software. This value surpasses the limit of 50μg/m3 stipulated by the National Ambient Air Quality Standard (NAAQS) and Recommended Malaysian Guidelines (RMG) set by Environment Department of Malaysia. The results also show high concentration of pollutant particles reading during dru seasons as compared to that of rainy seasons. In general, the developed algorithm is proven to be able to predict particles distribution around emitted source with acceptable accuracy.

  14. The numerical simulation based on CFD of hydraulic turbine pump

    NASA Astrophysics Data System (ADS)

    Duan, X. H.; Kong, F. Y.; Liu, Y. Y.; Zhao, R. J.; Hu, Q. L.

    2016-05-01

    As the functions of hydraulic turbine pump including self-adjusting and compensation with each other, it is far-reaching to analyze its internal flow by the numerical simulation based on CFD, mainly including the pressure field and the velocity field in hydraulic turbine and pump.The three-dimensional models of hydraulic turbine pump are made by Pro/Engineer software;the internal flow fields in hydraulic turbine and pump are simulated numerically by CFX ANSYS software. According to the results of the numerical simulation in design condition, the pressure field and the velocity field in hydraulic turbine and pump are analyzed respectively .The findings show that the static pressure decreases systematically and the pressure gradient is obvious in flow area of hydraulic turbine; the static pressure increases gradually in pump. The flow trace is regular in suction chamber and flume without spiral trace. However, there are irregular traces in the turbine runner channels which contrary to that in flow area of impeller. Most of traces in the flow area of draft tube are spiral.

  15. NOTE: Development of modified voxel phantoms for the numerical dosimetric reconstruction of radiological accidents involving external sources: implementation in SESAME tool

    NASA Astrophysics Data System (ADS)

    Courageot, Estelle; Sayah, Rima; Huet, Christelle

    2010-05-01

    Estimating the dose distribution in a victim's body is a relevant indicator in assessing biological damage from exposure in the event of a radiological accident caused by an external source. When the dose distribution is evaluated with a numerical anthropomorphic model, the posture and morphology of the victim have to be reproduced as realistically as possible. Several years ago, IRSN developed a specific software application, called the simulation of external source accident with medical images (SESAME), for the dosimetric reconstruction of radiological accidents by numerical simulation. This tool combines voxel geometry and the MCNP(X) Monte Carlo computer code for radiation-material interaction. This note presents a new functionality in this software that enables the modelling of a victim's posture and morphology based on non-uniform rational B-spline (NURBS) surfaces. The procedure for constructing the modified voxel phantoms is described, along with a numerical validation of this new functionality using a voxel phantom of the RANDO tissue-equivalent physical model.

  16. Development of modified voxel phantoms for the numerical dosimetric reconstruction of radiological accidents involving external sources: implementation in SESAME tool.

    PubMed

    Courageot, Estelle; Sayah, Rima; Huet, Christelle

    2010-05-07

    Estimating the dose distribution in a victim's body is a relevant indicator in assessing biological damage from exposure in the event of a radiological accident caused by an external source. When the dose distribution is evaluated with a numerical anthropomorphic model, the posture and morphology of the victim have to be reproduced as realistically as possible. Several years ago, IRSN developed a specific software application, called the simulation of external source accident with medical images (SESAME), for the dosimetric reconstruction of radiological accidents by numerical simulation. This tool combines voxel geometry and the MCNP(X) Monte Carlo computer code for radiation-material interaction. This note presents a new functionality in this software that enables the modelling of a victim's posture and morphology based on non-uniform rational B-spline (NURBS) surfaces. The procedure for constructing the modified voxel phantoms is described, along with a numerical validation of this new functionality using a voxel phantom of the RANDO tissue-equivalent physical model.

  17. Numerical analysis of eccentric orifice plate using ANSYS Fluent software

    NASA Astrophysics Data System (ADS)

    Zahariea, D.

    2016-11-01

    In this paper the eccentric orifice plate is qualitative analysed as compared with the classical concentric orifice plate from the point of view of sedimentation tendency of solid particles in the fluid whose flow rate is measured. For this purpose, the numerical streamlines pattern will be compared for both orifice plates. The numerical analysis has been performed using ANSYS Fluent software. The methodology of CFD analysis is presented: creating the 3D solid model, fluid domain extraction, meshing, boundary condition, turbulence model, solving algorithm, convergence criterion, results and validation. Analysing the numerical streamlines, for the concentric orifice plate can be clearly observed two circumferential regions of separated flows, upstream and downstream of the orifice plate. The bottom part of these regions are the place where the solid particles could sediment. On the other hand, for the eccentric orifice plate, the streamlines pattern suggest that no sedimentation will occur because at the bottom area of the pipe there are no separated flows.

  18. Python Open source Waveform ExtractoR (POWER): an open source, Python package to monitor and post-process numerical relativity simulations

    NASA Astrophysics Data System (ADS)

    Johnson, Daniel; Huerta, E. A.; Haas, Roland

    2018-01-01

    Numerical simulations of Einstein’s field equations provide unique insights into the physics of compact objects moving at relativistic speeds, and which are driven by strong gravitational interactions. Numerical relativity has played a key role to firmly establish gravitational wave astrophysics as a new field of research, and it is now paving the way to establish whether gravitational wave radiation emitted from compact binary mergers is accompanied by electromagnetic and astro-particle counterparts. As numerical relativity continues to blend in with routine gravitational wave data analyses to validate the discovery of gravitational wave events, it is essential to develop open source tools to streamline these studies. Motivated by our own experience as users and developers of the open source, community software, the Einstein Toolkit, we present an open source, Python package that is ideally suited to monitor and post-process the data products of numerical relativity simulations, and compute the gravitational wave strain at future null infinity in high performance environments. We showcase the application of this new package to post-process a large numerical relativity catalog and extract higher-order waveform modes from numerical relativity simulations of eccentric binary black hole mergers and neutron star mergers. This new software fills a critical void in the arsenal of tools provided by the Einstein Toolkit consortium to the numerical relativity community.

  19. Dynamic load synthesis for shock numerical simulation in space structure design

    NASA Astrophysics Data System (ADS)

    Monti, Riccardo; Gasbarri, Paolo

    2017-08-01

    Pyroshock loads are the most stressing environments that a space equipment experiences during its operating life from a mechanical point of view. In general, the mechanical designer considers the pyroshock analysis as a very demanding constraint. Unfortunately, due to the non-linear behaviour of the structure under such loads, only the experimental tests can demonstrate if it is able to withstand these dynamic loads. By taking all the previous considerations into account, some preliminary information about the design correctness could be done by performing ;ad-hoc; numerical simulations, for example via commercial finite element software (i.e. MSC Nastran). Usually these numerical tools face the shock solution in two ways: 1) a direct mode, by using a time dependent enforcement and by evaluating the time-response and space-response as well as the internal forces; 2) a modal basis approach, by considering a frequency dependent load and of course by evaluating internal forces in the frequency domain. This paper has the main aim to develop a numerical tool to synthetize the time dependent enforcement based on deterministic and/or genetic algorithm optimisers. In particular starting from a specified spectrum in terms of SRS (Shock Response Spectrum) a time dependent discrete function, typically an acceleration profile, will be obtained to force the equipment by simulating the shock event. The synthetizing time and the interface with standards numerical codes will be two of the main topics dealt with in the paper. In addition a congruity and consistency methodology will be presented to ensure that the identified time dependent loads fully match the specified spectrum.

  20. USL/DBMS NASA/PC R and D project system testing standards

    NASA Technical Reports Server (NTRS)

    Dominick, Wayne D. (Editor); Kavi, Srinu; Moreau, Dennis R.; Yan, Lin

    1984-01-01

    A set of system testing standards to be used in the development of all C software within the NASA/PC Research and Development Project is established. Testing will be considered in two phases: the program testing phase and the system testing phase. The objective of these standards is to provide guidelines for the planning and conduct of program and software system testing.

  1. Product specification documentation standard and Data Item Descriptions (DID). Volume of the information system life-cycle and documentation standards, volume 3

    NASA Technical Reports Server (NTRS)

    Callender, E. David; Steinbacher, Jody

    1989-01-01

    This is the third of five volumes on Information System Life-Cycle and Documentation Standards which present a well organized, easily used standard for providing technical information needed for developing information systems, components, and related processes. This volume states the Software Management and Assurance Program documentation standard for a product specification document and for data item descriptions. The framework can be applied to any NASA information system, software, hardware, operational procedures components, and related processes.

  2. Application of industry-standard guidelines for the validation of avionics software

    NASA Technical Reports Server (NTRS)

    Hayhurst, Kelly J.; Shagnea, Anita M.

    1990-01-01

    The application of industry standards to the development of avionics software is discussed, focusing on verification and validation activities. It is pointed out that the procedures that guide the avionics software development and testing process are under increased scrutiny. The DO-178A guidelines, Software Considerations in Airborne Systems and Equipment Certification, are used by the FAA for certifying avionics software. To investigate the effectiveness of the DO-178A guidelines for improving the quality of avionics software, guidance and control software (GCS) is being developed according to the DO-178A development method. It is noted that, due to the extent of the data collection and configuration management procedures, any phase in the life cycle of a GCS implementation can be reconstructed. Hence, a fundamental development and testing platform has been established that is suitable for investigating the adequacy of various software development processes. In particular, the overall effectiveness and efficiency of the development method recommended by the DO-178A guidelines are being closely examined.

  3. Principles underlying the design of "The Number Race", an adaptive computer game for remediation of dyscalculia.

    PubMed

    Wilson, Anna J; Dehaene, Stanislas; Pinel, Philippe; Revkin, Susannah K; Cohen, Laurent; Cohen, David

    2006-05-30

    Adaptive game software has been successful in remediation of dyslexia. Here we describe the cognitive and algorithmic principles underlying the development of similar software for dyscalculia. Our software is based on current understanding of the cerebral representation of number and the hypotheses that dyscalculia is due to a "core deficit" in number sense or in the link between number sense and symbolic number representations. "The Number Race" software trains children on an entertaining numerical comparison task, by presenting problems adapted to the performance level of the individual child. We report full mathematical specifications of the algorithm used, which relies on an internal model of the child's knowledge in a multidimensional "learning space" consisting of three difficulty dimensions: numerical distance, response deadline, and conceptual complexity (from non-symbolic numerosity processing to increasingly complex symbolic operations). The performance of the software was evaluated both by mathematical simulations and by five weeks of use by nine children with mathematical learning difficulties. The results indicate that the software adapts well to varying levels of initial knowledge and learning speeds. Feedback from children, parents and teachers was positive. A companion article describes the evolution of number sense and arithmetic scores before and after training. The software, open-source and freely available online, is designed for learning disabled children aged 5-8, and may also be useful for general instruction of normal preschool children. The learning algorithm reported is highly general, and may be applied in other domains.

  4. Advanced Data Format (ADF) Software Library and Users Guide

    NASA Technical Reports Server (NTRS)

    Smith, Matthew; Smith, Charles A. (Technical Monitor)

    1998-01-01

    The "CFD General Notation System" (CGNS) consists of a collection of conventions, and conforming software, for the storage and retrieval of Computational Fluid Dynamics (CFD) data. It facilitates the exchange of data between sites and applications, and helps stabilize the archiving of aerodynamic data. This effort was initiated in order to streamline the procedures in exchanging data and software between NASA and its customers, but the goal is to develop CGNS into a National Standard for the exchange of aerodynamic data. The CGNS development team is comprised of members from Boeing Commercial. Airplane Group, NASA-Ames, NASA-Langley, NASA-Lewis, McDonnell-Douglas Corporation (now Boeing-St. Louis), Air Force-Wright Lab., and ICEM-CFD Engineering. The elements of CGNS address all activities associated with the storage of data on external media and its movement to and from application programs. These elements include: 1) The Advanced Data Format (ADF) Database manager, consisting of both a file format specification and its 1/0 software, which handles the actual reading and writing of data from and to external storage media; 2) The Standard Interface Data Structures (SIDS), which specify the intellectual content of CFD data and the conventions governing naming and terminology; 3) The SIDS-to-ADF File Mapping conventions, which specify the exact location where the CFD data defined by the SIDS is to be stored within the ADF file(s); and 4) The CGNS Mid-level Library, which provides CFD-knowledgeable routines suitable for direct installation into application codes. The ADF is a generic database manager with minimal intrinsic capability. It was written for the purpose of storing large numerical datasets in an efficient, platform independent manner. To be effective, it must be used in conjunction with external agreements on how the data will be organized within the ADF database such defined by the SIDS. There are currently 34 user callable functions that comprise the ADF Core library and are described in the Users Guide. The library is written in C, but each function has a FORTRAN counterpart.

  5. Common Data Models and Efficient Reproducible Workflows for Distributed Ocean Model Skill Assessment

    NASA Astrophysics Data System (ADS)

    Signell, R. P.; Snowden, D. P.; Howlett, E.; Fernandes, F. A.

    2014-12-01

    Model skill assessment requires discovery, access, analysis, and visualization of information from both sensors and models, and traditionally has been possible only by a few experts. The US Integrated Ocean Observing System (US-IOOS) consists of 17 Federal Agencies and 11 Regional Associations that produce data from various sensors and numerical models; exactly the information required for model skill assessment. US-IOOS is seeking to develop documented skill assessment workflows that are standardized, efficient, and reproducible so that a much wider community can participate in the use and assessment of model results. Standardization requires common data models for observational and model data. US-IOOS relies on the CF Conventions for observations and structured grid data, and on the UGRID Conventions for unstructured (e.g. triangular) grid data. This allows applications to obtain only the data they require in a uniform and parsimonious way using web services: OPeNDAP for model output and OGC Sensor Observation Service (SOS) for observed data. Reproducibility is enabled with IPython Notebooks shared on GitHub (http://github.com/ioos). These capture the entire skill assessment workflow, including user input, search, access, analysis, and visualization, ensuring that workflows are self-documenting and reproducible by anyone, using free software. Python packages for common data models are Pyugrid and the British Met Office Iris package. Python packages required to run the workflows (pyugrid, pyoos, and the British Met Office Iris package) are also available on GitHub and on Binstar.org so that users can run scenarios using the free Anaconda Python distribution. Hosted services such as Wakari enable anyone to reproduce these workflows for free, without installing any software locally, using just their web browser. We are also experimenting with Wakari Enterprise, which allows multi-user access from a web browser to an IPython Server running where large quantities of model output reside, increasing the efficiency. The open development and distribution of these workflows, and the software on which they depend, is an educational resource for those new to the field and a center of focus where practitioners can contribute new software and ideas.

  6. Engineering Documentation and Data Control

    NASA Technical Reports Server (NTRS)

    Matteson, Michael J.; Bramley, Craig; Ciaruffoli, Veronica

    2001-01-01

    Mississippi Space Services (MSS) the facility services contractor for NASA's John C. Stennis Space Center (SSC), is utilizing technology to improve engineering documentation and data control. Two identified improvement areas, labor intensive documentation research and outdated drafting standards, were targeted as top priority. MSS selected AutoManager(R) WorkFlow from Cyco software to manage engineering documentation. The software is currently installed on over 150 desctops. The outdated SSC drafting standard was written for pre-CADD drafting methods, in other words, board drafting. Implementation of COTS software solutions to manage engineering documentation and update the drafting standard resulted in significant increases in productivity by reducing the time spent searching for documents.

  7. Design and Analysis of an Axisymmetric Phased Array Fed Gregorian Reflector System for Limited Scanning

    DTIC Science & Technology

    2016-01-22

    Numerical electromagnetic simulations based on the multilevel fast multipole method (MLFMM) were used to analyze and optimize the antenna...and are not necessarily endorsed by the United States Government. numerical simulations with the multilevel fast multipole method (MLFMM...and optimized using numerical simulations conducted with the multilevel fast multipole method (MLFMM) using FEKO software (www.feko.info). The

  8. ASSIP Study of Real-Time Safety-Critical Embedded Software-Intensive System Engineering Practices

    DTIC Science & Technology

    2008-02-01

    and assessment 2. product engineering processes 3. tooling processes 6 | CMU/SEI-2008-SR-001 Slide 1 Process Standards IEC/ ISO 12207 Software...and technical effort to align with 12207 IEC/ ISO 15026 System & Software Integrity Levels Generic Safety SAE ARP 4754 Certification Considerations...Process Frameworks in revision – ISO 9001, ISO 9004 – ISO 15288/ ISO 12207 harmonization – RTCA DO-178B, MOD Standard UK 00-56/3, … • Methods & Tools

  9. Real time computer data system for the 40 x 80 ft wind tunnel facility at Ames Research Center

    NASA Technical Reports Server (NTRS)

    Cambra, J. M.; Tolari, G. P.

    1974-01-01

    The wind tunnel realtime computer system is a distributed data gathering system that features a master computer subsystem, a high speed data gathering subsystem, a quick look dynamic analysis and vibration control subsystem, an analog recording back-up subsystem, a pulse code modulation (PCM) on-board subsystem, a communications subsystem, and a transducer excitation and calibration subsystem. The subsystems are married to the master computer through an executive software system and standard hardware and FORTRAN software interfaces. The executive software system has four basic software routines. These are the playback, setup, record, and monitor routines. The standard hardware interfaces along with the software interfaces provide the system with the capability of adapting to new environments.

  10. Simulated Analysis of Linear Reversible Enzyme Inhibition with SCILAB

    ERIC Educational Resources Information Center

    Antuch, Manuel; Ramos, Yaquelin; Álvarez, Rubén

    2014-01-01

    SCILAB is a lesser-known program (than MATLAB) for numeric simulations and has the advantage of being free software. A challenging software-based activity to analyze the most common linear reversible inhibition types with SCILAB is described. Students establish typical values for the concentration of enzyme, substrate, and inhibitor to simulate…

  11. SandiaMRCR

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    2012-01-05

    SandiaMCR was developed to identify pure components and their concentrations from spectral data. This software efficiently implements the multivariate calibration regression alternating least squares (MCR-ALS), principal component analysis (PCA), and singular value decomposition (SVD). Version 3.37 also includes the PARAFAC-ALS Tucker-1 (for trilinear analysis) algorithms. The alternating least squares methods can be used to determine the composition without or with incomplete prior information on the constituents and their concentrations. It allows the specification of numerous preprocessing, initialization and data selection and compression options for the efficient processing of large data sets. The software includes numerous options including the definition ofmore » equality and non-negativety constraints to realistically restrict the solution set, various normalization or weighting options based on the statistics of the data, several initialization choices and data compression. The software has been designed to provide a practicing spectroscopist the tools required to routinely analysis data in a reasonable time and without requiring expert intervention.« less

  12. CaveMan Enterprise version 1.0 Software Validation and Verification.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hart, David

    The U.S. Department of Energy Strategic Petroleum Reserve stores crude oil in caverns solution-mined in salt domes along the Gulf Coast of Louisiana and Texas. The CaveMan software program has been used since the late 1990s as one tool to analyze pressure mea- surements monitored at each cavern. The purpose of this monitoring is to catch potential cavern integrity issues as soon as possible. The CaveMan software was written in Microsoft Visual Basic, and embedded in a Microsoft Excel workbook; this method of running the CaveMan software is no longer sustainable. As such, a new version called CaveMan Enter- prisemore » has been developed. CaveMan Enterprise version 1.0 does not have any changes to the CaveMan numerical models. CaveMan Enterprise represents, instead, a change from desktop-managed work- books to an enterprise framework, moving data management into coordinated databases and porting the numerical modeling codes into the Python programming language. This document provides a report of the code validation and verification testing.« less

  13. Analytically exploiting noise correlations inside the feedback loop to improve locked-oscillator performance.

    PubMed

    Sastrawan, J; Jones, C; Akhalwaya, I; Uys, H; Biercuk, M J

    2016-08-01

    We introduce concepts from optimal estimation to the stabilization of precision frequency standards limited by noisy local oscillators. We develop a theoretical framework casting various measures for frequency standard variance in terms of frequency-domain transfer functions, capturing the effects of feedback stabilization via a time series of Ramsey measurements. Using this framework, we introduce an optimized hybrid predictive feedforward measurement protocol that employs results from multiple past measurements and transfer-function-based calculations of measurement covariance to improve the accuracy of corrections within the feedback loop. In the presence of common non-Markovian noise processes these measurements will be correlated in a calculable manner, providing a means to capture the stochastic evolution of the local oscillator frequency during the measurement cycle. We present analytic calculations and numerical simulations of oscillator performance under competing feedback schemes and demonstrate benefits in both correction accuracy and long-term oscillator stability using hybrid feedforward. Simulations verify that in the presence of uncompensated dead time and noise with significant spectral weight near the inverse cycle time predictive feedforward outperforms traditional feedback, providing a path towards developing a class of stabilization software routines for frequency standards limited by noisy local oscillators.

  14. Federal Communications Commission (FCC) Transponder Loading Data Conversion Software. User's guide and software maintenance manual, version 1.2

    NASA Technical Reports Server (NTRS)

    Mallasch, Paul G.

    1993-01-01

    This volume contains the complete software system documentation for the Federal Communications Commission (FCC) Transponder Loading Data Conversion Software (FIX-FCC). This software was written to facilitate the formatting and conversion of FCC Transponder Occupancy (Loading) Data before it is loaded into the NASA Geosynchronous Satellite Orbital Statistics Database System (GSOSTATS). The information that FCC supplies NASA is in report form and must be converted into a form readable by the database management software used in the GSOSTATS application. Both the User's Guide and Software Maintenance Manual are contained in this document. This volume of documentation passed an independent quality assurance review and certification by the Product Assurance and Security Office of the Planning Research Corporation (PRC). The manuals were reviewed for format, content, and readability. The Software Management and Assurance Program (SMAP) life cycle and documentation standards were used in the development of this document. Accordingly, these standards were used in the review. Refer to the System/Software Test/Product Assurance Report for the Geosynchronous Satellite Orbital Statistics Database System (GSOSTATS) for additional information.

  15. 43 CFR 3809.202 - Under what conditions will BLM defer to State regulation of operations?

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... standards on a provision-by-provision basis to determine— (i) Whether non-numerical State standards are functionally equivalent to BLM counterparts; and (ii) Whether numerical State standards are the same as corresponding numerical BLM standards, except that State review and approval time frames do not have to be the...

  16. 43 CFR 3809.202 - Under what conditions will BLM defer to State regulation of operations?

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... standards on a provision-by-provision basis to determine— (i) Whether non-numerical State standards are functionally equivalent to BLM counterparts; and (ii) Whether numerical State standards are the same as corresponding numerical BLM standards, except that State review and approval time frames do not have to be the...

  17. 43 CFR 3809.202 - Under what conditions will BLM defer to State regulation of operations?

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... standards on a provision-by-provision basis to determine— (i) Whether non-numerical State standards are functionally equivalent to BLM counterparts; and (ii) Whether numerical State standards are the same as corresponding numerical BLM standards, except that State review and approval time frames do not have to be the...

  18. 43 CFR 3809.202 - Under what conditions will BLM defer to State regulation of operations?

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... standards on a provision-by-provision basis to determine— (i) Whether non-numerical State standards are functionally equivalent to BLM counterparts; and (ii) Whether numerical State standards are the same as corresponding numerical BLM standards, except that State review and approval time frames do not have to be the...

  19. User systems guidelines for software projects

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Abrahamson, L.

    1986-04-01

    This manual presents guidelines for software standards which were developed so that software project-development teams and management involved in approving the software could have a generalized view of all phases in the software production procedure and the steps involved in completing each phase. Guidelines are presented for six phases of software development: project definition, building a user interface, designing software, writing code, testing code, and preparing software documentation. The discussions for each phase include examples illustrating the recommended guidelines. 45 refs. (DWL)

  20. Simulation on friction taper plug welding of AA6063-20Gr metal matrix composite

    NASA Astrophysics Data System (ADS)

    Hynes, N. Rajesh Jesudoss; Nithin, Abeyram M.

    2016-05-01

    Friction taper plug welding a variant of friction welding is useful in welding of similar and dissimilar materials. It could be used for joining of composites to metals in sophisticated aerospace applications. In the present work numerical simulation of friction taper plug welding process is carried out using finite element based software. Graphite reinforced AA6063 is modelled using the software ANSYS 15.0 and temperature distribution is predicted. Effect of friction time on temperature distribution is numerically investigated. When the friction time is increased to 30 seconds, the tapered part of plug gets detached and fills the hole in the AA6063 plate perfectly.

  1. Research on axisymmetric aspheric surface numerical design and manufacturing technology

    NASA Astrophysics Data System (ADS)

    Wang, Zhen-zhong; Guo, Yin-biao; Lin, Zheng

    2006-02-01

    The key technology for aspheric machining offers exact machining path and machining aspheric lens with high accuracy and efficiency, in spite of the development of traditional manual manufacturing into nowadays numerical control (NC) machining. This paper presents a mathematical model between virtual cone and aspheric surface equations, and discusses the technology of uniform wear of grinding wheel and error compensation in aspheric machining. Finally, a software system for high precision aspheric surface manufacturing is designed and realized, based on the mentioned above. This software system can work out grinding wheel path according to input parameters and generate machining NC programs of aspheric surfaces.

  2. Measurement of the optical fiber numeric aperture exposed to thermal and radiation aging

    NASA Astrophysics Data System (ADS)

    Vanderka, Ales; Bednarek, Lukas; Hajek, Lukas; Latal, Jan; Poboril, Radek; Zavodny, Petr; Vasinek, Vladimir

    2016-12-01

    This paper deals with the aging of optical fibers influenced by temperature and radiation. There are analyzed changes in the structure of the optical fiber, related to the propagation of light in the fiber structure. In this case for numerical aperture. For experimental measurement was used MM fiber OM1 with core diameter 62.5 μm, cladding diameter 125 μm in 2.8 mm secondary coating. Aging of the optical fiber was achieved with dry heat and radiation. For this purpose, we were using a temperature chamber with a stable temperature of 105 °C where the cables after two months. Cables were then irradiated with gamma radiation 60Co in doses of 1.5 kGy and then 60 kGy. These conditions simulated 50 years aging process of optical cables. According to European Standard EN 60793-1-43:2015 was created the automatic device for angular scan working with LabVIEW software interface. Numerical aperture was tested at a wavelength of 850 nm, with an output power 1 mW. Scanning angle was set to 50° with step 0.25°. Numerical aperture was calculated from the position where power has fallen from maximal power at e2 power. The measurement of each sample was performed 10 hours after thermal and radiation aging. The samples were subsequently tested after six months from the last irradiation. In conclusion, the results of the experiment were analyzed and compared.

  3. One Burn, One Standard

    DTIC Science & Technology

    2014-09-01

    Johannes Kepler University Linz Software GmbH Research Department Medical Informatics Hagenberg, Austria Herbert L. Haller, MD Trauma Hospital Linz of...0000000000000004 Address correspondence to M. Giretzlehner, PhD, Johannes Kepler University Linz, RISC Software GmbH, Research Department Medical Informatics, Softwarepark 35, 4232 Hagenberg, Austria. One Burn, One Standard LETTER TO THE EDITOR

  4. Research in applied mathematics, numerical analysis, and computer science

    NASA Technical Reports Server (NTRS)

    1984-01-01

    Research conducted at the Institute for Computer Applications in Science and Engineering (ICASE) in applied mathematics, numerical analysis, and computer science is summarized and abstracts of published reports are presented. The major categories of the ICASE research program are: (1) numerical methods, with particular emphasis on the development and analysis of basic numerical algorithms; (2) control and parameter identification; (3) computational problems in engineering and the physical sciences, particularly fluid dynamics, acoustics, and structural analysis; and (4) computer systems and software, especially vector and parallel computers.

  5. Development of an electromechanical principle for wet and dry milling

    NASA Astrophysics Data System (ADS)

    Halbedel, Bernd; Kazak, Oleg

    2018-05-01

    The paper presents a novel electromechanical principle for wet and dry milling of different materials, in which the milling beads are moved under a time- and local-variable magnetic field. A possibility to optimize the milling process in such a milling machine by simulation of the vector gradient distribution of the electromagnetic field in the process room is presented. The mathematical model and simulation methods based on standard software packages are worked out. The results of numerical simulations and experimental measurements of the electromagnetic field in the working chamber of a developed and manufactured laboratory plant correlate well with each other. Using the obtained operating parameters, dry milling experiments with crushed cement clinker and wet milling experiments of organic agents in the laboratory plant are performed and the results are discussed here.

  6. PHAST: Protein-like heteropolymer analysis by statistical thermodynamics

    NASA Astrophysics Data System (ADS)

    Frigori, Rafael B.

    2017-06-01

    PHAST is a software package written in standard Fortran, with MPI and CUDA extensions, able to efficiently perform parallel multicanonical Monte Carlo simulations of single or multiple heteropolymeric chains, as coarse-grained models for proteins. The outcome data can be straightforwardly analyzed within its microcanonical Statistical Thermodynamics module, which allows for computing the entropy, caloric curve, specific heat and free energies. As a case study, we investigate the aggregation of heteropolymers bioinspired on Aβ25-33 fragments and their cross-seeding with IAPP20-29 isoforms. Excellent parallel scaling is observed, even under numerically difficult first-order like phase transitions, which are properly described by the built-in fully reconfigurable force fields. Still, the package is free and open source, this shall motivate users to readily adapt it to specific purposes.

  7. Towards a standardized method to assess straylight in earth observing optical instruments

    NASA Astrophysics Data System (ADS)

    Caron, J.; Taccola, M.; Bézy, J.-L.

    2017-09-01

    Straylight is a spurious effect that can seriously degrade the radiometric accuracy achieved by Earth observing optical instruments, as a result of the high contrast in the observed Earth radiance scenes and spectra. It is considered critical for several ESA missions such as Sentinel-5, FLEX and potential successors to CarbonSat. Although it is traditionally evaluated by Monte-Carlo simulations performed with commercial softwares (e.g. ASAP, Zemax, LightTools), semi-analytical approximate methods [1,2] have drawn some interest in recent years due to their faster computing time and the greater insight they provide in straylight mechanisms. They cannot replace numerical simulations, but may be more advantageous in contexts where many iterations are needed, for instance during the early phases of an instrument design.

  8. Proceedings, Conference on the Computing Environment for Mathematical Software

    NASA Technical Reports Server (NTRS)

    1981-01-01

    Recent advances in software and hardware technology which make it economical to create computing environments appropriate for specialized applications are addressed. Topics included software tools, FORTRAN standards activity, and features of languages, operating systems, and hardware that are important for the development, testing, and maintenance of mathematical software.

  9. RELAP-7 Software Verification and Validation Plan

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Smith, Curtis L.; Choi, Yong-Joon; Zou, Ling

    This INL plan comprehensively describes the software for RELAP-7 and documents the software, interface, and software design requirements for the application. The plan also describes the testing-based software verification and validation (SV&V) process—a set of specially designed software models used to test RELAP-7. The RELAP-7 (Reactor Excursion and Leak Analysis Program) code is a nuclear reactor system safety analysis code being developed at Idaho National Laboratory (INL). The code is based on the INL’s modern scientific software development framework – MOOSE (Multi-Physics Object-Oriented Simulation Environment). The overall design goal of RELAP-7 is to take advantage of the previous thirty yearsmore » of advancements in computer architecture, software design, numerical integration methods, and physical models. The end result will be a reactor systems analysis capability that retains and improves upon RELAP5’s capability and extends the analysis capability for all reactor system simulation scenarios.« less

  10. Software Quality Assurance and Controls Standard

    DTIC Science & Technology

    2010-04-27

    Software Quality Assurance d C t l St d dan on ro s an ar Sue Carroll Principal Software Quality Analyst, SAS John Wal z VP Technology and...for Information Operations and Reports, 1215 Jefferson Davis Highway, Suite 1204, Arlington VA 22202-4302. Respondents should be aware that...Cycle (SLC) process? • What is in a SQA Process? • Where are SQA Controls? • What is the SQA standards history? Wh t i h i i SQA?• a s c ang ng n

  11. Software And Systems Engineering Risk Management

    DTIC Science & Technology

    2010-04-01

    RSKM 2004 COSO Enterprise RSKM Framework 2006 ISO/IEC 16085 Risk Management Process 2008 ISO/IEC 12207 Software Lifecycle Processes 2009 ISO/IEC...1 Software And Systems Engineering Risk Management John Walz VP Technical and Conferences Activities, IEEE Computer Society Vice-Chair Planning...Software & Systems Engineering Standards Committee, IEEE Computer Society US TAG to ISO TMB Risk Management Working Group Systems and Software

  12. Evaluating Software Assurance Knowledge and Competency of Acquisition Professionals

    DTIC Science & Technology

    2014-10-01

    of ISO 12207 -2008, both internationally and in the United States [7]. That standard documents a comprehensive set of activities and supporting...grows, organizations must ensure that their procurement agents acquire high quality, secure software. ISO 12207 and the Software Assurance Competency...cyberattacks grows, organizations must ensure that their procurement agents acquire high quality, secure software. ISO 12207 and the Software Assurance

  13. Transfer of numeric ASCII data files between Apple and IBM personal computers.

    PubMed

    Allan, R W; Bermejo, R; Houben, D

    1986-01-01

    Listings for programs designed to transfer numeric ASCII data files between Apple and IBM personal computers are provided with accompanying descriptions of how the software operates. Details of the hardware used are also given. The programs may be easily adapted for transferring data between other microcomputers.

  14. Computer Facilitated Mathematical Methods in Chemical Engineering--Similarity Solution

    ERIC Educational Resources Information Center

    Subramanian, Venkat R.

    2006-01-01

    High-performance computers coupled with highly efficient numerical schemes and user-friendly software packages have helped instructors to teach numerical solutions and analysis of various nonlinear models more efficiently in the classroom. One of the main objectives of a model is to provide insight about the system of interest. Analytical…

  15. CINDA-3G: Improved Numerical Differencing Analyzer Program for Third-Generation Computers

    NASA Technical Reports Server (NTRS)

    Gaski, J. D.; Lewis, D. R.; Thompson, L. R.

    1970-01-01

    The goal of this work was to develop a new and versatile program to supplement or replace the original Chrysler Improved Numerical Differencing Analyzer (CINDA) thermal analyzer program in order to take advantage of the improved systems software and machine speeds of the third-generation computers.

  16. Open-source framework for documentation of scientific software written on MATLAB-compatible programming languages

    NASA Astrophysics Data System (ADS)

    Konnik, Mikhail V.; Welsh, James

    2012-09-01

    Numerical simulators for adaptive optics systems have become an essential tool for the research and development of the future advanced astronomical instruments. However, growing software code of the numerical simulator makes it difficult to continue to support the code itself. The problem of adequate documentation of the astronomical software for adaptive optics simulators may complicate the development since the documentation must contain up-to-date schemes and mathematical descriptions implemented in the software code. Although most modern programming environments like MATLAB or Octave have in-built documentation abilities, they are often insufficient for the description of a typical adaptive optics simulator code. This paper describes a general cross-platform framework for the documentation of scientific software using open-source tools such as LATEX, mercurial, Doxygen, and Perl. Using the Perl script that translates M-files MATLAB comments into C-like, one can use Doxygen to generate and update the documentation for the scientific source code. The documentation generated by this framework contains the current code description with mathematical formulas, images, and bibliographical references. A detailed description of the framework components is presented as well as the guidelines for the framework deployment. Examples of the code documentation for the scripts and functions of a MATLAB-based adaptive optics simulator are provided.

  17. Autonomous Real Time Requirements Tracing

    NASA Technical Reports Server (NTRS)

    Plattsmier, George I.; Stetson, Howard K.

    2014-01-01

    One of the more challenging aspects of software development is the ability to verify and validate the functional software requirements dictated by the Software Requirements Specification (SRS) and the Software Detail Design (SDD). Insuring the software has achieved the intended requirements is the responsibility of the Software Quality team and the Software Test team. The utilization of Timeliner-TLX(sup TM) Auto-Procedures for relocating ground operations positions to ISS automated on-board operations has begun the transition that would be required for manned deep space missions with minimal crew requirements. This transition also moves the auto-procedures from the procedure realm into the flight software arena and as such the operational requirements and testing will be more structured and rigorous. The autoprocedures would be required to meet NASA software standards as specified in the Software Safety Standard (NASASTD- 8719), the Software Engineering Requirements (NPR 7150), the Software Assurance Standard (NASA-STD-8739) and also the Human Rating Requirements (NPR-8705). The Autonomous Fluid Transfer System (AFTS) test-bed utilizes the Timeliner-TLX(sup TM) Language for development of autonomous command and control software. The Timeliner- TLX(sup TM) system has the unique feature of providing the current line of the statement in execution during real-time execution of the software. The feature of execution line number internal reporting unlocks the capability of monitoring the execution autonomously by use of a companion Timeliner-TLX(sup TM) sequence as the line number reporting is embedded inside the Timeliner-TLX(sup TM) execution engine. This negates I/O processing of this type data as the line number status of executing sequences is built-in as a function reference. This paper will outline the design and capabilities of the AFTS Autonomous Requirements Tracker, which traces and logs SRS requirements as they are being met during real-time execution of the targeted system. It is envisioned that real time requirements tracing will greatly assist the movement of autoprocedures to flight software enhancing the software assurance of auto-procedures and also their acceptance as reliable commanders

  18. Autonomous Real Time Requirements Tracing

    NASA Technical Reports Server (NTRS)

    Plattsmier, George; Stetson, Howard

    2014-01-01

    One of the more challenging aspects of software development is the ability to verify and validate the functional software requirements dictated by the Software Requirements Specification (SRS) and the Software Detail Design (SDD). Insuring the software has achieved the intended requirements is the responsibility of the Software Quality team and the Software Test team. The utilization of Timeliner-TLX(sup TM) Auto- Procedures for relocating ground operations positions to ISS automated on-board operations has begun the transition that would be required for manned deep space missions with minimal crew requirements. This transition also moves the auto-procedures from the procedure realm into the flight software arena and as such the operational requirements and testing will be more structured and rigorous. The autoprocedures would be required to meet NASA software standards as specified in the Software Safety Standard (NASASTD- 8719), the Software Engineering Requirements (NPR 7150), the Software Assurance Standard (NASA-STD-8739) and also the Human Rating Requirements (NPR-8705). The Autonomous Fluid Transfer System (AFTS) test-bed utilizes the Timeliner-TLX(sup TM) Language for development of autonomous command and control software. The Timeliner-TLX(sup TM) system has the unique feature of providing the current line of the statement in execution during real-time execution of the software. The feature of execution line number internal reporting unlocks the capability of monitoring the execution autonomously by use of a companion Timeliner-TLX(sup TM) sequence as the line number reporting is embedded inside the Timeliner-TLX(sup TM) execution engine. This negates I/O processing of this type data as the line number status of executing sequences is built-in as a function reference. This paper will outline the design and capabilities of the AFTS Autonomous Requirements Tracker, which traces and logs SRS requirements as they are being met during real-time execution of the targeted system. It is envisioned that real time requirements tracing will greatly assist the movement of autoprocedures to flight software enhancing the software assurance of auto-procedures and also their acceptance as reliable commanders.

  19. Numerical simulation of a flow past a triangular sail-type blade of a wind generator using the ANSYS FLUENT software package

    NASA Astrophysics Data System (ADS)

    Kusaiynov, K.; Tanasheva, N. K.; Min'kov, L. L.; Nusupbekov, B. R.; Stepanova, Yu. O.; Rozhkova, A. V.

    2016-02-01

    An air flow past a single triangular sail-type blade of a wind turbine is analyzed by numerical simulation for low velocities of the incoming flow. The results of numerical simulation indicate a monotonic increase in the drag force and the lift force as functions of the incoming flow; empirical dependences of these quantities are obtained.

  20. Influence of standardization on the precision (reproducibility) of dental cast analysis with virtual 3-dimensional models.

    PubMed

    Hayashi, Kazuo; Chung, Onejune; Park, Seojung; Lee, Seung-Pyo; Sachdeva, Rohit C L; Mizoguchi, Itaru

    2015-03-01

    Virtual 3-dimensional (3D) models obtained by scanning of physical casts have become an alternative to conventional dental cast analysis in orthodontic treatment. If the precision (reproducibility) of virtual 3D model analysis can be further improved, digital orthodontics could be even more widely accepted. The purpose of this study was to clarify the influence of "standardization" of the target points for dental cast analysis using virtual 3D models. Physical plaster models were also measured to obtain additional information. Five sets of dental casts were used. The dental casts were scanned with R700 (3Shape, Copenhagen, Denmark) and REXCAN DS2 3D (Solutionix, Seoul, Korea) scanners. In this study, 3 system and software packages were used: SureSmile (OraMetrix, Richardson, Tex), Rapidform (Inus, Seoul, Korea), and I-DEAS (SDRC, Milford, Conn). Without standardization, the maximum differences were observed between the SureSmile software and the Rapidform software (0.39 mm ± 0.07). With standardization, the maximum differences were observed between the SureSmile software and measurements with a digital caliper (0.099 mm ± 0.01), and this difference was significantly greater (P <0.05) than the 2 other mean difference values. Furthermore, the results of this study showed that the mean differences "WITH" standardization were significantly lower than those "WITHOUT" standardization for all systems, software packages, or methods. The results showed that elimination of the influence of usability or habituation is important for improving the reproducibility of dental cast analysis. Copyright © 2015 American Association of Orthodontists. Published by Elsevier Inc. All rights reserved.

  1. Expert system verification and validation study. Delivery 3A and 3B: Trip summaries

    NASA Technical Reports Server (NTRS)

    French, Scott

    1991-01-01

    Key results are documented from attending the 4th workshop on verification, validation, and testing. The most interesting part of the workshop was when representatives from the U.S., Japan, and Europe presented surveys of VV&T within their respective regions. Another interesting part focused on current efforts to define industry standards for artificial intelligence and how that might affect approaches to VV&T of expert systems. The next part of the workshop focused on VV&T methods of applying mathematical techniques to verification of rule bases and techniques for capturing information relating to the process of developing software. The final part focused on software tools. A summary is also presented of the EPRI conference on 'Methodologies, Tools, and Standards for Cost Effective Reliable Software Verification and Validation. The conference was divided into discussion sessions on the following issues: development process, automated tools, software reliability, methods, standards, and cost/benefit considerations.

  2. The Synthetic Biology Open Language (SBOL) provides a community standard for communicating designs in synthetic biology.

    PubMed

    Galdzicki, Michal; Clancy, Kevin P; Oberortner, Ernst; Pocock, Matthew; Quinn, Jacqueline Y; Rodriguez, Cesar A; Roehner, Nicholas; Wilson, Mandy L; Adam, Laura; Anderson, J Christopher; Bartley, Bryan A; Beal, Jacob; Chandran, Deepak; Chen, Joanna; Densmore, Douglas; Endy, Drew; Grünberg, Raik; Hallinan, Jennifer; Hillson, Nathan J; Johnson, Jeffrey D; Kuchinsky, Allan; Lux, Matthew; Misirli, Goksel; Peccoud, Jean; Plahar, Hector A; Sirin, Evren; Stan, Guy-Bart; Villalobos, Alan; Wipat, Anil; Gennari, John H; Myers, Chris J; Sauro, Herbert M

    2014-06-01

    The re-use of previously validated designs is critical to the evolution of synthetic biology from a research discipline to an engineering practice. Here we describe the Synthetic Biology Open Language (SBOL), a proposed data standard for exchanging designs within the synthetic biology community. SBOL represents synthetic biology designs in a community-driven, formalized format for exchange between software tools, research groups and commercial service providers. The SBOL Developers Group has implemented SBOL as an XML/RDF serialization and provides software libraries and specification documentation to help developers implement SBOL in their own software. We describe early successes, including a demonstration of the utility of SBOL for information exchange between several different software tools and repositories from both academic and industrial partners. As a community-driven standard, SBOL will be updated as synthetic biology evolves to provide specific capabilities for different aspects of the synthetic biology workflow.

  3. Non-standard analysis and embedded software

    NASA Technical Reports Server (NTRS)

    Platek, Richard

    1995-01-01

    One model for computing in the future is ubiquitous, embedded computational devices analogous to embedded electrical motors. Many of these computers will control physical objects and processes. Such hidden computerized environments introduce new safety and correctness concerns whose treatment go beyond present Formal Methods. In particular, one has to begin to speak about Real Space software in analogy with Real Time software. By this we mean, computerized systems which have to meet requirements expressed in the real geometry of space. How to translate such requirements into ordinary software specifications and how to carry out proofs is a major challenge. In this talk we propose a research program based on the use of no-standard analysis. Much detail remains to be carried out. The purpose of the talk is to inform the Formal Methods community that Non-Standard Analysis provides a possible avenue to attack which we believe will be fruitful.

  4. Patient Safety—Incorporating Drawing Software into Root Cause Analysis Software

    PubMed Central

    Williams, Linda; Grayson, Diana; Gosbee, John

    2001-01-01

    Drawing software from Lassalle Technologies1 (France) designed for Visual Basic is the tool we used to standardize the creation, storage, and retrieval of flow diagrams containing information about adverse events and close calls.

  5. Patient Safety—Incorporating Drawing Software into Root Cause Analysis Software

    PubMed Central

    Williams, Linda; Grayson, Diana; Gosbee, John

    2002-01-01

    Drawing software from Lassalle Technologies1 (France) designed for Visual Basic is the tool we used to standardize the creation, storage, and retrieval of flow diagrams containing information about adverse events and close calls.

  6. Standardized development of computer software. Part 1: Methods

    NASA Technical Reports Server (NTRS)

    Tausworthe, R. C.

    1976-01-01

    This work is a two-volume set on standards for modern software engineering methodology. This volume presents a tutorial and practical guide to the efficient development of reliable computer software, a unified and coordinated discipline for design, coding, testing, documentation, and project organization and management. The aim of the monograph is to provide formal disciplines for increasing the probability of securing software that is characterized by high degrees of initial correctness, readability, and maintainability, and to promote practices which aid in the consistent and orderly development of a total software system within schedule and budgetary constraints. These disciplines are set forth as a set of rules to be applied during software development to drastically reduce the time traditionally spent in debugging, to increase documentation quality, to foster understandability among those who must come in contact with it, and to facilitate operations and alterations of the program as requirements on the program environment change.

  7. Health software: a new CEI Guide for software management in medical environment.

    PubMed

    Giacomozzi, Claudia; Martelli, Francesco

    2016-01-01

    The increasing spread of software components in the healthcare context renders explanatory guides relevant and mandatory to interpret laws and standards, and to support safe management of software products in healthcare. In 2012 a working group has been settled for the above purposes at Italian Electrotechnical Committee (CEI), made of experts from Italian National Institute of Health (ISS), representatives of industry, and representatives of the healthcare organizations. As a first outcome of the group activity, Guide CEI 62-237 was published in February 2015. The Guide incorporates an innovative approach based on the proper contextualization of software products, either medical devices or not, to the specific healthcare scenario, and addresses the risk management of IT systems. The Guide provides operators and manufacturers with an interpretative support with many detailed examples to facilitate the proper contextualization and management of health software, in compliance with related European and international regulations and standards.

  8. Implementation of AAPG exchange format

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Keiser, K.; Guerrero, I.

    1989-03-01

    The American Association of Petroleum Geologists (AAPG) has proposed a format for exchanging geologic and other petroleum data. The AAPG Computer Applications Committee approved the proposal at the March 1988 AAPG annual meeting in Houston, Texas. By adopting this format, data input into application software and data exchange between software packages are greatly simplified. Benefits to both users and suppliers of software are substantial. The AAPG exchange format supports a flexible, generic data structure. This flexibility allows application software to use the standard format for storing internal control data. In some cases, extensions to the standard format, such as separationmore » of header and data files and use of data delimiters, permits the use of AAPG format translator programs on data that were defined and generated before the emergence of the exchange format. Translation software, programmed in C, has been written and contributes to successful implementation of the AAPG exchange format in application software.« less

  9. Technical Support Document for Version 3.6.1 of the COMcheck Software

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bartlett, Rosemarie; Connell, Linda M.; Gowri, Krishnan

    2009-09-29

    This technical support document (TSD) is designed to explain the technical basis for the COMcheck software as originally developed based on the ANSI/ASHRAE/IES Standard 90.1-1989 (Standard 90.1-1989). Documentation for other national model codes and standards and specific state energy codes supported in COMcheck has been added to this report as appendices. These appendices are intended to provide technical documentation for features specific to the supported codes and for any changes made for state-specific codes that differ from the standard features that support compliance with the national model codes and standards.

  10. An Object-Oriented Approach to Writing Computational Electromagnetics Codes

    NASA Technical Reports Server (NTRS)

    Zimmerman, Martin; Mallasch, Paul G.

    1996-01-01

    Presently, most computer software development in the Computational Electromagnetics (CEM) community employs the structured programming paradigm, particularly using the Fortran language. Other segments of the software community began switching to an Object-Oriented Programming (OOP) paradigm in recent years to help ease design and development of highly complex codes. This paper examines design of a time-domain numerical analysis CEM code using the OOP paradigm, comparing OOP code and structured programming code in terms of software maintenance, portability, flexibility, and speed.

  11. A software simulation study of a (255,223) Reed-Solomon encoder-decoder

    NASA Technical Reports Server (NTRS)

    Pollara, F.

    1985-01-01

    A set of software programs which simulates a (255,223) Reed-Solomon encoder/decoder pair is described. The transform decoder algorithm uses a modified Euclid algorithm, and closely follows the pipeline architecture proposed for the hardware decoder. Uncorrectable error patterns are detected by a simple test, and the inverse transform is computed by a finite field FFT. Numerical examples of the decoder operation are given for some test codewords, with and without errors. The use of the software package is briefly described.

  12. Theoretical and experimental analysis of the impacts of removable storage media and antivirus software on viral spread

    NASA Astrophysics Data System (ADS)

    Gan, Chenquan; Yang, Xiaofan

    2015-05-01

    In this paper, a new computer virus propagation model, which incorporates the effects of removable storage media and antivirus software, is proposed and analyzed. The global stability of the unique equilibrium of the model is independent of system parameters. Numerical simulations not only verify this result, but also illustrate the influences of removable storage media and antivirus software on viral spread. On this basis, some applicable measures for suppressing virus prevalence are suggested.

  13. Bridges Dynamic Parameters Identification Based On Experimental and Numerical Method Comparison in Regard with Traffic Seismicity

    NASA Astrophysics Data System (ADS)

    Krkošková, Katarína; Papán, Daniel; Papánová, Zuzana

    2017-10-01

    The technical seismicity negatively affects the environment, buildings and structures. Technical seismicity means seismic shakes caused by force impulse, random process and unnatural origin. The vibration influence on buildings is evaluated in the Eurocode 8 in Slovak Republic, however, the Slovak Technical Standard STN 73 0036 includes solution of the technical seismicity. This standard also classes bridges into the group of structures that are significant in light of the technical seismicity - the group “U”. Using the case studies analysis by FEM simulation and comparison is necessary because of brief norm evaluation of this issue. In this article, determinate dynamic parameters by experimental measuring and numerical method on two real bridges are compared. First bridge, (D201 - 00) is Scaffold Bridge on the road I/11 leading to the city of Čadca and is situated in the city of Žilina. It is eleven - span concrete road bridge. The railway is the obstacle, which this bridge spans. Second bridge (M5973 Brodno) is situated in the part of Žilina City on the road of I/11. It is concrete three - span road bridge built as box girder. The computing part includes 3D computational models of the bridges. First bridge (D201 - 00) was modelled in the software of IDA Nexis as the slab - wall model. The model outputs are natural frequencies and natural vibration modes. Second bridge (M5973 Brodno) was modelled in the software of VisualFEA. The technical seismicity corresponds with the force impulse, which was put into this model. The model outputs are vibration displacements, velocities and accelerations. The aim of the experiments was measuring of the vibration acceleration time record of bridges, and there was need to systematic placement of accelerometers. The vibration acceleration time record is important during the under - bridge train crossing, about the first bridge (D201 - 00) and the vibration acceleration time domain is important during deducing the force impulse under the bridge, about second bridge (M5973 Brodno). The analysis was done in the software of Sigview. About the first bridge (D201 - 00), the analysis output were values of power spectral density adherent to the frequencies values. These frequencies were compared with the natural frequencies values from the computational model whereby the technical seismicity influence on bridge natural frequencies was found out. About the second bridge (M5973 Brodno), the Sigview display of recorded vibration velocity time history was compared with the final vibration velocity time history from the computational model, whereby the results were incidental.

  14. [The planning of resource support of secondary medical care in hospital].

    PubMed

    Kungurov, N V; Zil'berberg, N V

    2010-01-01

    The Ural Institute of dermatovenerology and immunopathology developed and implemented the software concerning the personalized total recording of medical services and pharmaceuticals. The Institute also presents such software as listing of medical services, software module of calculation of financial costs of implementing full standards of secondary medical care in case of chronic dermatopathy, reference book of standards of direct specific costs on laboratory and physiotherapy services, reference book of pharmaceuticals, testing systems and consumables. The unified information system of management recording is a good technique to substantiate the costs of the implementation of standards of medical care, including high-tech care with taking into account the results of total calculation of provided medical services.

  15. Rework of the ERA software system: ERA-8

    NASA Astrophysics Data System (ADS)

    Pavlov, D.; Skripnichenko, V.

    2015-08-01

    The software system that has been powering many products of the IAA during decades has undergone a major rework. ERA has capabilities for: processing tables of observations of different kinds, fitting parameters to observations, integrating equations of motion of the Solar system bodies. ERA comprises a domain-specific language called SLON, tailored for astronomical tasks. SLON provides a convenient syntax for reductions of observations, choosing of IAU standards to use, applying rules for filtering observations or selecting parameters for fitting. Also, ERA includes a table editor and a graph plotter. ERA-8 has a number of improvements over previous versions such as: integration of the Solar system and TT xA1 TDB with arbitrary number of asteroids; option to use different ephemeris (including DE and INPOP); integrator with 80-bit floating point. The code of ERA-8 has been completely rewritten from Pascal to C (for numerical computations) and Racket (for running SLON programs and managing data). ERA-8 is portable across major operating systems. The format of tables in ERA-8 is based on SQLite. The SPICE format has been chosen as the main format for ephemeris in ERA-8.

  16. Influence of the power law index on the fiber breakage during injection molding by numerical simulations

    NASA Astrophysics Data System (ADS)

    Desplentere, Frederik; Six, Wim; Bonte, Hilde; Debrabandere, Eric

    2013-04-01

    In predictive engineering for polymer processes, the proper prediction of material microstructure from known processing conditions and constituent material properties is a critical step forward properly predicting bulk properties in the finished composite. Operating within the context of long-fiber thermoplastics (LFT, length > 15mm) this investigation concentrates on the influence of the power law index on the final fiber length distribution within the injection molded part. To realize this, the Autodesk Simulation Moldflow Insight Scandium 2013 software has been used. In this software, a fiber breakage algorithm is available from this release on. Using virtual material data with realistic viscosity levels allows to separate the influence of the power law index on the fiber breakage from the other material and process parameters. Applying standard settings for the fiber breakage parameters results in an obvious influence on the fiber length distribution through the thickness of the part and also as function of position in the part. Finally, the influence of the shear rate constant within the fiber breakage model has been investigated illustrating the possibility to fit the virtual fiber length distribution to the possible experimentally available data.

  17. Kinetic modeling and fitting software for interconnected reaction schemes: VisKin.

    PubMed

    Zhang, Xuan; Andrews, Jared N; Pedersen, Steen E

    2007-02-15

    Reaction kinetics for complex, highly interconnected kinetic schemes are modeled using analytical solutions to a system of ordinary differential equations. The algorithm employs standard linear algebra methods that are implemented using MatLab functions in a Visual Basic interface. A graphical user interface for simple entry of reaction schemes facilitates comparison of a variety of reaction schemes. To ensure microscopic balance, graph theory algorithms are used to determine violations of thermodynamic cycle constraints. Analytical solutions based on linear differential equations result in fast comparisons of first order kinetic rates and amplitudes as a function of changing ligand concentrations. For analysis of higher order kinetics, we also implemented a solution using numerical integration. To determine rate constants from experimental data, fitting algorithms that adjust rate constants to fit the model to imported data were implemented using the Levenberg-Marquardt algorithm or using Broyden-Fletcher-Goldfarb-Shanno methods. We have included the ability to carry out global fitting of data sets obtained at varying ligand concentrations. These tools are combined in a single package, which we have dubbed VisKin, to guide and analyze kinetic experiments. The software is available online for use on PCs.

  18. GENERAL EARTHQUAKE-OBSERVATION SYSTEM (GEOS).

    USGS Publications Warehouse

    Borcherdt, R.D.; Fletcher, Joe B.; Jensen, E.G.; Maxwell, G.L.; VanSchaack, J.R.; Warrick, R.E.; Cranswick, E.; Johnston, M.J.S.; McClearn, R.

    1985-01-01

    Microprocessor technology has permitted the development of a General Earthquake-Observation System (GEOS) useful for most seismic applications. Central-processing-unit control via robust software of system functions that are isolated on hardware modules permits field adaptability of the system to a wide variety of active and passive seismic experiments and straightforward modification for incorporation of improvements in technology. Various laboratory tests and numerous deployments of a set of the systems in the field have confirmed design goals, including: wide linear dynamic range (16 bit/96 dB); broad bandwidth (36 hr to 600 Hz; greater than 36 hr available); selectable sensor-type (accelerometer, seismometer, dilatometer); selectable channels (1 to 6); selectable record mode (continuous, preset, trigger); large data capacity (1. 4 to 60 Mbytes); selectable time standard (WWVB, master, manual); automatic self-calibration; simple field operation; full capability to adapt system in the field to a wide variety of experiments; low power; portability; and modest costs. System design goals for a microcomputer-controlled system with modular software and hardware components as implemented on the GEOS are presented. The systems have been deployed for 15 experiments, including: studies of near-source strong motion; high-frequency microearthquakes; crustal structure; down-hole wave propagation; teleseismicity; and earth-tidal strains.

  19. A new paradigm on battery powered embedded system design based on User-Experience-Oriented method

    NASA Astrophysics Data System (ADS)

    Wang, Zhuoran; Wu, Yue

    2014-03-01

    The battery sustainable time has been an active research topic recently for the development of battery powered embedded products such as tablets and smart phones, which are determined by the battery capacity and power consumption. Despite numerous efforts on the improvement of battery capacity in the field of material engineering, the power consumption also plays an important role and easier to ameliorate in delivering a desirable user-experience, especially considering the moderate advancement on batteries for decades. In this study, a new Top-Down modelling method, User-Experience-Oriented Battery Powered Embedded System Design Paradigm, is proposed to estimate the target average power consumption, to guide the hardware and software design, and eventually to approach the theoretical lowest power consumption that the application is still able to provide the full functionality. Starting from the 10-hour sustainable time standard, average working current is defined with battery design capacity and set as a target. Then an implementation is illustrated from both hardware perspective, which is summarized as Auto-Gating power management, and from software perspective, which introduces a new algorithm, SleepVote, to guide the system task design and scheduling.

  20. ADOPT: A tool for automatic detection of tectonic plates at the surface of convection models

    NASA Astrophysics Data System (ADS)

    Mallard, C.; Jacquet, B.; Coltice, N.

    2017-08-01

    Mantle convection models with plate-like behavior produce surface structures comparable to Earth's plate boundaries. However, analyzing those structures is a difficult task, since convection models produce, as on Earth, diffuse deformation and elusive plate boundaries. Therefore we present here and share a quantitative tool to identify plate boundaries and produce plate polygon layouts from results of numerical models of convection: Automatic Detection Of Plate Tectonics (ADOPT). This digital tool operates within the free open-source visualization software Paraview. It is based on image segmentation techniques to detect objects. The fundamental algorithm used in ADOPT is the watershed transform. We transform the output of convection models into a topographic map, the crest lines being the regions of deformation (plate boundaries) and the catchment basins being the plate interiors. We propose two generic protocols (the field and the distance methods) that we test against an independent visual detection of plate polygons. We show that ADOPT is effective to identify the smaller plates and to close plate polygons in areas where boundaries are diffuse or elusive. ADOPT allows the export of plate polygons in the standard OGR-GMT format for visualization, modification, and analysis under generic softwares like GMT or GPlates.

  1. NetCDF-CF: Supporting Earth System Science with Data Access, Analysis, and Visualization

    NASA Astrophysics Data System (ADS)

    Davis, E.; Zender, C. S.; Arctur, D. K.; O'Brien, K.; Jelenak, A.; Santek, D.; Dixon, M. J.; Whiteaker, T. L.; Yang, K.

    2017-12-01

    NetCDF-CF is a community-developed convention for storing and describing earth system science data in the netCDF binary data format. It is an OGC recognized standard with numerous existing FOSS (Free and Open Source Software) and commercial software tools can explore, analyze, and visualize data that is stored and described as netCDF-CF data. To better support a larger segment of the earth system science community, a number of efforts are underway to extend the netCDF-CF convention with the goal of increasing the types of data that can be represented as netCDF-CF data. This presentation will provide an overview and update of work to extend the existing netCDF-CF convention. It will detail the types of earth system science data currently supported by netCDF-CF and the types of data targeted for support by current netCDF-CF convention development efforts. It will also describe some of the tools that support the use of netCDF-CF compliant datasets, the types of data they support, and efforts to extend them to handle the new data types that netCDF-CF will support.

  2. Hierarchical programming for data storage and visualization

    USGS Publications Warehouse

    Donovan, John M.; Smith, Peter E.; ,

    2001-01-01

    Graphics software is an essential tool for interpreting, analyzing, and presenting data from multidimensional hydrodynamic models used in estuarine and coastal ocean studies. The post-processing of time-varying three-dimensional model output presents unique requirements for data visualization because of the large volume of data that can be generated and the multitude of time scales that must be examined. Such data can relate to estuarine or coastal ocean environments and come from numerical models or field instruments. One useful software tool for the display, editing, visualization, and printing of graphical data is the Gr application, written by the first author for use in U.S. Geological Survey San Francisco Bay Program. The Gr application has been made available to the public via the Internet since the year 2000. The Gr application is written in the Java (Sun Microsystems, Nov. 29, 2001) programming language and uses the Extensible Markup Language standard for hierarchical data storage. Gr presents a hierarchy of objects to the user that can be edited using a common interface. Java's object-oriented capabilities allow Gr to treat data, graphics, and tools equally and to save them all to a single XML file.

  3. MITHRA 1.0: A full-wave simulation tool for free electron lasers

    NASA Astrophysics Data System (ADS)

    Fallahi, Arya; Yahaghi, Alireza; Kärtner, Franz X.

    2018-07-01

    Free Electron Lasers (FELs) are a solution for providing intense, coherent and bright radiation in the hard X-ray regime. Due to the low wall-plug efficiency of FEL facilities, it is crucial and additionally very useful to develop complete and accurate simulation tools for better optimizing a FEL interaction. The highly sophisticated dynamics involved in a FEL process was the main obstacle hindering the development of general simulation tools for this problem. We present a numerical algorithm based on finite difference time domain/Particle in cell (FDTD/PIC) in a Lorentz boosted coordinate system which is able to fulfill a full-wave simulation of a FEL process. The developed software offers a suitable tool for the analysis of FEL interactions without considering any of the usual approximations. A coordinate transformation to bunch rest frame makes the very different length scales of bunch size, optical wavelengths and the undulator period transform to values with the same order. Consequently, FDTD/PIC simulations in conjunction with efficient parallelization techniques make the full-wave simulation feasible using the available computational resources. Several examples of free electron lasers are analyzed using the developed software, the results are benchmarked based on standard FEL codes and discussed in detail.

  4. Revealing the ISO/IEC 9126-1 Clique Tree for COTS Software Evaluation

    NASA Technical Reports Server (NTRS)

    Morris, A. Terry

    2007-01-01

    Previous research has shown that acyclic dependency models, if they exist, can be extracted from software quality standards and that these models can be used to assess software safety and product quality. In the case of commercial off-the-shelf (COTS) software, the extracted dependency model can be used in a probabilistic Bayesian network context for COTS software evaluation. Furthermore, while experts typically employ Bayesian networks to encode domain knowledge, secondary structures (clique trees) from Bayesian network graphs can be used to determine the probabilistic distribution of any software variable (attribute) using any clique that contains that variable. Secondary structures, therefore, provide insight into the fundamental nature of graphical networks. This paper will apply secondary structure calculations to reveal the clique tree of the acyclic dependency model extracted from the ISO/IEC 9126-1 software quality standard. Suggestions will be provided to describe how the clique tree may be exploited to aid efficient transformation of an evaluation model.

  5. Effects of Medical Device Regulations on the Development of Stand-Alone Medical Software: A Pilot Study.

    PubMed

    Blagec, Kathrin; Jungwirth, David; Haluza, Daniela; Samwald, Matthias

    2018-01-01

    Medical device regulations which aim to ensure safety standards do not only apply to hardware devices but also to standalone medical software, e.g. mobile apps. To explore the effects of these regulations on the development and distribution of medical standalone software. We invited a convenience sample of 130 domain experts to participate in an online survey about the impact of current regulations on the development and distribution of medical standalone software. 21 respondents completed the questionnaire. Participants reported slight positive effects on usability, reliability, and data security of their products, whereas the ability to modify already deployed software and customization by end users were negatively impacted. The additional time and costs needed to go through the regulatory process were perceived as the greatest obstacles in developing and distributing medical software. Further research is needed to compare positive effects on software quality with negative impacts on market access and innovation. Strategies for avoiding over-regulation while still ensuring safety standards need to be devised.

  6. Optics simulations: a Python workshop

    NASA Astrophysics Data System (ADS)

    Ghalila, H.; Ammar, A.; Varadharajan, S.; Majdi, Y.; Zghal, M.; Lahmar, S.; Lakshminarayanan, V.

    2017-08-01

    Numerical simulations allow teachers and students to indirectly perform sophisticated experiments that cannot be realizable otherwise due to cost and other constraints. During the past few decades there has been an explosion in the development of numerical tools concurrently with open source environments such as Python software. This availability of open source software offers an incredible opportunity for advancing teaching methodologies as well as in research. More specifically it is possible to correlate theoretical knowledge with experimental measurements using "virtual" experiments. We have been working on the development of numerical simulation tools using the Python program package and we have concentrated on geometric and physical optics simulations. The advantage of doing hands-on numerical experiments is that it allows the student learner to be an active participant in the pedagogical/learning process rather than playing a passive role as in the traditional lecture format. Even in laboratory classes because of constraints of space, lack of equipment and often-large numbers of students, many students play a passive role since they work in groups of 3 or more students. Furthermore these new tools help students get a handle on numerical methods as well simulations and impart a "feel" for the physics under investigation.

  7. Experimental software engineering: Seventeen years of lessons in the SEL

    NASA Technical Reports Server (NTRS)

    Mcgarry, Frank E.

    1992-01-01

    Seven key principles developed by the Software Engineering Laboratory (SEL) at the Goddard Space Flight Center (GSFC) of the National Aeronautics and Space Administration (NASA) are described. For the past 17 years, the SEL has been experimentally analyzing the development of production software as varying techniques and methodologies are applied in this one environment. The SEL has collected, archived, and studied detailed measures from more than 100 flight dynamics projects, thereby gaining significant insight into the effectiveness of numerous software techniques, as well as extensive experience in the overall effectiveness of 'Experimental Software Engineering'. This experience has helped formulate follow-on studies in the SEL, and it has helped other software organizations better understand just what can be accomplished and what cannot be accomplished through experimentation.

  8. Earth-moon system: Dynamics and parameter estimation; numerical considerations and program documentation

    NASA Technical Reports Server (NTRS)

    Breedlove, W. J., Jr.

    1976-01-01

    Major activities included coding and verifying equations of motion for the earth-moon system. Some attention was also given to numerical integration methods and parameter estimation methods. Existing analytical theories such as Brown's lunar theory, Eckhardt's theory for lunar rotation, and Newcomb's theory for the rotation of the earth were coded and verified. These theories serve as checks for the numerical integration. Laser ranging data for the period January 1969 - December 1975 was collected and stored on tape. The main goal of this research is the development of software to enable physical parameters of the earth-moon system to be estimated making use of data available from the Lunar Laser Ranging Experiment and the Very Long Base Interferometry experiment of project Apollo. A more specific goal is to develop software for the estimation of certain physical parameters of the moon such as inertia ratios, and the third and fourth harmonic gravity coefficients.

  9. Molded underfill (MUF) encapsulation for flip-chip package: A numerical investigation

    NASA Astrophysics Data System (ADS)

    Azmi, M. A.; Abdullah, M. K.; Abdullah, M. Z.; Ariff, Z. M.; Saad, Abdullah Aziz; Hamid, M. F.; Ismail, M. A.

    2017-07-01

    This paper presents the numerical simulation of epoxy molding compound (EMC) filling in multi flip-chip packages during encapsulation process. The empty and a group flip chip packages were considered in the mold cavity in order to study the flow profile of the EMC. SOLIDWORKS software was used for three-dimensional modeling and it was incorporated into fluid analysis software namely as ANSYS FLUENT. The volume of fluid (VOF) technique was used for capturing the flow front profiles and Power Law model was applied for its rheology model. The numerical result are compared and discussed with previous experimental and it was shown a good conformity for model validation. The prediction of flow front was observed and analyzed at different filling time. The possibility and visual of void formation in the package is captured and the number of flip-chip is one factor that contributed to the void formation.

  10. Information Interface Related Standards, Guidelines, and Recommended Practices.

    DTIC Science & Technology

    1985-07-01

    Application Workshop, IEEE, October 1984 (11) Software Portability and Standards, by Ingemar Dahlstrand, Ellis Horwood Ltd., 1984 (12) World of EDP...March 29, 1985, p. 42-48 * 9. "IBM’s Topview Plays to Poor Reviews of Early Users," Computer World , March 4, 1985, p. 5 10. "Lack of Software Standards...Information Symbols ISOiTR 7239-1984 - Development and Principles for Application of Public Information Symbols ISO/TR 8545 -1984 - Technical Drawings

  11. Software component quality evaluation

    NASA Technical Reports Server (NTRS)

    Clough, A. J.

    1991-01-01

    The paper describes a software inspection process that can be used to evaluate the quality of software components. Quality criteria, process application, independent testing of the process and proposed associated tool support are covered. Early results indicate that this technique is well suited for assessing software component quality in a standardized fashion. With automated machine assistance to facilitate both the evaluation and selection of software components, such a technique should promote effective reuse of software components.

  12. Validation by numerical simulation of the behaviour of protective structures of machinery cabins subjected to standardized shocks

    NASA Astrophysics Data System (ADS)

    Dumitrache, P.; Goanţă, A. M.

    2017-08-01

    The ability of the cabins to insure the operator protection in the case of the shock loading that appears at the roll-over of the machine or when the cab is struck by the falling objects, it’s one of the most important performance criterions that it must comply by the machines and the mobile equipments. The experimental method provides the most accurate information on the behaviour of protective structures, but generates high costs due to experimental installations and structures which may be compromised during the experiments. In these circumstances, numerical simulation of the actual problem (mechanical shock applied to a strength structure) is a perfectly viable alternative, given that the hardware and software current performances provides the necessary support to obtain results with an acceptable level of accuracy. In this context, the paper proposes using FEA platforms for virtual testing of the actual strength structures of the cabins using their finite element models based on 3D models generated in CAD environments. In addition to the economic advantage above mentioned, although the results obtained by simulation using the finite element method are affected by a number of simplifying assumptions, the adequate modelling of the phenomenon can be a successful support in the design process of structures to meet safety performance criteria imposed by current standards. In the first section of the paper is presented the general context of the security performance requirements imposed by current standards on the cabins strength structures. The following section of the paper is dedicated to the peculiarities of finite element modelling in problems that impose simulation of the behaviour of structures subjected to shock loading. The final section of the paper is dedicated to a case study and to the future objectives.

  13. UNIX as an environment for producing numerical software

    NASA Technical Reports Server (NTRS)

    Schryer, N. L.

    1978-01-01

    The UNIX operating system supports a number of software tools; a mathematical equation-setting language, a phototypesetting language, a FORTRAN preprocessor language, a text editor, and a command interpreter. The design, implementation, documentation, and maintenance of a portable FORTRAN test of the floating-point arithmetic unit of a computer is used to illustrate these tools at work.

  14. How To Use the SilverPlatter Software To Search the ERIC CD ROM.

    ERIC Educational Resources Information Center

    Merrill, Paul F.

    This manual provides detailed instructions for using SilverPlatter software to search the ERIC CD ROM (Compact Disk Read Only Memory), a large bibliographic database relating to education which contains reference information on numerous journal articles from over 750 journals cited in the "Current Index to Journals in Education" (CIJE),…

  15. Software Simplifies the Sharing of Numerical Models

    NASA Technical Reports Server (NTRS)

    2014-01-01

    To ease the sharing of climate models with university students, Goddard Space Flight Center awarded SBIR funding to Reston, Virginia-based Parabon Computation Inc., a company that specializes in cloud computing. The firm developed a software program capable of running climate models over the Internet, and also created an online environment for people to collaborate on developing such models.

  16. The SCEC Unified Community Velocity Model (UCVM) Software Framework for Distributing and Querying Seismic Velocity Models

    NASA Astrophysics Data System (ADS)

    Maechling, P. J.; Taborda, R.; Callaghan, S.; Shaw, J. H.; Plesch, A.; Olsen, K. B.; Jordan, T. H.; Goulet, C. A.

    2017-12-01

    Crustal seismic velocity models and datasets play a key role in regional three-dimensional numerical earthquake ground-motion simulation, full waveform tomography, modern physics-based probabilistic earthquake hazard analysis, as well as in other related fields including geophysics, seismology, and earthquake engineering. The standard material properties provided by a seismic velocity model are P- and S-wave velocities and density for any arbitrary point within the geographic volume for which the model is defined. Many seismic velocity models and datasets are constructed by synthesizing information from multiple sources and the resulting models are delivered to users in multiple file formats, such as text files, binary files, HDF-5 files, structured and unstructured grids, and through computer applications that allow for interactive querying of material properties. The Southern California Earthquake Center (SCEC) has developed the Unified Community Velocity Model (UCVM) software framework to facilitate the registration and distribution of existing and future seismic velocity models to the SCEC community. The UCVM software framework is designed to provide a standard query interface to multiple, alternative velocity models, even if the underlying velocity models are defined in different formats or use different geographic projections. The UCVM framework provides a comprehensive set of open-source tools for querying seismic velocity model properties, combining regional 3D models and 1D background models, visualizing 3D models, and generating computational models in the form of regular grids or unstructured meshes that can be used as inputs for ground-motion simulations. The UCVM framework helps researchers compare seismic velocity models and build equivalent simulation meshes from alternative velocity models. These capabilities enable researchers to evaluate the impact of alternative velocity models in ground-motion simulations and seismic hazard analysis applications. In this poster, we summarize the key components of the UCVM framework and describe the impact it has had in various computational geoscientific applications.

  17. 75 FR 25165 - Defense Federal Acquisition Regulation Supplement; Cost and Software Data Reporting System

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-05-07

    ... 215, 234, 242, and 252 Defense Federal Acquisition Regulation Supplement; Cost and Software Data... Regulation Supplement (DFARS) to set forth DoD Cost and Software Data Reporting system requirements for major... set forth the DoD requirement for offerors to: Describe the standard Cost and Software Data Reporting...

  18. Tips on Creating Complex Geometry Using Solid Modeling Software

    ERIC Educational Resources Information Center

    Gow, George

    2008-01-01

    Three-dimensional computer-aided drafting (CAD) software, sometimes referred to as "solid modeling" software, is easy to learn, fun to use, and becoming the standard in industry. However, many users have difficulty creating complex geometry with the solid modeling software. And the problem is not entirely a student problem. Even some teachers and…

  19. Analysis of Software Development Methodologies to Build Safety Software Applications for the SATEX-II: A Mexican Experimental Satellite

    NASA Astrophysics Data System (ADS)

    Aguilar Cisneros, Jorge; Vargas Martinez, Hector; Pedroza Melendez, Alejandro; Alonso Arevalo, Miguel

    2013-09-01

    Mexico is a country where the experience to build software for satellite applications is beginning. This is a delicate situation because in the near future we will need to develop software for the SATEX-II (Mexican Experimental Satellite). SATEX- II is a SOMECyTA's project (the Mexican Society of Aerospace Science and Technology). We have experienced applying software development methodologies, like TSP (Team Software Process) and SCRUM in other areas. Then, we analyzed these methodologies and we concluded: these can be applied to develop software for the SATEX-II, also, we supported these methodologies with SSP-05-0 Standard in particular with ESA PSS-05-11. Our analysis was focusing on main characteristics of each methodology and how these methodologies could be used with the ESA PSS 05-0 Standards. Our outcomes, in general, may be used by teams who need to build small satellites, but, in particular, these are going to be used when we will build the on board software applications for the SATEX-II.

  20. Principles underlying the design of "The Number Race", an adaptive computer game for remediation of dyscalculia

    PubMed Central

    Wilson, Anna J; Dehaene, Stanislas; Pinel, Philippe; Revkin, Susannah K; Cohen, Laurent; Cohen, David

    2006-01-01

    Background Adaptive game software has been successful in remediation of dyslexia. Here we describe the cognitive and algorithmic principles underlying the development of similar software for dyscalculia. Our software is based on current understanding of the cerebral representation of number and the hypotheses that dyscalculia is due to a "core deficit" in number sense or in the link between number sense and symbolic number representations. Methods "The Number Race" software trains children on an entertaining numerical comparison task, by presenting problems adapted to the performance level of the individual child. We report full mathematical specifications of the algorithm used, which relies on an internal model of the child's knowledge in a multidimensional "learning space" consisting of three difficulty dimensions: numerical distance, response deadline, and conceptual complexity (from non-symbolic numerosity processing to increasingly complex symbolic operations). Results The performance of the software was evaluated both by mathematical simulations and by five weeks of use by nine children with mathematical learning difficulties. The results indicate that the software adapts well to varying levels of initial knowledge and learning speeds. Feedback from children, parents and teachers was positive. A companion article [1] describes the evolution of number sense and arithmetic scores before and after training. Conclusion The software, open-source and freely available online, is designed for learning disabled children aged 5–8, and may also be useful for general instruction of normal preschool children. The learning algorithm reported is highly general, and may be applied in other domains. PMID:16734905

  1. OHD/HL - Staff

    Science.gov Websites

    Laboratory Branches Hydrologic Software Engineering Branch (HSEB) Hydrologic Science and Modeling Branch (HSMB) General Info Publications Documentation Software Standard and Guidelines Contact Us HL Staff resources and services. Staff Directory Chief, Hydrology Laboratory; Chief, Hydrologic Software Engineering

  2. Standardizing Activation Analysis: New Software for Photon Activation Analysis

    NASA Astrophysics Data System (ADS)

    Sun, Z. J.; Wells, D.; Segebade, C.; Green, J.

    2011-06-01

    Photon Activation Analysis (PAA) of environmental, archaeological and industrial samples requires extensive data analysis that is susceptible to error. For the purpose of saving time, manpower and minimizing error, a computer program was designed, built and implemented using SQL, Access 2007 and asp.net technology to automate this process. Based on the peak information of the spectrum and assisted by its PAA library, the program automatically identifies elements in the samples and calculates their concentrations and respective uncertainties. The software also could be operated in browser/server mode, which gives the possibility to use it anywhere the internet is accessible. By switching the nuclide library and the related formula behind, the new software can be easily expanded to neutron activation analysis (NAA), charged particle activation analysis (CPAA) or proton-induced X-ray emission (PIXE). Implementation of this would standardize the analysis of nuclear activation data. Results from this software were compared to standard PAA analysis with excellent agreement. With minimum input from the user, the software has proven to be fast, user-friendly and reliable.

  3. A numerical differentiation library exploiting parallel architectures

    NASA Astrophysics Data System (ADS)

    Voglis, C.; Hadjidoukas, P. E.; Lagaris, I. E.; Papageorgiou, D. G.

    2009-08-01

    We present a software library for numerically estimating first and second order partial derivatives of a function by finite differencing. Various truncation schemes are offered resulting in corresponding formulas that are accurate to order O(h), O(h), and O(h), h being the differencing step. The derivatives are calculated via forward, backward and central differences. Care has been taken that only feasible points are used in the case where bound constraints are imposed on the variables. The Hessian may be approximated either from function or from gradient values. There are three versions of the software: a sequential version, an OpenMP version for shared memory architectures and an MPI version for distributed systems (clusters). The parallel versions exploit the multiprocessing capability offered by computer clusters, as well as modern multi-core systems and due to the independent character of the derivative computation, the speedup scales almost linearly with the number of available processors/cores. Program summaryProgram title: NDL (Numerical Differentiation Library) Catalogue identifier: AEDG_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEDG_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 73 030 No. of bytes in distributed program, including test data, etc.: 630 876 Distribution format: tar.gz Programming language: ANSI FORTRAN-77, ANSI C, MPI, OPENMP Computer: Distributed systems (clusters), shared memory systems Operating system: Linux, Solaris Has the code been vectorised or parallelized?: Yes RAM: The library uses O(N) internal storage, N being the dimension of the problem Classification: 4.9, 4.14, 6.5 Nature of problem: The numerical estimation of derivatives at several accuracy levels is a common requirement in many computational tasks, such as optimization, solution of nonlinear systems, etc. The parallel implementation that exploits systems with multiple CPUs is very important for large scale and computationally expensive problems. Solution method: Finite differencing is used with carefully chosen step that minimizes the sum of the truncation and round-off errors. The parallel versions employ both OpenMP and MPI libraries. Restrictions: The library uses only double precision arithmetic. Unusual features: The software takes into account bound constraints, in the sense that only feasible points are used to evaluate the derivatives, and given the level of the desired accuracy, the proper formula is automatically employed. Running time: Running time depends on the function's complexity. The test run took 15 ms for the serial distribution, 0.6 s for the OpenMP and 4.2 s for the MPI parallel distribution on 2 processors.

  4. Generating standardized image data for testing and calibrating quantification of volumes, surfaces, lengths, and object counts in fibrous and porous materials using X-ray microtomography.

    PubMed

    Jiřík, Miroslav; Bartoš, Martin; Tomášek, Petr; Malečková, Anna; Kural, Tomáš; Horáková, Jana; Lukáš, David; Suchý, Tomáš; Kochová, Petra; Hubálek Kalbáčová, Marie; Králíčková, Milena; Tonar, Zbyněk

    2018-06-01

    Quantification of the structure and composition of biomaterials using micro-CT requires image segmentation due to the low contrast and overlapping radioopacity of biological materials. The amount of bias introduced by segmentation procedures is generally unknown. We aim to develop software that generates three-dimensional models of fibrous and porous structures with known volumes, surfaces, lengths, and object counts in fibrous materials and to provide a software tool that calibrates quantitative micro-CT assessments. Virtual image stacks were generated using the newly developed software TeIGen, enabling the simulation of micro-CT scans of unconnected tubes, connected tubes, and porosities. A realistic noise generator was incorporated. Forty image stacks were evaluated using micro-CT, and the error between the true known and estimated data was quantified. Starting with geometric primitives, the error of the numerical estimation of surfaces and volumes was eliminated, thereby enabling the quantification of volumes and surfaces of colliding objects. Analysis of the sensitivity of the thresholding upon parameters of generated testing image sets revealed the effects of decreasing resolution and increasing noise on the accuracy of the micro-CT quantification. The size of the error increased with decreasing resolution when the voxel size exceeded 1/10 of the typical object size, which simulated the effect of the smallest details that could still be reliably quantified. Open-source software for calibrating quantitative micro-CT assessments by producing and saving virtually generated image data sets with known morphometric data was made freely available to researchers involved in morphometry of three-dimensional fibrillar and porous structures in micro-CT scans. © 2018 Wiley Periodicals, Inc.

  5. Three-dimensional representations of salt-dome margins at four active strategic petroleum reserve sites.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rautman, Christopher Arthur; Stein, Joshua S.

    2003-01-01

    Existing paper-based site characterization models of salt domes at the four active U.S. Strategic Petroleum Reserve sites have been converted to digital format and visualized using modern computer software. The four sites are the Bayou Choctaw dome in Iberville Parish, Louisiana; the Big Hill dome in Jefferson County, Texas; the Bryan Mound dome in Brazoria County, Texas; and the West Hackberry dome in Cameron Parish, Louisiana. A new modeling algorithm has been developed to overcome limitations of many standard geological modeling software packages in order to deal with structurally overhanging salt margins that are typical of many salt domes. Thismore » algorithm, and the implementing computer program, make use of the existing interpretive modeling conducted manually using professional geological judgement and presented in two dimensions in the original site characterization reports as structure contour maps on the top of salt. The algorithm makes use of concepts of finite-element meshes of general engineering usage. Although the specific implementation of the algorithm described in this report and the resulting output files are tailored to the modeling and visualization software used to construct the figures contained herein, the algorithm itself is generic and other implementations and output formats are possible. The graphical visualizations of the salt domes at the four Strategic Petroleum Reserve sites are believed to be major improvements over the previously available two-dimensional representations of the domes via conventional geologic drawings (cross sections and contour maps). Additionally, the numerical mesh files produced by this modeling activity are available for import into and display by other software routines. The mesh data are not explicitly tabulated in this report; however an electronic version in simple ASCII format is included on a PC-based compact disk.« less

  6. AN ADA LINEAR ALGEBRA PACKAGE MODELED AFTER HAL/S

    NASA Technical Reports Server (NTRS)

    Klumpp, A. R.

    1994-01-01

    This package extends the Ada programming language to include linear algebra capabilities similar to those of the HAL/S programming language. The package is designed for avionics applications such as Space Station flight software. In addition to the HAL/S built-in functions, the package incorporates the quaternion functions used in the Shuttle and Galileo projects, and routines from LINPAK that solve systems of equations involving general square matrices. Language conventions in this package follow those of HAL/S to the maximum extent practical and minimize the effort required for writing new avionics software and translating existent software into Ada. Valid numeric types in this package include scalar, vector, matrix, and quaternion declarations. (Quaternions are fourcomponent vectors used in representing motion between two coordinate frames). Single precision and double precision floating point arithmetic is available in addition to the standard double precision integer manipulation. Infix operators are used instead of function calls to define dot products, cross products, quaternion products, and mixed scalar-vector, scalar-matrix, and vector-matrix products. The package contains two generic programs: one for floating point, and one for integer. The actual component type is passed as a formal parameter to the generic linear algebra package. The procedures for solving systems of linear equations defined by general matrices include GEFA, GECO, GESL, and GIDI. The HAL/S functions include ABVAL, UNIT, TRACE, DET, INVERSE, TRANSPOSE, GET, PUT, FETCH, PLACE, and IDENTITY. This package is written in Ada (Version 1.2) for batch execution and is machine independent. The linear algebra software depends on nothing outside the Ada language except for a call to a square root function for floating point scalars (such as SQRT in the DEC VAX MATHLIB library). This program was developed in 1989, and is a copyrighted work with all copyright vested in NASA.

  7. Collaborative Data Publication Utilizing the Open Data Repository's (ODR) Data Publisher

    NASA Technical Reports Server (NTRS)

    Stone, N.; Lafuente, B.; Bristow, T.; Keller, R. M.; Downs, R. T.; Blake, D.; Fonda, M.; Dateo, C.; Pires, A.

    2017-01-01

    Introduction: For small communities in diverse fields such as astrobiology, publishing and sharing data can be a difficult challenge. While large, homogenous fields often have repositories and existing data standards, small groups of independent researchers have few options for publishing standards and data that can be utilized within their community. In conjunction with teams at NASA Ames and the University of Arizona, the Open Data Repository's (ODR) Data Publisher has been conducting ongoing pilots to assess the needs of diverse research groups and to develop software to allow them to publish and share their data collaboratively. Objectives: The ODR's Data Publisher aims to provide an easy-to-use and implement software tool that will allow researchers to create and publish database templates and related data. The end product will facilitate both human-readable interfaces (web-based with embedded images, files, and charts) and machine-readable interfaces utilizing semantic standards. Characteristics: The Data Publisher software runs on the standard LAMP (Linux, Apache, MySQL, PHP) stack to provide the widest server base available. The software is based on Symfony (www.symfony.com) which provides a robust framework for creating extensible, object-oriented software in PHP. The software interface consists of a template designer where individual or master database templates can be created. A master database template can be shared by many researchers to provide a common metadata standard that will set a compatibility standard for all derivative databases. Individual researchers can then extend their instance of the template with custom fields, file storage, or visualizations that may be unique to their studies. This allows groups to create compatible databases for data discovery and sharing purposes while still providing the flexibility needed to meet the needs of scientists in rapidly evolving areas of research. Research: As part of this effort, a number of ongoing pilot and test projects are currently in progress. The Astrobiology Habitable Environments Database Working Group is developing a shared database standard using the ODR's Data Publisher and has a number of example databases where astrobiology data are shared. Soon these databases will be integrated via the template-based standard. Work with this group helps determine what data researchers in these diverse fields need to share and archive. Additionally, this pilot helps determine what standards are viable for sharing these types of data from internally developed standards to existing open standards such as the Dublin Core (http://dublincore.org) and Darwin Core (http://rs.twdg.org) metadata standards. Further studies are ongoing with the University of Arizona Department of Geosciences where a number of mineralogy databases are being constructed within the ODR Data Publisher system. Conclusions: Through the ongoing pilots and discussions with individual researchers and small research teams, a definition of the tools desired by these groups is coming into focus. As the software development moves forward, the goal is to meet the publication and collaboration needs of these scientists in an unobtrusive and functional way.

  8. The measurements of water flow rates in the straight microchannel based on the scanning micro-PIV technique

    NASA Astrophysics Data System (ADS)

    Wang, H. L.; Han, W.; Xu, M.

    2011-12-01

    Measurement of the water flow rate in microchannel has been one of the hottest points in the applications of microfluidics, medical, biological, chemical analyses and so on. In this study, the scanning microscale particle image velocimetry (scanning micro-PIV) technique is used for the measurements of water flow rates in a straight microchannel of 200μm width and 60μm depth under the standard flow rates ranging from 2.481μL/min to 8.269μL/min. The main effort of this measurement technique is to obtain three-dimensional velocity distribution on the cross sections of microchannel by measuring velocities of the different fluid layers along the out-of-plane direction in the microchannel, so the water flow rates can be evaluated from the discrete surface integral of velocities on the cross section. At the same time, the three-dimensional velocity fields in the measured microchannel are simulated numerically using the FLUENT software in order to verify the velocity accuracy of measurement results. The results show that the experimental values of flow rates are well consistent to the standard flow rates input by the syringe pump and the compared results between numerical simulation and experiment are consistent fundamentally. This study indicates that the micro-flow rate evaluated from three-dimensional velocity by the scanning micro-PIV technique is a promising method for the micro-flow rate research.

  9. Semiannual report, 1 April - 30 September 1991

    NASA Technical Reports Server (NTRS)

    1991-01-01

    The major categories of the current Institute for Computer Applications in Science and Engineering (ICASE) research program are: (1) numerical methods, with particular emphasis on the development and analysis of basic numerical algorithms; (2) control and parameter identification problems, with emphasis on effective numerical methods; (3) computational problems in engineering and the physical sciences, particularly fluid dynamics, acoustics, and structural analysis; and (4) computer systems and software for parallel computers. Research in these areas is discussed.

  10. The Application of Software Safety to the Constellation Program Launch Control System

    NASA Technical Reports Server (NTRS)

    Kania, James; Hill, Janice

    2011-01-01

    The application of software safety practices on the LCS project resulted in the successful implementation of the NASA Software Safety Standard NASA-STD-8719.138 and CxP software safety requirements. The GOP-GEN-GSW-011 Hazard Report was the first report developed at KSC to identify software hazard causes and their controls. This approach can be applied to similar large software - intensive systems where loss of control can lead to a hazard.

  11. Integrating open-source software applications to build molecular dynamics systems.

    PubMed

    Allen, Bruce M; Predecki, Paul K; Kumosa, Maciej

    2014-04-05

    Three open-source applications, NanoEngineer-1, packmol, and mis2lmp are integrated using an open-source file format to quickly create molecular dynamics (MD) cells for simulation. The three software applications collectively make up the open-source software (OSS) suite known as MD Studio (MDS). The software is validated through software engineering practices and is verified through simulation of the diglycidyl ether of bisphenol-a and isophorone diamine (DGEBA/IPD) system. Multiple simulations are run using the MDS software to create MD cells, and the data generated are used to calculate density, bulk modulus, and glass transition temperature of the DGEBA/IPD system. Simulation results compare well with published experimental and numerical results. The MDS software prototype confirms that OSS applications can be analyzed against real-world research requirements and integrated to create a new capability. Copyright © 2014 Wiley Periodicals, Inc.

  12. Using SWAT to enhance watershed-based plans to meet numeric water quality standards

    USDA-ARS?s Scientific Manuscript database

    The number of states that have adopted numeric nutrient water-quality standards has increased to 23, up from ten in 1998. One state with both stream and reservoir phosphorus (P) numeric water-quality standards is Oklahoma. There were two primary objectives of this research: (1) determine if Oklaho...

  13. 45 CFR 153.350 - Risk adjustment data validation standards.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... implementation of any risk adjustment software and ensure proper validation of a statistically valid sample of... respect to implementation of risk adjustment software or as a result of data validation conducted pursuant... implementation of risk adjustment software or data validation. ...

  14. Software production methodology tested project

    NASA Technical Reports Server (NTRS)

    Tausworthe, R. C.

    1976-01-01

    The history and results of a 3 1/2-year study in software development methodology are reported. The findings of this study have become the basis for DSN software development guidelines and standard practices. The article discusses accomplishments, discoveries, problems, recommendations and future directions.

  15. Roadmap for cardiovascular circulation model

    PubMed Central

    Bradley, Christopher P.; Suresh, Vinod; Mithraratne, Kumar; Muller, Alexandre; Ho, Harvey; Ladd, David; Hellevik, Leif R.; Omholt, Stig W.; Chase, J. Geoffrey; Müller, Lucas O.; Watanabe, Sansuke M.; Blanco, Pablo J.; de Bono, Bernard; Hunter, Peter J.

    2016-01-01

    Abstract Computational models of many aspects of the mammalian cardiovascular circulation have been developed. Indeed, along with orthopaedics, this area of physiology is one that has attracted much interest from engineers, presumably because the equations governing blood flow in the vascular system are well understood and can be solved with well‐established numerical techniques. Unfortunately, there have been only a few attempts to create a comprehensive public domain resource for cardiovascular researchers. In this paper we propose a roadmap for developing an open source cardiovascular circulation model. The model should be registered to the musculo‐skeletal system. The computational infrastructure for the cardiovascular model should provide for near real‐time computation of blood flow and pressure in all parts of the body. The model should deal with vascular beds in all tissues, and the computational infrastructure for the model should provide links into CellML models of cell function and tissue function. In this work we review the literature associated with 1D blood flow modelling in the cardiovascular system, discuss model encoding standards, software and a model repository. We then describe the coordinate systems used to define the vascular geometry, derive the equations and discuss the implementation of these coupled equations in the open source computational software OpenCMISS. Finally, some preliminary results are presented and plans outlined for the next steps in the development of the model, the computational software and the graphical user interface for accessing the model. PMID:27506597

  16. Roadmap for cardiovascular circulation model.

    PubMed

    Safaei, Soroush; Bradley, Christopher P; Suresh, Vinod; Mithraratne, Kumar; Muller, Alexandre; Ho, Harvey; Ladd, David; Hellevik, Leif R; Omholt, Stig W; Chase, J Geoffrey; Müller, Lucas O; Watanabe, Sansuke M; Blanco, Pablo J; de Bono, Bernard; Hunter, Peter J

    2016-12-01

    Computational models of many aspects of the mammalian cardiovascular circulation have been developed. Indeed, along with orthopaedics, this area of physiology is one that has attracted much interest from engineers, presumably because the equations governing blood flow in the vascular system are well understood and can be solved with well-established numerical techniques. Unfortunately, there have been only a few attempts to create a comprehensive public domain resource for cardiovascular researchers. In this paper we propose a roadmap for developing an open source cardiovascular circulation model. The model should be registered to the musculo-skeletal system. The computational infrastructure for the cardiovascular model should provide for near real-time computation of blood flow and pressure in all parts of the body. The model should deal with vascular beds in all tissues, and the computational infrastructure for the model should provide links into CellML models of cell function and tissue function. In this work we review the literature associated with 1D blood flow modelling in the cardiovascular system, discuss model encoding standards, software and a model repository. We then describe the coordinate systems used to define the vascular geometry, derive the equations and discuss the implementation of these coupled equations in the open source computational software OpenCMISS. Finally, some preliminary results are presented and plans outlined for the next steps in the development of the model, the computational software and the graphical user interface for accessing the model. © 2016 The Authors. The Journal of Physiology © 2016 The Physiological Society.

  17. Digital diagnosis of medical images

    NASA Astrophysics Data System (ADS)

    Heinonen, Tomi; Kuismin, Raimo; Jormalainen, Raimo; Dastidar, Prasun; Frey, Harry; Eskola, Hannu

    2001-08-01

    The popularity of digital imaging devices and PACS installations has increased during the last years. Still, images are analyzed and diagnosed using conventional techniques. Our research group begun to study the requirements for digital image diagnostic methods to be applied together with PACS systems. The research was focused on various image analysis procedures (e.g., segmentation, volumetry, 3D visualization, image fusion, anatomic atlas, etc.) that could be useful in medical diagnosis. We have developed Image Analysis software (www.medimag.net) to enable several image-processing applications in medical diagnosis, such as volumetry, multimodal visualization, and 3D visualizations. We have also developed a commercial scalable image archive system (ActaServer, supports DICOM) based on component technology (www.acta.fi), and several telemedicine applications. All the software and systems operate in NT environment and are in clinical use in several hospitals. The analysis software have been applied in clinical work and utilized in numerous patient cases (500 patients). This method has been used in the diagnosis, therapy and follow-up in various diseases of the central nervous system (CNS), respiratory system (RS) and human reproductive system (HRS). In many of these diseases e.g. Systemic Lupus Erythematosus (CNS), nasal airways diseases (RS) and ovarian tumors (HRS), these methods have been used for the first time in clinical work. According to our results, digital diagnosis improves diagnostic capabilities, and together with PACS installations it will become standard tool during the next decade by enabling more accurate diagnosis and patient follow-up.

  18. FacetModeller: Software for manual creation, manipulation and analysis of 3D surface-based models

    NASA Astrophysics Data System (ADS)

    Lelièvre, Peter G.; Carter-McAuslan, Angela E.; Dunham, Michael W.; Jones, Drew J.; Nalepa, Mariella; Squires, Chelsea L.; Tycholiz, Cassandra J.; Vallée, Marc A.; Farquharson, Colin G.

    2018-01-01

    The creation of 3D models is commonplace in many disciplines. Models are often built from a collection of tessellated surfaces. To apply numerical methods to such models it is often necessary to generate a mesh of space-filling elements that conforms to the model surfaces. While there are meshing algorithms that can do so, they place restrictive requirements on the surface-based models that are rarely met by existing 3D model building software. Hence, we have developed a Java application named FacetModeller, designed for efficient manual creation, modification and analysis of 3D surface-based models destined for use in numerical modelling.

  19. Applications of magnetic resonance image segmentation in neurology

    NASA Astrophysics Data System (ADS)

    Heinonen, Tomi; Lahtinen, Antti J.; Dastidar, Prasun; Ryymin, Pertti; Laarne, Paeivi; Malmivuo, Jaakko; Laasonen, Erkki; Frey, Harry; Eskola, Hannu

    1999-05-01

    After the introduction of digital imagin devices in medicine computerized tissue recognition and classification have become important in research and clinical applications. Segmented data can be applied among numerous research fields including volumetric analysis of particular tissues and structures, construction of anatomical modes, 3D visualization, and multimodal visualization, hence making segmentation essential in modern image analysis. In this research project several PC based software were developed in order to segment medical images, to visualize raw and segmented images in 3D, and to produce EEG brain maps in which MR images and EEG signals were integrated. The software package was tested and validated in numerous clinical research projects in hospital environment.

  20. Open-source meteor detection software for low-cost single-board computers

    NASA Astrophysics Data System (ADS)

    Vida, D.; Zubović, D.; Šegon, D.; Gural, P.; Cupec, R.

    2016-01-01

    This work aims to overcome the current price threshold of meteor stations which can sometimes deter meteor enthusiasts from owning one. In recent years small card-sized computers became widely available and are used for numerous applications. To utilize such computers for meteor work, software which can run on them is needed. In this paper we present a detailed description of newly-developed open-source software for fireball and meteor detection optimized for running on low-cost single board computers. Furthermore, an update on the development of automated open-source software which will handle video capture, fireball and meteor detection, astrometry and photometry is given.

  1. Scilab and Maxima Environment: Towards Free Software in Numerical Analysis

    ERIC Educational Resources Information Center

    Mora, Angel; Galan, Jose Luis; Aguilera, Gabriel; Fernandez, Alvaro; Merida, Enrique; Rodriguez, Pedro

    2010-01-01

    In this work we will present the ScilabUMA environment we have developed as an alternative to Matlab. This environment connects Scilab (for numerical analysis) and Maxima (for symbolic computations). Furthermore, the developed interface is, in our opinion at least, as powerful as the interface of Matlab. (Contains 3 figures.)

  2. Numerical simulation of the casting process of titanium tooth crowns and bridges.

    PubMed

    Wu, M; Augthun, M; Wagner, I; Sahm, P R; Spiekermann, H

    2001-06-01

    The objectives of this paper were to simulate the casting process of titanium tooth crowns and bridges; to predict and control porosity defect. A casting simulation software, MAGMASOFT, was used. The geometry of the crowns with fine details of the occlusal surface were digitized by means of laser measuring technique, then converted and read in the simulation software. Both mold filling and solidification were simulated, the shrinkage porosity was predicted by a "feeding criterion", and the gas pore sensitivity was studied based on the mold filling and solidification simulations. Two types of dental prostheses (a single-crown casting and a three-unit-bridge) with various sprue designs were numerically "poured", and only one optimal design for each prosthesis was recommended for real casting trial. With the numerically optimized design, real titanium dental prostheses (five replicas for each) were made on a centrifugal casting machine. All the castings endured radiographic examination, and no porosity was detected in the cast prostheses. It indicates that the numerical simulation is an efficient tool for dental casting design and porosity control. Copyright 2001 Kluwer Academic Publishers

  3. Application of the inverse analysis for determining the material properties of the woven fabrics for macroscopic approach

    NASA Astrophysics Data System (ADS)

    Oleksik, Mihaela; Oleksik, Valentin

    2013-05-01

    The current paper intends to realise a fast method for determining the material characteristics in the case of composite materials used in the airbags manufacturing. For determining the material data needed for other complex numerical simulations at macroscopic level there was used the inverse analysis method. In fact, there were carried out tensile tests for the composite material extracted along two directions - the direction of the weft and the direction of the warp and afterwards there were realised numerical simulations (using the Ls-Dyna software). A second stage consisted in the numerical simulation through the finite element method and the experimental testing for the Bias test. The material characteristics of the composite fabric material were then obtained by applying a multicriterial analysis using the Ls-Opt software, for which there was imposed a decrease of the mismatch between the force-displacement curves obtained numerically and experimentally, respectively, for both directions (weft and warp) as well as the decrease of the mismatch between the strain - extension curves for two points at the Bias test.

  4. Evaluation of the efficiency and reliability of software generated by code generators

    NASA Technical Reports Server (NTRS)

    Schreur, Barbara

    1994-01-01

    There are numerous studies which show that CASE Tools greatly facilitate software development. As a result of these advantages, an increasing amount of software development is done with CASE Tools. As more software engineers become proficient with these tools, their experience and feedback lead to further development with the tools themselves. What has not been widely studied, however, is the reliability and efficiency of the actual code produced by the CASE Tools. This investigation considered these matters. Three segments of code generated by MATRIXx, one of many commercially available CASE Tools, were chosen for analysis: ETOFLIGHT, a portion of the Earth to Orbit Flight software, and ECLSS and PFMC, modules for Environmental Control and Life Support System and Pump Fan Motor Control, respectively.

  5. Using a web-based survey tool to undertake a Delphi study: application for nurse education research.

    PubMed

    Gill, Fenella J; Leslie, Gavin D; Grech, Carol; Latour, Jos M

    2013-11-01

    The Internet is increasingly being used as a data collection medium to access research participants. This paper reports on the experience and value of using web-survey software to conduct an eDelphi study to develop Australian critical care course graduate practice standards. The eDelphi technique used involved the iterative process of administering three rounds of surveys to a national expert panel. The survey was developed online using SurveyMonkey. Panel members responded to statements using one rating scale for round one and two scales for rounds two and three. Text boxes for panel comments were provided. For each round, the SurveyMonkey's email tool was used to distribute an individualized email invitation containing the survey web link. The distribution of panel responses, individual responses and a summary of comments were emailed to panel members. Stacked bar charts representing the distribution of responses were generated using the SurveyMonkey software. Panel response rates remained greater than 85% over all rounds. An online survey provided numerous advantages over traditional survey approaches including high quality data collection, ease and speed of survey administration, direct communication with the panel and rapid collation of feedback allowing data collection to be undertaken in 12 weeks. Only minor challenges were experienced using the technology. Ethical issues, specific to using the Internet to conduct research and external hosting of web-based software, lacked formal guidance. High response rates and an increased level of data quality were achieved in this study using web-survey software and the process was efficient and user-friendly. However, when considering online survey software, it is important to match the research design with the computer capabilities of participants and recognize that ethical review guidelines and processes have not yet kept pace with online research practices. Copyright © 2013 Elsevier Ltd. All rights reserved.

  6. Evolving the ECSS Standards and their Use: Experience Based on Industrial Case Studies

    NASA Astrophysics Data System (ADS)

    Feldt, R.; Ahmad, E.; Raza, B.; Hult, E.; Nordebäck, T.

    2009-05-01

    This paper introduces two case studies conducted at two Swedish companies developing software for the space industry. The overall goal of the project is to evaluate if current use of ECSS is cost efficient and if there are ways to make the process leaner while maintaining quality. The case studies reported on here focused on how the ECSS standard was used by the companies and how that affected software development processes and software quality. This paper describes the results and recommendations based on identified challenges.

  7. Software quality assurance plan for GCS

    NASA Technical Reports Server (NTRS)

    Duncan, Stephen E.; Bailey, Elizabeth K.

    1990-01-01

    The software quality assurance (SQA) function for the Guidance and Control Software (GCS) project which is part of a software error studies research program is described. The SQA plan outlines all of the procedures, controls, and audits to be carried out by the SQA organization to ensure adherence to the policies, procedures, and standards for the GCS project.

  8. [Safety and health in workers employed in industry. Data from Industrial Accidents Compensation Board (INAIL) and National Social Security Institute (INPS), Veneto Region, 1994-2002].

    PubMed

    Mastrangelo, G; Carassai, Patrizia; Carletti, Claudia; Cattani, F; De Zorzi, Lia; Di Loreto, G; Dini, M; Mattioni, G; Mundo, Antonietta; Noceta, R; Ortolani, G; Piccioni, M; Sartori, Angela; Sereno, Antonella; Priolo, G; Scoizzato, L; Marangi, G; Marchiori, L

    2008-01-01

    A decreasing time trend for occupational injuries and sickness absence would be the effect of the new legislation (D.Lgs. 626/94 and successive laws) on prevention in occupational settings. Conversely, the reduction of INPS disability would reflect a health improvement due to non-occupational causes. The aim of the study was to investigate the efficacy of the new legislation among employees in industry (where the law was mainly applied), via the time trend of three standardized rates in the Veneto Region. The numerator for the rate of occupational accidents (cases occurring in industry workers in the Veneto Region, broken down for sex, age and calendar years) was supplied by INAIL. The denominator for the above rate, as well as numerators and denominators for disability and sickness absence were supplied by INPS. Data were available from 1994 to 2002 for accidents and disability, and from 1997 to 2002 for sickness absence. In every year from 1994 to 2002, the rates were standardized for age and sex with the direct method, using an internal "standard" population. The time trend of year-specific standardized rates was analyzed by Joinpoint regression software. Among industrial workers in the Veneto Region, occupational accidents increased by 0.4% yearly, while disability decreased by 2.56% from 1994 to 2002. Sick absence increased up to 1999, then decreased. This epidemiological pattern is difficult to explain. The increase in accidents could be due to the increase of non-European Union workers and/or to the fact that accidents on the way to or from work were recognized as occupational accidents by INAIL starting from 2000. Both these phenomena could have contributed to increase the rate that was otherwise diminishing. On the other hand, this same situation could be due to insufficient efficacy of the legislation (D.Lgs. 626/94 and successive laws) for preventing occupational accidents and diseases.

  9. Weather forecasting with open source software

    NASA Astrophysics Data System (ADS)

    Rautenhaus, Marc; Dörnbrack, Andreas

    2013-04-01

    To forecast the weather situation during aircraft-based atmospheric field campaigns, we employ a tool chain of existing and self-developed open source software tools and open standards. Of particular value are the Python programming language with its extension libraries NumPy, SciPy, PyQt4, Matplotlib and the basemap toolkit, the NetCDF standard with the Climate and Forecast (CF) Metadata conventions, and the Open Geospatial Consortium Web Map Service standard. These open source libraries and open standards helped to implement the "Mission Support System", a Web Map Service based tool to support weather forecasting and flight planning during field campaigns. The tool has been implemented in Python and has also been released as open source (Rautenhaus et al., Geosci. Model Dev., 5, 55-71, 2012). In this presentation we discuss the usage of free and open source software for weather forecasting in the context of research flight planning, and highlight how the field campaign work benefits from using open source tools and open standards.

  10. Core Logistics Capability Policy Applied to USAF Combat Aircraft Avionics Software: A Systems Engineering Analysis

    DTIC Science & Technology

    2010-06-01

    cannot make a distinction between software maintenance and development” (Sharma, 2004). ISO /IEC 12207 Software Lifecycle Processes offers a guide to...synopsis of ISO /IEC 12207 , Raghu Singh of the Federal Aviation Administration states “Whenever a software product needs modifications, the development...Corporation. Singh, R. (1998). International Standard ISO /IEC 12207 Software Life Cycle Processes. Washington: Federal Aviation Administration. The Joint

  11. On the mechanics of cerebral aneurysms: experimental research and numerical simulation

    NASA Astrophysics Data System (ADS)

    Parshin, D. V.; Kuianova, I. O.; Yunoshev, A. S.; Ovsyannikov, K. S.; Dubovoy, A. V.

    2017-10-01

    This research extends existing experimental data for CA tissues [1, 2] and presents the preliminary results of numerical calculations. Experiments were performed to measure aneurysm wall stiffness and the data obtained was analyzed. To reconstruct the geometry of the CAs, DICOM images of real patients with aneurysms and ITK Snap [3] were used. In addition, numerical calculations were performed in ANSYS (commercial software, License of Lavrentyev Institute of Hydrodynamics). The results of these numerical calculations show a high level of agreement with experimental data from previous literature.

  12. Numerical simulation of the hydrodynamic instabilities of Richtmyer-Meshkov and Rayleigh-Taylor

    NASA Astrophysics Data System (ADS)

    Fortova, S. V.; Shepelev, V. V.; Troshkin, O. V.; Kozlov, S. A.

    2017-09-01

    The paper presents the results of numerical simulation of the development of hydrodynamic instabilities of Richtmyer-Meshkov and Rayleigh-Taylor encountered in experiments [1-3]. For the numerical solution used the TPS software package (Turbulence Problem Solver) that implements a generalized approach to constructing computer programs for a wide range of problems of hydrodynamics, described by the system of equations of hyperbolic type. As numerical methods are used the method of large particles and ENO-scheme of the second order with Roe solver for the approximate solution of the Riemann problem.

  13. An assessment of space shuttle flight software development processes

    NASA Technical Reports Server (NTRS)

    1993-01-01

    In early 1991, the National Aeronautics and Space Administration's (NASA's) Office of Space Flight commissioned the Aeronautics and Space Engineering Board (ASEB) of the National Research Council (NRC) to investigate the adequacy of the current process by which NASA develops and verifies changes and updates to the Space Shuttle flight software. The Committee for Review of Oversight Mechanisms for Space Shuttle Flight Software Processes was convened in Jan. 1992 to accomplish the following tasks: (1) review the entire flight software development process from the initial requirements definition phase to final implementation, including object code build and final machine loading; (2) review and critique NASA's independent verification and validation process and mechanisms, including NASA's established software development and testing standards; (3) determine the acceptability and adequacy of the complete flight software development process, including the embedded validation and verification processes through comparison with (1) generally accepted industry practices, and (2) generally accepted Department of Defense and/or other government practices (comparing NASA's program with organizations and projects having similar volumes of software development, software maturity, complexity, criticality, lines of code, and national standards); (4) consider whether independent verification and validation should continue. An overview of the study, independent verification and validation of critical software, and the Space Shuttle flight software development process are addressed. Findings and recommendations are presented.

  14. A NASA family of minicomputer systems, Appendix A

    NASA Technical Reports Server (NTRS)

    Deregt, M. P.; Dulfer, J. E.

    1972-01-01

    This investigation was undertaken to establish sufficient specifications, or standards, for minicomputer hardware and software to provide NASA with realizable economics in quantity purchases, interchangeability of minicomputers, software, storage and peripherals, and a uniformly high quality. The standards will define minicomputer system component types, each specialized to its intended NASA application, in as many levels of capacity as required.

  15. 25 CFR 547.5 - What are the rules of interpretation and of general application for this part?

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... INTERIOR HUMAN SERVICES MINIMUM TECHNICAL STANDARDS FOR GAMING EQUIPMENT USED WITH THE PLAY OF CLASS II... standards apply. Gaming equipment and software used with Class II gaming systems shall meet all applicable... of winning less than 1 in 50,000,000. (d) Approved equipment and software only. All gaming equipment...

  16. Multiple piezo-patch energy harvesters integrated to a thin plate with AC-DC conversion: analytical modeling and numerical validation

    NASA Astrophysics Data System (ADS)

    Aghakhani, Amirreza; Basdogan, Ipek; Erturk, Alper

    2016-04-01

    Plate-like components are widely used in numerous automotive, marine, and aerospace applications where they can be employed as host structures for vibration based energy harvesting. Piezoelectric patch harvesters can be easily attached to these structures to convert the vibrational energy to the electrical energy. Power output investigations of these harvesters require accurate models for energy harvesting performance evaluation and optimization. Equivalent circuit modeling of the cantilever-based vibration energy harvesters for estimation of electrical response has been proposed in recent years. However, equivalent circuit formulation and analytical modeling of multiple piezo-patch energy harvesters integrated to thin plates including nonlinear circuits has not been studied. In this study, equivalent circuit model of multiple parallel piezoelectric patch harvesters together with a resistive load is built in electronic circuit simulation software SPICE and voltage frequency response functions (FRFs) are validated using the analytical distributedparameter model. Analytical formulation of the piezoelectric patches in parallel configuration for the DC voltage output is derived while the patches are connected to a standard AC-DC circuit. The analytic model is based on the equivalent load impedance approach for piezoelectric capacitance and AC-DC circuit elements. The analytic results are validated numerically via SPICE simulations. Finally, DC power outputs of the harvesters are computed and compared with the peak power amplitudes in the AC output case.

  17. Resolution of singularities for multi-loop integrals

    NASA Astrophysics Data System (ADS)

    Bogner, Christian; Weinzierl, Stefan

    2008-04-01

    We report on a program for the numerical evaluation of divergent multi-loop integrals. The program is based on iterated sector decomposition. We improve the original algorithm of Binoth and Heinrich such that the program is guaranteed to terminate. The program can be used to compute numerically the Laurent expansion of divergent multi-loop integrals regulated by dimensional regularisation. The symbolic and the numerical steps of the algorithm are combined into one program. Program summaryProgram title: sector_decomposition Catalogue identifier: AEAG_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEAG_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 47 506 No. of bytes in distributed program, including test data, etc.: 328 485 Distribution format: tar.gz Programming language: C++ Computer: all Operating system: Unix RAM: Depending on the complexity of the problem Classification: 4.4 External routines: GiNaC, available from http://www.ginac.de, GNU scientific library, available from http://www.gnu.org/software/gsl Nature of problem: Computation of divergent multi-loop integrals. Solution method: Sector decomposition. Restrictions: Only limited by the available memory and CPU time. Running time: Depending on the complexity of the problem.

  18. Preserving & Serving 150 years of KNMI data

    NASA Astrophysics Data System (ADS)

    van de Vegte, J.; Som de Cerff, W. J.

    2012-04-01

    The Royal Netherlands Meteorological Institute (KNMI) has over 150 years of knowledge and gathered information related to weather, Climate and Seismology. A huge part of this information is from numerical models, insitu sensor networks and remote sensing satellites. This digital collection is mostly internal only available and is a collection of non searchable , non standardized file formats, lacking documentation and has no references to scientific publications. With the Dutch funded KNMI Data Centre (KDC) project we aim to tackle al these issues. In the project a user driven development approach with SCRUM is chosen to get maximum user involvement in a relative short development timeframe. Building on open standards and proven opensource technology (which includes in-house developed software like ADAGUC WMS and Portal) should result in a solid first release in 2012-Q3. This presentation will focus on the aspects of developing the KDC relating to its technical challenges, the human factor and the development strategy to come to a future-proof centre that will preserve our data en make it usable for the community.

  19. Study on cold forming of special fasteners using finite element method

    NASA Astrophysics Data System (ADS)

    Hsia, Shao-Yi; Chou, Yu-Tuan; Yang, Chun-Chieh

    2013-12-01

    The cold forming plays an important role in the field of fasteners. It can be extended to the automotive industry, construction, aerospace and 3C products. This study used Deform-3D analysis software to investigate the effect of the preforms for standard hex nuts. The effective stress, effective strain, velocity field and other information could be obtained from the numerical simulation. The outcome was verified with the physical phenomena and experiments. Furthermore, the analytical process can also be used to explore the forming technology of the special shaped nuts. When comparing to the standard hex nuts during the different stages, the optimized cold forming parameters could be extracted from the simulation and adopted to improve the performance of manufacturing for the special shaped nuts. The results can help the multi-pass processing factory to establish a cold forming capacity in the development of new products. Consequence, the ability of self-design and self-manufacture for special shaped fasteners in Taiwan would be increased widely to enhance the international competition of domestic industries.

  20. On Software Compatibility.

    ERIC Educational Resources Information Center

    Ershov, Andrei P.

    The problem of compatibility of software hampers the development of computer application. One solution lies in standardization of languages, terms, peripherais, operating systems and computer characteristics. (AB)

  1. Improvement of CSCW Software Implementation in NPD: The CAM Mechanism for a Better Adoption by Users

    ERIC Educational Resources Information Center

    Restrepo, Tomas; Arbelaez, Natalia; Millet, Dominique; Gidel, Thierry

    2010-01-01

    Cooperation between disseminated actors is a key factor in improving new product development (NPD) performance. In the last years, numerous CSCW software applications have been introduced in the industry to support NPD with a low success rate. This is partly due to the limited insight of the organisational and human factors influencing user…

  2. Application of Elements of Numerical Methods in the Analysis of Journal Bearings in AC Induction Motors: An Industry Case Study

    ERIC Educational Resources Information Center

    Ahrens, Fred; Mistry, Rajendra

    2005-01-01

    In product engineering there often arise design analysis problems for which a commercial software package is either unavailable or cost prohibitive. Further, these calculations often require successive iterations that can be time intensive when performed by hand, thus development of a software application is indicated. This case relates to the…

  3. Moving Target Techniques: Leveraging Uncertainty for Cyber Defense

    DTIC Science & Technology

    2015-08-24

    vulnerability (a flaw or bug that an attacker can exploit to penetrate or disrupt a system) to successfully compromise systems. Defenders, however...device drivers, numerous software applications, and hardware components. Within the cyberspace, this imbalance between a simple, one- bug attack...parsing code itself could have security-relevant software bugs . Dynamic  Network   Techniques in the dynamic network domain change the properties

  4. Computational Solutions for Today’s Navy: New Methods are Being Employed to Meet the Navy’s Changing Software-Development Environment

    DTIC Science & Technology

    2008-03-01

    software- development environment. ▶ Frank W. Bentrem, Ph.D., John T. Sample, Ph.D., and Michael M. Harris he Naval Research Labor - atory (NRL) is the...sonars (Through-the-Sensor technology), supercomputer generated numer- ical models, and historical/ clima - tological databases. It uses a vari- ety of

  5. A Methodology for Cybercraft Requirement Definition and Initial System Design

    DTIC Science & Technology

    2008-06-01

    the software development concepts of the SDLC , requirements, use cases and domain modeling . It ...collectively as Software Development 5 Life Cycle ( SDLC ) models . While there are numerous models that fit under the SDLC definition, all are based on... developed that provided expanded understanding of the domain, it is necessary to either update an existing domain model or create another domain

  6. Markov Chains For Testing Redundant Software

    NASA Technical Reports Server (NTRS)

    White, Allan L.; Sjogren, Jon A.

    1990-01-01

    Preliminary design developed for validation experiment that addresses problems unique to assuring extremely high quality of multiple-version programs in process-control software. Approach takes into account inertia of controlled system in sense it takes more than one failure of control program to cause controlled system to fail. Verification procedure consists of two steps: experimentation (numerical simulation) and computation, with Markov model for each step.

  7. Objective and Item Banking Computer Software and Its Use in Comprehensive Achievement Monitoring.

    ERIC Educational Resources Information Center

    Schriber, Peter E.; Gorth, William P.

    The current emphasis on objectives and test item banks for constructing more effective tests is being augmented by increasingly sophisticated computer software. Items can be catalogued in numerous ways for retrieval. The items as well as instructional objectives can be stored and test forms can be selected and printed by the computer. It is also…

  8. Comet Gas and Dust Dynamics Modeling

    NASA Technical Reports Server (NTRS)

    Von Allmen, Paul A.; Lee, Seungwon

    2010-01-01

    This software models the gas and dust dynamics of comet coma (the head region of a comet) in order to support the Microwave Instrument for Rosetta Orbiter (MIRO) project. MIRO will study the evolution of the comet 67P/Churyumov-Gerasimenko's coma system. The instrument will measure surface temperature, gas-production rates and relative abundances, and velocity and excitation temperatures of each species along with their spatial temporal variability. This software will use these measurements to improve the understanding of coma dynamics. The modeling tool solves the equation of motion of a dust particle, the energy balance equation of the dust particle, the continuity equation for the dust and gas flow, and the dust and gas mixture energy equation. By solving these equations numerically, the software calculates the temperature and velocity of gas and dust as a function of time for a given initial gas and dust production rate, and a dust characteristic parameter that measures the ability of a dust particle to adjust its velocity to the local gas velocity. The software is written in a modular manner, thereby allowing the addition of more dynamics equations as needed. All of the numerical algorithms are added in-house and no third-party libraries are used.

  9. Reuse and Interoperability of Avionics for Space Systems

    NASA Technical Reports Server (NTRS)

    Hodson, Robert F.

    2007-01-01

    The space environment presents unique challenges for avionics. Launch survivability, thermal management, radiation protection, and other factors are important for successful space designs. Many existing avionics designs use custom hardware and software to meet the requirements of space systems. Although some space vendors have moved more towards a standard product line approach to avionics, the space industry still lacks similar standards and common practices for avionics development. This lack of commonality manifests itself in limited reuse and a lack of interoperability. To address NASA s need for interoperable avionics that facilitate reuse, several hardware and software approaches are discussed. Experiences with existing space boards and the application of terrestrial standards is outlined. Enhancements and extensions to these standards are considered. A modular stack-based approach to space avionics is presented. Software and reconfigurable logic cores are considered for extending interoperability and reuse. Finally, some of the issues associated with the design of reusable interoperable avionics are discussed.

  10. Efficacy of a Newly Designed Cephalometric Analysis Software for McNamara Analysis in Comparison with Dolphin Software.

    PubMed

    Nouri, Mahtab; Hamidiaval, Shadi; Akbarzadeh Baghban, Alireza; Basafa, Mohammad; Fahim, Mohammad

    2015-01-01

    Cephalometric norms of McNamara analysis have been studied in various populations due to their optimal efficiency. Dolphin cephalometric software greatly enhances the conduction of this analysis for orthodontic measurements. However, Dolphin is very expensive and cannot be afforded by many clinicians in developing countries. A suitable alternative software program in Farsi/English will greatly help Farsi speaking clinicians. The present study aimed to develop an affordable Iranian cephalometric analysis software program and compare it with Dolphin, the standard software available on the market for cephalometric analysis. In this diagnostic, descriptive study, 150 lateral cephalograms of normal occlusion individuals were selected in Mashhad and Qazvin, two major cities of Iran mainly populated with Fars ethnicity, the main Iranian ethnic group. After tracing the cephalograms, the McNamara analysis standards were measured both with Dolphin and the new software. The cephalometric software was designed using Microsoft Visual C++ program in Windows XP. Measurements made with the new software were compared with those of Dolphin software on both series of cephalograms. The validity and reliability were tested using intra-class correlation coefficient. Calculations showed a very high correlation between the results of the Iranian cephalometric analysis software and Dolphin. This confirms the validity and optimal efficacy of the newly designed software (ICC 0.570-1.0). According to our results, the newly designed software has acceptable validity and reliability and can be used for orthodontic diagnosis, treatment planning and assessment of treatment outcome.

  11. Making interdisciplinary solid Earth modeling and analysis tools accessible in a diverse undergraduate and graduate classroom

    NASA Astrophysics Data System (ADS)

    Becker, T. W.

    2011-12-01

    I present results from ongoing, NSF-CAREER funded educational and research efforts that center around making numerical tools in seismology and geodynamics more accessible to a broader audience. The goal is not only to train students in quantitative, interdisciplinary research, but also to make methods more easily accessible to practitioners across disciplines. I describe the two main efforts that were funded, the Solid Earth Research and Teaching Environment (SEATREE, geosys.usc.edu/projects/seatree/), and a new Numerical Methods class. SEATREE is a modular and user-friendly software framework to facilitate using solid Earth research tools in the undergraduate and graduate classroom and for interdisciplinary, scientific collaboration. We use only open-source software, and most programming is done in the Python computer language. We strive to make use of modern software design and development concepts while remaining compatible with traditional scientific coding and existing, legacy software. Our goals are to provide a fully contained, yet transparent package that lets users operate in an easy, graphically supported "black box" mode, while also allowing to look under the hood, for example to conduct numerous forward models to explore parameter space. SEATREE currently has several implemented modules, including on global mantle flow, 2D phase velocity tomography, and 2D mantle convection and was used at the University of Southern California, Los Angeles, and at a 2010 CIDER summer school tutorial. SEATREE was developed in collaboration with engineering and computer science undergraduate students, some of which have gone on to work in Earth Science projects. In the long run, we envision SEATREE to contribute to new ways of sharing scientific research, and making (numerical) experiments truly reproducible again. The other project is a set of lecture notes and Matlab exercises on Numerical Methods in solid Earth, focusing on finite difference and element methods. The class has been taught several times at USC to a broad audience of Earth science students with very diverse levels of exposure to math and physics. It is our goal to bring everyone up to speed and empower students, and we have seen structural geology students with very little exposure to math go on to construct their own numerical models of pTt-paths in a core-complex setting. This exemplifies the goal of teaching students to both be able to put together simple numerical models from scratch, and, perhaps more importantly, to truly understand the basic concepts, capabilities, and pitfalls of the more powerful community codes that are being increasingly used. SEATREE and the Numerical Methods class material are freely available at geodynamics.usc.edu/~becker.

  12. The Infeasibility of Experimental Quantification of Life-Critical Software Reliability

    NASA Technical Reports Server (NTRS)

    Butler, Ricky W.; Finelli, George B.

    1991-01-01

    This paper affirms that quantification of life-critical software reliability is infeasible using statistical methods whether applied to standard software or fault-tolerant software. The key assumption of software fault tolerance|separately programmed versions fail independently|is shown to be problematic. This assumption cannot be justified by experimentation in the ultra-reliability region and subjective arguments in its favor are not sufficiently strong to justify it as an axiom. Also, the implications of the recent multi-version software experiments support this affirmation.

  13. [Development of ophthalmologic software for handheld devices].

    PubMed

    Grottone, Gustavo Teixeira; Pisa, Ivan Torres; Grottone, João Carlos; Debs, Fernando; Schor, Paulo

    2006-01-01

    The formulas for calculation of intraocular lenses have evolved since the first theoretical formulas by Fyodorov. Among the second generation formulas, the SRK-I formula has a simple calculation, taking into account a calculation that only involved anteroposterior length, IOL constant and average keratometry. With the evolution of those formulas, complexicity increased making the reconfiguration of parameters in special situations impracticable. In this way the production and development of software for such a purpose, can help surgeons to recalculate those values if needed. To idealize, develop and test a Brazilian software for calculation of IOL dioptric power for handheld computers. For the development and programming of software for calculation of IOL, we used PocketC program (OrbWorks Concentrated Software, USA). We compared the results collected from a gold-standard device (Ultrascan/Alcon Labs) with the simulation of 100 fictitious patients, using the same IOL parameters. The results were grouped for ULTRASCAN data and SOFTWARE data. Using SRK/T formula the range of those parameters included a keratometry varying between 35 and 55D, axial length between 20 and 28 mm, IOL constants of 118.7, 118.3 and 115.8. Using Wilcoxon test, it was shown that the groups do not differ (p=0.314). We had a variation in the Ultrascan sample between 11.82 and 27.97. In the tested program sample the variation was practically similar (11.83-27.98). The average of the Ultrascan group was 20.93. The software group had a similar average. The standard deviation of the samples was also similar (4.53). The precision of IOL software for handheld devices was similar to that of the standard devices using the SRK/T formula. The software worked properly, was steady without bugs in tested models of operational system.

  14. Feedback loops and temporal misalignment in component-based hydrologic modeling

    NASA Astrophysics Data System (ADS)

    Elag, Mostafa M.; Goodall, Jonathan L.; Castronova, Anthony M.

    2011-12-01

    In component-based modeling, a complex system is represented as a series of loosely integrated components with defined interfaces and data exchanges that allow the components to be coupled together through shared boundary conditions. Although the component-based paradigm is commonly used in software engineering, it has only recently been applied for modeling hydrologic and earth systems. As a result, research is needed to test and verify the applicability of the approach for modeling hydrologic systems. The objective of this work was therefore to investigate two aspects of using component-based software architecture for hydrologic modeling: (1) simulation of feedback loops between components that share a boundary condition and (2) data transfers between temporally misaligned model components. We investigated these topics using a simple case study where diffusion of mass is modeled across a water-sediment interface. We simulated the multimedia system using two model components, one for the water and one for the sediment, coupled using the Open Modeling Interface (OpenMI) standard. The results were compared with a more conventional numerical approach for solving the system where the domain is represented by a single multidimensional array. Results showed that the component-based approach was able to produce the same results obtained with the more conventional numerical approach. When the two components were temporally misaligned, we explored the use of different interpolation schemes to minimize mass balance error within the coupled system. The outcome of this work provides evidence that component-based modeling can be used to simulate complicated feedback loops between systems and guidance as to how different interpolation schemes minimize mass balance error introduced when components are temporally misaligned.

  15. Creative computing with Landlab: an open-source toolkit for building, coupling, and exploring two-dimensional numerical models of Earth-surface dynamics

    NASA Astrophysics Data System (ADS)

    Hobley, Daniel E. J.; Adams, Jordan M.; Nudurupati, Sai Siddhartha; Hutton, Eric W. H.; Gasparini, Nicole M.; Istanbulluoglu, Erkan; Tucker, Gregory E.

    2017-01-01

    The ability to model surface processes and to couple them to both subsurface and atmospheric regimes has proven invaluable to research in the Earth and planetary sciences. However, creating a new model typically demands a very large investment of time, and modifying an existing model to address a new problem typically means the new work is constrained to its detriment by model adaptations for a different problem. Landlab is an open-source software framework explicitly designed to accelerate the development of new process models by providing (1) a set of tools and existing grid structures - including both regular and irregular grids - to make it faster and easier to develop new process components, or numerical implementations of physical processes; (2) a suite of stable, modular, and interoperable process components that can be combined to create an integrated model; and (3) a set of tools for data input, output, manipulation, and visualization. A set of example models built with these components is also provided. Landlab's structure makes it ideal not only for fully developed modelling applications but also for model prototyping and classroom use. Because of its modular nature, it can also act as a platform for model intercomparison and epistemic uncertainty and sensitivity analyses. Landlab exposes a standardized model interoperability interface, and is able to couple to third-party models and software. Landlab also offers tools to allow the creation of cellular automata, and allows native coupling of such models to more traditional continuous differential equation-based modules. We illustrate the principles of component coupling in Landlab using a model of landform evolution, a cellular ecohydrologic model, and a flood-wave routing model.

  16. ICESat (GLAS) Science Processing Software Document Series. Volume 1; Science Software Management Plan; 3.0

    NASA Technical Reports Server (NTRS)

    Hancock, David W., III

    1999-01-01

    This document provides the Software Management Plan for the GLAS Standard Data Software (SDS) supporting the GLAS instrument of the EOS ICESat Spacecraft. The SDS encompasses the ICESat Science Investigator-led Processing System (I-SIPS) Software and the Instrument Support Terminal (IST) Software. For the I-SIPS Software, the SDS will produce Level 0, Level 1, and Level 2 data products as well as the associated product quality assessments and descriptive information. For the IST Software, the SDS will accommodate the GLAS instrument support areas of engineering status, command, performance assessment, and instrument health status.

  17. PlanetPack: A radial-velocity time-series analysis tool facilitating exoplanets detection, characterization, and dynamical simulations

    NASA Astrophysics Data System (ADS)

    Baluev, Roman V.

    2013-08-01

    We present PlanetPack, a new software tool that we developed to facilitate and standardize the advanced analysis of radial velocity (RV) data for the goal of exoplanets detection, characterization, and basic dynamical N-body simulations. PlanetPack is a command-line interpreter, that can run either in an interactive mode or in a batch mode of automatic script interpretation. Its major abilities include: (i) advanced RV curve fitting with the proper maximum-likelihood treatment of unknown RV jitter; (ii) user-friendly multi-Keplerian as well as Newtonian N-body RV fits; (iii) use of more efficient maximum-likelihood periodograms that involve the full multi-planet fitting (sometimes called as “residual” or “recursive” periodograms); (iv) easily calculatable parametric 2D likelihood function level contours, reflecting the asymptotic confidence regions; (v) fitting under some useful functional constraints is user-friendly; (vi) basic tasks of short- and long-term planetary dynamical simulation using a fast Everhart-type integrator based on Gauss-Legendre spacings; (vii) fitting the data with red noise (auto-correlated errors); (viii) various analytical and numerical methods for the tasks of determining the statistical significance. It is planned that further functionality may be added to PlanetPack in the future. During the development of this software, a lot of effort was made to improve the calculational speed, especially for CPU-demanding tasks. PlanetPack was written in pure C++ (standard of 1998/2003), and is expected to be compilable and useable on a wide range of platforms.

  18. Public domain optical character recognition

    NASA Astrophysics Data System (ADS)

    Garris, Michael D.; Blue, James L.; Candela, Gerald T.; Dimmick, Darrin L.; Geist, Jon C.; Grother, Patrick J.; Janet, Stanley A.; Wilson, Charles L.

    1995-03-01

    A public domain document processing system has been developed by the National Institute of Standards and Technology (NIST). The system is a standard reference form-based handprint recognition system for evaluating optical character recognition (OCR), and it is intended to provide a baseline of performance on an open application. The system's source code, training data, performance assessment tools, and type of forms processed are all publicly available. The system recognizes the handprint entered on handwriting sample forms like the ones distributed with NIST Special Database 1. From these forms, the system reads hand-printed numeric fields, upper and lowercase alphabetic fields, and unconstrained text paragraphs comprised of words from a limited-size dictionary. The modular design of the system makes it useful for component evaluation and comparison, training and testing set validation, and multiple system voting schemes. The system contains a number of significant contributions to OCR technology, including an optimized probabilistic neural network (PNN) classifier that operates a factor of 20 times faster than traditional software implementations of the algorithm. The source code for the recognition system is written in C and is organized into 11 libraries. In all, there are approximately 19,000 lines of code supporting more than 550 subroutines. Source code is provided for form registration, form removal, field isolation, field segmentation, character normalization, feature extraction, character classification, and dictionary-based postprocessing. The recognition system has been successfully compiled and tested on a host of UNIX workstations. This paper gives an overview of the recognition system's software architecture, including descriptions of the various system components along with timing and accuracy statistics.

  19. Guidance and Control Software Project Data - Volume 4: Configuration Management and Quality Assurance Documents

    NASA Technical Reports Server (NTRS)

    Hayhurst, Kelly J. (Editor)

    2008-01-01

    The Guidance and Control Software (GCS) project was the last in a series of software reliability studies conducted at Langley Research Center between 1977 and 1994. The technical results of the GCS project were recorded after the experiment was completed. Some of the support documentation produced as part of the experiment, however, is serving an unexpected role far beyond its original project context. Some of the software used as part of the GCS project was developed to conform to the RTCA/DO-178B software standard, "Software Considerations in Airborne Systems and Equipment Certification," used in the civil aviation industry. That standard requires extensive documentation throughout the software development life cycle, including plans, software requirements, design and source code, verification cases and results, and configuration management and quality control data. The project documentation that includes this information is open for public scrutiny without the legal or safety implications associated with comparable data from an avionics manufacturer. This public availability has afforded an opportunity to use the GCS project documents for DO-178B training. This report provides a brief overview of the GCS project, describes the 4-volume set of documents and the role they are playing in training, and includes configuration management and quality assurance documents from the GCS project. Volume 4 contains six appendices: A. Software Accomplishment Summary for the Guidance and Control Software Project; B. Software Configuration Index for the Guidance and Control Software Project; C. Configuration Management Records for the Guidance and Control Software Project; D. Software Quality Assurance Records for the Guidance and Control Software Project; E. Problem Report for the Pluto Implementation of the Guidance and Control Software Project; and F. Support Documentation Change Reports for the Guidance and Control Software Project.

  20. System Software Framework for System of Systems Avionics

    NASA Technical Reports Server (NTRS)

    Ferguson, Roscoe C.; Peterson, Benjamin L; Thompson, Hiram C.

    2005-01-01

    Project Constellation implements NASA's vision for space exploration to expand human presence in our solar system. The engineering focus of this project is developing a system of systems architecture. This architecture allows for the incremental development of the overall program. Systems can be built and connected in a "Lego style" manner to generate configurations supporting various mission objectives. The development of the avionics or control systems of such a massive project will result in concurrent engineering. Also, each system will have software and the need to communicate with other (possibly heterogeneous) systems. Fortunately, this design problem has already been solved during the creation and evolution of systems such as the Internet and the Department of Defense's successful effort to standardize distributed simulation (now IEEE 1516). The solution relies on the use of a standard layered software framework and a communication protocol. A standard framework and communication protocol is suggested for the development and maintenance of Project Constellation systems. The ARINC 653 standard is a great start for such a common software framework. This paper proposes a common system software framework that uses the Real Time Publish/Subscribe protocol for framework-to-framework communication to extend ARINC 653. It is highly recommended that such a framework be established before development. This is important for the success of concurrent engineering. The framework provides an infrastructure for general system services and is designed for flexibility to support a spiral development effort.

  1. What makes computational open source software libraries successful?

    NASA Astrophysics Data System (ADS)

    Bangerth, Wolfgang; Heister, Timo

    2013-01-01

    Software is the backbone of scientific computing. Yet, while we regularly publish detailed accounts about the results of scientific software, and while there is a general sense of which numerical methods work well, our community is largely unaware of best practices in writing the large-scale, open source scientific software upon which our discipline rests. This is particularly apparent in the commonly held view that writing successful software packages is largely the result of simply ‘being a good programmer’ when in fact there are many other factors involved, for example the social skill of community building. In this paper, we consider what we have found to be the necessary ingredients for successful scientific software projects and, in particular, for software libraries upon which the vast majority of scientific codes are built today. In particular, we discuss the roles of code, documentation, communities, project management and licenses. We also briefly comment on the impact on academic careers of engaging in software projects.

  2. NASA's TReK Project: A Case Study in Using the Spiral Model of Software Development

    NASA Technical Reports Server (NTRS)

    Hendrix, T. Dean; Schneider, Michelle P.

    1998-01-01

    Software development projects face numerous challenges that threaten their successful completion. Whether it is not enough money, too little time, or a case of "requirements creep" that has turned into a full sprint, projects must meet these challenges or face possible disastrous consequences. A robust, yet flexible process model can provide a mechanism through which software development teams can meet these challenges head on and win. This article describes how the spiral model has been successfully tailored to a specific project and relates some notable results to date.

  3. IRACproc: IRAC Post-BCD Processing

    NASA Astrophysics Data System (ADS)

    Schuster, Mike; Marengo, Massimo; Patten, Brian

    2012-09-01

    IRACproc is a software suite that facilitates the co-addition of dithered or mapped Spitzer/IRAC data to make them ready for further analysis with application to a wide variety of IRAC observing programs. The software runs within PDL, a numeric extension for Perl available from pdl.perl.org, and as stand alone perl scripts. In acting as a wrapper for the Spitzer Science Center's MOPEX software, IRACproc improves the rejection of cosmic rays and other transients in the co-added data. In addition, IRACproc performs (optional) Point Spread Function (PSF) fitting, subtraction, and masking of saturated stars.

  4. Substructured multibody molecular dynamics.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Grest, Gary Stephen; Stevens, Mark Jackson; Plimpton, Steven James

    2006-11-01

    We have enhanced our parallel molecular dynamics (MD) simulation software LAMMPS (Large-scale Atomic/Molecular Massively Parallel Simulator, lammps.sandia.gov) to include many new features for accelerated simulation including articulated rigid body dynamics via coupling to the Rensselaer Polytechnic Institute code POEMS (Parallelizable Open-source Efficient Multibody Software). We use new features of the LAMMPS software package to investigate rhodopsin photoisomerization, and water model surface tension and capillary waves at the vapor-liquid interface. Finally, we motivate the recipes of MD for practitioners and researchers in numerical analysis and computational mechanics.

  5. The new information technologies and psychiatry.

    PubMed

    Fauman, M A

    1989-09-01

    The author reviews the history and technology of the microcomputer and discusses the various classes of software that are presently available. Three major categories of software are described: numeric data processing, text processing, and communications. The application of this software to psychiatric education and practice is briefly discussed. A short curriculum on computers for psychiatric residents is outlined, and a brief bibliography of the recent relevant literature on computer applications to medicine and psychiatry is presented. Predictions are made about the future direction of computer technology and its application to psychiatry.

  6. Comparison of effects of different screw materials in the triangle fixation of femoral neck fractures.

    PubMed

    Gok, Kadir; Inal, Sermet; Gok, Arif; Gulbandilar, Eyyup

    2017-05-01

    In this study, biomechanical behaviors of three different screw materials (stainless steel, titanium and cobalt-chromium) have analyzed to fix with triangle fixation under axial loading in femoral neck fracture and which material is best has been investigated. Point cloud obtained after scanning the human femoral model with the three dimensional (3D) scanner and this point cloud has been converted to 3D femoral model by Geomagic Studio software. Femoral neck fracture was modeled by SolidWorks software for only triangle configuration and computer-aided numerical analyses of three different materials have been carried out by AnsysWorkbench finite element analysis (FEA) software. The loading, boundary conditions and material properties have prepared for FEA and Von-Misses stress values on upper and lower proximity of the femur and screws have been calculated. At the end of numerical analyses, the best advantageous screw material has calculated as titanium because it creates minimum stress at the upper and lower proximity of the fracture line.

  7. Analyzing Dynamics of Cooperating Spacecraft

    NASA Technical Reports Server (NTRS)

    Hughes, Stephen P.; Folta, David C.; Conway, Darrel J.

    2004-01-01

    A software library has been developed to enable high-fidelity computational simulation of the dynamics of multiple spacecraft distributed over a region of outer space and acting with a common purpose. All of the modeling capabilities afforded by this software are available independently in other, separate software systems, but have not previously been brought together in a single system. A user can choose among several dynamical models, many high-fidelity environment models, and several numerical-integration schemes. The user can select whether to use models that assume weak coupling between spacecraft, or strong coupling in the case of feedback control or tethering of spacecraft to each other. For weak coupling, spacecraft orbits are propagated independently, and are synchronized in time by controlling the step size of the integration. For strong coupling, the orbits are integrated simultaneously. Among the integration schemes that the user can choose are Runge-Kutta Verner, Prince-Dormand, Adams-Bashforth-Moulton, and Bulirsh- Stoer. Comparisons of performance are included for both the weak- and strongcoupling dynamical models for all of the numerical integrators.

  8. Numerical Arc Segmentation Algorithm for a Radio Conference-NASARC (version 4.0) technical manual

    NASA Technical Reports Server (NTRS)

    Whyte, Wayne A., Jr.; Heyward, Ann O.; Ponchak, Denise S.; Spence, Rodney L.; Zuzek, John E.

    1988-01-01

    The information contained in the NASARC (Version 4.0) Technical Manual and NASARC (Version 4.0) User's Manual relates to the Numerical Arc Segmentation Algorithm for a Radio Conference (NASARC) software development through November 1, 1988. The Technical Manual describes the NASARC concept and the algorithms used to implement the concept. The User's Manual provides information on computer system considerations, installation instructions, description of input files, and program operation instructions. Significant revisions were incorporated in the Version 4.0 software over prior versions. These revisions have further enhanced the modeling capabilities of the NASARC procedure and provide improved arrangements of predetermined arcs within the geostationary orbits. Array dimensions within the software were structured to fit within the currently available 12 megabyte memory capacity of the International Frequency Registration Board (IFRB) computer facility. A piecewise approach to predetermined arc generation in NASARC (Version 4.0) allows worldwide planning problem scenarios to be accommodated within computer run time and memory constraints with enhanced likelihood and ease of solution.

  9. Comparison of particle tracking algorithms in commercial CFD packages: sedimentation and diffusion.

    PubMed

    Robinson, Risa J; Snyder, Pam; Oldham, Michael J

    2007-05-01

    Computational fluid dynamic modeling software has enabled microdosimetry patterns of inhaled toxins and toxicants to be predicted and visualized, and is being used in inhalation toxicology and risk assessment. These predicted microdosimetry patterns in airway structures are derived from predicted airflow patterns within these airways and particle tracking algorithms used in computational fluid dynamics (CFD) software packages. Although these commercial CFD codes have been tested for accuracy under various conditions, they have not been well tested for respiratory flows in general. Nor has their particle tracking algorithm accuracy been well studied. In this study, three software packages, Fluent Discrete Phase Model (DPM), Fluent Fine Particle Model (FPM), and ANSYS CFX, were evaluated. Sedimentation and diffusion were each isolated in a straight tube geometry and tested for accuracy. A range of flow rates corresponding to adult low activity (minute ventilation = 10 L/min) and to heavy exertion (minute ventilation = 60 L/min) were tested by varying the range of dimensionless diffusion and sedimentation parameters found using the Weibel symmetric 23 generation lung morphology. Numerical results for fully developed parabolic and uniform (slip) profiles were compared respectively, to Pich (1972) and Yu (1977) analytical sedimentation solutions. Schum and Yeh (1980) equations for sedimentation were also compared. Numerical results for diffusional deposition were compared to analytical solutions of Ingham (1975) for parabolic and uniform profiles. Significant differences were found among the various CFD software packages and between numerical and analytical solutions. Therefore, it is prudent to validate CFD predictions against analytical solutions in idealized geometry before tackling the complex geometries of the respiratory tract.

  10. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Amerio, S.; Behari, S.; Boyd, J.

    The Fermilab Tevatron collider's data-taking run ended in September 2011, yielding a dataset with rich scientific potential. The CDF and D0 experiments each have approximately 9 PB of collider and simulated data stored on tape. A large computing infrastructure consisting of tape storage, disk cache, and distributed grid computing for physics analysis with the Tevatron data is present at Fermilab. The Fermilab Run II data preservation project intends to keep this analysis capability sustained through the year 2020 and beyond. To achieve this goal, we have implemented a system that utilizes virtualization, automated validation, and migration to new standards inmore » both software and data storage technology and leverages resources available from currently-running experiments at Fermilab. Lastly, these efforts have also provided useful lessons in ensuring long-term data access for numerous experiments, and enable high-quality scientific output for years to come.« less

  11. Multiphysics of bone remodeling: A 2D mesoscale activation simulation.

    PubMed

    Spingarn, C; Wagner, D; Rémond, Y; George, D

    2017-01-01

    In this work, we present an evolutive trabecular model for bone remodeling based on a boundary detection algorithm accounting for both biology and applied mechanical forces, known to be an important factor in bone evolution. A finite element (FE) numerical model using the Abaqus/Standard® software was used with a UMAT subroutine to solve the governing coupled mechanical-biological non-linear differential equations of the bone evolution model. The simulations present cell activation on a simplified trabeculae configuration organization with trabecular thickness of 200µm. For this activation process, the results confirm that the trabeculae are mainly oriented in the active direction of the principal mechanical stresses and according to the principal applied mechanical load directions. The trabeculae surface activation is clearly identified and can provide understanding of the different bone cell activations in more complex geometries and load conditions.

  12. Realization of Vilnius UPXYZVS photometric system for AltaU42 CCD camera at the MAO NAS of Ukraine

    NASA Astrophysics Data System (ADS)

    Vid'Machenko, A. P.; Andruk, V. M.; Samoylov, V. S.; Delets, O. S.; Nevodovsky, P. V.; Ivashchenko, Yu. M.; Kovalchuk, G. U.

    2005-06-01

    The description of two-inch glass filters of the Vilnius UPXYZVS photometric system, which are made at the Main Astronomical Observatory of NAS of Ukraine for AltaU42 CCD camera with format of 2048×2048 pixels, is presented in the paper. Reaction curves of instrumental system are shown. Estimations of minimal star's magnitudes for each filter's band in comparison with the visual V one are obtained. New software for automation of CCD frames processing is developed in program shell of LINUX/MIDAS/ROMAFOT. It is planned to carry out observations with the purpose to create the catalogue of primary UPXYZVS CCD standards in selected field of the sky for some radio-sources, globular and open clusters, etc. Numerical estimations of astrometric and photometric accuracy are obtained.

  13. Data preservation at the Fermilab Tevatron

    NASA Astrophysics Data System (ADS)

    Amerio, S.; Behari, S.; Boyd, J.; Brochmann, M.; Culbertson, R.; Diesburg, M.; Freeman, J.; Garren, L.; Greenlee, H.; Herner, K.; Illingworth, R.; Jayatilaka, B.; Jonckheere, A.; Li, Q.; Naymola, S.; Oleynik, G.; Sakumoto, W.; Varnes, E.; Vellidis, C.; Watts, G.; White, S.

    2017-04-01

    The Fermilab Tevatron collider's data-taking run ended in September 2011, yielding a dataset with rich scientific potential. The CDF and D0 experiments each have approximately 9 PB of collider and simulated data stored on tape. A large computing infrastructure consisting of tape storage, disk cache, and distributed grid computing for physics analysis with the Tevatron data is present at Fermilab. The Fermilab Run II data preservation project intends to keep this analysis capability sustained through the year 2020 and beyond. To achieve this goal, we have implemented a system that utilizes virtualization, automated validation, and migration to new standards in both software and data storage technology and leverages resources available from currently-running experiments at Fermilab. These efforts have also provided useful lessons in ensuring long-term data access for numerous experiments, and enable high-quality scientific output for years to come.

  14. Combined Uncertainty and A-Posteriori Error Bound Estimates for General CFD Calculations: Theory and Software Implementation

    NASA Technical Reports Server (NTRS)

    Barth, Timothy J.

    2014-01-01

    This workshop presentation discusses the design and implementation of numerical methods for the quantification of statistical uncertainty, including a-posteriori error bounds, for output quantities computed using CFD methods. Hydrodynamic realizations often contain numerical error arising from finite-dimensional approximation (e.g. numerical methods using grids, basis functions, particles) and statistical uncertainty arising from incomplete information and/or statistical characterization of model parameters and random fields. The first task at hand is to derive formal error bounds for statistics given realizations containing finite-dimensional numerical error [1]. The error in computed output statistics contains contributions from both realization error and the error resulting from the calculation of statistics integrals using a numerical method. A second task is to devise computable a-posteriori error bounds by numerically approximating all terms arising in the error bound estimates. For the same reason that CFD calculations including error bounds but omitting uncertainty modeling are only of limited value, CFD calculations including uncertainty modeling but omitting error bounds are only of limited value. To gain maximum value from CFD calculations, a general software package for uncertainty quantification with quantified error bounds has been developed at NASA. The package provides implementations for a suite of numerical methods used in uncertainty quantification: Dense tensorization basis methods [3] and a subscale recovery variant [1] for non-smooth data, Sparse tensorization methods[2] utilizing node-nested hierarchies, Sampling methods[4] for high-dimensional random variable spaces.

  15. 75 FR 70128 - 2011 Changes for Domestic Mailing Services

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-11-17

    ...LOT, RDI, and Five-Digit ZIP. The Postal Service certifies software meeting its standards until the... Delivery Point Validation (DPV) service in conjunction with CASS-Certified address matching software... interface between address-matching software and the LACS \\Link\\ database service. 1.21.2 Interface...

  16. Space Communication and Navigation SDR Testbed, Overview and Opportunity for Experiments

    NASA Technical Reports Server (NTRS)

    Reinhart, Richard C.

    2013-01-01

    NASA has developed an experimental flight payload (referred to as the Space Communication and Navigation (SCAN) Test Bed) to investigate software defined radio (SDR) communications, networking, and navigation technologies, operationally in the space environment. The payload consists of three software defined radios each compliant to NASAs Space Telecommunications Radio System Architecture, a common software interface description standard for software defined radios. The software defined radios are new technology developments underway by NASA and industry partners launched in 2012. The payload is externally mounted to the International Space Station truss to conduct experiments representative of future mission capability. Experiment operations include in-flight reconfiguration of the SDR waveform functions and payload networking software. The flight system will communicate with NASAs orbiting satellite relay network, the Tracking and Data Relay Satellite System at both S-band and Ka-band and to any Earth-based compatible S-band ground station. The system is available for experiments by industry, academia, and other government agencies to participate in the SDR technology assessments and standards advancements.

  17. Stable Kalman filters for processing clock measurement data

    NASA Technical Reports Server (NTRS)

    Clements, P. A.; Gibbs, B. P.; Vandergraft, J. S.

    1989-01-01

    Kalman filters have been used for some time to process clock measurement data. Due to instabilities in the standard Kalman filter algorithms, the results have been unreliable and difficult to obtain. During the past several years, stable forms of the Kalman filter have been developed, implemented, and used in many diverse applications. These algorithms, while algebraically equivalent to the standard Kalman filter, exhibit excellent numerical properties. Two of these stable algorithms, the Upper triangular-Diagonal (UD) filter and the Square Root Information Filter (SRIF), have been implemented to replace the standard Kalman filter used to process data from the Deep Space Network (DSN) hydrogen maser clocks. The data are time offsets between the clocks in the DSN, the timescale at the National Institute of Standards and Technology (NIST), and two geographically intermediate clocks. The measurements are made by using the GPS navigation satellites in mutual view between clocks. The filter programs allow the user to easily modify the clock models, the GPS satellite dependent biases, and the random noise levels in order to compare different modeling assumptions. The results of this study show the usefulness of such software for processing clock data. The UD filter is indeed a stable, efficient, and flexible method for obtaining optimal estimates of clock offsets, offset rates, and drift rates. A brief overview of the UD filter is also given.

  18. Definition of the Flexible Image Transport System (FITS), version 3.0

    NASA Astrophysics Data System (ADS)

    Pence, W. D.; Chiappetti, L.; Page, C. G.; Shaw, R. A.; Stobie, E.

    2010-12-01

    The Flexible Image Transport System (FITS) has been used by astronomers for over 30 years as a data interchange and archiving format; FITS files are now handled by a wide range of astronomical software packages. Since the FITS format definition document (the “standard”) was last printed in this journal in 2001, several new features have been developed and standardized, notably support for 64-bit integers in images and tables, variable-length arrays in tables, and new world coordinate system conventions which provide a mapping from an element in a data array to a physical coordinate on the sky or within a spectrum. The FITS Working Group of the International Astronomical Union has therefore produced this new version 3.0 of the FITS standard, which is provided here in its entirety. In addition to describing the new features in FITS, numerous editorial changes were made to the previous version to clarify and reorganize many of the sections. Also included are some appendices which are not formally part of the standard. The FITS standard is likely to undergo further evolution, in which case the latest version may be found on the FITS Support Office Web site at http://fits.gsfc.nasa.gov/, which also provides many links to FITS-related resources.

  19. Parallelization of Rocket Engine System Software (Press)

    NASA Technical Reports Server (NTRS)

    Cezzar, Ruknet

    1996-01-01

    The main goal is to assess parallelization requirements for the Rocket Engine Numeric Simulator (RENS) project which, aside from gathering information on liquid-propelled rocket engines and setting forth requirements, involve a large FORTRAN based package at NASA Lewis Research Center and TDK software developed by SUBR/UWF. The ultimate aim is to develop, test, integrate, and suitably deploy a family of software packages on various aspects and facets of rocket engines using liquid-propellants. At present, all project efforts by the funding agency, NASA Lewis Research Center, and the HBCU participants are disseminated over the internet using world wide web home pages. Considering obviously expensive methods of actual field trails, the benefits of software simulators are potentially enormous. When realized, these benefits will be analogous to those provided by numerous CAD/CAM packages and flight-training simulators. According to the overall task assignments, Hampton University's role is to collect all available software, place them in a common format, assess and evaluate, define interfaces, and provide integration. Most importantly, the HU's mission is to see to it that the real-time performance is assured. This involves source code translations, porting, and distribution. The porting will be done in two phases: First, place all software on Cray XMP platform using FORTRAN. After testing and evaluation on the Cray X-MP, the code will be translated to C + + and ported to the parallel nCUBE platform. At present, we are evaluating another option of distributed processing over local area networks using Sun NFS, Ethernet, TCP/IP. Considering the heterogeneous nature of the present software (e.g., first started as an expert system using LISP machines) which now involve FORTRAN code, the effort is expected to be quite challenging.

  20. A free software for pore-scale modelling: solving Stokes equation for velocity fields and permeability values in 3D pore geometries

    NASA Astrophysics Data System (ADS)

    Gerke, Kirill; Vasilyev, Roman; Khirevich, Siarhei; Karsanina, Marina; Collins, Daniel; Korost, Dmitry; Mallants, Dirk

    2015-04-01

    In this contribution we introduce a novel free software which solves the Stokes equation to obtain velocity fields for low Reynolds-number flows within externally generated 3D pore geometries. Provided with velocity fields, one can calculate permeability for known pressure gradient boundary conditions via Darcy's equation. Finite-difference schemes of 2nd and 4th order of accuracy are used together with an artificial compressibility method to iteratively converge to a steady-state solution of Stokes' equation. This numerical approach is much faster and less computationally demanding than the majority of open-source or commercial softwares employing other algorithms (finite elements/volumes, lattice Boltzmann, etc.) The software consists of two parts: 1) a pre and post-processing graphical interface, and 2) a solver. The latter is efficiently parallelized to use any number of available cores (the speedup on 16 threads was up to 10-12 depending on hardware). Due to parallelization and memory optimization our software can be used to obtain solutions for 300x300x300 voxels geometries on modern desktop PCs. The software was successfully verified by testing it against lattice Boltzmann simulations and analytical solutions. To illustrate the software's applicability for numerous problems in Earth Sciences, a number of case studies have been developed: 1) identifying the representative elementary volume for permeability determination within a sandstone sample, 2) derivation of permeability/hydraulic conductivity values for rock and soil samples and comparing those with experimentally obtained values, 3) revealing the influence of the amount of fine-textured material such as clay on filtration properties of sandy soil. This work was partially supported by RSF grant 14-17-00658 (pore-scale modelling) and RFBR grants 13-04-00409-a and 13-05-01176-a.

  1. Using the ACR/NEMA standard with TCP/IP and Ethernet

    NASA Astrophysics Data System (ADS)

    Chimiak, William J.; Williams, Rodney C.

    1991-07-01

    There is a need for a consolidated picture archival and communications system (PACS) in hospitals. At the Bowman Gray School of Medicine of Wake Forest University (BGSM), the authors are enhancing the ACR/NEMA Version 2 protocol using UNIX sockets and TCP/IP to greatly improve connectivity. Initially, nuclear medicine studies using gamma cameras are to be sent to PACS. The ACR/NEMA Version 2 protocol provides the functionality of the upper three layers of the open system interconnection (OSI) model in this implementation. The images, imaging equipment information, and patient information are then sent in ACR/NEMA format to a software socket. From there it is handed to the TCP/IP protocol, which provides the transport and network service. TCP/IP, in turn, uses the services of IEEE 802.3 (Ethernet) to complete the connectivity. The advantage of this implementation is threefold: (1) Only one I/O port is consumed by numerous nuclear medicine cameras, instead of a physical port for each camera. (2) Standard protocols are used which maximize interoperability with ACR/NEMA compliant PACSs. (3) The use of sockets allows a migration path to the transport and networking services of OSIs TP4 and connectionless network service as well as the high-performance protocol being considered by the American National Standards Institute (ANSI) and the International Standards Organization (ISO) -- the Xpress Transfer Protocol (XTP). The use of sockets also gives access to ANSI's Fiber Distributed Data Interface (FDDI) as well as other high-speed network standards.

  2. Guidance and Control Software Project Data - Volume 3: Verification Documents

    NASA Technical Reports Server (NTRS)

    Hayhurst, Kelly J. (Editor)

    2008-01-01

    The Guidance and Control Software (GCS) project was the last in a series of software reliability studies conducted at Langley Research Center between 1977 and 1994. The technical results of the GCS project were recorded after the experiment was completed. Some of the support documentation produced as part of the experiment, however, is serving an unexpected role far beyond its original project context. Some of the software used as part of the GCS project was developed to conform to the RTCA/DO-178B software standard, "Software Considerations in Airborne Systems and Equipment Certification," used in the civil aviation industry. That standard requires extensive documentation throughout the software development life cycle, including plans, software requirements, design and source code, verification cases and results, and configuration management and quality control data. The project documentation that includes this information is open for public scrutiny without the legal or safety implications associated with comparable data from an avionics manufacturer. This public availability has afforded an opportunity to use the GCS project documents for DO-178B training. This report provides a brief overview of the GCS project, describes the 4-volume set of documents and the role they are playing in training, and includes the verification documents from the GCS project. Volume 3 contains four appendices: A. Software Verification Cases and Procedures for the Guidance and Control Software Project; B. Software Verification Results for the Pluto Implementation of the Guidance and Control Software; C. Review Records for the Pluto Implementation of the Guidance and Control Software; and D. Test Results Logs for the Pluto Implementation of the Guidance and Control Software.

  3. Numerical investigation of the effect of number of blades on centrifugal pump performance

    NASA Astrophysics Data System (ADS)

    Kocaaslan, O.; Ozgoren, M.; Babayigit, O.; Aksoy, M. H.

    2017-07-01

    In this study, the flow structure in a centrifugal pump was numerically investigated for the different blade numbers in the impeller between 5 and 9. The pump used in the study is a single-stage horizontal shafted centrifugal pump. The original pump impeller was designed as 7 blades for the parameters of flow rate Q=100 mł/h, head Hm=180 kPa and revolution n=1480 rpm. First, models of impellers with the different blade numbers between 5 and 9 and the volute section of the centrifugal pump were separately drawn using Solidworks software. Later, grid structures were generated on the flow volume of the pump. Last, the flow analyses were performed and the flow characteristics under different operational conditions were determined numerically. In the numerical analyses, k-ɛ turbulence model and standard wall functions were used to solve turbulent flow. Balance holes and surface roughness, which adversely affect the hydraulic efficiency of pumps, were also considered. The obtained results of the analyses show that the hydraulic torque and head values have increased with the application of higher number of the impeller blades. For the impellers with 5 and 9 blades on the design flow rate of 100 mł/h (Q/Qd=1), the hydraulic torque and head were found 49/59.1 Nm and 153.1/184.4 kPa, respectively. Subsequently the hydraulic efficiencies of each pump were calculated. As a result, the highest hydraulic efficiency on the design flow rate was calculated as 54.16% for the pump impeller having 8 blades.

  4. SEL's Software Process-Improvement Program

    NASA Technical Reports Server (NTRS)

    Basili, Victor; Zelkowitz, Marvin; McGarry, Frank; Page, Jerry; Waligora, Sharon; Pajerski, Rose

    1995-01-01

    The goals and operations of the Software Engineering Laboratory (SEL) is reviewed. For nearly 20 years the SEL has worked to understand, assess, and improve software and the development process within the production environment of the Flight Dynamics Division (FDD) of NASA's Goddard Space Flight Center. The SEL was established in 1976 with the goals of reducing: (1) the defect rate of delivered software, (2) the cost of software to support flight projects, and (3) the average time to produce mission-support software. After studying over 125 projects of FDD, the results have guided the standards, management practices, technologies, and the training within the division. The results of the studies have been a 75 percent reduction in defects, a 50 percent reduction in cost, and a 25 percent reduction in development time. Over time the goals of SEL have been clarified. The goals are now stated as: (1) Understand baseline processes and product characteristics, (2) Assess improvements that have been incorporated into the development projects, (3) Package and infuse improvements into the standard SEL process. The SEL improvement goal is to demonstrate continual improvement of the software process by carrying out analysis, measurement and feedback to projects with in the FDD environment. The SEL supports the understanding of the process by study of several processes including, the effort distribution, and error detection rates. The SEL assesses and refines the processes. Once the assessment and refinement of a process is completed, the SEL packages the process by capturing the process in standards, tools and training.

  5. 25 CFR 547.8 - What are the minimum technical software standards applicable to Class II gaming systems?

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... OF CLASS II GAMES § 547.8 What are the minimum technical software standards applicable to Class II... of Class II games. (a) Player interface displays. (1) If not otherwise provided to the player, the player interface shall display the following: (i) The purchase or wager amount; (ii) Game results; and...

  6. 25 CFR 547.12 - What are the minimum technical standards for downloading on a Class II gaming system?

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... OF CLASS II GAMES § 547.12 What are the minimum technical standards for downloading on a Class II... software, files, data, and prize schedules. (2) Downloads of software, games, prize schedules, or other... performed in a manner that will not affect game play. (5) Downloads shall not affect the integrity of...

  7. 25 CFR 547.12 - What are the minimum technical standards for downloading on a Class II gaming system?

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... OF CLASS II GAMES § 547.12 What are the minimum technical standards for downloading on a Class II... software, files, data, and prize schedules. (2) Downloads of software, games, prize schedules, or other... performed in a manner that will not affect game play. (5) Downloads shall not affect the integrity of...

  8. 25 CFR 547.8 - What are the minimum technical software standards applicable to Class II gaming systems?

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... OF CLASS II GAMES § 547.8 What are the minimum technical software standards applicable to Class II... of Class II games. (a) Player interface displays. (1) If not otherwise provided to the player, the player interface shall display the following: (i) The purchase or wager amount; (ii) Game results; and...

  9. 25 CFR 547.8 - What are the minimum technical software standards applicable to Class II gaming systems?

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... OF CLASS II GAMES § 547.8 What are the minimum technical software standards applicable to Class II... of Class II games. (a) Player interface displays. (1) If not otherwise provided to the player, the player interface shall display the following: (i) The purchase or wager amount; (ii) Game results; and...

  10. 25 CFR 547.12 - What are the minimum technical standards for downloading on a Class II gaming system?

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... OF CLASS II GAMES § 547.12 What are the minimum technical standards for downloading on a Class II... software, files, data, and prize schedules. (2) Downloads of software, games, prize schedules, or other... performed in a manner that will not affect game play. (5) Downloads shall not affect the integrity of...

  11. [Development of integrated support software for clinical nutrition].

    PubMed

    Siquier Homar, Pedro; Pinteño Blanco, Manel; Calleja Hernández, Miguel Ángel; Fernández Cortés, Francisco; Martínez Sotelo, Jesús

    2015-09-01

    to develop an integrated computer software application for specialized nutritional support, integrated in the electronic clinical record, which detects automatically and early those undernourished patients or at risk of developing undernourishment, determining points of opportunity for improvement and evaluation of the results. the quality standards published by the Nutrition Work Group of the Spanish Society of Hospital Pharmacy (SEFH) and the recommendations by the Pharmacy Group of the Spanish Society of Parenteral and Enteral Nutrition (SENPE) have been taken into account. According to these quality standards, the nutritional support has to include the following healthcare stages or sub-processes: nutritional screening, nutritional assessment, plan for nutritional care, prescription, preparation and administration. this software allows to conduct, in an automated way, a specific nutritional assessment for those patients with nutritional risk, implementing, if necessary, a nutritional treatment plan, conducting follow-up and traceability of outcomes derived from the implementation of improvement actions, and quantifying to what extent our practice is close to the established standard. this software allows to standardize the specialized nutritional support from a multidisciplinary point of view, introducing the concept of quality control per processes, and including patient as the main customer. Copyright AULA MEDICA EDICIONES 2014. Published by AULA MEDICA. All rights reserved.

  12. The Human Physiome: how standards, software and innovative service infrastructures are providing the building blocks to make it achievable

    PubMed Central

    2016-01-01

    Reconstructing and understanding the Human Physiome virtually is a complex mathematical problem, and a highly demanding computational challenge. Mathematical models spanning from the molecular level through to whole populations of individuals must be integrated, then personalized. This requires interoperability with multiple disparate and geographically separated data sources, and myriad computational software tools. Extracting and producing knowledge from such sources, even when the databases and software are readily available, is a challenging task. Despite the difficulties, researchers must frequently perform these tasks so that available knowledge can be continually integrated into the common framework required to realize the Human Physiome. Software and infrastructures that support the communities that generate these, together with their underlying standards to format, describe and interlink the corresponding data and computer models, are pivotal to the Human Physiome being realized. They provide the foundations for integrating, exchanging and re-using data and models efficiently, and correctly, while also supporting the dissemination of growing knowledge in these forms. In this paper, we explore the standards, software tooling, repositories and infrastructures that support this work, and detail what makes them vital to realizing the Human Physiome. PMID:27051515

  13. The Human Physiome: how standards, software and innovative service infrastructures are providing the building blocks to make it achievable.

    PubMed

    Nickerson, David; Atalag, Koray; de Bono, Bernard; Geiger, Jörg; Goble, Carole; Hollmann, Susanne; Lonien, Joachim; Müller, Wolfgang; Regierer, Babette; Stanford, Natalie J; Golebiewski, Martin; Hunter, Peter

    2016-04-06

    Reconstructing and understanding the Human Physiome virtually is a complex mathematical problem, and a highly demanding computational challenge. Mathematical models spanning from the molecular level through to whole populations of individuals must be integrated, then personalized. This requires interoperability with multiple disparate and geographically separated data sources, and myriad computational software tools. Extracting and producing knowledge from such sources, even when the databases and software are readily available, is a challenging task. Despite the difficulties, researchers must frequently perform these tasks so that available knowledge can be continually integrated into the common framework required to realize the Human Physiome. Software and infrastructures that support the communities that generate these, together with their underlying standards to format, describe and interlink the corresponding data and computer models, are pivotal to the Human Physiome being realized. They provide the foundations for integrating, exchanging and re-using data and models efficiently, and correctly, while also supporting the dissemination of growing knowledge in these forms. In this paper, we explore the standards, software tooling, repositories and infrastructures that support this work, and detail what makes them vital to realizing the Human Physiome.

  14. Fatigue and fracture assessment of cracks in steel elements using acoustic emission

    NASA Astrophysics Data System (ADS)

    Nemati, Navid; Metrovich, Brian; Nanni, Antonio

    2011-04-01

    Single edge notches provide a very well defined load and fatigue crack size and shape environment for estimation of the stress intensity factor K, which is not found in welded elements. ASTM SE(T) specimens do not appear to provide ideal boundary conditions for proper recording of acoustic wave propagation and crack growth behavior observed in steel bridges, but do provide standard fatigue crack growth rate data. A modified versions of the SE(T) specimen has been examined to provide small scale specimens with improved acoustic emission(AE) characteristics while still maintaining accuracy of fatigue crack growth rate (da/dN) versus stress intensity factor (ΔK). The specimens intend to represent a steel beam flange subjected to pure tension, with a surface crack growing transverse to a uniform stress field. Fatigue test is conducted at low R ratio. Analytical and numerical studies of stress intensity factor are developed for single edge notch test specimens consistent with the experimental program. ABAQUS finite element software is utilized for stress analysis of crack tips. Analytical, experimental and numerical analysis were compared to assess the abilities of AE to capture a growing crack.

  15. Integrated optomechanical analysis and testing software development at MIT Lincoln Laboratory

    NASA Astrophysics Data System (ADS)

    Stoeckel, Gerhard P.; Doyle, Keith B.

    2013-09-01

    Advanced analytical software capabilities are being developed to advance the design of prototypical hardware in the Engineering Division at MIT Lincoln Laboratory. The current effort is focused on the integration of analysis tools tailored to the work flow, organizational structure, and current technology demands. These tools are being designed to provide superior insight into the interdisciplinary behavior of optical systems and enable rapid assessment and execution of design trades to optimize the design of optomechanical systems. The custom software architecture is designed to exploit and enhance the functionality of existing industry standard commercial software, provide a framework for centralizing internally developed tools, and deliver greater efficiency, productivity, and accuracy through standardization, automation, and integration. Specific efforts have included the development of a feature-rich software package for Structural-Thermal-Optical Performance (STOP) modeling, advanced Line Of Sight (LOS) jitter simulations, and improved integration of dynamic testing and structural modeling.

  16. An XML-based method for astronomy software designing

    NASA Astrophysics Data System (ADS)

    Liao, Mingxue; Aili, Yusupu; Zhang, Jin

    XML-based method for standardization of software designing is introduced and analyzed and successfully applied to renovating the hardware and software of the digital clock at Urumqi Astronomical Station. Basic strategy for eliciting time information from the new digital clock of FT206 in the antenna control program is introduced. By FT206, the need to compute how many centuries passed since a certain day with sophisticated formulas is eliminated and it is no longer necessary to set right UT time for the computer holding control over antenna because the information about year, month, day are all deduced from Julian day dwelling in FT206, rather than from computer time. With XML-based method and standard for software designing, various existing designing methods are unified, communications and collaborations between developers are facilitated, and thus Internet-based mode of developing software becomes possible. The trend of development of XML-based designing method is predicted.

  17. Stepwise Distributed Open Innovation Contests for Software Development: Acceleration of Genome-Wide Association Analysis

    PubMed Central

    Hill, Andrew; Loh, Po-Ru; Bharadwaj, Ragu B.; Pons, Pascal; Shang, Jingbo; Guinan, Eva; Lakhani, Karim; Kilty, Iain

    2017-01-01

    Abstract Background: The association of differing genotypes with disease-related phenotypic traits offers great potential to both help identify new therapeutic targets and support stratification of patients who would gain the greatest benefit from specific drug classes. Development of low-cost genotyping and sequencing has made collecting large-scale genotyping data routine in population and therapeutic intervention studies. In addition, a range of new technologies is being used to capture numerous new and complex phenotypic descriptors. As a result, genotype and phenotype datasets have grown exponentially. Genome-wide association studies associate genotypes and phenotypes using methods such as logistic regression. As existing tools for association analysis limit the efficiency by which value can be extracted from increasing volumes of data, there is a pressing need for new software tools that can accelerate association analyses on large genotype-phenotype datasets. Results: Using open innovation (OI) and contest-based crowdsourcing, the logistic regression analysis in a leading, community-standard genetics software package (PLINK 1.07) was substantially accelerated. OI allowed us to do this in <6 months by providing rapid access to highly skilled programmers with specialized, difficult-to-find skill sets. Through a crowd-based contest a combination of computational, numeric, and algorithmic approaches was identified that accelerated the logistic regression in PLINK 1.07 by 18- to 45-fold. Combining contest-derived logistic regression code with coarse-grained parallelization, multithreading, and associated changes to data initialization code further developed through distributed innovation, we achieved an end-to-end speedup of 591-fold for a data set size of 6678 subjects by 645 863 variants, compared to PLINK 1.07's logistic regression. This represents a reduction in run time from 4.8 hours to 29 seconds. Accelerated logistic regression code developed in this project has been incorporated into the PLINK2 project. Conclusions: Using iterative competition-based OI, we have developed a new, faster implementation of logistic regression for genome-wide association studies analysis. We present lessons learned and recommendations on running a successful OI process for bioinformatics. PMID:28327993

  18. Stepwise Distributed Open Innovation Contests for Software Development: Acceleration of Genome-Wide Association Analysis.

    PubMed

    Hill, Andrew; Loh, Po-Ru; Bharadwaj, Ragu B; Pons, Pascal; Shang, Jingbo; Guinan, Eva; Lakhani, Karim; Kilty, Iain; Jelinsky, Scott A

    2017-05-01

    The association of differing genotypes with disease-related phenotypic traits offers great potential to both help identify new therapeutic targets and support stratification of patients who would gain the greatest benefit from specific drug classes. Development of low-cost genotyping and sequencing has made collecting large-scale genotyping data routine in population and therapeutic intervention studies. In addition, a range of new technologies is being used to capture numerous new and complex phenotypic descriptors. As a result, genotype and phenotype datasets have grown exponentially. Genome-wide association studies associate genotypes and phenotypes using methods such as logistic regression. As existing tools for association analysis limit the efficiency by which value can be extracted from increasing volumes of data, there is a pressing need for new software tools that can accelerate association analyses on large genotype-phenotype datasets. Using open innovation (OI) and contest-based crowdsourcing, the logistic regression analysis in a leading, community-standard genetics software package (PLINK 1.07) was substantially accelerated. OI allowed us to do this in <6 months by providing rapid access to highly skilled programmers with specialized, difficult-to-find skill sets. Through a crowd-based contest a combination of computational, numeric, and algorithmic approaches was identified that accelerated the logistic regression in PLINK 1.07 by 18- to 45-fold. Combining contest-derived logistic regression code with coarse-grained parallelization, multithreading, and associated changes to data initialization code further developed through distributed innovation, we achieved an end-to-end speedup of 591-fold for a data set size of 6678 subjects by 645 863 variants, compared to PLINK 1.07's logistic regression. This represents a reduction in run time from 4.8 hours to 29 seconds. Accelerated logistic regression code developed in this project has been incorporated into the PLINK2 project. Using iterative competition-based OI, we have developed a new, faster implementation of logistic regression for genome-wide association studies analysis. We present lessons learned and recommendations on running a successful OI process for bioinformatics. © The Author 2017. Published by Oxford University Press.

  19. Overview of Hazard Assessment and Emergency Planning Software of Use to RN First Responders

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Waller, E; Millage, K; Blakely, W F

    2008-08-26

    There are numerous software tools available for field deployment, reach-back, training and planning use in the event of a radiological or nuclear (RN) terrorist event. Specialized software tools used by CBRNe responders can increase information available and the speed and accuracy of the response, thereby ensuring that radiation doses to responders, receivers, and the general public are kept as low as reasonably achievable. Software designed to provide health care providers with assistance in selecting appropriate countermeasures or therapeutic interventions in a timely fashion can improve the potential for positive patient outcome. This paper reviews various software applications of relevance tomore » radiological and nuclear (RN) events that are currently in use by first responders, emergency planners, medical receivers, and criminal investigators.« less

  20. Open Source Software and the Intellectual Commons.

    ERIC Educational Resources Information Center

    Dorman, David

    2002-01-01

    Discusses the Open Source Software method of software development and its relationship to control over information content. Topics include digital library resources; reference services; preservation; the legal and economic status of information; technical standards; access to digital data; control of information use; and copyright and patent laws.…

Top