Sample records for software analysis part

  1. Orbiter subsystem hardware/software interaction analysis. Volume 8: AFT reaction control system, part 2

    NASA Technical Reports Server (NTRS)

    Becker, D. D.

    1980-01-01

    The orbiter subsystems and interfacing program elements which interact with the orbiter computer flight software are analyzed. The failure modes identified in the subsystem/element failure mode and effects analysis are examined. Potential interaction with the software is examined through an evaluation of the software requirements. The analysis is restricted to flight software requirements and excludes utility/checkout software. The results of the hardware/software interaction analysis for the forward reaction control system are presented.

  2. Experience report: Using formal methods for requirements analysis of critical spacecraft software

    NASA Technical Reports Server (NTRS)

    Lutz, Robyn R.; Ampo, Yoko

    1994-01-01

    Formal specification and analysis of requirements continues to gain support as a method for producing more reliable software. However, the introduction of formal methods to a large software project is difficult, due in part to the unfamiliarity of the specification languages and the lack of graphics. This paper reports results of an investigation into the effectiveness of formal methods as an aid to the requirements analysis of critical, system-level fault-protection software on a spacecraft currently under development. Our experience indicates that formal specification and analysis can enhance the accuracy of the requirements and add assurance prior to design development in this domain. The work described here is part of a larger, NASA-funded research project whose purpose is to use formal-methods techniques to improve the quality of software in space applications. The demonstration project described here is part of the effort to evaluate experimentally the effectiveness of supplementing traditional engineering approaches to requirements specification with the more rigorous specification and analysis available with formal methods.

  3. Conservative Allowables Determined by a Tsai-Hill Equivalent Criterion for Design of Satellite Composite Parts

    NASA Astrophysics Data System (ADS)

    Pommatau, Gilles

    2014-06-01

    The present paper deals with the industrial application, via a software developed by Thales Alenia Space, of a new failure criterion named "Tsai-Hill equivalent criterion" for composite structural parts of satellites. The first part of the paper briefly describes the main hypothesis and the possibilities in terms of failure analysis of the software. The second parts reminds the quadratic and conservative nature of the new failure criterion, already presented in ESA conference in a previous paper. The third part presents the statistical calculation possibilities of the software, and the associated sensitivity analysis, via results obtained on different composites. Then a methodology, proposed to customers and agencies, is presented with its limitations and advantages. It is then conclude that this methodology is an efficient industrial way to perform mechanical analysis on quasi-isotropic composite parts.

  4. The VLBI Data Analysis Software νSolve: Development Progress and Plans for the Future

    NASA Astrophysics Data System (ADS)

    Bolotin, S.; Baver, K.; Gipson, J.; Gordon, D.; MacMillan, D.

    2014-12-01

    The program νSolve is a part of the CALC/SOLVE VLBI data analysis system. It is a replacement for interactive SOLVE, the part of CALC/SOLVE that is used for preliminary data analysis of new VLBI sessions. νSolve is completely new software. It is written in C++ and has a modern graphical user interface. In this article we present the capabilities of the software, its current status, and our plans for future development.

  5. Development problem analysis of correlation leak detector’s software

    NASA Astrophysics Data System (ADS)

    Faerman, V. A.; Avramchuk, V. S.; Marukyan, V. M.

    2018-05-01

    In the article, the practical application and the structure of the correlation leak detectors’ software is studied and the task of its designing is analyzed. In the first part of the research paper, the expediency of the facilities development of correlation leak detectors for the following operating efficiency of public utilities exploitation is shown. The analysis of the functional structure of correlation leak detectors is conducted and its program software tasks are defined. In the second part of the research paper some development steps of the software package – requirement forming, program structure definition and software concept creation – are examined in the context of the usage experience of the hardware-software prototype of correlation leak detector.

  6. The software application and classification algorithms for welds radiograms analysis

    NASA Astrophysics Data System (ADS)

    Sikora, R.; Chady, T.; Baniukiewicz, P.; Grzywacz, B.; Lopato, P.; Misztal, L.; Napierała, L.; Piekarczyk, B.; Pietrusewicz, T.; Psuj, G.

    2013-01-01

    The paper presents a software implementation of an Intelligent System for Radiogram Analysis (ISAR). The system has to support radiologists in welds quality inspection. The image processing part of software with a graphical user interface and a welds classification part are described with selected classification results. Classification was based on a few algorithms: an artificial neural network, a k-means clustering, a simplified k-means and a rough sets theory.

  7. Application of software technology to automatic test data analysis

    NASA Technical Reports Server (NTRS)

    Stagner, J. R.

    1991-01-01

    The verification process for a major software subsystem was partially automated as part of a feasibility demonstration. The methods employed are generally useful and applicable to other types of subsystems. The effort resulted in substantial savings in test engineer analysis time and offers a method for inclusion of automatic verification as a part of regression testing.

  8. NASA software specification and evaluation system design, part 2

    NASA Technical Reports Server (NTRS)

    1976-01-01

    A survey and analysis of the existing methods, tools and techniques employed in the development of software are presented along with recommendations for the construction of reliable software. Functional designs for software specification language, and the data base verifier are presented.

  9. Accounting for Uncertainties in Strengths of SiC MEMS Parts

    NASA Technical Reports Server (NTRS)

    Nemeth, Noel; Evans, Laura; Beheim, Glen; Trapp, Mark; Jadaan, Osama; Sharpe, William N., Jr.

    2007-01-01

    A methodology has been devised for accounting for uncertainties in the strengths of silicon carbide structural components of microelectromechanical systems (MEMS). The methodology enables prediction of the probabilistic strengths of complexly shaped MEMS parts using data from tests of simple specimens. This methodology is intended to serve as a part of a rational basis for designing SiC MEMS, supplementing methodologies that have been borrowed from the art of designing macroscopic brittle material structures. The need for this or a similar methodology arises as a consequence of the fundamental nature of MEMS and the brittle silicon-based materials of which they are typically fabricated. When tested to fracture, MEMS and structural components thereof show wide part-to-part scatter in strength. The methodology involves the use of the Ceramics Analysis and Reliability Evaluation of Structures Life (CARES/Life) software in conjunction with the ANSYS Probabilistic Design System (PDS) software to simulate or predict the strength responses of brittle material components while simultaneously accounting for the effects of variability of geometrical features on the strength responses. As such, the methodology involves the use of an extended version of the ANSYS/CARES/PDS software system described in Probabilistic Prediction of Lifetimes of Ceramic Parts (LEW-17682-1/4-1), Software Tech Briefs supplement to NASA Tech Briefs, Vol. 30, No. 9 (September 2006), page 10. The ANSYS PDS software enables the ANSYS finite-element-analysis program to account for uncertainty in the design-and analysis process. The ANSYS PDS software accounts for uncertainty in material properties, dimensions, and loading by assigning probabilistic distributions to user-specified model parameters and performing simulations using various sampling techniques.

  10. Information Extraction for System-Software Safety Analysis: Calendar Year 2008 Year-End Report

    NASA Technical Reports Server (NTRS)

    Malin, Jane T.

    2009-01-01

    This annual report describes work to integrate a set of tools to support early model-based analysis of failures and hazards due to system-software interactions. The tools perform and assist analysts in the following tasks: 1) extract model parts from text for architecture and safety/hazard models; 2) combine the parts with library information to develop the models for visualization and analysis; 3) perform graph analysis and simulation to identify and evaluate possible paths from hazard sources to vulnerable entities and functions, in nominal and anomalous system-software configurations and scenarios; and 4) identify resulting candidate scenarios for software integration testing. There has been significant technical progress in model extraction from Orion program text sources, architecture model derivation (components and connections) and documentation of extraction sources. Models have been derived from Internal Interface Requirements Documents (IIRDs) and FMEA documents. Linguistic text processing is used to extract model parts and relationships, and the Aerospace Ontology also aids automated model development from the extracted information. Visualizations of these models assist analysts in requirements overview and in checking consistency and completeness.

  11. Information Extraction for System-Software Safety Analysis: Calendar Year 2007 Year-End Report

    NASA Technical Reports Server (NTRS)

    Malin, Jane T.

    2008-01-01

    This annual report describes work to integrate a set of tools to support early model-based analysis of failures and hazards due to system-software interactions. The tools perform and assist analysts in the following tasks: 1) extract model parts from text for architecture and safety/hazard models; 2) combine the parts with library information to develop the models for visualization and analysis; 3) perform graph analysis on the models to identify possible paths from hazard sources to vulnerable entities and functions, in nominal and anomalous system-software configurations; 4) perform discrete-time-based simulation on the models to investigate scenarios where these paths may play a role in failures and mishaps; and 5) identify resulting candidate scenarios for software integration testing. This paper describes new challenges in a NASA abort system case, and enhancements made to develop the integrated tool set.

  12. Architecture of the software for LAMOST fiber positioning subsystem

    NASA Astrophysics Data System (ADS)

    Peng, Xiaobo; Xing, Xiaozheng; Hu, Hongzhuan; Zhai, Chao; Li, Weimin

    2004-09-01

    The architecture of the software which controls the LAMOST fiber positioning sub-system is described. The software is composed of two parts as follows: a main control program in a computer and a unit controller program in a MCS51 single chip microcomputer ROM. And the function of the software includes: Client/Server model establishment, observation planning, collision handling, data transmission, pulse generation, CCD control, image capture and processing, and data analysis etc. Particular attention is paid to the ways in which different parts of the software can communicate. Also software techniques for multi threads, SOCKET programming, Microsoft Windows message response, and serial communications are discussed.

  13. NASA software specification and evaluation system design, part 1

    NASA Technical Reports Server (NTRS)

    1976-01-01

    The research to develop methods for reducing the effort expended in software and verification is reported. The development of a formal software requirements methodology, a formal specifications language, a programming language, a language preprocessor, and code analysis tools are discussed.

  14. Expert system verification and validation study: ES V/V Workshop

    NASA Technical Reports Server (NTRS)

    French, Scott; Hamilton, David

    1992-01-01

    The primary purpose of this document is to build a foundation for applying principles of verification and validation (V&V) of expert systems. To achieve this, some V&V as applied to conventionally implemented software is required. Part one will discuss the background of V&V from the perspective of (1) what is V&V of software and (2) V&V's role in developing software. Part one will also overview some common analysis techniques that are applied when performing V&V of software. All of these materials will be presented based on the assumption that the reader has little or no background in V&V or in developing procedural software. The primary purpose of part two is to explain the major techniques that have been developed for V&V of expert systems.

  15. Evaluation of automated decisionmaking methodologies and development of an integrated robotic system simulation, volume 2, part 1. Appendix A: Software documentation

    NASA Technical Reports Server (NTRS)

    Lowrie, J. W.; Fermelia, A. J.; Haley, D. C.; Gremban, K. D.; Vanbaalen, J.; Walsh, R. W.

    1982-01-01

    Documentation of the preliminary software developed as a framework for a generalized integrated robotic system simulation is presented. The program structure is composed of three major functions controlled by a program executive. The three major functions are: system definition, analysis tools, and post processing. The system definition function handles user input of system parameters and definition of the manipulator configuration. The analysis tools function handles the computational requirements of the program. The post processing function allows for more detailed study of the results of analysis tool function executions. Also documented is the manipulator joint model software to be used as the basis of the manipulator simulation which will be part of the analysis tools capability.

  16. Numerical Analysis of Coolant Flow and Heat Transfer in ITER Diagnostic First Wall

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Khodak, A.; Loesser, G.; Zhai, Y.

    2015-07-24

    We performed numerical simulations of the ITER Diagnostic First Wall (DFW) using ANSYS workbench. During operation DFW will include solid main body as well as liquid coolant. Thus thermal and hydraulic analysis of the DFW was performed using conjugated heat transfer approach, in which heat transfer was resolved in both solid and liquid parts, and simultaneously fluid dynamics analysis was performed only in the liquid part. This approach includes interface between solid and liquid part of the systemAnalysis was performed using ANSYS CFX software. CFX software allows solution of heat transfer equations in solid and liquid part, and solution ofmore » the flow equations in the liquid part. Coolant flow in the DFW was assumed turbulent and was resolved using Reynolds averaged Navier-Stokes equations with Shear Stress Transport turbulence model. Meshing was performed using CFX method available within ANSYS. The data cloud for thermal loading consisting of volumetric heating and surface heating was imported into CFX Volumetric heating source was generated using Attila software. Surface heating was obtained using radiation heat transfer analysis. Our results allowed us to identify areas of excessive heating. Proposals for cooling channel relocation were made. Additional suggestions were made to improve hydraulic performance of the cooling system.« less

  17. Automation for System Safety Analysis

    NASA Technical Reports Server (NTRS)

    Malin, Jane T.; Fleming, Land; Throop, David; Thronesbery, Carroll; Flores, Joshua; Bennett, Ted; Wennberg, Paul

    2009-01-01

    This presentation describes work to integrate a set of tools to support early model-based analysis of failures and hazards due to system-software interactions. The tools perform and assist analysts in the following tasks: 1) extract model parts from text for architecture and safety/hazard models; 2) combine the parts with library information to develop the models for visualization and analysis; 3) perform graph analysis and simulation to identify and evaluate possible paths from hazard sources to vulnerable entities and functions, in nominal and anomalous system-software configurations and scenarios; and 4) identify resulting candidate scenarios for software integration testing. There has been significant technical progress in model extraction from Orion program text sources, architecture model derivation (components and connections) and documentation of extraction sources. Models have been derived from Internal Interface Requirements Documents (IIRDs) and FMEA documents. Linguistic text processing is used to extract model parts and relationships, and the Aerospace Ontology also aids automated model development from the extracted information. Visualizations of these models assist analysts in requirements overview and in checking consistency and completeness.

  18. Evaluation of a Game to Teach Requirements Collection and Analysis in Software Engineering at Tertiary Education Level

    ERIC Educational Resources Information Center

    Hainey, Thomas; Connolly, Thomas M.; Stansfield, Mark; Boyle, Elizabeth A.

    2011-01-01

    A highly important part of software engineering education is requirements collection and analysis which is one of the initial stages of the Database Application Lifecycle and arguably the most important stage of the Software Development Lifecycle. No other conceptual work is as difficult to rectify at a later stage or as damaging to the overall…

  19. Logic Model Checking of Unintended Acceleration Claims in Toyota Vehicles

    NASA Technical Reports Server (NTRS)

    Gamble, Ed

    2012-01-01

    Part of the US Department of Transportation investigation of Toyota sudden unintended acceleration (SUA) involved analysis of the throttle control software, JPL Laboratory for Reliable Software applied several techniques including static analysis and logic model checking, to the software; A handful of logic models were build, Some weaknesses were identified; however, no cause for SUA was found; The full NASA report includes numerous other analyses

  20. The Trial Software version for DEMETER power spectrum files visualization and mapping

    NASA Astrophysics Data System (ADS)

    Lozbin, Anatoliy; Inchin, Alexander; Shpadi, Maxim

    2010-05-01

    In the frame of Kazakhstan's Scientific Space System creation for earthquakes precursors research, the hardware and software of DEMETER satellite was investigated. The data processing Software of DEMETER is based on package SWAN under IDL Virtual machine and realizes many features, but we can't find an important tool for the spectrograms analysis - space-time visualization of power spectrum files from electromagnetic devices as ICE and IMSC. For elimination of this problem we have developed Software which is offered to use. The DeSS (DEMETER Spectrogram Software) - it is Software for visualization, analysis and a mapping of power spectrum data from electromagnetic devices ICE and IMSC. The Software primary goal is to give the researcher friendly tool for the analysis of electromagnetic data from DEMETER Satellite for earthquake precursors and other ionosphere events researches. The Input data for DeSS Software is a power spectrum files: - Power spectrum of 1 component of the electric field in the VLF range (APID 1132); - Power spectrum of 1 component of the electric field in the HF range (APID 1134); - Power spectrum of 1 component of the magnetic field in the VLF range (APID 1137). The main features and operations of the software is possible: - various time and frequency filtration; - visualization of time dependence of signal intensity on fixed frequency; - spectral density visualization for fixed frequency range; - spectrogram autosize and smooth spectrogram; - the information in each point of the spectrogram: time, frequency and intensity; - the spectrum information in the separate window, consisting of 4 blocks; - data mapping with 6 range scale. On the map we can browse next information: - satellite orbit; - conjugate point at the satellite altitude; - north conjugate point at the altitude 110 km; - south conjugate point at the altitude 110 km. This is only trial software version to help the researchers and we always ready collaborate with scientists for software improvement. References: 1. D.Lagoutte, J.Y. Brochot, D. de Carvalho, L.Madrias and M. Parrot. DEMETER Microsatellite. Scientific Mission Center. Data product description. DMT-SP-9-CM-6054-LPC. 2. D.Lagoutte, J.Y. Brochot, P.Latremoliere. SWAN - Software for Waveform Analysis. LPCE/NI/003.E - Part 1 (User's guide), Part 2 (Analysis tools), Part 3 (User's project interface).

  1. NASA Workshop on Computational Structural Mechanics 1987, part 3

    NASA Technical Reports Server (NTRS)

    Sykes, Nancy P. (Editor)

    1989-01-01

    Computational Structural Mechanics (CSM) topics are explored. Algorithms and software for nonlinear structural dynamics, concurrent algorithms for transient finite element analysis, computational methods and software systems for dynamics and control of large space structures, and the use of multi-grid for structural analysis are discussed.

  2. Evaluating Games-Based Learning

    ERIC Educational Resources Information Center

    Hainey, Thomas; Connolly, Thomas

    2010-01-01

    A highly important part of software engineering education is requirements collection and analysis, one of the initial stages of the Software Development Lifecycle. No other conceptual work is as difficult to rectify at a later stage or as damaging to the overall system if performed incorrectly. As software engineering is a field with a reputation…

  3. Analysis of direct punch velocity in professional defence

    NASA Astrophysics Data System (ADS)

    Lapkova, Dora; Adamek, Milan

    2016-06-01

    This paper is focused on analysis of a direct punch. Nowadays, professional defence is basic part of effective protection of people and property. There are many striking techniques and the goal of this research was to analyze the direct punch. The analysis is aimed to measure the velocity with help of high speed camera Olympus i-Speed 2 and then find the dependences of this velocity on input parameters. For data analysis two pieces of software were used - i-Speed Control Software and MINITAB. 111 participants took part in this experiment. The results are presented in this paper - especially dependence of mean velocity on time and difference in velocity between genders.

  4. IDA Cost Research Symposium Held 25 May 1995.

    DTIC Science & Technology

    1995-08-01

    Excel Spreadsheet Publications: MCR Report TR-9507/01 Category: II.B Keywords: Government, Estimating, Missiles, Analysis, Production, Data...originally developed by Martin Marietta as part of SASET software estimating model. To be implemented as part of SoftEST Software Estimating Tool...following documents to report the results of Its work. Reports Reports are the most authoritative and most carefully considered products IDA

  5. Using Combined SFTA and SFMECA Techniques for Space Critical Software

    NASA Astrophysics Data System (ADS)

    Nicodemos, F. G.; Lahoz, C. H. N.; Abdala, M. A. D.; Saotome, O.

    2012-01-01

    This work addresses the combined Software Fault Tree Analysis (SFTA) and Software Failure Modes, Effects and Criticality Analysis (SFMECA) techniques applied to space critical software of satellite launch vehicles. The combined approach is under research as part of the Verification and Validation (V&V) efforts to increase software dependability and as future application in other projects under development at Instituto de Aeronáutica e Espaço (IAE). The applicability of such approach was conducted on system software specification and applied to a case study based on the Brazilian Satellite Launcher (VLS). The main goal is to identify possible failure causes and obtain compensating provisions that lead to inclusion of new functional and non-functional system software requirements.

  6. Structural Analysis Using NX Nastran 9.0

    NASA Technical Reports Server (NTRS)

    Rolewicz, Benjamin M.

    2014-01-01

    NX Nastran is a powerful Finite Element Analysis (FEA) software package used to solve linear and non-linear models for structural and thermal systems. The software, which consists of both a solver and user interface, breaks down analysis into four files, each of which are important to the end results of the analysis. The software offers capabilities for a variety of types of analysis, and also contains a respectable modeling program. Over the course of ten weeks, I was trained to effectively implement NX Nastran into structural analysis and refinement for parts of two missions at NASA's Kennedy Space Center, the Restore mission and the Orion mission.

  7. Coordination and organization of security software process for power information application environment

    NASA Astrophysics Data System (ADS)

    Wang, Qiang

    2017-09-01

    As an important part of software engineering, the software process decides the success or failure of software product. The design and development feature of security software process is discussed, so is the necessity and the present significance of using such process. Coordinating the function software, the process for security software and its testing are deeply discussed. The process includes requirement analysis, design, coding, debug and testing, submission and maintenance. In each process, the paper proposed the subprocesses to support software security. As an example, the paper introduces the above process into the power information platform.

  8. Technology Infusion of CodeSonar into the Space Network Ground Segment (RII07)

    NASA Technical Reports Server (NTRS)

    Benson, Markland

    2008-01-01

    The NASA Software Assurance Research Program (in part) performs studies as to the feasibility of technologies for improving the safety, quality, reliability, cost, and performance of NASA software. This study considers the application of commercial automated source code analysis tools to mission critical ground software that is in the operations and sustainment portion of the product lifecycle.

  9. SEP thrust subsystem performance sensitivity analysis

    NASA Technical Reports Server (NTRS)

    Atkins, K. L.; Sauer, C. G., Jr.; Kerrisk, D. J.

    1973-01-01

    This is a two-part report on solar electric propulsion (SEP) performance sensitivity analysis. The first part describes the preliminary analysis of the SEP thrust system performance for an Encke rendezvous mission. A detailed description of thrust subsystem hardware tolerances on mission performance is included together with nominal spacecraft parameters based on these tolerances. The second part describes the method of analysis and graphical techniques used in generating the data for Part 1. Included is a description of both the trajectory program used and the additional software developed for this analysis. Part 2 also includes a comprehensive description of the use of the graphical techniques employed in this performance analysis.

  10. Inclusion of LCCA in Alaska flexible pavement design software manual.

    DOT National Transportation Integrated Search

    2012-10-01

    Life cycle cost analysis is a key part for selecting materials and techniques that optimize the service life of a pavement in terms of cost and performance. While the Alaska : Flexible Pavement Design software has been in use since 2004, there is no ...

  11. Logic Model Checking of Unintended Acceleration Claims in the 2005 Toyota Camry Electronic Throttle Control System

    NASA Technical Reports Server (NTRS)

    Gamble, Ed; Holzmann, Gerard

    2011-01-01

    Part of the US DOT investigation of Toyota SUA involved analysis of the throttle control software. JPL LaRS applied several techniques, including static analysis and logic model checking, to the software. A handful of logic models were built. Some weaknesses were identified; however, no cause for SUA was found. The full NASA report includes numerous other analyses

  12. STEM_CELL: a software tool for electron microscopy: part 2--analysis of crystalline materials.

    PubMed

    Grillo, Vincenzo; Rossi, Francesca

    2013-02-01

    A new graphical software (STEM_CELL) for analysis of HRTEM and STEM-HAADF images is here introduced in detail. The advantage of the software, beyond its graphic interface, is to put together different analysis algorithms and simulation (described in an associated article) to produce novel analysis methodologies. Different implementations and improvements to state of the art approach are reported in the image analysis, filtering, normalization, background subtraction. In particular two important methodological results are here highlighted: (i) the definition of a procedure for atomic scale quantitative analysis of HAADF images, (ii) the extension of geometric phase analysis to large regions up to potentially 1μm through the use of under sampled images with aliasing effects. Copyright © 2012 Elsevier B.V. All rights reserved.

  13. Engine Structures Modeling Software System (ESMOSS)

    NASA Technical Reports Server (NTRS)

    1991-01-01

    Engine Structures Modeling Software System (ESMOSS) is the development of a specialized software system for the construction of geometric descriptive and discrete analytical models of engine parts, components, and substructures which can be transferred to finite element analysis programs such as NASTRAN. The NASA Lewis Engine Structures Program is concerned with the development of technology for the rational structural design and analysis of advanced gas turbine engines with emphasis on advanced structural analysis, structural dynamics, structural aspects of aeroelasticity, and life prediction. Fundamental and common to all of these developments is the need for geometric and analytical model descriptions at various engine assembly levels which are generated using ESMOSS.

  14. Architecture for interoperable software in biology.

    PubMed

    Bare, James Christopher; Baliga, Nitin S

    2014-07-01

    Understanding biological complexity demands a combination of high-throughput data and interdisciplinary skills. One way to bring to bear the necessary combination of data types and expertise is by encapsulating domain knowledge in software and composing that software to create a customized data analysis environment. To this end, simple flexible strategies are needed for interconnecting heterogeneous software tools and enabling data exchange between them. Drawing on our own work and that of others, we present several strategies for interoperability and their consequences, in particular, a set of simple data structures--list, matrix, network, table and tuple--that have proven sufficient to achieve a high degree of interoperability. We provide a few guidelines for the development of future software that will function as part of an interoperable community of software tools for biological data analysis and visualization. © The Author 2012. Published by Oxford University Press.

  15. Automating Embedded Analysis Capabilities and Managing Software Complexity in Multiphysics Simulation, Part II: Application to Partial Differential Equations

    DOE PAGES

    Pawlowski, Roger P.; Phipps, Eric T.; Salinger, Andrew G.; ...

    2012-01-01

    A template-based generic programming approach was presented in Part I of this series of papers [Sci. Program. 20 (2012), 197–219] that separates the development effort of programming a physical model from that of computing additional quantities, such as derivatives, needed for embedded analysis algorithms. In this paper, we describe the implementation details for using the template-based generic programming approach for simulation and analysis of partial differential equations (PDEs). We detail several of the hurdles that we have encountered, and some of the software infrastructure developed to overcome them. We end with a demonstration where we present shape optimization and uncertaintymore » quantification results for a 3D PDE application.« less

  16. A Java application for tissue section image analysis.

    PubMed

    Kamalov, R; Guillaud, M; Haskins, D; Harrison, A; Kemp, R; Chiu, D; Follen, M; MacAulay, C

    2005-02-01

    The medical industry has taken advantage of Java and Java technologies over the past few years, in large part due to the language's platform-independence and object-oriented structure. As such, Java provides powerful and effective tools for developing tissue section analysis software. The background and execution of this development are discussed in this publication. Object-oriented structure allows for the creation of "Slide", "Unit", and "Cell" objects to simulate the corresponding real-world objects. Different functions may then be created to perform various tasks on these objects, thus facilitating the development of the software package as a whole. At the current time, substantial parts of the initially planned functionality have been implemented. Getafics 1.0 is fully operational and currently supports a variety of research projects; however, there are certain features of the software that currently introduce unnecessary complexity and inefficiency. In the future, we hope to include features that obviate these problems.

  17. Software Design Improvements. Part 1; Software Benefits and Limitations

    NASA Technical Reports Server (NTRS)

    Lalli, Vincent R.; Packard, Michael H.; Ziemianski, Tom

    1997-01-01

    Computer hardware and associated software have been used for many years to process accounting information, to analyze test data and to perform engineering analysis. Now computers and software also control everything from automobiles to washing machines and the number and type of applications are growing at an exponential rate. The size of individual program has shown similar growth. Furthermore, software and hardware are used to monitor and/or control potentially dangerous products and safety-critical systems. These uses include everything from airplanes and braking systems to medical devices and nuclear plants. The question is: how can this hardware and software be made more reliable? Also, how can software quality be improved? What methodology needs to be provided on large and small software products to improve the design and how can software be verified?

  18. Technology Transition Pull: A Case Study of Rate Monotonic Analysis (Part 2).

    DTIC Science & Technology

    1995-04-01

    met in software-intensive real - time systems . RMA allows engineers to under- stand and predict the timing behavior of real-time software to a degree...not previously possible. The Rate Monotonic Analysis for Real - Time Systems (RMARTS) Project at the SEI has dem- onstrated how to design, implement...troubleshoot, and maintain real - time systems using RMA. From 1987-1992, the project worked to develop the technology and encourage its widespread

  19. Automotive Design

    NASA Technical Reports Server (NTRS)

    1991-01-01

    Analytical Design Service Corporation, Ann Arbor, MI, used NASTRAN (a NASA Structural Analysis program that analyzes a design and predicts how parts will perform) in tests of transmissions, engine cooling systems, internal engine parts, and body components. They also use it to design future automobiles. Analytical software can save millions by allowing computer simulated analysis of performance even before prototypes are built.

  20. Software testing

    NASA Astrophysics Data System (ADS)

    Price-Whelan, Adrian M.

    2016-01-01

    Now more than ever, scientific results are dependent on sophisticated software and analysis. Why should we trust code written by others? How do you ensure your own code produces sensible results? How do you make sure it continues to do so as you update, modify, and add functionality? Software testing is an integral part of code validation and writing tests should be a requirement for any software project. I will talk about Python-based tools that make managing and running tests much easier and explore some statistics for projects hosted on GitHub that contain tests.

  1. Libre Software in Spanish Public Administrations

    NASA Astrophysics Data System (ADS)

    Ortega, Felipe; Lafuente, Isabel; Gato, Jose; González-Barahona, Jesús M.

    Libre software started to be used in Public Administrations in Spain during the 1990s, in some isolated but interesting experiences.During the early 2000s, and specially in some regional governments, libre software started to be considered as an integral part of ITrelated policies. In 2007, it was evident that many experiences related to libre software were running in Public Administrations with different levels of success. However, no study had looked into the details of these experiences, and no comprehensive analysis had been performed to better understand the different factors that affect them.

  2. Pandora Operation and Analysis Software

    NASA Technical Reports Server (NTRS)

    Herman, Jay; Cede, Alexander; Abuhassan, Nader

    2012-01-01

    Pandora Operation and Analysis Software controls the Pandora Sun- and sky-pointing optical head and built-in filter wheels (neutral density, UV bandpass, polarization filters, and opaque). The software also controls the attached spectrometer exposure time and thermoelectric cooler to maintain the spectrometer temperature to within 1 C. All functions are available through a GUI so as to be easily accessible by the user. The data are automatically stored on a miniature computer (netbook) for automatic download to a designated server at user defined intervals (once per day, once per week, etc.), or to a USB external device. An additional software component reduces the raw data (spectrometer counts) to preliminary scientific products for quick-view purposes. The Pandora systems are built from off-the-shelf commercial parts and from mechanical parts machined using electronic machine shop drawings. The Pandora spectrometer system is designed to look at the Sun (tracking to within 0.1 ), or to look at the sky at any zenith or azimuth angle, to gather information about the amount of trace gases or aerosols that are present.

  3. FluxPyt: a Python-based free and open-source software for 13C-metabolic flux analyses.

    PubMed

    Desai, Trunil S; Srivastava, Shireesh

    2018-01-01

    13 C-Metabolic flux analysis (MFA) is a powerful approach to estimate intracellular reaction rates which could be used in strain analysis and design. Processing and analysis of labeling data for calculation of fluxes and associated statistics is an essential part of MFA. However, various software currently available for data analysis employ proprietary platforms and thus limit accessibility. We developed FluxPyt, a Python-based truly open-source software package for conducting stationary 13 C-MFA data analysis. The software is based on the efficient elementary metabolite unit framework. The standard deviations in the calculated fluxes are estimated using the Monte-Carlo analysis. FluxPyt also automatically creates flux maps based on a template for visualization of the MFA results. The flux distributions calculated by FluxPyt for two separate models: a small tricarboxylic acid cycle model and a larger Corynebacterium glutamicum model, were found to be in good agreement with those calculated by a previously published software. FluxPyt was tested in Microsoft™ Windows 7 and 10, as well as in Linux Mint 18.2. The availability of a free and open 13 C-MFA software that works in various operating systems will enable more researchers to perform 13 C-MFA and to further modify and develop the package.

  4. FluxPyt: a Python-based free and open-source software for 13C-metabolic flux analyses

    PubMed Central

    Desai, Trunil S.

    2018-01-01

    13C-Metabolic flux analysis (MFA) is a powerful approach to estimate intracellular reaction rates which could be used in strain analysis and design. Processing and analysis of labeling data for calculation of fluxes and associated statistics is an essential part of MFA. However, various software currently available for data analysis employ proprietary platforms and thus limit accessibility. We developed FluxPyt, a Python-based truly open-source software package for conducting stationary 13C-MFA data analysis. The software is based on the efficient elementary metabolite unit framework. The standard deviations in the calculated fluxes are estimated using the Monte-Carlo analysis. FluxPyt also automatically creates flux maps based on a template for visualization of the MFA results. The flux distributions calculated by FluxPyt for two separate models: a small tricarboxylic acid cycle model and a larger Corynebacterium glutamicum model, were found to be in good agreement with those calculated by a previously published software. FluxPyt was tested in Microsoft™ Windows 7 and 10, as well as in Linux Mint 18.2. The availability of a free and open 13C-MFA software that works in various operating systems will enable more researchers to perform 13C-MFA and to further modify and develop the package. PMID:29736347

  5. Analysis Software

    NASA Technical Reports Server (NTRS)

    1994-01-01

    General Purpose Boundary Element Solution Technology (GPBEST) software employs the boundary element method of mechanical engineering analysis, as opposed to finite element. It is, according to one of its developers, 10 times faster in data preparation and more accurate than other methods. Its use results in less expensive products because the time between design and manufacturing is shortened. A commercial derivative of a NASA-developed computer code, it is marketed by Best Corporation to solve problems in stress analysis, heat transfer, fluid analysis and yielding and cracking of solids. Other applications include designing tractor and auto parts, household appliances and acoustic analysis.

  6. SMARTSware for SMARTS users to facilitate data reduction and data analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    2005-01-01

    The software package SMARTSware is made by one of the instrument scientist on the engineering neutron diffractometer SMARTS at the Lujan Center, a national user facility at Los Alamos Neutron Scattering Center (LANSCE). The purpose of the software is to facilitate the data analysis of powder diffraction data recorded at the Lujan Center, and hence the target audience is users performing experiments at the one of the powder diffractometers (SMARTS, HIPPO, HIPD and NPDF) at the Lujan Center. The beam time at the Lujan Center is allocated by peer review of internally and extenally submitted proposals, and therefore many ofmore » the users who are granted beam time are from the international science community. Generally, the users are only at the Lujan Center for a short period of time while they are performing the experiments, and often they leave with several data sets that have not been analyzed. The distribution of the SMARTSware software package will minimize their efforts when analyzing the data once they are back at their institution. Description of software: There are two main parts of the software; a part used to generate instrument parameter files from a set of calibration runs (Smartslparm, SmartsBin, SmartsFitDif and SmartsFitspec); and a part that facilitates the batch refinement of multiple diffraction patterns (SmartsRunRep, SmartsABC, SmartsSPF and SmartsExtract). The former part may only be peripheral to most users, but is a critical part of the instrument scientists' efforts in calibrating their instruments. The latter part is highly applicable to the users as they often need to analyze or re-analyze large sets of data. The programs within the SMARTSware package heavily rely on GSAS for the Rietveld and single peak refinements of diffraction data. GSAS (General Structure Analysis System) is a public available software also originating from LANL. Subroutines and libraries from the NeXus project (a world wide trust to standardize diffraction data formats) and National Center for Supercomputing Applications (NCSA) at the University of Illinois (the Hierarchical Data Format Software Library and Utilities) are used in the programs. All these subroutines and libraries are publicly available through the GNU Public License and/or Freeware. The package also contains sample input and output text files and a manual (LA-UR 04-6581). The executables and sample files will be available for down load at http://public.lanl.gov/clausen/SMARTSware.html and ftp://lansce.lanl.gov/clausen/SMARTSware/SMARTSware.zip, but the source codes will only be made available by written request to clausen@lanl.gov.« less

  7. Sample registration software for process automation in the Neutron Activation Analysis (NAA) Facility in Malaysia nuclear agency

    NASA Astrophysics Data System (ADS)

    Rahman, Nur Aira Abd; Yussup, Nolida; Salim, Nazaratul Ashifa Bt. Abdullah; Ibrahim, Maslina Bt. Mohd; Mokhtar, Mukhlis B.; Soh@Shaari, Syirrazie Bin Che; Azman, Azraf B.; Ismail, Nadiah Binti

    2015-04-01

    Neutron Activation Analysis (NAA) had been established in Nuclear Malaysia since 1980s. Most of the procedures established were done manually including sample registration. The samples were recorded manually in a logbook and given ID number. Then all samples, standards, SRM and blank were recorded on the irradiation vial and several forms prior to irradiation. These manual procedures carried out by the NAA laboratory personnel were time consuming and not efficient. Sample registration software is developed as part of IAEA/CRP project on `Development of Process Automation in the Neutron Activation Analysis (NAA) Facility in Malaysia Nuclear Agency (RC17399)'. The objective of the project is to create a pc-based data entry software during sample preparation stage. This is an effective method to replace redundant manual data entries that needs to be completed by laboratory personnel. The software developed will automatically generate sample code for each sample in one batch, create printable registration forms for administration purpose, and store selected parameters that will be passed to sample analysis program. The software is developed by using National Instruments Labview 8.6.

  8. BrightStat.com: free statistics online.

    PubMed

    Stricker, Daniel

    2008-10-01

    Powerful software for statistical analysis is expensive. Here I present BrightStat, a statistical software running on the Internet which is free of charge. BrightStat's goals, its main capabilities and functionalities are outlined. Three different sample runs, a Friedman test, a chi-square test, and a step-wise multiple regression are presented. The results obtained by BrightStat are compared with results computed by SPSS, one of the global leader in providing statistical software, and VassarStats, a collection of scripts for data analysis running on the Internet. Elementary statistics is an inherent part of academic education and BrightStat is an alternative to commercial products.

  9. WISE: Automated support for software project management and measurement. M.S. Thesis

    NASA Technical Reports Server (NTRS)

    Ramakrishnan, Sudhakar

    1995-01-01

    One important aspect of software development and IV&V is measurement. Unless a software development effort is measured in some way, it is difficult to judge the effectiveness of current efforts and predict future performances. Collection of metrics and adherence to a process are difficult tasks in a software project. Change activity is a powerful indicator of project status. Automated systems that can handle change requests, issues, and other process documents provide an excellent platform for tracking the status of the project. A World Wide Web based architecture is developed for (a) making metrics collection an implicit part of the software process, (b) providing metric analysis dynamically, (c) supporting automated tools that can complement current practices of in-process improvement, and (d) overcoming geographical barrier. An operational system (WISE) instantiates this architecture allowing for the improvement of software process in a realistic environment. The tool tracks issues in software development process, provides informal communication between the users with different roles, supports to-do lists (TDL), and helps in software process improvement. WISE minimizes the time devoted to metrics collection, analysis, and captures software change data. Automated tools like WISE focus on understanding and managing the software process. The goal is improvement through measurement.

  10. Generalized implementation of software safety policies

    NASA Technical Reports Server (NTRS)

    Knight, John C.; Wika, Kevin G.

    1994-01-01

    As part of a research program in the engineering of software for safety-critical systems, we are performing two case studies. The first case study, which is well underway, is a safety-critical medical application. The second, which is just starting, is a digital control system for a nuclear research reactor. Our goal is to use these case studies to permit us to obtain a better understanding of the issues facing developers of safety-critical systems, and to provide a vehicle for the assessment of research ideas. The case studies are not based on the analysis of existing software development by others. Instead, we are attempting to create software for new and novel systems in a process that ultimately will involve all phases of the software lifecycle. In this abstract, we summarize our results to date in a small part of this project, namely the determination and classification of policies related to software safety that must be enforced to ensure safe operation. We hypothesize that this classification will permit a general approach to the implementation of a policy enforcement mechanism.

  11. BAM/DASS: Data Analysis Software for Sub-Microarcsecond Astrometry Device

    NASA Astrophysics Data System (ADS)

    Gardiol, D.; Bonino, D.; Lattanzi, M. G.; Riva, A.; Russo, F.

    2010-12-01

    The INAF - Osservatorio Astronomico di Torino is part of the Data Processing and Analysis Consortium (DPAC) for Gaia, a cornerstone mission of the European Space Agency. Gaia will perform global astrometry by means of two telescopes looking at the sky along two different lines of sight oriented at a fixed angle, also called basic angle. Knowledge of the basic angle fluctuations at the sub-microarcsecond level over periods of the order of the minute is crucial to reach the mission goals. A specific device, the Basic Angle Monitoring, will be dedicated to this purpose. We present here the software system we are developing to analyze the BAM data and recover the basic angle variations. This tool is integrated into the whole DPAC data analysis software.

  12. Man-machine Integration Design and Analysis System (MIDAS) Task Loading Model (TLM) experimental and software detailed design report

    NASA Technical Reports Server (NTRS)

    Staveland, Lowell

    1994-01-01

    This is the experimental and software detailed design report for the prototype task loading model (TLM) developed as part of the man-machine integration design and analysis system (MIDAS), as implemented and tested in phase 6 of the Army-NASA Aircrew/Aircraft Integration (A3I) Program. The A3I program is an exploratory development effort to advance the capabilities and use of computational representations of human performance and behavior in the design, synthesis, and analysis of manned systems. The MIDAS TLM computationally models the demands designs impose on operators to aide engineers in the conceptual design of aircraft crewstations. This report describes TLM and the results of a series of experiments which were run this phase to test its capabilities as a predictive task demand modeling tool. Specifically, it includes discussions of: the inputs and outputs of TLM, the theories underlying it, the results of the test experiments, the use of the TLM as both stand alone tool and part of a complete human operator simulation, and a brief introduction to the TLM software design.

  13. Automated Software Development Workstation (ASDW)

    NASA Technical Reports Server (NTRS)

    Fridge, Ernie

    1990-01-01

    Software development is a serious bottleneck in the construction of complex automated systems. An increase of the reuse of software designs and components has been viewed as a way to relieve this bottleneck. One approach to achieving software reusability is through the development and use of software parts composition systems. A software parts composition system is a software development environment comprised of a parts description language for modeling parts and their interfaces, a catalog of existing parts, a composition editor that aids a user in the specification of a new application from existing parts, and a code generator that takes a specification and generates an implementation of a new application in a target language. The Automated Software Development Workstation (ASDW) is an expert system shell that provides the capabilities required to develop and manipulate these software parts composition systems. The ASDW is now in Beta testing at the Johnson Space Center. Future work centers on responding to user feedback for capability and usability enhancement, expanding the scope of the software lifecycle that is covered, and in providing solutions to handling very large libraries of reusable components.

  14. Road embankment and slope stabilization.

    DOT National Transportation Integrated Search

    2010-07-31

    This report and the accompanying software are part of efforts to improve the characterization and analysis of pilestabilized : slopes using one or two rows of driven piles. A combination of the limit equilibrium analysis and strain : wedge (SW) model...

  15. Wildlife software: procedures for publication of computer software

    USGS Publications Warehouse

    Samuel, M.D.

    1990-01-01

    Computers and computer software have become an integral part of the practice of wildlife science. Computers now play an important role in teaching, research, and management applications. Because of the specialized nature of wildlife problems, specific computer software is usually required to address a given problem (e.g., home range analysis). This type of software is not usually available from commercial vendors and therefore must be developed by those wildlife professionals with particular skill in computer programming. Current journal publication practices generally prevent a detailed description of computer software associated with new techniques. In addition, peer review of journal articles does not usually include a review of associated computer software. Thus, many wildlife professionals are usually unaware of computer software that would meet their needs or of major improvements in software they commonly use. Indeed most users of wildlife software learn of new programs or important changes only by word of mouth.

  16. Practical research on the teaching of Optical Design

    NASA Astrophysics Data System (ADS)

    Fan, Changjiang; Ren, Zhijun; Ying, Chaofu; Peng, Baojin

    2017-08-01

    Optical design, together with applied optics, forms a complete system from basic theory to application theory, and it plays a very important role in professional education. In order to improve senior undergraduates' understanding of optical design, this course is divided into three parts: theoretical knowledge, software design and product processing. Through learning theoretical knowledge, students can master the aberration theory and the design principles of typical optical system. By using ZEMAX(an imaging design software), TRACEPRO(a lighting optical design software), SOLIDWORKS or PROE( mechanical design software), student can establish a complete model of optical system. Student can use carving machine located in lab or cooperative units to process the model. Through the above three parts, student can learn necessary practical knowledge and get improved in their learning and analysis abilities, thus they can also get enough practice to prompt their creative abilities, then they could gradually change from scientific theory learners to an Optics Engineers.

  17. Stability analysis using SDSA tool

    NASA Astrophysics Data System (ADS)

    Goetzendorf-Grabowski, Tomasz; Mieszalski, Dawid; Marcinkiewicz, Ewa

    2011-11-01

    The SDSA (Simulation and Dynamic Stability Analysis) application is presented as a tool for analysing the dynamic characteristics of the aircraft just in the conceptual design stage. SDSA is part of the CEASIOM (Computerized Environment for Aircraft Synthesis and Integrated Optimization Methods) software environment which was developed within the SimSAC (Simulating Aircraft Stability And Control Characteristics for Use in Conceptual Design) project, funded by the European Commission 6th Framework Program. SDSA can also be used as stand alone software, and integrated with other design and optimisation systems using software wrappers. This paper focuses on the main functionalities of SDSA and presents both computational and free flight experimental results to compare and validate the presented software. Two aircraft are considered, the EADS Ranger 2000 and the Warsaw University designed PW-6 glider. For the two cases considered here the SDSA software is shown to be an excellent tool for predicting dynamic characteristics of an aircraft.

  18. Recent Advances in Multidisciplinary Analysis and Optimization, part 3

    NASA Technical Reports Server (NTRS)

    Barthelemy, Jean-Francois M. (Editor)

    1989-01-01

    This three-part document contains a collection of technical papers presented at the Second NASA/Air Force Symposium on Recent Advances in Multidisciplinary Analysis and Optimization, held September 28-30, 1988 in Hampton, Virginia. The topics covered include: aircraft design, aeroelastic tailoring, control of aeroelastic structures, dynamics and control of flexible structures, structural design, design of large engineering systems, application of artificial intelligence, shape optimization, software development and implementation, and sensitivity analysis.

  19. Recent Advances in Multidisciplinary Analysis and Optimization, part 2

    NASA Technical Reports Server (NTRS)

    Barthelemy, Jean-Francois M. (Editor)

    1989-01-01

    This three-part document contains a collection of technical papers presented at the Second NASA/Air Force Symposium on Recent Advances in Multidisciplinary Analysis and Optimization, held September 28-30, 1988 in Hampton, Virginia. The topics covered include: helicopter design, aeroelastic tailoring, control of aeroelastic structures, dynamics and control of flexible structures, structural design, design of large engineering systems, application of artificial intelligence, shape optimization, software development and implementation, and sensitivity analysis.

  20. Recent Advances in Multidisciplinary Analysis and Optimization, part 1

    NASA Technical Reports Server (NTRS)

    Barthelemy, Jean-Francois M. (Editor)

    1989-01-01

    This three-part document contains a collection of technical papers presented at the Second NASA/Air Force Symposium on Recent Advances in Multidisciplinary Analysis and Optimization, held September 28-30, 1988 in Hampton, Virginia. The topics covered include: helicopter design, aeroelastic tailoring, control of aeroelastic structures, dynamics and control of flexible structures, structural design, design of large engineering systems, application of artificial intelligence, shape optimization, software development and implementation, and sensitivity analysis.

  1. SIMA: Python software for analysis of dynamic fluorescence imaging data.

    PubMed

    Kaifosh, Patrick; Zaremba, Jeffrey D; Danielson, Nathan B; Losonczy, Attila

    2014-01-01

    Fluorescence imaging is a powerful method for monitoring dynamic signals in the nervous system. However, analysis of dynamic fluorescence imaging data remains burdensome, in part due to the shortage of available software tools. To address this need, we have developed SIMA, an open source Python package that facilitates common analysis tasks related to fluorescence imaging. Functionality of this package includes correction of motion artifacts occurring during in vivo imaging with laser-scanning microscopy, segmentation of imaged fields into regions of interest (ROIs), and extraction of signals from the segmented ROIs. We have also developed a graphical user interface (GUI) for manual editing of the automatically segmented ROIs and automated registration of ROIs across multiple imaging datasets. This software has been designed with flexibility in mind to allow for future extension with different analysis methods and potential integration with other packages. Software, documentation, and source code for the SIMA package and ROI Buddy GUI are freely available at http://www.losonczylab.org/sima/.

  2. 22 CFR 121.8 - End-items, components, accessories, attachments, parts, firmware, software, and systems.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ..., parts, firmware, software, and systems. 121.8 Section 121.8 Foreign Relations DEPARTMENT OF STATE...-items, components, accessories, attachments, parts, firmware, software, and systems. (a) An end-item is.... Firmware includes but is not limited to circuits into which software has been programmed. (f) Software...

  3. 22 CFR 121.8 - End-items, components, accessories, attachments, parts, firmware, software and systems.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ..., parts, firmware, software and systems. 121.8 Section 121.8 Foreign Relations DEPARTMENT OF STATE...-items, components, accessories, attachments, parts, firmware, software and systems. (a) An end-item is.... Firmware includes but is not limited to circuits into which software has been programmed. (f) Software...

  4. 22 CFR 121.8 - End-items, components, accessories, attachments, parts, firmware, software and systems.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ..., parts, firmware, software and systems. 121.8 Section 121.8 Foreign Relations DEPARTMENT OF STATE...-items, components, accessories, attachments, parts, firmware, software and systems. (a) An end-item is.... Firmware includes but is not limited to circuits into which software has been programmed. (f) Software...

  5. 22 CFR 121.8 - End-items, components, accessories, attachments, parts, firmware, software and systems.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ..., parts, firmware, software and systems. 121.8 Section 121.8 Foreign Relations DEPARTMENT OF STATE...-items, components, accessories, attachments, parts, firmware, software and systems. (a) An end-item is.... Firmware includes but is not limited to circuits into which software has been programmed. (f) Software...

  6. 22 CFR 121.8 - End-items, components, accessories, attachments, parts, firmware, software and systems.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ..., parts, firmware, software and systems. 121.8 Section 121.8 Foreign Relations DEPARTMENT OF STATE...-items, components, accessories, attachments, parts, firmware, software and systems. (a) An end-item is.... Firmware includes but is not limited to circuits into which software has been programmed. (f) Software...

  7. Advanced software development workstation project: Engineering scripting language. Graphical editor

    NASA Technical Reports Server (NTRS)

    1992-01-01

    Software development is widely considered to be a bottleneck in the development of complex systems, both in terms of development and in terms of maintenance of deployed systems. Cost of software development and maintenance can also be very high. One approach to reducing costs and relieving this bottleneck is increasing the reuse of software designs and software components. A method for achieving such reuse is a software parts composition system. Such a system consists of a language for modeling software parts and their interfaces, a catalog of existing parts, an editor for combining parts, and a code generator that takes a specification and generates code for that application in the target language. The Advanced Software Development Workstation is intended to be an expert system shell designed to provide the capabilities of a software part composition system.

  8. Harnessing the power of emerging petascale platforms

    NASA Astrophysics Data System (ADS)

    Mellor-Crummey, John

    2007-07-01

    As part of the US Department of Energy's Scientific Discovery through Advanced Computing (SciDAC-2) program, science teams are tackling problems that require computational simulation and modeling at the petascale. A grand challenge for computer science is to develop software technology that makes it easier to harness the power of these systems to aid scientific discovery. As part of its activities, the SciDAC-2 Center for Scalable Application Development Software (CScADS) is building open source software tools to support efficient scientific computing on the emerging leadership-class platforms. In this paper, we describe two tools for performance analysis and tuning that are being developed as part of CScADS: a tool for analyzing scalability and performance, and a tool for optimizing loop nests for better node performance. We motivate these tools by showing how they apply to S3D, a turbulent combustion code under development at Sandia National Laboratory. For S3D, our node performance analysis tool helped uncover several performance bottlenecks. Using our loop nest optimization tool, we transformed S3D's most costly loop nest to reduce execution time by a factor of 2.94 for a processor working on a 503 domain.

  9. Object-oriented design of medical imaging software.

    PubMed

    Ligier, Y; Ratib, O; Logean, M; Girard, C; Perrier, R; Scherrer, J R

    1994-01-01

    A special software package for interactive display and manipulation of medical images was developed at the University Hospital of Geneva, as part of a hospital wide Picture Archiving and Communication System (PACS). This software package, called Osiris, was especially designed to be easily usable and adaptable to the needs of noncomputer-oriented physicians. The Osiris software has been developed to allow the visualization of medical images obtained from any imaging modality. It provides generic manipulation tools, processing tools, and analysis tools more specific to clinical applications. This software, based on an object-oriented paradigm, is portable and extensible. Osiris is available on two different operating systems: the Unix X-11/OSF-Motif based workstations, and the Macintosh family.

  10. Reusable software parts and the semi-abstract data type

    NASA Technical Reports Server (NTRS)

    Cohen, Sanford G.

    1986-01-01

    The development of reuable software parts has been an area of intense discussion within the software community for many years. An approach is described for developing reusable parts for the applications of missile guidance, navigation and control which meet the following criteria: (1) Reusable; (2) Tailorable; (3) Efficient; (4) Simple to use; and (5) Protected against misuse. Validating the feasibility of developing reusable parts which possess these characteristics is the basis of the Common Ada Missile Packages Program (CAMP). Under CAMP, over 200 reusable software parts were developed, including part for navigation, Kalman filter, signal processing and autopilot. Six different methods are presented for designing reusable software parts.

  11. Quality measures and assurance for AI (Artificial Intelligence) software

    NASA Technical Reports Server (NTRS)

    Rushby, John

    1988-01-01

    This report is concerned with the application of software quality and evaluation measures to AI software and, more broadly, with the question of quality assurance for AI software. Considered are not only the metrics that attempt to measure some aspect of software quality, but also the methodologies and techniques (such as systematic testing) that attempt to improve some dimension of quality, without necessarily quantifying the extent of the improvement. The report is divided into three parts Part 1 reviews existing software quality measures, i.e., those that have been developed for, and applied to, conventional software. Part 2 considers the characteristics of AI software, the applicability and potential utility of measures and techniques identified in the first part, and reviews those few methods developed specifically for AI software. Part 3 presents an assessment and recommendations for the further exploration of this important area.

  12. Design, implementation and practice of JBEI-ICE: an open source biological part registry platform and tools.

    PubMed

    Ham, Timothy S; Dmytriv, Zinovii; Plahar, Hector; Chen, Joanna; Hillson, Nathan J; Keasling, Jay D

    2012-10-01

    The Joint BioEnergy Institute Inventory of Composable Elements (JBEI-ICEs) is an open source registry platform for managing information about biological parts. It is capable of recording information about 'legacy' parts, such as plasmids, microbial host strains and Arabidopsis seeds, as well as DNA parts in various assembly standards. ICE is built on the idea of a web of registries and thus provides strong support for distributed interconnected use. The information deposited in an ICE installation instance is accessible both via a web browser and through the web application programming interfaces, which allows automated access to parts via third-party programs. JBEI-ICE includes several useful web browser-based graphical applications for sequence annotation, manipulation and analysis that are also open source. As with open source software, users are encouraged to install, use and customize JBEI-ICE and its components for their particular purposes. As a web application programming interface, ICE provides well-developed parts storage functionality for other synthetic biology software projects. A public instance is available at public-registry.jbei.org, where users can try out features, upload parts or simply use it for their projects. The ICE software suite is available via Google Code, a hosting site for community-driven open source projects.

  13. Sample registration software for process automation in the Neutron Activation Analysis (NAA) Facility in Malaysia nuclear agency

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rahman, Nur Aira Abd, E-mail: nur-aira@nuclearmalaysia.gov.my; Yussup, Nolida; Ibrahim, Maslina Bt. Mohd

    Neutron Activation Analysis (NAA) had been established in Nuclear Malaysia since 1980s. Most of the procedures established were done manually including sample registration. The samples were recorded manually in a logbook and given ID number. Then all samples, standards, SRM and blank were recorded on the irradiation vial and several forms prior to irradiation. These manual procedures carried out by the NAA laboratory personnel were time consuming and not efficient. Sample registration software is developed as part of IAEA/CRP project on ‘Development of Process Automation in the Neutron Activation Analysis (NAA) Facility in Malaysia Nuclear Agency (RC17399)’. The objective ofmore » the project is to create a pc-based data entry software during sample preparation stage. This is an effective method to replace redundant manual data entries that needs to be completed by laboratory personnel. The software developed will automatically generate sample code for each sample in one batch, create printable registration forms for administration purpose, and store selected parameters that will be passed to sample analysis program. The software is developed by using National Instruments Labview 8.6.« less

  14. A taxonomy and discussion of software attack technologies

    NASA Astrophysics Data System (ADS)

    Banks, Sheila B.; Stytz, Martin R.

    2005-03-01

    Software is a complex thing. It is not an engineering artifact that springs forth from a design by simply following software coding rules; creativity and the human element are at the heart of the process. Software development is part science, part art, and part craft. Design, architecture, and coding are equally important activities and in each of these activities, errors may be introduced that lead to security vulnerabilities. Therefore, inevitably, errors enter into the code. Some of these errors are discovered during testing; however, some are not. The best way to find security errors, whether they are introduced as part of the architecture development effort or coding effort, is to automate the security testing process to the maximum extent possible and add this class of tools to the tools available, which aids in the compilation process, testing, test analysis, and software distribution. Recent technological advances, improvements in computer-generated forces (CGFs), and results in research in information assurance and software protection indicate that we can build a semi-intelligent software security testing tool. However, before we can undertake the security testing automation effort, we must understand the scope of the required testing, the security failures that need to be uncovered during testing, and the characteristics of the failures. Therefore, we undertook the research reported in the paper, which is the development of a taxonomy and a discussion of software attacks generated from the point of view of the security tester with the goal of using the taxonomy to guide the development of the knowledge base for the automated security testing tool. The representation for attacks and threat cases yielded by this research captures the strategies, tactics, and other considerations that come into play during the planning and execution of attacks upon application software. The paper is organized as follows. Section one contains an introduction to our research and a discussion of the motivation for our work. Section two contains a presents our taxonomy of software attacks and a discussion of the strategies employed and general weaknesses exploited for each attack. Section three contains a summary and suggestions for further research.

  15. Modeling of short fiber reinforced injection moulded composite

    NASA Astrophysics Data System (ADS)

    Kulkarni, A.; Aswini, N.; Dandekar, C. R.; Makhe, S.

    2012-09-01

    A micromechanics based finite element model (FEM) is developed to facilitate the design of a new production quality fiber reinforced plastic injection molded part. The composite part under study is composed of a polyetheretherketone (PEEK) matrix reinforced with 30% by volume fraction of short carbon fibers. The constitutive material models are obtained by using micromechanics based homogenization theories. The analysis is carried out by successfully coupling two commercial codes, Moldflow and ANSYS. Moldflow software is used to predict the fiber orientation by considering the flow kinetics and molding parameters. Material models are inputted into the commercial software ANSYS as per the predicted fiber orientation and the structural analysis is carried out. Thus in the present approach a coupling between two commercial codes namely Moldflow and ANSYS has been established to enable the analysis of the short fiber reinforced injection moulded composite parts. The load-deflection curve is obtained based on three constitutive material model namely an isotropy, transversely isotropy and orthotropy. Average values of the predicted quantities are compared to experimental results, obtaining a good correlation. In this manner, the coupled Moldflow-ANSYS model successfully predicts the load deflection curve of a composite injection molded part.

  16. Image analysis library software development

    NASA Technical Reports Server (NTRS)

    Guseman, L. F., Jr.; Bryant, J.

    1977-01-01

    The Image Analysis Library consists of a collection of general purpose mathematical/statistical routines and special purpose data analysis/pattern recognition routines basic to the development of image analysis techniques for support of current and future Earth Resources Programs. Work was done to provide a collection of computer routines and associated documentation which form a part of the Image Analysis Library.

  17. Phonetics and Technology in the Classroom: A Practical Approach to Using Speech Analysis Software in Second-Language Pronunciation Instruction

    ERIC Educational Resources Information Center

    Olsen, Daniel J.

    2014-01-01

    While speech analysis technology has become an integral part of phonetic research, and to some degree is used in language instruction at the most advanced levels, it appears to be mostly absent from the beginning levels of language instruction. In part, the lack of incorporation into the language classroom can be attributed to both the lack of…

  18. [Design and Realization of Personalized Corneal Analysis Software Based on Corneal Topography System].

    PubMed

    Huang, Xueping; Xie, Zhonghao; Cen, Qin; Zheng, Suilian

    2016-08-01

    As the most important refraction part in the optical system,cornea possesses characteristics which are important parameters in ophthalmology clinical surgery.During the measurement of the cornea in our study,we acquired the corneal data of Orbscan Ⅱ corneal topographer in real time using the Hook technology under Windows,and then took the data into the corneal analysis software.We then further analyzed and calculated the data to obtain individual Q-value of overall corneal 360semi-meridian.The corneal analysis software took Visual C++ 6.0as development environment,used OpenGL graphics technology to draw three-dimensional individual corneal morphological map and the distribution curve of the Q-value,and achieved real-time corneal data query.It could be concluded that the analysis would further extend the function of the corneal topography system,and provide a solid foundation for the further study of automatic screening of corneal diseases.

  19. Generalized Support Software: Domain Analysis and Implementation

    NASA Technical Reports Server (NTRS)

    Stark, Mike; Seidewitz, Ed

    1995-01-01

    For the past five years, the Flight Dynamics Division (FDD) at NASA's Goddard Space Flight Center has been carrying out a detailed domain analysis effort and is now beginning to implement Generalized Support Software (GSS) based on this analysis. GSS is part of the larger Flight Dynamics Distributed System (FDDS), and is designed to run under the FDDS User Interface / Executive (UIX). The FDD is transitioning from a mainframe based environment to systems running on engineering workstations. The GSS will be a library of highly reusable components that may be configured within the standard FDDS architecture to quickly produce low-cost satellite ground support systems. The estimates for the first release is that this library will contain approximately 200,000 lines of code. The main driver for developing generalized software is development cost and schedule improvement. The goal is to ultimately have at least 80 percent of all software required for a spacecraft mission (within the domain supported by the GSS) to be configured from the generalized components.

  20. Spreadsheets for Analyzing and Optimizing Space Missions

    NASA Technical Reports Server (NTRS)

    Some, Raphael R.; Agrawal, Anil K.; Czikmantory, Akos J.; Weisbin, Charles R.; Hua, Hook; Neff, Jon M.; Cowdin, Mark A.; Lewis, Brian S.; Iroz, Juana; Ross, Rick

    2009-01-01

    XCALIBR (XML Capability Analysis LIBRary) is a set of Extensible Markup Language (XML) database and spreadsheet- based analysis software tools designed to assist in technology-return-on-investment analysis and optimization of technology portfolios pertaining to outer-space missions. XCALIBR is also being examined for use in planning, tracking, and documentation of projects. An XCALIBR database contains information on mission requirements and technological capabilities, which are related by use of an XML taxonomy. XCALIBR incorporates a standardized interface for exporting data and analysis templates to an Excel spreadsheet. Unique features of XCALIBR include the following: It is inherently hierarchical by virtue of its XML basis. The XML taxonomy codifies a comprehensive data structure and data dictionary that includes performance metrics for spacecraft, sensors, and spacecraft systems other than sensors. The taxonomy contains >700 nodes representing all levels, from system through subsystem to individual parts. All entries are searchable and machine readable. There is an intuitive Web-based user interface. The software automatically matches technologies to mission requirements. The software automatically generates, and makes the required entries in, an Excel return-on-investment analysis software tool. The results of an analysis are presented in both tabular and graphical displays.

  1. Software reliability through fault-avoidance and fault-tolerance

    NASA Technical Reports Server (NTRS)

    Vouk, Mladen A.; Mcallister, David F.

    1993-01-01

    Strategies and tools for the testing, risk assessment and risk control of dependable software-based systems were developed. Part of this project consists of studies to enable the transfer of technology to industry, for example the risk management techniques for safety-concious systems. Theoretical investigations of Boolean and Relational Operator (BRO) testing strategy were conducted for condition-based testing. The Basic Graph Generation and Analysis tool (BGG) was extended to fully incorporate several variants of the BRO metric. Single- and multi-phase risk, coverage and time-based models are being developed to provide additional theoretical and empirical basis for estimation of the reliability and availability of large, highly dependable software. A model for software process and risk management was developed. The use of cause-effect graphing for software specification and validation was investigated. Lastly, advanced software fault-tolerance models were studied to provide alternatives and improvements in situations where simple software fault-tolerance strategies break down.

  2. Statistics of software vulnerability detection in certification testing

    NASA Astrophysics Data System (ADS)

    Barabanov, A. V.; Markov, A. S.; Tsirlov, V. L.

    2018-05-01

    The paper discusses practical aspects of introduction of the methods to detect software vulnerability in the day-to-day activities of the accredited testing laboratory. It presents the approval results of the vulnerability detection methods as part of the study of the open source software and the software that is a test object of the certification tests under information security requirements, including software for communication networks. Results of the study showing the allocation of identified vulnerabilities by types of attacks, country of origin, programming languages used in the development, methods for detecting vulnerability, etc. are given. The experience of foreign information security certification systems related to the detection of certified software vulnerabilities is analyzed. The main conclusion based on the study is the need to implement practices for developing secure software in the development life cycle processes. The conclusions and recommendations for the testing laboratories on the implementation of the vulnerability analysis methods are laid down.

  3. A survey of tools for the analysis of quantitative PCR (qPCR) data.

    PubMed

    Pabinger, Stephan; Rödiger, Stefan; Kriegner, Albert; Vierlinger, Klemens; Weinhäusel, Andreas

    2014-09-01

    Real-time quantitative polymerase-chain-reaction (qPCR) is a standard technique in most laboratories used for various applications in basic research. Analysis of qPCR data is a crucial part of the entire experiment, which has led to the development of a plethora of methods. The released tools either cover specific parts of the workflow or provide complete analysis solutions. Here, we surveyed 27 open-access software packages and tools for the analysis of qPCR data. The survey includes 8 Microsoft Windows, 5 web-based, 9 R-based and 5 tools from other platforms. Reviewed packages and tools support the analysis of different qPCR applications, such as RNA quantification, DNA methylation, genotyping, identification of copy number variations, and digital PCR. We report an overview of the functionality, features and specific requirements of the individual software tools, such as data exchange formats, availability of a graphical user interface, included procedures for graphical data presentation, and offered statistical methods. In addition, we provide an overview about quantification strategies, and report various applications of qPCR. Our comprehensive survey showed that most tools use their own file format and only a fraction of the currently existing tools support the standardized data exchange format RDML. To allow a more streamlined and comparable analysis of qPCR data, more vendors and tools need to adapt the standardized format to encourage the exchange of data between instrument software, analysis tools, and researchers.

  4. Modeling and analysis of visual digital impact model for a Chinese human thorax.

    PubMed

    Zhu, Jin; Wang, Kai-Ming; Li, Shu; Liu, Hai-Yan; Jing, Xiao; Li, Xiao-Fang; Liu, Yi-He

    2017-01-01

    To establish a three-dimensional finite element model of the human chest for engineering research on individual protection. Computed tomography (CT) scanning data were used for three-dimensional reconstruction with the medical image reconstruction software Mimics. The finite element method (FEM) preprocessing software ANSYS ICEM CFD was used for cell mesh generation, and the relevant material behavior parameters of all of the model's parts were specified. The finite element model was constructed with the FEM software, and the model availability was verified based on previous cadaver experimental data. A finite element model approximating the anatomical structure of the human chest was established, and the model's simulation results conformed to the results of the cadaver experiment overall. Segment data of the human body and specialized software can be utilized for FEM model reconstruction to satisfy the need for numerical analysis of shocks to the human chest in engineering research on body mechanics.

  5. Methods for cost estimation in software project management

    NASA Astrophysics Data System (ADS)

    Briciu, C. V.; Filip, I.; Indries, I. I.

    2016-02-01

    The speed in which the processes used in software development field have changed makes it very difficult the task of forecasting the overall costs for a software project. By many researchers, this task has been considered unachievable, but there is a group of scientist for which this task can be solved using the already known mathematical methods (e.g. multiple linear regressions) and the new techniques as genetic programming and neural networks. The paper presents a solution for building a model for the cost estimation models in the software project management using genetic algorithms starting from the PROMISE datasets related COCOMO 81 model. In the first part of the paper, a summary of the major achievements in the research area of finding a model for estimating the overall project costs is presented together with the description of the existing software development process models. In the last part, a basic proposal of a mathematical model of a genetic programming is proposed including here the description of the chosen fitness function and chromosome representation. The perspective of model described it linked with the current reality of the software development considering as basis the software product life cycle and the current challenges and innovations in the software development area. Based on the author's experiences and the analysis of the existing models and product lifecycle it was concluded that estimation models should be adapted with the new technologies and emerging systems and they depend largely by the chosen software development method.

  6. A coverage and slicing dependencies analysis for seeking software security defects.

    PubMed

    He, Hui; Zhang, Dongyan; Liu, Min; Zhang, Weizhe; Gao, Dongmin

    2014-01-01

    Software security defects have a serious impact on the software quality and reliability. It is a major hidden danger for the operation of a system that a software system has some security flaws. When the scale of the software increases, its vulnerability has becoming much more difficult to find out. Once these vulnerabilities are exploited, it may lead to great loss. In this situation, the concept of Software Assurance is carried out by some experts. And the automated fault localization technique is a part of the research of Software Assurance. Currently, automated fault localization method includes coverage based fault localization (CBFL) and program slicing. Both of the methods have their own location advantages and defects. In this paper, we have put forward a new method, named Reverse Data Dependence Analysis Model, which integrates the two methods by analyzing the program structure. On this basis, we finally proposed a new automated fault localization method. This method not only is automation lossless but also changes the basic location unit into single sentence, which makes the location effect more accurate. Through several experiments, we proved that our method is more effective. Furthermore, we analyzed the effectiveness among these existing methods and different faults.

  7. Basic Techniques in Environmental Simulation.

    DTIC Science & Technology

    1982-07-01

    the devel- ’I or oper is liable for all necessary changes in the model or its supporting computer software . After the 90-day warranty expires, the user...processing unit, that part of a computer which accom- plishes arithmetic and logical operations DCFLOS Dynamic cloud -free line-of-sight, a simulation... Software Development ......... 12 1.7.7 Operational Environment, Interfaces, and Constraints. . 12 1.7.8 Effectiveness Evaluation, Value Analysis, and

  8. Models for Threat Assessment in Networks

    DTIC Science & Technology

    2006-09-01

    Software International and Command AntiVirus . [Online]. Available: http://www.commandsoftware.com/virus/newlove.html [38] C. Ng and P. Ferrie. (2000...28 2.3 False positive trends across all population sizes for r=0.7 and m=0.1 . . . . 33 2.4 False negative trends across all population...benefits analysis is often performed to determine the list of mitigation procedures. Traditionally, risk assessment has been done in part with software

  9. The Sizing and Optimization Language (SOL): A computer language to improve the user/optimizer interface

    NASA Technical Reports Server (NTRS)

    Lucas, S. H.; Scotti, S. J.

    1989-01-01

    The nonlinear mathematical programming method (formal optimization) has had many applications in engineering design. A figure illustrates the use of optimization techniques in the design process. The design process begins with the design problem, such as the classic example of the two-bar truss designed for minimum weight as seen in the leftmost part of the figure. If formal optimization is to be applied, the design problem must be recast in the form of an optimization problem consisting of an objective function, design variables, and constraint function relations. The middle part of the figure shows the two-bar truss design posed as an optimization problem. The total truss weight is the objective function, the tube diameter and truss height are design variables, with stress and Euler buckling considered as constraint function relations. Lastly, the designer develops or obtains analysis software containing a mathematical model of the object being optimized, and then interfaces the analysis routine with existing optimization software such as CONMIN, ADS, or NPSOL. This final state of software development can be both tedious and error-prone. The Sizing and Optimization Language (SOL), a special-purpose computer language whose goal is to make the software implementation phase of optimum design easier and less error-prone, is presented.

  10. SAMPA: A free software tool for skin and membrane permeation data analysis.

    PubMed

    Bezrouk, Aleš; Fiala, Zdeněk; Kotingová, Lenka; Krulichová, Iva Selke; Kopečná, Monika; Vávrová, Kateřina

    2017-10-01

    Skin and membrane permeation experiments comprise an important step in the development of a transdermal or topical formulation or toxicological risk assessment. The standard method for analyzing these data relies on the linear part of a permeation profile. However, it is difficult to objectively determine when the profile becomes linear, or the experiment duration may be insufficient to reach a maximum or steady state. Here, we present a software tool for Skin And Membrane Permeation data Analysis, SAMPA, that is easy to use and overcomes several of these difficulties. The SAMPA method and software have been validated on in vitro and in vivo permeation data on human, pig and rat skin and model stratum corneum lipid membranes using compounds that range from highly lipophilic polycyclic aromatic hydrocarbons to highly hydrophilic antiviral drug, with and without two permeation enhancers. The SAMPA performance was compared with the standard method using a linear part of the permeation profile and a complex mathematical model. SAMPA is a user-friendly, open-source software tool for analyzing the data obtained from skin and membrane permeation experiments. It runs on a Microsoft Windows platform and is freely available as a Supporting file to this article. Copyright © 2017 Elsevier Ltd. All rights reserved.

  11. Finite element analysis of container ship's cargo hold using ANSYS and POSEIDON software

    NASA Astrophysics Data System (ADS)

    Tanny, Tania Tamiz; Akter, Naznin; Amin, Osman Md.

    2017-12-01

    Nowadays ship structural analysis has become an integral part of the preliminary ship design providing further support for the development and detail design of ship structures. Structural analyses of container ship's cargo holds are carried out for the balancing of their safety and capacity, as those ships are exposed to the high risk of structural damage during voyage. Two different design methodologies have been considered for the structural analysis of a container ship's cargo hold. One is rule-based methodology and the other is a more conventional software based analyses. The rule based analysis is done by DNV-GL's software POSEIDON and the conventional package based analysis is done by ANSYS structural module. Both methods have been applied to analyze some of the mechanical properties of the model such as total deformation, stress-strain distribution, Von Mises stress, Fatigue etc., following different design bases and approaches, to indicate some guidance's for further improvements in ship structural design.

  12. EBEX: A Balloon-Borne Telescope for Measuring Cosmic Microwave Background Polarization

    NASA Astrophysics Data System (ADS)

    Chapman, Daniel

    2015-05-01

    EBEX is a long-duration balloon-borne (LDB) telescope designed to probe polarization signals in the cosmic microwave background (CMB). It is designed to measure or place an upper limit on the inflationary B-mode signal, a signal predicted by inflationary theories to be imprinted on the CMB by gravitational waves, to detect the effects of gravitational lensing on the polarization of the CMB, and to characterize polarized Galactic foreground emission. The payload consists of a pointed gondola that houses the optics, polarimetry, detectors and detector readout systems, as well as the pointing sensors, control motors, telemetry sytems, and data acquisition and flight control computers. Polarimetry is achieved with a rotating half-wave plate and wire grid polarizer. The detectors are sensitive to frequency bands centered on 150, 250, and 410 GHz. EBEX was flown in 2009 from New Mexico as a full system test, and then flown again in December 2012 / January 2013 over Antarctica in a long-duration flight to collect scientific data. In the instrumentation part of this thesis we discuss the pointing sensors and attitude determination algorithms. We also describe the real-time map making software, "QuickLook", that was custom-designed for EBEX. We devote special attention to the design and construction of the primary pointing sensors, the star cameras, and their custom-designed flight software package, "STARS" (the Star Tracking Attitude Reconstruction Software). In the analysis part of this thesis we describe the current status of the post-flight analysis procedure. We discuss the data structures used in analysis and the pipeline stages related to attitude determination and map making. We also discuss a custom-designed software framework called "LEAP" (the LDB EBEX Analysis Pipeline) that supports most of the analysis pipeline stages.

  13. Dental application of novel finite element analysis software for three-dimensional finite element modeling of a dentulous mandible from its computed tomography images.

    PubMed

    Nakamura, Keiko; Tajima, Kiyoshi; Chen, Ker-Kong; Nagamatsu, Yuki; Kakigawa, Hiroshi; Masumi, Shin-ich

    2013-12-01

    This study focused on the application of novel finite-element analysis software for constructing a finite-element model from the computed tomography data of a human dentulous mandible. The finite-element model is necessary for evaluating the mechanical response of the alveolar part of the mandible, resulting from occlusal force applied to the teeth during biting. Commercially available patient-specific general computed tomography-based finite-element analysis software was solely applied to the finite-element analysis for the extraction of computed tomography data. The mandibular bone with teeth was extracted from the original images. Both the enamel and the dentin were extracted after image processing, and the periodontal ligament was created from the segmented dentin. The constructed finite-element model was reasonably accurate using a total of 234,644 nodes and 1,268,784 tetrahedral and 40,665 shell elements. The elastic moduli of the heterogeneous mandibular bone were determined from the bone density data of the computed tomography images. The results suggested that the software applied in this study is both useful and powerful for creating a more accurate three-dimensional finite-element model of a dentulous mandible from the computed tomography data without the need for any other software.

  14. Optimisation of process parameters on thin shell part using response surface methodology (RSM) and genetic algorithm (GA)

    NASA Astrophysics Data System (ADS)

    Faiz, J. M.; Shayfull, Z.; Nasir, S. M.; Fathullah, M.; Hazwan, M. H. M.

    2017-09-01

    This study conducts the simulation on optimisation of injection moulding process parameters using Autodesk Moldflow Insight (AMI) software. This study has applied some process parameters which are melt temperature, mould temperature, packing pressure, and cooling time in order to analyse the warpage value of the part. Besides, a part has been selected to be studied which made of Polypropylene (PP). The combination of the process parameters is analysed using Analysis of Variance (ANOVA) and the optimised value is obtained using Response Surface Methodology (RSM). The RSM as well as Genetic Algorithm are applied in Design Expert software in order to minimise the warpage value. The outcome of this study shows that the warpage value improved by using RSM and GA.

  15. A Software Designed For STP Data Plot and Analysis Based on Object-oriented Methodology

    NASA Astrophysics Data System (ADS)

    Lina, L.; Murata, K.

    2006-12-01

    In the present study, we design a system that is named "STARS (Solar-Terrestrial data Analysis and Reference System)". The STARS provides a research environment that researchers can refer to and analyse a variety of data with single software. This software design is based on the OMT (Object Modeling Technique). The OMT is one of the object-oriented techniques, which has an advantage in maintenance improvement, reuse and long time development of a system. At the Center for Information Technology, Ehime University, after our designing of the STARS, we have already started implementing the STARS. The latest version of the STARS, the STARS5, was released in 2006. Any user can download the system from our WWW site (http:// www.infonet.cite.ehime-u.ac.jp/STARS). The present paper is mainly devoted to the design of a data analysis software system. Through our designing, we paid attention so that the design is flexible and applicable when other developers design software for the similar purpose. If our model is so particular only for our own purpose, it would be useless for other developers. Through our design of the domain object model, we carefully removed the parts, which depend on the system resources, e.g. hardware and software. We put the dependent parts into the application object model. In the present design, therefore, the domain object model and the utility object model are independent of computer resource. This helps anther developer to construct his/her own system based the present design. They simply modify their own application object models according to their system resource. This division of the design between dependent and independent part into three object models is one of the advantages in the OMT. If the design of software is completely done along with the OMT, implementation is rather simple and automatic: developers simply map their designs on our programs. If one creates "ganother STARS" with other programming language such as Java, the programmer simply follows the present system as long as the language is object-oriented language. Researchers would want to add their data into the STARS. In this case, they simply add their own data class in the domain object model. It is because any satellite data has properties such as time or date, which are inherited from the upper class. In this way, their effort is less than in other old methodologies. In the OMT, description format of the system is rather strictly standardized. When new developers take part in STARS project, they have only to understand each model to obtain the overview of the STARS. Then they follow this designs and documents to implement the system. The OMT makes a new comer easy to join into the project already running.

  16. Teaching the Structure of Immunoglobulins by Molecular Visualization and SDS-PAGE Analysis

    ERIC Educational Resources Information Center

    Rižner, Tea Lanišnik

    2014-01-01

    This laboratory class combines molecular visualization and laboratory experimentation to teach the structure of the immunoglobulins (Ig). In the first part of the class, the three-dimensional structures of the human IgG and IgM molecules available through the RCSB PDB database are visualized using freely available software. In the second part, IgG…

  17. The software product assurance metrics study: JPL's software systems quality and productivity

    NASA Technical Reports Server (NTRS)

    Bush, Marilyn W.

    1989-01-01

    The findings are reported of the Jet Propulsion Laboratory (JPL)/Software Product Assurance (SPA) Metrics Study, conducted as part of a larger JPL effort to improve software quality and productivity. Until recently, no comprehensive data had been assembled on how JPL manages and develops software-intensive systems. The first objective was to collect data on software development from as many projects and for as many years as possible. Results from five projects are discussed. These results reflect 15 years of JPL software development, representing over 100 data points (systems and subsystems), over a third of a billion dollars, over four million lines of code and 28,000 person months. Analysis of this data provides a benchmark for gauging the effectiveness of past, present and future software development work. In addition, the study is meant to encourage projects to record existing metrics data and to gather future data. The SPA long term goal is to integrate the collection of historical data and ongoing project data with future project estimations.

  18. Process membership in asynchronous environments

    NASA Technical Reports Server (NTRS)

    Ricciardi, Aleta M.; Birman, Kenneth P.

    1993-01-01

    The development of reliable distributed software is simplified by the ability to assume a fail-stop failure model. The emulation of such a model in an asynchronous distributed environment is discussed. The solution proposed, called Strong-GMP, can be supported through a highly efficient protocol, and was implemented as part of a distributed systems software project at Cornell University. The precise definition of the problem, the protocol, correctness proofs, and an analysis of costs are addressed.

  19. Towards a Methodology for Identifying Program Constraints During Requirements Analysis

    NASA Technical Reports Server (NTRS)

    Romo, Lilly; Gates, Ann Q.; Della-Piana, Connie Kubo

    1997-01-01

    Requirements analysis is the activity that involves determining the needs of the customer, identifying the services that the software system should provide and understanding the constraints on the solution. The result of this activity is a natural language document, typically referred to as the requirements definition document. Some of the problems that exist in defining requirements in large scale software projects includes synthesizing knowledge from various domain experts and communicating this information across multiple levels of personnel. One approach that addresses part of this problem is called context monitoring and involves identifying the properties of and relationships between objects that the system will manipulate. This paper examines several software development methodologies, discusses the support that each provide for eliciting such information from experts and specifying the information, and suggests refinements to these methodologies.

  20. The integration of automated knowledge acquisition with computer-aided software engineering for space shuttle expert systems

    NASA Technical Reports Server (NTRS)

    Modesitt, Kenneth L.

    1990-01-01

    A prediction was made that the terms expert systems and knowledge acquisition would begin to disappear over the next several years. This is not because they are falling into disuse; it is rather that practitioners are realizing that they are valuable adjuncts to software engineering, in terms of problem domains addressed, user acceptance, and in development methodologies. A specific problem was discussed, that of constructing an automated test analysis system for the Space Shuttle Main Engine. In this domain, knowledge acquisition was part of requirements systems analysis, and was performed with the aid of a powerful inductive ESBT in conjunction with a computer aided software engineering (CASE) tool. The original prediction is not a very risky one -- it has already been accomplished.

  1. SACA: Software Assisted Call Analysis--an interactive tool supporting content exploration, online guidance and quality improvement of counseling dialogues.

    PubMed

    Trinkaus, Hans L; Gaisser, Andrea E

    2010-09-01

    Nearly 30,000 individual inquiries are answered annually by the telephone cancer information service (CIS, KID) of the German Cancer Research Center (DKFZ). The aim was to develop a tool for evaluating these calls, and to support the complete counseling process interactively. A novel software tool is introduced, based on a structure similar to a music score. Treating the interaction as a "duet", guided by the CIS counselor, the essential contents of the dialogue are extracted automatically. For this, "trained speech recognition" is applied to the (known) counselor's part, and "keyword spotting" is used on the (unknown) client's part to pick out specific items from the "word streams". The outcomes fill an abstract score representing the dialogue. Pilot tests performed on a prototype of SACA (Software Assisted Call Analysis) resulted in a basic proof of concept: Demographic data as well as information regarding the situation of the caller could be identified. The study encourages following up on the vision of an integrated SACA tool for supporting calls online and performing statistics on its knowledge database offline. Further research perspectives are to check SACA's potential in comparison with established interaction analysis systems like RIAS. Copyright (c) 2010 Elsevier Ireland Ltd. All rights reserved.

  2. Modeling and flow analysis of pure nylon polymer for injection molding process

    NASA Astrophysics Data System (ADS)

    Nuruzzaman, D. M.; Kusaseh, N.; Basri, S.; Oumer, A. N.; Hamedon, Z.

    2016-02-01

    In the production of complex plastic parts, injection molding is one of the most popular industrial processes. This paper addresses the modeling and analysis of the flow process of the nylon (polyamide) polymer for injection molding process. To determine the best molding conditions, a series of simulations are carried out using Autodesk Moldflow Insight software and the processing parameters are adjusted. This mold filling commercial software simulates the cavity filling pattern along with temperature and pressure distributions in the mold cavity. In the modeling, during the plastics flow inside the mold cavity, different flow parameters such as fill time, pressure, temperature, shear rate and warp at different locations in the cavity are analyzed. Overall, this Moldflow is able to perform a relatively sophisticated analysis of the flow process of pure nylon. Thus the prediction of the filling of a mold cavity is very important and it becomes useful before a nylon plastic part to be manufactured.

  3. 3D Fiber Orientation Simulation for Plastic Injection Molding

    NASA Astrophysics Data System (ADS)

    Lin, Baojiu; Jin, Xiaoshi; Zheng, Rong; Costa, Franco S.; Fan, Zhiliang

    2004-06-01

    Glass fiber reinforced polymer is widely used in the products made using injection molding processing. The distribution of fiber orientation inside plastic parts has direct effects on quality of molded parts. Using computer simulation to predict fiber orientation distribution is one of most efficient ways to assist engineers to do warpage analysis and to find a good design solution to produce high quality plastic parts. Fiber orientation simulation software based on 2-1/2D (midplane /Dual domain mesh) techniques has been used in industry for a decade. However, the 2-1/2D technique is based on the planar Hele-Shaw approximation and it is not suitable when the geometry has complex three-dimensional features which cannot be well approximated by 2D shells. Recently, a full 3D simulation software for fiber orientation has been developed and integrated into Moldflow Plastics Insight 3D simulation software. The theory for this new 3D fiber orientation calculation module is described in this paper. Several examples are also presented to show the benefit in using 3D fiber orientation simulation.

  4. Report on Automated Semantic Analysis of Scientific and Engineering Codes

    NASA Technical Reports Server (NTRS)

    Stewart. Maark E. M.; Follen, Greg (Technical Monitor)

    2001-01-01

    The loss of the Mars Climate Orbiter due to a software error reveals what insiders know: software development is difficult and risky because, in part, current practices do not readily handle the complex details of software. Yet, for scientific software development the MCO mishap represents the tip of the iceberg; few errors are so public, and many errors are avoided with a combination of expertise, care, and testing during development and modification. Further, this effort consumes valuable time and resources even when hardware costs and execution time continually decrease. Software development could use better tools! This lack of tools has motivated the semantic analysis work explained in this report. However, this work has a distinguishing emphasis; the tool focuses on automated recognition of the fundamental mathematical and physical meaning of scientific code. Further, its comprehension is measured by quantitatively evaluating overall recognition with practical codes. This emphasis is necessary if software errors-like the MCO error-are to be quickly and inexpensively avoided in the future. This report evaluates the progress made with this problem. It presents recommendations, describes the approach, the tool's status, the challenges, related research, and a development strategy.

  5. Space tug economic analysis study. Volume 2: Tug concepts analysis. Part 1: Overall approach and data generation

    NASA Technical Reports Server (NTRS)

    1972-01-01

    An economic analysis of space tug operations is presented. The subjects discussed are: (1) data base for orbit injection stages, (2) data base for reusable space tug, (3) performance equations, (4) data integration and interpretation, (5) tug performance and mission model accomodation, (6) total program cost, (7) payload analysis, (8) computer software, and (9) comparison of tug concepts.

  6. Warpage analysis on thin shell part using glowworm swarm optimisation (GSO)

    NASA Astrophysics Data System (ADS)

    Zulhasif, Z.; Shayfull, Z.; Nasir, S. M.; Fathullah, M.; Hazwan, M. H. M.

    2017-09-01

    The Autodesk Moldflow Insight (AMI) software was used in this study to focuses on the analysis in plastic injection moulding process associate the input parameter and output parameter. The material used in this study is Acrylonitrile Butadiene Styrene (ABS) as the moulded material to produced the plastic part. The MATLAB sortware is a method was used to find the best setting parameter. The variables was selected in this study were melt temperature, packing pressure, coolant temperature and cooling time.

  7. Wireless, battery-operated data acquisition system for mobile spectrometry applications and (potentially) for the Internet of things

    NASA Astrophysics Data System (ADS)

    Fitzgerald, Ryan; Karanassios, Vassili

    2017-05-01

    There are many applications requiring chemical analysis in the field and analytical results in (near) real-time. For example, when accidental spills occur. In others, collecting samples in the field followed by analysis in a lab increases costs and introduces time-delays. In such cases, "bring part of the lab to the sample" would be ideal. Toward this ideal (and to further reduce size and weight), we developed a relatively inexpensive, battery-operated, wireless data acquisition hardware system around an Arduino nano micro-controller and a 16-bit ADC (Analog-to- Digital Converter) with a max sampling rate of 860 samples/s. The hardware communicates the acquired data using low-power Bluetooth. Software for data acquisition and data display was written in Python. Potential ways of making the hardware-software approach described here a part of the Internet-of-Things (IoT) are presented.

  8. Analytical Design Package (ADP2): A computer aided engineering tool for aircraft transparency design

    NASA Technical Reports Server (NTRS)

    Wuerer, J. E.; Gran, M.; Held, T. W.

    1994-01-01

    The Analytical Design Package (ADP2) is being developed as a part of the Air Force Frameless Transparency Program (FTP). ADP2 is an integrated design tool consisting of existing analysis codes and Computer Aided Engineering (CAE) software. The objective of the ADP2 is to develop and confirm an integrated design methodology for frameless transparencies, related aircraft interfaces, and their corresponding tooling. The application of this methodology will generate high confidence for achieving a qualified part prior to mold fabrication. ADP2 is a customized integration of analysis codes, CAE software, and material databases. The primary CAE integration tool for the ADP2 is P3/PATRAN, a commercial-off-the-shelf (COTS) software tool. The open architecture of P3/PATRAN allows customized installations with different applications modules for specific site requirements. Integration of material databases allows the engineer to select a material, and those material properties are automatically called into the relevant analysis code. The ADP2 materials database will be composed of four independent schemas: CAE Design, Processing, Testing, and Logistics Support. The design of ADP2 places major emphasis on the seamless integration of CAE and analysis modules with a single intuitive graphical interface. This tool is being designed to serve and be used by an entire project team, i.e., analysts, designers, materials experts, and managers. The final version of the software will be delivered to the Air Force in Jan. 1994. The Analytical Design Package (ADP2) will then be ready for transfer to industry. The package will be capable of a wide range of design and manufacturing applications.

  9. Self-conscious robotic system design process--from analysis to implementation.

    PubMed

    Chella, Antonio; Cossentino, Massimo; Seidita, Valeria

    2011-01-01

    Developing robotic systems endowed with self-conscious capabilities means realizing complex sub-systems needing ad-hoc software engineering techniques for their modelling, analysis and implementation. In this chapter the whole process (from analysis to implementation) to model the development of self-conscious robotic systems is presented and the new created design process, PASSIC, supporting each part of it, is fully illustrated.

  10. Design and Analysis of Turbomachinery for Space Applications

    NASA Technical Reports Server (NTRS)

    Dorney, D.; Garcia, Roberto (Technical Monitor)

    2002-01-01

    This presentation provides an overview of CORSAIR, a three dimensional computational fluid dynamics software code for the analysis of turbomachinery components available from NASA, and discusses its potential use in the design of these parts. Topics covered include: time-dependent equations of motion, grid topology, turbulence models, boundary conditions, parallel simulations and miscellaneous capabilities.

  11. Nuclear Fuel Depletion Analysis Using Matlab Software

    NASA Astrophysics Data System (ADS)

    Faghihi, F.; Nematollahi, M. R.

    Coupled first order IVPs are frequently used in many parts of engineering and sciences. In this article, we presented a code including three computer programs which are joint with the Matlab software to solve and plot the solutions of the first order coupled stiff or non-stiff IVPs. Some engineering and scientific problems related to IVPs are given and fuel depletion (production of the 239Pu isotope) in a Pressurized Water Nuclear Reactor (PWR) are computed by the present code.

  12. Demonstration of a Safety Analysis on a Complex System

    NASA Technical Reports Server (NTRS)

    Leveson, Nancy; Alfaro, Liliana; Alvarado, Christine; Brown, Molly; Hunt, Earl B.; Jaffe, Matt; Joslyn, Susan; Pinnell, Denise; Reese, Jon; Samarziya, Jeffrey; hide

    1997-01-01

    For the past 17 years, Professor Leveson and her graduate students have been developing a theoretical foundation for safety in complex systems and building a methodology upon that foundation. The methodology includes special management structures and procedures, system hazard analyses, software hazard analysis, requirements modeling and analysis for completeness and safety, special software design techniques including the design of human-machine interaction, verification, operational feedback, and change analysis. The Safeware methodology is based on system safety techniques that are extended to deal with software and human error. Automation is used to enhance our ability to cope with complex systems. Identification, classification, and evaluation of hazards is done using modeling and analysis. To be effective, the models and analysis tools must consider the hardware, software, and human components in these systems. They also need to include a variety of analysis techniques and orthogonal approaches: There exists no single safety analysis or evaluation technique that can handle all aspects of complex systems. Applying only one or two may make us feel satisfied, but will produce limited results. We report here on a demonstration, performed as part of a contract with NASA Langley Research Center, of the Safeware methodology on the Center-TRACON Automation System (CTAS) portion of the air traffic control (ATC) system and procedures currently employed at the Dallas/Fort Worth (DFW) TRACON (Terminal Radar Approach CONtrol). CTAS is an automated system to assist controllers in handling arrival traffic in the DFW area. Safety is a system property, not a component property, so our safety analysis considers the entire system and not simply the automated components. Because safety analysis of a complex system is an interdisciplinary effort, our team included system engineers, software engineers, human factors experts, and cognitive psychologists.

  13. The STARLINK software collection

    NASA Astrophysics Data System (ADS)

    Penny, A. J.; Wallace, P. T.; Sherman, J. C.; Terret, D. L.

    1993-12-01

    A demonstration will be given of some recent Starlink software. STARLINK is: a network of computers used by UK astronomers; a collection of programs for the calibration and analysis of astronomical data; a team of people giving hardware, software and administrative support. The Starlink Project has been in operation since 1980 to provide UK astronomers with interactive image processing and data reduction facilities. There are now Starlink computer systems at 25 UK locations, serving about 1500 registered users. The Starlink software collection now has about 25 major packages covering a wide range of astronomical data reduction and analysis techniques, as well as many smaller programs and utilities. At the core of most of the packages is a common `software environment', which provides many of the functions which applications need and offers standardized methods of structuring and accessing data. The software environment simplifies programming and support, and makes it easy to use different packages for different stages of the data reduction. Users see a consistent style, and can mix applications without hitting problems of differing data formats. The Project group coordinates the writing and distribution of this software collection, which is Unix based. Outside the UK, Starlink is used at a large number of places, which range from installations at major UK telescopes, which are Starlink-compatible and managed like Starlink sites, to individuals who run only small parts of the Starlink software collection.

  14. Probabilistic Prediction of Lifetimes of Ceramic Parts

    NASA Technical Reports Server (NTRS)

    Nemeth, Noel N.; Gyekenyesi, John P.; Jadaan, Osama M.; Palfi, Tamas; Powers, Lynn; Reh, Stefan; Baker, Eric H.

    2006-01-01

    ANSYS/CARES/PDS is a software system that combines the ANSYS Probabilistic Design System (PDS) software with a modified version of the Ceramics Analysis and Reliability Evaluation of Structures Life (CARES/Life) Version 6.0 software. [A prior version of CARES/Life was reported in Program for Evaluation of Reliability of Ceramic Parts (LEW-16018), NASA Tech Briefs, Vol. 20, No. 3 (March 1996), page 28.] CARES/Life models effects of stochastic strength, slow crack growth, and stress distribution on the overall reliability of a ceramic component. The essence of the enhancement in CARES/Life 6.0 is the capability to predict the probability of failure using results from transient finite-element analysis. ANSYS PDS models the effects of uncertainty in material properties, dimensions, and loading on the stress distribution and deformation. ANSYS/CARES/PDS accounts for the effects of probabilistic strength, probabilistic loads, probabilistic material properties, and probabilistic tolerances on the lifetime and reliability of the component. Even failure probability becomes a stochastic quantity that can be tracked as a response variable. ANSYS/CARES/PDS enables tracking of all stochastic quantities in the design space, thereby enabling more precise probabilistic prediction of lifetimes of ceramic components.

  15. Analysis of data throughput in communication between PLCs and HMI/SCADA systems

    NASA Astrophysics Data System (ADS)

    Mikolajek, Martin; Koziorek, Jiri

    2016-09-01

    This paper is focused on Analysis of data throughout in communication between PLCs and HMI/SCADA systems. The first part of paper discusses basic problematic communication between PLC and HMI systems. Next part is about specific types of communications PLC - HMI requests. For those cases paper is talking about response and data throughput1-3 . Subsequent section of this article contains practical parts with various data exchanges between PLC Siemens and HMI. The possibilities of communication that are described in this article are focused on using OPC server for visualization software, custom HMI system and own application created by using .NET with Technology. The last part of this article contains some communication solutions.

  16. Space Station communications and tracking systems modeling and RF link simulation

    NASA Technical Reports Server (NTRS)

    Tsang, Chit-Sang; Chie, Chak M.; Lindsey, William C.

    1986-01-01

    In this final report, the effort spent on Space Station Communications and Tracking System Modeling and RF Link Simulation is described in detail. The effort is mainly divided into three parts: frequency division multiple access (FDMA) system simulation modeling and software implementation; a study on design and evaluation of a functional computerized RF link simulation/analysis system for Space Station; and a study on design and evaluation of simulation system architecture. This report documents the results of these studies. In addition, a separate User's Manual on Space Communications Simulation System (SCSS) (Version 1) documents the software developed for the Space Station FDMA communications system simulation. The final report, SCSS user's manual, and the software located in the NASA JSC system analysis division's VAX 750 computer together serve as the deliverables from LinCom for this project effort.

  17. A software platform for the analysis of dermatology images

    NASA Astrophysics Data System (ADS)

    Vlassi, Maria; Mavraganis, Vlasios; Asvestas, Panteleimon

    2017-11-01

    The purpose of this paper is to present a software platform developed in Python programming environment that can be used for the processing and analysis of dermatology images. The platform provides the capability for reading a file that contains a dermatology image. The platform supports image formats such as Windows bitmaps, JPEG, JPEG2000, portable network graphics, TIFF. Furthermore, it provides suitable tools for selecting, either manually or automatically, a region of interest (ROI) on the image. The automated selection of a ROI includes filtering for smoothing the image and thresholding. The proposed software platform has a friendly and clear graphical user interface and could be a useful second-opinion tool to a dermatologist. Furthermore, it could be used to classify images including from other anatomical parts such as breast or lung, after proper re-training of the classification algorithms.

  18. 15 CFR Supplement No. 3 to Part 774 - Statements of Understanding

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ...” that “incorporate” commodities or software on the Commerce Control List (Supplement No. 1 to part 774... medical research). (2) Commodities or software are considered “incorporated” if the commodity or software... medical equipment; and exported or reexported with the medical equipment. (3) Except for such software...

  19. PIPER: Performance Insight for Programmers and Exascale Runtimes: Guiding the Development of the Exascale Software Stack

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mellor-Crummey, John

    The PIPER project set out to develop methodologies and software for measurement, analysis, attribution, and presentation of performance data for extreme-scale systems. Goals of the project were to support analysis of massive multi-scale parallelism, heterogeneous architectures, multi-faceted performance concerns, and to support both post-mortem performance analysis to identify program features that contribute to problematic performance and on-line performance analysis to drive adaptation. This final report summarizes the research and development activity at Rice University as part of the PIPER project. Producing a complete suite of performance tools for exascale platforms during the course of this project was impossible since bothmore » hardware and software for exascale systems is still a moving target. For that reason, the project focused broadly on the development of new techniques for measurement and analysis of performance on modern parallel architectures, enhancements to HPCToolkit’s software infrastructure to support our research goals or use on sophisticated applications, engaging developers of multithreaded runtimes to explore how support for tools should be integrated into their designs, engaging operating system developers with feature requests for enhanced monitoring support, engaging vendors with requests that they add hardware measure- ment capabilities and software interfaces needed by tools as they design new components of HPC platforms including processors, accelerators and networks, and finally collaborations with partners interested in using HPCToolkit to analyze and tune scalable parallel applications.« less

  20. Automated water analyser computer supported system (AWACSS) Part I: Project objectives, basic technology, immunoassay development, software design and networking.

    PubMed

    Tschmelak, Jens; Proll, Guenther; Riedt, Johannes; Kaiser, Joachim; Kraemmer, Peter; Bárzaga, Luis; Wilkinson, James S; Hua, Ping; Hole, J Patrick; Nudd, Richard; Jackson, Michael; Abuknesha, Ram; Barceló, Damià; Rodriguez-Mozaz, Sara; de Alda, Maria J López; Sacher, Frank; Stien, Jan; Slobodník, Jaroslav; Oswald, Peter; Kozmenko, Helena; Korenková, Eva; Tóthová, Lívia; Krascsenits, Zoltan; Gauglitz, Guenter

    2005-02-15

    A novel analytical system AWACSS (automated water analyser computer-supported system) based on immunochemical technology has been developed that can measure several organic pollutants at low nanogram per litre level in a single few-minutes analysis without any prior sample pre-concentration nor pre-treatment steps. Having in mind actual needs of water-sector managers related to the implementation of the Drinking Water Directive (DWD) (98/83/EC, 1998) and Water Framework Directive WFD (2000/60/EC, 2000), drinking, ground, surface, and waste waters were major media used for the evaluation of the system performance. The instrument was equipped with remote control and surveillance facilities. The system's software allows for the internet-based networking between the measurement and control stations, global management, trend analysis, and early-warning applications. The experience of water laboratories has been utilised at the design of the instrument's hardware and software in order to make the system rugged and user-friendly. Several market surveys were conducted during the project to assess the applicability of the final system. A web-based AWACSS database was created for automated evaluation and storage of the obtained data in a format compatible with major databases of environmental organic pollutants in Europe. This first part article gives the reader an overview of the aims and scope of the AWACSS project as well as details about basic technology, immunoassays, software, and networking developed and utilised within the research project. The second part article reports on the system performance, first real sample measurements, and an international collaborative trial (inter-laboratory tests) to compare the biosensor with conventional anayltical methods.

  1. Impact of Domain Analysis on Reuse Methods

    DTIC Science & Technology

    1989-11-06

    return on the investment. The potential negative effects a "bad" domain analysis has on developing systems in the domain also increases the risks of a...importance of domain analysis as part of a software reuse program. A particular goal is to assist in avoiding the potential negative effects of ad hoc or...are specification objects discovered by performing object-oriented analysis. Object-based analysis approaches thus serve to capture a model of reality

  2. Errata: Response Analysis and Error Diagnosis Tools.

    ERIC Educational Resources Information Center

    Hart, Robert S.

    This guide to ERRATA, a set of HyperCard-based tools for response analysis and error diagnosis in language testing, is intended as a user manual and general reference and designed to be used with the software (not included here). It has three parts. The first is a brief survey of computational techniques available for dealing with student test…

  3. MULTIMEDIA ENVIRONMENTAL DISTRIBUTION OF TOXICS (MEND-TOX): PART II, SOFTWARE IMPLEMENTATION AND CASE STUDIES

    EPA Science Inventory

    An integrated hybrid spatial-compartmental simulator is presented for analyzing the dynamic distribution of chemicals in the multimedia environment. Information obtained from such analysis, which includes temporal chemical concentration profiles in various media, mass distribu...

  4. A SIMULINK environment for flight dynamics and control analysis: Application to the DHC-2 Beaver. Part 1: Implementation of a model library in SIMULINK. Part 2: Nonlinear analysis of the Beaver autopilot

    NASA Technical Reports Server (NTRS)

    Rauw, Marc O.

    1993-01-01

    The design of advanced Automatic Aircraft Control Systems (AACS's) can be improved upon considerably if the designer can access all models and tools required for control system design and analysis through a graphical user interface, from within one software environment. This MSc-thesis presents the first step in the development of such an environment, which is currently being done at the Section for Stability and Control of Delft University of Technology, Faculty of Aerospace Engineering. The environment is implemented within the commercially available software package MATLAB/SIMULINK. The report consists of two parts. Part 1 gives a detailed description of the AACS design environment. The heart of this environment is formed by the SIMULINK implementation of a nonlinear aircraft model in block-diagram format. The model has been worked out for the old laboratory aircraft of the Faculty, the DeHavilland DHC-2 'Beaver', but due to its modular structure, it can easily be adapted for other aircraft. Part 1 also describes MATLAB programs which can be applied for finding steady-state trimmed-flight conditions and for linearization of the aircraft model, and it shows how the built-in simulation routines of SIMULINK have been used for open-loop analysis of the aircraft dynamics. Apart from the implementation of the models and tools, a thorough treatment of the theoretical backgrounds is presented. Part 2 of this report presents a part of an autopilot design process for the 'Beaver' aircraft, which clearly demonstrates the power and flexibility of the AACS design environment from part 1. Evaluations of all longitudinal and lateral control laws by means of nonlinear simulations are treated in detail. A floppy disk containing all relevant MATLAB programs and SIMULINK models is provided as a supplement.

  5. User's guide to image processing applications of the NOAA satellite HRPT/AVHRR data. Part 1: Introduction to the satellite system and its applications. Part 2: Processing and analysis of AVHRR imagery

    NASA Technical Reports Server (NTRS)

    Huh, Oscar Karl; Leibowitz, Scott G.; Dirosa, Donald; Hill, John M.

    1986-01-01

    The use of NOAA Advanced Very High Resolution Radar/High Resolution Picture Transmission (AVHRR/HRPT) imagery for earth resource applications is provided for the applications scientist for use within the various Earth science, resource, and agricultural disciplines. A guide to processing NOAA AVHRR data using the hardware and software systems integrated for this NASA project is provided. The processing steps from raw data on computer compatible tapes (1B data format) through usable qualitative and quantitative products for applications are given. The manual is divided into two parts. The first section describes the NOAA satellite system, its sensors, and the theoretical basis for using these data for environmental applications. Part 2 is a hands-on description of how to use a specific image processing system, the International Imaging Systems, Inc. (I2S) Model 75 Array Processor and S575 software, to process these data.

  6. Warpage analysis on thin shell part using response surface methodology (RSM)

    NASA Astrophysics Data System (ADS)

    Zulhasif, Z.; Shayfull, Z.; Nasir, S. M.; Fathullah, M.; Hazwan, M. H. M.

    2017-09-01

    The optimisation of moulding parameters appropriate to reduce warpage defects produce using Autodesk Moldflow Insight (AMI) 2012 software The product is injected by using Acrylonitrile-Butadiene-Styrene (ABS) materials. This analysis has processing parameter that varies in melting temperature, mould temperature, packing pressure and packing time. Design of Experiments (DOE) has been integrated to obtain a polynomial model using Response Surface Methodology (RSM). The Glowworm Swarm Optimisation (GSO) method is used to predict a best combination parameters to minimise warpage defect in order to produce high quality parts.

  7. Multidisciplinary High-Fidelity Analysis and Optimization of Aerospace Vehicles. Part 2; Preliminary Results

    NASA Technical Reports Server (NTRS)

    Walsh, J. L.; Weston, R. P.; Samareh, J. A.; Mason, B. H.; Green, L. L.; Biedron, R. T.

    2000-01-01

    An objective of the High Performance Computing and Communication Program at the NASA Langley Research Center is to demonstrate multidisciplinary shape and sizing optimization of a complete aerospace vehicle configuration by using high-fidelity finite-element structural analysis and computational fluid dynamics aerodynamic analysis in a distributed, heterogeneous computing environment that includes high performance parallel computing. A software system has been designed and implemented to integrate a set of existing discipline analysis codes, some of them computationally intensive, into a distributed computational environment for the design of a high-speed civil transport configuration. The paper describes both the preliminary results from implementing and validating the multidisciplinary analysis and the results from an aerodynamic optimization. The discipline codes are integrated by using the Java programming language and a Common Object Request Broker Architecture compliant software product. A companion paper describes the formulation of the multidisciplinary analysis and optimization system.

  8. Data Curation and Visualization for MuSIASEM Analysis of the Nexus

    NASA Astrophysics Data System (ADS)

    Renner, Ansel

    2017-04-01

    A novel software-based approach to relational analysis applying recent theoretical advancements of the Multi-Scale Integrated Analysis of Societal and Ecosystem Metabolism (MuSIASEM) accounting framework is presented. This research explores and explains underutilized ways software can assist complex system analysis across the stages of data collection, exploration, analysis and dissemination and in a transparent and collaborative manner. This work is being conducted as part of, and in support of, the four-year European Commission H2020 project: Moving Towards Adaptive Governance in Complexity: Informing Nexus Security (MAGIC). In MAGIC, theoretical advancements to MuSIASEM propose a powerful new approach to spatial-temporal WEFC relational analysis in accordance with a structural-functional scaling mechanism appropriate for biophysically relevant complex system analyses. Software is designed primarily with JavaScript using the Angular2 model-view-controller framework and the Data-Driven Documents (D3) library. These design choices clarify and modularize data flow, simplify research practitioner's work, allow for and assist stakeholder involvement and advance collaboration at all stages. Data requirements and scalable, robust yet light-weight structuring will first be explained. Following, algorithms to process this data will be explored. Data interfaces and data visualization approaches will lastly be presented and described.

  9. Gaia DR1 documentation Chapter 6: Variability

    NASA Astrophysics Data System (ADS)

    Eyer, L.; Rimoldini, L.; Guy, L.; Holl, B.; Clementini, G.; Cuypers, J.; Mowlavi, N.; Lecoeur-Taïbi, I.; De Ridder, J.; Charnas, J.; Nienartowicz, K.

    2017-12-01

    This chapter describes the photometric variability processing of the Gaia DR1 data. Coordination Unit 7 is responsible for the variability analysis of over a billion celestial sources. In particular the definition, design, development, validation and provision of a software package for the data processing of photometrically variable objects. Data Processing Centre Geneva (DPCG) responsibilities cover all issues related to the computational part of the CU7 analysis. These span: hardware provisioning, including selection, deployment and optimisation of suitable hardware, choosing and developing software architecture, defining data and scientific workflows as well as operational activities such as configuration management, data import, time series reconstruction, storage and processing handling, visualisation and data export. CU7/DPCG is also responsible for interaction with other DPCs and CUs, software and programming training for the CU7 members, scientific software quality control and management of software and data lifecycle. Details about the specific data treatment steps of the Gaia DR1 data products are found in Eyer et al. (2017) and are not repeated here. The variability content of the Gaia DR1 focusses on a subsample of Cepheids and RR Lyrae stars around the South ecliptic pole, showcasing the performance of the Gaia photometry with respect to variable objects.

  10. GiA Roots: software for the high throughput analysis of plant root system architecture.

    PubMed

    Galkovskyi, Taras; Mileyko, Yuriy; Bucksch, Alexander; Moore, Brad; Symonova, Olga; Price, Charles A; Topp, Christopher N; Iyer-Pascuzzi, Anjali S; Zurek, Paul R; Fang, Suqin; Harer, John; Benfey, Philip N; Weitz, Joshua S

    2012-07-26

    Characterizing root system architecture (RSA) is essential to understanding the development and function of vascular plants. Identifying RSA-associated genes also represents an underexplored opportunity for crop improvement. Software tools are needed to accelerate the pace at which quantitative traits of RSA are estimated from images of root networks. We have developed GiA Roots (General Image Analysis of Roots), a semi-automated software tool designed specifically for the high-throughput analysis of root system images. GiA Roots includes user-assisted algorithms to distinguish root from background and a fully automated pipeline that extracts dozens of root system phenotypes. Quantitative information on each phenotype, along with intermediate steps for full reproducibility, is returned to the end-user for downstream analysis. GiA Roots has a GUI front end and a command-line interface for interweaving the software into large-scale workflows. GiA Roots can also be extended to estimate novel phenotypes specified by the end-user. We demonstrate the use of GiA Roots on a set of 2393 images of rice roots representing 12 genotypes from the species Oryza sativa. We validate trait measurements against prior analyses of this image set that demonstrated that RSA traits are likely heritable and associated with genotypic differences. Moreover, we demonstrate that GiA Roots is extensible and an end-user can add functionality so that GiA Roots can estimate novel RSA traits. In summary, we show that the software can function as an efficient tool as part of a workflow to move from large numbers of root images to downstream analysis.

  11. Java Source Code Analysis for API Migration to Embedded Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Winter, Victor; McCoy, James A.; Guerrero, Jonathan

    Embedded systems form an integral part of our technological infrastructure and oftentimes play a complex and critical role within larger systems. From the perspective of reliability, security, and safety, strong arguments can be made favoring the use of Java over C in such systems. In part, this argument is based on the assumption that suitable subsets of Java’s APIs and extension libraries are available to embedded software developers. In practice, a number of Java-based embedded processors do not support the full features of the JVM. For such processors, source code migration is a mechanism by which key abstractions offered bymore » APIs and extension libraries can made available to embedded software developers. The analysis required for Java source code-level library migration is based on the ability to correctly resolve element references to their corresponding element declarations. A key challenge in this setting is how to perform analysis for incomplete source-code bases (e.g., subsets of libraries) from which types and packages have been omitted. This article formalizes an approach that can be used to extend code bases targeted for migration in such a manner that the threats associated the analysis of incomplete code bases are eliminated.« less

  12. Global review of open access risk assessment software packages valid for global or continental scale analysis

    NASA Astrophysics Data System (ADS)

    Daniell, James; Simpson, Alanna; Gunasekara, Rashmin; Baca, Abigail; Schaefer, Andreas; Ishizawa, Oscar; Murnane, Rick; Tijssen, Annegien; Deparday, Vivien; Forni, Marc; Himmelfarb, Anne; Leder, Jan

    2015-04-01

    Over the past few decades, a plethora of open access software packages for the calculation of earthquake, volcanic, tsunami, storm surge, wind and flood have been produced globally. As part of the World Bank GFDRR Review released at the Understanding Risk 2014 Conference, over 80 such open access risk assessment software packages were examined. Commercial software was not considered in the evaluation. A preliminary analysis was used to determine whether the 80 models were currently supported and if they were open access. This process was used to select a subset of 31 models that include 8 earthquake models, 4 cyclone models, 11 flood models, and 8 storm surge/tsunami models for more detailed analysis. By using multi-criteria analysis (MCDA) and simple descriptions of the software uses, the review allows users to select a few relevant software packages for their own testing and development. The detailed analysis evaluated the models on the basis of over 100 criteria and provides a synopsis of available open access natural hazard risk modelling tools. In addition, volcano software packages have since been added making the compendium of risk software tools in excess of 100. There has been a huge increase in the quality and availability of open access/source software over the past few years. For example, private entities such as Deltares now have an open source policy regarding some flood models (NGHS). In addition, leaders in developing risk models in the public sector, such as Geoscience Australia (EQRM, TCRM, TsuDAT, AnuGA) or CAPRA (ERN-Flood, Hurricane, CRISIS2007 etc.), are launching and/or helping many other initiatives. As we achieve greater interoperability between modelling tools, we will also achieve a future wherein different open source and open access modelling tools will be increasingly connected and adapted towards unified multi-risk model platforms and highly customised solutions. It was seen that many software tools could be improved by enabling user-defined exposure and vulnerability. Without this function, many tools can only be used regionally and not at global or continental scale. It is becoming increasingly easy to use multiple packages for a single region and/or hazard to characterize the uncertainty in the risk, or use as checks for the sensitivities in the analysis. There is a potential for valuable synergy between existing software. A number of open source software packages could be combined to generate a multi-risk model with multiple views of a hazard. This extensive review has simply attempted to provide a platform for dialogue between all open source and open access software packages and to hopefully inspire collaboration between developers, given the great work done by all open access and open source developers.

  13. Evaluation of the Trajectory Operations Applications Software Task (TOAST). Volume 2: Interview transcripts

    NASA Technical Reports Server (NTRS)

    Perkins, Sharon; Martin, Andrea; Bavinger, Bill

    1990-01-01

    The Trajectory Operations Applications Software Task (TOAST) is a software development project whose purpose is to provide trajectory operation pre-mission and real-time support for the Space Shuttle. The purpose of the evaluation was to evaluate TOAST as an Application Manager - to assess current and planned capabilities, compare capabilities to commercially-available off the shelf (COTS) software, and analyze requirements of MCC and Flight Analysis Design System (FADS) for TOAST implementation. As a major part of the data gathering for the evaluation, interviews were conducted with NASA and contractor personnel. Real-time and flight design users, orbit navigation users, the TOAST developers, and management were interviewed. Code reviews and demonstrations were also held. Each of these interviews was videotaped and transcribed as appropriate. Transcripts were edited and are presented chronologically.

  14. Logic flowgraph methodology - A tool for modeling embedded systems

    NASA Technical Reports Server (NTRS)

    Muthukumar, C. T.; Guarro, S. B.; Apostolakis, G. E.

    1991-01-01

    The logic flowgraph methodology (LFM), a method for modeling hardware in terms of its process parameters, has been extended to form an analytical tool for the analysis of integrated (hardware/software) embedded systems. In the software part of a given embedded system model, timing and the control flow among different software components are modeled by augmenting LFM with modified Petrinet structures. The objective of the use of such an augmented LFM model is to uncover possible errors and the potential for unanticipated software/hardware interactions. This is done by backtracking through the augmented LFM mode according to established procedures which allow the semiautomated construction of fault trees for any chosen state of the embedded system (top event). These fault trees, in turn, produce the possible combinations of lower-level states (events) that may lead to the top event.

  15. Hard Choices for Individual Situations.

    ERIC Educational Resources Information Center

    Landon, Bruce

    This paper focuses on faculty use of a decision-making process for complex situations. The analysis part of the process describes and compares course management software focusing on: technical specifications, instructional design values,tools and features, ease of use, and standards compliance. The extensive comparisons provide faculty with…

  16. A new parallel-vector finite element analysis software on distributed-memory computers

    NASA Technical Reports Server (NTRS)

    Qin, Jiangning; Nguyen, Duc T.

    1993-01-01

    A new parallel-vector finite element analysis software package MPFEA (Massively Parallel-vector Finite Element Analysis) is developed for large-scale structural analysis on massively parallel computers with distributed-memory. MPFEA is designed for parallel generation and assembly of the global finite element stiffness matrices as well as parallel solution of the simultaneous linear equations, since these are often the major time-consuming parts of a finite element analysis. Block-skyline storage scheme along with vector-unrolling techniques are used to enhance the vector performance. Communications among processors are carried out concurrently with arithmetic operations to reduce the total execution time. Numerical results on the Intel iPSC/860 computers (such as the Intel Gamma with 128 processors and the Intel Touchstone Delta with 512 processors) are presented, including an aircraft structure and some very large truss structures, to demonstrate the efficiency and accuracy of MPFEA.

  17. Multidisciplinary High-Fidelity Analysis and Optimization of Aerospace Vehicles. Part 1; Formulation

    NASA Technical Reports Server (NTRS)

    Walsh, J. L.; Townsend, J. C.; Salas, A. O.; Samareh, J. A.; Mukhopadhyay, V.; Barthelemy, J.-F.

    2000-01-01

    An objective of the High Performance Computing and Communication Program at the NASA Langley Research Center is to demonstrate multidisciplinary shape and sizing optimization of a complete aerospace vehicle configuration by using high-fidelity, finite element structural analysis and computational fluid dynamics aerodynamic analysis in a distributed, heterogeneous computing environment that includes high performance parallel computing. A software system has been designed and implemented to integrate a set of existing discipline analysis codes, some of them computationally intensive, into a distributed computational environment for the design of a highspeed civil transport configuration. The paper describes the engineering aspects of formulating the optimization by integrating these analysis codes and associated interface codes into the system. The discipline codes are integrated by using the Java programming language and a Common Object Request Broker Architecture (CORBA) compliant software product. A companion paper presents currently available results.

  18. GPFrontend and GPGraphics: graphical analysis tools for genetic association studies.

    PubMed

    Uebe, Steffen; Pasutto, Francesca; Krumbiegel, Mandy; Schanze, Denny; Ekici, Arif B; Reis, André

    2010-09-21

    Most software packages for whole genome association studies are non-graphical, purely text based programs originally designed to run with UNIX-like operating systems. Graphical output is often not intended or supposed to be performed with other command line tools, e.g. gnuplot. Using the Microsoft .NET 2.0 platform and Visual Studio 2005, we have created a graphical software package to analyze data from microarray whole genome association studies, both for a DNA-pooling based approach as well as regular single sample data. Part of this package was made to integrate with GenePool 0.8.2, a previously existing software suite for GNU/Linux systems, which we have modified to run in a Microsoft Windows environment. Further modifications cause it to generate some additional data. This enables GenePool to interact with the .NET parts created by us. The programs we developed are GPFrontend, a graphical user interface and frontend to use GenePool and create metadata files for it, and GPGraphics, a program to further analyze and graphically evaluate output of different WGA analysis programs, among them also GenePool. Our programs enable regular MS Windows users without much experience in bioinformatics to easily visualize whole genome data from a variety of sources.

  19. Uncertainty in the use of MAMA software to measure particle morphological parameters from SEM images

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schwartz, Daniel S.; Tandon, Lav

    The MAMA software package developed at LANL is designed to make morphological measurements on a wide variety of digital images of objects. At LANL, we have focused on using MAMA to measure scanning electron microscope (SEM) images of particles, as this is a critical part of our forensic analysis of interdicted radiologic materials. In order to successfully use MAMA to make such measurements, we must understand the level of uncertainty involved in the process, so that we can rigorously support our quantitative conclusions.

  20. 15 CFR Supplement No. 6 to Part 742 - Technical Questionnaire for Encryption Items

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... software, provide the following information: (1) Description of all the symmetric and asymmetric encryption... third-party hardware or software encryption components (if any). Identify the manufacturers of the hardware or software components, including specific part numbers and version information as needed to...

  1. 15 CFR Supplement No. 6 to Part 742 - Technical Questionnaire for Encryption Items

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... software, provide the following information: (1) Description of all the symmetric and asymmetric encryption... third-party hardware or software encryption components (if any). Identify the manufacturers of the hardware or software components, including specific part numbers and version information as needed to...

  2. 15 CFR Supplement No. 6 to Part 742 - Technical Questionnaire for Encryption Items

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... software, provide the following information: (1) Description of all the symmetric and asymmetric encryption... third-party hardware or software encryption components (if any). Identify the manufacturers of the hardware or software components, including specific part numbers and version information as needed to...

  3. Validation of a free software for unsupervised assessment of abdominal fat in MRI.

    PubMed

    Maddalo, Michele; Zorza, Ivan; Zubani, Stefano; Nocivelli, Giorgio; Calandra, Giulio; Soldini, Pierantonio; Mascaro, Lorella; Maroldi, Roberto

    2017-05-01

    To demonstrate the accuracy of an unsupervised (fully automated) software for fat segmentation in magnetic resonance imaging. The proposed software is a freeware solution developed in ImageJ that enables the quantification of metabolically different adipose tissues in large cohort studies. The lumbar part of the abdomen (19cm in craniocaudal direction, centered in L3) of eleven healthy volunteers (age range: 21-46years, BMI range: 21.7-31.6kg/m 2 ) was examined in a breath hold on expiration with a GE T1 Dixon sequence. Single-slice and volumetric data were considered for each subject. The results of the visceral and subcutaneous adipose tissue assessments obtained by the unsupervised software were compared to supervised segmentations of reference. The associated statistical analysis included Pearson correlations, Bland-Altman plots and volumetric differences (VD % ). Values calculated by the unsupervised software significantly correlated with corresponding supervised segmentations of reference for both subcutaneous adipose tissue - SAT (R=0.9996, p<0.001) and visceral adipose tissue - VAT (R=0.995, p<0.001). Bland-Altman plots showed the absence of systematic errors and a limited spread of the differences. In the single-slice analysis, VD % were (1.6±2.9)% for SAT and (4.9±6.9)% for VAT. In the volumetric analysis, VD % were (1.3±0.9)% for SAT and (2.9±2.7)% for VAT. The developed software is capable of segmenting the metabolically different adipose tissues with a high degree of accuracy. This free add-on software for ImageJ can easily have a widespread and enable large-scale population studies regarding the adipose tissue and its related diseases. Copyright © 2017 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.

  4. Image Classification Workflow Using Machine Learning Methods

    NASA Astrophysics Data System (ADS)

    Christoffersen, M. S.; Roser, M.; Valadez-Vergara, R.; Fernández-Vega, J. A.; Pierce, S. A.; Arora, R.

    2016-12-01

    Recent increases in the availability and quality of remote sensing datasets have fueled an increasing number of scientifically significant discoveries based on land use classification and land use change analysis. However, much of the software made to work with remote sensing data products, specifically multispectral images, is commercial and often prohibitively expensive. The free to use solutions that are currently available come bundled up as small parts of much larger programs that are very susceptible to bugs and difficult to install and configure. What is needed is a compact, easy to use set of tools to perform land use analysis on multispectral images. To address this need, we have developed software using the Python programming language with the sole function of land use classification and land use change analysis. We chose Python to develop our software because it is relatively readable, has a large body of relevant third party libraries such as GDAL and Spectral Python, and is free to install and use on Windows, Linux, and Macintosh operating systems. In order to test our classification software, we performed a K-means unsupervised classification, Gaussian Maximum Likelihood supervised classification, and a Mahalanobis Distance based supervised classification. The images used for testing were three Landsat rasters of Austin, Texas with a spatial resolution of 60 meters for the years of 1984 and 1999, and 30 meters for the year 2015. The testing dataset was easily downloaded using the Earth Explorer application produced by the USGS. The software should be able to perform classification based on any set of multispectral rasters with little to no modification. Our software makes the ease of land use classification using commercial software available without an expensive license.

  5. Math Machines: Using Actuators in Physics Classes

    ERIC Educational Resources Information Center

    Thomas, Frederick J.; Chaney, Robert A.; Gruesbeck, Marta

    2018-01-01

    Probeware (sensors combined with data-analysis software) is a well-established part of physics education. In engineering and technology, sensors are frequently paired with actuators--motors, heaters, buzzers, valves, color displays, medical dosing systems, and other devices that are activated by electrical signals to produce intentional physical…

  6. CCDs in the Mechanics Lab--A Competitive Alternative? (Part I).

    ERIC Educational Resources Information Center

    Pinto, Fabrizio

    1995-01-01

    Reports on the implementation of a relatively low-cost, versatile, and intuitive system to teach basic mechanics based on the use of a Charge-Coupled Device (CCD) camera and inexpensive image-processing and analysis software. Discusses strengths and limitations of CCD imaging technologies. (JRH)

  7. Software Method for Computed Tomography Cylinder Data Unwrapping, Re-slicing, and Analysis

    NASA Technical Reports Server (NTRS)

    Roth, Don J.

    2013-01-01

    A software method has been developed that is applicable for analyzing cylindrical and partially cylindrical objects inspected using computed tomography (CT). This method involves unwrapping and re-slicing data so that the CT data from the cylindrical object can be viewed as a series of 2D sheets (or flattened onion skins ) in addition to a series of top view slices and 3D volume rendering. The advantages of viewing the data in this fashion are as follows: (1) the use of standard and specialized image processing and analysis methods is facilitated having 2D array data versus a volume rendering; (2) accurate lateral dimensional analysis of flaws is possible in the unwrapped sheets versus volume rendering; (3) flaws in the part jump out at the inspector with the proper contrast expansion settings in the unwrapped sheets; and (4) it is much easier for the inspector to locate flaws in the unwrapped sheets versus top view slices for very thin cylinders. The method is fully automated and requires no input from the user except proper voxel dimension from the CT experiment and wall thickness of the part. The software is available in 32-bit and 64-bit versions, and can be used with binary data (8- and 16-bit) and BMP type CT image sets. The software has memory (RAM) and hard-drive based modes. The advantage of the (64-bit) RAM-based mode is speed (and is very practical for users of 64-bit Windows operating systems and computers having 16 GB or more RAM). The advantage of the hard-drive based analysis is one can work with essentially unlimited-sized data sets. Separate windows are spawned for the unwrapped/re-sliced data view and any image processing interactive capability. Individual unwrapped images and un -wrapped image series can be saved in common image formats. More information is available at http://www.grc.nasa.gov/WWW/OptInstr/ NDE_CT_CylinderUnwrapper.html.

  8. Grinding Method and Error Analysis of Eccentric Shaft Parts

    NASA Astrophysics Data System (ADS)

    Wang, Zhiming; Han, Qiushi; Li, Qiguang; Peng, Baoying; Li, Weihua

    2017-12-01

    RV reducer and various mechanical transmission parts are widely used in eccentric shaft parts, The demand of precision grinding technology for eccentric shaft parts now, In this paper, the model of X-C linkage relation of eccentric shaft grinding is studied; By inversion method, the contour curve of the wheel envelope is deduced, and the distance from the center of eccentric circle is constant. The simulation software of eccentric shaft grinding is developed, the correctness of the model is proved, the influence of the X-axis feed error, the C-axis feed error and the wheel radius error on the grinding process is analyzed, and the corresponding error calculation model is proposed. The simulation analysis is carried out to provide the basis for the contour error compensation.

  9. FROG: Time Series Analysis for the Web Service Era

    NASA Astrophysics Data System (ADS)

    Allan, A.

    2005-12-01

    The FROG application is part of the next generation Starlink{http://www.starlink.ac.uk} software work (Draper et al. 2005) and released under the GNU Public License{http://www.gnu.org/copyleft/gpl.html} (GPL). Written in Java, it has been designed for the Web and Grid Service era as an extensible, pluggable, tool for time series analysis and display. With an integrated SOAP server the packages functionality is exposed to the user for use in their own code, and to be used remotely over the Grid, as part of the Virtual Observatory (VO).

  10. MSAViewer: interactive JavaScript visualization of multiple sequence alignments.

    PubMed

    Yachdav, Guy; Wilzbach, Sebastian; Rauscher, Benedikt; Sheridan, Robert; Sillitoe, Ian; Procter, James; Lewis, Suzanna E; Rost, Burkhard; Goldberg, Tatyana

    2016-11-15

    The MSAViewer is a quick and easy visualization and analysis JavaScript component for Multiple Sequence Alignment data of any size. Core features include interactive navigation through the alignment, application of popular color schemes, sorting, selecting and filtering. The MSAViewer is 'web ready': written entirely in JavaScript, compatible with modern web browsers and does not require any specialized software. The MSAViewer is part of the BioJS collection of components. The MSAViewer is released as open source software under the Boost Software License 1.0. Documentation, source code and the viewer are available at http://msa.biojs.net/Supplementary information: Supplementary data are available at Bioinformatics online. msa@bio.sh. © The Author 2016. Published by Oxford University Press.

  11. MSAViewer: interactive JavaScript visualization of multiple sequence alignments

    PubMed Central

    Yachdav, Guy; Wilzbach, Sebastian; Rauscher, Benedikt; Sheridan, Robert; Sillitoe, Ian; Procter, James; Lewis, Suzanna E.; Rost, Burkhard; Goldberg, Tatyana

    2016-01-01

    Summary: The MSAViewer is a quick and easy visualization and analysis JavaScript component for Multiple Sequence Alignment data of any size. Core features include interactive navigation through the alignment, application of popular color schemes, sorting, selecting and filtering. The MSAViewer is ‘web ready’: written entirely in JavaScript, compatible with modern web browsers and does not require any specialized software. The MSAViewer is part of the BioJS collection of components. Availability and Implementation: The MSAViewer is released as open source software under the Boost Software License 1.0. Documentation, source code and the viewer are available at http://msa.biojs.net/. Supplementary information: Supplementary data are available at Bioinformatics online. Contact: msa@bio.sh PMID:27412096

  12. Cost-Sensitive Radial Basis Function Neural Network Classifier for Software Defect Prediction

    PubMed Central

    Venkatesan, R.

    2016-01-01

    Effective prediction of software modules, those that are prone to defects, will enable software developers to achieve efficient allocation of resources and to concentrate on quality assurance activities. The process of software development life cycle basically includes design, analysis, implementation, testing, and release phases. Generally, software testing is a critical task in the software development process wherein it is to save time and budget by detecting defects at the earliest and deliver a product without defects to the customers. This testing phase should be carefully operated in an effective manner to release a defect-free (bug-free) software product to the customers. In order to improve the software testing process, fault prediction methods identify the software parts that are more noted to be defect-prone. This paper proposes a prediction approach based on conventional radial basis function neural network (RBFNN) and the novel adaptive dimensional biogeography based optimization (ADBBO) model. The developed ADBBO based RBFNN model is tested with five publicly available datasets from the NASA data program repository. The computed results prove the effectiveness of the proposed ADBBO-RBFNN classifier approach with respect to the considered metrics in comparison with that of the early predictors available in the literature for the same datasets. PMID:27738649

  13. Cost-Sensitive Radial Basis Function Neural Network Classifier for Software Defect Prediction.

    PubMed

    Kumudha, P; Venkatesan, R

    Effective prediction of software modules, those that are prone to defects, will enable software developers to achieve efficient allocation of resources and to concentrate on quality assurance activities. The process of software development life cycle basically includes design, analysis, implementation, testing, and release phases. Generally, software testing is a critical task in the software development process wherein it is to save time and budget by detecting defects at the earliest and deliver a product without defects to the customers. This testing phase should be carefully operated in an effective manner to release a defect-free (bug-free) software product to the customers. In order to improve the software testing process, fault prediction methods identify the software parts that are more noted to be defect-prone. This paper proposes a prediction approach based on conventional radial basis function neural network (RBFNN) and the novel adaptive dimensional biogeography based optimization (ADBBO) model. The developed ADBBO based RBFNN model is tested with five publicly available datasets from the NASA data program repository. The computed results prove the effectiveness of the proposed ADBBO-RBFNN classifier approach with respect to the considered metrics in comparison with that of the early predictors available in the literature for the same datasets.

  14. HEP Software Foundation Community White Paper Working Group - Data Analysis and Interpretation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bauerdick, Lothar

    At the heart of experimental high energy physics (HEP) is the development of facilities and instrumentation that provide sensitivity to new phenomena. Our understanding of nature at its most fundamental level is advanced through the analysis and interpretation of data from sophisticated detectors in HEP experiments. The goal of data analysis systems is to realize the maximum possible scientific potential of the data within the constraints of computing and human resources in the least time. To achieve this goal, future analysis systems should empower physicists to access the data with a high level of interactivity, reproducibility and throughput capability. Asmore » part of the HEP Software Foundation Community White Paper process, a working group on Data Analysis and Interpretation was formed to assess the challenges and opportunities in HEP data analysis and develop a roadmap for activities in this area over the next decade. In this report, the key findings and recommendations of the Data Analysis and Interpretation Working Group are presented.« less

  15. Earth Science Informatics Community Requirements for Improving Sustainable Science Software Practices: User Perspectives and Implications for Organizational Action

    NASA Astrophysics Data System (ADS)

    Downs, R. R.; Lenhardt, W. C.; Robinson, E.

    2014-12-01

    Science software is integral to the scientific process and must be developed and managed in a sustainable manner to ensure future access to scientific data and related resources. Organizations that are part of the scientific enterprise, as well as members of the scientific community who work within these entities, can contribute to the sustainability of science software and to practices that improve scientific community capabilities for science software sustainability. As science becomes increasingly digital and therefore, dependent on software, improving community practices for sustainable science software will contribute to the sustainability of science. Members of the Earth science informatics community, including scientific data producers and distributers, end-user scientists, system and application developers, and data center managers, use science software regularly and face the challenges and the opportunities that science software presents for the sustainability of science. To gain insight on practices needed for the sustainability of science software from the science software experiences of the Earth science informatics community, an interdisciplinary group of 300 community members were asked to engage in simultaneous roundtable discussions and report on their answers to questions about the requirements for improving scientific software sustainability. This paper will present an analysis of the issues reported and the conclusions offered by the participants. These results provide perspectives for science software sustainability practices and have implications for actions that organizations and their leadership can initiate to improve the sustainability of science software.

  16. Bayesian inference for psychology. Part II: Example applications with JASP.

    PubMed

    Wagenmakers, Eric-Jan; Love, Jonathon; Marsman, Maarten; Jamil, Tahira; Ly, Alexander; Verhagen, Josine; Selker, Ravi; Gronau, Quentin F; Dropmann, Damian; Boutin, Bruno; Meerhoff, Frans; Knight, Patrick; Raj, Akash; van Kesteren, Erik-Jan; van Doorn, Johnny; Šmíra, Martin; Epskamp, Sacha; Etz, Alexander; Matzke, Dora; de Jong, Tim; van den Bergh, Don; Sarafoglou, Alexandra; Steingroever, Helen; Derks, Koen; Rouder, Jeffrey N; Morey, Richard D

    2018-02-01

    Bayesian hypothesis testing presents an attractive alternative to p value hypothesis testing. Part I of this series outlined several advantages of Bayesian hypothesis testing, including the ability to quantify evidence and the ability to monitor and update this evidence as data come in, without the need to know the intention with which the data were collected. Despite these and other practical advantages, Bayesian hypothesis tests are still reported relatively rarely. An important impediment to the widespread adoption of Bayesian tests is arguably the lack of user-friendly software for the run-of-the-mill statistical problems that confront psychologists for the analysis of almost every experiment: the t-test, ANOVA, correlation, regression, and contingency tables. In Part II of this series we introduce JASP ( http://www.jasp-stats.org ), an open-source, cross-platform, user-friendly graphical software package that allows users to carry out Bayesian hypothesis tests for standard statistical problems. JASP is based in part on the Bayesian analyses implemented in Morey and Rouder's BayesFactor package for R. Armed with JASP, the practical advantages of Bayesian hypothesis testing are only a mouse click away.

  17. VOIP for Telerehabilitation: A Risk Analysis for Privacy, Security and HIPAA Compliance: Part II

    PubMed Central

    Watzlaf, Valerie J.M.; Moeini, Sohrab; Matusow, Laura; Firouzan, Patti

    2011-01-01

    In a previous publication the authors developed a privacy and security checklist to evaluate Voice over Internet Protocol (VoIP) videoconferencing software used between patients and therapists to provide telerehabilitation (TR) therapy. In this paper, the privacy and security checklist that was previously developed is used to perform a risk analysis of the top ten VoIP videoconferencing software to determine if their policies provide answers to the privacy and security checklist. Sixty percent of the companies claimed they do not listen into video-therapy calls unless maintenance is needed. Only 50% of the companies assessed use some form of encryption, and some did not specify what type of encryption was used. Seventy percent of the companies assessed did not specify any form of auditing on their servers. Statistically significant differences across company websites were found for sharing information outside of the country (p=0.010), encryption (p=0.006), and security evaluation (p=0.005). Healthcare providers considering use of VoIP software for TR services may consider using this privacy and security checklist before deciding to incorporate a VoIP software system for TR. Other videoconferencing software that is specific for TR with strong encryption, good access controls, and hardware that meets privacy and security standards should be considered for use with TR. PMID:25945177

  18. VOIP for Telerehabilitation: A Risk Analysis for Privacy, Security and HIPAA Compliance: Part II.

    PubMed

    Watzlaf, Valerie J M; Moeini, Sohrab; Matusow, Laura; Firouzan, Patti

    2011-01-01

    In a previous publication the authors developed a privacy and security checklist to evaluate Voice over Internet Protocol (VoIP) videoconferencing software used between patients and therapists to provide telerehabilitation (TR) therapy. In this paper, the privacy and security checklist that was previously developed is used to perform a risk analysis of the top ten VoIP videoconferencing software to determine if their policies provide answers to the privacy and security checklist. Sixty percent of the companies claimed they do not listen into video-therapy calls unless maintenance is needed. Only 50% of the companies assessed use some form of encryption, and some did not specify what type of encryption was used. Seventy percent of the companies assessed did not specify any form of auditing on their servers. Statistically significant differences across company websites were found for sharing information outside of the country (p=0.010), encryption (p=0.006), and security evaluation (p=0.005). Healthcare providers considering use of VoIP software for TR services may consider using this privacy and security checklist before deciding to incorporate a VoIP software system for TR. Other videoconferencing software that is specific for TR with strong encryption, good access controls, and hardware that meets privacy and security standards should be considered for use with TR.

  19. Personnel Training--Secondary Vocational Agriculture Teacher Education.

    ERIC Educational Resources Information Center

    Brown, Herman D.; And Others

    This document consists of three parts. The first part is a report on a project conducted to develop computer software needed by vocational agriculture teachers in Texas. The report details the process used to assess and develop software, and provides guidelines that can be used by others in evaluating computer software for vocational agriculture…

  20. Optimisation of process parameters on thin shell part using response surface methodology (RSM)

    NASA Astrophysics Data System (ADS)

    Faiz, J. M.; Shayfull, Z.; Nasir, S. M.; Fathullah, M.; Rashidi, M. M.

    2017-09-01

    This study is carried out to focus on optimisation of process parameters by simulation using Autodesk Moldflow Insight (AMI) software. The process parameters are taken as the input in order to analyse the warpage value which is the output in this study. There are some significant parameters that have been used which are melt temperature, mould temperature, packing pressure, and cooling time. A plastic part made of Polypropylene (PP) has been selected as the study part. Optimisation of process parameters is applied in Design Expert software with the aim to minimise the obtained warpage value. Response Surface Methodology (RSM) has been applied in this study together with Analysis of Variance (ANOVA) in order to investigate the interactions between parameters that are significant to the warpage value. Thus, the optimised warpage value can be obtained using the model designed using RSM due to its minimum error value. This study comes out with the warpage value improved by using RSM.

  1. Development and use of mathematical models and software frameworks for integrated analysis of agricultural systems and associated water use impacts

    USGS Publications Warehouse

    Fowler, K. R.; Jenkins, E.W.; Parno, M.; Chrispell, J.C.; Colón, A. I.; Hanson, Randall T.

    2016-01-01

    The development of appropriate water management strategies requires, in part, a methodology for quantifying and evaluating the impact of water policy decisions on regional stakeholders. In this work, we describe the framework we are developing to enhance the body of resources available to policy makers, farmers, and other community members in their e orts to understand, quantify, and assess the often competing objectives water consumers have with respect to usage. The foundation for the framework is the construction of a simulation-based optimization software tool using two existing software packages. In particular, we couple a robust optimization software suite (DAKOTA) with the USGS MF-OWHM water management simulation tool to provide a flexible software environment that will enable the evaluation of one or multiple (possibly competing) user-defined (or stakeholder) objectives. We introduce the individual software components and outline the communication strategy we defined for the coupled development. We present numerical results for case studies related to crop portfolio management with several defined objectives. The objectives are not optimally satisfied for any single user class, demonstrating the capability of the software tool to aid in the evaluation of a variety of competing interests.

  2. Dynamic Numerical Analysis of Steel Footbridge

    NASA Astrophysics Data System (ADS)

    Major, Maciej; Minda, Izabela; Major, Izabela

    2017-06-01

    The study presents a numerical analysis of the arched footbridge designed in two variants, made of steel and aluminium. The first part presents the criteria for evaluation of the comfort of using the footbridges. The study examined the footbridge with arched design with span in the axis of 24 m and width of 1.4 m. Arch geometry was made as a part of the circle with radius of r = 20 m cut off with a chord with length equal to the calculation length of the girders. The model of the analysed footbridge was subjected to the dynamic effect of wind and the pedestrian traffic with variable flexibility. The analyses used Robot Structural Analysis software.

  3. The jmzQuantML programming interface and validator for the mzQuantML data standard.

    PubMed

    Qi, Da; Krishna, Ritesh; Jones, Andrew R

    2014-03-01

    The mzQuantML standard from the HUPO Proteomics Standards Initiative has recently been released, capturing quantitative data about peptides and proteins, following analysis of MS data. We present a Java application programming interface (API) for mzQuantML called jmzQuantML. The API provides robust bridges between Java classes and elements in mzQuantML files and allows random access to any part of the file. The API provides read and write capabilities, and is designed to be embedded in other software packages, enabling mzQuantML support to be added to proteomics software tools (http://code.google.com/p/jmzquantml/). The mzQuantML standard is designed around a multilevel validation system to ensure that files are structurally and semantically correct for different proteomics quantitative techniques. In this article, we also describe a Java software tool (http://code.google.com/p/mzquantml-validator/) for validating mzQuantML files, which is a formal part of the data standard. © 2014 The Authors. Proteomics published by Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.

  4. Bringing Legacy Visualization Software to Modern Computing Devices via Application Streaming

    NASA Astrophysics Data System (ADS)

    Fisher, Ward

    2014-05-01

    Planning software compatibility across forthcoming generations of computing platforms is a problem commonly encountered in software engineering and development. While this problem can affect any class of software, data analysis and visualization programs are particularly vulnerable. This is due in part to their inherent dependency on specialized hardware and computing environments. A number of strategies and tools have been designed to aid software engineers with this task. While generally embraced by developers at 'traditional' software companies, these methodologies are often dismissed by the scientific software community as unwieldy, inefficient and unnecessary. As a result, many important and storied scientific software packages can struggle to adapt to a new computing environment; for example, one in which much work is carried out on sub-laptop devices (such as tablets and smartphones). Rewriting these packages for a new platform often requires significant investment in terms of development time and developer expertise. In many cases, porting older software to modern devices is neither practical nor possible. As a result, replacement software must be developed from scratch, wasting resources better spent on other projects. Enabled largely by the rapid rise and adoption of cloud computing platforms, 'Application Streaming' technologies allow legacy visualization and analysis software to be operated wholly from a client device (be it laptop, tablet or smartphone) while retaining full functionality and interactivity. It mitigates much of the developer effort required by other more traditional methods while simultaneously reducing the time it takes to bring the software to a new platform. This work will provide an overview of Application Streaming and how it compares against other technologies which allow scientific visualization software to be executed from a remote computer. We will discuss the functionality and limitations of existing application streaming frameworks and how a developer might prepare their software for application streaming. We will also examine the secondary benefits realized by moving legacy software to the cloud. Finally, we will examine the process by which a legacy Java application, the Integrated Data Viewer (IDV), is to be adapted for tablet computing via Application Streaming.

  5. Fingerprint analysis of polysaccharides from different Ganoderma by HPLC combined with chemometrics methods.

    PubMed

    Sun, Xiaomei; Wang, Haohao; Han, Xiaofeng; Chen, Shangwei; Zhu, Song; Dai, Jun

    2014-12-19

    A fingerprint analysis method has been developed for characterization and discrimination of polysaccharides from different Ganoderma by high performance liquid chromatography (HPLC) coupled with chemometrics means. The polysaccharides were extracted under ultrasonic-assisted condition, and then partly hydrolyzed with trifluoroacetic acid. Monosaccharides and oligosaccharides in the hydrolyzates were subjected to pre-column derivatization with 1-phenyl-3-methyl-5-pyrazolone and HPLC analysis, which will generate unique fingerprint information related to chemical composition and structure of polysaccharides. The peak data were imported to professional software in order to obtain standard fingerprint profiles and evaluate similarity of different samples. Meanwhile, the data were further processed by hierarchical cluster analysis and principal component analysis. Polysaccharides from different parts or species of Ganoderma or polysaccharides from the same parts of Ganoderma but from different geographical regions or different strains could be differentiated clearly. This fingerprint analysis method can be applied to identification and quality control of different Ganoderma and their products. Copyright © 2014 Elsevier Ltd. All rights reserved.

  6. Integrated learning in practical machine element design course: a case study of V-pulley design

    NASA Astrophysics Data System (ADS)

    Tantrabandit, Manop

    2014-06-01

    To achieve an effective integrated learning in Machine Element Design course, it is of importance to bridge the basic knowledge and skills of element designs. The multiple core learning leads the pathway which consists of two main parts. The first part involves teaching documents of which the contents are number of V-groove formulae, standard of V-grooved pulleys, and parallel key dimension's formulae. The second part relates to the subjects that the students have studied prior to participating in this integrated learning course, namely Material Selection, Manufacturing Process, Applied Engineering Drawing, CAD (Computer Aided Design) animation software. Moreover, an intensive cooperation between a lecturer and students is another key factor to fulfill the success of integrated learning. Last but not least, the students need to share their knowledge within the group and among the other groups aiming to gain knowledge of and skills in 1) the application of CAD-software to build up manufacture part drawings, 2) assembly drawing, 3) simulation to verify the strength of loaded pulley by method of Finite Element Analysis (FEA), 4) the software to create animation of mounting and dismounting of a pulley to a shaft, and 5) an instruction manual. The end product of this integrated learning, as a result of the above 1 to 5 knowledge and skills obtained, the participating students can create an assembly derived from manufacture part drawings and a video presentation with bilingual (English-Thai) audio description of Vpulley with datum diameter of 250 mm, 4 grooves, and type of groove: SPA.

  7. Software analysis handbook: Software complexity analysis and software reliability estimation and prediction

    NASA Technical Reports Server (NTRS)

    Lee, Alice T.; Gunn, Todd; Pham, Tuan; Ricaldi, Ron

    1994-01-01

    This handbook documents the three software analysis processes the Space Station Software Analysis team uses to assess space station software, including their backgrounds, theories, tools, and analysis procedures. Potential applications of these analysis results are also presented. The first section describes how software complexity analysis provides quantitative information on code, such as code structure and risk areas, throughout the software life cycle. Software complexity analysis allows an analyst to understand the software structure, identify critical software components, assess risk areas within a software system, identify testing deficiencies, and recommend program improvements. Performing this type of analysis during the early design phases of software development can positively affect the process, and may prevent later, much larger, difficulties. The second section describes how software reliability estimation and prediction analysis, or software reliability, provides a quantitative means to measure the probability of failure-free operation of a computer program, and describes the two tools used by JSC to determine failure rates and design tradeoffs between reliability, costs, performance, and schedule.

  8. Providing Access to CD-ROM Databases in a Campus Setting. Part II: Networking CD-ROMs via a LAN.

    ERIC Educational Resources Information Center

    Koren, Judy

    1992-01-01

    The second part of a report on CD-ROM networking in libraries describes LAN (local area network) technology; networking software and towers; gateway software for connecting to campuswide networks; Macintosh LANs; and network licenses. Several product and software reviews are included, and a sidebar lists vendor addresses. (NRP)

  9. 15 CFR Supplement No. 2 to Part 774 - General Technology and Software Notes

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 15 Commerce and Foreign Trade 2 2010-01-01 2010-01-01 false General Technology and Software Notes... REGULATIONS THE COMMERCE CONTROL LIST Pt. 774, Supp. 2 Supplement No. 2 to Part 774—General Technology and Software Notes 1. General Technology Note. The export of “technology” that is “required” for the...

  10. Design of Student Information Management Database Application System for Office and Departmental Target Responsibility System

    NASA Astrophysics Data System (ADS)

    Zhou, Hui

    It is the inevitable outcome of higher education reform to carry out office and departmental target responsibility system, in which statistical processing of student's information is an important part of student's performance review. On the basis of the analysis of the student's evaluation, the student information management database application system is designed by using relational database management system software in this paper. In order to implement the function of student information management, the functional requirement, overall structure, data sheets and fields, data sheet Association and software codes are designed in details.

  11. Finite element for rotor/stator interactive forces in general engine dynamic simulation. Part 1: Development of bearing damper element

    NASA Technical Reports Server (NTRS)

    Adams, M. L.; Padovan, J.; Fertis, D. G.

    1980-01-01

    A general purpose squeeze-film damper interactive force element was developed, coded into a software package (module) and debugged. This software package was applied to nonliner dynamic analyses of some simple rotor systems. Results for pressure distributions show that the long bearing (end sealed) is a stronger bearing as compared to the short bearing as expected. Results of the nonlinear dynamic analysis, using a four degree of freedom simulation model, showed that the orbit of the rotating shaft increases nonlinearity to fill the bearing clearance as the unbalanced weight increases.

  12. Towards a high performance geometry library for particle-detector simulations

    DOE PAGES

    Apostolakis, J.; Bandieramonte, M.; Bitzes, G.; ...

    2015-05-22

    Thread-parallelization and single-instruction multiple data (SIMD) ”vectorisation” of software components in HEP computing has become a necessity to fully benefit from current and future computing hardware. In this context, the Geant-Vector/GPU simulation project aims to re-engineer current software for the simulation of the passage of particles through detectors in order to increase the overall event throughput. As one of the core modules in this area, the geometry library plays a central role and vectorising its algorithms will be one of the cornerstones towards achieving good CPU performance. Here, we report on the progress made in vectorising the shape primitives, asmore » well as in applying new C++ template based optimizations of existing code available in the Geant4, ROOT or USolids geometry libraries. We will focus on a presentation of our software development approach that aims to provide optimized code for all use cases of the library (e.g., single particle and many-particle APIs) and to support different architectures (CPU and GPU) while keeping the code base small, manageable and maintainable. We report on a generic and templated C++ geometry library as a continuation of the AIDA USolids project. As a result, the experience gained with these developments will be beneficial to other parts of the simulation software, such as for the optimization of the physics library, and possibly to other parts of the experiment software stack, such as reconstruction and analysis.« less

  13. Towards a high performance geometry library for particle-detector simulations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Apostolakis, J.; Bandieramonte, M.; Bitzes, G.

    Thread-parallelization and single-instruction multiple data (SIMD) ”vectorisation” of software components in HEP computing has become a necessity to fully benefit from current and future computing hardware. In this context, the Geant-Vector/GPU simulation project aims to re-engineer current software for the simulation of the passage of particles through detectors in order to increase the overall event throughput. As one of the core modules in this area, the geometry library plays a central role and vectorising its algorithms will be one of the cornerstones towards achieving good CPU performance. Here, we report on the progress made in vectorising the shape primitives, asmore » well as in applying new C++ template based optimizations of existing code available in the Geant4, ROOT or USolids geometry libraries. We will focus on a presentation of our software development approach that aims to provide optimized code for all use cases of the library (e.g., single particle and many-particle APIs) and to support different architectures (CPU and GPU) while keeping the code base small, manageable and maintainable. We report on a generic and templated C++ geometry library as a continuation of the AIDA USolids project. As a result, the experience gained with these developments will be beneficial to other parts of the simulation software, such as for the optimization of the physics library, and possibly to other parts of the experiment software stack, such as reconstruction and analysis.« less

  14. PLUS: open-source toolkit for ultrasound-guided intervention systems.

    PubMed

    Lasso, Andras; Heffter, Tamas; Rankin, Adam; Pinter, Csaba; Ungi, Tamas; Fichtinger, Gabor

    2014-10-01

    A variety of advanced image analysis methods have been under the development for ultrasound-guided interventions. Unfortunately, the transition from an image analysis algorithm to clinical feasibility trials as part of an intervention system requires integration of many components, such as imaging and tracking devices, data processing algorithms, and visualization software. The objective of our paper is to provide a freely available open-source software platform-PLUS: Public software Library for Ultrasound-to facilitate rapid prototyping of ultrasound-guided intervention systems for translational clinical research. PLUS provides a variety of methods for interventional tool pose and ultrasound image acquisition from a wide range of tracking and imaging devices, spatial and temporal calibration, volume reconstruction, simulated image generation, and recording and live streaming of the acquired data. This paper introduces PLUS, explains its functionality and architecture, and presents typical uses and performance in ultrasound-guided intervention systems. PLUS fulfills the essential requirements for the development of ultrasound-guided intervention systems and it aspires to become a widely used translational research prototyping platform. PLUS is freely available as open source software under BSD license and can be downloaded from http://www.plustoolkit.org.

  15. Improving the strength of additively manufactured objects via modified interior structure

    NASA Astrophysics Data System (ADS)

    Al, Can Mert; Yaman, Ulas

    2017-10-01

    Additive manufacturing (AM), in other words 3D printing, is becoming more common because of its crucial advantages such as geometric complexity, functional interior structures, etc. over traditional manufacturing methods. Especially, Fused Filament Fabrication (FFF) 3D printing technology is frequently used because of the fact that desktop variants of these types of printers are highly appropriate for different fields and are improving rapidly. In spite of the fact that there are significant advantages of AM, the strength of the parts fabricated with AM is still a major problem especially when plastic materials, such as Acrylonitrile butadiene styrene (ABS), Polylactic acid (PLA), Nylon, etc., are utilized. In this study, an alternative method is proposed in which the strength of AM fabricated parts is improved employing direct slicing approach. Traditional Computer Aided Manufacturing (CAM) software of 3D printers takes only the geometry as an input in triangular mesh form (stereolithography, STL file) generated by Computer Aided Design software. This file format includes data only about the outer boundaries of the geometry. Interior of the artifacts are manufactured with homogeneous infill patterns, such as diagonal, honeycomb, linear, etc. according to the paths generated in CAM software. The developed method within this study provides a way to fabricate parts with heterogeneous infill patterns by utilizing the stress field data obtained from a Finite Element Analysis software, such as ABAQUS. According to the performed tensile tests, the strength of the test specimen is improved by about 45% compared to the conventional way of 3D printing.

  16. 77 FR 65550 - Compete, Inc.; Analysis of Proposed Consent Order To Aid Public Comment

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-10-29

    ... Public Reference Room, Room 130-H, 600 Pennsylvania Avenue NW., Washington, DC 20580, either in person or... interested persons. Comments received during this period will become part of the public record. After thirty... proposed order. Compete develops software for tracking consumers as they shop, browse and interact with...

  17. Technological Tools in the Introductory Statistics Classroom: Effects on Student Understanding of Inferential Statistics

    ERIC Educational Resources Information Center

    Meletiou-Mavrotheris, Maria

    2004-01-01

    While technology has become an integral part of introductory statistics courses, the programs typically employed are professional packages designed primarily for data analysis rather than for learning. Findings from several studies suggest that use of such software in the introductory statistics classroom may not be very effective in helping…

  18. 48 CFR 208.7400 - Scope of subpart.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... OF DEFENSE ACQUISITION PLANNING REQUIRED SOURCES OF SUPPLIES AND SERVICES Enterprise Software... commercial software and software maintenance, including software and software maintenance that is acquired— (a) As part of a system or system upgrade, where practicable; (b) Under a service contract; (c) Under...

  19. Advanced Information Processing System (AIPS)-based fault tolerant avionics architecture for launch vehicles

    NASA Technical Reports Server (NTRS)

    Lala, Jaynarayan H.; Harper, Richard E.; Jaskowiak, Kenneth R.; Rosch, Gene; Alger, Linda S.; Schor, Andrei L.

    1990-01-01

    An avionics architecture for the advanced launch system (ALS) that uses validated hardware and software building blocks developed under the advanced information processing system program is presented. The AIPS for ALS architecture defined is preliminary, and reliability requirements can be met by the AIPS hardware and software building blocks that are built using the state-of-the-art technology available in the 1992-93 time frame. The level of detail in the architecture definition reflects the level of detail available in the ALS requirements. As the avionics requirements are refined, the architecture can also be refined and defined in greater detail with the help of analysis and simulation tools. A useful methodology is demonstrated for investigating the impact of the avionics suite to the recurring cost of the ALS. It is shown that allowing the vehicle to launch with selected detected failures can potentially reduce the recurring launch costs. A comparative analysis shows that validated fault-tolerant avionics built out of Class B parts can result in lower life-cycle-cost in comparison to simplex avionics built out of Class S parts or other redundant architectures.

  20. Supporting secure programming in web applications through interactive static analysis.

    PubMed

    Zhu, Jun; Xie, Jing; Lipford, Heather Richter; Chu, Bill

    2014-07-01

    Many security incidents are caused by software developers' failure to adhere to secure programming practices. Static analysis tools have been used to detect software vulnerabilities. However, their wide usage by developers is limited by the special training required to write rules customized to application-specific logic. Our approach is interactive static analysis, to integrate static analysis into Integrated Development Environment (IDE) and provide in-situ secure programming support to help developers prevent vulnerabilities during code construction. No additional training is required nor are there any assumptions on ways programs are built. Our work is motivated in part by the observation that many vulnerabilities are introduced due to failure to practice secure programming by knowledgeable developers. We implemented a prototype interactive static analysis tool as a plug-in for Java in Eclipse. Our technical evaluation of our prototype detected multiple zero-day vulnerabilities in a large open source project. Our evaluations also suggest that false positives may be limited to a very small class of use cases.

  1. Supporting secure programming in web applications through interactive static analysis

    PubMed Central

    Zhu, Jun; Xie, Jing; Lipford, Heather Richter; Chu, Bill

    2013-01-01

    Many security incidents are caused by software developers’ failure to adhere to secure programming practices. Static analysis tools have been used to detect software vulnerabilities. However, their wide usage by developers is limited by the special training required to write rules customized to application-specific logic. Our approach is interactive static analysis, to integrate static analysis into Integrated Development Environment (IDE) and provide in-situ secure programming support to help developers prevent vulnerabilities during code construction. No additional training is required nor are there any assumptions on ways programs are built. Our work is motivated in part by the observation that many vulnerabilities are introduced due to failure to practice secure programming by knowledgeable developers. We implemented a prototype interactive static analysis tool as a plug-in for Java in Eclipse. Our technical evaluation of our prototype detected multiple zero-day vulnerabilities in a large open source project. Our evaluations also suggest that false positives may be limited to a very small class of use cases. PMID:25685513

  2. Operation and control software for APNEA

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McClelland, J.H.; Storm, B.H. Jr.; Ahearn, J.

    1997-11-01

    The human interface software for the Lockheed Martin Specialty Components (LMSC) Active/Passive Neutron Examination & Analysis System (APENA) provides a user friendly operating environment for the movement and analysis of waste drums. It is written in Microsoft Visual C++ on a Windows NT platform. Object oriented and multitasking techniques are used extensively to maximize the capability of the system. A waste drum is placed on a loading platform with a fork lift and then automatically moved into the APNEA chamber in preparation for analysis. A series of measurements is performed, controlled by menu commands to hardware components attached as peripheralmore » devices, in order to create data files for analysis. The analysis routines use the files to identify the pertinent radioactive characteristics of the drum, including the type, location, and quantity of fissionable material. At the completion of the measurement process, the drum is automatically unloaded and the data are archived in preparation for storage as part of the drum`s data signature. 3 figs.« less

  3. MOPEX: a software package for astronomical image processing and visualization

    NASA Astrophysics Data System (ADS)

    Makovoz, David; Roby, Trey; Khan, Iffat; Booth, Hartley

    2006-06-01

    We present MOPEX - a software package for astronomical image processing and display. The package is a combination of command-line driven image processing software written in C/C++ with a Java-based GUI. The main image processing capabilities include creating mosaic images, image registration, background matching, point source extraction, as well as a number of minor image processing tasks. The combination of the image processing and display capabilities allows for much more intuitive and efficient way of performing image processing. The GUI allows for the control over the image processing and display to be closely intertwined. Parameter setting, validation, and specific processing options are entered by the user through a set of intuitive dialog boxes. Visualization feeds back into further processing by providing a prompt feedback of the processing results. The GUI also allows for further analysis by accessing and displaying data from existing image and catalog servers using a virtual observatory approach. Even though originally designed for the Spitzer Space Telescope mission, a lot of functionalities are of general usefulness and can be used for working with existing astronomical data and for new missions. The software used in the package has undergone intensive testing and benefited greatly from effective software reuse. The visualization part has been used for observation planning for both the Spitzer and Herschel Space Telescopes as part the tool Spot. The visualization capabilities of Spot have been enhanced and integrated with the image processing functionality of the command-line driven MOPEX. The image processing software is used in the Spitzer automated pipeline processing, which has been in operation for nearly 3 years. The image processing capabilities have also been tested in off-line processing by numerous astronomers at various institutions around the world. The package is multi-platform and includes automatic update capabilities. The software package has been developed by a small group of software developers and scientists at the Spitzer Science Center. It is available for distribution at the Spitzer Science Center web page.

  4. 15 CFR Supplement No. 1 to Part 734 - Questions and Answers-Technology and Software Subject to the EAR

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 15 Commerce and Foreign Trade 2 2010-01-01 2010-01-01 false Questions and Answers-Technology and... Supplement No. 1 to Part 734—Questions and Answers—Technology and Software Subject to the EAR This Supplement No. 1 contains explanatory questions and answers relating to technology and software that is subject...

  5. An experimental study of factors affecting the selective inhibition of sintering process

    NASA Astrophysics Data System (ADS)

    Asiabanpour, Bahram

    Selective Inhibition of Sintering (SIS) is a new rapid prototyping method that builds parts in a layer-by-layer fabrication basis. SIS works by joining powder particles through sintering in the part's body, and by sintering inhibition of some selected powder areas. The objective of this research has been to improve the new SIS process, which has been invented at USC. The process improvement is based on statistical design of experiments. To conduct the needed experiments a working machine and related path generator software were needed. The machine and its control software were made available prior to this research. The path generator algorithms and software had to be created. This program should obtain model geometry data from a CAD file and generate an appropriate path file for the printer nozzle. Also, the program should generate a simulation file for path file inspection using virtual prototyping. The activities related to path generator constitute the first part of this research, which has resulted in an efficient path generator. In addition, to reach an acceptable level of accuracy, strength, and surface quality in the fabricated parts, all effective factors in the SIS process should be identified and controlled. Simultaneous analytical and experimental studies were conducted to recognize effective factors and to control the SIS process. Also, it was known that polystyrene was the most appropriate polymer powder and saturated potassium iodide was the most effective inhibitor among the available candidate materials. In addition, statistical tools were applied to improve the desirable properties of the parts fabricated by the SIS process. An investigation of part strength was conducted using the Response Surface Methodology (RSM) and a region of acceptable operating conditions for the part strength was found. Then, through analysis of the experimental results, the impact of the factors on the final part surface quality and dimensional accuracy was modeled. After developing a desirability function model, process operating conditions for maximum desirability were identified. Finally, the desirability model was validated.

  6. WGCNA: an R package for weighted correlation network analysis.

    PubMed

    Langfelder, Peter; Horvath, Steve

    2008-12-29

    Correlation networks are increasingly being used in bioinformatics applications. For example, weighted gene co-expression network analysis is a systems biology method for describing the correlation patterns among genes across microarray samples. Weighted correlation network analysis (WGCNA) can be used for finding clusters (modules) of highly correlated genes, for summarizing such clusters using the module eigengene or an intramodular hub gene, for relating modules to one another and to external sample traits (using eigengene network methodology), and for calculating module membership measures. Correlation networks facilitate network based gene screening methods that can be used to identify candidate biomarkers or therapeutic targets. These methods have been successfully applied in various biological contexts, e.g. cancer, mouse genetics, yeast genetics, and analysis of brain imaging data. While parts of the correlation network methodology have been described in separate publications, there is a need to provide a user-friendly, comprehensive, and consistent software implementation and an accompanying tutorial. The WGCNA R software package is a comprehensive collection of R functions for performing various aspects of weighted correlation network analysis. The package includes functions for network construction, module detection, gene selection, calculations of topological properties, data simulation, visualization, and interfacing with external software. Along with the R package we also present R software tutorials. While the methods development was motivated by gene expression data, the underlying data mining approach can be applied to a variety of different settings. The WGCNA package provides R functions for weighted correlation network analysis, e.g. co-expression network analysis of gene expression data. The R package along with its source code and additional material are freely available at http://www.genetics.ucla.edu/labs/horvath/CoexpressionNetwork/Rpackages/WGCNA.

  7. WGCNA: an R package for weighted correlation network analysis

    PubMed Central

    Langfelder, Peter; Horvath, Steve

    2008-01-01

    Background Correlation networks are increasingly being used in bioinformatics applications. For example, weighted gene co-expression network analysis is a systems biology method for describing the correlation patterns among genes across microarray samples. Weighted correlation network analysis (WGCNA) can be used for finding clusters (modules) of highly correlated genes, for summarizing such clusters using the module eigengene or an intramodular hub gene, for relating modules to one another and to external sample traits (using eigengene network methodology), and for calculating module membership measures. Correlation networks facilitate network based gene screening methods that can be used to identify candidate biomarkers or therapeutic targets. These methods have been successfully applied in various biological contexts, e.g. cancer, mouse genetics, yeast genetics, and analysis of brain imaging data. While parts of the correlation network methodology have been described in separate publications, there is a need to provide a user-friendly, comprehensive, and consistent software implementation and an accompanying tutorial. Results The WGCNA R software package is a comprehensive collection of R functions for performing various aspects of weighted correlation network analysis. The package includes functions for network construction, module detection, gene selection, calculations of topological properties, data simulation, visualization, and interfacing with external software. Along with the R package we also present R software tutorials. While the methods development was motivated by gene expression data, the underlying data mining approach can be applied to a variety of different settings. Conclusion The WGCNA package provides R functions for weighted correlation network analysis, e.g. co-expression network analysis of gene expression data. The R package along with its source code and additional material are freely available at . PMID:19114008

  8. Wallops Ship Surveillance System

    NASA Technical Reports Server (NTRS)

    Smith, Donna C.

    2011-01-01

    Approved as a Wallops control center backup system, the Wallops Ship Surveillance Software is a day-of-launch risk analysis tool for spaceport activities. The system calculates impact probabilities and displays ship locations relative to boundary lines. It enables rapid analysis of possible flight paths to preclude the need to cancel launches and allow execution of launches in a timely manner. Its design is based on low-cost, large-customer- base elements including personal computers, the Windows operating system, C/C++ object-oriented software, and network interfaces. In conformance with the NASA software safety standard, the system is designed to ensure that it does not falsely report a safe-for-launch condition. To improve the current ship surveillance method, the system is designed to prevent delay of launch under a safe-for-launch condition. A single workstation is designated the controller of the official ship information and the official risk analysis. Copies of this information are shared with other networked workstations. The program design is divided into five subsystems areas: 1. Communication Link -- threads that control the networking of workstations; 2. Contact List -- a thread that controls a list of protected item (ocean vessel) information; 3. Hazard List -- threads that control a list of hazardous item (debris) information and associated risk calculation information; 4. Display -- threads that control operator inputs and screen display outputs; and 5. Archive -- a thread that controls archive file read and write access. Currently, most of the hazard list thread and parts of other threads are being reused as part of a new ship surveillance system, under the SureTrak project.

  9. Firing Room Remote Application Software Development & Swamp Works Laboratory Robot Software Development

    NASA Technical Reports Server (NTRS)

    Garcia, Janette

    2016-01-01

    The National Aeronautics and Space Administration (NASA) is creating a way to send humans beyond low Earth orbit, and later to Mars. Kennedy Space Center (KSC) is working to make this possible by developing a Spaceport Command and Control System (SCCS) which will allow the launch of Space Launch System (SLS). This paper's focus is on the work performed by the author in her first and second part of the internship as a remote application software developer. During the first part of her internship, the author worked on the SCCS's software application layer by assisting multiple ground subsystems teams including Launch Accessories (LACC) and Environmental Control System (ECS) on the design, development, integration, and testing of remote control software applications. Then, on the second part of the internship, the author worked on the development of robot software at the Swamp Works Laboratory which is a research and technology development group which focuses on inventing new technology to help future In-Situ Resource Utilization (ISRU) missions.

  10. Waste management facility accident analysis (WASTE ACC) system: software for analysis of waste management alternatives

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kohout, E.F.; Folga, S.; Mueller, C.

    1996-03-01

    This paper describes the Waste Management Facility Accident Analysis (WASTE{underscore}ACC) software, which was developed at Argonne National Laboratory (ANL) to support the US Department of Energy`s (DOE`s) Waste Management (WM) Programmatic Environmental Impact Statement (PEIS). WASTE{underscore}ACC is a decision support and database system that is compatible with Microsoft{reg_sign} Windows{trademark}. It assesses potential atmospheric releases from accidents at waste management facilities. The software provides the user with an easy-to-use tool to determine the risk-dominant accident sequences for the many possible combinations of process technologies, waste and facility types, and alternative cases described in the WM PEIS. In addition, its structure willmore » allow additional alternative cases and assumptions to be tested as part of the future DOE programmatic decision-making process. The WASTE{underscore}ACC system demonstrates one approach to performing a generic, systemwide evaluation of accident risks at waste management facilities. The advantages of WASTE{underscore}ACC are threefold. First, the software gets waste volume and radiological profile data that were used to perform other WM PEIS-related analyses directly from the WASTE{underscore}MGMT system. Second, the system allows for a consistent analysis across all sites and waste streams, which enables decision makers to understand more fully the trade-offs among various policy options and scenarios. Third, the system is easy to operate; even complex scenario runs are completed within minutes.« less

  11. Validating New Software for Semiautomated Liver Volumetry--Better than Manual Measurement?

    PubMed

    Noschinski, L E; Maiwald, B; Voigt, P; Wiltberger, G; Kahn, T; Stumpp, P

    2015-09-01

    This prospective study compared a manual program for liver volumetry with semiautomated software. The hypothesis was that the semiautomated software would be faster, more accurate and less dependent on the evaluator's experience. Ten patients undergoing hemihepatectomy were included in this IRB approved study after written informed consent. All patients underwent a preoperative abdominal 3-phase CT scan, which was used for whole liver volumetry and volume prediction for the liver part to be resected. Two different types of software were used: 1) manual method: borders of the liver had to be defined per slice by the user; 2) semiautomated software: automatic identification of liver volume with manual assistance for definition of Couinaud segments. Measurements were done by six observers with different experience levels. Water displacement volumetry immediately after partial liver resection served as the gold standard. The resected part was examined with a CT scan after displacement volumetry. Volumetry of the resected liver scan showed excellent correlation to water displacement volumetry (manual: ρ = 0.997; semiautomated software: ρ = 0.995). The difference between the predicted volume and the real volume was significantly smaller with the semiautomated software than with the manual method (33% vs. 57%, p = 0.002). The semiautomated software was almost four times faster for volumetry of the whole liver (manual: 6:59 ± 3:04 min; semiautomated: 1:47 ± 1:11 min). Both methods for liver volumetry give an estimated liver volume close to the real one. The tested semiautomated software is faster, more accurate in predicting the volume of the resected liver part, gives more reproducible results and is less dependent on the user's experience. Both tested types of software allow exact volumetry of resected liver parts. Preoperative prediction can be performed more accurately with the semiautomated software. The semiautomated software is nearly four times faster than the tested manual program and less dependent on the user's experience. © Georg Thieme Verlag KG Stuttgart · New York.

  12. Designing a Software for Flood Risk Assessment Based on Multi Criteria Desicion Analysis and Information Diffusion Methods

    NASA Astrophysics Data System (ADS)

    Musaoglu, N.; Saral, A.; Seker, D. Z.

    2012-12-01

    Flooding is one of the major natural disasters not only in Turkey but also in all over the world and it causes serious damage and harm. It is estimated that of the total economic loss caused by all kinds of disasters, 40% was due to floods. In July 1995, the Ayamama Creek in Istanbul was flooded, the insurance sector received around 1,200 claims notices during that period, insurance companies had to pay a total of $40 million for claims. In 2009, the same creek was flooded again and killed 31 people over two days and insurance firms paid for damages around cost €150 million for claims. To solve these kinds of problems modern tools such as GIS and Remote Sensing should be utilized. In this study, a software was designed for the flood risk analysis with Analytic Hierarchy Process (AHP) and Information Diffusion( InfoDif) methods.In the developed sofware, five evaluation criterias were taken into account, which were slope, aspect, elevation, geology and land use which were extracted from the satellite sensor data. The Digital Elevation Model (DEM) of the Ayamama River Basin was acquired from the SPOT 5 satellite image with 2.5 meter spatial resolution. Slope and aspect values of the study basin were extracted from this DEM. The land use of the Ayamama Creek was obtained by performing object-oriented nearest neighbor classification method by image segmentation on SPOT 5 image dated 2010. All produced data were used as an input for the part of Multi Criteria Desicion Analysis (MCDA) method of this software. Criterias and their each sub criteras were weighted and flood vulnerability was determined with MCDA-AHP. Also, daily flood data was collected from Florya Meteorological Station, between 1975 to 2009 years and the daily flood peak discharge was calculated with the method of Soil Conservation Service-Curve Number (SCS-CN) and were used as an input in the software for the part of InfoDif.Obtained results were verified using ground truth data and it has been clearly seen that the developed (TRA) software which uses two different methods for flood risk analysis, can be more effective for achieving different decision problems, from conventional techniques and produce more reliable results in a short time.; Study Area

  13. Energy loss analysis of an integrated space power distribution system

    NASA Technical Reports Server (NTRS)

    Kankam, M. D.; Ribeiro, P. F.

    1992-01-01

    The results of studies related to conceptual topologies of an integrated utility-like space power system are described. The system topologies are comparatively analyzed by considering their transmission energy losses as functions of mainly distribution voltage level and load composition. The analysis is expedited by use of a Distribution System Analysis and Simulation (DSAS) software. This recently developed computer program by the Electric Power Research Institute (EPRI) uses improved load models to solve the power flow within the system. However, present shortcomings of the software with regard to space applications, and incompletely defined characteristics of a space power system make the results applicable to only the fundamental trends of energy losses of the topologies studied. Accountability, such as included, for the effects of the various parameters on the system performance can constitute part of a planning tool for a space power distribution system.

  14. Commercialization of NESSUS: Status

    NASA Technical Reports Server (NTRS)

    Thacker, Ben H.; Millwater, Harry R.

    1991-01-01

    A plan was initiated in 1988 to commercialize the Numerical Evaluation of Stochastic Structures Under Stress (NESSUS) probabilistic structural analysis software. The goal of the on-going commercialization effort is to begin the transfer of Probabilistic Structural Analysis Method (PSAM) developed technology into industry and to develop additional funding resources in the general area of structural reliability. The commercialization effort is summarized. The SwRI NESSUS Software System is a general purpose probabilistic finite element computer program using state of the art methods for predicting stochastic structural response due to random loads, material properties, part geometry, and boundary conditions. NESSUS can be used to assess structural reliability, to compute probability of failure, to rank the input random variables by importance, and to provide a more cost effective design than traditional methods. The goal is to develop a general probabilistic structural analysis methodology to assist in the certification of critical components in the next generation Space Shuttle Main Engine.

  15. Space Trajectory Error Analysis Program (STEAP) for halo orbit missions. Volume 2: Programmer's manual

    NASA Technical Reports Server (NTRS)

    Byrnes, D. V.; Carney, P. C.; Underwood, J. W.; Vogt, E. D.

    1974-01-01

    The six month effort was responsible for the development, test, conversion, and documentation of computer software for the mission analysis of missions to halo orbits about libration points in the earth-sun system. The software consisting of two programs called NOMNAL and ERRAN is part of the Space Trajectories Error Analysis Programs. The program NOMNAL targets a transfer trajectory from earth on a given launch date to a specified halo orbit on a required arrival date. Either impulsive or finite thrust insertion maneuvers into halo orbit are permitted by the program. The transfer trajectory is consistent with a realistic launch profile input by the user. The second program ERRAN conducts error analyses of the targeted transfer trajectory. Measurements including range, doppler, star-planet angles, and apparent planet diameter are processed in a Kalman-Schmidt filter to determine the trajectory knowledge uncertainty.

  16. Design and analysis of a magneto-rheological damper for an all terrain vehicle

    NASA Astrophysics Data System (ADS)

    Krishnan Unni, R.; Tamilarasan, N.

    2018-02-01

    A shock absorber design intended to replace the existing conventional shock absorber with a controllable system using a Magneto-rheological damper is introduced for an All Terrain Vehicle (ATV) that was designed for Baja SAE competitions. Suspensions are a vital part of an All Terrain Vehicles as it endures various surfaces and requires utmost attention while designing. COMSOL multi-physics software is used for applications that have coupled physics problems and is a unique tool that is used for the designing and analysis phase of the Magneto-rheological damper for the considered application and the model is optimized based on Taguchi using DOE software. The magneto-rheological damper is designed to maximize the damping force with the measured geometric constraints for the All Terrain Vehicle.

  17. A portable structural analysis library for reaction networks.

    PubMed

    Bedaso, Yosef; Bergmann, Frank T; Choi, Kiri; Medley, Kyle; Sauro, Herbert M

    2018-07-01

    The topology of a reaction network can have a significant influence on the network's dynamical properties. Such influences can include constraints on network flows and concentration changes or more insidiously result in the emergence of feedback loops. These effects are due entirely to mass constraints imposed by the network configuration and are important considerations before any dynamical analysis is made. Most established simulation software tools usually carry out some kind of structural analysis of a network before any attempt is made at dynamic simulation. In this paper, we describe a portable software library, libStructural, that can carry out a variety of popular structural analyses that includes conservation analysis, flux dependency analysis and enumerating elementary modes. The library employs robust algorithms that allow it to be used on large networks with more than a two thousand nodes. The library accepts either a raw or fully labeled stoichiometry matrix or models written in SBML format. The software is written in standard C/C++ and comes with extensive on-line documentation and a test suite. The software is available for Windows, Mac OS X, and can be compiled easily on any Linux operating system. A language binding for Python is also available through the pip package manager making it simple to install on any standard Python distribution. The bulk of the source code is licensed under the open source BSD license with other parts using as either the MIT license or more simply public domain. All source is available on GitHub (https://github.com/sys-bio/Libstructural). Copyright © 2018 Elsevier B.V. All rights reserved.

  18. Real-Time Processing Library for Open-Source Hardware Biomedical Sensors

    PubMed Central

    Castro-García, Juan A.; Lebrato-Vázquez, Clara

    2018-01-01

    Applications involving data acquisition from sensors need samples at a preset frequency rate, the filtering out of noise and/or analysis of certain frequency components. We propose a novel software architecture based on open-software hardware platforms which allows programmers to create data streams from input channels and easily implement filters and frequency analysis objects. The performances of the different classes given in the size of memory allocated and execution time (number of clock cycles) were analyzed in the low-cost platform Arduino Genuino. In addition, 11 people took part in an experiment in which they had to implement several exercises and complete a usability test. Sampling rates under 250 Hz (typical for many biomedical applications) makes it feasible to implement filters, sliding windows and Fourier analysis, operating in real time. Participants rated software usability at 70.2 out of 100 and the ease of use when implementing several signal processing applications was rated at just over 4.4 out of 5. Participants showed their intention of using this software because it was percieved as useful and very easy to use. The performances of the library showed that it may be appropriate for implementing small biomedical real-time applications or for human movement monitoring, even in a simple open-source hardware device like Arduino Genuino. The general perception about this library is that it is easy to use and intuitive. PMID:29596394

  19. LFQProfiler and RNP(xl): Open-Source Tools for Label-Free Quantification and Protein-RNA Cross-Linking Integrated into Proteome Discoverer.

    PubMed

    Veit, Johannes; Sachsenberg, Timo; Chernev, Aleksandar; Aicheler, Fabian; Urlaub, Henning; Kohlbacher, Oliver

    2016-09-02

    Modern mass spectrometry setups used in today's proteomics studies generate vast amounts of raw data, calling for highly efficient data processing and analysis tools. Software for analyzing these data is either monolithic (easy to use, but sometimes too rigid) or workflow-driven (easy to customize, but sometimes complex). Thermo Proteome Discoverer (PD) is a powerful software for workflow-driven data analysis in proteomics which, in our eyes, achieves a good trade-off between flexibility and usability. Here, we present two open-source plugins for PD providing additional functionality: LFQProfiler for label-free quantification of peptides and proteins, and RNP(xl) for UV-induced peptide-RNA cross-linking data analysis. LFQProfiler interacts with existing PD nodes for peptide identification and validation and takes care of the entire quantitative part of the workflow. We show that it performs at least on par with other state-of-the-art software solutions for label-free quantification in a recently published benchmark ( Ramus, C.; J. Proteomics 2016 , 132 , 51 - 62 ). The second workflow, RNP(xl), represents the first software solution to date for identification of peptide-RNA cross-links including automatic localization of the cross-links at amino acid resolution and localization scoring. It comes with a customized integrated cross-link fragment spectrum viewer for convenient manual inspection and validation of the results.

  20. Real-Time Processing Library for Open-Source Hardware Biomedical Sensors.

    PubMed

    Molina-Cantero, Alberto J; Castro-García, Juan A; Lebrato-Vázquez, Clara; Gómez-González, Isabel M; Merino-Monge, Manuel

    2018-03-29

    Applications involving data acquisition from sensors need samples at a preset frequency rate, the filtering out of noise and/or analysis of certain frequency components. We propose a novel software architecture based on open-software hardware platforms which allows programmers to create data streams from input channels and easily implement filters and frequency analysis objects. The performances of the different classes given in the size of memory allocated and execution time (number of clock cycles) were analyzed in the low-cost platform Arduino Genuino. In addition, 11 people took part in an experiment in which they had to implement several exercises and complete a usability test. Sampling rates under 250 Hz (typical for many biomedical applications) makes it feasible to implement filters, sliding windows and Fourier analysis, operating in real time. Participants rated software usability at 70.2 out of 100 and the ease of use when implementing several signal processing applications was rated at just over 4.4 out of 5. Participants showed their intention of using this software because it was percieved as useful and very easy to use. The performances of the library showed that it may be appropriate for implementing small biomedical real-time applications or for human movement monitoring, even in a simple open-source hardware device like Arduino Genuino. The general perception about this library is that it is easy to use and intuitive.

  1. Establishment of sequential software processing for a biomechanical model of mandibular reconstruction with custom-made plate.

    PubMed

    Li, Peng; Tang, Youchao; Li, Jia; Shen, Longduo; Tian, Weidong; Tang, Wei

    2013-09-01

    The aim of this study is to describe the sequential software processing of computed tomography (CT) dataset for reconstructing the finite element analysis (FEA) mandibular model with custom-made plate, and to provide a theoretical basis for clinical usage of this reconstruction method. A CT scan was done on one patient who had mandibular continuity defects. This CT dataset in DICOM format was imported into Mimics 10.0 software in which a three-dimensional (3-D) model of the facial skeleton was reconstructed and the mandible was segmented out. With Geomagic Studio 11.0, one custom-made plate and nine virtual screws were designed. All parts of the reconstructed mandible were converted into NURBS and saved as IGES format for importing into pro/E 4.0. After Boolean operation and assembly, the model was switched to ANSYS Workbench 12.0. Finally, after applying the boundary conditions and material properties, an analysis was performed. As results, a 3-D FEA model was successfully developed using the softwares above. The stress-strain distribution precisely indicated biomechanical performance of the reconstructed mandible on the normal occlusion load, without stress concentrated areas. The Von-Mises stress in all parts of the model, from the maximum value of 50.9MPa to the minimum value of 0.1MPa, was lower than the ultimate tensile strength. In conclusion, the described strategy could speedily and successfully produce a biomechanical model of a reconstructed mandible with custom-made plate. Using this FEA foundation, the custom-made plate may be improved for an optimal clinical outcome. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  2. Software and Algorithms for Biomedical Image Data Processing and Visualization

    NASA Technical Reports Server (NTRS)

    Talukder, Ashit; Lambert, James; Lam, Raymond

    2004-01-01

    A new software equipped with novel image processing algorithms and graphical-user-interface (GUI) tools has been designed for automated analysis and processing of large amounts of biomedical image data. The software, called PlaqTrak, has been specifically used for analysis of plaque on teeth of patients. New algorithms have been developed and implemented to segment teeth of interest from surrounding gum, and a real-time image-based morphing procedure is used to automatically overlay a grid onto each segmented tooth. Pattern recognition methods are used to classify plaque from surrounding gum and enamel, while ignoring glare effects due to the reflection of camera light and ambient light from enamel regions. The PlaqTrak system integrates these components into a single software suite with an easy-to-use GUI (see Figure 1) that allows users to do an end-to-end run of a patient s record, including tooth segmentation of all teeth, grid morphing of each segmented tooth, and plaque classification of each tooth image. The automated and accurate processing of the captured images to segment each tooth [see Figure 2(a)] and then detect plaque on a tooth-by-tooth basis is a critical component of the PlaqTrak system to do clinical trials and analysis with minimal human intervention. These features offer distinct advantages over other competing systems that analyze groups of teeth or synthetic teeth. PlaqTrak divides each segmented tooth into eight regions using an advanced graphics morphing procedure [see results on a chipped tooth in Figure 2(b)], and a pattern recognition classifier is then used to locate plaque [red regions in Figure 2(d)] and enamel regions. The morphing allows analysis within regions of teeth, thereby facilitating detailed statistical analysis such as the amount of plaque present on the biting surfaces on teeth. This software system is applicable to a host of biomedical applications, such as cell analysis and life detection, or robotic applications, such as product inspection or assembly of parts in space and industry.

  3. Continuous integration and quality control for scientific software

    NASA Astrophysics Data System (ADS)

    Neidhardt, A.; Ettl, M.; Brisken, W.; Dassing, R.

    2013-08-01

    Modern software has to be stable, portable, fast and reliable. This is going to be also more and more important for scientific software. But this requires a sophisticated way to inspect, check and evaluate the quality of source code with a suitable, automated infrastructure. A centralized server with a software repository and a version control system is one essential part, to manage the code basis and to control the different development versions. While each project can be compiled separately, the whole code basis can also be compiled with one central “Makefile”. This is used to create automated, nightly builds. Additionally all sources are inspected automatically with static code analysis and inspection tools, which check well-none error situations, memory and resource leaks, performance issues, or style issues. In combination with an automatic documentation generator it is possible to create the developer documentation directly from the code and the inline comments. All reports and generated information are presented as HTML page on a Web server. Because this environment increased the stability and quality of the software of the Geodetic Observatory Wettzell tremendously, it is now also available for scientific communities. One regular customer is already the developer group of the DiFX software correlator project.

  4. Comprehensive analysis of NMR data using advanced line shape fitting.

    PubMed

    Niklasson, Markus; Otten, Renee; Ahlner, Alexandra; Andresen, Cecilia; Schlagnitweit, Judith; Petzold, Katja; Lundström, Patrik

    2017-10-01

    NMR spectroscopy is uniquely suited for atomic resolution studies of biomolecules such as proteins, nucleic acids and metabolites, since detailed information on structure and dynamics are encoded in positions and line shapes of peaks in NMR spectra. Unfortunately, accurate determination of these parameters is often complicated and time consuming, in part due to the need for different software at the various analysis steps and for validating the results. Here, we present an integrated, cross-platform and open-source software that is significantly more versatile than the typical line shape fitting application. The software is a completely redesigned version of PINT ( https://pint-nmr.github.io/PINT/ ). It features a graphical user interface and includes functionality for peak picking, editing of peak lists and line shape fitting. In addition, the obtained peak intensities can be used directly to extract, for instance, relaxation rates, heteronuclear NOE values and exchange parameters. In contrast to most available software the entire process from spectral visualization to preparation of publication-ready figures is done solely using PINT and often within minutes, thereby, increasing productivity for users of all experience levels. Unique to the software are also the outstanding tools for evaluating the quality of the fitting results and extensive, but easy-to-use, customization of the fitting protocol and graphical output. In this communication, we describe the features of the new version of PINT and benchmark its performance.

  5. MAMA User Guide v2.0.1

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gaschen, Brian Keith; Bloch, Jeffrey Joseph; Porter, Reid

    Morphological signatures of bulk SNM materials have significant promise, but these potential signatures are not fully utilized. This document describes software tools, collectively called the MAMA (Morphological Analysis for Material Attribution) software that can help provide robust and accurate quantification of morphological features in bulk material microscopy images (Optical, SEM). Although many of the specific tools are not unique to Mama, the software package has been designed specifically for nuclear material morphological analysis, and is at a point where it can be easily adapted (by Los Alamos or by collaborators) in response to new, different, or changing forensics needs. Themore » current release of the MAMA software only includes the image quantification, descriptions, and annotation functionality. Only limited information on a sample, its pedigree, and its chemistry are recorded inside this part of the software. This was decision based on initial feedback and the fact that there are several analytical chemistry databases being developed within the community. Currently MAMA is a standalone program that can export quantification results in a basic text format that can be imported into other programs such as Excel and Access. There is also a basic report generating feature that produces HTML formatted pages of the same information. We will be working with collaborators to provide better integration of MAMA into their particular systems, databases and workflows.« less

  6. Software Engineering Laboratory (SEL) cleanroom process model

    NASA Technical Reports Server (NTRS)

    Green, Scott; Basili, Victor; Godfrey, Sally; Mcgarry, Frank; Pajerski, Rose; Waligora, Sharon

    1991-01-01

    The Software Engineering Laboratory (SEL) cleanroom process model is described. The term 'cleanroom' originates in the integrated circuit (IC) production process, where IC's are assembled in dust free 'clean rooms' to prevent the destructive effects of dust. When applying the clean room methodology to the development of software systems, the primary focus is on software defect prevention rather than defect removal. The model is based on data and analysis from previous cleanroom efforts within the SEL and is tailored to serve as a guideline in applying the methodology to future production software efforts. The phases that are part of the process model life cycle from the delivery of requirements to the start of acceptance testing are described. For each defined phase, a set of specific activities is discussed, and the appropriate data flow is described. Pertinent managerial issues, key similarities and differences between the SEL's cleanroom process model and the standard development approach used on SEL projects, and significant lessons learned from prior cleanroom projects are presented. It is intended that the process model described here will be further tailored as additional SEL cleanroom projects are analyzed.

  7. A methodology for collecting valid software engineering data

    NASA Technical Reports Server (NTRS)

    Basili, Victor R.; Weiss, David M.

    1983-01-01

    An effective data collection method for evaluating software development methodologies and for studying the software development process is described. The method uses goal-directed data collection to evaluate methodologies with respect to the claims made for them. Such claims are used as a basis for defining the goals of the data collection, establishing a list of questions of interest to be answered by data analysis, defining a set of data categorization schemes, and designing a data collection form. The data to be collected are based on the changes made to the software during development, and are obtained when the changes are made. To insure accuracy of the data, validation is performed concurrently with software development and data collection. Validation is based on interviews with those people supplying the data. Results from using the methodology show that data validation is a necessary part of change data collection. Without it, as much as 50% of the data may be erroneous. Feasibility of the data collection methodology was demonstrated by applying it to five different projects in two different environments. The application showed that the methodology was both feasible and useful.

  8. Software Quality Control at Belle II

    NASA Astrophysics Data System (ADS)

    Ritter, M.; Kuhr, T.; Hauth, T.; Gebard, T.; Kristof, M.; Pulvermacher, C.; Belle Software Group, II

    2017-10-01

    Over the last seven years the software stack of the next generation B factory experiment Belle II has grown to over one million lines of C++ and Python code, counting only the part included in offline software releases. There are several thousand commits to the central repository by about 100 individual developers per year. To keep a coherent software stack of high quality that it can be sustained and used efficiently for data acquisition, simulation, reconstruction, and analysis over the lifetime of the Belle II experiment is a challenge. A set of tools is employed to monitor the quality of the software and provide fast feedback to the developers. They are integrated in a machinery that is controlled by a buildbot master and automates the quality checks. The tools include different compilers, cppcheck, the clang static analyzer, valgrind memcheck, doxygen, a geometry overlap checker, a check for missing or extra library links, unit tests, steering file level tests, a sophisticated high-level validation suite, and an issue tracker. The technological development infrastructure is complemented by organizational means to coordinate the development.

  9. Software Past, Present, and Future: Views from Government, Industry and Academia

    NASA Technical Reports Server (NTRS)

    Holcomb, Lee; Page, Jerry; Evangelist, Michael

    2000-01-01

    Views from the NASA CIO NASA Software Engineering Workshop on software development from the past, present, and future are presented. The topics include: 1) Software Past; 2) Software Present; 3) NASA's Largest Software Challenges; 4) 8330 Software Projects in Industry Standish Groups 1994 Report; 5) Software Future; 6) Capability Maturity Model (CMM): Software Engineering Institute (SEI) levels; 7) System Engineering Quality Also Part of the Problem; 8) University Environment Trends Will Increase the Problem in Software Engineering; and 9) NASA Software Engineering Goals.

  10. Rule-Based Design of Plant Expression Vectors Using GenoCAD.

    PubMed

    Coll, Anna; Wilson, Mandy L; Gruden, Kristina; Peccoud, Jean

    2015-01-01

    Plant synthetic biology requires software tools to assist on the design of complex multi-genic expression plasmids. Here a vector design strategy to express genes in plants is formalized and implemented as a grammar in GenoCAD, a Computer-Aided Design software for synthetic biology. It includes a library of plant biological parts organized in structural categories and a set of rules describing how to assemble these parts into large constructs. Rules developed here are organized and divided into three main subsections according to the aim of the final construct: protein localization studies, promoter analysis and protein-protein interaction experiments. The GenoCAD plant grammar guides the user through the design while allowing users to customize vectors according to their needs. Therefore the plant grammar implemented in GenoCAD will help plant biologists take advantage of methods from synthetic biology to design expression vectors supporting their research projects.

  11. Expert system verification and validation study. Delivery 3A and 3B: Trip summaries

    NASA Technical Reports Server (NTRS)

    French, Scott

    1991-01-01

    Key results are documented from attending the 4th workshop on verification, validation, and testing. The most interesting part of the workshop was when representatives from the U.S., Japan, and Europe presented surveys of VV&T within their respective regions. Another interesting part focused on current efforts to define industry standards for artificial intelligence and how that might affect approaches to VV&T of expert systems. The next part of the workshop focused on VV&T methods of applying mathematical techniques to verification of rule bases and techniques for capturing information relating to the process of developing software. The final part focused on software tools. A summary is also presented of the EPRI conference on 'Methodologies, Tools, and Standards for Cost Effective Reliable Software Verification and Validation. The conference was divided into discussion sessions on the following issues: development process, automated tools, software reliability, methods, standards, and cost/benefit considerations.

  12. Web-based spatial analysis with the ILWIS open source GIS software and satellite images from GEONETCast

    NASA Astrophysics Data System (ADS)

    Lemmens, R.; Maathuis, B.; Mannaerts, C.; Foerster, T.; Schaeffer, B.; Wytzisk, A.

    2009-12-01

    This paper involves easy accessible integrated web-based analysis of satellite images with a plug-in based open source software. The paper is targeted to both users and developers of geospatial software. Guided by a use case scenario, we describe the ILWIS software and its toolbox to access satellite images through the GEONETCast broadcasting system. The last two decades have shown a major shift from stand-alone software systems to networked ones, often client/server applications using distributed geo-(web-)services. This allows organisations to combine without much effort their own data with remotely available data and processing functionality. Key to this integrated spatial data analysis is a low-cost access to data from within a user-friendly and flexible software. Web-based open source software solutions are more often a powerful option for developing countries. The Integrated Land and Water Information System (ILWIS) is a PC-based GIS & Remote Sensing software, comprising a complete package of image processing, spatial analysis and digital mapping and was developed as commercial software from the early nineties onwards. Recent project efforts have migrated ILWIS into a modular, plug-in-based open source software, and provide web-service support for OGC-based web mapping and processing. The core objective of the ILWIS Open source project is to provide a maintainable framework for researchers and software developers to implement training components, scientific toolboxes and (web-) services. The latest plug-ins have been developed for multi-criteria decision making, water resources analysis and spatial statistics analysis. The development of this framework is done since 2007 in the context of 52°North, which is an open initiative that advances the development of cutting edge open source geospatial software, using the GPL license. GEONETCast, as part of the emerging Global Earth Observation System of Systems (GEOSS), puts essential environmental data at the fingertips of users around the globe. This user-friendly and low-cost information dissemination provides global information as a basis for decision-making in a number of critical areas, including public health, energy, agriculture, weather, water, climate, natural disasters and ecosystems. GEONETCast makes available satellite images via Digital Video Broadcast (DVB) technology. An OGC WMS interface and plug-ins which convert GEONETCast data streams allow an ILWIS user to integrate various distributed data sources with data locally stored on his machine. Our paper describes a use case in which ILWIS is used with GEONETCast satellite imagery for decision making processes in Ghana. We also explain how the ILWIS software can be extended with additional functionality by means of building plug-ins and unfold our plans to implement other OGC standards, such as WCS and WPS in the same context. Especially, the latter one can be seen as a major step forward in terms of moving well-proven desktop based processing functionality to the web. This enables the embedding of ILWIS functionality in Spatial Data Infrastructures or even the execution in scalable and on-demand cloud computing environments.

  13. FEBio: finite elements for biomechanics.

    PubMed

    Maas, Steve A; Ellis, Benjamin J; Ateshian, Gerard A; Weiss, Jeffrey A

    2012-01-01

    In the field of computational biomechanics, investigators have primarily used commercial software that is neither geared toward biological applications nor sufficiently flexible to follow the latest developments in the field. This lack of a tailored software environment has hampered research progress, as well as dissemination of models and results. To address these issues, we developed the FEBio software suite (http://mrl.sci.utah.edu/software/febio), a nonlinear implicit finite element (FE) framework, designed specifically for analysis in computational solid biomechanics. This paper provides an overview of the theoretical basis of FEBio and its main features. FEBio offers modeling scenarios, constitutive models, and boundary conditions, which are relevant to numerous applications in biomechanics. The open-source FEBio software is written in C++, with particular attention to scalar and parallel performance on modern computer architectures. Software verification is a large part of the development and maintenance of FEBio, and to demonstrate the general approach, the description and results of several problems from the FEBio Verification Suite are presented and compared to analytical solutions or results from other established and verified FE codes. An additional simulation is described that illustrates the application of FEBio to a research problem in biomechanics. Together with the pre- and postprocessing software PREVIEW and POSTVIEW, FEBio provides a tailored solution for research and development in computational biomechanics.

  14. A high-level 3D visualization API for Java and ImageJ.

    PubMed

    Schmid, Benjamin; Schindelin, Johannes; Cardona, Albert; Longair, Mark; Heisenberg, Martin

    2010-05-21

    Current imaging methods such as Magnetic Resonance Imaging (MRI), Confocal microscopy, Electron Microscopy (EM) or Selective Plane Illumination Microscopy (SPIM) yield three-dimensional (3D) data sets in need of appropriate computational methods for their analysis. The reconstruction, segmentation and registration are best approached from the 3D representation of the data set. Here we present a platform-independent framework based on Java and Java 3D for accelerated rendering of biological images. Our framework is seamlessly integrated into ImageJ, a free image processing package with a vast collection of community-developed biological image analysis tools. Our framework enriches the ImageJ software libraries with methods that greatly reduce the complexity of developing image analysis tools in an interactive 3D visualization environment. In particular, we provide high-level access to volume rendering, volume editing, surface extraction, and image annotation. The ability to rely on a library that removes the low-level details enables concentrating software development efforts on the algorithm implementation parts. Our framework enables biomedical image software development to be built with 3D visualization capabilities with very little effort. We offer the source code and convenient binary packages along with extensive documentation at http://3dviewer.neurofly.de.

  15. Computer-aided modelling and analysis of PV systems: a comparative study.

    PubMed

    Koukouvaos, Charalambos; Kandris, Dionisis; Samarakou, Maria

    2014-01-01

    Modern scientific advances have enabled remarkable efficacy for photovoltaic systems with regard to the exploitation of solar energy, boosting them into having a rapidly growing position among the systems developed for the production of renewable energy. However, in many cases the design, analysis, and control of photovoltaic systems are tasks which are quite complex and thus difficult to be carried out. In order to cope with this kind of problems, appropriate software tools have been developed either as standalone products or parts of general purpose software platforms used to model and simulate the generation, transmission, and distribution of solar energy. The utilization of this kind of software tools may be extremely helpful to the successful performance evaluation of energy systems with maximum accuracy and minimum cost in time and effort. The work presented in this paper aims on a first level at the performance analysis of various configurations of photovoltaic systems through computer-aided modelling. On a second level, it provides a comparative evaluation of the credibility of two of the most advanced graphical programming environments, namely, Simulink and LabVIEW, with regard to their application in photovoltaic systems.

  16. Computer-Aided Modelling and Analysis of PV Systems: A Comparative Study

    PubMed Central

    Koukouvaos, Charalambos

    2014-01-01

    Modern scientific advances have enabled remarkable efficacy for photovoltaic systems with regard to the exploitation of solar energy, boosting them into having a rapidly growing position among the systems developed for the production of renewable energy. However, in many cases the design, analysis, and control of photovoltaic systems are tasks which are quite complex and thus difficult to be carried out. In order to cope with this kind of problems, appropriate software tools have been developed either as standalone products or parts of general purpose software platforms used to model and simulate the generation, transmission, and distribution of solar energy. The utilization of this kind of software tools may be extremely helpful to the successful performance evaluation of energy systems with maximum accuracy and minimum cost in time and effort. The work presented in this paper aims on a first level at the performance analysis of various configurations of photovoltaic systems through computer-aided modelling. On a second level, it provides a comparative evaluation of the credibility of two of the most advanced graphical programming environments, namely, Simulink and LabVIEW, with regard to their application in photovoltaic systems. PMID:24772007

  17. 15 CFR 30.37 - Miscellaneous exemptions.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... requirements of the licensing Federal agency. (f) Exports of technology and software as defined in 15 CFR 772... required for mass-market software. For purposes of this part, mass-market software is defined as software... of commodities and software intended for use by individual USPPIs or by employees or representatives...

  18. 15 CFR 30.37 - Miscellaneous exemptions.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... requirements of the licensing Federal agency. (f) Exports of technology and software as defined in 15 CFR 772... required for mass-market software. For purposes of this part, mass-market software is defined as software... of commodities and software intended for use by individual USPPIs or by employees or representatives...

  19. 15 CFR 30.37 - Miscellaneous exemptions.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... requirements of the licensing Federal agency. (f) Exports of technology and software as defined in 15 CFR 772... required for mass-market software. For purposes of this part, mass-market software is defined as software... of commodities and software intended for use by individual USPPIs or by employees or representatives...

  20. 15 CFR 30.37 - Miscellaneous exemptions.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... requirements of the licensing Federal agency. (f) Exports of technology and software as defined in 15 CFR 772... required for mass-market software. For purposes of this part, mass-market software is defined as software... of commodities and software intended for use by individual USPPIs or by employees or representatives...

  1. Good analytical practice: statistics and handling data in biomedical science. A primer and directions for authors. Part 1: Introduction. Data within and between one or two sets of individuals.

    PubMed

    Blann, A D; Nation, B R

    2008-01-01

    The biomedical scientist is bombarded on a daily basis by information, almost all of which refers to the health status of an individual or groups of individuals. This review is the first of a two-part article written to explain some of the issues related to the presentation and analysis of data. The first part focuses on types of data and how to present and analyse data from an individual or from one or two groups of persons. The second part will examine data from three or more sets of persons, what methods are available to allow this analysis (i.e., statistical software packages), and will conclude with a statement on appropriate descriptors of data, their analyses, and presentation for authors considering submission of their data to this journal.

  2. Profiles of gamma-ray and magnetic data for aerial surveys over parts of the Western United States from longitude 108 to 126 degrees W. and from latitude 34 to 49 degrees N.

    USGS Publications Warehouse

    Duval, Joseph S.

    1995-01-01

    This CD-ROM contains images generated from geophysical data, software for displaying and analyzing the images and software for displaying and examining profile data from aerial surveys flown as part of the National Uranium Resource Evaluation (NURE) Program of the U.S. Department of Energy. The images included are of gamma-ray data (uranium, thorium, and potassium channels), Bouguer gravity data, isostatic residual gravity data, aeromagnetic anomalies, topography, and topography with bathymetry. This publication contains image data for the conterminous United States and profile data for the conterminous United States within the area longitude 108 to 126 degrees W. and latitude 34 to 49 degrees N. The profile data include apparent surface concentrations of potassium, uranium, and thorium, the residual magnetic field, and the height above the ground. The images on this CD-ROM include graytone and color images of each data set, color shaded-relief images of the potential-field and topographic data, and color composite images of the gamma-ray data. The image display and analysis software can register images with geographic and geologic overlays. The profile display software permits the user to view the profiles as well as obtain data listings and export ASCII versions of data for selected flight lines.

  3. Open source EMR software: profiling, insights and hands-on analysis.

    PubMed

    Kiah, M L M; Haiqi, Ahmed; Zaidan, B B; Zaidan, A A

    2014-11-01

    The use of open source software in health informatics is increasingly advocated by authors in the literature. Although there is no clear evidence of the superiority of the current open source applications in the healthcare field, the number of available open source applications online is growing and they are gaining greater prominence. This repertoire of open source options is of a great value for any future-planner interested in adopting an electronic medical/health record system, whether selecting an existent application or building a new one. The following questions arise. How do the available open source options compare to each other with respect to functionality, usability and security? Can an implementer of an open source application find sufficient support both as a user and as a developer, and to what extent? Does the available literature provide adequate answers to such questions? This review attempts to shed some light on these aspects. The objective of this study is to provide more comprehensive guidance from an implementer perspective toward the available alternatives of open source healthcare software, particularly in the field of electronic medical/health records. The design of this study is twofold. In the first part, we profile the published literature on a sample of existent and active open source software in the healthcare area. The purpose of this part is to provide a summary of the available guides and studies relative to the sampled systems, and to identify any gaps in the published literature with respect to our research questions. In the second part, we investigate those alternative systems relative to a set of metrics, by actually installing the software and reporting a hands-on experience of the installation process, usability, as well as other factors. The literature covers many aspects of open source software implementation and utilization in healthcare practice. Roughly, those aspects could be distilled into a basic taxonomy, making the literature landscape more perceivable. Nevertheless, the surveyed articles fall short of fulfilling the targeted objective of providing clear reference to potential implementers. The hands-on study contributed a more detailed comparative guide relative to our set of assessment measures. Overall, no system seems to satisfy an industry-standard measure, particularly in security and interoperability. The systems, as software applications, feel similar from a usability perspective and share a common set of functionality, though they vary considerably in community support and activity. More detailed analysis of popular open source software can benefit the potential implementers of electronic health/medical records systems. The number of examined systems and the measures by which to compare them vary across studies, but still rewarding insights start to emerge. Our work is one step toward that goal. Our overall conclusion is that open source options in the medical field are still far behind the highly acknowledged open source products in other domains, e.g. operating systems market share. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  4. Software Framework for Development of Web-GIS Systems for Analysis of Georeferenced Geophysical Data

    NASA Astrophysics Data System (ADS)

    Okladnikov, I.; Gordov, E. P.; Titov, A. G.

    2011-12-01

    Georeferenced datasets (meteorological databases, modeling and reanalysis results, remote sensing products, etc.) are currently actively used in numerous applications including modeling, interpretation and forecast of climatic and ecosystem changes for various spatial and temporal scales. Due to inherent heterogeneity of environmental datasets as well as their size which might constitute up to tens terabytes for a single dataset at present studies in the area of climate and environmental change require a special software support. A dedicated software framework for rapid development of providing such support information-computational systems based on Web-GIS technologies has been created. The software framework consists of 3 basic parts: computational kernel developed using ITTVIS Interactive Data Language (IDL), a set of PHP-controllers run within specialized web portal, and JavaScript class library for development of typical components of web mapping application graphical user interface (GUI) based on AJAX technology. Computational kernel comprise of number of modules for datasets access, mathematical and statistical data analysis and visualization of results. Specialized web-portal consists of web-server Apache, complying OGC standards Geoserver software which is used as a base for presenting cartographical information over the Web, and a set of PHP-controllers implementing web-mapping application logic and governing computational kernel. JavaScript library aiming at graphical user interface development is based on GeoExt library combining ExtJS Framework and OpenLayers software. Based on the software framework an information-computational system for complex analysis of large georeferenced data archives was developed. Structured environmental datasets available for processing now include two editions of NCEP/NCAR Reanalysis, JMA/CRIEPI JRA-25 Reanalysis, ECMWF ERA-40 Reanalysis, ECMWF ERA Interim Reanalysis, MRI/JMA APHRODITE's Water Resources Project Reanalysis, meteorological observational data for the territory of the former USSR for the 20th century, and others. Current version of the system is already involved into a scientific research process. Particularly, recently the system was successfully used for analysis of Siberia climate changes and its impact in the region. The software framework presented allows rapid development of Web-GIS systems for geophysical data analysis thus providing specialists involved into multidisciplinary research projects with reliable and practical instruments for complex analysis of climate and ecosystems changes on global and regional scales. This work is partially supported by RFBR grants #10-07-00547, #11-05-01190, and SB RAS projects 4.31.1.5, 4.31.2.7, 4, 8, 9, 50 and 66.

  5. Distributed and Collaborative Software Analysis

    NASA Astrophysics Data System (ADS)

    Ghezzi, Giacomo; Gall, Harald C.

    Throughout the years software engineers have come up with a myriad of specialized tools and techniques that focus on a certain type of software analysissoftware analysis such as source code analysis, co-change analysis or bug prediction. However, easy and straight forward synergies between these analyses and tools rarely exist because of their stand-alone nature, their platform dependence, their different input and output formats and the variety of data to analyze. As a consequence, distributed and collaborative software analysiscollaborative software analysis scenarios and in particular interoperability are severely limited. We describe a distributed and collaborative software analysis platform that allows for a seamless interoperability of software analysis tools across platform, geographical and organizational boundaries. We realize software analysis tools as services that can be accessed and composed over the Internet. These distributed analysis services shall be widely accessible in our incrementally augmented Software Analysis Broker software analysis broker where organizations and tool providers can register and share their tools. To allow (semi-) automatic use and composition of these tools, they are classified and mapped into a software analysis taxonomy and adhere to specific meta-models and ontologiesontologies for their category of analysis.

  6. Software Tools for In-Situ Documentation of Built Heritage

    NASA Astrophysics Data System (ADS)

    Smars, P.

    2013-07-01

    The paper presents open source software tools developed by the author to facilitate in-situ documentation of architectural and archæological heritage. The design choices are exposed and related to a general issue in conservation and documentation: taking decisions about a valuable object under threat . The questions of level of objectivity is central to the three steps of this process. It is our belief that in-situ documentation has to be favoured in this demanding context, full of potential discoveries. The very powerful surveying techniques in rapid development nowadays enhance our vision but often tend to bring back a critical part of the documentation process to the office. The software presented facilitate a direct treatment of the data on the site. Emphasis is given to flexibility, interoperability and simplicity. Key features of the software are listed and illustrated with examples (3D model of Gothic vaults, analysis of the shape of a column, deformation of a wall, direct interaction with AutoCAD).

  7. Computer applications making rapid advances in high throughput microbial proteomics (HTMP).

    PubMed

    Anandkumar, Balakrishna; Haga, Steve W; Wu, Hui-Fen

    2014-02-01

    The last few decades have seen the rise of widely-available proteomics tools. From new data acquisition devices, such as MALDI-MS and 2DE to new database searching softwares, these new products have paved the way for high throughput microbial proteomics (HTMP). These tools are enabling researchers to gain new insights into microbial metabolism, and are opening up new areas of study, such as protein-protein interactions (interactomics) discovery. Computer software is a key part of these emerging fields. This current review considers: 1) software tools for identifying the proteome, such as MASCOT or PDQuest, 2) online databases of proteomes, such as SWISS-PROT, Proteome Web, or the Proteomics Facility of the Pathogen Functional Genomics Resource Center, and 3) software tools for applying proteomic data, such as PSI-BLAST or VESPA. These tools allow for research in network biology, protein identification, functional annotation, target identification/validation, protein expression, protein structural analysis, metabolic pathway engineering and drug discovery.

  8. Domain analysis for the reuse of software development experiences

    NASA Technical Reports Server (NTRS)

    Basili, V. R.; Briand, L. C.; Thomas, W. M.

    1994-01-01

    We need to be able to learn from past experiences so we can improve our software processes and products. The Experience Factory is an organizational structure designed to support and encourage the effective reuse of software experiences. This structure consists of two organizations which separates project development concerns from organizational concerns of experience packaging and learning. The experience factory provides the processes and support for analyzing, packaging, and improving the organization's stored experience. The project organization is structured to reuse this stored experience in its development efforts. However, a number of questions arise: What past experiences are relevant? Can they all be used (reused) on our current project? How do we take advantage of what has been learned in other parts of the organization? How do we take advantage of experience in the world-at-large? Can someone else's best practices be used in our organization with confidence? This paper describes approaches to help answer these questions. We propose both quantitative and qualitative approaches for effectively reusing software development experiences.

  9. ICT Integration in Primary and Secondary Education in Andalusia, Spain: Curricular and Organizational Implications

    ERIC Educational Resources Information Center

    Morueta, Ramon Tirado; Igado, Manuel Fandos; Gomez, J. Ignacio Aguaded

    2010-01-01

    This work, part of the Spanish government's National I + D Plan 2004/07, entitled "Observatics: the implementation of free software in ICT centres in Andalusia: an analysis of its effect on the teaching-learning process", aims to describe the most recent impact of online communication technologies on education in Andalusia (Spain),…

  10. 15 CFR 774.1 - Introduction.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... Control List (CCL) that includes “items”—i.e., “commodities,” “software,” and “technology”—subject to the.... 1 to this part, and Supplement No. 2 to this part contains the General Technology and Software Notes... the CCL that are defined in part 772 (Definitions of Terms), or for purposes of ECCNs, where a...

  11. Computer Program Re-layers Engineering Drawings

    NASA Technical Reports Server (NTRS)

    Crosby, Dewey C., III

    1990-01-01

    RULCHK computer program aids in structuring layers of information pertaining to part or assembly designed with software described in article "Software for Drawing Design Details Concurrently" (MFS-28444). Checks and optionally updates structure of layers for part. Enables designer to construct model and annotate its documentation without burden of manually layering part to conform to standards at design time.

  12. Principles and tools for collaborative entity-based intelligence analysis.

    PubMed

    Bier, Eric A; Card, Stuart K; Bodnar, John W

    2010-01-01

    Software tools that make it easier for analysts to collaborate as a natural part of their work will lead to better analysis that is informed by more perspectives. We are interested to know if software tools can be designed that support collaboration even as they allow analysts to find documents and organize information (including evidence, schemas, and hypotheses). We have modified the Entity Workspace system, described previously, to test such designs. We have evaluated the resulting design in both a laboratory study and a study where it is situated with an analysis team. In both cases, effects on collaboration appear to be positive. Key aspects of the design include an evidence notebook optimized for organizing entities (rather than text characters), information structures that can be collapsed and expanded, visualization of evidence that emphasizes events and documents (rather than emphasizing the entity graph), and a notification system that finds entities of mutual interest to multiple analysts. Long-term tests suggest that this approach can support both top-down and bottom-up styles of analysis.

  13. Design and Analysis of Tooth Impact Test Rig for Spur Gear

    NASA Astrophysics Data System (ADS)

    Ghazali, Wafiuddin Bin Md; Aziz, Ismail Ali Bin Abdul; Daing Idris, Daing Mohamad Nafiz Bin; Ismail, Nurazima Binti; Sofian, Azizul Helmi Bin

    2016-02-01

    This paper is about the design and analysis of a prototype of tooth impact test rig for spur gear. The test rig was fabricated and analysis was conducted to study its’ limitation and capabilities. The design of the rig is analysed to ensure that there will be no problem occurring during the test and reliable data can be obtained. From the result of the analysis, the maximum amount of load that can be applied, the factor of safety of the machine, the stresses on the test rig parts were determined. This is important in the design consideration of the test rig. The materials used for the fabrication of the test rig were also discussed and analysed. MSC Nastran Patran software was used to analyse the model, which was designed by using SolidWorks 2014 software. Based from the results, there were limitations found from the initial design and the test rig design needs to be improved in order for the test rig to operate properly.

  14. An advanced software suite for the processing and analysis of silicon luminescence images

    NASA Astrophysics Data System (ADS)

    Payne, D. N. R.; Vargas, C.; Hameiri, Z.; Wenham, S. R.; Bagnall, D. M.

    2017-06-01

    Luminescence imaging is a versatile characterisation technique used for a broad range of research and industrial applications, particularly for the field of photovoltaics where photoluminescence and electroluminescence imaging is routinely carried out for materials analysis and quality control. Luminescence imaging can reveal a wealth of material information, as detailed in extensive literature, yet these techniques are often only used qualitatively instead of being utilised to their full potential. Part of the reason for this is the time and effort required for image processing and analysis in order to convert image data to more meaningful results. In this work, a custom built, Matlab based software suite is presented which aims to dramatically simplify luminescence image processing and analysis. The suite includes four individual programs which can be used in isolation or in conjunction to achieve a broad array of functionality, including but not limited to, point spread function determination and deconvolution, automated sample extraction, image alignment and comparison, minority carrier lifetime calibration and iron impurity concentration mapping.

  15. Engine structures modeling software system: Computer code. User's manual

    NASA Technical Reports Server (NTRS)

    1992-01-01

    ESMOSS is a specialized software system for the construction of geometric descriptive and discrete analytical models of engine parts, components and substructures which can be transferred to finite element analysis programs such as NASTRAN. The software architecture of ESMOSS is designed in modular form with a central executive module through which the user controls and directs the development of the analytical model. Modules consist of a geometric shape generator, a library of discretization procedures, interfacing modules to join both geometric and discrete models, a deck generator to produce input for NASTRAN and a 'recipe' processor which generates geometric models from parametric definitions. ESMOSS can be executed both in interactive and batch modes. Interactive mode is considered to be the default mode and that mode will be assumed in the discussion in this document unless stated otherwise.

  16. Flow analysis of new type propulsion system for UV’s

    NASA Astrophysics Data System (ADS)

    Eimanis, M.; Auzins, J.

    2017-10-01

    This paper presents an original design of an autonomous underwater vehicle where thrust force is created by the helicoidal shape of the hull rather than screw propellers. Propulsion force is created by counter-rotating bow and stern parts. The middle part of the vehicle has the function of a cargo compartment containing all control mechanisms and communications. It’s made of elastic material, containing a Cardan-joint mechanism, which allows changing the direction of vehicle, actuated by bending drives. A bending drive velocity control algorithm for the automatic control of vehicle movement direction is proposed. The dynamics of AUV are simulated using multibody simulation software MSC Adams. For the simulation of water resistance forces and torques the surrogate polynomial metamodels are created on the basis of computer experiments with CFD software. For flow interaction with model geometry the simplified vehicle model is submerged in fluid medium using special CFD software, with the same idea used in wind tunnel experiments. The simulation results are compared with measurements of the AUV prototype, created at Institute of Mechanics of Riga Technical University. Experiments with the prototype showed good agreement with simulation results and confirmed the effectiveness and the future potential of the proposed principle.

  17. Development and Engineering Design in Support of "Rover Ranch": A K-12 Outreach Software Project

    NASA Technical Reports Server (NTRS)

    Pascali, Raresh

    2003-01-01

    A continuation of the initial development started in the summer of 1999, the body of work performed in support of 'ROVer Ranch' Project during the present fellowship dealt with the concrete concept implementation and resolution of the related issues. The original work performed last summer focused on the initial examination and articulation of the concept treatment strategy, audience and market analysis for the learning technologies software. The presented work focused on finalizing the set of parts to be made available for building an AERCam Sprint type robot and on defining, testing and implementing process necessary to convert the design engineering files to VRML files. Through reverse engineering, an initial set of mission critical systems was designed for beta testing in schools. The files were created in ProEngineer, exported to VRML 1.0 and converted to VRML 97 (VRML 2.0) for final integration in the software. Attributes for each part were assigned using an in-house developed JAVA based program. The final set of attributes for each system, their mutual interaction and the identification of the relevant ones to be tracked, still remain to be decided.

  18. Kinematic analysis of the finger exoskeleton using MATLAB/Simulink.

    PubMed

    Nasiłowski, Krzysztof; Awrejcewicz, Jan; Lewandowski, Donat

    2014-01-01

    A paralyzed and not fully functional part of human body can be supported by the properly designed exoskeleton system with motoric abilities. It can help in rehabilitation, or movement of a disabled/paralyzed limb. Both suitably selected geometry and specialized software are studied applying the MATLAB environment. A finger exoskeleton was the base for MATLAB/Simulink model. Specialized software, such as MATLAB/Simulink give us an opportunity to optimize calculation reaching precise results, which help in next steps of design process. The calculations carried out yield information regarding movement relation between three functionally connected actuators and showed distance and velocity changes during the whole simulation time.

  19. Mod-5A wind turbine generator program design report. Volume 4: Drawings and specifications, book 4

    NASA Technical Reports Server (NTRS)

    1984-01-01

    The design, development and analysis of the 7.3 MW MOD-5A wind turbine generator are documented. There are four volumes. This volume contains the drawings and specifications that were developed in preparation for building the MOD-5A wind turbine generator. This volume contains 5 books of which this is the fourth, providing drawings 47A380128 through 47A387125. In addition to the parts listing and where-used list, the logic design of the controller software and the code listing of the controller software are provided. Also given are the aerodynamic profile coordinates.

  20. Next generation of decision making software for nanopatterns characterization: application to semiconductor industry

    NASA Astrophysics Data System (ADS)

    Dervilllé, A.; Labrosse, A.; Zimmermann, Y.; Foucher, J.; Gronheid, R.; Boeckx, C.; Singh, A.; Leray, P.; Halder, S.

    2016-03-01

    The dimensional scaling in IC manufacturing strongly drives the demands on CD and defect metrology techniques and their measurement uncertainties. Defect review has become as important as CD metrology and both of them create a new metrology paradigm because it creates a completely new need for flexible, robust and scalable metrology software. Current, software architectures and metrology algorithms are performant but it must be pushed to another higher level in order to follow roadmap speed and requirements. For example: manage defect and CD in one step algorithm, customize algorithms and outputs features for each R&D team environment, provide software update every day or every week for R&D teams in order to explore easily various development strategies. The final goal is to avoid spending hours and days to manually tune algorithm to analyze metrology data and to allow R&D teams to stay focus on their expertise. The benefits are drastic costs reduction, more efficient R&D team and better process quality. In this paper, we propose a new generation of software platform and development infrastructure which can integrate specific metrology business modules. For example, we will show the integration of a chemistry module dedicated to electronics materials like Direct Self Assembly features. We will show a new generation of image analysis algorithms which are able to manage at the same time defect rates, images classifications, CD and roughness measurements with high throughput performances in order to be compatible with HVM. In a second part, we will assess the reliability, the customization of algorithm and the software platform capabilities to follow new specific semiconductor metrology software requirements: flexibility, robustness, high throughput and scalability. Finally, we will demonstrate how such environment has allowed a drastic reduction of data analysis cycle time.

  1. How to choose the right statistical software?-a method increasing the post-purchase satisfaction.

    PubMed

    Cavaliere, Roberto

    2015-12-01

    Nowadays, we live in the "data era" where the use of statistical or data analysis software is inevitable, in any research field. This means that the choice of the right software tool or platform is a strategic issue for a research department. Nevertheless, in many cases decision makers do not pay the right attention to a comprehensive and appropriate evaluation of what the market offers. Indeed, the choice still depends on few factors like, for instance, researcher's personal inclination, e.g., which software have been used at the university or is already known. This is not wrong in principle, but in some cases it's not enough at all and might lead to a "dead end" situation, typically after months or years of investments already done on the wrong software. This article, far from being a full and complete guide to statistical software evaluation, aims to illustrate some key points of the decision process and introduce an extended range of factors which can help to undertake the right choice, at least in potential. There is not enough literature about that topic, most of the time underestimated, both in the traditional literature and even in the so called "gray literature", even if some documents or short pages can be found online. Anyhow, it seems there is not a common and known standpoint about the process of software evaluation from the final user perspective. We suggests a multi-factor analysis leading to an evaluation matrix tool, to be intended as a flexible and customizable tool, aimed to provide a clearer picture of the software alternatives available, not in abstract but related to the researcher's own context and needs. This method is a result of about twenty years of experience of the author in the field of evaluating and using technical-computing software and partially arises from a research made about such topics as part of a project funded by European Commission under the Lifelong Learning Programme 2011.

  2. Software and hardware infrastructure for research in electrophysiology

    PubMed Central

    Mouček, Roman; Ježek, Petr; Vařeka, Lukáš; Řondík, Tomáš; Brůha, Petr; Papež, Václav; Mautner, Pavel; Novotný, Jiří; Prokop, Tomáš; Štěbeták, Jan

    2014-01-01

    As in other areas of experimental science, operation of electrophysiological laboratory, design and performance of electrophysiological experiments, collection, storage and sharing of experimental data and metadata, analysis and interpretation of these data, and publication of results are time consuming activities. If these activities are well organized and supported by a suitable infrastructure, work efficiency of researchers increases significantly. This article deals with the main concepts, design, and development of software and hardware infrastructure for research in electrophysiology. The described infrastructure has been primarily developed for the needs of neuroinformatics laboratory at the University of West Bohemia, the Czech Republic. However, from the beginning it has been also designed and developed to be open and applicable in laboratories that do similar research. After introducing the laboratory and the whole architectural concept the individual parts of the infrastructure are described. The central element of the software infrastructure is a web-based portal that enables community researchers to store, share, download and search data and metadata from electrophysiological experiments. The data model, domain ontology and usage of semantic web languages and technologies are described. Current data publication policy used in the portal is briefly introduced. The registration of the portal within Neuroscience Information Framework is described. Then the methods used for processing of electrophysiological signals are presented. The specific modifications of these methods introduced by laboratory researches are summarized; the methods are organized into a laboratory workflow. Other parts of the software infrastructure include mobile and offline solutions for data/metadata storing and a hardware stimulator communicating with an EEG amplifier and recording software. PMID:24639646

  3. Software and hardware infrastructure for research in electrophysiology.

    PubMed

    Mouček, Roman; Ježek, Petr; Vařeka, Lukáš; Rondík, Tomáš; Brůha, Petr; Papež, Václav; Mautner, Pavel; Novotný, Jiří; Prokop, Tomáš; Stěbeták, Jan

    2014-01-01

    As in other areas of experimental science, operation of electrophysiological laboratory, design and performance of electrophysiological experiments, collection, storage and sharing of experimental data and metadata, analysis and interpretation of these data, and publication of results are time consuming activities. If these activities are well organized and supported by a suitable infrastructure, work efficiency of researchers increases significantly. This article deals with the main concepts, design, and development of software and hardware infrastructure for research in electrophysiology. The described infrastructure has been primarily developed for the needs of neuroinformatics laboratory at the University of West Bohemia, the Czech Republic. However, from the beginning it has been also designed and developed to be open and applicable in laboratories that do similar research. After introducing the laboratory and the whole architectural concept the individual parts of the infrastructure are described. The central element of the software infrastructure is a web-based portal that enables community researchers to store, share, download and search data and metadata from electrophysiological experiments. The data model, domain ontology and usage of semantic web languages and technologies are described. Current data publication policy used in the portal is briefly introduced. The registration of the portal within Neuroscience Information Framework is described. Then the methods used for processing of electrophysiological signals are presented. The specific modifications of these methods introduced by laboratory researches are summarized; the methods are organized into a laboratory workflow. Other parts of the software infrastructure include mobile and offline solutions for data/metadata storing and a hardware stimulator communicating with an EEG amplifier and recording software.

  4. An integrated toolbox for processing and analysis of remote sensing data of inland and coastal waters - atmospheric correction

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Haan, J.F. de; Kokke, J.M.M.; Hoogenboom, H.J.

    1997-06-01

    Deriving thematic maps of water quality parameters from a remote sensing image requires a number of processing steps, such as calibration, atmospheric correction, air-water interface correction, and application of water quality algorithms. A prototype version of an integrated software environment has recently been developed that enables the user to perform and control these processing steps. Major parts of this environment are: (i) access to the MODTRAN 3 radiative transfer code, (ii) a database of water quality algorithms, and (iii) a spectral library of Dutch coastal and inland waters, containing subsurface irradiance reflectance spectra and associated water quality parameters. The atmosphericmore » correction part of this environment is discussed here. It is shown that this part can be used to accurately retrieve spectral signatures of inland water for wavelengths between 450 and 750 nm, provided in situ measurements are used to determine atmospheric model parameters. Assessment of the usefulness of the completely integrated software system in an operational environment requires a revised version that is presently being developed.« less

  5. Analysis of frame structure of medium and small truck crane

    NASA Astrophysics Data System (ADS)

    Cao, Fuyi; Li, Jinlong; Cui, Mengkai

    2018-03-01

    Truck crane is an important part of hoisting machinery. Frame, as the support component of the quality of truck crane, determines the safety of crane jib load and the rationality of structural design. In this paper, the truck crane frame is a box structure, the three-dimensional model is established in CATIA software, and imported into Hyperworks software for finite element analysis. On the base of doing constraints and loads for the finite element model of the frame, the finite element static analysis is carried out. And the static stress test verifies whether the finite element model and the frame structure design are reasonable; then the free modal analysis of the frame and the analysis of the first 8 - order modal vibration deformation are carried out. The analysis results show that the maximum stress value of the frame is greater than the yield limit value of the material, and the low-order modal value is close to the excitation frequency value, which needs to be improved to provide theoretical reference for the structural design of the truck crane frame.

  6. Technology transfer in software engineering

    NASA Technical Reports Server (NTRS)

    Bishop, Peter C.

    1989-01-01

    The University of Houston-Clear Lake is the prime contractor for the AdaNET Research Project under the direction of NASA Johnson Space Center. AdaNET was established to promote the principles of software engineering to the software development industry. AdaNET will contain not only environments and tools, but also concepts, principles, models, standards, guidelines and practices. Initially, AdaNET will serve clients from the U.S. government and private industry who are working in software development. It will seek new clients from those who have not yet adopted the principles and practices of software engineering. Some of the goals of AdaNET are to become known as an objective, authoritative source of new software engineering information and parts, to provide easy access to information and parts, and to keep abreast of innovations in the field.

  7. Machine Tool Software

    NASA Technical Reports Server (NTRS)

    1988-01-01

    A NASA-developed software package has played a part in technical education of students who major in Mechanical Engineering Technology at William Rainey Harper College. Professor Hack has been using (APT) Automatically Programmed Tool Software since 1969 in his CAD/CAM Computer Aided Design and Manufacturing curriculum. Professor Hack teaches the use of APT programming languages for control of metal cutting machines. Machine tool instructions are geometry definitions written in APT Language to constitute a "part program." The part program is processed by the machine tool. CAD/CAM students go from writing a program to cutting steel in the course of a semester.

  8. Incubator Display Software Cost Reduction Toolset Software Requirements Specification

    NASA Technical Reports Server (NTRS)

    Moran, Susanne; Jeffords, Ralph

    2005-01-01

    The Incubator Display Software Requirements Specification was initially developed by Intrinsyx Technologies Corporation (Intrinsyx) under subcontract to Lockheed Martin, Contract Number NAS2-02090, for the National Aeronautics and Space Administration (NASA) Ames Research Center (ARC) Space Station Biological Research Project (SSBRP). The Incubator Display is a User Payload Application (UPA) used to control an Incubator subrack payload for the SSBRP. The Incubator Display functions on-orbit as part of the subrack payload laptop, on the ground as part of the Communication and Data System (CDS) ground control system, and also as part of the crew training environment.

  9. Automated Simulation For Analysis And Design

    NASA Technical Reports Server (NTRS)

    Cantwell, E.; Shenk, Tim; Robinson, Peter; Upadhye, R.

    1992-01-01

    Design Assistant Workstation (DAWN) software being developed to facilitate simulation of qualitative and quantitative aspects of behavior of life-support system in spacecraft, chemical-processing plant, heating and cooling system of large building, or any of variety of systems including interacting process streams and processes. Used to analyze alternative design scenarios or specific designs of such systems. Expert system will automate part of design analysis: reason independently by simulating design scenarios and return to designer with overall evaluations and recommendations.

  10. 15 CFR Supplement No. 2 to Part 774 - General Technology and Software Notes

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 15 Commerce and Foreign Trade 2 2013-01-01 2013-01-01 false General Technology and Software Notes... Software Notes 1. General Technology Note. The export of “technology” that is “required” for the... necessary” information. 2. General Software Note. License Exception TSU (“mass market” software) is...

  11. 15 CFR Supplement No. 2 to Part 774 - General Technology and Software Notes

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 15 Commerce and Foreign Trade 2 2011-01-01 2011-01-01 false General Technology and Software Notes... Software Notes 1. General Technology Note. The export of “technology” that is “required” for the... necessary” information. 2. General Software Note. License Exception TSU (“mass market” software) is...

  12. 15 CFR Supplement No. 2 to Part 774 - General Technology and Software Notes

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 15 Commerce and Foreign Trade 2 2012-01-01 2012-01-01 false General Technology and Software Notes... Software Notes 1. General Technology Note. The export of “technology” that is “required” for the... necessary” information. 2. General Software Note. License Exception TSU (“mass market” software) is...

  13. ZBIT Bioinformatics Toolbox: A Web-Platform for Systems Biology and Expression Data Analysis

    PubMed Central

    Römer, Michael; Eichner, Johannes; Dräger, Andreas; Wrzodek, Clemens; Wrzodek, Finja; Zell, Andreas

    2016-01-01

    Bioinformatics analysis has become an integral part of research in biology. However, installation and use of scientific software can be difficult and often requires technical expert knowledge. Reasons are dependencies on certain operating systems or required third-party libraries, missing graphical user interfaces and documentation, or nonstandard input and output formats. In order to make bioinformatics software easily accessible to researchers, we here present a web-based platform. The Center for Bioinformatics Tuebingen (ZBIT) Bioinformatics Toolbox provides web-based access to a collection of bioinformatics tools developed for systems biology, protein sequence annotation, and expression data analysis. Currently, the collection encompasses software for conversion and processing of community standards SBML and BioPAX, transcription factor analysis, and analysis of microarray data from transcriptomics and proteomics studies. All tools are hosted on a customized Galaxy instance and run on a dedicated computation cluster. Users only need a web browser and an active internet connection in order to benefit from this service. The web platform is designed to facilitate the usage of the bioinformatics tools for researchers without advanced technical background. Users can combine tools for complex analyses or use predefined, customizable workflows. All results are stored persistently and reproducible. For each tool, we provide documentation, tutorials, and example data to maximize usability. The ZBIT Bioinformatics Toolbox is freely available at https://webservices.cs.uni-tuebingen.de/. PMID:26882475

  14. Instrumentation: Software-Driven Instrumentation: The New Wave.

    ERIC Educational Resources Information Center

    Salit, M. L.; Parsons, M. L.

    1985-01-01

    Software-driven instrumentation makes measurements that demand a computer as an integral part of either control, data acquisition, or data reduction. The structure of such instrumentation, hardware requirements, and software requirements are discussed. Examples of software-driven instrumentation (such as wavelength-modulated continuum source…

  15. High-Performance Mixed Models Based Genome-Wide Association Analysis with omicABEL software

    PubMed Central

    Fabregat-Traver, Diego; Sharapov, Sodbo Zh.; Hayward, Caroline; Rudan, Igor; Campbell, Harry; Aulchenko, Yurii; Bientinesi, Paolo

    2014-01-01

    To raise the power of genome-wide association studies (GWAS) and avoid false-positive results in structured populations, one can rely on mixed model based tests. When large samples are used, and when multiple traits are to be studied in the ’omics’ context, this approach becomes computationally challenging. Here we consider the problem of mixed-model based GWAS for arbitrary number of traits, and demonstrate that for the analysis of single-trait and multiple-trait scenarios different computational algorithms are optimal. We implement these optimal algorithms in a high-performance computing framework that uses state-of-the-art linear algebra kernels, incorporates optimizations, and avoids redundant computations, increasing throughput while reducing memory usage and energy consumption. We show that, compared to existing libraries, our algorithms and software achieve considerable speed-ups. The OmicABEL software described in this manuscript is available under the GNU GPL v. 3 license as part of the GenABEL project for statistical genomics at http: //www.genabel.org/packages/OmicABEL. PMID:25717363

  16. High-Performance Mixed Models Based Genome-Wide Association Analysis with omicABEL software.

    PubMed

    Fabregat-Traver, Diego; Sharapov, Sodbo Zh; Hayward, Caroline; Rudan, Igor; Campbell, Harry; Aulchenko, Yurii; Bientinesi, Paolo

    2014-01-01

    To raise the power of genome-wide association studies (GWAS) and avoid false-positive results in structured populations, one can rely on mixed model based tests. When large samples are used, and when multiple traits are to be studied in the 'omics' context, this approach becomes computationally challenging. Here we consider the problem of mixed-model based GWAS for arbitrary number of traits, and demonstrate that for the analysis of single-trait and multiple-trait scenarios different computational algorithms are optimal. We implement these optimal algorithms in a high-performance computing framework that uses state-of-the-art linear algebra kernels, incorporates optimizations, and avoids redundant computations, increasing throughput while reducing memory usage and energy consumption. We show that, compared to existing libraries, our algorithms and software achieve considerable speed-ups. The OmicABEL software described in this manuscript is available under the GNU GPL v. 3 license as part of the GenABEL project for statistical genomics at http: //www.genabel.org/packages/OmicABEL.

  17. Penn State University ground software support for X-ray missions.

    NASA Astrophysics Data System (ADS)

    Townsley, L. K.; Nousek, J. A.; Corbet, R. H. D.

    1995-03-01

    The X-ray group at Penn State is charged with two software development efforts in support of X-ray satellite missions. As part of the ACIS instrument team for AXAF, the authors are developing part of the ground software to support the instrument's calibration. They are also designing a translation program for Ginga data, to change it from the non-standard FRF format, which closely parallels the original telemetry format, to FITS.

  18. Automated system for the on-line monitoring of powder blending processes using near-infrared spectroscopy. Part I. System development and control.

    PubMed

    Hailey, P A; Doherty, P; Tapsell, P; Oliver, T; Aldridge, P K

    1996-03-01

    An automated system for the on-line monitoring of powder blending processes is described. The system employs near-infrared (NIR) spectroscopy using fibre-optics and a graphical user interface (GUI) developed in the LabVIEW environment. The complete supervisory control and data analysis (SCADA) software controls blender and spectrophotometer operation and performs statistical spectral data analysis in real time. A data analysis routine using standard deviation is described to demonstrate an approach to the real-time determination of blend homogeneity.

  19. Research study demonstrates computer simulation can predict warpage and assist in its elimination

    NASA Astrophysics Data System (ADS)

    Glozer, G.; Post, S.; Ishii, K.

    1994-10-01

    Programs for predicting warpage in injection molded parts are relatively new. Commercial software for simulating the flow and cooling stages of injection molding have steadily gained acceptance; however, warpage software is not yet as readily accepted. This study focused on gaining an understanding of the predictive capabilities of the warpage software. The following aspects of this study were unique. (1) Quantitative results were found using a statistically designed set of experiments. (2) Comparisons between experimental and simulation results were made with parts produced in a well-instrumented and controlled injection molding machine. (3) The experimental parts were accurately measured on a coordinate measuring machine with a non-contact laser probe. (4) The effect of part geometry on warpage was investigated.

  20. Integrated thermal disturbance analysis of optical system of astronomical telescope

    NASA Astrophysics Data System (ADS)

    Yang, Dehua; Jiang, Zibo; Li, Xinnan

    2008-07-01

    During operation, astronomical telescope will undergo thermal disturbance, especially more serious in solar telescope, which may cause degradation of image quality. As drives careful thermal load investigation and measure applied to assess its effect on final image quality during design phase. Integrated modeling analysis is boosting the process to find comprehensive optimum design scheme by software simulation. In this paper, we focus on the Finite Element Analysis (FEA) software-ANSYS-for thermal disturbance analysis and the optical design software-ZEMAX-for optical system design. The integrated model based on ANSYS and ZEMAX is briefed in the first from an overview of point. Afterwards, we discuss the establishment of thermal model. Complete power series polynomial with spatial coordinates is introduced to present temperature field analytically. We also borrow linear interpolation technique derived from shape function in finite element theory to interface the thermal model and structural model and further to apply the temperatures onto structural model nodes. Thereby, the thermal loads are transferred with as high fidelity as possible. Data interface and communication between the two softwares are discussed mainly on mirror surfaces and hence on the optical figure representation and transformation. We compare and comment the two different methods, Zernike polynomials and power series expansion, for representing and transforming deformed optical surface to ZEMAX. Additionally, these methods applied to surface with non-circular aperture are discussed. At the end, an optical telescope with parabolic primary mirror of 900 mm in diameter is analyzed to illustrate the above discussion. Finite Element Model with most interested parts of the telescope is generated in ANSYS with necessary structural simplification and equivalence. Thermal analysis is performed and the resulted positions and figures of the optics are to be retrieved and transferred to ZEMAX, and thus final image quality is evaluated with thermal disturbance.

  1. 15 CFR Supplement No. 2 to Part 774 - General Technology and Software Notes

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 15 Commerce and Foreign Trade 2 2014-01-01 2014-01-01 false General Technology and Software Notes... Software Notes 1. General Technology Note. The export of “technology” that is “required” for the... necessary” information. 2. General Software Note. License Exception TSU (mass market software) (see § 740.13...

  2. Assessing Survivability Using Software Fault Injection

    DTIC Science & Technology

    2001-04-01

    UNCLASSIFIED Defense Technical Information Center Compilation Part Notice ADPO10875 TITLE: Assessing Survivability Using Software Fault Injection...Esc to exit .......................................................................... = 11-1 Assessing Survivability Using Software Fault Injection...Jeffrey Voas Reliable Software Technologies 21351 Ridgetop Circle, #400 Dulles, VA 20166 jmvoas@rstcorp.crom Abstract approved sources have the

  3. Putting Safety in the Software

    NASA Technical Reports Server (NTRS)

    Wetherholt, Martha S.; Berens, Kalynnda M.; Hardy, Sandra (Technical Monitor)

    2001-01-01

    Software is a vital component of nearly every piece of modern technology. It is not a 'sub-system', able to be separated out from the system as a whole, but a 'co-system' that controls, manipulates, or interacts with the hardware and with the end user. Software has its fingers into all the pieces of the pie. If that 'pie', the system, can lead to injury, death, loss of major equipment, or impact your business bottom line, then software safety becomes vitally important. Learning to think about software from a safety perspective is the focus of this paper. We want you to think of software as part of the safety critical system, a major part. This requires 'system thinking' - being able to grasp the whole picture. Software's contribution to modern technology is both good and potentially bad. Software allows more complex and useful devices to be built. It can also contribute to plane crashes and power outages. We want you to see software in a whole new light, see it as a contributor to system hazards, and also as a possible fix or mitigation to some of those hazards.

  4. Software for calculating vegetation disturbance and recovery by using the equivalent clearcut area model.

    Treesearch

    Alan A. Ager; Caty Clifton

    2005-01-01

    The use of cumulative watershed effects models is mandated as part of interagency consultation over projects that might affect habitat for salmonids federally listed as threatened or endangered. Cumulative effects analysis is also required by a number of national forest plans in the Pacific Northwest Region (Region 6). Cumulative watershed effects in many cases are...

  5. Analysis and optimization of dynamic model of eccentric shaft grinder

    NASA Astrophysics Data System (ADS)

    Gao, Yangjie; Han, Qiushi; Li, Qiguang; Peng, Baoying

    2018-04-01

    Eccentric shaft servo grinder is the core equipment in the process chain of machining eccentric shaft. The establishment of the movement model and the determination of the kinematic relation of the-axis in the grinding process directly affect the quality of the grinding process, and there are many error factors in grinding, and it is very important to analyze the influence of these factors on the work piece quality. The three-dimensional model of eccentric shaft grinder is drawn by Pro/E three-dimensional drawing software, the model is imported into ANSYS Workbench Finite element analysis software, and the finite element analysis is carried out, and then the variation and parameters of each component of the bed are obtained by the modal analysis result. The natural frequencies and formations of the first six steps of the eccentric shaft grinder are obtained by modal analysis, and the weak links of the parts of the grinder are found out, and a reference improvement method is proposed for the design of the eccentric shaft grinder in the future.

  6. Data Analysis for the LISA Pathfinder Mission

    NASA Technical Reports Server (NTRS)

    Thorpe, James Ira

    2009-01-01

    The LTP (LISA Technology Package) is the core part of the Laser Interferometer Space Antenna (LISA) Pathfinder mission. The main goal of the mission is to study the sources of any disturbances that perturb the motion of the freely-falling test masses from their geodesic trajectories as well as 10 test various technologies needed for LISA. The LTP experiment is designed as a sequence of experimental runs in which the performance of the instrument is studied and characterized under different operating conditions. In order to best optimize subsequent experimental runs, each run must be promptly analysed to ensure that the following ones make best use of the available knowledge of the instrument ' In order to do this, all analyses must be designed and tested in advance of the mission and have sufficient built-in flexibility to account for unexpected results or behaviour. To support this activity, a robust and flexible data analysis software package is also required. This poster presents two of the main components that make up the data analysis effort: the data analysis software and the mock-data challenges used to validate analysis procedures and experiment designs.

  7. Lapin Data Interchange Among Database, Analysis and Display Programs Using XML-Based Text Files

    NASA Technical Reports Server (NTRS)

    2004-01-01

    The purpose was to investigate and evaluate the interchange of application- specific data among multiple programs each carrying out part of the analysis and design task. This has been carried out previously by creating a custom program to read data produced by one application and then write that data to a file whose format is specific to the second application that needs all or part of that data. In this investigation, data of interest is described using the XML markup language that allows the data to be stored in a text-string. Software to transform output data of a task into an XML-string and software to read an XML string and extract all or a portion of the data needed for another application is used to link two independent applications together as part of an overall design effort. This approach was initially used with a standard analysis program, Lapin, along with standard applications a standard spreadsheet program, a relational database program, and a conventional dialog and display program to demonstrate the successful sharing of data among independent programs. See Engineering Analysis Using a Web-Based Protocol by J.D. Schoeffler and R.W. Claus, NASA TM-2002-211981, October 2002. Most of the effort beyond that demonstration has been concentrated on the inclusion of more complex display programs. Specifically, a custom-written windowing program organized around dialogs to control the interactions have been combined with an independent CAD program (Open Cascade) that supports sophisticated display of CAD elements such as lines, spline curves, and surfaces and turbine-blade data produced by an independent blade design program (UD0300).

  8. Fully automated corneal endothelial morphometry of images captured by clinical specular microscopy

    NASA Astrophysics Data System (ADS)

    Bucht, Curry; Söderberg, Per; Manneberg, Göran

    2010-02-01

    The corneal endothelium serves as the posterior barrier of the cornea. Factors such as clarity and refractive properties of the cornea are in direct relationship to the quality of the endothelium. The endothelial cell density is considered the most important morphological factor of the corneal endothelium. Pathological conditions and physical trauma may threaten the endothelial cell density to such an extent that the optical property of the cornea and thus clear eyesight is threatened. Diagnosis of the corneal endothelium through morphometry is an important part of several clinical applications. Morphometry of the corneal endothelium is presently carried out by semi automated analysis of pictures captured by a Clinical Specular Microscope (CSM). Because of the occasional need of operator involvement, this process can be tedious, having a negative impact on sampling size. This study was dedicated to the development and use of fully automated analysis of a very large range of images of the corneal endothelium, captured by CSM, using Fourier analysis. Software was developed in the mathematical programming language Matlab. Pictures of the corneal endothelium, captured by CSM, were read into the analysis software. The software automatically performed digital enhancement of the images, normalizing lights and contrasts. The digitally enhanced images of the corneal endothelium were Fourier transformed, using the fast Fourier transform (FFT) and stored as new images. Tools were developed and applied for identification and analysis of relevant characteristics of the Fourier transformed images. The data obtained from each Fourier transformed image was used to calculate the mean cell density of its corresponding corneal endothelium. The calculation was based on well known diffraction theory. Results in form of estimated cell density of the corneal endothelium were obtained, using fully automated analysis software on 292 images captured by CSM. The cell density obtained by the fully automated analysis was compared to the cell density obtained from classical, semi-automated analysis and a relatively large correlation was found.

  9. Software tool for mining liquid chromatography/multi-stage mass spectrometry data for comprehensive glycerophospholipid profiling.

    PubMed

    Hein, Eva-Maria; Bödeker, Bertram; Nolte, Jürgen; Hayen, Heiko

    2010-07-30

    Electrospray ionization mass spectrometry (ESI-MS) has emerged as an indispensable tool in the field of lipidomics. Despite the growing interest in lipid analysis, there are only a few software tools available for data evaluation, as compared for example to proteomics applications. This makes comprehensive lipid analysis a complex challenge. Thus, a computational tool for harnessing the raw data from liquid chromatography/mass spectrometry (LC/MS) experiments was developed in this study and is available from the authors on request. The Profiler-Merger-Viewer tool is a software package for automatic processing of raw-data from data-dependent experiments, measured by high-performance liquid chromatography hyphenated to electrospray ionization hybrid linear ion trap Fourier transform mass spectrometry (FTICR-MS and Orbitrap) in single and multi-stage mode. The software contains three parts: processing of the raw data by Profiler for lipid identification, summarizing of replicate measurements by Merger and visualization of all relevant data (chromatograms as well as mass spectra) for validation of the results by Viewer. The tool is easily accessible, since it is implemented in Java and uses Microsoft Excel (XLS) as output format. The motivation was to develop a tool which supports and accelerates the manual data evaluation (identification and relative quantification) significantly but does not make a complete data analysis within a black-box system. The software's mode of operation, usage and options will be demonstrated on the basis of a lipid extract of baker's yeast (S. cerevisiae). In this study, we focused on three important representatives of lipids: glycerophospholipids, lyso-glycerophospholipids and free fatty acids. Copyright 2010 John Wiley & Sons, Ltd.

  10. A Proven Methodology for Developing Secure Software and Applying It to Ground Systems

    NASA Technical Reports Server (NTRS)

    Bailey, Brandon

    2016-01-01

    Part Two expands upon Part One in an attempt to translate the methodology for ground system personnel. The goal is to build upon the methodology presented in Part One by showing examples and details on how to implement the methodology. Section 1: Ground Systems Overview; Section 2: Secure Software Development; Section 3: Defense in Depth for Ground Systems; Section 4: What Now?

  11. Abstract - Cooperative Research and Development Agreement between Applied Spectra, Inc. and National Energy Technology Laboratory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McIntyre, Dustin L.; Russo, Richard

    Applied Spectra, as our industrial collaborator, is helping us develop our downhole LIBS sensor. Our part of the collaboration is the design, construction, and validation of the miniaturized fiber coupled laser whereas Applied Spectra will be providing technical guidance and control/analysis software. This will allow our system which is traditionally operated by a person to be automated in both data collection and analysis. This will allow our system to significantly increase its TRL.

  12. Archiving Software Systems: Approaches to Preserve Computational Capabilities

    NASA Astrophysics Data System (ADS)

    King, T. A.

    2014-12-01

    A great deal of effort is made to preserve scientific data. Not only because data is knowledge, but it is often costly to acquire and is sometimes collected under unique circumstances. Another part of the science enterprise is the development of software to process and analyze the data. Developed software is also a large investment and worthy of preservation. However, the long term preservation of software presents some challenges. Software often requires a specific technology stack to operate. This can include software, operating systems and hardware dependencies. One past approach to preserve computational capabilities is to maintain ancient hardware long past its typical viability. On an archive horizon of 100 years, this is not feasible. Another approach to preserve computational capabilities is to archive source code. While this can preserve details of the implementation and algorithms, it may not be possible to reproduce the technology stack needed to compile and run the resulting applications. This future forward dilemma has a solution. Technology used to create clouds and process big data can also be used to archive and preserve computational capabilities. We explore how basic hardware, virtual machines, containers and appropriate metadata can be used to preserve computational capabilities and to archive functional software systems. In conjunction with data archives, this provides scientist with both the data and capability to reproduce the processing and analysis used to generate past scientific results.

  13. Distributed Software for Observations in the Near Infrared

    NASA Astrophysics Data System (ADS)

    Gavryusev, V.; Baffa, C.; Giani, E.

    We have developed an integrated system that performs astronomical observations in Near Infrared bands operating two-dimensional instruments at the Italian National Infrared Facility's \\htmllink{ARNICA}{http://helios.arcetri.astro.it:/home/idefix/Mosaic/ instr/arnica/arnica.html} and \\htmllink{LONGSP}{http://helios.arcetri.astro.it:/home/idefix/Mosaic/ instr/longsp/longsp.html}. This software consists of several communicating processes, generally executed across a network, as well as on a single computer. The user interface is organized as widget-based X11 client. The interprocess communication is provided by sockets and uses TCP/IP. The processes denoted for control of hardware (telescope and other instruments) should be executed currently on a PC dedicated for this task under DESQview/X, while all other components (user interface, tools for the data analysis, etc.) can also work under UNIX\\@. The hardware independent part of software is based on the Athena Widget Set and is compiled by GNU C to provide maximum portability.

  14. Development of multichannel analyzer using sound card ADC for nuclear spectroscopy system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ibrahim, Maslina Mohd; Yussup, Nolida; Lombigit, Lojius

    This paper describes the development of Multi-Channel Analyzer (MCA) using sound card analogue to digital converter (ADC) for nuclear spectroscopy system. The system was divided into a hardware module and a software module. Hardware module consist of detector NaI (Tl) 2” by 2”, Pulse Shaping Amplifier (PSA) and a build in ADC chip from readily available in any computers’ sound system. The software module is divided into two parts which are a pre-processing of raw digital input and the development of the MCA software. Band-pass filter and baseline stabilization and correction were implemented for the pre-processing. For the MCA development,more » the pulse height analysis method was used to process the signal before displaying it using histogram technique. The development and tested result for using the sound card as an MCA are discussed.« less

  15. Lapin Data Interchange Among Database, Analysis and Display Programs Using XML-Based Text Files

    NASA Technical Reports Server (NTRS)

    2005-01-01

    The purpose of grant NCC3-966 was to investigate and evaluate the interchange of application-specific data among multiple programs each carrying out part of the analysis and design task. This has been carried out previously by creating a custom program to read data produced by one application and then write that data to a file whose format is specific to the second application that needs all or part of that data. In this investigation, data of interest is described using the XML markup language that allows the data to be stored in a text-string. Software to transform output data of a task into an XML-string and software to read an XML string and extract all or a portion of the data needed for another application is used to link two independent applications together as part of an overall design effort. This approach was initially used with a standard analysis program, Lapin, along with standard applications a standard spreadsheet program, a relational database program, and a conventional dialog and display program to demonstrate the successful sharing of data among independent programs. Most of the effort beyond that demonstration has been concentrated on the inclusion of more complex display programs. Specifically, a custom-written windowing program organized around dialogs to control the interactions have been combined with an independent CAD program (Open Cascade) that supports sophisticated display of CAD elements such as lines, spline curves, and surfaces and turbine-blade data produced by an independent blade design program (UD0300).

  16. Digital surfaces and hydrogeologic data for the Floridan aquifer system in Florida and in parts of Georgia, Alabama, and South Carolina

    USGS Publications Warehouse

    Bellino, Jason C.

    2011-01-01

    A digital dataset for the Floridan aquifer system in Florida and in parts of Georgia, Alabama, and South Carolina was developed from selected reports published as part of the Regional Aquifer-System Analysis (RASA) Program of the U.S. Geological Survey (USGS) in the 1980s. These reports contain maps and data depicting the extent and elevation of both time-stratigraphic and hydrogeologic units of which the aquifer system is composed, as well as data on hydrology, meteorology, and aquifer properties. The three primary reports used for this dataset compilation were USGS Professional Paper 1403-B (Miller, 1986), Professional Paper 1403-C (Bush and Johnston, 1988), and USGS Open-File Report 88-86 (Miller, 1988). Paper maps from Professional Papers 1403-B and 1403-C were scanned and georeferenced to the North American Datum of 1927 (NAD27) using the Lambert Conformal Conic projection (standard parallels 33 and 45 degrees, central longitude -96 degrees, central latitude 39 degrees). Once georeferenced, tracing of pertinent line features contained in each image (for example, contours and faults) was facilitated by specialized software using algorithms that automated much of the process. Resulting digital line features were then processed using standard geographic information system (GIS) software to remove artifacts from the digitization process and to verify and update attribute tables. The digitization process for polygonal features (for example, outcrop areas and unit extents) was completed by hand using GIS software.

  17. The ESA's Space Trajectory Analysis software suite

    NASA Astrophysics Data System (ADS)

    Ortega, Guillermo

    The European Space Agency (ESA) initiated in 2005 an internal activity to develop an open source software suite involving university science departments and research institutions all over the world. This project is called the "Space Trajectory Analysis" or STA. This article describes the birth of STA and its present configuration. One of the STA aims is to promote the exchange of technical ideas, and raise knowledge and competence in the areas of applied mathematics, space engineering, and informatics at University level. Conceived as a research and education tool to support the analysis phase of a space mission, STA is able to visualize a wide range of space trajectories. These include among others ascent, re-entry, descent and landing trajectories, orbits around planets and moons, interplanetary trajectories, rendezvous trajectories, etc. The article explains that STA project is an original idea of the Technical Directorate of ESA. It was born in August 2005 to provide a framework in astrodynamics research at University level. As research and education software applicable to Academia, a number of Universities support this development by joining ESA in leading the development. ESA and Universities partnership are expressed in the STA Steering Board. Together with ESA, each University has a chair in the board whose tasks are develop, control, promote, maintain, and expand the software suite. The article describes that STA provides calculations in the fields of spacecraft tracking, attitude analysis, coverage and visibility analysis, orbit determination, position and velocity of solar system bodies, etc. STA implements the concept of "space scenario" composed of Solar system bodies, spacecraft, ground stations, pads, etc. It is able to propagate the orbit of a spacecraft where orbital propagators are included. STA is able to compute communication links between objects of a scenario (coverage, line of sight), and to represent the trajectory computations and relationship between objects in 2D and 3D formats, etc. Further, the article explains that the STA development is open source and it is based on the state of the art astrodynamics routines that are grouped into modules. The modules are programmed using the C++ language. The different STA modules are designed, developed, tested and verified by the different Universities. Software integration and overall validation is performed by ESA. Students are chosen to work in STA modules as part of their Master or PhD thesis programs. As part of their growing experience, the students learn how to write documentation for a space project using European Coorperation on Space Standardization (ECSS) standards, how to test and verify the software modules they write and, how to interact with ESA and each other in this process. Finally, the article concludes about the benefits of the STA initiative. The STA project allows a strong link among applied mathematics, space engineering, and informatics disciplines by reinforcing the academic community with requirements and needs coming from space agencies and industry real needs and missions.

  18. What does voice-processing technology support today?

    PubMed Central

    Nakatsu, R; Suzuki, Y

    1995-01-01

    This paper describes the state of the art in applications of voice-processing technologies. In the first part, technologies concerning the implementation of speech recognition and synthesis algorithms are described. Hardware technologies such as microprocessors and DSPs (digital signal processors) are discussed. Software development environment, which is a key technology in developing applications software, ranging from DSP software to support software also is described. In the second part, the state of the art of algorithms from the standpoint of applications is discussed. Several issues concerning evaluation of speech recognition/synthesis algorithms are covered, as well as issues concerning the robustness of algorithms in adverse conditions. Images Fig. 3 PMID:7479720

  19. CytoSpectre: a tool for spectral analysis of oriented structures on cellular and subcellular levels.

    PubMed

    Kartasalo, Kimmo; Pölönen, Risto-Pekka; Ojala, Marisa; Rasku, Jyrki; Lekkala, Jukka; Aalto-Setälä, Katriina; Kallio, Pasi

    2015-10-26

    Orientation and the degree of isotropy are important in many biological systems such as the sarcomeres of cardiomyocytes and other fibrillar structures of the cytoskeleton. Image based analysis of such structures is often limited to qualitative evaluation by human experts, hampering the throughput, repeatability and reliability of the analyses. Software tools are not readily available for this purpose and the existing methods typically rely at least partly on manual operation. We developed CytoSpectre, an automated tool based on spectral analysis, allowing the quantification of orientation and also size distributions of structures in microscopy images. CytoSpectre utilizes the Fourier transform to estimate the power spectrum of an image and based on the spectrum, computes parameter values describing, among others, the mean orientation, isotropy and size of target structures. The analysis can be further tuned to focus on targets of particular size at cellular or subcellular scales. The software can be operated via a graphical user interface without any programming expertise. We analyzed the performance of CytoSpectre by extensive simulations using artificial images, by benchmarking against FibrilTool and by comparisons with manual measurements performed for real images by a panel of human experts. The software was found to be tolerant against noise and blurring and superior to FibrilTool when analyzing realistic targets with degraded image quality. The analysis of real images indicated general good agreement between computational and manual results while also revealing notable expert-to-expert variation. Moreover, the experiment showed that CytoSpectre can handle images obtained of different cell types using different microscopy techniques. Finally, we studied the effect of mechanical stretching on cardiomyocytes to demonstrate the software in an actual experiment and observed changes in cellular orientation in response to stretching. CytoSpectre, a versatile, easy-to-use software tool for spectral analysis of microscopy images was developed. The tool is compatible with most 2D images and can be used to analyze targets at different scales. We expect the tool to be useful in diverse applications dealing with structures whose orientation and size distributions are of interest. While designed for the biological field, the software could also be useful in non-biological applications.

  20. What Are HyperCard? (Part 2).

    ERIC Educational Resources Information Center

    Marcus, Stephen

    1989-01-01

    Presents the second article in a two-part series on HyperCard materials (computer software used to build structures that create patterns and connections) designed for English and language arts classes. Suggests assignments for use with early HyperCard software that can be adapted to a variety of nonverbal "stackware." (MM)

  1. Items Supporting the Hanford Internal Dosimetry Program Implementation of the IMBA Computer Code

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Carbaugh, Eugene H.; Bihl, Donald E.

    2008-01-07

    The Hanford Internal Dosimetry Program has adopted the computer code IMBA (Integrated Modules for Bioassay Analysis) as its primary code for bioassay data evaluation and dose assessment using methodologies of ICRP Publications 60, 66, 67, 68, and 78. The adoption of this code was part of the implementation plan for the June 8, 2007 amendments to 10 CFR 835. This information release includes action items unique to IMBA that were required by PNNL quality assurance standards for implementation of safety software. Copie of the IMBA software verification test plan and the outline of the briefing given to new users aremore » also included.« less

  2. NecroQuant: quantitative assessment of radiological necrosis

    NASA Astrophysics Data System (ADS)

    Hwang, Darryl H.; Mohamed, Passant; Varghese, Bino A.; Cen, Steven Y.; Duddalwar, Vinay

    2017-11-01

    Clinicians can now objectively quantify tumor necrosis by Hounsfield units and enhancement characteristics from multiphase contrast enhanced CT imaging. NecroQuant has been designed to work as part of a radiomics pipelines. The software is a departure from the conventional qualitative assessment of tumor necrosis, as it provides the user (radiologists and researchers) a simple interface to precisely and interactively define and measure necrosis in contrast-enhanced CT images. Although, the software is tested here on renal masses, it can be re-configured to assess tumor necrosis across variety of tumors from different body sites, providing a generalized, open, portable, and extensible quantitative analysis platform that is widely applicable across cancer types to quantify tumor necrosis.

  3. PYCHEM: a multivariate analysis package for python.

    PubMed

    Jarvis, Roger M; Broadhurst, David; Johnson, Helen; O'Boyle, Noel M; Goodacre, Royston

    2006-10-15

    We have implemented a multivariate statistical analysis toolbox, with an optional standalone graphical user interface (GUI), using the Python scripting language. This is a free and open source project that addresses the need for a multivariate analysis toolbox in Python. Although the functionality provided does not cover the full range of multivariate tools that are available, it has a broad complement of methods that are widely used in the biological sciences. In contrast to tools like MATLAB, PyChem 2.0.0 is easily accessible and free, allows for rapid extension using a range of Python modules and is part of the growing amount of complementary and interoperable scientific software in Python based upon SciPy. One of the attractions of PyChem is that it is an open source project and so there is an opportunity, through collaboration, to increase the scope of the software and to continually evolve a user-friendly platform that has applicability across a wide range of analytical and post-genomic disciplines. http://sourceforge.net/projects/pychem

  4. Quantitative imaging assay for NF-κB nuclear translocation in primary human macrophages

    PubMed Central

    Noursadeghi, Mahdad; Tsang, Jhen; Haustein, Thomas; Miller, Robert F.; Chain, Benjamin M.; Katz, David R.

    2008-01-01

    Quantitative measurement of NF-κB nuclear translocation is an important research tool in cellular immunology. Established methodologies have a number of limitations, such as poor sensitivity, high cost or dependence on cell lines. Novel imaging methods to measure nuclear translocation of transcriptionally active components of NF-κB are being used but are also partly limited by the need for specialist imaging equipment or image analysis software. Herein we present a method for quantitative detection of NF-κB rel A nuclear translocation, using immunofluorescence microscopy and the public domain image analysis software ImageJ that can be easily adopted for cellular immunology research without the need for specialist image analysis expertise and at low cost. The method presented here is validated by demonstrating the time course and dose response of NF-κB nuclear translocation in primary human macrophages stimulated with LPS, and by comparison with a commercial NF-κB activation reporter cell line. PMID:18036607

  5. Logistics Modeling for Lunar Exploration Systems

    NASA Technical Reports Server (NTRS)

    Andraschko, Mark R.; Merrill, R. Gabe; Earle, Kevin D.

    2008-01-01

    The extensive logistics required to support extended crewed operations in space make effective modeling of logistics requirements and deployment critical to predicting the behavior of human lunar exploration systems. This paper discusses the software that has been developed as part of the Campaign Manifest Analysis Tool in support of strategic analysis activities under the Constellation Architecture Team - Lunar. The described logistics module enables definition of logistics requirements across multiple surface locations and allows for the transfer of logistics between those locations. A key feature of the module is the loading algorithm that is used to efficiently load logistics by type into carriers and then onto landers. Attention is given to the capabilities and limitations of this loading algorithm, particularly with regard to surface transfers. These capabilities are described within the context of the object-oriented software implementation, with details provided on the applicability of using this approach to model other human exploration scenarios. Some challenges of incorporating probabilistics into this type of logistics analysis model are discussed at a high level.

  6. Numerical analysis of eccentric orifice plate using ANSYS Fluent software

    NASA Astrophysics Data System (ADS)

    Zahariea, D.

    2016-11-01

    In this paper the eccentric orifice plate is qualitative analysed as compared with the classical concentric orifice plate from the point of view of sedimentation tendency of solid particles in the fluid whose flow rate is measured. For this purpose, the numerical streamlines pattern will be compared for both orifice plates. The numerical analysis has been performed using ANSYS Fluent software. The methodology of CFD analysis is presented: creating the 3D solid model, fluid domain extraction, meshing, boundary condition, turbulence model, solving algorithm, convergence criterion, results and validation. Analysing the numerical streamlines, for the concentric orifice plate can be clearly observed two circumferential regions of separated flows, upstream and downstream of the orifice plate. The bottom part of these regions are the place where the solid particles could sediment. On the other hand, for the eccentric orifice plate, the streamlines pattern suggest that no sedimentation will occur because at the bottom area of the pipe there are no separated flows.

  7. Vibration study of a vehicle suspension assembly with the finite element method

    NASA Astrophysics Data System (ADS)

    Cătălin Marinescu, Gabriel; Castravete, Ştefan-Cristian; Dumitru, Nicolae

    2017-10-01

    The main steps of the present work represent a methodology of analysing various vibration effects over suspension mechanical parts of a vehicle. A McPherson type suspension from an existing vehicle was created using CAD software. Using the CAD model as input, a finite element model of the suspension assembly was developed. Abaqus finite element analysis software was used to pre-process, solve, and post-process the results. Geometric nonlinearities are included in the model. Severe sources of nonlinearities such us friction and contact are also included in the model. The McPherson spring is modelled as linear spring. The analysis include several steps: preload, modal analysis, the reduction of the model to 200 generalized coordinates, a deterministic external excitation, a random excitation that comes from different types of roads. The vibration data used as an input for the simulation were previously obtained by experimental means. Mathematical expressions used for the simulation were also presented in the paper.

  8. Standardized development of computer software. Part 2: Standards

    NASA Technical Reports Server (NTRS)

    Tausworthe, R. C.

    1978-01-01

    This monograph contains standards for software development and engineering. The book sets forth rules for design, specification, coding, testing, documentation, and quality assurance audits of software; it also contains detailed outlines for the documentation to be produced.

  9. An investigation into NVC characteristics of vehicle behaviour using modal analysis

    NASA Astrophysics Data System (ADS)

    Hanouf, Zahir; Faris, Waleed F.; Ahmad, Kartini

    2017-03-01

    NVC characterizations of vehicle behavior is one essential part of the development targets in automotive industries. Therefore understanding dynamic behavior of each structural part of the vehicle is a major requirement in improving the NVC characteristics of a vehicle. The main focus of this research is to investigate structural dynamic behavior of a passenger car using modal analysis part by part technique and apply this method to derive the interior noise sources. In the first part of this work computational modal analysis part by part tests were carried out to identify the dynamic parameters of the passenger car. Finite elements models of the different parts of the car are constructed using VPG 3.2 software. Ls-Dyna pre and post processing was used to identify and analyze the dynamic behavior of each car components panels. These tests had successfully produced natural frequencies and their associated mode shapes of such panels like trunk, hood, roof and door panels. In the second part of this research, experimental modal analysis part by part is performed on the selected car panels to extract modal parameters namely frequencies and mode shapes. The study establishes the step-by-step procedures to carry out experimental modal analysis on the car structures, using single input excitation and multi-output responses (SIMO) technique. To ensure the validity of the results obtained by the previous method an inverse method was done by fixing the response and moving the excitation and the results found were absolutely the same. Finally, comparison between results obtained from both analyses showed good similarity in both frequencies and mode shapes. Conclusion drawn from this part of study was that modal analysis part-by-part can be strongly used to establish the dynamic characteristics of the whole car. Furthermore, the developed method is also can be used to show the relationship between structural vibration of the car panels and the passengers’ noise comfort inside the cabin.

  10. A Quantitative Study of a Software Tool that Supports a Part-Complete Solution Method on Learning Outcomes

    ERIC Educational Resources Information Center

    Garner, Stuart

    2009-01-01

    This paper reports on the findings from a quantitative research study into the use of a software tool that was built to support a part-complete solution method (PCSM) for the learning of computer programming. The use of part-complete solutions to programming problems is one of the methods that can be used to reduce the cognitive load that students…

  11. Specificity of software cooperating with an optoelectronic sensor in the pulse oximeter system

    NASA Astrophysics Data System (ADS)

    Cysewska-Sobusiak, Anna; Wiczynski, Grzegorz; Jedwabny, Tomasz

    1995-06-01

    Specificity of a software package composed of two parts which control an optoelectronic sensor of the computer-aided system made to realize the noninvasive measurements of the arterial blood oxygen saturation as well as some parameters of the peripheral pulse waveform, has been described. Principles of the transmission variant of the one and only noninvasive measurement method, so-called pulse oximetry, has been utilized. The software co-ordinates the suitable cooperation of an IBM PC compatible microcomputer with the sensor and one specialized card. This novel card is a key part of the whole measuring system which some application fields are extended in comparison to pulse oximeters commonly attainable. The user-friendly MS Windows graphical environment which creates the system to be multitask and non-preemptive, has been used to design the specific part of the programming presented here. With this environment, sophisticated tasks of the software package can be performed without excessive complication.

  12. Implementing Educational Software and Evaluating Its Academic Effectiveness: Part I.

    ERIC Educational Resources Information Center

    Jolicoeur, Karen; Berger, Dale E.

    1988-01-01

    This basic plan for implementing educational software in the classroom incorporates a research design for evaluating its effectiveness. A study of fifth grade classrooms using game and tutorial software for spelling and fractions is used as an example. Topics discussed include software selection, selecting groups of comparable ability, and use of…

  13. 49 CFR Appendix C to Part 236 - Safety Assurance Criteria and Processes

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... system (all its elements including hardware and software) must be designed to assure safe operation with... unsafe errors in the software due to human error in the software specification, design, or coding phases... (hardware or software, or both) are used in combination to ensure safety. If a common mode failure exists...

  14. 49 CFR 238.105 - Train electronic hardware and software safety.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... and software system safety as part of the pre-revenue service testing of the equipment. (d)(1... safely by initiating a full service brake application in the event of a hardware or software failure that... 49 Transportation 4 2010-10-01 2010-10-01 false Train electronic hardware and software safety. 238...

  15. Guidance and Control Software Project Data - Volume 1: Planning Documents

    NASA Technical Reports Server (NTRS)

    Hayhurst, Kelly J. (Editor)

    2008-01-01

    The Guidance and Control Software (GCS) project was the last in a series of software reliability studies conducted at Langley Research Center between 1977 and 1994. The technical results of the GCS project were recorded after the experiment was completed. Some of the support documentation produced as part of the experiment, however, is serving an unexpected role far beyond its original project context. Some of the software used as part of the GCS project was developed to conform to the RTCA/DO-178B software standard, "Software Considerations in Airborne Systems and Equipment Certification," used in the civil aviation industry. That standard requires extensive documentation throughout the software development life cycle, including plans, software requirements, design and source code, verification cases and results, and configuration management and quality control data. The project documentation that includes this information is open for public scrutiny without the legal or safety implications associated with comparable data from an avionics manufacturer. This public availability has afforded an opportunity to use the GCS project documents for DO-178B training. This report provides a brief overview of the GCS project, describes the 4-volume set of documents and the role they are playing in training, and includes the planning documents from the GCS project. Volume 1 contains five appendices: A. Plan for Software Aspects of Certification for the Guidance and Control Software Project; B. Software Development Standards for the Guidance and Control Software Project; C. Software Verification Plan for the Guidance and Control Software Project; D. Software Configuration Management Plan for the Guidance and Control Software Project; and E. Software Quality Assurance Activities.

  16. CARES/Life Software for Designing More Reliable Ceramic Parts

    NASA Technical Reports Server (NTRS)

    Nemeth, Noel N.; Powers, Lynn M.; Baker, Eric H.

    1997-01-01

    Products made from advanced ceramics show great promise for revolutionizing aerospace and terrestrial propulsion, and power generation. However, ceramic components are difficult to design because brittle materials in general have widely varying strength values. The CAPES/Life software eases this task by providing a tool to optimize the design and manufacture of brittle material components using probabilistic reliability analysis techniques. Probabilistic component design involves predicting the probability of failure for a thermomechanically loaded component from specimen rupture data. Typically, these experiments are performed using many simple geometry flexural or tensile test specimens. A static, dynamic, or cyclic load is applied to each specimen until fracture. Statistical strength and SCG (fatigue) parameters are then determined from these data. Using these parameters and the results obtained from a finite element analysis, the time-dependent reliability for a complex component geometry and loading is then predicted. Appropriate design changes are made until an acceptable probability of failure has been reached.

  17. Quick Fix for Managing Risks

    NASA Technical Reports Server (NTRS)

    2002-01-01

    Under a Phase II SBIR contract, Kennedy and Lumina Decision Systems, Inc., jointly developed the Schedule and Cost Risk Analysis Modeling (SCRAM) system, based on a version of Lumina's flagship software product, Analytica(R). Acclaimed as "the best single decision-analysis program yet produced" by MacWorld magazine, Analytica is a "visual" tool used in decision-making environments worldwide to build, revise, and present business models, minus the time-consuming difficulty commonly associated with spreadsheets. With Analytica as their platform, Kennedy and Lumina created the SCRAM system in response to NASA's need to identify the importance of major delays in Shuttle ground processing, a critical function in project management and process improvement. As part of the SCRAM development project, Lumina designed a version of Analytica called the Analytica Design Engine (ADE) that can be easily incorporated into larger software systems. ADE was commercialized and utilized in many other developments, including web-based decision support.

  18. Terrestrial reference frame solution with the Vienna VLBI Software VieVS and implication of tropospheric gradient estimation

    NASA Astrophysics Data System (ADS)

    Spicakova, H.; Plank, L.; Nilsson, T.; Böhm, J.; Schuh, H.

    2011-07-01

    The Vienna VLBI Software (VieVS) has been developed at the Institute of Geodesy and Geophysics at TU Vienna since 2008. In this presentation, we present the module Vie_glob which is the part of VieVS that allows the parameter estimation from multiple VLBI sessions in a so-called global solution. We focus on the determination of the terrestrial reference frame (TRF) using all suitable VLBI sessions since 1984. We compare different analysis options like the choice of loading corrections or of one of the models for the tropospheric delays. The effect of atmosphere loading corrections on station heights if neglected at observation level will be shown. Time series of station positions (using a previously determined TRF as a priori values) are presented and compared to other estimates of site positions from individual IVS (International VLBI Service for Geodesy and Astrometry) Analysis Centers.

  19. Automated Transfer Vehicle (ATV) Critical Safety Software Overview

    NASA Astrophysics Data System (ADS)

    Berthelier, D.

    2002-01-01

    The European Automated Transfer Vehicle is an unmanned transportation system designed to dock to International Space Station (ISS) and to contribute to the logistic servicing of the ISS. Concisely, ATV control is realized by a nominal flight control function (using computers, softwares, sensors, actuators). In order to cover the extreme situations where this nominal chain can not ensure safe trajectory with respect to ISS, a segregated proximity flight safety function is activated, where unsafe free drift trajectories can be encountered. This function relies notably on a segregated computer, the Monitoring and Safing Unit (MSU) ; in case of major ATV malfunction detection, ATV is then controlled by MSU software. Therefore, this software is critical because a MSU software failure could result in catastrophic consequences. This paper provides an overview both of this software functions and of the software development and validation method which is specific considering its criticality. First part of the paper describes briefly the proximity flight safety chain. Second part deals with the software functions. Indeed, MSU software is in charge of monitoring nominal computers and ATV corridors, using its own navigation algorithms, and, if an abnormal situation is detected, it is in charge of the ATV control during the Collision Avoidance Manoeuvre (CAM) consisting in an attitude controlled braking boost, followed by a Post-CAM manoeuvre : a Sun-pointed ATV attitude control during up to 24 hours on a safe trajectory. Monitoring, navigation and control algorithms principles are presented. Third part of this paper describes the development and validation process : algorithms functional studies , ADA coding and unit validations ; algorithms ADA code integration and validation on a specific non real-time MATLAB/SIMULINK simulator ; global software functional engineering phase, architectural design, unit testing, integration and validation on target computer.

  20. Processing Ocean Images to Detect Large Drift Nets

    NASA Technical Reports Server (NTRS)

    Veenstra, Tim

    2009-01-01

    A computer program processes the digitized outputs of a set of downward-looking video cameras aboard an aircraft flying over the ocean. The purpose served by this software is to facilitate the detection of large drift nets that have been lost, abandoned, or jettisoned. The development of this software and of the associated imaging hardware is part of a larger effort to develop means of detecting and removing large drift nets before they cause further environmental damage to the ocean and to shores on which they sometimes impinge. The software is capable of near-realtime processing of as many as three video feeds at a rate of 30 frames per second. After a user sets the parameters of an adjustable algorithm, the software analyzes each video stream, detects any anomaly, issues a command to point a high-resolution camera toward the location of the anomaly, and, once the camera has been so aimed, issues a command to trigger the camera shutter. The resulting high-resolution image is digitized, and the resulting data are automatically uploaded to the operator s computer for analysis.

  1. Software and mathematical support of Kazakhstani star tracker

    NASA Astrophysics Data System (ADS)

    Akhmedov, D.; Yelubayev, S.; Ten, V.; Bopeyev, T.; Alipbayev, K.; Sukhenko, A.

    2016-10-01

    Currently the specialists of Kazakhstan have been developing the star tracker that is further planned to use on Kazakhstani satellites of various purposes. At the first stage it has been developed the experimental model of star tracker that has following characteristics: field of view 20°, update frequency 2 Hz, exclusion angle 40°, accuracy of attitude determination of optical axis/around optical axis 15/50 arcsec. Software and mathematical support are the most high technology parts of star tracker. The results of software and mathematical support development of experimental model of Kazakhstani star tracker are represented in this article. In particular, there are described the main mathematical models and algorithms that have been used as a basis for program units of preliminary image processing of starry sky, stars identification and star tracker attitude determination. The results of software and mathematical support testing with the help of program simulation complex using various configurations of defects including image sensor noises, point spread function modeling, optical system distortion up to 2% are presented. Analysis of testing results has shown that accuracy of attitude determination of star tracker is within the permissible range

  2. Verifying Diagnostic Software

    NASA Technical Reports Server (NTRS)

    Lindsey, Tony; Pecheur, Charles

    2004-01-01

    Livingstone PathFinder (LPF) is a simulation-based computer program for verifying autonomous diagnostic software. LPF is designed especially to be applied to NASA s Livingstone computer program, which implements a qualitative-model-based algorithm that diagnoses faults in a complex automated system (e.g., an exploratory robot, spacecraft, or aircraft). LPF forms a software test bed containing a Livingstone diagnosis engine, embedded in a simulated operating environment consisting of a simulator of the system to be diagnosed by Livingstone and a driver program that issues commands and faults according to a nondeterministic scenario provided by the user. LPF runs the test bed through all executions allowed by the scenario, checking for various selectable error conditions after each step. All components of the test bed are instrumented, so that execution can be single-stepped both backward and forward. The architecture of LPF is modular and includes generic interfaces to facilitate substitution of alternative versions of its different parts. Altogether, LPF provides a flexible, extensible framework for simulation-based analysis of diagnostic software; these characteristics also render it amenable to application to diagnostic programs other than Livingstone.

  3. Development and validation of MIX: comprehensive free software for meta-analysis of causal research data.

    PubMed

    Bax, Leon; Yu, Ly-Mee; Ikeda, Noriaki; Tsuruta, Harukazu; Moons, Karel G M

    2006-10-13

    Meta-analysis has become a well-known method for synthesis of quantitative data from previously conducted research in applied health sciences. So far, meta-analysis has been particularly useful in evaluating and comparing therapies and in assessing causes of disease. Consequently, the number of software packages that can perform meta-analysis has increased over the years. Unfortunately, it can take a substantial amount of time to get acquainted with some of these programs and most contain little or no interactive educational material. We set out to create and validate an easy-to-use and comprehensive meta-analysis package that would be simple enough programming-wise to remain available as a free download. We specifically aimed at students and researchers who are new to meta-analysis, with important parts of the development oriented towards creating internal interactive tutoring tools and designing features that would facilitate usage of the software as a companion to existing books on meta-analysis. We took an unconventional approach and created a program that uses Excel as a calculation and programming platform. The main programming language was Visual Basic, as implemented in Visual Basic 6 and Visual Basic for Applications in Excel 2000 and higher. The development took approximately two years and resulted in the 'MIX' program, which can be downloaded from the program's website free of charge. Next, we set out to validate the MIX output with two major software packages as reference standards, namely STATA (metan, metabias, and metatrim) and Comprehensive Meta-Analysis Version 2. Eight meta-analyses that had been published in major journals were used as data sources. All numerical and graphical results from analyses with MIX were identical to their counterparts in STATA and CMA. The MIX program distinguishes itself from most other programs by the extensive graphical output, the click-and-go (Excel) interface, and the educational features. The MIX program is a valid tool for performing meta-analysis and may be particularly useful in educational environments. It can be downloaded free of charge via http://www.mix-for-meta-analysis.info or http://sourceforge.net/projects/meta-analysis.

  4. Design and demonstration of automated data analysis algorithms for ultrasonic inspection of complex composite panels with bonds

    NASA Astrophysics Data System (ADS)

    Aldrin, John C.; Forsyth, David S.; Welter, John T.

    2016-02-01

    To address the data review burden and improve the reliability of the ultrasonic inspection of large composite structures, automated data analysis (ADA) algorithms have been developed to make calls on indications that satisfy the detection criteria and minimize false calls. The original design followed standard procedures for analyzing signals for time-of-flight indications and backwall amplitude dropout. However, certain complex panels with varying shape, ply drops and the presence of bonds can complicate this interpretation process. In this paper, enhancements to the automated data analysis algorithms are introduced to address these challenges. To estimate the thickness of the part and presence of bonds without prior information, an algorithm tracks potential backwall or bond-line signals, and evaluates a combination of spatial, amplitude, and time-of-flight metrics to identify bonded sections. Once part boundaries, thickness transitions and bonded regions are identified, feature extraction algorithms are applied to multiple sets of through-thickness and backwall C-scan images, for evaluation of both first layer through thickness and layers under bonds. ADA processing results are presented for a variety of complex test specimens with inserted materials and other test discontinuities. Lastly, enhancements to the ADA software interface are presented, which improve the software usability for final data review by the inspectors and support the certification process.

  5. Four applications of a software data collection and analysis methodology

    NASA Technical Reports Server (NTRS)

    Basili, Victor R.; Selby, Richard W., Jr.

    1985-01-01

    The evaluation of software technologies suffers because of the lack of quantitative assessment of their effect on software development and modification. A seven-step data collection and analysis methodology couples software technology evaluation with software measurement. Four in-depth applications of the methodology are presented. The four studies represent each of the general categories of analyses on the software product and development process: blocked subject-project studies, replicated project studies, multi-project variation studies, and single project strategies. The four applications are in the areas of, respectively, software testing, cleanroom software development, characteristic software metric sets, and software error analysis.

  6. User's Manual for the National Water-Quality Assessment Program Invertebrate Data Analysis System (IDAS) Software: Version 3

    USGS Publications Warehouse

    Cuffney, Thomas F.

    2003-01-01

    The Invertebrate Data Analysis System (IDAS) software provides an accurate, consistent, and efficient mechanism for analyzing invertebrate data collected as part of the National Water-Quality Assessment Program and stored in the Biological Transactional Database (Bio-TDB). The IDAS software is a stand-alone program for personal computers that run Microsoft (MS) Windows?. It allows users to read data downloaded from Bio-TDB and stored either as MS Excel? or MS Access? files. The program consists of five modules. The Edit Data module allows the user to subset, combine, delete, and summarize community data. The Data Preparation module allows the user to select the type(s) of sample(s) to process, calculate densities, delete taxa based on laboratory processing notes, combine lifestages or keep them separate, select a lowest taxonomic level for analysis, delete rare taxa, and resolve taxonomic ambiguities. The Calculate Community Metrics module allows the user to calculate over 130 community metrics, including metrics based on organism tolerances and functional feeding groups. The Calculate Diversities and Similarities module allows the user to calculate nine diversity and eight similarity indices. The Data export module allows the user to export data to other software packages and produce tables of community data that can be imported into spreadsheet and word-processing programs. Though the IDAS program was developed to process invertebrate data downloaded from USGS databases, it will work with other data sets that are converted to the USGS (Bio-TDB) format. Consequently, the data manipulation, analysis, and export procedures provided by the IDAS program can be used by anyone involved in using benthic macroinvertebrates in applied or basic research.

  7. Unobtrusive integration of data management with fMRI analysis.

    PubMed

    Poliakov, Andrew V; Hertzenberg, Xenia; Moore, Eider B; Corina, David P; Ojemann, George A; Brinkley, James F

    2007-01-01

    This note describes a software utility, called X-batch which addresses two pressing issues typically faced by functional magnetic resonance imaging (fMRI) neuroimaging laboratories (1) analysis automation and (2) data management. The first issue is addressed by providing a simple batch mode processing tool for the popular SPM software package (http://www.fil.ion. ucl.ac.uk/spm/; Welcome Department of Imaging Neuroscience, London, UK). The second is addressed by transparently recording metadata describing all aspects of the batch job (e.g., subject demographics, analysis parameters, locations and names of created files, date and time of analysis, and so on). These metadata are recorded as instances of an extended version of the Protégé-based Experiment Lab Book ontology created by the Dartmouth fMRI Data Center. The resulting instantiated ontology provides a detailed record of all fMRI analyses performed, and as such can be part of larger systems for neuroimaging data management, sharing, and visualization. The X-batch system is in use in our own fMRI research, and is available for download at http://X-batch.sourceforge.net/.

  8. Hypersonic Navier Stokes Comparisons to Orbiter Flight Data

    NASA Technical Reports Server (NTRS)

    Campbell, Charles H.; Nompelis, Ioannis; Candler, Graham; Barnhart, Michael; Yoon, Seokkwan

    2009-01-01

    Hypersonic chemical nonequilibrium simulations of low earth orbit entry flow fields are becoming increasingly commonplace as software and computational capabilities become more capable. However, development of robust and accurate software to model these environments will always encounter a significant barrier in developing a suite of high quality calibration cases. The US3D hypersonic nonequilibrium Navier Stokes analysis capability has been favorably compared to a number of wind tunnel test cases. Extension of the calibration basis for this software to Orbiter flight conditions will provide an incremental increase in confidence. As part of the Orbiter Boundary Layer Transition Flight Experiment and the Hypersonic Thermodynamic Infrared Measurements project, NASA is performing entry flight testing on the Orbiter to provide valuable aerothermodynamic heating data. An increase in interest related to orbiter entry environments is resulting from this activity. With the advent of this new data, comparisons of the US3D software to the new flight testing data is warranted. This paper will provide information regarding the framework of analyses that will be applied with the US3D analysis tool. In addition, comparisons will be made to entry flight testing data provided by the Orbiter BLT Flight Experiment and HYTHIRM projects. If data from digital scans of the Orbiter windward surface become available, simulations will also be performed to characterize the difference in surface heating between the CAD reference OML and the digitized surface provided by the surface scans.

  9. DigitSeis: A New Digitization Software and its Application to the Harvard-Adam Dziewoński Observatory Collection

    NASA Astrophysics Data System (ADS)

    Bogiatzis, P.; Altoé, I. L.; Karamitrou, A.; Ishii, M.; Ishii, H.

    2015-12-01

    DigitSeis is a new open-source, interactive digitization software written in MATLAB that converts digital, raster images of analog seismograms to readily usable, discretized time series using image processing algorithms. DigitSeis automatically identifies and corrects for various geometrical distortions of seismogram images that are acquired through the original recording, storage, and scanning procedures. With human supervision, the software further identifies and classifies important features such as time marks and notes, corrects time-mark offsets from the main trace, and digitizes the combined trace with an analysis to obtain as accurate timing as possible. Although a large effort has been made to minimize the human input, DigitSeis provides interactive tools for challenging situations such as trace crossings and stains in the paper. The effectiveness of the software is demonstrated with the digitization of seismograms that are over half a century old from the Harvard-Adam Dziewoński observatory that is still in operation as a part of the Global Seismographic Network (station code HRV and network code IU). The spectral analysis of the digitized time series shows no spurious features that may be related to the occurrence of minute and hour marks. They also display signals associated with significant earthquakes, and a comparison of the spectrograms with modern recordings reveals similarities in the background noise.

  10. Study of electrode slice forming of bicycle dynamo hub power connector

    NASA Astrophysics Data System (ADS)

    Chen, Dyi-Cheng; Jao, Chih-Hsuan

    2013-12-01

    Taiwan's bicycle industry has been an international reputation as bicycle kingdom, but the problem in the world makes global warming green energy rise, the development of electrode slice of hub dynamo and power output connector to bring new hope to bike industry. In this study connector power output to gather public opinion related to patent, basis of collected documents as basis for design, structural components in least drawn to power output with simple connector. Power output of this study objectives connector hope at least cost, structure strongest, highest efficiency in output performance characteristics such as use of computer-aided drawing software Solid works to establish power output connector parts of 3D model, the overall portfolio should be considered part types including assembly ideas, weather resistance, water resistance, corrosion resistance to vibration and power flow stability. Moreover the 3D model import computer-aided finite element analysis software simulation of expected the power output of the connector parts manufacturing process. A series of simulation analyses, in which the variables relied on first stage and second stage forming, were run to examine the effective stress, effective strain, press speed, and die radial load distribution when forming electrode slice of bicycle dynamo hub.

  11. The finite element simulation analysis research of 38CrSi cylindrical power spinning

    NASA Astrophysics Data System (ADS)

    Liang, Wei; Lv, Qiongying; Zhao, Yujuan; Lv, Yunxia

    2018-01-01

    In order to grope for the influence of the main cylindrical spinning process parameters on the spinning process, this paper combines with real tube power spinning process and uses ABAQUS finite element analysis software to simulate the tube power spinning process of 38CrSi steel materials, through the analysis of the stress, strain of the part forming process, analyzes the influence of the thickness reduction and the feed rate to the forming process, and analyzes the variation of the spinning force, finally determines the reasonable main spinning process parameters combination.

  12. Treatment delivery software for a new clinical grade ultrasound system for thermoradiotherapy.

    PubMed

    Novák, Petr; Moros, Eduardo G; Straube, William L; Myerson, Robert J

    2005-11-01

    A detailed description of a clinical grade Scanning Ultrasound Reflector Linear Array System (SURLAS) applicator was given in a previous paper [Med. Phys. 32, 230-240 (2005)]. In this paper we concentrate on the design, development, and testing of the personal computer (PC) based treatment delivery software that runs the therapy system. The SURLAS requires the coordinated interaction between the therapy applicator and several peripheral devices for its proper and safe operation. One of the most important tasks was the coordination of the input power sequences for the elements of two parallel opposed ultrasound arrays (eight 1.5 cm x 2 cm elements/array, array 1 and 2 operate at 1.9 and 4.9 MHz, respectively) in coordination with the position of a dual-face scanning acoustic reflector. To achieve this, the treatment delivery software can divide the applicator's treatment window in up to 64 sectors (minimum size of 2 cm x 2 cm), and control the power to each sector independently by adjusting the power output levels from the channels of a 16-channel radio-frequency generator. The software coordinates the generator outputs with the position of the reflector as it scans back and forth between the arrays. Individual sector control and dual frequency operation allows the SURLAS to adjust power deposition in three dimensions to superficial targets coupled to its treatment window. The treatment delivery software also monitors and logs several parameters such as temperatures acquired using a 16-channel thermocouple thermometry unit. Safety (in particular to patients) was the paramount concern and design criterion. Failure mode and effects analysis (FMEA) was applied to the applicator as well as to the entire therapy system in order to identify safety issues and rank their relative importance. This analysis led to the implementation of several safety mechanisms and a software structure where each device communicates with the controlling PC independently of the others. In case of a malfunction in any part of the system or a violation of a user-defined safety criterion based on temperature readings, the software terminates treatment immediately and the user is notified. The software development process consisting of problem analysis, design, implementation, and testing is presented in this paper. Once the software was finished and integrated with the hardware, the therapy system was extensively tested. Results demonstrated that the software operates the SURLAS as intended with minimum risk to future patients.

  13. Efficacy of a Newly Designed Cephalometric Analysis Software for McNamara Analysis in Comparison with Dolphin Software.

    PubMed

    Nouri, Mahtab; Hamidiaval, Shadi; Akbarzadeh Baghban, Alireza; Basafa, Mohammad; Fahim, Mohammad

    2015-01-01

    Cephalometric norms of McNamara analysis have been studied in various populations due to their optimal efficiency. Dolphin cephalometric software greatly enhances the conduction of this analysis for orthodontic measurements. However, Dolphin is very expensive and cannot be afforded by many clinicians in developing countries. A suitable alternative software program in Farsi/English will greatly help Farsi speaking clinicians. The present study aimed to develop an affordable Iranian cephalometric analysis software program and compare it with Dolphin, the standard software available on the market for cephalometric analysis. In this diagnostic, descriptive study, 150 lateral cephalograms of normal occlusion individuals were selected in Mashhad and Qazvin, two major cities of Iran mainly populated with Fars ethnicity, the main Iranian ethnic group. After tracing the cephalograms, the McNamara analysis standards were measured both with Dolphin and the new software. The cephalometric software was designed using Microsoft Visual C++ program in Windows XP. Measurements made with the new software were compared with those of Dolphin software on both series of cephalograms. The validity and reliability were tested using intra-class correlation coefficient. Calculations showed a very high correlation between the results of the Iranian cephalometric analysis software and Dolphin. This confirms the validity and optimal efficacy of the newly designed software (ICC 0.570-1.0). According to our results, the newly designed software has acceptable validity and reliability and can be used for orthodontic diagnosis, treatment planning and assessment of treatment outcome.

  14. How to choose the right statistical software?—a method increasing the post-purchase satisfaction

    PubMed Central

    2015-01-01

    Nowadays, we live in the “data era” where the use of statistical or data analysis software is inevitable, in any research field. This means that the choice of the right software tool or platform is a strategic issue for a research department. Nevertheless, in many cases decision makers do not pay the right attention to a comprehensive and appropriate evaluation of what the market offers. Indeed, the choice still depends on few factors like, for instance, researcher’s personal inclination, e.g., which software have been used at the university or is already known. This is not wrong in principle, but in some cases it’s not enough at all and might lead to a “dead end” situation, typically after months or years of investments already done on the wrong software. This article, far from being a full and complete guide to statistical software evaluation, aims to illustrate some key points of the decision process and introduce an extended range of factors which can help to undertake the right choice, at least in potential. There is not enough literature about that topic, most of the time underestimated, both in the traditional literature and even in the so called “gray literature”, even if some documents or short pages can be found online. Anyhow, it seems there is not a common and known standpoint about the process of software evaluation from the final user perspective. We suggests a multi-factor analysis leading to an evaluation matrix tool, to be intended as a flexible and customizable tool, aimed to provide a clearer picture of the software alternatives available, not in abstract but related to the researcher’s own context and needs. This method is a result of about twenty years of experience of the author in the field of evaluating and using technical-computing software and partially arises from a research made about such topics as part of a project funded by European Commission under the Lifelong Learning Programme 2011. PMID:26793368

  15. Software quality assurance plan for GCS

    NASA Technical Reports Server (NTRS)

    Duncan, Stephen E.; Bailey, Elizabeth K.

    1990-01-01

    The software quality assurance (SQA) function for the Guidance and Control Software (GCS) project which is part of a software error studies research program is described. The SQA plan outlines all of the procedures, controls, and audits to be carried out by the SQA organization to ensure adherence to the policies, procedures, and standards for the GCS project.

  16. Emerging Technologies for Software-Reliant Systems

    DTIC Science & Technology

    2011-02-24

    needs • Loose coupling • Global distribution of hardware, software and people • Horizontal integration and convergence • Virtualization...Webinar– February 2011 © 2011 Carnegie Mellon University Global Distribution of Hardware, Software and People Globalization is an essential part of...University Required Software Engineering Emphasis Due to Emerging Technologies (2) Defensive Programming • Security • Auto-adaptation • Globalization

  17. 34 CFR 464.42 - What limit applies to purchasing computer hardware and software?

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... software? 464.42 Section 464.42 Education Regulations of the Offices of the Department of Education... computer hardware and software? Not more than ten percent of funds received under any grant under this part may be used to purchase computer hardware or software. (Authority: 20 U.S.C. 1208aa(f)) ...

  18. 34 CFR 464.42 - What limit applies to purchasing computer hardware and software?

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... software? 464.42 Section 464.42 Education Regulations of the Offices of the Department of Education... computer hardware and software? Not more than ten percent of funds received under any grant under this part may be used to purchase computer hardware or software. (Authority: 20 U.S.C. 1208aa(f)) ...

  19. 34 CFR 464.42 - What limit applies to purchasing computer hardware and software?

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... software? 464.42 Section 464.42 Education Regulations of the Offices of the Department of Education... computer hardware and software? Not more than ten percent of funds received under any grant under this part may be used to purchase computer hardware or software. (Authority: 20 U.S.C. 1208aa(f)) ...

  20. Shaping Software Engineering Curricula Using Open Source Communities: A Case Study

    ERIC Educational Resources Information Center

    Bowring, James; Burke, Quinn

    2016-01-01

    This paper documents four years of a novel approach to teaching a two-course sequence in software engineering as part of the ABET-accredited computer science curriculum at the College of Charleston. This approach is team-based and centers on learning software engineering in the context of open source software projects. In the first course, teams…

  1. 34 CFR 464.42 - What limit applies to purchasing computer hardware and software?

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... software? 464.42 Section 464.42 Education Regulations of the Offices of the Department of Education... computer hardware and software? Not more than ten percent of funds received under any grant under this part may be used to purchase computer hardware or software. (Authority: 20 U.S.C. 1208aa(f)) ...

  2. 34 CFR 464.42 - What limit applies to purchasing computer hardware and software?

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... software? 464.42 Section 464.42 Education Regulations of the Offices of the Department of Education... computer hardware and software? Not more than ten percent of funds received under any grant under this part may be used to purchase computer hardware or software. (Authority: 20 U.S.C. 1208aa(f)) ...

  3. Advances in directional borehole radar data analysis and visualization

    USGS Publications Warehouse

    Smith, D.V.G.; Brown, P.J.

    2002-01-01

    The U.S. Geological Survey is developing a directional borehole radar (DBOR) tool for mapping fractures, lithologic changes, and underground utility and void detection. An important part of the development of the DBOR tool is data analysis and visualization, with the aim of making the software graphical user interface (GUI) intuitive and easy to use. The DBOR software system consists of a suite of signal and image processing routines written in Research Systems' Interactive Data Language (IDL). The software also serves as a front-end to many widely accepted Colorado School of Mines Center for Wave Phenomena (CWP) Seismic UNIX (SU) algorithms (Cohen and Stockwell, 2001). Although the SU collection runs natively in a UNIX environment, our system seamlessly emulates a UNIX session within a widely used PC operating system (MicroSoft Windows) using GNU tools (Noer, 1998). Examples are presented of laboratory data acquired with the prototype tool from two different experimental settings. The first experiment imaged plastic pipes in a macro-scale sand tank. The second experiment monitored the progress of an invasion front resulting from oil injection. Finally, challenges to further development and planned future work are discussed.

  4. Concept of software interface for BCI systems

    NASA Astrophysics Data System (ADS)

    Svejda, Jaromir; Zak, Roman; Jasek, Roman

    2016-06-01

    Brain Computer Interface (BCI) technology is intended to control external system by brain activity. One of main part of such system is software interface, which carries about clear communication between brain and either computer or additional devices connected to computer. This paper is organized as follows. Firstly, current knowledge about human brain is briefly summarized to points out its complexity. Secondly, there is described a concept of BCI system, which is then used to build an architecture of proposed software interface. Finally, there are mentioned disadvantages of sensing technology discovered during sensing part of our research.

  5. Feasibility study of an Integrated Program for Aerospace-vehicle Design (IPAD) system. Volume 5: Design of the IPAD system. Part 2: System design. Part 3: General purpose utilities, phase 1, task 2

    NASA Technical Reports Server (NTRS)

    Garrocq, C. A.; Hurley, M. J.

    1973-01-01

    Viable designs are presented of various elements of the IPAD framework software, data base management system, and required new languages in relation to the capabilities of operating systems software. A thorough evaluation was made of the basic systems functions to be provide by each software element, its requirements defined in the conceptual design, the operating systems features affecting its design, and the engineering/design functions which it was intended to enhance.

  6. A learning apprentice for software parts composition

    NASA Technical Reports Server (NTRS)

    Allen, Bradley P.; Holtzman, Peter L.

    1987-01-01

    An overview of the knowledge acquisition component of the Bauhaus, a prototype computer aided software engineering (CASE) workstation for the development of domain-specific automatic programming systems (D-SAPS) is given. D-SAPS use domain knowledge in the refinement of a description of an application program into a compilable implementation. The approach to the construction of D-SAPS was to automate the process of refining a description of a program, expressed in an object-oriented domain language, into a configuration of software parts that implement the behavior of the domain objects.

  7. Guidance and Control Software Project Data - Volume 4: Configuration Management and Quality Assurance Documents

    NASA Technical Reports Server (NTRS)

    Hayhurst, Kelly J. (Editor)

    2008-01-01

    The Guidance and Control Software (GCS) project was the last in a series of software reliability studies conducted at Langley Research Center between 1977 and 1994. The technical results of the GCS project were recorded after the experiment was completed. Some of the support documentation produced as part of the experiment, however, is serving an unexpected role far beyond its original project context. Some of the software used as part of the GCS project was developed to conform to the RTCA/DO-178B software standard, "Software Considerations in Airborne Systems and Equipment Certification," used in the civil aviation industry. That standard requires extensive documentation throughout the software development life cycle, including plans, software requirements, design and source code, verification cases and results, and configuration management and quality control data. The project documentation that includes this information is open for public scrutiny without the legal or safety implications associated with comparable data from an avionics manufacturer. This public availability has afforded an opportunity to use the GCS project documents for DO-178B training. This report provides a brief overview of the GCS project, describes the 4-volume set of documents and the role they are playing in training, and includes configuration management and quality assurance documents from the GCS project. Volume 4 contains six appendices: A. Software Accomplishment Summary for the Guidance and Control Software Project; B. Software Configuration Index for the Guidance and Control Software Project; C. Configuration Management Records for the Guidance and Control Software Project; D. Software Quality Assurance Records for the Guidance and Control Software Project; E. Problem Report for the Pluto Implementation of the Guidance and Control Software Project; and F. Support Documentation Change Reports for the Guidance and Control Software Project.

  8. The analysis of the distribution of unitary stresses for the universal plowshare in tiller seeder combos (UPTSC)

    NASA Astrophysics Data System (ADS)

    Chiorescu, D.; Chiorescu, E.; Dodun, O.; Crăciun, V.

    2016-11-01

    The sustainable development of agriculture is an important component of economic and social progress of the mankind aiming especially at promoting environmentally friendly systems and technologies. Thus, the implementation of sustainable agriculture also requires some high performance farming aggregates such as tiller seeder combos. Their most stressed active working part is the plowshare which has an important part in cutting the soil. For this reason, we consider that theoretical and experimental research is needed for the tear to which this working part is subjected to. This paper analyses the behavior of the universal plowshare, component part of UPTSC, using the Finite Element Method (FEM) and the Ansys software program. With the help of FEM, we analyzed the universal plowshare in the material structure during the soil cutting process, highlighting the deformation degree and the stress field in the working part. In the first stage, we identified a representative set of problems concerning the soil cutting process, for which we designed the solutions through numerical simulations. In the processing stage, we designed a 3D model which respects entirely the geometric shape of the active element in Cartesian coordinates. In order to simulate the soil cutting process in accordance with the real conditions, the compilations are done for various refinement degrees of the discretization network in finite elements. In the same stage we introduced the constraints represented by: the fixation of the plowshare support, direction, as well as the action of the cohesion and shear strength. Using the Explicit Dynamics module of the Ansys software, which allows studying the plowshare behavior, we analyzed in real conditions, the normal and the shear stresses as well as the deformation, for various soil types and various soil states. Considering the data on the existent stresses, following the FEM analysis of the working part, we determined the wear and suggested the safety coefficients for this case.

  9. Special Report: Part One. New Tools for Professionals.

    ERIC Educational Resources Information Center

    Liskin, Miriam; And Others

    1984-01-01

    This collection of articles includes an examination of word-processing software; project management software; new expert systems that turn microcomputers into logical, well-informed consultants; simulated negotiation software; telephone management systems; and the physical design of an efficient microcomputer work space. (MBR)

  10. 49 CFR 236.903 - Definitions.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... electrical, mechanical, hardware, or software) that is part of a system or subsystem. Configuration..., including the hardware components and software version, is documented and maintained through the life-cycle... or compensates individuals to perform the duties specified in § 236.921 (a). Executive software means...

  11. 49 CFR 236.903 - Definitions.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... electrical, mechanical, hardware, or software) that is part of a system or subsystem. Configuration..., including the hardware components and software version, is documented and maintained through the life-cycle... or compensates individuals to perform the duties specified in § 236.921 (a). Executive software means...

  12. 49 CFR 236.903 - Definitions.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... electrical, mechanical, hardware, or software) that is part of a system or subsystem. Configuration..., including the hardware components and software version, is documented and maintained through the life-cycle... or compensates individuals to perform the duties specified in § 236.921 (a). Executive software means...

  13. 49 CFR 236.903 - Definitions.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... electrical, mechanical, hardware, or software) that is part of a system or subsystem. Configuration..., including the hardware components and software version, is documented and maintained through the life-cycle... or compensates individuals to perform the duties specified in § 236.921 (a). Executive software means...

  14. Fast interactive exploration of 4D MRI flow data

    NASA Astrophysics Data System (ADS)

    Hennemuth, A.; Friman, O.; Schumann, C.; Bock, J.; Drexl, J.; Huellebrand, M.; Markl, M.; Peitgen, H.-O.

    2011-03-01

    1- or 2-directional MRI blood flow mapping sequences are an integral part of standard MR protocols for diagnosis and therapy control in heart diseases. Recent progress in rapid MRI has made it possible to acquire volumetric, 3-directional cine images in reasonable scan time. In addition to flow and velocity measurements relative to arbitrarily oriented image planes, the analysis of 3-dimensional trajectories enables the visualization of flow patterns, local features of flow trajectories or possible paths into specific regions. The anatomical and functional information allows for advanced hemodynamic analysis in different application areas like stroke risk assessment, congenital and acquired heart disease, aneurysms or abdominal collaterals and cranial blood flow. The complexity of the 4D MRI flow datasets and the flow related image analysis tasks makes the development of fast comprehensive data exploration software for advanced flow analysis a challenging task. Most existing tools address only individual aspects of the analysis pipeline such as pre-processing, quantification or visualization, or are difficult to use for clinicians. The goal of the presented work is to provide a software solution that supports the whole image analysis pipeline and enables data exploration with fast intuitive interaction and visualization methods. The implemented methods facilitate the segmentation and inspection of different vascular systems. Arbitrary 2- or 3-dimensional regions for quantitative analysis and particle tracing can be defined interactively. Synchronized views of animated 3D path lines, 2D velocity or flow overlays and flow curves offer a detailed insight into local hemodynamics. The application of the analysis pipeline is shown for 6 cases from clinical practice, illustrating the usefulness for different clinical questions. Initial user tests show that the software is intuitive to learn and even inexperienced users achieve good results within reasonable processing times.

  15. Fully automatic and precise data analysis developed for time-of-flight mass spectrometry.

    PubMed

    Meyer, Stefan; Riedo, Andreas; Neuland, Maike B; Tulej, Marek; Wurz, Peter

    2017-09-01

    Scientific objectives of current and future space missions are focused on the investigation of the origin and evolution of the solar system with the particular emphasis on habitability and signatures of past and present life. For in situ measurements of the chemical composition of solid samples on planetary surfaces, the neutral atmospheric gas and the thermal plasma of planetary atmospheres, the application of mass spectrometers making use of time-of-flight mass analysers is a technique widely used. However, such investigations imply measurements with good statistics and, thus, a large amount of data to be analysed. Therefore, faster and especially robust automated data analysis with enhanced accuracy is required. In this contribution, an automatic data analysis software, which allows fast and precise quantitative data analysis of time-of-flight mass spectrometric data, is presented and discussed in detail. A crucial part of this software is a robust and fast peak finding algorithm with a consecutive numerical integration method allowing precise data analysis. We tested our analysis software with data from different time-of-flight mass spectrometers and different measurement campaigns thereof. The quantitative analysis of isotopes, using automatic data analysis, yields results with an accuracy of isotope ratios up to 100 ppm for a signal-to-noise ratio (SNR) of 10 4 . We show that the accuracy of isotope ratios is in fact proportional to SNR -1 . Furthermore, we observe that the accuracy of isotope ratios is inversely proportional to the mass resolution. Additionally, we show that the accuracy of isotope ratios is depending on the sample width T s by T s 0.5 . Copyright © 2017 John Wiley & Sons, Ltd. Copyright © 2017 John Wiley & Sons, Ltd.

  16. Continuous Risk Management: An Overview

    NASA Technical Reports Server (NTRS)

    Rosenberg, Linda; Hammer, Theodore F.

    1999-01-01

    Software risk management is important because it helps avoid disasters, rework, and overkill, but more importantly because it stimulates win-win situations. The objectives of software risk management are to identify, address, and eliminate software risk items before they become threats to success or major sources of rework. In general, good project managers are also good managers of risk. It makes good business sense for all software development projects to incorporate risk management as part of project management. The Software Assurance Technology Center (SATC) at NASA GSFC has been tasked with the responsibility for developing and teaching a systems level course for risk management that provides information on how to implement risk management. The course was developed in conjunction with the Software Engineering Institute at Carnegie Mellon University, then tailored to the NASA systems community. This is an introductory tutorial to continuous risk management based on this course. The rational for continuous risk management and how it is incorporated into project management are discussed. The risk management structure of six functions is discussed in sufficient depth for managers to understand what is involved in risk management and how it is implemented. These functions include: (1) Identify the risks in a specific format; (2) Analyze the risk probability, impact/severity, and timeframe; (3) Plan the approach; (4) Track the risk through data compilation and analysis; (5) Control and monitor the risk; (6) Communicate and document the process and decisions.

  17. Debugging and Performance Analysis Software Tools for Peregrine System |

    Science.gov Websites

    High-Performance Computing | NREL Debugging and Performance Analysis Software Tools for Peregrine System Debugging and Performance Analysis Software Tools for Peregrine System Learn about debugging and performance analysis software tools available to use with the Peregrine system. Allinea

  18. Development of a software safety process and a case study of its use

    NASA Technical Reports Server (NTRS)

    Knight, John C.

    1993-01-01

    The goal of this research is to continue the development of a comprehensive approach to software safety and to evaluate the approach with a case study. The case study is a major part of the project, and it involves the analysis of a specific safety-critical system from the medical equipment domain. The particular application being used was selected because of the availability of a suitable candidate system. We consider the results to be generally applicable and in no way particularly limited by the domain. The research is concentrating on issues raised by the specification and verification phases of the software lifecycle since they are central to our previously-developed rigorous definitions of software safety. The theoretical research is based on our framework of definitions for software safety. In the area of specification, the main topics being investigated are the development of techniques for building system fault trees that correctly incorporate software issues and the development of rigorous techniques for the preparation of software safety specifications. The research results are documented. Another area of theoretical investigation is the development of verification methods tailored to the characteristics of safety requirements. Verification of the correct implementation of the safety specification is central to the goal of establishing safe software. The empirical component of this research is focusing on a case study in order to provide detailed characterizations of the issues as they appear in practice, and to provide a testbed for the evaluation of various existing and new theoretical results, tools, and techniques. The Magnetic Stereotaxis System is summarized.

  19. Certification of COTS Software in NASA Human Rated Flight Systems

    NASA Technical Reports Server (NTRS)

    Goforth, Andre

    2012-01-01

    Adoption of commercial off-the-shelf (COTS) products in safety critical systems has been seen as a promising acquisition strategy to improve mission affordability and, yet, has come with significant barriers and challenges. Attempts to integrate COTS software components into NASA human rated flight systems have been, for the most part, complicated by verification and validation (V&V) requirements necessary for flight certification per NASA s own standards. For software that is from COTS sources, and, in general from 3rd party sources, either commercial, government, modified or open source, the expectation is that it meets the same certification criteria as those used for in-house and that it does so as if it were built in-house. The latter is a critical and hidden issue. This paper examines the longstanding barriers and challenges in the use of 3rd party software in safety critical systems and cover recent efforts to use COTS software in NASA s Multi-Purpose Crew Vehicle (MPCV) project. It identifies some core artifacts that without them, the use of COTS and 3rd party software is, for all practical purposes, a nonstarter for affordable and timely insertion into flight critical systems. The paper covers the first use in a flight critical system by NASA of COTS software that has prior FAA certification heritage, which was shown to meet the RTCA-DO-178B standard, and how this certification may, in some cases, be leveraged to allow the use of analysis in lieu of testing. Finally, the paper proposes the establishment of an open source forum for development of safety critical 3rd party software.

  20. Genome re-annotation: a wiki solution?

    PubMed Central

    Salzberg, Steven L

    2007-01-01

    The annotation of most genomes becomes outdated over time, owing in part to our ever-improving knowledge of genomes and in part to improvements in bioinformatics software. Unfortunately, annotation is rarely if ever updated and resources to support routine reannotation are scarce. Wiki software, which would allow many scientists to edit each genome's annotation, offers one possible solution. PMID:17274839

  1. Integrated testing and verification system for research flight software

    NASA Technical Reports Server (NTRS)

    Taylor, R. N.

    1979-01-01

    The MUST (Multipurpose User-oriented Software Technology) program is being developed to cut the cost of producing research flight software through a system of software support tools. An integrated verification and testing capability was designed as part of MUST. Documentation, verification and test options are provided with special attention on real-time, multiprocessing issues. The needs of the entire software production cycle were considered, with effective management and reduced lifecycle costs as foremost goals.

  2. IEEE Computer Society/Software Engineering Institute Software Process Achievement (SPA) Award 2009

    DTIC Science & Technology

    2011-03-01

    capabilities to our GDM. We also introduced software as a service ( SaaS ) as part our technology solutions and have further enhanced our ability to...model PROSPER Infosys production support methodology Q&P quality and productivity R&D research and development SaaS software as a service ... Software Development Life Cycle (SDLC) 23 Table 10: Scientific Estimation Coverage by Service Line 27 CMU/SEI-2011-TR-008 | vi CMU/SEI-2011

  3. Track train dynamics analysis and test program: Methodology development for the derailment safety analysis of six-axle locomotives

    NASA Technical Reports Server (NTRS)

    Marcotte, P. P.; Mathewson, K. J. R.

    1982-01-01

    The operational safety of six axle locomotives is analyzed. A locomotive model with corresponding data on suspension characteristics, a method of track defect characterization, and a method of characterizing operational safety are used. A user oriented software package was developed as part of the methodology and was used to study the effect (on operational safety) of various locomotive parameters and operational conditions such as speed, tractive effort, and track curvature. The operational safety of three different locomotive designs was investigated.

  4. Program design by a multidisciplinary team. [for structural finite element analysis on STAR-100 computer

    NASA Technical Reports Server (NTRS)

    Voigt, S.

    1975-01-01

    The use of software engineering aids in the design of a structural finite-element analysis computer program for the STAR-100 computer is described. Nested functional diagrams to aid in communication among design team members were used, and a standardized specification format to describe modules designed by various members was adopted. This is a report of current work in which use of the functional diagrams provided continuity and helped resolve some of the problems arising in this long-running part-time project.

  5. Development and validation of MIX: comprehensive free software for meta-analysis of causal research data

    PubMed Central

    Bax, Leon; Yu, Ly-Mee; Ikeda, Noriaki; Tsuruta, Harukazu; Moons, Karel GM

    2006-01-01

    Background Meta-analysis has become a well-known method for synthesis of quantitative data from previously conducted research in applied health sciences. So far, meta-analysis has been particularly useful in evaluating and comparing therapies and in assessing causes of disease. Consequently, the number of software packages that can perform meta-analysis has increased over the years. Unfortunately, it can take a substantial amount of time to get acquainted with some of these programs and most contain little or no interactive educational material. We set out to create and validate an easy-to-use and comprehensive meta-analysis package that would be simple enough programming-wise to remain available as a free download. We specifically aimed at students and researchers who are new to meta-analysis, with important parts of the development oriented towards creating internal interactive tutoring tools and designing features that would facilitate usage of the software as a companion to existing books on meta-analysis. Results We took an unconventional approach and created a program that uses Excel as a calculation and programming platform. The main programming language was Visual Basic, as implemented in Visual Basic 6 and Visual Basic for Applications in Excel 2000 and higher. The development took approximately two years and resulted in the 'MIX' program, which can be downloaded from the program's website free of charge. Next, we set out to validate the MIX output with two major software packages as reference standards, namely STATA (metan, metabias, and metatrim) and Comprehensive Meta-Analysis Version 2. Eight meta-analyses that had been published in major journals were used as data sources. All numerical and graphical results from analyses with MIX were identical to their counterparts in STATA and CMA. The MIX program distinguishes itself from most other programs by the extensive graphical output, the click-and-go (Excel) interface, and the educational features. Conclusion The MIX program is a valid tool for performing meta-analysis and may be particularly useful in educational environments. It can be downloaded free of charge via or . PMID:17038197

  6. Transient loads analysis for space flight applications

    NASA Technical Reports Server (NTRS)

    Thampi, S. K.; Vidyasagar, N. S.; Ganesan, N.

    1992-01-01

    A significant part of the flight readiness verification process involves transient analysis of the coupled Shuttle-payload system to determine the low frequency transient loads. This paper describes a methodology for transient loads analysis and its implementation for the Spacelab Life Sciences Mission. The analysis is carried out using two major software tools - NASTRAN and an external FORTRAN code called EZTRAN. This approach is adopted to overcome some of the limitations of NASTRAN's standard transient analysis capabilities. The method uses Data Recovery Matrices (DRM) to improve computational efficiency. The mode acceleration method is fully implemented in the DRM formulation to recover accurate displacements, stresses, and forces. The advantages of the method are demonstrated through a numerical example.

  7. Guidance and Control Software Project Data - Volume 3: Verification Documents

    NASA Technical Reports Server (NTRS)

    Hayhurst, Kelly J. (Editor)

    2008-01-01

    The Guidance and Control Software (GCS) project was the last in a series of software reliability studies conducted at Langley Research Center between 1977 and 1994. The technical results of the GCS project were recorded after the experiment was completed. Some of the support documentation produced as part of the experiment, however, is serving an unexpected role far beyond its original project context. Some of the software used as part of the GCS project was developed to conform to the RTCA/DO-178B software standard, "Software Considerations in Airborne Systems and Equipment Certification," used in the civil aviation industry. That standard requires extensive documentation throughout the software development life cycle, including plans, software requirements, design and source code, verification cases and results, and configuration management and quality control data. The project documentation that includes this information is open for public scrutiny without the legal or safety implications associated with comparable data from an avionics manufacturer. This public availability has afforded an opportunity to use the GCS project documents for DO-178B training. This report provides a brief overview of the GCS project, describes the 4-volume set of documents and the role they are playing in training, and includes the verification documents from the GCS project. Volume 3 contains four appendices: A. Software Verification Cases and Procedures for the Guidance and Control Software Project; B. Software Verification Results for the Pluto Implementation of the Guidance and Control Software; C. Review Records for the Pluto Implementation of the Guidance and Control Software; and D. Test Results Logs for the Pluto Implementation of the Guidance and Control Software.

  8. 15 CFR 774.2 - [Reserved

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ...” that “incorporate” commodities or software on the Commerce Control List (Supplement No. 1 to part 774... the practice of medicine (does not include medical research). (2) Commodities or software are considered “incorporated” if the commodity or software is: Essential to the functioning of the medical...

  9. Software process assessments

    NASA Technical Reports Server (NTRS)

    Miller, Sharon E.; Tucker, George T.; Verducci, Anthony J., Jr.

    1992-01-01

    Software process assessments (SPA's) are part of an ongoing program of continuous quality improvements in AT&T. Their use was found to be very beneficial by software development organizations in identifying the issues facing the organization and the actions required to increase both quality and productivity in the organization.

  10. 48 CFR 252.251-7000 - Ordering from Government supply sources.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... Enterprise Software Agreements, the Contractor shall follow the terms of the applicable schedule or agreement... Enterprise Software Agreement contractor). (2) The following statement: Any price reductions negotiated as part of an Enterprise Software Agreement issued under a Federal Supply Schedule contract shall control...

  11. The Offline Software Framework of the NA61/SHINE Experiment

    NASA Astrophysics Data System (ADS)

    Sipos, Roland; Laszlo, Andras; Marcinek, Antoni; Paul, Tom; Szuba, Marek; Unger, Michael; Veberic, Darko; Wyszynski, Oskar

    2012-12-01

    NA61/SHINE (SHINE = SPS Heavy Ion and Neutrino Experiment) is an experiment at the CERN SPS using the upgraded NA49 hadron spectrometer. Among its physics goals are precise hadron production measurements for improving calculations of the neutrino beam flux in the T2K neutrino oscillation experiment as well as for more reliable simulations of cosmic-ray air showers. Moreover, p+p, p+Pb and nucleus+nucleus collisions will be studied extensively to allow for a study of properties of the onset of deconfinement and search for the critical point of strongly interacting matter. Currently NA61/SHINE uses the old NA49 software framework for reconstruction, simulation and data analysis. The core of this legacy framework was developed in the early 1990s. It is written in different programming and scripting languages (C, pgi-Fortran, shell) and provides several concurrent data formats for the event data model, which includes also obsolete parts. In this contribution we will introduce the new software framework, called Shine, that is written in C++ and designed to comprise three principal parts: a collection of processing modules which can be assembled and sequenced by the user via XML files, an event data model which contains all simulation and reconstruction information based on STL and ROOT streaming, and a detector description which provides data on the configuration and state of the experiment. To assure a quick migration to the Shine framework, wrappers were introduced that allow to run legacy code parts as modules in the new framework and we will present first results on the cross validation of the two frameworks.

  12. Center for Efficient Exascale Discretizations Software Suite

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kolev, Tzanio; Dobrev, Veselin; Tomov, Vladimir

    The CEED Software suite is a collection of generally applicable software tools focusing on the following computational motives: PDE discretizations on unstructured meshes, high-order finite element and spectral element methods and unstructured adaptive mesh refinement. All of this software is being developed as part of CEED, a co-design Center for Efficient Exascale Discretizations, within DOE's Exascale Computing Project (ECP) program.

  13. 3-d finite element model development for biomechanics: a software demonstration

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hollerbach, K.; Hollister, A.M.; Ashby, E.

    1997-03-01

    Finite element analysis is becoming an increasingly important part of biomechanics and orthopedic research, as computational resources become more powerful, and data handling algorithms become more sophisticated. Until recently, tools with sufficient power did not exist or were not accessible to adequately model complicated, three-dimensional, nonlinear biomechanical systems. In the past, finite element analyses in biomechanics have often been limited to two-dimensional approaches, linear analyses, or simulations of single tissue types. Today, we have the resources to model fully three-dimensional, nonlinear, multi-tissue, and even multi-joint systems. The authors will present the process of developing these kinds of finite element models,more » using human hand and knee examples, and will demonstrate their software tools.« less

  14. Automation of experimental research of waveguide paths induction soldering

    NASA Astrophysics Data System (ADS)

    Tynchenko, V. S.; Petrenko, V. E.; Kukartsev, V. V.; Tynchenko, V. V.; Antamoshkin, O. A.

    2018-05-01

    The article presents an automated system of experimental studies of the waveguide paths induction soldering process. The system is a part of additional software for a complex of automated control of the technological process of induction soldering of thin-walled waveguide paths from aluminum alloys, expanding its capabilities. The structure of the software product, the general appearance of the controls and the potential application possibilities are presented. The utility of the developed application by approbation in a series of field experiments was considered and justified. The application of the experimental research system makes it possible to improve the process under consideration, providing the possibility of fine-tuning the control regulators, as well as keeping the statistics of the soldering process in a convenient form for analysis.

  15. Simulation based optimization on automated fibre placement process

    NASA Astrophysics Data System (ADS)

    Lei, Shi

    2018-02-01

    In this paper, a software simulation (Autodesk TruPlan & TruFiber) based method is proposed to optimize the automate fibre placement (AFP) process. Different types of manufacturability analysis are introduced to predict potential defects. Advanced fibre path generation algorithms are compared with respect to geometrically different parts. Major manufacturing data have been taken into consideration prior to the tool paths generation to achieve high success rate of manufacturing.

  16. Perform MetalMapper Classification Treatability Investigations as Part of Remedial Investigation/Feasibility Studies: Live Site Demonstrations: Pueblo Chemical Depot

    DTIC Science & Technology

    2016-03-14

    DoD Department of Defense EMI electromagnetic induction ESTCP Environmental Security Technology Certification Program ft. foot GPS global...three primary objectives:  Test and validate detection and discrimination capabilities of a currently available advanced electromagnetic induction ... induction (EMI) sensors in dynamic and static data acquisition modes and associated analysis software. To achieve these objectives, a controlled test was

  17. Evaluating Teachers' Support Requests When Just-in-Time Instructional Support is Provided to Introduce a Primary Level Web-Based Reading Program

    ERIC Educational Resources Information Center

    Wood, Eileen; Anderson, Alissa; Piquette-Tomei, Noella; Savage, Robert; Mueller, Julie

    2011-01-01

    Support requests were documented for 10 teachers (4 kindergarten, 4 grade one, and 2 grade one/two teachers) who received just-in-time instructional support over a 2 1/2 month period while implementing a novel reading software program as part of their literacy instruction. In-class observations were made of each instructional session. Analysis of…

  18. Software for integrated manufacturing systems, part 2

    NASA Technical Reports Server (NTRS)

    Volz, R. A.; Naylor, A. W.

    1987-01-01

    Part 1 presented an overview of the unified approach to manufacturing software. The specific characteristics of the approach that allow it to realize the goals of reduced cost, increased reliability and increased flexibility are considered. Why the blending of a components view, distributed languages, generics and formal models is important, why each individual part of this approach is essential, and why each component will typically have each of these parts are examined. An example of a specification for a real material handling system is presented using the approach and compared with the standard interface specification given by the manufacturer. Use of the component in a distributed manufacturing system is then compared with use of the traditional specification with a more traditional approach to designing the system. An overview is also provided of the underlying mechanisms used for implementing distributed manufacturing systems using the unified software/hardware component approach.

  19. Development of a software tool to support chemical and biological terrorism intelligence analysis

    NASA Astrophysics Data System (ADS)

    Hunt, Allen R.; Foreman, William

    1997-01-01

    AKELA has developed a software tool which uses a systems analytic approach to model the critical processes which support the acquisition of biological and chemical weapons by terrorist organizations. This tool has four major components. The first is a procedural expert system which describes the weapon acquisition process. It shows the relationship between the stages a group goes through to acquire and use a weapon, and the activities in each stage required to be successful. It applies to both state sponsored and small group acquisition. An important part of this expert system is an analysis of the acquisition process which is embodied in a list of observables of weapon acquisition activity. These observables are cues for intelligence collection The second component is a detailed glossary of technical terms which helps analysts with a non- technical background understand the potential relevance of collected information. The third component is a linking capability which shows where technical terms apply to the parts of the acquisition process. The final component is a simple, intuitive user interface which shows a picture of the entire process at a glance and lets the user move quickly to get more detailed information. This paper explains e each of these five model components.

  20. CAPRI: Using a Geometric Foundation for Computational Analysis and Design

    NASA Technical Reports Server (NTRS)

    Haimes, Robert

    2002-01-01

    CAPRI (Computational Analysis Programming Interface) is a software development tool intended to make computerized design, simulation and analysis faster and more efficient. The computational steps traditionally taken for most engineering analysis (Computational Fluid Dynamics (CFD), structural analysis, etc.) are: Surface Generation, usually by employing a Computer Aided Design (CAD) system; Grid Generation, preparing the volume for the simulation; Flow Solver, producing the results at the specified operational point; Post-processing Visualization, interactively attempting to understand the results. It should be noted that the structures problem is more tractable than CFD; there are fewer mesh topologies used and the grids are not as fine (this problem space does not have the length scaling issues of fluids). For CFD, these steps have worked well in the past for simple steady-state simulations at the expense of much user interaction. The data was transmitted between phases via files. In most cases, the output from a CAD system could go IGES files. The output from Grid Generators and Solvers do not really have standards though there are a couple of file formats that can be used for a subset of the gridding (i.e. PLOT3D) data formats and the upcoming CGNS). The user would have to patch up the data or translate from one format to another to move to the next step. Sometimes this could take days. Instead of the serial approach to analysis, CAPRI takes a geometry centric approach. CAPRI is a software building tool-kit that refers to two ideas: (1) A simplified, object-oriented, hierarchical view of a solid part integrating both geometry and topology definitions, and (2) programming access to this part or assembly and any attached data. The connection to the geometry is made through an Application Programming Interface (API) and not a file system.

  1. The Earth System Documentation (ES-DOC) Software Process

    NASA Astrophysics Data System (ADS)

    Greenslade, M. A.; Murphy, S.; Treshansky, A.; DeLuca, C.; Guilyardi, E.; Denvil, S.

    2013-12-01

    Earth System Documentation (ES-DOC) is an international project supplying high-quality tools & services in support of earth system documentation creation, analysis and dissemination. It is nurturing a sustainable standards based documentation eco-system that aims to become an integral part of the next generation of exa-scale dataset archives. ES-DOC leverages open source software, and applies a software development methodology that places end-user narratives at the heart of all it does. ES-DOC has initially focused upon nurturing the Earth System Model (ESM) documentation eco-system and currently supporting the following projects: * Coupled Model Inter-comparison Project Phase 5 (CMIP5); * Dynamical Core Model Inter-comparison Project (DCMIP); * National Climate Predictions and Projections Platforms Quantitative Evaluation of Downscaling Workshop. This talk will demonstrate that ES-DOC implements a relatively mature software development process. Taking a pragmatic Agile process as inspiration, ES-DOC: * Iteratively develops and releases working software; * Captures user requirements via a narrative based approach; * Uses online collaboration tools (e.g. Earth System CoG) to manage progress; * Prototypes applications to validate their feasibility; * Leverages meta-programming techniques where appropriate; * Automates testing whenever sensibly feasible; * Streamlines complex deployments to a single command; * Extensively leverages GitHub and Pivotal Tracker; * Enforces strict separation of the UI from underlying API's; * Conducts code reviews.

  2. EvOligo: A Novel Software to Design and Group Libraries of Oligonucleotides Applicable for Nucleic Acid-Based Experiments.

    PubMed

    Milewski, Marek C; Kamel, Karol; Kurzynska-Kokorniak, Anna; Chmielewski, Marcin K; Figlerowicz, Marek

    2017-10-01

    Experimental methods based on DNA and RNA hybridization, such as multiplex polymerase chain reaction, multiplex ligation-dependent probe amplification, or microarray analysis, require the use of mixtures of multiple oligonucleotides (primers or probes) in a single test tube. To provide an optimal reaction environment, minimal self- and cross-hybridization must be achieved among these oligonucleotides. To address this problem, we developed EvOligo, which is a software package that provides the means to design and group DNA and RNA molecules with defined lengths. EvOligo combines two modules. The first module performs oligonucleotide design, and the second module performs oligonucleotide grouping. The software applies a nearest-neighbor model of nucleic acid interactions coupled with a parallel evolutionary algorithm to construct individual oligonucleotides, and to group the molecules that are characterized by the weakest possible cross-interactions. To provide optimal solutions, the evolutionary algorithm sorts oligonucleotides into sets, preserves preselected parts of the oligonucleotides, and shapes their remaining parts. In addition, the oligonucleotide sets can be designed and grouped based on their melting temperatures. For the user's convenience, EvOligo is provided with a user-friendly graphical interface. EvOligo was used to design individual oligonucleotides, oligonucleotide pairs, and groups of oligonucleotide pairs that are characterized by the following parameters: (1) weaker cross-interactions between the non-complementary oligonucleotides and (2) more uniform ranges of the oligonucleotide pair melting temperatures than other available software products. In addition, in contrast to other grouping algorithms, EvOligo offers time-efficient sorting of paired and unpaired oligonucleotides based on various parameters defined by the user.

  3. Design and Construction of a Field Capable Snapshot Hyperspectral Imaging Spectrometer

    NASA Technical Reports Server (NTRS)

    Arik, Glenda H.

    2005-01-01

    The computed-tomography imaging spectrometer (CTIS) is a device which captures the spatial and spectral content of a rapidly evolving same in a single image frame. The most recent CTIS design is optically all reflective and uses as its dispersive device a stated the-art reflective computer generated hologram (CGH). This project focuses on the instrument's transition from laboratory to field. This design will enable the CTIS to withstand a harsh desert environment. The system is modeled in optical design software using a tolerance analysis. The tolerances guide the design of the athermal mount and component parts. The parts are assembled into a working mount shell where the performance of the mounts is tested for thermal integrity. An interferometric analysis of the reflective CGH is also performed.

  4. GIDEP Batching Tool

    NASA Technical Reports Server (NTRS)

    Fong, Danny; Odell,Dorice; Barry, Peter; Abrahamian, Tomik

    2008-01-01

    This software provides internal, automated search mechanics of GIDEP (Government- Industry Data Exchange Program) Alert data imported from the GIDEP government Web site. The batching tool allows the import of a single parts list in tab-delimited text format into the local JPL GIDEP database. Delimiters from every part number are removed. The original part numbers with delimiters are compared, as well as the newly generated list without the delimiters. The two lists run against the GIDEP imports, and output any matches. This feature only works with Netscape 2.0 or greater, or Internet Explorer 4.0 or greater. The user selects the browser button to choose a text file to import. When the submit button is pressed, this script will import alerts from the text file into the local JPL GIDEP database. This batch tool provides complete in-house control over exported material and data for automated batch match abilities. The batching tool has the ability to match capabilities of the parts list to tables, and yields results that aid further research and analysis. This provides more control over GIDEP information for metrics and reports information not provided by the government site. This software yields results quickly and gives more control over external data from the government site in order to generate other reports not available from the external source. There is enough space to store years of data. The program relates to risk identification and management with regard to projects and GIDEP alert information encompassing flight parts for space exploration.

  5. Data analysis software for the autoradiographic enhancement process. Volumes 1, 2, and 3, and appendix

    NASA Technical Reports Server (NTRS)

    Singh, S. P.

    1979-01-01

    The computer software developed to set up a method for Wiener spectrum analysis of photographic films is presented. This method is used for the quantitative analysis of the autoradiographic enhancement process. The software requirements and design for the autoradiographic enhancement process are given along with the program listings and the users manual. A software description and program listings modification of the data analysis software are included.

  6. 15 CFR 995.25 - Quality management system.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... management system are those defined in this part. The quality management system must ensure that the... type approved conversion software is maintained by a third party, CEVAD shall ensure that no changes made to the conversion software render the type approval of the conversion software invalid, and shall...

  7. 15 CFR 995.25 - Quality management system.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... management system are those defined in this part. The quality management system must ensure that the... type approved conversion software is maintained by a third party, CEVAD shall ensure that no changes made to the conversion software render the type approval of the conversion software invalid, and shall...

  8. 15 CFR 995.25 - Quality management system.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... management system are those defined in this part. The quality management system must ensure that the... type approved conversion software is maintained by a third party, CEVAD shall ensure that no changes made to the conversion software render the type approval of the conversion software invalid, and shall...

  9. 15 CFR 995.25 - Quality management system.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... management system are those defined in this part. The quality management system must ensure that the... type approved conversion software is maintained by a third party, CEVAD shall ensure that no changes made to the conversion software render the type approval of the conversion software invalid, and shall...

  10. 15 CFR Supplement No. 2 to Part 730 - Technical Advisory Committees

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ..., materials, or supplies, including technology, software, and other information, that are subject to export... to a clearly defined grouping of articles, materials, or supplies, including technology, software, or..., including technology, software, and other information, that are subject to export controls because of their...

  11. 15 CFR Supplement No. 2 to Part 730 - Technical Advisory Committees

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ..., materials, or supplies, including technology, software, and other information, that are subject to export... to a clearly defined grouping of articles, materials, or supplies, including technology, software, or..., including technology, software, and other information, that are subject to export controls because of their...

  12. 15 CFR Supplement No. 2 to Part 730 - Technical Advisory Committees

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ..., materials, or supplies, including technology, software, and other information, that are subject to export... to a clearly defined grouping of articles, materials, or supplies, including technology, software, or..., including technology, software, and other information, that are subject to export controls because of their...

  13. A general concept for consistent documentation of computational analyses

    PubMed Central

    Müller, Fabian; Nordström, Karl; Lengauer, Thomas; Schulz, Marcel H.

    2015-01-01

    The ever-growing amount of data in the field of life sciences demands standardized ways of high-throughput computational analysis. This standardization requires a thorough documentation of each step in the computational analysis to enable researchers to understand and reproduce the results. However, due to the heterogeneity in software setups and the high rate of change during tool development, reproducibility is hard to achieve. One reason is that there is no common agreement in the research community on how to document computational studies. In many cases, simple flat files or other unstructured text documents are provided by researchers as documentation, which are often missing software dependencies, versions and sufficient documentation to understand the workflow and parameter settings. As a solution we suggest a simple and modest approach for documenting and verifying computational analysis pipelines. We propose a two-part scheme that defines a computational analysis using a Process and an Analysis metadata document, which jointly describe all necessary details to reproduce the results. In this design we separate the metadata specifying the process from the metadata describing an actual analysis run, thereby reducing the effort of manual documentation to an absolute minimum. Our approach is independent of a specific software environment, results in human readable XML documents that can easily be shared with other researchers and allows an automated validation to ensure consistency of the metadata. Because our approach has been designed with little to no assumptions concerning the workflow of an analysis, we expect it to be applicable in a wide range of computational research fields. Database URL: http://deep.mpi-inf.mpg.de/DAC/cmds/pub/pyvalid.zip PMID:26055099

  14. Creating an open environment software infrastructure

    NASA Technical Reports Server (NTRS)

    Jipping, Michael J.

    1992-01-01

    As the development of complex computer hardware accelerates at increasing rates, the ability of software to keep pace is essential. The development of software design tools, however, is falling behind the development of hardware for several reasons, the most prominent of which is the lack of a software infrastructure to provide an integrated environment for all parts of a software system. The research was undertaken to provide a basis for answering this problem by investigating the requirements of open environments.

  15. Sandia National Laboratories Advanced Simulation and Computing (ASC) software quality plan. Part 1: ASC software quality engineering practices, Version 2.0.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sturtevant, Judith E.; Heaphy, Robert; Hodges, Ann Louise

    2006-09-01

    The purpose of the Sandia National Laboratories Advanced Simulation and Computing (ASC) Software Quality Plan is to clearly identify the practices that are the basis for continually improving the quality of ASC software products. The plan defines the ASC program software quality practices and provides mappings of these practices to Sandia Corporate Requirements CPR 1.3.2 and 1.3.6 and to a Department of Energy document, ASCI Software Quality Engineering: Goals, Principles, and Guidelines. This document also identifies ASC management and software project teams responsibilities in implementing the software quality practices and in assessing progress towards achieving their software quality goals.

  16. Managing EEE part standardisation and procurement

    NASA Astrophysics Data System (ADS)

    Serieys, C.; Bensoussan, A.; Petitmangin, A.; Rigaud, M.; Barbaresco, P.; Lyan, C.

    2002-12-01

    This paper presents the development activities in space components selection and procurement dealing with a new data base tool implemented at Alcatel Space using TransForm softwaa re configurator developed by Techform S.A. Based on TransForm, Access Ingenierie has devv eloped a software product named OLG@DOS which facilitate the part nomenclatures analyses for new equipment design and manufacturing in term of ACCESS data base implementation. Hi-Rel EEE part type technical, production and quality information are collected and compiled usingproduction data base issued from production tools implemented for equipment definition, description and production based on Manufacturing Resource Planning (MRP II Control Open) and Parametric Design Manager (PDM Work Manager). The analysis of any new equipment nomenclature may be conducted through this means for standardisation purpose, cost containment program and management procurement activities as well as preparation of Component reviews as Part Approval Document and Declared Part List validation.

  17. 15 CFR Supplement No. 1 to Part 734 - Questions and Answers-Technology and Software Subject to the EAR

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... software and databases, at wholesale and retail. Our products are available by mail order to any member of.... Release of information by instruction in catalog courses and associated teaching laboratories of academic... proprietary business does not qualify as an “academic institution” within the meaning of § 734.9 of this part...

  18. 15 CFR Supplement No. 1 to Part 734 - Questions and Answers-Technology and Software Subject to the EAR

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... software and databases, at wholesale and retail. Our products are available by mail order to any member of.... Release of information by instruction in catalog courses and associated teaching laboratories of academic... proprietary business does not qualify as an “academic institution” within the meaning of § 734.9 of this part...

  19. 15 CFR Supplement No. 1 to Part 734 - Questions and Answers-Technology and Software Subject to the EAR

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... software and databases, at wholesale and retail. Our products are available by mail order to any member of.... Release of information by instruction in catalog courses and associated teaching laboratories of academic... proprietary business does not qualify as an “academic institution” within the meaning of § 734.9 of this part...

  20. 15 CFR Supplement No. 1 to Part 734 - Questions and Answers-Technology and Software Subject to the EAR

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... software and databases, at wholesale and retail. Our products are available by mail order to any member of.... Release of information by instruction in catalog courses and associated teaching laboratories of academic... proprietary business does not qualify as an “academic institution” within the meaning of § 734.9 of this part...

  1. Infusing Reliability Techniques into Software Safety Analysis

    NASA Technical Reports Server (NTRS)

    Shi, Ying

    2015-01-01

    Software safety analysis for a large software intensive system is always a challenge. Software safety practitioners need to ensure that software related hazards are completely identified, controlled, and tracked. This paper discusses in detail how to incorporate the traditional reliability techniques into the entire software safety analysis process. In addition, this paper addresses how information can be effectively shared between the various practitioners involved in the software safety analyses. The author has successfully applied the approach to several aerospace applications. Examples are provided to illustrate the key steps of the proposed approach.

  2. Automatic landslide detection from LiDAR DTM derivatives by geographic-object-based image analysis based on open-source software

    NASA Astrophysics Data System (ADS)

    Knevels, Raphael; Leopold, Philip; Petschko, Helene

    2017-04-01

    With high-resolution airborne Light Detection and Ranging (LiDAR) data more commonly available, many studies have been performed to facilitate the detailed information on the earth surface and to analyse its limitation. Specifically in the field of natural hazards, digital terrain models (DTM) have been used to map hazardous processes such as landslides mainly by visual interpretation of LiDAR DTM derivatives. However, new approaches are striving towards automatic detection of landslides to speed up the process of generating landslide inventories. These studies usually use a combination of optical imagery and terrain data, and are designed in commercial software packages such as ESRI ArcGIS, Definiens eCognition, or MathWorks MATLAB. The objective of this study was to investigate the potential of open-source software for automatic landslide detection based only on high-resolution LiDAR DTM derivatives in a study area within the federal state of Burgenland, Austria. The study area is very prone to landslides which have been mapped with different methodologies in recent years. The free development environment R was used to integrate open-source geographic information system (GIS) software, such as SAGA (System for Automated Geoscientific Analyses), GRASS (Geographic Resources Analysis Support System), or TauDEM (Terrain Analysis Using Digital Elevation Models). The implemented geographic-object-based image analysis (GEOBIA) consisted of (1) derivation of land surface parameters, such as slope, surface roughness, curvature, or flow direction, (2) finding optimal scale parameter by the use of an objective function, (3) multi-scale segmentation, (4) classification of landslide parts (main scarp, body, flanks) by k-mean thresholding, (5) assessment of the classification performance using a pre-existing landslide inventory, and (6) post-processing analysis for the further use in landslide inventories. The results of the developed open-source approach demonstrated good success rates to objectively detect landslides in high-resolution topography data by GEOBIA.

  3. The use of Tecnomatix software to simulate the manufacturing flows in an industrial enterprise producing hydrostatic components

    NASA Astrophysics Data System (ADS)

    Petrila, S.; Brabie, G.; Chirita, B.

    2016-08-01

    The analysis performed on manufacturing flows within industrial enterprises producing hydrostatic components twos made on a number of factors that influence smooth running of production such: distance between pieces, waiting time from one surgery to another; time achievement of setups on CNC machines; tool changing in case of a large number of operators and manufacturing complexity of large files [2]. To optimize the manufacturing flow it was used the software Tecnomatix. This software represents a complete portfolio of manufacturing solutions digital manufactured by Siemens. It provides innovation by linking all production methods of a product from process design, process simulation, validation and ending the manufacturing process. Among its many capabilities to create a wide range of simulations, the program offers various demonstrations regarding the behavior manufacturing cycles. This program allows the simulation and optimization of production systems and processes in several areas such as: car suppliers, production of industrial equipment; electronics manufacturing, design and production of aerospace and defense parts.

  4. Patterns of Interaction and Mathematical Thinking of High School Students in Classroom Environments That Include Use of Java-Based, Curriculum-Embedded Software

    ERIC Educational Resources Information Center

    Fonkert, Karen L.

    2012-01-01

    This study analyzes the nature of student interaction and discourse in an environment that includes the use of Java-based, curriculum-embedded mathematical software. The software "CPMP-Tools" was designed as part of the development of the second edition of the "Core-Plus Mathematics" curriculum. The use of the software on…

  5. An Open Source approach to automated hydrological analysis of ungauged drainage basins in Serbia using R and SAGA

    NASA Astrophysics Data System (ADS)

    Zlatanovic, Nikola; Milovanovic, Irina; Cotric, Jelena

    2014-05-01

    Drainage basins are for the most part ungauged or poorly gauged not only in Serbia but in most parts of the world, usually due to insufficient funds, but also the decommission of river gauges in upland catchments to focus on downstream areas which are more populated. Very often, design discharges are needed for these streams or rivers where no streamflow data is available, for various applications. Examples include river training works for flood protection measures or erosion control, design of culverts, water supply facilities, small hydropower plants etc. The estimation of discharges in ungauged basins is most often performed using rainfall-runoff models, whose parameters heavily rely on geomorphometric attributes of the basin (e.g. catchment area, elevation, slopes of channels and hillslopes etc.). The calculation of these, as well as other paramaters, is most often done in GIS (Geographic Information System) software environments. This study deals with the application of freely available and open source software and datasets for automating rainfall-runoff analysis of ungauged basins using methodologies currently in use hydrological practice. The R programming language was used for scripting and automating the hydrological calculations, coupled with SAGA GIS (System for Automated Geoscientivic Analysis) for geocomputing functions and terrain analysis. Datasets used in the analyses include the freely available SRTM (Shuttle Radar Topography Mission) terrain data, CORINE (Coordination of Information on the Environment) Land Cover data, as well as soil maps and rainfall data. The choice of free and open source software and datasets makes the project ideal for academic and research purposes and cross-platform projects. The geomorphometric module was tested on more than 100 catchments throughout Serbia and compared to manually calculated values (using topographic maps). The discharge estimation module was tested on 21 catchments where data were available and compared to results obtained by frequency analysis of annual maximum discharge. The geomorphometric module of the calculation system showed excellent results, saving a great deal of time that would otherwise have been spent on manual processing of geospatial data. This type of automated analysis presented in this study will enable a much quicker hydrologic analysis on multiple watersheds, providing the platform for further research into spatial variability of runoff.

  6. Project SYNERGY: Software Support for Underprepared Students. Software Implementation Report.

    ERIC Educational Resources Information Center

    Anandam, Kamala; And Others

    Miami-Dade Community College's (MDCC's) implementation and assessment of computer software as a part of Project SYNERGY, a multi-institutional project funded by the International Business Machines (IBM) Corporation designed to seek technological solutions for helping students underprepared in reading, writing and mathematics, is described in this…

  7. 25 CFR 543.7 - What are the minimum internal control standards for bingo?

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... software upgrades, data storage media replacement, etc.). The information recorded must be used when...., draw objects and back-up draw objects); and (ii) Random number generator software. (Additional information technology security standards can be found in § 543.16 of this part.) (2) The game software...

  8. 25 CFR 543.7 - What are the minimum internal control standards for bingo?

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... software upgrades, data storage media replacement, etc.). The information recorded must be used when...., draw objects and back-up draw objects); and (ii) Random number generator software. (Additional information technology security standards can be found in § 543.16 of this part.) (2) The game software...

  9. 31 CFR 560.538 - Authorized transactions necessary and ordinarily incident to publishing.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... written publication in electronic format, the addition of embedded software necessary for reading, browsing, navigating, or searching the written publication; and (ii) Exporting embedded software necessary... that the software is designated as “EAR99” under the Export Administration Regulations, 15 CFR parts...

  10. 31 CFR 538.529 - Authorized transactions necessary and ordinarily incident to publishing.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... written publication in electronic format, the addition of embedded software necessary for reading, browsing, navigating, or searching the written publication; (ii) Exporting embedded software necessary for... software is classified as “EAR 99” under the Export Administration Regulations, 15 CFR parts 730-774 (the...

  11. 31 CFR 560.538 - Authorized transactions necessary and ordinarily incident to publishing.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... written publication in electronic format, the addition of embedded software necessary for reading, browsing, navigating, or searching the written publication; and (ii) Exporting embedded software necessary... that the software is designated as “EAR99” under the Export Administration Regulations, 15 CFR parts...

  12. 31 CFR 538.529 - Authorized transactions necessary and ordinarily incident to publishing.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... written publication in electronic format, the addition of embedded software necessary for reading, browsing, navigating, or searching the written publication; (ii) Exporting embedded software necessary for... software is classified as “EAR 99” under the Export Administration Regulations, 15 CFR parts 730-774 (the...

  13. 31 CFR 538.529 - Authorized transactions necessary and ordinarily incident to publishing.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... written publication in electronic format, the addition of embedded software necessary for reading, browsing, navigating, or searching the written publication; (ii) Exporting embedded software necessary for... software is classified as “EAR 99” under the Export Administration Regulations, 15 CFR parts 730-774 (the...

  14. 31 CFR 538.529 - Authorized transactions necessary and ordinarily incident to publishing.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... written publication in electronic format, the addition of embedded software necessary for reading, browsing, navigating, or searching the written publication; (ii) Exporting embedded software necessary for... software is classified as “EAR 99” under the Export Administration Regulations, 15 CFR parts 730-774 (the...

  15. 77 FR 31758 - Airworthiness Directives; the Boeing Company Airplanes

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-05-30

    .... That NPRM proposed to inspect for part numbers of the operational program software of the flight... operational program software (OPS) of the flight control computers (FCC), and doing corrective actions if... previous NPRM (75 FR 57885, September 23, 2010), we have determined that the software installation required...

  16. 15 CFR Supplement No. 6 to Part 742 - Guidelines for Submitting Review Requests for Encryption Items

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... brochures or other documentation or specifications related to the technology, commodity or software... commodity or software, provide the following information: (1) Description of all the symmetric and... is provided by third-party hardware or software encryption components (if any). Identify the...

  17. A practical data processing workflow for multi-OMICS projects.

    PubMed

    Kohl, Michael; Megger, Dominik A; Trippler, Martin; Meckel, Hagen; Ahrens, Maike; Bracht, Thilo; Weber, Frank; Hoffmann, Andreas-Claudius; Baba, Hideo A; Sitek, Barbara; Schlaak, Jörg F; Meyer, Helmut E; Stephan, Christian; Eisenacher, Martin

    2014-01-01

    Multi-OMICS approaches aim on the integration of quantitative data obtained for different biological molecules in order to understand their interrelation and the functioning of larger systems. This paper deals with several data integration and data processing issues that frequently occur within this context. To this end, the data processing workflow within the PROFILE project is presented, a multi-OMICS project that aims on identification of novel biomarkers and the development of new therapeutic targets for seven important liver diseases. Furthermore, a software called CrossPlatformCommander is sketched, which facilitates several steps of the proposed workflow in a semi-automatic manner. Application of the software is presented for the detection of novel biomarkers, their ranking and annotation with existing knowledge using the example of corresponding Transcriptomics and Proteomics data sets obtained from patients suffering from hepatocellular carcinoma. Additionally, a linear regression analysis of Transcriptomics vs. Proteomics data is presented and its performance assessed. It was shown, that for capturing profound relations between Transcriptomics and Proteomics data, a simple linear regression analysis is not sufficient and implementation and evaluation of alternative statistical approaches are needed. Additionally, the integration of multivariate variable selection and classification approaches is intended for further development of the software. Although this paper focuses only on the combination of data obtained from quantitative Proteomics and Transcriptomics experiments, several approaches and data integration steps are also applicable for other OMICS technologies. Keeping specific restrictions in mind the suggested workflow (or at least parts of it) may be used as a template for similar projects that make use of different high throughput techniques. This article is part of a Special Issue entitled: Computational Proteomics in the Post-Identification Era. Guest Editors: Martin Eisenacher and Christian Stephan. Copyright © 2013 Elsevier B.V. All rights reserved.

  18. Training the Next Generation in Space Situational Awareness Research

    NASA Astrophysics Data System (ADS)

    Colpo, D.; Reddy, V.; Arora, S.; Tucker, S.; Jeffries, L.; May, D.; Bronson, R.; Hunten, E.

    Traditional academic SSA research has relied on commercial off the shelf (COTS) systems for collecting metric and lightcurve data. COTS systems have several advantages over a custom built system including cost, easy integration, technical support and short deployment timescales. We at the University of Arizona took an alternative approach to develop a sensor system for space object characterization. Five engineering students designed and built two 0.6-meter F/4 electro-optical (EO) systems for collecting lightcurve and spectral data. All the design and fabrication work was carried out over the course of two semesters as part f their senior design project that is mandatory for the completion of their bachelors in engineering degree. The students designed over 200 individual parts using three-dimensional modeling software (SolidWorks), and conducted detailed optical design analysis using raytracing software (ZEMAX), with oversight and advice from faculty sponsor and Starizona, a local small business in Tucson. The components of the design were verified by test, analysis, inspection, or demonstration, per the process that the University of Arizona requires for each of its design projects. Methods to complete this project include mechanical FEA, optical testing methods (Foucault Knife Edge Test and Couder Mask Test), tests to verify the function of the thermometers, and a final pointing model test. A surprise outcome of our exercise is that the entire cost of the design and fabrication of these two EO systems was significantly lower than a COTS alternative. With careful planning and coordination we were also able to reduce to the deployment times to those for a commercial system. Our experience shows that development of hardware and software for SSA research could be accomplished in an academic environment that would enable the training of the next generation with active support from local small businesses.

  19. Development and improvement of the operating diagnostics systems of NPO CKTI works for turbine of thermal and nuclear power plants

    NASA Astrophysics Data System (ADS)

    Kovalev, I. A.; Rakovskii, V. G.; Isakov, N. Yu.; Sandovskii, A. V.

    2016-03-01

    The work results on the development and improvement of the techniques, algorithms, and software-hardware of continuous operating diagnostics systems of rotating units and parts of turbine equipment state are presented. In particular, to ensure the full remote service of monitored turbine equipment using web technologies, the web version of the software of the automated systems of vibration-based diagnostics (ASVD VIDAS) was developed. The experience in the automated analysis of data obtained by ASVD VIDAS form the basis of the new algorithm of early detection of such dangerous defects as rotor deflection, crack in the rotor, and strong misalignment of supports. The program-technical complex of monitoring and measuring the deflection of medium pressure rotor (PTC) realizing this algorithm will alert the electric power plant staff during a deflection and indicate its value. This will give the opportunity to take timely measures to prevent the further extension of the defect. Repeatedly, recorded cases of full or partial destruction of shrouded shelves of rotor blades of the last stages of low-pressure cylinders of steam turbines defined the need to develop a version of the automated system of blade diagnostics (ASBD SKALA) for shrouded stages. The processing, analysis, presentation, and backup of data characterizing the mechanical state of blade device are carried out with a newly developed controller of the diagnostics system. As a result of the implementation of the works, the diagnosed parameters determining the operation security of rotating elements of equipment was expanded and the new tasks on monitoring the state of units and parts of turbines were solved. All algorithmic solutions and hardware-software implementations mentioned in the article were tested on the test benches and applied at some power plants.

  20. Software Development in the Water Sciences: a view from the divide (Invited)

    NASA Astrophysics Data System (ADS)

    Miles, B.; Band, L. E.

    2013-12-01

    While training in statistical methods is an important part of many earth scientists' training, these scientists often learn the bulk of their software development skills in an ad hoc, just-in-time manner. Yet to carry out contemporary research scientists are spending more and more time developing software. Here I present perspectives - as an earth sciences graduate student with professional software engineering experience - on the challenges scientists face adopting software engineering practices, with an emphasis on areas of the science software development lifecycle that could benefit most from improved engineering. This work builds on experience gained as part of the NSF-funded Water Science Software Institute (WSSI) conceptualization award (NSF Award # 1216817). Throughout 2013, the WSSI team held a series of software scoping and development sprints with the goals of: (1) adding features to better model green infrastructure within the Regional Hydro-Ecological Simulation System (RHESSys); and (2) infusing test-driven agile software development practices into the processes employed by the RHESSys team. The goal of efforts such as the WSSI is to ensure that investments by current and future scientists in software engineering training will enable transformative science by improving both scientific reproducibility and researcher productivity. Experience with the WSSI indicates: (1) the potential for achieving this goal; and (2) while scientists are willing to adopt some software engineering practices, transformative science will require continued collaboration between domain scientists and cyberinfrastructure experts for the foreseeable future.

  1. Guidance and Control Software Project Data - Volume 2: Development Documents

    NASA Technical Reports Server (NTRS)

    Hayhurst, Kelly J. (Editor)

    2008-01-01

    The Guidance and Control Software (GCS) project was the last in a series of software reliability studies conducted at Langley Research Center between 1977 and 1994. The technical results of the GCS project were recorded after the experiment was completed. Some of the support documentation produced as part of the experiment, however, is serving an unexpected role far beyond its original project context. Some of the software used as part of the GCS project was developed to conform to the RTCA/DO-178B software standard, "Software Considerations in Airborne Systems and Equipment Certification," used in the civil aviation industry. That standard requires extensive documentation throughout the software development life cycle, including plans, software requirements, design and source code, verification cases and results, and configuration management and quality control data. The project documentation that includes this information is open for public scrutiny without the legal or safety implications associated with comparable data from an avionics manufacturer. This public availability has afforded an opportunity to use the GCS project documents for DO-178B training. This report provides a brief overview of the GCS project, describes the 4-volume set of documents and the role they are playing in training, and includes the development documents from the GCS project. Volume 2 contains three appendices: A. Guidance and Control Software Development Specification; B. Design Description for the Pluto Implementation of the Guidance and Control Software; and C. Source Code for the Pluto Implementation of the Guidance and Control Software

  2. Reliability of Fault Tolerant Control Systems. Part 1

    NASA Technical Reports Server (NTRS)

    Wu, N. Eva

    2001-01-01

    This paper reports Part I of a two part effort, that is intended to delineate the relationship between reliability and fault tolerant control in a quantitative manner. Reliability analysis of fault-tolerant control systems is performed using Markov models. Reliability properties, peculiar to fault-tolerant control systems are emphasized. As a consequence, coverage of failures through redundancy management can be severely limited. It is shown that in the early life of a syi1ein composed of highly reliable subsystems, the reliability of the overall system is affine with respect to coverage, and inadequate coverage induces dominant single point failures. The utility of some existing software tools for assessing the reliability of fault tolerant control systems is also discussed. Coverage modeling is attempted in Part II in a way that captures its dependence on the control performance and on the diagnostic resolution.

  3. Automating Embedded Analysis Capabilities and Managing Software Complexity in Multiphysics Simulation, Part I: Template-Based Generic Programming

    DOE PAGES

    Pawlowski, Roger P.; Phipps, Eric T.; Salinger, Andrew G.

    2012-01-01

    An approach for incorporating embedded simulation and analysis capabilities in complex simulation codes through template-based generic programming is presented. This approach relies on templating and operator overloading within the C++ language to transform a given calculation into one that can compute a variety of additional quantities that are necessary for many state-of-the-art simulation and analysis algorithms. An approach for incorporating these ideas into complex simulation codes through general graph-based assembly is also presented. These ideas have been implemented within a set of packages in the Trilinos framework and are demonstrated on a simple problem from chemical engineering.

  4. Space Trajectory Error Analysis Program (STEAP) for halo orbit missions. Volume 1: Analytic and user's manual

    NASA Technical Reports Server (NTRS)

    Byrnes, D. V.; Carney, P. C.; Underwood, J. W.; Vogt, E. D.

    1974-01-01

    Development, test, conversion, and documentation of computer software for the mission analysis of missions to halo orbits about libration points in the earth-sun system is reported. The software consisting of two programs called NOMNAL and ERRAN is part of the Space Trajectories Error Analysis Programs (STEAP). The program NOMNAL targets a transfer trajectory from Earth on a given launch date to a specified halo orbit on a required arrival date. Either impulsive or finite thrust insertion maneuvers into halo orbit are permitted by the program. The transfer trajectory is consistent with a realistic launch profile input by the user. The second program ERRAN conducts error analyses of the targeted transfer trajectory. Measurements including range, doppler, star-planet angles, and apparent planet diameter are processed in a Kalman-Schmidt filter to determine the trajectory knowledge uncertainty. Execution errors at injection, midcourse correction and orbit insertion maneuvers are analyzed along with the navigation uncertainty to determine trajectory control uncertainties and fuel-sizing requirements. The program is also capable of generalized covariance analyses.

  5. Weaves as an Interconnection Fabric for ASIM's and Nanosatellites

    NASA Technical Reports Server (NTRS)

    Gorlick, Michael M.

    1995-01-01

    Many of the micromachines under consideration require computer support, indeed, one of the appeals of this technology is the ability to intermix mechanical, optical, analog, and digital devices on the same substrate. The amount of computer power is rarely an issue, the sticking point is the complexity of the software required to make effective use of these devices. Micromachines are the nano-technologist's equivalent of 'golden screws'. In other words, they will be piece parts in larger assemblages. For example, a nano-satellite may be composed of stacked silicon wafers where each wafer contains hundreds to thousands of micromachines, digital controllers, general purpose computers, memories, and high-speed bus interconnects. Comparatively few of these devices will be custom designed, most will be stock parts selected from libraries and catalogs. The novelty will lie in the interconnections. For example, a digital accelerometer may be a component part in an adaptive suspension, a monitoring element embedded in the wrapper of a package, or a portion of the smart skin of a launch vehicle. In each case, this device must inter-operate with other devices and probes for the purposes of command, control, and communication. We propose a software technology called 'weaves' that will permit large collections of micromachines and their attendant computers to freely intercommunicate while preserving modularity, transparency, and flexibility. Weaves are composed of networks of communicating software components. The network, and the components comprising it, may be changed even while the software, and the devices it controls, are executing. This unusual degree of software plasticity permits micromachines to dynamically adapt the software to changing conditions and allows system engineers to rapidly and inexpensively develop special purpose software by assembling stock software components in custom configurations.

  6. Antenna analysis using properties of metamaterials

    NASA Astrophysics Data System (ADS)

    Mitra, Atindra K.; Hu, Colin; Maxwell, Kasandra

    2010-04-01

    As part of the Student Internship Programs at Wright-Patterson Air Force Base, including the AFRL Wright Scholar Program for High School Students and the AFRL STEP Program, sample results from preliminary investigation and analysis of integrated antenna structures are reported. Investigation of these novel integrated antenna geometries can be interpreted as a continuation of systems analysis under the general topic area of potential integrated apertures for future software radar/radio solutions [1] [2]. Specifically, the categories of novel integrated aperture geometries investigated in this paper include slotted-fractal structures on microstrip rectangular patch antenna models in tandem with the analysis of exotic substrate materials comprised of a type of synthesized electromagnetic structure known as metamaterials [8] - [10].

  7. Real-time software failure characterization

    NASA Technical Reports Server (NTRS)

    Dunham, Janet R.; Finelli, George B.

    1990-01-01

    A series of studies aimed at characterizing the fundamentals of the software failure process has been undertaken as part of a NASA project on the modeling of a real-time aerospace vehicle software reliability. An overview of these studies is provided, and the current study, an investigation of the reliability of aerospace vehicle guidance and control software, is examined. The study approach provides for the collection of life-cycle process data, and for the retention and evaluation of interim software life-cycle products.

  8. The SIFT hardware/software systems. Volume 2: Software listings

    NASA Technical Reports Server (NTRS)

    Palumbo, Daniel L.

    1985-01-01

    This document contains software listings of the SIFT operating system and application software. The software is coded for the most part in a variant of the Pascal language, Pascal*. Pascal* is a cross-compiler running on the VAX and Eclipse computers. The output of Pascal* is BDX-390 assembler code. When necessary, modules are written directly in BDX-390 assembler code. The listings in this document supplement the description of the SIFT system found in Volume 1 of this report, A Detailed Description.

  9. Histomorphometric analysis of collagen architecture of auricular keloids in an Asian population.

    PubMed

    Chong, Yosep; Park, Tae Hwan; Seo, Sang won; Chang, Choong Hyun

    2015-03-01

    Keloids are a pathologic condition of the reparative process, which present as excessive scar formation that involves various cells and cytokines. Many studies focusing on the histologic feature of keloids, however, have shown discordant results without consideration of architectural aspect of collagen structure. The purpose of this study was to demonstrate a schematic illustration of collagen architecture of keloids, specifically auricular keloids, and to analyze each part on the histomorphologic and morphometric basis. Thirty-nine surgically excised auricular keloids were retrieved from the file of Kangbuk Samsung Hospital. After exhaustive histomorphologic analysis, 3 distinctive structural parts, keloidal collagen, organizing collagen, and proliferating core collagen, were identified and mapped in every case. Cellularity of fibroblasts, blood vessel density, degree of inflammatory cell infiltration, and mast cells counts using Masson trichrome stain, Van Gieson stain, toluidine blue stain, and immunohistochemical stains for CD31 and smooth muscle actin were analyzed in each part of each case. Morphometric analysis on these parameters using ImageJ software was performed using 3 representative images of each part. Three parts were histomorphologically distinct by shape and array of collagen bundles, fibroblasts cellularity, blood vessel density, degree of inflammatory cells, and mast cell infiltration. Morphometric analysis revealed statistically significant difference between each part in fibroblasts cellularity, blood vessel density, degree of inflammatory cell infiltration, and mast cells count. All parameters were exceedingly high in whorling hypercellular fibrous nodules in proliferating core collagen showing simultaneous changes in other parts. Morphologically and morphometrically, 3 distinctive parts were identified in auricular keloids. Mast cell infiltrations, blood vessel density, and fibroblast cellularity are simultaneously increased or decreased according to these parts. Proliferating core collagen might serve as a proliferating center of keloids and might be a key portion for tumor growth and recurrence.

  10. Investigating the Accuracy of Point Clouds Generated for Rock Surfaces

    NASA Astrophysics Data System (ADS)

    Seker, D. Z.; Incekara, A. H.

    2016-12-01

    Point clouds which are produced by means of different techniques are widely used to model the rocks and obtain the properties of rock surfaces like roughness, volume and area. These point clouds can be generated by applying laser scanning and close range photogrammetry techniques. Laser scanning is the most common method to produce point cloud. In this method, laser scanner device produces 3D point cloud at regular intervals. In close range photogrammetry, point cloud can be produced with the help of photographs taken in appropriate conditions depending on developing hardware and software technology. Many photogrammetric software which is open source or not currently provide the generation of point cloud support. Both methods are close to each other in terms of accuracy. Sufficient accuracy in the mm and cm range can be obtained with the help of a qualified digital camera and laser scanner. In both methods, field work is completed in less time than conventional techniques. In close range photogrammetry, any part of rock surfaces can be completely represented owing to overlapping oblique photographs. In contrast to the proximity of the data, these two methods are quite different in terms of cost. In this study, whether or not point cloud produced by photographs can be used instead of point cloud produced by laser scanner device is investigated. In accordance with this purpose, rock surfaces which have complex and irregular shape located in İstanbul Technical University Ayazaga Campus were selected as study object. Selected object is mixture of different rock types and consists of both partly weathered and fresh parts. Study was performed on a part of 30m x 10m rock surface. 2D and 3D analysis were performed for several regions selected from the point clouds of the surface models. 2D analysis is area-based and 3D analysis is volume-based. Analysis conclusions showed that point clouds in both are similar and can be used as alternative to each other. This proved that point cloud produced using photographs which are both economical and enables to produce data in less time can be used in several studies instead of point cloud produced by laser scanner.

  11. Reliability of simulated robustness testing in fast liquid chromatography, using state-of-the-art column technology, instrumentation and modelling software.

    PubMed

    Kormány, Róbert; Fekete, Jenő; Guillarme, Davy; Fekete, Szabolcs

    2014-02-01

    The goal of this study was to evaluate the accuracy of simulated robustness testing using commercial modelling software (DryLab) and state-of-the-art stationary phases. For this purpose, a mixture of amlodipine and its seven related impurities was analyzed on short narrow bore columns (50×2.1mm, packed with sub-2μm particles) providing short analysis times. The performance of commercial modelling software for robustness testing was systematically compared to experimental measurements and DoE based predictions. We have demonstrated that the reliability of predictions was good, since the predicted retention times and resolutions were in good agreement with the experimental ones at the edges of the design space. In average, the retention time relative errors were <1.0%, while the predicted critical resolution errors were comprised between 6.9 and 17.2%. Because the simulated robustness testing requires significantly less experimental work than the DoE based predictions, we think that robustness could now be investigated in the early stage of method development. Moreover, the column interchangeability, which is also an important part of robustness testing, was investigated considering five different C8 and C18 columns packed with sub-2μm particles. Again, thanks to modelling software, we proved that the separation was feasible on all columns within the same analysis time (less than 4min), by proper adjustments of variables. Copyright © 2013 Elsevier B.V. All rights reserved.

  12. Voltammetric analysis of ordnance materials. Part 2: A portable digital voltammeter for use with a silver wire working electrode

    NASA Astrophysics Data System (ADS)

    Fine, D. A.; Reeve, D. A.; Dickus, R. A.

    1984-12-01

    An inexpensive, portable, digital voltammeter has been designed and built at NWC. The instrument is intended for use with a silver wire working electrode. The voltammeter was built in response to a need on the part of Navy facilities for the monitoring of effluent water from the carbon column cleanup process used to remove propyleneglycoldinitrate from Otto fuel waste water. The instrument may also be used for the monitoring of contaminants such as nitroglycerin, dinitrotoluene, trinitrotoluene and nitroguanidine. This report describes in detail the construction, circuitry, software and operational features of the instrument.

  13. Design and Checking Analysis of Injection Mold for a Plastic Cup

    NASA Astrophysics Data System (ADS)

    Li, Xuebing

    2018-03-01

    A special injection mold was designed for the structural characteristics of a plastic cup part. The mold was simulated by Moldflow software and verified by calculating the stripping force, the pulling force and the clamping force of the mold so that to determine the appropriate injection parameters. It has been proved that the injection mold is effective and practical in the actual producing and can meet the quality requirements during the course of using it, which solved some problems for injection molding of this kind of parts and can provide some reference for the production of other products in the same industry.

  14. [HPLC fingerprint chromatogram analysis of some Taraxacum in Henan province].

    PubMed

    Li, Xi-Feng; Shi, Hui-Min; Xu, Min; Meng, Lu

    2008-10-01

    To analyze the HPLC fingerprint chromatogram of some Taraxacum in Henan. Samples of different species, producing areas, harvest seasons and medicinal parts were determined by RP-HPLC. The chromatogram was evaluated by software of evaluating similarity. The components of different species in Taraxacum were the same and could be substituted for each other. The contents of coffeic acid and chlorogenic acid in different producing areas were very different,which in fecund soil was better. The period of flowering and fruiting in Spring was the best gather period, and the components in different parts were different. The quality of medicinal materal within Taraxacum should be controlled better by this method.

  15. Big Software for SmallSats: Adapting CFS to CubeSat Missions

    NASA Technical Reports Server (NTRS)

    Cudmore, Alan P.; Crum, Gary; Sheikh, Salman; Marshall, James

    2015-01-01

    Expanding capabilities and mission objectives for SmallSats and CubeSats is driving the need for reliable, reusable, and robust flight software. While missions are becoming more complicated and the scientific goals more ambitious, the level of acceptable risk has decreased. Design challenges are further compounded by budget and schedule constraints that have not kept pace. NASA's Core Flight Software System (cFS) is an open source solution which enables teams to build flagship satellite level flight software within a CubeSat schedule and budget. NASA originally developed cFS to reduce mission and schedule risk for flagship satellite missions by increasing code reuse and reliability. The Lunar Reconnaissance Orbiter, which launched in 2009, was the first of a growing list of Class B rated missions to use cFS. Large parts of cFS are now open source, which has spurred adoption outside of NASA. This paper reports on the experiences of two teams using cFS for current CubeSat missions. The performance overheads of cFS are quantified, and the reusability of code between missions is discussed. The analysis shows that cFS is well suited to use on CubeSats and demonstrates the portability and modularity of cFS code.

  16. Inertial Upper Stage (IUS) software analysis

    NASA Technical Reports Server (NTRS)

    Grayson, W. L.; Nickel, C. E.; Rose, P. L.; Singh, R. P.

    1979-01-01

    The Inertial Upper Stage (IUS) System, an extension of the Space Transportation System (STS) operating regime to include higher orbits, orbital plane changes, geosynchronous orbits, and interplanetary trajectories is presented. The IUS software design, the IUS software interfaces with other systems, and the cost effectiveness in software verification are described. Tasks of the IUS discussed include: (1) design analysis; (2) validation requirements analysis; (3) interface analysis; and (4) requirements analysis.

  17. [Design and implementation of online statistical analysis function in information system of air pollution and health impact monitoring].

    PubMed

    Lü, Yiran; Hao, Shuxin; Zhang, Guoqing; Liu, Jie; Liu, Yue; Xu, Dongqun

    2018-01-01

    To implement the online statistical analysis function in information system of air pollution and health impact monitoring, and obtain the data analysis information real-time. Using the descriptive statistical method as well as time-series analysis and multivariate regression analysis, SQL language and visual tools to implement online statistical analysis based on database software. Generate basic statistical tables and summary tables of air pollution exposure and health impact data online; Generate tendency charts of each data part online and proceed interaction connecting to database; Generate butting sheets which can lead to R, SAS and SPSS directly online. The information system air pollution and health impact monitoring implements the statistical analysis function online, which can provide real-time analysis result to its users.

  18. An Analysis of SE and MBSE Concepts to Support Defence Capability Acquisition

    DTIC Science & Technology

    2014-09-01

    Government Department of Finance and Deregulation, Canberra, ACT, August 2011. [online] URL: http://agimo.gov.au/files/2012/04/AGA_RM_v3_0.pdf ANSI...First Time, White Paper, Aberdeen Group Group, August 2011. [online] URL: http://www.aberdeen.com/Aberdeen- Library/7121/RA-system-design...Edge e-zine, IBM Software Group, August 2003. Cantor 2003b Cantor, Murray, Rational Unified Process for Systems Engineering Part I1: System

  19. The Healthcare Administrator's Associate: an experiment in distributed healthcare information systems.

    PubMed Central

    Fowler, J.; Martin, G.

    1997-01-01

    The Healthcare Administrator's Associate is a collection of portable tools designed to support analysis of data retrieved via the Internet from diverse distributed healthcare information systems by means of the InfoSleuth system of distributed software agents. Development of these tools is part of an effort to enhance access to diverse and geographically distributed healthcare data in order to improve the basis upon which administrative and clinical decisions are made. PMID:9357686

  20. Software Safety Progress in NASA

    NASA Technical Reports Server (NTRS)

    Radley, Charles F.

    1995-01-01

    NASA has developed guidelines for development and analysis of safety-critical software. These guidelines have been documented in a Guidebook for Safety Critical Software Development and Analysis. The guidelines represent a practical 'how to' approach, to assist software developers and safety analysts in cost effective methods for software safety. They provide guidance in the implementation of the recent NASA Software Safety Standard NSS-1740.13 which was released as 'Interim' version in June 1994, scheduled for formal adoption late 1995. This paper is a survey of the methods in general use, resulting in the NASA guidelines for safety critical software development and analysis.

  1. GIS Methodic and New Database for Magmatic Rocks. Application for Atlantic Oceanic Magmatism.

    NASA Astrophysics Data System (ADS)

    Asavin, A. M.

    2001-12-01

    There are several geochemical Databases in INTERNET available now. There one of the main peculiarities of stored geochemical information is geographical coordinates of each samples in those Databases. As rule the software of this Database use spatial information only for users interface search procedures. In the other side, GIS-software (Geographical Information System software),for example ARC/INFO software which using for creation and analyzing special geological, geochemical and geophysical e-map, have been deeply involved with geographical coordinates for of samples. We join peculiarities GIS systems and relational geochemical Database from special software. Our geochemical information system created in Vernadsky Geological State Museum and institute of Geochemistry and Analytical Chemistry from Moscow. Now we tested system with data of geochemistry oceanic rock from Atlantic and Pacific oceans, about 10000 chemical analysis. GIS information content consist from e-map covers Wold Globes. Parts of these maps are Atlantic ocean covers gravica map (with grid 2''), oceanic bottom hot stream, altimeteric maps, seismic activity, tectonic map and geological map. Combination of this information content makes possible created new geochemical maps and combination of spatial analysis and numerical geochemical modeling of volcanic process in ocean segment. Now we tested information system on thick client technology. Interface between GIS system Arc/View and Database resides in special multiply SQL-queries sequence. The result of the above gueries were simple DBF-file with geographical coordinates. This file act at the instant of creation geochemical and other special e-map from oceanic region. We used more complex method for geophysical data. From ARC\\View we created grid cover for polygon spatial geophysical information.

  2. Stress Induced in Periodontal Ligament under Orthodontic Loading (Part II): A Comparison of Linear Versus Non-Linear Fem Study.

    PubMed

    Hemanth, M; Deoli, Shilpi; Raghuveer, H P; Rani, M S; Hegde, Chatura; Vedavathi, B

    2015-09-01

    Simulation of periodontal ligament (PDL) using non-linear finite element method (FEM) analysis gives better insight into understanding of the biology of tooth movement. The stresses in the PDL were evaluated for intrusion and lingual root torque using non-linear properties. A three-dimensional (3D) FEM model of the maxillary incisors was generated using Solidworks modeling software. Stresses in the PDL were evaluated for intrusive and lingual root torque movements by 3D FEM using ANSYS software. These stresses were compared with linear and non-linear analyses. For intrusive and lingual root torque movements, distribution of stress over the PDL was within the range of optimal stress value as proposed by Lee, but was exceeding the force system given by Proffit as optimum forces for orthodontic tooth movement with linear properties. When same force load was applied in non-linear analysis, stresses were more compared to linear analysis and were beyond the optimal stress range as proposed by Lee for both intrusive and lingual root torque. To get the same stress as linear analysis, iterations were done using non-linear properties and the force level was reduced. This shows that the force level required for non-linear analysis is lesser than that of linear analysis.

  3. Undergraduate Research Opportunities in OSS

    NASA Astrophysics Data System (ADS)

    Boldyreff, Cornelia; Capiluppi, Andrea; Knowles, Thomas; Munro, James

    Using Open Source Software (OSS) in undergraduate teaching in universities is now commonplace. Students use OSS applications and systems in their courses on programming, operating systems, DBMS, web development to name but a few. Studying OSS projects from both a product and a process view also forms part of the software engineering curriculum at various universities. Many students have taken part in OSS projects as well as developers.

  4. An Advanced Programming Technique for a Cost-Effective Hardware-Independent Realization of Naval Software Systems. Final Technical Report, Part II.

    ERIC Educational Resources Information Center

    Computer Symbolic, Inc., Washington, DC.

    A pseudo assembly language, PAL, was developed and specified for use as the lowest level in a general, multilevel programing system for the realization of cost-effective, hardware-independent Naval software. The language was developed as part of the system called FIRMS (Fast Iterative Recursive Macro System) and is sufficiently general to allow…

  5. Software reliability models for critical applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pham, H.; Pham, M.

    This report presents the results of the first phase of the ongoing EG G Idaho, Inc. Software Reliability Research Program. The program is studying the existing software reliability models and proposes a state-of-the-art software reliability model that is relevant to the nuclear reactor control environment. This report consists of three parts: (1) summaries of the literature review of existing software reliability and fault tolerant software reliability models and their related issues, (2) proposed technique for software reliability enhancement, and (3) general discussion and future research. The development of this proposed state-of-the-art software reliability model will be performed in the secondmore » place. 407 refs., 4 figs., 2 tabs.« less

  6. Software reliability models for critical applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pham, H.; Pham, M.

    This report presents the results of the first phase of the ongoing EG&G Idaho, Inc. Software Reliability Research Program. The program is studying the existing software reliability models and proposes a state-of-the-art software reliability model that is relevant to the nuclear reactor control environment. This report consists of three parts: (1) summaries of the literature review of existing software reliability and fault tolerant software reliability models and their related issues, (2) proposed technique for software reliability enhancement, and (3) general discussion and future research. The development of this proposed state-of-the-art software reliability model will be performed in the second place.more » 407 refs., 4 figs., 2 tabs.« less

  7. Software Design Improvements. Part 2; Software Quality and the Design and Inspection Process

    NASA Technical Reports Server (NTRS)

    Lalli, Vincent R.; Packard, Michael H.; Ziemianski, Tom

    1997-01-01

    The application of assurance engineering techniques improves the duration of failure-free performance of software. The totality of features and characteristics of a software product are what determine its ability to satisfy customer needs. Software in safety-critical systems is very important to NASA. We follow the System Safety Working Groups definition for system safety software as: 'The optimization of system safety in the design, development, use and maintenance of software and its integration with safety-critical systems in an operational environment. 'If it is not safe, say so' has become our motto. This paper goes over methods that have been used by NASA to make software design improvements by focusing on software quality and the design and inspection process.

  8. A tutorial for software development in quantitative proteomics using PSI standard formats☆

    PubMed Central

    Gonzalez-Galarza, Faviel F.; Qi, Da; Fan, Jun; Bessant, Conrad; Jones, Andrew R.

    2014-01-01

    The Human Proteome Organisation — Proteomics Standards Initiative (HUPO-PSI) has been working for ten years on the development of standardised formats that facilitate data sharing and public database deposition. In this article, we review three HUPO-PSI data standards — mzML, mzIdentML and mzQuantML, which can be used to design a complete quantitative analysis pipeline in mass spectrometry (MS)-based proteomics. In this tutorial, we briefly describe the content of each data model, sufficient for bioinformaticians to devise proteomics software. We also provide guidance on the use of recently released application programming interfaces (APIs) developed in Java for each of these standards, which makes it straightforward to read and write files of any size. We have produced a set of example Java classes and a basic graphical user interface to demonstrate how to use the most important parts of the PSI standards, available from http://code.google.com/p/psi-standard-formats-tutorial. This article is part of a Special Issue entitled: Computational Proteomics in the Post-Identification Era. Guest Editors: Martin Eisenacher and Christian Stephan. PMID:23584085

  9. Analysis, Simulation and Prediction of Cosmetic Defects on Automotive External Panel

    NASA Astrophysics Data System (ADS)

    Le Port, A.; Thuillier, S.; Borot, C.; Charbonneaux, J.

    2011-08-01

    The first feeling of quality for a vehicle is linked to its perfect appearance. This has a major impact on the reputation of a car manufacturer. Cosmetic defects are thus more and more taken into account in the process design. Qualifying a part as good or bad from the cosmetic point of view is mainly subjective: the part aspect is considered acceptable if no defect is visible on the vehicle by the final customer. Cosmetic defects that appear during sheet metal forming are checked by visual inspection in light inspection rooms, stoning, or with optical or mechanical sensors or feelers. A lack of cosmetic defect prediction before part production leads to the need for corrective actions, production delays and generates additional costs. This paper first explores the objective description of what cosmetic defects are on a stamped part and where they come from. It then investigates the capability of software to predict these defects, and suggests the use of a cosmetic defects analysis tool developed within PAM-STAMP 2G for its qualitative and quantitative prediction.

  10. Ideas for the rapid development of the structural models in mechanical engineering

    NASA Astrophysics Data System (ADS)

    Oanta, E.; Raicu, A.; Panait, C.

    2017-08-01

    Conceiving computer based instruments is a long run concern of the authors. Some of the original solutions are: optimal processing of the large matrices, interfaces between the programming languages, approximation theory using spline functions, numerical programming increased accuracy based on the extended arbitrary precision libraries. For the rapid development of the models we identified the following directions: atomization, ‘librarization’, parameterization, automatization and integration. Each of these directions has some particular aspects if we approach mechanical design problems or software development. Atomization means a thorough top-down decomposition analysis which offers an insight regarding the basic features of the phenomenon. Creation of libraries of reusable mechanical parts and libraries of programs (data types, functions) save time, cost and effort when a new model must be conceived. Parameterization leads to flexible definition of the mechanical parts, the values of the parameters being changed either using a dimensioning program or in accord to other parts belonging to the same assembly. The resulting templates may be also included in libraries. Original software applications are useful for the model’s input data generation, to input the data into CAD/FEA commercial applications and for the data integration of the various types of studies included in the same project.

  11. Metabolic Flux Analysis in Isotope Labeling Experiments Using the Adjoint Approach.

    PubMed

    Mottelet, Stephane; Gaullier, Gil; Sadaka, Georges

    2017-01-01

    Comprehension of metabolic pathways is considerably enhanced by metabolic flux analysis (MFA-ILE) in isotope labeling experiments. The balance equations are given by hundreds of algebraic (stationary MFA) or ordinary differential equations (nonstationary MFA), and reducing the number of operations is therefore a crucial part of reducing the computation cost. The main bottleneck for deterministic algorithms is the computation of derivatives, particularly for nonstationary MFA. In this article, we explain how the overall identification process may be speeded up by using the adjoint approach to compute the gradient of the residual sum of squares. The proposed approach shows significant improvements in terms of complexity and computation time when it is compared with the usual (direct) approach. Numerical results are obtained for the central metabolic pathways of Escherichia coli and are validated against reference software in the stationary case. The methods and algorithms described in this paper are included in the sysmetab software package distributed under an Open Source license at http://forge.scilab.org/index.php/p/sysmetab/.

  12. Autonomous smart sensor network for full-scale structural health monitoring

    NASA Astrophysics Data System (ADS)

    Rice, Jennifer A.; Mechitov, Kirill A.; Spencer, B. F., Jr.; Agha, Gul A.

    2010-04-01

    The demands of aging infrastructure require effective methods for structural monitoring and maintenance. Wireless smart sensor networks offer the ability to enhance structural health monitoring (SHM) practices through the utilization of onboard computation to achieve distributed data management. Such an approach is scalable to the large number of sensor nodes required for high-fidelity modal analysis and damage detection. While smart sensor technology is not new, the number of full-scale SHM applications has been limited. This slow progress is due, in part, to the complex network management issues that arise when moving from a laboratory setting to a full-scale monitoring implementation. This paper presents flexible network management software that enables continuous and autonomous operation of wireless smart sensor networks for full-scale SHM applications. The software components combine sleep/wake cycling for enhanced power management with threshold detection for triggering network wide tasks, such as synchronized sensing or decentralized modal analysis, during periods of critical structural response.

  13. Using a commercial CAD system for simultaneous input to theoretical aerodynamic programs and wind-tunnel model construction

    NASA Technical Reports Server (NTRS)

    Enomoto, F.; Keller, P.

    1984-01-01

    The Computer Aided Design (CAD) system's common geometry database was used to generate input for theoretical programs and numerically controlled (NC) tool paths for wind tunnel part fabrication. This eliminates the duplication of work in generating separate geometry databases for each type of analysis. Another advantage is that it reduces the uncertainty due to geometric differences when comparing theoretical aerodynamic data with wind tunnel data. The system was adapted to aerodynamic research by developing programs written in Design Analysis Language (DAL). These programs reduced the amount of time required to construct complex geometries and to generate input for theoretical programs. Certain shortcomings of the Design, Drafting, and Manufacturing (DDM) software limited the effectiveness of these programs and some of the Calma NC software. The complexity of aircraft configurations suggests that more types of surface and curve geometry should be added to the system. Some of these shortcomings may be eliminated as improved versions of DDM are made available.

  14. Structural Design of Ares V Interstage Composite Structure

    NASA Technical Reports Server (NTRS)

    Sleigh, David W.; Sreekantamurthy, Thammaiah; Kosareo, Daniel N.; Martin, Robert A.; Johnson, Theodore F.

    2011-01-01

    Preliminary and detailed design studies were performed to mature composite structural design concepts for the Ares V Interstage structure as a part of NASA s Advanced Composite Technologies Project. Aluminum honeycomb sandwich and hat-stiffened composite panel structural concepts were considered. The structural design and analysis studies were performed using HyperSizer design sizing software and MSC Nastran finite element analysis software. System-level design trade studies were carried out to predict weight and margins of safety for composite honeycomb-core sandwich and composite hat-stiffened skin design concepts. Details of both preliminary and detailed design studies are presented in the paper. For the range of loads and geometry considered in this work, the hat-stiffened designs were found to be approximately 11-16 percent lighter than the sandwich designs. A down-select process was used to choose the most favorable structural concept based on a set of figures of merit, and the honeycomb sandwich design was selected as the best concept based on advantages in manufacturing cost.

  15. Tele-rehabilitation using in-house wearable ankle rehabilitation robot.

    PubMed

    Jamwal, Prashant K; Hussain, Shahid; Mir-Nasiri, Nazim; Ghayesh, Mergen H; Xie, Sheng Q

    2018-01-01

    This article explores wide-ranging potential of the wearable ankle robot for in-house rehabilitation. The presented robot has been conceptualized following a brief analysis of the existing technologies, systems, and solutions for in-house physical ankle rehabilitation. Configuration design analysis and component selection for ankle robot have been discussed as part of the conceptual design. The complexities of human robot interaction are closely encountered while maneuvering a rehabilitation robot. We present a fuzzy logic-based controller to perform the required robot-assisted ankle rehabilitation treatment. Designs of visual haptic interfaces have also been discussed, which will make the treatment interesting, and the subject will be motivated to exert more and regain lost functions rapidly. The complex nature of web-based communication between user and remotely sitting physiotherapy staff has also been discussed. A high-level software architecture appended with robot ensures user-friendly operations. This software is made up of three important components: patient-related database, graphical user interface (GUI), and a library of exercises creating virtual reality-specifically developed for ankle rehabilitation.

  16. The Software Problem.

    ERIC Educational Resources Information Center

    Walker, Decker F.

    This paper addresses the reasons that it is difficult to find good educational software and proposes measures for coping with this problem. The fundamental problem is a shortange of educational software that can be used as a major part of the teaching of academic subjects in elementary and secondary schools--a shortage that is both the effect and…

  17. Active Learning through Modeling: Introduction to Software Development in the Business Curriculum

    ERIC Educational Resources Information Center

    Roussev, Boris; Rousseva, Yvonna

    2004-01-01

    Modern software practices call for the active involvement of business people in the software process. Therefore, programming has become an indispensable part of the information systems component of the core curriculum at business schools. In this paper, we present a model-based approach to teaching introduction to programming to general business…

  18. GRO/EGRET data analysis software: An integrated system of custom and commercial software using standard interfaces

    NASA Technical Reports Server (NTRS)

    Laubenthal, N. A.; Bertsch, D.; Lal, N.; Etienne, A.; Mcdonald, L.; Mattox, J.; Sreekumar, P.; Nolan, P.; Fierro, J.

    1992-01-01

    The Energetic Gamma Ray Telescope Experiment (EGRET) on the Compton Gamma Ray Observatory has been in orbit for more than a year and is being used to map the full sky for gamma rays in a wide energy range from 30 to 20,000 MeV. Already these measurements have resulted in a wide range of exciting new information on quasars, pulsars, galactic sources, and diffuse gamma ray emission. The central part of the analysis is done with sky maps that typically cover an 80 x 80 degree section of the sky for an exposure time of several days. Specific software developed for this program generates the counts, exposure, and intensity maps. The analysis is done on a network of UNIX based workstations and takes full advantage of a custom-built user interface called X-dialog. The maps that are generated are stored in the FITS format for a collection of energies. These, along with similar diffuse emission background maps generated from a model calculation, serve as input to a maximum likelihood program that produces maps of likelihood with optional contours that are used to evaluate regions for sources. Likelihood also evaluates the background corrected intensity at each location for each energy interval from which spectra can be generated. Being in a standard FITS format permits all of the maps to be easily accessed by the full complement of tools available in several commercial astronomical analysis systems. In the EGRET case, IDL is used to produce graphics plots in two and three dimensions and to quickly implement any special evaluation that might be desired. Other custom-built software, such as the spectral and pulsar analyses, take advantage of the XView toolkit for display and Postscript output for the color hard copy. This poster paper outlines the data flow and provides examples of the user interfaces and output products. It stresses the advantages that are derived from the integration of the specific instrument-unique software and powerful commercial tools for graphics and statistical evaluation. This approach has several proven advantages including flexibility, a minimum of development effort, ease of use, and portability.

  19. Control Software for Advanced Video Guidance Sensor

    NASA Technical Reports Server (NTRS)

    Howard, Richard T.; Book, Michael L.; Bryan, Thomas C.

    2006-01-01

    Embedded software has been developed specifically for controlling an Advanced Video Guidance Sensor (AVGS). A Video Guidance Sensor is an optoelectronic system that provides guidance for automated docking of two vehicles. Such a system includes pulsed laser diodes and a video camera, the output of which is digitized. From the positions of digitized target images and known geometric relationships, the relative position and orientation of the vehicles are computed. The present software consists of two subprograms running in two processors that are parts of the AVGS. The subprogram in the first processor receives commands from an external source, checks the commands for correctness, performs commanded non-image-data-processing control functions, and sends image data processing parts of commands to the second processor. The subprogram in the second processor processes image data as commanded. Upon power-up, the software performs basic tests of functionality, then effects a transition to a standby mode. When a command is received, the software goes into one of several operational modes (e.g. acquisition or tracking). The software then returns, to the external source, the data appropriate to the command.

  20. Flood Vulnerability Analysis of the part of Karad Region, Satara District, Maharashtra using Remote Sensing and Geographic Information System technique

    NASA Astrophysics Data System (ADS)

    Warghat, Sumedh R.; Das, Sandipan; Doad, Atul; Mali, Sagar; Moon, Vishal S.

    2012-07-01

    Karad City is situated on the bank of confluence of river Krishna & Koyana, which is severely flood prone area. The floodwaters enter the city through the roads and disrupt the infrastructure in the whole city. Furthermore, due to negligence of the authorities and unplanned growth of the city, the people living in the city have harnessed the natural flow of water by constructing unnecessary embankments in the river Koyna. Due to this reason now river koyna is flowing in the form of a narrow channel, which very easily over-flows during very minor flooding.Flood Vulnerabilty Analysis has been done for the karad region of satara district, maharashtra using remote sensing and geographic information system technique. The aim of this study is to identify flood vulnerability zone by using GIS and RS technique and an attempt has been to demonstrat the application of remote sensing and GIS in order to map flood vulnerabilty area by utilizing ArcMap, and Erdas software. Flood vulnerabilty analysis of part the Karad Regian of Satara District, Maharashtra has been carried out with the objectives - Identify the Flood Prone area in the Koyana and Krishna river basin, Calculate surface runoff and Delineate flood sensitive areas. Delineate classified hazard Map, Evaluate the Flood affected area, Prepare the Flood Vulnerability Map by utilizing Remote Sensing and GIS technique. (C.J. Kumanan;S.M. Ramasamy)The study is based on GIS and spatial technique is used for analysis and understanding of flood problem in Karad Tahsil. The flood affected areas of the different magnitude has been identified and mapped using Arc GIS software. The analysis is useful for local planning authority for identification of risk areas and taking proper decision in right moment. In the analysis causative factors for flooding in watershed are taken into account as annual rainfall, size of watershed, basin slope, drainage density of natural channels and land use. (Dinand Alkema; Farah Aziz.)This study of flood vulnerable area determination in a part of Karad Tahsil is employed to illustrate the different approaches.

  1. Experimental Evaluation of Verification and Validation Tools on Martian Rover Software

    NASA Technical Reports Server (NTRS)

    Brat, Guillaume; Giannakopoulou, Dimitra; Goldberg, Allen; Havelund, Klaus; Lowry, Mike; Pasareanu, Corina; Venet, Arnaud; Visser, Willem

    2003-01-01

    To achieve its science objectives in deep space exploration, NASA has a need for science platform vehicles to autonomously make control decisions in a time frame that excludes intervention from Earth-based controllers. Round-trip light-time is one significant factor motivating autonomy capability, another factor is the need to reduce ground support operations cost. An unsolved problem potentially impeding the adoption of autonomy capability is the verification and validation of such software systems, which exhibit far more behaviors (and hence distinct execution paths in the software) than is typical in current deepspace platforms. Hence the need for a study to benchmark advanced Verification and Validation (V&V) tools on representative autonomy software. The objective of the study was to access the maturity of different technologies, to provide data indicative of potential synergies between them, and to identify gaps in the technologies with respect to the challenge of autonomy V&V. The study consisted of two parts: first, a set of relatively independent case studies of different tools on the same autonomy code, second a carefully controlled experiment with human participants on a subset of these technologies. This paper describes the second part of the study. Overall, nearly four hundred hours of data on human use of three different advanced V&V tools were accumulated, with a control group that used conventional testing methods. The experiment simulated four independent V&V teams debugging three successive versions of an executive controller for a Martian Rover. Defects were carefully seeded into the three versions based on a profile of defects from CVS logs that occurred in the actual development of the executive controller. The rest of the document is structured a s follows. In section 2 and 3, we respectively describe the tools used in the study and the rover software that was analyzed. In section 4 the methodology for the experiment is described; this includes the code preparation, seeding of defects, participant training and experimental setup. Next we give a qualitative overview of how the experiment went from the point of view of each technology; model checking (section 5), static analysis (section 6), runtime analysis (section 7) and testing (section 8). The find section gives some preliminary quantitative results on how the tools compared.

  2. Using software security analysis to verify the secure socket layer (SSL) protocol

    NASA Technical Reports Server (NTRS)

    Powell, John D.

    2004-01-01

    nal Aeronautics and Space Administration (NASA) have tens of thousands of networked computer systems and applications. Software Security vulnerabilities present risks such as lost or corrupted data, information the3, and unavailability of critical systems. These risks represent potentially enormous costs to NASA. The NASA Code Q research initiative 'Reducing Software Security Risk (RSSR) Trough an Integrated Approach '' offers, among its capabilities, formal verification of software security properties, through the use of model based verification (MBV) to address software security risks. [1,2,3,4,5,6] MBV is a formal approach to software assurance that combines analysis of software, via abstract models, with technology, such as model checkers, that provide automation of the mechanical portions of the analysis process. This paper will discuss: The need for formal analysis to assure software systems with respect to software and why testing alone cannot provide it. The means by which MBV with a Flexible Modeling Framework (FMF) accomplishes the necessary analysis task. An example of FMF style MBV in the verification of properties over the Secure Socket Layer (SSL) communication protocol as a demonstration.

  3. Development of Automated Image Analysis Software for Suspended Marine Particle Classification

    DTIC Science & Technology

    2003-09-30

    Development of Automated Image Analysis Software for Suspended Marine Particle Classification Scott Samson Center for Ocean Technology...REPORT TYPE 3. DATES COVERED 00-00-2003 to 00-00-2003 4. TITLE AND SUBTITLE Development of Automated Image Analysis Software for Suspended...objective is to develop automated image analysis software to reduce the effort and time required for manual identification of plankton images. Automated

  4. A tool to include gamma analysis software into a quality assurance program.

    PubMed

    Agnew, Christina E; McGarry, Conor K

    2016-03-01

    To provide a tool to enable gamma analysis software algorithms to be included in a quality assurance (QA) program. Four image sets were created comprising two geometric images to independently test the distance to agreement (DTA) and dose difference (DD) elements of the gamma algorithm, a clinical step and shoot IMRT field and a clinical VMAT arc. The images were analysed using global and local gamma analysis with 2 in-house and 8 commercially available software encompassing 15 software versions. The effect of image resolution on gamma pass rates was also investigated. All but one software accurately calculated the gamma passing rate for the geometric images. Variation in global gamma passing rates of 1% at 3%/3mm and over 2% at 1%/1mm was measured between software and software versions with analysis of appropriately sampled images. This study provides a suite of test images and the gamma pass rates achieved for a selection of commercially available software. This image suite will enable validation of gamma analysis software within a QA program and provide a frame of reference by which to compare results reported in the literature from various manufacturers and software versions. Copyright © 2015. Published by Elsevier Ireland Ltd.

  5. Software Tools to Support Research on Airport Departure Planning

    NASA Technical Reports Server (NTRS)

    Carr, Francis; Evans, Antony; Feron, Eric; Clarke, John-Paul

    2003-01-01

    A simple, portable and useful collection of software tools has been developed for the analysis of airport surface traffic. The tools are based on a flexible and robust traffic-flow model, and include calibration, validation and simulation functionality for this model. Several different interfaces have been developed to help promote usage of these tools, including a portable Matlab(TM) implementation of the basic algorithms; a web-based interface which provides online access to automated analyses of airport traffic based on a database of real-world operations data which covers over 250 U.S. airports over a 5-year period; and an interactive simulation-based tool currently in use as part of a college-level educational module. More advanced applications for airport departure traffic include taxi-time prediction and evaluation of "windowing" congestion control.

  6. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Prather, J. C.; Smith, S. K.; Watson, C. R.

    The National Radiobiology Archives is a comprehensive effort to gather, organize, and catalog original data, representative specimens, and supporting materials related to significant radiobiology studies. This provides researchers with information for analyses which compare or combine results of these and other studies and with materials for analysis by advanced molecular biology techniques. This Programmer's Guide document describes the database access software, NRADEMO, and the subset loading script NRADEMO/MAINT/MAINTAIN, which comprise the National Laboratory Archives Distributed Access Package. The guide is intended for use by an experienced database management specialist. It contains information about the physical and logical organization of themore » software and data files. It also contains printouts of all the scripts and associated batch processing files. It is part of a suite of documents published by the National Radiobiology Archives.« less

  7. Problem-Solving Software

    NASA Technical Reports Server (NTRS)

    1992-01-01

    CBR Express software solves problems by adapting sorted solutions to new problems specified by a user. It is applicable to a wide range of situations. The technology was originally developed by Inference Corporation for Johnson Space Center's Advanced Software Development Workstation. The project focused on the reuse of software designs, and Inference used CBR as part of the ACCESS prototype software. The commercial CBR Express is used as a "help desk" for customer support, enabling reuse of existing information when necessary. It has been adopted by several companies, among them American Airlines, which uses it to solve reservation system software problems.

  8. Automated Ontology Generation Using Spatial Reasoning

    NASA Astrophysics Data System (ADS)

    Coalter, Alton; Leopold, Jennifer L.

    Recently there has been much interest in using ontologies to facilitate knowledge representation, integration, and reasoning. Correspondingly, the extent of the information embodied by an ontology is increasing beyond the conventional is_a and part_of relationships. To address these requirements, a vast amount of digitally available information may need to be considered when building ontologies, prompting a desire for software tools to automate at least part of the process. The main efforts in this direction have involved textual information retrieval and extraction methods. For some domains extension of the basic relationships could be enhanced further by the analysis of 2D and/or 3D images. For this type of media, image processing algorithms are more appropriate than textual analysis methods. Herein we present an algorithm that, given a collection of 3D image files, utilizes Qualitative Spatial Reasoning (QSR) to automate the creation of an ontology for the objects represented by the images, relating the objects in terms of is_a and part_of relationships and also through unambiguous Relational Connection Calculus (RCC) relations.

  9. PINS Spectrum Identification Guide

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    A.J. Caffrey

    2012-03-01

    The Portable Isotopic Neutron Spectroscopy—PINS, for short—system identifies the chemicals inside munitions and containers without opening them, a decided safety advantage if the fill chemical is a hazardous substance like a chemical warfare agent or an explosive. The PINS Spectrum Identification Guide is intended as a reference for technical professionals responsible for the interpretation of PINS gamma-ray spectra. The guide is divided into two parts. The three chapters that constitute Part I cover the science and technology of PINS. Neutron activation analysis is the focus of Chapter 1. Chapter 2 explores PINS hardware, software, and related operational issues. Gamma-ray spectralmore » analysis basics are introduced in Chapter 3. The six chapters of Part II cover the identification of PINS spectra in detail. Like the PINS decision tree logic, these chapters are organized by chemical element: phosphorus-based chemicals, chlorine-based chemicals, etc. These descriptions of hazardous, toxic, and/or explosive chemicals conclude with a chapter on the identification of the inert chemicals, e.g. sand, used to fill practice munitions.« less

  10. The Role of Data Analysis Software in Graduate Programs in Education and Post-Graduate Research

    ERIC Educational Resources Information Center

    Harwell, Michael

    2018-01-01

    The importance of data analysis software in graduate programs in education and post-graduate educational research is self-evident. However the role of this software in facilitating supererogated statistical practice versus "cookbookery" is unclear. The need to rigorously document the role of data analysis software in students' graduate…

  11. SAM Photovoltaic Model Technical Reference 2016 Update

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gilman, Paul; DiOrio, Nicholas A; Freeman, Janine M

    This manual describes the photovoltaic performance model in the System Advisor Model (SAM) software, Version 2016.3.14 Revision 4 (SSC Version 160). It is an update to the 2015 edition of the manual, which describes the photovoltaic model in SAM 2015.1.30 (SSC 41). This new edition includes corrections of errors in the 2015 edition and descriptions of new features introduced in SAM 2016.3.14, including: 3D shade calculator Battery storage model DC power optimizer loss inputs Snow loss model Plane-of-array irradiance input from weather file option Support for sub-hourly simulations Self-shading works with all four subarrays, and uses same algorithm for fixedmore » arrays and one-axis tracking Linear self-shading algorithm for thin-film modules Loss percentages replace derate factors. The photovoltaic performance model is one of the modules in the SAM Simulation Core (SSC), which is part of both SAM and the SAM SDK. SAM is a user-friedly desktop application for analysis of renewable energy projects. The SAM SDK (Software Development Kit) is for developers writing their own renewable energy analysis software based on SSC. This manual is written for users of both SAM and the SAM SDK wanting to learn more about the details of SAM's photovoltaic model.« less

  12. NASA Tech Briefs, July 2003

    NASA Technical Reports Server (NTRS)

    2003-01-01

    Topics covered include: Real-Time, High-Frequency QRS Electrocardiograph; Software for Improved Extraction of Data From Tape Storage; Radio System for Locating Emergency Workers; Software for Displaying High-Frequency Test Data; Capacitor-Chain Successive-Approximation ADC; Simpler Alternative to an Optimum FQPSK-B Viterbi Receiver; Multilayer Patch Antenna Surrounded by a Metallic Wall; Software To Secure Distributed Propulsion Simulations; Explicit Pore Pressure Material Model in Carbon-Cloth Phenolic; Meshed-Pumpkin Super-Pressure Balloon Design; Corrosion Inhibitors as Penetrant Dyes for Radiography; Transparent Metal-Salt-Filled Polymeric Radiation Shields; Lightweight Energy Absorbers for Blast Containers; Brush-Wheel Samplers for Planetary Exploration; Dry Process for Making Polyimide/ Carbon-and-Boron-Fiber Tape; Relatively Inexpensive Rapid Prototyping of Small Parts; Magnetic Field Would Reduce Electron Backstreaming in Ion Thrusters; Alternative Electrochemical Systems for Ozonation of Water; Interferometer for Measuring Displacement to Within 20 pm; UV-Enhanced IR Raman System for Identifying Biohazards; Prognostics Methodology for Complex Systems; Algorithms for Haptic Rendering of 3D Objects; Modeling and Control of Aerothermoelastic Effects; Processing Digital Imagery to Enhance Perceptions of Realism; Analysis of Designs of Space Laboratories; Shields for Enhanced Protection Against High-Speed Debris; Study of Dislocation-Ordered In(x)Ga(1-x)As/GaAs Quantum Dots; and Tilt-Sensitivity Analysis for Space Telescopes.

  13. Software reuse issues affecting AdaNET

    NASA Technical Reports Server (NTRS)

    Mcbride, John G.

    1989-01-01

    The AdaNet program is reviewing its long-term goals and strategies. A significant concern is whether current AdaNet plans adequately address the major strategic issues of software reuse technology. The major reuse issues of providing AdaNet services that should be addressed as part of future AdaNet development are identified and reviewed. Before significant development proceeds, a plan should be developed to resolve the aforementioned issues. This plan should also specify a detailed approach to develop AdaNet. A three phased strategy is recommended. The first phase would consist of requirements analysis and produce an AdaNet system requirements specification. It would consider the requirements of AdaNet in terms of mission needs, commercial realities, and administrative policies affecting development, and the experience of AdaNet and other projects promoting the transfer software engineering technology. Specifically, requirements analysis would be performed to better understand the requirements for AdaNet functions. The second phase would provide a detailed design of the system. The AdaNet should be designed with emphasis on the use of existing technology readily available to the AdaNet program. A number of reuse products are available upon which AdaNet could be based. This would significantly reduce the risk and cost of providing an AdaNet system. Once a design was developed, implementation would proceed in the third phase.

  14. Software-assisted stacking of gene modules using GoldenBraid 2.0 DNA-assembly framework.

    PubMed

    Vazquez-Vilar, Marta; Sarrion-Perdigones, Alejandro; Ziarsolo, Peio; Blanca, Jose; Granell, Antonio; Orzaez, Diego

    2015-01-01

    GoldenBraid (GB) is a modular DNA assembly technology for plant multigene engineering based on type IIS restriction enzymes. GB speeds up the assembly of transcriptional units from standard genetic parts and facilitates the stacking of several genes within the same T-DNA in few days. GBcloning is software-assisted with a set of online tools. The GBDomesticator tool assists in the adaptation of DNA parts to the GBstandard. The combination of GB-adapted parts to build new transcriptional units is assisted by the GB TU Assembler tool. Finally, the assembly of multigene modules is simulated by the GB Binary Assembler. All the software tools are available at www.gbcloning.org . Here, we describe in detail the assembly methodology to create a multigene construct with three transcriptional units for polyphenol metabolic engineering in plants.

  15. The Secret Life of Quarks, Final Report for the University of North Carolina at Chapel Hill

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fowler, Robert J.

    This final report summarizes activities and results at the University of North Carolina as part of the the SciDAC-2 Project The Secret Life of Quarks: National Computational Infrastructure for Lattice Quantum Chromodynamics. The overall objective of the project is to construct the software needed to study quantum chromo- dynamics (QCD), the theory of the strong interactions of subatomic physics, and similar strongly coupled gauge theories anticipated to be of importance in the LHC era. It built upon the successful efforts of the SciDAC-1 project National Computational Infrastructure for Lattice Gauge Theory, in which a QCD Applications Programming Interface (QCD API)more » was developed that enables lat- tice gauge theorists to make effective use of a wide variety of massively parallel computers. In the SciDAC-2 project, optimized versions of the QCD API were being created for the IBM Blue- Gene/L (BG/L) and BlueGene/P (BG/P), the Cray XT3/XT4 and its successors, and clusters based on multi-core processors and Infiniband communications networks. The QCD API is being used to enhance the performance of the major QCD community codes and to create new applications. Software libraries of physics tools have been expanded to contain sharable building blocks for inclusion in application codes, performance analysis and visualization tools, and software for au- tomation of physics work flow. New software tools were designed for managing the large data sets generated in lattice QCD simulations, and for sharing them through the International Lattice Data Grid consortium. As part of the overall project, researchers at UNC were funded through ASCR to work in three general areas. The main thrust has been performance instrumentation and analysis in support of the SciDAC QCD code base as it evolved and as it moved to new computation platforms. In support of the performance activities, performance data was to be collected in a database for the purpose of broader analysis. Third, the UNC work was done at RENCI (Renaissance Computing Institute), which has extensive expertise and facilities for scientific data visualization, so we acted in an ongoing consulting and support role in that area.« less

  16. Usability study of clinical exome analysis software: top lessons learned and recommendations.

    PubMed

    Shyr, Casper; Kushniruk, Andre; Wasserman, Wyeth W

    2014-10-01

    New DNA sequencing technologies have revolutionized the search for genetic disruptions. Targeted sequencing of all protein coding regions of the genome, called exome analysis, is actively used in research-oriented genetics clinics, with the transition to exomes as a standard procedure underway. This transition is challenging; identification of potentially causal mutation(s) amongst ∼10(6) variants requires specialized computation in combination with expert assessment. This study analyzes the usability of user interfaces for clinical exome analysis software. There are two study objectives: (1) To ascertain the key features of successful user interfaces for clinical exome analysis software based on the perspective of expert clinical geneticists, (2) To assess user-system interactions in order to reveal strengths and weaknesses of existing software, inform future design, and accelerate the clinical uptake of exome analysis. Surveys, interviews, and cognitive task analysis were performed for the assessment of two next-generation exome sequence analysis software packages. The subjects included ten clinical geneticists who interacted with the software packages using the "think aloud" method. Subjects' interactions with the software were recorded in their clinical office within an urban research and teaching hospital. All major user interface events (from the user interactions with the packages) were time-stamped and annotated with coding categories to identify usability issues in order to characterize desired features and deficiencies in the user experience. We detected 193 usability issues, the majority of which concern interface layout and navigation, and the resolution of reports. Our study highlights gaps in specific software features typical within exome analysis. The clinicians perform best when the flow of the system is structured into well-defined yet customizable layers for incorporation within the clinical workflow. The results highlight opportunities to dramatically accelerate clinician analysis and interpretation of patient genomic data. We present the first application of usability methods to evaluate software interfaces in the context of exome analysis. Our results highlight how the study of user responses can lead to identification of usability issues and challenges and reveal software reengineering opportunities for improving clinical next-generation sequencing analysis. While the evaluation focused on two distinctive software tools, the results are general and should inform active and future software development for genome analysis software. As large-scale genome analysis becomes increasingly common in healthcare, it is critical that efficient and effective software interfaces are provided to accelerate clinical adoption of the technology. Implications for improved design of such applications are discussed. Copyright © 2014 The Authors. Published by Elsevier Inc. All rights reserved.

  17. What's in a story? A text analysis of burn survivors' web-posted narratives.

    PubMed

    Badger, Karen; Royse, David; Moore, Kelly

    2011-01-01

    Story-telling has been found to be beneficial following trauma, suggesting a potential intervention for burn survivors who frequently make use of? telling their story? as part of their recovery. This study is the first to examine the word content of burn survivors' Web-posted narratives to explore their perceptions of the event, supportive resources, their post-burn well-being, and re-integration using a comparison group and a text data analysis software developed by the widely recognized James Pennebaker. Suggestions for using expressive writing or story-telling as a guided psychosocial intervention with burn survivors are made.

  18. Open Source Paradigm: A Synopsis of The Cathedral and the Bazaar for Health and Social Care.

    PubMed

    Benson, Tim

    2016-07-04

    Open source software (OSS) is becoming more fashionable in health and social care, although the ideas are not new. However progress has been slower than many had expected. The purpose is to summarise the Free/Libre Open Source Software (FLOSS) paradigm in terms of what it is, how it impacts users and software engineers and how it can work as a business model in health and social care sectors. Much of this paper is a synopsis of Eric Raymond's seminal book The Cathedral and the Bazaar, which was the first comprehensive description of the open source ecosystem, set out in three long essays. Direct quotes from the book are used liberally, without reference to specific passages. The first part contrasts open and closed source approaches to software development and support. The second part describes the culture and practices of the open source movement. The third part considers business models. A key benefit of open source is that users can access and collaborate on improving the software if they wish. Closed source code may be regarded as a strategic business risk that that may be unacceptable if there is an open source alternative. The sharing culture of the open source movement fits well with that of health and social care.

  19. CALIPSO: an interactive image analysis software package for desktop PACS workstations

    NASA Astrophysics Data System (ADS)

    Ratib, Osman M.; Huang, H. K.

    1990-07-01

    The purpose of this project is to develop a low cost workstation for quantitative analysis of multimodality images using a Macintosh II personal computer. In the current configuration the Macintosh operates as a stand alone workstation where images are imported either from a central PACS server through a standard Ethernet network or recorded through video digitizer board. The CALIPSO software developed contains a large variety ofbasic image display and manipulation tools. We focused our effort however on the design and implementation ofquantitative analysis methods that can be applied to images from different imaging modalities. Analysis modules currently implemented include geometric and densitometric volumes and ejection fraction calculation from radionuclide and cine-angiograms Fourier analysis ofcardiac wall motion vascular stenosis measurement color coded parametric display of regional flow distribution from dynamic coronary angiograms automatic analysis ofmyocardial distribution ofradiolabelled tracers from tomoscintigraphic images. Several of these analysis tools were selected because they use similar color coded andparametric display methods to communicate quantitative data extracted from the images. 1. Rationale and objectives of the project Developments of Picture Archiving and Communication Systems (PACS) in clinical environment allow physicians and radiologists to assess radiographic images directly through imaging workstations (''). This convenient access to the images is often limited by the number of workstations available due in part to their high cost. There is also an increasing need for quantitative analysis ofthe images. During thepast decade

  20. An open source software for fast grid-based data-mining in spatial epidemiology (FGBASE).

    PubMed

    Baker, David M; Valleron, Alain-Jacques

    2014-10-30

    Examining whether disease cases are clustered in space is an important part of epidemiological research. Another important part of spatial epidemiology is testing whether patients suffering from a disease are more, or less, exposed to environmental factors of interest than adequately defined controls. Both approaches involve determining the number of cases and controls (or population at risk) in specific zones. For cluster searches, this often must be done for millions of different zones. Doing this by calculating distances can lead to very lengthy computations. In this work we discuss the computational advantages of geographical grid-based methods, and introduce an open source software (FGBASE) which we have created for this purpose. Geographical grids based on the Lambert Azimuthal Equal Area projection are well suited for spatial epidemiology because they preserve area: each cell of the grid has the same area. We describe how data is projected onto such a grid, as well as grid-based algorithms for spatial epidemiological data-mining. The software program (FGBASE), that we have developed, implements these grid-based methods. The grid based algorithms perform extremely fast. This is particularly the case for cluster searches. When applied to a cohort of French Type 1 Diabetes (T1D) patients, as an example, the grid based algorithms detected potential clusters in a few seconds on a modern laptop. This compares very favorably to an equivalent cluster search using distance calculations instead of a grid, which took over 4 hours on the same computer. In the case study we discovered 4 potential clusters of T1D cases near the cities of Le Havre, Dunkerque, Toulouse and Nantes. One example of environmental analysis with our software was to study whether a significant association could be found between distance to vineyards with heavy pesticide. None was found. In both examples, the software facilitates the rapid testing of hypotheses. Grid-based algorithms for mining spatial epidemiological data provide advantages in terms of computational complexity thus improving the speed of computations. We believe that these methods and this software tool (FGBASE) will lower the computational barriers to entry for those performing epidemiological research.

  1. Fault Tree Analysis Application for Safety and Reliability

    NASA Technical Reports Server (NTRS)

    Wallace, Dolores R.

    2003-01-01

    Many commercial software tools exist for fault tree analysis (FTA), an accepted method for mitigating risk in systems. The method embedded in the tools identifies a root as use in system components, but when software is identified as a root cause, it does not build trees into the software component. No commercial software tools have been built specifically for development and analysis of software fault trees. Research indicates that the methods of FTA could be applied to software, but the method is not practical without automated tool support. With appropriate automated tool support, software fault tree analysis (SFTA) may be a practical technique for identifying the underlying cause of software faults that may lead to critical system failures. We strive to demonstrate that existing commercial tools for FTA can be adapted for use with SFTA, and that applied to a safety-critical system, SFTA can be used to identify serious potential problems long before integrator and system testing.

  2. RELAP-7 Software Verification and Validation Plan

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Smith, Curtis L.; Choi, Yong-Joon; Zou, Ling

    This INL plan comprehensively describes the software for RELAP-7 and documents the software, interface, and software design requirements for the application. The plan also describes the testing-based software verification and validation (SV&V) process—a set of specially designed software models used to test RELAP-7. The RELAP-7 (Reactor Excursion and Leak Analysis Program) code is a nuclear reactor system safety analysis code being developed at Idaho National Laboratory (INL). The code is based on the INL’s modern scientific software development framework – MOOSE (Multi-Physics Object-Oriented Simulation Environment). The overall design goal of RELAP-7 is to take advantage of the previous thirty yearsmore » of advancements in computer architecture, software design, numerical integration methods, and physical models. The end result will be a reactor systems analysis capability that retains and improves upon RELAP5’s capability and extends the analysis capability for all reactor system simulation scenarios.« less

  3. Closing the loop on improvement: Packaging experience in the Software Engineering Laboratory

    NASA Technical Reports Server (NTRS)

    Waligora, Sharon R.; Landis, Linda C.; Doland, Jerry T.

    1994-01-01

    As part of its award-winning software process improvement program, the Software Engineering Laboratory (SEL) has developed an effective method for packaging organizational best practices based on real project experience into useful handbooks and training courses. This paper shares the SEL's experience over the past 12 years creating and updating software process handbooks and training courses. It provides cost models and guidelines for successful experience packaging derived from SEL experience.

  4. Traffic-Light-Preemption Vehicle-Transponder Software Module

    NASA Technical Reports Server (NTRS)

    Bachelder, Aaron; Foster, Conrad

    2005-01-01

    A prototype wireless data-communication and control system automatically modifies the switching of traffic lights to give priority to emergency vehicles. The system, which was reported in several NASA Tech Briefs articles at earlier stages of development, includes a transponder on each emergency vehicle, a monitoring and control unit (an intersection controller) at each intersection equipped with traffic lights, and a central monitoring subsystem. An essential component of the system is a software module executed by a microcontroller in each transponder. This module integrates and broadcasts data on the position, velocity, acceleration, and emergency status of the vehicle. The position, velocity, and acceleration data are derived partly from the Global Positioning System, partly from deductive reckoning, and partly from a diagnostic computer aboard the vehicle. The software module also monitors similar broadcasts from other vehicles and from intersection controllers, informs the driver of which intersections it controls, and generates visible and audible alerts to inform the driver of any other emergency vehicles that are close enough to create a potential hazard. The execution of the software module can be monitored remotely and the module can be upgraded remotely and, hence, automatically

  5. Proceedings of the 14th Annual Software Engineering Workshop

    NASA Technical Reports Server (NTRS)

    1989-01-01

    Several software related topics are presented. Topics covered include studies and experiment at the Software Engineering Laboratory at the Goddard Space Flight Center, predicting project success from the Software Project Management Process, software environments, testing in a reuse environment, domain directed reuse, and classification tree analysis using the Amadeus measurement and empirical analysis.

  6. Design and validation of Segment--freely available software for cardiovascular image analysis.

    PubMed

    Heiberg, Einar; Sjögren, Jane; Ugander, Martin; Carlsson, Marcus; Engblom, Henrik; Arheden, Håkan

    2010-01-11

    Commercially available software for cardiovascular image analysis often has limited functionality and frequently lacks the careful validation that is required for clinical studies. We have already implemented a cardiovascular image analysis software package and released it as freeware for the research community. However, it was distributed as a stand-alone application and other researchers could not extend it by writing their own custom image analysis algorithms. We believe that the work required to make a clinically applicable prototype can be reduced by making the software extensible, so that researchers can develop their own modules or improvements. Such an initiative might then serve as a bridge between image analysis research and cardiovascular research. The aim of this article is therefore to present the design and validation of a cardiovascular image analysis software package (Segment) and to announce its release in a source code format. Segment can be used for image analysis in magnetic resonance imaging (MRI), computed tomography (CT), single photon emission computed tomography (SPECT) and positron emission tomography (PET). Some of its main features include loading of DICOM images from all major scanner vendors, simultaneous display of multiple image stacks and plane intersections, automated segmentation of the left ventricle, quantification of MRI flow, tools for manual and general object segmentation, quantitative regional wall motion analysis, myocardial viability analysis and image fusion tools. Here we present an overview of the validation results and validation procedures for the functionality of the software. We describe a technique to ensure continued accuracy and validity of the software by implementing and using a test script that tests the functionality of the software and validates the output. The software has been made freely available for research purposes in a source code format on the project home page http://segment.heiberg.se. Segment is a well-validated comprehensive software package for cardiovascular image analysis. It is freely available for research purposes provided that relevant original research publications related to the software are cited.

  7. Ground control station software design for micro aerial vehicles

    NASA Astrophysics Data System (ADS)

    Walendziuk, Wojciech; Oldziej, Daniel; Binczyk, Dawid Przemyslaw; Slowik, Maciej

    2017-08-01

    This article describes the process of designing the equipment part and the software of a ground control station used for configuring and operating micro unmanned aerial vehicles (UAV). All the works were conducted on a quadrocopter model being a commonly accessible commercial construction. This article contains a characteristics of the research object, the basics of operating the micro aerial vehicles (MAV) and presents components of the ground control station model. It also describes the communication standards for the purpose of building a model of the station. Further part of the work concerns the software of the product - the GIMSO application (Generally Interactive Station for Mobile Objects), which enables the user to manage the actions and communication and control processes from the UAV. The process of creating the software and the field tests of a station model are also presented in the article.

  8. GSC configuration management plan

    NASA Technical Reports Server (NTRS)

    Withers, B. Edward

    1990-01-01

    The tools and methods used for the configuration management of the artifacts (including software and documentation) associated with the Guidance and Control Software (GCS) project are described. The GCS project is part of a software error studies research program. Three implementations of GCS are being produced in order to study the fundamental characteristics of the software failure process. The Code Management System (CMS) is used to track and retrieve versions of the documentation and software. Application of the CMS for this project is described and the numbering scheme is delineated for the versions of the project artifacts.

  9. A software upgrade method for micro-electronics medical implants.

    PubMed

    Cao, Yang; Hao, Hongwei; Xue, Lin; Li, Luming; Ma, Bozhi

    2006-01-01

    A software upgrade method for micro-electronics medical implants is designed to enhance the devices' function or renew the software if there are some bugs found, the software updating or some memory units disabled. The implants needn't be replaced by operations if the faults can be corrected through reprogramming, which reduces the patients' pain and improves the safety effectively. This paper introduces the software upgrade method using in-application programming (IAP) and emphasizes how to insure the system, especially the implanted part's reliability and stability while upgrading.

  10. Assuring Software Cost Estimates: Is it an Oxymoron?

    NASA Technical Reports Server (NTRS)

    Hihn, Jarius; Tregre, Grant

    2013-01-01

    The software industry repeatedly observes cost growth of well over 100% even after decades of cost estimation research and well-known best practices, so "What's the problem?" In this paper we will provide an overview of the current state oj software cost estimation best practice. We then explore whether applying some of the methods used in software assurance might improve the quality of software cost estimates. This paper especially focuses on issues associated with model calibration, estimate review, and the development and documentation of estimates as part alan integrated plan.

  11. User-driven integrated software lives: ``Paleomag'' paleomagnetics analysis on the Macintosh

    NASA Astrophysics Data System (ADS)

    Jones, Craig H.

    2002-12-01

    "PaleoMag," a paleomagnetics analysis package originally developed for the Macintosh operating system in 1988, allows examination of demagnetization of individual samples and analysis of directional data from collections of samples. Prior to recent reinvigorated development of the software for both Macintosh and Windows, it was widely used despite not running properly on machines and operating systems sold after 1995. This somewhat surprising situation demonstrates that there is a continued need for integrated analysis software within the earth sciences, in addition to well-developed scripting and batch-mode software. One distinct advantage of software like PaleoMag is in the ability to combine quality control with analysis within a unique graphical environment. Because such demands are frequent within the earth sciences, means of nurturing the development of similar software should be found.

  12. Statistical modeling of software reliability

    NASA Technical Reports Server (NTRS)

    Miller, Douglas R.

    1992-01-01

    This working paper discusses the statistical simulation part of a controlled software development experiment being conducted under the direction of the System Validation Methods Branch, Information Systems Division, NASA Langley Research Center. The experiment uses guidance and control software (GCS) aboard a fictitious planetary landing spacecraft: real-time control software operating on a transient mission. Software execution is simulated to study the statistical aspects of reliability and other failure characteristics of the software during development, testing, and random usage. Quantification of software reliability is a major goal. Various reliability concepts are discussed. Experiments are described for performing simulations and collecting appropriate simulated software performance and failure data. This data is then used to make statistical inferences about the quality of the software development and verification processes as well as inferences about the reliability of software versions and reliability growth under random testing and debugging.

  13. The Effective Use of Professional Software in an Undergraduate Mining Engineering Curriculum

    ERIC Educational Resources Information Center

    Kecojevic, Vladislav; Bise, Christopher; Haight, Joel

    2005-01-01

    The use of professional software is an integral part of a student's education in the mining engineering curriculum at The Pennsylvania State University. Even though mining engineering represents a limited market across U.S. educational institutions, the goal still exists for using this type of software to enrich the learning environment with…

  14. RELAP-7 Software Verification and Validation Plan: Requirements Traceability Matrix (RTM) Part 1 – Physics and numerical methods

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Choi, Yong Joon; Yoo, Jun Soo; Smith, Curtis Lee

    2015-09-01

    This INL plan comprehensively describes the Requirements Traceability Matrix (RTM) on main physics and numerical method of the RELAP-7. The plan also describes the testing-based software verification and validation (SV&V) process—a set of specially designed software models used to test RELAP-7.

  15. Ethics and Morality in Software Development: A Developer's Perspective

    ERIC Educational Resources Information Center

    Stephenson, James H.

    2010-01-01

    Computers and other digital devices have become ubiquitous in our lives. Almost all aspects of our lives are in part or wholly impacted by computers and the software that runs on them. Unknowingly, we are placing our livelihoods and even our lives in the hands unknown software developers. Ethical and moral decisions made during software…

  16. Expert system verification and validation guidelines/workshop task. Deliverable no. 1: ES V/V guidelines

    NASA Technical Reports Server (NTRS)

    French, Scott W.

    1991-01-01

    The goals are to show that verifying and validating a software system is a required part of software development and has a direct impact on the software's design and structure. Workshop tasks are given in the areas of statistics, integration/system test, unit and architectural testing, and a traffic controller problem.

  17. 15 CFR 740.10 - Servicing and replacement of parts and equipment (RPL).

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... supporting acts of international terrorism) if the commodity to be repaired is an “aircraft” (as defined in... equipment controlled under ECCN 2A983 or related software controlled under ECCN 2D983. (vi) The conditions... defective or unacceptable U.S.-origin commodities and software. (2) Commodities and software sent to a...

  18. If the Design of Express Waybill Influences Customer’s Parcel-seeking?

    NASA Astrophysics Data System (ADS)

    Hu, Pengji; Shi, Juan

    2018-03-01

    Nowadays, college students have become the main flow of online-shopping, hence positioning their own buying in express bulks is getting more and more difficult. In order to figure out how to easily seek out their names on express waybill and fetch their express, an experiment is being conducted to identify on which part the students’ name shall be noticed first. 36 college students (except freshman) from 3 different majors (12 from each major) were tested with the collecting and analyzing of Dikablis by being shown pictures of 4 different express waybills with consignee information on different part of them. The analysis result from relative software shows that consignee information set at parts with larger number of fixation point and longer duration are likely to reinforce the significance of consignee information. Consequently, the consignee information is recommended to set at parts for the sake of students’ convenience.

  19. Evidence of a Heterogeneous Tissue Oxygenation: Renal Ischemia/Reperfusion Injury in a Large Animal Model

    DTIC Science & Technology

    2013-03-01

    operation. 2.1.2 Canine model The canine experiment (n ¼ 1) was performed as a validation of the correlation of visible reflectance imaging measurements...http://spiedl.org/terms with actual blood oxygenation. The canine laparotomy, as part of an animal protocol approved by the Institutional Animal Care and...All data analysis was performed using algorithms and software written in-house using the programming languages Matlab and IDL/ ENVI (ITT Visual

  20. Program Manager: the Journal of the Defense Systems Management College, Volume 14, Number 3, May-June 1985.

    DTIC Science & Technology

    1985-06-01

    Z2~1 31DTIC TAR 31 Unaflnonc.-d Successful The S N wDefense Challenge: Distribul Systems Spare Parts Availability Dr. Joniathian D. Kaplan Lieutenant...Developing Human Perform- hardware software has not been ing that the resulting design be capable ance Specifications ( Kaplan & Crooks, developed at this...design to perform at the components: MOS-characteristics specified criteria. Although the map, analysis-characteristics map, and E Dr. Kaplan is a

  1. Study of Optimum Simulation Techniques for the Design and Evaluation of Anti-Jam Communication Systems

    DTIC Science & Technology

    1976-03-01

    pseudo -ranae and range rate correlations , and GDM software etficiency. Other simplifications include the eliwination of all or part of che multipath...signal is available. Then the pdf parameters are trivially available by simple mean, variance and correlation measurements on the quadrature signal...This report investigates the application of CSEL to the LES 8/9 and GPS satellite programs. In addition, a new analysis of the effects of soft and

  2. Photography of the histological and radiological analysis of the ligaments of the distal radioulnar joint.

    PubMed

    Clayton, Gemma

    2013-06-01

    This project was undertaken as part of the PhD research project of Paul Malone, Pricipal Investigator, Covance plc, Harrogate. Mr Malone approached the photography department for involvement in the study with the aim of settling the current debate on the anatomical and histological features of the distal radioulnar ligaments by capturing the anatomy photographically throughout the process of dissection via a microtome. The author was approached to lead on the photographic protocol as part of her post-graduate certificate training at Staffordshire University. High-resolution digital images of an entire human arm were required, the main area of interest being the distal radioulnar joint of the wrist. Images were to be taken at 40 μm intervals as the specimen was sliced. When microtomy was undertaken through the ligaments images were made at 20 μm intervals. A method of suspending a camera approximately 1 metre above the specimen was devised, together with the preparation for the capture, processing and storage of images. The resulting images were then to be subject to further analysis in the form of 3-Dimensional reconstruction, using computer modelling techniques and software. The possibility of merging the images with sequences obtained from both CT & MRI using image handling software is also an area of exploration, in collaboration with the University of Manchester's Visualisation Centre.

  3. An ontology based trust verification of software license agreement

    NASA Astrophysics Data System (ADS)

    Lu, Wenhuan; Li, Xiaoqing; Gan, Zengqin; Wei, Jianguo

    2017-08-01

    When we install software or download software, there will show up so big mass document to state the rights and obligations, for which lots of person are not patient to read it or understand it. That would may make users feel distrust for the software. In this paper, we propose an ontology based verification for Software License Agreement. First of all, this work proposed an ontology model for domain of Software License Agreement. The domain ontology is constructed by proposed methodology according to copyright laws and 30 software license agreements. The License Ontology can act as a part of generalized copyright law knowledge model, and also can work as visualization of software licenses. Based on this proposed ontology, a software license oriented text summarization approach is proposed which performances showing that it can improve the accuracy of software licenses summarizing. Based on the summarization, the underline purpose of the software license can be explicitly explored for trust verification.

  4. Theory and praxis of map analsys in CHEF part 2: Nonlinear normal form

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Michelotti, Leo; /FERMILAB

    2009-04-01

    This is the second of three memos describing how normal form map analysis is implemented in CHEF. The first [1] explained the manipulations required to assure that initial, linear transformations preserved Poincare invariants, thereby confirming correct normalization of action-angle coordinates. In this one, the transformation will be extended to nonlinear terms. The third, describing how the algorithms were implemented within the software of CHEF's libraries, most likely will never be written. The first section, Section 2, quickly lays out preliminary concepts and relationships. In Section 3, we shall review the perturbation theory - an iterative sequence of transformations that convertsmore » a nonlinear mapping into its normal form - and examine the equation which moves calculations from one step to the next. Following that is a section titled 'Interpretation', which identifies connections between the normalized mappings and idealized, integrable, fictitious Hamiltonian models. A final section contains closing comments, some of which may - but probably will not - preview work to be done later. My reasons for writing this memo and its predecessor have already been expressed. [1] To them can be added this: 'black box code' encourages users to proceed with little or no understanding of what it does or how it operates. So far, CHEF has avoided this trap admirably by failing to attract potential users. However, we reached a watershed last year: even I now have difficulty following the software through its maze of operations. Extensions to CHEF's physics functionalities, software upgrades, and even simple maintenance are becoming more difficult than they should. I hope these memos will mark parts of the maze for easier navigation in the future. Despite appearances to the contrary, I tried to include no (or very little) more than the minimum needed to understand what CHEF's nonlinear analysis modules do.1 As with the first memo, material has been lifted - and modified - from Intermediate Classical Dynamics (ICD) [2], old technical memos, seminar viewgraphs, and lecture notes. Finally, for a reason I do not know but am willing to indulge, equation and comment labels start from where they left off in Part 1.« less

  5. Semi-automatic tracking, smoothing and segmentation of hyoid bone motion from videofluoroscopic swallowing study.

    PubMed

    Kim, Won-Seok; Zeng, Pengcheng; Shi, Jian Qing; Lee, Youngjo; Paik, Nam-Jong

    2017-01-01

    Motion analysis of the hyoid bone via videofluoroscopic study has been used in clinical research, but the classical manual tracking method is generally labor intensive and time consuming. Although some automatic tracking methods have been developed, masked points could not be tracked and smoothing and segmentation, which are necessary for functional motion analysis prior to registration, were not provided by the previous software. We developed software to track the hyoid bone motion semi-automatically. It works even in the situation where the hyoid bone is masked by the mandible and has been validated in dysphagia patients with stroke. In addition, we added the function of semi-automatic smoothing and segmentation. A total of 30 patients' data were used to develop the software, and data collected from 17 patients were used for validation, of which the trajectories of 8 patients were partly masked. Pearson correlation coefficients between the manual and automatic tracking are high and statistically significant (0.942 to 0.991, P-value<0.0001). Relative errors between automatic tracking and manual tracking in terms of the x-axis, y-axis and 2D range of hyoid bone excursion range from 3.3% to 9.2%. We also developed an automatic method to segment each hyoid bone trajectory into four phases (elevation phase, anterior movement phase, descending phase and returning phase). The semi-automatic hyoid bone tracking from VFSS data by our software is valid compared to the conventional manual tracking method. In addition, the ability of automatic indication to switch the automatic mode to manual mode in extreme cases and calibration without attaching the radiopaque object is convenient and useful for users. Semi-automatic smoothing and segmentation provide further information for functional motion analysis which is beneficial to further statistical analysis such as functional classification and prognostication for dysphagia. Therefore, this software could provide the researchers in the field of dysphagia with a convenient, useful, and all-in-one platform for analyzing the hyoid bone motion. Further development of our method to track the other swallowing related structures or objects such as epiglottis and bolus and to carry out the 2D curve registration may be needed for a more comprehensive functional data analysis for dysphagia with big data.

  6. Semantic Metrics for Analysis of Software

    NASA Technical Reports Server (NTRS)

    Etzkorn, Letha H.; Cox, Glenn W.; Farrington, Phil; Utley, Dawn R.; Ghalston, Sampson; Stein, Cara

    2005-01-01

    A recently conceived suite of object-oriented software metrics focus is on semantic aspects of software, in contradistinction to traditional software metrics, which focus on syntactic aspects of software. Semantic metrics represent a more human-oriented view of software than do syntactic metrics. The semantic metrics of a given computer program are calculated by use of the output of a knowledge-based analysis of the program, and are substantially more representative of software quality and more readily comprehensible from a human perspective than are the syntactic metrics.

  7. Analysis of the shrinkage at the thick plate part using response surface methodology

    NASA Astrophysics Data System (ADS)

    Hatta, N. M.; Azlan, M. Z.; Shayfull, Z.; Roselina, S.; Nasir, S. M.

    2017-09-01

    Injection moulding is well known for its manufacturing process especially in producing plastic products. To measure the final product quality, there are lots of precautions to be taken into such as parameters setting at the initial stage of the process. Sometimes, if these parameters were set up wrongly, defects may be occurred and one of the well-known defects in the injection moulding process is a shrinkage. To overcome this problem, a maximisation at the precaution stage by making an optimal adjustment on the parameter setting need to be done and this paper focuses on analysing the shrinkage by optimising the parameter at thick plate part with the help of Response Surface Methodology (RSM) and ANOVA analysis. From the previous study, the outstanding parameter gained from the optimisation method in minimising the shrinkage at the moulded part was packing pressure. Therefore, with the reference from the previous literature, packing pressure was selected as the parameter setting for this study with other three parameters which are melt temperature, cooling time and mould temperature. The analysis of the process was obtained from the simulation by Autodesk Moldflow Insight (AMI) software and the material used for moulded part was Acrylonitrile Butadiene Styrene (ABS). The analysis and result were obtained and it found that the shrinkage can be minimised and the significant parameters were found as packing pressure, mould temperature and melt temperature.

  8. Architectural Analysis of Complex Evolving Systems of Systems

    NASA Technical Reports Server (NTRS)

    Lindvall, Mikael; Stratton, William C.; Sibol, Deane E.; Ray, Arnab; Ackemann, Chris; Yonkwa, Lyly; Ganesan, Dharma

    2009-01-01

    The goal of this collaborative project between FC-MD, APL, and GSFC and supported by NASA IV&V Software Assurance Research Program (SARP), was to develop a tool, Dynamic SAVE, or Dyn-SAVE for short, for analyzing architectures of systems of systems. The project team was comprised of the principal investigator (PI) from FC-MD and four other FC-MD scientists (part time) and several FC-MD students (full time), as well as, two APL software architects (part time), and one NASA POC (part time). The PI and FC-MD scientists together with APL architects were responsible for requirements analysis, and for applying and evaluating the Dyn-SAVE tool and method. The PI and a group of FC-MD scientists were responsible for improving the method and conducting outreach activities, while another group of FC-MD scientists were responsible for development and improvement of the tool. Oversight and reporting was conducted by the PI and NASA POC. The project team produced many results including several prototypes of the Dyn-SAVE tool and method, several case studies documenting how the tool and method was applied to APL s software systems, and several published papers in highly respected conferences and journals. Dyn-SAVE as developed and enhanced throughout this research period, is a software tool intended for software developers and architects, software integration testers, and persons who need to analyze software systems from the point of view of how it communicates with other systems. Using the tool, the user specifies the planned communication behavior of the system modeled as a sequence diagram. The user then captures and imports the actual communication behavior of the system, which is then converted and visualized as a sequence diagram by Dyn-SAVE. After mapping the planned to the actual and specifying parameter and timing constraints, Dyn-SAVE detects and highlights deviations between the planned and the actual behavior. Requirements based on the need to analyze two inter-system communication protocols that are representative of protocols used in the Aerospace industry have been specified. The protocols are related: APL s Common Ground System (CGS) as used in the MErcury Surface, Space ENvironment, GEochemistry, and Ranging (MESSENGER) and the Radiation Belt Space Probes (RBSP) missions. The analyzed communications were implementations of the Telemetry protocol and the CCSDS File Delivery Protocol (CFDP) protocol. Based on these requirements, three prototypes of Dyn-SAVE were developed and applied to these protocols. The application of Dyn-SAVE to these protocols resulted in the detection of several issues. Dyn-SAVE was also applied to several Testbeds that have previously been used for experimentation earlier on this project, as well as, to other protocols and logs for testing its broader applicability. For example, Dyn-SAVE was used to analyze 1) the communication pattern between a web browser and a web server, 2) the system log of a computer in order to detect offnominal computer shut-down behavior, and 3) the actual test cases of NASA Goddard s Core Flight System (CFS) and automatically generated test cases in order to determine the overlap between the two sets of test cases. In all cases, Dyn-SAVE assisted in providing insightful conclusions about each of the cases identified above.

  9. Numerical continuation and bifurcation analysis in aircraft design: an industrial perspective.

    PubMed

    Sharma, Sanjiv; Coetzee, Etienne B; Lowenberg, Mark H; Neild, Simon A; Krauskopf, Bernd

    2015-09-28

    Bifurcation analysis is a powerful method for studying the steady-state nonlinear dynamics of systems. Software tools exist for the numerical continuation of steady-state solutions as parameters of the system are varied. These tools make it possible to generate 'maps of solutions' in an efficient way that provide valuable insight into the overall dynamic behaviour of a system and potentially to influence the design process. While this approach has been employed in the military aircraft control community to understand the effectiveness of controllers, the use of bifurcation analysis in the wider aircraft industry is yet limited. This paper reports progress on how bifurcation analysis can play a role as part of the design process for passenger aircraft. © 2015 The Author(s).

  10. Mechanical System Analysis/Design Tool (MSAT) Quick Guide

    NASA Technical Reports Server (NTRS)

    Lee, HauHua; Kolb, Mark; Madelone, Jack

    1998-01-01

    MSAT is a unique multi-component multi-disciplinary tool that organizes design analysis tasks around object-oriented representations of configuration components, analysis programs and modules, and data transfer links between them. This creative modular architecture enables rapid generation of input stream for trade-off studies of various engine configurations. The data transfer links automatically transport output from one application as relevant input to the next application once the sequence is set up by the user. The computations are managed via constraint propagation - the constraints supplied by the user as part of any optimization module. The software can be used in the preliminary design stage as well as during the detail design of product development process.

  11. Software Reliability Analysis of NASA Space Flight Software: A Practical Experience

    PubMed Central

    Sukhwani, Harish; Alonso, Javier; Trivedi, Kishor S.; Mcginnis, Issac

    2017-01-01

    In this paper, we present the software reliability analysis of the flight software of a recently launched space mission. For our analysis, we use the defect reports collected during the flight software development. We find that this software was developed in multiple releases, each release spanning across all software life-cycle phases. We also find that the software releases were developed and tested for four different hardware platforms, spanning from off-the-shelf or emulation hardware to actual flight hardware. For releases that exhibit reliability growth or decay, we fit Software Reliability Growth Models (SRGM); otherwise we fit a distribution function. We find that most releases exhibit reliability growth, with Log-Logistic (NHPP) and S-Shaped (NHPP) as the best-fit SRGMs. For the releases that experience reliability decay, we investigate the causes for the same. We find that such releases were the first software releases to be tested on a new hardware platform, and hence they encountered major hardware integration issues. Also such releases seem to have been developed under time pressure in order to start testing on the new hardware platform sooner. Such releases exhibit poor reliability growth, and hence exhibit high predicted failure rate. Other problems include hardware specification changes and delivery delays from vendors. Thus, our analysis provides critical insights and inputs to the management to improve the software development process. As NASA has moved towards a product line engineering for its flight software development, software for future space missions will be developed in a similar manner and hence the analysis results for this mission can be considered as a baseline for future flight software missions. PMID:29278255

  12. Software Reliability Analysis of NASA Space Flight Software: A Practical Experience.

    PubMed

    Sukhwani, Harish; Alonso, Javier; Trivedi, Kishor S; Mcginnis, Issac

    2016-01-01

    In this paper, we present the software reliability analysis of the flight software of a recently launched space mission. For our analysis, we use the defect reports collected during the flight software development. We find that this software was developed in multiple releases, each release spanning across all software life-cycle phases. We also find that the software releases were developed and tested for four different hardware platforms, spanning from off-the-shelf or emulation hardware to actual flight hardware. For releases that exhibit reliability growth or decay, we fit Software Reliability Growth Models (SRGM); otherwise we fit a distribution function. We find that most releases exhibit reliability growth, with Log-Logistic (NHPP) and S-Shaped (NHPP) as the best-fit SRGMs. For the releases that experience reliability decay, we investigate the causes for the same. We find that such releases were the first software releases to be tested on a new hardware platform, and hence they encountered major hardware integration issues. Also such releases seem to have been developed under time pressure in order to start testing on the new hardware platform sooner. Such releases exhibit poor reliability growth, and hence exhibit high predicted failure rate. Other problems include hardware specification changes and delivery delays from vendors. Thus, our analysis provides critical insights and inputs to the management to improve the software development process. As NASA has moved towards a product line engineering for its flight software development, software for future space missions will be developed in a similar manner and hence the analysis results for this mission can be considered as a baseline for future flight software missions.

  13. The new meaning of quality in the information age.

    PubMed

    Prahalad, C K; Krishnan, M S

    1999-01-01

    Software applications are now a mission-critical source of competitive advantage for most companies. They are also a source of great risk, as the Y2K bug has made clear. Yet many line managers still haven't confronted software issues--partly because they aren't sure how best to define the quality of the applications in their IT infrastructures. Some companies such as Wal-Mart and the Gap have successfully integrated the software in their networks, but most have accumulated an unwidely number of incompatible applications--all designed to perform the same tasks. The authors provide a framework for measuring the performance of software in a company's IT portfolio. Quality traditionally has been measured according to a product's ability to meet certain specifications; other views of quality have emerged that measure a product's adaptability to customers' needs and a product's ability to encourage innovation. To judge software quality properly, argue the authors, managers must measure applications against all three approaches. Understanding the domain of a software application is an important part of that process. The domain is the body of knowledge about a user's needs and expectations for a product. Software domains change frequently based on how a consumer chooses to use, for example, Microsoft Word or a spreadsheet application. The domain can also be influenced by general changes in technology, such as the development of a new software platform. Thus, applications can't be judged only according to whether they conform to specifications. The authors discuss how to identify domain characteristics and software risks and suggest ways to reduce the variability of software domains.

  14. Learn by Yourself: The Self-Learning Tools for Qualitative Analysis Software Packages

    ERIC Educational Resources Information Center

    Freitas, Fábio; Ribeiro, Jaime; Brandão, Catarina; Reis, Luís Paulo; de Souza, Francislê Neri; Costa, António Pedro

    2017-01-01

    Computer Assisted Qualitative Data Analysis Software (CAQDAS) are tools that help researchers to develop qualitative research projects. These software packages help the users with tasks such as transcription analysis, coding and text interpretation, writing and annotation, content search and analysis, recursive abstraction, grounded theory…

  15. Kubios HRV--heart rate variability analysis software.

    PubMed

    Tarvainen, Mika P; Niskanen, Juha-Pekka; Lipponen, Jukka A; Ranta-Aho, Perttu O; Karjalainen, Pasi A

    2014-01-01

    Kubios HRV is an advanced and easy to use software for heart rate variability (HRV) analysis. The software supports several input data formats for electrocardiogram (ECG) data and beat-to-beat RR interval data. It includes an adaptive QRS detection algorithm and tools for artifact correction, trend removal and analysis sample selection. The software computes all the commonly used time-domain and frequency-domain HRV parameters and several nonlinear parameters. There are several adjustable analysis settings through which the analysis methods can be optimized for different data. The ECG derived respiratory frequency is also computed, which is important for reliable interpretation of the analysis results. The analysis results can be saved as an ASCII text file (easy to import into MS Excel or SPSS), Matlab MAT-file, or as a PDF report. The software is easy to use through its compact graphical user interface. The software is available free of charge for Windows and Linux operating systems at http://kubios.uef.fi. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  16. SIMPATIQCO: a server-based software suite which facilitates monitoring the time course of LC-MS performance metrics on Orbitrap instruments.

    PubMed

    Pichler, Peter; Mazanek, Michael; Dusberger, Frederico; Weilnböck, Lisa; Huber, Christian G; Stingl, Christoph; Luider, Theo M; Straube, Werner L; Köcher, Thomas; Mechtler, Karl

    2012-11-02

    While the performance of liquid chromatography (LC) and mass spectrometry (MS) instrumentation continues to increase, applications such as analyses of complete or near-complete proteomes and quantitative studies require constant and optimal system performance. For this reason, research laboratories and core facilities alike are recommended to implement quality control (QC) measures as part of their routine workflows. Many laboratories perform sporadic quality control checks. However, successive and systematic longitudinal monitoring of system performance would be facilitated by dedicated automatic or semiautomatic software solutions that aid an effortless analysis and display of QC metrics over time. We present the software package SIMPATIQCO (SIMPle AuTomatIc Quality COntrol) designed for evaluation of data from LTQ Orbitrap, Q-Exactive, LTQ FT, and LTQ instruments. A centralized SIMPATIQCO server can process QC data from multiple instruments. The software calculates QC metrics supervising every step of data acquisition from LC and electrospray to MS. For each QC metric the software learns the range indicating adequate system performance from the uploaded data using robust statistics. Results are stored in a database and can be displayed in a comfortable manner from any computer in the laboratory via a web browser. QC data can be monitored for individual LC runs as well as plotted over time. SIMPATIQCO thus assists the longitudinal monitoring of important QC metrics such as peptide elution times, peak widths, intensities, total ion current (TIC) as well as sensitivity, and overall LC-MS system performance; in this way the software also helps identify potential problems. The SIMPATIQCO software package is available free of charge.

  17. SIMPATIQCO: A Server-Based Software Suite Which Facilitates Monitoring the Time Course of LC–MS Performance Metrics on Orbitrap Instruments

    PubMed Central

    2012-01-01

    While the performance of liquid chromatography (LC) and mass spectrometry (MS) instrumentation continues to increase, applications such as analyses of complete or near-complete proteomes and quantitative studies require constant and optimal system performance. For this reason, research laboratories and core facilities alike are recommended to implement quality control (QC) measures as part of their routine workflows. Many laboratories perform sporadic quality control checks. However, successive and systematic longitudinal monitoring of system performance would be facilitated by dedicated automatic or semiautomatic software solutions that aid an effortless analysis and display of QC metrics over time. We present the software package SIMPATIQCO (SIMPle AuTomatIc Quality COntrol) designed for evaluation of data from LTQ Orbitrap, Q-Exactive, LTQ FT, and LTQ instruments. A centralized SIMPATIQCO server can process QC data from multiple instruments. The software calculates QC metrics supervising every step of data acquisition from LC and electrospray to MS. For each QC metric the software learns the range indicating adequate system performance from the uploaded data using robust statistics. Results are stored in a database and can be displayed in a comfortable manner from any computer in the laboratory via a web browser. QC data can be monitored for individual LC runs as well as plotted over time. SIMPATIQCO thus assists the longitudinal monitoring of important QC metrics such as peptide elution times, peak widths, intensities, total ion current (TIC) as well as sensitivity, and overall LC–MS system performance; in this way the software also helps identify potential problems. The SIMPATIQCO software package is available free of charge. PMID:23088386

  18. [Three-dimensional finite element stress distribution and displacement analysis of alveolar ridge retained by conical telescope].

    PubMed

    Lin, Ying-he; Man, Yi; Liang, Xing; Qu, Yi-li; Lu, Xuan

    2004-11-01

    To study the stress distribution and displacement of edentulous alveolar ridge of removable partial denture which is retained by using conical telescope. An ideal three dimensional finite element model was constructed by using SCT image reconstruction technique, self-programming and ANSYS software. The static load was applied. The stress and displacement characteristics of these different types of materials which form the metal part of the conical telescope were compared and analyzed. Generally, the four materials produced almost the same stress and displacement at the site of the edentulous alveolar ridge. From the viewpoint of dynamics, the application of different materials in making the metal part of conical telescope is feasible.

  19. An online database for plant image analysis software tools.

    PubMed

    Lobet, Guillaume; Draye, Xavier; Périlleux, Claire

    2013-10-09

    Recent years have seen an increase in methods for plant phenotyping using image analyses. These methods require new software solutions for data extraction and treatment. These solutions are instrumental in supporting various research pipelines, ranging from the localisation of cellular compounds to the quantification of tree canopies. However, due to the variety of existing tools and the lack of central repository, it is challenging for researchers to identify the software that is best suited for their research. We present an online, manually curated, database referencing more than 90 plant image analysis software solutions. The website, plant-image-analysis.org, presents each software in a uniform and concise manner enabling users to identify the available solutions for their experimental needs. The website also enables user feedback, evaluations and new software submissions. The plant-image-analysis.org database provides an overview of existing plant image analysis software. The aim of such a toolbox is to help users to find solutions, and to provide developers a way to exchange and communicate about their work.

  20. GWAMA: software for genome-wide association meta-analysis.

    PubMed

    Mägi, Reedik; Morris, Andrew P

    2010-05-28

    Despite the recent success of genome-wide association studies in identifying novel loci contributing effects to complex human traits, such as type 2 diabetes and obesity, much of the genetic component of variation in these phenotypes remains unexplained. One way to improving power to detect further novel loci is through meta-analysis of studies from the same population, increasing the sample size over any individual study. Although statistical software analysis packages incorporate routines for meta-analysis, they are ill equipped to meet the challenges of the scale and complexity of data generated in genome-wide association studies. We have developed flexible, open-source software for the meta-analysis of genome-wide association studies. The software incorporates a variety of error trapping facilities, and provides a range of meta-analysis summary statistics. The software is distributed with scripts that allow simple formatting of files containing the results of each association study and generate graphical summaries of genome-wide meta-analysis results. The GWAMA (Genome-Wide Association Meta-Analysis) software has been developed to perform meta-analysis of summary statistics generated from genome-wide association studies of dichotomous phenotypes or quantitative traits. Software with source files, documentation and example data files are freely available online at http://www.well.ox.ac.uk/GWAMA.

Top