Science.gov

Sample records for focused application software

  1. Knowledge focus via software agents

    NASA Astrophysics Data System (ADS)

    Henager, Donald E.

    2001-09-01

    The essence of military Command and Control (C2) is making knowledge intensive decisions in a limited amount of time using uncertain, incorrect, or outdated information. It is essential to provide tools to decision-makers that provide: * Management of friendly forces by treating the "friendly resources as a system". * Rapid assessment of effects of military actions againt the "enemy as a system". * Assessment of how an enemy should, can, and could react to friendly military activities. Software agents in the form of mission agents, target agents, maintenance agents, and logistics agents can meet this information challenge. The role of each agent is to know all the details about its assigned mission, target, maintenance, or logistics entity. The Mission Agent would fight for mission resources based on the mission priority and analyze the effect that a proposed mission's results would have on the enemy. The Target Agent (TA) communicates with other targets to determine its role in the system of targets. A system of TAs would be able to inform a planner or analyst of the status of a system of targets, the effect of that status, adn the effect of attacks on that system. The system of TAs would also be able to analyze possible enemy reactions to attack by determining ways to minimize the effect of attack, such as rerouting traffic or using deception. The Maintenance Agent would scheudle maintenance events and notify the maintenance unit. The Logistics Agent would manage shipment and delivery of supplies to maintain appropriate levels of weapons, fuel and spare parts. The central idea underlying this case of software agents is knowledge focus. Software agents are createad automatically to focus their attention on individual real-world entities (e.g., missions, targets) and view the world from that entities perspective. The agent autonomously monitors the entity, identifies problems/opportunities, formulates solutions, and informs the decision-maker. The agent must be

  2. SU-E-J-04: Integration of Interstitial High Intensity Therapeutic Ultrasound Applicators On a Clinical MRI-Guided High Intensity Focused Ultrasound Treatment Planning Software Platform

    SciTech Connect

    Ellens, N; Partanen, A; Ghoshal, G; Burdette, E; Farahani, K

    2015-06-15

    Purpose: Interstitial high intensity therapeutic ultrasound (HITU) applicators can be used to ablate tissue percutaneously, allowing for minimally-invasive treatment without ionizing radiation [1,2]. The purpose of this study was to evaluate the feasibility and usability of combining multielement interstitial HITU applicators with a clinical magnetic resonance imaging (MRI)-guided focused ultrasound software platform. Methods: The Sonalleve software platform (Philips Healthcare, Vantaa, Finland) combines anatomical MRI for target selection and multi-planar MRI thermometry to provide real-time temperature information. The MRI-compatible interstitial US applicators (Acoustic MedSystems, Savoy, IL, USA) had 1–4 cylindrical US elements, each 1 cm long with either 180° or 360° of active surface. Each applicator (4 Fr diameter, enclosed within a 13 Fr flexible catheter) was inserted into a tissue-mimicking agar-silica phantom. Degassed water was circulated around the transducers for cooling and coupling. Based on the location of the applicator, a virtual transducer overlay was added to the software to assist targeting and to allow automatic thermometry slice placement. The phantom was sonicated at 7 MHz for 5 minutes with 6–8 W of acoustic power for each element. MR thermometry data were collected during and after sonication. Results: Preliminary testing indicated that the applicator location could be identified in the planning images and the transducer locations predicted within 1 mm accuracy using the overlay. Ablation zones (thermal dose ≥ 240 CEM43) for 2 active, adjacent US elements ranged from 18 mm × 24 mm (width × length) to 25 mm × 25 mm for the 6 W and 8 W sonications, respectively. Conclusion: The combination of interstitial HITU applicators and this software platform holds promise for novel approaches in minimally-invasive MRI-guided therapy, especially when bony structures or air-filled cavities may preclude extracorporeal HIFU.[1] Diederich et al

  3. Cartographic applications software

    USGS Publications Warehouse

    U.S. Geological Survey

    1992-01-01

    The Office of the Assistant Division Chief for Research, National Mapping Division, develops computer software for the solution of geometronic problems in the fields of surveying, geodesy, remote sensing, and photogrammetry. Software that has been developed using public funds is available on request for a nominal charge to recover the cost of duplication.

  4. CDP: application of focus drilling

    NASA Astrophysics Data System (ADS)

    Geisler, S.; Bauer, J.; Haak, U.; Schulz, K.; Old, G.; Matthus, E.

    2009-01-01

    The achievement of a depth of focus required for stable process conditions is one of the biggest challenges in modern optical photolithography. There are several ways of improving the depth of focus. For line/space layers, for instance, application of RET (Resolution Enhancement Technology) using scattering bars, phaseshift masks or optimized illumination systems have shown good results. For contact and via layers the depth of focus is limited and critical, due to the structure size of the holes, alternating pattern density and wafer topology. A well known method of improving the depth of focus for contact and via layers is called focus latitude enhancement exposure (FLEX) [1-3]. With FLEX, several focal planes are being exposed, i.e. each during a separate exposure step. The main drawback is low throughput, as the total processing time rises which each additional exposure. In this paper, we investigate Nikon's CDP (continuous depth of focus expansion procedure) [4]. The CDP option is applicable to modern scanning exposure tools [4-5]. A schematic view of the procedure is shown in Fig. 1. The CDP value or CDP amplitude defines the tilt of the wafer and thus the range of focus in the resist, as the focus plane migrates through the resist during the exposure. The main advantage of CDP, compared to FLEX, is higher throughput, since focal planes are defined within a single exposure. A non-CDP exposure may result in varying aerial images within resist thickness, therefore leading to decreased image contrast within out-of-focus planes. As shown in Fig. 1 the averaged aerial images of a CDP exposure induce better image contrast throughout the resist layer and therefore increase the focus window.

  5. Ray-tracing software comparison for linear focusing solar collectors

    NASA Astrophysics Data System (ADS)

    Osório, Tiago; Horta, Pedro; Larcher, Marco; Pujol-Nadal, Ramón; Hertel, Julian; van Rooyen, De Wet; Heimsath, Anna; Schneider, Simon; Benitez, Daniel; Frein, Antoine; Denarie, Alice

    2016-05-01

    Ray-Tracing software tools have been widely used in the optical design of solar concentrating collectors. In spite of the ability of these tools to assess the geometrical and material aspects impacting the optical performance of concentrators, their use in combination with experimental measurements in the framework of collector testing procedures as not been implemented, to the date, in none of the current solar collector testing standards. In the latest revision of ISO9806 an effort was made to include linear focusing concentrating collectors but some practical and theoretical difficulties emerged. A Ray-Tracing analysis could provide important contributions to overcome these issues, complementing the experimental results obtained through thermal testing and allowing the achievement of more thorough testing outputs with lower experimental requirements. In order to evaluate different available software tools a comparison study was conducted. Taking as representative technologies for line-focus concentrators the Parabolic Trough Collector and the Linear Fresnel Reflector Collector, two exemplary cases with predefined conditions - geometry, sun model and material properties - were simulated with different software tools. This work was carried out within IEA/SHC Task 49 "Solar Heat Integration in Industrial Processes".

  6. Risk reduction using DDP (Defect Detection and Prevention): Software support and software applications

    NASA Technical Reports Server (NTRS)

    Feather, M. S.

    2001-01-01

    Risk assessment and mitigation is the focus of the Defect Detection and Prevention (DDP) process, which has been applied to spacecraft technology assessments and planning, both hardware and software. DDP's major elements and their relevance to core requirement engineering concerns are summarized. The accompanying research demonstration illustrates DDP's tool support, and further customizations for application to software.

  7. The LBT double prime focus camera control software

    NASA Astrophysics Data System (ADS)

    Di Paola, Andrea; Baruffolo, Andrea; Gallozzi, Stefano; Pedichini, Fernando; Speziali, Roberto

    2004-09-01

    The LBT double prime focus camera (LBC) is composed of twin CCD mosaic imagers. The instrument is designed to match the double channel structure of the LBT telescope and to exploit parallel observing mode by optimizing one camera at blue and the other at red side of the visible spectrum. Because of these facts, the LBC activity will likely consist of simultaneous multi-wavelength observation of specific targets, with both channels working at the same time to acquire and download images at different rates. The LBC Control Software is responsible for coordinating these activities by managing scientific sensors and all the ancillary devices such as rotators, filter wheels, optical correctors focusing, house-keeping information, tracking and Active Optics wavefront sensors. The result is obtained using four dedicated PCs to control the four CCD controllers and one dual processor PC to manage all the other aspects including instrument operator interface. The general architecture of the LBC Control Software is described as well as solutions and details about its implementation.

  8. Portable Medical Laboratory Applications Software

    PubMed Central

    Silbert, Jerome A.

    1983-01-01

    Portability implies that a program can be run on a variety of computers with minimal software revision. The advantages of portability are outlined and design considerations for portable laboratory software are discussed. Specific approaches for achieving this goal are presented.

  9. Software Component Technologies and Space Applications

    NASA Technical Reports Server (NTRS)

    Batory, Don

    1995-01-01

    In the near future, software systems will be more reconfigurable than hardware. This will be possible through the advent of software component technologies which have been prototyped in universities and research labs. In this paper, we outline the foundations for those technologies and suggest how they might impact software for space applications.

  10. Software testability and its application to avionic software

    NASA Technical Reports Server (NTRS)

    Voas, Jeffrey M.; Miller, Keith W.; Payne, Jeffery E.

    1993-01-01

    Randomly generated black-box testing is an established yet controversial method of estimating software reliability. Unfortunately, as software applications have required higher reliabilities, practical difficulties with black-box testing have become increasingly problematic. These practical problems are particularly acute in life-critical avionics software, where requirements of 10 exp -7 failures per hour of system reliability can translate into a probability of failure (POF) of perhaps 10 exp -9 or less for each individual execution of the software. This paper describes the application of one type of testability analysis called 'sensitivity analysis' to B-737 avionics software; one application of sensitivity analysis is to quantify whether software testing is capable of detecting faults in a particular program and thus whether we can be confident that a tested program is not hiding faults. We so 80 by finding the testabilities of the individual statements of the program, and then use those statement testabilities to find the testabilities of the functions and modules. For the B-737 system we analyzed, we were able to isolate those functions that are more prone to hide errors during system/reliability testing.

  11. Programming Language Software For Graphics Applications

    NASA Technical Reports Server (NTRS)

    Beckman, Brian C.

    1993-01-01

    New approach reduces repetitive development of features common to different applications. High-level programming language and interactive environment with access to graphical hardware and software created by adding graphical commands and other constructs to standardized, general-purpose programming language, "Scheme". Designed for use in developing other software incorporating interactive computer-graphics capabilities into application programs. Provides alternative to programming entire applications in C or FORTRAN, specifically ameliorating design and implementation of complex control and data structures typifying applications with interactive graphics. Enables experimental programming and rapid development of prototype software, and yields high-level programs serving as executable versions of software-design documentation.

  12. Focused fault injection testing of software implemented fault tolerance mechanisms of Voltan TMR nodes

    NASA Astrophysics Data System (ADS)

    Tao, S.; Ezhilchelvan, P. D.; Shrivastava, S. K.

    1995-03-01

    One way of gaining confidence in the adequacy of fault tolerance mechanisms of a system is to test the system by injecting faults and see how the system performs under faulty conditions. This paper presents an application of the focused fault injection method that has been developed for testing software implemented fault tolerance mechanisms of distributed systems. The method exploits the object oriented approach of software implementation to support the injection of specific classes of faults. With the focused fault injection method, the system tester is able to inject specific classes of faults (including malicious ones) such that the fault tolerance mechanisms of a target system can be tested adequately. The method has been applied to test the design and implementation of voting, clock synchronization, and ordering modules of the Voltan TMR (triple modular redundant) node. The tests performed uncovered three flaws in the system software.

  13. Software applications for flux balance analysis.

    PubMed

    Lakshmanan, Meiyappan; Koh, Geoffrey; Chung, Bevan K S; Lee, Dong-Yup

    2014-01-01

    Flux balance analysis (FBA) is a widely used computational method for characterizing and engineering intrinsic cellular metabolism. The increasing number of its successful applications and growing popularity are possibly attributable to the availability of specific software tools for FBA. Each tool has its unique features and limitations with respect to operational environment, user-interface and supported analysis algorithms. Presented herein is an in-depth evaluation of currently available FBA applications, focusing mainly on usability, functionality, graphical representation and inter-operability. Overall, most of the applications are able to perform basic features of model creation and FBA simulation. COBRA toolbox, OptFlux and FASIMU are versatile to support advanced in silico algorithms to identify environmental and genetic targets for strain design. SurreyFBA, WEbcoli, Acorn, FAME, GEMSiRV and MetaFluxNet are the distinct tools which provide the user friendly interfaces in model handling. In terms of software architecture, FBA-SimVis and OptFlux have the flexible environments as they enable the plug-in/add-on feature to aid prospective functional extensions. Notably, an increasing trend towards the implementation of more tailored e-services such as central model repository and assistance to collaborative efforts was observed among the web-based applications with the help of advanced web-technologies. Furthermore, most recent applications such as the Model SEED, FAME, MetaFlux and MicrobesFlux have even included several routines to facilitate the reconstruction of genome-scale metabolic models. Finally, a brief discussion on the future directions of FBA applications was made for the benefit of potential tool developers. PMID:23131418

  14. Collaborative Software and Focused Distraction in the Classroom

    ERIC Educational Resources Information Center

    Rhine, Steve; Bailey, Mark

    2011-01-01

    In search of strategies for increasing their pre-service teachers' thoughtful engagement with content and in an effort to model connection between choice of technology and pedagogical goals, the authors utilized collaborative software during class time. Collaborative software allows all students to write simultaneously on a single collective…

  15. Approaching Parallelization of Payload Software Application on ARM Multicore Platforms

    NASA Astrophysics Data System (ADS)

    Bretault, Pierre; Chatonnay, Nicolas; Calmet, Brigitte

    2015-09-01

    This paper is the result of a study realized by Thales Alenia Space (TAS) in collaboration with the french space agency (CNES). It introduces an approach for parallelizing a payload oriented software application. The first part of the paper tackles with the different issues a software engineer faces when he/she starts a software development on a multicore platform. The second part exposes, through a concrete case of study, the iterative approach we adopt to distribute a payload-oriented software application. The case of study consists in parallelizing a full software signal processing chain running on top of an Execution Platform composed of a PikeOS hypervisor and of an ARM quad-core platform (Cortex A9). Finally, the conclusion of the paper focuses on returns of experience related to such a development.

  16. Database Handling Software and Scientific Applications.

    ERIC Educational Resources Information Center

    Gabaldon, Diana J.

    1984-01-01

    Discusses the general characteristics of database management systems and file systems. Also gives a basic framework for evaluating such software and suggests characteristics that should be considered when buying software for specific scientific applications. A list of vendor addresses for popular database management systems is included. (JN)

  17. Firing Room Remote Application Software Development

    NASA Technical Reports Server (NTRS)

    Liu, Kan

    2015-01-01

    The Engineering and Technology Directorate (NE) at National Aeronautics and Space Administration (NASA) Kennedy Space Center (KSC) is designing a new command and control system for the checkout and launch of Space Launch System (SLS) and future rockets. The purposes of the semester long internship as a remote application software developer include the design, development, integration, and verification of the software and hardware in the firing rooms, in particular with the Mobile Launcher (ML) Launch Accessories (LACC) subsystem. In addition, a software test verification procedure document was created to verify and checkout LACC software for Launch Equipment Test Facility (LETF) testing.

  18. Focus: Design and Evaluation of a Software Tool for Collecting Reader Feedback.

    ERIC Educational Resources Information Center

    de Jong, Menno; Lentz, Leo

    2001-01-01

    Describes "Focus," a software tool for collecting reader comments more efficiently. Discusses the design and rationale of the software. Notes that results obtained using Focus were compared to the reader feedback collected under the plus-minus method. Concludes that Focus participants appeared to comment more from a reviewer's and less from a…

  19. Student Use of Applications Software.

    ERIC Educational Resources Information Center

    Lockheed, Marlaine E.; And Others

    This report studies the learning and use of computer applications programs--i.e., word processing programs, database management programs, and spreadsheets--in elementary and secondary school computer literacy courses. This study was designed to identify teachers' pedagogical assumptions and expectations underlying the use of applications software…

  20. AMMOS software: method and application.

    PubMed

    Pencheva, T; Lagorce, D; Pajeva, I; Villoutreix, B O; Miteva, M A

    2012-01-01

    Recent advances in computational sciences enabled extensive use of in silico methods in projects at the interface between chemistry and biology. Among them virtual ligand screening, a modern set of approaches, facilitates hit identification and lead optimization in drug discovery programs. Most of these approaches require the preparation of the libraries containing small organic molecules to be screened or a refinement of the virtual screening results. Here we present an overview of the open source AMMOS software, which is a platform performing an automatic procedure that allows for a structural generation and optimization of drug-like molecules in compound collections, as well as a structural refinement of protein-ligand complexes to assist in silico screening exercises. PMID:22183534

  1. Technological applications of focusing optical elements

    NASA Astrophysics Data System (ADS)

    Abul'khanov, Stanislav R.

    2015-03-01

    The article analyzes a wide range of technologies generated by the application of focusators of laser radiation. I give a brief review of the methods of monitoring substrate and forming a diffraction microrelief, optical systems and devices for experimental research of focusators, laser technologies and units on their basis. In particular, I analyze using focusator into the ring for growing single-crystalline fibers in device of mini pedestal, using focusator into a set of rings for information-measuring system of three-dimensional control of grid spacers and other applications of focusators.

  2. Software engineering with application-specific languages

    NASA Technical Reports Server (NTRS)

    Campbell, David J.; Barker, Linda; Mitchell, Deborah; Pollack, Robert H.

    1993-01-01

    Application-Specific Languages (ASL's) are small, special-purpose languages that are targeted to solve a specific class of problems. Using ASL's on software development projects can provide considerable cost savings, reduce risk, and enhance quality and reliability. ASL's provide a platform for reuse within a project or across many projects and enable less-experienced programmers to tap into the expertise of application-area experts. ASL's have been used on several software development projects for the Space Shuttle Program. On these projects, the use of ASL's resulted in considerable cost savings over conventional development techniques. Two of these projects are described.

  3. Workshop and conference on Grand Challenges applications and software technology

    SciTech Connect

    Not Available

    1993-12-31

    On May 4--7, 1993, nine federal agencies sponsored a four-day meeting on Grand Challenge applications and software technology. The objective was to bring High-Performance Computing and Communications (HPCC) Grand Challenge applications research groups supported under the federal HPCC program together with HPCC software technologists to: discuss multidisciplinary computational science research issues and approaches, identify major technology challenges facing users and providers, and refine software technology requirements for Grand Challenge applications research. The first day and a half focused on applications. Presentations were given by speakers from universities, national laboratories, and government agencies actively involved in Grand Challenge research. Five areas of research were covered: environmental and earth sciences; computational physics; computational biology, chemistry, and materials sciences; computational fluid and plasma dynamics; and applications of artificial intelligence. The next day and a half was spent in working groups in which the applications researchers were joined by software technologists. Nine breakout sessions took place: I/0, Data, and File Systems; Parallel Programming Paradigms; Performance Characterization and Evaluation of Massively Parallel Processing Applications; Program Development Tools; Building Multidisciplinary Applications; Algorithm and Libraries I; Algorithms and Libraries II; Graphics and Visualization; and National HPCC Infrastructure.

  4. Firing Room Remote Application Software Development & Swamp Works Laboratory Robot Software Development

    NASA Technical Reports Server (NTRS)

    Garcia, Janette

    2016-01-01

    The National Aeronautics and Space Administration (NASA) is creating a way to send humans beyond low Earth orbit, and later to Mars. Kennedy Space Center (KSC) is working to make this possible by developing a Spaceport Command and Control System (SCCS) which will allow the launch of Space Launch System (SLS). This paper's focus is on the work performed by the author in her first and second part of the internship as a remote application software developer. During the first part of her internship, the author worked on the SCCS's software application layer by assisting multiple ground subsystems teams including Launch Accessories (LACC) and Environmental Control System (ECS) on the design, development, integration, and testing of remote control software applications. Then, on the second part of the internship, the author worked on the development of robot software at the Swamp Works Laboratory which is a research and technology development group which focuses on inventing new technology to help future In-Situ Resource Utilization (ISRU) missions.

  5. E-learning for Critical Thinking: Using Nominal Focus Group Method to Inform Software Content and Design

    PubMed Central

    Parker, Steve; Mayner, Lidia; Michael Gillham, David

    2015-01-01

    Background: Undergraduate nursing students are often confused by multiple understandings of critical thinking. In response to this situation, the Critiique for critical thinking (CCT) project was implemented to provide consistent structured guidance about critical thinking. Objectives: This paper introduces Critiique software, describes initial validation of the content of this critical thinking tool and explores wider applications of the Critiique software. Materials and Methods: Critiique is flexible, authorable software that guides students step-by-step through critical appraisal of research papers. The spelling of Critiique was deliberate, so as to acquire a unique web domain name and associated logo. The CCT project involved implementation of a modified nominal focus group process with academic staff working together to establish common understandings of critical thinking. Previous work established a consensus about critical thinking in nursing and provided a starting point for the focus groups. The study was conducted at an Australian university campus with the focus group guided by open ended questions. Results: Focus group data established categories of content that academic staff identified as important for teaching critical thinking. This emerging focus group data was then used to inform modification of Critiique software so that students had access to consistent and structured guidance in relation to critical thinking and critical appraisal. Conclusions: The project succeeded in using focus group data from academics to inform software development while at the same time retaining the benefits of broader philosophical dimensions of critical thinking. PMID:26835469

  6. Firing Room Remote Application Software Development

    NASA Technical Reports Server (NTRS)

    Liu, Kan

    2014-01-01

    The Engineering and Technology Directorate (NE) at National Aeronautics and Space Administration (NASA) Kennedy Space Center (KSC) is designing a new command and control system for the checkout and launch of Space Launch System (SLS) and future rockets. The purposes of the semester long internship as a remote application software developer include the design, development, integration, and verification of the software and hardware in the firing rooms, in particular with the Mobile Launcher (ML) Launch Accessories subsystem. In addition, a Conversion Fusion project was created to show specific approved checkout and launch engineering data for public-friendly display purposes.

  7. Earth Resources Laboratory Applications Software (ELAS)

    NASA Technical Reports Server (NTRS)

    Balcerek, T. W.

    1981-01-01

    Implementation of the Earth Resources Laboratory applications software (ELAS) system is described. A Data General Eclipse model S/230 minicomputer is employed for image processing of LANDSAT data. A 16 bit word is used, and the smaller addressability necessitates reducing some of the array sizes. All INTEGER*4 variables were changed to REAL. Interfacing the ELAS software to Data General's FORTRAN callable runtime routines required rewriting the input/output routines and the subroutines that bring in the various overlays. Overlay relinking is required when a change is made in resident routines. The ELAS system and its user documentation is evaluated.

  8. Managing configuration software of ground software applications with glueware

    NASA Technical Reports Server (NTRS)

    Larsen, B.; Herrera, R.; Sesplaukis, T.; Cheng, L.; Sarrel, M.

    2003-01-01

    This paper reports on a simple, low-cost effort to streamline the configuration of the uplink software tools. Even though the existing ground system consisted of JPL and custom Cassini software rather than COTS, we chose a glueware approach--reintegrating with wrappers and bridges and adding minimal new functionality.

  9. A Software Architecture for High Level Applications

    SciTech Connect

    Shen,G.

    2009-05-04

    A modular software platform for high level applications is under development at the National Synchrotron Light Source II project. This platform is based on client-server architecture, and the components of high level applications on this platform will be modular and distributed, and therefore reusable. An online model server is indispensable for model based control. Different accelerator facilities have different requirements for the online simulation. To supply various accelerator simulators, a set of narrow and general application programming interfaces is developed based on Tracy-3 and Elegant. This paper describes the system architecture for the modular high level applications, the design of narrow and general application programming interface for an online model server, and the prototype of online model server.

  10. 36 CFR 1194.21 - Software applications and operating systems.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 36 Parks, Forests, and Public Property 3 2013-07-01 2012-07-01 true Software applications and... Standards § 1194.21 Software applications and operating systems. (a) When software is designed to run on a...) Software shall not use flashing or blinking text, objects, or other elements having a flash or...

  11. 36 CFR 1194.21 - Software applications and operating systems.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 36 Parks, Forests, and Public Property 3 2011-07-01 2011-07-01 false Software applications and... Standards § 1194.21 Software applications and operating systems. (a) When software is designed to run on a...) Software shall not use flashing or blinking text, objects, or other elements having a flash or...

  12. Parallel Algorithms and Software for Nuclear, Energy, and Environmental Applications. Part II: Multiphysics Software

    SciTech Connect

    Derek Gaston; Luanjing Guo; Glen Hansen; Hai Huang; Richard Johnson; Dana Knoll; Chris Newman; Hyeong Kae Park; Robert Podgorney; Michael Tonks; Richard Williamson

    2012-09-01

    This paper is the second part of a two part sequence on multiphysics algorithms and software. The first [1] focused on the algorithms; this part treats the multiphysics software framework and applications based on it. Tight coupling is typically designed into the analysis application at inception, as such an application is strongly tied to a composite nonlinear solver that arrives at the final solution by treating all equations simultaneously. The application must also take care to minimize both time and space error between the physics, particularly if more than one mesh representation is needed in the solution process. This paper presents an application framework that was specifically designed to support tightly coupled multiphysics analysis. The Multiphysics Object Oriented Simulation Environment (MOOSE) is based on the Jacobian-free Newton-Krylov (JFNK) method combined with physics-based preconditioning to provide the underlying mathematical structure for applications. The report concludes with the presentation of a host of nuclear, energy, and environmental applications that demonstrate the efficacy of the approach and the utility of a well-designed multiphysics framework.

  13. Distribution and communication in software engineering environments. Application to the HELIOS Software Bus.

    PubMed Central

    Jean, F. C.; Jaulent, M. C.; Coignard, J.; Degoulet, P.

    1991-01-01

    Modularity, distribution and integration are current trends in Software Engineering. To reach these goals HELIOS, a distributive Software Engineering Environment dedicated to the medical field, has been conceived and a prototype implemented. This environment is made by the collaboration of several, well encapsulated Software Components. This paper presents the architecture retained to allow communication between the different components and focus on the implementation details of the Software Bus, the communication and integration vector of the currently running prototype. PMID:1807652

  14. Improved maintenance productivity through software application

    SciTech Connect

    Rohrig, S.D.

    1987-01-01

    Some twelve years ago, MRC - Mound was one of the first Department of Energy (DOE) weapons complex sites to define and implement an engineering performance standards system to aid in the management of site maintenance operations. Over succeeding years, the scope of the system was broadened to include preventive maintenance and equipment management. Portions of the system were eventually automated through applications programming utilizing Mound's mainframe computer. The Automated Maintenance Management Operations, AMMO, software package will support all the principal functions of the maintenance mission at Mound. In the subsequent pages, a few unique features of the AMMO System will be highlighted. Also to be discussed, will be some of the productivity philosophies that influenced the AMMO design, as well as, the mechanisms within the system that will monitor the success of the system in achieving the desired results. Piror to this discussion, however, an overview of Mounds's present maintenance work order execution must be briefly discussed. The AMMO System software was designed around this mode of operation while incorporating some innovative new philosophies.

  15. Software reliability models for critical applications

    SciTech Connect

    Pham, H.; Pham, M.

    1991-12-01

    This report presents the results of the first phase of the ongoing EG G Idaho, Inc. Software Reliability Research Program. The program is studying the existing software reliability models and proposes a state-of-the-art software reliability model that is relevant to the nuclear reactor control environment. This report consists of three parts: (1) summaries of the literature review of existing software reliability and fault tolerant software reliability models and their related issues, (2) proposed technique for software reliability enhancement, and (3) general discussion and future research. The development of this proposed state-of-the-art software reliability model will be performed in the second place. 407 refs., 4 figs., 2 tabs.

  16. Software reliability models for critical applications

    SciTech Connect

    Pham, H.; Pham, M.

    1991-12-01

    This report presents the results of the first phase of the ongoing EG&G Idaho, Inc. Software Reliability Research Program. The program is studying the existing software reliability models and proposes a state-of-the-art software reliability model that is relevant to the nuclear reactor control environment. This report consists of three parts: (1) summaries of the literature review of existing software reliability and fault tolerant software reliability models and their related issues, (2) proposed technique for software reliability enhancement, and (3) general discussion and future research. The development of this proposed state-of-the-art software reliability model will be performed in the second place. 407 refs., 4 figs., 2 tabs.

  17. Applications software supporting the Spartan Attitude Control System

    NASA Technical Reports Server (NTRS)

    Stone, R. W.

    1986-01-01

    The native software supporting a single mission for the Spartan Attitude Control System can require up to 40,000 lines of code. Most of this must be rewritten for each mission. Control system engineers use an array of Applications Software Packages residing in ground computers to write each mission's flight software. These Applications Packages are written in the 'C' programming language and run under the UNIX Operating System. This paper discusses each of the Attitude Control Applications Software Packages, and describes the purpose and design of each.

  18. Designing Control System Application Software for Change

    NASA Technical Reports Server (NTRS)

    Boulanger, Richard

    2001-01-01

    The Unified Modeling Language (UML) was used to design the Environmental Systems Test Stand (ESTS) control system software. The UML was chosen for its ability to facilitate a clear dialog between software designer and customer, from which requirements are discovered and documented in a manner which transposes directly to program objects. Applying the UML to control system software design has resulted in a baseline set of documents from which change and effort of that change can be accurately measured. As the Environmental Systems Test Stand evolves, accurate estimates of the time and effort required to change the control system software will be made. Accurate quantification of the cost of software change can be before implementation, improving schedule and budget accuracy.

  19. Software environment for implementing engineering applications on MIMD computers

    NASA Technical Reports Server (NTRS)

    Lopez, L. A.; Valimohamed, K. A.; Schiff, S.

    1990-01-01

    In this paper the concept for a software environment for developing engineering application systems for multiprocessor hardware (MIMD) is presented. The philosophy employed is to solve the largest problems possible in a reasonable amount of time, rather than solve existing problems faster. In the proposed environment most of the problems concerning parallel computation and handling of large distributed data spaces are hidden from the application program developer, thereby facilitating the development of large-scale software applications. Applications developed under the environment can be executed on a variety of MIMD hardware; it protects the application software from the effects of a rapidly changing MIMD hardware technology.

  20. 2003 SNL ASCI applications software quality engineering assessment report.

    SciTech Connect

    Schofield, Joseph Richard, Jr.; Ellis, Molly A.; Williamson, Charles Michael; Bonano, Lora A.

    2004-02-01

    This document describes the 2003 SNL ASCI Software Quality Engineering (SQE) assessment of twenty ASCI application code teams and the results of that assessment. The purpose of this assessment was to determine code team compliance with the Sandia National Laboratories ASCI Applications Software Quality Engineering Practices, Version 2.0 as part of an overall program assessment.

  1. Classroom Applications of Electronic Spreadsheet Computer Software.

    ERIC Educational Resources Information Center

    Tolbert, Patricia H.; Tolbert, Charles M., II

    1983-01-01

    Details classroom use of SuperCalc, a software accounting package developed originally for small businesses, as a computerized gradebook. A procedure which uses data from the computer gradebook to produce weekly printed reports for parents is also described. (MBR)

  2. Sandia National Laboratories ASCI Applications Software Quality Engineering Practices

    SciTech Connect

    ZEPPER, JOHN D.; ARAGON, KATHRYN MARY; ELLIS, MOLLY A.; BYLE, KATHLEEN A.; EATON, DONNA SUE

    2003-04-01

    This document provides a guide to the deployment of the software verification activities, software engineering practices, and project management principles that guide the development of Accelerated Strategic Computing Initiative (ASCI) applications software at Sandia National Laboratories (Sandia). The goal of this document is to identify practices and activities that will foster the development of reliable and trusted products produced by the ASCI Applications program. Document contents include an explanation of the structure and purpose of the ASCI Quality Management Council, an overview of the software development lifecycle, an outline of the practices and activities that should be followed, and an assessment tool.

  3. Focused force angioplasty Theory and application

    SciTech Connect

    Solar, Ronald J.; Ischinger, Thomas A

    2003-03-01

    Focused force angioplasty is a technique in which the forces resulting from inflating an angioplasty balloon in a stenosis are concentrated and focused at one or more locations within the stenosis. While the technique has been shown to be useful in resolving resistant stenoses, its real value may be in minimizing the vascular trauma associated with balloon angioplasty and subsequently improving the outcome.

  4. The system software development for prime focus spectrograph on Subaru Telescope

    NASA Astrophysics Data System (ADS)

    Shimono, Atsushi; Tamura, Naoyuki; Sugai, Hajime; Karoji, Hiroshi

    2012-09-01

    The Prime Focus Spectrograph (PFS) is a wide field multi-fiber spectrograph using the prime focus of the Subaru telescope, which is capable of observing up to 2400 astronomical objects simultaneously. The instrument control software will manage the observation procedure communicating with subsystems such as the fiber positioner "COBRA", the metrology camera system, and the spectrograph and camera systems. Before an exposure starts, the instrument control system needs to access to a database where target lists provided by observers are stored in advance, and accurately position fibers onto astronomical targets as requested therein. This fiber positioning will be carried out interacting with the metrology system which measures the fiber positions. In parallel, the control system can issue a command to point the telescope to the target position and to rotate the instrument rotator. Finally the telescope pointing and the rotator angle will be checked by imaging bright stars and checking their positions on the auto-guide and acquisition cameras. After the exposure finishes, the data are collected from the detector systems and are finalized as FITS files to archive with necessary information. The observation preparation software is required, given target lists and a sequence of observation, to find optimal fiber allocations with maximizing the number of guide stars. To carry out these operations efficiently, the control system will be integrated seamlessly with a database system which will store information necessary for observation execution such as fiber configurations. In this article, the conceptual system design of the observation preparation software and the instrument control software will be presented.

  5. Powerplant software

    SciTech Connect

    Elliott, T.C.

    1995-07-01

    Powerplants need software to thrive and compete. Covered here are many programs and applications -- an overview of the functions, tasks, and problem-solving software is used for today. Software or, more accurately, software-driven systems are pervasive. Their presence is felt in every nook and cranny of the powerplant -- from design and construction through operation and maintenance, even dismantling and decommissioning -- embracing whole systems but also focusing on individual pieces of equipment. No one software supplier or two or three dominates -- powerplant software is the purview of scores if not hundreds of suppliers ranging from the largest corporations to individual consultants and application developers.

  6. SHMTools: a general-purpose software tool for SHM applications

    SciTech Connect

    Harvey, Dustin; Farrar, Charles; Taylor, Stuart; Park, Gyuhae; Flynn, Eric B; Kpotufe, Samory; Dondi, Denis; Mollov, Todor; Todd, Michael D; Rosin, Tajana S; Figueiredo, Eloi

    2010-11-30

    This paper describes a new software package for various structural health monitoring (SHM) applications. The software is a set of standardized MATLAB routines covering three main stages of SHM: data acquisition, feature extraction, and feature classification for damage identification. A subset of the software in SHMTools is embeddable, which consists of Matlab functions that can be cross-compiled into generic 'C' programs to be run on a target hardware. The software is also designed to accommodate multiple sensing modalities, including piezoelectric active-sensing, which has been widely used in SHM practice. The software package, including standardized datasets, are publicly available for use by the SHM community. The details of this embeddable software will be discussed, along with several example processes that can be used for guidelines for future use of the software.

  7. Safety Characteristics in System Application Software for Human Rated Exploration

    NASA Technical Reports Server (NTRS)

    Mango, E. J.

    2016-01-01

    NASA and its industry and international partners are embarking on a bold and inspiring development effort to design and build an exploration class space system. The space system is made up of the Orion system, the Space Launch System (SLS) and the Ground Systems Development and Operations (GSDO) system. All are highly coupled together and dependent on each other for the combined safety of the space system. A key area of system safety focus needs to be in the ground and flight application software system (GFAS). In the development, certification and operations of GFAS, there are a series of safety characteristics that define the approach to ensure mission success. This paper will explore and examine the safety characteristics of the GFAS development.

  8. An evaluation of the Interactive Software Invocation System (ISIS) for software development applications. [flight software

    NASA Technical Reports Server (NTRS)

    Noland, M. S.

    1981-01-01

    The Interactive Software Invocation System (ISIS), which allows a user to build, modify, control, and process a total flight software system without direct communications with the host computer, is described. This interactive data management system provides the user with a file manager, text editor, a tool invoker, and an Interactive Programming Language (IPL). The basic file design of ISIS is a five level hierarchical structure. The file manager controls this hierarchical file structure and permits the user to create, to save, to access, and to purge pages of information. The text editor is used to manipulate pages of text to be modified and the tool invoker allows the user to communicate with the host computer through a RUN file created by the user. The IPL is based on PASCAL and contains most of the statements found in a high-level programming language. In order to evaluate the effectiveness of the system as applied to a flight project, the collection of software components required to support the Annular Suspension and Pointing System (ASPS) flight project were integrated using ISIS. The ASPS software system and its integration into ISIS is described.

  9. Evaluation of Two Types of Online Help for Application Software.

    ERIC Educational Resources Information Center

    Dutke, Stephan; Reimer, T.

    2000-01-01

    Discusses online help systems in application software design and describes two experiments in which adult computer novices learned to use experimental graphics software by task-based exploration. Topics include operative help; function-oriented help; effects on learning performance; schemata; mental models; and implications for the design of…

  10. Effectiveness of Audio on Screen Captures in Software Application Instruction

    ERIC Educational Resources Information Center

    Veronikas, Susan Walsh; Maushak, Nancy

    2005-01-01

    Presentation of software instruction has been supported by manuals and textbooks consisting of screen captures, but a multimedia approach may increase learning outcomes. This study investigated the effects of modality (text, audio, or dual) on the achievement and attitudes of college students learning a software application through the computer.…

  11. Applications of Computers and Computer Software in Teaching Analytical Chemistry.

    ERIC Educational Resources Information Center

    O'Haver, T. C.

    1991-01-01

    Some commercially available software tools that have potential applications in the analytical chemistry curriculum are surveyed and evaluated. Tools for instruction, analysis and research, and courseware development are described. A list of the software packages, the compatible hardware, and the vendor's address is included. (KR)

  12. SKEW QUADRUPOLE FOCUSING LATTICES AND APPLICATIONS.

    SciTech Connect

    PARKER,B.

    2001-06-18

    In this paper we revisit using skew quadrupole fields in place of traditional normal upright quadrupole fields to make beam focusing structures. We illustrate by example skew lattice decoupling, dispersion suppression and chromatic correction using the neutrino factory Study-II muon storage ring design. Ongoing BNL investigation of flat coil magnet structures that allow building a very compact muon storage ring arc and other flat coil configurations that might bring significant magnet cost reduction to a VLHC motivate our study of skew focusing.

  13. Software development for safety-critical medical applications

    NASA Technical Reports Server (NTRS)

    Knight, John C.

    1992-01-01

    There are many computer-based medical applications in which safety and not reliability is the overriding concern. Reduced, altered, or no functionality of such systems is acceptable as long as no harm is done. A precise, formal definition of what software safety means is essential, however, before any attempt can be made to achieve it. Without this definition, it is not possible to determine whether a specific software entity is safe. A set of definitions pertaining to software safety will be presented and a case study involving an experimental medical device will be described. Some new techniques aimed at improving software safety will also be discussed.

  14. Statistical Software and Artificial Intelligence: A Watershed in Applications Programming.

    ERIC Educational Resources Information Center

    Pickett, John C.

    1984-01-01

    AUTOBJ and AUTOBOX are revolutionary software programs which contain the first application of artificial intelligence to statistical procedures used in analysis of time series data. The artificial intelligence included in the programs and program features are discussed. (JN)

  15. High Performance Computing Software Applications for Space Situational Awareness

    NASA Astrophysics Data System (ADS)

    Giuliano, C.; Schumacher, P.; Matson, C.; Chun, F.; Duncan, B.; Borelli, K.; Desonia, R.; Gusciora, G.; Roe, K.

    The High Performance Computing Software Applications Institute for Space Situational Awareness (HSAI-SSA) has completed its first full year of applications development. The emphasis of our work in this first year was in improving space surveillance sensor models and image enhancement software. These applications are the Space Surveillance Network Analysis Model (SSNAM), the Air Force Space Fence simulation (SimFence), and physically constrained iterative de-convolution (PCID) image enhancement software tool. Specifically, we have demonstrated order of magnitude speed-up in those codes running on the latest Cray XD-1 Linux supercomputer (Hoku) at the Maui High Performance Computing Center. The software applications improvements that HSAI-SSA has made, has had significant impact to the warfighter and has fundamentally changed the role of high performance computing in SSA.

  16. Designing application software in wide area network settings

    NASA Technical Reports Server (NTRS)

    Makpangou, Mesaac; Birman, Ken

    1990-01-01

    Progress in methodologies for developing robust local area network software has not been matched by similar results for wide area settings. The design of application software spanning multiple local area environments is examined. For important classes of applications, simple design techniques are presented that yield fault tolerant wide area programs. An implementation of these techniques as a set of tools for use within the ISIS system is described.

  17. Sandia National Laboratories ASCI Applications Software Quality Engineering Practices

    SciTech Connect

    ZEPPER, JOHN D.; ARAGON, KATHRYN MARY; ELLIS, MOLLY A.; BYLE, KATHLEEN A.; EATON, DONNA SUE

    2002-01-01

    This document provides a guide to the deployment of the software verification activities, software engineering practices, and project management principles that guide the development of Accelerated Strategic Computing Initiative (ASCI) applications software at Sandia National Laboratories (Sandia). The goal of this document is to identify practices and activities that will foster the development of reliable and trusted products produced by the ASCI Applications program. Document contents include an explanation of the structure and purpose of the ASCI Quality Management Council, an overview of the software development lifecycle, an outline of the practices and activities that should be followed, and an assessment tool. These sections map practices and activities at Sandia to the ASCI Software Quality Engineering: Goals, Principles, and Guidelines, a Department of Energy document.

  18. Application Focused Schlieren to Nozzle Ejector Flowfields

    NASA Technical Reports Server (NTRS)

    Mitchell, L. Kerry; Ponton, Michael K.; Seiner, John M.; Manning, James C.; Jansen, Bernard J.; Lagen, Nicholas T.

    1999-01-01

    The motivation of the testing was to reduce noise generated by eddy Mach wave emission via enhanced mixing in the jet plume. This was to be accomplished through the use of an ejector shroud, which would bring in cooler ambient fluid to mix with the hotter jet flow. In addition, the contour of the mixer, with its chutes and lobes, would accentuate the merging of the outer and inner flows. The objective of the focused schlieren work was to characterize the mixing performance inside of the ejector. Using flow visualization allowed this to be accomplished in a non-intrusive manner.

  19. A computationally efficient software application for calculating vibration from underground railways

    NASA Astrophysics Data System (ADS)

    Hussein, M. F. M.; Hunt, H. E. M.

    2009-08-01

    The PiP model is a software application with a user-friendly interface for calculating vibration from underground railways. This paper reports about the software with a focus on its latest version and the plans for future developments. The software calculates the Power Spectral Density of vibration due to a moving train on floating-slab track with track irregularity described by typical values of spectra for tracks with good, average and bad conditions. The latest version accounts for a tunnel embedded in a half space by employing a toolbox developed at K.U. Leuven which calculates Green's functions for a multi-layered half-space.

  20. Software Applications for Oversampling of Transit Candidates

    NASA Astrophysics Data System (ADS)

    Quentin, C. G.; Barge, P.; Cautain, R.; Meunier, J.-C.; Moutou, C.; Savalle, R.; Surace, C.

    2006-11-01

    In the Exo-field, the standard sampling of 12000 light-curves is 8.5 minutes but a finite number of them (1000 at maximum for a given field of view) can be oversampled to 32 seconds. This possibility was planned to enhance the scientific impact of the mission with the aim to get additional information on the shape and timing of the planetary transits. The targets to be oversampled are selected from a ground-based analysis of uncompletely corrected (preliminary) N1 data and sorted in a list that can be changed during the data acquisition. At the beginning of a run the 1000 oversampling possibilities are occupied either by stars known to host a planet (from ground based observations) or by reference stars appropriately chosen within the HerstzsprungRussel (HR) diagram. We present the software used for early detection in raw light-curves and the way results are stored and sorted to build up the oversampling list that is loaded on board the satellite. This list can be changed once a week during the operations while new sets of data are acquired. The whole software and data management developed to decide which target stars merit or not oversampling will be called ”Exowarning pipeline”.

  1. Application Reuse Library for Software, Requirements, and Guidelines

    NASA Technical Reports Server (NTRS)

    Malin, Jane T.; Thronesbery, Carroll

    1994-01-01

    Better designs are needed for expert systems and other operations automation software, for more reliable, usable and effective human support. A prototype computer-aided Application Reuse Library shows feasibility of supporting concurrent development and improvement of advanced software by users, analysts, software developers, and human-computer interaction experts. Such a library expedites development of quality software, by providing working, documented examples, which support understanding, modification and reuse of requirements as well as code. It explicitly documents and implicitly embodies design guidelines, standards and conventions. The Application Reuse Library provides application modules with Demo-and-Tester elements. Developers and users can evaluate applicability of a library module and test modifications, by running it interactively. Sub-modules provide application code and displays and controls. The library supports software modification and reuse, by providing alternative versions of application and display functionality. Information about human support and display requirements is provided, so that modifications will conform to guidelines. The library supports entry of new application modules from developers throughout an organization. Example library modules include a timer, some buttons and special fonts, and a real-time data interface program. The library prototype is implemented in the object-oriented G2 environment for developing real-time expert systems.

  2. HSCT4.0 Application: Software Requirements Specification

    NASA Technical Reports Server (NTRS)

    Salas, A. O.; Walsh, J. L.; Mason, B. H.; Weston, R. P.; Townsend, J. C.; Samareh, J. A.; Green, L. L.

    2001-01-01

    The software requirements for the High Performance Computing and Communication Program High Speed Civil Transport application project, referred to as HSCT4.0, are described. The objective of the HSCT4.0 application project is to demonstrate the application of high-performance computing techniques to the problem of multidisciplinary design optimization of a supersonic transport configuration, using high-fidelity analysis simulations. Descriptions of the various functions (and the relationships among them) that make up the multidisciplinary application as well as the constraints on the software design arc provided. This document serves to establish an agreement between the suppliers and the customer as to what the HSCT4.0 application should do and provides to the software developers the information necessary to design and implement the system.

  3. Control system software, simulation, and robotic applications

    NASA Technical Reports Server (NTRS)

    Frisch, Harold P.

    1991-01-01

    All essential existing capabilities needed to create a man-machine interaction dynamics and performance (MMIDAP) capability are reviewed. The multibody system dynamics software program Order N DISCOS will be used for machine and musculo-skeletal dynamics modeling. The program JACK will be used for estimating and animating whole body human response to given loading situations and motion constraints. The basic elements of performance (BEP) task decomposition methodologies associated with the Human Performance Institute database will be used for performance assessment. Techniques for resolving the statically indeterminant muscular load sharing problem will be used for a detailed understanding of potential musculotendon or ligamentous fatigue, pain, discomfort, and trauma. The envisioned capacity is to be used for mechanical system design, human performance assessment, extrapolation of man/machine interaction test data, biomedical engineering, and soft prototyping within a concurrent engineering (CE) system.

  4. Development and Application of New Quality Model for Software Projects

    PubMed Central

    Karnavel, K.; Dillibabu, R.

    2014-01-01

    The IT industry tries to employ a number of models to identify the defects in the construction of software projects. In this paper, we present COQUALMO and its limitations and aim to increase the quality without increasing the cost and time. The computation time, cost, and effort to predict the residual defects are very high; this was overcome by developing an appropriate new quality model named the software testing defect corrective model (STDCM). The STDCM was used to estimate the number of remaining residual defects in the software product; a few assumptions and the detailed steps of the STDCM are highlighted. The application of the STDCM is explored in software projects. The implementation of the model is validated using statistical inference, which shows there is a significant improvement in the quality of the software projects. PMID:25478594

  5. Earth Resources Laboratory Applications Software (ELAS)

    NASA Technical Reports Server (NTRS)

    Penton, P. G.

    1989-01-01

    ELAS modified to handle broad range of digital images, and now finding widespread application in medical imaging field. Many versions of ELAS condensed into v. 8.0, which is available for DEC VAX, CONCURRENT, and for UNIX environment.

  6. High Intensity Focused Ultrasound Tumor Therapy System and Its Application

    NASA Astrophysics Data System (ADS)

    Sun, Fucheng; He, Ye; Li, Rui

    2007-05-01

    At the end of last century, a High Intensity Focused Ultrasound (HIFU) tumor therapy system was successfully developed and manufactured in China, which has been already applied to clinical therapy. This article aims to discuss the HIFU therapy system and its application. Detailed research includes the following: power amplifiers for high-power ultrasound, ultrasound transducers with large apertures, accurate 3-D mechanical drives, a software control system (both high-voltage control and low-voltage control), and the B-mode ultrasonic diagnostic equipment used for treatment monitoring. Research on the dosage of ultrasound required for tumour therapy in multiple human cases has made it possible to relate a dosage formula, presented in this paper, to other significant parameters such as the volume of thermal tumor solidification, the acoustic intensity (I), and the ultrasound emission time (tn). Moreover, the HIFU therapy system can be applied to the clinical treatment of both benign and malignant tumors in the pelvic and abdominal cavity, such as uterine fibroids, liver cancer and pancreatic carcinoma.

  7. A software framework for developing measurement applications under variable requirements

    NASA Astrophysics Data System (ADS)

    Arpaia, Pasquale; Buzio, Marco; Fiscarelli, Lucio; Inglese, Vitaliano

    2012-11-01

    A framework for easily developing software for measurement and test applications under highly and fast-varying requirements is proposed. The framework allows the software quality, in terms of flexibility, usability, and maintainability, to be maximized. Furthermore, the development effort is reduced and finalized, by relieving the test engineer of development details. The framework can be configured for satisfying a large set of measurement applications in a generic field for an industrial test division, a test laboratory, or a research center. As an experimental case study, the design, the implementation, and the assessment inside the application to a measurement scenario of magnet testing at the European Organization for Nuclear Research is reported.

  8. Computer Applications in Marketing. An Annotated Bibliography of Computer Software.

    ERIC Educational Resources Information Center

    Burrow, Jim; Schwamman, Faye

    This bibliography contains annotations of 95 items of educational and business software with applications in seven marketing and business functions. The annotations, which appear in alphabetical order by title, provide this information: category (related application), title, date, source and price, equipment, supplementary materials, description…

  9. Simplifying applications software for vision guided robot implementation

    NASA Technical Reports Server (NTRS)

    Duncheon, Charlie

    1994-01-01

    A simple approach to robot applications software is described. The idea is to use commercially available software and hardware wherever possible to minimize system costs, schedules and risks. The U.S. has been slow in the adaptation of robots and flexible automation compared to the fluorishing growth of robot implementation in Japan. The U.S. can benefit from this approach because of a more flexible array of vision guided robot technologies.

  10. Ray tracing software application in VIP lamp design

    NASA Astrophysics Data System (ADS)

    Rehn, Henning

    2002-08-01

    In our contribution we demonstrate a wide variety of ray tracing software applications for the design of VIP short-arc discharge video projection lamps. On the basis of simulations we derive design rules for the lamp itself and for its optical environment. Light Tools software acts as a means to understand the collection efficiency of a VIP lamp with an elliptical reflector and as an instrument to prove the conclusions.

  11. The Application of Solution-Focused Work in Employment Counseling

    ERIC Educational Resources Information Center

    Bezanson, Birdie J.

    2004-01-01

    The author explores the applicability of a solution-focused therapy (SFT) model as a comprehensive approach to employment counseling. SFT focuses the client on developing a vision of a preferred future and assumes that the client has the talents and resources that can be accessed in the employment counseling process. The solution-focused counselor…

  12. Solar-terrestrial models and application software

    NASA Technical Reports Server (NTRS)

    Bilitza, Dieter

    1990-01-01

    The empirical models related to solar-terrestrial sciences are listed and described which are available in the form of computer programs. Also included are programs that use one or more of these models for application specific purposes. The entries are grouped according to the region of the solar-terrestrial environment to which they belong and according to the parameter which they describe. Regions considered include the ionosphere, atmosphere, magnetosphere, planets, interplanetary space, and heliosphere. Also provided is the information on the accessibility for solar-terrestrial models to specify the magnetic and solar activity conditions.

  13. Solar-terrestrial models and application software

    NASA Technical Reports Server (NTRS)

    Bilitza, D.

    1992-01-01

    The empirical models related to solar-terrestrial sciences are listed and described which are available in the form of computer programs. Also included are programs that use one or more of these models for application specific purposes. The entries are grouped according to the region of their solar-terrestrial environment to which they belong and according to the parameter which they describe. Regions considered include the ionosphere, atmosphere, magnetosphere, planets, interplanetary space, and heliosphere. Also provided is the information on the accessibility for solar-terrestrial models to specify the magnetic and solar activity conditions.

  14. Mission design applications of QUICK. [software for interactive trajectory calculation

    NASA Technical Reports Server (NTRS)

    Skinner, David L.; Bass, Laura E.; Byrnes, Dennis V.; Cheng, Jeannie T.; Fordyce, Jess E.; Knocke, Philip C.; Lyons, Daniel T.; Pojman, Joan L.; Stetson, Douglas S.; Wolf, Aron A.

    1990-01-01

    An overview of an interactive software environment for space mission design termed QUICK is presented. This stand-alone program provides a programmable FORTRAN-like calculator interface to a wide range of both built-in and user defined functions. QUICK has evolved into a general-purpose software environment that can be intrinsically and dynamically customized for a wide range of mission design applications. Specific applications are described for some space programs, e.g., the earth-Venus-Mars mission, the Cassini mission to Saturn, the Mars Observer, the Galileo Project, and the Magellan Spacecraft.

  15. Static and Dynamic Verification of Critical Software for Space Applications

    NASA Astrophysics Data System (ADS)

    Moreira, F.; Maia, R.; Costa, D.; Duro, N.; Rodríguez-Dapena, P.; Hjortnaes, K.

    Space technology is no longer used only for much specialised research activities or for sophisticated manned space missions. Modern society relies more and more on space technology and applications for every day activities. Worldwide telecommunications, Earth observation, navigation and remote sensing are only a few examples of space applications on which we rely daily. The European driven global navigation system Galileo and its associated applications, e.g. air traffic management, vessel and car navigation, will significantly expand the already stringent safety requirements for space based applications Apart from their usefulness and practical applications, every single piece of onboard software deployed into the space represents an enormous investment. With a long lifetime operation and being extremely difficult to maintain and upgrade, at least when comparing with "mainstream" software development, the importance of ensuring their correctness before deployment is immense. Verification &Validation techniques and technologies have a key role in ensuring that the onboard software is correct and error free, or at least free from errors that can potentially lead to catastrophic failures. Many RAMS techniques including both static criticality analysis and dynamic verification techniques have been used as a means to verify and validate critical software and to ensure its correctness. But, traditionally, these have been isolated applied. One of the main reasons is the immaturity of this field in what concerns to its application to the increasing software product(s) within space systems. This paper presents an innovative way of combining both static and dynamic techniques exploiting their synergy and complementarity for software fault removal. The methodology proposed is based on the combination of Software FMEA and FTA with Fault-injection techniques. The case study herein described is implemented with support from two tools: The SoftCare tool for the SFMEA and SFTA

  16. Enhancement of computer system for applications software branch

    NASA Technical Reports Server (NTRS)

    Bykat, Alex

    1987-01-01

    Presented is a compilation of the history of a two-month project concerned with a survey, evaluation, and specification of a new computer system for the Applications Software Branch of the Software and Data Management Division of Information and Electronic Systems Laboratory of Marshall Space Flight Center, NASA. Information gathering consisted of discussions and surveys of branch activities, evaluation of computer manufacturer literature, and presentations by vendors. Information gathering was followed by evaluation of their systems. The criteria of the latter were: the (tentative) architecture selected for the new system, type of network architecture supported, software tools, and to some extent the price. The information received from the vendors, as well as additional research, lead to detailed design of a suitable system. This design included considerations of hardware and software environments as well as personnel issues such as training. Design of the system culminated in a recommendation for a new computing system for the Branch.

  17. Web Application Software for Ground Operations Planning Database (GOPDb) Management

    NASA Technical Reports Server (NTRS)

    Lanham, Clifton; Kallner, Shawn; Gernand, Jeffrey

    2013-01-01

    A Web application facilitates collaborative development of the ground operations planning document. This will reduce costs and development time for new programs by incorporating the data governance, access control, and revision tracking of the ground operations planning data. Ground Operations Planning requires the creation and maintenance of detailed timelines and documentation. The GOPDb Web application was created using state-of-the-art Web 2.0 technologies, and was deployed as SaaS (Software as a Service), with an emphasis on data governance and security needs. Application access is managed using two-factor authentication, with data write permissions tied to user roles and responsibilities. Multiple instances of the application can be deployed on a Web server to meet the robust needs for multiple, future programs with minimal additional cost. This innovation features high availability and scalability, with no additional software that needs to be bought or installed. For data governance and security (data quality, management, business process management, and risk management for data handling), the software uses NAMS. No local copy/cloning of data is permitted. Data change log/tracking is addressed, as well as collaboration, work flow, and process standardization. The software provides on-line documentation and detailed Web-based help. There are multiple ways that this software can be deployed on a Web server to meet ground operations planning needs for future programs. The software could be used to support commercial crew ground operations planning, as well as commercial payload/satellite ground operations planning. The application source code and database schema are owned by NASA.

  18. Radiation-Hardened Software for Space Flight Science Applications

    NASA Astrophysics Data System (ADS)

    Mehlitz, P. C.; Penix, J. J.; Markosian, L. Z.

    2005-12-01

    Hardware faults caused by radiation-induced Single Event Effects (SEEs) are a serious issue in space flight, especially affecting scientific missions in earth orbits crossing the poles or the South Atlantic Anomaly. Traditionally, SEEs are treated as a hardware problem, for example mitigated by radiation-hardened processors and shielding. Rad-hardened processors are expensive, exhibit a decade performance gap compared to COTS technology, have a larger form factor and require more power. Shielding is ineffective for high energy particles and increases launch weight. Hardware approaches cannot dynamically adapt protection levels for different radiation scenarios depending on solar activity and flight phase. Future hardware will exacerbate the problem due to higher chip densities and lower power levels. An alternative approach is to use software to mitigate SEEs. This "Radiation Hardened Software" (RHS) approach has two components: (1) RHS library and application design guidelines To increase robustness, we combine SEE countermeasures in three areas: prevention and detection; recovery; and reconfiguration. Prevention and detection includes an application- and heap-aware memory scanner, and dynamically adapted software Error Correction Codes to handle cache and multi-bit errors. Recovery mechanisms include exception firewalls and transaction-based software design patterns, to minimize data loss. Reconfiguration includes a heap manager to avoid damaged memory areas. (2) Software-based SEE Simulation Probabilistic effects require extensive simulation, with test environments that do not require original flight hardware and can simulate various SEE profiles. We use processor emulation software, interfaced to a debugger, to analyze SEE propagation and optimize RHS mechanisms. The simulator runs unmodified binary flight code, enables injecting randomized transient and permanent memory errors, providing execution traces and precise failure reproduction. The goal of RHS is to

  19. Application of software technology to automatic test data analysis

    NASA Technical Reports Server (NTRS)

    Stagner, J. R.

    1991-01-01

    The verification process for a major software subsystem was partially automated as part of a feasibility demonstration. The methods employed are generally useful and applicable to other types of subsystems. The effort resulted in substantial savings in test engineer analysis time and offers a method for inclusion of automatic verification as a part of regression testing.

  20. Software Applications Course as an Early Indicator of Academic Performance

    ERIC Educational Resources Information Center

    Benham, Harry C.; Bielinska-Kwapisz, Agnieszka; Brown, F. William

    2013-01-01

    This study's objective is to determine if students who were unable to successfully complete a required sophomore level business software applications course encountered unique academic difficulties in that course, or if their difficulty signaled more general academic achievement problems in business. The study points to the importance of including…

  1. Application of Plagiarism Screening Software in the Chemical Engineering Curriculum

    ERIC Educational Resources Information Center

    Cooper, Matthew E.; Bullard, Lisa G.

    2014-01-01

    Plagiarism is an area of increasing concern for written ChE assignments, such as laboratory and design reports, due to ease of access to text and other materials via the internet. This study examines the application of plagiarism screening software to four courses in a university chemical engineering curriculum. The effectiveness of plagiarism…

  2. QFD Application to a Software - Intensive System Development Project

    NASA Technical Reports Server (NTRS)

    Tran, T. L.

    1996-01-01

    This paper describes the use of Quality Function Deployment (QFD), adapted to requirements engineering for a software-intensive system development project, and sysnthesizes the lessons learned from the application of QFD to the Network Control System (NCS) pre-project of the Deep Space Network.

  3. CernVM - a virtual software appliance for LHC applications

    NASA Astrophysics Data System (ADS)

    Buncic, P.; Aguado Sanchez, C.; Blomer, J.; Franco, L.; Harutyunian, A.; Mato, P.; Yao, Y.

    2010-04-01

    CernVM is a Virtual Software Appliance capable of running physics applications from the LHC experiments at CERN. It aims to provide a complete and portable environment for developing and running LHC data analysis on any end-user computer (laptop, desktop) as well as on the Grid, independently of Operating System platforms (Linux, Windows, MacOS). The experiment application software and its specific dependencies are built independently from CernVM and delivered to the appliance just in time by means of a CernVM File System (CVMFS) specifically designed for efficient software distribution. The procedures for building, installing and validating software releases remains under the control and responsibility of each user community. We provide a mechanism to publish pre-built and configured experiment software releases to a central distribution point from where it finds its way to the running CernVM instances via the hierarchy of proxy servers or content delivery networks. In this paper, we present current state of CernVM project and compare performance of CVMFS to performance of traditional network file system like AFS and discuss possible scenarios that could further improve its performance and scalability.

  4. Reflective Array with Controlled Focusing for Radiotomographic Application

    NASA Astrophysics Data System (ADS)

    Shipilov, S. E.; Eremeev, A. I.; Yakubov, V. P.

    2016-01-01

    It's considered the principle possibility of creation the managed reflectors for formulation of given field distribution in the focus area. Reflectors change the reflect ratio in dependence of the external control. The proposed theoretical modeling of such controlled focused device which provides focuse to a specific point in a given distribution of the reflectors. On the basis of numerical simulation it's considered the application of this approach for the solution of the problem of radiotomography.

  5. Quantum searching application in search based software engineering

    NASA Astrophysics Data System (ADS)

    Wu, Nan; Song, FangMin; Li, Xiangdong

    2013-05-01

    The Search Based Software Engineering (SBSE) is widely used in software engineering for identifying optimal solutions. However, there is no polynomial-time complexity solution used in the traditional algorithms for SBSE, and that causes the cost very high. In this paper, we analyze and compare several quantum search algorithms that could be applied for SBSE: quantum adiabatic evolution searching algorithm, fixed-point quantum search (FPQS), quantum walks, and a rapid modified Grover quantum searching method. The Grover's algorithm is thought as the best choice for a large-scaled unstructured data searching and theoretically it can be applicable to any search-space structure and any type of searching problems.

  6. Application of industry-standard guidelines for the validation of avionics software

    NASA Technical Reports Server (NTRS)

    Hayhurst, Kelly J.; Shagnea, Anita M.

    1990-01-01

    The application of industry standards to the development of avionics software is discussed, focusing on verification and validation activities. It is pointed out that the procedures that guide the avionics software development and testing process are under increased scrutiny. The DO-178A guidelines, Software Considerations in Airborne Systems and Equipment Certification, are used by the FAA for certifying avionics software. To investigate the effectiveness of the DO-178A guidelines for improving the quality of avionics software, guidance and control software (GCS) is being developed according to the DO-178A development method. It is noted that, due to the extent of the data collection and configuration management procedures, any phase in the life cycle of a GCS implementation can be reconstructed. Hence, a fundamental development and testing platform has been established that is suitable for investigating the adequacy of various software development processes. In particular, the overall effectiveness and efficiency of the development method recommended by the DO-178A guidelines are being closely examined.

  7. A software framework for developing measurement applications under variable requirements.

    PubMed

    Arpaia, Pasquale; Buzio, Marco; Fiscarelli, Lucio; Inglese, Vitaliano

    2012-11-01

    A framework for easily developing software for measurement and test applications under highly and fast-varying requirements is proposed. The framework allows the software quality, in terms of flexibility, usability, and maintainability, to be maximized. Furthermore, the development effort is reduced and finalized, by relieving the test engineer of development details. The framework can be configured for satisfying a large set of measurement applications in a generic field for an industrial test division, a test laboratory, or a research center. As an experimental case study, the design, the implementation, and the assessment inside the application to a measurement scenario of magnet testing at the European Organization for Nuclear Research is reported. PMID:23206094

  8. Final Report. Center for Scalable Application Development Software

    SciTech Connect

    Mellor-Crummey, John

    2014-10-26

    The Center for Scalable Application Development Software (CScADS) was established as a part- nership between Rice University, Argonne National Laboratory, University of California Berkeley, University of Tennessee – Knoxville, and University of Wisconsin – Madison. CScADS pursued an integrated set of activities with the aim of increasing the productivity of DOE computational scientists by catalyzing the development of systems software, libraries, compilers, and tools for leadership computing platforms. Principal Center activities were workshops to engage the research community in the challenges of leadership computing, research and development of open-source software, and work with computational scientists to help them develop codes for leadership computing platforms. This final report summarizes CScADS activities at Rice University in these areas.

  9. The application of image processing software: Photoshop in environmental design

    NASA Astrophysics Data System (ADS)

    Dong, Baohua; Zhang, Chunmi; Zhuo, Chen

    2011-02-01

    In the process of environmental design and creation, the design sketch holds a very important position in that it not only illuminates the design's idea and concept but also shows the design's visual effects to the client. In the field of environmental design, computer aided design has made significant improvement. Many types of specialized design software for environmental performance of the drawings and post artistic processing have been implemented. Additionally, with the use of this software, working efficiency has greatly increased and drawings have become more specific and more specialized. By analyzing the application of photoshop image processing software in environmental design and comparing and contrasting traditional hand drawing and drawing with modern technology, this essay will further explore the way for computer technology to play a bigger role in environmental design.

  10. Applications of adaptive focused acoustics to compound management.

    PubMed

    Nixon, Elizabeth; Holland-Crimmin, Sue; Lupotsky, Brian; Chan, James; Curtis, Jon; Dobbs, Karen; Blaxill, Zoe

    2009-06-01

    Since the introduction of lithotripsy kidney stone therapy, Focused Acoustics and its properties have been thoroughly utilized in medicine and exploration. More recently, Compound Management is exploring its applications and benefits to sample integrity. There are 2 forms of Focused Acoustics: Acoustic Droplet Ejection and Adaptive Focused Acoustics, which work by emitting high-powered acoustic waves through water toward a focused point. This focused power results in noncontact plate-to-plate sample transfer or sample dissolution, respectively. For the purposes of this article, only Adaptive Focused Acoustics will be addressed. Adaptive Focused Acoustics uses high-powered acoustic waves to mix, homogenize, dissolve, and thaw samples. It facilitates transferable samples through noncontact, closed-container, isothermal mixing. Experimental results show significantly reduced mixing times, limited degradation, and ideal use for heat-sensitive compounds. Upon implementation, acoustic dissolution has reduced the number of samples requiring longer mixing times as well as reducing the number impacted by incomplete compound dissolution. It has also helped in increasing the overall sample concentration from 6 to 8 mM to 8 to 10 mM by ensuring complete compound solubilization. The application of Adaptive Focused Acoustics, however, cannot be applied to all Compound Management processes, such as sample thawing and low-volume sample reconstitution. This article will go on to describe the areas where Adaptive Focused Acoustics adds value as well as areas in which it has shown no clear benefit. PMID:19487768

  11. IFDOTMETER: A New Software Application for Automated Immunofluorescence Analysis.

    PubMed

    Rodríguez-Arribas, Mario; Pizarro-Estrella, Elisa; Gómez-Sánchez, Rubén; Yakhine-Diop, S M S; Gragera-Hidalgo, Antonio; Cristo, Alejandro; Bravo-San Pedro, Jose M; González-Polo, Rosa A; Fuentes, José M

    2016-04-01

    Most laboratories interested in autophagy use different imaging software for managing and analyzing heterogeneous parameters in immunofluorescence experiments (e.g., LC3-puncta quantification and determination of the number and size of lysosomes). One solution would be software that works on a user's laptop or workstation that can access all image settings and provide quick and easy-to-use analysis of data. Thus, we have designed and implemented an application called IFDOTMETER, which can run on all major operating systems because it has been programmed using JAVA (Sun Microsystems). Briefly, IFDOTMETER software has been created to quantify a variety of biological hallmarks, including mitochondrial morphology and nuclear condensation. The program interface is intuitive and user-friendly, making it useful for users not familiar with computer handling. By setting previously defined parameters, the software can automatically analyze a large number of images without the supervision of the researcher. Once analysis is complete, the results are stored in a spreadsheet. Using software for high-throughput cell image analysis offers researchers the possibility of performing comprehensive and precise analysis of a high number of images in an automated manner, making this routine task easier. PMID:26303944

  12. An Experiment in Determining Software Reliability Model Applicability

    NASA Technical Reports Server (NTRS)

    Nikora, Allen P.

    1995-01-01

    There have been few reports on the behavior of software reliability models under controlled conditions. That is to say, most of the reported experience with the models is during the testing phase of actual projects, during which researchers have little or no control over the data with which they work. Give that failure data for actual projects can be noisy, distorted, and uncertain, reported procedures for determining model applicability may be incomplete.

  13. Application of automated topography focus corrections for volume manufacturing

    NASA Astrophysics Data System (ADS)

    Wiltshire, Timothy J.; Liegl, Bernhard R.; Hwang, Emily M.; Lucksinger, Mark R.

    2010-03-01

    This work describes the implementation and performance of AGILE focus corrections for advanced photo lithography in volume production as well as advanced development in IBM's 300mm facility. In particular, a logic hierarchy that manages the air gage sub-system corrections to optimize tool productivity while sampling with sufficient frequency to ensure focus accuracy for stable production processes is described. The information reviewed includes: General AGILE implementation approaches; Sample focus correction contours for critical 45nm, 32nm, and 22nm applications; An outline of the IBM Advanced Process Control (APC) logic and system(s) that manage the focus correction sets; Long term, historical focus correction data for stable 45nm processes as well as development stage 32nm processes; Practical issues encountered and possible enhancements to the methodology.

  14. Neurological applications of transcranial high intensity focused ultrasound.

    PubMed

    Wang, Tony R; Dallapiazza, Rob; Elias, W Jeff

    2015-05-01

    Advances in transcranial MRI-guided focused ultrasound have renewed interest in lesioning procedures in functional neurosurgery with a potential role in the treatment of neurological conditions such as chronic pain, brain tumours, movement disorders and psychiatric diseases. While the use of transcranial MRI-guided focused ultrasound represents a new innovation in neurosurgery, ultrasound has been used in neurosurgery for almost 60 years. This paper reviews the major historical milestones that have led to modern transcranial focused ultrasound and discusses current and evolving applications of ultrasound in the brain. PMID:25703389

  15. Improving ICT Governance by Reorganizing Operation of ICT and Software Applications: The First Step to Outsource

    NASA Astrophysics Data System (ADS)

    Johansson, Björn

    During recent years great attention has been paid to outsourcing as well as to the reverse, insourcing (Dibbern et al., 2004). There has been a strong focus on how the management of software applications and information and communication technology (ICT), expressed as ICT management versus ICT governance, should be carried out (Grembergen, 2004). The maintenance and operation of software applications and ICT use a lot of the resources spent on ICT in organizations today (Bearingpoint, 2004), and managers are asked to increase the business benefits of these investments (Weill & Ross, 2004). That is, they are asked to improve the usage of ICT and to develop new business critical solutions supported by ICT. It also means that investments in ICT and software applications need to be shown to be worthwhile. Basically there are two considerations to take into account with ICT usage: cost reduction and improving business value. How the governance and management of ICT and software applications are organized is important. This means that the improvement of the control of maintenance and operation may be of interest to executives of organizations. It can be stated that usage is dependent on how it is organized. So, if an increase of ICT governance is the same as having well-organized ICT resources, could this be seen as the first step in organizations striving for external provision of ICT? This question is dealt with to some degree in this paper.

  16. Open source software engineering for geoscientific modeling applications

    NASA Astrophysics Data System (ADS)

    Bilke, L.; Rink, K.; Fischer, T.; Kolditz, O.

    2012-12-01

    OpenGeoSys (OGS) is a scientific open source project for numerical simulation of thermo-hydro-mechanical-chemical (THMC) processes in porous and fractured media. The OGS software development community is distributed all over the world and people with different backgrounds are contributing code to a complex software system. The following points have to be addressed for successful software development: - Platform independent code - A unified build system - A version control system - A collaborative project web site - Continuous builds and testing - Providing binaries and documentation for end users OGS should run on a PC as well as on a computing cluster regardless of the operating system. Therefore the code should not include any platform specific feature or library. Instead open source and platform independent libraries like Qt for the graphical user interface or VTK for visualization algorithms are used. A source code management and version control system is a definite requirement for distributed software development. For this purpose Git is used, which enables developers to work on separate versions (branches) of the software and to merge those versions at some point to the official one. The version control system is integrated into an information and collaboration website based on a wiki system. The wiki is used for collecting information such as tutorials, application examples and case studies. Discussions take place in the OGS mailing list. To improve code stability and to verify code correctness a continuous build and testing system, based on the Jenkins Continuous Integration Server, has been established. This server is connected to the version control system and does the following on every code change: - Compiles (builds) the code on every supported platform (Linux, Windows, MacOS) - Runs a comprehensive test suite of over 120 benchmarks and verifies the results Runs software development related metrics on the code (like compiler warnings, code complexity

  17. Application of software to development of reactor-safety codes

    SciTech Connect

    Wilburn, N.P.; Niccoli, L.G.

    1980-09-01

    Over the past two-and-a-half decades, the application of new techniques has reduced hardware cost for digital computer systems and increased computational speed by several orders of magnitude. A corresponding cost reduction in business and scientific software development has not occurred. The same situation is seen for software developed to model the thermohydraulic behavior of nuclear systems under hypothetical accident situations. For all cases this is particularly noted when costs over the total software life cycle are considered. A solution to this dilemma for reactor safety code systems has been demonstrated by applying the software engineering techniques which have been developed over the course of the last few years in the aerospace and business communities. These techniques have been applied recently with a great deal of success in four major projects at the Hanford Engineering Development Laboratory (HEDL): 1) a rewrite of a major safety code (MELT); 2) development of a new code system (CONACS) for description of the response of LMFBR containment to hypothetical accidents, and 3) development of two new modules for reactor safety analysis.

  18. Software architecture for time-constrained machine vision applications

    NASA Astrophysics Data System (ADS)

    Usamentiaga, Rubén; Molleda, Julio; García, Daniel F.; Bulnes, Francisco G.

    2013-01-01

    Real-time image and video processing applications require skilled architects, and recent trends in the hardware platform make the design and implementation of these applications increasingly complex. Many frameworks and libraries have been proposed or commercialized to simplify the design and tuning of real-time image processing applications. However, they tend to lack flexibility, because they are normally oriented toward particular types of applications, or they impose specific data processing models such as the pipeline. Other issues include large memory footprints, difficulty for reuse, and inefficient execution on multicore processors. We present a novel software architecture for time-constrained machine vision applications that addresses these issues. The architecture is divided into three layers. The platform abstraction layer provides a high-level application programming interface for the rest of the architecture. The messaging layer provides a message-passing interface based on a dynamic publish/subscribe pattern. A topic-based filtering in which messages are published to topics is used to route the messages from the publishers to the subscribers interested in a particular type of message. The application layer provides a repository for reusable application modules designed for machine vision applications. These modules, which include acquisition, visualization, communication, user interface, and data processing, take advantage of the power of well-known libraries such as OpenCV, Intel IPP, or CUDA. Finally, the proposed architecture is applied to a real machine vision application: a jam detector for steel pickling lines.

  19. Software Transition Project Retrospectives and the Application of SEL Effort Estimation Model and Boehm's COCOMO to Complex Software Transition Projects

    NASA Technical Reports Server (NTRS)

    McNeill, Justin

    1995-01-01

    The Multimission Image Processing Subsystem (MIPS) at the Jet Propulsion Laboratory (JPL) has managed transitions of application software sets from one operating system and hardware platform to multiple operating systems and hardware platforms. As a part of these transitions, cost estimates were generated from the personal experience of in-house developers and managers to calculate the total effort required for such projects. Productivity measures have been collected for two such transitions, one very large and the other relatively small in terms of source lines of code. These estimates used a cost estimation model similar to the Software Engineering Laboratory (SEL) Effort Estimation Model. Experience in transitioning software within JPL MIPS have uncovered a high incidence of interface complexity. Interfaces, both internal and external to individual software applications, have contributed to software transition project complexity, and thus to scheduling difficulties and larger than anticipated design work on software to be ported.

  20. Computational protein design: the Proteus software and selected applications.

    PubMed

    Simonson, Thomas; Gaillard, Thomas; Mignon, David; Schmidt am Busch, Marcel; Lopes, Anne; Amara, Najette; Polydorides, Savvas; Sedano, Audrey; Druart, Karen; Archontis, Georgios

    2013-10-30

    We describe an automated procedure for protein design, implemented in a flexible software package, called Proteus. System setup and calculation of an energy matrix are done with the XPLOR modeling program and its sophisticated command language, supporting several force fields and solvent models. A second program provides algorithms to search sequence space. It allows a decomposition of the system into groups, which can be combined in different ways in the energy function, for both positive and negative design. The whole procedure can be controlled by editing 2-4 scripts. Two applications consider the tyrosyl-tRNA synthetase enzyme and its successful redesign to bind both O-methyl-tyrosine and D-tyrosine. For the latter, we present Monte Carlo simulations where the D-tyrosine concentration is gradually increased, displacing L-tyrosine from the binding pocket and yielding the binding free energy difference, in good agreement with experiment. Complete redesign of the Crk SH3 domain is presented. The top 10000 sequences are all assigned to the correct fold by the SUPERFAMILY library of Hidden Markov Models. Finally, we report the acid/base behavior of the SNase protein. Sidechain protonation is treated as a form of mutation; it is then straightforward to perform constant-pH Monte Carlo simulations, which yield good agreement with experiment. Overall, the software can be used for a wide range of application, producing not only native-like sequences but also thermodynamic properties with errors that appear comparable to other current software packages. PMID:24037756

  1. Software Failure Propagation Prevention: Application of the Guidelines and Recommendations of Proof of Concept Pilots for Ground and Flight Software

    NASA Astrophysics Data System (ADS)

    Hann, Mark; Rodriguez Dapena, Patricia; Moretti, Davide

    2014-08-01

    Software Failure Propagation Prevention (SFPP) project aims to establish the conditions and identify the methods, techniques and tools to be used to ensure prevention of failure propagation between software products and components of different criticality category, both on-board and in the ground segment, under an ESA R&D contract [2]. This paper describes the development of the "Guidelines And Recommendations For Prevention Of Software Failure Propagation" [6] and the application of these guidelines in two proof of concept pilots. These pilots demonstrate the application of the techniques and methods described in the guidelines to two existing space software applications, one for ground segment and one for flight segment. This paper follows on from [1], which described the analysis of software failure propagation, the review of current practices inside and outside the European Space domain.

  2. Application and systems software in Ada: Development experiences

    NASA Technical Reports Server (NTRS)

    Kuschill, Jim

    1986-01-01

    In its most basic sense software development involves describing the tasks to be solved, including the given objects and the operations to be performed on those objects. Unfortunately, the way people describe objects and operations usually bears little resemblance to source code in most contemporary computer languages. There are two ways around this problem. One is to allow users to describe what they want the computer to do in everyday, typically imprecise English. The PRODOC methodology and software development environment is based on a second more flexible and possibly even easier to use approach. Rather than hiding program structure, PRODOC represents such structure graphically using visual programming techniques. In addition, the program terminology used in PRODOC may be customized so as to match the way human experts in any given application area naturally describe the relevant data and operations. The PRODOC methodology is described in detail.

  3. Evaluation of the Trajectory Operations Applications Software Task (TOAST)

    NASA Technical Reports Server (NTRS)

    Perkins, Sharon; Martin, Andrea; Bavinger, Bill

    1990-01-01

    The Trajectory Operations Applications Software Task (TOAST) is a software development project under the auspices of the Mission Operations Directorate. Its purpose is to provide trajectory operation pre-mission and real-time support for the Space Shuttle program. As an Application Manager, TOAST provides an isolation layer between the underlying Unix operating system and the series of user programs. It provides two main services: a common interface to operating system functions with semantics appropriate for C or FORTRAN, and a structured input and output package that can be utilized by user application programs. In order to evaluate TOAST as an Application Manager, the task was to assess current and planned capabilities, compare capabilities to functions available in commercially-available off the shelf (COTS) and Flight Analysis Design System (FADS) users for TOAST implementation. As a result of the investigation, it was found that the current version of TOAST is well implemented and meets the needs of the real-time users. The plans for migrating TOAST to the X Window System are essentially sound; the Executive will port with minor changes, while Menu Handler will require a total rewrite. A series of recommendations for future TOAST directions are included.

  4. A Software Development Platform for Wearable Medical Applications.

    PubMed

    Zhang, Ruikai; Lin, Wei

    2015-10-01

    Wearable medical devices have become a leading trend in healthcare industry. Microcontrollers are computers on a chip with sufficient processing power and preferred embedded computing units in those devices. We have developed a software platform specifically for the design of the wearable medical applications with a small code footprint on the microcontrollers. It is supported by the open source real time operating system FreeRTOS and supplemented with a set of standard APIs for the architectural specific hardware interfaces on the microcontrollers for data acquisition and wireless communication. We modified the tick counter routine in FreeRTOS to include a real time soft clock. When combined with the multitasking features in the FreeRTOS, the platform offers the quick development of wearable applications and easy porting of the application code to different microprocessors. Test results have demonstrated that the application software developed using this platform are highly efficient in CPU usage while maintaining a small code foot print to accommodate the limited memory space in microcontrollers. PMID:26276017

  5. WinTRAX: A raytracing software package for the design of multipole focusing systems

    NASA Astrophysics Data System (ADS)

    Grime, G. W.

    2013-07-01

    The software package TRAX was a simulation tool for modelling the path of charged particles through linear cylindrical multipole fields described by analytical expressions and was a development of the earlier OXRAY program (Grime and Watt, 1983; Grime et al., 1982) [1,2]. In a 2005 comparison of raytracing software packages (Incerti et al., 2005) [3], TRAX/OXRAY was compared with Geant4 and Zgoubi and was found to give close agreement with the more modern codes. TRAX was a text-based program which was only available for operation in a now rare VMS workstation environment, so a new program, WinTRAX, has been developed for the Windows operating system. This implements the same basic computing strategy as TRAX, and key sections of the code are direct translations from FORTRAN to C++, but the Windows environment is exploited to make an intuitive graphical user interface which simplifies and enhances many operations including system definition and storage, optimisation, beam simulation (including with misaligned elements) and aberration coefficient determination. This paper describes the program and presents comparisons with other software and real installations.

  6. Software tools for developing parallel applications. Part 1: Code development and debugging

    SciTech Connect

    Brown, J.; Geist, A.; Pancake, C.; Rover, D.

    1997-04-01

    Developing an application for parallel computers can be a lengthy and frustrating process making it a perfect candidate for software tool support. Yet application programmers are often the last to hear about new tools emerging from R and D efforts. This paper provides an overview of two focuses of tool support: code development and debugging. Each is discussed in terms of the programmer needs addressed, the extent to which representative current tools meet those needs, and what new levels of tool support are important if parallel computing is to become more widespread.

  7. MathBrowser: Web-Enabled Mathematical Software with Application to the Chemistry Curriculum, v 1.0

    NASA Astrophysics Data System (ADS)

    Goldsmith, Jack G.

    1997-10-01

    MathSoft: Cambridge, MA, 1996; free via ftp from www.mathsoft.com. The movement to provide computer-based applications in chemistry has come to focus on three main areas: software aimed at specific applications (drawing, simulation, data analysis, etc.), multimedia applications designed to assist in the presentation of conceptual information, and packages to be used in conjunction with a particular textbook at a specific point in the chemistry curriculum. The result is a situation where no single software package devoted to problem solving can be used across a large segment of the curriculum. Adoption of World Wide Web (WWW) technology by a manufacturer of mathematical software, however, has produced software that provides an attractive means of providing a problem-solving resource to students in courses from freshman through senior level.

  8. Application of existing design software to problems in neuronal modeling.

    PubMed

    Vranić-Sowers, S; Fleshman, J W

    1994-03-01

    In this communication, we describe the application of the Valid/Analog Design Tools circuit simulation package called PC Workbench to the problem of modeling the electrical behavior of neural tissue. A nerve cell representation as an equivalent electrical circuit using compartmental models is presented. Several types of nonexcitable and excitable membranes are designed, and simulation results for different types of electrical stimuli are compared to the corresponding analytical data. It is shown that the hardware/software platform and the models developed constitute an accurate, flexible, and powerful way to study neural tissue. PMID:8045583

  9. Supporting SBML as a model exchange format in software applications.

    PubMed

    Keating, Sarah M; Le Novère, Nicolas

    2013-01-01

    This chapter describes the Systems Biology Markup Language (SBML) from its origins. It describes the rationale behind and importance of having a common language when it comes to representing models. This chapter mentions the development of SBML and outlines the structure of an SBML model. It provides a section on libSBML, a useful application programming interface (API) library for reading, writing, manipulating and validating content expressed in the SBML format. Finally the chapter also provides a description of the SBML Toolbox which provides a means of facilitating the import and export of SBML from both MATLAB and Octave ( http://www.gnu.org/software/octave/) environments. PMID:23715987

  10. Application of software simulation to DBS transmission design and evaluation

    NASA Astrophysics Data System (ADS)

    White, Lawrence W.; Palmer, Larry C.; Chang, Peter Y.; Shenoy, Ajit

    The paper describes software simulation results in the following three areas related to DBS planning: multipath and overdeviation of the video signal, digital audio transmission, and phase-locked loop demodulation. The results obtained for multipath and overdeviation involved simulation to examine a dual-path phenomenon that can be encountered in DBS applications. Results are also presented on the insertion of high rate digital data directly into the horizontal blanking interval of the video scan line. The results of simulations with a phase-locked demodular determined the degree of threshold extension achieved for various test patterns, as compared to a conventional limiter/discriminator.

  11. APPLICATION OF SOFTWARE QUALITY ASSURANCE CONCEPTS AND PROCEDURES TO ENVIORNMENTAL RESEARCH INVOLVING SOFTWARE DEVELOPMENT

    EPA Science Inventory

    As EPA’s environmental research expands into new areas that involve the development of software, quality assurance concepts and procedures that were originally developed for environmental data collection may not be appropriate. Fortunately, software quality assurance is a ...

  12. Optimizing Focusing X-Ray Optics for Planetary Science Applications

    NASA Astrophysics Data System (ADS)

    Melso, Nicole; Romaine, Suzanne; Hong, Jaesub; Cotroneo, Vincenzo

    2015-01-01

    X-Ray observations are a valuable tool for studying the composition, formation and evolution of the numerous X-Ray emitting objects in our Solar System. Although there are plenty of useful applications for in situ X-Ray focusing instrumentation, X-Ray focusing optics have never been feasible for use onboard planetary missions due to their mass and cost. Recent advancements in small-scale X-Ray instrumentation have made focusing X-Ray technology more practical and affordable for use onboard in situ spacecraft. Specifically, the technology of a metal-ceramic hybrid material combined with Electroformed Nickel Replication (ENR) holds great promise for realizing lightweight X-ray optics. We are working to optimize these lightweight focusing X-Ray optics for use in planetary science applications. We have explored multiple configurations and geometries that maximize the telescope's effective area and field of view while meeting practical mass and volume requirements. Each configuration was modeled via analytic calculations and Monte Carlo ray tracing simulations and compared to alternative Micro-pore Optics designs. The improved performance of our approach using hybrid materials has many exciting implications for the future of planetary science, X-Ray instrumentation, and the exploration of X-Ray sources in our Solar System.This work was supported in part by the NSF REU and DoD ASSURE programs under NSF grant no. 1262851 and by the Smithsonian Institution.

  13. Application of time reversal acoustics focusing for nonlinear imaging ms

    NASA Astrophysics Data System (ADS)

    Sarvazyan, Armen; Sutin, Alexander

    2001-05-01

    Time reversal acoustic (TRA) focusing of ultrasound appears to be an effective tool for nonlinear imaging in industrial and medical applications because of its ability to efficiently concentrate ultrasonic energy (close to diffraction limit) in heterogeneous media. In this study, we used two TRA systems to focus ultrasonic beams with different frequencies in coinciding focal points, thus causing the generation of ultrasonic waves with combination frequencies. Measurements of the intensity of these combination frequency waves provide information on the nonlinear parameter of medium in the focal region. Synchronized stirring of two TRA focused beams enables obtaining 3-D acoustic nonlinearity images of the object. Each of the TRA systems employed an aluminum resonator with piezotransducers glued to its facet. One of the free facets of each resonator was submerged into a water tank and served as a virtual phased array capable of ultrasound focusing and beam steering. To mimic a medium with spatially varying acoustical nonlinearity a simplest model such as a microbubble column in water was used. Microbubbles were generated by electrolysis of water using a needle electrode. An order of magnitude increase of the sum frequency component was observed when the ultrasound beams were focused in the area with bubbles.

  14. Clinical applications of high-intensity focused ultrasound.

    PubMed

    She, W H; Cheung, T T; Jenkins, C R; Irwin, M G

    2016-08-01

    Ultrasound has been developed for therapeutic use in addition to its diagnostic ability. The use of focused ultrasound energy can offer a non-invasive method for tissue ablation, and can therefore be used to treat various solid tumours. High-intensity focused ultrasound is being increasingly used in the treatment of both primary and metastatic tumours as these can be precisely located for ablation. It has been shown to be particularly useful in the treatment of uterine fibroids, and various solid tumours including those of the pancreas and liver. High-intensity focused ultrasound is a valid treatment option for liver tumours in patients with significant medical co-morbidity who are at high risk for surgery or who have relatively poor liver function that may preclude hepatectomy. It has also been used as a form of bridging therapy while patients awaiting cadaveric donor liver transplantation. In this article, we outline the principles of high-intensity focused ultrasound and its clinical applications, including the management protocol development in the treatment of hepatocellular carcinoma in Hong Kong by performing a search on MEDLINE (OVID), EMBASE, and PubMed. The search of these databases ranged from the date of their establishment until December 2015. The search terms used were: high-intensity focused ultrasound, ultrasound, magnetic resonance imaging, liver tumour, hepatocellular carcinoma, pancreas, renal cell carcinoma, prostate cancer, breast cancer, fibroids, bone tumour, atrial fibrillation, glaucoma, Parkinson's disease, essential tremor, and neuropathic pain. PMID:27380753

  15. Bibliographic Management Software: A Focus Group Study of the Preferences and Practices of Undergraduate Students

    ERIC Educational Resources Information Center

    Salem, Jamie; Fehrmann, Paul

    2013-01-01

    With the growing population of undergraduate students on our campus and an increased focus on their success, librarians at a large midwestern university were interested in the citation management styles of this university cohort. Our university library spends considerable resources each year to maintain and promote access to the robust…

  16. CHSSI Software for Geometrically Complex Unsteady Aerodynamic Applications

    NASA Technical Reports Server (NTRS)

    Chan, William M.; Meakin, Robert L.; Potsdam, Mark A.

    2001-01-01

    A comprehensive package of scalable overset grid CFD software is reviewed. The software facilitates accurate simulation of complete aircraft aerodynamics, including viscous effects, unsteadiness, and relative motion between component parts. The software significantly lowers the manpower and computer costs normally associated with such efforts. The software is discussed in terms of current capabilities and planned future enhancements.

  17. The Application of Software Safety to the Constellation Program Launch Control System

    NASA Technical Reports Server (NTRS)

    Kania, James; Hill, Janice

    2011-01-01

    The application of software safety practices on the LCS project resulted in the successful implementation of the NASA Software Safety Standard NASA-STD-8719.138 and CxP software safety requirements. The GOP-GEN-GSW-011 Hazard Report was the first report developed at KSC to identify software hazard causes and their controls. This approach can be applied to similar large software - intensive systems where loss of control can lead to a hazard.

  18. CFEL–ASG Software Suite (CASS): usage for free-electron laser experiments with biological focus1

    PubMed Central

    Foucar, Lutz

    2016-01-01

    CASS [Foucar et al. (2012). Comput. Phys. Commun. 183, 2207–2213] is a well established software suite for experiments performed at any sort of light source. It is based on a modular design and can easily be adapted for use at free-electron laser (FEL) experiments that have a biological focus. This article will list all the additional functionality and enhancements of CASS for use with FEL experiments that have been introduced since the first publication. The article will also highlight some advanced experiments with biological aspects that have been performed. PMID:27504079

  19. Correctness of Sensor Network Applications by Software Bounded Model Checking

    NASA Astrophysics Data System (ADS)

    Werner, Frank; Faragó, David

    We investigate the application of the software bounded model checking tool CBMC to the domain of wireless sensor networks (WSNs). We automatically generate a software behavior model from a network protocol (ESAWN) implementation in a WSN development and deployment platform (TinyOS), which is used to rigorously verify the protocol. Our work is a proof of concept that automatic verification of programs of practical size (≈ 21 000 LoC) and complexity is possible with CBMC and can be integrated into TinyOS. The developer can automatically check for pointer dereference and array index out of bound errors. She can also check additional, e.g., functional, properties that she provides by assume- and assert-statements. This experience paper shows that our approach is in general feasible since we managed to verify about half of the properties. We made the verification process scalable in the size of the code by abstraction (eg, from hardware) and by simplification heuristics. The latter also achieved scalability in data type complexity for the properties that were verifiable. The others require technical advancements for complex data types within CBMC's core.

  20. The DOE Meteorological Coordinating Council Perspective on the Application to Meteorological Software of DOE's Software Quality Assurance Requirements

    SciTech Connect

    Mazzola, Carl; Schalk, Walt; Glantz, Clifford S.

    2008-03-12

    Since 1994, the DOE Meteorological Coordinating Council (DMCC) has been addressing meteorological monitoring and meteorological applications program issues at U.S. Department of Energy/National Nuclear Safety Administration (DOE/NNSA) sites and providing solutions to these issues. The fundamental objectives of the DMCC include promoting cost-effective meteorological support and facilitating the use of common meteorological methods, procedures, and standards at all DOE/NNSA sites. In 2005, the DOE established strict software quality assurance (SQA) requirements for safety software, including consequence assessment software used for hazard assessments and safety analyses. These evaluations often utilize meteorological data supplied by DOE/NNSA site-based meteorological programs. However, the DOE has not established SQA guidance for this type of safety-related meteorological software. To address this gap, the DMCC is developing this guidance for the use of both public- and private-sector organizations. The goal for the DMCC is to mimic the SQA requirements for safety software but allow a much greater degree of “grading” in determining exactly what specific activities are needed. The emphasis of the DMCC SQA guidelines is on three key elements: 1) design and implementation documentation, 2) configuration management, and 3) verification and validation testing. These SQA guidelines should provide owners and users of meteorological software with a fair degree of assurance that their software is reliable, documented, and tested without putting an undue burden on meteorological system software developers.

  1. Practical Application of Model Checking in Software Verification

    NASA Technical Reports Server (NTRS)

    Havelund, Klaus; Skakkebaek, Jens Ulrik

    1999-01-01

    This paper presents our experiences in applying the JAVA PATHFINDER (J(sub PF)), a recently developed JAVA to SPIN translator, in the finding of synchronization bugs in a Chinese Chess game server application written in JAVA. We give an overview of J(sub PF) and the subset of JAVA that it supports and describe the abstraction and verification of the game server. Finally, we analyze the results of the effort. We argue that abstraction by under-approximation is necessary for abstracting sufficiently smaller models for verification purposes; that user guidance is crucial for effective abstraction; and that current model checkers do not conveniently support the computational models of software in general and JAVA in particular.

  2. Optimizing Flight Control Software With an Application Platform

    NASA Technical Reports Server (NTRS)

    Smith, Irene Skupniewicz; Shi, Nija; Webster, Christopher

    2012-01-01

    Flight controllers in NASA s mission control centers work day and night to ensure that missions succeed and crews are safe. The IT goals of NASA mission control centers are similar to those of most businesses: to evolve IT infrastructure from basic to dynamic. This paper describes Mission Control Technologies (MCT), an application platform that is powering mission control today and is designed to meet the needs of future NASA control centers. MCT is an extensible platform that provides GUI components and a runtime environment. The platform enables NASA s IT goals through its use of lightweight interfaces and configurable components, which promote standardization and incorporate useful solution patterns. The MCT architecture positions mission control centers to reach the goal of dynamic IT, leading to lower cost of ownership, and treating software as a strategic investment.

  3. Process Orchestration With Modular Software Applications On Intelligent Field Devices

    NASA Astrophysics Data System (ADS)

    Orfgen, Marius; Schmitt, Mathias

    2015-07-01

    The method developed by the DFKI-IFS for extending the functionality of intelligent field devices through the use of reloadable software applications (so-called Apps) is to be further augmented with a methodology and communication concept for process orchestration. The concept allows individual Apps from different manufacturers to decentrally share information. This way of communicating forms the basis for the dynamic orchestration of Apps to complete processes, in that it allows the actions of one App (e.g. detecting a component part with a sensor App) to trigger reactions in other Apps (e.g. triggering the processing of that component part). A holistic methodology and its implementation as a configuration tool allows one to model the information flow between Apps, as well as automatically introduce it into physical production hardware via available interfaces provided by the Field Device Middleware. Consequently, configuring industrial facilities is made simpler, resulting in shorter changeover and shutdown times.

  4. Operational excellence (six sigma) philosophy: Application to software quality assurance

    SciTech Connect

    Lackner, M.

    1997-11-01

    This report contains viewgraphs on operational excellence philosophy of six sigma applied to software quality assurance. This report outlines the following: goal of six sigma; six sigma tools; manufacturing vs administrative processes; Software quality assurance document inspections; map software quality assurance requirements document; failure mode effects analysis for requirements document; measuring the right response variables; and questions.

  5. Migrating data from TcSE to DOORS : an evaluation of the T-Plan Integrator software application.

    SciTech Connect

    Post, Debra S.; Manzanares, David A.; Taylor, Jeffrey L.

    2011-02-01

    This report describes our evaluation of the T-Plan Integrator software application as it was used to transfer a real data set from the Teamcenter for Systems Engineering (TcSE) software application to the DOORS software application. The T-Plan Integrator was evaluated to determine if it would meet the needs of Sandia National Laboratories to migrate our existing data sets from TcSE to DOORS. This report presents the struggles of migrating data and focuses on how the Integrator can be used to map a data set and its data architecture from TcSE to DOORS. Finally, this report describes how the bulk of the migration can take place using the Integrator; however, about 20-30% of the data would need to be transferred from TcSE to DOORS manually. This report does not evaluate the transfer of data from DOORS to TcSE.

  6. Current focus of stem cell application in retinal repair.

    PubMed

    Alonso-Alonso, María L; Srivastava, Girish K

    2015-04-26

    The relevance of retinal diseases, both in society's economy and in the quality of people's life who suffer with them, has made stem cell therapy an interesting topic for research. Embryonic stem cells (ESCs), induced pluripotent stem cells (iPSCs) and adipose derived mesenchymal stem cells (ADMSCs) are the focus in current endeavors as a source of different retinal cells, such as photoreceptors and retinal pigment epithelial cells. The aim is to apply them for cell replacement as an option for treating retinal diseases which so far are untreatable in their advanced stage. ESCs, despite the great potential for differentiation, have the dangerous risk of teratoma formation as well as ethical issues, which must be resolved before starting a clinical trial. iPSCs, like ESCs, are able to differentiate in to several types of retinal cells. However, the process to get them for personalized cell therapy has a high cost in terms of time and money. Researchers are working to resolve this since iPSCs seem to be a realistic option for treating retinal diseases. ADMSCs have the advantage that the procedures to obtain them are easier. Despite advancements in stem cell application, there are still several challenges that need to be overcome before transferring the research results to clinical application. This paper reviews recent research achievements of the applications of these three types of stem cells as well as clinical trials currently based on them. PMID:25914770

  7. Application safety enhancement model using self-checking with software enzymes

    NASA Astrophysics Data System (ADS)

    Subramaniam, Chandrasekaran; Ravishankar, Arthi; Gopal, Deepthi; Subramanian, Dhaarini

    2011-12-01

    The objective of the paper is to propose a safety enhancement model for application software in accelerating the respective self checking strategies similar to bio enzymatic actions. The application software components which are safety critical may have to be assessed periodically or on demand to achieve not only the functional correctness but also the safety specifications or features while getting executed. The design and deployment of such software modules can be formally verified for possible safety flaws using self checking capabilities and software enzymatic actions. The self checks must sense the safety holes in the software and decide to activate the built-in software components called enzymes to do the safe guard operations in a timely manner to mitigate the safety faults using the proposed enzyme calculus. The various application hazards due to the boolean faults in the functional and behavioral model that lead to software safety issues are considered in this approach.

  8. Application of Design Patterns in Refactoring Software Design

    NASA Technical Reports Server (NTRS)

    Baggs. Rjpda; Shaykhian, Gholam Ali

    2007-01-01

    Refactoring software design is a method of changing software design while explicitly preserving its unique design functionalities. Presented approach is to utilize design patterns as the basis for refactoring software design. Comparison of a design solution will be made through C++ programming language examples to exploit this approach. Developing reusable component will be discussed, the paper presents that the construction of such components can diminish the added burden of both refactoring and the use of design patterns.

  9. Cryo-focused-ion-beam applications in structural biology.

    PubMed

    Rigort, Alexander; Plitzko, Jürgen M

    2015-09-01

    The ability to precisely control the preparation of biological samples for investigations by electron cryo-microscopy is becoming increasingly important for ultrastructural imaging in biology. Precision machining instruments such as the focused ion beam microscope (FIB) were originally developed for applications in materials science. However, today we witness a growing use of these tools in the life sciences mainly due to their versatility, since they can be used both as manipulation and as imaging devices, when complemented with a scanning electron microscope (SEM). The advent of cryo-preparation equipment and accessories made it possible to pursue work on frozen-hydrated biological specimens with these two beam (FIB/SEM) instruments. In structural biology, the cryo-FIB can be used to site-specifically thin vitrified specimens for transmission electron microscopy (TEM) and tomography. Having control over the specimen thickness is a decisive factor for TEM imaging, as the thickness of the object under scrutiny determines the attainable resolution. Besides its use for TEM preparation, the FIB/SEM microscope can be additionally used to obtain three-dimensional volumetric data from biological specimens. The unique combination of an imaging and precision manipulation tool allows sequentially removing material with the ion beam and imaging the milled block faces by scanning with the electron beam, an approach known as FIB/SEM tomography. This review covers both fields of cryo-FIB applications: specimen preparation for TEM cryo-tomography and volume imaging by cryo-FIB/SEM tomography. PMID:25703192

  10. Current and Perspective Applications of Dense Plasma Focus Devices

    SciTech Connect

    Gribkov, V. A.

    2008-04-07

    Dense Plasma Focus (DPF) devices' applications, which are intended to support the main-stream large-scale nuclear fusion programs (NFP) from one side (both in fundamental problems of Dense Magnetized Plasma physics and in its engineering issues) as well as elaborated for an immediate use in a number of fields from the other one, are described. In the first direction such problems as self-generated magnetic fields, implosion stability of plasma shells having a high aspect ratio, etc. are important for the Inertial Confinement Fusion (ICF) programs (e.g. as NIF), whereas different problems of current disruption phenomenon, plasma turbulence, mechanisms of generation of fast particles and neutrons in magnetized plasmas are of great interest for the large devices of the Magnetic Plasma Confinement--MPC (e.g. as ITER). In a sphere of the engineering problems of NFP it is shown that in particular the radiation material sciences have DPF as a very efficient tool for radiation tests of prospect materials and for improvement of their characteristics. In the field of broad-band current applications some results obtained in the fields of radiation material sciences, radiobiology, nuclear medicine, express Neutron Activation Analysis (including a single-shot interrogation of hidden illegal objects), dynamic non-destructive quality control, X-Ray microlithography and micromachining, and micro-radiography are presented. As the examples of the potential future applications it is proposed to use DPF as a powerful high-flux neutron source to generate very powerful pulses of neutrons in the nanosecond (ns) range of its duration for innovative experiments in nuclear physics, for the goals of radiation treatment of malignant tumors, for neutron tests of materials of the first wall, blankets and NFP device's constructions (with fluences up to 1 dpa per a year term), and ns pulses of fast electrons, neutrons and hard X-Rays for brachytherapy.

  11. Cloud based, Open Source Software Application for Mitigating Herbicide Drift

    NASA Astrophysics Data System (ADS)

    Saraswat, D.; Scott, B.

    2014-12-01

    The spread of herbicide resistant weeds has resulted in the need for clearly marked fields. In response to this need, the University of Arkansas Cooperative Extension Service launched a program named Flag the Technology in 2011. This program uses color-coded flags as a visual alert of the herbicide trait technology within a farm field. The flag based program also serves to help avoid herbicide misapplication and prevent herbicide drift damage between fields with differing crop technologies. This program has been endorsed by Southern Weed Science Society of America and is attracting interest from across the USA, Canada, and Australia. However, flags have risk of misplacement or disappearance due to mischief or severe windstorms/thunderstorms, respectively. This presentation will discuss the design and development of a cloud-based, free application utilizing open-source technologies, called Flag the Technology Cloud (FTTCloud), for allowing agricultural stakeholders to color code their farm fields for indicating herbicide resistant technologies. The developed software utilizes modern web development practices, widely used design technologies, and basic geographic information system (GIS) based interactive interfaces for representing, color-coding, searching, and visualizing fields. This program has also been made compatible for a wider usability on different size devices- smartphones, tablets, desktops and laptops.

  12. Web Services Provide Access to SCEC Scientific Research Application Software

    NASA Astrophysics Data System (ADS)

    Gupta, N.; Gupta, V.; Okaya, D.; Kamb, L.; Maechling, P.

    2003-12-01

    Web services offer scientific communities a new paradigm for sharing research codes and communicating results. While there are formal technical definitions of what constitutes a web service, for a user community such as the Southern California Earthquake Center (SCEC), we may conceptually consider a web service to be functionality provided on-demand by an application which is run on a remote computer located elsewhere on the Internet. The value of a web service is that it can (1) run a scientific code without the user needing to install and learn the intricacies of running the code; (2) provide the technical framework which allows a user's computer to talk to the remote computer which performs the service; (3) provide the computational resources to run the code; and (4) bundle several analysis steps and provide the end results in digital or (post-processed) graphical form. Within an NSF-sponsored ITR project coordinated by SCEC, we are constructing web services using architectural protocols and programming languages (e.g., Java). However, because the SCEC community has a rich pool of scientific research software (written in traditional languages such as C and FORTRAN), we also emphasize making existing scientific codes available by constructing web service frameworks which wrap around and directly run these codes. In doing so we attempt to broaden community usage of these codes. Web service wrapping of a scientific code can be done using a "web servlet" construction or by using a SOAP/WSDL-based framework. This latter approach is widely adopted in IT circles although it is subject to rapid evolution. Our wrapping framework attempts to "honor" the original codes with as little modification as is possible. For versatility we identify three methods of user access: (A) a web-based GUI (written in HTML and/or Java applets); (B) a Linux/OSX/UNIX command line "initiator" utility (shell-scriptable); and (C) direct access from within any Java application (and with the

  13. Does Screencast Teaching Software Application Needs Narration for Effective Learning?

    ERIC Educational Resources Information Center

    Mohamad Ali, Ahmad Zamzuri; Samsudin, Khairulanuar; Hassan, Mohamad; Sidek, Salman Firdaus

    2011-01-01

    The aim of this study was to investigate the effects of screencast with narration and without narration in enhancing learning performance. A series of screencast teaching Flash animation software was developed using screen capture software for the purpose of this research. The screencast series were uploaded to specialized channels created in…

  14. Transfer of Learning: The Effects of Different Instruction Methods on Software Application Learning

    ERIC Educational Resources Information Center

    Larson, Mark E.

    2010-01-01

    Human Resource Departments (HRD), especially instructors, are challenged to keep pace with rapidly changing computer software applications and technology. The problem under investigation revealed after instruction of a software application if a particular method of instruction was a predictor of transfer of learning, when other risk factors were…

  15. Editorial: Focus on Atom Optics and its Applications

    NASA Astrophysics Data System (ADS)

    Schmidt-Kaler, F.; Pfau, T.; Schmelcher, P.; Schleich, W.

    2010-06-01

    Atom optics employs the modern techniques of quantum optics and laser cooling to enable applications which often outperform current standard technologies. Atomic matter wave interferometers allow for ultra-precise sensors; metrology and clocks are pushed to an extraordinary accuracy of 17 digits using single atoms. Miniaturization and integration are driven forward for both atomic clocks and atom optical circuits. With the miniaturization of information-storage and -processing devices, the scale of single atoms is approached in solid state devices, where the laws of quantum physics lead to novel, advantageous features and functionalities. An upcoming branch of atom optics is the control of single atoms, potentially allowing solid state devices to be built atom by atom; some of which would be applicable in future quantum information processing devices. Selective manipulation of individual atoms also enables trace analysis of extremely rare isotopes. Additionally, sources of neutral atoms with high brightness are being developed and, if combined with photo ionization, even novel focused ion beam sources are within reach. Ultracold chemistry is fertilized by atomic techniques, when reactions of chemical constituents are investigated between ions, atoms, molecules, trapped or aligned in designed fields and cooled to ultra-low temperatures such that the reaction kinetics can be studied in a completely state-resolved manner. Focus on Atom Optics and its Applications Contents Sensitive gravity-gradiometry with atom interferometry: progress towards an improved determination of the gravitational constant F Sorrentino, Y-H Lien, G Rosi, L Cacciapuoti, M Prevedelli and G M Tino A single-atom detector integrated on an atom chip: fabrication, characterization and application D Heine, W Rohringer, D Fischer, M Wilzbach, T Raub, S Loziczky, XiYuan Liu, S Groth, B Hessmo and J Schmiedmayer Interaction of a propagating guided matter wave with a localized potential G L Gattobigio, A

  16. Evaluation of Distribution Analysis Software for DER Applications

    SciTech Connect

    Staunton, RH

    2003-01-23

    The term ''Distributed energy resources'' or DER refers to a variety of compact, mostly self-contained power-generating technologies that can be combined with energy management and storage systems and used to improve the operation of the electricity distribution system, whether or not those technologies are connected to an electricity grid. Implementing DER can be as simple as installing a small electric generator to provide backup power at an electricity consumer's site. Or it can be a more complex system, highly integrated with the electricity grid and consisting of electricity generation, energy storage, and power management systems. DER devices provide opportunities for greater local control of electricity delivery and consumption. They also enable more efficient utilization of waste heat in combined cooling, heating and power (CHP) applications--boosting efficiency and lowering emissions. CHP systems can provide electricity, heat and hot water for industrial processes, space heating and cooling, refrigeration, and humidity control to improve indoor air quality. DER technologies are playing an increasingly important role in the nation's energy portfolio. They can be used to meet base load power, peaking power, backup power, remote power, power quality, as well as cooling and heating needs. DER systems, ranging in size and capacity from a few kilowatts up to 50 MW, can include a number of technologies (e.g., supply-side and demand-side) that can be located at or near the location where the energy is used. Information pertaining to DER technologies, application solutions, successful installations, etc., can be found at the U.S. Department of Energy's DER Internet site [1]. Market forces in the restructured electricity markets are making DER, both more common and more active in the distribution systems throughout the US [2]. If DER devices can be made even more competitive with central generation sources this trend will become unstoppable. In response, energy

  17. Laboratory Connections. Commercial Interfacing Packages: Part II: Software and Applications.

    ERIC Educational Resources Information Center

    Powers, Michael H.

    1989-01-01

    Describes the titration of a weak base with a strong acid and subsequent treatment of experimental data using commercially available software. Provides a BASIC program for determining first and second derivatives of data input. Lists 6 references. (YP)

  18. The software application and classification algorithms for welds radiograms analysis

    NASA Astrophysics Data System (ADS)

    Sikora, R.; Chady, T.; Baniukiewicz, P.; Grzywacz, B.; Lopato, P.; Misztal, L.; Napierała, L.; Piekarczyk, B.; Pietrusewicz, T.; Psuj, G.

    2013-01-01

    The paper presents a software implementation of an Intelligent System for Radiogram Analysis (ISAR). The system has to support radiologists in welds quality inspection. The image processing part of software with a graphical user interface and a welds classification part are described with selected classification results. Classification was based on a few algorithms: an artificial neural network, a k-means clustering, a simplified k-means and a rough sets theory.

  19. Editorial: Focus on Atom Optics and its Applications

    NASA Astrophysics Data System (ADS)

    Schmidt-Kaler, F.; Pfau, T.; Schmelcher, P.; Schleich, W.

    2010-06-01

    Atom optics employs the modern techniques of quantum optics and laser cooling to enable applications which often outperform current standard technologies. Atomic matter wave interferometers allow for ultra-precise sensors; metrology and clocks are pushed to an extraordinary accuracy of 17 digits using single atoms. Miniaturization and integration are driven forward for both atomic clocks and atom optical circuits. With the miniaturization of information-storage and -processing devices, the scale of single atoms is approached in solid state devices, where the laws of quantum physics lead to novel, advantageous features and functionalities. An upcoming branch of atom optics is the control of single atoms, potentially allowing solid state devices to be built atom by atom; some of which would be applicable in future quantum information processing devices. Selective manipulation of individual atoms also enables trace analysis of extremely rare isotopes. Additionally, sources of neutral atoms with high brightness are being developed and, if combined with photo ionization, even novel focused ion beam sources are within reach. Ultracold chemistry is fertilized by atomic techniques, when reactions of chemical constituents are investigated between ions, atoms, molecules, trapped or aligned in designed fields and cooled to ultra-low temperatures such that the reaction kinetics can be studied in a completely state-resolved manner. Focus on Atom Optics and its Applications Contents Sensitive gravity-gradiometry with atom interferometry: progress towards an improved determination of the gravitational constant F Sorrentino, Y-H Lien, G Rosi, L Cacciapuoti, M Prevedelli and G M Tino A single-atom detector integrated on an atom chip: fabrication, characterization and application D Heine, W Rohringer, D Fischer, M Wilzbach, T Raub, S Loziczky, XiYuan Liu, S Groth, B Hessmo and J Schmiedmayer Interaction of a propagating guided matter wave with a localized potential G L Gattobigio, A

  20. Problem-Solving Software

    NASA Technical Reports Server (NTRS)

    1992-01-01

    CBR Express software solves problems by adapting sorted solutions to new problems specified by a user. It is applicable to a wide range of situations. The technology was originally developed by Inference Corporation for Johnson Space Center's Advanced Software Development Workstation. The project focused on the reuse of software designs, and Inference used CBR as part of the ACCESS prototype software. The commercial CBR Express is used as a "help desk" for customer support, enabling reuse of existing information when necessary. It has been adopted by several companies, among them American Airlines, which uses it to solve reservation system software problems.

  1. Software tool for data mining and its applications

    NASA Astrophysics Data System (ADS)

    Yang, Jie; Ye, Chenzhou; Chen, Nianyi

    2002-03-01

    A software tool for data mining is introduced, which integrates pattern recognition (PCA, Fisher, clustering, hyperenvelop, regression), artificial intelligence (knowledge representation, decision trees), statistical learning (rough set, support vector machine), computational intelligence (neural network, genetic algorithm, fuzzy systems). It consists of nine function models: pattern recognition, decision trees, association rule, fuzzy rule, neural network, genetic algorithm, Hyper Envelop, support vector machine, visualization. The principle and knowledge representation of some function models of data mining are described. The software tool of data mining is realized by Visual C++ under Windows 2000. Nonmonotony in data mining is dealt with by concept hierarchy and layered mining. The software tool of data mining has satisfactorily applied in the prediction of regularities of the formation of ternary intermetallic compounds in alloy systems, and diagnosis of brain glioma.

  2. Progress in Addressing DNFSB Recommendation 2002-1 Issues: Improving Accident Analysis Software Applications

    SciTech Connect

    VINCENT, ANDREW

    2005-04-25

    Defense Nuclear Facilities Safety Board (DNFSB) Recommendation 2002-1 (''Quality Assurance for Safety-Related Software'') identified a number of quality assurance issues on the use of software in Department of Energy (DOE) facilities for analyzing hazards, and designing and operating controls to prevent or mitigate potential accidents. Over the last year, DOE has begun several processes and programs as part of the Implementation Plan commitments, and in particular, has made significant progress in addressing several sets of issues particularly important in the application of software for performing hazard and accident analysis. The work discussed here demonstrates that through these actions, Software Quality Assurance (SQA) guidance and software tools are available that can be used to improve resulting safety analysis. Specifically, five of the primary actions corresponding to the commitments made in the Implementation Plan to Recommendation 2002-1 are identified and discussed in this paper. Included are the web-based DOE SQA Knowledge Portal and the Central Registry, guidance and gap analysis reports, electronic bulletin board and discussion forum, and a DOE safety software guide. These SQA products can benefit DOE safety contractors in the development of hazard and accident analysis by precluding inappropriate software applications and utilizing best practices when incorporating software results to safety basis documentation. The improvement actions discussed here mark a beginning to establishing stronger, standard-compliant programs, practices, and processes in SQA among safety software users, managers, and reviewers throughout the DOE Complex. Additional effort is needed, however, particularly in: (1) processes to add new software applications to the DOE Safety Software Toolbox; (2) improving the effectiveness of software issue communication; and (3) promoting a safety software quality assurance culture.

  3. Toric focusing for radiation force applications using a toric lens coupled to a spherically focused transducer.

    PubMed

    Arnal, Bastien; Nguyen, Thu-Mai; O'Donnell, Matthew

    2014-12-01

    Dynamic elastography using radiation force requires that an ultrasound field be focused during hundreds of microseconds at a pressure of several megapascals. Here, we address the importance of the focal geometry. Although there is usually no control of the elevational focal width in generating a tissue mechanical response, we propose a tunable approach to adapt the focus geometry that can significantly improve radiation force efficiency. Several thin, in-house-made polydimethylsiloxane lenses were designed to modify the focal spot of a spherical transducer. They exhibited low absorption and the focal spot widths were extended up to 8-fold in the elevation direction. Radiation force experiments demonstrated an 8-fold increase in tissue displacements using the same pressure level in a tissue-mimicking phantom with a similar shear wave spectrum, meaning it does not affect elastography resolution. Our results demonstrate that larger tissue responses can be obtained for a given pressure level, or that similar response can be reached at a much lower mechanical index (MI). We envision that this work will impact 3-D elastography using 2-D phased arrays, where such shaping can be achieved electronically with the potential for adaptive optimization. PMID:25474778

  4. Application of software technology to a future spacecraft computer design

    NASA Technical Reports Server (NTRS)

    Labaugh, R. J.

    1980-01-01

    A study was conducted to determine how major improvements in spacecraft computer systems can be obtained from recent advances in hardware and software technology. Investigations into integrated circuit technology indicated that the CMOS/SOS chip set being developed for the Air Force Avionics Laboratory at Wright Patterson had the best potential for improving the performance of spaceborne computer systems. An integral part of the chip set is the bit slice arithmetic and logic unit. The flexibility allowed by microprogramming, combined with the software investigations, led to the specification of a baseline architecture and instruction set.

  5. Experiments in software reliability - Life-critical applications

    NASA Technical Reports Server (NTRS)

    Dunham, J. R.

    1986-01-01

    The paper discusses four reliability data gathering experiments which were conducted using a small sample of programs for two problems having ultrareliability requirements, n-version programming for fault detection, and repetitive run modeling for failure and fault rate estimation. The experimental results agree with those of Nagel and Skrivan in that the program error rates suggest an approximate log-linear pattern and the individual faults occurred with significantly different error rates. Additional analysis of the experimental data raises new questions concerning the phenomenon of interacting faults. This phenomenon may provide one explanation for software reliability decay. The fourth experiment underscored the difficulty in distinguishing between observations of deficiencies in the design of the algorithm and observations of software faults for real-time process control software. These experiments are a part of a program of serial experiments being pursued by the System Validation Methods of NASA-Langley Research Center to find a means of credibly performing reliability evaluations of flight control software.

  6. A focusing reflectarray and its application in microwave virus sanitizer

    NASA Astrophysics Data System (ADS)

    Hung, Wan-Ting; Tung, Jen-Jung; Chen, Shih-Yuan

    2014-10-01

    In this paper, a focusing reflectarray based on the conductor-backed strip dipole unit cell is proposed and designed for use in the microwave virus sanitizer. Unlike traditional far-field antennas that form a planar phase front in a specified far-field direction, the focusing reflectarray is designed to coherently add the fields radiated from the feeding antenna at a predetermined focal point, typically within its radiating near-field region and to ensure adequate power density to inactivate the H3N2 virus sample. Furthermore, the focusing reflectarray has a simple and planar structure compared with conventional focusing antennas. Since the microwave resonant absorption frequency of the H3N2 virus is at about 8 GHz, an 8 × 8 focusing reflectarray is designed for operation at 8 GHz. A prototype antenna is then fabricated and used for H3N2 virus sanitization. It is demonstrated experimentally that the death rate of the H3N2 virus sample is up to 93%, verifying the feasibility of the microwave virus sanitizer as well as the proposed focusing reflectarray.

  7. Disperse—a software system for design of selector probes for exon resequencing applications

    PubMed Central

    Stenberg, J.; Zhang, M.; Ji, H.

    2009-01-01

    Summary:Selector probes enable the amplification of many selected regions of the genome in multiplex. Disperse is a software pipeline that automates the procedure of designing selector probes for exon resequencing applications. Availability:Software and documentation is available at http://bioinformatics.org/disperse Contact: genomics_ji@stanford.edu PMID:19158162

  8. Application of software quality assurance to a specific scientific code development task

    SciTech Connect

    Dronkers, J.J.

    1986-03-01

    This paper describes an application of software quality assurance to a specific scientific code development program. The software quality assurance program consists of three major components: administrative control, configuration management, and user documentation. The program attempts to be consistent with existing local traditions of scientific code development while at the same time providing a controlled process of development.

  9. A Cooperative Application to Improve the Educational Software Design Using Re-usable Processes

    NASA Astrophysics Data System (ADS)

    Garcia, I.; Pacheco, C.; Garcia, W.

    In the last few years, Educational Software has developed enormously, but a large part of this has been badly organized and poorly documented. Recent advances in the software technology can promote the cooperative learning that is a teaching strategy in which small teams, each composed by students of different levels of ability, use different learning activities to improve their understanding of a subject. How can we design Educational Software if we never learnt how to do it? This paper describes how the Technological University of the Mixtec Region is using a cooperative application to improve the quality of education offered to its students in the Educational Software design.

  10. ELAS - SCIENCE & TECHNOLOGY LABORATORY APPLICATIONS SOFTWARE (DEC VAX VERSION)

    NASA Technical Reports Server (NTRS)

    Junkin, B. G.

    1994-01-01

    The Science and Technology Laboratory Applications Software (ELAS) was originally designed to analyze and process digital imagery data, specifically remotely-sensed scanner data. This capability includes the processing of Landsat multispectral data; aircraft-acquired scanner data; digitized topographic data; and numerous other ancillary data, such as soil types and rainfall information, that can be stored in digitized form. ELAS has the subsequent capability to geographically reference this data to dozens of standard, as well as user created projections. As an integrated image processing system, ELAS offers the user of remotely-sensed data a wide range of capabilities in the areas of land cover analysis and general purpose image analysis. ELAS is designed for flexible use and operation and includes its own FORTRAN operating subsystem and an expandable set of FORTRAN application modules. Because all of ELAS resides in one "logical" FORTRAN program, data inputs and outputs, directives, and module switching are convenient for the user. There are over 230 modules presently available to aid the user in performing a wide range of land cover analyses and manipulation. The file management modules enable the user to allocate, define, access, and specify usage for all types of files (ELAS files, subfiles, external files etc.). Various other modules convert specific types of satellite, aircraft, and vector-polygon data into files that can be used by other ELAS modules. The user also has many module options which aid in displaying image data, such as magnification/reduction of the display; true color display; and several memory functions. Additional modules allow for the building and manipulation of polygonal areas of the image data. Finally, there are modules which allow the user to select and classify the image data. An important feature of the ELAS subsystem is that its structure allows new applications modules to be easily integrated in the future. ELAS has as a standard

  11. ELAS - SCIENCE & TECHNOLOGY LABORATORY APPLICATIONS SOFTWARE (SUN VERSION)

    NASA Technical Reports Server (NTRS)

    Walters, D.

    1994-01-01

    The Science and Technology Laboratory Applications Software (ELAS) was originally designed to analyze and process digital imagery data, specifically remotely-sensed scanner data. This capability includes the processing of Landsat multispectral data; aircraft-acquired scanner data; digitized topographic data; and numerous other ancillary data, such as soil types and rainfall information, that can be stored in digitized form. ELAS has the subsequent capability to geographically reference this data to dozens of standard, as well as user created projections. As an integrated image processing system, ELAS offers the user of remotely-sensed data a wide range of capabilities in the areas of land cover analysis and general purpose image analysis. ELAS is designed for flexible use and operation and includes its own FORTRAN operating subsystem and an expandable set of FORTRAN application modules. Because all of ELAS resides in one "logical" FORTRAN program, data inputs and outputs, directives, and module switching are convenient for the user. There are over 230 modules presently available to aid the user in performing a wide range of land cover analyses and manipulation. The file management modules enable the user to allocate, define, access, and specify usage for all types of files (ELAS files, subfiles, external files etc.). Various other modules convert specific types of satellite, aircraft, and vector-polygon data into files that can be used by other ELAS modules. The user also has many module options which aid in displaying image data, such as magnification/reduction of the display; true color display; and several memory functions. Additional modules allow for the building and manipulation of polygonal areas of the image data. Finally, there are modules which allow the user to select and classify the image data. An important feature of the ELAS subsystem is that its structure allows new applications modules to be easily integrated in the future. ELAS has as a standard

  12. ELAS - SCIENCE & TECHNOLOGY LABORATORY APPLICATIONS SOFTWARE (CONCURRENT VERSION)

    NASA Technical Reports Server (NTRS)

    Pearson, R. W.

    1994-01-01

    The Science and Technology Laboratory Applications Software (ELAS) was originally designed to analyze and process digital imagery data, specifically remotely-sensed scanner data. This capability includes the processing of Landsat multispectral data; aircraft-acquired scanner data; digitized topographic data; and numerous other ancillary data, such as soil types and rainfall information, that can be stored in digitized form. ELAS has the subsequent capability to geographically reference this data to dozens of standard, as well as user created projections. As an integrated image processing system, ELAS offers the user of remotely-sensed data a wide range of capabilities in the areas of land cover analysis and general purpose image analysis. ELAS is designed for flexible use and operation and includes its own FORTRAN operating subsystem and an expandable set of FORTRAN application modules. Because all of ELAS resides in one "logical" FORTRAN program, data inputs and outputs, directives, and module switching are convenient for the user. There are over 230 modules presently available to aid the user in performing a wide range of land cover analyses and manipulation. The file management modules enable the user to allocate, define, access, and specify usage for all types of files (ELAS files, subfiles, external files etc.). Various other modules convert specific types of satellite, aircraft, and vector-polygon data into files that can be used by other ELAS modules. The user also has many module options which aid in displaying image data, such as magnification/reduction of the display; true color display; and several memory functions. Additional modules allow for the building and manipulation of polygonal areas of the image data. Finally, there are modules which allow the user to select and classify the image data. An important feature of the ELAS subsystem is that its structure allows new applications modules to be easily integrated in the future. ELAS has as a standard

  13. ELAS - SCIENCE & TECHNOLOGY LABORATORY APPLICATIONS SOFTWARE (MASSCOMP VERSION)

    NASA Technical Reports Server (NTRS)

    Walters, D.

    1994-01-01

    The Science and Technology Laboratory Applications Software (ELAS) was originally designed to analyze and process digital imagery data, specifically remotely-sensed scanner data. This capability includes the processing of Landsat multispectral data; aircraft-acquired scanner data; digitized topographic data; and numerous other ancillary data, such as soil types and rainfall information, that can be stored in digitized form. ELAS has the subsequent capability to geographically reference this data to dozens of standard, as well as user created projections. As an integrated image processing system, ELAS offers the user of remotely-sensed data a wide range of capabilities in the areas of land cover analysis and general purpose image analysis. ELAS is designed for flexible use and operation and includes its own FORTRAN operating subsystem and an expandable set of FORTRAN application modules. Because all of ELAS resides in one "logical" FORTRAN program, data inputs and outputs, directives, and module switching are convenient for the user. There are over 230 modules presently available to aid the user in performing a wide range of land cover analyses and manipulation. The file management modules enable the user to allocate, define, access, and specify usage for all types of files (ELAS files, subfiles, external files etc.). Various other modules convert specific types of satellite, aircraft, and vector-polygon data into files that can be used by other ELAS modules. The user also has many module options which aid in displaying image data, such as magnification/reduction of the display; true color display; and several memory functions. Additional modules allow for the building and manipulation of polygonal areas of the image data. Finally, there are modules which allow the user to select and classify the image data. An important feature of the ELAS subsystem is that its structure allows new applications modules to be easily integrated in the future. ELAS has as a standard

  14. ELAS - SCIENCE & TECHNOLOGY LABORATORY APPLICATIONS SOFTWARE (SILICON GRAPHICS VERSION)

    NASA Technical Reports Server (NTRS)

    Walters, D.

    1994-01-01

    The Science and Technology Laboratory Applications Software (ELAS) was originally designed to analyze and process digital imagery data, specifically remotely-sensed scanner data. This capability includes the processing of Landsat multispectral data; aircraft-acquired scanner data; digitized topographic data; and numerous other ancillary data, such as soil types and rainfall information, that can be stored in digitized form. ELAS has the subsequent capability to geographically reference this data to dozens of standard, as well as user created projections. As an integrated image processing system, ELAS offers the user of remotely-sensed data a wide range of capabilities in the areas of land cover analysis and general purpose image analysis. ELAS is designed for flexible use and operation and includes its own FORTRAN operating subsystem and an expandable set of FORTRAN application modules. Because all of ELAS resides in one "logical" FORTRAN program, data inputs and outputs, directives, and module switching are convenient for the user. There are over 230 modules presently available to aid the user in performing a wide range of land cover analyses and manipulation. The file management modules enable the user to allocate, define, access, and specify usage for all types of files (ELAS files, subfiles, external files etc.). Various other modules convert specific types of satellite, aircraft, and vector-polygon data into files that can be used by other ELAS modules. The user also has many module options which aid in displaying image data, such as magnification/reduction of the display; true color display; and several memory functions. Additional modules allow for the building and manipulation of polygonal areas of the image data. Finally, there are modules which allow the user to select and classify the image data. An important feature of the ELAS subsystem is that its structure allows new applications modules to be easily integrated in the future. ELAS has as a standard

  15. ELAS - SCIENCE & TECHNOLOGY LABORATORY APPLICATIONS SOFTWARE (MASSCOMP VERSION)

    NASA Technical Reports Server (NTRS)

    Walters, D.

    1994-01-01

    The Science and Technology Laboratory Applications Software (ELAS) was originally designed to analyze and process digital imagery data, specifically remotely-sensed scanner data. This capability includes the processing of Landsat multispectral data; aircraft-acquired scanner data; digitized topographic data; and numerous other ancillary data, such as soil types and rainfall information, that can be stored in digitized form. ELAS has the subsequent capability to geographically reference this data to dozens of standard, as well as user created projections. As an integrated image processing system, ELAS offers the user of remotely-sensed data a wide range of capabilities in the areas of land cover analysis and general purpose image analysis. ELAS is designed for flexible use and operation and includes its own FORTRAN operating subsystem and an expandable set of FORTRAN application modules. Because all of ELAS resides in one "logical" FORTRAN program, data inputs and outputs, directives, and module switching are convenient for the user. There are over 230 modules presently available to aid the user in performing a wide range of land cover analyses and manipulation. The file management modules enable the user to allocate, define, access, and specify usage for all types of files (ELAS files, subfiles, external files etc.). Various other modules convert specific types of satellite, aircraft, and vector-polygon data into files that can be used by other ELAS modules. The user also has many module options which aid in displaying image data, such as magnification/reduction of the display; true color display; and several memory functions. Additional modules allow for the building and manipulation of polygonal areas of the image data. Finally, there are modules which allow the user to select and classify the image data. An important feature of the ELAS subsystem is that its structure allows new applications modules to be easily integrated in the future. ELAS has as a standard

  16. ELAS - SCIENCE & TECHNOLOGY LABORATORY APPLICATIONS SOFTWARE (DEC VAX VERSION)

    NASA Technical Reports Server (NTRS)

    Junkin, B. G.

    1994-01-01

    The Science and Technology Laboratory Applications Software (ELAS) was originally designed to analyze and process digital imagery data, specifically remotely-sensed scanner data. This capability includes the processing of Landsat multispectral data; aircraft-acquired scanner data; digitized topographic data; and numerous other ancillary data, such as soil types and rainfall information, that can be stored in digitized form. ELAS has the subsequent capability to geographically reference this data to dozens of standard, as well as user created projections. As an integrated image processing system, ELAS offers the user of remotely-sensed data a wide range of capabilities in the areas of land cover analysis and general purpose image analysis. ELAS is designed for flexible use and operation and includes its own FORTRAN operating subsystem and an expandable set of FORTRAN application modules. Because all of ELAS resides in one "logical" FORTRAN program, data inputs and outputs, directives, and module switching are convenient for the user. There are over 230 modules presently available to aid the user in performing a wide range of land cover analyses and manipulation. The file management modules enable the user to allocate, define, access, and specify usage for all types of files (ELAS files, subfiles, external files etc.). Various other modules convert specific types of satellite, aircraft, and vector-polygon data into files that can be used by other ELAS modules. The user also has many module options which aid in displaying image data, such as magnification/reduction of the display; true color display; and several memory functions. Additional modules allow for the building and manipulation of polygonal areas of the image data. Finally, there are modules which allow the user to select and classify the image data. An important feature of the ELAS subsystem is that its structure allows new applications modules to be easily integrated in the future. ELAS has as a standard

  17. ELAS - SCIENCE & TECHNOLOGY LABORATORY APPLICATIONS SOFTWARE (SUN VERSION)

    NASA Technical Reports Server (NTRS)

    Walters, D.

    1994-01-01

    The Science and Technology Laboratory Applications Software (ELAS) was originally designed to analyze and process digital imagery data, specifically remotely-sensed scanner data. This capability includes the processing of Landsat multispectral data; aircraft-acquired scanner data; digitized topographic data; and numerous other ancillary data, such as soil types and rainfall information, that can be stored in digitized form. ELAS has the subsequent capability to geographically reference this data to dozens of standard, as well as user created projections. As an integrated image processing system, ELAS offers the user of remotely-sensed data a wide range of capabilities in the areas of land cover analysis and general purpose image analysis. ELAS is designed for flexible use and operation and includes its own FORTRAN operating subsystem and an expandable set of FORTRAN application modules. Because all of ELAS resides in one "logical" FORTRAN program, data inputs and outputs, directives, and module switching are convenient for the user. There are over 230 modules presently available to aid the user in performing a wide range of land cover analyses and manipulation. The file management modules enable the user to allocate, define, access, and specify usage for all types of files (ELAS files, subfiles, external files etc.). Various other modules convert specific types of satellite, aircraft, and vector-polygon data into files that can be used by other ELAS modules. The user also has many module options which aid in displaying image data, such as magnification/reduction of the display; true color display; and several memory functions. Additional modules allow for the building and manipulation of polygonal areas of the image data. Finally, there are modules which allow the user to select and classify the image data. An important feature of the ELAS subsystem is that its structure allows new applications modules to be easily integrated in the future. ELAS has as a standard

  18. ELAS - SCIENCE & TECHNOLOGY LABORATORY APPLICATIONS SOFTWARE (SILICON GRAPHICS VERSION)

    NASA Technical Reports Server (NTRS)

    Walters, D.

    1994-01-01

    The Science and Technology Laboratory Applications Software (ELAS) was originally designed to analyze and process digital imagery data, specifically remotely-sensed scanner data. This capability includes the processing of Landsat multispectral data; aircraft-acquired scanner data; digitized topographic data; and numerous other ancillary data, such as soil types and rainfall information, that can be stored in digitized form. ELAS has the subsequent capability to geographically reference this data to dozens of standard, as well as user created projections. As an integrated image processing system, ELAS offers the user of remotely-sensed data a wide range of capabilities in the areas of land cover analysis and general purpose image analysis. ELAS is designed for flexible use and operation and includes its own FORTRAN operating subsystem and an expandable set of FORTRAN application modules. Because all of ELAS resides in one "logical" FORTRAN program, data inputs and outputs, directives, and module switching are convenient for the user. There are over 230 modules presently available to aid the user in performing a wide range of land cover analyses and manipulation. The file management modules enable the user to allocate, define, access, and specify usage for all types of files (ELAS files, subfiles, external files etc.). Various other modules convert specific types of satellite, aircraft, and vector-polygon data into files that can be used by other ELAS modules. The user also has many module options which aid in displaying image data, such as magnification/reduction of the display; true color display; and several memory functions. Additional modules allow for the building and manipulation of polygonal areas of the image data. Finally, there are modules which allow the user to select and classify the image data. An important feature of the ELAS subsystem is that its structure allows new applications modules to be easily integrated in the future. ELAS has as a standard

  19. ELAS - SCIENCE & TECHNOLOGY LABORATORY APPLICATIONS SOFTWARE (CONCURRENT VERSION)

    NASA Technical Reports Server (NTRS)

    Pearson, R. W.

    1994-01-01

    The Science and Technology Laboratory Applications Software (ELAS) was originally designed to analyze and process digital imagery data, specifically remotely-sensed scanner data. This capability includes the processing of Landsat multispectral data; aircraft-acquired scanner data; digitized topographic data; and numerous other ancillary data, such as soil types and rainfall information, that can be stored in digitized form. ELAS has the subsequent capability to geographically reference this data to dozens of standard, as well as user created projections. As an integrated image processing system, ELAS offers the user of remotely-sensed data a wide range of capabilities in the areas of land cover analysis and general purpose image analysis. ELAS is designed for flexible use and operation and includes its own FORTRAN operating subsystem and an expandable set of FORTRAN application modules. Because all of ELAS resides in one "logical" FORTRAN program, data inputs and outputs, directives, and module switching are convenient for the user. There are over 230 modules presently available to aid the user in performing a wide range of land cover analyses and manipulation. The file management modules enable the user to allocate, define, access, and specify usage for all types of files (ELAS files, subfiles, external files etc.). Various other modules convert specific types of satellite, aircraft, and vector-polygon data into files that can be used by other ELAS modules. The user also has many module options which aid in displaying image data, such as magnification/reduction of the display; true color display; and several memory functions. Additional modules allow for the building and manipulation of polygonal areas of the image data. Finally, there are modules which allow the user to select and classify the image data. An important feature of the ELAS subsystem is that its structure allows new applications modules to be easily integrated in the future. ELAS has as a standard

  20. A wind tunnel application of large-field focusing schlieren

    NASA Technical Reports Server (NTRS)

    Ponton, Michael K.; Seiner, John M.; Mitchell, L. K.; Manning, James C.; Jansen, Bernard J.; Lagen, Nicholas T.

    1992-01-01

    A large-field focusing schlieren apparatus was installed in the NASA Lewis Research Center 9 by 15 foot wind tunnel in an attempt to determine the density gradient flow field of a free jet issuing from a supersonic nozzle configuration. The nozzle exit geometry was designed to reduce acoustic emissions from the jet by enhancing plume mixing. Thus, the flow exhibited a complex three-dimensional structure which warranted utilizing the sharp focusing capability of this type of schlieren method. Design considerations concerning tunnel limitations, high-speed photography, and video tape recording are presented in the paper.

  1. Generating Safety-Critical PLC Code From a High-Level Application Software Specification

    NASA Technical Reports Server (NTRS)

    2008-01-01

    The benefits of automatic-application code generation are widely accepted within the software engineering community. These benefits include raised abstraction level of application programming, shorter product development time, lower maintenance costs, and increased code quality and consistency. Surprisingly, code generation concepts have not yet found wide acceptance and use in the field of programmable logic controller (PLC) software development. Software engineers at Kennedy Space Center recognized the need for PLC code generation while developing the new ground checkout and launch processing system, called the Launch Control System (LCS). Engineers developed a process and a prototype software tool that automatically translates a high-level representation or specification of application software into ladder logic that executes on a PLC. All the computer hardware in the LCS is planned to be commercial off the shelf (COTS), including industrial controllers or PLCs that are connected to the sensors and end items out in the field. Most of the software in LCS is also planned to be COTS, with only small adapter software modules that must be developed in order to interface between the various COTS software products. A domain-specific language (DSL) is a programming language designed to perform tasks and to solve problems in a particular domain, such as ground processing of launch vehicles. The LCS engineers created a DSL for developing test sequences of ground checkout and launch operations of future launch vehicle and spacecraft elements, and they are developing a tabular specification format that uses the DSL keywords and functions familiar to the ground and flight system users. The tabular specification format, or tabular spec, allows most ground and flight system users to document how the application software is intended to function and requires little or no software programming knowledge or experience. A small sample from a prototype tabular spec application is

  2. Software Agents Applications Using Real-Time CORBA

    NASA Astrophysics Data System (ADS)

    Fowell, S.; Ward, R.; Nielsen, M.

    This paper describes current projects being performed by SciSys in the area of the use of software agents, built using CORBA middleware, to improve operations within autonomous satellite/ground systems. These concepts have been developed and demonstrated in a series of experiments variously funded by ESA's Technology Flight Opportunity Initiative (TFO) and Leading Edge Technology for SMEs (LET-SME), and the British National Space Centre's (BNSC) National Technology Programme. Some of this earlier work has already been reported in [1]. This paper will address the trends, issues and solutions associated with this software agent architecture concept, together with its implementation using CORBA within an on-board environment, that is to say taking account of its real- time and resource constrained nature.

  3. Application of Lightweight Formal Methods to Software Security

    NASA Technical Reports Server (NTRS)

    Gilliam, David P.; Powell, John D.; Bishop, Matt

    2005-01-01

    Formal specification and verification of security has proven a challenging task. There is no single method that has proven feasible. Instead, an integrated approach which combines several formal techniques can increase the confidence in the verification of software security properties. Such an approach which species security properties in a library that can be reused by 2 instruments and their methodologies developed for the National Aeronautics and Space Administration (NASA) at the Jet Propulsion Laboratory (JPL) are described herein The Flexible Modeling Framework (FMF) is a model based verijkation instrument that uses Promela and the SPIN model checker. The Property Based Tester (PBT) uses TASPEC and a Text Execution Monitor (TEM). They are used to reduce vulnerabilities and unwanted exposures in software during the development and maintenance life cycles.

  4. Bringing Legacy Visualization Software to Modern Computing Devices via Application Streaming

    NASA Astrophysics Data System (ADS)

    Fisher, Ward

    2014-05-01

    Planning software compatibility across forthcoming generations of computing platforms is a problem commonly encountered in software engineering and development. While this problem can affect any class of software, data analysis and visualization programs are particularly vulnerable. This is due in part to their inherent dependency on specialized hardware and computing environments. A number of strategies and tools have been designed to aid software engineers with this task. While generally embraced by developers at 'traditional' software companies, these methodologies are often dismissed by the scientific software community as unwieldy, inefficient and unnecessary. As a result, many important and storied scientific software packages can struggle to adapt to a new computing environment; for example, one in which much work is carried out on sub-laptop devices (such as tablets and smartphones). Rewriting these packages for a new platform often requires significant investment in terms of development time and developer expertise. In many cases, porting older software to modern devices is neither practical nor possible. As a result, replacement software must be developed from scratch, wasting resources better spent on other projects. Enabled largely by the rapid rise and adoption of cloud computing platforms, 'Application Streaming' technologies allow legacy visualization and analysis software to be operated wholly from a client device (be it laptop, tablet or smartphone) while retaining full functionality and interactivity. It mitigates much of the developer effort required by other more traditional methods while simultaneously reducing the time it takes to bring the software to a new platform. This work will provide an overview of Application Streaming and how it compares against other technologies which allow scientific visualization software to be executed from a remote computer. We will discuss the functionality and limitations of existing application streaming

  5. Applications of multigrid software in the atmospheric sciences

    NASA Technical Reports Server (NTRS)

    Adams, J.; Garcia, R.; Gross, B.; Hack, J.; Haidvogel, D.; Pizzo, V.

    1992-01-01

    Elliptic partial differential equations from different areas in the atmospheric sciences are efficiently and easily solved utilizing the multigrid software package named MUDPACK. It is demonstrated that the multigrid method is more efficient than other commonly employed techniques, such as Gaussian elimination and fixed-grid relaxation. The efficiency relative to other techniques, both in terms of storage requirement and computational time, increases quickly with grid size.

  6. Software Quality Evaluation Models Applicable in Health Information and Communications Technologies. A Review of the Literature.

    PubMed

    Villamor Ordozgoiti, Alberto; Delgado Hito, Pilar; Guix Comellas, Eva María; Fernandez Sanchez, Carlos Manuel; Garcia Hernandez, Milagros; Lluch Canut, Teresa

    2016-01-01

    Information and Communications Technologies in healthcare has increased the need to consider quality criteria through standardised processes. The aim of this study was to analyse the software quality evaluation models applicable to healthcare from the perspective of ICT-purchasers. Through a systematic literature review with the keywords software, product, quality, evaluation and health, we selected and analysed 20 original research papers published from 2005-2016 in health science and technology databases. The results showed four main topics: non-ISO models, software quality evaluation models based on ISO/IEC standards, studies analysing software quality evaluation models, and studies analysing ISO standards for software quality evaluation. The models provide cost-efficiency criteria for specific software, and improve use outcomes. The ISO/IEC25000 standard is shown as the most suitable for evaluating the quality of ICTs for healthcare use from the perspective of institutional acquisition. PMID:27350495

  7. The Design and Application of Target-Focused Compound Libraries

    PubMed Central

    Harris, C. John; Hill, Richard D; Sheppard, David W; Slater, Martin J; Stouten, Pieter F.W

    2011-01-01

    Target-focused compound libraries are collections of compounds which are designed to interact with an individual protein target or, frequently, a family of related targets (such as kinases, voltage-gated ion channels, serine/cysteine proteases). They are used for screening against therapeutic targets in order to find hit compounds that might be further developed into drugs. The design of such libraries generally utilizes structural information about the target or family of interest. In the absence of such structural information, a chemogenomic model that incorporates sequence and mutagenesis data to predict the properties of the binding site can be employed. A third option, usually pursued when no structural data are available, utilizes knowledge of the ligands of the target from which focused libraries can be developed via scaffold hopping. Consequently, the methods used for the design of target-focused libraries vary according to the quantity and quality of structural or ligand data that is available for each target family. This article describes examples of each of these design approaches and illustrates them with case studies, which highlight some of the issues and successes observed when screening target-focused libraries. PMID:21521154

  8. Generation of Focused Shock Waves in Water for Biomedical Applications

    NASA Astrophysics Data System (ADS)

    Lukeš, Petr; Šunka, Pavel; Hoffer, Petr; Stelmashuk, Vitaliy; Beneš, Jiří; Poučková, Pavla; Zadinová, Marie; Zeman, Jan

    The physical characteristics of focused two-successive (tandem) shock waves (FTSW) in water and their biological effects are presented. FTSW were ­generated by underwater multichannel electrical discharges in a highly conductive saline solution using two porous ceramic-coated cylindrical electrodes of different diameter and surface area. The primary cylindrical pressure wave generated at each composite electrode was focused by a metallic parabolic reflector to a common focal point to form two strong shock waves with a variable time delay between the waves. The pressure field and interaction between the first and the second shock waves at the focus were investigated using schlieren photography and polyvinylidene fluoride (PVDF) shock gauge sensors. The largest interaction was obtained for a time delay of 8-15 μs between the waves, producing an amplitude of the negative pressure phase of the second shock wave down to -80 MPa and a large number of cavitations at the focus. The biological effects of FTSW were demonstrated in vitro on damage to B16 melanoma cells, in vivo on targeted lesions in the thigh muscles of rabbits and on the growth delay of sarcoma tumors in Lewis rats treated in vivo by FTSW, compared to untreated controls.

  9. Applications of focused ion beam systems in gunshot residue investigation.

    PubMed

    Niewöhner, L; Wenz, H W

    1999-01-01

    Scanning ion microscopy technology has opened a new door to forensic scientists, allowing the GSR investigator to see inside a particle's core. Using a focused ion beam, particles can be cross-sectioned, revealing interior morphology and character that can be utilized for identification of the ammunition manufacturer. PMID:9987878

  10. Software Application for Supporting the Education of Database Systems

    ERIC Educational Resources Information Center

    Vágner, Anikó

    2015-01-01

    The article introduces an application which supports the education of database systems, particularly the teaching of SQL and PL/SQL in Oracle Database Management System environment. The application has two parts, one is the database schema and its content, and the other is a C# application. The schema is to administrate and store the tasks and the…

  11. Application of Domain Knowledge to Software Quality Assurance

    NASA Technical Reports Server (NTRS)

    Wild, Christian W.

    1997-01-01

    This work focused on capturing, using, and evolving a qualitative decision support structure across the life cycle of a project. The particular application of this study was towards business process reengineering and the representation of the business process in a set of Business Rules (BR). In this work, we defined a decision model which captured the qualitative decision deliberation process. It represented arguments both for and against proposed alternatives to a problem. It was felt that the subjective nature of many critical business policy decisions required a qualitative modeling approach similar to that of Lee and Mylopoulos. While previous work was limited almost exclusively to the decision capture phase, which occurs early in the project life cycle, we investigated the use of such a model during the later stages as well. One of our significant developments was the use of the decision model during the operational phase of a project. By operational phase, we mean the phase in which the system or set of policies which were earlier decided are deployed and put into practice. By making the decision model available to operational decision makers, they would have access to the arguments pro and con for a variety of actions and can thus make a more informed decision which balances the often conflicting criteria by which the value of action is measured. We also developed the concept of a 'monitored decision' in which metrics of performance were identified during the decision making process and used to evaluate the quality of that decision. It is important to monitor those decision which seem at highest risk of not meeting their stated objectives. Operational decisions are also potentially high risk decisions. Finally, we investigated the use of performance metrics for monitored decisions and audit logs of operational decisions in order to feed an evolutionary phase of the the life cycle. During evolution, decisions are revisisted, assumptions verified or refuted

  12. Cold atomic beam ion source for focused ion beam applications

    SciTech Connect

    Knuffman, B.; Steele, A. V.; McClelland, J. J.

    2013-07-28

    We report measurements and modeling of an ion source that is based on ionization of a laser-cooled atomic beam. We show a high brightness and a low energy spread, suitable for use in next-generation, high-resolution focused ion beam systems. Our measurements of total ion current as a function of ionization conditions support an analytical model that also predicts the cross-sectional current density and spatial distribution of ions created in the source. The model predicts a peak brightness of 2 × 10{sup 7} A m{sup −2} sr{sup −1} eV{sup −1} and an energy spread less than 0.34 eV. The model is also combined with Monte-Carlo simulations of the inter-ion Coulomb forces to show that the source can be operated at several picoamperes with a brightness above 1 × 10{sup 7} A m{sup −2} sr{sup −1} eV{sup −1}. We estimate that when combined with a conventional ion focusing column, an ion source with these properties could focus a 1 pA beam into a spot smaller than 1 nm. A total current greater than 5 nA was measured in a lower-brightness configuration of the ion source, demonstrating the possibility of a high current mode of operation.

  13. Application of an impedance matching transformer to a plasma focus.

    PubMed

    Bures, B L; James, C; Krishnan, M; Adler, R

    2011-10-01

    A plasma focus was constructed using an impedance matching transformer to improve power transfer between the pulse power and the dynamic plasma load. The system relied on two switches and twelve transformer cores to produce a 100 kA pulse in short circuit on the secondary at 27 kV on the primary with 110 J stored. With the two transformer systems in parallel, the Thevenin equivalent circuit parameters on the secondary side of the driver are: C = 10.9 μF, V(0) = 4.5 kV, L = 17 nH, and R = 5 mΩ. An equivalent direct drive circuit would require a large number of switches in parallel, to achieve the same Thevenin equivalent. The benefits of this approach are replacement of consumable switches with non-consumable transformer cores, reduction of the driver inductance and resistance as viewed by the dynamic load, and reduction of the stored energy to produce a given peak current. The system is designed to operate at 100 Hz, so minimizing the stored energy results in less load on the thermal management system. When operated at 1 Hz, the neutron yield from the transformer matched plasma focus was similar to the neutron yield from a conventional (directly driven) plasma focus at the same peak current. PMID:22047293

  14. Application of Regulatory Focus Theory to Search Advertising

    PubMed Central

    Mowle, Elyse N.; Georgia, Emily J.; Doss, Brian D.; Updegraff, John A.

    2015-01-01

    Purpose The purpose of this paper is to test the utility of regulatory focus theory principles in a real-world setting; specifically, Internet hosted text advertisements. Effect of compatibility of the ad text with the regulatory focus of the consumer was examined. Design/methodology/approach Advertisements were created using Google AdWords. Data were collected for the number of views and clicks each ad received. Effect of regulatory fit was measured using logistic regression. Findings Logistic regression analyses demonstrated that there was a strong main effect for keyword, such that users were almost six times as likely to click on a promotion advertisement as a prevention advertisement, as well as a main effect for compatibility, such that users were twice as likely to click on an advertisement with content that was consistent with their keyword. Finally, there was a strong interaction of these two variables, such that the effect of consistent advertisements was stronger for promotion searches than for prevention searches. Research limitations/implications The effect of ad compatibility had medium to large effect sizes, suggesting that individuals’ state may have more influence on advertising response than do individuals’ traits (e.g. personality traits). Measurement of regulatory fit was limited by the constraints of Google AdWords. Practical implications The results of this study provide a possible framework for ad creation for Internet advertisers. Originality/value This paper is the first study to demonstrate the utility of regulatory focus theory in online advertising. PMID:26430293

  15. Tightly Coupled Geodynamic Systems: Software, Implicit Solvers & Applications

    NASA Astrophysics Data System (ADS)

    May, D.; Le Pourhiet, L.; Brown, J.

    2011-12-01

    The generic term "multi-physics" is used to define physical processes which are described by a collection of partial differential equations, or "physics". Numerous processes in geodynamics fall into this category. For example, the evolution of viscous fluid flow and heat transport within the mantle (Stokes flow + energy conservation), the dynamics of melt migration (Stokes flow + Darcy flow + porosity evolution) and landscape evolution (Stokes + diffusion/advection over a surface). The development of software to numerically investigate processes that are described through the composition of different physics components are typically (a) designed for one particular set of physics and are never intended to be extended, or coupled to other processes (b) enforce that certain non-linearity's (or coupling) are explicitly removed from the system for reasons of computational efficiency, or due the lack of a robust non-linear solver (e.g. most models in the mantle convection community). We describe a software infrastructure which enables us to easily introduce new physics with minimal code modifications; tightly couple all physics without introducing splitting errors; exploit modern linear/non-linear solvers and permit the re-use of monolithic preconditioners for individual physics blocks (e.g. saddle point preconditioners for Stokes). Here we present a number of examples to illustrate the flexibility and importance of using this software infra-structure. Using the Stokes system as a prototype, we show results illustrating (i) visco-plastic shear banding experiments, (ii) how coupling Stokes flow with the evolution of the material coordinates can yield temporal stability in the free surface evolution and (iii) the discretisation error associated with decoupling Stokes equation from the heat transport equation in models of mantle convection with various rheologies.

  16. The application of formal software engineering methods to the unattended and remote monitoring software suite at Los Alamos National Laboratory

    SciTech Connect

    Determan, John Clifford; Longo, Joseph F; Michel, Kelly D

    2009-01-01

    The Unattended and Remote Monitoring (UNARM) system is a collection of specialized hardware and software used by the International Atomic Energy Agency (IAEA) to institute nuclear safeguards at many nuclear facilities around the world. The hardware consists of detectors, instruments, and networked computers for acquiring various forms of data, including but not limited to radiation data, global position coordinates, camera images, isotopic data, and operator declarations. The software provides two primary functions: the secure and reliable collection of this data from the instruments and the ability to perform an integrated review and analysis of the disparate data sources. Several years ago the team responsible for maintaining the software portion of the UNARM system began the process of formalizing its operations. These formal operations include a configuration management system, a change control board, an issue tracking system, and extensive formal testing, for both functionality and reliability. Functionality is tested with formal test cases chosen to fully represent the data types and methods of analysis that will be commonly encountered. Reliability is tested with iterative, concurrent testing where up to five analyses are executed simultaneously for thousands of cycles. Iterative concurrent testing helps ensure that there are no resource conflicts or leaks when multiple system components are in use simultaneously. The goal of this work is to provide a high quality, reliable product, commensurate with the criticality of the application. Testing results will be presented that demonstrate that this goal has been achieved and the impact of the introduction of a formal software engineering framework to the UNARM product will be presented.

  17. Data-Interpolating Variational Analysis (DIVA) software : recent development and application

    NASA Astrophysics Data System (ADS)

    Watelet, Sylvain; Barth, Alexander; Troupin, Charles; Ouberdous, Mohamed; Beckers, Jean-Marie

    2014-05-01

    The Data-Interpolating Variational Analysis (DIVA) software is a tool designed to reconstruct a continuous field from discrete measurements. This method is based on the numerical implementation of the Variational Inverse Model (VIM), which consists of a minimization of a cost function, allowing the choice of the analyzed field fitting at best the data sets. The problem is solved efficiently using a finite-element method. This statistical method is particularly suited to deal with irregularly-spaced observations, producing outputs on a regular grid. Initially created to work in a two-dimensional way, the software is now able to handle 3D or even 4D analysis, in order to easily produce ocean climatologies. These analyzes can easily be improved by taking advantage of the DIVA's ability to take topographic and dynamic constraints into account (coastal relief, prevailing wind impacting the advection,...). In DIVA, we assume errors on measurements are not correlated, which means we do not consider the effect of correlated observation errors on the analysis and we therefore use a diagonal observation error covariance matrix. However, the oceanographic data sets are generally clustered in space and time, thus introducing some correlation between observations. In order to determine the impact of such an approximation and provide strategies to mitigate its effects, we conducted several synthetic experiments with known correlation structure. Overall, the best results were obtained with a variant of the covariance inflation method. Finally, a new application of DIVA on satellite altimetry data will be presented : these data have particular space and time distributions, as they consist of repeated tracks (~10-35 days) of measurements with a distance lower than 10 km between two successive measurements in a given track. The tools designed to determine the analysis parameters were adapted to these specificities. Moreover, different weights were applied to measurements in order to

  18. Focusing particle concentrator with application to ultrafine particles

    DOEpatents

    Hering, Susanne; Lewis, Gregory; Spielman, Steven R.

    2013-06-11

    Technology is presented for the high efficiency concentration of fine and ultrafine airborne particles into a small fraction of the sampled airflow by condensational enlargement, aerodynamic focusing and flow separation. A nozzle concentrator structure including an acceleration nozzle with a flow extraction structure may be coupled to a containment vessel. The containment vessel may include a water condensation growth tube to facilitate the concentration of ultrafine particles. The containment vessel may further include a separate carrier flow introduced at the center of the sampled flow, upstream of the acceleration nozzle of the nozzle concentrator to facilitate the separation of particle and vapor constituents.

  19. Application of BSTRAIN software for wind turbine blade testing

    SciTech Connect

    Musial, W D; Clark, M E; Stensland, T

    1996-07-01

    NREL currently operates the largest structural testing facility in US for testing wind turbine blades. A data acquisition system was developed to measure blade response and monitor test status; it is called BSTRAIN (Blade Structural Test Real-time Acquisition Interface Network). Software objectives were to develop a robust, easy-to-use computer program that could automatically collect data from static and fatigue blade tests without missing any significant events or overloading the computer with excess data. The program currently accepts inputs from up to 32 channels but can be expanded to over 1000 channels. In order to reduce the large amount of data collected during long fatigue tests, options for real-time data processing were developed including peak-valley series collection, peak-valley decimation, block decimation, and continuous recording of all data. Other BSTRAIN features include automated blade stiffness checks, remote terminal access to blade test status, and automated VCR control for continuous test recording. Results from tests conducted with the software revealed areas for improvement including test accuracy, post-processing analysis, and further data reduction.

  20. Application of Gaia Analysis Software AGIS to Nano-JASMINE

    NASA Astrophysics Data System (ADS)

    Yamada, Y.; Lammers, U.; Gouda, N.

    2011-07-01

    The core data reduction for the Nano-JASMINE mission is planned to be done with Gaia's Astrometric Global Iterative Solution (AGIS). Nano-JASMINE is an ultra small (35 kg) satellite for astrometry observations in Japan and Gaia is ESA's large (over 1000 kg) next-generation astrometry mission. The accuracy of Nano-JASMINE is about 3 mas, comparable to the Hipparcos mission, Gaia's predecessor some 20 years ago. It is challenging that such a small satellite can perform real scientific observations. The collaboration for sharing software started in 2007. In addition to similar design and operating principles of the two missions, this is possible thanks to the encapsulation of all Gaia-specific aspects of AGIS in a Parameter Database. Nano-JASMINE will be the test bench for the Gaia AGIS software. We present this idea in detail and the necessary practical steps to make AGIS work with Nano-JASMINE data. We also show the key mission parameters, goals, and status of the data reduction for the Nano-JASMINE.

  1. Application of parallelized software architecture to an autonomous ground vehicle

    NASA Astrophysics Data System (ADS)

    Shakya, Rahul; Wright, Adam; Shin, Young Ho; Momin, Orko; Petkovsek, Steven; Wortman, Paul; Gautam, Prasanna; Norton, Adam

    2011-01-01

    This paper presents improvements made to Q, an autonomous ground vehicle designed to participate in the Intelligent Ground Vehicle Competition (IGVC). For the 2010 IGVC, Q was upgraded with a new parallelized software architecture and a new vision processor. Improvements were made to the power system reducing the number of batteries required for operation from six to one. In previous years, a single state machine was used to execute the bulk of processing activities including sensor interfacing, data processing, path planning, navigation algorithms and motor control. This inefficient approach led to poor software performance and made it difficult to maintain or modify. For IGVC 2010, the team implemented a modular parallel architecture using the National Instruments (NI) LabVIEW programming language. The new architecture divides all the necessary tasks - motor control, navigation, sensor data collection, etc. into well-organized components that execute in parallel, providing considerable flexibility and facilitating efficient use of processing power. Computer vision is used to detect white lines on the ground and determine their location relative to the robot. With the new vision processor and some optimization of the image processing algorithm used last year, two frames can be acquired and processed in 70ms. With all these improvements, Q placed 2nd in the autonomous challenge.

  2. Software Defined Radio Standard Architecture and its Application to NASA Space Missions

    NASA Technical Reports Server (NTRS)

    Andro, Monty; Reinhart, Richard C.

    2006-01-01

    A software defined radio (SDR) architecture used in space-based platforms proposes to standardize certain aspects of radio development such as interface definitions, functional control and execution, and application software and firmware development. NASA has charted a team to develop an open software defined radio hardware and software architecture to support NASA missions and determine the viability of an Agency-wide Standard. A draft concept of the proposed standard has been released and discussed among organizations in the SDR community. Appropriate leveraging of the JTRS SCA, OMG's SWRadio Architecture and other aspects are considered. A standard radio architecture offers potential value by employing common waveform software instantiation, operation, testing and software maintenance. While software defined radios offer greater flexibility, they also poses challenges to the radio development for the space environment in terms of size, mass and power consumption and available technology. An SDR architecture for space must recognize and address the constraints of space flight hardware, and systems along with flight heritage and culture. NASA is actively participating in the development of technology and standards related to software defined radios. As NASA considers a standard radio architecture for space communications, input and coordination from government agencies, the industry, academia, and standards bodies is key to a successful architecture. The unique aspects of space require thorough investigation of relevant terrestrial technologies properly adapted to space. The talk will describe NASA's current effort to investigate SDR applications to space missions and a brief overview of a candidate architecture under consideration for space based platforms.

  3. Automated Translation of Safety Critical Application Software Specifications into PLC Ladder Logic

    NASA Technical Reports Server (NTRS)

    Leucht, Kurt W.; Semmel, Glenn S.

    2008-01-01

    The numerous benefits of automatic application code generation are widely accepted within the software engineering community. A few of these benefits include raising the abstraction level of application programming, shorter product development time, lower maintenance costs, and increased code quality and consistency. Surprisingly, code generation concepts have not yet found wide acceptance and use in the field of programmable logic controller (PLC) software development. Software engineers at the NASA Kennedy Space Center (KSC) recognized the need for PLC code generation while developing their new ground checkout and launch processing system. They developed a process and a prototype software tool that automatically translates a high-level representation or specification of safety critical application software into ladder logic that executes on a PLC. This process and tool are expected to increase the reliability of the PLC code over that which is written manually, and may even lower life-cycle costs and shorten the development schedule of the new control system at KSC. This paper examines the problem domain and discusses the process and software tool that were prototyped by the KSC software engineers.

  4. Foundry Technologies Focused on Environmental and Ecological Applications

    NASA Astrophysics Data System (ADS)

    Roizin, Ya.; Lisiansky, M.; Pikhay, E.

    Solutions allowing fabrication of remote control systems with integrated sensors (motes) were introduced as a part of CMOS foundry production platform and verified on silicon. The integrated features include sensors employing principles previously verified in the development of ultra-low power consuming non-volatile memories (C-Flash, MRAM) and components allowing low-power energy harvesting (low voltage rectifiers, high -voltage solar cells). The developed systems are discussed with emphasis on their environmental and security applications.

  5. An experimental investigation of fault tolerant software structures in an avionics application

    NASA Technical Reports Server (NTRS)

    Caglayan, Alper K.; Eckhardt, Dave E., Jr.

    1989-01-01

    The objective of this experimental investigation is to compare the functional performance and software reliability of competing fault tolerant software structures utilizing software diversity. In this experiment, three versions of the redundancy management software for a skewed sensor array have been developed using three diverse failure detection and isolation algorithms and incorporated into various N-version, recovery block and hybrid software structures. The empirical results show that, for maximum functional performance improvement in the selected application domain, the results of diverse algorithms should be voted before being processed by multiple versions without enforced diversity. Results also suggest that when the reliability gain with an N-version structure is modest, recovery block structures are more feasible since higher reliability can be obtained using an acceptance check with a modest reliability.

  6. Office Computer Software: A Comprehensive Review of Software Programs.

    ERIC Educational Resources Information Center

    Secretary, 1992

    1992-01-01

    Describes types of software including system software, application software, spreadsheets, accounting software, graphics packages, desktop publishing software, database, desktop and personal information management software, project and records management software, groupware, and shareware. (JOW)

  7. Conceptions of Software Development by Project Managers: A Study of Managing the Outsourced Development of Software Applications for United States Federal Government Agencies

    ERIC Educational Resources Information Center

    Eisen, Daniel

    2013-01-01

    This study explores how project managers, working for private federal IT contractors, experience and understand managing the development of software applications for U.S. federal government agencies. Very little is known about how they manage their projects in this challenging environment. Software development is a complex task and only grows in…

  8. Software Attribution for Geoscience Applications in the Computational Infrastructure for Geodynamics

    NASA Astrophysics Data System (ADS)

    Hwang, L.; Dumit, J.; Fish, A.; Soito, L.; Kellogg, L. H.; Smith, M.

    2015-12-01

    Scientific software is largely developed by individual scientists and represents a significant intellectual contribution to the field. As the scientific culture and funding agencies move towards an expectation that software be open-source, there is a corresponding need for mechanisms to cite software, both to provide credit and recognition to developers, and to aid in discoverability of software and scientific reproducibility. We assess the geodynamic modeling community's current citation practices by examining more than 300 predominantly self-reported publications utilizing scientific software in the past 5 years that is available through the Computational Infrastructure for Geodynamics (CIG). Preliminary results indicate that authors cite and attribute software either through citing (in rank order) peer-reviewed scientific publications, a user's manual, and/or a paper describing the software code. Attributions maybe found directly in the text, in acknowledgements, in figure captions, or in footnotes. What is considered citable varies widely. Citations predominantly lack software version numbers or persistent identifiers to find the software package. Versioning may be implied through reference to a versioned user manual. Authors sometimes report code features used and whether they have modified the code. As an open-source community, CIG requests that researchers contribute their modifications to the repository. However, such modifications may not be contributed back to a repository code branch, decreasing the chances of discoverability and reproducibility. Survey results through CIG's Software Attribution for Geoscience Applications (SAGA) project suggest that lack of knowledge, tools, and workflows to cite codes are barriers to effectively implement the emerging citation norms. Generated on-demand attributions on software landing pages and a prototype extensible plug-in to automatically generate attributions in codes are the first steps towards reproducibility.

  9. User Manual for the Data-Series Interface of the Gr Application Software

    USGS Publications Warehouse

    Donovan, John M.

    2009-01-01

    This manual describes the data-series interface for the Gr Application software. Basic tasks such as plotting, editing, manipulating, and printing data series are presented. The properties of the various types of data objects and graphical objects used within the application, and the relationships between them also are presented. Descriptions of compatible data-series file formats are provided.

  10. Methodology of functionality selection for water management software and examples of its application.

    PubMed

    Vasilyev, K N

    2013-01-01

    When developing new software products and adapting existing software, project leaders have to decide which functionalities to keep, adapt or develop. They have to consider that the cost of making errors during the specification phase is extremely high. In this paper a formalised approach is proposed that considers the main criteria for selecting new software functions. The application of this approach minimises the chances of making errors in selecting the functions to apply. Based on the work on software development and support projects in the area of water resources and flood damage evaluation in economic terms at CH2M HILL (the developers of the flood modelling package ISIS), the author has defined seven criteria for selecting functions to be included in a software product. The approach is based on the evaluation of the relative significance of the functions to be included into the software product. Evaluation is achieved by considering each criterion and the weighting coefficients of each criterion in turn and applying the method of normalisation. This paper includes a description of this new approach and examples of its application in the development of new software products in the are of the water resources management. PMID:24355845

  11. IMAGE information monitoring and applied graphics software environment. Volume 4. Applications description

    SciTech Connect

    Hallam, J.W.; Ng, K.B.; Upham, G.L.

    1986-09-01

    The EPRI Information Monitoring and Applied Graphics Environment (IMAGE) system is designed for 'fast proto-typing' of advanced concepts for computer-aided plant operations tools. It is a flexible software system which can be used for rapidly creating, dynamically driving and evaluating advanced operator aid displays. The software is written to be both host computer and graphic device independent. This four volume report includes an Executive Overview of the IMAGE package (Volume 1), followed by Software Description (Volume II), User's Guide (Volume III), and Description of Example Applications (Volume IV).

  12. A proposed acceptance process for commercial off-the-shelf (COTS) software in reactor applications

    SciTech Connect

    Preckshot, G.G.; Scott, J.A.

    1996-03-01

    This paper proposes a process for acceptance of commercial off-the-shelf (COTS) software products for use in reactor systems important to safety. An initial set of four criteria establishes COTS software product identification and its safety category. Based on safety category, three sets of additional criteria, graded in rigor, are applied to approve/disapprove the product. These criteria fall roughly into three areas: product assurance, verification of safety function and safety impact, and examination of usage experience of the COTS product in circumstances similar to the proposed application. A report addressing the testing of existing software is included as an appendix.

  13. Analysis of a hardware and software fault tolerant processor for critical applications

    NASA Technical Reports Server (NTRS)

    Dugan, Joanne B.

    1993-01-01

    Computer systems for critical applications must be designed to tolerate software faults as well as hardware faults. A unified approach to tolerating hardware and software faults is characterized by classifying faults in terms of duration (transient or permanent) rather than source (hardware or software). Errors arising from transient faults can be handled through masking or voting, but errors arising from permanent faults require system reconfiguration to bypass the failed component. Most errors which are caused by software faults can be considered transient, in that they are input-dependent. Software faults are triggered by a particular set of inputs. Quantitative dependability analysis of systems which exhibit a unified approach to fault tolerance can be performed by a hierarchical combination of fault tree and Markov models. A methodology for analyzing hardware and software fault tolerant systems is applied to the analysis of a hypothetical system, loosely based on the Fault Tolerant Parallel Processor. The models consider both transient and permanent faults, hardware and software faults, independent and related software faults, automatic recovery, and reconfiguration.

  14. Design of Control Server Application Software for Neutral Beam Injection System

    NASA Astrophysics Data System (ADS)

    Shi, Qilin; Hu, Chundong; Sheng, Peng; Song, Shihua

    2012-04-01

    For the remote control of a neutral beam injection (NBI) system, a software NBIcsw is developed to work on the control server. It can meet the requirements of data transmission and operation-control between the NBI measurement and control layer (MCL) and the remote monitoring layer (RML). The NBIcsw runs on a Linux system, developed with client/server (C/S) mode and multithreading technology. It is shown through application that the software is with good efficiency.

  15. Focusing cross-fire applicator for ultrasonic hyperthermia of tumors.

    PubMed

    Lierke, E G; Hemsel, T

    2006-12-22

    An improved concept for ultrasonic hyperthermia of tumors is presented. This concept is based on past experience of a German government supported project , which ended in 1984. It offers a low cost alternative to common RF- and microwave methods for hyperthermia of tumors with volumes between 1 and 40 ml at treatment times between 30 and 60 min. Our new version of the system considerably improves the temperature suppression in the healthy tissue around the target area and enables the adjustment of the beam width to the actual tumor size and the field geometry to the depth and shape of the tumor. The applicator can be used for moderate hyperthermia with tissue overheating up to 10K or for ablation therapy with short high temperature pulses. Its central area is free for the integration of a commercial ultrasonic diagnostic sector scanner or a Doppler flow sensor in order to support the adjustment of the transducer and to monitor the whole area during the therapy. PMID:16930663

  16. Beehive: A Software Application for Synchronous Collaborative Learning

    ERIC Educational Resources Information Center

    Turani, Aiman; Calvo, Rafael A.

    2006-01-01

    Purpose: The purpose of this paper is to describe Beehive, a new web application framework for designing and supporting synchronous collaborative learning. Design/methodology/approach: Our web engineering approach integrates educational design expertise into a technology for building tools for collaborative learning activities. Beehive simplifies…

  17. Scaling Irregular Applications through Data Aggregation and Software Multithreading

    SciTech Connect

    Morari, Alessandro; Tumeo, Antonino; Chavarría-Miranda, Daniel; Villa, Oreste; Valero, Mateo

    2014-05-30

    Bioinformatics, data analytics, semantic databases, knowledge discovery are emerging high performance application areas that exploit dynamic, linked data structures such as graphs, unbalanced trees or unstructured grids. These data structures usually are very large, requiring significantly more memory than available on single shared memory systems. Additionally, these data structures are difficult to partition on distributed memory systems. They also present poor spatial and temporal locality, thus generating unpredictable memory and network accesses. The Partitioned Global Address Space (PGAS) programming model seems suitable for these applications, because it allows using a shared memory abstraction across distributed-memory clusters. However, current PGAS languages and libraries are built to target regular remote data accesses and block transfers. Furthermore, they usually rely on the Single Program Multiple Data (SPMD) parallel control model, which is not well suited to the fine grained, dynamic and unbalanced parallelism of irregular applications. In this paper we present {\\bf GMT} (Global Memory and Threading library), a custom runtime library that enables efficient execution of irregular applications on commodity clusters. GMT integrates a PGAS data substrate with simple fork/join parallelism and provides automatic load balancing on a per node basis. It implements multi-level aggregation and lightweight multithreading to maximize memory and network bandwidth with fine-grained data accesses and tolerate long data access latencies. A key innovation in the GMT runtime is its thread specialization (workers, helpers and communication threads) that realize the overall functionality. We compare our approach with other PGAS models, such as UPC running using GASNet, and hand-optimized MPI code on a set of typical large-scale irregular applications, demonstrating speedups of an order of magnitude.

  18. The Enterprise Derivative Application: Flexible Software for Optimizing Manufacturing Processes

    SciTech Connect

    Ward, Richard C; Allgood, Glenn O; Knox, John R

    2008-11-01

    The Enterprise Derivative Application (EDA) implements the enterprise-derivative analysis for optimization of an industrial process (Allgood and Manges, 2001). It is a tool to help industry planners choose the most productive way of manufacturing their products while minimizing their cost. Developed in MS Access, the application allows users to input initial data ranging from raw material to variable costs and enables the tracking of specific information as material is passed from one process to another. Energy-derivative analysis is based on calculation of sensitivity parameters. For the specific application to a steel production process these include: the cost to product sensitivity, the product to energy sensitivity, the energy to efficiency sensitivity, and the efficiency to cost sensitivity. Using the EDA, for all processes the user can display a particular sensitivity or all sensitivities can be compared for all processes. Although energy-derivative analysis was originally designed for use by the steel industry, it is flexible enough to be applied to many other industrial processes. Examples of processes where energy-derivative analysis would prove useful are wireless monitoring of processes in the petroleum cracking industry and wireless monitoring of motor failure for determining the optimum time to replace motor parts. One advantage of the MS Access-based application is its flexibility in defining the process flow and establishing the relationships between parent and child process and products resulting from a process. Due to the general design of the program, a process can be anything that occurs over time with resulting output (products). So the application can be easily modified to many different industrial and organizational environments. Another advantage is the flexibility of defining sensitivity parameters. Sensitivities can be determined between all possible variables in the process flow as a function of time. Thus the dynamic development of the

  19. Generic domain models in software engineering

    NASA Technical Reports Server (NTRS)

    Maiden, Neil

    1992-01-01

    This paper outlines three research directions related to domain-specific software development: (1) reuse of generic models for domain-specific software development; (2) empirical evidence to determine these generic models, namely elicitation of mental knowledge schema possessed by expert software developers; and (3) exploitation of generic domain models to assist modelling of specific applications. It focuses on knowledge acquisition for domain-specific software development, with emphasis on tool support for the most important phases of software development.

  20. Applications of software-defined radio (SDR) technology in hospital environments.

    PubMed

    Chávez-Santiago, Raúl; Mateska, Aleksandra; Chomu, Konstantin; Gavrilovska, Liljana; Balasingham, Ilangko

    2013-01-01

    A software-defined radio (SDR) is a radio communication system where the major part of its functionality is implemented by means of software in a personal computer or embedded system. Such a design paradigm has the major advantage of producing devices that can receive and transmit widely different radio protocols based solely on the software used. This flexibility opens several application opportunities in hospital environments, where a large number of wired and wireless electronic devices must coexist in confined areas like operating rooms and intensive care units. This paper outlines some possible applications in the 2360-2500 MHz frequency band. These applications include the integration of wireless medical devices in a common communication platform for seamless interoperability, and cognitive radio (CR) for body area networks (BANs) and wireless sensor networks (WSNs) for medical environmental surveillance. The description of a proof-of-concept CR prototype is also presented. PMID:24109925

  1. Key Attributes of the SAPHIRE Risk and Reliability Analysis Software for Risk-Informed Probabilistic Applications

    SciTech Connect

    Curtis Smith; James Knudsen; Kellie Kvarfordt; Ted Wood

    2008-08-01

    The Idaho National Laboratory is a primary developer of probabilistic risk and reliability analysis (PRRA) tools, dating back over 35 years. Evolving from mainframe-based software, the current state-of-the-practice has lead to the creation of the SAPHIRE software. Currently, agencies such as the Nuclear Regulatory Commission, the National Aeronautics and Aerospace Agency, the Department of Energy, and the Department of Defense use version 7 of the SAPHIRE software for many of their risk-informed activities. In order to better understand and appreciate the power of software as part of risk-informed applications, we need to recall that our current analysis methods and solution methods have built upon pioneering work done 30 to 40 years ago. We contrast this work with the current capabilities in the SAPHIRE analysis package. As part of this discussion, we provide information for both the typical features and special analysis capabilities which are available. We also present the application and results typically found with state-of-the-practice PRRA models. By providing both a high-level and detailed look at the SAPHIRE software, we give a snapshot in time for the current use of software tools in a risk-informed decision arena.

  2. Engineering of Data Acquiring Mobile Software and Sustainable End-User Applications

    NASA Technical Reports Server (NTRS)

    Smith, Benton T.

    2013-01-01

    The criteria for which data acquiring software and its supporting infrastructure should be designed should take the following two points into account: the reusability and organization of stored online and remote data and content, and an assessment on whether abandoning a platform optimized design in favor for a multi-platform solution significantly reduces the performance of an end-user application. Furthermore, in-house applications that control or process instrument acquired data for end-users should be designed with a communication and control interface such that the application's modules can be reused as plug-in modular components in greater software systems. The application of the above mentioned is applied using two loosely related projects: a mobile application, and a website containing live and simulated data. For the intelligent devices mobile application AIDM, the end-user interface have a platform and data type optimized design, while the database and back-end applications store this information in an organized manner and manage access to that data to only to authorized user end application(s). Finally, the content for the website was derived from a database such that the content can be included and uniform to all applications accessing the content. With these projects being ongoing, I have concluded from my research that the applicable methods presented are feasible for both projects, and that a multi-platform design for the mobile application only marginally drop the performance of the mobile application.

  3. Using WWW to Improve Software Development and Maintenance: Application of the Light System to Aleph Programs

    NASA Astrophysics Data System (ADS)

    Aimar, A.; Aimar, M.; Khodabandeh, A.; Palazzi, P.; Rousseau, B.; Ruggier, M.; Cattaneo, M.; Comas Illas, P.

    Programmers who develop, use, maintain, modify software are faced with the problem of scanning and understanding large amounts of documents, ranging from source code to requirements, analysis and design diagrams, user and reference manuals, etc. This task is non trivial and time consuming, because of the number and size of documents, and the many implicit cross-references that they contain. In large distributed development teams, where software and related documents are produced at various sites, the problem can be even more severe. LIGHT, Life cycle Global HyperText, is an attempt to solve the problem using WWW technology. The basic idea is to make all the software documents, including code, available and cross-connected on the WWW. The first application of this concept to go in production is JULIA/LIGHT, a system to convert and publish on WWW the software documentation of the JULIA reconstruction program of the ALEPH experiment at CERN, European Organisation for Particle Physics, Geneva.

  4. An infrastructure for the creation of high end scientific and engineering software tools and applications

    SciTech Connect

    Drummond, L.A.; Marques, O.A.; Wilson, G.V.

    2003-04-01

    This document has been prepared as a response to the High End Computing Revitalization Task Force (HECRTF) call for white papers. Our main goal is to identify mechanism necessary for the design and implementation of an infrastructure to support development of high-end scientific and engineering software tools and applications. This infrastructure will provide a plethora of software services to facilitate the efficient deployment of future HEC technology as well as collaborations among researchers and engineers across disciplines and institutions. In particular, we address here the following points; Key software technologies that must be advanced to strengthen the foundation for developing new generations of HEC systems. A Software Infrastructure for minimizing ''time to solution'' by users of HEC systems.

  5. VISTA (Vertical Integration of Science, Technology, and Applications) user interface software study

    SciTech Connect

    Chin, G.

    1990-04-01

    The Vertical Integration of Science, Technology, and Applications (VISTA) project is an initiative to employ modern information and communications technology for rapid and effective application of basic research results by end users. Developed by the Pacific Northwest Laboratory, VISTA's purpose is to develop and deploy information systems (software or software/hardware products) to broad segments of various markets. Inherent in these products would be mechanisms for accessing PNL-resident information about the problem. A goal of VISTA is to incorporate existing, commercially available user interface technology into the VISTA UIMS. Commercial systems are generally more complete, reliable, and cost-effective than software developed in-house. The objective of this report is to examine the current state of commercial user interface software and discuss the implications of selections thereof. This report begins by describing the functionality of the user interface as it applies to users and application developers. Next, a reference model is presented defining the various operational software layers of a graphical user interface. The main body follows which examines current user interface technology by sampling a number of commercial systems. Both the window system and user interface toolkit markets are surveyed. A summary of the current technology concludes this report. 15 refs., 3 figs., 1 tab.

  6. Internet-based hardware/software co-design framework for embedded 3D graphics applications

    NASA Astrophysics Data System (ADS)

    Yeh, Chi-Tsai; Wang, Chun-Hao; Huang, Ing-Jer; Wong, Weng-Fai

    2011-12-01

    Advances in technology are making it possible to run three-dimensional (3D) graphics applications on embedded and handheld devices. In this article, we propose a hardware/software co-design environment for 3D graphics application development that includes the 3D graphics software, OpenGL ES application programming interface (API), device driver, and 3D graphics hardware simulators. We developed a 3D graphics system-on-a-chip (SoC) accelerator using transaction-level modeling (TLM). This gives software designers early access to the hardware even before it is ready. On the other hand, hardware designers also stand to gain from the more complex test benches made available in the software for verification. A unique aspect of our framework is that it allows hardware and software designers from geographically dispersed areas to cooperate and work on the same framework. Designs can be entered and executed from anywhere in the world without full access to the entire framework, which may include proprietary components. This results in controlled and secure transparency and reproducibility, granting leveled access to users of various roles.

  7. Cloud computing geospatial application for water resources based on free and open source software and open standards - a prototype

    NASA Astrophysics Data System (ADS)

    Delipetrev, Blagoj

    2016-04-01

    Presently, most of the existing software is desktop-based, designed to work on a single computer, which represents a major limitation in many ways, starting from limited computer processing, storage power, accessibility, availability, etc. The only feasible solution lies in the web and cloud. This abstract presents research and development of a cloud computing geospatial application for water resources based on free and open source software and open standards using hybrid deployment model of public - private cloud, running on two separate virtual machines (VMs). The first one (VM1) is running on Amazon web services (AWS) and the second one (VM2) is running on a Xen cloud platform. The presented cloud application is developed using free and open source software, open standards and prototype code. The cloud application presents a framework how to develop specialized cloud geospatial application that needs only a web browser to be used. This cloud application is the ultimate collaboration geospatial platform because multiple users across the globe with internet connection and browser can jointly model geospatial objects, enter attribute data and information, execute algorithms, and visualize results. The presented cloud application is: available all the time, accessible from everywhere, it is scalable, works in a distributed computer environment, it creates a real-time multiuser collaboration platform, the programing languages code and components are interoperable, and it is flexible in including additional components. The cloud geospatial application is implemented as a specialized water resources application with three web services for 1) data infrastructure (DI), 2) support for water resources modelling (WRM), 3) user management. The web services are running on two VMs that are communicating over the internet providing services to users. The application was tested on the Zletovica river basin case study with concurrent multiple users. The application is a state

  8. MAPI: a software framework for distributed biomedical applications

    PubMed Central

    2013-01-01

    Background The amount of web-based resources (databases, tools etc.) in biomedicine has increased, but the integrated usage of those resources is complex due to differences in access protocols and data formats. However, distributed data processing is becoming inevitable in several domains, in particular in biomedicine, where researchers face rapidly increasing data sizes. This big data is difficult to process locally because of the large processing, memory and storage capacity required. Results This manuscript describes a framework, called MAPI, which provides a uniform representation of resources available over the Internet, in particular for Web Services. The framework enhances their interoperability and collaborative use by enabling a uniform and remote access. The framework functionality is organized in modules that can be combined and configured in different ways to fulfil concrete development requirements. Conclusions The framework has been tested in the biomedical application domain where it has been a base for developing several clients that are able to integrate different web resources. The MAPI binaries and documentation are freely available at http://www.bitlab-es.com/mapi under the Creative Commons Attribution-No Derivative Works 2.5 Spain License. The MAPI source code is available by request (GPL v3 license). PMID:23311574

  9. A Software Package for Neural Network Applications Development

    NASA Technical Reports Server (NTRS)

    Baran, Robert H.

    1993-01-01

    Original Backprop (Version 1.2) is an MS-DOS package of four stand-alone C-language programs that enable users to develop neural network solutions to a variety of practical problems. Original Backprop generates three-layer, feed-forward (series-coupled) networks which map fixed-length input vectors into fixed length output vectors through an intermediate (hidden) layer of binary threshold units. Version 1.2 can handle up to 200 input vectors at a time, each having up to 128 real-valued components. The first subprogram, TSET, appends a number (up to 16) of classification bits to each input, thus creating a training set of input output pairs. The second subprogram, BACKPROP, creates a trilayer network to do the prescribed mapping and modifies the weights of its connections incrementally until the training set is leaned. The learning algorithm is the 'back-propagating error correction procedures first described by F. Rosenblatt in 1961. The third subprogram, VIEWNET, lets the trained network be examined, tested, and 'pruned' (by the deletion of unnecessary hidden units). The fourth subprogram, DONET, makes a TSR routine by which the finished product of the neural net design-and-training exercise can be consulted under other MS-DOS applications.

  10. Evaluation of the Trajectory Operations Applications Software Task (TOAST). Volume 2: Interview transcripts

    NASA Technical Reports Server (NTRS)

    Perkins, Sharon; Martin, Andrea; Bavinger, Bill

    1990-01-01

    The Trajectory Operations Applications Software Task (TOAST) is a software development project whose purpose is to provide trajectory operation pre-mission and real-time support for the Space Shuttle. The purpose of the evaluation was to evaluate TOAST as an Application Manager - to assess current and planned capabilities, compare capabilities to commercially-available off the shelf (COTS) software, and analyze requirements of MCC and Flight Analysis Design System (FADS) for TOAST implementation. As a major part of the data gathering for the evaluation, interviews were conducted with NASA and contractor personnel. Real-time and flight design users, orbit navigation users, the TOAST developers, and management were interviewed. Code reviews and demonstrations were also held. Each of these interviews was videotaped and transcribed as appropriate. Transcripts were edited and are presented chronologically.

  11. Performance diagnostics software for gas turbines in pipeline and cogeneration applications. Final report, July 1985-September 1989

    SciTech Connect

    Levine, P.

    1989-12-01

    The development experience for the PEGASYS and COGENT software is presented. The PEGASYS software is applicable to two-shaft gas turbines in simple, regenerative and combined cycle systems. The COGENT software is applicable to cogeneration systems. The test results show that the software is able to define the deviations between measured and expected power and thermal efficiency. Further, the software is able to identify the components causing the performance losses. The results show that axial compressor fouling is a major cause of performance losses and that the performance can be recovered by washing. A description of an on-line version of PEGASYS is described.

  12. Development and Applications of the HYDRUS and STANMOD Software Packages, and related codes

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Mathematical models have become indispensable tools for studying vadose zone flow 43 and transport processes. In this paper we review the history of development, the main processes involved, and selected applications of HYDRUS and related models and software packages developed collaboratively by sev...

  13. Software reliability: Application of a reliability model to requirements error analysis

    NASA Technical Reports Server (NTRS)

    Logan, J.

    1980-01-01

    The application of a software reliability model having a well defined correspondence of computer program properties to requirements error analysis is described. Requirements error categories which can be related to program structural elements are identified and their effect on program execution considered. The model is applied to a hypothetical B-5 requirement specification for a program module.

  14. Application of thermoluminescence for detection of cascade shower 1: Hardware and software of reader system

    NASA Technical Reports Server (NTRS)

    Akashi, M.; Kawaguchi, S.; Watanabe, Z.; Misaki, A.; Niwa, M.; Okamoto, Y.; Fujinaga, T.; Ichimura, M.; Shibata, T.; Dake, S.

    1985-01-01

    A reader system for the detection of cascade showers via luminescence induced by heating sensitive material (BaSO4:Eu) is developed. The reader system is composed of following six instruments: (1) heater, (2) light guide, (3) image intensifier, (4) CCD camera, (5) image processor, (6) microcomputer. The efficiency of these apparatuses and software application for image analysis is reported.

  15. Novice and Expert Collaboration in Educational Software Development: Evaluating Application Effectiveness

    ERIC Educational Resources Information Center

    Friedman, Rob; Saponara, Adam

    2008-01-01

    In an attempt to hone the role of learners as designers, this study investigates the effectiveness of an instructional software application resulting from a design process founded on the tenets of participatory design, informant design, and contextual inquiry, as well as a set of established design heuristics. Collaboration occurred among learning…

  16. VARK Learning Preferences and Mobile Anatomy Software Application Use in Pre-Clinical Chiropractic Students

    ERIC Educational Resources Information Center

    Meyer, Amanda J.; Stomski, Norman J.; Innes, Stanley I.; Armson, Anthony J.

    2016-01-01

    Ubiquitous smartphone ownership and reduced face-to-face teaching time may lead to students making greater use of mobile technologies in their learning. This is the first study to report on the prevalence of mobile gross anatomy software applications (apps) usage in pre-clinical chiropractic students and to ascertain if a relationship exists…

  17. A transportable executive system for use with remote sensing applications software

    NASA Technical Reports Server (NTRS)

    Van Wie, P.; Fischel, D.; Howell, D.

    1980-01-01

    This report describes a transportable software executive system under development. This executive will facilitate user access to data and programs used in remote sensing applications and will provide the necessary system control, bookkeeping, operating system interface, man-machine communications and other services needed to make the computer an effective and economical tool for the remote sensing investigator.

  18. Technology survey of computer software as applicable to the MIUS project

    NASA Technical Reports Server (NTRS)

    Fulbright, B. E.

    1975-01-01

    Existing computer software, available from either governmental or private sources, applicable to modular integrated utility system program simulation is surveyed. Several programs and subprograms are described to provide a consolidated reference, and a bibliography is included. The report covers the two broad areas of design simulation and system simulation.

  19. Expert Panel: A New Strategy for Creating a Student-Centred Learning Environment for Software Applications

    ERIC Educational Resources Information Center

    Wang, Sy-Chyi

    2011-01-01

    Education reforms from teacher-centred to student-centred courses usually come with the adoption of new teaching strategies. However, following the growing design and development of student-centred teaching and learning innovations in many fields of study, not many efforts have been found in the field of software application teaching. Therefore,…

  20. Application of FLUIDNET-G software to the preliminary thermal and hydraulic design of Hermes spaceplane

    NASA Astrophysics Data System (ADS)

    Chelotti, J. N.; Andre, S.; Maciaszek, T.; Thuaire, A.; Ferro, A.

    FLUIDNET-G is a 'workstation' based graphic's orientated software package. The package performs single phase fluid loop thermal and hydraulic preliminary design. The main characteristics of this tool are as follows: the user friendliness of the graphics and the workstation concept; and the 'upstream compatibility' towards others analysis tools. An application illustrates FLUIDNET-G's ability to perform tradeoff and sensitivity studies.

  1. DEMOS: state-of-the-art application software for design, evaluation, and modeling of optical systems

    NASA Astrophysics Data System (ADS)

    Gan, Michael A.; Zhdanov, Dmitriy D.; Novoselskiy, Vadim V.; Ustinov, Sergey I.; Fedorov, Alexander O.; Potyemin, Igor S.

    1992-04-01

    A new version of the DEMOS program is presented. DEMOS (design, evaluation, and modeling of optical systems) is integrated dialog software for automatic modeling to estimate and design optical systems with conventional and hologram optical elements. The theoretical principles and the current state of the primary possibilities and application principles of the DEMOS program for optical systems design and simulation on computers are discussed.

  2. Common characteristics of open source software development and applicability for drug discovery: a systematic review

    PubMed Central

    2011-01-01

    Background Innovation through an open source model has proven to be successful for software development. This success has led many to speculate if open source can be applied to other industries with similar success. We attempt to provide an understanding of open source software development characteristics for researchers, business leaders and government officials who may be interested in utilizing open source innovation in other contexts and with an emphasis on drug discovery. Methods A systematic review was performed by searching relevant, multidisciplinary databases to extract empirical research regarding the common characteristics and barriers of initiating and maintaining an open source software development project. Results Common characteristics to open source software development pertinent to open source drug discovery were extracted. The characteristics were then grouped into the areas of participant attraction, management of volunteers, control mechanisms, legal framework and physical constraints. Lastly, their applicability to drug discovery was examined. Conclusions We believe that the open source model is viable for drug discovery, although it is unlikely that it will exactly follow the form used in software development. Hybrids will likely develop that suit the unique characteristics of drug discovery. We suggest potential motivations for organizations to join an open source drug discovery project. We also examine specific differences between software and medicines, specifically how the need for laboratories and physical goods will impact the model as well as the effect of patents. PMID:21955914

  3. Testing existing software for safety-related applications. Revision 7.1

    SciTech Connect

    Scott, J.A.; Lawrence, J.D.

    1995-12-01

    The increasing use of commercial off-the-shelf (COTS) software products in digital safety-critical applications is raising concerns about the safety, reliability, and quality of these products. One of the factors involved in addressing these concerns is product testing. A tester`s knowledge of the software product will vary, depending on the information available from the product vendor. In some cases, complete source listings, program structures, and other information from the software development may be available. In other cases, only the complete hardware/software package may exist, with the tester having no knowledge of the internal structure of the software. The type of testing that can be used will depend on the information available to the tester. This report describes six different types of testing, which differ in the information used to create the tests, the results that may be obtained, and the limitations of the test types. An Annex contains background information on types of faults encountered in testing, and a Glossary of pertinent terms is also included. This study is pertinent for safety-related software at reactors.

  4. Constructing a working taxonomy of functional Ada software components for real-time embedded system applications

    NASA Technical Reports Server (NTRS)

    Wallace, Robert

    1986-01-01

    A major impediment to a systematic attack on Ada software reusability is the lack of an effective taxonomy for software component functions. The scope of all possible applications of Ada software is considered too great to allow the practical development of a working taxonomy. Instead, for the purposes herein, the scope of Ada software application is limited to device and subsystem control in real-time embedded systems. A functional approach is taken in constructing the taxonomy tree for identified Ada domain. The use of modular software functions as a starting point fits well with the object oriented programming philosophy of Ada. Examples of the types of functions represented within the working taxonomy are real time kernels, interrupt service routines, synchronization and message passing, data conversion, digital filtering and signal conditioning, and device control. The constructed taxonomy is proposed as a framework from which a need analysis can be performed to reveal voids in current Ada real-time embedded programming efforts for Space Station.

  5. OIL—Output input language for data connectivity between geoscientific software applications

    NASA Astrophysics Data System (ADS)

    Amin Khan, Khalid; Akhter, Gulraiz; Ahmad, Zulfiqar

    2010-05-01

    Geoscientific computing has become so complex that no single software application can perform all the processing steps required to get the desired results. Thus for a given set of analyses, several specialized software applications are required, which must be interconnected for electronic flow of data. In this network of applications the outputs of one application become inputs of other applications. Each of these applications usually involve more than one data type and may have their own data formats, making them incompatible with other applications in terms of data connectivity. Consequently several data format conversion utilities are developed in-house to provide data connectivity between applications. Practically there is no end to this problem as each time a new application is added to the system, a set of new data conversion utilities need to be developed. This paper presents a flexible data format engine, programmable through a platform independent, interpreted language named; Output Input Language (OIL). Its unique architecture allows input and output formats to be defined independent of each other by two separate programs. Thus read and write for each format is coded only once and data connectivity link between two formats is established by a combination of their read and write programs. This results in fewer programs with no redundancy and maximum reuse, enabling rapid application development and easy maintenance of data connectivity links.

  6. Application of holographic interferometric studies of underwater shock-wave focusing to medicine

    NASA Astrophysics Data System (ADS)

    Takayama, Kazuyoshi; Nagoya, H.; Obara, Tetsuro; Kuwahara, M.

    1993-01-01

    Holographic interferometric flow visualization was successfully applied to underwater shock wave focusing and its application to extracorporeal shock wave lithotripsy (ESWL). Real time diffuse holograms revealed the shock wave focusing process in an ellipsoidal reflector made from PMMA and double exposure holographic interferometry also clarified quantitatively the shock focusing process. Disintegration of urinary tract stones and gallbladder stones was observed by high speed photogrammetry. Tissue damage associated with the ESWL treatment is discussed in some detail.

  7. Software Graphics Processing Unit (sGPU) for Deep Space Applications

    NASA Technical Reports Server (NTRS)

    McCabe, Mary; Salazar, George; Steele, Glen

    2015-01-01

    A graphics processing capability will be required for deep space missions and must include a range of applications, from safety-critical vehicle health status to telemedicine for crew health. However, preliminary radiation testing of commercial graphics processing cards suggest they cannot operate in the deep space radiation environment. Investigation into an Software Graphics Processing Unit (sGPU)comprised of commercial-equivalent radiation hardened/tolerant single board computers, field programmable gate arrays, and safety-critical display software shows promising results. Preliminary performance of approximately 30 frames per second (FPS) has been achieved. Use of multi-core processors may provide a significant increase in performance.

  8. Management software for a universal device communication controller: application to monitoring and computerized infusions.

    PubMed

    Coussaert, E J; Cantraine, F R

    1996-11-01

    We designed a virtual device for a local area network observing, operating and connecting devices to a personal computer. To keep the widest field of application, we proceeded by using abstraction and specification rules of software engineering in the design and implementation of the hardware and software for the Infusion Monitor. We specially built a box of hardware to interface multiple medical instruments with different communication protocols to a PC via a single serial port. We called that box the Universal Device Communication Controller (UDCC). The use of the virtual device driver is illustrated by the Infusion Monitor implemented for the anaesthesia and intensive care workstation. PMID:9080243

  9. Option design pattern for CAE software development and its application to extension of nonlinear functions

    NASA Astrophysics Data System (ADS)

    Kanto, Y.; Kawasumi, T.

    2010-06-01

    With the rapid progress of the computational mechanics, CAE software, such as FEM programs, are having many functions and become more complicated. Because the development shouldn't be stopped, every CAE program should consider future functionality expansions. It is difficult, however, to forecast what types of expansions are required from the future researches. Object-oriented approach appears a promising technique to develop complicated and flexible software. Especially adoption of design patterns fits the purpose. In this paper, a combination of Decorator pattern and Visitor pattern, or Option pattern is discussed and its application to the FEM program of structural problems is demonstrated.

  10. An interactive end-user software application for a deep-sea photographic database

    NASA Astrophysics Data System (ADS)

    Jaisankar, S.; Sharma, R.

    2005-07-01

    A photographic database is created for cataloguing data from underwater deep-tow photographic surveys conducted in the Central Indian Ocean Basin. This includes digitizing, encoding and merging different types of data and information obtained from different sources of the system. Date and time is used as the key reference for merging the data from different sources. Techniques that are developed to encode information from photographs and area calculations are discussed. Interactive JAVA application software is developed to play video clippings (dynamic images), displaying photographs (static images), plotting simple graphs, searching and retrieving information from the database and finally saving the searched information into separate files. The software is the first of its kind in deep-sea applications and it also attempts to educate the user about deep-sea photography. The application software is developed by modifying established routines and by creating new routines to save the retrieved information. The information retrieval from the database with newly developed routines to create graphs and to display images are some of the outputs of the application. Finally, further modifications needed for the application are briefed in this paper.

  11. An application of machine learning to the organization of institutional software repositories

    NASA Technical Reports Server (NTRS)

    Bailin, Sidney; Henderson, Scott; Truszkowski, Walt

    1993-01-01

    Software reuse has become a major goal in the development of space systems, as a recent NASA-wide workshop on the subject made clear. The Data Systems Technology Division of Goddard Space Flight Center has been working on tools and techniques for promoting reuse, in particular in the development of satellite ground support software. One of these tools is the Experiment in Libraries via Incremental Schemata and Cobweb (ElvisC). ElvisC applies machine learning to the problem of organizing a reusable software component library for efficient and reliable retrieval. In this paper we describe the background factors that have motivated this work, present the design of the system, and evaluate the results of its application.

  12. Applications of Formal Methods to Specification and Safety of Avionics Software

    NASA Technical Reports Server (NTRS)

    Hoover, D. N.; Guaspari, David; Humenn, Polar

    1996-01-01

    This report treats several topics in applications of formal methods to avionics software development. Most of these topics concern decision tables, an orderly, easy-to-understand format for formally specifying complex choices among alternative courses of action. The topics relating to decision tables include: generalizations fo decision tables that are more concise and support the use of decision tables in a refinement-based formal software development process; a formalism for systems of decision tables with behaviors; an exposition of Parnas tables for users of decision tables; and test coverage criteria and decision tables. We outline features of a revised version of ORA's decision table tool, Tablewise, which will support many of the new ideas described in this report. We also survey formal safety analysis of specifications and software.

  13. Using Solution-Focused Applications for Transitional Coping of Workplace Survivors

    ERIC Educational Resources Information Center

    Germain, Marie-Line; Palamara, Sherry A.

    2007-01-01

    Solution-focused applications are proposed to assist survivor employees to return to workplace homeostasis after co-workers voluntarily or involuntarily leave the organization. A model for transitional coping is presented as well as a potential case study illustrating the application of the model. Implications for the theory, practice, and…

  14. Next generation of decision making software for nanopatterns characterization: application to semiconductor industry

    NASA Astrophysics Data System (ADS)

    Dervilllé, A.; Labrosse, A.; Zimmermann, Y.; Foucher, J.; Gronheid, R.; Boeckx, C.; Singh, A.; Leray, P.; Halder, S.

    2016-03-01

    The dimensional scaling in IC manufacturing strongly drives the demands on CD and defect metrology techniques and their measurement uncertainties. Defect review has become as important as CD metrology and both of them create a new metrology paradigm because it creates a completely new need for flexible, robust and scalable metrology software. Current, software architectures and metrology algorithms are performant but it must be pushed to another higher level in order to follow roadmap speed and requirements. For example: manage defect and CD in one step algorithm, customize algorithms and outputs features for each R&D team environment, provide software update every day or every week for R&D teams in order to explore easily various development strategies. The final goal is to avoid spending hours and days to manually tune algorithm to analyze metrology data and to allow R&D teams to stay focus on their expertise. The benefits are drastic costs reduction, more efficient R&D team and better process quality. In this paper, we propose a new generation of software platform and development infrastructure which can integrate specific metrology business modules. For example, we will show the integration of a chemistry module dedicated to electronics materials like Direct Self Assembly features. We will show a new generation of image analysis algorithms which are able to manage at the same time defect rates, images classifications, CD and roughness measurements with high throughput performances in order to be compatible with HVM. In a second part, we will assess the reliability, the customization of algorithm and the software platform capabilities to follow new specific semiconductor metrology software requirements: flexibility, robustness, high throughput and scalability. Finally, we will demonstrate how such environment has allowed a drastic reduction of data analysis cycle time.

  15. Designing Tracking Software for Image-Guided Surgery Applications: IGSTK Experience

    PubMed Central

    Enquobahrie, Andinet; Gobbi, David; Turek, Matt; Cheng, Patrick; Yaniv, Ziv; Lindseth, Frank; Cleary, Kevin

    2009-01-01

    Objective Many image-guided surgery applications require tracking devices as part of their core functionality. The Image-Guided Surgery Toolkit (IGSTK) was designed and developed to interface tracking devices with software applications incorporating medical images. Methods IGSTK was designed as an open source C++ library that provides the basic components needed for fast prototyping and development of image-guided surgery applications. This library follows a component-based architecture with several components designed for specific sets of image-guided surgery functions. At the core of the toolkit is the tracker component that handles communication between a control computer and navigation device to gather pose measurements of surgical instruments present in the surgical scene. The representations of the tracked instruments are superimposed on anatomical images to provide visual feedback to the clinician during surgical procedures. Results The initial version of the IGSTK toolkit has been released in the public domain and several trackers are supported. The toolkit and related information are available at www.igstk.org. Conclusion With the increased popularity of minimally invasive procedures in health care, several tracking devices have been developed for medical applications. Designing and implementing high-quality and safe software to handle these different types of trackers in a common framework is a challenging task. It requires establishing key software design principles that emphasize abstraction, extensibility, reusability, fault-tolerance, and portability. IGSTK is an open source library that satisfies these needs for the image-guided surgery community. PMID:20037671

  16. Predictable component-based software design of real-time MPEG-4 video applications

    NASA Astrophysics Data System (ADS)

    Bondarev, Egor; Pastrnak, Milan; de With, Peter H. N.; Chaudron, Michel R. V.

    2005-07-01

    Component-based software development is very attractive, because it allows a clear decomposition of logical processing blocks into software blocks and it offers wide reuse. The strong real-time requirements of media processing systems should be validated as soon as possible to avoid costly system redesign. This can be achieved by prediction of timing and performance properties. In this paper, we propose a scenario simulation design approach featuring early performance prediction of a component-based software system. We validated this approach through a case study, for which we developed an advanced MPEG-4 coding application. The benefits of the approach are threefold: (a) high accuracy of the predicted performance data; (b) it delivers an efficient real-time software-hardware implementation, because the generic computational costs become known in advance, and (c) improved ease of use because of a high abstraction level of modelling. Experiments showed that the prediction accuracy of the system performance is about 90% or higher, while the prediction accuracy of the time-detailed processor usage (performance) does not get lower than 70%. However, the real-time performance requirements are sometimes not met, e.g. when other applications require intensive memory usage, thereby imposing delays on the retrieval from memory of the decoder data.

  17. Applications of on-product diffraction-based focus metrology in logic high volume manufacturing

    NASA Astrophysics Data System (ADS)

    Noyes, Ben F.; Mokaberi, Babak; Bolton, David; Li, Chen; Palande, Ashwin; Park, Kevin; Noot, Marc; Kea, Marc

    2016-03-01

    The integration of on-product diffraction-based focus (DBF) capability into the majority of immersion lithography layers in leading edge logic manufacturing has enabled new applications targeted towards improving cycle time and yield. A CD-based detection method is the process of record (POR) for excursion detection. The drawback of this method is increased cycle time and limited sampling due to CD-SEM metrology capacity constraints. The DBFbased method allows the addition of focus metrology samples to the existing overlay measurements on the integrated metrology (IM) system. The result enables the addition of measured focus to the SPC system, allowing a faster excursion detection method. For focus targeting, the current method involves using a dedicated focus-exposure matrix (FEM) on all scanners, resulting in lengthy analysis times and uncertainty in the best focus. The DBF method allows the measurement to occur on the IM system, on a regular production wafer, and at the same time as the exposure. This results in a cycle time gain as well as a less subjective determination of best focus. A third application aims to use the novel onproduct focus metrology data in order to apply per-exposure focus corrections to the scanner. These corrections are particularly effective at the edge of the wafer, where systematic layer-dependent effects can be removed using DBFbased scanner feedback. This paper will discuss the development of a methodology to accomplish each of these applications in a high-volume production environment. The new focus metrology method, sampling schemes, feedback mechanisms and analysis methods lead to improved focus control, as well as earlier detection of failures.

  18. Scatterometry-based dose and focus decorrelation: applications to 28nm contact holes patterning intrafield focus investigations

    NASA Astrophysics Data System (ADS)

    Orlando, B.; Spaziani, N.; Socquet, N.; Bouyssou, R.; Gatefait, M.; Goirand, P. J.

    2013-04-01

    We introduced a simple method based on scatterometry measurement performed on dense contact holes matrix to investigate intrafield focus deviation on 28nm FDSOI real production wafers at contact holes patterning lithography operation. A complex three-dimensional scatterometry model with all patterned resist geometrical parameters left as degree of freedom. Then simple linear relationships between patterned resist geometrical parameters on the one hand, and applied dose and focus offset on the other hand were used to determine a focus and dose decorrelation model. This model was then used to investigate the effect of ASML AGILETM scanner option on intrafield focus deviation. A significant 16% intrafield focus standard deviation improvement was found with AGILETM, which validated our method and shows the possibilities of AGILETM option for intrafield focus control. This focus investigation method may be used to improve advanced CMOS manufacturing process control.

  19. Towards the Goal of Modular Climate Data Services: An Overview of NCPP Applications and Software

    NASA Astrophysics Data System (ADS)

    Koziol, B. W.; Cinquini, L.; Treshansky, A.; Murphy, S.; DeLuca, C.

    2013-12-01

    In August 2013, the National Climate Predictions and Projections Platform (NCPP) organized a workshop focusing on the quantitative evaluation of downscaled climate data products (QED-2013). The QED-2013 workshop focused on real-world application problems drawn from several sectors (e.g. hydrology, ecology, environmental health, agriculture), and required that downscaled downscaled data products be dynamically accessed, generated, manipulated, annotated, and evaluated. The cyberinfrastructure elements that were integrated to support the workshop included (1) a wiki-based project hosting environment (Earth System CoG) with an interface to data services provided by an Earth System Grid Federation (ESGF) data node; (2) metadata tools provided by the Earth System Documentation (ES-DOC) collaboration; and (3) a Python-based library OpenClimateGIS (OCGIS) for subsetting and converting NetCDF-based climate data to GIS and tabular formats. Collectively, this toolset represents a first deployment of a 'ClimateTranslator' that enables users to access, interpret, and apply climate information at local and regional scales. This presentation will provide an overview of these components above, how they were used in the workshop, and discussion of current and potential integration. The long-term strategy for this software stack is to offer the suite of services described on a customizable, per-project basis. Additional detail on the three components is below. (1) Earth System CoG is a web-based collaboration environment that integrates data discovery and access services with tools for supporting governance and the organization of information. QED-2013 utilized these capabilities to share with workshop participants a suite of downscaled datasets, associated images derived from those datasets, and metadata files describing the downscaling techniques involved. The collaboration side of CoG was used for workshop organization, discussion, and results. (2) The ES-DOC Questionnaire

  20. Software Process Improvement Initiatives Based on Quality Assurance Strategies: A QATAM Pilot Application

    NASA Astrophysics Data System (ADS)

    Winkler, Dietmar; Elberzhager, Frank; Biffl, Stefan; Eschbach, Robert

    Quality Assurance (QA) strategies, i.e., bundles of verification and validation approaches embedded within a balanced software process can support project and quality managers in systematically planning and implementing improvement initiatives. New and modified processes and methods come up frequently that seems promising candidates for improvement. Nevertheless, the impact of processes and methods strongly depends on individual project contexts. A major challenge is how to systematically select and implement "bestpractices" for product construction, verification, and validation. In this paper we present the Quality Assurance Tradeoff Analysis Method (QATAM) that supports engineers in (a) systematically identifying candidate QA strategies and (b) evaluating QA strategy variants in a given project context. We evaluate feasibility and usefulness in a pilot application in a medium-size software engineering organization. Main results were that QATAM was considered useful for identifying and evaluating various improvement initiatives applicable for large organizations as well as for small and medium enterprises.

  1. An application of software design and documentation language. [Galileo spacecraft command and data subsystem

    NASA Technical Reports Server (NTRS)

    Callender, E. D.; Clarkson, T. B.; Frasier, C. E.

    1980-01-01

    The software design and documentation language (SDDL) is a general purpose processor to support a lanugage for the description of any system, structure, concept, or procedure that may be presented from the viewpoint of a collection of hierarchical entities linked together by means of binary connections. The language comprises a set of rules of syntax, primitive construct classes (module, block, and module invocation), and language control directives. The result is a language with a fixed grammar, variable alphabet and punctuation, and an extendable vocabulary. The application of SDDL to the detailed software design of the Command Data Subsystem for the Galileo Spacecraft is discussed. A set of constructs was developed and applied. These constructs are evaluated and examples of their application are considered.

  2. Software Use Control

    SciTech Connect

    Trussell, F.G.

    1994-03-01

    The topic of this technical presentation is Use Control Software. The nuclear weapon software design community is being subjected to many surety forces that are stretching the envelope of their designs. Given that software is a critical part of the use control system design, we must work to limit the errors of the software development process. The objective of this paper is to discuss a methodology that the author, as a member of the Security and Use Control Assessment Department, is working on. This is the first introduction of the proposed methodology. Software that is a part of any use control system, subsystem, device, or component is critical to the operation of that apparatus. The software is expected to meet the criteria of modern software quality. In a use control application, meeting the normal quality standards is short of the expectations in meeting the use control obligations. The NWC community expects the use control features of a nuclear weapon to provide assurance that the weapon is protected from unauthorized nuclear detonation. The methodology that the author is proposing will provide a focused scrutiny to software that is used in the hardware of use control systems, subsystems, devices, and components. The methodology proposes further scrutiny of the structure of the software, memory, variables, storage, and control features.

  3. Solar thermal power systems point-focusing thermal and electric applications projects. Volume 1: Executive summary

    NASA Technical Reports Server (NTRS)

    Marriott, A.

    1980-01-01

    The activities of the Point-Focusing Thermal and Electric Applications (PETEA) project for the fiscal year 1979 are summarized. The main thrust of the PFTEA Project, the small community solar thermal power experiment, was completed. Concept definition studies included a small central receiver approach, a point-focusing distributed receiver system with central power generation, and a point-focusing distributed receiver concept with distributed power generation. The first experiment in the Isolated Application Series was initiated. Planning for the third engineering experiment series, which addresses the industrial market sector, was also initiated. In addition to the experiment-related activities, several contracts to industry were let and studies were conducted to explore the market potential for point-focusing distributed receiver (PFDR) systems. System analysis studies were completed that looked at PFDR technology relative to other small power system technology candidates for the utility market sector.

  4. Transcranial MR-Guided Focused Ultrasound: A Review of the Technology and Neuro Applications

    PubMed Central

    Ghanouni, Pejman; Pauly, Kim Butts; Elias, W. Jeff; Henderson, Jaimie; Sheehan, Jason; Monteith, Stephen; Wintermark, Max

    2015-01-01

    MR guided focused ultrasound is a new, minimally invasive method of targeted tissue thermal ablation that may be of use to treat central neuropathic pain, essential tremor, Parkinson tremor, and brain tumors. The system has also been used to temporarily disrupt the blood-brain barrier to allow targeted drug delivery to brain tumors. This article reviews the physical principles of MR guided focused ultrasound and discusses current and potential applications of this exciting technology. PMID:26102394

  5. [Artificial neural network parameters optimization software and its application in the design of sustained release tablets].

    PubMed

    Zhang, Xing-Yi; Chen, Da-Wei; Jin, Jie; Lu, Wei

    2009-10-01

    Artificial neural network (ANN) is a multi-objective optimization method that needs mathematic and statistic knowledge which restricts its application in the pharmaceutical research area. An artificial neural network parameters optimization software (ANNPOS) programmed by the Visual Basic language was developed to overcome this shortcoming. In the design of a sustained release formulation, the suitable parameters of ANN were estimated by the ANNPOS. And then the Matlab 5.0 Neural Network Toolbox was used to determine the optimal formulation. It showed that the ANNPOS reduced the complexity and difficulty in the ANN's application. PMID:20055142

  6. Software Bridge

    NASA Technical Reports Server (NTRS)

    1995-01-01

    I-Bridge is a commercial version of software developed by I-Kinetics under a NASA Small Business Innovation Research (SBIR) contract. The software allows users of Windows applications to gain quick, easy access to databases, programs and files on UNIX services. Information goes directly onto spreadsheets and other applications; users need not manually locate, transfer and convert data.

  7. GWASS: GRASS web application software system based on the GeoBrain web service

    NASA Astrophysics Data System (ADS)

    Qiu, Fang; Ni, Feng; Chastain, Bryan; Huang, Haiting; Zhao, Peisheng; Han, Weiguo; Di, Liping

    2012-10-01

    GRASS is a well-known geographic information system developed more than 30 years ago. As one of the earliest GIS systems, GRASS has currently survived mainly as free, open-source desktop GIS software, with users primarily limited to the research community or among programmers who use it to create customized functions. To allow average GIS end users to continue taking advantage of this widely-used software, we developed a GRASS Web Application Software System (GWASS), a distributed, web-based, multi-tiered Geospatial Information System (GIS) built on top of the GeoBrain web service, a project sponsored by NASA using the latest service oriented architecture (SOA). This SOA enabled system offers an effective and practical alternative to current commercial desktop GIS solutions. With GWASS, all geospatial processing and analyses are conducted by the server, so users are not required to install any software at the client side, which reduces the cost of access for users. The only resource needed to use GWASS is an access to the Internet, and anyone who knows how to use a web browser can operate the system. The SOA framework is revitalizing the GRASS as a new means to bring powerful geospatial analysis and resources to more users with concurrent access.

  8. Lessons Learned from Application of System and Software Level RAMS Analysis to a Space Control System

    NASA Astrophysics Data System (ADS)

    Silva, N.; Esper, A.

    2012-01-01

    The work presented in this article represents the results of applying RAMS analysis to a critical space control system, both at system and software levels. The system level RAMS analysis allowed the assignment of criticalities to the high level components, which was further refined by a tailored software level RAMS analysis. The importance of the software level RAMS analysis in the identification of new failure modes and its impact on the system level RAMS analysis is discussed. Recommendations of changes in the software architecture have also been proposed in order to reduce the criticality of the SW components to an acceptable minimum. The dependability analysis was performed in accordance to ECSS-Q-ST-80, which had to be tailored and complemented in some aspects. This tailoring will also be detailed in the article and lessons learned from the application of this tailoring will be shared, stating the importance to space systems safety evaluations. The paper presents the applied techniques, the relevant results obtained, the effort required for performing the tasks and the planned strategy for ROI estimation, as well as the soft skills required and acquired during these activities.

  9. In-situ and non-destructive focus determination device for high-precision laser applications

    NASA Astrophysics Data System (ADS)

    Armbruster, Oskar; Naghilou, Aida; Pöhl, Hannes; Kautek, Wolfgang

    2016-09-01

    A non-destructive, in-line, and low-cost focusing device based on an image sensor has been developed and demonstrated. It allows an in situ focus determination for a broad variety of laser types (e.g. cw and pulsed lasers). It provides stringent focusing conditions with high numerical apertures. This approach does not require sub-picosecond and/or auxiliary lasers, or high fluences above damage thresholds. Applications of this system include, but are not limited to the laser-illumination of micro-electrodes, pump-probe microscopy on thin films, and laser ablation of small samples without sufficient surface area for focus determination by ablation. An uncertainty of the focus position by an order of magnitude less than the respective Rayleigh length could be demonstrated.

  10. A software tool of digital tomosynthesis application for patient positioning in radiotherapy.

    PubMed

    Yan, Hui; Dai, Jian-Rong

    2016-01-01

    Digital Tomosynthesis (DTS) is an image modality in reconstructing tomographic images from two-dimensional kV projections covering a narrow scan angles. Comparing with conventional cone-beam CT (CBCT), it requires less time and radiation dose in data acquisition. It is feasible to apply this technique in patient positioning in radiotherapy. To facilitate its clinical application, a software tool was developed and the reconstruction processes were accelerated by graphic process-ing unit (GPU). Two reconstruction and two registration processes are required for DTS application which is different from conventional CBCT application which requires one image reconstruction process and one image registration process. The reconstruction stage consists of productions of two types of DTS. One type of DTS is reconstructed from cone-beam (CB) projections covering a narrow scan angle and is named onboard DTS (ODTS), which represents the real patient position in treatment room. Another type of DTS is reconstructed from digitally reconstructed radiography (DRR) and is named reference DTS (RDTS), which represents the ideal patient position in treatment room. Prior to the reconstruction of RDTS, The DRRs are reconstructed from planning CT using the same acquisition setting of CB projections. The registration stage consists of two matching processes between ODTS and RDTS. The target shift in lateral and longitudinal axes are obtained from the matching between ODTS and RDTS in coronal view, while the target shift in longitudinal and vertical axes are obtained from the matching between ODTS and RDTS in sagittal view. In this software, both DRR and DTS reconstruction algorithms were implemented on GPU environments for acceleration purpose. The comprehensive evaluation of this software tool was performed including geometric accuracy, image quality, registration accuracy, and reconstruction efficiency. The average correlation coefficient between DRR/DTS generated by GPU-based algorithm

  11. Development of a web application for water resources based on open source software

    NASA Astrophysics Data System (ADS)

    Delipetrev, Blagoj; Jonoski, Andreja; Solomatine, Dimitri P.

    2014-01-01

    This article presents research and development of a prototype web application for water resources using latest advancements in Information and Communication Technologies (ICT), open source software and web GIS. The web application has three web services for: (1) managing, presenting and storing of geospatial data, (2) support of water resources modeling and (3) water resources optimization. The web application is developed using several programming languages (PhP, Ajax, JavaScript, Java), libraries (OpenLayers, JQuery) and open source software components (GeoServer, PostgreSQL, PostGIS). The presented web application has several main advantages: it is available all the time, it is accessible from everywhere, it creates a real time multi-user collaboration platform, the programing languages code and components are interoperable and designed to work in a distributed computer environment, it is flexible for adding additional components and services and, it is scalable depending on the workload. The application was successfully tested on a case study with concurrent multi-users access.

  12. Safety Characteristics in System Application of Software for Human Rated Exploration Missions for the 8th IAASS Conference

    NASA Technical Reports Server (NTRS)

    Mango, Edward J.

    2016-01-01

    NASA and its industry and international partners are embarking on a bold and inspiring development effort to design and build an exploration class space system. The space system is made up of the Orion system, the Space Launch System (SLS) and the Ground Systems Development and Operations (GSDO) system. All are highly coupled together and dependent on each other for the combined safety of the space system. A key area of system safety focus needs to be in the ground and flight application software system (GFAS). In the development, certification and operations of GFAS, there are a series of safety characteristics that define the approach to ensure mission success. This paper will explore and examine the safety characteristics of the GFAS development. The GFAS system integrates the flight software packages of the Orion and SLS with the ground systems and launch countdown sequencers through the 'agile' software development process. A unique approach is needed to develop the GFAS project capabilities within this agile process. NASA has defined the software development process through a set of standards. The standards were written during the infancy of the so-called industry 'agile development' movement and must be tailored to adapt to the highly integrated environment of human exploration systems. Safety of the space systems and the eventual crew on board is paramount during the preparation of the exploration flight systems. A series of software safety characteristics have been incorporated into the development and certification efforts to ensure readiness for use and compatibility with the space systems. Three underlining factors in the exploration architecture require the GFAS system to be unique in its approach to ensure safety for the space systems, both the flight as well as the ground systems. The first are the missions themselves, which are exploration in nature, and go far beyond the comfort of low Earth orbit operations. The second is the current exploration

  13. Demonstration of a software design and statistical analysis methodology with application to patient outcomes data sets

    PubMed Central

    Mayo, Charles; Conners, Steve; Warren, Christopher; Miller, Robert; Court, Laurence; Popple, Richard

    2013-01-01

    Purpose: With emergence of clinical outcomes databases as tools utilized routinely within institutions, comes need for software tools to support automated statistical analysis of these large data sets and intrainstitutional exchange from independent federated databases to support data pooling. In this paper, the authors present a design approach and analysis methodology that addresses both issues. Methods: A software application was constructed to automate analysis of patient outcomes data using a wide range of statistical metrics, by combining use of C#.Net and R code. The accuracy and speed of the code was evaluated using benchmark data sets. Results: The approach provides data needed to evaluate combinations of statistical measurements for ability to identify patterns of interest in the data. Through application of the tools to a benchmark data set for dose-response threshold and to SBRT lung data sets, an algorithm was developed that uses receiver operator characteristic curves to identify a threshold value and combines use of contingency tables, Fisher exact tests, Welch t-tests, and Kolmogorov-Smirnov tests to filter the large data set to identify values demonstrating dose-response. Kullback-Leibler divergences were used to provide additional confirmation. Conclusions: The work demonstrates the viability of the design approach and the software tool for analysis of large data sets. PMID:24320426

  14. Techniques and tools for measuring energy efficiency of scientific software applications

    NASA Astrophysics Data System (ADS)

    Abdurachmanov, David; Elmer, Peter; Eulisse, Giulio; Knight, Robert; Niemi, Tapio; Nurminen, Jukka K.; Nyback, Filip; Pestana, Gonçalo; Ou, Zhonghong; Khan, Kashif

    2015-05-01

    The scale of scientific High Performance Computing (HPC) and High Throughput Computing (HTC) has increased significantly in recent years, and is becoming sensitive to total energy use and cost. Energy-efficiency has thus become an important concern in scientific fields such as High Energy Physics (HEP). There has been a growing interest in utilizing alternate architectures, such as low power ARM processors, to replace traditional Intel x86 architectures. Nevertheless, even though such solutions have been successfully used in mobile applications with low I/O and memory demands, it is unclear if they are suitable and more energy-efficient in the scientific computing environment. Furthermore, there is a lack of tools and experience to derive and compare power consumption between the architectures for various workloads, and eventually to support software optimizations for energy efficiency. To that end, we have performed several physical and software-based measurements of workloads from HEP applications running on ARM and Intel architectures, and compare their power consumption and performance. We leverage several profiling tools (both in hardware and software) to extract different characteristics of the power use. We report the results of these measurements and the experience gained in developing a set of measurement techniques and profiling tools to accurately assess the power consumption for scientific workloads.

  15. 25 CFR 547.8 - What are the minimum technical software standards applicable to Class II gaming systems?

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... 25 Indians 2 2014-04-01 2014-04-01 false What are the minimum technical software standards... EQUIPMENT § 547.8 What are the minimum technical software standards applicable to Class II gaming systems... adopted by the tribe or TGRA; (ii) Display player interface identification; and (iii) Display...

  16. An image-processing software package: UU and Fig for optical metrology applications

    NASA Astrophysics Data System (ADS)

    Chen, Lujie

    2013-06-01

    Modern optical metrology applications are largely supported by computational methods, such as phase shifting [1], Fourier Transform [2], digital image correlation [3], camera calibration [4], etc, in which image processing is a critical and indispensable component. While it is not too difficult to obtain a wide variety of image-processing programs from the internet; few are catered for the relatively special area of optical metrology. This paper introduces an image-processing software package: UU (data processing) and Fig (data rendering) that incorporates many useful functions to process optical metrological data. The cross-platform programs UU and Fig are developed based on wxWidgets. At the time of writing, it has been tested on Windows, Linux and Mac OS. The userinterface is designed to offer precise control of the underline processing procedures in a scientific manner. The data input/output mechanism is designed to accommodate diverse file formats and to facilitate the interaction with other independent programs. In terms of robustness, although the software was initially developed for personal use, it is comparably stable and accurate to most of the commercial software of similar nature. In addition to functions for optical metrology, the software package has a rich collection of useful tools in the following areas: real-time image streaming from USB and GigE cameras, computational geometry, computer vision, fitting of data, 3D image processing, vector image processing, precision device control (rotary stage, PZT stage, etc), point cloud to surface reconstruction, volume rendering, batch processing, etc. The software package is currently used in a number of universities for teaching and research.

  17. Application of the AHP method in modeling the trust and reputation of software agents

    NASA Astrophysics Data System (ADS)

    Zytniewski, Mariusz; Klementa, Marek; Skorupka, Dariusz; Stanek, Stanislaw; Duchaczek, Artur

    2016-06-01

    Given the unique characteristics of cyberspace and, in particular, the number of inherent security threats, communication between software agents becomes a highly complex issue and a major challenge that, on the one hand, needs to be continuously monitored and, on the other, awaits new solutions addressing its vulnerabilities. An approach that has recently come into view mimics mechanisms typical of social systems and is based on trust and reputation that assist agents in deciding which other agents to interact with. The paper offers an enhancement to existing trust and reputation models, involving the application of the AHP method that is widely used for decision support in social systems, notably for risks analysis. To this end, it is proposed to expand the underlying conceptual basis by including such notions as self-trust and social trust, and to apply these to software agents. The discussion is concluded with an account of an experiment aimed at testing the effectiveness of the proposed solution.

  18. The Application of V&V within Reuse-Based Software Engineering

    NASA Technical Reports Server (NTRS)

    Addy, Edward

    1996-01-01

    Verification and Validation (V&V) is performed during application development for many systems, especially safety-critical and mission-critical systems. The V&V process is intended to discover errors as early as possible during the development process. Early discovery is important in order to minimize the cost and other impacts of correcting these errors. In reuse-based software engineering, decisions on the requirements, design and even implementation of domain assets can can be made prior to beginning development of a specific system. in order to bring the effectiveness of V&V to bear within reuse-based software engineering. V&V must be incorporated within the domain engineering process.

  19. A Case Study of the Evolving Software Architecture for the FDA Generic Drug Application Process

    PubMed Central

    Canfield, Kip; Ritondo, Michele; Sponaugle, Richard

    1998-01-01

    This primary goal of this project was to develop a software architecture to support the Food and Drug Administration (FDA) generic drug application process by making it more efficient and effective. The secondary goal was to produce a scalable, modular, and flexible architecture that could be generalized to other contexts in interorganizational health care communications. The system described here shows improvements over the old system for the generic drug application process for most of the defined design objectives. The modular, flexible design that produced this new system offers lessons for the general design of distributed health care information systems and points the way to robust application frameworks that will allow practical development and maintenance of a distributed infrastructure. PMID:9760391

  20. Computer software.

    PubMed

    Rosenthal, L E

    1986-10-01

    Software is the component in a computer system that permits the hardware to perform the various functions that a computer system is capable of doing. The history of software and its development can be traced to the early nineteenth century. All computer systems are designed to utilize the "stored program concept" as first developed by Charles Babbage in the 1850s. The concept was lost until the mid-1940s, when modern computers made their appearance. Today, because of the complex and myriad tasks that a computer system can perform, there has been a differentiation of types of software. There is software designed to perform specific business applications. There is software that controls the overall operation of a computer system. And there is software that is designed to carry out specialized tasks. Regardless of types, software is the most critical component of any computer system. Without it, all one has is a collection of circuits, transistors, and silicone chips. PMID:3536223

  1. A flexible software architecture for scalable real-time image and video processing applications

    NASA Astrophysics Data System (ADS)

    Usamentiaga, Rubén; Molleda, Julio; García, Daniel F.; Bulnes, Francisco G.

    2012-06-01

    Real-time image and video processing applications require skilled architects, and recent trends in the hardware platform make the design and implementation of these applications increasingly complex. Many frameworks and libraries have been proposed or commercialized to simplify the design and tuning of real-time image processing applications. However, they tend to lack flexibility because they are normally oriented towards particular types of applications, or they impose specific data processing models such as the pipeline. Other issues include large memory footprints, difficulty for reuse and inefficient execution on multicore processors. This paper presents a novel software architecture for real-time image and video processing applications which addresses these issues. The architecture is divided into three layers: the platform abstraction layer, the messaging layer, and the application layer. The platform abstraction layer provides a high level application programming interface for the rest of the architecture. The messaging layer provides a message passing interface based on a dynamic publish/subscribe pattern. A topic-based filtering in which messages are published to topics is used to route the messages from the publishers to the subscribers interested in a particular type of messages. The application layer provides a repository for reusable application modules designed for real-time image and video processing applications. These modules, which include acquisition, visualization, communication, user interface and data processing modules, take advantage of the power of other well-known libraries such as OpenCV, Intel IPP, or CUDA. Finally, we present different prototypes and applications to show the possibilities of the proposed architecture.

  2. The development and application of composite complexity models and a relative complexity metric in a software maintenance environment

    NASA Technical Reports Server (NTRS)

    Hops, J. M.; Sherif, J. S.

    1994-01-01

    A great deal of effort is now being devoted to the study, analysis, prediction, and minimization of software maintenance expected cost, long before software is delivered to users or customers. It has been estimated that, on the average, the effort spent on software maintenance is as costly as the effort spent on all other software costs. Software design methods should be the starting point to aid in alleviating the problems of software maintenance complexity and high costs. Two aspects of maintenance deserve attention: (1) protocols for locating and rectifying defects, and for ensuring that noe new defects are introduced in the development phase of the software process; and (2) protocols for modification, enhancement, and upgrading. This article focuses primarily on the second aspect, the development of protocols to help increase the quality and reduce the costs associated with modifications, enhancements, and upgrades of existing software. This study developed parsimonious models and a relative complexity metric for complexity measurement of software that were used to rank the modules in the system relative to one another. Some success was achieved in using the models and the relative metric to identify maintenance-prone modules.

  3. The Development and Application of Composite Complexity Models and a Relative Complexity Metric in a Software Maintenance Environment

    NASA Astrophysics Data System (ADS)

    Hops, J. M.; Sherif, J. S.

    1994-01-01

    A great deal of effort is now being devoted to the study, analysis, prediction, and minimization of software maintenance expected cost, long before software is delivered to users or customers. It has been estimated that, on the average, the effort spent on software maintenance is as costly as the effort spent on all other software costs. Software design methods should be the starting point to aid in alleviating the problems of software maintenance complexity and high costs. Two aspects of maintenance deserve attention: (1) protocols for locating and rectifying defects, and for ensuring that no new defects are introduced in the development phase of the software process, and (2) protocols for modification, enhancement, and upgrading. This article focuses primarily on the second aspect, the development of protocols to help increase the quality and reduce the costs associated with modi fications, enhancements, and upgrades of existing software. This study developed parsimonious models and a relative complexity metric for complexity measurement of software that were used to rank the modules in the system relative to one another. Some success was achieved in using the models and the relative metric to identify maintenance-prone modules.

  4. Surgical model-view-controller simulation software framework for local and collaborative applications

    PubMed Central

    Sankaranarayanan, Ganesh; Halic, Tansel; Arikatla, Venkata Sreekanth; Lu, Zhonghua; De, Suvranu

    2010-01-01

    Purpose Surgical simulations require haptic interactions and collaboration in a shared virtual environment. A software framework for decoupled surgical simulation based on a multi-controller and multi-viewer model-view-controller (MVC) pattern was developed and tested. Methods A software framework for multimodal virtual environments was designed, supporting both visual interactions and haptic feedback while providing developers with an integration tool for heterogeneous architectures maintaining high performance, simplicity of implementation, and straightforward extension. The framework uses decoupled simulation with updates of over 1,000 Hz for haptics and accommodates networked simulation with delays of over 1,000 ms without performance penalty. Results The simulation software framework was implemented and was used to support the design of virtual reality-based surgery simulation systems. The framework supports the high level of complexity of such applications and the fast response required for interaction with haptics. The efficacy of the framework was tested by implementation of a minimally invasive surgery simulator. Conclusion A decoupled simulation approach can be implemented as a framework to handle simultaneous processes of the system at the various frame rates each process requires. The framework was successfully used to develop collaborative virtual environments (VEs) involving geographically distributed users connected through a network, with the results comparable to VEs for local users. PMID:20714933

  5. Second International Workshop on Software Engineering and Code Design in Parallel Meteorological and Oceanographic Applications

    NASA Technical Reports Server (NTRS)

    OKeefe, Matthew (Editor); Kerr, Christopher L. (Editor)

    1998-01-01

    This report contains the abstracts and technical papers from the Second International Workshop on Software Engineering and Code Design in Parallel Meteorological and Oceanographic Applications, held June 15-18, 1998, in Scottsdale, Arizona. The purpose of the workshop is to bring together software developers in meteorology and oceanography to discuss software engineering and code design issues for parallel architectures, including Massively Parallel Processors (MPP's), Parallel Vector Processors (PVP's), Symmetric Multi-Processors (SMP's), Distributed Shared Memory (DSM) multi-processors, and clusters. Issues to be discussed include: (1) code architectures for current parallel models, including basic data structures, storage allocation, variable naming conventions, coding rules and styles, i/o and pre/post-processing of data; (2) designing modular code; (3) load balancing and domain decomposition; (4) techniques that exploit parallelism efficiently yet hide the machine-related details from the programmer; (5) tools for making the programmer more productive; and (6) the proliferation of programming models (F--, OpenMP, MPI, and HPF).

  6. Design of single phase inverter using microcontroller assisted by data processing applications software

    NASA Astrophysics Data System (ADS)

    Ismail, K.; Muharam, A.; Amin; Widodo Budi, S.

    2015-12-01

    Inverter is widely used for industrial, office, and residential purposes. Inverter supports the development of alternative energy such as solar cells, wind turbines and fuel cells by converting dc voltage to ac voltage. Inverter has been made with a variety of hardware and software combinations, such as the use of pure analog circuit and various types of microcontroller as controller. When using pure analog circuit, modification would be difficult because it will change the entire hardware components. In inverter with microcontroller based design (with software), calculations to generate AC modulation is done in the microcontroller. This increases programming complexity and amount of coding downloaded to the microcontroller chip (capacity flash memory in the microcontroller is limited). This paper discusses the design of a single phase inverter using unipolar modulation of sine wave and triangular wave, which is done outside the microcontroller using data processing software application (Microsoft Excel), result shows that complexity programming was reduce and resolution sampling data is very influence to THD. Resolution sampling must taking ½ A degree to get best THD (15.8%).

  7. Statistical software applications used in health services research: analysis of published studies in the U.S

    PubMed Central

    2011-01-01

    Background This study aims to identify the statistical software applications most commonly employed for data analysis in health services research (HSR) studies in the U.S. The study also examines the extent to which information describing the specific analytical software utilized is provided in published articles reporting on HSR studies. Methods Data were extracted from a sample of 1,139 articles (including 877 original research articles) published between 2007 and 2009 in three U.S. HSR journals, that were considered to be representative of the field based upon a set of selection criteria. Descriptive analyses were conducted to categorize patterns in statistical software usage in those articles. The data were stratified by calendar year to detect trends in software use over time. Results Only 61.0% of original research articles in prominent U.S. HSR journals identified the particular type of statistical software application used for data analysis. Stata and SAS were overwhelmingly the most commonly used software applications employed (in 46.0% and 42.6% of articles respectively). However, SAS use grew considerably during the study period compared to other applications. Stratification of the data revealed that the type of statistical software used varied considerably by whether authors were from the U.S. or from other countries. Conclusions The findings highlight a need for HSR investigators to identify more consistently the specific analytical software used in their studies. Knowing that information can be important, because different software packages might produce varying results, owing to differences in the software's underlying estimation methods. PMID:21977990

  8. Clinical application of extracorporeal shock wave therapy in orthopedics: focused versus unfocused shock waves.

    PubMed

    Foldager, Casper Bindzus; Kearney, Cathal; Spector, Myron

    2012-10-01

    For the past decade extracorporeal shock wave therapy has been applied to a wide range of musculoskeletal disorders. The many promising results and the introduction of shock wave generators that are less expensive and easier to handle has added to the growing interest. Based on their nature of propagation, shock waves can be divided into two types: focused and unfocused. Although several physical differences between these different types of shock waves have been described, very little is known about the clinical outcome using these different modalities. The aim of the present review is to investigate differences in outcome in select orthopaedic applications using focused and unfocused shock waves. PMID:22920552

  9. DSM Software for Computing Synthetic Seismograms in Transversely Isotropic Spherically Symmetric Media and Its Application

    NASA Astrophysics Data System (ADS)

    Kawai, K.; Takeuchi, N.; Geller, R. J.

    2002-12-01

    The existence of anisotropy has been suggested in many regions in the Earth. Determining the anisotropic seismic velocity structure of the Earth can contribute to our understanding of geodynamics and rheology. Inversion of observed seismic waveforms is a promising approach for determining the Earth's anisotropic structure, but development of computational algorithms and software for computing synthetic seismograms in anisotropic media is required. Software for computing seismic waveforms in isotropic media based on the Direct Solution Method (DSM; Geller and Ohminato 1994, GJI) has previously been developed and is being used in data analysis, but DSM software for computing synthetic seismograms for anisotropic media has not yet been developed. In this study, we derive algorithms and develop software for computing synthetics for transversely isotropic spherically symmetric media. Our derivation follows previous work for isotropic media (Takeuchi et al. 1996, GRL; Cummins et al. 1997, GJI). The displacement is represented using spherical harmonics for the lateral dependence and linear spline functions for the vertical dependence of the trial functions. The numerical operators derived using these trial functions are then replaced by optimally accurate operators (Geller and Takeuchi 1995, GJI; Takeuchi and Geller 2002, GJI, submitted). Although the number of elastic constants increases from 2 to 5, the numerical operators are basically identical to those for the isotropic case. Our derivation does not require approximations that treat the anisotropic or laterally heterogeneous structure as an infinitesimal perturbation to the isotropic structure. Only spherically symmetric models are considered in this paper, but when our methods can be extended to the 3-D case to permit computation of synthetic seismograms with the same accuracy as for spherically symmetric isotropic models. We present computational examples such as accuracy checks and also some applications to

  10. A Java and XML Application to Support Numerical Model Development within the Geologic Sequestration Software Suite (GS3)

    NASA Astrophysics Data System (ADS)

    Williams, M. D.; Wurstner, S. K.; Thorne, P. D.; Freedman, V. L.; Litofsky, A.; Huda, S. A.; Gurumoorthi, V.

    2010-12-01

    A Java and XML based application is currently under development as part of the Geologic Sequestration Software Suite (GS3) to support the generation of input files for multiple subsurface multifluid flow and transport simulators. The application will aid in the translation of conceptual models to a numerical modeling framework, and will provide the capability of generating multi-scale (spatial and temporal) numerical models in support of a variety of collaborative geologic sequestration studies. User specifications for numerical models include grids, geology, boundary and initial conditions, source terms, material properties, geochemical reactions, and geomechanics. Some of these inputs are defined as part of the conceptual model, while others are specified during the numerical model development process. To preserve the distinction between the conceptual model and its translation to a numerical modeling framework, the application manages data associated with each specification independently. This facilitates 1) building multi-scale numerical models from a common conceptual model, 2) building numerical models from multiple conceptual models, 3) building numerical models and input files for different simulators from a common conceptual model, 4) ease in re-generating numerical models in response to revisions of the conceptual model, and 5) revising the numerical model specification during the development process (e.g., grid modifications and resulting re-assignment of material property values and distributions). A key aspect of the model definition software is the ability to define features in the numerical model by specifying them as geometric objects, eliminating the need for the user to specify node/element numbers that often change when the grid is revised. The GS3 platform provides the capability of tracking provenance and dependencies for data files used in the numerical model definition. Within this framework, metadata is generated to support configuration

  11. Software Reviews.

    ERIC Educational Resources Information Center

    Bitter, Gary G., Ed.

    1990-01-01

    Reviews three computer software: (1) "Elastic Lines: The Electronic Geoboard" on elementary geometry; (2) "Wildlife Adventures: Whales" on environmental science; and (3) "What Do You Do with a Broken Calculator?" on computation and problem solving. Summarizes the descriptions, strengths and weaknesses, and applications of each software. (YP)

  12. Application of the Software as a Service Model to the Control of Complex Building Systems

    SciTech Connect

    Stadler, Michael; Donadee, Jon; Marnay, Chris; Lai, Judy; Mendes, Goncalo; Appen, Jan von; Mégel, Oliver; Bhattacharya, Prajesh; DeForest, Nicholas; Lai, Judy

    2011-03-18

    In an effort to create broad access to its optimization software, Lawrence Berkeley National Laboratory (LBNL), in collaboration with the University of California at Davis (UC Davis) and OSISoft, has recently developed a Software as a Service (SaaS) Model for reducing energy costs, cutting peak power demand, and reducing carbon emissions for multipurpose buildings. UC Davis currently collects and stores energy usage data from buildings on its campus. Researchers at LBNL sought to demonstrate that a SaaS application architecture could be built on top of this data system to optimize the scheduling of electricity and heat delivery in the building. The SaaS interface, known as WebOpt, consists of two major parts: a) the investment& planning and b) the operations module, which builds on the investment& planning module. The operational scheduling and load shifting optimization models within the operations module use data from load prediction and electrical grid emissions models to create an optimal operating schedule for the next week, reducing peak electricity consumption while maintaining quality of energy services. LBNL's application also provides facility managers with suggested energy infrastructure investments for achieving their energy cost and emission goals based on historical data collected with OSISoft's system. This paper describes these models as well as the SaaS architecture employed by LBNL researchers to provide asset scheduling services to UC Davis. The peak demand, emissions, and cost implications of the asset operation schedule and investments suggested by this optimization model are analyzed.

  13. Application of the Software as a Service Model to the Control of Complex Building Systems

    SciTech Connect

    Stadler, Michael; Donadee, Jonathan; Marnay, Chris; Mendes, Goncalo; Appen, Jan von; Megel, Oliver; Bhattacharya, Prajesh; DeForest, Nicholas; Lai, Judy

    2011-03-17

    In an effort to create broad access to its optimization software, Lawrence Berkeley National Laboratory (LBNL), in collaboration with the University of California at Davis (UC Davis) and OSISoft, has recently developed a Software as a Service (SaaS) Model for reducing energy costs, cutting peak power demand, and reducing carbon emissions for multipurpose buildings. UC Davis currently collects and stores energy usage data from buildings on its campus. Researchers at LBNL sought to demonstrate that a SaaS application architecture could be built on top of this data system to optimize the scheduling of electricity and heat delivery in the building. The SaaS interface, known as WebOpt, consists of two major parts: a) the investment& planning and b) the operations module, which builds on the investment& planning module. The operational scheduling and load shifting optimization models within the operations module use data from load prediction and electrical grid emissions models to create an optimal operating schedule for the next week, reducing peak electricity consumption while maintaining quality of energy services. LBNL's application also provides facility managers with suggested energy infrastructure investments for achieving their energy cost and emission goals based on historical data collected with OSISoft's system. This paper describes these models as well as the SaaS architecture employed by LBNL researchers to provide asset scheduling services to UC Davis. The peak demand, emissions, and cost implications of the asset operation schedule and investments suggested by this optimization model are analysed.

  14. Software architecture for a distributed real-time system in Ada, with application to telerobotics

    NASA Technical Reports Server (NTRS)

    Olsen, Douglas R.; Messiora, Steve; Leake, Stephen

    1992-01-01

    The architecture structure and software design methodology presented is described in the context of telerobotic application in Ada, specifically the Engineering Test Bed (ETB), which was developed to support the Flight Telerobotic Servicer (FTS) Program at GSFC. However, the nature of the architecture is such that it has applications to any multiprocessor distributed real-time system. The ETB architecture, which is a derivation of the NASA/NBS Standard Reference Model (NASREM), defines a hierarchy for representing a telerobot system. Within this hierarchy, a module is a logical entity consisting of the software associated with a set of related hardware components in the robot system. A module is comprised of submodules, which are cyclically executing processes that each perform a specific set of functions. The submodules in a module can run on separate processors. The submodules in the system communicate via command/status (C/S) interface channels, which are used to send commands down and relay status back up the system hierarchy. Submodules also communicate via setpoint data links, which are used to transfer control data from one submodule to another. A submodule invokes submodule algorithms (SMA's) to perform algorithmic operations. Data that describe or models a physical component of the system are stored as objects in the World Model (WM). The WM is a system-wide distributed database that is accessible to submodules in all modules of the system for creating, reading, and writing objects.

  15. Scalable, High-performance 3D Imaging Software Platform: System Architecture and Application to Virtual Colonoscopy

    PubMed Central

    Yoshida, Hiroyuki; Wu, Yin; Cai, Wenli; Brett, Bevin

    2013-01-01

    One of the key challenges in three-dimensional (3D) medical imaging is to enable the fast turn-around time, which is often required for interactive or real-time response. This inevitably requires not only high computational power but also high memory bandwidth due to the massive amount of data that need to be processed. In this work, we have developed a software platform that is designed to support high-performance 3D medical image processing for a wide range of applications using increasingly available and affordable commodity computing systems: multi-core, clusters, and cloud computing systems. To achieve scalable, high-performance computing, our platform (1) employs size-adaptive, distributable block volumes as a core data structure for efficient parallelization of a wide range of 3D image processing algorithms; (2) supports task scheduling for efficient load distribution and balancing; and (3) consists of a layered parallel software libraries that allow a wide range of medical applications to share the same functionalities. We evaluated the performance of our platform by applying it to an electronic cleansing system in virtual colonoscopy, with initial experimental results showing a 10 times performance improvement on an 8-core workstation over the original sequential implementation of the system. PMID:23366803

  16. High intensity focused ultrasound surgery (HIFU) of the brain: A historical perspective, with modern applications

    PubMed Central

    Jagannathan, Jay; Sanghvi, Narendra K; Crum, Lawrence A; Yen, Chun-Po; Medel, Ricky; Dumont, Aaron S; Sheehan, Jason P; Steiner, Ladislau; Jolesz, Ferenc; Kassell, Neal F

    2014-01-01

    The field of MRI-guided high intensity focused ultrasound surgery (MRgFUS) is a rapidly evolving one with many potential applications in neurosurgery. This is the first of three articles on MRgFUS, this paper focuses on the historical development of the technology and it's potential applications to modern neurosurgery. The evolution of MRgFUS has occurred in parallel with modern neurological surgery and the two seemingly distinct disciplines share many of the same pioneering figures. Early studies on focused ultrasound treatment in the 1940's and 1950's demonstrated the ability to perform precise lesioning in the human brain, with a favorable risk-benefit profile. However, the need for a craniotomy, as well as lack of sophisticated imaging technology resulted in limited growth of HIFU for neurosurgery. More recently, technological advances, have permitted the combination of HIFU along with MRI guidance to provide an opportunity to effectively treat a variety of CNS disorders. Although challenges remain, HIFU-mediated neurosurgery may offer the ability to target and treat CNS conditions that were previously extremely difficult to perform. The remaining two articles in this series will focus on the physical principles of modern MRgFUS as well as current and future avenues for investigation. PMID:19190451

  17. A complete software application for automatic registration of x-ray mammography and magnetic resonance images

    SciTech Connect

    Solves-Llorens, J. A.; Rupérez, M. J. Monserrat, C.; Lloret, M.

    2014-08-15

    Purpose: This work presents a complete and automatic software application to aid radiologists in breast cancer diagnosis. The application is a fully automated method that performs a complete registration of magnetic resonance (MR) images and x-ray (XR) images in both directions (from MR to XR and from XR to MR) and for both x-ray mammograms, craniocaudal (CC), and mediolateral oblique (MLO). This new approximation allows radiologists to mark points in the MR images and, without any manual intervention, it provides their corresponding points in both types of XR mammograms and vice versa. Methods: The application automatically segments magnetic resonance images and x-ray images using the C-Means method and the Otsu method, respectively. It compresses the magnetic resonance images in both directions, CC and MLO, using a biomechanical model of the breast that distinguishes the specific biomechanical behavior of each one of its three tissues (skin, fat, and glandular tissue) separately. It makes a projection of both compressions and registers them with the original XR images using affine transformations and nonrigid registration methods. Results: The application has been validated by two expert radiologists. This was carried out through a quantitative validation on 14 data sets in which the Euclidean distance between points marked by the radiologists and the corresponding points obtained by the application were measured. The results showed a mean error of 4.2 ± 1.9 mm for the MRI to CC registration, 4.8 ± 1.3 mm for the MRI to MLO registration, and 4.1 ± 1.3 mm for the CC and MLO to MRI registration. Conclusions: A complete software application that automatically registers XR and MR images of the breast has been implemented. The application permits radiologists to estimate the position of a lesion that is suspected of being a tumor in an imaging modality based on its position in another different modality with a clinically acceptable error. The results show that the

  18. Platinum metallization for MEMS application: focus on coating adhesion for biomedical applications.

    PubMed

    Guarnieri, Vittorio; Biazi, Leonardo; Marchiori, Roberto; Lago, Alexandre

    2014-01-01

    The adherence of Platinum thin film on Si/SiO 2 wafer was studies using Chromium, Titanium or Alumina (Cr, Ti, Al 2O 3) as interlayer. The adhesion of Pt is a fundamental property in different areas, for example in MEMS devices, which operate at high temperature conditions, as well as in biomedical applications, where the problem of adhesion of a Pt film to the substrate is known as a major challenge in several industrial applications health and in biomedical devices, such as for example in the stents. (1)(-) (4) We investigated the properties of Chromium, Titanium, and Alumina (Cr, Ti, and Al 2O 3) used as adhesion layers of Platinum (Pt) electrode. Thin films of Chromium, Titanium and Alumina were deposited on Silicon/Silicon dioxide (Si/SiO 2) wafer by electron beam. We introduced Al 2O 3 as a new adhesion layer to test the behavior of the Pt film at higher temperature using a ceramic adhesion thin film. Electric behaviors were measured for different annealing temperatures to know the performance for Cr/Pt, Ti/Pt, and Al 2O 3/Pt metallic film in the gas sensor application. All these metal layers showed a good adhesion onto Si/SiO 2 and also good Au wire bondability at room temperature, but for higher temperature than 400 °C the thin Cr/Pt and Ti/Pt films showed poor adhesion due to the atomic inter-diffusion between Platinum and the metal adhesion layers. (5) The proposed Al 2O 3/Pt ceramic-metal layers confirmed a better adherence for the higher temperatures tested. PMID:24743057

  19. Platinum metallization for MEMS application. Focus on coating adhesion for biomedical applications.

    PubMed

    Guarnieri, Vittorio; Biazi, Leonardo; Marchiori, Roberto; Lago, Alexandre

    2014-01-01

    The adherence of Platinum thin film on Si/SiO2 wafer was studies using Chromium, Titanium or Alumina (Cr, Ti, Al2O3) as interlayer. The adhesion of Pt is a fundamental property in different areas, for example in MEMS devices, which operate at high temperature conditions, as well as in biomedical applications, where the problem of adhesion of a Pt film to the substrate is known as a major challenge in several industrial applications health and in biomedical devices, such as for example in the stents. We investigated the properties of Chromium, Titanium, and Alumina (Cr, Ti, and Al2O3) used as adhesion layers of Platinum (Pt) electrode. Thin films of Chromium, Titanium and Alumina were deposited on Silicon/Silicon dioxide (Si/SiO2) wafer by electron beam. We introduced Al2O3 as a new adhesion layer to test the behavior of the Pt film at higher temperature using a ceramic adhesion thin film. Electric behaviors were measured for different annealing temperatures to know the performance for Cr/Pt, Ti/Pt, and Al2O3/Pt metallic film in the gas sensor application. All these metal layers showed a good adhesion onto Si/SiO2 and also good Au wire bondability at room temperature, but for higher temperature than 400 °C the thin Cr/Pt and Ti/Pt films showed poor adhesion due to the atomic inter-diffusion between Platinum and the metal adhesion layers. The proposed Al2O3/Pt ceramic-metal layers confirmed a better adherence for the higher temperatures tested. PMID:25482415

  20. Evaluation of commercial drilling and geological software for deep drilling applications

    NASA Astrophysics Data System (ADS)

    Pierdominici, Simona; Prevedel, Bernhard; Conze, Ronald; Tridec Team

    2013-04-01

    risks and costs. This procedure enables a timely, efficient and accurate data access and exchange among the rig site data acquisition system, office-based software applications and data storage. The loading of real-time data has to be quick and efficient in order to refine the model and learn the lessons for the next drilling operations.

  1. Collective focusing of intense ion beam pulses for high-energy density physics applications

    SciTech Connect

    Dorf, Mikhail A.; Kaganovich, Igor D.; Startsev, Edward A.; Davidson, Ronald C.

    2011-03-15

    The collective focusing concept in which a weak magnetic lens provides strong focusing of an intense ion beam pulse carrying a neutralizing electron background is investigated by making use of advanced particle-in-cell simulations and reduced analytical models. The original analysis by Robertson [Phys. Rev. Lett. 48, 149 (1982)] is extended to the parameter regimes of particular importance for several high-energy density physics applications. The present paper investigates (1) the effects of non-neutral collective focusing in a moderately strong magnetic field; (2) the diamagnetic effects leading to suppression of the applied magnetic field due to the presence of the beam pulse; and (3) the influence of a finite-radius conducting wall surrounding the beam cross-section on beam neutralization. In addition, it is demonstrated that the use of the collective focusing lens can significantly simplify the technical realization of the final focusing of ion beam pulses in the Neutralized Drift Compression Experiment-I (NDCX-I), and the conceptual designs of possible experiments on NDCX-I are investigated by making use of advanced numerical simulations.

  2. Collective Focusing of Intense Ion Beam Pulses for High-energy Density Physics Applications

    SciTech Connect

    Dorf, Mikhail A.; Kaganovich, Igor D.; Startsev, Edward A.; Davidson, Ronald C.

    2011-04-27

    The collective focusing concept in which a weak magnetic lens provides strong focusing of an intense ion beam pulse carrying a neutralizing electron background is investigated by making use of advanced particle-in-cell simulations and reduced analytical models. The original analysis by Robertson Phys. Rev. Lett. 48, 149 (1982) is extended to the parameter regimes of particular importance for several high-energy density physics applications. The present paper investigates (1) the effects of non-neutral collective focusing in a moderately strong magnetic field; (2) the diamagnetic effects leading to suppression of the applied magnetic field due to the presence of the beam pulse; and (3) the influence of a finite-radius conducting wall surrounding the beam cross-section on beam neutralization. In addition, it is demonstrated that the use of the collective focusing lens can significantly simplify the technical realization of the final focusing of ion beam pulses in the Neutralized Drift Compression Experiment-I (NDCX-I) , and the conceptual designs of possible experiments on NDCX-I are investigated by making use of advanced numerical simulations. 2011 American Institute of Physics

  3. RICIS Software Engineering 90 Symposium: Aerospace Applications and Research Directions Proceedings Appendices

    NASA Technical Reports Server (NTRS)

    1990-01-01

    Papers presented at RICIS Software Engineering Symposium are compiled. The following subject areas are covered: flight critical software; management of real-time Ada; software reuse; megaprogramming software; Ada net; POSIX and Ada integration in the Space Station Freedom Program; and assessment of formal methods for trustworthy computer systems.

  4. Application of software engineering to development of reactor-safety codes

    SciTech Connect

    Wilburn, N P; Niccoli, L G

    1980-11-01

    As a result of the drastically increasing cost of software and the lack of an engineering approach, the technology of Software Engineering is being developed. Software Engineering provides an answer to the increasing cost of developing and maintaining software. It has been applied extensively in the business and aerospace communities and is just now being applied to the development of scientific software and, in particular, to the development of reactor safety codes at HEDL.

  5. Using Focused Regression for Accurate Time-Constrained Scaling of Scientific Applications

    SciTech Connect

    Barnes, B; Garren, J; Lowenthal, D; Reeves, J; de Supinski, B; Schulz, M; Rountree, B

    2010-01-28

    Many large-scale clusters now have hundreds of thousands of processors, and processor counts will be over one million within a few years. Computational scientists must scale their applications to exploit these new clusters. Time-constrained scaling, which is often used, tries to hold total execution time constant while increasing the problem size along with the processor count. However, complex interactions between parameters, the processor count, and execution time complicate determining the input parameters that achieve this goal. In this paper we develop a novel gray-box, focused median prediction errors are less than 13%. regression-based approach that assists the computational scientist with maintaining constant run time on increasing processor counts. Combining application-level information from a small set of training runs, our approach allows prediction of the input parameters that result in similar per-processor execution time at larger scales. Our experimental validation across seven applications showed that median prediction errors are less than 13%.

  6. Mercury: Reusable software application for Metadata Management, Data Discovery and Access

    NASA Astrophysics Data System (ADS)

    Devarakonda, Ranjeet; Palanisamy, Giri; Green, James; Wilson, Bruce E.

    2009-12-01

    simple, keyword, spatial and temporal searches across these metadata sources. The search user interface software has two API categories; a common core API which is used by all the Mercury user interfaces for querying the index and a customized API for project specific user interfaces. For our work in producing a reusable, portable, robust, feature-rich application, Mercury received a 2008 NASA Earth Science Data Systems Software Reuse Working Group Peer-Recognition Software Reuse Award. The new Mercury system is based on a Service Oriented Architecture and effectively reuses components for various services such as Thesaurus Service, Gazetteer Web Service and UDDI Directory Services. The software also provides various search services including: RSS, Geo-RSS, OpenSearch, Web Services and Portlets, integrated shopping cart to order datasets from various data centers (ORNL DAAC, NSIDC) and integrated visualization tools. Other features include: Filtering and dynamic sorting of search results, book-markable search results, save, retrieve, and modify search criteria.

  7. Total lithography system based on a new application software platform enabling smart scanner management

    NASA Astrophysics Data System (ADS)

    Kono, Hirotaka; Masaki, Kazuo; Matsuyama, Tomoyuki; Wakamoto, Shinji; Park, Seemoon; Sugihara, Taro; Shibazaki, Yuichi

    2015-03-01

    Along with device shrinkage, higher accuracy will continuously be required from photo-lithography tools in order to enhance on-product yield. In order to achieve higher yield, the advanced photo-lithography tools must be equipped with sophisticated tuning knobs on the tool and with software that is flexible enough to be applied per layer. This means photo-lithography tools must be capable of handling many types of sub-recipes and parameters simultaneously. To enable managing such a large amount of data easily and to setup lithography tools smoothly, we have developed a total lithography system called Litho Turnkey Solution based on a new software application platform, which we call Plug and Play Manager (PPM). PPM has its own graphical user interface, which enables total management of various data. Here various data means recipes, sub-recipes, tuning-parameters, measurement results, and so on. Through PPM, parameter making by intelligent applications such as CDU/Overlay tuning tools can easily be implemented. In addition, PPM is also linked to metrology tools and the customer's host computer, which enables data flow automation. Based on measurement data received from the metrology tools, PPM calculates correction parameters and sends them to the scanners automatically. This scheme can make calibration feedback loops possible. It should be noted that the abovementioned functions are running on the same platform through a user-friendly interface. This leads to smart scanner management and usability improvement. In this paper, we will demonstrate the latest development status of Nikon's total lithography solution based on PPM; describe details of each application; and provide supporting data for the accuracy and usability of the system. Keywords: exposure

  8. Adaptation of Control Center Software to Commerical Real-Time Display Applications

    NASA Technical Reports Server (NTRS)

    Collier, Mark D.

    1994-01-01

    NASA-Marshall Space Flight Center (MSFC) is currently developing an enhanced Huntsville Operation Support Center (HOSC) system designed to support multiple spacecraft missions. The Enhanced HOSC is based upon a distributed computing architecture using graphic workstation hardware and industry standard software including POSIX, X Windows, Motif, TCP/IP, and ANSI C. Southwest Research Institute (SwRI) is currently developing a prototype of the Display Services application for this system. Display Services provides the capability to generate and operate real-time data-driven graphic displays. This prototype is a highly functional application designed to allow system end users to easily generate complex data-driven displays. The prototype is easy to use, flexible, highly functional, and portable. Although this prototype is being developed for NASA-MSFC, the general-purpose real-time display capability can be reused in similar mission and process control environments. This includes any environment depending heavily upon real-time data acquisition and display. Reuse of the prototype will be a straight-forward transition because the prototype is portable, is designed to add new display types easily, has a user interface which is separated from the application code, and is very independent of the specifics of NASA-MSFC's system. Reuse of this prototype in other environments is a excellent alternative to creation of a new custom application, or for environments with a large number of users, to purchasing a COTS package.

  9. Design and Application of the Reconstruction Software for the BaBar Calorimeter

    SciTech Connect

    Strother, Philip David; /Imperial Coll., London

    2006-07-07

    The BaBar high energy physics experiment will be in operation at the PEP-II asymmetric e{sup +}e{sup -} collider in Spring 1999. The primary purpose of the experiment is the investigation of CP violation in the neutral B meson system. The electromagnetic calorimeter forms a central part of the experiment and new techniques are employed in data acquisition and reconstruction software to maximize the capability of this device. The use of a matched digital filter in the feature extraction in the front end electronics is presented. The performance of the filter in the presence of the expected high levels of soft photon background from the machine is evaluated. The high luminosity of the PEP-II machine and the demands on the precision of the calorimeter require reliable software that allows for increased physics capability. BaBar has selected C++ as its primary programming language and object oriented analysis and design as its coding paradigm. The application of this technology to the reconstruction software for the calorimeter is presented. The design of the systems for clustering, cluster division, track matching, particle identification and global calibration is discussed with emphasis on the provisions in the design for increased physics capability as levels of understanding of the detector increase. The CP violating channel B{sup 0} {yields} J/{Psi}K{sub S}{sup 0} has been studied in the two lepton, two {pi}{sup 0} final state. The contribution of this channel to the evaluation of the angle sin 2{beta} of the unitarity triangle is compared to that from the charged pion final state. An error of 0.34 on this quantity is expected after 1 year of running at design luminosity.

  10. The Environment for Application Software Integration and Execution (EASIE) version 1.0. Volume 1: Executive overview

    NASA Technical Reports Server (NTRS)

    Rowell, Lawrence F.; Davis, John S.

    1989-01-01

    The Environment for Application Software Integration and Execution (EASIE) provides a methodology and a set of software utility programs to ease the task of coordinating engineering design and analysis codes. EASIE was designed to meet the needs of conceptual design engineers that face the task of integrating many stand-alone engineering analysis programs. Using EASIE, programs are integrated through a relational database management system. Volume 1, Executive Overview, gives an overview of the functions provided by EASIE and describes their use. Three operational design systems based upon the EASIE software are briefly described.

  11. Application of MR-guided focused pulsed ultrasound for destroying clots in vitro using thrombolytic drugs

    NASA Astrophysics Data System (ADS)

    Hadjisavvas, V.; Ioannides, K.; Damianou, C.

    2011-09-01

    In this paper an MR-guided focused pulsed ultrasound system for the treatment of stroke using thrombolytic drugs in a model in vitro is presented. A single element spherically focused transducer of 5 cm diameter; focusing at 10 cm and operating at 0.5 MHz or 1 MHz was used. The transducer was mounted in an MR compatible robot. The artery was modelled using a silicone tube. Tissue was modelled using polyaclylimide gel. Coagulated blood was used to model thrombus. A thermocouple was placed in the thrombus in order to measure the thrombus temperature. The effect of power, beam, and frequency was investigated. The goal was to maintain a temperature increase of less than 1 °C during the application of pulse ultrasound (called safe temperature). With the application of ultrasound alone there was no notable destruction of the thrombus. With the combination of ultrasound and thrombolytic drugs destruction occurred after 60 mins of pulse exposure (PRF = 1 s, duty factor = 10%, and with thrombus placed at 1 cm deep in the tissue). This simple in vitro model was proven very successful for evaluating MRgFUS as a modality for treating stroke. In the future we plan to apply this treatment protocol in live animals and humans.

  12. Computer controlled cryo-electron microscopy--TOM² a software package for high-throughput applications.

    PubMed

    Korinek, Andreas; Beck, Florian; Baumeister, Wolfgang; Nickell, Stephan; Plitzko, Jürgen M

    2011-09-01

    Automated data acquisition expedites structural studies by electron microscopy and it allows to collect data sets of unprecedented size and consistent quality. In electron tomography it greatly facilitates the systematic exploration of large cellular landscapes and in single particle analysis it allows to generate data sets for an exhaustive classification of coexisting molecular states. Here we describe a novel software philosophy and architecture that can be used for a great variety of automated data acquisition scenarios. Based on our original software package TOM, the new TOM(2) package has been designed in an object-oriented way. The whole program can be seen as a collection of self-sufficient modules with defined relationships acting in a concerted manner. It subdivides data acquisition into a set of hierarchical tasks, bonding data structure and the operations to be performed tightly together. To demonstrate its capacity for high-throughput data acquisition it has been used in conjunction with instrumentation combining the latest technological achievements in electron optics, cryogenics and robotics. Its performance is demonstrated with a single particle analysis case study and with a batch tomography application. PMID:21704708

  13. Design and Development of an Open Source Software Application for the Characterization of Spatially Variable Fields

    NASA Astrophysics Data System (ADS)

    Gunnell, D. K.; Osorio-Murillo, C. A.; Over, M. W.; Frystacky, H.; Ames, D. P.; Rubin, Y.

    2013-12-01

    The characterization of the structural parameters of spatially variable fields (SVFs) is essential to understanding the variability of hydrological processes such as infiltration, evapotranspiration, groundwater contaminant transport, etc. SVFs can be characterized using a Bayesian inverse method called Method of Anchored Distributions (MAD). This method characterizes the structural parameters of SVFs using prior information of structural parameter fields, indirect measurements, and simulation models allowing the transfer of valuable information to a target variable field. An example SVF in hydrology is hydraulic conductivity, which may be characterized by head pressure measurements through a simulation model such as MODFLOW. This poster will present the design and development of a free and open source inverse modeling desktop software application and extension framework called MAD# for the characterization of the structural parameters of SVFs using MAD. The developed software is designed with a flexible architecture to support different simulation models and random field generators and includes geographic information system (GIS) interfaces for representing, analyzing, and understanding SVFs. This framework has also been made compatible with Mono, a cross-platform implementation of C#, for a wider usability.

  14. Numerical simulation of shock wave focusing at fold caustics, with application to sonic boom.

    PubMed

    Marchiano, Régis; Coulouvrat, François; Grenon, Richard

    2003-10-01

    Weak shock wave focusing at fold caustics is described by the mixed type elliptic/hyperbolic nonlinear Tricomi equation. This paper presents a new and original numerical method for solving this equation, using a potential formulation and an "exact" numerical solver for handling nonlinearities. Validation tests demonstrate quantitatively the efficiency of the algorithm, which is able to handle complex waveforms as may come out from "optimized" aircraft designed to minimize sonic booms. It provides a real alternative to the approximate method of the hodograph transform. This motivated the application to evaluate the ground track focusing of sonic boom for an accelerating aircraft, by coupling CFD Euler simulations performed around the mock-up on an adaptated mesh grid, atmospheric propagation modeling, and the Tricomi algorithm. The chosen configuration is the European Eurosup mock-up. Convergence of the focused boom at the ground level as a function of the matching distance is investigated to demonstrate the efficiency of the numerical process. As a conclusion, it is indicated how the present work may pave the way towards a study on sonic superboom (focused boom) mitigation. PMID:14587578

  15. Managing MDO Software Development Projects

    NASA Technical Reports Server (NTRS)

    Townsend, J. C.; Salas, A. O.

    2002-01-01

    Over the past decade, the NASA Langley Research Center developed a series of 'grand challenge' applications demonstrating the use of parallel and distributed computation and multidisciplinary design optimization. All but the last of these applications were focused on the high-speed civil transport vehicle; the final application focused on reusable launch vehicles. Teams of discipline experts developed these multidisciplinary applications by integrating legacy engineering analysis codes. As teams became larger and the application development became more complex with increasing levels of fidelity and numbers of disciplines, the need for applying software engineering practices became evident. This paper briefly introduces the application projects and then describes the approaches taken in project management and software engineering for each project; lessons learned are highlighted.

  16. Application of bacteriorhodopsin films in an adaptive-focusing schlieren system

    NASA Astrophysics Data System (ADS)

    Downie, John D.

    1995-09-01

    The photochromic property of bacteriorhodopsin films is exploited in the application of a focusing schlieren optical system for the visualization of optical phase information. By encoding an image on the film with light of one wavelength and reading out with a different wavelength, the readout beam can effectively see the photographic negative of the original image. The potential advantage of this system over previous focusing schlieren systems is that the updatable nature of the bacteriorhodopsin film allows system adaptation. I discuss two image encoding and readout techniques for the bacteriorhodopsin and use film transmission characteristics to choose the more appropriate method. I demonstrate the system principle with experimental results using argon-ion and He-Cd lasers as the two light sources of different wavelengths, and I discuss current limitations to implementation with a white-light source.

  17. Application of Bacteriorhodopsin Films in an Adaptive-Focusing Schlieren System

    NASA Technical Reports Server (NTRS)

    Downie, John D.

    1995-01-01

    The photochromic property of bacteriorhodopsin films is exploited in the application of a focusing schlieren optical system for the visualization of optical phase information. By encoding an image on the film with light of one wavelength and reading out with a different wavelength, the readout beam can effectively see the photographic negative of the original image. The potential advantage of this system over previous focusing schlieren systems is that the updatable nature of the bacteriorhodopsin film allows system adaptation. I discuss two image encoding and readout techniques for the bacteriorhodopsin and use film transmission characteristics to choose the more appropriate method. I demonstrate the system principle with experimental results using argon-ion and He-Cd lasers as the two light sources of different wavelengths, and I discuss current limitations to implementation with a white-light source.

  18. Clinical Application of High-intensity Focused Ultrasound in Cancer Therapy.

    PubMed

    Hsiao, Yi-Hsuan; Kuo, Shou-Jen; Tsai, Horng-Der; Chou, Ming-Chih; Yeh, Guang-Perng

    2016-01-01

    The treatment of cancer is an important issue in both developing and developed countries. Clinical use of ultrasound in cancer is not only for the diagnosis but also for the treatment. Focused ultrasound surgery (FUS) is a noninvasive technique. By using the combination of high-intensity focused ultrasound (HIFU) and imaging method, FUS has the potential to ablate tumor lesions precisely. The main mechanisms of HIFU ablation involve mechanical and thermal effects. Recent advances in HIFU have increased its popularity. Some promising results were achieved in managing various malignancies, including pancreas, prostate, liver, kidney, breast and bone. Other applications include brain tumor ablation and disruption of the blood-brain barrier. We aim at briefly outlining the clinical utility of FUS as a noninvasive technique for a variety of types of cancer treatment. PMID:26918034

  19. Clinical Application of High-intensity Focused Ultrasound in Cancer Therapy

    PubMed Central

    Hsiao, Yi-Hsuan; Kuo, Shou-Jen; Tsai, Horng-Der; Chou, Ming-Chih; Yeh, Guang-Perng

    2016-01-01

    The treatment of cancer is an important issue in both developing and developed countries. Clinical use of ultrasound in cancer is not only for the diagnosis but also for the treatment. Focused ultrasound surgery (FUS) is a noninvasive technique. By using the combination of high-intensity focused ultrasound (HIFU) and imaging method, FUS has the potential to ablate tumor lesions precisely. The main mechanisms of HIFU ablation involve mechanical and thermal effects. Recent advances in HIFU have increased its popularity. Some promising results were achieved in managing various malignancies, including pancreas, prostate, liver, kidney, breast and bone. Other applications include brain tumor ablation and disruption of the blood-brain barrier. We aim at briefly outlining the clinical utility of FUS as a noninvasive technique for a variety of types of cancer treatment. PMID:26918034

  20. Driving Circuitry for Focused Ultrasound Noninvasive Surgery and Drug Delivery Applications

    PubMed Central

    El-Desouki, Munir M.; Hynynen, Kullervo

    2011-01-01

    Recent works on focused ultrasound (FUS) have shown great promise for cancer therapy. Researchers are continuously trying to improve system performance, which is resulting in an increased complexity that is more apparent when using multi-element phased array systems. This has led to significant efforts to reduce system size and cost by relying on system integration. Although ideas from other fields such as microwave antenna phased arrays can be adopted in FUS, the application requirements differ significantly since the frequency range used in FUS is much lower. In this paper, we review recent efforts to design efficient power monitoring, phase shifting and output driving techniques used specifically for high intensity focused ultrasound (HIFU). PMID:22346589

  1. NASA's Software Safety Standard

    NASA Technical Reports Server (NTRS)

    Ramsay, Christopher M.

    2007-01-01

    NASA relies more and more on software to control, monitor, and verify its safety critical systems, facilities and operations. Since the 1960's there has hardly been a spacecraft launched that does not have a computer on board that will provide command and control services. There have been recent incidents where software has played a role in high-profile mission failures and hazardous incidents. For example, the Mars Orbiter, Mars Polar Lander, the DART (Demonstration of Autonomous Rendezvous Technology), and MER (Mars Exploration Rover) Spirit anomalies were all caused or contributed to by software. The Mission Control Centers for the Shuttle, ISS, and unmanned programs are highly dependant on software for data displays, analysis, and mission planning. Despite this growing dependence on software control and monitoring, there has been little to no consistent application of software safety practices and methodology to NASA's projects with safety critical software. Meanwhile, academia and private industry have been stepping forward with procedures and standards for safety critical systems and software, for example Dr. Nancy Leveson's book Safeware: System Safety and Computers. The NASA Software Safety Standard, originally published in 1997, was widely ignored due to its complexity and poor organization. It also focused on concepts rather than definite procedural requirements organized around a software project lifecycle. Led by NASA Headquarters Office of Safety and Mission Assurance, the NASA Software Safety Standard has recently undergone a significant update. This new standard provides the procedures and guidelines for evaluating a project for safety criticality and then lays out the minimum project lifecycle requirements to assure the software is created, operated, and maintained in the safest possible manner. This update of the standard clearly delineates the minimum set of software safety requirements for a project without detailing the implementation for those

  2. The Point-Focusing Thermal and Electric Applications Project - A progress report. [small solar power systems applications

    NASA Technical Reports Server (NTRS)

    Marriott, A. T.

    1979-01-01

    The paper discusses the Point-Focusing Thermal and Electric Applications Project which encompasses three primary activities: (1) applications analysis and development, in which potential markets for small power systems (less than 10 MWe) are identified and characterized in order to provide requirements for design and information for activities relating to market development; (2) systems engineering and development, for analyses that will define the most appropriate small power system designs based on specific user requirements; and (3) experiment implementation and test, which deals with the design and placement of engineering experiments in various applications environments in order to test the readiness of the selected technology in an operational setting. Progress to date and/or key results are discussed throughout the text.

  3. ELAS - A geobased information system that is transferable to several computers. [Earth resources Laboratory Applications Software

    NASA Technical Reports Server (NTRS)

    Whitley, S. L.; Pearson, R. W.; Seyfarth, B. R.; Graham, M. H.

    1981-01-01

    In the early years of remote sensing, emphasis was placed on the processing and analysis of data from a single multispectral sensor, such as the Landsat Multispectral Scanner System (MSS). However, in connection with attempts to use the data for resource management, it was realized that many deficiencies existed in single data sets. A need was established to geographically reference the MSS data and to register with it data from disparate sources. Technological transfer activities have required systems concepts that can be easily transferred to computers of different types in other organizations. ELAS (Earth Resources Laboratory Applications Software), a geographically based information system, was developed to meet the considered needs. ELAS accepts data from a variety of sources. It contains programs to geographically reference the data to the Universal Transverse Mercator grid. One of the primary functions of ELAS is to produce a surface cover map.

  4. Software Configuration Management for Safety-Related Applications in Space Systems: Extending the Application of the USAF 8-Step Method

    NASA Astrophysics Data System (ADS)

    Johnson, C. W.

    2010-09-01

    Configuration management ensures that the requirements and constraints, identified in previous stages of development, are preserved throughout the design, implementation and operation of complex systems. Space-related, software systems pose particular problems because, for instance, it can be hard to determine what code is actually running on a platform as successive updates are performed over many months of remote operation. It is, therefore, important we learn as much as possible from previous mishaps that have involved configuration management; given that software continues to play a critical role in the safety of many space missions. The following pages extend the US Air Force’s 8-Step Method to identify lessons learned from space related incidents. This approach builds on Boyd’s OODA(Observe, Orient, Decide and Act) Loop and provides a common framework for the analysis of these complex incidents. It is important to stress that the application of an existing general approach to problem solving, rather than the development of a specific approach for configuration management, is intended to reduce training costs and to increase the value added from existing investments in the use of the 8-Step Method. Many specialised software engineering techniques are not used because they cannot easily be applied within the financial limits and deadlines that constrain most space programmes. The closing sections of this paper identify areas for further work; in particular, we stress the importance of links with recent European Space Agency problem solving techniques that support the early-stage development of long duration space missions.

  5. VLSI realization of learning vector quantization with hardware/software co-design for different applications

    NASA Astrophysics Data System (ADS)

    An, Fengwei; Akazawa, Toshinobu; Yamasaki, Shogo; Chen, Lei; Jürgen Mattausch, Hans

    2015-04-01

    This paper reports a VLSI realization of learning vector quantization (LVQ) with high flexibility for different applications. It is based on a hardware/software (HW/SW) co-design concept for on-chip learning and recognition and designed as a SoC in 180 nm CMOS. The time consuming nearest Euclidean distance search in the LVQ algorithm’s competition layer is efficiently implemented as a pipeline with parallel p-word input. Since neuron number in the competition layer, weight values, input and output number are scalable, the requirements of many different applications can be satisfied without hardware changes. Classification of a d-dimensional input vector is completed in n × \\lceil d/p \\rceil + R clock cycles, where R is the pipeline depth, and n is the number of reference feature vectors (FVs). Adjustment of stored reference FVs during learning is done by the embedded 32-bit RISC CPU, because this operation is not time critical. The high flexibility is verified by the application of human detection with different numbers for the dimensionality of the FVs.

  6. Current and Future Clinical Applications of High-Intensity Focused Ultrasound (HIFU) for Pancreatic Cancer

    PubMed Central

    Jang, Hyun Joo; Lee, Jae-Young; Lee, Don-Haeng; Kim, Won-Hong

    2010-01-01

    High-intensity focused ultrasound (HIFU) is a novel therapeutic modality that permits noninvasive treatment of various benign and malignant solid tumors, including prostatic cancer, uterine fibroids, hepatic tumors, renal tumors, breast cancers, and pancreatic cancers. Several preclinical and clinical studies have investigated the safety and efficacy of HIFU for treating solid tumors, including pancreatic cancer. The results of nonrandomized studies of HIFU therapy in patients with pancreatic cancer have suggested that HIFU treatment can effectively alleviate cancer-related pain without any significant complications. This noninvasive method of delivering ultrasound energy into the body has recently been evolving from a method for purely thermal ablation to harnessing the mechanical effects of HIFU to induce a systemic immune response and to enhance targeted drug delivery. This review provides a brief overview of HIFU, describes current clinical applications of HIFU for pancreatic cancer, and discusses future applications and challenges. PMID:21103296

  7. RICIS Software Engineering 90 Symposium: Aerospace Applications and Research Directions Proceedings

    NASA Technical Reports Server (NTRS)

    1990-01-01

    Papers presented at RICIS Software Engineering Symposium are compiled. The following subject areas are covered: synthesis - integrating product and process; Serpent - a user interface management system; prototyping distributed simulation networks; and software reuse.

  8. Teacher-Designed Software for Interactive Linear Equations: Concepts, Interpretive Skills, Applications & Word-Problem Solving.

    ERIC Educational Resources Information Center

    Lawrence, Virginia

    No longer just a user of commercial software, the 21st century teacher is a designer of interactive software based on theories of learning. This software, a comprehensive study of straightline equations, enhances conceptual understanding, sketching, graphic interpretive and word problem solving skills as well as making connections to real-life and…

  9. Clinical and future applications of high intensity focused ultrasound in cancer.

    PubMed

    Al-Bataineh, Osama; Jenne, Jürgen; Huber, Peter

    2012-08-01

    High intensity focused ultrasound (HIFU) or focused ultrasound (FUS) is a promising modality to treat tumors in a complete, non invasive fashion where online image guidance and therapy control can be achieved by magnetic resonance imaging (MRI) or diagnostic ultrasound (US). In the last 10 years, the feasibility and the safety of HIFU have been tested in a growing number of clinical studies on several benign and malignant tumors of the prostate, breast, uterine, liver, kidney, pancreas, bone, and brain. For certain indications this new treatment principle is on its verge to become a serious alternative or adjunct to the standard treatment options of surgery, radiotherapy, gene therapy and chemotherapy in oncology. In addition to the now clinically available thermal ablation, in the future, focused ultrasound at much lower intensities may have the potential to become a major instrument to mediate drug and gene delivery for localized cancer treatment. We introduce the technology of MRI guided and ultrasound guided HIFU and present a critical overview of the clinical applications and results along with a discussion of future HIFU developments. PMID:21924838

  10. GEnomes Management Application (GEM.app): a new software tool for large-scale collaborative genome analysis.

    PubMed

    Gonzalez, Michael A; Lebrigio, Rafael F Acosta; Van Booven, Derek; Ulloa, Rick H; Powell, Eric; Speziani, Fiorella; Tekin, Mustafa; Schüle, Rebecca; Züchner, Stephan

    2013-06-01

    Novel genes are now identified at a rapid pace for many Mendelian disorders, and increasingly, for genetically complex phenotypes. However, new challenges have also become evident: (1) effectively managing larger exome and/or genome datasets, especially for smaller labs; (2) direct hands-on analysis and contextual interpretation of variant data in large genomic datasets; and (3) many small and medium-sized clinical and research-based investigative teams around the world are generating data that, if combined and shared, will significantly increase the opportunities for the entire community to identify new genes. To address these challenges, we have developed GEnomes Management Application (GEM.app), a software tool to annotate, manage, visualize, and analyze large genomic datasets (https://genomics.med.miami.edu/). GEM.app currently contains ∼1,600 whole exomes from 50 different phenotypes studied by 40 principal investigators from 15 different countries. The focus of GEM.app is on user-friendly analysis for nonbioinformaticians to make next-generation sequencing data directly accessible. Yet, GEM.app provides powerful and flexible filter options, including single family filtering, across family/phenotype queries, nested filtering, and evaluation of segregation in families. In addition, the system is fast, obtaining results within 4 sec across ∼1,200 exomes. We believe that this system will further enhance identification of genetic causes of human disease. PMID:23463597

  11. Development of a Controlled Vocabulary and Software Application to Analyze Fruit Shape Variation in Tomato and Other Plant Species1[W

    PubMed Central

    Brewer, Marin Talbot; Lang, Lixin; Fujimura, Kikuo; Dujmovic, Nancy; Gray, Simon; van der Knaap, Esther

    2006-01-01

    The domestication and improvement of fruit-bearing crops resulted in a large diversity of fruit form. To facilitate consistent terminology pertaining to shape, a controlled vocabulary focusing specifically on fruit shape traits was developed. Mathematical equations were established for the attributes so that objective, quantitative measurements of fruit shape could be conducted. The controlled vocabulary and equations were integrated into a newly developed software application, Tomato Analyzer, which conducts semiautomatic phenotypic measurements. To demonstrate the utility of Tomato Analyzer in the detection of shape variation, fruit from two F2 populations of tomato (Solanum spp.) were analyzed. Principal components analysis was used to identify the traits that best described shape variation within as well as between the two populations. The three principal components were analyzed as traits, and several significant quantitative trait loci (QTL) were identified in both populations. The usefulness and flexibility of the software was further demonstrated by analyzing the distal fruit end angle of fruit at various user-defined settings. Results of the QTL analyses indicated that significance levels of detected QTL were greatly improved by selecting the setting that maximized phenotypic variation in a given population. Tomato Analyzer was also applied to conduct phenotypic analyses of fruit from several other species, demonstrating that many of the algorithms developed for tomato could be readily applied to other plants. The controlled vocabulary, algorithms, and software application presented herein will provide plant scientists with novel tools to consistently, accurately, and efficiently describe two-dimensional fruit shapes. PMID:16684933

  12. Agile hardware and software systems engineering for critical military space applications

    NASA Astrophysics Data System (ADS)

    Huang, Philip M.; Knuth, Andrew A.; Krueger, Robert O.; Garrison-Darrin, Margaret A.

    2012-06-01

    The Multi Mission Bus Demonstrator (MBD) is a successful demonstration of agile program management and system engineering in a high risk technology application where utilizing and implementing new, untraditional development strategies were necessary. MBD produced two fully functioning spacecraft for a military/DOD application in a record breaking time frame and at dramatically reduced costs. This paper discloses the adaptation and application of concepts developed in agile software engineering to hardware product and system development for critical military applications. This challenging spacecraft did not use existing key technology (heritage hardware) and created a large paradigm shift from traditional spacecraft development. The insertion of new technologies and methods in space hardware has long been a problem due to long build times, the desire to use heritage hardware, and lack of effective process. The role of momentum in the innovative process can be exploited to tackle ongoing technology disruptions and allowing risk interactions to be mitigated in a disciplined manner. Examples of how these concepts were used during the MBD program will be delineated. Maintaining project momentum was essential to assess the constant non recurring technological challenges which needed to be retired rapidly from the engineering risk liens. Development never slowed due to tactical assessment of the hardware with the adoption of the SCRUM technique. We adapted this concept as a representation of mitigation of technical risk while allowing for design freeze later in the program's development cycle. By using Agile Systems Engineering and Management techniques which enabled decisive action, the product development momentum effectively was used to produce two novel space vehicles in a fraction of time with dramatically reduced cost.

  13. In Vivo application and localization of transcranial focused ultrasound using dual-mode ultrasound arrays.

    PubMed

    Haritonova, Alyona; Liu, Dalong; Ebbini, Emad S

    2015-12-01

    Focused ultrasound (FUS) has been proposed for a variety of transcranial applications, including neuromodulation, tumor ablation, and blood-brain barrier opening. A flurry of activity in recent years has generated encouraging results demonstrating its feasibility in these and other applications. To date, monitoring of FUS beams has been primarily accomplished using MR guidance, where both MR thermography and elastography have been used. The recent introduction of real-time dual-mode ultrasound array (DMUA) systems offers a new paradigm in transcranial focusing. In this paper, we present first experimental results of ultrasound-guided transcranial FUS (tFUS) application in a rodent brain, both ex vivo and in vivo. DMUA imaging is used for visualization of the treatment region for placement of the focal spot within the brain. This includes the detection and localization of pulsating blood vessels at or near the target point(s). In addition, DMUA imaging is used to monitor and localize the FUS-tissue interactions in real time. In particular, a concave (40 mm radius of curvature), 32-element, 3.5-MHz DMUA prototype was used for imaging and tFUS application in ex vivo and in vivo rat models. The ex vivo experiments were used to evaluate the point spread function of the transcranial DMUA imaging at various points within the brain. In addition, DMUA-based transcranial ultrasound thermography measurements were compared with thermocouple measurements of subtherapeutic tFUS heating in rat brain ex vivo. The ex vivo setting was also used to demonstrate the capability of DMUA to produce localized thermal lesions. The in vivo experiments were designed to demonstrate the ability of the DMUA to apply, monitor, and localize subtherapeutic tFUS patterns that could be beneficial in transient blood-brain barrier opening. The results show that although the DMUA focus is degraded due to the propagation through the skull, it still produces localized heating effects within a sub

  14. In Vivo Application and Localization of Transcranial Focused Ultrasound Using Dual-Mode Ultrasound Arrays

    PubMed Central

    Haritonova, Alyona; Liu, Dalong; Ebbini, Emad S.

    2015-01-01

    Focused ultrasound (FUS) has been proposed for a variety of transcranial applications, including neuromodulation, tumor ablation, and blood brain barrier opening. A flurry of activity in recent years has generated encouraging results demonstrating its feasibility in these and other applications. To date, monitoring of FUS beams have been primarily accomplished using MR guidance, where both MR thermography and elastography have been used. The recent introduction of real-time dual-mode ultrasound array (DMUA) systems offers a new paradigm in transcranial focusing. In this paper, we present first experimental results of ultrasound-guided transcranial FUS (tFUS) application in a rodent brain, both ex vivo and in vivo. DMUA imaging is used for visualization of the treatment region for placement of the focal spot within the brain. This includes the detection and localization of pulsating blood vessels at or near the target point(s). In addition, DMUA imaging is used to monitor and localize the FUS-tissue interactions in real-time. In particular, a concave (40-mm radius of curvature), 32-element, 3.5 MHz DMUA prototype was used for imaging and tFUS application in ex vivo and in vivo rat model. The ex vivo experiments were used to evaluate the point spread function (psf) of the transcranial DMUA imaging at various points within the brain. In addition, DMUA-based transcranial ultrasound thermography measurements were compared with thermocouple measurements of subtherapeutic tFUS heating in rat brain ex vivo. The ex vivo setting was also used to demonstrate the DMUA capability to produce localized thermal lesions. The in vivo experiments were designed to demonstrate the ability of the DMUA to apply, monitor, and localize subtherapeutic tFUS patterns that could be beneficial in transient blood brain barrier opening. The results show that, while the DMUA focus is degraded due to the propagation through the skull, it still produces localized heating effects within sub

  15. SEE: improving nurse-patient communications and preventing software piracy in nurse call applications.

    PubMed

    Unluturk, Mehmet S

    2012-06-01

    Nurse call system is an electrically functioning system by which patients can call upon from a bedside station or from a duty station. An intermittent tone shall be heard and a corridor lamp located outside the room starts blinking with a slow or a faster rate depending on the call origination. It is essential to alert nurses on time so that they can offer care and comfort without any delay. There are currently many devices available for a nurse call system to improve communication between nurses and patients such as pagers, RFID (radio frequency identification) badges, wireless phones and so on. To integrate all these devices into an existing nurse call system and make they communicate with each other, we propose software client applications called bridges in this paper. We also propose a window server application called SEE (Supervised Event Executive) that delivers messages among these devices. A single hardware dongle is utilized for authentication and copy protection for SEE. Protecting SEE with securities provided by dongle only is a weak defense against hackers. In this paper, we develop some defense patterns for hackers such as calculating checksums in runtime, making calls to dongle from multiple places in code and handling errors properly by logging them into database. PMID:21222218

  16. Agile Software Development

    ERIC Educational Resources Information Center

    Biju, Soly Mathew

    2008-01-01

    Many software development firms are now adopting the agile software development method. This method involves the customer at every level of software development, thus reducing the impact of change in the requirement at a later stage. In this article, the principles of the agile method for software development are explored and there is a focus on…

  17. Data-Interpolating Variational Analysis (DIVA) software : recent development and application

    NASA Astrophysics Data System (ADS)

    Watelet, Sylvain; Beckers, Jean-Marie; Barth, Alexander; Back, Örjan

    2016-04-01

    The Data-Interpolating Variational Analysis (DIVA) software is a tool designed to reconstruct a continuous field from discrete measurements. This method is based on the numerical implementation of the Variational Inverse Model (VIM), which consists of a minimization of a cost function, allowing the choice of the analysed field fitting at best the data sets. The problem is solved efficiently using a finite-element method. This statistical method is particularly suited to deal with irregularly-spaced observations, producing outputs on a regular grid. Initially created to work in a two-dimensional way, the software is now able to handle 3D or even 4D analysis, in order to easily produce ocean climatologies. These analyses can easily be improved by taking advantage of the DIVA's ability to take topographic and dynamic constraints into account (coastal relief, prevailing wind impacting the advection,...). DIVA is an open-source software which is continuously upgraded and distributed for free through frequent version releases. The development is funded by the EMODnet and SeaDataNet projects and include many discussions and feedback from the users community. Here, we present two recent major upgrades : the data weighting option and the bottom-based analyses. Since DIVA works with a diagonal observation error covariance matrix, it is assumed that the observation errors are uncorrelated in space and time. In practice, this assumption is not always valid especially when dealing e.g. with cruise measurements (same instrument) or with time series at a fixed geographic point (representativity error). The data weighting option proposes to decrease the weights in the analysis of such observations. Theses weights are based on an exponential function using a 3D (x,y,t) distance between several observations. A comparison between not-weighted and weighted analyses will be shown. It has been a recurrent request from the DIVA users to improve the way the analyses near the ocean bottom

  18. Open Source Subtitle Editor Software Study for Section 508 Close Caption Applications

    NASA Technical Reports Server (NTRS)

    Murphy, F. Brandon

    2013-01-01

    This paper will focus on a specific item within the NASA Electronic Information Accessibility Policy - Multimedia Presentation shall have synchronized caption; thus making information accessible to a person with hearing impairment. This synchronized caption will assist a person with hearing or cognitive disability to access the same information as everyone else. This paper focuses on the research and implementation for CC (subtitle option) support to video multimedia. The goal of this research is identify the best available open-source (free) software to achieve synchronized captions requirement and achieve savings, while meeting the security requirement for Government information integrity and assurance. CC and subtitling are processes that display text within a video to provide additional or interpretive information for those whom may need it or those whom chose it. Closed captions typically show the transcription of the audio portion of a program (video) as it occurs (either verbatim or in its edited form), sometimes including non-speech elements (such as sound effects). The transcript can be provided by a third party source or can be extracted word for word from the video. This feature can be made available for videos in two forms: either Soft-Coded or Hard-Coded. Soft-Coded is the more optional version of CC, where you can chose to turn them on if you want, or you can turn them off. Most of the time, when using the Soft-Coded option, the transcript is also provided to the view along-side the video. This option is subject to compromise, whereas the transcript is merely a text file that can be changed by anyone who has access to it. With this option the integrity of the CC is at the mercy of the user. Hard-Coded CC is a more permanent form of CC. A Hard-Coded CC transcript is embedded within a video, without the option of removal.

  19. The Environment for Application Software Integration and Execution (EASIE) version 1.0. Volume 4: System installation and maintenance guide

    NASA Technical Reports Server (NTRS)

    Randall, Donald P.; Jones, Kennie H.; Rowell, Lawrence F.

    1988-01-01

    The Environment for Application Software Integration and Execution (EASIE) provides both a methodology and a set of software utility programs to ease the task of coordinating engineering design and analysis codes. This document provides necessary information for installing the EASIE software on a host computer system. The target host is a DEX VAX running VMS version 4; host dependencies are noted when appropriate. Relevant directories and individual files are identified, and compile/load/execute sequences are specified. In the case of the data management utilities, database management system (DBMS) specific features are described in an effort to assist the maintenance programmer in converting to a new DBMS. The document also describes a sample EASIE program directory structure to guide the program implementer in establishing his/her application dependent environment.

  20. Advanced software development workstation. Knowledge base design: Design of knowledge base for flight planning application

    NASA Technical Reports Server (NTRS)

    Izygon, Michel E.

    1992-01-01

    The development process of the knowledge base for the generation of Test Libraries for Mission Operations Computer (MOC) Command Support focused on a series of information gathering interviews. These knowledge capture sessions are supporting the development of a prototype for evaluating the capabilities of INTUIT on such an application. the prototype includes functions related to POCC (Payload Operation Control Center) processing. It prompts the end-users for input through a series of panels and then generates the Meds associated with the initialization and the update of hazardous command tables for a POCC Processing TLIB.

  1. Eddy Covariance Method for CO2 Emission Measurements: CCS Applications, Principles, Instrumentation and Software

    NASA Astrophysics Data System (ADS)

    Burba, George; Madsen, Rod; Feese, Kristin

    2013-04-01

    The Eddy Covariance method is a micrometeorological technique for direct high-speed measurements of the transport of gases, heat, and momentum between the earth's surface and the atmosphere. Gas fluxes, emission and exchange rates are carefully characterized from single-point in-situ measurements using permanent or mobile towers, or moving platforms such as automobiles, helicopters, airplanes, etc. Since the early 1990s, this technique has been widely used by micrometeorologists across the globe for quantifying CO2 emission rates from various natural, urban and agricultural ecosystems [1,2], including areas of agricultural carbon sequestration. Presently, over 600 eddy covariance stations are in operation in over 120 countries. In the last 3-5 years, advancements in instrumentation and software have reached the point when they can be effectively used outside the area of micrometeorology, and can prove valuable for geological carbon capture and sequestration, landfill emission measurements, high-precision agriculture and other non-micrometeorological industrial and regulatory applications. In the field of geological carbon capture and sequestration, the magnitude of CO2 seepage fluxes depends on a variety of factors. Emerging projects utilize eddy covariance measurement to monitor large areas where CO2 may escape from the subsurface, to detect and quantify CO2 leakage, and to assure the efficiency of CO2 geological storage [3,4,5,6,7,8]. Although Eddy Covariance is one of the most direct and defensible ways to measure and calculate turbulent fluxes, the method is mathematically complex, and requires careful setup, execution and data processing tailor-fit to a specific site and a project. With this in mind, step-by-step instructions were created to introduce a novice to the conventional Eddy Covariance technique [9], and to assist in further understanding the method through more advanced references such as graduate-level textbooks, flux networks guidelines, journals

  2. Software Review.

    ERIC Educational Resources Information Center

    McGrath, Diane, Ed.

    1989-01-01

    Reviewed is a computer software package entitled "Audubon Wildlife Adventures: Grizzly Bears" for Apple II and IBM microcomputers. Included are availability, hardware requirements, cost, and a description of the program. The murder-mystery flavor of the program is stressed in this program that focuses on illegal hunting and game management. (CW)

  3. Integrating model behavior, optimization, and sensitivity/uncertainty analysis: overview and application of the MOUSE software toolbox

    Technology Transfer Automated Retrieval System (TEKTRAN)

    This paper provides an overview of the Model Optimization, Uncertainty, and SEnsitivity Analysis (MOUSE) software application, an open-source, Java-based toolbox of visual and numerical analysis components for the evaluation of environmental models. MOUSE is based on the OPTAS model calibration syst...

  4. The Relationship between Teacher Attitudes towards Software Applications and Student Achievement in Fourth and Fifth Grade Classrooms

    ERIC Educational Resources Information Center

    Spencer, Laura K.

    2010-01-01

    The problem: The problem addressed in this study was to examine how teacher attitudes towards software applications affect student achievement in the classroom. Method: A correlational study was conducted, and 50 fourth and fifth grade teachers who taught in the Santee School District, were administered a survey assessing their attitudes…

  5. 25 CFR 547.8 - What are the minimum technical software standards applicable to Class II gaming systems?

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... OF CLASS II GAMES § 547.8 What are the minimum technical software standards applicable to Class II... of Class II games. (a) Player interface displays. (1) If not otherwise provided to the player, the player interface shall display the following: (i) The purchase or wager amount; (ii) Game results;...

  6. 25 CFR 547.8 - What are the minimum technical software standards applicable to Class II gaming systems?

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... OF CLASS II GAMES § 547.8 What are the minimum technical software standards applicable to Class II... of Class II games. (a) Player interface displays. (1) If not otherwise provided to the player, the player interface shall display the following: (i) The purchase or wager amount; (ii) Game results;...

  7. 25 CFR 547.8 - What are the minimum technical software standards applicable to Class II gaming systems?

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... OF CLASS II GAMES § 547.8 What are the minimum technical software standards applicable to Class II... of Class II games. (a) Player interface displays. (1) If not otherwise provided to the player, the player interface shall display the following: (i) The purchase or wager amount; (ii) Game results;...

  8. Adopting Open-Source Software Applications in U. S. Higher Education: A Cross-Disciplinary Review of the Literature

    ERIC Educational Resources Information Center

    van Rooij, Shahron Williams

    2009-01-01

    Higher Education institutions in the United States are considering Open Source software applications such as the Moodle and Sakai course management systems and the Kuali financial system to build integrated learning environments that serve both academic and administrative needs. Open Source is presumed to be more flexible and less costly than…

  9. Application of an ultrasonic focusing radiator for acoustic levitation of submillimeter samples

    NASA Technical Reports Server (NTRS)

    Lee, M. C.

    1981-01-01

    An acoustic apparatus has been specifically developed to handle samples of submillimeter size in a gaseous medium. This apparatus consists of an acoustic levitation device, deployment devices for small liquid and solid samples, heat sources for sample heat treatment, acoustic alignment devices, a cooling system and data-acquisition instrumentation. The levitation device includes a spherical aluminum dish of 12 in. diameter and 0.6 in. thickness, 130 pieces of PZT transducers attached to the back side of the dish and a spherical concave reflector situated in the vicinity of the center of curvature of the dish. The three lowest operating frequencies for the focusing-radiator levitation device are 75, 105 and 163 kHz, respectively. In comparison with other levitation apparatus, it possesses a large radiation pressure and a high lateral positional stability. This apparatus can be used most advantageously in the study of droplets and spherical shell systems, for instance, for fusion target applications.

  10. [The Development and Application of the Orthopaedics Implants Failure Database Software Based on WEB].

    PubMed

    Huang, Jiahua; Zhou, Hai; Zhang, Binbin; Ding, Biao

    2015-09-01

    This article develops a new failure database software for orthopaedics implants based on WEB. The software is based on B/S mode, ASP dynamic web technology is used as its main development language to achieve data interactivity, Microsoft Access is used to create a database, these mature technologies make the software extend function or upgrade easily. In this article, the design and development idea of the software, the software working process and functions as well as relative technical features are presented. With this software, we can store many different types of the fault events of orthopaedics implants, the failure data can be statistically analyzed, and in the macroscopic view, it can be used to evaluate the reliability of orthopaedics implants and operations, it also can ultimately guide the doctors to improve the clinical treatment level. PMID:26904871

  11. Application of an integrated multi-criteria decision making AHP-TOPSIS methodology for ETL software selection.

    PubMed

    Hanine, Mohamed; Boutkhoum, Omar; Tikniouine, Abdessadek; Agouti, Tarik

    2016-01-01

    Actually, a set of ETL software (Extract, Transform and Load) is available to constitute a major investment market. Each ETL uses its own techniques for extracting, transforming and loading data into data warehouse, which makes the task of evaluating ETL software very difficult. However, choosing the right software of ETL is critical to the success or failure of any Business Intelligence project. As there are many impacting factors in the selection of ETL software, the same process is considered as a complex multi-criteria decision making (MCDM) problem. In this study, an application of decision-making methodology that employs the two well-known MCDM techniques, namely Analytic Hierarchy Process (AHP) and Technique for Order Preference by Similarity to Ideal Solution (TOPSIS) methods is designed. In this respect, the aim of using AHP is to analyze the structure of the ETL software selection problem and obtain weights of the selected criteria. Then, TOPSIS technique is used to calculate the alternatives' ratings. An example is given to illustrate the proposed methodology. Finally, a software prototype for demonstrating both methods is implemented. PMID:27006872

  12. The Use of Mobile Health Applications Among Youth and Young Adults Living with HIV: Focus Group Findings.

    PubMed

    Saberi, Parya; Siedle-Khan, Robert; Sheon, Nicolas; Lightfoot, Marguerita

    2016-06-01

    The objective of this study was to conduct focus groups with youth (18-29 years old) living with HIV (YLWH) to better understand preferences for mobile applications in general and to inform the design of a mobile health application aimed at improving retention and engagement in healthcare and adherence to antiretroviral therapy. We conducted four focus groups with YLWH to elicit the names and characteristics of applications that they commonly used, reasons they deleted applications, and the features of an ideal mobile health application. A diverse sample of youth (N = 17) with a mean age of 25 years, 88.2% male, and 29.4% African American participated in four focus groups. Positive attributes of applications included informative, simple, allowing for networking, timely updates, little overlap with other applications, unlimited access to entertainment, and with ongoing advancement. Participants identified several reasons for deleting applications, including engaging in excessive behaviors (e.g., spending money), for hook ups only, too many notifications or restrictions, occupied too much space on device, or required wireless connectivity or frequent updates. Participants suggested that a mobile health application that they would find useful should have the ability to connect to a community of other YLWH, readily access healthcare providers, track personal data and information (such as laboratory data), and obtain health news and education. Privacy was a key factor in a mobile health application for all participants. Researchers can use the information provided by focus group participants in creating mobile health applications for YLWH. PMID:27214751

  13. Dense Plasma Focus: A question in search of answers, a technology in search of applications

    NASA Astrophysics Data System (ADS)

    Auluck, S. K. H.

    2014-08-01

    Diagnostic information accumulated over four decades of research suggests a directionality of toroidal motion for energetic ions responsible for fusion neutron production in the Dense Plasma Focus (DPF) and existence of an axial component of magnetic field even under conditions of azimuthal symmetry. This is at variance with the traditional view of Dense Plasma Focus as a purely irrotational compressive flow. The difficulty in understanding the experimental situation from a theoretical standpoint arises from polarity of the observed solenoidal state: three independent experiments confirm existence of a fixed polarity of the axial magnetic field or related azimuthal current. Since the equations governing plasma dynamics do not have a built-in direction, the fixed polarity must be related with initial conditions: the plasma dynamics must interact with an external physical vector in order to generate a solenoidal state of fixed polarity. Only four such external physical vectors can be identified: the earth's magnetic field, earth's angular momentum, direction of current flow and the direction of the plasma accelerator. How interaction of plasma dynamics with these fields can generate observed solenoidal state is a question still in search of answers; this paper outlines one possible answer. The importance of this question goes beyond scientific curiosity into technological uses of the energetic ions and the high-power-density plasma environment. However, commercial utilization of such technologies faces reliability concerns, which can be met only by first-principles integrated design of globally-optimized industrial-quality DPF hardware. Issues involved in the emergence of the Dense Plasma Focus as a technology platform for commercial applications in the not-too-distant future are discussed.

  14. Recollimation and Radiative Focusing of Relativistic Jets: Applications to Blazars and M87

    NASA Astrophysics Data System (ADS)

    Bromberg, Omer; Levinson, Amir

    2009-07-01

    Recent observations of M87 and some blazars reveal violent activity in small regions located at relatively large distances from the central engine. Motivated by these considerations, we study the hydrodynamic collimation of a relativistic cooling outflow using a semianalytical model developed earlier. We first demonstrate that radiative cooling of the shocked outflow layer can lead to a focusing of the outflow and its reconfinement in a region having a very small cross-sectional radius. Such a configuration can produce rapid variability at large distances from the central engine via reflections of the converging recollimation shock. Possible applications of this model to TeV blazars are discussed. We then apply our model to M87. The low radiative efficiency of the M87 jet renders focusing unlikely. However, the shallow profile of the ambient medium pressure inferred from observations results in extremely good collimation that can explain the reported variability of the X-ray flux emitted from the HST-1 knot.

  15. On integrating modeling software for application to total-system performance assessment

    SciTech Connect

    Lewis, L.C.; Wilson, M.L.

    1994-05-01

    We examine the processes and methods used to facilitate collaboration in software development between two organizations at separate locations -- Lawrence Livermore National Laboratory (LLNL) in California and Sandia National Laboratories (SNL) in New Mexico. Our software development process integrated the efforts of these two laboratories. Software developed at LLNL to model corrosion and failure of waste packages and subsequent releases of radionuclides was incorporated as a source term into SNLs computer models for fluid flow and radionuclide transport through the geosphere.

  16. Evaluation Software in Counseling.

    ERIC Educational Resources Information Center

    Sabella, Russell A.

    Counselors today are presented with a number of differing applications software. This article intends to advance the counselor's knowledge and considerations of the various aspects of application software. Included is a discussion of the software applications typically of help to counselors in (a) managing their work (computer managed counseling);…

  17. Caltech/JPL Conference on Image Processing Technology, Data Sources and Software for Commercial and Scientific Applications

    NASA Technical Reports Server (NTRS)

    Redmann, G. H.

    1976-01-01

    Recent advances in image processing and new applications are presented to the user community to stimulate the development and transfer of this technology to industrial and commercial applications. The Proceedings contains 37 papers and abstracts, including many illustrations (some in color) and provides a single reference source for the user community regarding the ordering and obtaining of NASA-developed image-processing software and science data.

  18. Issues and relationships among software standards for nuclear safety applications. Version 2.0

    SciTech Connect

    Scott, J.A.; Preckshot, G.G.; Lawrence, J.D.; Johnson, G.L.

    1996-03-26

    Lawrence Livermore National Laboratory is assisting the Nuclear Regulatory Commission with the development of draft regulatory guides for selected software engineering standards. This report describes the results of the initial task in this work. The selected software standards and a set of related software engineering standards were reviewed, and the resulting preliminary elements of the regulatory positions are identified in this report. The importance of a thorough understanding of the relationships among standards useful for developing safety-related software is emphasized. The relationship of this work to the update of the Standard Review Plan is also discussed.

  19. Focusing on energy and optoelectronic applications: a journey for graphene and graphene oxide at large scale.

    PubMed

    Wan, Xiangjian; Huang, Yi; Chen, Yongsheng

    2012-04-17

    Carbon is the only element that has stable allotropes in the 0th through the 3rd dimension, all of which have many outstanding properties. Graphene is the basic building block of other important carbon allotropes. Studies of graphene became much more active after the Geim group isolated "free" and "perfect" graphene sheets and demonstrated the unprecedented electronic properties of graphene in 2004. So far, no other individual material combines so many important properties, including high mobility, Hall effect, transparency, mechanical strength, and thermal conductivity. In this Account, we briefly review our studies of bulk scale graphene and graphene oxide (GO), including their synthesis and applications focused on energy and optoelectronics. Researchers use many methods to produce graphene materials: bottom-up and top-down methods and scalable methods such as chemical vapor deposition (CVD) and chemical exfoliation. Each fabrication method has both advantages and limitations. CVD could represent the most important production method for electronic applications. The chemical exfoliation method offers the advantages of easy scale up and easy solution processing but also produces graphene oxide (GO), which leads to defects and the introduction of heavy functional groups. However, most of these additional functional groups and defects can be removed by chemical reduction or thermal annealing. Because solution processing is required for many film and device applications, including transparent electrodes for touch screens, light-emitting devices (LED), field-effect transistors (FET), and photovoltaic devices (OPV), flexible electronics, and composite applications, the use of GO is important for the production of graphene. Because graphene has an intrinsic zero band gap, this issue needs to be tackled for its FET applications. The studies for transparent electrode related applications have made great progress, but researchers need to improve sheet resistance while

  20. Neuroinformatics Software Applications Supporting Electronic Data Capture, Management, and Sharing for the Neuroimaging Community.

    PubMed

    Nichols, B Nolan; Pohl, Kilian M

    2015-09-01

    Accelerating insight into the relation between brain and behavior entails conducting small and large-scale research endeavors that lead to reproducible results. Consensus is emerging between funding agencies, publishers, and the research community that data sharing is a fundamental requirement to ensure all such endeavors foster data reuse and fuel reproducible discoveries. Funding agency and publisher mandates to share data are bolstered by a growing number of data sharing efforts that demonstrate how information technologies can enable meaningful data reuse. Neuroinformatics evaluates scientific needs and develops solutions to facilitate the use of data across the cognitive and neurosciences. For example, electronic data capture and management tools designed to facilitate human neurocognitive research can decrease the setup time of studies, improve quality control, and streamline the process of harmonizing, curating, and sharing data across data repositories. In this article we outline the advantages and disadvantages of adopting software applications that support these features by reviewing the tools available and then presenting two contrasting neuroimaging study scenarios in the context of conducting a cross-sectional and a multisite longitudinal study. PMID:26267019

  1. Software Aspects of IEEE Floating-Point Computations for Numerical Applications in High Energy Physics

    SciTech Connect

    2010-05-11

    Floating-point computations are at the heart of much of the computing done in high energy physics. The correctness, speed and accuracy of these computations are of paramount importance. The lack of any of these characteristics can mean the difference between new, exciting physics and an embarrassing correction. This talk will examine practical aspects of IEEE 754-2008 floating-point arithmetic as encountered in HEP applications. After describing the basic features of IEEE floating-point arithmetic, the presentation will cover: common hardware implementations (SSE, x87) techniques for improving the accuracy of summation, multiplication and data interchange compiler options for gcc and icc affecting floating-point operations hazards to be avoided About the speaker Jeffrey M Arnold is a Senior Software Engineer in the Intel Compiler and Languages group at Intel Corporation. He has been part of the Digital->Compaq->Intel compiler organization for nearly 20 years; part of that time, he worked on both low- and high-level math libraries. Prior to that, he was in the VMS Engineering organization at Digital Equipment Corporation. In the late 1980s, Jeff spent 2½ years at CERN as part of the CERN/Digital Joint Project. In 2008, he returned to CERN to spent 10 weeks working with CERN/openlab. Since that time, he has returned to CERN multiple times to teach at openlab workshops and consult with various LHC experiments. Jeff received his Ph.D. in physics from Case Western Reserve University.

  2. Software Aspects of IEEE Floating-Point Computations for Numerical Applications in High Energy Physics

    ScienceCinema

    None

    2011-10-06

    Floating-point computations are at the heart of much of the computing done in high energy physics. The correctness, speed and accuracy of these computations are of paramount importance. The lack of any of these characteristics can mean the difference between new, exciting physics and an embarrassing correction. This talk will examine practical aspects of IEEE 754-2008 floating-point arithmetic as encountered in HEP applications. After describing the basic features of IEEE floating-point arithmetic, the presentation will cover: common hardware implementations (SSE, x87) techniques for improving the accuracy of summation, multiplication and data interchange compiler options for gcc and icc affecting floating-point operations hazards to be avoided About the speaker Jeffrey M Arnold is a Senior Software Engineer in the Intel Compiler and Languages group at Intel Corporation. He has been part of the Digital->Compaq->Intel compiler organization for nearly 20 years; part of that time, he worked on both low- and high-level math libraries. Prior to that, he was in the VMS Engineering organization at Digital Equipment Corporation. In the late 1980s, Jeff spent 2½ years at CERN as part of the CERN/Digital Joint Project. In 2008, he returned to CERN to spent 10 weeks working with CERN/openlab. Since that time, he has returned to CERN multiple times to teach at openlab workshops and consult with various LHC experiments. Jeff received his Ph.D. in physics from Case Western Reserve University.

  3. The application of TSIM software to act design and analysis on flexible aircraft

    NASA Technical Reports Server (NTRS)

    Kaynes, Ian W.

    1989-01-01

    The TSIM software is described. This is a package which uses an interactive FORTRAN-like simulation language for the simulation on nonlinear dynamic systems and offers facilities which include: mixed continuous and discrete time systems, time response calculations, numerical optimization, automatic trimming of nonlinear aircraft systems, and linearization of nonlinear equations for eigenvalues, frequency responses and power spectral response evaluation. Details are given of the application of TSIM to the analysis of aeroelastic systems under the RAE Farborough extension FLEX-SIM. The aerodynamic and structural data for the equations of motion of a flexible aircraft are prepared by a preprocessor program for incorporation in TSIM simulations. Within the simulation, the flexible aircraft model may then be selected interactively for different flight conditions and modal reduction techniques applied. The use of FLEX-SIM is demonstrated by an example of the flutter prediction for a simple aeroelastic model. By utilizing the numerical optimization facility of TSIM, it is possible to undertake identification of required parameters in the TSIM model within the simulation.

  4. A fast auto-focusing technique for the long focal lens TDI CCD camera in remote sensing applications

    NASA Astrophysics Data System (ADS)

    Wang, Dejiang; Ding, Xu; Zhang, Tao; Kuang, Haipeng

    2013-02-01

    The key issue in automatic focus adjustment for long focal lens TDI CCD camera in remote sensing applications is to achieve the optimum focus position as fast as possible. Existing auto-focusing techniques consume too much time as the mechanical focusing parts of the camera move in steps during the searching procedure. In this paper, we demonstrate a fast auto-focusing technique, which employs the internal optical elements and the TDI CCD itself to directly sense the deviations in back focal distance of the lens and restore the imaging system to a best-available focus. It is particularly advantageous for determination of the focus, due to that the relative motion between the TDI CCD and the focusing element can proceed without interruption. Moreover, the theoretical formulas describing the effect of imaging motion on the focusing precision and the effective focusing range are also developed. Finally, an experimental setup is constructed to evaluate the performance of the proposed technique. The results of the experiment show a ±5 μm precision of auto-focusing in a range of ±500 μmdefocus, and the searching procedure could be accomplished within 0.125 s, which leads to remarkable improvement on the real-time imaging capability for high resolution TDI CCD camera in remote sensing applications.

  5. PREFACE: Focus section on AdS/CFT applications to QCD matter Focus section on AdS/CFT applications to QCD matter

    NASA Astrophysics Data System (ADS)

    Bass, Steffan A.; Casalderrey-Solana, Jorge

    2012-04-01

    influential results in the field and motivating a growing interest within the heavy ion and finite temperature QCD communities on the gauge/gravity duality. Since this influential work, the gauge/gravity duality has been applied to many different aspects of QCD at extreme conditions. A lot of effort has been made in using the correspondence in its original form to describe the high energy collision and thermalization of strongly coupled matter, as well as its interaction with energetic probes. While these results strictly apply to Script N = 4 SYM, this large body of work has led to many insights into the equivalent problems in QCD, since in the region of relevance for current experiments the coupling is not small. Complementarily, there has been very intense research in the bottom-up construction of gravitational duals which capture some of the most salient differences between QCD and Script N = 4 SYM, such as the lack of conformality, in the absence of supersymmetry. These new holographic models have been applied not only to collision dynamics, but also to the description of the QCD phase diagram. The intense research activity we have just described motivated a focus section on the applications of gauge/gravity duality to the dynamics of QCD matter under extreme conditions. The articles in this issue cover most of the aspects where the correspondence is currently having a larger impact within the field and will provide the reader with an overall view of the wide scope that this method has acquired, as well as the technical advances and main results in the field. References [1] Maldacena J 1999 The large-N limit of superconformal field theories and supergravity Int. J. Theor. Phys. 38 1113-33 [2] Policastro G, Son D T and Starinets A O 2001 Shear viscosity of strongly coupled N = 4 supersymmetric Yang-Mills plasma Phys. Rev. Lett. 87 081601

  6. Investigating the application of AOP methodology in development of Financial Accounting Software using Eclipse-AJDT Environment

    NASA Astrophysics Data System (ADS)

    Sharma, Amita; Sarangdevot, S. S.

    2010-11-01

    Aspect-Oriented Programming (AOP) methodology has been investigated in development of real world business application software—Financial Accounting Software. Eclipse-AJDT environment has been used as open source enhanced IDE support for programming in AOP language—Aspect J. Crosscutting concerns have been identified and modularized as aspects. This reduces the complexity of the design considerably due to elimination of code scattering and tangling. Improvement in modularity, quality and performance is achieved. The study concludes that AOP methodology in Eclipse-AJDT environment offers powerful support for modular design and implementation of real world quality business software.

  7. Applications of Logic Coverage Criteria and Logic Mutation to Software Testing

    ERIC Educational Resources Information Center

    Kaminski, Garrett K.

    2011-01-01

    Logic is an important component of software. Thus, software logic testing has enjoyed significant research over a period of decades, with renewed interest in the last several years. One approach to detecting logic faults is to create and execute tests that satisfy logic coverage criteria. Another approach to detecting faults is to perform mutation…

  8. An application generator for rapid prototyping of Ada real-time control software

    NASA Technical Reports Server (NTRS)

    Johnson, Jim; Biglari, Haik; Lehman, Larry

    1990-01-01

    The need to increase engineering productivity and decrease software life cycle costs in real-time system development establishes a motivation for a method of rapid prototyping. The design by iterative rapid prototyping technique is described. A tool which facilitates such a design methodology for the generation of embedded control software is described.

  9. Microfabrication and Test of a Three-Dimensional Polymer Hydro-focusing Unit for Flow Cytometry Applications

    NASA Technical Reports Server (NTRS)

    Yang, Ren; Feeback, Daniel L.; Wang, Wanjun

    2004-01-01

    This paper details a novel three-dimensional (3D) hydro-focusing micro cell sorter for micro flow cytometry applications. The unit was microfabricated by means of SU-8 3D lithography. The 3D microstructure for coaxial sheathing was designed, microfabricated, and tested. Three-dimensional hydro-focusing capability was demonstrated with an experiment to sort labeled tanned sheep erythrocytes (red blood cells). This polymer hydro-focusing microstructure is easily microfabricated and integrated with other polymer microfluidic structures.

  10. Application of Open Source Software by the Lunar Mapping and Modeling Project

    NASA Astrophysics Data System (ADS)

    Ramirez, P.; Goodale, C. E.; Bui, B.; Chang, G.; Kim, R. M.; Law, E.; Malhotra, S.; Rodriguez, L.; Sadaqathullah, S.; Mattmann, C. A.; Crichton, D. J.

    2011-12-01

    The Lunar Mapping and Modeling Project (LMMP), led by the Marshall Space Flight center (MSFC), is responsible for the development of an information system to support lunar exploration, decision analysis, and release of lunar data to the public. The data available through the lunar portal is predominantly derived from present lunar missions (e.g., the Lunar Reconnaissance Orbiter (LRO)) and from historical missions (e.g., Apollo). This project has created a gold source of data, models, and tools for lunar explorers to exercise and incorporate into their activities. At Jet Propulsion Laboratory (JPL), we focused on engineering and building the infrastructure to support cataloging, archiving, accessing, and delivery of lunar data. We decided to use a RESTful service-oriented architecture to enable us to abstract from the underlying technology choices and focus on interfaces to be used internally and externally. This decision allowed us to leverage several open source software components and integrate them by either writing a thin REST service layer or relying on the API they provided; the approach chosen was dependent on the targeted consumer of a given interface. We will discuss our varying experience using open source products; namely Apache OODT, Oracle Berkley DB XML, Apache Solr, and Oracle OpenSSO (now named OpenAM). Apache OODT, developed at NASA's Jet Propulsion Laboratory and recently migrated over to Apache, provided the means for ingestion and cataloguing of products within the infrastructure. Its usage was based upon team experience with the project and past benefit received on other projects internal and external to JPL. Berkeley DB XML, distributed by Oracle for both commercial and open source use, was the storage technology chosen for our metadata. This decision was in part based on our use Federal Geographic Data Committee (FGDC) Metadata, which is expressed in XML, and the desire to keep it in its native form and exploit other technologies built on

  11. Feature Selection for Evolutionary Commercial-off-the-Shelf Software: Studies Focusing on Time-to-Market, Innovation and Hedonic-Utilitarian Trade-Offs

    ERIC Educational Resources Information Center

    Kakar, Adarsh Kumar

    2013-01-01

    Feature selection is one of the most important decisions made by product managers. This three article study investigates the concepts, tools and techniques for making trade-off decisions of introducing new features in evolving Commercial-Off-The-Shelf (COTS) software products. The first article investigates the efficacy of various feature…

  12. Visual Recognition Software for Binary Classification and its Application to Pollen Identification

    NASA Astrophysics Data System (ADS)

    Punyasena, S. W.; Tcheng, D. K.; Nayak, A.

    2014-12-01

    An underappreciated source of uncertainty in paleoecology is the uncertainty of palynological identifications. The confidence of any given identification is not regularly reported in published results, so cannot be incorporated into subsequent meta-analyses. Automated identifications systems potentially provide a means of objectively measuring the confidence of a given count or single identification, as well as a mechanism for increasing sample sizes and throughput. We developed the software ARLO (Automated Recognition with Layered Optimization) to tackle difficult visual classification problems such as pollen identification. ARLO applies pattern recognition and machine learning to the analysis of pollen images. The features that the system discovers are not the traditional features of pollen morphology. Instead, general purpose image features, such as pixel lines and grids of different dimensions, size, spacing, and resolution, are used. ARLO adapts to a given problem by searching for the most effective combination of feature representation and learning strategy. We present a two phase approach which uses our machine learning process to first segment pollen grains from the background and then classify pollen pixels and report species ratios. We conducted two separate experiments that utilized two distinct sets of algorithms and optimization procedures. The first analysis focused on reconstructing black and white spruce pollen ratios, training and testing our classification model at the slide level. This allowed us to directly compare our automated counts and expert counts to slides of known spruce ratios. Our second analysis focused on maximizing classification accuracy at the individual pollen grain level. Instead of predicting ratios of given slides, we predicted the species represented in a given image window. The resulting analysis was more scalable, as we were able to adapt the most efficient parts of the methodology from our first analysis. ARLO was able to

  13. Excel2Genie: A Microsoft Excel application to improve the flexibility of the Genie-2000 Spectroscopic software.

    PubMed

    Forgács, Attila; Balkay, László; Trón, Lajos; Raics, Péter

    2014-12-01

    Excel2Genie, a simple and user-friendly Microsoft Excel interface, has been developed to the Genie-2000 Spectroscopic Software of Canberra Industries. This Excel application can directly control Canberra Multichannel Analyzer (MCA), process the acquired data and visualize them. Combination of Genie-2000 with Excel2Genie results in remarkably increased flexibility and a possibility to carry out repetitive data acquisitions even with changing parameters and more sophisticated analysis. The developed software package comprises three worksheets: display parameters and results of data acquisition, data analysis and mathematical operations carried out on the measured gamma spectra. At the same time it also allows control of these processes. Excel2Genie is freely available to assist gamma spectrum measurements and data evaluation by the interested Canberra users. With access to the Visual Basic Application (VBA) source code of this application users are enabled to modify the developed interface according to their intentions. PMID:25113536

  14. Application of the Golden Software Surfer mapping software for automation of visualisation of meteorological and oceanographic data in IMGW Maritime Branch.

    NASA Astrophysics Data System (ADS)

    Piliczewski, B.

    2003-04-01

    The Golden Software Surfer has been used in IMGW Maritime Branch for more than ten years. This tool provides ActiveX Automation objects, which allow scripts to control practically every feature of Surfer. These objects can be accessed from any Automation-enabled environment, such as Visual Basic or Excel. Several applications based on Surfer has been developed in IMGW. The first example is an on-line oceanographic service, which presents forecasts of the water temperature, sea level and currents originating from the HIROMB model and is automatically updated every day. Surfer was also utilised in MERMAID, an international project supported by EC under the 5th Framework Programme. The main aim of this project was to create a prototype of the Internet-based data brokerage system, which would enable to search, extract, buy and download datasets containing meteorological or oceanographic data. During the project IMGW developed an online application, called Mermaid Viewer, which enables communication with the data broker and automatic visualisation of the downloaded data using Surfer. Both the above mentioned applications were developed in Visual Basic. Currently it is considered to adopt Surfer for the monitoring service, which provides access to the data collected in the monitoring of the Baltic Sea environment.

  15. Distributed execution of recovery blocks - An approach for uniform treatment of hardware and software faults in real-time applications

    NASA Technical Reports Server (NTRS)

    Kim, K. H.; Welch, Howard O.

    1989-01-01

    The concept of distributed execution of recovery blocks is examined as an approach for uniform treatment of hardware and software faults. A useful characteristic of the approach is the relatively small time cost it requires. The approach is thus suitable for incorporation into real-time computer systems. A specific formulation of the approach that is aimed at minimizing the recovery time is presented, called the distributed recovery block (DRB) scheme. The DRB scheme is capable of effecting forward recovery while handling both hardware and software faults in a uniform manner. An approach to incorporating the capability for multiprocessing scheme is also discussed. Two experiments aimed at testing the execution efficiency of the scheme in real-time applications have been conducted on two different multimicrocomputer networks. The results clearly indicate the feasibility of achieving tolerance of hardware and software faults in a broad range of real-time computer systems by use of the schemes for distributed execution of recovery blocks.

  16. Characterization and application of simultaneously spatio-temporally focused ultrafast laser pulses

    NASA Astrophysics Data System (ADS)

    Greco, Michael J.

    Chirped pulse amplication of ultrafast laser pulses has become an essential technology in the elds of micromachining, tissue ablation, and microscopy. With specically tailored pulses of light we have been able to begin investigation into lab-on-a-chip technology, which has the potential of revolutionizing the medical industry. Advances in microscopy have allowed sub diraction limited resolution to become a reality as well as lensless imaging of single molecules. An intimate knowledge of ultrafast optical pulses, the ability to manipulate an optical spectrum and generate an optical pulse of a specic temporal shape, allows us to continue pushing these elds forward as well as open new ones. This thesis investigates the spatio-temporal construction of pulses which are simultaneously spatio-temporally focused (SSTF) and about their current and future applications. By laterally chirping a compressed laser pulse we have conned the peak intensity to a shorter distance along the optical axis than can be achieved by conventional methods. This also brings about interesting changes to the structure of the pulse intensity such as pulse front tilt (PFT), an eect where the pulse energy is delayed across the focal spot at the focal plane by longer durations than the pulse itself. Though these pulses have found utility in microscopy and micromachining, in-situ methods for characterizing them spatially and temporally are not yet wide spread. I present here an in-situ characterization technique for both spatial and temporal diagnosis of SSTF pulses. By performing a knife-edge scan and collecting the light in a spectrometer, the relative spectral position as well as beam size can be deduced. Temporal characterization is done by dispersion scan, where a second harmonic crystal through the beam focus. Combining the unknown phase of the pulse with the known phase (a result of angular dispersion) allows the unknown phase to be extracted from the second harmonic spectra.

  17. Software Configuration Management Guidebook

    NASA Technical Reports Server (NTRS)

    1995-01-01

    The growth in cost and importance of software to NASA has caused NASA to address the improvement of software development across the agency. One of the products of this program is a series of guidebooks that define a NASA concept of the assurance processes which are used in software development. The Software Assurance Guidebook, SMAP-GB-A201, issued in September, 1989, provides an overall picture of the concepts and practices of NASA in software assurance. Lower level guidebooks focus on specific activities that fall within the software assurance discipline, and provide more detailed information for the manager and/or practitioner. This is the Software Configuration Management Guidebook which describes software configuration management in a way that is compatible with practices in industry and at NASA Centers. Software configuration management is a key software development process, and is essential for doing software assurance.

  18. The Program for Climate Model Diagnosis and Intercomparison (PCMDI) Software Development: Applications, Infrastructure, and Middleware/Networks

    SciTech Connect

    Williams, Dean N.

    2011-06-30

    The status of and future plans for the Program for Climate Model Diagnosis and Intercomparison (PCMDI) hinge on software that PCMDI is either currently distributing or plans to distribute to the climate community in the near future. These software products include standard conventions, national and international federated infrastructures, and community analysis and visualization tools. This report also mentions other secondary software not necessarily led by or developed at PCMDI to provide a complete picture of the overarching applications, infrastructures, and middleware/networks. Much of the software described anticipates the use of future technologies envisioned over the span of next year to 10 years. These technologies, together with the software, will be the catalyst required to address extreme-scale data warehousing, scalability issues, and service-level requirements for a diverse set of well-known projects essential for predicting climate change. These tools, unlike the previous static analysis tools of the past, will support the co-existence of many users in a productive, shared virtual environment. This advanced technological world driven by extreme-scale computing and the data it generates will increase scientists’ productivity, exploit national and international relationships, and push research to new levels of understanding.

  19. Choice: 36 band feature selection software with applications to multispectral pattern recognition

    NASA Technical Reports Server (NTRS)

    Jones, W. C.

    1973-01-01

    Feature selection software was developed at the Earth Resources Laboratory that is capable of inputting up to 36 channels and selecting channel subsets according to several criteria based on divergence. One of the criterion used is compatible with the table look-up classifier requirements. The software indicates which channel subset best separates (based on average divergence) each class from all other classes. The software employs an exhaustive search technique, and computer time is not prohibitive. A typical task to select the best 4 of 22 channels for 12 classes takes 9 minutes on a Univac 1108 computer.

  20. An Application of Intelligent Data Analysis Techniques to a Large Software Engineering Dataset

    NASA Astrophysics Data System (ADS)

    Cain, James; Counsell, Steve; Swift, Stephen; Tucker, Allan

    Within the development of large software systems, there is significant value in being able to predict changes. If we can predict the likely changes that a system will undergo, then we can estimate likely developer effort and allocate resources appropriately. Within object oriented software development, these changes are often identified as refactorings. Very few studies have explored the prediction of refactorings on a wide-scale. Within this paper we aim to do just this, through applying intelligent data analysis techniques to a uniquely large and comprehensive software engineering time series dataset. Our analysis show extremely promising results, allowing us to predict the occurrence of future large changes.

  1. Modeling a distributed environment for a petroleum reservoir engineering application with software product line

    NASA Astrophysics Data System (ADS)

    de Faria Scheidt, Rafael; Vilain, Patrícia; Dantas, M. A. R.

    2014-10-01

    Petroleum reservoir engineering is a complex and interesting field that requires large amount of computational facilities to achieve successful results. Usually, software environments for this field are developed without taking care out of possible interactions and extensibilities required by reservoir engineers. In this paper, we present a research work which it is characterized by the design and implementation based on a software product line model for a real distributed reservoir engineering environment. Experimental results indicate successfully the utilization of this approach for the design of distributed software architecture. In addition, all components from the proposal provided greater visibility of the organization and processes for the reservoir engineers.

  2. VARK learning preferences and mobile anatomy software application use in pre-clinical chiropractic students.

    PubMed

    Meyer, Amanda J; Stomski, Norman J; Innes, Stanley I; Armson, Anthony J

    2016-05-01

    Ubiquitous smartphone ownership and reduced face-to-face teaching time may lead to students making greater use of mobile technologies in their learning. This is the first study to report on the prevalence of mobile gross anatomy software applications (apps) usage in pre-clinical chiropractic students and to ascertain if a relationship exists between preferred learning styles as determined by the validated VARK(©) questionnaire and use of mobile anatomy apps. The majority of the students who completed the VARK questionnaire were multimodal learners with kinesthetic and visual preferences. Sixty-seven percent (73/109) of students owned one or more mobile anatomy apps which were used by 57 students. Most of these students owned one to five apps and spent less than 30 minutes per week using them. Six of the top eight mobile anatomy apps owned and recommended by the students were developed by 3D4Medical. Visual learning preferences were not associated with time spent using mobile anatomy apps (OR = 0.40, 95% CI 0.12-1.40). Similarly, kinesthetic learning preferences (OR = 1.88, 95% CI 0.18-20.2), quadmodal preferences (OR = 0.71, 95% CI 0.06-9.25), or gender (OR = 1.51, 95% CI 0.48-4.81) did not affect the time students' spent using mobile anatomy apps. Learning preferences do not appear to influence students' time spent using mobile anatomy apps. Anat Sci Educ 9: 247-254. © 2015 American Association of Anatomists. PMID:26109371

  3. Generation of micro-sized PDMS particles by a flow focusing technique for biomicrofluidics applications.

    PubMed

    Muñoz-Sánchez, B N; Silva, S F; Pinho, D; Vega, E J; Lima, R

    2016-01-01

    Polydimethylsiloxane (PDMS), due to its remarkable properties, is one of the most widely used polymers in many industrial and medical applications. In this work, a technique based on a flow focusing technique is used to produce PDMS spherical particles with sizes of a few microns. PDMS precursor is injected through a hypodermic needle to form a film/reservoir over the needle's outer surface. This film flows towards the needle tip until a liquid ligament is steadily ejected thanks to the action of a coflowing viscous liquid stream. The outcome is a capillary jet which breaks up into PDMS precursor droplets due to the growth of capillary waves producing a micrometer emulsion. The PDMS liquid droplets in the solution are thermally cured into solid microparticles. The size distribution of the particles is analyzed before and after curing, showing an acceptable degree of monodispersity. The PDMS liquid droplets suffer shrinkage while curing. These microparticles can be used in very varied technological fields, such as biomedicine, biotechnology, pharmacy, and industrial engineering. PMID:27042245

  4. Nanofocus of tenth of joules and a portable plasma focus of few joules for field applications

    SciTech Connect

    Soto, Leopoldo; Pavez, Cristian; Moreno, Jose; Tarifeno, Ariel; Pedreros, Jose; Altamirano, Luis

    2009-01-21

    A repetitive pinch plasma focus that works with stored energy less than 1 J per shot has be developed at the Chilean Nuclear Energy Commission. The main features of this device, repetitive Nanofocus, are 5 nF of capacity, 5 nH of inductance, 5-10 kV charging voltage, 60-250 mJ stored energy, 5-10 kA current peak, per shot. The device has been operated at 20 Hz in hydrogen and deuterium. X-ray radiographs of materials of different thickness were obtained. Neutrons were detected using a system based upon {sup 3}He proportional counter in chare integrated mode. However, the reproducibility of this miniaturized device is low and several technological subjects have to be previously solved in order to produce neutrons for periods greater than minutes. Further studies in the Nanofocus are being carried out. In addition, a device with a stored energy of a few joules is being explored. A preliminary compact, low weight (3 kg), portable PF device (25 cmx5 cmx5 cm) for field applications has been designed. This device was designed to operate with few kilovolts (10 kV or less) with a stored energy of 2 J and a repetition rate of 10 Hz without cooling. A neutron flux of the order of 10{sup 4}-10{sup 5} n/s is expected.

  5. Software Reviews.

    ERIC Educational Resources Information Center

    Computing Teacher, 1985

    1985-01-01

    Reprinted from "The Computing Teacher," this document contains software reviews for 23 computer programs that educators could use in the classroom or for administrative purposes. Each review describes the program by listing the program title, subject, producer, grade level (if applicable), hardware required, cost, and reviewer's name and…

  6. Design Software

    NASA Technical Reports Server (NTRS)

    1991-01-01

    A NASA contractor and Small Business Innovation Research (SBIR) participant has converted its research into commercial software products for auto design, structural analysis and other applications. ViGYAN, Inc., utilizing the aeronautical research principle of computational fluid dynamics, has created - with VGRID3D and VPLOT3D - an easier alternative to conventional structured grids for fluid dynamic calculations.

  7. The international river interface cooperative: Public domain flow and morphodynamics software for education and applications

    NASA Astrophysics Data System (ADS)

    Nelson, Jonathan M.; Shimizu, Yasuyuki; Abe, Takaaki; Asahi, Kazutake; Gamou, Mineyuki; Inoue, Takuya; Iwasaki, Toshiki; Kakinuma, Takaharu; Kawamura, Satomi; Kimura, Ichiro; Kyuka, Tomoko; McDonald, Richard R.; Nabi, Mohamed; Nakatsugawa, Makoto; Simões, Francisco R.; Takebayashi, Hiroshi; Watanabe, Yasunori

    2016-07-01

    This paper describes a new, public-domain interface for modeling flow, sediment transport and morphodynamics in rivers and other geophysical flows. The interface is named after the International River Interface Cooperative (iRIC), the group that constructed the interface and many of the current solvers included in iRIC. The interface is entirely free to any user and currently houses thirteen models ranging from simple one-dimensional models through three-dimensional large-eddy simulation models. Solvers are only loosely coupled to the interface so it is straightforward to modify existing solvers or to introduce other solvers into the system. Six of the most widely-used solvers are described in detail including example calculations to serve as an aid for users choosing what approach might be most appropriate for their own applications. The example calculations range from practical computations of bed evolution in natural rivers to highly detailed predictions of the development of small-scale bedforms on an initially flat bed. The remaining solvers are also briefly described. Although the focus of most solvers is coupled flow and morphodynamics, several of the solvers are also specifically aimed at providing flood inundation predictions over large spatial domains. Potential users can download the application, solvers, manuals, and educational materials including detailed tutorials at www.-i-ric.org. The iRIC development group encourages scientists and engineers to use the tool and to consider adding their own methods to the iRIC suite of tools.

  8. The international river interface cooperative: Public domain flow and morphodynamics software for education and applications

    USGS Publications Warehouse

    Nelson, Jonathan M.; Shimizu, Yasuyuki; Abe, Takaaki; Asahi, Kazutake; Gamou, Mineyuki; Inoue, Takuya; Iwasaki, Toshiki; Kakinuma, Takaharu; Kawamura, Satomi; Kimura, Ichiro; Kyuka, Tomoko; McDonald, Richard R.; Nabi, Mohamed; Nakatsugawa, Makoto; Simoes, Francisco J.; Takebayashi, Hiroshi; Watanabe, Yasunori

    2016-01-01

    This paper describes a new, public-domain interface for modeling flow, sediment transport and morphodynamics in rivers and other geophysical flows. The interface is named after the International River Interface Cooperative (iRIC), the group that constructed the interface and many of the current solvers included in iRIC. The interface is entirely free to any user and currently houses thirteen models ranging from simple one-dimensional models through three-dimensional large-eddy simulation models. Solvers are only loosely coupled to the interface so it is straightforward to modify existing solvers or to introduce other solvers into the system. Six of the most widely-used solvers are described in detail including example calculations to serve as an aid for users choosing what approach might be most appropriate for their own applications. The example calculations range from practical computations of bed evolution in natural rivers to highly detailed predictions of the development of small-scale bedforms on an initially flat bed. The remaining solvers are also briefly described. Although the focus of most solvers is coupled flow and morphodynamics, several of the solvers are also specifically aimed at providing flood inundation predictions over large spatial domains. Potential users can download the application, solvers, manuals, and educational materials including detailed tutorials at www.-i-ric.org. The iRIC development group encourages scientists and engineers to use the tool and to consider adding their own methods to the iRIC suite of tools.

  9. Application of Calibrated Peer Review (CPR) Writing Assignments to Enhance Experiments with an Environmental Chemistry Focus

    ERIC Educational Resources Information Center

    Margerum, Lawrence D.; Gulsrud, Maren; Manlapez, Ronald; Rebong, Rachelle; Love, Austin

    2007-01-01

    The browser-based software program, Calibrated Peer Review (CPR) developed by the Molecular Science Project enables instructors to create structured writing assignments in which students learn by writing and reading for content. Though the CPR project covers only one experiment in general chemistry, it might provide lab instructors with a method…

  10. Software Design Improvements. Part 2; Software Quality and the Design and Inspection Process

    NASA Technical Reports Server (NTRS)

    Lalli, Vincent R.; Packard, Michael H.; Ziemianski, Tom

    1997-01-01

    The application of assurance engineering techniques improves the duration of failure-free performance of software. The totality of features and characteristics of a software product are what determine its ability to satisfy customer needs. Software in safety-critical systems is very important to NASA. We follow the System Safety Working Groups definition for system safety software as: 'The optimization of system safety in the design, development, use and maintenance of software and its integration with safety-critical systems in an operational environment. 'If it is not safe, say so' has become our motto. This paper goes over methods that have been used by NASA to make software design improvements by focusing on software quality and the design and inspection process.

  11. Educational Software Acquisition for Microcomputers.

    ERIC Educational Resources Information Center

    Erikson, Warren; Turban, Efraim

    1985-01-01

    Examination of issues involved in acquiring appropriate microcomputer software for higher education focuses on the following points: developing your own software; finding commercially available software; using published evaluations; pre-purchase testing; customizing and adapting commercial software; post-purchase testing; and software use. A…

  12. Software component quality evaluation

    NASA Technical Reports Server (NTRS)

    Clough, A. J.

    1991-01-01

    The paper describes a software inspection process that can be used to evaluate the quality of software components. Quality criteria, process application, independent testing of the process and proposed associated tool support are covered. Early results indicate that this technique is well suited for assessing software component quality in a standardized fashion. With automated machine assistance to facilitate both the evaluation and selection of software components, such a technique should promote effective reuse of software components.

  13. Further Evolution of Composite Doubler Aircraft Repairs Through a Focus on Niche Applications

    SciTech Connect

    ROACH,DENNIS P.

    2000-07-15

    The number of commercial airframes exceeding twenty years of service continues to grow. A typical aircraft can experience over 2,000 fatigue cycles (cabin pressurizations) and even greater flight hours in a single year. An unavoidable by-product of aircraft use is that crack and corrosion flaws develop throughout the aircraft's skin and substructure elements. Economic barriers to the purchase of new aircraft have created an aging aircraft fleet and placed even greater demands on efficient and safe repair methods. The use of bonded composite doublers offers the airframe manufacturers and aircraft maintenance facilities a cost effective method to safety extend the lives of their aircraft. Instead of riveting multiple steel or aluminum plates to facilitate an aircraft repair, it is now possible to bond a single Boron-Epoxy composite doubler to the damaged structure. The FAA's Airworthiness Assurance Center at Sandia National Labs (AANC) is conducting a program with Boeing and Federal Express to validate and introduce composite doubler repair technology to the US commercial aircraft industry. This project focuses on repair of DC-10 structure and builds on the foundation of the successful L-1011 door corner repair that was completed by the AANC, Lockheed-Martin, and Delta Air Lines. The L-1011 composite doubler repair was installed in 1997 and has not developed any flaws in over three years of service, As a follow-on effort, this DC-1O repair program investigated design, analysis, performance (durability, flaw containment, reliability), installation, and nondestructive inspection issues. Current activities are demonstrating regular use of composite doubler repairs on commercial aircraft. The primary goal of this program is to move the technology into niche applications and to streamline the design-to-installation process. Using the data accumulated to date, the team has designed, analyzed, and developed inspection techniques for an array of composite doubler repairs

  14. Focused-electron-beam-induced processing (FEBIP) for emerging applications in carbon nanoelectronics

    NASA Astrophysics Data System (ADS)

    Fedorov, Andrei G.; Kim, Songkil; Henry, Mathias; Kulkarni, Dhaval; Tsukruk, Vladimir V.

    2014-12-01

    Focused-electron-beam-induced processing (FEBIP), a resist-free additive nanomanufacturing technique, is an actively researched method for "direct-write" processing of a wide range of structural and functional nanomaterials, with high degree of spatial and time-domain control. This article attempts to critically assess the FEBIP capabilities and unique value proposition in the context of processing of electronics materials, with a particular emphasis on emerging carbon (i.e., based on graphene and carbon nanotubes) devices and interconnect structures. One of the major hurdles in advancing the carbon-based electronic materials and device fabrication is a disjoint nature of various processing steps involved in making a functional device from the precursor graphene/CNT materials. Not only this multi-step sequence severely limits the throughput and increases the cost, but also dramatically reduces the processing reproducibility and negatively impacts the quality because of possible between-the-step contamination, especially for impurity-susceptible materials such as graphene. The FEBIP provides a unique opportunity to address many challenges of carbon nanoelectronics, especially when it is employed as part of an integrated processing environment based on multiple "beams" of energetic particles, including electrons, photons, and molecules. This avenue is promising from the applications' prospective, as such a multi-functional (electron/photon/molecule beam) enables one to define shapes (patterning), form structures (deposition/etching), and modify (cleaning/doping/annealing) properties with locally resolved control on nanoscale using the same tool without ever changing the processing environment. It thus will have a direct positive impact on enhancing functionality, improving quality and reducing fabrication costs for electronic devices, based on both conventional CMOS and emerging carbon (CNT/graphene) materials.

  15. Software Formal Inspections Standard

    NASA Technical Reports Server (NTRS)

    1993-01-01

    This Software Formal Inspections Standard (hereinafter referred to as Standard) is applicable to NASA software. This Standard defines the requirements that shall be fulfilled by the software formal inspections process whenever this process is specified for NASA software. The objective of this Standard is to define the requirements for a process that inspects software products to detect and eliminate defects as early as possible in the software life cycle. The process also provides for the collection and analysis of inspection data to improve the inspection process as well as the quality of the software.

  16. TOUGH2 software qualification

    SciTech Connect

    Pruess, K.; Simmons, A.; Wu, Y.S.; Moridis, G.

    1996-02-01

    TOUGH2 is a numerical simulation code for multi-dimensional coupled fluid and heat flow of multiphase, multicomponent fluid mixtures in porous and fractured media. It belongs to the MULKOM ({open_quotes}MULti-KOMponent{close_quotes}) family of codes and is a more general version of the TOUGH simulator. The MULKOM family of codes was originally developed with a focus on geothermal reservoir simulation. They are suited to modeling systems which contain different fluid mixtures, with applications to flow problems arising in the context of high-level nuclear waste isolation, oil and gas recovery and storage, and groundwater resource protection. TOUGH2 is essentially a subset of MULKOM, consisting of a selection of the better tested and documented MULKOM program modules. The purpose of this package of reports is to provide all software baseline documents necessary for the software qualification of TOUGH2.

  17. An application of three-dimensional modeling in the cutting machine of intersecting line software

    NASA Astrophysics Data System (ADS)

    Lu, Jixiang

    2011-11-01

    This paper developed a software platform of intersecting line cutting machine. The software platform consists of three parts. The first is the interface of parameter input and modify, the second is the three-dimensional display of main tube and branch tube, and the last is the cutting simulation and G code output. We can obtain intersection data by intersection algorithm, and we also make three-dimensional model and dynamic simulation on the data of intersecting line cutting. By changing the parameters and the assembly sequence of main tube and branch tube, you can see the modified two-dimensional and three-dimensional graphics and corresponding G-code output file. This method has been applied to practical cutting machine of intersecting line software.

  18. Open source hardware and software platform for robotics and artificial intelligence applications

    NASA Astrophysics Data System (ADS)

    Liang, S. Ng; Tan, K. O.; Lai Clement, T. H.; Ng, S. K.; Mohammed, A. H. Ali; Mailah, Musa; Azhar Yussof, Wan; Hamedon, Zamzuri; Yussof, Zulkifli

    2016-02-01

    Recent developments in open source hardware and software platforms (Android, Arduino, Linux, OpenCV etc.) have enabled rapid development of previously expensive and sophisticated system within a lower budget and flatter learning curves for developers. Using these platform, we designed and developed a Java-based 3D robotic simulation system, with graph database, which is integrated in online and offline modes with an Android-Arduino based rubbish picking remote control car. The combination of the open source hardware and software system created a flexible and expandable platform for further developments in the future, both in the software and hardware areas, in particular in combination with graph database for artificial intelligence, as well as more sophisticated hardware, such as legged or humanoid robots.

  19. Microcomputer technology applications: Charger and regulator software for a breadboard programmable power processor

    NASA Technical Reports Server (NTRS)

    Green, D. M.

    1978-01-01

    Software programs are described, one which implements a voltage regulation function, and one which implements a charger function with peak-power tracking of its input. The software, written in modular fashion, is intended as a vehicle for further experimentation with the P-3 system. A control teleprinter allows an operator to make parameter modifications to the control algorithm during experiments. The programs require 3K ROM and 2K ram each. User manuals for each system are included as well as a third program for simple I/O control.

  20. Coupling Photon Monte Carlo Simulation and CAD Software. Application to X-ray Nondestructive Evaluation

    NASA Astrophysics Data System (ADS)

    Tabary, J.; Glière, A.

    A Monte Carlo radiation transport simulation program, EGS Nova, and a Computer Aided Design software, BRL-CAD, have been coupled within the framework of Sindbad, a Nondestructive Evaluation (NDE) simulation system. In its current status, the program is very valuable in a NDE laboratory context, as it helps simulate the images due to the uncollided and scattered photon fluxes in a single NDE software environment, without having to switch to a Monte Carlo code parameters set. Numerical validations show a good agreement with EGS4 computed and published data. As the program's major drawback is the execution time, computational efficiency improvements are foreseen.

  1. An application of the IMC software to controller design for the JPL LSCL Experiment Facility

    NASA Technical Reports Server (NTRS)

    Zhu, Guoming; Skelton, Robert E.

    1993-01-01

    A software package which Integrates Model reduction and Controller design (The IMC software) is applied to design controllers for the JPL Large Spacecraft Control Laboratory Experiment Facility. Modal Cost Analysis is used for the model reduction, and various Output Covariance Constraints are guaranteed by the controller design. The main motivation is to find the controller with the 'best' performance with respect to output variances. Indeed it is shown that by iterating on the reduced order design model, the controller designed does have better performance than that obtained with the first model reduction.

  2. Application of Artificial Intelligence technology to the analysis and synthesis of reliable software systems

    NASA Technical Reports Server (NTRS)

    Wild, Christian; Eckhardt, Dave

    1987-01-01

    The development of a methodology for the production of highly reliable software is one of the greatest challenges facing the computer industry. Meeting this challenge will undoubtably involve the integration of many technologies. This paper describes the use of Artificial Intelligence technologies in the automated analysis of the formal algebraic specifications of abstract data types. These technologies include symbolic execution of specifications using techniques of automated deduction and machine learning through the use of examples. On-going research into the role of knowledge representation and problem solving in the process of developing software is also discussed.

  3. Software Applications to Access Earth Science Data: Building an ECHO Client

    NASA Astrophysics Data System (ADS)

    Cohen, A.; Cechini, M.; Pilone, D.

    2010-12-01

    Historically, developing an ECHO (NASA’s Earth Observing System (EOS) ClearingHOuse) client required interaction with its SOAP API. SOAP, as a framework for web service communication has numerous advantages for Enterprise applications and Java/C# type programming languages. However, as interest has grown for quick development cycles and more intriguing “mashups,” ECHO has seen the SOAP API lose its appeal. In order to address these changing needs, ECHO has introduced two new interfaces facilitating simple access to its metadata holdings. The first interface is built upon the OpenSearch format and ESIP Federated Search framework. The second interface is built upon the Representational State Transfer (REST) architecture. Using the REST and OpenSearch APIs to access ECHO makes development with modern languages much more feasible and simpler. Client developers can leverage the simple interaction with ECHO to focus more of their time on the advanced functionality they are presenting to users. To demonstrate the simplicity of developing with the REST API, participants will be led through a hands-on experience where they will develop an ECHO client that performs the following actions: + Login + Provider discovery + Provider based dataset discovery + Dataset, Temporal, and Spatial constraint based Granule discovery + Online Data Access

  4. Experimental demonstration of elastic optical networks based on enhanced software defined networking (eSDN) for data center application.

    PubMed

    Zhang, Jie; Yang, Hui; Zhao, Yongli; Ji, Yuefeng; Li, Hui; Lin, Yi; Li, Gang; Han, Jianrui; Lee, Young; Ma, Teng

    2013-11-01

    Due to the high burstiness and high-bandwidth characteristics of the applications, data center interconnection by elastic optical networks have attracted much attention of network operators and service providers. Many data center applications require lower delay and higher availability with the end-to-end guaranteed quality of service. In this paper, we propose and implement a novel elastic optical network based on enhanced software defined networking (eSDN) architecture for data center application, by introducing a transport-aware cross stratum optimization (TA-CSO) strategy. eSDN can enable cross stratum optimization of application and elastic optical network stratum resources and provide the elastic physical layer parameter adjustment, e.g., modulation format and bandwidth. We have designed and verified experimentally software defined path provisioning on our testbed with four real OpenFlow-enabled elastic optical nodes for data center application. The overall feasibility and efficiency of the proposed architecture is also experimentally demonstrated and compared with individual CSO and physical layer adjustment strategies in terms of path setup/release/adjustment latency, blocking probability and resource occupation rate. PMID:24216922

  5. School Counseling and Solution-Focused Site Supervision: A Theoretical Application and Case Example

    ERIC Educational Resources Information Center

    Cigrand, Dawnette L.; Wood, Susannah M.

    2011-01-01

    The solution-focused counseling theory provides a useful framework that can be applied to supervision of counselors-in-training. Solution-focused supervision is especially useful for school counseling site supervisors who may not have much time for supervision, who may not have had much training in clinical supervision, or who may have had…

  6. Empowering Adolescent Survivors of Sexual Abuse: Application of a Solution-Focused Ericksonian Counseling Group

    ERIC Educational Resources Information Center

    Kress, Victoria E.; Hoffman, Rachel M.

    2008-01-01

    This article describes a solution-focused and Ericksonian group counseling model that can be used with adolescent girls who have been sexually abused. An overview of the components of this approach is provided. A postintervention focus group provided additional results and ideas for the future development of the group counseling model.

  7. Microfabrication and Test of a Three-Dimensional Polymer Hydro-focusing Unit for Flow Cytometry Applications

    NASA Technical Reports Server (NTRS)

    Yang, Ren; Feeback, Daniel L.; Wang, Wan-Jun

    2005-01-01

    This paper details a novel three-dimensional (3D) hydro-focusing micro cell sorter for micro flow cytometry applications. The unit was microfabricated by means of SU-8 3D lithography. The 3D microstructure for coaxial sheathing was designed, microfabricated, and tested. Three-dimensional hydrofocusing capability was demonstrated with an experiment to sort labeled tanned sheep erythrocytes (red blood cells). This polymer hydro-focusing microstructure is easily microfabricated and integrated with other polymer microfluidic structures. Keywords: SU-8, three-dimensional hydro-focusing, microfluidic, microchannel, cytometer

  8. caBIG™ Compatibility Review System: Software to Support the Evaluation of Applications Using Defined Interoperability Criteria

    PubMed Central

    Freimuth, Robert R.; Schauer, Michael W.; Lodha, Preeti; Govindrao, Poornima; Nagarajan, Rakesh; Chute, Christopher G.

    2008-01-01

    The caBIG™ Compatibility Review System (CRS) is a web-based application to support compatibility reviews, which certify that software applications that pass the review meet a specific set of criteria that allow them to interoperate. The CRS contains workflows that support both semantic and syntactic reviews, which are performed by the caBIG Vocabularies and Common Data Elements (VCDE) and Architecture workspaces, respectively. The CRS increases the efficiency of compatibility reviews by reducing administrative overhead and it improves uniformity by ensuring that each review is conducted according to a standard process. The CRS provides metrics that allow the review team to evaluate the level of data element reuse in an application, a first step towards quantifying the extent of harmonization between applications. Finally, functionality is being added that will provide automated validation of checklist criteria, which will further simplify the review process. PMID:18999296

  9. Early Algebra with Graphics Software as a Type II Application of Technology

    ERIC Educational Resources Information Center

    Abramovich, Sergei

    2006-01-01

    This paper describes the use of Kid Pix-graphics software for creative activities of young children--in the context of early algebra as determined by the mathematics core curriculum of New York state. It shows how grade-two appropriate pedagogy makes it possible to bring about a qualitative change in the learning process of those commonly…

  10. Study of application of space telescope science operations software for SIRTF use

    NASA Technical Reports Server (NTRS)

    Dignam, F.; Stetson, E.; Allendoerfer, W.

    1985-01-01

    The design and development of the Space Telescope Science Operations Ground System (ST SOGS) was evaluated to compile a history of lessons learned that would benefit NASA's Space Infrared Telescope Facility (SIRTF). Forty-nine specific recommendations resulted and were categorized as follows: (1) requirements: a discussion of the content, timeliness and proper allocation of the system and segment requirements and the resulting impact on SOGS development; (2) science instruments: a consideration of the impact of the Science Instrument design and data streams on SOGS software; and (3) contract phasing: an analysis of the impact of beginning the various ST program segments at different times. Approximately half of the software design and source code might be useable for SIRTF. Transportability of this software requires, at minimum, a compatible DEC VAX-based architecture and VMS operating system, system support software similar to that developed for SOGS, and continued evolution of the SIRTF operations concept and requirements such that they remain compatible with ST SOGS operation.

  11. Application of fuzzy logic in intelligent software agents for IP selection

    NASA Astrophysics Data System (ADS)

    Liu, Jian; Shragowitz, Eugene B.

    1999-11-01

    IPs (Intellectual Properties) are becoming increasingly essential in today's electronic system design. One of important issues in design reuse is the IP selection, i.e. finding an existing solution that matches the user's expectations best. This paper describes the Internet-based intelligent software system (Software Agent) that helps the user to pick out the optimal designs among those marketed by the IP vendors. The Software Agent for IP Selection (SAFIPS) conducts dialogues with both the IP users and IP vendors, narrowing the choices after evaluating general characteristics first, followed by matching behavioral, RTL, logic, and physical levels. The SAFIPS system conducts reasoning based on fuzzy logic rules derived in the process of dialogues of the software agent with the IP users and vendors. In addition to the dialogue system and fuzzy logic inference system, the SAFIPS includes a HDL simulator and fuzzy logic evaluator that are used to measure the level of matching of the user's behavioral model with the IP vendor's model.

  12. Application of FLUIDNET-G software to the preliminary thermal and hydraulic design of Hermes spaceplane

    NASA Astrophysics Data System (ADS)

    Chelotti, Jean-Noel; Andre, Serge; Maciaszek, Thierry; Thuaire, Alain; Ferro, Alain

    1992-07-01

    FLUIDNET-G, a workstation-run graphics-software package, performs thermal and hydraulic preliminary design tasks for single-phase fluid-loop systems. Both tradeoff and sensitivity studies may be conducted, as is presently illustrated for the case of the Hermes radiator system. Extensive examples are given of the FLUIDNET-G system's graphics for the design problems discussed.

  13. Contemporary Applications of Computer Technology: Development of Meaningful Software in Special Education/Rehabilitation.

    ERIC Educational Resources Information Center

    Mills, Russell

    Four elements of clinical programming must be considered during development in order for a software program to be truly useful in rehabilitation: presentation of a useful task; treatment parameters selectable by clinicians; data collection/analysis; and authoring capability. These criteria govern the development of all Brain-Link Software…

  14. Two-dimensional spatiotemporal focusing of femtosecond pulses and its applications in microscopy

    NASA Astrophysics Data System (ADS)

    Song, Qiyuan; Nakamura, Aoi; Hirosawa, Kenichi; Isobe, Keisuke; Midorikawa, Katsumi; Kannari, Fumihiko

    2015-08-01

    We demonstrate and theoretically analyze the two-dimensional spatiotemporal focusing of femtosecond pulses by utilizing a two-dimensional spectral disperser. Compared with spatiotemporal focusing with a diffraction grating, it can achieve widefield illumination with better sectioning ability for a multiphoton excitation process. By utilizing paraxial approximation, our analytical method improves the axial confinement ability and identifies that the free spectra range (FSR) of the two-dimensional spectral disperser affects the out-of-focus multiphoton excitation intensity due to the temporal self-imaging effect. Based on our numerical simulation, a FSR of 50 GHz is necessary to reduce the out-of-focus two-photon excitation by 2 orders of magnitude compared with that in a grating-based spatiotemporal focusing scheme for a 90-fs excitation laser pulse. We build a two-dimensional spatiotemporal focusing microscope using a virtually imaged phased array and achieve an axial resolution of 1.3 μm, which outperforms the resolution of conventional spatiotemporal focusing using a grating by a factor of 1.7, and demonstrate better image contrast inside a tissue-like phantom.

  15. Two-dimensional spatiotemporal focusing of femtosecond pulses and its applications in microscopy

    SciTech Connect

    Song, Qiyuan; Nakamura, Aoi; Hirosawa, Kenichi; Kannari, Fumihiko; Isobe, Keisuke; Midorikawa, Katsumi

    2015-08-15

    We demonstrate and theoretically analyze the two-dimensional spatiotemporal focusing of femtosecond pulses by utilizing a two-dimensional spectral disperser. Compared with spatiotemporal focusing with a diffraction grating, it can achieve widefield illumination with better sectioning ability for a multiphoton excitation process. By utilizing paraxial approximation, our analytical method improves the axial confinement ability and identifies that the free spectra range (FSR) of the two-dimensional spectral disperser affects the out-of-focus multiphoton excitation intensity due to the temporal self-imaging effect. Based on our numerical simulation, a FSR of 50 GHz is necessary to reduce the out-of-focus two-photon excitation by 2 orders of magnitude compared with that in a grating-based spatiotemporal focusing scheme for a 90-fs excitation laser pulse. We build a two-dimensional spatiotemporal focusing microscope using a virtually imaged phased array and achieve an axial resolution of 1.3 μm, which outperforms the resolution of conventional spatiotemporal focusing using a grating by a factor of 1.7, and demonstrate better image contrast inside a tissue-like phantom.

  16. A Component Approach to Collaborative Scientific Software Development: Tools and Techniques Utilized by the Quantum Chemistry Science Application Partnership

    DOE PAGESBeta

    Kenny, Joseph P.; Janssen, Curtis L.; Gordon, Mark S.; Sosonkina, Masha; Windus, Theresa L.

    2008-01-01

    Cutting-edge scientific computing software is complex, increasingly involving the coupling of multiple packages to combine advanced algorithms or simulations at multiple physical scales. Component-based software engineering (CBSE) has been advanced as a technique for managing this complexity, and complex component applications have been created in the quantum chemistry domain, as well as several other simulation areas, using the component model advocated by the Common Component Architecture (CCA) Forum. While programming models do indeed enable sound software engineering practices, the selection of programming model is just one building block in a comprehensive approach to large-scale collaborative development which must also addressmore » interface and data standardization, and language and package interoperability. We provide an overview of the development approach utilized within the Quantum Chemistry Science Application Partnership, identifying design challenges, describing the techniques which we have adopted to address these challenges and highlighting the advantages which the CCA approach offers for collaborative development.« less

  17. The Study on Neuro-IE Management Software in Manufacturing Enterprises. -The Application of Video Analysis Technology

    NASA Astrophysics Data System (ADS)

    Bian, Jun; Fu, Huijian; Shang, Qian; Zhou, Xiangyang; Ma, Qingguo

    This paper analyzes the outstanding problems in current industrial production by reviewing the three stages of the Industrial Engineering Development. Based on investigations and interviews in enterprises, we propose the new idea of applying "computer video analysis technology" to new industrial engineering management software, and add "loose-coefficient" of the working station to this software in order to arrange scientific and humanistic production. Meanwhile, we suggest utilizing Biofeedback Technology to promote further research on "the rules of workers' physiological, psychological and emotional changes in production". This new kind of combination will push forward industrial engineering theories and benefit enterprises in progressing towards flexible social production, thus it will be of great theory innovation value, social significance and application value.

  18. Design of the on-board application software for the instrument control unit of Euclid-NISP

    NASA Astrophysics Data System (ADS)

    Ligori, Sebastiano; Corcione, Leonardo; Capobianco, Vito; Valenziano, Luca

    2014-08-01

    In this paper we describe the main requirements driving the development of the Application software of the ICU of NISP, the Near-Infrared Spectro-Photometer of the Euclid mission. This software will be based on a real-time operating system and will interface with all the subunits of NISP, as well as the CMDU of the spacecraft for the Telecommand and Housekeeping management. We briefly detail the services (following the PUS standard) that will be made available, and also possible commonalities in the approach with the ASW of the VIS CDPU, which could make the development effort more efficient; this approach could also make easier the maintenance of the SW during the mission. The development plan of the ASW and the next milestones foreseen are described, together with the architectural design approach and the development environment we are setting up.

  19. MisTec - A software application for supporting space exploration scenario options and technology development analysis and planning

    NASA Technical Reports Server (NTRS)

    Horsham, Gary A. P.

    1992-01-01

    This structure and composition of a new, emerging software application, which models and analyzes space exploration scenario options for feasibility based on technology development projections is presented. The software application consists of four main components: a scenario generator for designing and inputting scenario options and constraints; a processor which performs algorithmic coupling and options analyses of mission activity requirements and technology capabilities; a results display which graphically and textually shows coupling and options analysis results; and a data/knowledge base which contains information on a variety of mission activities and (power and propulsion) technology system capabilities. The general long-range study process used by NASA to support recent studies is briefly introduced to provide the primary basis for comparison for discussing the potential advantages to be gained from developing and applying this kind of application. A hypothetical example of a scenario option to facilitate the best conceptual understanding of what the application is, how it works, or the operating methodology, and when it might be applied is presented.

  20. MisTec: A software application for supporting space exploration scenario options and technology development analysis and planning

    NASA Technical Reports Server (NTRS)

    Horsham, Gary A. P.

    1991-01-01

    The structure and composition of a new, emerging software application, which models and analyzes space exploration scenario options for feasibility based on technology development projections is presented. The software application consists of four main components: a scenario generator for designing and inputting scenario options and constraints; a processor which performs algorithmic coupling and options analyses of mission activity requirements and technology capabilities; a results display which graphically and textually shows coupling and options analysis results; and a data/knowledge base which contains information on a variety of mission activities and (power and propulsion) technology system capabilities. The general long-range study process used by NASA to support recent studies is briefly introduced to provide the primary basis for comparison for discussing the potential advantages to be gained from developing and applying this king of application. A hypothetical example of a scenario option to facilitate the best conceptual understanding of what the application is, how it works, or the operating methodology, and when it might be applied is presented.

  1. The Environment for Application Software Integration and Execution (EASIE), version 1.0. Volume 3: Program execution guide

    NASA Technical Reports Server (NTRS)

    Schwing, James L.; Criste, Russell E.; Schwing, James L.; Criste, Russell E.

    1988-01-01

    The Environment for Application Software Integration and Execution, EASIE, provides a methodology and a set of software utility programs to ease the task of coordinating engineering design and analysis codes. EASIE was designed to meet the needs of conceptual design engineers that face the task of integrating the results of many stand-alone engineering analysis programs. EASIE provides access to these programs via a quick, uniform, user-friendly interface. In addition, EASIE provides utilities which aid in the execution of the following tasks: selection of application programs, modification and review of program data, automatic definition and coordination of data files during program execution and a logging of steps executed throughout a design study. Volume 3, the Program Execution Guide, describes the executive capabilities provided by EASIE and defines the command language and menus available under Version 1.0. EASIE provides users with two basic modes of operation. One is the Application-Derived Executive (ADE) which provides users with sufficient guidance to quickly review data, select menu action items, and execute application programs. The second is the Complete Control Executive (CCE), which provides a full executive interface allowing users in-depth control of the design process.

  2. Space Station Software Recommendations

    NASA Technical Reports Server (NTRS)

    Voigt, S. (Editor)

    1985-01-01

    Four panels of invited experts and NASA representatives focused on the following topics: software management, software development environment, languages, and software standards. Each panel deliberated in private, held two open sessions with audience participation, and developed recommendations for the NASA Space Station Program. The major thrusts of the recommendations were as follows: (1) The software management plan should establish policies, responsibilities, and decision points for software acquisition; (2) NASA should furnish a uniform modular software support environment and require its use for all space station software acquired (or developed); (3) The language Ada should be selected for space station software, and NASA should begin to address issues related to the effective use of Ada; and (4) The space station software standards should be selected (based upon existing standards where possible), and an organization should be identified to promulgate and enforce them. These and related recommendations are described in detail in the conference proceedings.

  3. Computer Software.

    ERIC Educational Resources Information Center

    Kay, Alan

    1984-01-01

    Discusses the nature and development of computer software. Programing, programing languages, types of software (including dynamic spreadsheets), and software of the future are among the topics considered. (JN)

  4. Energy-based adaptive focusing of waves: application to noninvasive aberration correction of ultrasonic wavefields

    PubMed Central

    Herbert, Eric; Pernot, Mathieu; Montaldo, Gabriel; Fink, Mathias; Tanter, Mickael

    2009-01-01

    An aberration correction method based on the maximization of the wave intensity at the focus of an emitting array is presented. The potential of this new adaptive focusing technique is investigated for ultrasonic focusing in biological tissues. The acoustic intensity is maximized non invasively through the direct measurement or indirect estimation of the beam energy at the focus for a series of spatially coded emissions. For ultrasonic waves, the acoustic energy at the desired focus can be indirectly estimated from the local displacements induced in tissues by the ultrasonic radiation force of the beam. Based on the measurement of these displacements, this method allows the precise estimation of the phase and amplitude aberrations and consequently the correction of aberrations along the beam travel path. The proof of concept is first performed experimentally using a large therapeutic array with strong electronic phase aberrations (up to 2π). Displacements induced by the ultrasonic radiation force at the desired focus are indirectly estimated using the time shift of backscattered echoes recorded on the array. The phase estimation is deduced accurately using a direct inversion algorithm which reduces the standard deviation of the phase distribution from σ = 1.89 before correction to σ = 0.53 following correction. The corrected beam focusing quality is verified using a needle hydrophone. The peak intensity obtained through the aberrator is found to be −7.69 dB below the reference intensity obtained without any aberration. Using the phase correction, a sharp focus is restored through the aberrator with a relative peak intensity of −0.89 dB. The technique is tested experimentally using a linear transmit/receive array through a real aberrating layer. The array is used to automatically correct its beam quality, as it both generates the radiation force with coded excitations and indirectly estimates the acoustic intensity at the focus with speckle tracking. This

  5. Development and applications of a software tool for diarthrodial joint analysis.

    PubMed

    Martelli, Sandra; Lopomo, Nicola; Greggio, Samuele; Ferretti, Emil; Visani, Andrea

    2006-07-01

    This paper describes a new software environment for advanced analysis of diarthrodial joints. The new tool provides a number of elaboration functions to investigate the joint kinematics, bone anatomy, and ligament and tendon properties. In particular, the shapes and the contact points of the articulating surfaces can be displayed and analysed through 2D user-defined sections and fittings (lines or conics). Ligament behaviour can be evaluated during joint movement, through the computation of elongations, orientations, and fiber strain. Motion trajectories can be also analysed through the calculation of helical axes, instantaneous rotations, and displacements in specific user-chosen coordinate reference frames. The software has an user-friendly graphical interface to display four-dimensional data (time-space data) obtained from medical images, navigation systems, spatial linkages or digitalizers, and can also generate printable reports and multiple graphs as well as ASCII files that can be imported to spreadsheet programs such as Microsoft Excel. PMID:16777259

  6. Parallel Domain Decomposition Formulation and Software for Large-Scale Sparse Symmetrical/Unsymmetrical Aeroacoustic Applications

    NASA Technical Reports Server (NTRS)

    Nguyen, D. T.; Watson, Willie R. (Technical Monitor)

    2005-01-01

    The overall objectives of this research work are to formulate and validate efficient parallel algorithms, and to efficiently design/implement computer software for solving large-scale acoustic problems, arised from the unified frameworks of the finite element procedures. The adopted parallel Finite Element (FE) Domain Decomposition (DD) procedures should fully take advantages of multiple processing capabilities offered by most modern high performance computing platforms for efficient parallel computation. To achieve this objective. the formulation needs to integrate efficient sparse (and dense) assembly techniques, hybrid (or mixed) direct and iterative equation solvers, proper pre-conditioned strategies, unrolling strategies, and effective processors' communicating schemes. Finally, the numerical performance of the developed parallel finite element procedures will be evaluated by solving series of structural, and acoustic (symmetrical and un-symmetrical) problems (in different computing platforms). Comparisons with existing "commercialized" and/or "public domain" software are also included, whenever possible.

  7. Application of optical design software in the analysis of "unknown" optical systems

    NASA Astrophysics Data System (ADS)

    Roudnicky, Dunja S.

    1998-08-01

    Optical design software is not very usable in designing new optical systems only, but also in analysis of `unknown' systems. When measurements of radii of curvature, focal lengths and axial thickness of elements are done, we use SIGMA 2100 Optical design software (Kidger Optics). We determine which optical glass fits the nearest measured focal length of each element. We also get aberration curves of elements and the whole system. In such a way we analyze elements of an eyepiece which is the part of a compound panoramic sight. Since we now have all specifications of this eyepiece, it is possible to optimize glasses and radii to the more convenient ones, without a risk to change the performance of the whole optical system. This method gives us a possibility of reparation and adaptation of `unknown' optical systems with a high yield.

  8. The software engineering laboratory: An approach to measuring software technology

    NASA Technical Reports Server (NTRS)

    Mcgarry, F.

    1980-01-01

    The investigations of the software evaluation laboratory into the software development process at NASA/Goddard are described. A data collection process for acquiring detailed histories of software development projects is outlined. The application of different sets of software methodologies to specific applications projects is summarized. The effect of the development methodology on productivity is discussed.

  9. TreeRipper web application: towards a fully automated optical tree recognition software

    PubMed Central

    2011-01-01

    Background Relationships between species, genes and genomes have been printed as trees for over a century. Whilst this may have been the best format for exchanging and sharing phylogenetic hypotheses during the 20th century, the worldwide web now provides faster and automated ways of transferring and sharing phylogenetic knowledge. However, novel software is needed to defrost these published phylogenies for the 21st century. Results TreeRipper is a simple website for the fully-automated recognition of multifurcating phylogenetic trees (http://linnaeus.zoology.gla.ac.uk/~jhughes/treeripper/). The program accepts a range of input image formats (PNG, JPG/JPEG or GIF). The underlying command line c++ program follows a number of cleaning steps to detect lines, remove node labels, patch-up broken lines and corners and detect line edges. The edge contour is then determined to detect the branch length, tip label positions and the topology of the tree. Optical Character Recognition (OCR) is used to convert the tip labels into text with the freely available tesseract-ocr software. 32% of images meeting the prerequisites for TreeRipper were successfully recognised, the largest tree had 115 leaves. Conclusions Despite the diversity of ways phylogenies have been illustrated making the design of a fully automated tree recognition software difficult, TreeRipper is a step towards automating the digitization of past phylogenies. We also provide a dataset of 100 tree images and associated tree files for training and/or benchmarking future software. TreeRipper is an open source project licensed under the GNU General Public Licence v3. PMID:21599881

  10. Perspectives in Medical Applications of Monte Carlo Simulation Software for Clinical Practice in Radiotherapy Treatments

    NASA Astrophysics Data System (ADS)

    Boschini, Matteo; Giani, Simone; Ivanchenko, Vladimir; Rancoita, Pier-Giorgio

    2006-04-01

    We discuss the physics requirements to accurately model radiation dosimetry in the human body as performed for oncological radiotherapy treatment. Recent advancements in computing hardware and software simulation technology allow precise dose calculation in real-life imaging output, with speed suitable for clinical needs. An experimental programme, based on physics published literature, is proposed to demonstrate the actual possibility to improve the precision of radiotherapy treatment planning.

  11. Dual reflector antenna design software - Application to offset-fed shaped elliptical aperture systems

    NASA Astrophysics Data System (ADS)

    Kossiavas, Isabelle

    1992-04-01

    To facilitate the design of dual reflector antennas, the interactive, graphic CA2R software package handles centrally or offset-fed structures with quadric or shaped reflectors. Surface shaping, based on geometrical optics, improves the antenna's efficiency and the sidelobe level. Existing techniques are applied to an offset-fed antenna with an elliptical projected aperture. An original numerical method to minimize crosspolar components is also presented.

  12. Distributing LHC application software and conditions databases using the CernVM file system

    NASA Astrophysics Data System (ADS)

    Blomer, Jakob; Aguado-Sánchez, Carlos; Buncic, Predrag; Harutyunyan, Artem

    2011-12-01

    The CernVM File System (CernVM-FS) is a read-only file system designed to deliver high energy physics (HEP) experiment analysis software onto virtual machines and Grid worker nodes in a fast, scalable, and reliable way. CernVM-FS decouples the underlying operating system from the experiment denned software stack. Files and file metadata are aggressively cached and downloaded on demand. By designing the file system specifically to the use case of HEP software repositories and experiment conditions data, several typically hard problems for (distributed) file systems can be solved in an elegant way. For the distribution of files, we use a standard HTTP transport, which allows exploitation of a variety of web caches, including commercial content delivery networks. We ensure data authenticity and integrity over possibly untrusted caches and connections while keeping all distributed data cacheable. On the small scale, we developed an experimental extension that allows multiple CernVM-FS instances in a computing cluster to discover each other and to share their file caches.

  13. Interfacing US Census map files with statistical graphics software: Application and use in epidemiology

    SciTech Connect

    Rizzardi, M.; Mohr, M.S.; Merrill, D.W.; Selvin, S. California Univ., Berkeley, CA . School of Public Health)

    1993-03-01

    In 1990, the United States Bureau of the Census released detailed geographic map files known as TIGER/Line (Topologically Integrated Geographic Encoding and Referencing). The TIGER files, accessible through purchase or Federal repository libraries, contain 24 billion characters of data describing various geographic features including coastlines, hydrography, transportation networks, political boundaries, etc. covering the entire United States. Many of these physical features are of potential interest in epidemiological case studies. Unfortunately, the TIGER database only provides raw alphanumeric data; no utility software, graphical or otherwise, is included. Recently, the S statistical software package has been extended to include a map display function. The map function augments S's high-level approach toward statistical analysis and graphical display of data. Coupling this statistical software with the map database developed for US Census data collection will facilitate epidemiological research. We discuss the technical background necessary to utilize the TIGER database for mapping with S. Two types of S maps, segment-based and polygon-based, are discussed along with methods to construct them from TIGER data. Polygon-based maps are useful for displaying regional statistical data; e.g., disease rates or incidence at the census tract level. Segment-based maps are easier to assemble and appropriate if the data are not regionalized. Census tract data of AIDS incidence in San Francisco (CA) and lung cancer case locations relative to petrochemical refinery sites in Contra Costa County (CA) are used to illustrate the methods and potential uses of interfacing the TIGER database with S.

  14. The Application of Synthetically Focused Imaging Techniques for High Resolution Guided Wave Pipe Inspection

    NASA Astrophysics Data System (ADS)

    Davies, J.; Cawley, P.

    2007-03-01

    Synthetically focused guided wave imaging techniques have previously been employed for plate like structures. Much work has been has been done using algorithms such as the synthetic aperture focusing technique (SAFT), the total focusing method (TFM) and common source method (CSM) using both linear and circular arrays. The resolutions for such algorithms for the plate case are well known. We have attempted to use these algorithms for imaging defects in pipes using an array of piezoelectric shear transducers clamped around the pipe circumference. We show that the SAFT and the TFM methods both suffer from coherent noise in the image caused by circumferentially propagating wave modes. It is shown that the common source method (CSM) method does not suffer from this problem though the resolution obtained is poorer. Results from the different imaging algorithms are presented for an 8 inch pipe using 50 kHz excitation, both from finite element simulations and laboratory experiments.

  15. Applications of variable focus liquid lenses for curvature wave-front sensors in astronomy

    NASA Astrophysics Data System (ADS)

    Fuentes-Fernández, J.; Cuevas, S.; Alvarez-Nuñez, L. C.; Watson, A. M.

    2014-08-01

    Curvature wavefront sensors obtain the wave-front aberrations from two defocused intensity images at each side of the pupil plane. Typically, when high modulation speeds are required, as it is the case with Adaptive Optics, that defocusing is done with a fast vibrating membrane mirror. We propose an alternative defocusing mechanism based on an electrowetting variable focus liquid lens. The use of such lenses may perform the required focus modulation without the need of extra moving parts, reducing the overall size of the system.

  16. Focused ion beam post-processing of optical fiber Fabry-Perot cavities for sensing applications.

    PubMed

    André, Ricardo M; Pevec, Simon; Becker, Martin; Dellith, Jan; Rothhardt, Manfred; Marques, Manuel B; Donlagic, Denis; Bartelt, Hartmut; Frazão, Orlando

    2014-06-01

    Focused ion beam technology is combined with chemical etching of specifically designed fibers to create Fabry-Perot interferometers. Hydrofluoric acid is used to etch special fibers and create microwires with diameters of 15 μm. These microwires are then milled with a focused ion beam to create two different structures: an indented Fabry-Perot structure and a cantilever Fabry-Perot structure that are characterized in terms of temperature. The cantilever structure is also sensitive to vibrations and is capable of measuring frequencies in the range 1 Hz - 40 kHz. PMID:24921506

  17. On Software Compatibility.

    ERIC Educational Resources Information Center

    Ershov, Andrei P.

    The problem of compatibility of software hampers the development of computer application. One solution lies in standardization of languages, terms, peripherais, operating systems and computer characteristics. (AB)

  18. ASR Application in Climate Change Adaptation: The Need, Issues and Research Focus

    EPA Science Inventory

    This presentation will focus on four key points: (a) Aquifer storage and recovery: a long-held practice offering a potential tool for climate change adaptation, (b) The drivers: 1) hydrological perturbations related to climate change, 2) water imbalance in both Qand Vbetween wat...

  19. Solution of the lossy nonlinear Tricomi equation with application to sonic boom focusing

    NASA Astrophysics Data System (ADS)

    Salamone, Joseph A., III

    Sonic boom focusing theory has been augmented with new terms that account for mean flow effects in the direction of propagation and also for atmospheric absorption/dispersion due to molecular relaxation due to oxygen and nitrogen. The newly derived model equation was numerically implemented using a computer code. The computer code was numerically validated using a spectral solution for nonlinear propagation of a sinusoid through a lossy homogeneous medium. An additional numerical check was performed to verify the linear diffraction component of the code calculations. The computer code was experimentally validated using measured sonic boom focusing data from the NASA sponsored Superboom Caustic and Analysis Measurement Program (SCAMP) flight test. The computer code was in good agreement with both the numerical and experimental validation. The newly developed code was applied to examine the focusing of a NASA low-boom demonstration vehicle concept. The resulting pressure field was calculated for several supersonic climb profiles. The shaping efforts designed into the signatures were still somewhat evident despite the effects of sonic boom focusing.

  20. Global focused identification of germplasm strategy (figs) application on Trifolium epens L.

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Trifolium repens L. is a legume species extensively used in grass pastures. Traits such as level of cyanogenic glucosides and flower production are important in breeding productive and nutritious varieties. The Focused Identification of Germplasm Strategy (FIGS) is an approach used to screen large g...

  1. Means for the focusing and acceleration of parallel beams of charged particles. [Patent application

    DOEpatents

    Maschke, A.W.

    1980-09-23

    Apparatus for focusing beams of charged particles comprising planar arrays of electrostatic quadrupoles. The array may be assembled from a single component which comprises a support plate containing uniform rows of poles. Each pole is separated by a hole through the plate designed to pass a beam. Two such plates may be positioned with their poles intermeshed to form a plurality of quadrupoles.

  2. Ecologically-focused Calibration of Hydrological Models for Environmental Flow Applications

    NASA Astrophysics Data System (ADS)

    Adams, S. K.; Bledsoe, B. P.

    2015-12-01

    Hydrologic alteration resulting from watershed urbanization is a common cause of aquatic ecosystem degradation. Developing environmental flow criteria for urbanizing watersheds requires quantitative flow-ecology relationships that describe biological responses to streamflow alteration. Ideally, gaged flow data are used to develop flow-ecology relationships; however, biological monitoring sites are frequently ungaged. For these ungaged locations, hydrologic models must be used to predict streamflow characteristics through calibration and testing at gaged sites, followed by extrapolation to ungaged sites. Physically-based modeling of rainfall-runoff response has frequently utilized "best overall fit" calibration criteria, such as the Nash-Sutcliffe Efficiency (NSE), that do not necessarily focus on specific aspects of the flow regime relevant to biota of interest. This study investigates the utility of employing flow characteristics known a priori to influence regional biological endpoints as "ecologically-focused" calibration criteria compared to traditional, "best overall fit" criteria. For this study, 19 continuous HEC-HMS 4.0 models were created in coastal southern California and calibrated to hourly USGS streamflow gages with nearby biological monitoring sites using one "best overall fit" and three "ecologically-focused" criteria: NSE, Richards-Baker Flashiness Index (RBI), percent of time when the flow is < 1 cfs (%<1), and a Combined Calibration (RBI and %<1). Calibrated models were compared using calibration accuracy, environmental flow metric reproducibility, and the strength of flow-ecology relationships. Results indicate that "ecologically-focused" criteria can be calibrated with high accuracy and may provide stronger flow-ecology relationships than "best overall fit" criteria, especially when multiple "ecologically-focused" criteria are used in concert, despite inabilities to accurately reproduce additional types of ecological flow metrics to which the

  3. The Life Cycle Application of Intelligent Software Modeling for the First Materials Science Research Rack

    NASA Technical Reports Server (NTRS)

    Rice, Amanda; Parris, Frank; Nerren, Philip

    2000-01-01

    Marshall Space Flight Center (MSFC) has been funding development of intelligent software models to benefit payload ground operations for nearly a decade. Experience gained from simulator development and real-time monitoring and control is being applied to engineering design, testing, and operation of the First Material Science Research Rack (MSRR-1). MSRR-1 is the first rack in a suite of three racks comprising the Materials Science Research Facility (MSRF) which will operate on the International Space Station (ISS). The MSRF will accommodate advanced microgravity investigations in areas such as the fields of solidification of metals and alloys, thermo-physical properties of polymers, crystal growth studies of semiconductor materials, and research in ceramics and glasses. The MSRR-1 is a joint venture between NASA and the European Space Agency (ESA) to study the behavior of different materials during high temperature processing in a low gravity environment. The planned MSRR-1 mission duration is five (5) years on-orbit and the total design life is ten (IO) years. The MSRR-1 launch is scheduled on the third Utilization Flight (UF-3) to ISS, currently in February of 2003). The objective of MSRR-1 is to provide an early capability on the ISS to conduct material science, materials technology, and space product research investigations in microgravity. It will provide a modular, multi-user facility for microgravity research in materials crystal growth and solidification. An intelligent software model of MSRR-1 is under development and will serve multiple purposes to support the engineering analysis, testing, training, and operational phases of the MSRR-1 life cycle development. The G2 real-time expert system software environment developed by Gensym Corporation was selected as the intelligent system shell for this development work based on past experience gained and the effectiveness of the programming environment. Our approach of multi- uses of the simulation model and

  4. Computerized accounting for the dental office. Using horizontal applications general ledger software.

    PubMed

    Garsson, B

    1988-01-01

    Remember that computer software is designed for accrual accounting, whereas your business operates and reports income on a cash basis. The rules of tax law stipulate that professional practices may use the cash method of accounting, but if accrual accounting is ever used to report taxable income the government may not permit a switch back to cash accounting. Therefore, always consider the computer as a bookkeeper, not a substitute for a qualified accountant. (Your accountant will have readily accessible payroll and general ledger data available for analysis and tax reports, thanks to the magic of computer processing.) Accounts Payable reports are interfaced with the general ledger and are of interest for transaction detail, open invoice and cash flow analysis, and for a record of payments by vendor. Payroll reports, including check register and withholding detail are provided and interfaced with the general ledger. The use of accounting software expands the use of in-office computers to areas beyond professional billing and insurance form generation. It simplifies payroll recordkeeping; maintains payables details; integrates payables, receivables, and payroll with general ledger files; provides instantaneous information on all aspects of the business office; and creates a continuous "audit-trail" following the entering of data. The availability of packaged accounting software allows the professional business office an array of choices. The person(s) responsible for bookkeeping and accounting should choose carefully, ensuring that any system is easy to use, has been thoroughly tested, and provides at least as much control over office records as has been outlined in this article. PMID:3422198

  5. Interfacing US Census map files with statistical graphics software: Application and use in epidemiology. Revision 1

    SciTech Connect

    Rizzardi, M.; Mohr, M.S.; Merrill, D.W.; Selvin, S. |

    1993-03-01

    In 1990, the United States Bureau of the Census released detailed geographic map files known as TIGER/Line (Topologically Integrated Geographic Encoding and Referencing). The TIGER files, accessible through purchase or Federal repository libraries, contain 24 billion characters of data describing various geographic features including coastlines, hydrography, transportation networks, political boundaries, etc. covering the entire United States. Many of these physical features are of potential interest in epidemiological case studies. Unfortunately, the TIGER database only provides raw alphanumeric data; no utility software, graphical or otherwise, is included. Recently, the S statistical software package has been extended to include a map display function. The map function augments S`s high-level approach toward statistical analysis and graphical display of data. Coupling this statistical software with the map database developed for US Census data collection will facilitate epidemiological research. We discuss the technical background necessary to utilize the TIGER database for mapping with S. Two types of S maps, segment-based and polygon-based, are discussed along with methods to construct them from TIGER data. Polygon-based maps are useful for displaying regional statistical data; e.g., disease rates or incidence at the census tract level. Segment-based maps are easier to assemble and appropriate if the data are not regionalized. Census tract data of AIDS incidence in San Francisco (CA) and lung cancer case locations relative to petrochemical refinery sites in Contra Costa County (CA) are used to illustrate the methods and potential uses of interfacing the TIGER database with S.

  6. Agent Building Software

    NASA Technical Reports Server (NTRS)

    2000-01-01

    AgentBuilder is a software component developed under an SBIR contract between Reticular Systems, Inc., and Goddard Space Flight Center. AgentBuilder allows software developers without experience in intelligent agent technologies to easily build software applications using intelligent agents. Agents are components of software that will perform tasks automatically, with no intervention or command from a user. AgentBuilder reduces the time and cost of developing agent systems and provides a simple mechanism for implementing high-performance agent systems.

  7. Experimentation in software engineering

    NASA Technical Reports Server (NTRS)

    Basili, V. R.; Selby, R. W.; Hutchens, D. H.

    1986-01-01

    Experimentation in software engineering supports the advancement of the field through an iterative learning process. In this paper, a framework for analyzing most of the experimental work performed in software engineering over the past several years is presented. A variety of experiments in the framework is described and their contribution to the software engineering discipline is discussed. Some useful recommendations for the application of the experimental process in software engineering are included.

  8. EBT2 dosimetry of x-rays produced by the electron beam from a Plasma Focus for medical applications

    NASA Astrophysics Data System (ADS)

    Ceccolini, E.; Rocchi, F.; Mostacci, D.; Sumini, M.; Tartari, A.; Mariotti, F.

    2012-09-01

    The electron beam emitted from the back of Plasma Focus devices is being studied as a radiation source for intraoperative radiation therapy applications. A Plasma Focus device is being developed to this aim, to be utilized as an x-ray source. The electron beam is driven to impinge on 50 μm brass foil, where conversion x-rays are generated. Measurements with gafchromic film are performed to analyse the attenuation of the x-rays beam and to predict the dose given to the culture cell in radiobiological experiments to follow.

  9. EBT2 dosimetry of x-rays produced by the electron beam from a Plasma Focus for medical applications

    SciTech Connect

    Ceccolini, E.; Mostacci, D.; Sumini, M.; Rocchi, F.; Tartari, A.; Mariotti, F.

    2012-09-01

    The electron beam emitted from the back of Plasma Focus devices is being studied as a radiation source for intraoperative radiation therapy applications. A Plasma Focus device is being developed to this aim, to be utilized as an x-ray source. The electron beam is driven to impinge on 50 {mu}m brass foil, where conversion x-rays are generated. Measurements with gafchromic film are performed to analyse the attenuation of the x-rays beam and to predict the dose given to the culture cell in radiobiological experiments to follow.

  10. Microfabrication and Test of a Three-Dimensional Polymer Hydro-Focusing Unit for Flow Cytometry Applications

    NASA Technical Reports Server (NTRS)

    Yang, Ren; Feedback, Daniel L.; Wang, Wanjun

    2004-01-01

    This paper details a novel three-dimensional (3D) hydro-focusing micro cell sorter for micro flow cytometry applications. The unit was micro-fabricated by means of SU-8 3D lithography. The 3D microstructure for coaxial sheathing was designed, micro-fabricated, and tested. Three-dimensional hydrofocusing capability was demonstrated with an experiment to sort labeled tanned sheep erythrocytes (red blood cells). This polymer hydro-focusing microstructure is easily micro-fabricated and integrated with other polymer microfluidic structures.

  11. An FPGA hardware/software co-design towards evolvable spiking neural networks for robotics application.

    PubMed

    Johnston, S P; Prasad, G; Maguire, L; McGinnity, T M

    2010-12-01

    This paper presents an approach that permits the effective hardware realization of a novel Evolvable Spiking Neural Network (ESNN) paradigm on Field Programmable Gate Arrays (FPGAs). The ESNN possesses a hybrid learning algorithm that consists of a Spike Timing Dependent Plasticity (STDP) mechanism fused with a Genetic Algorithm (GA). The design and implementation direction utilizes the latest advancements in FPGA technology to provide a partitioned hardware/software co-design solution. The approach achieves the maximum FPGA flexibility obtainable for the ESNN paradigm. The algorithm was applied as an embedded intelligent system robotic controller to solve an autonomous navigation and obstacle avoidance problem. PMID:21117269

  12. Applications of the Lithium Focused Ion Beam: Nanoscale Electrochemistry and Microdisk Mode Imaging

    NASA Astrophysics Data System (ADS)

    McGehee, William; Takeuchi, Saya; Michels, Thomas; Oleshko, Vladimir; Aksyuk, Vladimir; Soles, Christopher; McClelland, Jabez; Center for Nanoscale Science; Technology at NIST Collaboration; Materials Measurement Laboratory at NIST Collaboration

    2016-05-01

    The NIST-developed lithium Focused-Ion-Beam (LiFIB) system creates a low-energy, picoampere-scale ion beam from a photoionized gas of laser-cooled atoms. The ion beam can be focused to a <30 nm spot and scanned across a sample. This enables imaging through collection of ion-induced secondary electrons (similar to SEM) as well as the ability to selectively deposit lithium-ions into nanoscale volumes in a material. We exploit this second ability of the LiFIB to selectively ''titrate'' lithium ions as a means of probing the optical modes in microdisk resonators as well as for exploring nanoscale, Li-ion electrochemistry in battery-relevant materials. We present an overview of both measurements, including imaging of the optical mode in a silicon microdisk and a comparison of FIB and electrochemical lithiation of tin.

  13. Scaffold-Focused Virtual Screening: Prospective Application to the Discovery of TTK Inhibitors

    PubMed Central

    2013-01-01

    We describe and apply a scaffold-focused virtual screen based upon scaffold trees to the mitotic kinase TTK (MPS1). Using level 1 of the scaffold tree, we perform both 2D and 3D similarity searches between a query scaffold and a level 1 scaffold library derived from a 2 million compound library; 98 compounds from 27 unique top-ranked level 1 scaffolds are selected for biochemical screening. We show that this scaffold-focused virtual screen prospectively identifies eight confirmed active compounds that are structurally differentiated from the query compound. In comparison, 100 compounds were selected for biochemical screening using a virtual screen based upon whole molecule similarity resulting in 12 confirmed active compounds that are structurally similar to the query compound. We elucidated the binding mode for four of the eight confirmed scaffold hops to TTK by determining their protein–ligand crystal structures; each represents a ligand-efficient scaffold for inhibitor design. PMID:23672464

  14. A Practical Software Architecture for Virtual Universities

    ERIC Educational Resources Information Center

    Xiang, Peifeng; Shi, Yuanchun; Qin, Weijun

    2006-01-01

    This article introduces a practical software architecture called CUBES, which focuses on system integration and evolvement for online virtual universities. The key of CUBES is a supporting platform that helps to integrate and evolve heterogeneous educational applications developed by different organizations. Both standardized educational…

  15. Software for Fermat's principle and lenses

    NASA Astrophysics Data System (ADS)

    Mihas, Pavlos

    2012-05-01

    Fermat's principle is considered as a unifying concept. It is usually presented erroneously as a 'least time principle'. In this paper we present some software that shows cases of maxima and minima and the application of Fermat's principle to the problem of focusing in lenses.

  16. Software for Fermat's Principle and Lenses

    ERIC Educational Resources Information Center

    Mihas, Pavlos

    2012-01-01

    Fermat's principle is considered as a unifying concept. It is usually presented erroneously as a "least time principle". In this paper we present some software that shows cases of maxima and minima and the application of Fermat's principle to the problem of focusing in lenses. (Contains 12 figures.)

  17. Focused tandem shock waves in water and their potential application in cancer treatment

    NASA Astrophysics Data System (ADS)

    Lukes, P.; Sunka, P.; Hoffer, P.; Stelmashuk, V.; Pouckova, P.; Zadinova, M.; Zeman, J.; Dibdiak, L.; Kolarova, H.; Tomankova, K.; Binder, S.; Benes, J.

    2014-01-01

    The generator of two focused successive (tandem) shock waves (FTSW) in water produced by underwater multichannel electrical discharges at two composite electrodes, with a time delay between the first and second shock waves of 10 s, was developed. It produces, at the focus, a strong shock wave with a peak positive pressure of up to 80 MPa, followed by a tensile wave with a peak negative pressure of up to MPa, thus generating at the focus a large amount of cavitation. Biological effects of FTSW were demonstrated in vitro on hemolysis of erythrocytes and cell viability of human acute lymphoblastic leukemia cells as well as on tumor growth delay ex vivo and in vivo experiments performed with B16 melanoma, T-lymphoma, and R5-28 sarcoma cell lines. It was demonstrated in vivo that FTSW can enhance antitumor effects of chemotherapeutic drugs, such as cisplatin, most likely due to increased permeability of the membrane of cancer cells induced by FTSW. Synergetic cytotoxicity of FTSW with sonosensitive porphyrin-based drug Photosan on tumor growth was observed, possibly due to the cavitation-induced sonodynamic effect of FTSW.

  18. Conical octopole ion guide: Design, focusing, and its application to the deposition of low energetic clusters

    SciTech Connect

    Roettgen, Martin A.; Judai, Ken; Antonietti, Jean-Marie; Heiz, Ueli; Rauschenbach, Stephan; Kern, Klaus

    2006-01-15

    A design of a radio-frequency (rf) octopole ion guide with truncated conical rods arranged in a conical geometry is presented. The performance is tested in a cluster deposition apparatus used for the soft-landing of size-selected clusters on well-characterized substrates used as a model system in heterogeneous catalysis in ultrahigh vacuum. This device allows us to focus 500 pA of a mass-selected Ni{sub 20}{sup +} cluster ion beam from 9 mm down to a spot size of 2 mm in diameter. The transmittance is 70%{+-}5% at a rf voltage of 420 V{sub pp} applied over an amateur radio transceiver with an interposed homemade amplifier-transformer circuit. An increase of the cluster density by a factor of 15 has been achieved. Three ion trajectories are simulated by using SIMION6, which are relevant for this focusing device: transmitted, reflected, and absorbed. The observed effects in the simulations can be successfully explained by the adiabatic approximation. The focusing behavior of the conical octopole lens is demonstrated by experiment and simulations to be a very useful technique for increasing molecule or cluster densities on a substrate and thus reducing deposition time.

  19. Shuttle mission simulator software conceptual design

    NASA Technical Reports Server (NTRS)

    Burke, J. F.

    1973-01-01

    Software conceptual designs (SCD) are presented for meeting the simulator requirements for the shuttle missions. The major areas of the SCD discussed include: malfunction insertion, flight software, applications software, systems software, and computer complex.

  20. An overview of the quantum wavelet transform, focused on earth science applications

    NASA Astrophysics Data System (ADS)

    Shehab, O.; LeMoigne, J.; Lomonaco, S.; Halem, M.

    2015-12-01

    Registering the images from the MODIS system and the OCO-2 satellite is currently being done by classical image registration techniques. One such technique is wavelet transformation. Besides image registration, wavelet transformation is also used in other areas of earth science, for example, processinga and compressing signal variation, etc. In this talk, we investigate the applicability of few quantum wavelet transformation algorithms to perform image registration on the MODIS and OCO-2 data. Most of the known quantum wavelet transformation algorithms are data agnostic. We investigate their applicability in transforming Flexible Representation for Quantum Images. Similarly, we also investigate the applicability of the algorithms in signal variation analysis. We also investigate the transformation of the models into pseudo-boolean functions to implement them on commercially available quantum annealing computers, such as the D-Wave computer located at NASA Ames.

  1. Application of automated mass spectrometry deconvolution and identification software for pesticide analysis in surface waters.

    PubMed

    Furtula, Vesna; Derksen, George; Colodey, Alan

    2006-01-01

    A new approach to surface water analysis has been investigated in order to enhance the detection of different organic contaminants in Nathan Creek, British Columbia. Water samples from Nathan Creek were prepared by liquid/liquid extraction using dichloromethane (DCM) as an extraction solvent and analyzed by gas chromatography mass spectrometry method in scan mode (GC-MS scan). To increase sensitivity for pesticides detection, acquired scan data were further analyzed by Automated Mass Spectrometry Deconvolution and Identification Software (AMDIS) incorporated into the Agilent Deconvolution Reporting Software (DRS), which also includes mass spectral libraries for 567 pesticides. Extracts were reanalyzed by gas chromatography mass spectrometry single ion monitoring (GC-MS-SIM) to confirm and quantitate detected pesticides. Pesticides: atrazine, dimethoate, diazinone, metalaxyl, myclobutanil, napropamide, oxadiazon, propazine and simazine were detected at three sampling sites on the mainstream of the Nathan Creek. Results of the study are further discussed in terms of detectivity and identification level for each pesticide found. The proposed approach of monitoring pesticides in surface waters enables their detection and identification at trace levels. PMID:17090491

  2. Software-Defined Ultra-wideband Radio Communications: A New RF Technology for Emergency Response Applications

    SciTech Connect

    Nekoogar, F; Dowla, F

    2009-10-19

    Reliable wireless communication links for local-area (short-range) and regional (long-range) reach capabilities are crucial for emergency response to disasters. Lack of a dependable communication system can result in disruptions in the situational awareness between the local responders in the field and the emergency command and control centers. To date, all wireless communications systems such as cell phones and walkie-talkies use narrowband radio frequency (RF) signaling for data communication. However, the hostile radio propagation environment caused by collapsed structures and rubble in various disaster sites results in significant degradation and attenuation of narrowband RF signals, which ends up in frequent communication breakdowns. To address the challenges of reliable radio communication in disaster fields, we propose an approach to use ultra-wideband (UWB) or wideband RF waveforms for implementation on Software Defined Radio (SDR) platforms. Ultra-wideband communications has been proven by many research groups to be effective in addressing many of the limitations faced by conventional narrowband radio technologies. In addition, LLNL's radio and wireless team have shown significant success in field deployment of various UWB communications system for harsh environments based on LLNL's patented UWB modulation and equalization techniques. Furthermore, using software defined radio platform for UWB communications offers a great deal of flexibility in operational parameters and helps the radio system to dynamically adapt itself to its environment for optimal performance.

  3. Final Technical Report - Center for Technology for Advanced Scientific Component Software (TASCS)

    SciTech Connect

    Sussman, Alan

    2014-10-21

    This is a final technical report for the University of Maryland work in the SciDAC Center for Technology for Advanced Scientific Component Software (TASCS). The Maryland work focused on software tools for coupling parallel software components built using the Common Component Architecture (CCA) APIs. Those tools are based on the Maryland InterComm software framework that has been used in multiple computational science applications to build large-scale simulations of complex physical systems that employ multiple separately developed codes.

  4. Application of Ion Mobility Spectrometry (IMS) in forensic chemistry and toxicology with focus on biological matrices

    NASA Technical Reports Server (NTRS)

    Bernhard, Werner; Keller, Thomas; Regenscheit, Priska

    1995-01-01

    The IMS (Ion Mobility Spectroscopy) instrument 'Ionscan' takes advantage of the fact that trace quantities of illicit drugs are adsorbed on dust particles on clothes, in cars and on other items of evidence. The dust particles are collected on a membrane filter by a special attachment on a vacuum cleaner. The sample is then directly inserted into the spectrometer and can be analyzed immediately. We show casework applications of a forensic chemistry and toxicology laboratory. One new application of IMS in forensic chemistry is the detection of psilocybin in dried mushrooms without any further sample preparation.

  5. The Environment for Application Software Integration and Execution (EASIE), version 1.0. Volume 2: Program integration guide

    NASA Technical Reports Server (NTRS)

    Jones, Kennie H.; Randall, Donald P.; Stallcup, Scott S.; Rowell, Lawrence F.

    1988-01-01

    The Environment for Application Software Integration and Execution, EASIE, provides a methodology and a set of software utility programs to ease the task of coordinating engineering design and analysis codes. EASIE was designed to meet the needs of conceptual design engineers that face the task of integrating many stand-alone engineering analysis programs. Using EASIE, programs are integrated through a relational data base management system. In volume 2, the use of a SYSTEM LIBRARY PROCESSOR is used to construct a DATA DICTIONARY describing all relations defined in the data base, and a TEMPLATE LIBRARY. A TEMPLATE is a description of all subsets of relations (including conditional selection criteria and sorting specifications) to be accessed as input or output for a given application. Together, these form the SYSTEM LIBRARY which is used to automatically produce the data base schema, FORTRAN subroutines to retrieve/store data from/to the data base, and instructions to a generic REVIEWER program providing review/modification of data for a given template. Automation of these functions eliminates much of the tedious, error prone work required by the usual approach to data base integration.

  6. A theory for graph-based language specification, analysis, and mapping with application to the development of parallel software

    SciTech Connect

    Bailor, P.D.

    1989-01-01

    This investigation develops a generalized formal language theory model and associated theorems for the specification, analysis, and mapping of graphs and graph-based languages. The developed model is defined as a graph generative system, and the model is analyzed from a set theoretic, formal language, algebraic, and abstract automata perspective. As a result of the analysis, numerous theorems pertaining to the properties of the model, graphs, and graph-based languages are derived. Additionally, the graph generative system model serves as the basis for applying graph-based languages to areas such as the specification and design of software and visual programming. The specific application area emphasized is the use of graph-based languages as user friendly interfaces for wide-spectrum languages that include structures for representing parallelism. The goal of this approach is to provide an effective, efficient, and formal method for the specification, design, and rapid prototyping of parallel software. To demonstrate the utility of the theory and the feasibility of the application, two models of parallel computation are chosen. The widely used Petri net model of parallel computation is formalized as a graph-based language. The Petri net syntax is formally mapped into the corresponding syntax of a Communicating Sequential Processes (CSP) model of parallel computation where CSP is used as the formalism for extended wide-spectrum languages. Finally, the Petri net to CSP mapping is analyzed from a behavioral perspective to demonstrate that the CSP specification formally behaves in a manner equivalent to the Petri net model.

  7. Theory for graph-based language specification, analysis, and mapping with application to the development of parallel software. Doctoral thesis

    SciTech Connect

    Bailor, P.D.

    1989-09-01

    Ageneralized formal language theory model and associated theorems were developed for the specification, analysis, and mapping of graphs and graph-based languages. The developed model, defined as a graph-generative system, is analyzed from a set theoretic, formal language, algebraic, and abstract automata perspective. As a result of the analysis, numerous theorems pertaining to the properties of the model, graphs, and graph-based languages are derived. The graph generative system model also serves as the basis for applying graph based languages to areas such as the specification and design of software and visual programming. The specific application area emphasized is the use of graph-based languages as user-friendly interfaces for wide-spectrum languages that include structures for representing parallelism. The goal of this approach is to provide an effective, efficient, and formal method for the specification, design, and rapid prototyping of parallel software. To demonstrate the theory's utility and the feasibility of the application, two models of parallel computation are chosen. The widely used Petri net model of parallel computation is formalized as a graph-based language. The Petri net syntax is formally mapped into the corresponding syntax of a Communicating Sequential Processes(CSP) model of parallel computation where CSP is used as the formalism for extended wide-spectrum languages. Finally, the Petri net to CSP mapping is analyzed to demonstrate that the CSP specification formally behaves in a manner equivalent to the Petri net model.

  8. Software design specification. Part 2: Orbital Flight Test (OFT) detailed design specification. Volume 3: Applications. Book 2: System management

    NASA Technical Reports Server (NTRS)

    1979-01-01

    The functions performed by the systems management (SM) application software are described along with the design employed to accomplish these functions. The operational sequences (OPS) control segments and the cyclic processes they control are defined. The SM specialist function control (SPEC) segments and the display controlled 'on-demand' processes that are invoked by either an OPS or SPEC control segment as a direct result of an item entry to a display are included. Each processing element in the SM application is described including an input/output table and a structured control flow diagram. The flow through the module and other information pertinent to that process and its interfaces to other processes are included.

  9. Ion focusing

    SciTech Connect

    Cooks, Robert Graham; Baird, Zane; Peng, Wen-Ping

    2015-11-10

    The invention generally relates to apparatuses for focusing ions at or above ambient pressure and methods of use thereof. In certain embodiments, the invention provides an apparatus for focusing ions that includes an electrode having a cavity, at least one inlet within the electrode configured to operatively couple with an ionization source, such that discharge generated by the ionization source is injected into the cavity of the electrode, and an outlet. The cavity in the electrode is shaped such that upon application of voltage to the electrode, ions within the cavity are focused and directed to the outlet, which is positioned such that a proximal end of the outlet receives the focused ions and a distal end of the outlet is open to ambient pressure.

  10. A Creative Application of Solution-Focused Counseling: An Integration with Children's Literature and Visual Arts

    ERIC Educational Resources Information Center

    Leggett, Elsa Soto

    2009-01-01

    In order to facilitate young clients' verbal and nonverbal expression of thoughts, feelings, and behaviors, it is necessary to consider therapeutic approaches designed especially for children and adolescents that combine the use of talking and playing (Orton, 1997). This article shares an overview of a creative application of solution-focused…

  11. High intensity focused ultrasound as a tool for tissue engineering: Application to cartilage.

    PubMed

    Nover, Adam B; Hou, Gary Y; Han, Yang; Wang, Shutao; O'Connell, Grace D; Ateshian, Gerard A; Konofagou, Elisa E; Hung, Clark T

    2016-02-01

    This article promotes the use of High Intensity Focused Ultrasound (HIFU) as a tool for affecting the local properties of tissue engineered constructs in vitro. HIFU is a low cost, non-invasive technique used for eliciting focal thermal elevations at variable depths within tissues. HIFU can be used to denature proteins within constructs, leading to decreased permeability and potentially increased local stiffness. Adverse cell viability effects remain restricted to the affected area. The methods described in this article are explored through the scope of articular cartilage tissue engineering and the fabrication of osteochondral constructs, but may be applied to the engineering of a variety of different tissues. PMID:26724968

  12. Computer-controlled High Resolution Arbitrary Waveform Generator (HRAWG) for Focusing Beamforming Applications

    NASA Astrophysics Data System (ADS)

    Assef, Amauri Amorin; Maia, Joaquim Miguel; Costa, Eduardo Tavares

    In advanced ultrasound imaging systems, expensive high-end integrated analog front-ends have been traditionally used to support generation of arbitrary transmit waveforms, in addition to transmit focusing and apodization control. In this paper, we present a cost-effective computer-controlled reconfigurable high-resolution arbitrary waveform generator (HRAWG) that has been designed for ultrasound research, development and teaching at the Federal University of Technology (UTFPR), Brazil. The 8-channel transmit beamformer is fully controlled by a host computer in which a Matlab GUI with the Field II simulation program, allows easy and accurate control over the transmission parameters such as waveform, amplitude apodization and timing.

  13. Characterization of pulsed (plasma focus) neutron source with image plate and application to neutron radiography

    SciTech Connect

    Andola, Sanjay; Niranjan, Ram; Rout, R. K.; Kaushik, T. C.; Gupta, S. C.; Shaikh, A. M.

    2013-02-05

    Plasma focus device of Mather type developed in house has been used first time for neutron radiography of different objects. The device gives (1.2{+-}0.3) Multiplication-Sign 10{sup 9} neutrons per pulse produced by D-D fusion reaction with a pulse width of 50{+-}5 ns. The method involves exposing sample to be radiographed to thermalized D-D neutrons and recording the image on Fuji-film BAS-ND image plates. The thermal neutron component of the moderated beam was estimated using two image plates: a conventional IP for X-rays and gamma rays, and an IP doped with Gd for detecting neutrons.

  14. Point Focusing Thermal and Electric Applications Project. Volume 2: Workshop proceedings

    NASA Technical Reports Server (NTRS)

    Landis, K. E. (Editor)

    1979-01-01

    Point focus distributed receiver solar thermal technology for the production of electric power and of industrial process heat is discussed. Thermal power systems are described. Emphasis is on the development of cost effective systems which will accelerate the commercialization and industrialization of plants, using parabolic dish collectors. The characteristics of PFDR systems and the cost targets for major subsystems hardware are identified. Markets for this technology and their size are identified, and expected levelized bus bar energy costs as a function of yearly production level are presented. The present status of the technology development effort is discussed.

  15. The use of Flex as a viable toolkit for astronomy software applications

    NASA Astrophysics Data System (ADS)

    Gillies, Kim; Conti, Alberto; Rogers, Anthony

    2010-07-01

    The challenges facing the developers of user interfaces for astronomy applications has never been greater. Astronomers and engineers often use well-designed commercial and web applications outside their work environment and have come to expect a similar user experience with applications developed for their work tasks. The connectivity provided by the Internet and the ability to work from anywhere can improve user productivity, but it is a challenge to provide the kind of interactivity and responsiveness needed for astronomical applications to web based projects. It is fair to say that browserbased applications have not been adequate for many kinds of workhorse astronomy applications. The Flex/Actionscript framework from Adobe has been used successfully at the Space Telescope Science Institute in a variety of situations that were not possible with other technologies. In this paper, the Flex framework and technology is briefly introduced followed by a discussion of its advantages and disadvantages and how it addresses user expectations. Three astronomy applications will be presented demonstrating the technology capabilities with useful performance data. Flex/Actionscript is not well known within the astronomy development community, and our goal is to demonstrate that it can be the right choice for many astronomy applications.

  16. Application of mixsep software package: Performance verification of male-mixed DNA analysis.

    PubMed

    Hu, Na; Cong, Bin; Gao, Tao; Chen, Yu; Shen, Junyi; Li, Shujin; Ma, Chunling

    2015-08-01

    An experimental model of male-mixed DNA (n=297) was constructed according to the mixed DNA construction principle. This comprised the use of the Applied Biosystems (ABI) 7500 quantitative polymerase chain reaction system, with scientific validation of mixture proportion (Mx; root-mean-square error ≤ 0.02). Statistical analysis was performed on locus separation accuracy using mixsep, a DNA mixture separation R-package, and the analytical performance of mixsep was assessed by examining the data distribution pattern of different mixed gradients, short tandem repeat (STR) loci and mixed DNA types. The results showed that locus separation accuracy had a negative linear correlation with the mixed gradient (R(2)=-0.7121). With increasing mixed gradient imbalance, locus separation accuracy first increased and then decreased, with the highest value detected at a gradient of 1:3 (≥ 90%). The mixed gradient, which is the theoretical Mx, was one of the primary factors that influenced the success of mixed DNA analysis. Among the 16 STR loci detected by Identifiler®, the separation accuracy was relatively high (>88%) for loci D5S818, D8S1179 and FGA, whereas the median separation accuracy value was lowest for the D7S820 locus. STR loci with relatively large numbers of allelic drop-out (ADO; >15) were all located in the yellow and red channels, including loci D18S51, D19S433, FGA, TPOX and vWA. These five loci featured low allele peak heights, which was consistent with the low sensitivity of the ABI 3130xl Genetic Analyzer to yellow and red fluorescence. The locus separation accuracy of the mixsep package was substantially different with and without the inclusion of ADO loci; inclusion of ADO significantly reduced the analytical performance of the mixsep package, which was consistent with the lack of an ADO functional module in this software. The present study demonstrated that the mixsep software had a number of advantages and was recommended for analysis of mixed DNA. This

  17. Application of mixsep software package: Performance verification of male-mixed DNA analysis

    PubMed Central

    HU, NA; CONG, BIN; GAO, TAO; CHEN, YU; SHEN, JUNYI; LI, SHUJIN; MA, CHUNLING

    2015-01-01

    An experimental model of male-mixed DNA (n=297) was constructed according to the mixed DNA construction principle. This comprised the use of the Applied Biosystems (ABI) 7500 quantitative polymerase chain reaction system, with scientific validation of mixture proportion (Mx; root-mean-square error ≤0.02). Statistical analysis was performed on locus separation accuracy using mixsep, a DNA mixture separation R-package, and the analytical performance of mixsep was assessed by examining the data distribution pattern of different mixed gradients, short tandem repeat (STR) loci and mixed DNA types. The results showed that locus separation accuracy had a negative linear correlation with the mixed gradient (R2=−0.7121). With increasing mixed gradient imbalance, locus separation accuracy first increased and then decreased, with the highest value detected at a gradient of 1:3 (≥90%). The mixed gradient, which is the theoretical Mx, was one of the primary factors that influenced the success of mixed DNA analysis. Among the 16 STR loci detected by Identifiler®, the separation accuracy was relatively high (>88%) for loci D5S818, D8S1179 and FGA, whereas the median separation accuracy value was lowest for the D7S820 locus. STR loci with relatively large numbers of allelic drop-out (ADO; >15) were all located in the yellow and red channels, including loci D18S51, D19S433, FGA, TPOX and vWA. These five loci featured low allele peak heights, which was consistent with the low sensitivity of the ABI 3130xl Genetic Analyzer to yellow and red fluorescence. The locus separation accuracy of the mixsep package was substantially different with and without the inclusion of ADO loci; inclusion of ADO significantly reduced the analytical performance of the mixsep package, which was consistent with the lack of an ADO functional module in this software. The present study demonstrated that the mixsep software had a number of advantages and was recommended for analysis of mixed DNA. This

  18. Potentiality of a table top plasma focus as X-ray source: Radiographic applications

    NASA Astrophysics Data System (ADS)

    Pavez, Cristian; Zambra, Marcelo; Veloso, Felipe; Moreno, José; Soto, Leopoldo

    2014-05-01

    Images radiographic testing on different elements both in biological and inorganic samples are obtained using the X-ray radiation coming from a small plasma focus of at least 350J (880 nF, 40 nH, 28kV, T/4 ~ 300ns). The experimental device is operated using hydrogen as filling gas in a discharge region limited by a volume of around 70 cm3. The X-ray radiation is monitored shot by shot by means of a scintillator-photomultiplier system located outside of the vacuum chamber at 2.3m far away from the emission region. Angular distribution measurements of the accumulated X-ray dose are carried out using TLD-100 dosimeters which are radially distributed around the emission region center. There are two different arrays for the dosimeter which are placed in two different radial positions outside the discharge chamber. In each array, the TLDs dosimeters are uniformly located and distributed respect to the symmetry axis of the plasma column. An estimation of the energy spectrum of X-ray by means of the filters techniques is presented. The potential of this table top plasma focus is discussed according to its size, the quality of the radiographies, the effective equivalent energy and the dosimetric characteristics.

  19. Experimental demonstration of time-reversed reverberation focusing in a rough waveguide. Application to target detection

    NASA Astrophysics Data System (ADS)

    Sabra, Karim G.; Roux, Philippe; Song, Hee-Chun; Hodgkiss, William; Kuperman, William A.; Akal, Tuncay; Stevenson, Mark R.

    2005-09-01

    For most shallow-water waveguides, the backscattered energy measured in a monostatic configuration is dominated by ocean bottom reverberation. A selected time-gated portion of the measured reverberation signals is used to provide a transfer function between a time-reversing array and a corresponding range interval on the bottom. Ultrasonic and at-sea experiments demonstrate focusing capabilities along the rough bottom interface of a time-reversing array using these reverberation signals only. The iterative time-reversal technique facilitates robust focusing along the ocean bottom, with little signal-processing effort involved and a priori information on the environment, and the enhancement of detection and localization of proud or buried targets in complex shallow-water environments. A passive implementation of the iterative time-reversal processing is used to construct reflectivity maps, similar to a sonar map, but with an enhanced contrast for the strongest reflectors (or scatterers), at the water-bottom interface. Ultrasonic and at-sea experiments show that targets lying on the seafloor located up to 400 wavelengths from the time-reversing array were detected over the bottom reverberation.

  20. Applications of artificial intelligence to space station and automated software techniques: High level robot command language

    NASA Technical Reports Server (NTRS)

    Mckee, James W.

    1989-01-01

    The objective is to develop a system that will allow a person not necessarily skilled in the art of programming robots to quickly and naturally create the necessary data and commands to enable a robot to perform a desired task. The system will use a menu driven graphical user interface. This interface will allow the user to input data to select objects to be moved. There will be an imbedded expert system to process the knowledge about objects and the robot to determine how they are to be moved. There will be automatic path planning to avoid obstacles in the work space and to create a near optimum path. The system will contain the software to generate the required robot instructions.

  1. Module-based Hybrid Uncertainty Quantification for Multi-physics Applications: Theory and Software

    SciTech Connect

    Tong, Charles; Chen, Xiao; Iaccarino, Gianluca; Mittal, Akshay

    2013-10-08

    In this project we proposed to develop an innovative uncertainty quantification methodology that captures the best of the two competing approaches in UQ, namely, intrusive and non-intrusive approaches. The idea is to develop the mathematics and the associated computational framework and algorithms to facilitate the use of intrusive or non-intrusive UQ methods in different modules of a multi-physics multi-module simulation model in a way that physics code developers for different modules are shielded (as much as possible) from the chores of accounting for the uncertain ties introduced by the other modules. As the result of our research and development, we have produced a number of publications, conference presentations, and a software product.

  2. Application of radiation processing in asia and the pacific region: Focus on malaysia

    NASA Astrophysics Data System (ADS)

    Mohd Dahlan, Khairul Zaman HJ.

    1995-09-01

    Applications of radiation processing in Malaysia and other developing countries in Asia and the Pacific region is increasing as the countries move toward industrialisation. At present, there are more than 85 gamma facilities and 334 electron accelerators in Asia and the Pacific region which are mainly in Japan, Rep. of Korea and China. The main applications which are in the interest of the region are radiation sterilisation of medical products; radiation crosslinking of wire and cable, heat shrinkable film and tube, and foam; radiation curing of surface coatings, printing inks and adhesive; radiation vulcanisation of natural rubber latex; radiation processing of agro-industrial waste; radiation treatment of sewage sludge and municipal waste; food irradiation; tissue grafts and radiation synthesis of bioactive materials.

  3. Biomedical Applications of Functionalized Hollow Mesoporous Silica Nanoparticles: Focusing on Molecular Imaging

    PubMed Central

    Shi, Sixiang; Chen, Feng; Cai, Weibo

    2013-01-01

    Hollow mesoporous silica nanoparticles (HMSNs), with a large cavity inside each original mesoporous silica nanoparticle (MSN), have recently gained increasing interest due to their tremendous potential for cancer imaging and therapy. The last several years have witnessed a rapid development in engineering of functionalized HMSNs (i.e. f-HMSNs) with various types of inorganic functional nanocrystals integrated into the system for imaging and therapeutic applications. In this review article, we summarize the recent progress in the design and biological applications of f-HMSNs, with a special emphasis on molecular imaging. Commonly used synthetic strategies for the generation of high quality HMSNs will be discussed in detail, followed by a systematic review of engineered f-HMSNs for optical, positron emission tomography, magnetic resonance, and ultrasound imaging in preclinical studies. Lastly, we also discuss the challenges and future research directions regarding the use of f-HMSNs for cancer imaging and therapy. PMID:24279491

  4. The Life Cycle Application of Intelligent Software Modeling for the First Materials Science Research Rack

    NASA Technical Reports Server (NTRS)

    Rice, Amanda; Parris, Frank; Nerren, Philip

    2000-01-01

    Marshall Space Flight Center (MSFC) has been funding development of intelligent software models to benefit payload ground operations for nearly a decade. Experience gained from simulator development and real-time monitoring and control is being applied to engineering design, testing, and operation of the First Material Science Research Rack (MSRR-1). MSRR-1 is the first rack in a suite of three racks comprising the Materials Science Research Facility (MSRF) which will operate on the International Space Station (ISS). The MSRF will accommodate advanced microgravity investigations in areas such as the fields of solidification of metals and alloys, thermo-physical properties of polymers, crystal growth studies of semiconductor materials, and research in ceramics and glasses. The MSRR-1 is a joint venture between NASA and the European Space Agency (ESA) to study the behavior of different materials during high temperature processing in a low gravity environment. The planned MSRR-1 mission duration is five (5) years on-orbit and the total design life is ten (IO) years. The MSRR-1 launch is scheduled on the third Utilization Flight (UF-3) to ISS, currently in February of 2003). The objective of MSRR-1 is to provide an early capability on the ISS to conduct material science, materials technology, and space product research investigations in microgravity. It will provide a modular, multi-user facility for microgravity research in materials crystal growth and solidification. An intelligent software model of MSRR-1 is under development and will serve multiple purposes to support the engineering analysis, testing, training, and operational phases of the MSRR-1 life cycle development. The G2 real-time expert system software environment developed by Gensym Corporation was selected as the intelligent system shell for this development work based on past experience gained and the effectiveness of the programming environment. Our approach of multi- uses of the simulation model and

  5. Project management in the development of scientific software

    NASA Astrophysics Data System (ADS)

    Platz, Jochen

    1986-08-01

    This contribution is a rough outline of a comprehensive project management model for the development of software for scientific applications. The model was tested in the unique environment of the Siemens AG Corporate Research and Technology Division. Its focal points are the structuring of project content - the so-called phase organization, the project organization and the planning model used, and its particular applicability to innovative projects. The outline focuses largely on actual project management aspects rather than associated software engineering measures.

  6. Effects of Varying Interactive Strategies Provided by Computer-Based Tutorials for a Software Application Program.

    ERIC Educational Resources Information Center

    Tiemann, Philip W.; Markle, Susan M.

    1990-01-01

    Discussion of interaction in computer-based tutorials (CBT) focuses on a study that compared the performance of adult learners from training with three CBTs that varied the level of interactivity. The degrees of learner control, system control, and domain control are discussed, and the Lotus spreadsheet tutorials used are described. (24…

  7. Application of value-focused thinking on the environmental selection of wall structures.

    PubMed

    Hassan, O A B

    2004-02-01

    The decision of selecting building structures with respect to the environmental demand is an issue commonly addressed in environmental management. In this paper, the importance of considering the decision analysis technique value-focused thinking in the environmental selection of wall structures is investigated. In this context, a qualitative value model is developed in which the external and internal environmental factors are considered. The model is applied on a case study in which a decision should be made on three categories of exterior wall structures: wood, masonry and concrete. It is found that the wall structure made of wood is the most compatible option with respect to the external and internal environmental requirements of building structures. PMID:15160743

  8. Iodine enhanced focused-ion-beam etching of silicon for photonic applications

    SciTech Connect

    Schrauwen, Jonathan; Thourhout, Dries van; Baets, Roel

    2007-11-15

    Focused-ion-beam etching of silicon enables fast and versatile fabrication of micro- and nanophotonic devices. However, large optical losses due to crystal damage and ion implantation make the devices impractical when the optical mode is confined near the etched region. These losses are shown to be reduced by the local implantation and etching of silicon waveguides with iodine gas enhancement, followed by baking at 300 deg. C. The excess optical loss in the silicon waveguides drops from 3500 to 1700 dB/cm when iodine gas is used, and is further reduced to 200 dB/cm after baking at 300 deg. C. We present elemental and chemical surface analyses supporting that this is caused by the desorption of iodine from the silicon surface. Finally we present a model to extract the absorption coefficient from the measurements.

  9. Photomechanical ablation of biological tissue induced by focused femtosecond laser and its application for acupuncture

    NASA Astrophysics Data System (ADS)

    Hosokawa, Yoichiroh; Ohta, Mika; Ito, Akihiko; Takaoka, Yutaka

    2013-03-01

    Photomechanical laser ablation due to focused femtosecond laser irradiation was induced on the hind legs of living mice, and its clinical influence on muscle cell proliferation was investigated via histological examination and reverse transcriptase-polymerase chain reaction (RT-PCR) analysis to examine the expression of the gene encoding myostatin, which is a growth repressor in muscle satellite cells. The histological examination suggested that damage of the tissue due to the femtosecond laser irradiation was localized on epidermis and dermis and hardly induced in the muscle tissue below. On the other hand, gene expression of the myostatin of muscle tissue after laser irradiation was suppressed. The suppression of myostatin expression facilitates the proliferation of muscle cells, because myostatin is a growth repressor in muscle satellite cells. On the basis of these results, we recognize the potential of the femtosecond laser as a tool for noncontact, high-throughput acupuncture in the treatment of muscle disease.

  10. Focused ion beam induced synthesis of antimony nanowires for gas sensor applications

    NASA Astrophysics Data System (ADS)

    Schoendorfer, Christoph; Hetzel, Martin; Pongratz, Peter; Lugstein, Alois; Bertagnolli, Emmerich

    2012-11-01

    In this paper the formation of antimony (Sb) nanowires (NWs) by a focused Ga ion beam approach and their gas sensing capability is reported. The NWs with uniform diameters of only 25 nm and lengths up to several microns are synthesized at predefined positions at room temperature in an ion beam induced self-assembling process. Then individual Sb-NWs are deposited on insulating substrates and provided with gold electrodes. Subsequently sensing characteristics of individual Sb-NWs are investigated at room temperature for H2O, CO, H2, He, O2 and ethanol over a wide concentration range. The Sb-NWs exhibit selective sensing properties for ethanol and H2O with exceptional sensitivities of more than 17 000 and 60 000, respectively.

  11. The application of sparse arrays in high frequency transcranial focused ultrasound therapy: A simulation study

    SciTech Connect

    Pajek, Daniel Hynynen, Kullervo

    2013-12-15

    Purpose: Transcranial focused ultrasound is an emerging therapeutic modality that can be used to perform noninvasive neurosurgical procedures. The current clinical transcranial phased array operates at 650 kHz, however the development of a higher frequency array would enable more precision, while reducing the risk of standing waves. However, the smaller wavelength and the skull's increased distortion at this frequency are problematic. It would require an order of magnitude more elements to create such an array. Random sparse arrays enable steering of a therapeutic array with fewer elements. However, the tradeoffs inherent in the use of sparsity in a transcranial phased array have not been systematically investigated and so the objective of this simulation study is to investigate the effect of sparsity on transcranial arrays at a frequency of 1.5 MHz that provides small focal spots for precise exposure control. Methods: Transcranial sonication simulations were conducted using a multilayer Rayleigh-Sommerfeld propagation model. Element size and element population were varied and the phased array's ability to steer was assessed. Results: The focal pressures decreased proportionally as elements were removed. However, off-focus hotspots were generated if a high degree of steering was attempted with very sparse arrays. A phased array consisting of 1588 elements 3 mm in size, a 10% population, was appropriate for steering up to 4 cm in all directions. However, a higher element population would be required if near-skull sonication is desired. Conclusions: This study demonstrated that the development of a sparse, hemispherical array at 1.5 MHz could enable more precision in therapies that utilize lower intensity sonications.

  12. MARVIN: a medical research application framework based on open source software.

    PubMed

    Rudolph, Tobias; Puls, Marc; Anderegg, Christoph; Ebert, Lars; Broehan, Martina; Rudin, Adrian; Kowal, Jens

    2008-08-01

    This paper describes the open source framework MARVIN for rapid application development in the field of biomedical and clinical research. MARVIN applications consist of modules that can be plugged together in order to provide the functionality required for a specific experimental scenario. Application modules work on a common patient database that is used to store and organize medical data as well as derived data. MARVIN provides a flexible input/output system with support for many file formats including DICOM, various 2D image formats and surface mesh data. Furthermore, it implements an advanced visualization system and interfaces to a wide range of 3D tracking hardware. Since it uses only highly portable libraries, MARVIN applications run on Unix/Linux, Mac OS X and Microsoft Windows. PMID:18541330

  13. Treatment planning of a skin-sparing conical breast brachytherapy applicator using conventional brachytherapy software

    SciTech Connect

    Yang Yun; Melhus, Christopher S.; Sioshansi, Shirin; Rivard, Mark J.

    2011-03-15

    Purpose: AccuBoost is a noninvasive image-guided technique for the delivery of partial breast irradiation to the tumor bed and currently serves as an alternate to conventional electron beam boost. To irradiate the target volume while providing dose sparing to the skin, the round applicator design was augmented through the addition of an internally truncated conical shield and the reduction of the source to skin distance. Methods: Brachytherapy dose distributions for two types of conical applicators were simulated and estimated using Monte Carlo (MC) methods for radiation transport and a conventional treatment planning system (TPS). MC-derived and TPS-generated dose volume histograms (DVHs) and dose distribution data were compared for both the conical and round applicators for benchmarking purposes. Results: Agreement using the gamma-index test was {>=}99.95% for distance to agreement and dose accuracy criteria of 2 mm and 2%, respectively. After observing good agreement, TPS DVHs and dose distributions for the conical and round applicators were obtained and compared. Brachytherapy dose distributions generated using Pinnacle{sup 3} for ten CT data sets showed that the parallel-opposed beams of the conical applicators provided similar PTV coverage to the round applicators and reduced the maximum dose to skin, chest wall, and lung by up to 27%, 42%, and 43%, respectively. Conclusions: Brachytherapy dose distributions for the conical applicators have been generated using MC methods and entered into the Pinnacle{sup 3} TPS via the Tufts technique. Treatment planning metrics for the conical AccuBoost applicators were significantly improved in comparison to those for conventional electron beam breast boost.

  14. Application of symbolic and algebraic manipulation software in solving applied mechanics problems

    NASA Technical Reports Server (NTRS)

    Tsai, Wen-Lang; Kikuchi, Noboru

    1993-01-01

    As its name implies, symbolic and algebraic manipulation is an operational tool which not only can retain symbols throughout computations but also can express results in terms of symbols. This report starts with a history of symbolic and algebraic manipulators and a review of the literatures. With the help of selected examples, the capabilities of symbolic and algebraic manipulators are demonstrated. These applications to problems of applied mechanics are then presented. They are the application of automatic formulation to applied mechanics problems, application to a materially nonlinear problem (rigid-plastic ring compression) by finite element method (FEM) and application to plate problems by FEM. The advantages and difficulties, contributions, education, and perspectives of symbolic and algebraic manipulation are discussed. It is well known that there exist some fundamental difficulties in symbolic and algebraic manipulation, such as internal swelling and mathematical limitation. A remedy for these difficulties is proposed, and the three applications mentioned are solved successfully. For example, the closed from solution of stiffness matrix of four-node isoparametrical quadrilateral element for 2-D elasticity problem was not available before. Due to the work presented, the automatic construction of it becomes feasible. In addition, a new advantage of the application of symbolic and algebraic manipulation found is believed to be crucial in improving the efficiency of program execution in the future. This will substantially shorten the response time of a system. It is very significant for certain systems, such as missile and high speed aircraft systems, in which time plays an important role.

  15. Software assurance standard

    NASA Technical Reports Server (NTRS)

    1992-01-01

    This standard specifies the software assurance program for the provider of software. It also delineates the assurance activities for the provider and the assurance data that are to be furnished by the provider to the acquirer. In any software development effort, the provider is the entity or individual that actually designs, develops, and implements the software product, while the acquirer is the entity or individual who specifies the requirements and accepts the resulting products. This standard specifies at a high level an overall software assurance program for software developed for and by NASA. Assurance includes the disciplines of quality assurance, quality engineering, verification and validation, nonconformance reporting and corrective action, safety assurance, and security assurance. The application of these disciplines during a software development life cycle is called software assurance. Subsequent lower-level standards will specify the specific processes within these disciplines.

  16. Exploring the focus and experiences of smartphone applications for addiction recovery.

    PubMed

    Savic, Michael; Best, David; Rodda, Simone; Lubman, Dan I

    2013-01-01

    Addiction recovery Smartphone applications (apps) (n = 87) identified on the Google Play store in 2012 were coded, along with app user reviews, to explore functions, foci, and user experiences. Content analysis revealed that apps typically provided information on recovery, as well as content to enhance motivation, promote social support and tools to monitor progress. App users commented that the apps helped to inform them, keep them focussed, inspire them, and connect them with other people and groups. Because few addiction recovery apps appear to have been formally evaluated, further research is needed to ascertain their effectiveness as stand-alone or adjunctive interventions. PMID:24074196

  17. Analysis Software

    NASA Technical Reports Server (NTRS)

    1994-01-01

    General Purpose Boundary Element Solution Technology (GPBEST) software employs the boundary element method of mechanical engineering analysis, as opposed to finite element. It is, according to one of its developers, 10 times faster in data preparation and more accurate than other methods. Its use results in less expensive products because the time between design and manufacturing is shortened. A commercial derivative of a NASA-developed computer code, it is marketed by Best Corporation to solve problems in stress analysis, heat transfer, fluid analysis and yielding and cracking of solids. Other applications include designing tractor and auto parts, household appliances and acoustic analysis.

  18. Seminar Software

    NASA Technical Reports Server (NTRS)

    1993-01-01

    The Society for Computer Simulation International is a professional technical society that distributes information on methodology techniques and uses of computer simulation. The society uses NETS, a NASA-developed program, to assist seminar participants in learning to use neural networks for computer simulation. NETS is a software system modeled after the human brain; it is designed to help scientists exploring artificial intelligence to solve pattern matching problems. Examples from NETS are presented to seminar participants, who can then manipulate, alter or enhance them for their own applications.

  19. Simulation Software

    NASA Technical Reports Server (NTRS)

    1996-01-01

    Various NASA Small Business Innovation Research grants from Marshall Space Flight Center, Langley Research Center and Ames Research Center were used to develop the 'kernel' of COMCO's modeling and simulation software, the PHLEX finite element code. NASA needed it to model designs of flight vehicles; one of many customized commercial applications is UNISIM, a PHLEX-based code for analyzing underground flows in oil reservoirs for Texaco, Inc. COMCO's products simulate a computational mechanics problem, estimate the solution's error and produce the optimal hp-adapted mesh for the accuracy the user chooses. The system is also used as a research or training tool in universities and in mechanical design in industrial corporations.

  20. Software Surrogate

    NASA Technical Reports Server (NTRS)

    1999-01-01

    In 1994, Blackboard Technology received a NASA Phase I SBIR award entitled "A Blackboard-Based Framework for Mixed-Initiative, Crewed- Space-System Applications." This research continued in Phase II at JSC, where a generic architecture was developed in which a software surrogate serves as the operator's representative in the fast-paced realm of nearly autonomous, intelligent systems. This SBIR research effort addressed the need to support human-operator monitoring and intervention with intelligent systems such as those being developed for NASA's crewed space program.

  1. The Application of New Software Technology to the Architecture of the National Cycle Program

    NASA Technical Reports Server (NTRS)

    Schoeffler, James D.

    1997-01-01

    As part of the Numerical Propulsion System Simulation (NPSS) effort of NASA Lewis in conjunction with the United States aeropropulsion industry, a new system simulation framework, the National Cycle Program (NCP), capable of combining existing empirical engine models with new detailed component-based computational models is being developed. The software architecture of the NCP program involves a generalized object- oriented framework and a base-set of engine component models along with supporting tool kits which will support engine simulation in a distributed environment. As the models are extended to contain two and three dimensions the computing load increases rapidly and it is intended that this load be distributed across multiple work stations executing concurrently in order to get acceptably fast results. The research carried out was directed toward performance analysis of the distributed object system. More specifically, the performance of the actor-based distributed object design I created earlier was desired. To this end, the research was directed toward the design and implementation of suitable performance-analysis techniques and software to demonstrate those techniques. There were three specific results which are reported in two separate reports submitted separately as NASA Technical Memoranda. The results are: (1) Design, implementation, and testing of a performance analysis program for a set of active objects (actor based objects) which allowed the individual actors to be assigned to arbitrary processes on an arbitrary set of machines. (2) The global-balance-equation approach has the fundamental limitation that the number of equations increases exponentially with the number of actors. Hence, unlike many approximate approaches to this problem, the nearest-neighbor approach allows checking of the solution and an estimate of the error. The technique was demonstrated in a prototype analysis program as part of this research. The results of the program were

  2. Application of FE software Elmer to the modeling of crustal-scale processes

    NASA Astrophysics Data System (ADS)

    Maierová, Petra; Guy, Alexandra; Lexa, Ondrej; Cadek, Ondrej

    2010-05-01

    We extended Elmer (the open source finite element software for multiphysical problems, http://www.csc.fi/english/pages/elmer) by user-written procedures for the two-dimensional modeling of crustal-scale processes. The standard version of Elmer is an appropriate tool for modeling of thermomechanical convection with non-linear viscous rheology. In geophysics, it might be suitable for some type of mantle convection modeling. Unlike the mantle, the crust is very heterogeneous. It consists of materials with distinct rheological properties that are subject to highly varied conditions: low pressure and temperature near the surface of the Earth and relatively high pressure and temperature at a depth of several tens of kilometers. Moreover, the deformation in the upper crust is mostly brittle and the strain is concentrated into narrow shear zones and thrusts. In order to simulate the brittle behavior of the crust, we implemented pressure-dependent visco-plastic rheology. The material heterogeneity and chemical convection is implemented in terms of active markers. Another special feature of the crust, the moving free surface, is already included in Elmer by means of a moving computational grid. Erosion can easily be added in this scheme. We tested the properties of our formulation of plastic flow on several numerical experiments simulating the deformation of material under compressional and extensional stresses. In the first step, we examined angles of shear zones that form in a plastically deforming material for different material parameters and grid resolutions. A more complex setting of "sandbox-type" experiments containing heterogeneous material, strain-softening and boundary friction was considered as a next testing case. To illustrate the abilities of the extended Elmer software in crustal deformation studies, we present two models of geological processes: diapirism of the lower crust and a channel flow forced by indentation. Both these processes are assumed to take

  3. Decision support systems and applications in ophthalmology: literature and commercial review focused on mobile apps.

    PubMed

    de la Torre-Díez, Isabel; Martínez-Pérez, Borja; López-Coronado, Miguel; Díaz, Javier Rodríguez; López, Miguel Maldonado

    2015-01-01

    The growing importance that mobile devices have in daily life has also reached health care and medicine. This is making the paradigm of health care change and the concept of mHealth or mobile health more relevant, whose main essence is the apps. This new reality makes it possible for doctors who are not specialist to have easy access to all the information generated in different corners of the world, making them potential keepers of that knowledge. However, the new daily information exceeds the limits of the human intellect, making Decision Support Systems (DSS) necessary for helping doctors to diagnose diseases and also help them to decide the attitude that has to be taken towards these diagnoses. These could improve the health care in remote areas and developing countries. All of this is even more important in diseases that are more prevalent in primary care and that directly affect the people's quality of life, this is the case in ophthalmological problems where in first patient care a specialist in ophthalmology is not involved. The goal of this paper is to analyse the state of the art of DSS in Ophthalmology. Many of them focused on diseases affecting the eye's posterior pole. For achieving the main purpose of this research work, a literature review and commercial apps analysis will be done. The used databases and systems will be IEEE Xplore, Web of Science (WoS), Scopus, and PubMed. The search is limited to articles published from 2000 until now. Later, different Mobile Decision Support System (MDSS) in Ophthalmology will be analyzed in the virtual stores for Android and iOS. 37 articles were selected according their thematic (posterior pole, anterior pole, Electronic Health Records (EHRs), cloud, data mining, algorithms and structures for DSS, and other) from a total of 600 found in the above cited databases. Very few mobile apps were found in the different stores. It can be concluded that almost all existing mobile apps are focused on the eye's posterior

  4. History and modern applications of nano-composite materials carrying GA/cm2 current density due to a Bose-Einstein Condensate at room temperature produced by Focused Electron Beam Induced Processing for many extraordinary novel technical applications

    NASA Astrophysics Data System (ADS)

    Koops, Hans W. P.

    2015-12-01

    The discovery of Focused Electron Beam Induced Processing and early applications of this technology led to the possible use of a novel nanogranular material “Koops-GranMat®” using Pt/C and Au/C material. which carries at room temperature a current density > 50 times the current density which high TC superconductors can carry. The explanation for the characteristics of this novel material is given. This fact allows producing novel products for many applications using Dual Beam system having a gas supply and X.Y.T stream data programming and not using GDSII layout pattern control software. Novel products are possible for energy transportation. -distribution.-switching, photon-detection above 65 meV energy for very efficient energy harvesting, for bright field emission electron sources used for vacuum electronic devices like amplifiers for HF electronics, micro-tubes, 30 GHz to 6 THz switching amplifiers with signal to noise ratio >10(!), THz power sources up to 1 Watt, in combination with miniaturized vacuum pumps, vacuum gauges, IR to THz detectors, EUV- and X-Ray sources. Since focusing electron beam induced deposition works also at low energy, selfcloning multibeam-production machines for field emitter lamps, displays, multi-beam - lithography, - imaging, and - inspection, energy harvesting, and power distribution with switches controlling field-emitter arrays for KA of currents but with < 100 V switching voltage are possible. Finally the replacement of HTC superconductors and its applications by the Koops-GranMat® having Koops-Pairs at room temperature will allow the investigation devices similar to Josephson Junctions and its applications now called QUIDART (Quantum interference devices at Room Temperature). All these possibilities will support a revolution in the optical, electric, power, and electronic technology.

  5. Collection and focusing of laser accelerated ion beams for therapy applications

    NASA Astrophysics Data System (ADS)

    Hofmann, Ingo; Meyer-Ter-Vehn, Jürgen; Yan, Xueqing; Orzhekhovskaya, Anna; Yaramyshev, Stepan

    2011-03-01

    Experimental results in laser acceleration of protons and ions and theoretical predictions that the currently achieved energies might be raised by factors 5-10 in the next few years have stimulated research exploring this new technology for oncology as a compact alternative to conventional synchrotron based accelerator technology. The emphasis of this paper is on collection and focusing of the laser produced particles by using simulation data from a specific laser acceleration model. We present a scaling law for the “chromatic emittance” of the collector—here assumed as a solenoid lens—and apply it to the particle energy and angular spectra of the simulation output. For a 10 Hz laser system we find that particle collection by a solenoid magnet well satisfies requirements of intensity and beam quality as needed for depth scanning irradiation. This includes a sufficiently large safety margin for intensity, whereas a scheme without collection—by using mere aperture collimation—hardly reaches the needed intensities.

  6. Establishment and preliminary application of a rapid fluorescent focus inhibition test (RFFIT) for rabies virus.

    PubMed

    Yu, Pengcheng; Lv, Xinjun; Shen, Xinxin; Tang, Qing; Liang, Guodong

    2013-08-01

    The World Health Organization (WHO) standard assay for determining levels of the rabies virus neutralization antibody (RVNA) is the rapid fluorescent focus inhibition test (RFFIT), which is used to evaluate the immunity effect after vaccination against rabies. For RFFIT, CVS-11 was used as the challenge virus, BSR cells as the adapted cells, and WHO rabies immunoglobulin (WHO STD) as the reference serum in this study. With reference to WHO and Pasteur RFFIT procedures, a micro-RFFIT procedure adapted to our laboratory was produced, and its specificity and reproducibility were tested. We tested levels of RVNA in human serum samples after immunization with different human rabies vaccines (domestic purified Vero cell rabies vaccine (PVRV) and imported purified chick embryo cell vaccine (PCECV)) using different regimens (Zagreb regimen and Essen regimen). We analyzed the levels of RVNA, and compared the immune efficacy of domestic PVRV and imported PCECV using different immunization regimens. The results showed that the immune efficacy of domestic PVRV using the Zagreb regimen was as good as that of the imported PCECV, but virus antibodies were generated more rapidly with the Zagreb regimen than with the Essen regimen. The RFFIT procedure established in our laboratory will enhance the comprehensive detection ability of institutions involved in rabies surveillance in China. PMID:23913179

  7. Audiovisual focus of attention and its application to Ultra High Definition video compression

    NASA Astrophysics Data System (ADS)

    Rerabek, Martin; Nemoto, Hiromi; Lee, Jong-Seok; Ebrahimi, Touradj

    2014-02-01

    Using Focus of Attention (FoA) as a perceptual process in image and video compression belongs to well-known approaches to increase coding efficiency. It has been shown that foveated coding, when compression quality varies across the image according to region of interest, is more efficient than the alternative coding, when all region are compressed in a similar way. However, widespread use of such foveated compression has been prevented due to two main conflicting causes, namely, the complexity and the efficiency of algorithms for FoA detection. One way around these is to use as much information as possible from the scene. Since most video sequences have an associated audio, and moreover, in many cases there is a correlation between the audio and the visual content, audiovisual FoA can improve efficiency of the detection algorithm while remaining of low complexity. This paper discusses a simple yet efficient audiovisual FoA algorithm based on correlation of dynamics between audio and video signal components. Results of audiovisual FoA detection algorithm are subsequently taken into account for foveated coding and compression. This approach is implemented into H.265/HEVC encoder producing a bitstream which is fully compliant to any H.265/HEVC decoder. The influence of audiovisual FoA in the perceived quality of high and ultra-high definition audiovisual sequences is explored and the amount of gain in compression efficiency is analyzed.

  8. DigitalVHI--a freeware open-source software application to capture the Voice Handicap Index and other questionnaire data in various languages.

    PubMed

    Herbst, Christian T; Oh, Jinook; Vydrová, Jitka; Švec, Jan G

    2015-07-01

    In this short report we introduce DigitalVHI, a free open-source software application for obtaining Voice Handicap Index (VHI) and other questionnaire data, which can be put on a computer in clinics and used in clinical practice. The software can simplify performing clinical studies since it makes the VHI scores directly available for analysis in a digital form. It can be downloaded from http://www.christian-herbst.org/DigitalVHI/. PMID:24047114

  9. Roadmap for Peridynamic Software Implementation

    SciTech Connect

    Littlewood, David John

    2015-10-01

    The application of peridynamics for engineering analysis requires an efficient and robust software implementation. Key elements include processing of the discretization, the proximity search for identification of pairwise interactions, evaluation of the con- stitutive model, application of a bond-damage law, and contact modeling. Additional requirements may arise from the choice of time integration scheme, for example esti- mation of the maximum stable time step for explicit schemes, and construction of the tangent stiffness matrix for many implicit approaches. This report summaries progress to date on the software implementation of the peridynamic theory of solid mechanics. Discussion is focused on parallel implementation of the meshfree discretization scheme of Silling and Askari [33] in three dimensions, although much of the discussion applies to computational peridynamics in general.

  10. Visual Recognition Software for Binary Classification and Its Application to Spruce Pollen Identification

    PubMed Central

    Tcheng, David K.; Nayak, Ashwin K.; Fowlkes, Charless C.; Punyasena, Surangi W.

    2016-01-01

    Discriminating between black and white spruce (Picea mariana and Picea glauca) is a difficult palynological classification problem that, if solved, would provide valuable data for paleoclimate reconstructions. We developed an open-source visual recognition software (ARLO, Automated Recognition with Layered Optimization) capable of differentiating between these two species at an accuracy on par with human experts. The system applies pattern recognition and machine learning to the analysis of pollen images and discovers general-purpose image features, defined by simple features of lines and grids of pixels taken at different dimensions, size, spacing, and resolution. It adapts to a given problem by searching for the most effective combination of both feature representation and learning strategy. This results in a powerful and flexible framework for image classification. We worked with images acquired using an automated slide scanner. We first applied a hash-based “pollen spotting” model to segment pollen grains from the slide background. We next tested ARLO’s ability to reconstruct black to white spruce pollen ratios using artificially constructed slides of known ratios. We then developed a more scalable hash-based method of image analysis that was able to distinguish between the pollen of black and white spruce with an estimated accuracy of 83.61%, comparable to human expert performance. Our results demonstrate the capability of machine learning systems to automate challenging taxonomic classifications in pollen analysis, and our success with simple image representations suggests that our approach is generalizable to many other object recognition problems. PMID:26867017

  11. Visual Recognition Software for Binary Classification and Its Application to Spruce Pollen Identification.

    PubMed

    Tcheng, David K; Nayak, Ashwin K; Fowlkes, Charless C; Punyasena, Surangi W

    2016-01-01

    Discriminating between black and white spruce (Picea mariana and Picea glauca) is a difficult palynological classification problem that, if solved, would provide valuable data for paleoclimate reconstructions. We developed an open-source visual recognition software (ARLO, Automated Recognition with Layered Optimization) capable of differentiating between these two species at an accuracy on par with human experts. The system applies pattern recognition and machine learning to the analysis of pollen images and discovers general-purpose image features, defined by simple features of lines and grids of pixels taken at different dimensions, size, spacing, and resolution. It adapts to a given problem by searching for the most effective combination of both feature representation and learning strategy. This results in a powerful and flexible framework for image classification. We worked with images acquired using an automated slide scanner. We first applied a hash-based "pollen spotting" model to segment pollen grains from the slide background. We next tested ARLO's ability to reconstruct black to white spruce pollen ratios using artificially constructed slides of known ratios. We then developed a more scalable hash-based method of image analysis that was able to distinguish between the pollen of black and white spruce with an estimated accuracy of 83.61%, comparable to human expert performance. Our results demonstrate the capability of machine learning systems to automate challenging taxonomic classifications in pollen analysis, and our success with simple image representations suggests that our approach is generalizable to many other object recognition problems. PMID:26867017

  12. An open-source software package for multivariate modeling and clustering: applications to air quality management.

    PubMed

    Wang, Xiuquan; Huang, Guohe; Zhao, Shan; Guo, Junhong

    2015-09-01

    This paper presents an open-source software package, rSCA, which is developed based upon a stepwise cluster analysis method and serves as a statistical tool for modeling the relationships between multiple dependent and independent variables. The rSCA package is efficient in dealing with both continuous and discrete variables, as well as nonlinear relationships between the variables. It divides the sample sets of dependent variables into different subsets (or subclusters) through a series of cutting and merging operations based upon the theory of multivariate analysis of variance (MANOVA). The modeling results are given by a cluster tree, which includes both intermediate and leaf subclusters as well as the flow paths from the root of the tree to each leaf subcluster specified by a series of cutting and merging actions. The rSCA package is a handy and easy-to-use tool and is freely available at http://cran.r-project.org/package=rSCA . By applying the developed package to air quality management in an urban environment, we demonstrate its effectiveness in dealing with the complicated relationships among multiple variables in real-world problems. PMID:25966889

  13. Software architecture for a kinematically dissimilar master-slave telethesis for exploring rehabilitation applications

    NASA Astrophysics Data System (ADS)

    Wunderlich, Joseph; Chen, Shoupu; Pino, D.; Rahman, Tariq; Harwin, William S.

    1993-12-01

    A person with limited arm and hand function could benefit from technology based on teleoperation principles, particularly where the mechanism provides proprioceptive-like information to the operator giving an idea of the forces encountered in the environment and the positions of the slave robot. A test-bed system is being prepared to evaluate the potential for adapting telemanipulator technology to the needs of people with high level spinal cord injury. This test-bed uses a kinematically dissimilar master and slave pair and will be adaptable to a variety of disabilities. The master will be head controlled and when combined with auxiliary functions will provide the degrees of freedom necessary to attempt any task. In the simplest case, this mapping could be direct, with the slave amplifying the person's movements and forces. It is unrealistic however to expect that the operator will have the range of head movement required for accurate operation of the slave over the full workspace. Neither is it likely that the person will be able to achieve simultaneous and independent control of the 6 or more degrees of freedom required to move the slave. Thus a set of more general mappings must be available that can be chosen to relate to the intrinsic coordinates of the task. The software structure to implement the control of this master-slave system is based on two IBM PC computers linked via an ethernet.

  14. A software architecture for multidisciplinary applications: Integrating task and data parallelism

    NASA Technical Reports Server (NTRS)

    Chapman, Barbara; Mehrotra, Piyush; Vanrosendale, John; Zima, Hans

    1994-01-01

    Data parallel languages such as Vienna Fortran and HPF can be successfully applied to a wide range of numerical applications. However, many advanced scientific and engineering applications are of a multidisciplinary and heterogeneous nature and thus do not fit well into the data parallel paradigm. In this paper we present new Fortran 90 language extensions to fill this gap. Tasks can be spawned as asynchronous activities in a homogeneous or heterogeneous computing environment; they interact by sharing access to Shared Data Abstractions (SDA's). SDA's are an extension of Fortran 90 modules, representing a pool of common data, together with a set of Methods for controlled access to these data and a mechanism for providing persistent storage. Our language supports the integration of data and task parallelism as well as nested task parallelism and thus can be used to express multidisciplinary applications in a natural and efficient way.

  15. Software Engineering Laboratory Series: Collected Software Engineering Papers. Volume 15

    NASA Technical Reports Server (NTRS)

    1997-01-01

    The Software Engineering Laboratory (SEL) is an organization sponsored by NASA/GSFC and created to investigate the effectiveness of software engineering technologies when applied to the development of application software. The activities, findings, and recommendations of the SEL are recorded in the Software Engineering Laboratory Series, a continuing series of reports that includes this document.

  16. Software Engineering Laboratory Series: Collected Software Engineering Papers. Volume 14

    NASA Technical Reports Server (NTRS)

    1996-01-01

    The Software Engineering Laboratory (SEL) is an organization sponsored by NASA/GSFC and created to investigate the effectiveness of software engineering technologies when applied to the development of application software. The activities, findings, and recommendations of the SEL are recorded in the Software Engineering Laboratory Series, a continuing series of reports that includes this document.

  17. Software Engineering Laboratory Series: Collected Software Engineering Papers. Volume 13

    NASA Technical Reports Server (NTRS)

    1995-01-01

    The Software Engineering Laboratory (SEL) is an organization sponsored by NASA/GSFC and created to investigate the effectiveness of software engineering technologies when applied to the development of application software. The activities, findings, and recommendations of the SEL are recorded in the Software Engineering Laboratory Series, a continuing series of reports that includes this document.

  18. User Friendly Processing of Sediment CT Data: Software and Application in High Resolution Non-Destructive Sediment Core Data Sets

    NASA Astrophysics Data System (ADS)

    Reilly, B. T.; Stoner, J. S.; Wiest, J.; Abbott, M. B.; Francus, P.; Lapointe, F.

    2015-12-01

    Computed Tomography (CT) of sediment cores allow for high resolution images, three dimensional volumes, and down core profiles, generated through the attenuation of X-rays as a function of density and atomic number. When using a medical CT-Scanner, these quantitative data are stored in pixels using the Hounsfield scale, which are relative to the attenuation of X-rays in water and air at standard temperature and pressure. Here we present MATLAB based software specifically designed for sedimentary applications with a user friendly graphical interface to process DICOM files and stitch overlapping CT scans. For visualization, the software allows easy generation of core slice images with grayscale and false color relative to a user defined Hounsfield number range. For comparison to other high resolution non-destructive methods, down core Hounsfield number profiles are extracted using a method robust to coring imperfections, like deformation, bowing, gaps, and gas expansion. We demonstrate the usefulness of this technique with lacustrine sediment cores from the Western United States and Canadian High Arctic, including Fish Lake, Oregon, and Sawtooth Lake, Ellesmere Island. These sites represent two different depositional environments and provide examples for a variety of common coring defects and lithologies. The Hounsfield profiles and images can be used in combination with other high resolution data sets, including sediment magnetic parameters, XRF core scans and many other types of data, to provide unique insights into how lithology influences paleoenvironmental and paleomagnetic records and their interpretations.

  19. User's guide to DIANE Version 2. 1: A microcomputer software package for modeling battery performance in electric vehicle applications

    SciTech Connect

    Marr, W.W.; Walsh, W.J. . Energy Systems Div.); Symons, P.C. )

    1990-06-01

    DIANE is an interactive microcomputer software package for the analysis of battery performance in electric vehicle (EV) applications. The principal objective of this software package is to enable the prediction of EV performance on the basis of laboratory test data for batteries. The model provides a second-by-second simulation of battery voltage and current for any specified velocity/time or power/time profile. The capability of the battery is modeled by an algorithm that relates the battery voltage to the withdrawn current, taking into account the effect of battery depth-of-discharge (DOD). Because of the lack of test data and other constraints, the current version of DIANE deals only with vehicles using fresh'' batteries with or without regenerative braking. Deterioration of battery capability due to aging can presently be simulated with user-input parameters accounting for an increase of effective internal resistance and/or a decrease of cell no-load voltage. DIANE 2.1 is written in FORTRAN language for use on IBM-compatible microcomputers. 7 refs.

  20. User's guide to DIANE version 2.1: A microcomputer software package for modeling battery performance in electric vehicle applications

    NASA Astrophysics Data System (ADS)

    Marr, W. W.; Walsh, W. J.; Symons, P. C.

    1990-06-01

    DIANE is an interactive microcomputer software package for the analysis of battery performance in electric vehicle (EV) applications. The principal objective of this software package is to enable the prediction of EV performance on the basis of laboratory test data for batteries. The model provides a second-by-second simulation of battery voltage and current for any specified velocity time or power time profile. The capability of the battery is modeled by an algorithm that relates the battery voltage to the withdrawn current, taking into account the effect of battery depth-of-discharge (DOD). Because of the lack of test data and other constraints, the current version of DIANE deals only with vehicles using fresh batteries with or without regenerative braking. Deterioration of battery capability due to aging can presently be simulated with user input parameters accounting for an increase of effective internal resistance and/or a decrease of cell no-load voltage. DIANE 2.1 is written in FORTRAN language for use on IBM-compatible microcomputers.