Science.gov

Sample records for focused application software

  1. Focus: Software Secrets.

    ERIC Educational Resources Information Center

    Borman, Stuart A.

    1983-01-01

    A controversy has arisen as to whether instrument vendors are providing sufficient information about instrument-associated software to permit users to understand invisible data transformations going on inside of the equipment. Topics addressed include software protection, user modifications, and publishing software. (Author/JN)

  2. Knowledge focus via software agents

    NASA Astrophysics Data System (ADS)

    Henager, Donald E.

    2001-09-01

    The essence of military Command and Control (C2) is making knowledge intensive decisions in a limited amount of time using uncertain, incorrect, or outdated information. It is essential to provide tools to decision-makers that provide: * Management of friendly forces by treating the "friendly resources as a system". * Rapid assessment of effects of military actions againt the "enemy as a system". * Assessment of how an enemy should, can, and could react to friendly military activities. Software agents in the form of mission agents, target agents, maintenance agents, and logistics agents can meet this information challenge. The role of each agent is to know all the details about its assigned mission, target, maintenance, or logistics entity. The Mission Agent would fight for mission resources based on the mission priority and analyze the effect that a proposed mission's results would have on the enemy. The Target Agent (TA) communicates with other targets to determine its role in the system of targets. A system of TAs would be able to inform a planner or analyst of the status of a system of targets, the effect of that status, adn the effect of attacks on that system. The system of TAs would also be able to analyze possible enemy reactions to attack by determining ways to minimize the effect of attack, such as rerouting traffic or using deception. The Maintenance Agent would scheudle maintenance events and notify the maintenance unit. The Logistics Agent would manage shipment and delivery of supplies to maintain appropriate levels of weapons, fuel and spare parts. The central idea underlying this case of software agents is knowledge focus. Software agents are createad automatically to focus their attention on individual real-world entities (e.g., missions, targets) and view the world from that entities perspective. The agent autonomously monitors the entity, identifies problems/opportunities, formulates solutions, and informs the decision-maker. The agent must be

  3. Using Digital Devices in a First Year Classroom: A Focus on the Design and Use of Phonics Software Applications

    ERIC Educational Resources Information Center

    Nicholas, Maria; McKenzie, Sophie; Wells, Muriel A.

    2017-01-01

    When integrated within a holistic literacy program, phonics applications can be used in classrooms to facilitate students' self-directed learning of letter-sound knowledge; but are they designed to allow for such a purpose? With most phonics software applications making heavy use of image cues, this project has more specifically investigated…

  4. SU-E-J-04: Integration of Interstitial High Intensity Therapeutic Ultrasound Applicators On a Clinical MRI-Guided High Intensity Focused Ultrasound Treatment Planning Software Platform

    SciTech Connect

    Ellens, N; Partanen, A; Ghoshal, G; Burdette, E; Farahani, K

    2015-06-15

    Purpose: Interstitial high intensity therapeutic ultrasound (HITU) applicators can be used to ablate tissue percutaneously, allowing for minimally-invasive treatment without ionizing radiation [1,2]. The purpose of this study was to evaluate the feasibility and usability of combining multielement interstitial HITU applicators with a clinical magnetic resonance imaging (MRI)-guided focused ultrasound software platform. Methods: The Sonalleve software platform (Philips Healthcare, Vantaa, Finland) combines anatomical MRI for target selection and multi-planar MRI thermometry to provide real-time temperature information. The MRI-compatible interstitial US applicators (Acoustic MedSystems, Savoy, IL, USA) had 1–4 cylindrical US elements, each 1 cm long with either 180° or 360° of active surface. Each applicator (4 Fr diameter, enclosed within a 13 Fr flexible catheter) was inserted into a tissue-mimicking agar-silica phantom. Degassed water was circulated around the transducers for cooling and coupling. Based on the location of the applicator, a virtual transducer overlay was added to the software to assist targeting and to allow automatic thermometry slice placement. The phantom was sonicated at 7 MHz for 5 minutes with 6–8 W of acoustic power for each element. MR thermometry data were collected during and after sonication. Results: Preliminary testing indicated that the applicator location could be identified in the planning images and the transducer locations predicted within 1 mm accuracy using the overlay. Ablation zones (thermal dose ≥ 240 CEM43) for 2 active, adjacent US elements ranged from 18 mm × 24 mm (width × length) to 25 mm × 25 mm for the 6 W and 8 W sonications, respectively. Conclusion: The combination of interstitial HITU applicators and this software platform holds promise for novel approaches in minimally-invasive MRI-guided therapy, especially when bony structures or air-filled cavities may preclude extracorporeal HIFU.[1] Diederich et al

  5. Cartographic applications software

    USGS Publications Warehouse

    ,

    1992-01-01

    The Office of the Assistant Division Chief for Research, National Mapping Division, develops computer software for the solution of geometronic problems in the fields of surveying, geodesy, remote sensing, and photogrammetry. Software that has been developed using public funds is available on request for a nominal charge to recover the cost of duplication.

  6. Future Trends of Software Technology and Applications: Software Architecture

    DTIC Science & Technology

    2006-01-01

    Sponsored by the U.S. Department of Defense © 2006 by Carnegie Mellon University 1 Pittsburgh, PA 15213-3890 Future Trends of Software Technology ...COVERED 00-00-2006 to 00-00-2006 4. TITLE AND SUBTITLE Future Trends of Software Technology and Applications: Software Architecture 5a. CONTRACT...and Applications: Software Architecture Paul Clements Software Engineering Institute Carnegie Mellon University Report Documentation Page Form

  7. Neurofeedback training aimed to improve focused attention and alertness in children with ADHD: a study of relative power of EEG rhythms using custom-made software application.

    PubMed

    Hillard, Brent; El-Baz, Ayman S; Sears, Lonnie; Tasman, Allan; Sokhadze, Estate M

    2013-07-01

    Neurofeedback is a nonpharmacological treatment for attention-deficit hyperactivity disorder (ADHD). We propose that operant conditioning of electroencephalogram (EEG) in neurofeedback training aimed to mitigate inattention and low arousal in ADHD, will be accompanied by changes in EEG bands' relative power. Patients were 18 children diagnosed with ADHD. The neurofeedback protocol ("Focus/Alertness" by Peak Achievement Trainer) has a focused attention and alertness training mode. The neurofeedback protocol provides one for Focus and one for Alertness. This does not allow for collecting information regarding changes in specific EEG bands (delta, theta, alpha, low and high beta, and gamma) power within the 2 to 45 Hz range. Quantitative EEG analysis was completed on each of twelve 25-minute-long sessions using a custom-made MatLab application to determine the relative power of each of the aforementioned EEG bands throughout each session, and from the first session to the last session. Additional statistical analysis determined significant changes in relative power within sessions (from minute 1 to minute 25) and between sessions (from session 1 to session 12). Analysis was of relative power of theta, alpha, low and high beta, theta/alpha, theta/beta, and theta/low beta and theta/high beta ratios. Additional secondary measures of patients' post-neurofeedback outcomes were assessed, using an audiovisual selective attention test (IVA + Plus) and behavioral evaluation scores from the Aberrant Behavior Checklist. Analysis of data computed in the MatLab application, determined that theta/low beta and theta/alpha ratios decreased significantly from session 1 to session 12, and from minute 1 to minute 25 within sessions. The findings regarding EEG changes resulting from brain wave self-regulation training, along with behavioral evaluations, will help elucidate neural mechanisms of neurofeedback aimed to improve focused attention and alertness in ADHD.

  8. Tired of Teaching Software Applications?

    ERIC Educational Resources Information Center

    Lippert, Susan K.; Granger, Mary J.

    Many university business schools have an instructor-led course introducing computer software application packages. This course is often required for all undergraduates and is a prerequisite to other courses, such as accounting, finance, marketing, and operations management. Knowledge and skills gained in this course should enable students not only…

  9. Applications Software as Cognitive Enhancers.

    ERIC Educational Resources Information Center

    Lambrecht, Judith J.

    1993-01-01

    Discusses the use of microcomputer applications software in education, reviews the literature involving studies on the development of problem-solving skills and on the instructional use of spreadsheets, and describes a project for secondary school students using spreadsheets that provides cognitive support. (Contains 35 references.) (LRW)

  10. Girls' Preferences in Software Design: Insights from a Focus Group.

    ERIC Educational Resources Information Center

    Miller, Leslie; And Others

    1996-01-01

    A lack of gender-sensitive computer games exacerbates female disinterest in technology. Girls-only focus groups revealed phenomena that may help software developers awaken girls' enthusiasm for computing. For instance, girls placed a premium on richly textured video and audio, on collaborating rather than competing, on interacting with male…

  11. Risk reduction using DDP (Defect Detection and Prevention): Software support and software applications

    NASA Technical Reports Server (NTRS)

    Feather, M. S.

    2001-01-01

    Risk assessment and mitigation is the focus of the Defect Detection and Prevention (DDP) process, which has been applied to spacecraft technology assessments and planning, both hardware and software. DDP's major elements and their relevance to core requirement engineering concerns are summarized. The accompanying research demonstration illustrates DDP's tool support, and further customizations for application to software.

  12. Ray-tracing software comparison for linear focusing solar collectors

    NASA Astrophysics Data System (ADS)

    Osório, Tiago; Horta, Pedro; Larcher, Marco; Pujol-Nadal, Ramón; Hertel, Julian; van Rooyen, De Wet; Heimsath, Anna; Schneider, Simon; Benitez, Daniel; Frein, Antoine; Denarie, Alice

    2016-05-01

    Ray-Tracing software tools have been widely used in the optical design of solar concentrating collectors. In spite of the ability of these tools to assess the geometrical and material aspects impacting the optical performance of concentrators, their use in combination with experimental measurements in the framework of collector testing procedures as not been implemented, to the date, in none of the current solar collector testing standards. In the latest revision of ISO9806 an effort was made to include linear focusing concentrating collectors but some practical and theoretical difficulties emerged. A Ray-Tracing analysis could provide important contributions to overcome these issues, complementing the experimental results obtained through thermal testing and allowing the achievement of more thorough testing outputs with lower experimental requirements. In order to evaluate different available software tools a comparison study was conducted. Taking as representative technologies for line-focus concentrators the Parabolic Trough Collector and the Linear Fresnel Reflector Collector, two exemplary cases with predefined conditions - geometry, sun model and material properties - were simulated with different software tools. This work was carried out within IEA/SHC Task 49 "Solar Heat Integration in Industrial Processes".

  13. Standardization of Software Application Development and Governance

    DTIC Science & Technology

    2015-03-01

    software - defined systems continues to increase. Size and complexity of the systems also continue to increase, and the design problems go beyond algorithms... software expects to meet the requirements as it is about defining system-coding methodology. There are many styles of software architectures, and they...development can take place. A software framework commonly defined as “a platform for developing applications. It provides the foundation on which software

  14. The LBT double prime focus camera control software

    NASA Astrophysics Data System (ADS)

    Di Paola, Andrea; Baruffolo, Andrea; Gallozzi, Stefano; Pedichini, Fernando; Speziali, Roberto

    2004-09-01

    The LBT double prime focus camera (LBC) is composed of twin CCD mosaic imagers. The instrument is designed to match the double channel structure of the LBT telescope and to exploit parallel observing mode by optimizing one camera at blue and the other at red side of the visible spectrum. Because of these facts, the LBC activity will likely consist of simultaneous multi-wavelength observation of specific targets, with both channels working at the same time to acquire and download images at different rates. The LBC Control Software is responsible for coordinating these activities by managing scientific sensors and all the ancillary devices such as rotators, filter wheels, optical correctors focusing, house-keeping information, tracking and Active Optics wavefront sensors. The result is obtained using four dedicated PCs to control the four CCD controllers and one dual processor PC to manage all the other aspects including instrument operator interface. The general architecture of the LBC Control Software is described as well as solutions and details about its implementation.

  15. Programming Language Software For Graphics Applications

    NASA Technical Reports Server (NTRS)

    Beckman, Brian C.

    1993-01-01

    New approach reduces repetitive development of features common to different applications. High-level programming language and interactive environment with access to graphical hardware and software created by adding graphical commands and other constructs to standardized, general-purpose programming language, "Scheme". Designed for use in developing other software incorporating interactive computer-graphics capabilities into application programs. Provides alternative to programming entire applications in C or FORTRAN, specifically ameliorating design and implementation of complex control and data structures typifying applications with interactive graphics. Enables experimental programming and rapid development of prototype software, and yields high-level programs serving as executable versions of software-design documentation.

  16. Programming Language Software For Graphics Applications

    NASA Technical Reports Server (NTRS)

    Beckman, Brian C.

    1993-01-01

    New approach reduces repetitive development of features common to different applications. High-level programming language and interactive environment with access to graphical hardware and software created by adding graphical commands and other constructs to standardized, general-purpose programming language, "Scheme". Designed for use in developing other software incorporating interactive computer-graphics capabilities into application programs. Provides alternative to programming entire applications in C or FORTRAN, specifically ameliorating design and implementation of complex control and data structures typifying applications with interactive graphics. Enables experimental programming and rapid development of prototype software, and yields high-level programs serving as executable versions of software-design documentation.

  17. Software Component Technologies and Space Applications

    NASA Technical Reports Server (NTRS)

    Batory, Don

    1995-01-01

    In the near future, software systems will be more reconfigurable than hardware. This will be possible through the advent of software component technologies which have been prototyped in universities and research labs. In this paper, we outline the foundations for those technologies and suggest how they might impact software for space applications.

  18. Software applications for flux balance analysis.

    PubMed

    Lakshmanan, Meiyappan; Koh, Geoffrey; Chung, Bevan K S; Lee, Dong-Yup

    2014-01-01

    Flux balance analysis (FBA) is a widely used computational method for characterizing and engineering intrinsic cellular metabolism. The increasing number of its successful applications and growing popularity are possibly attributable to the availability of specific software tools for FBA. Each tool has its unique features and limitations with respect to operational environment, user-interface and supported analysis algorithms. Presented herein is an in-depth evaluation of currently available FBA applications, focusing mainly on usability, functionality, graphical representation and inter-operability. Overall, most of the applications are able to perform basic features of model creation and FBA simulation. COBRA toolbox, OptFlux and FASIMU are versatile to support advanced in silico algorithms to identify environmental and genetic targets for strain design. SurreyFBA, WEbcoli, Acorn, FAME, GEMSiRV and MetaFluxNet are the distinct tools which provide the user friendly interfaces in model handling. In terms of software architecture, FBA-SimVis and OptFlux have the flexible environments as they enable the plug-in/add-on feature to aid prospective functional extensions. Notably, an increasing trend towards the implementation of more tailored e-services such as central model repository and assistance to collaborative efforts was observed among the web-based applications with the help of advanced web-technologies. Furthermore, most recent applications such as the Model SEED, FAME, MetaFlux and MicrobesFlux have even included several routines to facilitate the reconstruction of genome-scale metabolic models. Finally, a brief discussion on the future directions of FBA applications was made for the benefit of potential tool developers.

  19. Database Handling Software and Scientific Applications.

    ERIC Educational Resources Information Center

    Gabaldon, Diana J.

    1984-01-01

    Discusses the general characteristics of database management systems and file systems. Also gives a basic framework for evaluating such software and suggests characteristics that should be considered when buying software for specific scientific applications. A list of vendor addresses for popular database management systems is included. (JN)

  20. Firing Room Remote Application Software Development

    NASA Technical Reports Server (NTRS)

    Liu, Kan

    2015-01-01

    The Engineering and Technology Directorate (NE) at National Aeronautics and Space Administration (NASA) Kennedy Space Center (KSC) is designing a new command and control system for the checkout and launch of Space Launch System (SLS) and future rockets. The purposes of the semester long internship as a remote application software developer include the design, development, integration, and verification of the software and hardware in the firing rooms, in particular with the Mobile Launcher (ML) Launch Accessories (LACC) subsystem. In addition, a software test verification procedure document was created to verify and checkout LACC software for Launch Equipment Test Facility (LETF) testing.

  1. Collaborative Software and Focused Distraction in the Classroom

    ERIC Educational Resources Information Center

    Rhine, Steve; Bailey, Mark

    2011-01-01

    In search of strategies for increasing their pre-service teachers' thoughtful engagement with content and in an effort to model connection between choice of technology and pedagogical goals, the authors utilized collaborative software during class time. Collaborative software allows all students to write simultaneously on a single collective…

  2. Software engineering with application-specific languages

    NASA Technical Reports Server (NTRS)

    Campbell, David J.; Barker, Linda; Mitchell, Deborah; Pollack, Robert H.

    1993-01-01

    Application-Specific Languages (ASL's) are small, special-purpose languages that are targeted to solve a specific class of problems. Using ASL's on software development projects can provide considerable cost savings, reduce risk, and enhance quality and reliability. ASL's provide a platform for reuse within a project or across many projects and enable less-experienced programmers to tap into the expertise of application-area experts. ASL's have been used on several software development projects for the Space Shuttle Program. On these projects, the use of ASL's resulted in considerable cost savings over conventional development techniques. Two of these projects are described.

  3. Workshop and conference on Grand Challenges applications and software technology

    SciTech Connect

    Not Available

    1993-12-31

    On May 4--7, 1993, nine federal agencies sponsored a four-day meeting on Grand Challenge applications and software technology. The objective was to bring High-Performance Computing and Communications (HPCC) Grand Challenge applications research groups supported under the federal HPCC program together with HPCC software technologists to: discuss multidisciplinary computational science research issues and approaches, identify major technology challenges facing users and providers, and refine software technology requirements for Grand Challenge applications research. The first day and a half focused on applications. Presentations were given by speakers from universities, national laboratories, and government agencies actively involved in Grand Challenge research. Five areas of research were covered: environmental and earth sciences; computational physics; computational biology, chemistry, and materials sciences; computational fluid and plasma dynamics; and applications of artificial intelligence. The next day and a half was spent in working groups in which the applications researchers were joined by software technologists. Nine breakout sessions took place: I/0, Data, and File Systems; Parallel Programming Paradigms; Performance Characterization and Evaluation of Massively Parallel Processing Applications; Program Development Tools; Building Multidisciplinary Applications; Algorithm and Libraries I; Algorithms and Libraries II; Graphics and Visualization; and National HPCC Infrastructure.

  4. Firing Room Remote Application Software Development & Swamp Works Laboratory Robot Software Development

    NASA Technical Reports Server (NTRS)

    Garcia, Janette

    2016-01-01

    The National Aeronautics and Space Administration (NASA) is creating a way to send humans beyond low Earth orbit, and later to Mars. Kennedy Space Center (KSC) is working to make this possible by developing a Spaceport Command and Control System (SCCS) which will allow the launch of Space Launch System (SLS). This paper's focus is on the work performed by the author in her first and second part of the internship as a remote application software developer. During the first part of her internship, the author worked on the SCCS's software application layer by assisting multiple ground subsystems teams including Launch Accessories (LACC) and Environmental Control System (ECS) on the design, development, integration, and testing of remote control software applications. Then, on the second part of the internship, the author worked on the development of robot software at the Swamp Works Laboratory which is a research and technology development group which focuses on inventing new technology to help future In-Situ Resource Utilization (ISRU) missions.

  5. Remote Software Application and Display Development

    NASA Technical Reports Server (NTRS)

    Sanders, Brandon T.

    2014-01-01

    The era of the shuttle program has come to an end, but only to give rise to newer and more exciting projects. Now is the time of the Orion spacecraft, a work of art designed to exceed all previous endeavors of man. NASA is exiting the time of exploration and is entering a new period, a period of pioneering. With this new mission, many of NASAs organizations must undergo a great deal of change and development to support the Orion missions. The Spaceport Command and Control System (SCCS) is the new system that will provide NASA the ability to launch rockets into orbit and thus control Orion and other spacecraft as the goal of populating Mars becomes ever increasingly tangible. Since the previous control system, Launch Processing System (LPS), was primarily designed to launch the shuttles, SCCS was needed as Kennedy Space Center (KSC) reorganized to a multiuser spaceport for commercial flights, providing a more versatile control over rockets. Within SCCS, is the Launch Control System (LCS), which is the remote software behind the command and monitoring of flight and ground system hardware. This internship at KSC has involved two main components in LCS, including Remote Software Application and Display development. The display environment provides a graphical user interface for an operator to view and see if any cautions are raised, while the remote applications are the backbone that communicate with hardware, and then relay the data back to the displays. These elements go hand in hand as they provide monitoring and control over hardware and software alike from the safety of the Launch Control Center. The remote software applications are written in Application Control Language (ACL), which must undergo unit testing to ensure data integrity. This paper describes both the implementation and writing of unit tests in ACL code for remote software applications, as well as the building of remote displays to be used in the Launch Control Center (LCC).

  6. Firing Room Remote Application Software Development

    NASA Technical Reports Server (NTRS)

    Liu, Kan

    2014-01-01

    The Engineering and Technology Directorate (NE) at National Aeronautics and Space Administration (NASA) Kennedy Space Center (KSC) is designing a new command and control system for the checkout and launch of Space Launch System (SLS) and future rockets. The purposes of the semester long internship as a remote application software developer include the design, development, integration, and verification of the software and hardware in the firing rooms, in particular with the Mobile Launcher (ML) Launch Accessories subsystem. In addition, a Conversion Fusion project was created to show specific approved checkout and launch engineering data for public-friendly display purposes.

  7. E-learning for Critical Thinking: Using Nominal Focus Group Method to Inform Software Content and Design.

    PubMed

    Parker, Steve; Mayner, Lidia; Michael Gillham, David

    2015-12-01

    Undergraduate nursing students are often confused by multiple understandings of critical thinking. In response to this situation, the Critiique for critical thinking (CCT) project was implemented to provide consistent structured guidance about critical thinking. This paper introduces Critiique software, describes initial validation of the content of this critical thinking tool and explores wider applications of the Critiique software. Critiique is flexible, authorable software that guides students step-by-step through critical appraisal of research papers. The spelling of Critiique was deliberate, so as to acquire a unique web domain name and associated logo. The CCT project involved implementation of a modified nominal focus group process with academic staff working together to establish common understandings of critical thinking. Previous work established a consensus about critical thinking in nursing and provided a starting point for the focus groups. The study was conducted at an Australian university campus with the focus group guided by open ended questions. Focus group data established categories of content that academic staff identified as important for teaching critical thinking. This emerging focus group data was then used to inform modification of Critiique software so that students had access to consistent and structured guidance in relation to critical thinking and critical appraisal. The project succeeded in using focus group data from academics to inform software development while at the same time retaining the benefits of broader philosophical dimensions of critical thinking.

  8. E-learning for Critical Thinking: Using Nominal Focus Group Method to Inform Software Content and Design

    PubMed Central

    Parker, Steve; Mayner, Lidia; Michael Gillham, David

    2015-01-01

    Background: Undergraduate nursing students are often confused by multiple understandings of critical thinking. In response to this situation, the Critiique for critical thinking (CCT) project was implemented to provide consistent structured guidance about critical thinking. Objectives: This paper introduces Critiique software, describes initial validation of the content of this critical thinking tool and explores wider applications of the Critiique software. Materials and Methods: Critiique is flexible, authorable software that guides students step-by-step through critical appraisal of research papers. The spelling of Critiique was deliberate, so as to acquire a unique web domain name and associated logo. The CCT project involved implementation of a modified nominal focus group process with academic staff working together to establish common understandings of critical thinking. Previous work established a consensus about critical thinking in nursing and provided a starting point for the focus groups. The study was conducted at an Australian university campus with the focus group guided by open ended questions. Results: Focus group data established categories of content that academic staff identified as important for teaching critical thinking. This emerging focus group data was then used to inform modification of Critiique software so that students had access to consistent and structured guidance in relation to critical thinking and critical appraisal. Conclusions: The project succeeded in using focus group data from academics to inform software development while at the same time retaining the benefits of broader philosophical dimensions of critical thinking. PMID:26835469

  9. Application development using the ALMA common software

    NASA Astrophysics Data System (ADS)

    Chiozzi, G.; Caproni, A.; Jeram, B.; Sommer, H.; Wang, V.; Plesko, M.; Sekoranja, M.; Zagar, K.; Fugate, D. W.; Harrington, S.; Di Marcantonio, P.; Cirami, R.

    2006-06-01

    The ALMA Common Software (ACS) provides the software infrastructure used by ALMA and by several other telescope projects, thanks also to the choice of adopting the LGPL public license. ACS is a set of application frameworks providing the basic services needed for object oriented distributed computing. Among these are transparent remote object invocation, object deployment and location based on a container/component model, distributed error, alarm handling, logging and events. ACS is based on CORBA and built on top of free CORBA implementations. Free software is extensively used wherever possible. The general architecture of ACS was presented at SPIE 2002. ACS has been under development for 6 years and it is midway through its development life. Many applications have been written using ACS; the ALMA test facility, APEX and other telescopes are running systems based on ACS. This is therefore a good time to look back and see what have been until now the strong and the weak points of ACS in terms of architecture and implementation. In this perspective, it is very important to analyze the applications based on ACS, the feedback received by the users and the impact that this feedback has had on the development of ACS itself, by favoring the development of some features with respect to others. The purpose of this paper is to describe the results of this analysis and discuss what we would like to do in order to extend and improve ACS in the coming years, in particular to make application development easier and more efficient.

  10. A Software Architecture for High Level Applications

    SciTech Connect

    Shen,G.

    2009-05-04

    A modular software platform for high level applications is under development at the National Synchrotron Light Source II project. This platform is based on client-server architecture, and the components of high level applications on this platform will be modular and distributed, and therefore reusable. An online model server is indispensable for model based control. Different accelerator facilities have different requirements for the online simulation. To supply various accelerator simulators, a set of narrow and general application programming interfaces is developed based on Tracy-3 and Elegant. This paper describes the system architecture for the modular high level applications, the design of narrow and general application programming interface for an online model server, and the prototype of online model server.

  11. 36 CFR 1194.21 - Software applications and operating systems.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 36 Parks, Forests, and Public Property 3 2011-07-01 2011-07-01 false Software applications and... Standards § 1194.21 Software applications and operating systems. (a) When software is designed to run on a...) Software shall not use flashing or blinking text, objects, or other elements having a flash or...

  12. 36 CFR 1194.21 - Software applications and operating systems.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 36 Parks, Forests, and Public Property 3 2010-07-01 2010-07-01 false Software applications and... Standards § 1194.21 Software applications and operating systems. (a) When software is designed to run on a...) Software shall not use flashing or blinking text, objects, or other elements having a flash or...

  13. 36 CFR 1194.21 - Software applications and operating systems.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 36 Parks, Forests, and Public Property 3 2014-07-01 2014-07-01 false Software applications and... Standards § 1194.21 Software applications and operating systems. (a) When software is designed to run on a...) Software shall not use flashing or blinking text, objects, or other elements having a flash or blink...

  14. 36 CFR 1194.21 - Software applications and operating systems.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 36 Parks, Forests, and Public Property 3 2012-07-01 2012-07-01 false Software applications and... Standards § 1194.21 Software applications and operating systems. (a) When software is designed to run on a...) Software shall not use flashing or blinking text, objects, or other elements having a flash or blink...

  15. 36 CFR 1194.21 - Software applications and operating systems.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 36 Parks, Forests, and Public Property 3 2013-07-01 2012-07-01 true Software applications and... Standards § 1194.21 Software applications and operating systems. (a) When software is designed to run on a...) Software shall not use flashing or blinking text, objects, or other elements having a flash or...

  16. Parallel Algorithms and Software for Nuclear, Energy, and Environmental Applications. Part II: Multiphysics Software

    SciTech Connect

    Derek Gaston; Luanjing Guo; Glen Hansen; Hai Huang; Richard Johnson; Dana Knoll; Chris Newman; Hyeong Kae Park; Robert Podgorney; Michael Tonks; Richard Williamson

    2012-09-01

    This paper is the second part of a two part sequence on multiphysics algorithms and software. The first [1] focused on the algorithms; this part treats the multiphysics software framework and applications based on it. Tight coupling is typically designed into the analysis application at inception, as such an application is strongly tied to a composite nonlinear solver that arrives at the final solution by treating all equations simultaneously. The application must also take care to minimize both time and space error between the physics, particularly if more than one mesh representation is needed in the solution process. This paper presents an application framework that was specifically designed to support tightly coupled multiphysics analysis. The Multiphysics Object Oriented Simulation Environment (MOOSE) is based on the Jacobian-free Newton-Krylov (JFNK) method combined with physics-based preconditioning to provide the underlying mathematical structure for applications. The report concludes with the presentation of a host of nuclear, energy, and environmental applications that demonstrate the efficacy of the approach and the utility of a well-designed multiphysics framework.

  17. Evaluation of the DDSolver software applications.

    PubMed

    Zuo, Jieyu; Gao, Yuan; Bou-Chacra, Nadia; Löbenberg, Raimar

    2014-01-01

    When a new oral dosage form is developed, its dissolution behavior must be quantitatively analyzed. Dissolution analysis involves a comparison of the dissolution profiles and the application of mathematical models to describe the drug release pattern. This report aims to assess the application of the DDSolver, an Excel add-in software package, which is designed to analyze data obtained from dissolution experiments. The data used in this report were chosen from two dissolution studies. The results of the DDSolver analysis were compared with those obtained using an Excel worksheet. The comparisons among three different products obtained similarity factors (f 2) of 23.21, 46.66, and 17.91 using both DDSolver and the Excel worksheet. The results differed when DDSolver and Excel were used to calculate the release exponent "n" in the Korsmeyer-Peppas model. Performing routine quantitative analysis proved to be much easier using the DDSolver program than an Excel spreadsheet. The use of the DDSolver program reduced the calculation time and has the potential to omit calculation errors, thus making this software package a convenient tool for dissolution comparison.

  18. Plasma focus: Present status and potential applications

    SciTech Connect

    Brzosko, J.S.; Nardi, V.; Powell, C.

    1997-12-01

    Initially, dense plasma focus (DPF) machines were constructed independently by Filippov in Moscow and Mather in Los Alamos at the end of the 1950s. Since then, more than 30 laboratories have carried vigorous DPF programs, oriented mainly toward the studies of physics of ion acceleration and trapping in the plasma focus environment. Applications of the DPF as intense neutron and X-ray sources have been recognized since its discovery but not implemented for various reasons. Recently, some groups (including AES) addressed the issue of DPF applications, and some of them are briefly discussed in this paper.

  19. Software reliability models for critical applications

    SciTech Connect

    Pham, H.; Pham, M.

    1991-12-01

    This report presents the results of the first phase of the ongoing EG&G Idaho, Inc. Software Reliability Research Program. The program is studying the existing software reliability models and proposes a state-of-the-art software reliability model that is relevant to the nuclear reactor control environment. This report consists of three parts: (1) summaries of the literature review of existing software reliability and fault tolerant software reliability models and their related issues, (2) proposed technique for software reliability enhancement, and (3) general discussion and future research. The development of this proposed state-of-the-art software reliability model will be performed in the second place. 407 refs., 4 figs., 2 tabs.

  20. Software reliability models for critical applications

    SciTech Connect

    Pham, H.; Pham, M.

    1991-12-01

    This report presents the results of the first phase of the ongoing EG G Idaho, Inc. Software Reliability Research Program. The program is studying the existing software reliability models and proposes a state-of-the-art software reliability model that is relevant to the nuclear reactor control environment. This report consists of three parts: (1) summaries of the literature review of existing software reliability and fault tolerant software reliability models and their related issues, (2) proposed technique for software reliability enhancement, and (3) general discussion and future research. The development of this proposed state-of-the-art software reliability model will be performed in the second place. 407 refs., 4 figs., 2 tabs.

  1. Designing Control System Application Software for Change

    NASA Technical Reports Server (NTRS)

    Boulanger, Richard

    2001-01-01

    The Unified Modeling Language (UML) was used to design the Environmental Systems Test Stand (ESTS) control system software. The UML was chosen for its ability to facilitate a clear dialog between software designer and customer, from which requirements are discovered and documented in a manner which transposes directly to program objects. Applying the UML to control system software design has resulted in a baseline set of documents from which change and effort of that change can be accurately measured. As the Environmental Systems Test Stand evolves, accurate estimates of the time and effort required to change the control system software will be made. Accurate quantification of the cost of software change can be before implementation, improving schedule and budget accuracy.

  2. Designing Control System Application Software for Change

    NASA Technical Reports Server (NTRS)

    Boulanger, Richard

    2001-01-01

    The Unified Modeling Language (UML) was used to design the Environmental Systems Test Stand (ESTS) control system software. The UML was chosen for its ability to facilitate a clear dialog between software designer and customer, from which requirements are discovered and documented in a manner which transposes directly to program objects. Applying the UML to control system software design has resulted in a baseline set of documents from which change and effort of that change can be accurately measured. As the Environmental Systems Test Stand evolves, accurate estimates of the time and effort required to change the control system software will be made. Accurate quantification of the cost of software change can be before implementation, improving schedule and budget accuracy.

  3. Software environment for implementing engineering applications on MIMD computers

    NASA Technical Reports Server (NTRS)

    Lopez, L. A.; Valimohamed, K. A.; Schiff, S.

    1990-01-01

    In this paper the concept for a software environment for developing engineering application systems for multiprocessor hardware (MIMD) is presented. The philosophy employed is to solve the largest problems possible in a reasonable amount of time, rather than solve existing problems faster. In the proposed environment most of the problems concerning parallel computation and handling of large distributed data spaces are hidden from the application program developer, thereby facilitating the development of large-scale software applications. Applications developed under the environment can be executed on a variety of MIMD hardware; it protects the application software from the effects of a rapidly changing MIMD hardware technology.

  4. 2003 SNL ASCI applications software quality engineering assessment report.

    SciTech Connect

    Schofield, Joseph Richard, Jr.; Ellis, Molly A.; Williamson, Charles Michael; Bonano, Lora A.

    2004-02-01

    This document describes the 2003 SNL ASCI Software Quality Engineering (SQE) assessment of twenty ASCI application code teams and the results of that assessment. The purpose of this assessment was to determine code team compliance with the Sandia National Laboratories ASCI Applications Software Quality Engineering Practices, Version 2.0 as part of an overall program assessment.

  5. 2002 SNL ASCI Applications Software Engineering Assessment Report

    SciTech Connect

    WILLIAMSON, CHARLES MICHAEL; OGDEN, HARVEY C.; BYLE, KATHLEEN A.

    2002-07-01

    This document describes the 2002 SNL Accelerated Strategic Computing Initiative (ASCI) Applications Software Quality Engineering (SQE) Assessment and the assessment results. The primary purpose of the assessment was to establish the current state of software engineering practices within the SNL ASCI Applications Program.

  6. Analysis of Software Development Methodologies to Build Safety Software Applications for the SATEX-II: A Mexican Experimental Satellite

    NASA Astrophysics Data System (ADS)

    Aguilar Cisneros, Jorge; Vargas Martinez, Hector; Pedroza Melendez, Alejandro; Alonso Arevalo, Miguel

    2013-09-01

    Mexico is a country where the experience to build software for satellite applications is beginning. This is a delicate situation because in the near future we will need to develop software for the SATEX-II (Mexican Experimental Satellite). SATEX- II is a SOMECyTA's project (the Mexican Society of Aerospace Science and Technology). We have experienced applying software development methodologies, like TSP (Team Software Process) and SCRUM in other areas. Then, we analyzed these methodologies and we concluded: these can be applied to develop software for the SATEX-II, also, we supported these methodologies with SSP-05-0 Standard in particular with ESA PSS-05-11. Our analysis was focusing on main characteristics of each methodology and how these methodologies could be used with the ESA PSS 05-0 Standards. Our outcomes, in general, may be used by teams who need to build small satellites, but, in particular, these are going to be used when we will build the on board software applications for the SATEX-II.

  7. Classroom Applications of Electronic Spreadsheet Computer Software.

    ERIC Educational Resources Information Center

    Tolbert, Patricia H.; Tolbert, Charles M., II

    1983-01-01

    Details classroom use of SuperCalc, a software accounting package developed originally for small businesses, as a computerized gradebook. A procedure which uses data from the computer gradebook to produce weekly printed reports for parents is also described. (MBR)

  8. Classroom Applications of Electronic Spreadsheet Computer Software.

    ERIC Educational Resources Information Center

    Tolbert, Patricia H.; Tolbert, Charles M., II

    1983-01-01

    Details classroom use of SuperCalc, a software accounting package developed originally for small businesses, as a computerized gradebook. A procedure which uses data from the computer gradebook to produce weekly printed reports for parents is also described. (MBR)

  9. Sandia National Laboratories ASCI Applications Software Quality Engineering Practices

    SciTech Connect

    ZEPPER, JOHN D.; ARAGON, KATHRYN MARY; ELLIS, MOLLY A.; BYLE, KATHLEEN A.; EATON, DONNA SUE

    2003-04-01

    This document provides a guide to the deployment of the software verification activities, software engineering practices, and project management principles that guide the development of Accelerated Strategic Computing Initiative (ASCI) applications software at Sandia National Laboratories (Sandia). The goal of this document is to identify practices and activities that will foster the development of reliable and trusted products produced by the ASCI Applications program. Document contents include an explanation of the structure and purpose of the ASCI Quality Management Council, an overview of the software development lifecycle, an outline of the practices and activities that should be followed, and an assessment tool.

  10. SHMTools: a general-purpose software tool for SHM applications

    SciTech Connect

    Harvey, Dustin; Farrar, Charles; Taylor, Stuart; Park, Gyuhae; Flynn, Eric B; Kpotufe, Samory; Dondi, Denis; Mollov, Todor; Todd, Michael D; Rosin, Tajana S; Figueiredo, Eloi

    2010-11-30

    This paper describes a new software package for various structural health monitoring (SHM) applications. The software is a set of standardized MATLAB routines covering three main stages of SHM: data acquisition, feature extraction, and feature classification for damage identification. A subset of the software in SHMTools is embeddable, which consists of Matlab functions that can be cross-compiled into generic 'C' programs to be run on a target hardware. The software is also designed to accommodate multiple sensing modalities, including piezoelectric active-sensing, which has been widely used in SHM practice. The software package, including standardized datasets, are publicly available for use by the SHM community. The details of this embeddable software will be discussed, along with several example processes that can be used for guidelines for future use of the software.

  11. Software framework for nano- and microscale measurement applications

    NASA Astrophysics Data System (ADS)

    Röning, Juha; Tuhkanen, Ville; Sipola, Risto; Vallius, Tero

    2011-01-01

    Development of new instruments and measurement methods has advanced research in the field of nanotechnology. Development of measurement systems used in research requires support from reconfigurable software. Application frameworks can be used to develop domain-specific application skeletons. New applications are specialized from the framework by filling its extension points. This paper presents an application framework for nano- and micro-scale applications. The framework consists of implementation of a robotic control architecture and components that implement features available in measurement applications. To ease the development of user interfaces for measurement systems, the framework also contains ready-to-use user interface components. The goal of the framework was to ease the development of new applications for measurement systems. Features of the implemented framework were examined through two test cases. Benefits gained by using the framework were analyzed by determining work needed to specialize new applications from the framework. Also the degree of reusability of specialized applications was examined. The work shows that the developed framework can be used to implement software for measurement systems and that the major part of the software can be implemented by using reusable components of the framework. When developing new software, a developer only needs to develop components related to the hardware used and performing the measurement task. Using the framework developing new software takes less time. The framework also unifies structure of developed software.

  12. Safety Characteristics in System Application Software for Human Rated Exploration

    NASA Technical Reports Server (NTRS)

    Mango, E. J.

    2016-01-01

    NASA and its industry and international partners are embarking on a bold and inspiring development effort to design and build an exploration class space system. The space system is made up of the Orion system, the Space Launch System (SLS) and the Ground Systems Development and Operations (GSDO) system. All are highly coupled together and dependent on each other for the combined safety of the space system. A key area of system safety focus needs to be in the ground and flight application software system (GFAS). In the development, certification and operations of GFAS, there are a series of safety characteristics that define the approach to ensure mission success. This paper will explore and examine the safety characteristics of the GFAS development.

  13. An evaluation of the Interactive Software Invocation System (ISIS) for software development applications. [flight software

    NASA Technical Reports Server (NTRS)

    Noland, M. S.

    1981-01-01

    The Interactive Software Invocation System (ISIS), which allows a user to build, modify, control, and process a total flight software system without direct communications with the host computer, is described. This interactive data management system provides the user with a file manager, text editor, a tool invoker, and an Interactive Programming Language (IPL). The basic file design of ISIS is a five level hierarchical structure. The file manager controls this hierarchical file structure and permits the user to create, to save, to access, and to purge pages of information. The text editor is used to manipulate pages of text to be modified and the tool invoker allows the user to communicate with the host computer through a RUN file created by the user. The IPL is based on PASCAL and contains most of the statements found in a high-level programming language. In order to evaluate the effectiveness of the system as applied to a flight project, the collection of software components required to support the Annular Suspension and Pointing System (ASPS) flight project were integrated using ISIS. The ASPS software system and its integration into ISIS is described.

  14. End User Software Development for Transportation Applications.

    DTIC Science & Technology

    1987-09-01

    patience, attention, and guidance throughout this research project. I also wish to thank Captain Demetrius " Glass of the HQ USAF Transportation Plans and...necessary management and direction to refine non-standardized, interim software computer programs. According to Captain Demetrius Glass of the

  15. Software Applications To Increase Administrative and Teacher Effectiveness.

    ERIC Educational Resources Information Center

    Garland, Virginia E.

    Arguing that the most effective types of managerial computer software for teacher use are word processing, database management, and electronic spreadsheet packages, this paper uses Apple Writer, PFS File, and VisiCalc as examples of such software and suggests ways in which they can be used by classroom teachers. Applications of Apple Writer that…

  16. Software Process Improvement Journey: IBM Australia Application Management Services

    DTIC Science & Technology

    2005-03-01

    See Section 5.1.2) - Client Relationship Management ( CRM ) processes-specifically, Solution Design and Solution Delivery - Worldwide Project Management ...plex systems life-cycle management , rapid solutions development, custom development, package selection and implementation, maintenance, minor...CarnegieMellon ___ Software Engineering Institute Software Process Improvement Journey: IBM Australia Application Management Services Robyn Nichols

  17. Software development for safety-critical medical applications

    NASA Technical Reports Server (NTRS)

    Knight, John C.

    1992-01-01

    There are many computer-based medical applications in which safety and not reliability is the overriding concern. Reduced, altered, or no functionality of such systems is acceptable as long as no harm is done. A precise, formal definition of what software safety means is essential, however, before any attempt can be made to achieve it. Without this definition, it is not possible to determine whether a specific software entity is safe. A set of definitions pertaining to software safety will be presented and a case study involving an experimental medical device will be described. Some new techniques aimed at improving software safety will also be discussed.

  18. Designing application software in wide area network settings

    NASA Technical Reports Server (NTRS)

    Makpangou, Mesaac; Birman, Ken

    1990-01-01

    Progress in methodologies for developing robust local area network software has not been matched by similar results for wide area settings. The design of application software spanning multiple local area environments is examined. For important classes of applications, simple design techniques are presented that yield fault tolerant wide area programs. An implementation of these techniques as a set of tools for use within the ISIS system is described.

  19. Sandia National Laboratories ASCI Applications Software Quality Engineering Practices

    SciTech Connect

    ZEPPER, JOHN D.; ARAGON, KATHRYN MARY; ELLIS, MOLLY A.; BYLE, KATHLEEN A.; EATON, DONNA SUE

    2002-01-01

    This document provides a guide to the deployment of the software verification activities, software engineering practices, and project management principles that guide the development of Accelerated Strategic Computing Initiative (ASCI) applications software at Sandia National Laboratories (Sandia). The goal of this document is to identify practices and activities that will foster the development of reliable and trusted products produced by the ASCI Applications program. Document contents include an explanation of the structure and purpose of the ASCI Quality Management Council, an overview of the software development lifecycle, an outline of the practices and activities that should be followed, and an assessment tool. These sections map practices and activities at Sandia to the ASCI Software Quality Engineering: Goals, Principles, and Guidelines, a Department of Energy document.

  20. [Application of password manager software in health care].

    PubMed

    Ködmön, József

    2016-12-01

    When using multiple IT systems, handling of passwords in a secure manner means a potential source of problem. The most frequent issues are choosing the appropriate length and complexity, and then remembering the strong passwords. Password manager software provides a good solution for this problem, while greatly increasing the security of sensitive medical data. This article introduces a password manager software and provides basic information of the application. It also discusses how to select a really secure password manager software and suggests a practical application to efficient, safe and comfortable use for health care. Orv. Hetil., 2016, 157(52), 2066-2073.

  1. Application Reuse Library for Software, Requirements, and Guidelines

    NASA Technical Reports Server (NTRS)

    Malin, Jane T.; Thronesbery, Carroll

    1994-01-01

    Better designs are needed for expert systems and other operations automation software, for more reliable, usable and effective human support. A prototype computer-aided Application Reuse Library shows feasibility of supporting concurrent development and improvement of advanced software by users, analysts, software developers, and human-computer interaction experts. Such a library expedites development of quality software, by providing working, documented examples, which support understanding, modification and reuse of requirements as well as code. It explicitly documents and implicitly embodies design guidelines, standards and conventions. The Application Reuse Library provides application modules with Demo-and-Tester elements. Developers and users can evaluate applicability of a library module and test modifications, by running it interactively. Sub-modules provide application code and displays and controls. The library supports software modification and reuse, by providing alternative versions of application and display functionality. Information about human support and display requirements is provided, so that modifications will conform to guidelines. The library supports entry of new application modules from developers throughout an organization. Example library modules include a timer, some buttons and special fonts, and a real-time data interface program. The library prototype is implemented in the object-oriented G2 environment for developing real-time expert systems.

  2. HSCT4.0 Application: Software Requirements Specification

    NASA Technical Reports Server (NTRS)

    Salas, A. O.; Walsh, J. L.; Mason, B. H.; Weston, R. P.; Townsend, J. C.; Samareh, J. A.; Green, L. L.

    2001-01-01

    The software requirements for the High Performance Computing and Communication Program High Speed Civil Transport application project, referred to as HSCT4.0, are described. The objective of the HSCT4.0 application project is to demonstrate the application of high-performance computing techniques to the problem of multidisciplinary design optimization of a supersonic transport configuration, using high-fidelity analysis simulations. Descriptions of the various functions (and the relationships among them) that make up the multidisciplinary application as well as the constraints on the software design arc provided. This document serves to establish an agreement between the suppliers and the customer as to what the HSCT4.0 application should do and provides to the software developers the information necessary to design and implement the system.

  3. A computationally efficient software application for calculating vibration from underground railways

    NASA Astrophysics Data System (ADS)

    Hussein, M. F. M.; Hunt, H. E. M.

    2009-08-01

    The PiP model is a software application with a user-friendly interface for calculating vibration from underground railways. This paper reports about the software with a focus on its latest version and the plans for future developments. The software calculates the Power Spectral Density of vibration due to a moving train on floating-slab track with track irregularity described by typical values of spectra for tracks with good, average and bad conditions. The latest version accounts for a tunnel embedded in a half space by employing a toolbox developed at K.U. Leuven which calculates Green's functions for a multi-layered half-space.

  4. Control system software, simulation, and robotic applications

    NASA Technical Reports Server (NTRS)

    Frisch, Harold P.

    1991-01-01

    All essential existing capabilities needed to create a man-machine interaction dynamics and performance (MMIDAP) capability are reviewed. The multibody system dynamics software program Order N DISCOS will be used for machine and musculo-skeletal dynamics modeling. The program JACK will be used for estimating and animating whole body human response to given loading situations and motion constraints. The basic elements of performance (BEP) task decomposition methodologies associated with the Human Performance Institute database will be used for performance assessment. Techniques for resolving the statically indeterminant muscular load sharing problem will be used for a detailed understanding of potential musculotendon or ligamentous fatigue, pain, discomfort, and trauma. The envisioned capacity is to be used for mechanical system design, human performance assessment, extrapolation of man/machine interaction test data, biomedical engineering, and soft prototyping within a concurrent engineering (CE) system.

  5. Focused force angioplasty Theory and application

    SciTech Connect

    Solar, Ronald J.; Ischinger, Thomas A

    2003-03-01

    Focused force angioplasty is a technique in which the forces resulting from inflating an angioplasty balloon in a stenosis are concentrated and focused at one or more locations within the stenosis. While the technique has been shown to be useful in resolving resistant stenoses, its real value may be in minimizing the vascular trauma associated with balloon angioplasty and subsequently improving the outcome.

  6. Development and application of new quality model for software projects.

    PubMed

    Karnavel, K; Dillibabu, R

    2014-01-01

    The IT industry tries to employ a number of models to identify the defects in the construction of software projects. In this paper, we present COQUALMO and its limitations and aim to increase the quality without increasing the cost and time. The computation time, cost, and effort to predict the residual defects are very high; this was overcome by developing an appropriate new quality model named the software testing defect corrective model (STDCM). The STDCM was used to estimate the number of remaining residual defects in the software product; a few assumptions and the detailed steps of the STDCM are highlighted. The application of the STDCM is explored in software projects. The implementation of the model is validated using statistical inference, which shows there is a significant improvement in the quality of the software projects.

  7. Development and Application of New Quality Model for Software Projects

    PubMed Central

    Karnavel, K.; Dillibabu, R.

    2014-01-01

    The IT industry tries to employ a number of models to identify the defects in the construction of software projects. In this paper, we present COQUALMO and its limitations and aim to increase the quality without increasing the cost and time. The computation time, cost, and effort to predict the residual defects are very high; this was overcome by developing an appropriate new quality model named the software testing defect corrective model (STDCM). The STDCM was used to estimate the number of remaining residual defects in the software product; a few assumptions and the detailed steps of the STDCM are highlighted. The application of the STDCM is explored in software projects. The implementation of the model is validated using statistical inference, which shows there is a significant improvement in the quality of the software projects. PMID:25478594

  8. Computer Applications in Marketing. An Annotated Bibliography of Computer Software.

    ERIC Educational Resources Information Center

    Burrow, Jim; Schwamman, Faye

    This bibliography contains annotations of 95 items of educational and business software with applications in seven marketing and business functions. The annotations, which appear in alphabetical order by title, provide this information: category (related application), title, date, source and price, equipment, supplementary materials, description…

  9. Simplifying applications software for vision guided robot implementation

    NASA Technical Reports Server (NTRS)

    Duncheon, Charlie

    1994-01-01

    A simple approach to robot applications software is described. The idea is to use commercially available software and hardware wherever possible to minimize system costs, schedules and risks. The U.S. has been slow in the adaptation of robots and flexible automation compared to the fluorishing growth of robot implementation in Japan. The U.S. can benefit from this approach because of a more flexible array of vision guided robot technologies.

  10. Software Acquisition: Evolution, Total Quality Management, and Applications to the Army Tactical Missile System.

    DTIC Science & Technology

    presents the concept of software Total Quality Management (TQM) which focuses on the entire process of software acquisition, as a partial solution to...software TQM can be applied to software acquisition. Software Development, Software Acquisition, Total Quality management (TQM), Army Tactical Missile

  11. Software Defined GPS API: Development and Implementation of GPS Correlator Architectures Using MATLAB with Focus on SDR Implementations

    DTIC Science & Technology

    2014-05-18

    and Implementation of GPS Correlator Architectures Using MATLAB with Focus on SDR Implementations The Software Defined GPS API was created with the...documentation. 9. SPONSORING/MONITORING AGENCY NAME(S) AND ADDRESS (ES) U.S. Army Research Office P.O. Box 12211 Research Triangle Park, NC 27709-2211 SDR ...Implementation of GPS Correlator Architectures Using MATLAB with Focus on SDR Implementations Report Title The Software Defined GPS API was created

  12. Solar-terrestrial models and application software

    NASA Technical Reports Server (NTRS)

    Bilitza, Dieter

    1990-01-01

    The empirical models related to solar-terrestrial sciences are listed and described which are available in the form of computer programs. Also included are programs that use one or more of these models for application specific purposes. The entries are grouped according to the region of the solar-terrestrial environment to which they belong and according to the parameter which they describe. Regions considered include the ionosphere, atmosphere, magnetosphere, planets, interplanetary space, and heliosphere. Also provided is the information on the accessibility for solar-terrestrial models to specify the magnetic and solar activity conditions.

  13. Mission design applications of QUICK. [software for interactive trajectory calculation

    NASA Technical Reports Server (NTRS)

    Skinner, David L.; Bass, Laura E.; Byrnes, Dennis V.; Cheng, Jeannie T.; Fordyce, Jess E.; Knocke, Philip C.; Lyons, Daniel T.; Pojman, Joan L.; Stetson, Douglas S.; Wolf, Aron A.

    1990-01-01

    An overview of an interactive software environment for space mission design termed QUICK is presented. This stand-alone program provides a programmable FORTRAN-like calculator interface to a wide range of both built-in and user defined functions. QUICK has evolved into a general-purpose software environment that can be intrinsically and dynamically customized for a wide range of mission design applications. Specific applications are described for some space programs, e.g., the earth-Venus-Mars mission, the Cassini mission to Saturn, the Mars Observer, the Galileo Project, and the Magellan Spacecraft.

  14. Mission design applications of QUICK. [software for interactive trajectory calculation

    NASA Technical Reports Server (NTRS)

    Skinner, David L.; Bass, Laura E.; Byrnes, Dennis V.; Cheng, Jeannie T.; Fordyce, Jess E.; Knocke, Philip C.; Lyons, Daniel T.; Pojman, Joan L.; Stetson, Douglas S.; Wolf, Aron A.

    1990-01-01

    An overview of an interactive software environment for space mission design termed QUICK is presented. This stand-alone program provides a programmable FORTRAN-like calculator interface to a wide range of both built-in and user defined functions. QUICK has evolved into a general-purpose software environment that can be intrinsically and dynamically customized for a wide range of mission design applications. Specific applications are described for some space programs, e.g., the earth-Venus-Mars mission, the Cassini mission to Saturn, the Mars Observer, the Galileo Project, and the Magellan Spacecraft.

  15. Static and Dynamic Verification of Critical Software for Space Applications

    NASA Astrophysics Data System (ADS)

    Moreira, F.; Maia, R.; Costa, D.; Duro, N.; Rodríguez-Dapena, P.; Hjortnaes, K.

    Space technology is no longer used only for much specialised research activities or for sophisticated manned space missions. Modern society relies more and more on space technology and applications for every day activities. Worldwide telecommunications, Earth observation, navigation and remote sensing are only a few examples of space applications on which we rely daily. The European driven global navigation system Galileo and its associated applications, e.g. air traffic management, vessel and car navigation, will significantly expand the already stringent safety requirements for space based applications Apart from their usefulness and practical applications, every single piece of onboard software deployed into the space represents an enormous investment. With a long lifetime operation and being extremely difficult to maintain and upgrade, at least when comparing with "mainstream" software development, the importance of ensuring their correctness before deployment is immense. Verification &Validation techniques and technologies have a key role in ensuring that the onboard software is correct and error free, or at least free from errors that can potentially lead to catastrophic failures. Many RAMS techniques including both static criticality analysis and dynamic verification techniques have been used as a means to verify and validate critical software and to ensure its correctness. But, traditionally, these have been isolated applied. One of the main reasons is the immaturity of this field in what concerns to its application to the increasing software product(s) within space systems. This paper presents an innovative way of combining both static and dynamic techniques exploiting their synergy and complementarity for software fault removal. The methodology proposed is based on the combination of Software FMEA and FTA with Fault-injection techniques. The case study herein described is implemented with support from two tools: The SoftCare tool for the SFMEA and SFTA

  16. Enhancement of computer system for applications software branch

    NASA Technical Reports Server (NTRS)

    Bykat, Alex

    1987-01-01

    Presented is a compilation of the history of a two-month project concerned with a survey, evaluation, and specification of a new computer system for the Applications Software Branch of the Software and Data Management Division of Information and Electronic Systems Laboratory of Marshall Space Flight Center, NASA. Information gathering consisted of discussions and surveys of branch activities, evaluation of computer manufacturer literature, and presentations by vendors. Information gathering was followed by evaluation of their systems. The criteria of the latter were: the (tentative) architecture selected for the new system, type of network architecture supported, software tools, and to some extent the price. The information received from the vendors, as well as additional research, lead to detailed design of a suitable system. This design included considerations of hardware and software environments as well as personnel issues such as training. Design of the system culminated in a recommendation for a new computing system for the Branch.

  17. Web Application Software for Ground Operations Planning Database (GOPDb) Management

    NASA Technical Reports Server (NTRS)

    Lanham, Clifton; Kallner, Shawn; Gernand, Jeffrey

    2013-01-01

    A Web application facilitates collaborative development of the ground operations planning document. This will reduce costs and development time for new programs by incorporating the data governance, access control, and revision tracking of the ground operations planning data. Ground Operations Planning requires the creation and maintenance of detailed timelines and documentation. The GOPDb Web application was created using state-of-the-art Web 2.0 technologies, and was deployed as SaaS (Software as a Service), with an emphasis on data governance and security needs. Application access is managed using two-factor authentication, with data write permissions tied to user roles and responsibilities. Multiple instances of the application can be deployed on a Web server to meet the robust needs for multiple, future programs with minimal additional cost. This innovation features high availability and scalability, with no additional software that needs to be bought or installed. For data governance and security (data quality, management, business process management, and risk management for data handling), the software uses NAMS. No local copy/cloning of data is permitted. Data change log/tracking is addressed, as well as collaboration, work flow, and process standardization. The software provides on-line documentation and detailed Web-based help. There are multiple ways that this software can be deployed on a Web server to meet ground operations planning needs for future programs. The software could be used to support commercial crew ground operations planning, as well as commercial payload/satellite ground operations planning. The application source code and database schema are owned by NASA.

  18. Application Focused Schlieren to Nozzle Ejector Flowfields

    NASA Technical Reports Server (NTRS)

    Mitchell, L. Kerry; Ponton, Michael K.; Seiner, John M.; Manning, James C.; Jansen, Bernard J.; Lagen, Nicholas T.

    1999-01-01

    The motivation of the testing was to reduce noise generated by eddy Mach wave emission via enhanced mixing in the jet plume. This was to be accomplished through the use of an ejector shroud, which would bring in cooler ambient fluid to mix with the hotter jet flow. In addition, the contour of the mixer, with its chutes and lobes, would accentuate the merging of the outer and inner flows. The objective of the focused schlieren work was to characterize the mixing performance inside of the ejector. Using flow visualization allowed this to be accomplished in a non-intrusive manner.

  19. Application of Plagiarism Screening Software in the Chemical Engineering Curriculum

    ERIC Educational Resources Information Center

    Cooper, Matthew E.; Bullard, Lisa G.

    2014-01-01

    Plagiarism is an area of increasing concern for written ChE assignments, such as laboratory and design reports, due to ease of access to text and other materials via the internet. This study examines the application of plagiarism screening software to four courses in a university chemical engineering curriculum. The effectiveness of plagiarism…

  20. Application of Plagiarism Screening Software in the Chemical Engineering Curriculum

    ERIC Educational Resources Information Center

    Cooper, Matthew E.; Bullard, Lisa G.

    2014-01-01

    Plagiarism is an area of increasing concern for written ChE assignments, such as laboratory and design reports, due to ease of access to text and other materials via the internet. This study examines the application of plagiarism screening software to four courses in a university chemical engineering curriculum. The effectiveness of plagiarism…

  1. QFD Application to a Software - Intensive System Development Project

    NASA Technical Reports Server (NTRS)

    Tran, T. L.

    1996-01-01

    This paper describes the use of Quality Function Deployment (QFD), adapted to requirements engineering for a software-intensive system development project, and sysnthesizes the lessons learned from the application of QFD to the Network Control System (NCS) pre-project of the Deep Space Network.

  2. Application of software technology to automatic test data analysis

    NASA Technical Reports Server (NTRS)

    Stagner, J. R.

    1991-01-01

    The verification process for a major software subsystem was partially automated as part of a feasibility demonstration. The methods employed are generally useful and applicable to other types of subsystems. The effort resulted in substantial savings in test engineer analysis time and offers a method for inclusion of automatic verification as a part of regression testing.

  3. Software Applications Course as an Early Indicator of Academic Performance

    ERIC Educational Resources Information Center

    Benham, Harry C.; Bielinska-Kwapisz, Agnieszka; Brown, F. William

    2013-01-01

    This study's objective is to determine if students who were unable to successfully complete a required sophomore level business software applications course encountered unique academic difficulties in that course, or if their difficulty signaled more general academic achievement problems in business. The study points to the importance of including…

  4. Laboratory and software applications for clinical trials: the global laboratory environment.

    PubMed

    Briscoe, Chad

    2011-11-01

    The Applied Pharmaceutical Software Meeting is held annually. It is sponsored by The Boston Society, a not-for-profit organization that coordinates a series of meetings within the global pharmaceutical industry. The meeting generally focuses on laboratory applications, but in recent years has expanded to include some software applications for clinical trials. The 2011 meeting emphasized the global laboratory environment. Global clinical trials generate massive amounts of data in many locations that must be centralized and processed for efficient analysis. Thus, the meeting had a strong focus on establishing networks and systems for dealing with the computer infrastructure to support such environments. In addition to the globally installed laboratory information management system, electronic laboratory notebook and other traditional laboratory applications, cloud computing is quickly becoming the answer to provide efficient, inexpensive options for managing the large volumes of data and computing power, and thus it served as a central theme for the meeting.

  5. High Intensity Focused Ultrasound Tumor Therapy System and Its Application

    NASA Astrophysics Data System (ADS)

    Sun, Fucheng; He, Ye; Li, Rui

    2007-05-01

    At the end of last century, a High Intensity Focused Ultrasound (HIFU) tumor therapy system was successfully developed and manufactured in China, which has been already applied to clinical therapy. This article aims to discuss the HIFU therapy system and its application. Detailed research includes the following: power amplifiers for high-power ultrasound, ultrasound transducers with large apertures, accurate 3-D mechanical drives, a software control system (both high-voltage control and low-voltage control), and the B-mode ultrasonic diagnostic equipment used for treatment monitoring. Research on the dosage of ultrasound required for tumour therapy in multiple human cases has made it possible to relate a dosage formula, presented in this paper, to other significant parameters such as the volume of thermal tumor solidification, the acoustic intensity (I), and the ultrasound emission time (tn). Moreover, the HIFU therapy system can be applied to the clinical treatment of both benign and malignant tumors in the pelvic and abdominal cavity, such as uterine fibroids, liver cancer and pancreatic carcinoma.

  6. Applications of isoelectric focusing in forensic serology.

    PubMed

    Murch, R S; Budowle, B

    1986-07-01

    The typing of certain polymorphic proteins present in human body fluids is an important aspect of the analysis of serological evidence. This is particularly true when dealing with evidence related to violent criminal activity such as homocide, assault, or rape. Until recently, the routine analysis of the genetic polymorphisms of interest relied upon conventional electrophoretic techniques such as horizontal starch or agarose slab gel or both, cellulose acetate, and vertical polyacrylamide gradient gel methods. These techniques adequately separate a limited number of common variants. In some cases, these methods are still those of choice. However, as a result of the nature of the conventional approach, problems with time required for analysis, resolution, diffusion of bands, sensitivity of protein detection, and cost are often encountered. Isoelectric focusing (IEF) offers an effective alternative to conventional electrophoresis for genetic marker typing. This method exploits the isoelectric point of allelic products rather than charge-to-mass ratio in a particular pH environment. The advantages of employing IEF include: reduction of time of analysis, increased resolution of protein bands, the possibility of subtyping existing phenotypes, increased sensitivity of detection, the counteraction of diffusion effects, and reduced cost per sample.

  7. The Application of Solution-Focused Work in Employment Counseling

    ERIC Educational Resources Information Center

    Bezanson, Birdie J.

    2004-01-01

    The author explores the applicability of a solution-focused therapy (SFT) model as a comprehensive approach to employment counseling. SFT focuses the client on developing a vision of a preferred future and assumes that the client has the talents and resources that can be accessed in the employment counseling process. The solution-focused counselor…

  8. Final Report. Center for Scalable Application Development Software

    SciTech Connect

    Mellor-Crummey, John

    2014-10-26

    The Center for Scalable Application Development Software (CScADS) was established as a part- nership between Rice University, Argonne National Laboratory, University of California Berkeley, University of Tennessee – Knoxville, and University of Wisconsin – Madison. CScADS pursued an integrated set of activities with the aim of increasing the productivity of DOE computational scientists by catalyzing the development of systems software, libraries, compilers, and tools for leadership computing platforms. Principal Center activities were workshops to engage the research community in the challenges of leadership computing, research and development of open-source software, and work with computational scientists to help them develop codes for leadership computing platforms. This final report summarizes CScADS activities at Rice University in these areas.

  9. The application of image processing software: Photoshop in environmental design

    NASA Astrophysics Data System (ADS)

    Dong, Baohua; Zhang, Chunmi; Zhuo, Chen

    2011-02-01

    In the process of environmental design and creation, the design sketch holds a very important position in that it not only illuminates the design's idea and concept but also shows the design's visual effects to the client. In the field of environmental design, computer aided design has made significant improvement. Many types of specialized design software for environmental performance of the drawings and post artistic processing have been implemented. Additionally, with the use of this software, working efficiency has greatly increased and drawings have become more specific and more specialized. By analyzing the application of photoshop image processing software in environmental design and comparing and contrasting traditional hand drawing and drawing with modern technology, this essay will further explore the way for computer technology to play a bigger role in environmental design.

  10. Software radio technology and applications to law enforcement

    NASA Astrophysics Data System (ADS)

    Mitola, Joseph, III

    1997-02-01

    Law enforcement use of radio includes the rapid creation of networks for the dozens of law enforcement organizations who come together in situations as diverse as the TWA 800 disaster in New York or the SunFest celebration in Palm Beach. The software radio is a proven technology for rapidly building such interoperable networks, including seamless bridging cross sub-networks of different frequency bands, channel modulations and information formats. In addition, law enforcement must manage the costs of related radio base station infrastructure, mobile units and handsets. The software radio is a collection of engineering techniques for creating radio infrastructure that can be programmed for new standards and that can be dynamically updated with new software personalities even 'over the air,' reducing the need to purchase new hardware to remain current with emerging radio interface standards. Although relatively expensive today, continuing DoD, federal and commercial investment in software radio technology will bring products within the reach of law enforcement applications within the next few years. The Modular Multifunction Information Transfer Systems (MMITS) Forum provides further impetus for cost reductions through the market efficiencies of open architecture. This article summarizes software radio technology and key trends in the marketplace including the progress of the MMITS forum. Expanded law enforcement participation in this forum would accelerate the availability of low cost products for law enforcement.

  11. IFDOTMETER: A New Software Application for Automated Immunofluorescence Analysis.

    PubMed

    Rodríguez-Arribas, Mario; Pizarro-Estrella, Elisa; Gómez-Sánchez, Rubén; Yakhine-Diop, S M S; Gragera-Hidalgo, Antonio; Cristo, Alejandro; Bravo-San Pedro, Jose M; González-Polo, Rosa A; Fuentes, José M

    2016-04-01

    Most laboratories interested in autophagy use different imaging software for managing and analyzing heterogeneous parameters in immunofluorescence experiments (e.g., LC3-puncta quantification and determination of the number and size of lysosomes). One solution would be software that works on a user's laptop or workstation that can access all image settings and provide quick and easy-to-use analysis of data. Thus, we have designed and implemented an application called IFDOTMETER, which can run on all major operating systems because it has been programmed using JAVA (Sun Microsystems). Briefly, IFDOTMETER software has been created to quantify a variety of biological hallmarks, including mitochondrial morphology and nuclear condensation. The program interface is intuitive and user-friendly, making it useful for users not familiar with computer handling. By setting previously defined parameters, the software can automatically analyze a large number of images without the supervision of the researcher. Once analysis is complete, the results are stored in a spreadsheet. Using software for high-throughput cell image analysis offers researchers the possibility of performing comprehensive and precise analysis of a high number of images in an automated manner, making this routine task easier.

  12. Software.

    ERIC Educational Resources Information Center

    Journal of Chemical Education, 1989

    1989-01-01

    Presented are reviews of two computer software packages for Apple II computers; "Organic Spectroscopy," and "Videodisc Display Program" for use with "The Periodic Table Videodisc." A sample spectrograph from "Organic Spectroscopy" is included. (CW)

  13. Methods, Software and Tools for Three Numerical Applications. Final report

    SciTech Connect

    E. R. Jessup

    2000-03-01

    This is a report of the results of the authors work supported by DOE contract DE-FG03-97ER25325. They proposed to study three numerical problems. They are: (1) the extension of the PMESC parallel programming library; (2) the development of algorithms and software for certain generalized eigenvalue and singular value (SVD) problems, and (3) the application of techniques of linear algebra to an information retrieval technique known as latent semantic indexing (LSI).

  14. Software architecture for time-constrained machine vision applications

    NASA Astrophysics Data System (ADS)

    Usamentiaga, Rubén; Molleda, Julio; García, Daniel F.; Bulnes, Francisco G.

    2013-01-01

    Real-time image and video processing applications require skilled architects, and recent trends in the hardware platform make the design and implementation of these applications increasingly complex. Many frameworks and libraries have been proposed or commercialized to simplify the design and tuning of real-time image processing applications. However, they tend to lack flexibility, because they are normally oriented toward particular types of applications, or they impose specific data processing models such as the pipeline. Other issues include large memory footprints, difficulty for reuse, and inefficient execution on multicore processors. We present a novel software architecture for time-constrained machine vision applications that addresses these issues. The architecture is divided into three layers. The platform abstraction layer provides a high-level application programming interface for the rest of the architecture. The messaging layer provides a message-passing interface based on a dynamic publish/subscribe pattern. A topic-based filtering in which messages are published to topics is used to route the messages from the publishers to the subscribers interested in a particular type of message. The application layer provides a repository for reusable application modules designed for machine vision applications. These modules, which include acquisition, visualization, communication, user interface, and data processing, take advantage of the power of well-known libraries such as OpenCV, Intel IPP, or CUDA. Finally, the proposed architecture is applied to a real machine vision application: a jam detector for steel pickling lines.

  15. Open source software engineering for geoscientific modeling applications

    NASA Astrophysics Data System (ADS)

    Bilke, L.; Rink, K.; Fischer, T.; Kolditz, O.

    2012-12-01

    OpenGeoSys (OGS) is a scientific open source project for numerical simulation of thermo-hydro-mechanical-chemical (THMC) processes in porous and fractured media. The OGS software development community is distributed all over the world and people with different backgrounds are contributing code to a complex software system. The following points have to be addressed for successful software development: - Platform independent code - A unified build system - A version control system - A collaborative project web site - Continuous builds and testing - Providing binaries and documentation for end users OGS should run on a PC as well as on a computing cluster regardless of the operating system. Therefore the code should not include any platform specific feature or library. Instead open source and platform independent libraries like Qt for the graphical user interface or VTK for visualization algorithms are used. A source code management and version control system is a definite requirement for distributed software development. For this purpose Git is used, which enables developers to work on separate versions (branches) of the software and to merge those versions at some point to the official one. The version control system is integrated into an information and collaboration website based on a wiki system. The wiki is used for collecting information such as tutorials, application examples and case studies. Discussions take place in the OGS mailing list. To improve code stability and to verify code correctness a continuous build and testing system, based on the Jenkins Continuous Integration Server, has been established. This server is connected to the version control system and does the following on every code change: - Compiles (builds) the code on every supported platform (Linux, Windows, MacOS) - Runs a comprehensive test suite of over 120 benchmarks and verifies the results Runs software development related metrics on the code (like compiler warnings, code complexity

  16. Software Transition Project Retrospectives and the Application of SEL Effort Estimation Model and Boehm's COCOMO to Complex Software Transition Projects

    NASA Technical Reports Server (NTRS)

    McNeill, Justin

    1995-01-01

    The Multimission Image Processing Subsystem (MIPS) at the Jet Propulsion Laboratory (JPL) has managed transitions of application software sets from one operating system and hardware platform to multiple operating systems and hardware platforms. As a part of these transitions, cost estimates were generated from the personal experience of in-house developers and managers to calculate the total effort required for such projects. Productivity measures have been collected for two such transitions, one very large and the other relatively small in terms of source lines of code. These estimates used a cost estimation model similar to the Software Engineering Laboratory (SEL) Effort Estimation Model. Experience in transitioning software within JPL MIPS have uncovered a high incidence of interface complexity. Interfaces, both internal and external to individual software applications, have contributed to software transition project complexity, and thus to scheduling difficulties and larger than anticipated design work on software to be ported.

  17. Application of software to development of reactor-safety codes

    SciTech Connect

    Wilburn, N.P.; Niccoli, L.G.

    1980-09-01

    Over the past two-and-a-half decades, the application of new techniques has reduced hardware cost for digital computer systems and increased computational speed by several orders of magnitude. A corresponding cost reduction in business and scientific software development has not occurred. The same situation is seen for software developed to model the thermohydraulic behavior of nuclear systems under hypothetical accident situations. For all cases this is particularly noted when costs over the total software life cycle are considered. A solution to this dilemma for reactor safety code systems has been demonstrated by applying the software engineering techniques which have been developed over the course of the last few years in the aerospace and business communities. These techniques have been applied recently with a great deal of success in four major projects at the Hanford Engineering Development Laboratory (HEDL): 1) a rewrite of a major safety code (MELT); 2) development of a new code system (CONACS) for description of the response of LMFBR containment to hypothetical accidents, and 3) development of two new modules for reactor safety analysis.

  18. Improving ICT Governance by Reorganizing Operation of ICT and Software Applications: The First Step to Outsource

    NASA Astrophysics Data System (ADS)

    Johansson, Björn

    During recent years great attention has been paid to outsourcing as well as to the reverse, insourcing (Dibbern et al., 2004). There has been a strong focus on how the management of software applications and information and communication technology (ICT), expressed as ICT management versus ICT governance, should be carried out (Grembergen, 2004). The maintenance and operation of software applications and ICT use a lot of the resources spent on ICT in organizations today (Bearingpoint, 2004), and managers are asked to increase the business benefits of these investments (Weill & Ross, 2004). That is, they are asked to improve the usage of ICT and to develop new business critical solutions supported by ICT. It also means that investments in ICT and software applications need to be shown to be worthwhile. Basically there are two considerations to take into account with ICT usage: cost reduction and improving business value. How the governance and management of ICT and software applications are organized is important. This means that the improvement of the control of maintenance and operation may be of interest to executives of organizations. It can be stated that usage is dependent on how it is organized. So, if an increase of ICT governance is the same as having well-organized ICT resources, could this be seen as the first step in organizations striving for external provision of ICT? This question is dealt with to some degree in this paper.

  19. Reflective Array with Controlled Focusing for Radiotomographic Application

    NASA Astrophysics Data System (ADS)

    Shipilov, S. E.; Eremeev, A. I.; Yakubov, V. P.

    2016-01-01

    It's considered the principle possibility of creation the managed reflectors for formulation of given field distribution in the focus area. Reflectors change the reflect ratio in dependence of the external control. The proposed theoretical modeling of such controlled focused device which provides focuse to a specific point in a given distribution of the reflectors. On the basis of numerical simulation it's considered the application of this approach for the solution of the problem of radiotomography.

  20. Software Receiver Processing for Deep Space Telemetry Applications

    NASA Astrophysics Data System (ADS)

    Lay, N.; Lyubarev, M.; Tkacenko, A.; Srinivasan, M.; Andrews, K.; Finley, S.; Goodhart, C.; Navarro, R.

    2010-02-01

    Recently, much effort has been placed toward the development of the Reconfigurable Wideband Ground Receiver (RWGR): a variable-data-rate, reprogrammable receiver, whose technologies are intended for infusion into the Deep Space Network. A significant thrust of that effort has been focused on the development of field-programmable gate array (FPGA)-based algorithms for processing high-rate waveforms up to 640 Mbps. In this article, we describe the development of software receiver algorithms used to perform telemetry demodulation of low- to medium-data-rate signals.

  1. Application and systems software in Ada: Development experiences

    NASA Technical Reports Server (NTRS)

    Kuschill, Jim

    1986-01-01

    In its most basic sense software development involves describing the tasks to be solved, including the given objects and the operations to be performed on those objects. Unfortunately, the way people describe objects and operations usually bears little resemblance to source code in most contemporary computer languages. There are two ways around this problem. One is to allow users to describe what they want the computer to do in everyday, typically imprecise English. The PRODOC methodology and software development environment is based on a second more flexible and possibly even easier to use approach. Rather than hiding program structure, PRODOC represents such structure graphically using visual programming techniques. In addition, the program terminology used in PRODOC may be customized so as to match the way human experts in any given application area naturally describe the relevant data and operations. The PRODOC methodology is described in detail.

  2. Evaluation of the Trajectory Operations Applications Software Task (TOAST)

    NASA Technical Reports Server (NTRS)

    Perkins, Sharon; Martin, Andrea; Bavinger, Bill

    1990-01-01

    The Trajectory Operations Applications Software Task (TOAST) is a software development project under the auspices of the Mission Operations Directorate. Its purpose is to provide trajectory operation pre-mission and real-time support for the Space Shuttle program. As an Application Manager, TOAST provides an isolation layer between the underlying Unix operating system and the series of user programs. It provides two main services: a common interface to operating system functions with semantics appropriate for C or FORTRAN, and a structured input and output package that can be utilized by user application programs. In order to evaluate TOAST as an Application Manager, the task was to assess current and planned capabilities, compare capabilities to functions available in commercially-available off the shelf (COTS) and Flight Analysis Design System (FADS) users for TOAST implementation. As a result of the investigation, it was found that the current version of TOAST is well implemented and meets the needs of the real-time users. The plans for migrating TOAST to the X Window System are essentially sound; the Executive will port with minor changes, while Menu Handler will require a total rewrite. A series of recommendations for future TOAST directions are included.

  3. A Software Development Platform for Wearable Medical Applications.

    PubMed

    Zhang, Ruikai; Lin, Wei

    2015-10-01

    Wearable medical devices have become a leading trend in healthcare industry. Microcontrollers are computers on a chip with sufficient processing power and preferred embedded computing units in those devices. We have developed a software platform specifically for the design of the wearable medical applications with a small code footprint on the microcontrollers. It is supported by the open source real time operating system FreeRTOS and supplemented with a set of standard APIs for the architectural specific hardware interfaces on the microcontrollers for data acquisition and wireless communication. We modified the tick counter routine in FreeRTOS to include a real time soft clock. When combined with the multitasking features in the FreeRTOS, the platform offers the quick development of wearable applications and easy porting of the application code to different microprocessors. Test results have demonstrated that the application software developed using this platform are highly efficient in CPU usage while maintaining a small code foot print to accommodate the limited memory space in microcontrollers.

  4. Software tools for developing parallel applications. Part 1: Code development and debugging

    SciTech Connect

    Brown, J.; Geist, A.; Pancake, C.; Rover, D.

    1997-04-01

    Developing an application for parallel computers can be a lengthy and frustrating process making it a perfect candidate for software tool support. Yet application programmers are often the last to hear about new tools emerging from R and D efforts. This paper provides an overview of two focuses of tool support: code development and debugging. Each is discussed in terms of the programmer needs addressed, the extent to which representative current tools meet those needs, and what new levels of tool support are important if parallel computing is to become more widespread.

  5. Applications of adaptive focused acoustics to compound management.

    PubMed

    Nixon, Elizabeth; Holland-Crimmin, Sue; Lupotsky, Brian; Chan, James; Curtis, Jon; Dobbs, Karen; Blaxill, Zoe

    2009-06-01

    Since the introduction of lithotripsy kidney stone therapy, Focused Acoustics and its properties have been thoroughly utilized in medicine and exploration. More recently, Compound Management is exploring its applications and benefits to sample integrity. There are 2 forms of Focused Acoustics: Acoustic Droplet Ejection and Adaptive Focused Acoustics, which work by emitting high-powered acoustic waves through water toward a focused point. This focused power results in noncontact plate-to-plate sample transfer or sample dissolution, respectively. For the purposes of this article, only Adaptive Focused Acoustics will be addressed. Adaptive Focused Acoustics uses high-powered acoustic waves to mix, homogenize, dissolve, and thaw samples. It facilitates transferable samples through noncontact, closed-container, isothermal mixing. Experimental results show significantly reduced mixing times, limited degradation, and ideal use for heat-sensitive compounds. Upon implementation, acoustic dissolution has reduced the number of samples requiring longer mixing times as well as reducing the number impacted by incomplete compound dissolution. It has also helped in increasing the overall sample concentration from 6 to 8 mM to 8 to 10 mM by ensuring complete compound solubilization. The application of Adaptive Focused Acoustics, however, cannot be applied to all Compound Management processes, such as sample thawing and low-volume sample reconstitution. This article will go on to describe the areas where Adaptive Focused Acoustics adds value as well as areas in which it has shown no clear benefit.

  6. Supporting SBML as a model exchange format in software applications.

    PubMed

    Keating, Sarah M; Le Novère, Nicolas

    2013-01-01

    This chapter describes the Systems Biology Markup Language (SBML) from its origins. It describes the rationale behind and importance of having a common language when it comes to representing models. This chapter mentions the development of SBML and outlines the structure of an SBML model. It provides a section on libSBML, a useful application programming interface (API) library for reading, writing, manipulating and validating content expressed in the SBML format. Finally the chapter also provides a description of the SBML Toolbox which provides a means of facilitating the import and export of SBML from both MATLAB and Octave ( http://www.gnu.org/software/octave/) environments.

  7. Adaptive Signal Processing Testbed application software: User's manual

    NASA Astrophysics Data System (ADS)

    Parliament, Hugh A.

    1992-05-01

    The Adaptive Signal Processing Testbed (ASPT) application software is a set of programs that provide general data acquisition and minimal processing functions on live digital data. The data are obtained from a digital input interface whose data source is the DAR4000 digital quadrature receiver that receives a phase shift keying signal at 21.4 MHz intermediate frequency. The data acquisition software is used to acquire raw unprocessed data from the DAR4000 and store it on disk in the Sun workstation based ASPT. File processing utilities are available to convert the stored files for analysis. The data evaluation software is used for the following functions: acquisition of data from the DAR4000, conversion to IEEE format, and storage to disk; acquisition of data from the DAR4000, power spectrum estimation, and on-line plotting on the graphics screen; and processing of disk file data, power spectrum estimation, and display and/or storage to disk in the new format. A user's guide is provided that describes the acquisition and evaluation programs along with how to acquire, evaluate, and use the data.

  8. APPLICATION OF SOFTWARE QUALITY ASSURANCE CONCEPTS AND PROCEDURES TO ENVIORNMENTAL RESEARCH INVOLVING SOFTWARE DEVELOPMENT

    EPA Science Inventory

    As EPA’s environmental research expands into new areas that involve the development of software, quality assurance concepts and procedures that were originally developed for environmental data collection may not be appropriate. Fortunately, software quality assurance is a ...

  9. APPLICATION OF SOFTWARE QUALITY ASSURANCE CONCEPTS AND PROCEDURES TO ENVIORNMENTAL RESEARCH INVOLVING SOFTWARE DEVELOPMENT

    EPA Science Inventory

    As EPA’s environmental research expands into new areas that involve the development of software, quality assurance concepts and procedures that were originally developed for environmental data collection may not be appropriate. Fortunately, software quality assurance is a ...

  10. WinTRAX: A raytracing software package for the design of multipole focusing systems

    NASA Astrophysics Data System (ADS)

    Grime, G. W.

    2013-07-01

    The software package TRAX was a simulation tool for modelling the path of charged particles through linear cylindrical multipole fields described by analytical expressions and was a development of the earlier OXRAY program (Grime and Watt, 1983; Grime et al., 1982) [1,2]. In a 2005 comparison of raytracing software packages (Incerti et al., 2005) [3], TRAX/OXRAY was compared with Geant4 and Zgoubi and was found to give close agreement with the more modern codes. TRAX was a text-based program which was only available for operation in a now rare VMS workstation environment, so a new program, WinTRAX, has been developed for the Windows operating system. This implements the same basic computing strategy as TRAX, and key sections of the code are direct translations from FORTRAN to C++, but the Windows environment is exploited to make an intuitive graphical user interface which simplifies and enhances many operations including system definition and storage, optimisation, beam simulation (including with misaligned elements) and aberration coefficient determination. This paper describes the program and presents comparisons with other software and real installations.

  11. Computational protein design: the Proteus software and selected applications.

    PubMed

    Simonson, Thomas; Gaillard, Thomas; Mignon, David; Schmidt am Busch, Marcel; Lopes, Anne; Amara, Najette; Polydorides, Savvas; Sedano, Audrey; Druart, Karen; Archontis, Georgios

    2013-10-30

    We describe an automated procedure for protein design, implemented in a flexible software package, called Proteus. System setup and calculation of an energy matrix are done with the XPLOR modeling program and its sophisticated command language, supporting several force fields and solvent models. A second program provides algorithms to search sequence space. It allows a decomposition of the system into groups, which can be combined in different ways in the energy function, for both positive and negative design. The whole procedure can be controlled by editing 2-4 scripts. Two applications consider the tyrosyl-tRNA synthetase enzyme and its successful redesign to bind both O-methyl-tyrosine and D-tyrosine. For the latter, we present Monte Carlo simulations where the D-tyrosine concentration is gradually increased, displacing L-tyrosine from the binding pocket and yielding the binding free energy difference, in good agreement with experiment. Complete redesign of the Crk SH3 domain is presented. The top 10000 sequences are all assigned to the correct fold by the SUPERFAMILY library of Hidden Markov Models. Finally, we report the acid/base behavior of the SNase protein. Sidechain protonation is treated as a form of mutation; it is then straightforward to perform constant-pH Monte Carlo simulations, which yield good agreement with experiment. Overall, the software can be used for a wide range of application, producing not only native-like sequences but also thermodynamic properties with errors that appear comparable to other current software packages. Copyright © 2013 Wiley Periodicals, Inc.

  12. Intracranial Applications of MR Imaging-Guided Focused Ultrasound.

    PubMed

    Khanna, N; Gandhi, D; Steven, A; Frenkel, V; Melhem, E R

    2017-03-01

    Initially used in the treatment of prostate cancer and uterine fibroids, the role of focused ultrasound has expanded as transcranial acoustic wave distortion and other limitations have been overcome. Its utility relies on focal energy deposition via acoustic wave propagation. The duty cycle and intensity of focused ultrasound influence the rate of energy deposition and result in unique physiologic and biomechanical effects. Thermal ablation via high-intensity continuous exposure generates coagulative necrosis of tissues. High-intensity, pulsed application reduces temporally averaged energy deposition, resulting in mechanical effects, including reversible, localized BBB disruption, which enhances neurotherapeutic agent delivery. While the precise mechanisms remain unclear, low-intensity, pulsed exposures can influence neuronal activity with preservation of cytoarchitecture. Its noninvasive nature, high-resolution, radiation-free features allow focused ultrasound to compare favorably with other modalities. We discuss the physical characteristics of focused ultrasound devices, the biophysical mechanisms at the tissue level, and current and emerging applications.

  13. An investigation of modelling and design for software service applications

    PubMed Central

    2017-01-01

    Software services offer the opportunity to use a component-based approach for the design of applications. However, this needs a deeper understanding of how to develop service-based applications in a systematic manner, and of the set of properties that need to be included in the ‘design model’. We have used a realistic application to explore systematically how service-based designs can be created and described. We first identified the key properties of an SOA (service oriented architecture) and then undertook a single-case case study to explore its use in the development of a design for a large-scale application in energy engineering, modelling this with existing notations wherever possible. We evaluated the resulting design model using two walkthroughs with both domain and application experts. We were able to successfully develop a design model around the ten properties identified, and to describe it by adapting existing design notations. A component-based approach to designing such systems does appear to be feasible. However, it needs the assistance of a more integrated set of notations for describing the resulting design model. PMID:28489905

  14. An investigation of modelling and design for software service applications.

    PubMed

    Anjum, Maria; Budgen, David

    2017-01-01

    Software services offer the opportunity to use a component-based approach for the design of applications. However, this needs a deeper understanding of how to develop service-based applications in a systematic manner, and of the set of properties that need to be included in the 'design model'. We have used a realistic application to explore systematically how service-based designs can be created and described. We first identified the key properties of an SOA (service oriented architecture) and then undertook a single-case case study to explore its use in the development of a design for a large-scale application in energy engineering, modelling this with existing notations wherever possible. We evaluated the resulting design model using two walkthroughs with both domain and application experts. We were able to successfully develop a design model around the ten properties identified, and to describe it by adapting existing design notations. A component-based approach to designing such systems does appear to be feasible. However, it needs the assistance of a more integrated set of notations for describing the resulting design model.

  15. Application of automated topography focus corrections for volume manufacturing

    NASA Astrophysics Data System (ADS)

    Wiltshire, Timothy J.; Liegl, Bernhard R.; Hwang, Emily M.; Lucksinger, Mark R.

    2010-03-01

    This work describes the implementation and performance of AGILE focus corrections for advanced photo lithography in volume production as well as advanced development in IBM's 300mm facility. In particular, a logic hierarchy that manages the air gage sub-system corrections to optimize tool productivity while sampling with sufficient frequency to ensure focus accuracy for stable production processes is described. The information reviewed includes: General AGILE implementation approaches; Sample focus correction contours for critical 45nm, 32nm, and 22nm applications; An outline of the IBM Advanced Process Control (APC) logic and system(s) that manage the focus correction sets; Long term, historical focus correction data for stable 45nm processes as well as development stage 32nm processes; Practical issues encountered and possible enhancements to the methodology.

  16. CHSSI Software for Geometrically Complex Unsteady Aerodynamic Applications

    NASA Technical Reports Server (NTRS)

    Chan, William M.; Meakin, Robert L.; Potsdam, Mark A.

    2001-01-01

    A comprehensive package of scalable overset grid CFD software is reviewed. The software facilitates accurate simulation of complete aircraft aerodynamics, including viscous effects, unsteadiness, and relative motion between component parts. The software significantly lowers the manpower and computer costs normally associated with such efforts. The software is discussed in terms of current capabilities and planned future enhancements.

  17. The Application of Software Safety to the Constellation Program Launch Control System

    NASA Technical Reports Server (NTRS)

    Kania, James; Hill, Janice

    2011-01-01

    The application of software safety practices on the LCS project resulted in the successful implementation of the NASA Software Safety Standard NASA-STD-8719.138 and CxP software safety requirements. The GOP-GEN-GSW-011 Hazard Report was the first report developed at KSC to identify software hazard causes and their controls. This approach can be applied to similar large software - intensive systems where loss of control can lead to a hazard.

  18. Weighted Ensemble Simulation: Review of Methodology, Applications, and Software.

    PubMed

    Zuckerman, Daniel M; Chong, Lillian T

    2017-05-22

    The weighted ensemble (WE) methodology orchestrates quasi-independent parallel simulations run with intermittent communication that can enhance sampling of rare events such as protein conformational changes, folding, and binding. The WE strategy can achieve superlinear scaling-the unbiased estimation of key observables such as rate constants and equilibrium state populations to greater precision than would be possible with ordinary parallel simulation. WE software can be used to control any dynamics engine, such as standard molecular dynamics and cell-modeling packages. This article reviews the theoretical basis of WE and goes on to describe successful applications to a number of complex biological processes-protein conformational transitions, (un)binding, and assembly processes, as well as cell-scale processes in systems biology. We furthermore discuss the challenges that need to be overcome in the next phase of WE methodological development. Overall, the combined advances in WE methodology and software have enabled the simulation of long-timescale processes that would otherwise not be practical on typical computing resources using standard simulation.

  19. Optimizing Flight Control Software With an Application Platform

    NASA Technical Reports Server (NTRS)

    Smith, Irene Skupniewicz; Shi, Nija; Webster, Christopher

    2012-01-01

    Flight controllers in NASA s mission control centers work day and night to ensure that missions succeed and crews are safe. The IT goals of NASA mission control centers are similar to those of most businesses: to evolve IT infrastructure from basic to dynamic. This paper describes Mission Control Technologies (MCT), an application platform that is powering mission control today and is designed to meet the needs of future NASA control centers. MCT is an extensible platform that provides GUI components and a runtime environment. The platform enables NASA s IT goals through its use of lightweight interfaces and configurable components, which promote standardization and incorporate useful solution patterns. The MCT architecture positions mission control centers to reach the goal of dynamic IT, leading to lower cost of ownership, and treating software as a strategic investment.

  20. Practical Application of Model Checking in Software Verification

    NASA Technical Reports Server (NTRS)

    Havelund, Klaus; Skakkebaek, Jens Ulrik

    1999-01-01

    This paper presents our experiences in applying the JAVA PATHFINDER (J(sub PF)), a recently developed JAVA to SPIN translator, in the finding of synchronization bugs in a Chinese Chess game server application written in JAVA. We give an overview of J(sub PF) and the subset of JAVA that it supports and describe the abstraction and verification of the game server. Finally, we analyze the results of the effort. We argue that abstraction by under-approximation is necessary for abstracting sufficiently smaller models for verification purposes; that user guidance is crucial for effective abstraction; and that current model checkers do not conveniently support the computational models of software in general and JAVA in particular.

  1. Process Orchestration With Modular Software Applications On Intelligent Field Devices

    NASA Astrophysics Data System (ADS)

    Orfgen, Marius; Schmitt, Mathias

    2015-07-01

    The method developed by the DFKI-IFS for extending the functionality of intelligent field devices through the use of reloadable software applications (so-called Apps) is to be further augmented with a methodology and communication concept for process orchestration. The concept allows individual Apps from different manufacturers to decentrally share information. This way of communicating forms the basis for the dynamic orchestration of Apps to complete processes, in that it allows the actions of one App (e.g. detecting a component part with a sensor App) to trigger reactions in other Apps (e.g. triggering the processing of that component part). A holistic methodology and its implementation as a configuration tool allows one to model the information flow between Apps, as well as automatically introduce it into physical production hardware via available interfaces provided by the Field Device Middleware. Consequently, configuring industrial facilities is made simpler, resulting in shorter changeover and shutdown times.

  2. Bibliographic Management Software: A Focus Group Study of the Preferences and Practices of Undergraduate Students

    ERIC Educational Resources Information Center

    Salem, Jamie; Fehrmann, Paul

    2013-01-01

    With the growing population of undergraduate students on our campus and an increased focus on their success, librarians at a large midwestern university were interested in the citation management styles of this university cohort. Our university library spends considerable resources each year to maintain and promote access to the robust…

  3. Bibliographic Management Software: A Focus Group Study of the Preferences and Practices of Undergraduate Students

    ERIC Educational Resources Information Center

    Salem, Jamie; Fehrmann, Paul

    2013-01-01

    With the growing population of undergraduate students on our campus and an increased focus on their success, librarians at a large midwestern university were interested in the citation management styles of this university cohort. Our university library spends considerable resources each year to maintain and promote access to the robust…

  4. Application of Zernike polynomials towards accelerated adaptive focusing of transcranial high intensity focused ultrasound

    PubMed Central

    Kaye, Elena A.; Hertzberg, Yoni; Marx, Michael; Werner, Beat; Navon, Gil; Levoy, Marc; Pauly, Kim Butts

    2012-01-01

    initial estimates based on using the average of the phase aberration data from the individual subgroups of subjects was shown to increase the intensity at the focal spot for the five subjects. Conclusions: The application of ZPs to phase aberration correction was shown to be beneficial for adaptive focusing of transcranial ultrasound. The skull-based phase aberrations were found to be well approximated by the number of ZP modes representing only a fraction of the number of elements in the hemispherical transducer. Implementing the initial phase aberration estimate together with Zernike-based algorithm can be used to improve the robustness and can potentially greatly increase the viability of MR-ARFI-based focusing for a clinical transcranial MRgFUS therapy. PMID:23039661

  5. Application of Zernike polynomials towards accelerated adaptive focusing of transcranial high intensity focused ultrasound.

    PubMed

    Kaye, Elena A; Hertzberg, Yoni; Marx, Michael; Werner, Beat; Navon, Gil; Levoy, Marc; Pauly, Kim Butts

    2012-10-01

    phase aberration data from the individual subgroups of subjects was shown to increase the intensity at the focal spot for the five subjects. The application of ZPs to phase aberration correction was shown to be beneficial for adaptive focusing of transcranial ultrasound. The skull-based phase aberrations were found to be well approximated by the number of ZP modes representing only a fraction of the number of elements in the hemispherical transducer. Implementing the initial phase aberration estimate together with Zernike-based algorithm can be used to improve the robustness and can potentially greatly increase the viability of MR-ARFI-based focusing for a clinical transcranial MRgFUS therapy.

  6. Operational excellence (six sigma) philosophy: Application to software quality assurance

    SciTech Connect

    Lackner, M.

    1997-11-01

    This report contains viewgraphs on operational excellence philosophy of six sigma applied to software quality assurance. This report outlines the following: goal of six sigma; six sigma tools; manufacturing vs administrative processes; Software quality assurance document inspections; map software quality assurance requirements document; failure mode effects analysis for requirements document; measuring the right response variables; and questions.

  7. Dynamically focused optical coherence tomography for endoscopic applications

    NASA Astrophysics Data System (ADS)

    Divetia, Asheesh; Hsieh, Tsung-Hsi; Zhang, Jun; Chen, Zhongping; Bachman, Mark; Li, Guann-Pyng

    2005-03-01

    We report a demonstration of a small liquid-filled polymer lens that may be used to dynamically provide scanning depth focus for endoscopic optical coherence tomography (OCT) applications. The focal depth of the lens is controlled by changing the hydraulic pressure within the lens, enabling dynamic focal depth control without the need for articulated parts. The 1 mm diameter lens is shown to have resolving power of 5 μm, and can enable depth scans of 2.5 mm, making it suitable for use with OCT-enabled optical biopsy applications.

  8. High Energy Physics Forum for Computational Excellence: Working Group Reports (I. Applications Software II. Software Libraries and Tools III. Systems)

    SciTech Connect

    Habib, Salman; Roser, Robert

    2015-10-28

    Computing plays an essential role in all aspects of high energy physics. As computational technology evolves rapidly in new directions, and data throughput and volume continue to follow a steep trend-line, it is important for the HEP community to develop an effective response to a series of expected challenges. In order to help shape the desired response, the HEP Forum for Computational Excellence (HEP-FCE) initiated a roadmap planning activity with two key overlapping drivers -- 1) software effectiveness, and 2) infrastructure and expertise advancement. The HEP-FCE formed three working groups, 1) Applications Software, 2) Software Libraries and Tools, and 3) Systems (including systems software), to provide an overview of the current status of HEP computing and to present findings and opportunities for the desired HEP computational roadmap. The final versions of the reports are combined in this document, and are presented along with introductory material.

  9. Focused microwave-assisted Soxhlet extraction: devices and applications.

    PubMed

    Luque-García, J L; Luque de Castro, M D

    2004-10-20

    An overview of a new extraction technique called focused microwave-assisted Soxhlet extraction (FMASE) is here presented. This technique is based on the same principles as conventional Soxhlet extraction but using microwaves as auxiliary energy to accelerate the process. The different devices designed and constructed so far, their advantages and limitations as well as their main applications on environmental and food analysis are discussed in this article.

  10. Migrating data from TcSE to DOORS : an evaluation of the T-Plan Integrator software application.

    SciTech Connect

    Post, Debra S.; Manzanares, David A.; Taylor, Jeffrey L.

    2011-02-01

    This report describes our evaluation of the T-Plan Integrator software application as it was used to transfer a real data set from the Teamcenter for Systems Engineering (TcSE) software application to the DOORS software application. The T-Plan Integrator was evaluated to determine if it would meet the needs of Sandia National Laboratories to migrate our existing data sets from TcSE to DOORS. This report presents the struggles of migrating data and focuses on how the Integrator can be used to map a data set and its data architecture from TcSE to DOORS. Finally, this report describes how the bulk of the migration can take place using the Integrator; however, about 20-30% of the data would need to be transferred from TcSE to DOORS manually. This report does not evaluate the transfer of data from DOORS to TcSE.

  11. Optimizing Focusing X-Ray Optics for Planetary Science Applications

    NASA Astrophysics Data System (ADS)

    Melso, Nicole; Romaine, Suzanne; Hong, Jaesub; Cotroneo, Vincenzo

    2015-01-01

    X-Ray observations are a valuable tool for studying the composition, formation and evolution of the numerous X-Ray emitting objects in our Solar System. Although there are plenty of useful applications for in situ X-Ray focusing instrumentation, X-Ray focusing optics have never been feasible for use onboard planetary missions due to their mass and cost. Recent advancements in small-scale X-Ray instrumentation have made focusing X-Ray technology more practical and affordable for use onboard in situ spacecraft. Specifically, the technology of a metal-ceramic hybrid material combined with Electroformed Nickel Replication (ENR) holds great promise for realizing lightweight X-ray optics. We are working to optimize these lightweight focusing X-Ray optics for use in planetary science applications. We have explored multiple configurations and geometries that maximize the telescope's effective area and field of view while meeting practical mass and volume requirements. Each configuration was modeled via analytic calculations and Monte Carlo ray tracing simulations and compared to alternative Micro-pore Optics designs. The improved performance of our approach using hybrid materials has many exciting implications for the future of planetary science, X-Ray instrumentation, and the exploration of X-Ray sources in our Solar System.This work was supported in part by the NSF REU and DoD ASSURE programs under NSF grant no. 1262851 and by the Smithsonian Institution.

  12. Application of Department of Defense Software Management Techniques to Medical Software Development Projects

    PubMed Central

    Stanford, Jean Hanmer; Siegel, Jean Lafaye

    1981-01-01

    The Department of Defense (DoD) is probably the biggest buyer and developer of computer software in the world. Over the years the DoD has developed, implemented, and tested various software management techniques. In this paper the authors describe the major formalized techniques and indicate how they could be applied to software development projects in medical environments. The major control technique used to manage the design of a software project is formal, standardized system documentation. The military departments have developed detailed definitions of the required content for the system documents; these can be found in Mil. Std. 7935.1-S and in Mil. Std. 490. The authors present the major documents in the 7935.1-S documentation suite and indicate their appropriate use. The DoD also uses another formal technique, Configuration Management, to control the contents of the documents and to control changes to the system as its development progresses. After the software is installed at one or more sites, the Status Accounting portion of the Configuration Management discipline is used to keep track of the contents of the operating versions of the software out in the field.

  13. Clinical applications of high-intensity focused ultrasound.

    PubMed

    She, W H; Cheung, T T; Jenkins, C R; Irwin, M G

    2016-08-01

    Ultrasound has been developed for therapeutic use in addition to its diagnostic ability. The use of focused ultrasound energy can offer a non-invasive method for tissue ablation, and can therefore be used to treat various solid tumours. High-intensity focused ultrasound is being increasingly used in the treatment of both primary and metastatic tumours as these can be precisely located for ablation. It has been shown to be particularly useful in the treatment of uterine fibroids, and various solid tumours including those of the pancreas and liver. High-intensity focused ultrasound is a valid treatment option for liver tumours in patients with significant medical co-morbidity who are at high risk for surgery or who have relatively poor liver function that may preclude hepatectomy. It has also been used as a form of bridging therapy while patients awaiting cadaveric donor liver transplantation. In this article, we outline the principles of high-intensity focused ultrasound and its clinical applications, including the management protocol development in the treatment of hepatocellular carcinoma in Hong Kong by performing a search on MEDLINE (OVID), EMBASE, and PubMed. The search of these databases ranged from the date of their establishment until December 2015. The search terms used were: high-intensity focused ultrasound, ultrasound, magnetic resonance imaging, liver tumour, hepatocellular carcinoma, pancreas, renal cell carcinoma, prostate cancer, breast cancer, fibroids, bone tumour, atrial fibrillation, glaucoma, Parkinson's disease, essential tremor, and neuropathic pain.

  14. Software Portability Considerations for Multiple Applications over Multiple Sites

    PubMed Central

    Munnecke, Thomas

    1981-01-01

    There are great benefits to be obtained by distributing the cost of software development over multiple sites. Both economies and dis-economies of scale become prominent when broad-based software portability is looked at carefully. However, traditional data processing techniques are oriented to making specific users, rather than general software for a class of users. This trend toward overspecification is getting worse with traditional data processing languages, while other standard languages are confronting it.

  15. Application of Neural Networks in Software Engineering: A Review

    NASA Astrophysics Data System (ADS)

    Singh, Yogesh; Bhatia, Pradeep Kumar; Kaur, Arvinder; Sangwan, Omprakash

    The software engineering is comparatively new and ever changing field. The challenge of meeting tight project schedules with quality software requires that the field of software engineering be automated to large extent and human intervention be minimized to optimum level. To achieve this goal the researchers have explored the potential of machine learning approaches as they are adaptable, have learning capabilities and non-parametric. In this paper, we take a look at how Neural Network (NN) can be used to build tools for software development and maintenance tasks.

  16. Cloud based, Open Source Software Application for Mitigating Herbicide Drift

    NASA Astrophysics Data System (ADS)

    Saraswat, D.; Scott, B.

    2014-12-01

    The spread of herbicide resistant weeds has resulted in the need for clearly marked fields. In response to this need, the University of Arkansas Cooperative Extension Service launched a program named Flag the Technology in 2011. This program uses color-coded flags as a visual alert of the herbicide trait technology within a farm field. The flag based program also serves to help avoid herbicide misapplication and prevent herbicide drift damage between fields with differing crop technologies. This program has been endorsed by Southern Weed Science Society of America and is attracting interest from across the USA, Canada, and Australia. However, flags have risk of misplacement or disappearance due to mischief or severe windstorms/thunderstorms, respectively. This presentation will discuss the design and development of a cloud-based, free application utilizing open-source technologies, called Flag the Technology Cloud (FTTCloud), for allowing agricultural stakeholders to color code their farm fields for indicating herbicide resistant technologies. The developed software utilizes modern web development practices, widely used design technologies, and basic geographic information system (GIS) based interactive interfaces for representing, color-coding, searching, and visualizing fields. This program has also been made compatible for a wider usability on different size devices- smartphones, tablets, desktops and laptops.

  17. Web Services Provide Access to SCEC Scientific Research Application Software

    NASA Astrophysics Data System (ADS)

    Gupta, N.; Gupta, V.; Okaya, D.; Kamb, L.; Maechling, P.

    2003-12-01

    Web services offer scientific communities a new paradigm for sharing research codes and communicating results. While there are formal technical definitions of what constitutes a web service, for a user community such as the Southern California Earthquake Center (SCEC), we may conceptually consider a web service to be functionality provided on-demand by an application which is run on a remote computer located elsewhere on the Internet. The value of a web service is that it can (1) run a scientific code without the user needing to install and learn the intricacies of running the code; (2) provide the technical framework which allows a user's computer to talk to the remote computer which performs the service; (3) provide the computational resources to run the code; and (4) bundle several analysis steps and provide the end results in digital or (post-processed) graphical form. Within an NSF-sponsored ITR project coordinated by SCEC, we are constructing web services using architectural protocols and programming languages (e.g., Java). However, because the SCEC community has a rich pool of scientific research software (written in traditional languages such as C and FORTRAN), we also emphasize making existing scientific codes available by constructing web service frameworks which wrap around and directly run these codes. In doing so we attempt to broaden community usage of these codes. Web service wrapping of a scientific code can be done using a "web servlet" construction or by using a SOAP/WSDL-based framework. This latter approach is widely adopted in IT circles although it is subject to rapid evolution. Our wrapping framework attempts to "honor" the original codes with as little modification as is possible. For versatility we identify three methods of user access: (A) a web-based GUI (written in HTML and/or Java applets); (B) a Linux/OSX/UNIX command line "initiator" utility (shell-scriptable); and (C) direct access from within any Java application (and with the

  18. Update on Clinical MR-guided Focused Ultrasound Applications

    PubMed Central

    McDannold, Nathan

    2015-01-01

    SYNOPSIS Focused ultrasound (FUS) can be used to thermally ablate tissue. The performance of FUS under magnetic resonance (MR) guidance enables aiming the focus at the target, accurate treatment planning, real-time temperature mapping, and evaluation of the treatment. This review updates several clinical applications of MR-guided FUS. MR-guided FUS has a CE mark and FDA approval for thermal ablation for uterine fibroids and bone metastases related pain management. Thousands of uterine fibroid patients have successfully been treated with minor side effects. Technical improvements, increased experience, and the use of a screening MRI examination should further improve treatment outcome. When used for bone metastases and other bone diseases, thermal ablation leads to pain relief due to denervation, and debulking of the tumor. The use of a hemi-spherical multi-element transducer and phase corrections have enabled application of FUS through the skull. Transcranial MR-guided FUS has received CE certification for ablation of deep, central locations in the brain such as the thalamus. Thermal ablation of specific parts of the thalamus can result in relief of the symptoms in neurological disorders such as essential tremor, Parkinson’s, and neuropathic pain. No CE mark or FDA approval has been obtained as yet for treatment of prostate cancer or breast cancer, but several approaches have been proposed and clinical trials should show the potential of MR-guided FUS for these and other applications. PMID:26499282

  19. Does Screencast Teaching Software Application Needs Narration for Effective Learning?

    ERIC Educational Resources Information Center

    Mohamad Ali, Ahmad Zamzuri; Samsudin, Khairulanuar; Hassan, Mohamad; Sidek, Salman Firdaus

    2011-01-01

    The aim of this study was to investigate the effects of screencast with narration and without narration in enhancing learning performance. A series of screencast teaching Flash animation software was developed using screen capture software for the purpose of this research. The screencast series were uploaded to specialized channels created in…

  20. Fully-Focused SAR Altimetry for Oceanographic Applications

    NASA Astrophysics Data System (ADS)

    Egido, A.; Smith, W. H. F.

    2016-12-01

    In this paper we present an innovative approach for the processing of SAR altimetry data. By accounting for the phase evolution of the targets in the scene, it is possible to focus the complex echoes along the aperture, and perform a coherent integration potentially as long as the target illumination time. This process, similar to SAR imaging systems, reduces the along-track resolution down to the theoretical limit equal to L/2, where L is the antenna length. We call this the fully focused SAR Altimetry processing. For the development of the technique we have used the CryoSat-2 SAR Mode data, but our methods could also be used with similar data from Sentinel-3 or Sentinel-6/Jason-CS. The footprint of a fully focused SAR altimeter measurement is an elongated strip on the surface, which is pulse-limited across-track and SAR focused along-track. The technique has been demonstrated using transponder data, showing an achievable along-track resolution of 0.5 meters. Despite the asymmetry of the altimeter footprint, the fully focused technique may be useful for applications in which one needs to separate specific targets within highly heterogeneous scenes, such as in the case of sea-ice leads detection, hydrology, and coastal altimetry applications. On a random rough surface like the open ocean, the fully focused altimeter waveform is a random realization of speckle noise. The single looks of the surface are uncorrelated between each other, so they can be incoherently averaged to obtain a multi-looked waveform that has significant improvements with respect to conventional and D/D (unfocused SAR) altimeters. This paper reviews the results that we have obtained so far from CryoSat-2 SAR mode observations from the open ocean, where a consistent performance improvement of square root of 2 with respect to the ESA L2 product is obtained for both sea surface height range and significant wave height. The performance improvement is lower than expected for an ideal FF-SAR, as the

  1. Transfer of Learning: The Effects of Different Instruction Methods on Software Application Learning

    ERIC Educational Resources Information Center

    Larson, Mark E.

    2010-01-01

    Human Resource Departments (HRD), especially instructors, are challenged to keep pace with rapidly changing computer software applications and technology. The problem under investigation revealed after instruction of a software application if a particular method of instruction was a predictor of transfer of learning, when other risk factors were…

  2. Transfer of Learning: The Effects of Different Instruction Methods on Software Application Learning

    ERIC Educational Resources Information Center

    Larson, Mark E.

    2010-01-01

    Human Resource Departments (HRD), especially instructors, are challenged to keep pace with rapidly changing computer software applications and technology. The problem under investigation revealed after instruction of a software application if a particular method of instruction was a predictor of transfer of learning, when other risk factors were…

  3. Integrating open-source software applications to build molecular dynamics systems.

    PubMed

    Allen, Bruce M; Predecki, Paul K; Kumosa, Maciej

    2014-04-05

    Three open-source applications, NanoEngineer-1, packmol, and mis2lmp are integrated using an open-source file format to quickly create molecular dynamics (MD) cells for simulation. The three software applications collectively make up the open-source software (OSS) suite known as MD Studio (MDS). The software is validated through software engineering practices and is verified through simulation of the diglycidyl ether of bisphenol-a and isophorone diamine (DGEBA/IPD) system. Multiple simulations are run using the MDS software to create MD cells, and the data generated are used to calculate density, bulk modulus, and glass transition temperature of the DGEBA/IPD system. Simulation results compare well with published experimental and numerical results. The MDS software prototype confirms that OSS applications can be analyzed against real-world research requirements and integrated to create a new capability. Copyright © 2014 Wiley Periodicals, Inc.

  4. Laboratory Connections. Commercial Interfacing Packages: Part II: Software and Applications.

    ERIC Educational Resources Information Center

    Powers, Michael H.

    1989-01-01

    Describes the titration of a weak base with a strong acid and subsequent treatment of experimental data using commercially available software. Provides a BASIC program for determining first and second derivatives of data input. Lists 6 references. (YP)

  5. Applications of Microcomputers in the Teaching of Physics 6502 Software.

    ERIC Educational Resources Information Center

    Marsh, David P.

    1980-01-01

    Described is a variety of uses of the microcomputer when coupled with software available for systems using 6502 microprocessors. Included are several computer programs which exhibit some of the possibilities for programing the 6502 microprocessors. (DS)

  6. A Taxonomy of Knowledge Management Software Tools: Origins and Applications.

    ERIC Educational Resources Information Center

    Tyndale, Peter

    2002-01-01

    Examines, evaluates, and organizes a wide variety of knowledge management software tools by examining the literature related to the selection and evaluation of knowledge management tools. (Author/SLD)

  7. Application of NX Siemens PLM software in educational process in preparing students of engineering branch

    NASA Astrophysics Data System (ADS)

    Sadchikova, G. M.

    2017-01-01

    This article discusses the results of the introduction of computer-aided design NX by Siemens Plm Software to the classes of a higher education institution. The necessity of application of modern information technologies in teaching students of engineering profile and selection of a software product is substantiated. The author describes stages of the software module study in relation to some specific courses, considers the features of NX software, which require the creation of standard and unified product databases. The article also gives examples of research carried out by the students with the various software modules.

  8. Current focus of stem cell application in retinal repair

    PubMed Central

    Alonso-Alonso, María L; Srivastava, Girish K

    2015-01-01

    The relevance of retinal diseases, both in society’s economy and in the quality of people’s life who suffer with them, has made stem cell therapy an interesting topic for research. Embryonic stem cells (ESCs), induced pluripotent stem cells (iPSCs) and adipose derived mesenchymal stem cells (ADMSCs) are the focus in current endeavors as a source of different retinal cells, such as photoreceptors and retinal pigment epithelial cells. The aim is to apply them for cell replacement as an option for treating retinal diseases which so far are untreatable in their advanced stage. ESCs, despite the great potential for differentiation, have the dangerous risk of teratoma formation as well as ethical issues, which must be resolved before starting a clinical trial. iPSCs, like ESCs, are able to differentiate in to several types of retinal cells. However, the process to get them for personalized cell therapy has a high cost in terms of time and money. Researchers are working to resolve this since iPSCs seem to be a realistic option for treating retinal diseases. ADMSCs have the advantage that the procedures to obtain them are easier. Despite advancements in stem cell application, there are still several challenges that need to be overcome before transferring the research results to clinical application. This paper reviews recent research achievements of the applications of these three types of stem cells as well as clinical trials currently based on them. PMID:25914770

  9. Software Application for Computer Aided Vocabulary Learning in a Blended Learning Environment

    ERIC Educational Resources Information Center

    Essam, Rasha

    2010-01-01

    This study focuses on the effect of computer-aided vocabulary learning software called "ArabCAVL" on students' vocabulary acquisition. It was hypothesized that students who use the ArabCAVL software in blended learning environment will surpass students who use traditional vocabulary learning strategies in face-to-face learning…

  10. Spectromicroscopy and coherent diffraction imaging: focus on energy materials applications.

    PubMed

    Hitchcock, Adam P; Toney, Michael F

    2014-09-01

    Current and future capabilities of X-ray spectromicroscopy are discussed based on coherence-limited imaging methods which will benefit from the dramatic increase in brightness expected from a diffraction-limited storage ring (DLSR). The methods discussed include advanced coherent diffraction techniques and nanoprobe-based real-space imaging using Fresnel zone plates or other diffractive optics whose performance is affected by the degree of coherence. The capabilities of current systems, improvements which can be expected, and some of the important scientific themes which will be impacted are described, with focus on energy materials applications. Potential performance improvements of these techniques based on anticipated DLSR performance are estimated. Several examples of energy sciences research problems which are out of reach of current instrumentation, but which might be solved with the enhanced DLSR performance, are discussed.

  11. Verification of operating software for cooperative monitoring applications

    SciTech Connect

    Tolk, K.M.; Rembold, R.K.

    1997-08-01

    Monitoring agencies often use computer based equipment to control instruments and to collect data at sites that are being monitored under international safeguards or other cooperative monitoring agreements. In order for this data to be used as an independent verification of data supplied by the host at the facility, the software used must be trusted by the monitoring agency. The monitoring party must be sure that the software has not be altered to give results that could lead to erroneous conclusions about nuclear materials inventories or other operating conditions at the site. The host might also want to verify that the software being used is the software that has been previously inspected in order to be assured that only data that is allowed under the agreement is being collected. A description of a method to provide this verification using keyed has functions and how the proposed method overcomes possible vulnerabilities in methods currently in use such as loading the software from trusted disks is presented. The use of public key data authentication for this purpose is also discussed.

  12. Selection of bioprocess simulation software for industrial applications.

    PubMed

    Shanklin, T; Roper, K; Yegneswaran, P K; Marten, M R

    2001-02-20

    Two commercially available, process-simulation software packages (Aspen Batch Plus v1.2, Aspen Technology, Inc., Cambridge, Massachusetts, and Intelligen SuperPro v3.0, INTELLIGEN, INC., Scotch Plains, Ner Jersey) are evaluated for use in modeling industrial, biotechnology processes. Software is quantitatively evaluated by Kepner-Tregoe Decision Analysis (Kepner and Tregoe, 1981). This evaluation shows that Aspen Batch Plus v1.2 (ABP) and Intelligen SuperPro v3.0 (ISP) can successfully perform specific simulation tasks but do not provide a complete model of all phenomena occurring within a biotechnology process. Software is best suited to provide a format for process management, using material and energy balances to answer scheduling questions, explore equipment change-outs, and calculate cost data. The ability of simulation software to accurately predict unit operation scale-up and optimize bioprocesses is limited. To realistically evaluate the software, a vaccine manufacturing process under development at Merck & Company is simulated. Case studies from the vaccine process are presented as examples of how ABP and ISP can be used to shed light on real-world processing issues.

  13. Universally applicable three-dimensional hydrodynamic microfluidic flow focusing.

    PubMed

    Chiu, Yu-Jui; Cho, Sung Hwan; Mei, Zhe; Lien, Victor; Wu, Tsung-Feng; Lo, Yu-Hwa

    2013-05-07

    We have demonstrated a microfluidic device that can not only achieve three-dimensional flow focusing but also confine particles to the center stream along the channel. The device has a sample channel of smaller height and two sheath flow channels of greater height, merged into the downstream main channel where 3D focusing effects occur. We have demonstrated that both beads and cells in our device display significantly lower CVs in velocity and position distributions as well as reduced probability of coincidental events than they do in conventional 2D-confined microfluidic channels. The improved particle confinement in the microfluidic channel is highly desirable for microfluidic flow cytometers and in fluorescence-activated cell sorting (FACS). We have also reported a novel method to measure the velocity of each individual particle in the microfluidic channel. The method is compatible with the flow cytometer setup and requires no sophisticated visualization equipment. The principles and methods of device design and characterization can be applicable to many types of microfluidic systems.

  14. Application of a Genetic Algorithm to Optimize Quality Assurance in Software Development

    DTIC Science & Technology

    1993-09-01

    NAVAL POSTGRADUATE SCHOOL Monterey, California AD-A273 193 THESIS APPLICATION OF A GENETIC ALGORITHM TO OPTIMIZE QUALITY ASSURANCE IN SOFTWARE ...Procurement Instrument Identification Number I(if applicable) Address (ity, state, and ZIP code) 10 Source of Funding Numbers Program Element No Project No...Task No Work Unit Accession No II Title (include security classification) APPLICATION OF A GENETIC ALGORITHM TO OPTIMIZE QUALITY ASSURANCE IN SOFTWARE

  15. Application of software technology to a future spacecraft computer design

    NASA Technical Reports Server (NTRS)

    Labaugh, R. J.

    1980-01-01

    A study was conducted to determine how major improvements in spacecraft computer systems can be obtained from recent advances in hardware and software technology. Investigations into integrated circuit technology indicated that the CMOS/SOS chip set being developed for the Air Force Avionics Laboratory at Wright Patterson had the best potential for improving the performance of spaceborne computer systems. An integral part of the chip set is the bit slice arithmetic and logic unit. The flexibility allowed by microprogramming, combined with the software investigations, led to the specification of a baseline architecture and instruction set.

  16. Progress in Addressing DNFSB Recommendation 2002-1 Issues: Improving Accident Analysis Software Applications

    SciTech Connect

    VINCENT, ANDREW

    2005-04-25

    Defense Nuclear Facilities Safety Board (DNFSB) Recommendation 2002-1 (''Quality Assurance for Safety-Related Software'') identified a number of quality assurance issues on the use of software in Department of Energy (DOE) facilities for analyzing hazards, and designing and operating controls to prevent or mitigate potential accidents. Over the last year, DOE has begun several processes and programs as part of the Implementation Plan commitments, and in particular, has made significant progress in addressing several sets of issues particularly important in the application of software for performing hazard and accident analysis. The work discussed here demonstrates that through these actions, Software Quality Assurance (SQA) guidance and software tools are available that can be used to improve resulting safety analysis. Specifically, five of the primary actions corresponding to the commitments made in the Implementation Plan to Recommendation 2002-1 are identified and discussed in this paper. Included are the web-based DOE SQA Knowledge Portal and the Central Registry, guidance and gap analysis reports, electronic bulletin board and discussion forum, and a DOE safety software guide. These SQA products can benefit DOE safety contractors in the development of hazard and accident analysis by precluding inappropriate software applications and utilizing best practices when incorporating software results to safety basis documentation. The improvement actions discussed here mark a beginning to establishing stronger, standard-compliant programs, practices, and processes in SQA among safety software users, managers, and reviewers throughout the DOE Complex. Additional effort is needed, however, particularly in: (1) processes to add new software applications to the DOE Safety Software Toolbox; (2) improving the effectiveness of software issue communication; and (3) promoting a safety software quality assurance culture.

  17. Problem-Solving Software

    NASA Technical Reports Server (NTRS)

    1992-01-01

    CBR Express software solves problems by adapting sorted solutions to new problems specified by a user. It is applicable to a wide range of situations. The technology was originally developed by Inference Corporation for Johnson Space Center's Advanced Software Development Workstation. The project focused on the reuse of software designs, and Inference used CBR as part of the ACCESS prototype software. The commercial CBR Express is used as a "help desk" for customer support, enabling reuse of existing information when necessary. It has been adopted by several companies, among them American Airlines, which uses it to solve reservation system software problems.

  18. Current and Perspective Applications of Dense Plasma Focus Devices

    NASA Astrophysics Data System (ADS)

    Gribkov, V. A.

    2008-04-01

    Dense Plasma Focus (DPF) devices' applications, which are intended to support the main-stream large-scale nuclear fusion programs (NFP) from one side (both in fundamental problems of Dense Magnetized Plasma physics and in its engineering issues) as well as elaborated for an immediate use in a number of fields from the other one, are described. In the first direction such problems as self-generated magnetic fields, implosion stability of plasma shells having a high aspect ratio, etc. are important for the Inertial Confinement Fusion (ICF) programs (e.g. as NIF), whereas different problems of current disruption phenomenon, plasma turbulence, mechanisms of generation of fast particles and neutrons in magnetized plasmas are of great interest for the large devices of the Magnetic Plasma Confinement—MPC (e.g. as ITER). In a sphere of the engineering problems of NFP it is shown that in particular the radiation material sciences have DPF as a very efficient tool for radiation tests of prospect materials and for improvement of their characteristics. In the field of broad-band current applications some results obtained in the fields of radiation material sciences, radiobiology, nuclear medicine, express Neutron Activation Analysis (including a single-shot interrogation of hidden illegal objects), dynamic non-destructive quality control, X-Ray microlithography and micromachining, and micro-radiography are presented. As the examples of the potential future applications it is proposed to use DPF as a powerful high-flux neutron source to generate very powerful pulses of neutrons in the nanosecond (ns) range of its duration for innovative experiments in nuclear physics, for the goals of radiation treatment of malignant tumors, for neutron tests of materials of the first wall, blankets and NFP device's constructions (with fluences up to 1 dpa per a year term), and ns pulses of fast electrons, neutrons and hard X-Rays for brachytherapy.

  19. Extending the Role of the Corporate Library: Corporate Database Applications Using BRS/Search Software.

    ERIC Educational Resources Information Center

    Lammert, Diana

    1993-01-01

    Describes the McKenna Information Center's application of BRS/SEARCH, information retrieval software, as part of its services to Kennmetal Inc., its parent company. Features and uses of the software, including commands, custom searching, menu-driven interfaces, preparing reports, and designing databases are covered. Nine examples of software…

  20. Free and open-source software application for the evaluation of coronary computed tomography angiography images.

    PubMed

    Hadlich, Marcelo Souza; Oliveira, Gláucia Maria Moraes; Feijóo, Raúl A; Azevedo, Clerio F; Tura, Bernardo Rangel; Ziemer, Paulo Gustavo Portela; Blanco, Pablo Javier; Pina, Gustavo; Meira, Márcio; Souza e Silva, Nelson Albuquerque de

    2012-10-01

    The standardization of images used in Medicine in 1993 was performed using the DICOM (Digital Imaging and Communications in Medicine) standard. Several tests use this standard and it is increasingly necessary to design software applications capable of handling this type of image; however, these software applications are not usually free and open-source, and this fact hinders their adjustment to most diverse interests. To develop and validate a free and open-source software application capable of handling DICOM coronary computed tomography angiography images. We developed and tested the ImageLab software in the evaluation of 100 tests randomly selected from a database. We carried out 600 tests divided between two observers using ImageLab and another software sold with Philips Brilliance computed tomography appliances in the evaluation of coronary lesions and plaques around the left main coronary artery (LMCA) and the anterior descending artery (ADA). To evaluate intraobserver, interobserver and intersoftware agreements, we used simple and kappa statistics agreements. The agreements observed between software applications were generally classified as substantial or almost perfect in most comparisons. The ImageLab software agreed with the Philips software in the evaluation of coronary computed tomography angiography tests, especially in patients without lesions, with lesions < 50% in the LMCA and < 70% in the ADA. The agreement for lesions > 70% in the ADA was lower, but this is also observed when the anatomical reference standard is used.

  1. High Performance Computing Software Applications Institute for Space Situational Awareness (HSAI-SSA)

    NASA Astrophysics Data System (ADS)

    Sabol, C.; Schumacher, P.; Duncan, B.

    This poster paper firstly provides a status of the Institute project team's work to date over the past year, starting with brief HSAI background from the Department of Defense (DOD) High Performance Computing Modernization Program (DOD HPCMP). HSAI-SSA is one of only nine DOD institute projects that have been selected by the Deputy Under Secretary of Defense (Science and Technology) to focus and use advanced computational science and high performance computing to accelerate solving the DOD's highest priority challenges and make important advances in research, development, test, and evaluation. HSAI-SSA is the only DOD institute project focused on Space. We next describe Space Situational Awareness (SSA), how its many challenges necessitate supercomputing, and identify the type of disciplines required to solve many SSA problems; and the role of the Institute, which is led by the Air Force Research Laboratory AFRL/RD Directorate and an Onsite Director located on Maui. We then follow with a short discussion of the vision, mission, and overview of the Institute's strategic goals and core competencies. HSAI-SSA core competencies include Image Enhancement, Astrodynamics, Non-Resolvable Satellite Characterization, Data Integration, and High Performance Computing. We then follow up with and most of the poster shows and discusses the technical status for several of our current software applications projects and show the high performance computing metrics we have been able to achieve to date. In closing, we quickly summarize HSAI-SSA challenges, members and partners, and technology transition payoffs for selected applications and users.

  2. ELAS - SCIENCE & TECHNOLOGY LABORATORY APPLICATIONS SOFTWARE (SILICON GRAPHICS VERSION)

    NASA Technical Reports Server (NTRS)

    Walters, D.

    1994-01-01

    The Science and Technology Laboratory Applications Software (ELAS) was originally designed to analyze and process digital imagery data, specifically remotely-sensed scanner data. This capability includes the processing of Landsat multispectral data; aircraft-acquired scanner data; digitized topographic data; and numerous other ancillary data, such as soil types and rainfall information, that can be stored in digitized form. ELAS has the subsequent capability to geographically reference this data to dozens of standard, as well as user created projections. As an integrated image processing system, ELAS offers the user of remotely-sensed data a wide range of capabilities in the areas of land cover analysis and general purpose image analysis. ELAS is designed for flexible use and operation and includes its own FORTRAN operating subsystem and an expandable set of FORTRAN application modules. Because all of ELAS resides in one "logical" FORTRAN program, data inputs and outputs, directives, and module switching are convenient for the user. There are over 230 modules presently available to aid the user in performing a wide range of land cover analyses and manipulation. The file management modules enable the user to allocate, define, access, and specify usage for all types of files (ELAS files, subfiles, external files etc.). Various other modules convert specific types of satellite, aircraft, and vector-polygon data into files that can be used by other ELAS modules. The user also has many module options which aid in displaying image data, such as magnification/reduction of the display; true color display; and several memory functions. Additional modules allow for the building and manipulation of polygonal areas of the image data. Finally, there are modules which allow the user to select and classify the image data. An important feature of the ELAS subsystem is that its structure allows new applications modules to be easily integrated in the future. ELAS has as a standard

  3. ELAS - SCIENCE & TECHNOLOGY LABORATORY APPLICATIONS SOFTWARE (CONCURRENT VERSION)

    NASA Technical Reports Server (NTRS)

    Pearson, R. W.

    1994-01-01

    The Science and Technology Laboratory Applications Software (ELAS) was originally designed to analyze and process digital imagery data, specifically remotely-sensed scanner data. This capability includes the processing of Landsat multispectral data; aircraft-acquired scanner data; digitized topographic data; and numerous other ancillary data, such as soil types and rainfall information, that can be stored in digitized form. ELAS has the subsequent capability to geographically reference this data to dozens of standard, as well as user created projections. As an integrated image processing system, ELAS offers the user of remotely-sensed data a wide range of capabilities in the areas of land cover analysis and general purpose image analysis. ELAS is designed for flexible use and operation and includes its own FORTRAN operating subsystem and an expandable set of FORTRAN application modules. Because all of ELAS resides in one "logical" FORTRAN program, data inputs and outputs, directives, and module switching are convenient for the user. There are over 230 modules presently available to aid the user in performing a wide range of land cover analyses and manipulation. The file management modules enable the user to allocate, define, access, and specify usage for all types of files (ELAS files, subfiles, external files etc.). Various other modules convert specific types of satellite, aircraft, and vector-polygon data into files that can be used by other ELAS modules. The user also has many module options which aid in displaying image data, such as magnification/reduction of the display; true color display; and several memory functions. Additional modules allow for the building and manipulation of polygonal areas of the image data. Finally, there are modules which allow the user to select and classify the image data. An important feature of the ELAS subsystem is that its structure allows new applications modules to be easily integrated in the future. ELAS has as a standard

  4. ELAS - SCIENCE & TECHNOLOGY LABORATORY APPLICATIONS SOFTWARE (SUN VERSION)

    NASA Technical Reports Server (NTRS)

    Walters, D.

    1994-01-01

    The Science and Technology Laboratory Applications Software (ELAS) was originally designed to analyze and process digital imagery data, specifically remotely-sensed scanner data. This capability includes the processing of Landsat multispectral data; aircraft-acquired scanner data; digitized topographic data; and numerous other ancillary data, such as soil types and rainfall information, that can be stored in digitized form. ELAS has the subsequent capability to geographically reference this data to dozens of standard, as well as user created projections. As an integrated image processing system, ELAS offers the user of remotely-sensed data a wide range of capabilities in the areas of land cover analysis and general purpose image analysis. ELAS is designed for flexible use and operation and includes its own FORTRAN operating subsystem and an expandable set of FORTRAN application modules. Because all of ELAS resides in one "logical" FORTRAN program, data inputs and outputs, directives, and module switching are convenient for the user. There are over 230 modules presently available to aid the user in performing a wide range of land cover analyses and manipulation. The file management modules enable the user to allocate, define, access, and specify usage for all types of files (ELAS files, subfiles, external files etc.). Various other modules convert specific types of satellite, aircraft, and vector-polygon data into files that can be used by other ELAS modules. The user also has many module options which aid in displaying image data, such as magnification/reduction of the display; true color display; and several memory functions. Additional modules allow for the building and manipulation of polygonal areas of the image data. Finally, there are modules which allow the user to select and classify the image data. An important feature of the ELAS subsystem is that its structure allows new applications modules to be easily integrated in the future. ELAS has as a standard

  5. ELAS - SCIENCE & TECHNOLOGY LABORATORY APPLICATIONS SOFTWARE (DEC VAX VERSION)

    NASA Technical Reports Server (NTRS)

    Junkin, B. G.

    1994-01-01

    The Science and Technology Laboratory Applications Software (ELAS) was originally designed to analyze and process digital imagery data, specifically remotely-sensed scanner data. This capability includes the processing of Landsat multispectral data; aircraft-acquired scanner data; digitized topographic data; and numerous other ancillary data, such as soil types and rainfall information, that can be stored in digitized form. ELAS has the subsequent capability to geographically reference this data to dozens of standard, as well as user created projections. As an integrated image processing system, ELAS offers the user of remotely-sensed data a wide range of capabilities in the areas of land cover analysis and general purpose image analysis. ELAS is designed for flexible use and operation and includes its own FORTRAN operating subsystem and an expandable set of FORTRAN application modules. Because all of ELAS resides in one "logical" FORTRAN program, data inputs and outputs, directives, and module switching are convenient for the user. There are over 230 modules presently available to aid the user in performing a wide range of land cover analyses and manipulation. The file management modules enable the user to allocate, define, access, and specify usage for all types of files (ELAS files, subfiles, external files etc.). Various other modules convert specific types of satellite, aircraft, and vector-polygon data into files that can be used by other ELAS modules. The user also has many module options which aid in displaying image data, such as magnification/reduction of the display; true color display; and several memory functions. Additional modules allow for the building and manipulation of polygonal areas of the image data. Finally, there are modules which allow the user to select and classify the image data. An important feature of the ELAS subsystem is that its structure allows new applications modules to be easily integrated in the future. ELAS has as a standard

  6. ELAS - SCIENCE & TECHNOLOGY LABORATORY APPLICATIONS SOFTWARE (MASSCOMP VERSION)

    NASA Technical Reports Server (NTRS)

    Walters, D.

    1994-01-01

    The Science and Technology Laboratory Applications Software (ELAS) was originally designed to analyze and process digital imagery data, specifically remotely-sensed scanner data. This capability includes the processing of Landsat multispectral data; aircraft-acquired scanner data; digitized topographic data; and numerous other ancillary data, such as soil types and rainfall information, that can be stored in digitized form. ELAS has the subsequent capability to geographically reference this data to dozens of standard, as well as user created projections. As an integrated image processing system, ELAS offers the user of remotely-sensed data a wide range of capabilities in the areas of land cover analysis and general purpose image analysis. ELAS is designed for flexible use and operation and includes its own FORTRAN operating subsystem and an expandable set of FORTRAN application modules. Because all of ELAS resides in one "logical" FORTRAN program, data inputs and outputs, directives, and module switching are convenient for the user. There are over 230 modules presently available to aid the user in performing a wide range of land cover analyses and manipulation. The file management modules enable the user to allocate, define, access, and specify usage for all types of files (ELAS files, subfiles, external files etc.). Various other modules convert specific types of satellite, aircraft, and vector-polygon data into files that can be used by other ELAS modules. The user also has many module options which aid in displaying image data, such as magnification/reduction of the display; true color display; and several memory functions. Additional modules allow for the building and manipulation of polygonal areas of the image data. Finally, there are modules which allow the user to select and classify the image data. An important feature of the ELAS subsystem is that its structure allows new applications modules to be easily integrated in the future. ELAS has as a standard

  7. ELAS - SCIENCE & TECHNOLOGY LABORATORY APPLICATIONS SOFTWARE (MASSCOMP VERSION)

    NASA Technical Reports Server (NTRS)

    Walters, D.

    1994-01-01

    The Science and Technology Laboratory Applications Software (ELAS) was originally designed to analyze and process digital imagery data, specifically remotely-sensed scanner data. This capability includes the processing of Landsat multispectral data; aircraft-acquired scanner data; digitized topographic data; and numerous other ancillary data, such as soil types and rainfall information, that can be stored in digitized form. ELAS has the subsequent capability to geographically reference this data to dozens of standard, as well as user created projections. As an integrated image processing system, ELAS offers the user of remotely-sensed data a wide range of capabilities in the areas of land cover analysis and general purpose image analysis. ELAS is designed for flexible use and operation and includes its own FORTRAN operating subsystem and an expandable set of FORTRAN application modules. Because all of ELAS resides in one "logical" FORTRAN program, data inputs and outputs, directives, and module switching are convenient for the user. There are over 230 modules presently available to aid the user in performing a wide range of land cover analyses and manipulation. The file management modules enable the user to allocate, define, access, and specify usage for all types of files (ELAS files, subfiles, external files etc.). Various other modules convert specific types of satellite, aircraft, and vector-polygon data into files that can be used by other ELAS modules. The user also has many module options which aid in displaying image data, such as magnification/reduction of the display; true color display; and several memory functions. Additional modules allow for the building and manipulation of polygonal areas of the image data. Finally, there are modules which allow the user to select and classify the image data. An important feature of the ELAS subsystem is that its structure allows new applications modules to be easily integrated in the future. ELAS has as a standard

  8. ELAS - SCIENCE & TECHNOLOGY LABORATORY APPLICATIONS SOFTWARE (DEC VAX VERSION)

    NASA Technical Reports Server (NTRS)

    Junkin, B. G.

    1994-01-01

    The Science and Technology Laboratory Applications Software (ELAS) was originally designed to analyze and process digital imagery data, specifically remotely-sensed scanner data. This capability includes the processing of Landsat multispectral data; aircraft-acquired scanner data; digitized topographic data; and numerous other ancillary data, such as soil types and rainfall information, that can be stored in digitized form. ELAS has the subsequent capability to geographically reference this data to dozens of standard, as well as user created projections. As an integrated image processing system, ELAS offers the user of remotely-sensed data a wide range of capabilities in the areas of land cover analysis and general purpose image analysis. ELAS is designed for flexible use and operation and includes its own FORTRAN operating subsystem and an expandable set of FORTRAN application modules. Because all of ELAS resides in one "logical" FORTRAN program, data inputs and outputs, directives, and module switching are convenient for the user. There are over 230 modules presently available to aid the user in performing a wide range of land cover analyses and manipulation. The file management modules enable the user to allocate, define, access, and specify usage for all types of files (ELAS files, subfiles, external files etc.). Various other modules convert specific types of satellite, aircraft, and vector-polygon data into files that can be used by other ELAS modules. The user also has many module options which aid in displaying image data, such as magnification/reduction of the display; true color display; and several memory functions. Additional modules allow for the building and manipulation of polygonal areas of the image data. Finally, there are modules which allow the user to select and classify the image data. An important feature of the ELAS subsystem is that its structure allows new applications modules to be easily integrated in the future. ELAS has as a standard

  9. ELAS - SCIENCE & TECHNOLOGY LABORATORY APPLICATIONS SOFTWARE (SILICON GRAPHICS VERSION)

    NASA Technical Reports Server (NTRS)

    Walters, D.

    1994-01-01

    The Science and Technology Laboratory Applications Software (ELAS) was originally designed to analyze and process digital imagery data, specifically remotely-sensed scanner data. This capability includes the processing of Landsat multispectral data; aircraft-acquired scanner data; digitized topographic data; and numerous other ancillary data, such as soil types and rainfall information, that can be stored in digitized form. ELAS has the subsequent capability to geographically reference this data to dozens of standard, as well as user created projections. As an integrated image processing system, ELAS offers the user of remotely-sensed data a wide range of capabilities in the areas of land cover analysis and general purpose image analysis. ELAS is designed for flexible use and operation and includes its own FORTRAN operating subsystem and an expandable set of FORTRAN application modules. Because all of ELAS resides in one "logical" FORTRAN program, data inputs and outputs, directives, and module switching are convenient for the user. There are over 230 modules presently available to aid the user in performing a wide range of land cover analyses and manipulation. The file management modules enable the user to allocate, define, access, and specify usage for all types of files (ELAS files, subfiles, external files etc.). Various other modules convert specific types of satellite, aircraft, and vector-polygon data into files that can be used by other ELAS modules. The user also has many module options which aid in displaying image data, such as magnification/reduction of the display; true color display; and several memory functions. Additional modules allow for the building and manipulation of polygonal areas of the image data. Finally, there are modules which allow the user to select and classify the image data. An important feature of the ELAS subsystem is that its structure allows new applications modules to be easily integrated in the future. ELAS has as a standard

  10. ELAS - SCIENCE & TECHNOLOGY LABORATORY APPLICATIONS SOFTWARE (CONCURRENT VERSION)

    NASA Technical Reports Server (NTRS)

    Pearson, R. W.

    1994-01-01

    The Science and Technology Laboratory Applications Software (ELAS) was originally designed to analyze and process digital imagery data, specifically remotely-sensed scanner data. This capability includes the processing of Landsat multispectral data; aircraft-acquired scanner data; digitized topographic data; and numerous other ancillary data, such as soil types and rainfall information, that can be stored in digitized form. ELAS has the subsequent capability to geographically reference this data to dozens of standard, as well as user created projections. As an integrated image processing system, ELAS offers the user of remotely-sensed data a wide range of capabilities in the areas of land cover analysis and general purpose image analysis. ELAS is designed for flexible use and operation and includes its own FORTRAN operating subsystem and an expandable set of FORTRAN application modules. Because all of ELAS resides in one "logical" FORTRAN program, data inputs and outputs, directives, and module switching are convenient for the user. There are over 230 modules presently available to aid the user in performing a wide range of land cover analyses and manipulation. The file management modules enable the user to allocate, define, access, and specify usage for all types of files (ELAS files, subfiles, external files etc.). Various other modules convert specific types of satellite, aircraft, and vector-polygon data into files that can be used by other ELAS modules. The user also has many module options which aid in displaying image data, such as magnification/reduction of the display; true color display; and several memory functions. Additional modules allow for the building and manipulation of polygonal areas of the image data. Finally, there are modules which allow the user to select and classify the image data. An important feature of the ELAS subsystem is that its structure allows new applications modules to be easily integrated in the future. ELAS has as a standard

  11. ELAS - SCIENCE & TECHNOLOGY LABORATORY APPLICATIONS SOFTWARE (SUN VERSION)

    NASA Technical Reports Server (NTRS)

    Walters, D.

    1994-01-01

    The Science and Technology Laboratory Applications Software (ELAS) was originally designed to analyze and process digital imagery data, specifically remotely-sensed scanner data. This capability includes the processing of Landsat multispectral data; aircraft-acquired scanner data; digitized topographic data; and numerous other ancillary data, such as soil types and rainfall information, that can be stored in digitized form. ELAS has the subsequent capability to geographically reference this data to dozens of standard, as well as user created projections. As an integrated image processing system, ELAS offers the user of remotely-sensed data a wide range of capabilities in the areas of land cover analysis and general purpose image analysis. ELAS is designed for flexible use and operation and includes its own FORTRAN operating subsystem and an expandable set of FORTRAN application modules. Because all of ELAS resides in one "logical" FORTRAN program, data inputs and outputs, directives, and module switching are convenient for the user. There are over 230 modules presently available to aid the user in performing a wide range of land cover analyses and manipulation. The file management modules enable the user to allocate, define, access, and specify usage for all types of files (ELAS files, subfiles, external files etc.). Various other modules convert specific types of satellite, aircraft, and vector-polygon data into files that can be used by other ELAS modules. The user also has many module options which aid in displaying image data, such as magnification/reduction of the display; true color display; and several memory functions. Additional modules allow for the building and manipulation of polygonal areas of the image data. Finally, there are modules which allow the user to select and classify the image data. An important feature of the ELAS subsystem is that its structure allows new applications modules to be easily integrated in the future. ELAS has as a standard

  12. ELAS - SCIENCE & TECHNOLOGY LABORATORY APPLICATIONS SOFTWARE (CONCURRENT VERSION)

    NASA Technical Reports Server (NTRS)

    Pearson, R. W.

    1994-01-01

    The Science and Technology Laboratory Applications Software (ELAS) was originally designed to analyze and process digital imagery data, specifically remotely-sensed scanner data. This capability includes the processing of Landsat multispectral data; aircraft-acquired scanner data; digitized topographic data; and numerous other ancillary data, such as soil types and rainfall information, that can be stored in digitized form. ELAS has the subsequent capability to geographically reference this data to dozens of standard, as well as user created projections. As an integrated image processing system, ELAS offers the user of remotely-sensed data a wide range of capabilities in the areas of land cover analysis and general purpose image analysis. ELAS is designed for flexible use and operation and includes its own FORTRAN operating subsystem and an expandable set of FORTRAN application modules. Because all of ELAS resides in one "logical" FORTRAN program, data inputs and outputs, directives, and module switching are convenient for the user. There are over 230 modules presently available to aid the user in performing a wide range of land cover analyses and manipulation. The file management modules enable the user to allocate, define, access, and specify usage for all types of files (ELAS files, subfiles, external files etc.). Various other modules convert specific types of satellite, aircraft, and vector-polygon data into files that can be used by other ELAS modules. The user also has many module options which aid in displaying image data, such as magnification/reduction of the display; true color display; and several memory functions. Additional modules allow for the building and manipulation of polygonal areas of the image data. Finally, there are modules which allow the user to select and classify the image data. An important feature of the ELAS subsystem is that its structure allows new applications modules to be easily integrated in the future. ELAS has as a standard

  13. ELAS - SCIENCE & TECHNOLOGY LABORATORY APPLICATIONS SOFTWARE (DEC VAX VERSION)

    NASA Technical Reports Server (NTRS)

    Junkin, B. G.

    1994-01-01

    The Science and Technology Laboratory Applications Software (ELAS) was originally designed to analyze and process digital imagery data, specifically remotely-sensed scanner data. This capability includes the processing of Landsat multispectral data; aircraft-acquired scanner data; digitized topographic data; and numerous other ancillary data, such as soil types and rainfall information, that can be stored in digitized form. ELAS has the subsequent capability to geographically reference this data to dozens of standard, as well as user created projections. As an integrated image processing system, ELAS offers the user of remotely-sensed data a wide range of capabilities in the areas of land cover analysis and general purpose image analysis. ELAS is designed for flexible use and operation and includes its own FORTRAN operating subsystem and an expandable set of FORTRAN application modules. Because all of ELAS resides in one "logical" FORTRAN program, data inputs and outputs, directives, and module switching are convenient for the user. There are over 230 modules presently available to aid the user in performing a wide range of land cover analyses and manipulation. The file management modules enable the user to allocate, define, access, and specify usage for all types of files (ELAS files, subfiles, external files etc.). Various other modules convert specific types of satellite, aircraft, and vector-polygon data into files that can be used by other ELAS modules. The user also has many module options which aid in displaying image data, such as magnification/reduction of the display; true color display; and several memory functions. Additional modules allow for the building and manipulation of polygonal areas of the image data. Finally, there are modules which allow the user to select and classify the image data. An important feature of the ELAS subsystem is that its structure allows new applications modules to be easily integrated in the future. ELAS has as a standard

  14. ELAS - SCIENCE & TECHNOLOGY LABORATORY APPLICATIONS SOFTWARE (MASSCOMP VERSION)

    NASA Technical Reports Server (NTRS)

    Walters, D.

    1994-01-01

    The Science and Technology Laboratory Applications Software (ELAS) was originally designed to analyze and process digital imagery data, specifically remotely-sensed scanner data. This capability includes the processing of Landsat multispectral data; aircraft-acquired scanner data; digitized topographic data; and numerous other ancillary data, such as soil types and rainfall information, that can be stored in digitized form. ELAS has the subsequent capability to geographically reference this data to dozens of standard, as well as user created projections. As an integrated image processing system, ELAS offers the user of remotely-sensed data a wide range of capabilities in the areas of land cover analysis and general purpose image analysis. ELAS is designed for flexible use and operation and includes its own FORTRAN operating subsystem and an expandable set of FORTRAN application modules. Because all of ELAS resides in one "logical" FORTRAN program, data inputs and outputs, directives, and module switching are convenient for the user. There are over 230 modules presently available to aid the user in performing a wide range of land cover analyses and manipulation. The file management modules enable the user to allocate, define, access, and specify usage for all types of files (ELAS files, subfiles, external files etc.). Various other modules convert specific types of satellite, aircraft, and vector-polygon data into files that can be used by other ELAS modules. The user also has many module options which aid in displaying image data, such as magnification/reduction of the display; true color display; and several memory functions. Additional modules allow for the building and manipulation of polygonal areas of the image data. Finally, there are modules which allow the user to select and classify the image data. An important feature of the ELAS subsystem is that its structure allows new applications modules to be easily integrated in the future. ELAS has as a standard

  15. ELAS - SCIENCE & TECHNOLOGY LABORATORY APPLICATIONS SOFTWARE (SUN VERSION)

    NASA Technical Reports Server (NTRS)

    Walters, D.

    1994-01-01

    The Science and Technology Laboratory Applications Software (ELAS) was originally designed to analyze and process digital imagery data, specifically remotely-sensed scanner data. This capability includes the processing of Landsat multispectral data; aircraft-acquired scanner data; digitized topographic data; and numerous other ancillary data, such as soil types and rainfall information, that can be stored in digitized form. ELAS has the subsequent capability to geographically reference this data to dozens of standard, as well as user created projections. As an integrated image processing system, ELAS offers the user of remotely-sensed data a wide range of capabilities in the areas of land cover analysis and general purpose image analysis. ELAS is designed for flexible use and operation and includes its own FORTRAN operating subsystem and an expandable set of FORTRAN application modules. Because all of ELAS resides in one "logical" FORTRAN program, data inputs and outputs, directives, and module switching are convenient for the user. There are over 230 modules presently available to aid the user in performing a wide range of land cover analyses and manipulation. The file management modules enable the user to allocate, define, access, and specify usage for all types of files (ELAS files, subfiles, external files etc.). Various other modules convert specific types of satellite, aircraft, and vector-polygon data into files that can be used by other ELAS modules. The user also has many module options which aid in displaying image data, such as magnification/reduction of the display; true color display; and several memory functions. Additional modules allow for the building and manipulation of polygonal areas of the image data. Finally, there are modules which allow the user to select and classify the image data. An important feature of the ELAS subsystem is that its structure allows new applications modules to be easily integrated in the future. ELAS has as a standard

  16. ELAS - SCIENCE & TECHNOLOGY LABORATORY APPLICATIONS SOFTWARE (SILICON GRAPHICS VERSION)

    NASA Technical Reports Server (NTRS)

    Walters, D.

    1994-01-01

    The Science and Technology Laboratory Applications Software (ELAS) was originally designed to analyze and process digital imagery data, specifically remotely-sensed scanner data. This capability includes the processing of Landsat multispectral data; aircraft-acquired scanner data; digitized topographic data; and numerous other ancillary data, such as soil types and rainfall information, that can be stored in digitized form. ELAS has the subsequent capability to geographically reference this data to dozens of standard, as well as user created projections. As an integrated image processing system, ELAS offers the user of remotely-sensed data a wide range of capabilities in the areas of land cover analysis and general purpose image analysis. ELAS is designed for flexible use and operation and includes its own FORTRAN operating subsystem and an expandable set of FORTRAN application modules. Because all of ELAS resides in one "logical" FORTRAN program, data inputs and outputs, directives, and module switching are convenient for the user. There are over 230 modules presently available to aid the user in performing a wide range of land cover analyses and manipulation. The file management modules enable the user to allocate, define, access, and specify usage for all types of files (ELAS files, subfiles, external files etc.). Various other modules convert specific types of satellite, aircraft, and vector-polygon data into files that can be used by other ELAS modules. The user also has many module options which aid in displaying image data, such as magnification/reduction of the display; true color display; and several memory functions. Additional modules allow for the building and manipulation of polygonal areas of the image data. Finally, there are modules which allow the user to select and classify the image data. An important feature of the ELAS subsystem is that its structure allows new applications modules to be easily integrated in the future. ELAS has as a standard

  17. [Genetic algorithm application to multi-focus patterns of 256-element phased array for focused ultrasound surgery].

    PubMed

    Xu, Feng; Wan, Mingxi; Lu, Mingzhu

    2008-10-01

    The genetic optimal algorithm and sound field calculation approach for the spherical-section phased array are presented in this paper. The in-house manufactured 256-element phased array focused ultrasound surgery system is briefly described. The on-axis single focus and off-axis single focus are simulated along with the axis-symmetric six-focus patter and the axis-asymmetric four-focus pattern using a 256-element phased array and the genetic optimal algorithm and sound field calculation approach. The experimental results of the described 256-element phased array focused ultrasound surgery system acting on organic glass and phantom are also analyzed. The results of the simulations and experiments confirm the applicability of the genetic algorithm and field calculation approaches in accurately steering three dimensional foci and focus.

  18. Software Agents Applications Using Real-Time CORBA

    NASA Astrophysics Data System (ADS)

    Fowell, S.; Ward, R.; Nielsen, M.

    This paper describes current projects being performed by SciSys in the area of the use of software agents, built using CORBA middleware, to improve operations within autonomous satellite/ground systems. These concepts have been developed and demonstrated in a series of experiments variously funded by ESA's Technology Flight Opportunity Initiative (TFO) and Leading Edge Technology for SMEs (LET-SME), and the British National Space Centre's (BNSC) National Technology Programme. Some of this earlier work has already been reported in [1]. This paper will address the trends, issues and solutions associated with this software agent architecture concept, together with its implementation using CORBA within an on-board environment, that is to say taking account of its real- time and resource constrained nature.

  19. Application of Lightweight Formal Methods to Software Security

    NASA Technical Reports Server (NTRS)

    Gilliam, David P.; Powell, John D.; Bishop, Matt

    2005-01-01

    Formal specification and verification of security has proven a challenging task. There is no single method that has proven feasible. Instead, an integrated approach which combines several formal techniques can increase the confidence in the verification of software security properties. Such an approach which species security properties in a library that can be reused by 2 instruments and their methodologies developed for the National Aeronautics and Space Administration (NASA) at the Jet Propulsion Laboratory (JPL) are described herein The Flexible Modeling Framework (FMF) is a model based verijkation instrument that uses Promela and the SPIN model checker. The Property Based Tester (PBT) uses TASPEC and a Text Execution Monitor (TEM). They are used to reduce vulnerabilities and unwanted exposures in software during the development and maintenance life cycles.

  20. Application of Lightweight Formal Methods to Software Security

    NASA Technical Reports Server (NTRS)

    Gilliam, David P.; Powell, John D.; Bishop, Matt

    2005-01-01

    Formal specification and verification of security has proven a challenging task. There is no single method that has proven feasible. Instead, an integrated approach which combines several formal techniques can increase the confidence in the verification of software security properties. Such an approach which species security properties in a library that can be reused by 2 instruments and their methodologies developed for the National Aeronautics and Space Administration (NASA) at the Jet Propulsion Laboratory (JPL) are described herein The Flexible Modeling Framework (FMF) is a model based verijkation instrument that uses Promela and the SPIN model checker. The Property Based Tester (PBT) uses TASPEC and a Text Execution Monitor (TEM). They are used to reduce vulnerabilities and unwanted exposures in software during the development and maintenance life cycles.

  1. Alternatives for Developing User Documentation for Applications Software

    DTIC Science & Technology

    1991-09-01

    computers do not operate in offices alone. In some cases , programs and data in offices are shared among machines and users via the exchanging of disks...supported, and understood of the types of software documentation noted. Automated systems such as CASE tools, rapidly becoming available for producing...more important. In many cases , the qualities of effective online documentation must be abstracted from the qualities of effective paper documentation

  2. Surfing the Edge of Chaos: Applications to Software Engineering

    DTIC Science & Technology

    2000-01-01

    fore- cast the winner. 5. The edge of chaos In the fiction novel "The Lost World", Michael Crichton explains why some species evolve while others...Cusumano, Michael How Microsoft Makes Large Teams Work Like Small Teams. Sloan Management Review. Fall, 1997. (Dooley, 1994) Dooley, K. and Flor, R...Paper submitted to SEKE 2000. (Porter, 1980) Porter, Michael . Competitive Strategy. Free Press, 1980. (Putnam, 1980) Putnam, L. Software Cost Estimating

  3. Applications of multigrid software in the atmospheric sciences

    NASA Technical Reports Server (NTRS)

    Adams, J.; Garcia, R.; Gross, B.; Hack, J.; Haidvogel, D.; Pizzo, V.

    1992-01-01

    Elliptic partial differential equations from different areas in the atmospheric sciences are efficiently and easily solved utilizing the multigrid software package named MUDPACK. It is demonstrated that the multigrid method is more efficient than other commonly employed techniques, such as Gaussian elimination and fixed-grid relaxation. The efficiency relative to other techniques, both in terms of storage requirement and computational time, increases quickly with grid size.

  4. Generating Safety-Critical PLC Code From a High-Level Application Software Specification

    NASA Technical Reports Server (NTRS)

    2008-01-01

    The benefits of automatic-application code generation are widely accepted within the software engineering community. These benefits include raised abstraction level of application programming, shorter product development time, lower maintenance costs, and increased code quality and consistency. Surprisingly, code generation concepts have not yet found wide acceptance and use in the field of programmable logic controller (PLC) software development. Software engineers at Kennedy Space Center recognized the need for PLC code generation while developing the new ground checkout and launch processing system, called the Launch Control System (LCS). Engineers developed a process and a prototype software tool that automatically translates a high-level representation or specification of application software into ladder logic that executes on a PLC. All the computer hardware in the LCS is planned to be commercial off the shelf (COTS), including industrial controllers or PLCs that are connected to the sensors and end items out in the field. Most of the software in LCS is also planned to be COTS, with only small adapter software modules that must be developed in order to interface between the various COTS software products. A domain-specific language (DSL) is a programming language designed to perform tasks and to solve problems in a particular domain, such as ground processing of launch vehicles. The LCS engineers created a DSL for developing test sequences of ground checkout and launch operations of future launch vehicle and spacecraft elements, and they are developing a tabular specification format that uses the DSL keywords and functions familiar to the ground and flight system users. The tabular specification format, or tabular spec, allows most ground and flight system users to document how the application software is intended to function and requires little or no software programming knowledge or experience. A small sample from a prototype tabular spec application is

  5. Bringing Legacy Visualization Software to Modern Computing Devices via Application Streaming

    NASA Astrophysics Data System (ADS)

    Fisher, Ward

    2014-05-01

    Planning software compatibility across forthcoming generations of computing platforms is a problem commonly encountered in software engineering and development. While this problem can affect any class of software, data analysis and visualization programs are particularly vulnerable. This is due in part to their inherent dependency on specialized hardware and computing environments. A number of strategies and tools have been designed to aid software engineers with this task. While generally embraced by developers at 'traditional' software companies, these methodologies are often dismissed by the scientific software community as unwieldy, inefficient and unnecessary. As a result, many important and storied scientific software packages can struggle to adapt to a new computing environment; for example, one in which much work is carried out on sub-laptop devices (such as tablets and smartphones). Rewriting these packages for a new platform often requires significant investment in terms of development time and developer expertise. In many cases, porting older software to modern devices is neither practical nor possible. As a result, replacement software must be developed from scratch, wasting resources better spent on other projects. Enabled largely by the rapid rise and adoption of cloud computing platforms, 'Application Streaming' technologies allow legacy visualization and analysis software to be operated wholly from a client device (be it laptop, tablet or smartphone) while retaining full functionality and interactivity. It mitigates much of the developer effort required by other more traditional methods while simultaneously reducing the time it takes to bring the software to a new platform. This work will provide an overview of Application Streaming and how it compares against other technologies which allow scientific visualization software to be executed from a remote computer. We will discuss the functionality and limitations of existing application streaming

  6. Editorial: Focus on Atom Optics and its Applications

    NASA Astrophysics Data System (ADS)

    Schmidt-Kaler, F.; Pfau, T.; Schmelcher, P.; Schleich, W.

    2010-06-01

    Atom optics employs the modern techniques of quantum optics and laser cooling to enable applications which often outperform current standard technologies. Atomic matter wave interferometers allow for ultra-precise sensors; metrology and clocks are pushed to an extraordinary accuracy of 17 digits using single atoms. Miniaturization and integration are driven forward for both atomic clocks and atom optical circuits. With the miniaturization of information-storage and -processing devices, the scale of single atoms is approached in solid state devices, where the laws of quantum physics lead to novel, advantageous features and functionalities. An upcoming branch of atom optics is the control of single atoms, potentially allowing solid state devices to be built atom by atom; some of which would be applicable in future quantum information processing devices. Selective manipulation of individual atoms also enables trace analysis of extremely rare isotopes. Additionally, sources of neutral atoms with high brightness are being developed and, if combined with photo ionization, even novel focused ion beam sources are within reach. Ultracold chemistry is fertilized by atomic techniques, when reactions of chemical constituents are investigated between ions, atoms, molecules, trapped or aligned in designed fields and cooled to ultra-low temperatures such that the reaction kinetics can be studied in a completely state-resolved manner. Focus on Atom Optics and its Applications Contents Sensitive gravity-gradiometry with atom interferometry: progress towards an improved determination of the gravitational constant F Sorrentino, Y-H Lien, G Rosi, L Cacciapuoti, M Prevedelli and G M Tino A single-atom detector integrated on an atom chip: fabrication, characterization and application D Heine, W Rohringer, D Fischer, M Wilzbach, T Raub, S Loziczky, XiYuan Liu, S Groth, B Hessmo and J Schmiedmayer Interaction of a propagating guided matter wave with a localized potential G L Gattobigio, A

  7. Directory of Industry and University Collaborations with a Focus on Software Engineering Education and Training, Version 7

    DTIC Science & Technology

    1999-02-01

    engineering cur- ricula, coalitions in other countries, Software Process Improvement Networks (SPINs), and the Capability Maturity Model ® (CMM®) for Software...the goals of the collaborations. ’ Capability Maturity Model and CMM are registered in the U.S Patent and Trademark Office. CMU/SEI-99-SR-001 2...Department of Defense. Both of these divisions have been rated at the Software Engineering Institute (SEI) Capability Maturity Model (CMM) Level 3 or

  8. Software Quality Evaluation Models Applicable in Health Information and Communications Technologies. A Review of the Literature.

    PubMed

    Villamor Ordozgoiti, Alberto; Delgado Hito, Pilar; Guix Comellas, Eva María; Fernandez Sanchez, Carlos Manuel; Garcia Hernandez, Milagros; Lluch Canut, Teresa

    2016-01-01

    Information and Communications Technologies in healthcare has increased the need to consider quality criteria through standardised processes. The aim of this study was to analyse the software quality evaluation models applicable to healthcare from the perspective of ICT-purchasers. Through a systematic literature review with the keywords software, product, quality, evaluation and health, we selected and analysed 20 original research papers published from 2005-2016 in health science and technology databases. The results showed four main topics: non-ISO models, software quality evaluation models based on ISO/IEC standards, studies analysing software quality evaluation models, and studies analysing ISO standards for software quality evaluation. The models provide cost-efficiency criteria for specific software, and improve use outcomes. The ISO/IEC25000 standard is shown as the most suitable for evaluating the quality of ICTs for healthcare use from the perspective of institutional acquisition.

  9. Handbook of software quality assurance techniques applicable to the nuclear industry

    SciTech Connect

    Bryant, J.L.; Wilburn, N.P.

    1987-08-01

    Pacific Northwest Laboratory is conducting a research project to recommend good engineering practices in the application of 10 CFR 50, Appendix B requirements to assure quality in the development and use of computer software for the design and operation of nuclear power plants for NRC and industry. This handbook defines the content of a software quality assurance program by enumerating the techniques applicable. Definitions, descriptions, and references where further information may be obtained are provided for each topic.

  10. Software Application for Supporting the Education of Database Systems

    ERIC Educational Resources Information Center

    Vágner, Anikó

    2015-01-01

    The article introduces an application which supports the education of database systems, particularly the teaching of SQL and PL/SQL in Oracle Database Management System environment. The application has two parts, one is the database schema and its content, and the other is a C# application. The schema is to administrate and store the tasks and the…

  11. Application of Domain Knowledge to Software Quality Assurance

    NASA Technical Reports Server (NTRS)

    Wild, Christian W.

    1997-01-01

    This work focused on capturing, using, and evolving a qualitative decision support structure across the life cycle of a project. The particular application of this study was towards business process reengineering and the representation of the business process in a set of Business Rules (BR). In this work, we defined a decision model which captured the qualitative decision deliberation process. It represented arguments both for and against proposed alternatives to a problem. It was felt that the subjective nature of many critical business policy decisions required a qualitative modeling approach similar to that of Lee and Mylopoulos. While previous work was limited almost exclusively to the decision capture phase, which occurs early in the project life cycle, we investigated the use of such a model during the later stages as well. One of our significant developments was the use of the decision model during the operational phase of a project. By operational phase, we mean the phase in which the system or set of policies which were earlier decided are deployed and put into practice. By making the decision model available to operational decision makers, they would have access to the arguments pro and con for a variety of actions and can thus make a more informed decision which balances the often conflicting criteria by which the value of action is measured. We also developed the concept of a 'monitored decision' in which metrics of performance were identified during the decision making process and used to evaluate the quality of that decision. It is important to monitor those decision which seem at highest risk of not meeting their stated objectives. Operational decisions are also potentially high risk decisions. Finally, we investigated the use of performance metrics for monitored decisions and audit logs of operational decisions in order to feed an evolutionary phase of the the life cycle. During evolution, decisions are revisisted, assumptions verified or refuted

  12. Tightly Coupled Geodynamic Systems: Software, Implicit Solvers & Applications

    NASA Astrophysics Data System (ADS)

    May, D.; Le Pourhiet, L.; Brown, J.

    2011-12-01

    The generic term "multi-physics" is used to define physical processes which are described by a collection of partial differential equations, or "physics". Numerous processes in geodynamics fall into this category. For example, the evolution of viscous fluid flow and heat transport within the mantle (Stokes flow + energy conservation), the dynamics of melt migration (Stokes flow + Darcy flow + porosity evolution) and landscape evolution (Stokes + diffusion/advection over a surface). The development of software to numerically investigate processes that are described through the composition of different physics components are typically (a) designed for one particular set of physics and are never intended to be extended, or coupled to other processes (b) enforce that certain non-linearity's (or coupling) are explicitly removed from the system for reasons of computational efficiency, or due the lack of a robust non-linear solver (e.g. most models in the mantle convection community). We describe a software infrastructure which enables us to easily introduce new physics with minimal code modifications; tightly couple all physics without introducing splitting errors; exploit modern linear/non-linear solvers and permit the re-use of monolithic preconditioners for individual physics blocks (e.g. saddle point preconditioners for Stokes). Here we present a number of examples to illustrate the flexibility and importance of using this software infra-structure. Using the Stokes system as a prototype, we show results illustrating (i) visco-plastic shear banding experiments, (ii) how coupling Stokes flow with the evolution of the material coordinates can yield temporal stability in the free surface evolution and (iii) the discretisation error associated with decoupling Stokes equation from the heat transport equation in models of mantle convection with various rheologies.

  13. Scoring of medical publications with SIGAPS software: Application to orthopedics.

    PubMed

    Rouvillain, J-L; Derancourt, C; Moore, N; Devos, P

    2014-11-01

    SIGAPS is a bibliometric software tool developed in France to identify and analyze Medline-indexed publications that are produced by a researcher or research group. This measurement takes into account the author's ranking on the paper along with the journal's prestige according to its impact factor within the research field. However, use of this impact factor is the primary limitation of SIGAPS. SIGAPS analysis results are used to assign a financial value to hospital facilities. The impact of the journal Revue de Chirurgie Orthopédique and its successor-Orthopaedics & Traumatology: Surgery & Research-was compared using the Medline-based ISI (SIGAPS) and SCOPUS-based SCImago journal rankings.

  14. The application of formal software engineering methods to the unattended and remote monitoring software suite at Los Alamos National Laboratory

    SciTech Connect

    Determan, John Clifford; Longo, Joseph F; Michel, Kelly D

    2009-01-01

    The Unattended and Remote Monitoring (UNARM) system is a collection of specialized hardware and software used by the International Atomic Energy Agency (IAEA) to institute nuclear safeguards at many nuclear facilities around the world. The hardware consists of detectors, instruments, and networked computers for acquiring various forms of data, including but not limited to radiation data, global position coordinates, camera images, isotopic data, and operator declarations. The software provides two primary functions: the secure and reliable collection of this data from the instruments and the ability to perform an integrated review and analysis of the disparate data sources. Several years ago the team responsible for maintaining the software portion of the UNARM system began the process of formalizing its operations. These formal operations include a configuration management system, a change control board, an issue tracking system, and extensive formal testing, for both functionality and reliability. Functionality is tested with formal test cases chosen to fully represent the data types and methods of analysis that will be commonly encountered. Reliability is tested with iterative, concurrent testing where up to five analyses are executed simultaneously for thousands of cycles. Iterative concurrent testing helps ensure that there are no resource conflicts or leaks when multiple system components are in use simultaneously. The goal of this work is to provide a high quality, reliable product, commensurate with the criticality of the application. Testing results will be presented that demonstrate that this goal has been achieved and the impact of the introduction of a formal software engineering framework to the UNARM product will be presented.

  15. Unit Testing for the Application Control Language (ACL) Software

    NASA Technical Reports Server (NTRS)

    Heinich, Christina Marie

    2014-01-01

    In the software development process, code needs to be tested before it can be packaged for release in order to make sure the program actually does what it says is supposed to happen as well as to check how the program deals with errors and edge cases (such as negative or very large numbers). One of the major parts of the testing process is unit testing, where you test specific units of the code to make sure each individual part of the code works. This project is about unit testing many different components of the ACL software and fixing any errors encountered. To do this, mocks of other objects need to be created and every line of code needs to be exercised to make sure every case is accounted for. Mocks are important to make because it gives direct control of the environment the unit lives in instead of attempting to work with the entire program. This makes it easier to achieve the second goal of exercising every line of code.

  16. Application of Gaia Analysis Software AGIS to Nano-JASMINE

    NASA Astrophysics Data System (ADS)

    Yamada, Y.; Lammers, U.; Gouda, N.

    2011-07-01

    The core data reduction for the Nano-JASMINE mission is planned to be done with Gaia's Astrometric Global Iterative Solution (AGIS). Nano-JASMINE is an ultra small (35 kg) satellite for astrometry observations in Japan and Gaia is ESA's large (over 1000 kg) next-generation astrometry mission. The accuracy of Nano-JASMINE is about 3 mas, comparable to the Hipparcos mission, Gaia's predecessor some 20 years ago. It is challenging that such a small satellite can perform real scientific observations. The collaboration for sharing software started in 2007. In addition to similar design and operating principles of the two missions, this is possible thanks to the encapsulation of all Gaia-specific aspects of AGIS in a Parameter Database. Nano-JASMINE will be the test bench for the Gaia AGIS software. We present this idea in detail and the necessary practical steps to make AGIS work with Nano-JASMINE data. We also show the key mission parameters, goals, and status of the data reduction for the Nano-JASMINE.

  17. Software applications toward quantitative metabolic flux analysis and modeling.

    PubMed

    Dandekar, Thomas; Fieselmann, Astrid; Majeed, Saman; Ahmed, Zeeshan

    2014-01-01

    Metabolites and their pathways are central for adaptation and survival. Metabolic modeling elucidates in silico all the possible flux pathways (flux balance analysis, FBA) and predicts the actual fluxes under a given situation, further refinement of these models is possible by including experimental isotopologue data. In this review, we initially introduce the key theoretical concepts and different analysis steps in the modeling process before comparing flux calculation and metabolite analysis programs such as C13, BioOpt, COBRA toolbox, Metatool, efmtool, FiatFlux, ReMatch, VANTED, iMAT and YANA. Their respective strengths and limitations are discussed and compared to alternative software. While data analysis of metabolites, calculation of metabolic fluxes, pathways and their condition-specific changes are all possible, we highlight the considerations that need to be taken into account before deciding on a specific software. Current challenges in the field include the computation of large-scale networks (in elementary mode analysis), regulatory interactions and detailed kinetics, and these are discussed in the light of powerful new approaches.

  18. Application of parallelized software architecture to an autonomous ground vehicle

    NASA Astrophysics Data System (ADS)

    Shakya, Rahul; Wright, Adam; Shin, Young Ho; Momin, Orko; Petkovsek, Steven; Wortman, Paul; Gautam, Prasanna; Norton, Adam

    2011-01-01

    This paper presents improvements made to Q, an autonomous ground vehicle designed to participate in the Intelligent Ground Vehicle Competition (IGVC). For the 2010 IGVC, Q was upgraded with a new parallelized software architecture and a new vision processor. Improvements were made to the power system reducing the number of batteries required for operation from six to one. In previous years, a single state machine was used to execute the bulk of processing activities including sensor interfacing, data processing, path planning, navigation algorithms and motor control. This inefficient approach led to poor software performance and made it difficult to maintain or modify. For IGVC 2010, the team implemented a modular parallel architecture using the National Instruments (NI) LabVIEW programming language. The new architecture divides all the necessary tasks - motor control, navigation, sensor data collection, etc. into well-organized components that execute in parallel, providing considerable flexibility and facilitating efficient use of processing power. Computer vision is used to detect white lines on the ground and determine their location relative to the robot. With the new vision processor and some optimization of the image processing algorithm used last year, two frames can be acquired and processed in 70ms. With all these improvements, Q placed 2nd in the autonomous challenge.

  19. Optimizing the use of open-source software applications in drug discovery.

    PubMed

    Geldenhuys, Werner J; Gaasch, Kevin E; Watson, Mark; Allen, David D; Van der Schyf, Cornelis J

    2006-02-01

    Drug discovery is a time consuming and costly process. Recently, a trend towards the use of in silico computational chemistry and molecular modeling for computer-aided drug design has gained significant momentum. This review investigates the application of free and/or open-source software in the drug discovery process. Among the reviewed software programs are applications programmed in JAVA, Perl and Python, as well as resources including software libraries. These programs might be useful for cheminformatics approaches to drug discovery, including QSAR studies, energy minimization and docking studies in drug design endeavors. Furthermore, this review explores options for integrating available computer modeling open-source software applications in drug discovery programs.

  20. Software Defined Radio Standard Architecture and its Application to NASA Space Missions

    NASA Technical Reports Server (NTRS)

    Andro, Monty; Reinhart, Richard C.

    2006-01-01

    A software defined radio (SDR) architecture used in space-based platforms proposes to standardize certain aspects of radio development such as interface definitions, functional control and execution, and application software and firmware development. NASA has charted a team to develop an open software defined radio hardware and software architecture to support NASA missions and determine the viability of an Agency-wide Standard. A draft concept of the proposed standard has been released and discussed among organizations in the SDR community. Appropriate leveraging of the JTRS SCA, OMG's SWRadio Architecture and other aspects are considered. A standard radio architecture offers potential value by employing common waveform software instantiation, operation, testing and software maintenance. While software defined radios offer greater flexibility, they also poses challenges to the radio development for the space environment in terms of size, mass and power consumption and available technology. An SDR architecture for space must recognize and address the constraints of space flight hardware, and systems along with flight heritage and culture. NASA is actively participating in the development of technology and standards related to software defined radios. As NASA considers a standard radio architecture for space communications, input and coordination from government agencies, the industry, academia, and standards bodies is key to a successful architecture. The unique aspects of space require thorough investigation of relevant terrestrial technologies properly adapted to space. The talk will describe NASA's current effort to investigate SDR applications to space missions and a brief overview of a candidate architecture under consideration for space based platforms.

  1. Data-Interpolating Variational Analysis (DIVA) software : recent development and application

    NASA Astrophysics Data System (ADS)

    Watelet, Sylvain; Barth, Alexander; Troupin, Charles; Ouberdous, Mohamed; Beckers, Jean-Marie

    2014-05-01

    The Data-Interpolating Variational Analysis (DIVA) software is a tool designed to reconstruct a continuous field from discrete measurements. This method is based on the numerical implementation of the Variational Inverse Model (VIM), which consists of a minimization of a cost function, allowing the choice of the analyzed field fitting at best the data sets. The problem is solved efficiently using a finite-element method. This statistical method is particularly suited to deal with irregularly-spaced observations, producing outputs on a regular grid. Initially created to work in a two-dimensional way, the software is now able to handle 3D or even 4D analysis, in order to easily produce ocean climatologies. These analyzes can easily be improved by taking advantage of the DIVA's ability to take topographic and dynamic constraints into account (coastal relief, prevailing wind impacting the advection,...). In DIVA, we assume errors on measurements are not correlated, which means we do not consider the effect of correlated observation errors on the analysis and we therefore use a diagonal observation error covariance matrix. However, the oceanographic data sets are generally clustered in space and time, thus introducing some correlation between observations. In order to determine the impact of such an approximation and provide strategies to mitigate its effects, we conducted several synthetic experiments with known correlation structure. Overall, the best results were obtained with a variant of the covariance inflation method. Finally, a new application of DIVA on satellite altimetry data will be presented : these data have particular space and time distributions, as they consist of repeated tracks (~10-35 days) of measurements with a distance lower than 10 km between two successive measurements in a given track. The tools designed to determine the analysis parameters were adapted to these specificities. Moreover, different weights were applied to measurements in order to

  2. Automated Translation of Safety Critical Application Software Specifications into PLC Ladder Logic

    NASA Technical Reports Server (NTRS)

    Leucht, Kurt W.; Semmel, Glenn S.

    2008-01-01

    The numerous benefits of automatic application code generation are widely accepted within the software engineering community. A few of these benefits include raising the abstraction level of application programming, shorter product development time, lower maintenance costs, and increased code quality and consistency. Surprisingly, code generation concepts have not yet found wide acceptance and use in the field of programmable logic controller (PLC) software development. Software engineers at the NASA Kennedy Space Center (KSC) recognized the need for PLC code generation while developing their new ground checkout and launch processing system. They developed a process and a prototype software tool that automatically translates a high-level representation or specification of safety critical application software into ladder logic that executes on a PLC. This process and tool are expected to increase the reliability of the PLC code over that which is written manually, and may even lower life-cycle costs and shorten the development schedule of the new control system at KSC. This paper examines the problem domain and discusses the process and software tool that were prototyped by the KSC software engineers.

  3. Reducing errors in the management of hyperbilirubinaemia: validating a software application.

    PubMed

    Balaguer, A; Quiroga-González, R; Camprubí, M; Milá-Farnés, M; Escribano, J; Girabent-Farrés, M

    2009-01-01

    To verify the usefulness and reliability of a software tool we developed to help to apply the American Academy of Pediatrics (AAP) Guidelines 2004 on hyperbilirubinaemia according to the infant's age in hours and their clinical risk factors. Randomised, cross-over, controlled trial with 20 simulated clinical cases comparing the "manual" application of the guidelines with our software application. Fifteen doctors (eight final-year residents and seven consultants) from two hospitals in Spain. Major errors (defined a priori as any deviation from the AAP guidelines that involve a risk of morbidity or mortality for the patient), minor errors (those that cause discomfort and/or, in extremely rare cases, morbidity) and time spent. Fifteen doctors each managed 20 simulated cases, half by using the guidelines alone and half using the software tool. Without the software application, 42 "minor" errors were made. With it, only 25 errors were made. "Major" errors also decreased from 10 to 2 with the software. As a group, the residents benefited most; they made an average of 1.8 errors fewer per 10 cases. Use of the software reduced the time taken by the residents to resolve the cases, although the mean reduction in time was not significant for the group of consultants. The use of simulated clinical cases revealed many errors in the routine management of hyperbilirubinaemia. The software helped clinicians make fewer errors and saved time for residents, but not consultants.

  4. Comprehensive, powerful, efficient, intuitive: a new software framework for clinical imaging applications

    NASA Astrophysics Data System (ADS)

    Augustine, Kurt E.; Holmes, David R., III; Hanson, Dennis P.; Robb, Richard A.

    2006-03-01

    One of the greatest challenges for a software engineer is to create a complex application that is comprehensive enough to be useful to a diverse set of users, yet focused enough for individual tasks to be carried out efficiently with minimal training. This "powerful yet simple" paradox is particularly prevalent in advanced medical imaging applications. Recent research in the Biomedical Imaging Resource (BIR) at Mayo Clinic has been directed toward development of an imaging application framework that provides powerful image visualization/analysis tools in an intuitive, easy-to-use interface. It is based on two concepts very familiar to physicians - Cases and Workflows. Each case is associated with a unique patient and a specific set of routine clinical tasks, or a workflow. Each workflow is comprised of an ordered set of general-purpose modules which can be re-used for each unique workflow. Clinicians help describe and design the workflows, and then are provided with an intuitive interface to both patient data and analysis tools. Since most of the individual steps are common to many different workflows, the use of general-purpose modules reduces development time and results in applications that are consistent, stable, and robust. While the development of individual modules may reflect years of research by imaging scientists, new customized workflows based on the new modules can be developed extremely fast. If a powerful, comprehensive application is difficult to learn and complicated to use, it will be unacceptable to most clinicians. Clinical image analysis tools must be intuitive and effective or they simply will not be used.

  5. Supporting Fourth-Grade Students' Word Identification Using Application Software

    ERIC Educational Resources Information Center

    Moser, Gary P.; Morrison, Timothy G.; Wilcox, Brad

    2017-01-01

    A quasi-experimental study examined effects of a 10-week word structure intervention with fourth-grade students. During daily 10-15-minute practice periods, students worked individually with mobile apps focused on specific aspects of word identification. Pre- and post-treatment assessments showed no differences in rate and accuracy of oral reading…

  6. An experimental investigation of fault tolerant software structures in an avionics application

    NASA Technical Reports Server (NTRS)

    Caglayan, Alper K.; Eckhardt, Dave E., Jr.

    1989-01-01

    The objective of this experimental investigation is to compare the functional performance and software reliability of competing fault tolerant software structures utilizing software diversity. In this experiment, three versions of the redundancy management software for a skewed sensor array have been developed using three diverse failure detection and isolation algorithms and incorporated into various N-version, recovery block and hybrid software structures. The empirical results show that, for maximum functional performance improvement in the selected application domain, the results of diverse algorithms should be voted before being processed by multiple versions without enforced diversity. Results also suggest that when the reliability gain with an N-version structure is modest, recovery block structures are more feasible since higher reliability can be obtained using an acceptance check with a modest reliability.

  7. An experimental investigation of fault tolerant software structures in an avionics application

    NASA Technical Reports Server (NTRS)

    Caglayan, Alper K.; Eckhardt, Dave E., Jr.

    1989-01-01

    The objective of this experimental investigation is to compare the functional performance and software reliability of competing fault tolerant software structures utilizing software diversity. In this experiment, three versions of the redundancy management software for a skewed sensor array have been developed using three diverse failure detection and isolation algorithms and incorporated into various N-version, recovery block and hybrid software structures. The empirical results show that, for maximum functional performance improvement in the selected application domain, the results of diverse algorithms should be voted before being processed by multiple versions without enforced diversity. Results also suggest that when the reliability gain with an N-version structure is modest, recovery block structures are more feasible since higher reliability can be obtained using an acceptance check with a modest reliability.

  8. Office Computer Software: A Comprehensive Review of Software Programs.

    ERIC Educational Resources Information Center

    Secretary, 1992

    1992-01-01

    Describes types of software including system software, application software, spreadsheets, accounting software, graphics packages, desktop publishing software, database, desktop and personal information management software, project and records management software, groupware, and shareware. (JOW)

  9. Conceptions of Software Development by Project Managers: A Study of Managing the Outsourced Development of Software Applications for United States Federal Government Agencies

    ERIC Educational Resources Information Center

    Eisen, Daniel

    2013-01-01

    This study explores how project managers, working for private federal IT contractors, experience and understand managing the development of software applications for U.S. federal government agencies. Very little is known about how they manage their projects in this challenging environment. Software development is a complex task and only grows in…

  10. Conceptions of Software Development by Project Managers: A Study of Managing the Outsourced Development of Software Applications for United States Federal Government Agencies

    ERIC Educational Resources Information Center

    Eisen, Daniel

    2013-01-01

    This study explores how project managers, working for private federal IT contractors, experience and understand managing the development of software applications for U.S. federal government agencies. Very little is known about how they manage their projects in this challenging environment. Software development is a complex task and only grows in…

  11. User Manual for the Data-Series Interface of the Gr Application Software

    USGS Publications Warehouse

    Donovan, John M.

    2009-01-01

    This manual describes the data-series interface for the Gr Application software. Basic tasks such as plotting, editing, manipulating, and printing data series are presented. The properties of the various types of data objects and graphical objects used within the application, and the relationships between them also are presented. Descriptions of compatible data-series file formats are provided.

  12. Cost Effective Applications of High Integrity Software Processes

    DTIC Science & Technology

    2011-05-18

    Inspections/ peer reviews • Checklists • Programming Languages and Coding Standards • Static Code Analysis C d l it• o e comp ex y • Unit Testing...rom their own perspective © 2011 Lockheed Martin Corporation AER201103026 Inspection/ Peer Reviews • Reduce costly rework − Focus on defect removal... peer reviews ) to remove up to 80 percent of their defects • It doesn’t have to be hard − Reviews can be of many different types (very formal

  13. Software Attribution for Geoscience Applications in the Computational Infrastructure for Geodynamics

    NASA Astrophysics Data System (ADS)

    Hwang, L.; Dumit, J.; Fish, A.; Soito, L.; Kellogg, L. H.; Smith, M.

    2015-12-01

    Scientific software is largely developed by individual scientists and represents a significant intellectual contribution to the field. As the scientific culture and funding agencies move towards an expectation that software be open-source, there is a corresponding need for mechanisms to cite software, both to provide credit and recognition to developers, and to aid in discoverability of software and scientific reproducibility. We assess the geodynamic modeling community's current citation practices by examining more than 300 predominantly self-reported publications utilizing scientific software in the past 5 years that is available through the Computational Infrastructure for Geodynamics (CIG). Preliminary results indicate that authors cite and attribute software either through citing (in rank order) peer-reviewed scientific publications, a user's manual, and/or a paper describing the software code. Attributions maybe found directly in the text, in acknowledgements, in figure captions, or in footnotes. What is considered citable varies widely. Citations predominantly lack software version numbers or persistent identifiers to find the software package. Versioning may be implied through reference to a versioned user manual. Authors sometimes report code features used and whether they have modified the code. As an open-source community, CIG requests that researchers contribute their modifications to the repository. However, such modifications may not be contributed back to a repository code branch, decreasing the chances of discoverability and reproducibility. Survey results through CIG's Software Attribution for Geoscience Applications (SAGA) project suggest that lack of knowledge, tools, and workflows to cite codes are barriers to effectively implement the emerging citation norms. Generated on-demand attributions on software landing pages and a prototype extensible plug-in to automatically generate attributions in codes are the first steps towards reproducibility.

  14. A beamline matching application based on open source software

    SciTech Connect

    2000-12-21

    An interactive Beamline Matching application has been developed using beamline and automatic differentiation class libraries. Various freely available components were used; in particular, the user interface is based on FLTK, a C++ toolkit distributed under the terms of the GNU Public License (GPL). The result is an application that compiles without modifications under both X-Windows and Win32 and offers the same look and feel under both operating environments. In this paper, we discuss some of the practical issues that were confronted and the choices that were made. In particular, we discuss object-based event propagation mechanisms, multithreading, language mixing and persistence.

  15. Beehive: A Software Application for Synchronous Collaborative Learning

    ERIC Educational Resources Information Center

    Turani, Aiman; Calvo, Rafael A.

    2006-01-01

    Purpose: The purpose of this paper is to describe Beehive, a new web application framework for designing and supporting synchronous collaborative learning. Design/methodology/approach: Our web engineering approach integrates educational design expertise into a technology for building tools for collaborative learning activities. Beehive simplifies…

  16. Scaling Irregular Applications through Data Aggregation and Software Multithreading

    SciTech Connect

    Morari, Alessandro; Tumeo, Antonino; Chavarría-Miranda, Daniel; Villa, Oreste; Valero, Mateo

    2014-05-30

    Bioinformatics, data analytics, semantic databases, knowledge discovery are emerging high performance application areas that exploit dynamic, linked data structures such as graphs, unbalanced trees or unstructured grids. These data structures usually are very large, requiring significantly more memory than available on single shared memory systems. Additionally, these data structures are difficult to partition on distributed memory systems. They also present poor spatial and temporal locality, thus generating unpredictable memory and network accesses. The Partitioned Global Address Space (PGAS) programming model seems suitable for these applications, because it allows using a shared memory abstraction across distributed-memory clusters. However, current PGAS languages and libraries are built to target regular remote data accesses and block transfers. Furthermore, they usually rely on the Single Program Multiple Data (SPMD) parallel control model, which is not well suited to the fine grained, dynamic and unbalanced parallelism of irregular applications. In this paper we present {\\bf GMT} (Global Memory and Threading library), a custom runtime library that enables efficient execution of irregular applications on commodity clusters. GMT integrates a PGAS data substrate with simple fork/join parallelism and provides automatic load balancing on a per node basis. It implements multi-level aggregation and lightweight multithreading to maximize memory and network bandwidth with fine-grained data accesses and tolerate long data access latencies. A key innovation in the GMT runtime is its thread specialization (workers, helpers and communication threads) that realize the overall functionality. We compare our approach with other PGAS models, such as UPC running using GASNet, and hand-optimized MPI code on a set of typical large-scale irregular applications, demonstrating speedups of an order of magnitude.

  17. The Enterprise Derivative Application: Flexible Software for Optimizing Manufacturing Processes

    SciTech Connect

    Ward, Richard C; Allgood, Glenn O; Knox, John R

    2008-11-01

    The Enterprise Derivative Application (EDA) implements the enterprise-derivative analysis for optimization of an industrial process (Allgood and Manges, 2001). It is a tool to help industry planners choose the most productive way of manufacturing their products while minimizing their cost. Developed in MS Access, the application allows users to input initial data ranging from raw material to variable costs and enables the tracking of specific information as material is passed from one process to another. Energy-derivative analysis is based on calculation of sensitivity parameters. For the specific application to a steel production process these include: the cost to product sensitivity, the product to energy sensitivity, the energy to efficiency sensitivity, and the efficiency to cost sensitivity. Using the EDA, for all processes the user can display a particular sensitivity or all sensitivities can be compared for all processes. Although energy-derivative analysis was originally designed for use by the steel industry, it is flexible enough to be applied to many other industrial processes. Examples of processes where energy-derivative analysis would prove useful are wireless monitoring of processes in the petroleum cracking industry and wireless monitoring of motor failure for determining the optimum time to replace motor parts. One advantage of the MS Access-based application is its flexibility in defining the process flow and establishing the relationships between parent and child process and products resulting from a process. Due to the general design of the program, a process can be anything that occurs over time with resulting output (products). So the application can be easily modified to many different industrial and organizational environments. Another advantage is the flexibility of defining sensitivity parameters. Sensitivities can be determined between all possible variables in the process flow as a function of time. Thus the dynamic development of the

  18. A wind tunnel application of large-field focusing schlieren

    NASA Technical Reports Server (NTRS)

    Ponton, Michael K.; Seiner, John M.; Mitchell, L. K.; Manning, James C.; Jansen, Bernard J.; Lagen, Nicholas T.

    1992-01-01

    A large-field focusing schlieren apparatus was installed in the NASA Lewis Research Center 9 by 15 foot wind tunnel in an attempt to determine the density gradient flow field of a free jet issuing from a supersonic nozzle configuration. The nozzle exit geometry was designed to reduce acoustic emissions from the jet by enhancing plume mixing. Thus, the flow exhibited a complex three-dimensional structure which warranted utilizing the sharp focusing capability of this type of schlieren method. Design considerations concerning tunnel limitations, high-speed photography, and video tape recording are presented in the paper.

  19. A wind tunnel application of large-field focusing schlieren

    NASA Technical Reports Server (NTRS)

    Ponton, Michael K.; Seiner, John M.; Mitchell, L. K.; Manning, James C.; Jansen, Bernard J.; Lagen, Nicholas T.

    1992-01-01

    A large-field focusing schlieren apparatus was installed in the NASA Lewis Research Center 9 by 15 foot wind tunnel in an attempt to determine the density gradient flow field of a free jet issuing from a supersonic nozzle configuration. The nozzle exit geometry was designed to reduce acoustic emissions from the jet by enhancing plume mixing. Thus, the flow exhibited a complex three-dimensional structure which warranted utilizing the sharp focusing capability of this type of schlieren method. Design considerations concerning tunnel limitations, high-speed photography, and video tape recording are presented in the paper.

  20. Applications of focused ion beam systems in gunshot residue investigation.

    PubMed

    Niewöhner, L; Wenz, H W

    1999-01-01

    Scanning ion microscopy technology has opened a new door to forensic scientists, allowing the GSR investigator to see inside a particle's core. Using a focused ion beam, particles can be cross-sectioned, revealing interior morphology and character that can be utilized for identification of the ammunition manufacturer.

  1. Description of phase singularities and their application to focusing design.

    PubMed

    Martínez-Niconoff, G; Muñoz-Lopez, J; Méndez-Martínez, E

    2001-09-01

    We describe the focusing region associated with transmittances, analyzing its associated phase function. We show that generic features can be studied from the differential equation for focusing geometry, which is obtained through angular representation for diffraction fields. With the treatment, we recover the results for circular zone plates, and by introducing a linear transformation into the transmittance function we generate structures that keep the ability to generate focusing. According to the choice of the parameters involved, the diffraction field presents new focusing regions, whose three-dimensional geometry and spatial evolution can be described in a selective fashion with analysis of only the phase singularities associated with the diffraction field and avoidance of the integral representation. The treatment is also applied to a simple lens. We recover the theoretical predictions obtained by Berry and Upstill [M. V. Berry and C. Upstill, in Progress in Optics, E. Wolf, ed. (North-Holland, Amsterdam, 1980), Vol. XVIII, p. 259], and these predictions are corroborated experimentally. The results obtained are shown.

  2. Generation of Focused Shock Waves in Water for Biomedical Applications

    NASA Astrophysics Data System (ADS)

    Lukeš, Petr; Šunka, Pavel; Hoffer, Petr; Stelmashuk, Vitaliy; Beneš, Jiří; Poučková, Pavla; Zadinová, Marie; Zeman, Jan

    The physical characteristics of focused two-successive (tandem) shock waves (FTSW) in water and their biological effects are presented. FTSW were ­generated by underwater multichannel electrical discharges in a highly conductive saline solution using two porous ceramic-coated cylindrical electrodes of different diameter and surface area. The primary cylindrical pressure wave generated at each composite electrode was focused by a metallic parabolic reflector to a common focal point to form two strong shock waves with a variable time delay between the waves. The pressure field and interaction between the first and the second shock waves at the focus were investigated using schlieren photography and polyvinylidene fluoride (PVDF) shock gauge sensors. The largest interaction was obtained for a time delay of 8-15 μs between the waves, producing an amplitude of the negative pressure phase of the second shock wave down to -80 MPa and a large number of cavitations at the focus. The biological effects of FTSW were demonstrated in vitro on damage to B16 melanoma cells, in vivo on targeted lesions in the thigh muscles of rabbits and on the growth delay of sarcoma tumors in Lewis rats treated in vivo by FTSW, compared to untreated controls.

  3. Applications of software-defined radio (SDR) technology in hospital environments.

    PubMed

    Chávez-Santiago, Raúl; Mateska, Aleksandra; Chomu, Konstantin; Gavrilovska, Liljana; Balasingham, Ilangko

    2013-01-01

    A software-defined radio (SDR) is a radio communication system where the major part of its functionality is implemented by means of software in a personal computer or embedded system. Such a design paradigm has the major advantage of producing devices that can receive and transmit widely different radio protocols based solely on the software used. This flexibility opens several application opportunities in hospital environments, where a large number of wired and wireless electronic devices must coexist in confined areas like operating rooms and intensive care units. This paper outlines some possible applications in the 2360-2500 MHz frequency band. These applications include the integration of wireless medical devices in a common communication platform for seamless interoperability, and cognitive radio (CR) for body area networks (BANs) and wireless sensor networks (WSNs) for medical environmental surveillance. The description of a proof-of-concept CR prototype is also presented.

  4. Generic domain models in software engineering

    NASA Technical Reports Server (NTRS)

    Maiden, Neil

    1992-01-01

    This paper outlines three research directions related to domain-specific software development: (1) reuse of generic models for domain-specific software development; (2) empirical evidence to determine these generic models, namely elicitation of mental knowledge schema possessed by expert software developers; and (3) exploitation of generic domain models to assist modelling of specific applications. It focuses on knowledge acquisition for domain-specific software development, with emphasis on tool support for the most important phases of software development.

  5. A model of cloud application assignments in software-defined storages

    NASA Astrophysics Data System (ADS)

    Bolodurina, Irina P.; Parfenov, Denis I.; Polezhaev, Petr N.; Shukhman, Alexander E.

    2017-01-01

    The aim of this study is to analyze the structure and mechanisms of interaction of typical cloud applications and to suggest the approaches to optimize their placement in storage systems. In this paper, we describe a generalized model of cloud applications including the three basic layers: a model of application, a model of service, and a model of resource. The distinctive feature of the model suggested implies analyzing cloud resources from the user point of view and from the point of view of a software-defined infrastructure of the virtual data center (DC). The innovation character of this model is in describing at the same time the application data placements, as well as the state of the virtual environment, taking into account the network topology. The model of software-defined storage has been developed as a submodel within the resource model. This model allows implementing the algorithm for control of cloud application assignments in software-defined storages. Experimental researches returned this algorithm decreases in cloud application response time and performance growth in user request processes. The use of software-defined data storages allows the decrease in the number of physical store devices, which demonstrates the efficiency of our algorithm.

  6. [Impact of a software application to improve medication reconciliation at hospital discharge].

    PubMed

    Corral Baena, S; Garabito Sánchez, M J; Ruíz Rómero, M V; Vergara Díaz, M A; Martín Chacón, E R; Fernández Moyano, A

    2014-01-01

    To assess the impact of a software application to improve the quality of information concerning current patient medications and changes on the discharge report after hospitalization. To analyze the incidence of errors and to classify them. Quasi-experimental pre / post study with non-equivalent control group study. Medical patients at hospital discharge. implementation of a software application. Percentage of reconciled patient medication on discharge, and percentage of patients with more than one unjustified discrepancy. A total of 349 patients were assessed; 199 (pre-intervention phase) and 150 (post-intervention phase). Before the implementation of the application in 157 patients (78.8%) medication reconciliation had been completed; finding reconciliation errors in 99 (63.0%). The most frequent type of error, 339 (78.5%), was a missing dose or administration frequency information. After implementation, all the patient prescriptions were reconciled when the software was used. The percentage of patients with unjustified discrepancies decreased from 63.0% to 11.8% with the use of the application (p<.001). The main type of discrepancy found on using the application was confusing prescription, due to the fact that the professionals were not used to using the new tool. The use of a software application has been shown to improve the quality of the information on patient treatment on the hospital discharge report, but it is still necessary to continue development as a strategy for improving medication reconciliation. Copyright © 2014 SECA. Published by Elsevier Espana. All rights reserved.

  7. Application of Regulatory Focus Theory to Search Advertising.

    PubMed

    Mowle, Elyse N; Georgia, Emily J; Doss, Brian D; Updegraff, John A

    The purpose of this paper is to test the utility of regulatory focus theory principles in a real-world setting; specifically, Internet hosted text advertisements. Effect of compatibility of the ad text with the regulatory focus of the consumer was examined. Advertisements were created using Google AdWords. Data were collected for the number of views and clicks each ad received. Effect of regulatory fit was measured using logistic regression. Logistic regression analyses demonstrated that there was a strong main effect for keyword, such that users were almost six times as likely to click on a promotion advertisement as a prevention advertisement, as well as a main effect for compatibility, such that users were twice as likely to click on an advertisement with content that was consistent with their keyword. Finally, there was a strong interaction of these two variables, such that the effect of consistent advertisements was stronger for promotion searches than for prevention searches. The effect of ad compatibility had medium to large effect sizes, suggesting that individuals' state may have more influence on advertising response than do individuals' traits (e.g. personality traits). Measurement of regulatory fit was limited by the constraints of Google AdWords. The results of this study provide a possible framework for ad creation for Internet advertisers. This paper is the first study to demonstrate the utility of regulatory focus theory in online advertising.

  8. Application of Regulatory Focus Theory to Search Advertising

    PubMed Central

    Mowle, Elyse N.; Georgia, Emily J.; Doss, Brian D.; Updegraff, John A.

    2015-01-01

    Purpose The purpose of this paper is to test the utility of regulatory focus theory principles in a real-world setting; specifically, Internet hosted text advertisements. Effect of compatibility of the ad text with the regulatory focus of the consumer was examined. Design/methodology/approach Advertisements were created using Google AdWords. Data were collected for the number of views and clicks each ad received. Effect of regulatory fit was measured using logistic regression. Findings Logistic regression analyses demonstrated that there was a strong main effect for keyword, such that users were almost six times as likely to click on a promotion advertisement as a prevention advertisement, as well as a main effect for compatibility, such that users were twice as likely to click on an advertisement with content that was consistent with their keyword. Finally, there was a strong interaction of these two variables, such that the effect of consistent advertisements was stronger for promotion searches than for prevention searches. Research limitations/implications The effect of ad compatibility had medium to large effect sizes, suggesting that individuals’ state may have more influence on advertising response than do individuals’ traits (e.g. personality traits). Measurement of regulatory fit was limited by the constraints of Google AdWords. Practical implications The results of this study provide a possible framework for ad creation for Internet advertisers. Originality/value This paper is the first study to demonstrate the utility of regulatory focus theory in online advertising. PMID:26430293

  9. Application of an impedance matching transformer to a plasma focus.

    PubMed

    Bures, B L; James, C; Krishnan, M; Adler, R

    2011-10-01

    A plasma focus was constructed using an impedance matching transformer to improve power transfer between the pulse power and the dynamic plasma load. The system relied on two switches and twelve transformer cores to produce a 100 kA pulse in short circuit on the secondary at 27 kV on the primary with 110 J stored. With the two transformer systems in parallel, the Thevenin equivalent circuit parameters on the secondary side of the driver are: C = 10.9 μF, V(0) = 4.5 kV, L = 17 nH, and R = 5 mΩ. An equivalent direct drive circuit would require a large number of switches in parallel, to achieve the same Thevenin equivalent. The benefits of this approach are replacement of consumable switches with non-consumable transformer cores, reduction of the driver inductance and resistance as viewed by the dynamic load, and reduction of the stored energy to produce a given peak current. The system is designed to operate at 100 Hz, so minimizing the stored energy results in less load on the thermal management system. When operated at 1 Hz, the neutron yield from the transformer matched plasma focus was similar to the neutron yield from a conventional (directly driven) plasma focus at the same peak current.

  10. Cold atomic beam ion source for focused ion beam applications

    NASA Astrophysics Data System (ADS)

    Knuffman, B.; Steele, A. V.; McClelland, J. J.

    2013-07-01

    We report measurements and modeling of an ion source that is based on ionization of a laser-cooled atomic beam. We show a high brightness and a low energy spread, suitable for use in next-generation, high-resolution focused ion beam systems. Our measurements of total ion current as a function of ionization conditions support an analytical model that also predicts the cross-sectional current density and spatial distribution of ions created in the source. The model predicts a peak brightness of 2 × 107 A m-2 sr-1 eV-1 and an energy spread less than 0.34 eV. The model is also combined with Monte-Carlo simulations of the inter-ion Coulomb forces to show that the source can be operated at several picoamperes with a brightness above 1 × 107 A m-2 sr-1 eV-1. We estimate that when combined with a conventional ion focusing column, an ion source with these properties could focus a 1 pA beam into a spot smaller than 1 nm. A total current greater than 5 nA was measured in a lower-brightness configuration of the ion source, demonstrating the possibility of a high current mode of operation.

  11. Cold atomic beam ion source for focused ion beam applications

    SciTech Connect

    Knuffman, B.; Steele, A. V.; McClelland, J. J.

    2013-07-28

    We report measurements and modeling of an ion source that is based on ionization of a laser-cooled atomic beam. We show a high brightness and a low energy spread, suitable for use in next-generation, high-resolution focused ion beam systems. Our measurements of total ion current as a function of ionization conditions support an analytical model that also predicts the cross-sectional current density and spatial distribution of ions created in the source. The model predicts a peak brightness of 2 × 10{sup 7} A m{sup −2} sr{sup −1} eV{sup −1} and an energy spread less than 0.34 eV. The model is also combined with Monte-Carlo simulations of the inter-ion Coulomb forces to show that the source can be operated at several picoamperes with a brightness above 1 × 10{sup 7} A m{sup −2} sr{sup −1} eV{sup −1}. We estimate that when combined with a conventional ion focusing column, an ion source with these properties could focus a 1 pA beam into a spot smaller than 1 nm. A total current greater than 5 nA was measured in a lower-brightness configuration of the ion source, demonstrating the possibility of a high current mode of operation.

  12. Application of an APP Store Software Model within the DoD

    DTIC Science & Technology

    2012-05-17

    GRADUATE SCHOOL PANEL: “Application of an APP Store Software Model within the DoD” Monterey, California Report Documentation Page Form ApprovedOMB No...00-00-2012 to 00-00-2012 4. TITLE AND SUBTITLE Application of an APP Store Software Model within the DoD 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c...distribution is unlimited (17 May 2012) US Army “ APP Store for War” To streamline its operations in a rapidly advancing world, the Army is

  13. Evaluation of the Red Blood Cell Advanced Software Application on the CellaVision DM96.

    PubMed

    Criel, M; Godefroid, M; Deckers, B; Devos, H; Cauwelier, B; Emmerechts, J

    2016-08-01

    The CellaVision Advanced Red Blood Cell (RBC) Software Application is a new software for advanced morphological analysis of RBCs on a digital microscopy system. Upon automated precharacterization into 21 categories, the software offers the possibility of reclassification of RBCs by the operator. We aimed to define the optimal cut-off to detect morphological RBC abnormalities and to evaluate the precharacterization performance of this software. Thirty-eight blood samples of healthy donors and sixty-eight samples of hospitalized patients were analyzed. Different methodologies to define a cut-off between negativity and positivity were used. Sensitivity and specificity were calculated according to these different cut-offs using the manual microscopic method as the gold standard. Imprecision was assessed by measuring analytical within-run and between-run variability and by measuring between-observer variability. By optimizing the cut-off between negativity and positivity, sensitivities exceeded 80% for 'critical' RBC categories (target cells, tear drop cells, spherocytes, sickle cells, and parasites), while specificities exceeded 80% for the other RBC morphological categories. Results of within-run, between-run, and between-observer variabilities were all clinically acceptable. The CellaVision Advanced RBC Software Application is an easy-to-use software that helps to detect most RBC morphological abnormalities in a sensitive and specific way without increasing work load, provided the proper cut-offs are chosen. However, evaluation of the images by an experienced observer remains necessary. © 2016 John Wiley & Sons Ltd.

  14. Engineering of Data Acquiring Mobile Software and Sustainable End-User Applications

    NASA Technical Reports Server (NTRS)

    Smith, Benton T.

    2013-01-01

    The criteria for which data acquiring software and its supporting infrastructure should be designed should take the following two points into account: the reusability and organization of stored online and remote data and content, and an assessment on whether abandoning a platform optimized design in favor for a multi-platform solution significantly reduces the performance of an end-user application. Furthermore, in-house applications that control or process instrument acquired data for end-users should be designed with a communication and control interface such that the application's modules can be reused as plug-in modular components in greater software systems. The application of the above mentioned is applied using two loosely related projects: a mobile application, and a website containing live and simulated data. For the intelligent devices mobile application AIDM, the end-user interface have a platform and data type optimized design, while the database and back-end applications store this information in an organized manner and manage access to that data to only to authorized user end application(s). Finally, the content for the website was derived from a database such that the content can be included and uniform to all applications accessing the content. With these projects being ongoing, I have concluded from my research that the applicable methods presented are feasible for both projects, and that a multi-platform design for the mobile application only marginally drop the performance of the mobile application.

  15. Using WWW to Improve Software Development and Maintenance: Application of the Light System to Aleph Programs

    NASA Astrophysics Data System (ADS)

    Aimar, A.; Aimar, M.; Khodabandeh, A.; Palazzi, P.; Rousseau, B.; Ruggier, M.; Cattaneo, M.; Comas Illas, P.

    Programmers who develop, use, maintain, modify software are faced with the problem of scanning and understanding large amounts of documents, ranging from source code to requirements, analysis and design diagrams, user and reference manuals, etc. This task is non trivial and time consuming, because of the number and size of documents, and the many implicit cross-references that they contain. In large distributed development teams, where software and related documents are produced at various sites, the problem can be even more severe. LIGHT, Life cycle Global HyperText, is an attempt to solve the problem using WWW technology. The basic idea is to make all the software documents, including code, available and cross-connected on the WWW. The first application of this concept to go in production is JULIA/LIGHT, a system to convert and publish on WWW the software documentation of the JULIA reconstruction program of the ALEPH experiment at CERN, European Organisation for Particle Physics, Geneva.

  16. Focusing particle concentrator with application to ultrafine particles

    DOEpatents

    Hering, Susanne; Lewis, Gregory; Spielman, Steven R.

    2013-06-11

    Technology is presented for the high efficiency concentration of fine and ultrafine airborne particles into a small fraction of the sampled airflow by condensational enlargement, aerodynamic focusing and flow separation. A nozzle concentrator structure including an acceleration nozzle with a flow extraction structure may be coupled to a containment vessel. The containment vessel may include a water condensation growth tube to facilitate the concentration of ultrafine particles. The containment vessel may further include a separate carrier flow introduced at the center of the sampled flow, upstream of the acceleration nozzle of the nozzle concentrator to facilitate the separation of particle and vapor constituents.

  17. Development of a controlled vocabulary and software application to analyze fruit shape variation in tomato and other plant species.

    PubMed

    Brewer, Marin Talbot; Lang, Lixin; Fujimura, Kikuo; Dujmovic, Nancy; Gray, Simon; van der Knaap, Esther

    2006-05-01

    The domestication and improvement of fruit-bearing crops resulted in a large diversity of fruit form. To facilitate consistent terminology pertaining to shape, a controlled vocabulary focusing specifically on fruit shape traits was developed. Mathematical equations were established for the attributes so that objective, quantitative measurements of fruit shape could be conducted. The controlled vocabulary and equations were integrated into a newly developed software application, Tomato Analyzer, which conducts semiautomatic phenotypic measurements. To demonstrate the utility of Tomato Analyzer in the detection of shape variation, fruit from two F2 populations of tomato (Solanum spp.) were analyzed. Principal components analysis was used to identify the traits that best described shape variation within as well as between the two populations. The three principal components were analyzed as traits, and several significant quantitative trait loci (QTL) were identified in both populations. The usefulness and flexibility of the software was further demonstrated by analyzing the distal fruit end angle of fruit at various user-defined settings. Results of the QTL analyses indicated that significance levels of detected QTL were greatly improved by selecting the setting that maximized phenotypic variation in a given population. Tomato Analyzer was also applied to conduct phenotypic analyses of fruit from several other species, demonstrating that many of the algorithms developed for tomato could be readily applied to other plants. The controlled vocabulary, algorithms, and software application presented herein will provide plant scientists with novel tools to consistently, accurately, and efficiently describe two-dimensional fruit shapes.

  18. Internet-based hardware/software co-design framework for embedded 3D graphics applications

    NASA Astrophysics Data System (ADS)

    Yeh, Chi-Tsai; Wang, Chun-Hao; Huang, Ing-Jer; Wong, Weng-Fai

    2011-12-01

    Advances in technology are making it possible to run three-dimensional (3D) graphics applications on embedded and handheld devices. In this article, we propose a hardware/software co-design environment for 3D graphics application development that includes the 3D graphics software, OpenGL ES application programming interface (API), device driver, and 3D graphics hardware simulators. We developed a 3D graphics system-on-a-chip (SoC) accelerator using transaction-level modeling (TLM). This gives software designers early access to the hardware even before it is ready. On the other hand, hardware designers also stand to gain from the more complex test benches made available in the software for verification. A unique aspect of our framework is that it allows hardware and software designers from geographically dispersed areas to cooperate and work on the same framework. Designs can be entered and executed from anywhere in the world without full access to the entire framework, which may include proprietary components. This results in controlled and secure transparency and reproducibility, granting leveled access to users of various roles.

  19. A Software Package for Neural Network Applications Development

    NASA Technical Reports Server (NTRS)

    Baran, Robert H.

    1993-01-01

    Original Backprop (Version 1.2) is an MS-DOS package of four stand-alone C-language programs that enable users to develop neural network solutions to a variety of practical problems. Original Backprop generates three-layer, feed-forward (series-coupled) networks which map fixed-length input vectors into fixed length output vectors through an intermediate (hidden) layer of binary threshold units. Version 1.2 can handle up to 200 input vectors at a time, each having up to 128 real-valued components. The first subprogram, TSET, appends a number (up to 16) of classification bits to each input, thus creating a training set of input output pairs. The second subprogram, BACKPROP, creates a trilayer network to do the prescribed mapping and modifies the weights of its connections incrementally until the training set is leaned. The learning algorithm is the 'back-propagating error correction procedures first described by F. Rosenblatt in 1961. The third subprogram, VIEWNET, lets the trained network be examined, tested, and 'pruned' (by the deletion of unnecessary hidden units). The fourth subprogram, DONET, makes a TSR routine by which the finished product of the neural net design-and-training exercise can be consulted under other MS-DOS applications.

  20. MAPI: a software framework for distributed biomedical applications

    PubMed Central

    2013-01-01

    Background The amount of web-based resources (databases, tools etc.) in biomedicine has increased, but the integrated usage of those resources is complex due to differences in access protocols and data formats. However, distributed data processing is becoming inevitable in several domains, in particular in biomedicine, where researchers face rapidly increasing data sizes. This big data is difficult to process locally because of the large processing, memory and storage capacity required. Results This manuscript describes a framework, called MAPI, which provides a uniform representation of resources available over the Internet, in particular for Web Services. The framework enhances their interoperability and collaborative use by enabling a uniform and remote access. The framework functionality is organized in modules that can be combined and configured in different ways to fulfil concrete development requirements. Conclusions The framework has been tested in the biomedical application domain where it has been a base for developing several clients that are able to integrate different web resources. The MAPI binaries and documentation are freely available at http://www.bitlab-es.com/mapi under the Creative Commons Attribution-No Derivative Works 2.5 Spain License. The MAPI source code is available by request (GPL v3 license). PMID:23311574

  1. MAPI: a software framework for distributed biomedical applications.

    PubMed

    Karlsson, Johan; Trelles, Oswaldo

    2013-01-11

    The amount of web-based resources (databases, tools etc.) in biomedicine has increased, but the integrated usage of those resources is complex due to differences in access protocols and data formats. However, distributed data processing is becoming inevitable in several domains, in particular in biomedicine, where researchers face rapidly increasing data sizes. This big data is difficult to process locally because of the large processing, memory and storage capacity required. This manuscript describes a framework, called MAPI, which provides a uniform representation of resources available over the Internet, in particular for Web Services. The framework enhances their interoperability and collaborative use by enabling a uniform and remote access. The framework functionality is organized in modules that can be combined and configured in different ways to fulfil concrete development requirements. The framework has been tested in the biomedical application domain where it has been a base for developing several clients that are able to integrate different web resources. The MAPI binaries and documentation are freely available at http://www.bitlab-es.com/mapi under the Creative Commons Attribution-No Derivative Works 2.5 Spain License. The MAPI source code is available by request (GPL v3 license).

  2. A lightweight focusing reflector concept for space power applications

    NASA Astrophysics Data System (ADS)

    Wallace, T.; Bussard, R. W.

    A very lightweight membrane mirror system which can function as a flat or concave mirror and has applications in space power systems is described. The structural properties, including steady-state design and dynamic effects, are addressed along with optical properties. Operational issues are briefly discussed, including orbit stabilization, deformation by solar pressure, and pointing control. The design of the mirror provides a simple means of altering the mirror focal length.

  3. An overview of PET/MR, focused on clinical applications.

    PubMed

    Catalano, Onofrio Antonio; Masch, William Roger; Catana, Ciprian; Mahmood, Umar; Sahani, Dushyant Vasudeo; Gee, Michael Stanley; Menezes, Leon; Soricelli, Andrea; Salvatore, Marco; Gervais, Debra; Rosen, Bruce Robert

    2017-02-01

    Hybrid PET/MR scanners are innovative imaging devices that simultaneously or sequentially acquire and fuse anatomical and functional data from magnetic resonance (MR) with metabolic information from positron emission tomography (PET) (Delso et al. in J Nucl Med 52:1914-1922, 2011; Zaidi et al. in Phys Med Biol 56:3091-3106, 2011). Hybrid PET/MR scanners have the potential to greatly impact not only on medical research but also, and more importantly, on patient management. Although their clinical applications are still under investigation, the increased worldwide availability of PET/MR scanners, and the growing published literature are important determinants in their rising utilization for primarily clinical applications. In this manuscript, we provide a summary of the physical features of PET/MR, including its limitations, which are most relevant to clinical PET/MR implementation and to interpretation. Thereafter, we discuss the most important current and emergent clinical applications of such hybrid technology in the abdomen and pelvis, both in the field of oncologic and non-oncologic imaging, and we provide, when possible, a comparison with clinically consolidated imaging techniques, like for example PET/CT.

  4. Focused Magnetic Resonance Coupling Coils for Electromagnetic Therapy Applications.

    PubMed

    Yeung, Sai Ho; Pradhan, Raunaq; Feng, Xiaohua; Zheng, Yuanjin

    2015-11-01

    This paper presents the design and construction of a pair of figure-of-eight coils, coupled by magnetic resonance coupling (MRC), which could generate (150 V/m per Ampere) electric field at the focal points for electromagnetic therapy related applications. The E field generated at the targeted site would be significantly enhanced under the same amount of current flowing through the MRC figure-of-eight coils compared to normal coils, due to the superposition of E field contributed by the coils. Furthermore, the MRC figure-of-eight coil is designed and the results are verified in theory, simulation, and experiments. In the ex vivo tissue measurement, 35% current and 82% ohmic power improvements were observed. Since it can enhance the current and ohmic power, the MRC figure-of-eight coils are promising solutions for electromagnetic therapy applications. The potential applications of the coils include noninvasive radio frequency (RF) stimulation, thermoacoustic imaging, electromagnetic field therapies, and RF ablation, etc.

  5. Cloud computing geospatial application for water resources based on free and open source software and open standards - a prototype

    NASA Astrophysics Data System (ADS)

    Delipetrev, Blagoj

    2016-04-01

    Presently, most of the existing software is desktop-based, designed to work on a single computer, which represents a major limitation in many ways, starting from limited computer processing, storage power, accessibility, availability, etc. The only feasible solution lies in the web and cloud. This abstract presents research and development of a cloud computing geospatial application for water resources based on free and open source software and open standards using hybrid deployment model of public - private cloud, running on two separate virtual machines (VMs). The first one (VM1) is running on Amazon web services (AWS) and the second one (VM2) is running on a Xen cloud platform. The presented cloud application is developed using free and open source software, open standards and prototype code. The cloud application presents a framework how to develop specialized cloud geospatial application that needs only a web browser to be used. This cloud application is the ultimate collaboration geospatial platform because multiple users across the globe with internet connection and browser can jointly model geospatial objects, enter attribute data and information, execute algorithms, and visualize results. The presented cloud application is: available all the time, accessible from everywhere, it is scalable, works in a distributed computer environment, it creates a real-time multiuser collaboration platform, the programing languages code and components are interoperable, and it is flexible in including additional components. The cloud geospatial application is implemented as a specialized water resources application with three web services for 1) data infrastructure (DI), 2) support for water resources modelling (WRM), 3) user management. The web services are running on two VMs that are communicating over the internet providing services to users. The application was tested on the Zletovica river basin case study with concurrent multiple users. The application is a state

  6. Directory of Industry and University Collaborations with a Focus on Software Engineering Education and Training, Version 6

    DTIC Science & Technology

    1997-11-01

    pointer to potential new client bases. A short bibliography points the reader to background material on software engineering curricula , coalitions...studies and influence on course material ) is a condition of the funding. Points of Contact for further information Dr. Jacob Slonim Head of Research...1995-96 academic year, the Computer Science Department experimented with a new approach to teaching its first two courses in the undergraduate

  7. 25 CFR 547.8 - What are the minimum technical software standards applicable to Class II gaming systems?

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 25 Indians 2 2010-04-01 2010-04-01 false What are the minimum technical software standards... OF CLASS II GAMES § 547.8 What are the minimum technical software standards applicable to Class II gaming systems? This section provides general software standards for Class II gaming systems for the...

  8. Saturn: a software application of tensor utilities for research in neuroimaging.

    PubMed

    Cárdenes, Rubén; Muñoz-Moreno, Emma; Tristan-Vega, Antonio; Martin-Fernandez, Marcos

    2010-03-01

    We present an advanced software tool designed for visualization and quantitative analysis of Diffusion Tensor Imaging (DTI) called Saturn. The software is specially developed to help clinicians and researchers in neuroimaging, and includes a complete set of visualization capabilities to browse and analyze efficiently DTI data, making this application a powerful tool also for diagnosis purposes. The software includes a robust quantification method for DTI data, using an atlas-based method to automatically obtain equivalent anatomical fiber bundles and regions of interest among different DTI data sets. Consequently, a set of measurements is also implemented to perform robust group studies among subjects affected by neurological disorders and control groups in order to look for significant differences. Finally, a comparison study with five similar DTI applications is presented, showing the advantages offered by this tool. Copyright 2009 Elsevier Ireland Ltd. All rights reserved.

  9. Evaluation of the Trajectory Operations Applications Software Task (TOAST). Volume 2: Interview transcripts

    NASA Technical Reports Server (NTRS)

    Perkins, Sharon; Martin, Andrea; Bavinger, Bill

    1990-01-01

    The Trajectory Operations Applications Software Task (TOAST) is a software development project whose purpose is to provide trajectory operation pre-mission and real-time support for the Space Shuttle. The purpose of the evaluation was to evaluate TOAST as an Application Manager - to assess current and planned capabilities, compare capabilities to commercially-available off the shelf (COTS) software, and analyze requirements of MCC and Flight Analysis Design System (FADS) for TOAST implementation. As a major part of the data gathering for the evaluation, interviews were conducted with NASA and contractor personnel. Real-time and flight design users, orbit navigation users, the TOAST developers, and management were interviewed. Code reviews and demonstrations were also held. Each of these interviews was videotaped and transcribed as appropriate. Transcripts were edited and are presented chronologically.

  10. Performance diagnostics software for gas turbines in pipeline and cogeneration applications. Final report, July 1985-September 1989

    SciTech Connect

    Levine, P.

    1989-12-01

    The development experience for the PEGASYS and COGENT software is presented. The PEGASYS software is applicable to two-shaft gas turbines in simple, regenerative and combined cycle systems. The COGENT software is applicable to cogeneration systems. The test results show that the software is able to define the deviations between measured and expected power and thermal efficiency. Further, the software is able to identify the components causing the performance losses. The results show that axial compressor fouling is a major cause of performance losses and that the performance can be recovered by washing. A description of an on-line version of PEGASYS is described.

  11. Application of the Open Software Foundation (OSF)distributed computing environment to global PACS

    NASA Astrophysics Data System (ADS)

    Martinez, Ralph; Alsafadi, Yasser H.; Kim, Jinman

    1994-05-01

    In this paper, we present our approach to developing Global Picture Archiving and Communication System (GPACS) applications using the Open Software Foundation (OSF) Distributed Computing Environment (DCE) services and toolkits. The OSF DCE services include remote procedure calls, naming service, threads service, time service, file management services, and security service. Several OSF DCE toolkits are currently available from computer and software vendors. Designing distributed Global PACS applications using the OSF DCE approach will feature an open architecture, heterogeneity, and technology independence for GPACS remote consultation and diagnosis applications, including synchronized image annotation, and system privacy and security. The applications can communicate through various transport services and communications networks in a Global PACS environment. The use of OSF DCE services for Global PACS will enable us to develop a robust distributed structure and new user services which feature reliability and scalability for Global PACS environments.

  12. [Evaluation of Web-based software applications for administrating and organising an ophthalmological clinical trial site].

    PubMed

    Kortüm, K; Reznicek, L; Leicht, S; Ulbig, M; Wolf, A

    2013-07-01

    The importance and complexity of clinical trials is continuously increasing, especially in innovative specialties like ophthalmology. Therefore an efficient clinical trial site organisational structure is essential. In modern internet times, this can be accomplished by web-based applications. In total, 3 software applications (Vibe on Prem, Sharepoint and open source software) were evaluated in a clinical trial site in ophthalmology. Assessment criteria were set; they were: reliability, easiness of administration, usability, scheduling, task list, knowledge management, operating costs and worldwide availability. Vibe on Prem customised by the local university met the assessment criteria best. Other applications were not as strong. By introducing a web-based application for administrating and organising an ophthalmological trial site, studies can be conducted in a more efficient and reliable manner. Georg Thieme Verlag KG Stuttgart · New York.

  13. Novice and Expert Collaboration in Educational Software Development: Evaluating Application Effectiveness

    ERIC Educational Resources Information Center

    Friedman, Rob; Saponara, Adam

    2008-01-01

    In an attempt to hone the role of learners as designers, this study investigates the effectiveness of an instructional software application resulting from a design process founded on the tenets of participatory design, informant design, and contextual inquiry, as well as a set of established design heuristics. Collaboration occurred among learning…

  14. Application of thermoluminescence for detection of cascade shower 1: Hardware and software of reader system

    NASA Technical Reports Server (NTRS)

    Akashi, M.; Kawaguchi, S.; Watanabe, Z.; Misaki, A.; Niwa, M.; Okamoto, Y.; Fujinaga, T.; Ichimura, M.; Shibata, T.; Dake, S.

    1985-01-01

    A reader system for the detection of cascade showers via luminescence induced by heating sensitive material (BaSO4:Eu) is developed. The reader system is composed of following six instruments: (1) heater, (2) light guide, (3) image intensifier, (4) CCD camera, (5) image processor, (6) microcomputer. The efficiency of these apparatuses and software application for image analysis is reported.

  15. Expert Panel: A New Strategy for Creating a Student-Centred Learning Environment for Software Applications

    ERIC Educational Resources Information Center

    Wang, Sy-Chyi

    2011-01-01

    Education reforms from teacher-centred to student-centred courses usually come with the adoption of new teaching strategies. However, following the growing design and development of student-centred teaching and learning innovations in many fields of study, not many efforts have been found in the field of software application teaching. Therefore,…

  16. VARK Learning Preferences and Mobile Anatomy Software Application Use in Pre-Clinical Chiropractic Students

    ERIC Educational Resources Information Center

    Meyer, Amanda J.; Stomski, Norman J.; Innes, Stanley I.; Armson, Anthony J.

    2016-01-01

    Ubiquitous smartphone ownership and reduced face-to-face teaching time may lead to students making greater use of mobile technologies in their learning. This is the first study to report on the prevalence of mobile gross anatomy software applications (apps) usage in pre-clinical chiropractic students and to ascertain if a relationship exists…

  17. Spreading CD-ROM Technology beyond the Library: Applications for Remote Communications Software.

    ERIC Educational Resources Information Center

    Bell, Stephen

    1990-01-01

    Discusses the use of remote communications software (RCS) with microcomputers as an inexpensive way to deliver information technologies such as CD-ROM databases to users at remote locations. Typical library applications at the University of Pennsylvania are described, potential disadvantages are presented, and an appendix lists vendors of RCS.…

  18. Technology survey of computer software as applicable to the MIUS project

    NASA Technical Reports Server (NTRS)

    Fulbright, B. E.

    1975-01-01

    Existing computer software, available from either governmental or private sources, applicable to modular integrated utility system program simulation is surveyed. Several programs and subprograms are described to provide a consolidated reference, and a bibliography is included. The report covers the two broad areas of design simulation and system simulation.

  19. A Software Application for Assessing Readability in the Japanese EFL Context

    ERIC Educational Resources Information Center

    Ozasa, Toshiaki; Weir, George R. S.; Fukui, Masayasu

    2010-01-01

    We have been engaged in developing a readability index and its application software attuned for Japanese EFL learners. The index program, Ozasa-Fukui Year Level Program, Ver. 1.0, was used in developing the readability metric Ozasa-Fukui Year Level Index but tended to assume a high level of computer knowledge in its users. As a result, the…

  20. Novice and Expert Collaboration in Educational Software Development: Evaluating Application Effectiveness

    ERIC Educational Resources Information Center

    Friedman, Rob; Saponara, Adam

    2008-01-01

    In an attempt to hone the role of learners as designers, this study investigates the effectiveness of an instructional software application resulting from a design process founded on the tenets of participatory design, informant design, and contextual inquiry, as well as a set of established design heuristics. Collaboration occurred among learning…

  1. VARK Learning Preferences and Mobile Anatomy Software Application Use in Pre-Clinical Chiropractic Students

    ERIC Educational Resources Information Center

    Meyer, Amanda J.; Stomski, Norman J.; Innes, Stanley I.; Armson, Anthony J.

    2016-01-01

    Ubiquitous smartphone ownership and reduced face-to-face teaching time may lead to students making greater use of mobile technologies in their learning. This is the first study to report on the prevalence of mobile gross anatomy software applications (apps) usage in pre-clinical chiropractic students and to ascertain if a relationship exists…

  2. Compact plasma focus devices: Flexible laboratory sources for applications

    SciTech Connect

    Lebert, R.; Engel, A.; Bergmann, K.; Treichel, O.; Gavrilescu, C.; Neff, W.

    1997-05-05

    Small pinch plasma devices are intense sources of pulsed XUV-radiation. Because of their low costs and their compact sizes pinch plasmas seem well suited to supplement research activities based on synchrotrons. With correct optimisation, both continuous radiation and narrowband line radiation can be tailored for specific applications. For the special demand of optimising narrowband emission from these plasmas the scaling of K-shell line emission of intermediate atomic number pinch plasmas with respect to device parameters has been studied. Scaling laws, especially taking into account the transient behaviour of the pinch plasma, give design criteria. Investigations of the transition between column and micropinch mode offer predictable access to shorter wavelengths and smaller source sizes. Results on proximity x-ray lithography, imaging and contact x-ray microscopy, x-ray fluorescence (XFA) microscopy and photo-electron spectroscopy (XPS) were achieved.

  3. Independent Mars spacecraft precise orbit determination software development and its applications

    NASA Astrophysics Data System (ADS)

    Yan, Jianguo; Yang, Xuan; Ye, Mao; Li, Fei; Jin, Weitong; Barriot, Jean-Pierre

    2017-07-01

    In this paper, we present an independent software for Mars spacecraft precise orbit determination and gravity field recovery we call the Mars Gravity Recovery and Analysis Software (MAGREAS), which is aimed to analyze tracking data from the Chinese Mars exploration mission and similar NASA and ESA Mars-related projects. The design structure, module distribution, and functions of the software are described in this manuscript. A detailed cross validation with the mature precise orbit determination platform Geodyn-II was done. Additionally, we use MAGREAS to process the MEX orbital tracking data with two-way and three-way tracking modes separately. Measurement residuals and the difference from the reconstructed ephemeris provided by Royal Observatory of Belgium indicate that our software is reliable. In addition to describe of our software and validate with Geodyn-II, we give a simulation case close to Chinese Mars exploration mission to indicate the application of our software. We present a simulation of a four-way tracking mode between Earth tracking station, Mars orbiter, and Mars lander to validate the effectiveness of our MAGREAS-based approach for Mars orbiter determination and lander positioning. Experimental results show that our proposed tracking mode significantly improves positioning accuracy. This work will provide a reference for the design of the Chinese Mars exploration mission as well as for the processing of Chinese Mars mission orbital tracking data.

  4. Common characteristics of open source software development and applicability for drug discovery: a systematic review

    PubMed Central

    2011-01-01

    Background Innovation through an open source model has proven to be successful for software development. This success has led many to speculate if open source can be applied to other industries with similar success. We attempt to provide an understanding of open source software development characteristics for researchers, business leaders and government officials who may be interested in utilizing open source innovation in other contexts and with an emphasis on drug discovery. Methods A systematic review was performed by searching relevant, multidisciplinary databases to extract empirical research regarding the common characteristics and barriers of initiating and maintaining an open source software development project. Results Common characteristics to open source software development pertinent to open source drug discovery were extracted. The characteristics were then grouped into the areas of participant attraction, management of volunteers, control mechanisms, legal framework and physical constraints. Lastly, their applicability to drug discovery was examined. Conclusions We believe that the open source model is viable for drug discovery, although it is unlikely that it will exactly follow the form used in software development. Hybrids will likely develop that suit the unique characteristics of drug discovery. We suggest potential motivations for organizations to join an open source drug discovery project. We also examine specific differences between software and medicines, specifically how the need for laboratories and physical goods will impact the model as well as the effect of patents. PMID:21955914

  5. Common characteristics of open source software development and applicability for drug discovery: a systematic review.

    PubMed

    Ardal, Christine; Alstadsæter, Annette; Røttingen, John-Arne

    2011-09-28

    Innovation through an open source model has proven to be successful for software development. This success has led many to speculate if open source can be applied to other industries with similar success. We attempt to provide an understanding of open source software development characteristics for researchers, business leaders and government officials who may be interested in utilizing open source innovation in other contexts and with an emphasis on drug discovery. A systematic review was performed by searching relevant, multidisciplinary databases to extract empirical research regarding the common characteristics and barriers of initiating and maintaining an open source software development project. Common characteristics to open source software development pertinent to open source drug discovery were extracted. The characteristics were then grouped into the areas of participant attraction, management of volunteers, control mechanisms, legal framework and physical constraints. Lastly, their applicability to drug discovery was examined. We believe that the open source model is viable for drug discovery, although it is unlikely that it will exactly follow the form used in software development. Hybrids will likely develop that suit the unique characteristics of drug discovery. We suggest potential motivations for organizations to join an open source drug discovery project. We also examine specific differences between software and medicines, specifically how the need for laboratories and physical goods will impact the model as well as the effect of patents.

  6. Development of Oceanographic Software Tools and Applications for Navy Operational Use

    DTIC Science & Technology

    1997-09-30

    DEVELOPMENT OF OCEANOGRAPHIC SOFTWARE TOOLS AND APPLICATIONS FOR NAVY OPERATIONAL USE James H. Corbin Center for Air Sea Technology Mississippi State...applications, were significantly reduced. Accordingly, the CAST objective for FY97 was to develop interactive graphical tools for shipboard METOC briefers...This was in response to a COMSIXTHFLT validated METOC requirement to provide visualization briefing tools , animations, and 3–D graphical depictions

  7. Constructing a working taxonomy of functional Ada software components for real-time embedded system applications

    NASA Technical Reports Server (NTRS)

    Wallace, Robert

    1986-01-01

    A major impediment to a systematic attack on Ada software reusability is the lack of an effective taxonomy for software component functions. The scope of all possible applications of Ada software is considered too great to allow the practical development of a working taxonomy. Instead, for the purposes herein, the scope of Ada software application is limited to device and subsystem control in real-time embedded systems. A functional approach is taken in constructing the taxonomy tree for identified Ada domain. The use of modular software functions as a starting point fits well with the object oriented programming philosophy of Ada. Examples of the types of functions represented within the working taxonomy are real time kernels, interrupt service routines, synchronization and message passing, data conversion, digital filtering and signal conditioning, and device control. The constructed taxonomy is proposed as a framework from which a need analysis can be performed to reveal voids in current Ada real-time embedded programming efforts for Space Station.

  8. Constructing a working taxonomy of functional Ada software components for real-time embedded system applications

    NASA Technical Reports Server (NTRS)

    Wallace, Robert

    1986-01-01

    A major impediment to a systematic attack on Ada software reusability is the lack of an effective taxonomy for software component functions. The scope of all possible applications of Ada software is considered too great to allow the practical development of a working taxonomy. Instead, for the purposes herein, the scope of Ada software application is limited to device and subsystem control in real-time embedded systems. A functional approach is taken in constructing the taxonomy tree for identified Ada domain. The use of modular software functions as a starting point fits well with the object oriented programming philosophy of Ada. Examples of the types of functions represented within the working taxonomy are real time kernels, interrupt service routines, synchronization and message passing, data conversion, digital filtering and signal conditioning, and device control. The constructed taxonomy is proposed as a framework from which a need analysis can be performed to reveal voids in current Ada real-time embedded programming efforts for Space Station.

  9. Genoviz Software Development Kit: Java tool kit for building genomics visualization applications.

    PubMed

    Helt, Gregg A; Nicol, John W; Erwin, Ed; Blossom, Eric; Blanchard, Steven G; Chervitz, Stephen A; Harmon, Cyrus; Loraine, Ann E

    2009-08-25

    Visualization software can expose previously undiscovered patterns in genomic data and advance biological science. The Genoviz Software Development Kit (SDK) is an open source, Java-based framework designed for rapid assembly of visualization software applications for genomics. The Genoviz SDK framework provides a mechanism for incorporating adaptive, dynamic zooming into applications, a desirable feature of genome viewers. Visualization capabilities of the Genoviz SDK include automated layout of features along genetic or genomic axes; support for user interactions with graphical elements (Glyphs) in a map; a variety of Glyph sub-classes that promote experimentation with new ways of representing data in graphical formats; and support for adaptive, semantic zooming, whereby objects change their appearance depending on zoom level and zooming rate adapts to the current scale. Freely available demonstration and production quality applications, including the Integrated Genome Browser, illustrate Genoviz SDK capabilities. Separation between graphics components and genomic data models makes it easy for developers to add visualization capability to pre-existing applications or build new applications using third-party data models. Source code, documentation, sample applications, and tutorials are available at http://genoviz.sourceforge.net/.

  10. Imaging In focus: Reflected light imaging: Techniques and applications.

    PubMed

    Guggenheim, Emily J; Lynch, Iseult; Rappoport, Joshua Z

    2017-02-01

    Reflectance imaging is a broad term that describes the formation of images by the detection of illumination light that is back-scattered from reflective features within a sample. Reflectance imaging can be performed in a variety of different configurations, such as confocal, oblique angle illumination, structured illumination, interferometry and total internal reflectance, permitting a plethora of biomedical applications. Reflectance imaging has proven indispensable for critical investigations into the safety and understanding of biomedically and environmentally relevant nano-materials, an area of high priority and investment. The non-destructive in vivo imaging ability of reflectance techniques permits alternative diagnostic strategies that may eventually facilitate the eradication of some invasive biopsy procedures. Reflectance can also provide additional structural information and clarity necessary in fluorescent based in vivo studies. Near-coverslip interrogation techniques, such as reflectance interferometry and total internal reflection, have provided a label free means to investigate cell-surface contacts, cell motility and vesicle trafficking in vivo and in vitro. Other key advances include the ability to acquire superresolution reflectance images providing increased spatial resolution.

  11. Software Graphics Processing Unit (sGPU) for Deep Space Applications

    NASA Technical Reports Server (NTRS)

    McCabe, Mary; Salazar, George; Steele, Glen

    2015-01-01

    A graphics processing capability will be required for deep space missions and must include a range of applications, from safety-critical vehicle health status to telemedicine for crew health. However, preliminary radiation testing of commercial graphics processing cards suggest they cannot operate in the deep space radiation environment. Investigation into an Software Graphics Processing Unit (sGPU)comprised of commercial-equivalent radiation hardened/tolerant single board computers, field programmable gate arrays, and safety-critical display software shows promising results. Preliminary performance of approximately 30 frames per second (FPS) has been achieved. Use of multi-core processors may provide a significant increase in performance.

  12. [Application of Stata software to test heterogeneity in meta-analysis method].

    PubMed

    Wang, Dan; Mou, Zhen-yun; Zhai, Jun-xia; Zong, Hong-xia; Zhao, Xiao-dong

    2008-07-01

    To introduce the application of Stata software to heterogeneity test in meta-analysis. A data set was set up according to the example in the study, and the corresponding commands of the methods in Stata 9 software were applied to test the example. The methods used were Q-test and I2 statistic attached to the fixed effect model forest plot, H statistic and Galbraith plot. The existence of the heterogeneity among studies could be detected by Q-test and H statistic and the degree of the heterogeneity could be detected by I2 statistic. The outliers which were the sources of the heterogeneity could be spotted from the Galbraith plot. Heterogeneity test in meta-analysis can be completed by the four methods in Stata software simply and quickly. H and I2 statistics are more robust, and the outliers of the heterogeneity can be clearly seen in the Galbraith plot among the four methods.

  13. Applications of Formal Methods to Specification and Safety of Avionics Software

    NASA Technical Reports Server (NTRS)

    Hoover, D. N.; Guaspari, David; Humenn, Polar

    1996-01-01

    This report treats several topics in applications of formal methods to avionics software development. Most of these topics concern decision tables, an orderly, easy-to-understand format for formally specifying complex choices among alternative courses of action. The topics relating to decision tables include: generalizations fo decision tables that are more concise and support the use of decision tables in a refinement-based formal software development process; a formalism for systems of decision tables with behaviors; an exposition of Parnas tables for users of decision tables; and test coverage criteria and decision tables. We outline features of a revised version of ORA's decision table tool, Tablewise, which will support many of the new ideas described in this report. We also survey formal safety analysis of specifications and software.

  14. An application of machine learning to the organization of institutional software repositories

    NASA Technical Reports Server (NTRS)

    Bailin, Sidney; Henderson, Scott; Truszkowski, Walt

    1993-01-01

    Software reuse has become a major goal in the development of space systems, as a recent NASA-wide workshop on the subject made clear. The Data Systems Technology Division of Goddard Space Flight Center has been working on tools and techniques for promoting reuse, in particular in the development of satellite ground support software. One of these tools is the Experiment in Libraries via Incremental Schemata and Cobweb (ElvisC). ElvisC applies machine learning to the problem of organizing a reusable software component library for efficient and reliable retrieval. In this paper we describe the background factors that have motivated this work, present the design of the system, and evaluate the results of its application.

  15. Development of a comprehensive software application for calculations in nuclear medicine and radiopharmacy.

    PubMed

    Perales, Jesús Luis Gómez; Mendoza, Antonio García

    2010-09-01

    In the daily practice of in-hospital or centralized radiopharmacies, there is a need to perform reliable numeric calculations. Furthermore, several nuclear medicine diagnostic tests also involve carrying out calculations. In both cases, these calculations are sometimes complex or tedious and prone to error. We report the development of a computer software program that performs a comprehensive range of calculations required in radiopharmacy and nuclear medicine diagnostic tests. This software was developed and compiled in the Visual Basic programming language using algorithms and methods reflected in the scientific literature. We developed 2 versions of the software program, which we call Nucleolab. It automatically performs calculations relating to radiopharmacy practice as well as 9 diagnostic nuclear medicine tests. The 0.1 version performs all these calculations, and the 1.2 version also has a database that enables the user to save and recover diagnostic test results and issue custom reports. The software can be downloaded at www.radiofarmacia.org/nucleolab-english. To our knowledge, ours is the first attempt to develop a comprehensive software application that facilitates calculations in nuclear medicine and radiopharmacy, reducing errors and improving efficiency and accuracy.

  16. An Exploration in Implementing Fault Tolerance in Scientific Simulation Application Software

    SciTech Connect

    DRAKE, RICHARD R.; SUMMERS, RANDALL M.

    2003-05-01

    The ability for scientific simulation software to detect and recover from errors and failures of supporting hardware and software layers is becoming more important due to the pressure to shift from large, specialized multi-million dollar ASCI computing platforms to smaller, less expensive interconnected machines consisting of off-the-shelf hardware. As evidenced by the CPlant{trademark} experiences, fault tolerance can be necessary even on such a homogeneous system and may also prove useful in the next generation of ASCI platforms. This report describes a research effort intended to study, implement, and test the feasibility of various fault tolerance mechanisms controlled at the simulation code level. Errors and failures would be detected by underlying software layers, communicated to the application through a convenient interface, and then handled by the simulation code itself. Targeted faults included corrupt communication messages, processor node dropouts, and unacceptable slowdown of service from processing nodes. Recovery techniques such as re-sending communication messages and dynamic reallocation of failing processor nodes were considered. However, most fault tolerance mechanisms rely on underlying software layers which were discovered to be lacking to such a degree that mechanisms at the application level could not be implemented. This research effort has been postponed and shifted to these supporting layers.

  17. Next generation of decision making software for nanopatterns characterization: application to semiconductor industry

    NASA Astrophysics Data System (ADS)

    Dervilllé, A.; Labrosse, A.; Zimmermann, Y.; Foucher, J.; Gronheid, R.; Boeckx, C.; Singh, A.; Leray, P.; Halder, S.

    2016-03-01

    The dimensional scaling in IC manufacturing strongly drives the demands on CD and defect metrology techniques and their measurement uncertainties. Defect review has become as important as CD metrology and both of them create a new metrology paradigm because it creates a completely new need for flexible, robust and scalable metrology software. Current, software architectures and metrology algorithms are performant but it must be pushed to another higher level in order to follow roadmap speed and requirements. For example: manage defect and CD in one step algorithm, customize algorithms and outputs features for each R&D team environment, provide software update every day or every week for R&D teams in order to explore easily various development strategies. The final goal is to avoid spending hours and days to manually tune algorithm to analyze metrology data and to allow R&D teams to stay focus on their expertise. The benefits are drastic costs reduction, more efficient R&D team and better process quality. In this paper, we propose a new generation of software platform and development infrastructure which can integrate specific metrology business modules. For example, we will show the integration of a chemistry module dedicated to electronics materials like Direct Self Assembly features. We will show a new generation of image analysis algorithms which are able to manage at the same time defect rates, images classifications, CD and roughness measurements with high throughput performances in order to be compatible with HVM. In a second part, we will assess the reliability, the customization of algorithm and the software platform capabilities to follow new specific semiconductor metrology software requirements: flexibility, robustness, high throughput and scalability. Finally, we will demonstrate how such environment has allowed a drastic reduction of data analysis cycle time.

  18. Detection of patient setup errors with a portal image - DRR registration software application.

    PubMed

    Sutherland, Kenneth; Ishikawa, Masayori; Bengua, Gerard; Ito, Yoichi M; Miyamoto, Yoshiko; Shirato, Hiroki

    2011-02-18

    The purpose of this study was to evaluate a custom portal image - digitally reconstructed radiograph (DRR) registration software application. The software works by transforming the portal image into the coordinate space of the DRR image using three control points placed on each image by the user, and displaying the fused image. In order to test statistically that the software actually improves setup error estimation, an intra- and interobserver phantom study was performed. Portal images of anthropomorphic thoracic and pelvis phantoms with virtually placed irradiation fields at known setup errors were prepared. A group of five doctors was first asked to estimate the setup errors by examining the portal and DRR image side-by-side, not using the software. A second group of four technicians then estimated the same set of images using the registration software. These two groups of human subjects were then compared with an auto-registration feature of the software, which is based on the mutual information between the portal and DRR images. For the thoracic case, the average distance between the actual setup error and the estimated error was 4.3 ± 3.0 mm for doctors using the side-by-side method, 2.1 ± 2.4 mm for technicians using the registration method, and 0.8 ± 0.4mm for the automatic algorithm. For the pelvis case, the average distance between the actual setup error and estimated error was 2.0 ± 0.5 mm for the doctors using the side-by-side method, 2.5 ± 0.4 mm for technicians using the registration method, and 2.0 ± 1.0 mm for the automatic algorithm. The ability of humans to estimate offset values improved statistically using our software for the chest phantom that we tested. Setup error estimation was further improved using our automatic error estimation algorithm. Estimations were not statistically different for the pelvis case. Consistency improved using the software for both the chest and pelvis phantoms. We also tested the automatic algorithm with a

  19. Time-Reversal Acoustic Focusing with Liquid Resonator for Medical Applications

    NASA Astrophysics Data System (ADS)

    Sinelnikov, Yegor D.; Sutin, Alexandre Y.; Sarvazyan, Armen P.

    2007-05-01

    Time Reversal Acoustic (TRA) focusing system based on the use of liquid filled resonators with single or few transducers is demonstrated to effectively converge acoustic energy in space and time. Because the wavelength in liquid is typically smaller than in solids, liquid based TRA focusing resonators can have smaller dimensions than solid resonators. The efficiency of liquid-based TRA focusing resonators to transmit acoustic power to soft tissues is improved by impedance matching of the acoustic transducer assembly to the surrounding liquid. Experiments were conducted to understand the properties of TRA focusing with the liquid-filled resonators and possible application of the TRA systems for biomedical applications. The factors defining the efficiency of liquid based TRA focusing resonators were explored. In media with high attenuation, the binary mode of ultrasound delivery yielded noticeably narrower focusing of ultrasound than conventional analog focusing.

  20. An application of software design and documentation language. [Galileo spacecraft command and data subsystem

    NASA Technical Reports Server (NTRS)

    Callender, E. D.; Clarkson, T. B.; Frasier, C. E.

    1980-01-01

    The software design and documentation language (SDDL) is a general purpose processor to support a lanugage for the description of any system, structure, concept, or procedure that may be presented from the viewpoint of a collection of hierarchical entities linked together by means of binary connections. The language comprises a set of rules of syntax, primitive construct classes (module, block, and module invocation), and language control directives. The result is a language with a fixed grammar, variable alphabet and punctuation, and an extendable vocabulary. The application of SDDL to the detailed software design of the Command Data Subsystem for the Galileo Spacecraft is discussed. A set of constructs was developed and applied. These constructs are evaluated and examples of their application are considered.

  1. Software Process Improvement Initiatives Based on Quality Assurance Strategies: A QATAM Pilot Application

    NASA Astrophysics Data System (ADS)

    Winkler, Dietmar; Elberzhager, Frank; Biffl, Stefan; Eschbach, Robert

    Quality Assurance (QA) strategies, i.e., bundles of verification and validation approaches embedded within a balanced software process can support project and quality managers in systematically planning and implementing improvement initiatives. New and modified processes and methods come up frequently that seems promising candidates for improvement. Nevertheless, the impact of processes and methods strongly depends on individual project contexts. A major challenge is how to systematically select and implement "bestpractices" for product construction, verification, and validation. In this paper we present the Quality Assurance Tradeoff Analysis Method (QATAM) that supports engineers in (a) systematically identifying candidate QA strategies and (b) evaluating QA strategy variants in a given project context. We evaluate feasibility and usefulness in a pilot application in a medium-size software engineering organization. Main results were that QATAM was considered useful for identifying and evaluating various improvement initiatives applicable for large organizations as well as for small and medium enterprises.

  2. [Quality assurance of a virtual simulation software: application to IMAgo and SIMAgo (ISOgray)].

    PubMed

    Isambert, A; Beaudré, A; Ferreira, I; Lefkopoulos, D

    2007-06-01

    Virtual simulation process is often used to prepare three dimensional conformal radiation therapy treatments. As the quality of the treatment is widely dependent on this step, it is mandatory to perform extensive controls on this software before clinical use. The tests presented in this work have been carried out on the treatment planning system ISOgray (DOSIsoft), including the delineation module IMAgo and the virtual simulation module SIMAgo. According to our experience, the most relevant controls of international protocols have been selected. These tests mainly focused on measuring and delineation tools, virtual simulation functionalities, and have been performed with three phantoms: the Quasar Multi-Purpose Body Phantom, the Quasar MLC Beam Geometry Phantom (Modus Medical Devices Inc.) and a phantom developed at Hospital Tenon. No major issues have been identified while performing the tests. These controls have emphasized the necessity for the user to consider with a critical eye the results displayed by a virtual simulation software. The contrast of visualisation, the slice thickness, the calculation and display mode of 3D structures used by the software are many factors of uncertainties. A virtual simulation software quality assurance procedure has been written and applied on a set of CT images. Similar tests have to be performed periodically and at minimum at each change of major version.

  3. Towards the Goal of Modular Climate Data Services: An Overview of NCPP Applications and Software

    NASA Astrophysics Data System (ADS)

    Koziol, B. W.; Cinquini, L.; Treshansky, A.; Murphy, S.; DeLuca, C.

    2013-12-01

    In August 2013, the National Climate Predictions and Projections Platform (NCPP) organized a workshop focusing on the quantitative evaluation of downscaled climate data products (QED-2013). The QED-2013 workshop focused on real-world application problems drawn from several sectors (e.g. hydrology, ecology, environmental health, agriculture), and required that downscaled downscaled data products be dynamically accessed, generated, manipulated, annotated, and evaluated. The cyberinfrastructure elements that were integrated to support the workshop included (1) a wiki-based project hosting environment (Earth System CoG) with an interface to data services provided by an Earth System Grid Federation (ESGF) data node; (2) metadata tools provided by the Earth System Documentation (ES-DOC) collaboration; and (3) a Python-based library OpenClimateGIS (OCGIS) for subsetting and converting NetCDF-based climate data to GIS and tabular formats. Collectively, this toolset represents a first deployment of a 'ClimateTranslator' that enables users to access, interpret, and apply climate information at local and regional scales. This presentation will provide an overview of these components above, how they were used in the workshop, and discussion of current and potential integration. The long-term strategy for this software stack is to offer the suite of services described on a customizable, per-project basis. Additional detail on the three components is below. (1) Earth System CoG is a web-based collaboration environment that integrates data discovery and access services with tools for supporting governance and the organization of information. QED-2013 utilized these capabilities to share with workshop participants a suite of downscaled datasets, associated images derived from those datasets, and metadata files describing the downscaling techniques involved. The collaboration side of CoG was used for workshop organization, discussion, and results. (2) The ES-DOC Questionnaire

  4. Software Bridge

    NASA Technical Reports Server (NTRS)

    1995-01-01

    I-Bridge is a commercial version of software developed by I-Kinetics under a NASA Small Business Innovation Research (SBIR) contract. The software allows users of Windows applications to gain quick, easy access to databases, programs and files on UNIX services. Information goes directly onto spreadsheets and other applications; users need not manually locate, transfer and convert data.

  5. [CD15 focus score for diagnostics of periprosthetic joint infections : Neutrophilic granulocytes quantification mode and the development of morphometric software (CD15 quantifier)].

    PubMed

    Kölbel, B; Wienert, S; Dimitriadis, J; Kendoff, D; Gehrke, T; Huber, M; Frommelt, L; Tiemann, A; Saeger, K; Krenn, V

    2015-09-01

    The aim of this project was to devise a quantification method for neutrophils within a single focal point through the development of a CD15 focus score which enables bacterial infections in synovial-like interface membranes (SLIM) to be diagnosed. In this study a histopathological classification of 91 SLIM removed during revision surgery from the hips (n = 59) and knees (n = 32) was performed. Neutrophils were identified immunohistochemically by means of a CD15-specific monoclonal antibody. The quantitative evaluation of CD15-positive neutrophils (CD15Ne) used the principle of maximum focal infiltration (focus) together with an assessment of a single focal point (0.3 mm(2)). This immunohistochemical approach made it possible to develop the CD15 quantifier software, which automatically quantifies CD15Ne. The SLIM cases with positive microbiological findings (n = 47) had significantly (p < 0.001, Mann-Whitney U-test) more CD15Ne/focal point than cases with negative microbiological findings (n = 44). A count of 50 CD15Ne/focal point was identified as the optimum threshold when diagnosing periprosthetic joint infections (PJI) using the CD15 focus score. If the microbiological findings are used as a gold standard, the diagnostic sensitivity is 0.83, and the specificity is 0.864 with a positive predictive value (PPV) of 0.87, a negative predictive value (NPV) of 0.83, an accuracy of 0.846 and an area under the curve (AUC) of 0.878. The evaluation of findings for the preparations using the CD15 quantifier software (n = 31) deviated by an average of 12 cells from the histopathological evaluation findings (CD15 focus score). Above a cell count of 62, the CD15-quantifier needs on average 32 s less than the pathologist. The immunohistochemical CD15 focus score has a high diagnostic value and allowed the development of the CD15 quantifier software. This provides an automated procedure, which shortens the mentally tiring and time-consuming process of

  6. Lessons Learned from Application of System and Software Level RAMS Analysis to a Space Control System

    NASA Astrophysics Data System (ADS)

    Silva, N.; Esper, A.

    2012-01-01

    The work presented in this article represents the results of applying RAMS analysis to a critical space control system, both at system and software levels. The system level RAMS analysis allowed the assignment of criticalities to the high level components, which was further refined by a tailored software level RAMS analysis. The importance of the software level RAMS analysis in the identification of new failure modes and its impact on the system level RAMS analysis is discussed. Recommendations of changes in the software architecture have also been proposed in order to reduce the criticality of the SW components to an acceptable minimum. The dependability analysis was performed in accordance to ECSS-Q-ST-80, which had to be tailored and complemented in some aspects. This tailoring will also be detailed in the article and lessons learned from the application of this tailoring will be shared, stating the importance to space systems safety evaluations. The paper presents the applied techniques, the relevant results obtained, the effort required for performing the tasks and the planned strategy for ROI estimation, as well as the soft skills required and acquired during these activities.

  7. GWASS: GRASS web application software system based on the GeoBrain web service

    NASA Astrophysics Data System (ADS)

    Qiu, Fang; Ni, Feng; Chastain, Bryan; Huang, Haiting; Zhao, Peisheng; Han, Weiguo; Di, Liping

    2012-10-01

    GRASS is a well-known geographic information system developed more than 30 years ago. As one of the earliest GIS systems, GRASS has currently survived mainly as free, open-source desktop GIS software, with users primarily limited to the research community or among programmers who use it to create customized functions. To allow average GIS end users to continue taking advantage of this widely-used software, we developed a GRASS Web Application Software System (GWASS), a distributed, web-based, multi-tiered Geospatial Information System (GIS) built on top of the GeoBrain web service, a project sponsored by NASA using the latest service oriented architecture (SOA). This SOA enabled system offers an effective and practical alternative to current commercial desktop GIS solutions. With GWASS, all geospatial processing and analyses are conducted by the server, so users are not required to install any software at the client side, which reduces the cost of access for users. The only resource needed to use GWASS is an access to the Internet, and anyone who knows how to use a web browser can operate the system. The SOA framework is revitalizing the GRASS as a new means to bring powerful geospatial analysis and resources to more users with concurrent access.

  8. Development of a web application for water resources based on open source software

    NASA Astrophysics Data System (ADS)

    Delipetrev, Blagoj; Jonoski, Andreja; Solomatine, Dimitri P.

    2014-01-01

    This article presents research and development of a prototype web application for water resources using latest advancements in Information and Communication Technologies (ICT), open source software and web GIS. The web application has three web services for: (1) managing, presenting and storing of geospatial data, (2) support of water resources modeling and (3) water resources optimization. The web application is developed using several programming languages (PhP, Ajax, JavaScript, Java), libraries (OpenLayers, JQuery) and open source software components (GeoServer, PostgreSQL, PostGIS). The presented web application has several main advantages: it is available all the time, it is accessible from everywhere, it creates a real time multi-user collaboration platform, the programing languages code and components are interoperable and designed to work in a distributed computer environment, it is flexible for adding additional components and services and, it is scalable depending on the workload. The application was successfully tested on a case study with concurrent multi-users access.

  9. A software tool of digital tomosynthesis application for patient positioning in radiotherapy.

    PubMed

    Yan, Hui; Dai, Jian-Rong

    2016-03-08

    Digital Tomosynthesis (DTS) is an image modality in reconstructing tomographic images from two-dimensional kV projections covering a narrow scan angles. Comparing with conventional cone-beam CT (CBCT), it requires less time and radiation dose in data acquisition. It is feasible to apply this technique in patient positioning in radiotherapy. To facilitate its clinical application, a software tool was developed and the reconstruction processes were accelerated by graphic process-ing unit (GPU). Two reconstruction and two registration processes are required for DTS application which is different from conventional CBCT application which requires one image reconstruction process and one image registration process. The reconstruction stage consists of productions of two types of DTS. One type of DTS is reconstructed from cone-beam (CB) projections covering a narrow scan angle and is named onboard DTS (ODTS), which represents the real patient position in treatment room. Another type of DTS is reconstructed from digitally reconstructed radiography (DRR) and is named reference DTS (RDTS), which represents the ideal patient position in treatment room. Prior to the reconstruction of RDTS, The DRRs are reconstructed from planning CT using the same acquisition setting of CB projections. The registration stage consists of two matching processes between ODTS and RDTS. The target shift in lateral and longitudinal axes are obtained from the matching between ODTS and RDTS in coronal view, while the target shift in longitudinal and vertical axes are obtained from the matching between ODTS and RDTS in sagittal view. In this software, both DRR and DTS reconstruction algorithms were implemented on GPU environments for acceleration purpose. The comprehensive evaluation of this software tool was performed including geometric accuracy, image quality, registration accuracy, and reconstruction efficiency. The average correlation coefficient between DRR/DTS generated by GPU-based algorithm

  10. Modelface: an Application Programming Interface (API) for Homology Modeling Studies Using Modeller Software

    PubMed Central

    Sakhteman, Amirhossein; Zare, Bijan

    2016-01-01

    An interactive application, Modelface, was presented for Modeller software based on windows platform. The application is able to run all steps of homology modeling including pdb to fasta generation, running clustal, model building and loop refinement. Other modules of modeler including energy calculation, energy minimization and the ability to make single point mutations in the PDB structures are also implemented inside Modelface. The API is a simple batch based application with no memory occupation and is free of charge for academic use. The application is also able to repair missing atom types in the PDB structures making it suitable for many molecular modeling studies such as docking and molecular dynamic simulation. Some successful instances of modeling studies using Modelface are also reported. PMID:28243276

  11. Safety Characteristics in System Application of Software for Human Rated Exploration Missions for the 8th IAASS Conference

    NASA Technical Reports Server (NTRS)

    Mango, Edward J.

    2016-01-01

    NASA and its industry and international partners are embarking on a bold and inspiring development effort to design and build an exploration class space system. The space system is made up of the Orion system, the Space Launch System (SLS) and the Ground Systems Development and Operations (GSDO) system. All are highly coupled together and dependent on each other for the combined safety of the space system. A key area of system safety focus needs to be in the ground and flight application software system (GFAS). In the development, certification and operations of GFAS, there are a series of safety characteristics that define the approach to ensure mission success. This paper will explore and examine the safety characteristics of the GFAS development. The GFAS system integrates the flight software packages of the Orion and SLS with the ground systems and launch countdown sequencers through the 'agile' software development process. A unique approach is needed to develop the GFAS project capabilities within this agile process. NASA has defined the software development process through a set of standards. The standards were written during the infancy of the so-called industry 'agile development' movement and must be tailored to adapt to the highly integrated environment of human exploration systems. Safety of the space systems and the eventual crew on board is paramount during the preparation of the exploration flight systems. A series of software safety characteristics have been incorporated into the development and certification efforts to ensure readiness for use and compatibility with the space systems. Three underlining factors in the exploration architecture require the GFAS system to be unique in its approach to ensure safety for the space systems, both the flight as well as the ground systems. The first are the missions themselves, which are exploration in nature, and go far beyond the comfort of low Earth orbit operations. The second is the current exploration

  12. An image-processing software package: UU and Fig for optical metrology applications

    NASA Astrophysics Data System (ADS)

    Chen, Lujie

    2013-06-01

    Modern optical metrology applications are largely supported by computational methods, such as phase shifting [1], Fourier Transform [2], digital image correlation [3], camera calibration [4], etc, in which image processing is a critical and indispensable component. While it is not too difficult to obtain a wide variety of image-processing programs from the internet; few are catered for the relatively special area of optical metrology. This paper introduces an image-processing software package: UU (data processing) and Fig (data rendering) that incorporates many useful functions to process optical metrological data. The cross-platform programs UU and Fig are developed based on wxWidgets. At the time of writing, it has been tested on Windows, Linux and Mac OS. The userinterface is designed to offer precise control of the underline processing procedures in a scientific manner. The data input/output mechanism is designed to accommodate diverse file formats and to facilitate the interaction with other independent programs. In terms of robustness, although the software was initially developed for personal use, it is comparably stable and accurate to most of the commercial software of similar nature. In addition to functions for optical metrology, the software package has a rich collection of useful tools in the following areas: real-time image streaming from USB and GigE cameras, computational geometry, computer vision, fitting of data, 3D image processing, vector image processing, precision device control (rotary stage, PZT stage, etc), point cloud to surface reconstruction, volume rendering, batch processing, etc. The software package is currently used in a number of universities for teaching and research.

  13. Application of the AHP method in modeling the trust and reputation of software agents

    NASA Astrophysics Data System (ADS)

    Zytniewski, Mariusz; Klementa, Marek; Skorupka, Dariusz; Stanek, Stanislaw; Duchaczek, Artur

    2016-06-01

    Given the unique characteristics of cyberspace and, in particular, the number of inherent security threats, communication between software agents becomes a highly complex issue and a major challenge that, on the one hand, needs to be continuously monitored and, on the other, awaits new solutions addressing its vulnerabilities. An approach that has recently come into view mimics mechanisms typical of social systems and is based on trust and reputation that assist agents in deciding which other agents to interact with. The paper offers an enhancement to existing trust and reputation models, involving the application of the AHP method that is widely used for decision support in social systems, notably for risks analysis. To this end, it is proposed to expand the underlying conceptual basis by including such notions as self-trust and social trust, and to apply these to software agents. The discussion is concluded with an account of an experiment aimed at testing the effectiveness of the proposed solution.

  14. The Application of V&V within Reuse-Based Software Engineering

    NASA Technical Reports Server (NTRS)

    Addy, Edward

    1996-01-01

    Verification and Validation (V&V) is performed during application development for many systems, especially safety-critical and mission-critical systems. The V&V process is intended to discover errors as early as possible during the development process. Early discovery is important in order to minimize the cost and other impacts of correcting these errors. In reuse-based software engineering, decisions on the requirements, design and even implementation of domain assets can can be made prior to beginning development of a specific system. in order to bring the effectiveness of V&V to bear within reuse-based software engineering. V&V must be incorporated within the domain engineering process.

  15. Application of the ARRAMIS Risk and Reliability Software to the Nuclear Accident Progression

    SciTech Connect

    Wyss, Gregory D.; Daniel, Sharon L.; Hays, Kelly M.; Brown, Thomas D.

    1997-06-01

    The ARRAMIS risk and reliability analysis software suite developed by Sandia National Laboratories enables analysts to evaluate the safety and reliability of a wide range of complex systems whose failure results in high consequences. This software was originally designed to model the systems, responses, and phenomena associated with potential severe accidents at commercial nuclear power reactors by solving very large fault tree and event tree models. However, because of its power and versatility, ARRAMIS and its constituent analysis engines have recently been used to evaluate a wide variety of systems, including nuclear weapons, telecommunications facilities, robotic material handling systems, and aircraft systems using hybrid fault tree event tree analysis techniques incorporating fully integrated uncertainty analysis capabilities. This paper describes recent applications in the area of nuclear reactor accident progression analysis using a large event tree methodology and the ARRAMIS package.

  16. Lifelong personal health data and application software via virtual machines in the cloud.

    PubMed

    Van Gorp, Pieter; Comuzzi, Marco

    2014-01-01

    Personal Health Records (PHRs) should remain the lifelong property of patients, who should be able to show them conveniently and securely to selected caregivers and institutions. In this paper, we present MyPHRMachines, a cloud-based PHR system taking a radically new architectural solution to health record portability. In MyPHRMachines, health-related data and the application software to view and/or analyze it are separately deployed in the PHR system. After uploading their medical data to MyPHRMachines, patients can access them again from remote virtual machines that contain the right software to visualize and analyze them without any need for conversion. Patients can share their remote virtual machine session with selected caregivers, who will need only a Web browser to access the pre-loaded fragments of their lifelong PHR. We discuss a prototype of MyPHRMachines applied to two use cases, i.e., radiology image sharing and personalized medicine.

  17. Using Solution-Focused Applications for Transitional Coping of Workplace Survivors

    ERIC Educational Resources Information Center

    Germain, Marie-Line; Palamara, Sherry A.

    2007-01-01

    Solution-focused applications are proposed to assist survivor employees to return to workplace homeostasis after co-workers voluntarily or involuntarily leave the organization. A model for transitional coping is presented as well as a potential case study illustrating the application of the model. Implications for the theory, practice, and…

  18. Computer software.

    PubMed

    Rosenthal, L E

    1986-10-01

    Software is the component in a computer system that permits the hardware to perform the various functions that a computer system is capable of doing. The history of software and its development can be traced to the early nineteenth century. All computer systems are designed to utilize the "stored program concept" as first developed by Charles Babbage in the 1850s. The concept was lost until the mid-1940s, when modern computers made their appearance. Today, because of the complex and myriad tasks that a computer system can perform, there has been a differentiation of types of software. There is software designed to perform specific business applications. There is software that controls the overall operation of a computer system. And there is software that is designed to carry out specialized tasks. Regardless of types, software is the most critical component of any computer system. Without it, all one has is a collection of circuits, transistors, and silicone chips.

  19. Applications of on-product diffraction-based focus metrology in logic high volume manufacturing

    NASA Astrophysics Data System (ADS)

    Noyes, Ben F.; Mokaberi, Babak; Bolton, David; Li, Chen; Palande, Ashwin; Park, Kevin; Noot, Marc; Kea, Marc

    2016-03-01

    The integration of on-product diffraction-based focus (DBF) capability into the majority of immersion lithography layers in leading edge logic manufacturing has enabled new applications targeted towards improving cycle time and yield. A CD-based detection method is the process of record (POR) for excursion detection. The drawback of this method is increased cycle time and limited sampling due to CD-SEM metrology capacity constraints. The DBFbased method allows the addition of focus metrology samples to the existing overlay measurements on the integrated metrology (IM) system. The result enables the addition of measured focus to the SPC system, allowing a faster excursion detection method. For focus targeting, the current method involves using a dedicated focus-exposure matrix (FEM) on all scanners, resulting in lengthy analysis times and uncertainty in the best focus. The DBF method allows the measurement to occur on the IM system, on a regular production wafer, and at the same time as the exposure. This results in a cycle time gain as well as a less subjective determination of best focus. A third application aims to use the novel onproduct focus metrology data in order to apply per-exposure focus corrections to the scanner. These corrections are particularly effective at the edge of the wafer, where systematic layer-dependent effects can be removed using DBFbased scanner feedback. This paper will discuss the development of a methodology to accomplish each of these applications in a high-volume production environment. The new focus metrology method, sampling schemes, feedback mechanisms and analysis methods lead to improved focus control, as well as earlier detection of failures.

  20. A flexible software architecture for scalable real-time image and video processing applications

    NASA Astrophysics Data System (ADS)

    Usamentiaga, Rubén; Molleda, Julio; García, Daniel F.; Bulnes, Francisco G.

    2012-06-01

    Real-time image and video processing applications require skilled architects, and recent trends in the hardware platform make the design and implementation of these applications increasingly complex. Many frameworks and libraries have been proposed or commercialized to simplify the design and tuning of real-time image processing applications. However, they tend to lack flexibility because they are normally oriented towards particular types of applications, or they impose specific data processing models such as the pipeline. Other issues include large memory footprints, difficulty for reuse and inefficient execution on multicore processors. This paper presents a novel software architecture for real-time image and video processing applications which addresses these issues. The architecture is divided into three layers: the platform abstraction layer, the messaging layer, and the application layer. The platform abstraction layer provides a high level application programming interface for the rest of the architecture. The messaging layer provides a message passing interface based on a dynamic publish/subscribe pattern. A topic-based filtering in which messages are published to topics is used to route the messages from the publishers to the subscribers interested in a particular type of messages. The application layer provides a repository for reusable application modules designed for real-time image and video processing applications. These modules, which include acquisition, visualization, communication, user interface and data processing modules, take advantage of the power of other well-known libraries such as OpenCV, Intel IPP, or CUDA. Finally, we present different prototypes and applications to show the possibilities of the proposed architecture.

  1. Isolating the Effects of a Mobile Phone on the Usability and Safety of eHealth Software Applications.

    PubMed

    Borycki, Elizabeth M; Griffith, Janessa; Monkman, Helen; Reid-Haughian, Cheryl

    2017-01-01

    Mobile phones are used in conjunction with mobile eHealth software applications. These mobile software applications can be used to access, review and document clinical information. The objective of this research was to explore the relationship between mobile phones, usability and safety. Clinical simulations and semi-structured interviews were used to investigate this relationship. The findings revealed that mobile phones may lead to specific types of usability issues that may introduce some types of errors.

  2. Effects of a Passive Online Software Application on Heart Rate Variability and Autonomic Nervous System Balance

    PubMed Central

    2017-01-01

    Abstract Objective: This study investigated whether short-term exposure to a passive online software application of purported subtle energy technology would affect heart rate variability (HRV) and associated autonomic nervous system measures. Methods: This was a randomized, double-blinded, sham-controlled clinical trial (RCT). The study took place in a nonprofit laboratory in Emeryville, California. Twenty healthy, nonsmoking subjects (16 females), aged 40–75 years, participated. Quantum Code Technology™ (QCT), a purported subtle energy technology, was delivered through a passive software application (Heart+ App) on a smartphone placed <1 m from subjects who were seated and reading a catalog. HRV was measured for 5 min in triplicate for each condition via finger plethysmography using a Food and Drug Administration medically approved HRV measurement device. Measurements were made at baseline and 35 min following exposure to the software applications. The following parameters were calculated and analyzed: heart rate, total power, standard deviation node-to-node, root mean square sequential difference, low frequency to high frequency ratio (LF/HF), low frequency (LF), and high frequency (HF). Results: Paired samples t-tests showed that for the Heart+ App, mean LF/HF decreased (p = 9.5 × 10–4), while mean LF decreased in a trend (p = 0.06), indicating reduced sympathetic dominance. Root mean square sequential difference increased for the Heart+ App, showing a possible trend (p = 0.09). Post–pre differences in LF/HF for sham compared with the Heart+ App were also significant (p < 0.008) by independent t-test, indicating clinical relevance. Conclusions: Significant beneficial changes in mean LF/HF, along with possible trends in mean LF and root mean square sequential difference, were observed in subjects following 35 min exposure to the Heart+ App that was working in the background on an active smartphone untouched by the subjects

  3. Effects of a Passive Online Software Application on Heart Rate Variability and Autonomic Nervous System Balance.

    PubMed

    Rubik, Beverly

    2017-01-01

    This study investigated whether short-term exposure to a passive online software application of purported subtle energy technology would affect heart rate variability (HRV) and associated autonomic nervous system measures. This was a randomized, double-blinded, sham-controlled clinical trial (RCT). The study took place in a nonprofit laboratory in Emeryville, California. Twenty healthy, nonsmoking subjects (16 females), aged 40-75 years, participated. Quantum Code Technology(™) (QCT), a purported subtle energy technology, was delivered through a passive software application (Heart+ App) on a smartphone placed <1 m from subjects who were seated and reading a catalog. HRV was measured for 5 min in triplicate for each condition via finger plethysmography using a Food and Drug Administration medically approved HRV measurement device. Measurements were made at baseline and 35 min following exposure to the software applications. The following parameters were calculated and analyzed: heart rate, total power, standard deviation node-to-node, root mean square sequential difference, low frequency to high frequency ratio (LF/HF), low frequency (LF), and high frequency (HF). Paired samples t-tests showed that for the Heart+ App, mean LF/HF decreased (p = 9.5 × 10(-4)), while mean LF decreased in a trend (p = 0.06), indicating reduced sympathetic dominance. Root mean square sequential difference increased for the Heart+ App, showing a possible trend (p = 0.09). Post-pre differences in LF/HF for sham compared with the Heart+ App were also significant (p < 0.008) by independent t-test, indicating clinical relevance. Significant beneficial changes in mean LF/HF, along with possible trends in mean LF and root mean square sequential difference, were observed in subjects following 35 min exposure to the Heart+ App that was working in the background on an active smartphone untouched by the subjects. This may be the first RCT to show that specific

  4. A Study of the Feasibility of Duplicating JAMPS Applications Software in the Ada Programming Language.

    DTIC Science & Technology

    1984-04-01

    the Sieve of 33 Eratosthenes 4 Sizing Analysis for Existing Software Written in "C" 44 5 Sizing Data with Adjustments 45 6 Conversion from "C" to Ada...benchmark program based on the Sieve of EratoSthenes [7 ]. This program finds all of the prime numbers between 3 and 16381. The benchmark test results shown in...appears to be quite reasonable for the JAMPS application. 32 - Table 3 Benchmark Test Results Using the Sieve of Eratosthenes [7 ] Execution Operating Time

  5. Second International Workshop on Software Engineering and Code Design in Parallel Meteorological and Oceanographic Applications

    NASA Technical Reports Server (NTRS)

    OKeefe, Matthew (Editor); Kerr, Christopher L. (Editor)

    1998-01-01

    This report contains the abstracts and technical papers from the Second International Workshop on Software Engineering and Code Design in Parallel Meteorological and Oceanographic Applications, held June 15-18, 1998, in Scottsdale, Arizona. The purpose of the workshop is to bring together software developers in meteorology and oceanography to discuss software engineering and code design issues for parallel architectures, including Massively Parallel Processors (MPP's), Parallel Vector Processors (PVP's), Symmetric Multi-Processors (SMP's), Distributed Shared Memory (DSM) multi-processors, and clusters. Issues to be discussed include: (1) code architectures for current parallel models, including basic data structures, storage allocation, variable naming conventions, coding rules and styles, i/o and pre/post-processing of data; (2) designing modular code; (3) load balancing and domain decomposition; (4) techniques that exploit parallelism efficiently yet hide the machine-related details from the programmer; (5) tools for making the programmer more productive; and (6) the proliferation of programming models (F--, OpenMP, MPI, and HPF).

  6. Spectra acquisition software for clinical applications of the USB4000 spectrometer

    NASA Astrophysics Data System (ADS)

    Martínez Rodríguez, A. E.; Delgado Atencio, J. A.; Vázquez Y Montiel, S.; Romero Hernández, R. A.

    2011-08-01

    The non-invasive clinic method of diffuse reflectance spectroscopy (DRS), for the diagnosis of human skin lesions can be performed by using spectrometric devices together with fiber optics probes. However, the operation of most of these devices commercially available is not specifically designed for clinical applications. As a result, the commercial software and the optical hardware of these spectrometers are impractical when trying to conciliate the requirements of a clinical procedure with their operation to perform the DRS for diagnosis purposes. Therefore, the development of home-built acquisition software will impact in a more reliable and practical spectrometric system for clinical environment. In this work is presented the development of an automation system that includes both a user graphical interface and a control system that enable a more reliable and faster acquisition of clinical spectra. The software features a voice control to perform the acquiring spectra process. The impact of this work is mostly the use of available programming platforms to implement a preliminary spectra processing tool that will lead to real-time acquisition of skin reflectance spectra of a given patient.

  7. Design of single phase inverter using microcontroller assisted by data processing applications software

    NASA Astrophysics Data System (ADS)

    Ismail, K.; Muharam, A.; Amin; Widodo Budi, S.

    2015-12-01

    Inverter is widely used for industrial, office, and residential purposes. Inverter supports the development of alternative energy such as solar cells, wind turbines and fuel cells by converting dc voltage to ac voltage. Inverter has been made with a variety of hardware and software combinations, such as the use of pure analog circuit and various types of microcontroller as controller. When using pure analog circuit, modification would be difficult because it will change the entire hardware components. In inverter with microcontroller based design (with software), calculations to generate AC modulation is done in the microcontroller. This increases programming complexity and amount of coding downloaded to the microcontroller chip (capacity flash memory in the microcontroller is limited). This paper discusses the design of a single phase inverter using unipolar modulation of sine wave and triangular wave, which is done outside the microcontroller using data processing software application (Microsoft Excel), result shows that complexity programming was reduce and resolution sampling data is very influence to THD. Resolution sampling must taking ½ A degree to get best THD (15.8%).

  8. Surgical model-view-controller simulation software framework for local and collaborative applications

    PubMed Central

    Sankaranarayanan, Ganesh; Halic, Tansel; Arikatla, Venkata Sreekanth; Lu, Zhonghua; De, Suvranu

    2010-01-01

    Purpose Surgical simulations require haptic interactions and collaboration in a shared virtual environment. A software framework for decoupled surgical simulation based on a multi-controller and multi-viewer model-view-controller (MVC) pattern was developed and tested. Methods A software framework for multimodal virtual environments was designed, supporting both visual interactions and haptic feedback while providing developers with an integration tool for heterogeneous architectures maintaining high performance, simplicity of implementation, and straightforward extension. The framework uses decoupled simulation with updates of over 1,000 Hz for haptics and accommodates networked simulation with delays of over 1,000 ms without performance penalty. Results The simulation software framework was implemented and was used to support the design of virtual reality-based surgery simulation systems. The framework supports the high level of complexity of such applications and the fast response required for interaction with haptics. The efficacy of the framework was tested by implementation of a minimally invasive surgery simulator. Conclusion A decoupled simulation approach can be implemented as a framework to handle simultaneous processes of the system at the various frame rates each process requires. The framework was successfully used to develop collaborative virtual environments (VEs) involving geographically distributed users connected through a network, with the results comparable to VEs for local users. PMID:20714933

  9. Surgical model-view-controller simulation software framework for local and collaborative applications.

    PubMed

    Maciel, Anderson; Sankaranarayanan, Ganesh; Halic, Tansel; Arikatla, Venkata Sreekanth; Lu, Zhonghua; De, Suvranu

    2011-07-01

    Surgical simulations require haptic interactions and collaboration in a shared virtual environment. A software framework for decoupled surgical simulation based on a multi-controller and multi-viewer model-view-controller (MVC) pattern was developed and tested. A software framework for multimodal virtual environments was designed, supporting both visual interactions and haptic feedback while providing developers with an integration tool for heterogeneous architectures maintaining high performance, simplicity of implementation, and straightforward extension. The framework uses decoupled simulation with updates of over 1,000 Hz for haptics and accommodates networked simulation with delays of over 1,000 ms without performance penalty. The simulation software framework was implemented and was used to support the design of virtual reality-based surgery simulation systems. The framework supports the high level of complexity of such applications and the fast response required for interaction with haptics. The efficacy of the framework was tested by implementation of a minimally invasive surgery simulator. A decoupled simulation approach can be implemented as a framework to handle simultaneous processes of the system at the various frame rates each process requires. The framework was successfully used to develop collaborative virtual environments (VEs) involving geographically distributed users connected through a network, with the results comparable to VEs for local users.

  10. Statistical software applications used in health services research: analysis of published studies in the U.S

    PubMed Central

    2011-01-01

    Background This study aims to identify the statistical software applications most commonly employed for data analysis in health services research (HSR) studies in the U.S. The study also examines the extent to which information describing the specific analytical software utilized is provided in published articles reporting on HSR studies. Methods Data were extracted from a sample of 1,139 articles (including 877 original research articles) published between 2007 and 2009 in three U.S. HSR journals, that were considered to be representative of the field based upon a set of selection criteria. Descriptive analyses were conducted to categorize patterns in statistical software usage in those articles. The data were stratified by calendar year to detect trends in software use over time. Results Only 61.0% of original research articles in prominent U.S. HSR journals identified the particular type of statistical software application used for data analysis. Stata and SAS were overwhelmingly the most commonly used software applications employed (in 46.0% and 42.6% of articles respectively). However, SAS use grew considerably during the study period compared to other applications. Stratification of the data revealed that the type of statistical software used varied considerably by whether authors were from the U.S. or from other countries. Conclusions The findings highlight a need for HSR investigators to identify more consistently the specific analytical software used in their studies. Knowing that information can be important, because different software packages might produce varying results, owing to differences in the software's underlying estimation methods. PMID:21977990

  11. The development and application of composite complexity models and a relative complexity metric in a software maintenance environment

    NASA Technical Reports Server (NTRS)

    Hops, J. M.; Sherif, J. S.

    1994-01-01

    A great deal of effort is now being devoted to the study, analysis, prediction, and minimization of software maintenance expected cost, long before software is delivered to users or customers. It has been estimated that, on the average, the effort spent on software maintenance is as costly as the effort spent on all other software costs. Software design methods should be the starting point to aid in alleviating the problems of software maintenance complexity and high costs. Two aspects of maintenance deserve attention: (1) protocols for locating and rectifying defects, and for ensuring that noe new defects are introduced in the development phase of the software process; and (2) protocols for modification, enhancement, and upgrading. This article focuses primarily on the second aspect, the development of protocols to help increase the quality and reduce the costs associated with modifications, enhancements, and upgrades of existing software. This study developed parsimonious models and a relative complexity metric for complexity measurement of software that were used to rank the modules in the system relative to one another. Some success was achieved in using the models and the relative metric to identify maintenance-prone modules.

  12. Complementing anatomy education using three-dimensional anatomy mobile software applications on tablet computers.

    PubMed

    Lewis, T L; Burnett, B; Tunstall, R G; Abrahams, P H

    2014-04-01

    Anatomy has traditionally been a cornerstone of medical education, which has been taught via dissection and didactic lectures. The rising prevalence of mobile tablet technology means medical software applications ("apps") play an increasingly important role in medical education. The applications highlighted in this article will aid anatomical educators to identify which are the most useful in clinical, academic, and educational environments. These have been systematically identified by downloading all applications with keywords related to anatomy and then carrying out qualitative assessment. Novel anatomy applications from developers such as Visible Body, 3D4Medical, and Pocket Anatomy allow students to visualize and manipulate complex anatomical structures using detailed 3D models. They often contain additional content including clinical correlations and a range of media from instructional videos to interactive quiz functions. The strength of tablet technology lies in its ability to consolidate and present anatomical information to the user in the most appropriate manner for their learning style. The only question mark remains over the level of detail and accuracy of these applications. Innovative medical educators who embrace tablet technology will find that anatomy applications serve as a useful learning tool when used in conjunction with existing teaching setups.

  13. Solar thermal power systems point-focusing thermal and electric applications projects. Volume 1: Executive summary

    NASA Technical Reports Server (NTRS)

    Marriott, A.

    1980-01-01

    The activities of the Point-Focusing Thermal and Electric Applications (PETEA) project for the fiscal year 1979 are summarized. The main thrust of the PFTEA Project, the small community solar thermal power experiment, was completed. Concept definition studies included a small central receiver approach, a point-focusing distributed receiver system with central power generation, and a point-focusing distributed receiver concept with distributed power generation. The first experiment in the Isolated Application Series was initiated. Planning for the third engineering experiment series, which addresses the industrial market sector, was also initiated. In addition to the experiment-related activities, several contracts to industry were let and studies were conducted to explore the market potential for point-focusing distributed receiver (PFDR) systems. System analysis studies were completed that looked at PFDR technology relative to other small power system technology candidates for the utility market sector.

  14. The application software of the CERN PS accelerator controls system — analysis of its cost and resources

    NASA Astrophysics Data System (ADS)

    Benincasa, Gianpaolo; Daneels, Axel; Heymans, Paul; Serre, Christian

    1986-06-01

    The CERN PS accelerators have evolved into one of the world's most sophisticated high energy physics facility. The variety of beams and their high repetition rate means that a most sophisticated controls system is required. This reflects on the application software. At the time of the completion of the new control system, nearly 1000 programs, amounting to around 450 000 lines of code, have been developed at the cost of approximately 120 man-years. The span of this software ranges from real-time application programs to special purpose development and management tools. This paper documents the cost, resources and production of this software project. These are analyzed in terms of the structure of the application software. Rules-of-thumb are suggested for estimating the required effort at various phases of the project and to define the implementation strategy.

  15. Transcranial MR-Guided Focused Ultrasound: A Review of the Technology and Neuro Applications

    PubMed Central

    Ghanouni, Pejman; Pauly, Kim Butts; Elias, W. Jeff; Henderson, Jaimie; Sheehan, Jason; Monteith, Stephen; Wintermark, Max

    2015-01-01

    MR guided focused ultrasound is a new, minimally invasive method of targeted tissue thermal ablation that may be of use to treat central neuropathic pain, essential tremor, Parkinson tremor, and brain tumors. The system has also been used to temporarily disrupt the blood-brain barrier to allow targeted drug delivery to brain tumors. This article reviews the physical principles of MR guided focused ultrasound and discusses current and potential applications of this exciting technology. PMID:26102394

  16. A Java and XML Application to Support Numerical Model Development within the Geologic Sequestration Software Suite (GS3)

    NASA Astrophysics Data System (ADS)

    Williams, M. D.; Wurstner, S. K.; Thorne, P. D.; Freedman, V. L.; Litofsky, A.; Huda, S. A.; Gurumoorthi, V.

    2010-12-01

    A Java and XML based application is currently under development as part of the Geologic Sequestration Software Suite (GS3) to support the generation of input files for multiple subsurface multifluid flow and transport simulators. The application will aid in the translation of conceptual models to a numerical modeling framework, and will provide the capability of generating multi-scale (spatial and temporal) numerical models in support of a variety of collaborative geologic sequestration studies. User specifications for numerical models include grids, geology, boundary and initial conditions, source terms, material properties, geochemical reactions, and geomechanics. Some of these inputs are defined as part of the conceptual model, while others are specified during the numerical model development process. To preserve the distinction between the conceptual model and its translation to a numerical modeling framework, the application manages data associated with each specification independently. This facilitates 1) building multi-scale numerical models from a common conceptual model, 2) building numerical models from multiple conceptual models, 3) building numerical models and input files for different simulators from a common conceptual model, 4) ease in re-generating numerical models in response to revisions of the conceptual model, and 5) revising the numerical model specification during the development process (e.g., grid modifications and resulting re-assignment of material property values and distributions). A key aspect of the model definition software is the ability to define features in the numerical model by specifying them as geometric objects, eliminating the need for the user to specify node/element numbers that often change when the grid is revised. The GS3 platform provides the capability of tracking provenance and dependencies for data files used in the numerical model definition. Within this framework, metadata is generated to support configuration

  17. CG-DAMS: Concrete gravity dam stability analysis software. Application manual, final report

    SciTech Connect

    Not Available

    1993-01-01

    CG-DAMS is a finite element based program written specifically for the stability analysis of concrete gravity dams. The code automates the prediction and evaluation of cracking in the dam, along the dam-rock interface, and in the foundation using incremental nonlinear analysis techniques based on the ``smeared crack`` approach. Its primary application is in the computation of dam-rock interface sliding stability factors of safety. The automated procedure for crack propagation analysis replaces the trial-and-error cracked-base analysis method commonly used in gravity dam safety analyses. This Application manual of CG-DAMS illustrates, through sample problems, the many features of the software. Example problems illustrate the capabilities of both CG-DAMS-PC and CG-DAMS-ABAQUS. CG-DAMS-PC is a menu driven program that runs on 386/486 PCs under the DOS operating system (4 Megabytes RAM, 25 Megabytes of hard disk space). CG-DAMS-ABAQUS is a pre- and post-processor along with a concrete constitutive model and distributed load module that interfaces with the ABAQUS general purpose finite element program. The PC program contains thermal analysis capabilities, a rough crack constitutive model, and an interface to the CRFLOOD software not available with the ABAQUS version. The CG-DAMS-ABAQUS program contains time marching dynamic analysis capabilities not available with the PC program. Example analyses presented include static, pseudo dynamic, and time marching dynamic analyses. The manual also presents sensitivity evaluations on mesh size and foundation material strength. Comparisons are presented between CG-DAMS and gravity method calculations. Comparisons with other finite element software are included for the dynamic time history analyses.

  18. VANESA - a software application for the visualization and analysis of networks in system biology applications.

    PubMed

    Brinkrolf, Christoph; Janowski, Sebastian Jan; Kormeier, Benjamin; Lewinski, Martin; Hippe, Klaus; Borck, Daniela; Hofestädt, Ralf

    2014-06-23

    VANESA is a modeling software for the automatic reconstruction and analysis of biological networks based on life-science database information. Using VANESA, scientists are able to model any kind of biological processes and systems as biological networks. It is now possible for scientists to automatically reconstruct important molecular systems with information from the databases KEGG, MINT, IntAct, HPRD, and BRENDA. Additionally, experimental results can be expanded with database information to better analyze the investigated elements and processes in an overall context. Users also have the possibility to use graph theoretical approaches in VANESA to identify regulatory structures and significant actors within the modeled systems. These structures can then be further investigated in the Petri net environment of VANESA. It is platform-independent, free-of-charge, and available at http://vanesa.sf.net.

  19. Intracoronary optical coherence tomography: Clinical and research applications and intravascular imaging software overview.

    PubMed

    Tenekecioglu, Erhan; Albuquerque, Felipe N; Sotomi, Yohei; Zeng, Yaping; Suwannasom, Pannipa; Tateishi, Hiroki; Cavalcante, Rafael; Ishibashi, Yuki; Nakatani, Shimpei; Abdelghani, Mohammad; Dijkstra, Jouke; Bourantas, Christos; Collet, Carlos; Karanasos, Antonios; Radu, Maria; Wang, Ancong; Muramatsu, Takashi; Landmesser, Ulf; Okamura, Takayuki; Regar, Evelyn; Räber, Lorenz; Guagliumi, Giulio; Pyo, Robert T; Onuma, Yoshinobu; Serruys, Patrick W

    2017-01-21

    By providing valuable information about the coronary artery wall and lumen, intravascular imaging may aid in optimizing interventional procedure results and thereby could improve clinical outcomes following percutaneous coronary intervention (PCI). Intravascular optical coherence tomography (OCT) is a light-based technology with a tissue penetration of approximately 1 to 3 mm and provides near histological resolution. It has emerged as a technological breakthrough in intravascular imaging with multiple clinical and research applications. OCT provides detailed visualization of the vessel following PCI and provides accurate assessment of post-procedural stent performance including detection of edge dissection, stent struts apposition, tissue prolapse, and healing parameters. Additionally, it can provide accurate characterization of plaque morphology and provides key information to optimize post-procedural outcomes. This manuscript aims to review the current clinical and research applications of intracoronary OCT and summarize the analytic OCT imaging software packages currently available. © 2017 Wiley Periodicals, Inc.

  20. Software-defined radio with flexible RF front end for satellite maritime radio applications

    NASA Astrophysics Data System (ADS)

    Budroweit, Jan

    2016-09-01

    This paper presents the concept of a software-defined radio with a flexible RF front end. The design and architecture of this system, as well as possible application examples will be explained. One specific scenario is the operation in maritime frequency bands. A well-known service is the Automatic Identification System (AIS), which has been captured by the DLR mission AISat, and will be chosen as a maritime application example. The results of an embedded solution for AIS on the SDR platform are presented in this paper. Since there is an increasing request for more performance on maritime radio bands, services like AIS will be enhanced by the International Association of Marine Aids to Navigation and Lighthouse Authorities (IALA). The new VHF Data Exchange Service (VDES) shall implement a dedicated satellite link. This paper describes that the SDR with a flexible RF front end can be used as a technology demonstration platform for this upcoming data exchange service.

  1. Freely available compound data sets and software tools for chemoinformatics and computational medicinal chemistry applications.

    PubMed

    Hu, Ye; Bajorath, Jurgen

    2012-01-01

    We have generated a number of  compound data sets and programs for different types of applications in pharmaceutical research. These data sets and programs were originally designed for our research projects and are made publicly available. Without consulting original literature sources, it is difficult to understand specific features of data sets and software tools, basic ideas underlying their design, and applicability domains. Currently, 30 different entries are available for download from our website. In this data article, we provide an overview of the data and tools we make available and designate the areas of research for which they should be useful. For selected data sets and methods/programs, detailed descriptions are given. This article should help interested readers to select data and tools for specific computational investigations.

  2. Application of the Software as a Service Model to the Control of Complex Building Systems

    SciTech Connect

    Stadler, Michael; Donadee, Jon; Marnay, Chris; Lai, Judy; Mendes, Goncalo; Appen, Jan von; Mégel, Oliver; Bhattacharya, Prajesh; DeForest, Nicholas; Lai, Judy

    2011-03-18

    In an effort to create broad access to its optimization software, Lawrence Berkeley National Laboratory (LBNL), in collaboration with the University of California at Davis (UC Davis) and OSISoft, has recently developed a Software as a Service (SaaS) Model for reducing energy costs, cutting peak power demand, and reducing carbon emissions for multipurpose buildings. UC Davis currently collects and stores energy usage data from buildings on its campus. Researchers at LBNL sought to demonstrate that a SaaS application architecture could be built on top of this data system to optimize the scheduling of electricity and heat delivery in the building. The SaaS interface, known as WebOpt, consists of two major parts: a) the investment& planning and b) the operations module, which builds on the investment& planning module. The operational scheduling and load shifting optimization models within the operations module use data from load prediction and electrical grid emissions models to create an optimal operating schedule for the next week, reducing peak electricity consumption while maintaining quality of energy services. LBNL's application also provides facility managers with suggested energy infrastructure investments for achieving their energy cost and emission goals based on historical data collected with OSISoft's system. This paper describes these models as well as the SaaS architecture employed by LBNL researchers to provide asset scheduling services to UC Davis. The peak demand, emissions, and cost implications of the asset operation schedule and investments suggested by this optimization model are analyzed.

  3. Application of the Software as a Service Model to the Control of Complex Building Systems

    SciTech Connect

    Stadler, Michael; Donadee, Jonathan; Marnay, Chris; Mendes, Goncalo; Appen, Jan von; Megel, Oliver; Bhattacharya, Prajesh; DeForest, Nicholas; Lai, Judy

    2011-03-17

    In an effort to create broad access to its optimization software, Lawrence Berkeley National Laboratory (LBNL), in collaboration with the University of California at Davis (UC Davis) and OSISoft, has recently developed a Software as a Service (SaaS) Model for reducing energy costs, cutting peak power demand, and reducing carbon emissions for multipurpose buildings. UC Davis currently collects and stores energy usage data from buildings on its campus. Researchers at LBNL sought to demonstrate that a SaaS application architecture could be built on top of this data system to optimize the scheduling of electricity and heat delivery in the building. The SaaS interface, known as WebOpt, consists of two major parts: a) the investment& planning and b) the operations module, which builds on the investment& planning module. The operational scheduling and load shifting optimization models within the operations module use data from load prediction and electrical grid emissions models to create an optimal operating schedule for the next week, reducing peak electricity consumption while maintaining quality of energy services. LBNL's application also provides facility managers with suggested energy infrastructure investments for achieving their energy cost and emission goals based on historical data collected with OSISoft's system. This paper describes these models as well as the SaaS architecture employed by LBNL researchers to provide asset scheduling services to UC Davis. The peak demand, emissions, and cost implications of the asset operation schedule and investments suggested by this optimization model are analysed.

  4. Scalable, high-performance 3D imaging software platform: system architecture and application to virtual colonoscopy.

    PubMed

    Yoshida, Hiroyuki; Wu, Yin; Cai, Wenli; Brett, Bevin

    2012-01-01

    One of the key challenges in three-dimensional (3D) medical imaging is to enable the fast turn-around time, which is often required for interactive or real-time response. This inevitably requires not only high computational power but also high memory bandwidth due to the massive amount of data that need to be processed. In this work, we have developed a software platform that is designed to support high-performance 3D medical image processing for a wide range of applications using increasingly available and affordable commodity computing systems: multi-core, clusters, and cloud computing systems. To achieve scalable, high-performance computing, our platform (1) employs size-adaptive, distributable block volumes as a core data structure for efficient parallelization of a wide range of 3D image processing algorithms; (2) supports task scheduling for efficient load distribution and balancing; and (3) consists of a layered parallel software libraries that allow a wide range of medical applications to share the same functionalities. We evaluated the performance of our platform by applying it to an electronic cleansing system in virtual colonoscopy, with initial experimental results showing a 10 times performance improvement on an 8-core workstation over the original sequential implementation of the system.

  5. Software architecture for a distributed real-time system in Ada, with application to telerobotics

    NASA Technical Reports Server (NTRS)

    Olsen, Douglas R.; Messiora, Steve; Leake, Stephen

    1992-01-01

    The architecture structure and software design methodology presented is described in the context of telerobotic application in Ada, specifically the Engineering Test Bed (ETB), which was developed to support the Flight Telerobotic Servicer (FTS) Program at GSFC. However, the nature of the architecture is such that it has applications to any multiprocessor distributed real-time system. The ETB architecture, which is a derivation of the NASA/NBS Standard Reference Model (NASREM), defines a hierarchy for representing a telerobot system. Within this hierarchy, a module is a logical entity consisting of the software associated with a set of related hardware components in the robot system. A module is comprised of submodules, which are cyclically executing processes that each perform a specific set of functions. The submodules in a module can run on separate processors. The submodules in the system communicate via command/status (C/S) interface channels, which are used to send commands down and relay status back up the system hierarchy. Submodules also communicate via setpoint data links, which are used to transfer control data from one submodule to another. A submodule invokes submodule algorithms (SMA's) to perform algorithmic operations. Data that describe or models a physical component of the system are stored as objects in the World Model (WM). The WM is a system-wide distributed database that is accessible to submodules in all modules of the system for creating, reading, and writing objects.

  6. A complete software application for automatic registration of x-ray mammography and magnetic resonance images

    SciTech Connect

    Solves-Llorens, J. A.; Rupérez, M. J. Monserrat, C.; Lloret, M.

    2014-08-15

    Purpose: This work presents a complete and automatic software application to aid radiologists in breast cancer diagnosis. The application is a fully automated method that performs a complete registration of magnetic resonance (MR) images and x-ray (XR) images in both directions (from MR to XR and from XR to MR) and for both x-ray mammograms, craniocaudal (CC), and mediolateral oblique (MLO). This new approximation allows radiologists to mark points in the MR images and, without any manual intervention, it provides their corresponding points in both types of XR mammograms and vice versa. Methods: The application automatically segments magnetic resonance images and x-ray images using the C-Means method and the Otsu method, respectively. It compresses the magnetic resonance images in both directions, CC and MLO, using a biomechanical model of the breast that distinguishes the specific biomechanical behavior of each one of its three tissues (skin, fat, and glandular tissue) separately. It makes a projection of both compressions and registers them with the original XR images using affine transformations and nonrigid registration methods. Results: The application has been validated by two expert radiologists. This was carried out through a quantitative validation on 14 data sets in which the Euclidean distance between points marked by the radiologists and the corresponding points obtained by the application were measured. The results showed a mean error of 4.2 ± 1.9 mm for the MRI to CC registration, 4.8 ± 1.3 mm for the MRI to MLO registration, and 4.1 ± 1.3 mm for the CC and MLO to MRI registration. Conclusions: A complete software application that automatically registers XR and MR images of the breast has been implemented. The application permits radiologists to estimate the position of a lesion that is suspected of being a tumor in an imaging modality based on its position in another different modality with a clinically acceptable error. The results show that the

  7. A complete software application for automatic registration of x-ray mammography and magnetic resonance images.

    PubMed

    Solves-Llorens, J A; Rupérez, M J; Monserrat, C; Feliu, E; García, M; Lloret, M

    2014-08-01

    This work presents a complete and automatic software application to aid radiologists in breast cancer diagnosis. The application is a fully automated method that performs a complete registration of magnetic resonance (MR) images and x-ray (XR) images in both directions (from MR to XR and from XR to MR) and for both x-ray mammograms, craniocaudal (CC), and mediolateral oblique (MLO). This new approximation allows radiologists to mark points in the MR images and, without any manual intervention, it provides their corresponding points in both types of XR mammograms and vice versa. The application automatically segments magnetic resonance images and x-ray images using the C-Means method and the Otsu method, respectively. It compresses the magnetic resonance images in both directions, CC and MLO, using a biomechanical model of the breast that distinguishes the specific biomechanical behavior of each one of its three tissues (skin, fat, and glandular tissue) separately. It makes a projection of both compressions and registers them with the original XR images using affine transformations and nonrigid registration methods. The application has been validated by two expert radiologists. This was carried out through a quantitative validation on 14 data sets in which the Euclidean distance between points marked by the radiologists and the corresponding points obtained by the application were measured. The results showed a mean error of 4.2 ± 1.9 mm for the MRI to CC registration, 4.8 ± 1.3 mm for the MRI to MLO registration, and 4.1 ± 1.3 mm for the CC and MLO to MRI registration. A complete software application that automatically registers XR and MR images of the breast has been implemented. The application permits radiologists to estimate the position of a lesion that is suspected of being a tumor in an imaging modality based on its position in another different modality with a clinically acceptable error. The results show that the application can accelerate the

  8. Evaluation of commercial drilling and geological software for deep drilling applications

    NASA Astrophysics Data System (ADS)

    Pierdominici, Simona; Prevedel, Bernhard; Conze, Ronald; Tridec Team

    2013-04-01

    risks and costs. This procedure enables a timely, efficient and accurate data access and exchange among the rig site data acquisition system, office-based software applications and data storage. The loading of real-time data has to be quick and efficient in order to refine the model and learn the lessons for the next drilling operations.

  9. RICIS Software Engineering 90 Symposium: Aerospace Applications and Research Directions Proceedings Appendices

    NASA Technical Reports Server (NTRS)

    1990-01-01

    Papers presented at RICIS Software Engineering Symposium are compiled. The following subject areas are covered: flight critical software; management of real-time Ada; software reuse; megaprogramming software; Ada net; POSIX and Ada integration in the Space Station Freedom Program; and assessment of formal methods for trustworthy computer systems.

  10. Coordination and organization of security software process for power information application environment

    NASA Astrophysics Data System (ADS)

    Wang, Qiang

    2017-09-01

    As an important part of software engineering, the software process decides the success or failure of software product. The design and development feature of security software process is discussed, so is the necessity and the present significance of using such process. Coordinating the function software, the process for security software and its testing are deeply discussed. The process includes requirement analysis, design, coding, debug and testing, submission and maintenance. In each process, the paper proposed the subprocesses to support software security. As an example, the paper introduces the above process into the power information platform.

  11. Optimal patterns for sequentially multiple focusing in high intensity focused ultrasound and their application to thermal dose

    NASA Astrophysics Data System (ADS)

    Shim, Mun-Bo; Lee, Hyoungki; Lee, Hotaik; Park, Junho; Ahn, Minsu

    2012-11-01

    The purpose of this study is to propose a new method for multiple-focus generation to shorten overall treatment time as well as to avoid the formation of high intensity regions outside the target volume. A numerical simulation of the acoustic fields produced by a 1017-element spherical-section ultrasound phased array transducer operating at a frequency of 1.0MHz with 16 cm radius of curvature is performed for the proposed multiple-focus generation. The total foci are partitioned into the several patterns because multiple focusing generally gives rise to the grating lobes outside of the three dimensional region of interest even if applying the optimization of intensity gain in determining the phases and amplitudes of the excitation source vector. The optimization problem is repeatedly formulated in term of the focal points until the multiple-focus patterns cover all the focal points. Genetic algorithm is used for selecting the patterns without the grating lobes. The obtained set of multiple-focus patterns can sequentially be used to necrose a given volume in the short time as well as in the safe treatment. The proposed method might prove useful to improve the speed and safety of focused ultrasound thermal ablation. This strategy will also be effective for any transducers as well as for other cases of multiple-focus generation.

  12. Application of software engineering to development of reactor-safety codes

    SciTech Connect

    Wilburn, N P; Niccoli, L G

    1980-11-01

    As a result of the drastically increasing cost of software and the lack of an engineering approach, the technology of Software Engineering is being developed. Software Engineering provides an answer to the increasing cost of developing and maintaining software. It has been applied extensively in the business and aerospace communities and is just now being applied to the development of scientific software and, in particular, to the development of reactor safety codes at HEDL.

  13. The object-oriented development of a parallel application in protein dynamics: why we need software tools for HPCN applications

    NASA Astrophysics Data System (ADS)

    Bækdal, Lars; Joosen, Wouter; Larsen, Thomas; Kolafa, Jiri; Ovesen, Jens H.; Perram, John W.; Petersen, Henrik G.; Bywater, Robert; Ratner, Mark

    1996-08-01

    We analyse the concurrency and performance of the various types of force calculations involved in the molecular dynamics simulation of large protein or polyelectrolyte molecules. Although this analysis can in principle be used to write a meta-program to optimize load-balancing of this application on an MPP system, we argue that it is an enormous undertaking not appropriate for the computational scientist. Instead we argue that it is better to exploit research in parallel execution environments which provide automatic load-balancing for concurrent Object-Oriented applications. We also argue that use of Object-Oriented technology in the design of simulation software encapsulates the natural concurrency of the system. We illustrate this point with a discussion of the constraint force calculation for a polymeric molecule.

  14. Information Engineering and the Information Engineering Facility versus Rapid Application Development and Focus

    DTIC Science & Technology

    1992-12-01

    business requirements that are resilient and responsive to continuous change and improvement. Business re-engineering focuses on the strategic vision and... Supply department of the Naval Postgraduate School (NPS). Future possibilities for applications include integrating the curricular offices, travel and...operations ( supply , public works, and financial and personnel resources) are directed by military personnel with predominantly civilian staffs. The

  15. Total lithography system based on a new application software platform enabling smart scanner management

    NASA Astrophysics Data System (ADS)

    Kono, Hirotaka; Masaki, Kazuo; Matsuyama, Tomoyuki; Wakamoto, Shinji; Park, Seemoon; Sugihara, Taro; Shibazaki, Yuichi

    2015-03-01

    Along with device shrinkage, higher accuracy will continuously be required from photo-lithography tools in order to enhance on-product yield. In order to achieve higher yield, the advanced photo-lithography tools must be equipped with sophisticated tuning knobs on the tool and with software that is flexible enough to be applied per layer. This means photo-lithography tools must be capable of handling many types of sub-recipes and parameters simultaneously. To enable managing such a large amount of data easily and to setup lithography tools smoothly, we have developed a total lithography system called Litho Turnkey Solution based on a new software application platform, which we call Plug and Play Manager (PPM). PPM has its own graphical user interface, which enables total management of various data. Here various data means recipes, sub-recipes, tuning-parameters, measurement results, and so on. Through PPM, parameter making by intelligent applications such as CDU/Overlay tuning tools can easily be implemented. In addition, PPM is also linked to metrology tools and the customer's host computer, which enables data flow automation. Based on measurement data received from the metrology tools, PPM calculates correction parameters and sends them to the scanners automatically. This scheme can make calibration feedback loops possible. It should be noted that the abovementioned functions are running on the same platform through a user-friendly interface. This leads to smart scanner management and usability improvement. In this paper, we will demonstrate the latest development status of Nikon's total lithography solution based on PPM; describe details of each application; and provide supporting data for the accuracy and usability of the system. Keywords: exposure

  16. Mercury: Reusable software application for Metadata Management, Data Discovery and Access

    NASA Astrophysics Data System (ADS)

    Devarakonda, Ranjeet; Palanisamy, Giri; Green, James; Wilson, Bruce E.

    2009-12-01

    simple, keyword, spatial and temporal searches across these metadata sources. The search user interface software has two API categories; a common core API which is used by all the Mercury user interfaces for querying the index and a customized API for project specific user interfaces. For our work in producing a reusable, portable, robust, feature-rich application, Mercury received a 2008 NASA Earth Science Data Systems Software Reuse Working Group Peer-Recognition Software Reuse Award. The new Mercury system is based on a Service Oriented Architecture and effectively reuses components for various services such as Thesaurus Service, Gazetteer Web Service and UDDI Directory Services. The software also provides various search services including: RSS, Geo-RSS, OpenSearch, Web Services and Portlets, integrated shopping cart to order datasets from various data centers (ORNL DAAC, NSIDC) and integrated visualization tools. Other features include: Filtering and dynamic sorting of search results, book-markable search results, save, retrieve, and modify search criteria.

  17. Adaptation of Control Center Software to Commerical Real-Time Display Applications

    NASA Technical Reports Server (NTRS)

    Collier, Mark D.

    1994-01-01

    NASA-Marshall Space Flight Center (MSFC) is currently developing an enhanced Huntsville Operation Support Center (HOSC) system designed to support multiple spacecraft missions. The Enhanced HOSC is based upon a distributed computing architecture using graphic workstation hardware and industry standard software including POSIX, X Windows, Motif, TCP/IP, and ANSI C. Southwest Research Institute (SwRI) is currently developing a prototype of the Display Services application for this system. Display Services provides the capability to generate and operate real-time data-driven graphic displays. This prototype is a highly functional application designed to allow system end users to easily generate complex data-driven displays. The prototype is easy to use, flexible, highly functional, and portable. Although this prototype is being developed for NASA-MSFC, the general-purpose real-time display capability can be reused in similar mission and process control environments. This includes any environment depending heavily upon real-time data acquisition and display. Reuse of the prototype will be a straight-forward transition because the prototype is portable, is designed to add new display types easily, has a user interface which is separated from the application code, and is very independent of the specifics of NASA-MSFC's system. Reuse of this prototype in other environments is a excellent alternative to creation of a new custom application, or for environments with a large number of users, to purchasing a COTS package.

  18. Electrically tunable-focusing and polarizer-free liquid crystal lenses for ophthalmic applications.

    PubMed

    Lin, Yi-Hsin; Chen, Hung-Shan

    2013-04-22

    An electrically tunable-focusing and polarizer-free liquid crystal (LC) lens for ophthalmic applications is demonstrated. The optical mechanism of a LC lens used in human eye system is introduced. The polarizer-free LC lens for myopia-presbyopia based on artificial accommodation is demonstrated. The continuously tunable-focusing properties of the LC lenses are more practical in applications for different visional conditions of people. The concept we proposed can also be applied to another types of lenses as long as the focusing properties are tunable. The concept in this paper can also be extensively applied to imaging systems, and projection systems, such as cameras in cell phones, pico projectors, and endoscopes.

  19. Design and Application of the Reconstruction Software for the BaBar Calorimeter

    SciTech Connect

    Strother, Philip David; /Imperial Coll., London

    2006-07-07

    The BaBar high energy physics experiment will be in operation at the PEP-II asymmetric e{sup +}e{sup -} collider in Spring 1999. The primary purpose of the experiment is the investigation of CP violation in the neutral B meson system. The electromagnetic calorimeter forms a central part of the experiment and new techniques are employed in data acquisition and reconstruction software to maximize the capability of this device. The use of a matched digital filter in the feature extraction in the front end electronics is presented. The performance of the filter in the presence of the expected high levels of soft photon background from the machine is evaluated. The high luminosity of the PEP-II machine and the demands on the precision of the calorimeter require reliable software that allows for increased physics capability. BaBar has selected C++ as its primary programming language and object oriented analysis and design as its coding paradigm. The application of this technology to the reconstruction software for the calorimeter is presented. The design of the systems for clustering, cluster division, track matching, particle identification and global calibration is discussed with emphasis on the provisions in the design for increased physics capability as levels of understanding of the detector increase. The CP violating channel B{sup 0} {yields} J/{Psi}K{sub S}{sup 0} has been studied in the two lepton, two {pi}{sup 0} final state. The contribution of this channel to the evaluation of the angle sin 2{beta} of the unitarity triangle is compared to that from the charged pion final state. An error of 0.34 on this quantity is expected after 1 year of running at design luminosity.

  20. The Environment for Application Software Integration and Execution (EASIE) version 1.0. Volume 1: Executive overview

    NASA Technical Reports Server (NTRS)

    Rowell, Lawrence F.; Davis, John S.

    1989-01-01

    The Environment for Application Software Integration and Execution (EASIE) provides a methodology and a set of software utility programs to ease the task of coordinating engineering design and analysis codes. EASIE was designed to meet the needs of conceptual design engineers that face the task of integrating many stand-alone engineering analysis programs. Using EASIE, programs are integrated through a relational database management system. Volume 1, Executive Overview, gives an overview of the functions provided by EASIE and describes their use. Three operational design systems based upon the EASIE software are briefly described.

  1. ADFNE: Open source software for discrete fracture network engineering, two and three dimensional applications

    NASA Astrophysics Data System (ADS)

    Fadakar Alghalandis, Younes

    2017-05-01

    Rapidly growing topic, the discrete fracture network engineering (DFNE), has already attracted many talents from diverse disciplines in academia and industry around the world to challenge difficult problems related to mining, geothermal, civil, oil and gas, water and many other projects. Although, there are few commercial software capable of providing some useful functionalities fundamental for DFNE, their costs, closed code (black box) distributions and hence limited programmability and tractability encouraged us to respond to this rising demand with a new solution. This paper introduces an open source comprehensive software package for stochastic modeling of fracture networks in two- and three-dimension in discrete formulation. Functionalities included are geometric modeling (e.g., complex polygonal fracture faces, and utilizing directional statistics), simulations, characterizations (e.g., intersection, clustering and connectivity analyses) and applications (e.g., fluid flow). The package is completely written in Matlab scripting language. Significant efforts have been made to bring maximum flexibility to the functions in order to solve problems in both two- and three-dimensions in an easy and united way that is suitable for beginners, advanced and experienced users.

  2. Igpet software for modeling igneous processes: examples of application using the open educational version

    NASA Astrophysics Data System (ADS)

    Carr, Michael J.; Gazel, Esteban

    2016-09-01

    We provide here an open version of Igpet software, called t-Igpet to emphasize its application for teaching and research in forward modeling of igneous geochemistry. There are three programs, a norm utility, a petrologic mixing program using least squares and Igpet, a graphics program that includes many forms of numerical modeling. Igpet is a multifaceted tool that provides the following basic capabilities: igneous rock identification using the IUGS (International Union of Geological Sciences) classification and several supplementary diagrams; tectonic discrimination diagrams; pseudo-quaternary projections; least squares fitting of lines, polynomials and hyperbolae; magma mixing using two endmembers, histograms, x-y plots, ternary plots and spider-diagrams. The advanced capabilities of Igpet are multi-element mixing and magma evolution modeling. Mixing models are particularly useful for understanding the isotopic variations in rock suites that evolved by mixing different sources. The important melting models include, batch melting, fractional melting and aggregated fractional melting. Crystallization models include equilibrium and fractional crystallization and AFC (assimilation and fractional crystallization). Theses, reports and proposals concerning igneous petrology are improved by numerical modeling. For reviewed publications some elements of modeling are practically a requirement. Our intention in providing this software is to facilitate improved communication and lower entry barriers to research, especially for students.

  3. a New Digital Image Correlation Software for Displacements Field Measurement in Structural Applications

    NASA Astrophysics Data System (ADS)

    Ravanelli, R.; Nascetti, A.; Di Rita, M.; Belloni, V.; Mattei, D.; Nisticó, N.; Crespi, M.

    2017-07-01

    Recently, there has been a growing interest in studying non-contact techniques for strain and displacement measurement. Within photogrammetry, Digital Image Correlation (DIC) has received particular attention thanks to the recent advances in the field of lowcost, high resolution digital cameras, computer power and memory storage. DIC is indeed an optical technique able to measure full field displacements and strain by comparing digital images of the surface of a material sample at different stages of deformation and thus can play a major role in structural monitoring applications. For all these reasons, a free and open source 2D DIC software, named py2DIC, was developed at the Geodesy and Geomatics Division of DICEA, University of Rome La Sapienza. Completely written in python, the software is based on the template matching method and computes the displacement and strain fields. The potentialities of Py2DIC were evaluated by processing the images captured during a tensile test performed in the Lab of Structural Engineering, where three different Glass Fiber Reinforced Polymer samples were subjected to a controlled tension by means of a universal testing machine. The results, compared with the values independently measured by several strain gauges fixed on the samples, demonstrate the possibility to successfully characterize the deformation mechanism of the investigated material. Py2DIC is indeed able to highlight displacements at few microns level, in reasonable agreement with the reference, both in terms of displacements (again, at few microns in the average) and Poisson's module.

  4. Igpet software for modeling igneous processes: examples of application using the open educational version

    NASA Astrophysics Data System (ADS)

    Carr, Michael J.; Gazel, Esteban

    2017-04-01

    We provide here an open version of Igpet software, called t-Igpet to emphasize its application for teaching and research in forward modeling of igneous geochemistry. There are three programs, a norm utility, a petrologic mixing program using least squares and Igpet, a graphics program that includes many forms of numerical modeling. Igpet is a multifaceted tool that provides the following basic capabilities: igneous rock identification using the IUGS (International Union of Geological Sciences) classification and several supplementary diagrams; tectonic discrimination diagrams; pseudo-quaternary projections; least squares fitting of lines, polynomials and hyperbolae; magma mixing using two endmembers, histograms, x-y plots, ternary plots and spider-diagrams. The advanced capabilities of Igpet are multi-element mixing and magma evolution modeling. Mixing models are particularly useful for understanding the isotopic variations in rock suites that evolved by mixing different sources. The important melting models include, batch melting, fractional melting and aggregated fractional melting. Crystallization models include equilibrium and fractional crystallization and AFC (assimilation and fractional crystallization). Theses, reports and proposals concerning igneous petrology are improved by numerical modeling. For reviewed publications some elements of modeling are practically a requirement. Our intention in providing this software is to facilitate improved communication and lower entry barriers to research, especially for students.

  5. Managing MDO Software Development Projects

    NASA Technical Reports Server (NTRS)

    Townsend, J. C.; Salas, A. O.

    2002-01-01

    Over the past decade, the NASA Langley Research Center developed a series of 'grand challenge' applications demonstrating the use of parallel and distributed computation and multidisciplinary design optimization. All but the last of these applications were focused on the high-speed civil transport vehicle; the final application focused on reusable launch vehicles. Teams of discipline experts developed these multidisciplinary applications by integrating legacy engineering analysis codes. As teams became larger and the application development became more complex with increasing levels of fidelity and numbers of disciplines, the need for applying software engineering practices became evident. This paper briefly introduces the application projects and then describes the approaches taken in project management and software engineering for each project; lessons learned are highlighted.

  6. NASA's Software Safety Standard

    NASA Technical Reports Server (NTRS)

    Ramsay, Christopher M.

    2007-01-01

    NASA relies more and more on software to control, monitor, and verify its safety critical systems, facilities and operations. Since the 1960's there has hardly been a spacecraft launched that does not have a computer on board that will provide command and control services. There have been recent incidents where software has played a role in high-profile mission failures and hazardous incidents. For example, the Mars Orbiter, Mars Polar Lander, the DART (Demonstration of Autonomous Rendezvous Technology), and MER (Mars Exploration Rover) Spirit anomalies were all caused or contributed to by software. The Mission Control Centers for the Shuttle, ISS, and unmanned programs are highly dependant on software for data displays, analysis, and mission planning. Despite this growing dependence on software control and monitoring, there has been little to no consistent application of software safety practices and methodology to NASA's projects with safety critical software. Meanwhile, academia and private industry have been stepping forward with procedures and standards for safety critical systems and software, for example Dr. Nancy Leveson's book Safeware: System Safety and Computers. The NASA Software Safety Standard, originally published in 1997, was widely ignored due to its complexity and poor organization. It also focused on concepts rather than definite procedural requirements organized around a software project lifecycle. Led by NASA Headquarters Office of Safety and Mission Assurance, the NASA Software Safety Standard has recently undergone a significant update. This new standard provides the procedures and guidelines for evaluating a project for safety criticality and then lays out the minimum project lifecycle requirements to assure the software is created, operated, and maintained in the safest possible manner. This update of the standard clearly delineates the minimum set of software safety requirements for a project without detailing the implementation for those

  7. NASA's Software Safety Standard

    NASA Technical Reports Server (NTRS)

    Ramsay, Christopher M.

    2007-01-01

    NASA relies more and more on software to control, monitor, and verify its safety critical systems, facilities and operations. Since the 1960's there has hardly been a spacecraft launched that does not have a computer on board that will provide command and control services. There have been recent incidents where software has played a role in high-profile mission failures and hazardous incidents. For example, the Mars Orbiter, Mars Polar Lander, the DART (Demonstration of Autonomous Rendezvous Technology), and MER (Mars Exploration Rover) Spirit anomalies were all caused or contributed to by software. The Mission Control Centers for the Shuttle, ISS, and unmanned programs are highly dependant on software for data displays, analysis, and mission planning. Despite this growing dependence on software control and monitoring, there has been little to no consistent application of software safety practices and methodology to NASA's projects with safety critical software. Meanwhile, academia and private industry have been stepping forward with procedures and standards for safety critical systems and software, for example Dr. Nancy Leveson's book Safeware: System Safety and Computers. The NASA Software Safety Standard, originally published in 1997, was widely ignored due to its complexity and poor organization. It also focused on concepts rather than definite procedural requirements organized around a software project lifecycle. Led by NASA Headquarters Office of Safety and Mission Assurance, the NASA Software Safety Standard has recently undergone a significant update. This new standard provides the procedures and guidelines for evaluating a project for safety criticality and then lays out the minimum project lifecycle requirements to assure the software is created, operated, and maintained in the safest possible manner. This update of the standard clearly delineates the minimum set of software safety requirements for a project without detailing the implementation for those

  8. Summary of the CSRI Workshop on Combinatorial Algebraic Topology (CAT): Software, Applications, & Algorithms

    SciTech Connect

    Bennett, Janine Camille; Day, David Minot; Mitchell, Scott A.

    2009-11-20

    This report summarizes the Combinatorial Algebraic Topology: software, applications & algorithms workshop (CAT Workshop). The workshop was sponsored by the Computer Science Research Institute of Sandia National Laboratories. It was organized by CSRI staff members Scott Mitchell and Shawn Martin. It was held in Santa Fe, New Mexico, August 29-30. The CAT Workshop website has links to some of the talk slides and other information, http://www.cs.sandia.gov/CSRI/Workshops/2009/CAT/index.html. The purpose of the report is to summarize the discussions and recap the sessions. There is a special emphasis on technical areas that are ripe for further exploration, and the plans for follow-up amongst the workshop participants. The intended audiences are the workshop participants, other researchers in the area, and the workshop sponsors.

  9. Instruction set extension for software defined radio in mobile GNSS applications

    NASA Astrophysics Data System (ADS)

    Marcinek, Krzysztof; Pleskacz, Witold A.

    2013-07-01

    A variety of currently operational GNSS frequencies and rapid development of new navigation satellite systems resulted in the need for interoperability and compatibility. Recent state-of-the-art integrated GNSS front-ends are capable of simultaneous support of multiple navigation systems [1, 2]. A unification of the signal processing part is also possible and, what is more, desirable. This paper introduces a proposal for universal instruction set extension (ISE), which is used for accelerating signal processing in SDR (Software Defined Radio) based GNSS applications. The results of this work were implemented and tested on the chip multithreading general purpose processor - AGATE [3], running on the Xilinx Virtex-6 ML605 FPGA evaluation board.

  10. ELAS - A geobased information system that is transferable to several computers. [Earth resources Laboratory Applications Software

    NASA Technical Reports Server (NTRS)

    Whitley, S. L.; Pearson, R. W.; Seyfarth, B. R.; Graham, M. H.

    1981-01-01

    In the early years of remote sensing, emphasis was placed on the processing and analysis of data from a single multispectral sensor, such as the Landsat Multispectral Scanner System (MSS). However, in connection with attempts to use the data for resource management, it was realized that many deficiencies existed in single data sets. A need was established to geographically reference the MSS data and to register with it data from disparate sources. Technological transfer activities have required systems concepts that can be easily transferred to computers of different types in other organizations. ELAS (Earth Resources Laboratory Applications Software), a geographically based information system, was developed to meet the considered needs. ELAS accepts data from a variety of sources. It contains programs to geographically reference the data to the Universal Transverse Mercator grid. One of the primary functions of ELAS is to produce a surface cover map.

  11. A validated software application to measure fiber organization in soft tissue.

    PubMed

    Morrill, Erica E; Tulepbergenov, Azamat N; Stender, Christina J; Lamichhane, Roshani; Brown, Raquel J; Lujan, Trevor J

    2016-12-01

    The mechanical behavior of soft connective tissue is governed by a dense network of fibrillar proteins in the extracellular matrix. Characterization of this fibrous network requires the accurate extraction of descriptive structural parameters from imaging data, including fiber dispersion and mean fiber orientation. Common methods to quantify fiber parameters include fast Fourier transforms (FFT) and structure tensors; however, information is limited on the accuracy of these methods. In this study, we compared these two methods using test images of fiber networks with varying topology. The FFT method with a band-pass filter was the most accurate, with an error of [Formula: see text] in measuring mean fiber orientation and an error of [Formula: see text] in measuring fiber dispersion in the test images. The accuracy of the structure tensor method was approximately five times worse than the FFT band-pass method when measuring fiber dispersion. A free software application, FiberFit, was then developed that utilizes an FFT band-pass filter to fit fiber orientations to a semicircular von Mises distribution. FiberFit was used to measure collagen fibril organization in confocal images of bovine ligament at magnifications of [Formula: see text] and [Formula: see text]. Grayscale conversion prior to FFT analysis gave the most accurate results, with errors of [Formula: see text] for mean fiber orientation and [Formula: see text] for fiber dispersion when measuring confocal images at [Formula: see text]. By developing and validating a software application that facilitates the automated analysis of fiber organization, this study can help advance a mechanistic understanding of collagen networks and help clarify the mechanobiology of soft tissue remodeling and repair.

  12. Collision of Physics and Software in the Monte Carlo Application Toolkit (MCATK)

    SciTech Connect

    Sweezy, Jeremy Ed

    2016-01-21

    The topic is presented in a series of slides organized as follows: MCATK overview, development strategy, available algorithms, problem modeling (sources, geometry, data, tallies), parallelism, miscellaneous tools/features, example MCATK application, recent areas of research, and summary and future work. MCATK is a C++ component-based Monte Carlo neutron-gamma transport software library with continuous energy neutron and photon transport. Designed to build specialized applications and to provide new functionality in existing general-purpose Monte Carlo codes like MCNP, it reads ACE formatted nuclear data generated by NJOY. The motivation behind MCATK was to reduce costs. MCATK physics involves continuous energy neutron & gamma transport with multi-temperature treatment, static eigenvalue (keff and α) algorithms, time-dependent algorithm, and fission chain algorithms. MCATK geometry includes mesh geometries and solid body geometries. MCATK provides verified, unit-test Monte Carlo components, flexibility in Monte Carlo application development, and numerous tools such as geometry and cross section plotters.

  13. Energy-Based Adaptive Focusing of waves: Application to Ultrasonic Transcranial Therapy

    NASA Astrophysics Data System (ADS)

    Herbert, E.; Pernot, M.; Montaldo, G.; Tanter, M.; Fink, M.

    2009-04-01

    We propose a general concept of adaptive focusing through complex media based on the estimation or measurement of the wave energy density at the desired focal spot. As it does not require the knowledge of phase information, this technique has many potential applications in acoustics and optics for light focusing through diffusive media. We present here the application of this technique to the problem of ultrasonic aberration correction for HIFU treatments. The estimation of wave energy density is based on the maximization of the ultrasound radiation force, using a multi-elements (64) array. A spatial coded excitation method is developed by using ad-hoc virtual transducers that include all the elements for each emission. The radiation force is maximized by optimizing the displacement of a small target at the focus. We measured the target displacement using ultrasound pulse echo on the same elements. A method using spatial coded excitation is developed in order to estimate the phase and amplitude aberration based on the target displacement. We validated this method using phase aberration up to 2π. The phase correction is achieved and the pressure field is measured using a needle hydrophone. The acoustic intensity at the focus is restored through very large aberrations. Basic experiments for brain HIFU treatment are presented. Optimal transcranial adaptive focusing is performed using a limited number of short ultrasonic radiation force pushes.

  14. Principles and applications of a dynamically focused phased array real time ultrasound system.

    PubMed

    Morgan, C L; Trought, W S; Clark, W M; Von Ramm, O T; Thurstone, F L

    1978-12-01

    The physical principles and clinical applications of a high-resolution, dynamically focused phased-array real time ultrasound are described. Advantages of the real time technique include rapid survey capability, efficient selection of an appropriate tomographic plane, identification of pulsating structures, and dynamic studies. The capabilities of a high resolution phased array with extended dynamic focusing to a range of 15-20 cm are demonstrated in vascular, abdominal, and obstetric imaging. Appropriate clinical examples showing normal and pathological anatomy are presented. Comparisons with conventional B scans are illustrated.

  15. How Precise Are Preinterventional Measurements Using Centerline Analysis Applications? Objective Ground Truth Evaluation Reveals Software-Specific Centerline Characteristics.

    PubMed

    Hoegen, Philipp; Wörz, Stefan; Müller-Eschner, Matthias; Geisbüsch, Philipp; Liao, Wei; Rohr, Karl; Schmitt, Matthias; Rengier, Fabian; Kauczor, Hans-Ulrich; von Tengg-Kobligk, Hendrik

    2017-08-01

    To evaluate different centerline analysis applications using objective ground truth from realistic aortic aneurysm phantoms with precisely defined geometry and centerlines to overcome the lack of unknown true dimensions in previously published in vivo validation studies. Three aortic phantoms were created using computer-aided design (CAD) software and a 3-dimensional (3D) printer. Computed tomography angiograms (CTAs) of phantoms and 3 patients were analyzed with 3 clinically approved and 1 research software application. The 3D centerline coordinates, intraluminal diameters, and lengths were validated against CAD ground truth using a dedicated evaluation software platform. The 3D centerline position mean error ranged from 0.7±0.8 to 2.9±2.5 mm between tested applications. All applications calculated centerlines significantly different from ground truth. Diameter mean errors varied from 0.5±1.2 to 1.1±1.0 mm among 3 applications, but exceeded 8.0±11.0 mm with one application due to an unsteady distortion of luminal dimensions along the centerline. All tested commercially available software tools systematically underestimated centerline total lengths by -4.6±0.9 mm to -10.4±4.3 mm (maximum error -14.6 mm). Applications with the highest 3D centerline accuracy yielded the most precise diameter and length measurements. One clinically approved application did not provide reproducible centerline-based analysis results, while another approved application showed length errors that might influence stent-graft choice and procedure success. The variety and specific characteristics of endovascular aneurysm repair planning software tools require scientific evaluation and user awareness.

  16. RICIS Software Engineering 90 Symposium: Aerospace Applications and Research Directions Proceedings

    NASA Technical Reports Server (NTRS)

    1990-01-01

    Papers presented at RICIS Software Engineering Symposium are compiled. The following subject areas are covered: synthesis - integrating product and process; Serpent - a user interface management system; prototyping distributed simulation networks; and software reuse.

  17. Dental application of novel finite element analysis software for three-dimensional finite element modeling of a dentulous mandible from its computed tomography images.

    PubMed

    Nakamura, Keiko; Tajima, Kiyoshi; Chen, Ker-Kong; Nagamatsu, Yuki; Kakigawa, Hiroshi; Masumi, Shin-ich

    2013-12-01

    This study focused on the application of novel finite-element analysis software for constructing a finite-element model from the computed tomography data of a human dentulous mandible. The finite-element model is necessary for evaluating the mechanical response of the alveolar part of the mandible, resulting from occlusal force applied to the teeth during biting. Commercially available patient-specific general computed tomography-based finite-element analysis software was solely applied to the finite-element analysis for the extraction of computed tomography data. The mandibular bone with teeth was extracted from the original images. Both the enamel and the dentin were extracted after image processing, and the periodontal ligament was created from the segmented dentin. The constructed finite-element model was reasonably accurate using a total of 234,644 nodes and 1,268,784 tetrahedral and 40,665 shell elements. The elastic moduli of the heterogeneous mandibular bone were determined from the bone density data of the computed tomography images. The results suggested that the software applied in this study is both useful and powerful for creating a more accurate three-dimensional finite-element model of a dentulous mandible from the computed tomography data without the need for any other software.

  18. Teacher-Designed Software for Interactive Linear Equations: Concepts, Interpretive Skills, Applications & Word-Problem Solving.

    ERIC Educational Resources Information Center

    Lawrence, Virginia

    No longer just a user of commercial software, the 21st century teacher is a designer of interactive software based on theories of learning. This software, a comprehensive study of straightline equations, enhances conceptual understanding, sketching, graphic interpretive and word problem solving skills as well as making connections to real-life and…

  19. The Evolution of Software Pricing: From Box Licenses to Application Service Provider Models.

    ERIC Educational Resources Information Center

    Bontis, Nick; Chung, Honsan

    2000-01-01

    Describes three different pricing models for software. Findings of this case study support the proposition that software pricing is a complex and subjective process. The key determinant of alignment between vendor and user is the nature of value in the software to the buyer. This value proposition may range from increased cost reduction to…

  20. Application issues in the use of depth from (de)focus analysis methods

    NASA Astrophysics Data System (ADS)

    Daneshpanah, M.; Abramovich, G.; Harding, K.; Vemury, A.

    2011-06-01

    Recovering 3D object information through analyzing image focus (or defocus) has been shown to be a potential tool in situations where only a single viewing point is possible. Precise modeling and manipulation of imaging system parameters, e.g. depth of field, modulation transfer function and sensor characteristics, as well as lighting condition and object surface characteristics are critical for effectiveness of such methods. Sub-optimal performance is achieved when one or more of these parameters are dictated by other factors. In this paper, we will discuss the implicit requirements imposed by most common depth from focus/defocus (DFF/DFD) analysis methods and offer related application considerations. We also describe how a priori information about the objects of interest can be used to improve performance in realistic applications of this technology.

  1. Supervised Semi-Automated Data Analysis Software for Gas Chromatography / Differential Mobility Spectrometry (GC/DMS) Metabolomics Applications.

    PubMed

    Peirano, Daniel J; Pasamontes, Alberto; Davis, Cristina E

    2016-09-01

    Modern differential mobility spectrometers (DMS) produce complex and multi-dimensional data streams that allow for near-real-time or post-hoc chemical detection for a variety of applications. An active area of interest for this technology is metabolite monitoring for biological applications, and these data sets regularly have unique technical and data analysis end user requirements. While there are initial publications on how investigators have individually processed and analyzed their DMS metabolomic data, there are no user-ready commercial or open source software packages that are easily used for this purpose. We have created custom software uniquely suited to analyze gas chromatograph / differential mobility spectrometry (GC/DMS) data from biological sources. Here we explain the implementation of the software, describe the user features that are available, and provide an example of how this software functions using a previously-published data set. The software is compatible with many commercial or home-made DMS systems. Because the software is versatile, it can also potentially be used for other similarly structured data sets, such as GC/GC and other IMS modalities.

  2. In-Situ Real-Time Focus Detection during Laser Processing Using Double-Hole Masks and Advanced Image Sensor Software

    PubMed Central

    Hoang, Phuong Le; Ahn, Sanghoon; Kim, Jeng-o; Kang, Heeshin; Noh, Jiwhan

    2017-01-01

    In modern high-intensity ultrafast laser processing, detecting the focal position of the working laser beam, at which the intensity is the highest and the beam diameter is the lowest, and immediately locating the target sample at that point are challenging tasks. A system that allows in-situ real-time focus determination and fabrication using a high-power laser has been in high demand among both engineers and scientists. Conventional techniques require the complicated mathematical theory of wave optics, employing interference as well as diffraction phenomena to detect the focal position; however, these methods are ineffective and expensive for industrial application. Moreover, these techniques could not perform detection and fabrication simultaneously. In this paper, we propose an optical design capable of detecting the focal point and fabricating complex patterns on a planar sample surface simultaneously. In-situ real-time focus detection is performed using a bandpass filter, which only allows for the detection of laser transmission. The technique enables rapid, non-destructive, and precise detection of the focal point. Furthermore, it is sufficiently simple for application in both science and industry for mass production, and it is expected to contribute to the next generation of laser equipment, which can be used to fabricate micro-patterns with high complexity. PMID:28671566

  3. In-Situ Real-Time Focus Detection during Laser Processing Using Double-Hole Masks and Advanced Image Sensor Software.

    PubMed

    Cao, Binh Xuan; Hoang, Phuong Le; Ahn, Sanghoon; Kim, Jeng-O; Kang, Heeshin; Noh, Jiwhan

    2017-07-01

    In modern high-intensity ultrafast laser processing, detecting the focal position of the working laser beam, at which the intensity is the highest and the beam diameter is the lowest, and immediately locating the target sample at that point are challenging tasks. A system that allows in-situ real-time focus determination and fabrication using a high-power laser has been in high demand among both engineers and scientists. Conventional techniques require the complicated mathematical theory of wave optics, employing interference as well as diffraction phenomena to detect the focal position; however, these methods are ineffective and expensive for industrial application. Moreover, these techniques could not perform detection and fabrication simultaneously. In this paper, we propose an optical design capable of detecting the focal point and fabricating complex patterns on a planar sample surface simultaneously. In-situ real-time focus detection is performed using a bandpass filter, which only allows for the detection of laser transmission. The technique enables rapid, non-destructive, and precise detection of the focal point. Furthermore, it is sufficiently simple for application in both science and industry for mass production, and it is expected to contribute to the next generation of laser equipment, which can be used to fabricate micro-patterns with high complexity.

  4. Agile hardware and software systems engineering for critical military space applications

    NASA Astrophysics Data System (ADS)

    Huang, Philip M.; Knuth, Andrew A.; Krueger, Robert O.; Garrison-Darrin, Margaret A.

    2012-06-01

    The Multi Mission Bus Demonstrator (MBD) is a successful demonstration of agile program management and system engineering in a high risk technology application where utilizing and implementing new, untraditional development strategies were necessary. MBD produced two fully functioning spacecraft for a military/DOD application in a record breaking time frame and at dramatically reduced costs. This paper discloses the adaptation and application of concepts developed in agile software engineering to hardware product and system development for critical military applications. This challenging spacecraft did not use existing key technology (heritage hardware) and created a large paradigm shift from traditional spacecraft development. The insertion of new technologies and methods in space hardware has long been a problem due to long build times, the desire to use heritage hardware, and lack of effective process. The role of momentum in the innovative process can be exploited to tackle ongoing technology disruptions and allowing risk interactions to be mitigated in a disciplined manner. Examples of how these concepts were used during the MBD program will be delineated. Maintaining project momentum was essential to assess the constant non recurring technological challenges which needed to be retired rapidly from the engineering risk liens. Development never slowed due to tactical assessment of the hardware with the adoption of the SCRUM technique. We adapted this concept as a representation of mitigation of technical risk while allowing for design freeze later in the program's development cycle. By using Agile Systems Engineering and Management techniques which enabled decisive action, the product development momentum effectively was used to produce two novel space vehicles in a fraction of time with dramatically reduced cost.

  5. Laboratory Information Management Software for genotyping workflows: applications in high throughput crop genotyping

    PubMed Central

    Jayashree, B; Reddy, Praveen T; Leeladevi, Y; Crouch, Jonathan H; Mahalakshmi, V; Buhariwalla, Hutokshi K; Eshwar, KE; Mace, Emma; Folksterma, Rolf; Senthilvel, S; Varshney, Rajeev K; Seetha, K; Rajalakshmi, R; Prasanth, VP; Chandra, Subhash; Swarupa, L; SriKalyani, P; Hoisington, David A

    2006-01-01

    Background With the advances in DNA sequencer-based technologies, it has become possible to automate several steps of the genotyping process leading to increased throughput. To efficiently handle the large amounts of genotypic data generated and help with quality control, there is a strong need for a software system that can help with the tracking of samples and capture and management of data at different steps of the process. Such systems, while serving to manage the workflow precisely, also encourage good laboratory practice by standardizing protocols, recording and annotating data from every step of the workflow. Results A laboratory information management system (LIMS) has been designed and implemented at the International Crops Research Institute for the Semi-Arid Tropics (ICRISAT) that meets the requirements of a moderately high throughput molecular genotyping facility. The application is designed as modules and is simple to learn and use. The application leads the user through each step of the process from starting an experiment to the storing of output data from the genotype detection step with auto-binning of alleles; thus ensuring that every DNA sample is handled in an identical manner and all the necessary data are captured. The application keeps track of DNA samples and generated data. Data entry into the system is through the use of forms for file uploads. The LIMS provides functions to trace back to the electrophoresis gel files or sample source for any genotypic data and for repeating experiments. The LIMS is being presently used for the capture of high throughput SSR (simple-sequence repeat) genotyping data from the legume (chickpea, groundnut and pigeonpea) and cereal (sorghum and millets) crops of importance in the semi-arid tropics. Conclusion A laboratory information management system is available that has been found useful in the management of microsatellite genotype data in a moderately high throughput genotyping laboratory. The application

  6. Implementation of a Software Application for Presurgical Case History Review of Frozen Section Pathology Cases.

    PubMed

    Norgan, Andrew P; Okeson, Mathew L; Juskewitch, Justin E; Shah, Kabeer K; Sukov, William R

    2017-01-01

    The frozen section pathology practice at Mayo Clinic in Rochester performs ~20,000 intraoperative consultations a year (~70-80/weekday). To prepare for intraoperative consultations, surgical pathology fellows and residents review the case history, previous pathology, and relevant imaging the day before surgery. Before the work described herein, review of pending surgical pathology cases was a paper-based process requiring handwritten transcription from the electronic health record, a laborious and potentially error prone process. To facilitate more efficient case review, a modular extension of an existing surgical listing software application (Surgical and Procedure Scheduling [SPS]) was developed. The module (SPS-pathology-specific module [PM]) added pathology-specific functionality including recording case notes, prefetching of radiology, pathology, and operative reports from the medical record, flagging infectious cases, and real-time tracking of cases in the operating room. After implementation, users were surveyed about its impact on the surgical pathology practice. There were 16 survey respondents (five staff pathologists and eleven residents or fellows). All trainees (11/11) responded that the application improved an aspect of surgical list review including abstraction from medical records (10/11), identification of possibly infectious cases (7/11), and speed of list preparation (10/11). The average reported time savings in list preparation was 1.4 h/day. Respondents indicated the application improved the speed (11/16), clarity (13/16), and accuracy (10/16) of morning report. During the workday, respondents reported the application improved real-time case review (14/16) and situational awareness of ongoing cases (13/16). A majority of respondents found the SPS-PM improved all preparatory and logistical aspects of the Mayo Clinic frozen section surgical pathology practice. In addition, use of the SPS-PM saved an average of 1.4 h/day for residents and fellows

  7. Implementation of a Software Application for Presurgical Case History Review of Frozen Section Pathology Cases

    PubMed Central

    Norgan, Andrew P.; Okeson, Mathew L.; Juskewitch, Justin E.; Shah, Kabeer K.; Sukov, William R.

    2017-01-01

    Background: The frozen section pathology practice at Mayo Clinic in Rochester performs ~20,000 intraoperative consultations a year (~70–80/weekday). To prepare for intraoperative consultations, surgical pathology fellows and residents review the case history, previous pathology, and relevant imaging the day before surgery. Before the work described herein, review of pending surgical pathology cases was a paper-based process requiring handwritten transcription from the electronic health record, a laborious and potentially error prone process. Methods: To facilitate more efficient case review, a modular extension of an existing surgical listing software application (Surgical and Procedure Scheduling [SPS]) was developed. The module (SPS-pathology-specific module [PM]) added pathology-specific functionality including recording case notes, prefetching of radiology, pathology, and operative reports from the medical record, flagging infectious cases, and real-time tracking of cases in the operating room. After implementation, users were surveyed about its impact on the surgical pathology practice. Results: There were 16 survey respondents (five staff pathologists and eleven residents or fellows). All trainees (11/11) responded that the application improved an aspect of surgical list review including abstraction from medical records (10/11), identification of possibly infectious cases (7/11), and speed of list preparation (10/11). The average reported time savings in list preparation was 1.4 h/day. Respondents indicated the application improved the speed (11/16), clarity (13/16), and accuracy (10/16) of morning report. During the workday, respondents reported the application improved real-time case review (14/16) and situational awareness of ongoing cases (13/16). Conclusions: A majority of respondents found the SPS-PM improved all preparatory and logistical aspects of the Mayo Clinic frozen section surgical pathology practice. In addition, use of the SPS-PM saved an

  8. Detailed design and first tests of the application software for the instrument control unit of Euclid-NISP

    NASA Astrophysics Data System (ADS)

    Ligori, S.; Corcione, L.; Capobianco, V.; Bonino, D.; Sirri, G.; Fornari, F.; Giacomini, F.; Patrizii, L.; Valenziano, L.; Travaglini, R.; Colodro, C.; Bortoletto, F.; Bonoli, C.; Chiarusi, T.; Margiotta, A.; Mauri, N.; Pasqualini, L.; Spurio, M.; Tenti, M.; Dal Corso, F.; Dusini, S.; Laudisio, F.; Sirignano, C.; Stanco, L.; Ventura, S.; Auricchio, N.; Balestra, A.; Franceschi, E.; Morgante, G.; Trifoglio, M.; Medinaceli, E.; Guizzo, G. P.; Debei, S.; Stephen, J. B.

    2016-07-01

    In this paper we describe the detailed design of the application software (ASW) of the instrument control unit (ICU) of NISP, the Near-Infrared Spectro-Photometer of the Euclid mission. This software is based on a real-time operating system (RTEMS) and will interface with all the subunits of NISP, as well as the command and data management unit (CDMU) of the spacecraft for telecommand and housekeeping management. We briefly review the main requirements driving the design and the architecture of the software that is approaching the Critical Design Review level. The interaction with the data processing unit (DPU), which is the intelligent subunit controlling the detector system, is described in detail, as well as the concept for the implementation of the failure detection, isolation and recovery (FDIR) algorithms. The first version of the software is under development on a Breadboard model produced by AIRBUS/CRISA. We describe the results of the tests and the main performances and budgets.

  9. High intensity focused ultrasound surgery (HIFU) of the brain: A historical perspective, with modern applications

    PubMed Central

    Jagannathan, Jay; Sanghvi, Narendra K; Crum, Lawrence A; Yen, Chun-Po; Medel, Ricky; Dumont, Aaron S; Sheehan, Jason P; Steiner, Ladislau; Jolesz, Ferenc; Kassell, Neal F

    2014-01-01

    The field of MRI-guided high intensity focused ultrasound surgery (MRgFUS) is a rapidly evolving one with many potential applications in neurosurgery. This is the first of three articles on MRgFUS, this paper focuses on the historical development of the technology and it's potential applications to modern neurosurgery. The evolution of MRgFUS has occurred in parallel with modern neurological surgery and the two seemingly distinct disciplines share many of the same pioneering figures. Early studies on focused ultrasound treatment in the 1940's and 1950's demonstrated the ability to perform precise lesioning in the human brain, with a favorable risk-benefit profile. However, the need for a craniotomy, as well as lack of sophisticated imaging technology resulted in limited growth of HIFU for neurosurgery. More recently, technological advances, have permitted the combination of HIFU along with MRI guidance to provide an opportunity to effectively treat a variety of CNS disorders. Although challenges remain, HIFU-mediated neurosurgery may offer the ability to target and treat CNS conditions that were previously extremely difficult to perform. The remaining two articles in this series will focus on the physical principles of modern MRgFUS as well as current and future avenues for investigation. PMID:19190451

  10. Platinum metallization for MEMS application: focus on coating adhesion for biomedical applications.

    PubMed

    Guarnieri, Vittorio; Biazi, Leonardo; Marchiori, Roberto; Lago, Alexandre

    2014-01-01

    The adherence of Platinum thin film on Si/SiO 2 wafer was studies using Chromium, Titanium or Alumina (Cr, Ti, Al 2O 3) as interlayer. The adhesion of Pt is a fundamental property in different areas, for example in MEMS devices, which operate at high temperature conditions, as well as in biomedical applications, where the problem of adhesion of a Pt film to the substrate is known as a major challenge in several industrial applications health and in biomedical devices, such as for example in the stents. (1)(-) (4) We investigated the properties of Chromium, Titanium, and Alumina (Cr, Ti, and Al 2O 3) used as adhesion layers of Platinum (Pt) electrode. Thin films of Chromium, Titanium and Alumina were deposited on Silicon/Silicon dioxide (Si/SiO 2) wafer by electron beam. We introduced Al 2O 3 as a new adhesion layer to test the behavior of the Pt film at higher temperature using a ceramic adhesion thin film. Electric behaviors were measured for different annealing temperatures to know the performance for Cr/Pt, Ti/Pt, and Al 2O 3/Pt metallic film in the gas sensor application. All these metal layers showed a good adhesion onto Si/SiO 2 and also good Au wire bondability at room temperature, but for higher temperature than 400 °C the thin Cr/Pt and Ti/Pt films showed poor adhesion due to the atomic inter-diffusion between Platinum and the metal adhesion layers. (5) The proposed Al 2O 3/Pt ceramic-metal layers confirmed a better adherence for the higher temperatures tested.

  11. Platinum metallization for MEMS application. Focus on coating adhesion for biomedical applications.

    PubMed

    Guarnieri, Vittorio; Biazi, Leonardo; Marchiori, Roberto; Lago, Alexandre

    2014-01-01

    The adherence of Platinum thin film on Si/SiO2 wafer was studies using Chromium, Titanium or Alumina (Cr, Ti, Al2O3) as interlayer. The adhesion of Pt is a fundamental property in different areas, for example in MEMS devices, which operate at high temperature conditions, as well as in biomedical applications, where the problem of adhesion of a Pt film to the substrate is known as a major challenge in several industrial applications health and in biomedical devices, such as for example in the stents. We investigated the properties of Chromium, Titanium, and Alumina (Cr, Ti, and Al2O3) used as adhesion layers of Platinum (Pt) electrode. Thin films of Chromium, Titanium and Alumina were deposited on Silicon/Silicon dioxide (Si/SiO2) wafer by electron beam. We introduced Al2O3 as a new adhesion layer to test the behavior of the Pt film at higher temperature using a ceramic adhesion thin film. Electric behaviors were measured for different annealing temperatures to know the performance for Cr/Pt, Ti/Pt, and Al2O3/Pt metallic film in the gas sensor application. All these metal layers showed a good adhesion onto Si/SiO2 and also good Au wire bondability at room temperature, but for higher temperature than 400 °C the thin Cr/Pt and Ti/Pt films showed poor adhesion due to the atomic inter-diffusion between Platinum and the metal adhesion layers. The proposed Al2O3/Pt ceramic-metal layers confirmed a better adherence for the higher temperatures tested.

  12. GEnomes Management Application (GEM.app): a new software tool for large-scale collaborative genome analysis.

    PubMed

    Gonzalez, Michael A; Lebrigio, Rafael F Acosta; Van Booven, Derek; Ulloa, Rick H; Powell, Eric; Speziani, Fiorella; Tekin, Mustafa; Schüle, Rebecca; Züchner, Stephan

    2013-06-01

    Novel genes are now identified at a rapid pace for many Mendelian disorders, and increasingly, for genetically complex phenotypes. However, new challenges have also become evident: (1) effectively managing larger exome and/or genome datasets, especially for smaller labs; (2) direct hands-on analysis and contextual interpretation of variant data in large genomic datasets; and (3) many small and medium-sized clinical and research-based investigative teams around the world are generating data that, if combined and shared, will significantly increase the opportunities for the entire community to identify new genes. To address these challenges, we have developed GEnomes Management Application (GEM.app), a software tool to annotate, manage, visualize, and analyze large genomic datasets (https://genomics.med.miami.edu/). GEM.app currently contains ∼1,600 whole exomes from 50 different phenotypes studied by 40 principal investigators from 15 different countries. The focus of GEM.app is on user-friendly analysis for nonbioinformaticians to make next-generation sequencing data directly accessible. Yet, GEM.app provides powerful and flexible filter options, including single family filtering, across family/phenotype queries, nested filtering, and evaluation of segregation in families. In addition, the system is fast, obtaining results within 4 sec across ∼1,200 exomes. We believe that this system will further enhance identification of genetic causes of human disease. © 2013 Wiley Periodicals, Inc.

  13. GEnomes Management Application (GEM.app): A new software tool for large-scale collaborative genome analysis

    PubMed Central

    Gonzalez, Michael A.; Acosta Lebrigio, Rafael F.; Van Booven, Derek; Ulloa, Rick H.; Powell, Eric; Speziani, Fiorella; Tekin, Mustafa; Schule, Rebecca; Zuchner, Stephan

    2015-01-01

    Novel genes are now identified at a rapid pace for many Mendelian disorders, and increasingly, for genetically complex phenotypes. However, new challenges have also become evident: (1) effectively managing larger exome and/or genome datasets, especially for smaller labs; (2) direct hands-on analysis and contextual interpretation of variant data in large genomic datasets; and (3) many small and medium-sized clinical and research-based investigative teams around the world are generating data that, if combined and shared, will significantly increase the opportunities for the entire community to identify new genes. To address these challenges we have developed GEnomes Management Application (GEM.app), a software tool to annotate, manage, visualize, and analyze large genomic datasets (https://genomics.med.miami.edu/). GEM.app currently contains ~1,600 whole exomes from 50 different phenotypes studied by 40 principal investigators from 15 different countries. The focus of GEM.app is on user-friendly analysis for non-bioinformaticians to make NGS data directly accessible. Yet, GEM.app provides powerful and flexible filter options, including single family filtering, across family/phenotype queries, nested filtering, and evaluation of segregation in families. In addition, the system is fast, obtaining results within 4 seconds across ~1,200 exomes. We believe that this system will further enhance identification of genetic causes of human disease. PMID:23463597

  14. SEE: improving nurse-patient communications and preventing software piracy in nurse call applications.

    PubMed

    Unluturk, Mehmet S

    2012-06-01

    Nurse call system is an electrically functioning system by which patients can call upon from a bedside station or from a duty station. An intermittent tone shall be heard and a corridor lamp located outside the room starts blinking with a slow or a faster rate depending on the call origination. It is essential to alert nurses on time so that they can offer care and comfort without any delay. There are currently many devices available for a nurse call system to improve communication between nurses and patients such as pagers, RFID (radio frequency identification) badges, wireless phones and so on. To integrate all these devices into an existing nurse call system and make they communicate with each other, we propose software client applications called bridges in this paper. We also propose a window server application called SEE (Supervised Event Executive) that delivers messages among these devices. A single hardware dongle is utilized for authentication and copy protection for SEE. Protecting SEE with securities provided by dongle only is a weak defense against hackers. In this paper, we develop some defense patterns for hackers such as calculating checksums in runtime, making calls to dongle from multiple places in code and handling errors properly by logging them into database.

  15. ODS2: a multiplatform software application for creating integrated physical and genetic maps.

    PubMed

    Hall, D; Bhandarkar, S M; Wang, J

    2001-03-01

    A contig map is a physical map that shows the native order of a library of overlapping genomic clones. One common method for creating such maps involves using hybridization to detect clone overlaps. False- positive and false-negative hybridization errors, the presence of chimeric clones, and gaps in library coverage lead to ambiguity and error in the clone order. Genomes with good genetic maps, such as Neurospora crassa, provide a means for reducing ambiguities and errors when constructing contig maps if clones can be anchored with genetic markers to the genetic map. A software application called ODS2 for creating contig maps based on clone-clone hybridization data is presented. This application is also designed to exploit partial ordering information provided by anchorage of clones to a genetic map. This information, along with clone-clone hybridization data, is used by a clone ordering algorithm and is represented graphically, allowing users to interactively align physical and genetic maps. ODS2 has a graphical user interface and is implemented entirely in Java, so it runs on multiple platforms. Other features include the flexibility of storing data in a local file or relational database and the ability to create full or minimum tiling contig maps.

  16. Challenges in software applications for the cognitive evaluation and stimulation of the elderly

    PubMed Central

    2014-01-01

    Background Computer-based cognitive stimulation applications can help the elderly maintain and improve their cognitive skills. In this research paper, our objectives are to verify the usability of PESCO (an open-software application for cognitive evaluation and stimulation) and to determine the concurrent validity of cognitive assessment tests and the effectiveness of PESCO’s cognitive stimulation exercises. Methods Two studies were conducted in various community computer centers in the province of Granada. The first study tested tool usability by observing 43 elderly people and considering their responses to a questionnaire. In the second study, 36 elderly people completed pen-and-paper and PESCO tests followed by nine cognitive stimulation sessions. Meanwhile, a control group with 34 participants used computers for nine non-structured sessions. Results Analysis of the first study revealed that although PESCO had been developed by taking usability guidelines into account, there was room for improvement. Results from the second study indicated moderate concurrent validity between PESCO and standardized tests (Pearson’s r from .501 to .702) and highlighted the effectiveness of training exercises for improving attention (F = -4.111, p < .001) and planning (F = 5.791, p < .001) functions. Conclusions PESCO can be used by the elderly. The PESCO cognitive test module demonstrated its concurrent validity with traditional cognitive evaluation tests. The stimulation module is effective for improving attention and planning skills. PMID:24886420

  17. Application of open source image guided therapy software in MR-guided therapies.

    PubMed

    Hata, Nobuhiko; Piper, Steve; Jolesz, Ferenc A; Tempany, Clare M C; Black, Peter McL; Morikawa, Shigehiro; Iseki, Horoshi; Hashizume, Makoto; Kikinis, Ron

    2007-01-01

    We present software engineering methods to provide free open-source software for MR-guided therapy. We report that graphical representation of the surgical tools, interconnectively with the tracking device, patient-to-image registration, and MRI-based thermal mapping are crucial components of MR-guided therapy in sharing such software. Software process includes a network-based distribution mechanism by multi-platform compiling tool CMake, CVS, quality assurance software DART. We developed six procedures in four separate clinical sites using proposed software engineering and process, and found the proposed method is feasible to facilitate multicenter clinical trial of MR-guided therapies. Our future studies include use of the software in non-MR-guided therapies.

  18. Data-Interpolating Variational Analysis (DIVA) software : recent development and application

    NASA Astrophysics Data System (ADS)

    Watelet, Sylvain; Beckers, Jean-Marie; Barth, Alexander; Back, Örjan

    2016-04-01

    The Data-Interpolating Variational Analysis (DIVA) software is a tool designed to reconstruct a continuous field from discrete measurements. This method is based on the numerical implementation of the Variational Inverse Model (VIM), which consists of a minimization of a cost function, allowing the choice of the analysed field fitting at best the data sets. The problem is solved efficiently using a finite-element method. This statistical method is particularly suited to deal with irregularly-spaced observations, producing outputs on a regular grid. Initially created to work in a two-dimensional way, the software is now able to handle 3D or even 4D analysis, in order to easily produce ocean climatologies. These analyses can easily be improved by taking advantage of the DIVA's ability to take topographic and dynamic constraints into account (coastal relief, prevailing wind impacting the advection,...). DIVA is an open-source software which is continuously upgraded and distributed for free through frequent version releases. The development is funded by the EMODnet and SeaDataNet projects and include many discussions and feedback from the users community. Here, we present two recent major upgrades : the data weighting option and the bottom-based analyses. Since DIVA works with a diagonal observation error covariance matrix, it is assumed that the observation errors are uncorrelated in space and time. In practice, this assumption is not always valid especially when dealing e.g. with cruise measurements (same instrument) or with time series at a fixed geographic point (representativity error). The data weighting option proposes to decrease the weights in the analysis of such observations. Theses weights are based on an exponential function using a 3D (x,y,t) distance between several observations. A comparison between not-weighted and weighted analyses will be shown. It has been a recurrent request from the DIVA users to improve the way the analyses near the ocean bottom

  19. Agile Software Development

    ERIC Educational Resources Information Center

    Biju, Soly Mathew

    2008-01-01

    Many software development firms are now adopting the agile software development method. This method involves the customer at every level of software development, thus reducing the impact of change in the requirement at a later stage. In this article, the principles of the agile method for software development are explored and there is a focus on…

  20. Agile Software Development

    ERIC Educational Resources Information Center

    Biju, Soly Mathew

    2008-01-01

    Many software development firms are now adopting the agile software development method. This method involves the customer at every level of software development, thus reducing the impact of change in the requirement at a later stage. In this article, the principles of the agile method for software development are explored and there is a focus on…

  1. Collective Focusing of Intense Ion Beam Pulses for High-energy Density Physics Applications

    SciTech Connect

    Dorf, Mikhail A.; Kaganovich, Igor D.; Startsev, Edward A.; Davidson, Ronald C.

    2011-04-27

    The collective focusing concept in which a weak magnetic lens provides strong focusing of an intense ion beam pulse carrying a neutralizing electron background is investigated by making use of advanced particle-in-cell simulations and reduced analytical models. The original analysis by Robertson Phys. Rev. Lett. 48, 149 (1982) is extended to the parameter regimes of particular importance for several high-energy density physics applications. The present paper investigates (1) the effects of non-neutral collective focusing in a moderately strong magnetic field; (2) the diamagnetic effects leading to suppression of the applied magnetic field due to the presence of the beam pulse; and (3) the influence of a finite-radius conducting wall surrounding the beam cross-section on beam neutralization. In addition, it is demonstrated that the use of the collective focusing lens can significantly simplify the technical realization of the final focusing of ion beam pulses in the Neutralized Drift Compression Experiment-I (NDCX-I) , and the conceptual designs of possible experiments on NDCX-I are investigated by making use of advanced numerical simulations. 2011 American Institute of Physics

  2. Neutron focusing using capillary optics and its applications to elemental analysis

    SciTech Connect

    Chen-Mayer, H. H.; Mildner, D. F. R.; Lamaze, G. P.; Paul, R. L.; Lindstrom, R. M.

    1999-06-10

    Capillary optics (Kumakhov lenses) have been characterized and tested at two cold neutron beam facilities at the NIST reactor: the Neutron Depth Profiling (NDP) and the Prompt Gamma Activation Analysis (PGAA) spectrometers. Lenses of both multifiber and monolithic types focus cold neutron beams from guides of cm transverse dimensions onto a sub-mm spot size with higher current densities at the expense of the angular resolution, which is acceptable for applications employing neutron absorption. These lenses can improve the sensitivity and detection limits for NDP and PGAA measurements on small samples, and enable sample scanning to study spatial non-uniformity or to perform compositional mapping. A summary of the neutron focusing effort is given, with examples of a multifiber lens with on-axis focusing, a bender-focuser with off-axis focusing, and a monolithic lens with a more compact size. Preliminary results and existing problems in applying these lenses to NDP and PGAA are presented, and current and future directions are discussed.

  3. Neutron focusing using capillary optics and its applications to elemental analysis

    NASA Astrophysics Data System (ADS)

    Chen-Mayer, H. H.; Mildner, D. F. R.; Lamaze, G. P.; Paul, R. L.; Lindstrom, R. M.

    1999-06-01

    Capillary optics (Kumakhov lenses) have been characterized and tested at two cold neutron beam facilities at the NIST reactor: the Neutron Depth Profiling (NDP) and the Prompt Gamma Activation Analysis (PGAA) spectrometers. Lenses of both multifiber and monolithic types focus cold neutron beams from guides of cm transverse dimensions onto a sub-mm spot size with higher current densities at the expense of the angular resolution, which is acceptable for applications employing neutron absorption. These lenses can improve the sensitivity and detection limits for NDP and PGAA measurements on small samples, and enable sample scanning to study spatial non-uniformity or to perform compositional mapping. A summary of the neutron focusing effort is given, with examples of a multifiber lens with on-axis focusing, a bender-focuser with off-axis focusing, and a monolithic lens with a more compact size. Preliminary results and existing problems in applying these lenses to NDP and PGAA are presented, and current and future directions are discussed.

  4. Impact of focusing of Ground Based SAR data on the quality of interferometric SAR applications

    NASA Astrophysics Data System (ADS)

    Zonno, Mariantonietta; Mascolo, Luigi; Guccione, Pietro; Nico, Giovanni; Di Pasquale, Andrea

    2014-10-01

    A Ground-Based Synthetic Aperture Radar (GB-SAR) is nowadays employed in several applications. The processing of ground-based, space and airborne SAR data relies on the same physical principles. Nevertheless specific algorithms for the focusing of data acquired by GB-SAR system have been proposed in literature. In this work the impact of the main focusing methods on the interferometric phase dispersion and on the coherence has been studied by employing a real dataset obtained by carrying out an experiment. Several acquisitions of a scene with a corner reflector mounted on a micrometric screw have been made; before some acquisitions the micrometric screw has been displaced of few millimetres in the Line-of-Sight direction. The images have been first focused by using two different algorithms and correspondently, two different sets of interferograms have been generated. The mean and standard deviation of the phase values in correspondence of the corner reflector have been compared to those obtained by knowing the real displacement of the micrometric screw. The mean phase and its dispersion and the coherence values for each focusing algorithm have been quantified and both the precision and the accuracy of the interferometic phase measurements obtained by using the two different focusing methods have been assessed.

  5. Using Focused Regression for Accurate Time-Constrained Scaling of Scientific Applications

    SciTech Connect

    Barnes, B; Garren, J; Lowenthal, D; Reeves, J; de Supinski, B; Schulz, M; Rountree, B

    2010-01-28

    Many large-scale clusters now have hundreds of thousands of processors, and processor counts will be over one million within a few years. Computational scientists must scale their applications to exploit these new clusters. Time-constrained scaling, which is often used, tries to hold total execution time constant while increasing the problem size along with the processor count. However, complex interactions between parameters, the processor count, and execution time complicate determining the input parameters that achieve this goal. In this paper we develop a novel gray-box, focused median prediction errors are less than 13%. regression-based approach that assists the computational scientist with maintaining constant run time on increasing processor counts. Combining application-level information from a small set of training runs, our approach allows prediction of the input parameters that result in similar per-processor execution time at larger scales. Our experimental validation across seven applications showed that median prediction errors are less than 13%.

  6. Eddy Covariance Method for CO2 Emission Measurements: CCS Applications, Principles, Instrumentation and Software

    NASA Astrophysics Data System (ADS)

    Burba, George; Madsen, Rod; Feese, Kristin

    2013-04-01

    The Eddy Covariance method is a micrometeorological technique for direct high-speed measurements of the transport of gases, heat, and momentum between the earth's surface and the atmosphere. Gas fluxes, emission and exchange rates are carefully characterized from single-point in-situ measurements using permanent or mobile towers, or moving platforms such as automobiles, helicopters, airplanes, etc. Since the early 1990s, this technique has been widely used by micrometeorologists across the globe for quantifying CO2 emission rates from various natural, urban and agricultural ecosystems [1,2], including areas of agricultural carbon sequestration. Presently, over 600 eddy covariance stations are in operation in over 120 countries. In the last 3-5 years, advancements in instrumentation and software have reached the point when they can be effectively used outside the area of micrometeorology, and can prove valuable for geological carbon capture and sequestration, landfill emission measurements, high-precision agriculture and other non-micrometeorological industrial and regulatory applications. In the field of geological carbon capture and sequestration, the magnitude of CO2 seepage fluxes depends on a variety of factors. Emerging projects utilize eddy covariance measurement to monitor large areas where CO2 may escape from the subsurface, to detect and quantify CO2 leakage, and to assure the efficiency of CO2 geological storage [3,4,5,6,7,8]. Although Eddy Covariance is one of the most direct and defensible ways to measure and calculate turbulent fluxes, the method is mathematically complex, and requires careful setup, execution and data processing tailor-fit to a specific site and a project. With this in mind, step-by-step instructions were created to introduce a novice to the conventional Eddy Covariance technique [9], and to assist in further understanding the method through more advanced references such as graduate-level textbooks, flux networks guidelines, journals

  7. Open Source Subtitle Editor Software Study for Section 508 Close Caption Applications

    NASA Technical Reports Server (NTRS)

    Murphy, F. Brandon

    2013-01-01

    This paper will focus on a specific item within the NASA Electronic Information Accessibility Policy - Multimedia Presentation shall have synchronized caption; thus making information accessible to a person with hearing impairment. This synchronized caption will assist a person with hearing or cognitive disability to access the same information as everyone else. This paper focuses on the research and implementation for CC (subtitle option) support to video multimedia. The goal of this research is identify the best available open-source (free) software to achieve synchronized captions requirement and achieve savings, while meeting the security requirement for Government information integrity and assurance. CC and subtitling are processes that display text within a video to provide additional or interpretive information for those whom may need it or those whom chose it. Closed captions typically show the transcription of the audio portion of a program (video) as it occurs (either verbatim or in its edited form), sometimes including non-speech elements (such as sound effects). The transcript can be provided by a third party source or can be extracted word for word from the video. This feature can be made available for videos in two forms: either Soft-Coded or Hard-Coded. Soft-Coded is the more optional version of CC, where you can chose to turn them on if you want, or you can turn them off. Most of the time, when using the Soft-Coded option, the transcript is also provided to the view along-side the video. This option is subject to compromise, whereas the transcript is merely a text file that can be changed by anyone who has access to it. With this option the integrity of the CC is at the mercy of the user. Hard-Coded CC is a more permanent form of CC. A Hard-Coded CC transcript is embedded within a video, without the option of removal.

  8. The Environment for Application Software Integration and Execution (EASIE) version 1.0. Volume 4: System installation and maintenance guide

    NASA Technical Reports Server (NTRS)

    Randall, Donald P.; Jones, Kennie H.; Rowell, Lawrence F.

    1988-01-01

    The Environment for Application Software Integration and Execution (EASIE) provides both a methodology and a set of software utility programs to ease the task of coordinating engineering design and analysis codes. This document provides necessary information for installing the EASIE software on a host computer system. The target host is a DEX VAX running VMS version 4; host dependencies are noted when appropriate. Relevant directories and individual files are identified, and compile/load/execute sequences are specified. In the case of the data management utilities, database management system (DBMS) specific features are described in an effort to assist the maintenance programmer in converting to a new DBMS. The document also describes a sample EASIE program directory structure to guide the program implementer in establishing his/her application dependent environment.

  9. Advanced software development workstation. Knowledge base design: Design of knowledge base for flight planning application

    NASA Technical Reports Server (NTRS)

    Izygon, Michel E.

    1992-01-01

    The development process of the knowledge base for the generation of Test Libraries for Mission Operations Computer (MOC) Command Support focused on a series of information gathering interviews. These knowledge capture sessions are supporting the development of a prototype for evaluating the capabilities of INTUIT on such an application. the prototype includes functions related to POCC (Payload Operation Control Center) processing. It prompts the end-users for input through a series of panels and then generates the Meds associated with the initialization and the update of hazardous command tables for a POCC Processing TLIB.

  10. Software Review.

    ERIC Educational Resources Information Center

    McGrath, Diane, Ed.

    1989-01-01

    Reviewed is a computer software package entitled "Audubon Wildlife Adventures: Grizzly Bears" for Apple II and IBM microcomputers. Included are availability, hardware requirements, cost, and a description of the program. The murder-mystery flavor of the program is stressed in this program that focuses on illegal hunting and game…

  11. Integrating model behavior, optimization, and sensitivity/uncertainty analysis: overview and application of the MOUSE software toolbox

    USDA-ARS?s Scientific Manuscript database

    This paper provides an overview of the Model Optimization, Uncertainty, and SEnsitivity Analysis (MOUSE) software application, an open-source, Java-based toolbox of visual and numerical analysis components for the evaluation of environmental models. MOUSE is based on the OPTAS model calibration syst...

  12. Adopting Open-Source Software Applications in U. S. Higher Education: A Cross-Disciplinary Review of the Literature

    ERIC Educational Resources Information Center

    van Rooij, Shahron Williams

    2009-01-01

    Higher Education institutions in the United States are considering Open Source software applications such as the Moodle and Sakai course management systems and the Kuali financial system to build integrated learning environments that serve both academic and administrative needs. Open Source is presumed to be more flexible and less costly than…

  13. The Relationship between Teacher Attitudes towards Software Applications and Student Achievement in Fourth and Fifth Grade Classrooms

    ERIC Educational Resources Information Center

    Spencer, Laura K.

    2010-01-01

    The problem: The problem addressed in this study was to examine how teacher attitudes towards software applications affect student achievement in the classroom. Method: A correlational study was conducted, and 50 fourth and fifth grade teachers who taught in the Santee School District, were administered a survey assessing their attitudes…

  14. Adopting Open-Source Software Applications in U. S. Higher Education: A Cross-Disciplinary Review of the Literature

    ERIC Educational Resources Information Center

    van Rooij, Shahron Williams

    2009-01-01

    Higher Education institutions in the United States are considering Open Source software applications such as the Moodle and Sakai course management systems and the Kuali financial system to build integrated learning environments that serve both academic and administrative needs. Open Source is presumed to be more flexible and less costly than…

  15. 25 CFR 547.8 - What are the minimum technical software standards applicable to Class II gaming systems?

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... OF CLASS II GAMES § 547.8 What are the minimum technical software standards applicable to Class II... of Class II games. (a) Player interface displays. (1) If not otherwise provided to the player, the player interface shall display the following: (i) The purchase or wager amount; (ii) Game results;...

  16. 25 CFR 547.8 - What are the minimum technical software standards applicable to Class II gaming systems?

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... OF CLASS II GAMES § 547.8 What are the minimum technical software standards applicable to Class II... of Class II games. (a) Player interface displays. (1) If not otherwise provided to the player, the player interface shall display the following: (i) The purchase or wager amount; (ii) Game results;...

  17. [The Development and Application of the Orthopaedics Implants Failure Database Software Based on WEB].

    PubMed

    Huang, Jiahua; Zhou, Hai; Zhang, Binbin; Ding, Biao

    2015-09-01

    This article develops a new failure database software for orthopaedics implants based on WEB. The software is based on B/S mode, ASP dynamic web technology is used as its main development language to achieve data interactivity, Microsoft Access is used to create a database, these mature technologies make the software extend function or upgrade easily. In this article, the design and development idea of the software, the software working process and functions as well as relative technical features are presented. With this software, we can store many different types of the fault events of orthopaedics implants, the failure data can be statistically analyzed, and in the macroscopic view, it can be used to evaluate the reliability of orthopaedics implants and operations, it also can ultimately guide the doctors to improve the clinical treatment level.

  18. Application of MR-guided focused pulsed ultrasound for destroying clots in vitro using thrombolytic drugs

    NASA Astrophysics Data System (ADS)

    Hadjisavvas, V.; Ioannides, K.; Damianou, C.

    2011-09-01

    In this paper an MR-guided focused pulsed ultrasound system for the treatment of stroke using thrombolytic drugs in a model in vitro is presented. A single element spherically focused transducer of 5 cm diameter; focusing at 10 cm and operating at 0.5 MHz or 1 MHz was used. The transducer was mounted in an MR compatible robot. The artery was modelled using a silicone tube. Tissue was modelled using polyaclylimide gel. Coagulated blood was used to model thrombus. A thermocouple was placed in the thrombus in order to measure the thrombus temperature. The effect of power, beam, and frequency was investigated. The goal was to maintain a temperature increase of less than 1 °C during the application of pulse ultrasound (called safe temperature). With the application of ultrasound alone there was no notable destruction of the thrombus. With the combination of ultrasound and thrombolytic drugs destruction occurred after 60 mins of pulse exposure (PRF = 1 s, duty factor = 10%, and with thrombus placed at 1 cm deep in the tissue). This simple in vitro model was proven very successful for evaluating MRgFUS as a modality for treating stroke. In the future we plan to apply this treatment protocol in live animals and humans.

  19. Application of Real Options Theory to DoD Software Acquisitions

    DTIC Science & Technology

    2009-02-20

    2008). A Monte Carlo simulation of the risk model (Figure 5) was run using the Risk Simulator software, taking into account interdependencies between...and we run the Monte Carlo simulation of the model with the revised risk estimates again. Based on the risk of requirements uncertainty8 presented...design and implementation, software validation and software evolution uncertainties all of which can be categorized as exhibiting both Heisenberg -type3

  20. Application of an integrated multi-criteria decision making AHP-TOPSIS methodology for ETL software selection.

    PubMed

    Hanine, Mohamed; Boutkhoum, Omar; Tikniouine, Abdessadek; Agouti, Tarik

    2016-01-01

    Actually, a set of ETL software (Extract, Transform and Load) is available to constitute a major investment market. Each ETL uses its own techniques for extracting, transforming and loading data into data warehouse, which makes the task of evaluating ETL software very difficult. However, choosing the right software of ETL is critical to the success or failure of any Business Intelligence project. As there are many impacting factors in the selection of ETL software, the same process is considered as a complex multi-criteria decision making (MCDM) problem. In this study, an application of decision-making methodology that employs the two well-known MCDM techniques, namely Analytic Hierarchy Process (AHP) and Technique for Order Preference by Similarity to Ideal Solution (TOPSIS) methods is designed. In this respect, the aim of using AHP is to analyze the structure of the ETL software selection problem and obtain weights of the selected criteria. Then, TOPSIS technique is used to calculate the alternatives' ratings. An example is given to illustrate the proposed methodology. Finally, a software prototype for demonstrating both methods is implemented.

  1. Caltech/JPL Conference on Image Processing Technology, Data Sources and Software for Commercial and Scientific Applications

    NASA Technical Reports Server (NTRS)

    Redmann, G. H.

    1976-01-01

    Recent advances in image processing and new applications are presented to the user community to stimulate the development and transfer of this technology to industrial and commercial applications. The Proceedings contains 37 papers and abstracts, including many illustrations (some in color) and provides a single reference source for the user community regarding the ordering and obtaining of NASA-developed image-processing software and science data.

  2. Ultrasonic focusing through inhomogeneous media by application of the inverse scattering problem

    PubMed Central

    Haddadin, Osama S.; Ebbini, Emad S.

    2010-01-01

    A new approach is introduced for self-focusing phased arrays through inhomogeneous media for therapeutic and imaging applications. This algorithm utilizes solutions to the inverse scattering problem to estimate the impulse response (Green’s function) of the desired focal point(s) at the elements of the array. This approach is a two-stage procedure, where in the first stage the Green’s functions is estimated from measurements of the scattered field taken outside the region of interest. In the second stage, these estimates are used in the pseudoinverse method to compute excitation weights satisfying predefined set of constraints on the structure of the field at the focus points. These scalar, complex valued excitation weights are used to modulate the incident field for retransmission. The pseudoinverse pattern synthesis method requires knowing the Green’s function between the focus points and the array, which is difficult to attain for an unknown inhomogeneous medium. However, the solution to the inverse scattering problem, the scattering function, can be used directly to compute the required inhomogeneous Green’s function. This inverse scattering based self-focusing is noninvasive and does not require a strong point scatterer at or near the desired focus point. It simply requires measurements of the scattered field outside the region of interest. It can be used for high resolution imaging and enhanced therapeutic effects through inhomogeneous media without making any assumptions on the shape, size, or location of the inhomogeneity. This technique is outlined and numerical simulations are shown which validate this technique for single and multiple focusing using a circular array. PMID:9670525

  3. The Point-Focusing Thermal and Electric Applications Project - A progress report. [small solar power systems applications

    NASA Technical Reports Server (NTRS)

    Marriott, A. T.

    1979-01-01

    The paper discusses the Point-Focusing Thermal and Electric Applications Project which encompasses three primary activities: (1) applications analysis and development, in which potential markets for small power systems (less than 10 MWe) are identified and characterized in order to provide requirements for design and information for activities relating to market development; (2) systems engineering and development, for analyses that will define the most appropriate small power system designs based on specific user requirements; and (3) experiment implementation and test, which deals with the design and placement of engineering experiments in various applications environments in order to test the readiness of the selected technology in an operational setting. Progress to date and/or key results are discussed throughout the text.

  4. Numerical simulation of shock wave focusing at fold caustics, with application to sonic boom.

    PubMed

    Marchiano, Régis; Coulouvrat, François; Grenon, Richard

    2003-10-01

    Weak shock wave focusing at fold caustics is described by the mixed type elliptic/hyperbolic nonlinear Tricomi equation. This paper presents a new and original numerical method for solving this equation, using a potential formulation and an "exact" numerical solver for handling nonlinearities. Validation tests demonstrate quantitatively the efficiency of the algorithm, which is able to handle complex waveforms as may come out from "optimized" aircraft designed to minimize sonic booms. It provides a real alternative to the approximate method of the hodograph transform. This motivated the application to evaluate the ground track focusing of sonic boom for an accelerating aircraft, by coupling CFD Euler simulations performed around the mock-up on an adaptated mesh grid, atmospheric propagation modeling, and the Tricomi algorithm. The chosen configuration is the European Eurosup mock-up. Convergence of the focused boom at the ground level as a function of the matching distance is investigated to demonstrate the efficiency of the numerical process. As a conclusion, it is indicated how the present work may pave the way towards a study on sonic superboom (focused boom) mitigation.

  5. Application of precision harmonic gear drive in focusing mechanism of space camera

    NASA Astrophysics Data System (ADS)

    Zhang, Xinjie; Yan, Changxiang

    2010-10-01

    A kind of precision harmonic gear drive in the focusing mechanism of space camera is studied, which adopt external meshing complex wave transmission mode. Wave generator is combined with an elliptic cam and a flexible bearing around it. Flexspline is a structure of single wave with the same teeth. The output shaft is supported by a single crossroller bearing. Ball screws connected the output shaft translate the rotational motion to the linear motion, and drive the focusing mirror repeated moving along the linear guide. The output rigid wheel is connected with absolute encoder to detect displacement of focusing movement. It has the characteristics of big transmission ratio, high precision, compact structure, high efficiency and smooth running etc. According to the practical application of this harmonic gear drive in the space camera, the location relationship between the displacement of focusing structure and the focal plane movement is derived, and the system error is analyzed, its accuracy is tested with the open-loop control method. Experimental results show that transmission ratio of the instrument is 1:70, repeated positional accuracy is 2μm, which meet the requirements for use.

  6. MaRiMba: a software application for spectral library-based MRM transition list assembly.

    PubMed

    Sherwood, Carly A; Eastham, Ashley; Lee, Lik Wee; Peterson, Amelia; Eng, Jimmy K; Shteynberg, David; Mendoza, Luis; Deutsch, Eric W; Risler, Jenni; Tasman, Natalie; Aebersold, Ruedi; Lam, Henry; Martin, Daniel B

    2009-10-01

    Multiple reaction monitoring mass spectrometry (MRM-MS) is a targeted analysis method that has been increasingly viewed as an avenue to explore proteomes with unprecedented sensitivity and throughput. We have developed a software tool, called MaRiMba, to automate the creation of explicitly defined MRM transition lists required to program triple quadrupole mass spectrometers in such analyses. MaRiMba creates MRM transition lists from downloaded or custom-built spectral libraries, restricts output to specified proteins or peptides, and filters based on precursor peptide and product ion properties. MaRiMba can also create MRM lists containing corresponding transitions for isotopically heavy peptides, for which the precursor and product ions are adjusted according to user specifications. This open-source application is operated through a graphical user interface incorporated into the Trans-Proteomic Pipeline, and it outputs the final MRM list to a text file for upload to MS instruments. To illustrate the use of MaRiMba, we used the tool to design and execute an MRM-MS experiment in which we targeted the proteins of a well-defined and previously published standard mixture.

  7. MaRiMba: A Software Application for Spectral Library-Based MRM Transition List Assembly

    PubMed Central

    Sherwood, Carly A.; Eastham, Ashley; Lee, Lik Wee; Peterson, Amelia; Eng, Jimmy K.; Shteynberg, David; Mendoza, Luis; Deutsch, Eric W.; Risler, Jenni; Tasman, Natalie; Aebersold, Ruedi; Lam, Henry; Martin, Daniel B.

    2009-01-01

    Multiple reaction monitoring mass spectrometry (MRM-MS) is a targeted analysis method that has been increasingly viewed as an avenue to explore proteomes with unprecedented sensitivity and throughput. We have developed a software tool, called MaRiMba, to automate the creation of explicitly defined MRM transition lists required to program triple quadrupole mass spectrometers in such analyses. MaRiMba creates MRM transition lists from downloaded or custom-built spectral libraries, restricts output to specified proteins or peptides, and filters based on precursor peptide and product ion properties. MaRiMba can also create MRM lists containing corresponding transitions for isotopically heavy peptides, for which the precursor and product ions are adjusted according to user specifications. This open-source application is operated through a graphical user interface incorporated into the Trans-Proteomic Pipeline, and it outputs the final MRM list to a text file for upload to MS instruments. To illustrate the use of MaRiMba, we used the tool to design and execute an MRM-MS experiment in which we targeted the proteins of a well-defined and previously published standard mixture. PMID:19603829

  8. Conformity assessment of the measurement accuracy in testing laboratories using a software application

    NASA Astrophysics Data System (ADS)

    Diniţă, A.

    2017-02-01

    This article presents a method for assessing the accuracy of the measurements obtained at different tests conducted in laboratories by implementing the interlaboratory comparison method (organization, performance and evaluation of measurements of tests on the same or similar items by two or more laboratories under predetermined conditions). The program (independent software application), realised by the author and described in this paper, analyses the measurement accuracy and performance of testing laboratory by comparing the results obtained from different tests, using the modify Youden diagram, helping identify different types of errors that can occur in measurement, according to ISO 13528:2015, Statistical methods for use in proficiency testing by interlaboratory comparison. A case study is presented in the article by determining the chemical composition of identical samples from five different laboratories. The Youden diagram obtained from this study case was used to identify errors in the laboratory testing equipment. This paper was accepted for publication in Proceedings after double peer reviewing process but was not presented at the Conference ROTRIB’16

  9. Neuroinformatics Software Applications Supporting Electronic Data Capture, Management, and Sharing for the Neuroimaging Community

    PubMed Central

    Nichols, B. Nolan; Pohl, Kilian M.

    2017-01-01

    Accelerating insight into the relation between brain and behavior entails conducting small and large-scale research endeavors that lead to reproducible results. Consensus is emerging between funding agencies, publishers, and the research community that data sharing is a fundamental requirement to ensure all such endeavors foster data reuse and fuel reproducible discoveries. Funding agency and publisher mandates to share data are bolstered by a growing number of data sharing efforts that demonstrate how information technologies can enable meaningful data reuse. Neuroinformatics evaluates scientific needs and develops solutions to facilitate the use of data across the cognitive and neurosciences. For example, electronic data capture and management tools designed to facilitate human neurocognitive research can decrease the setup time of studies, improve quality control, and streamline the process of harmonizing, curating, and sharing data across data repositories. In this article we outline the advantages and disadvantages of adopting software applications that support these features by reviewing the tools available and then presenting two contrasting neuroimaging study scenarios in the context of conducting a cross-sectional and a multisite longitudinal study. PMID:26267019

  10. Software Aspects of IEEE Floating-Point Computations for Numerical Applications in High Energy Physics

    SciTech Connect

    2010-05-11

    Floating-point computations are at the heart of much of the computing done in high energy physics. The correctness, speed and accuracy of these computations are of paramount importance. The lack of any of these characteristics can mean the difference between new, exciting physics and an embarrassing correction. This talk will examine practical aspects of IEEE 754-2008 floating-point arithmetic as encountered in HEP applications. After describing the basic features of IEEE floating-point arithmetic, the presentation will cover: common hardware implementations (SSE, x87) techniques for improving the accuracy of summation, multiplication and data interchange compiler options for gcc and icc affecting floating-point operations hazards to be avoided About the speaker Jeffrey M Arnold is a Senior Software Engineer in the Intel Compiler and Languages group at Intel Corporation. He has been part of the Digital->Compaq->Intel compiler organization for nearly 20 years; part of that time, he worked on both low- and high-level math libraries. Prior to that, he was in the VMS Engineering organization at Digital Equipment Corporation. In the late 1980s, Jeff spent 2½ years at CERN as part of the CERN/Digital Joint Project. In 2008, he returned to CERN to spent 10 weeks working with CERN/openlab. Since that time, he has returned to CERN multiple times to teach at openlab workshops and consult with various LHC experiments. Jeff received his Ph.D. in physics from Case Western Reserve University.

  11. Software Aspects of IEEE Floating-Point Computations for Numerical Applications in High Energy Physics

    ScienceCinema

    None

    2016-07-12

    Floating-point computations are at the heart of much of the computing done in high energy physics. The correctness, speed and accuracy of these computations are of paramount importance. The lack of any of these characteristics can mean the difference between new, exciting physics and an embarrassing correction. This talk will examine practical aspects of IEEE 754-2008 floating-point arithmetic as encountered in HEP applications. After describing the basic features of IEEE floating-point arithmetic, the presentation will cover: common hardware implementations (SSE, x87) techniques for improving the accuracy of summation, multiplication and data interchange compiler options for gcc and icc affecting floating-point operations hazards to be avoided About the speaker Jeffrey M Arnold is a Senior Software Engineer in the Intel Compiler and Languages group at Intel Corporation. He has been part of the Digital->Compaq->Intel compiler organization for nearly 20 years; part of that time, he worked on both low- and high-level math libraries. Prior to that, he was in the VMS Engineering organization at Digital Equipment Corporation. In the late 1980s, Jeff spent 2½ years at CERN as part of the CERN/Digital Joint Project. In 2008, he returned to CERN to spent 10 weeks working with CERN/openlab. Since that time, he has returned to CERN multiple times to teach at openlab workshops and consult with various LHC experiments. Jeff received his Ph.D. in physics from Case Western Reserve University.

  12. Clinical Application of High-intensity Focused Ultrasound in Cancer Therapy

    PubMed Central

    Hsiao, Yi-Hsuan; Kuo, Shou-Jen; Tsai, Horng-Der; Chou, Ming-Chih; Yeh, Guang-Perng

    2016-01-01

    The treatment of cancer is an important issue in both developing and developed countries. Clinical use of ultrasound in cancer is not only for the diagnosis but also for the treatment. Focused ultrasound surgery (FUS) is a noninvasive technique. By using the combination of high-intensity focused ultrasound (HIFU) and imaging method, FUS has the potential to ablate tumor lesions precisely. The main mechanisms of HIFU ablation involve mechanical and thermal effects. Recent advances in HIFU have increased its popularity. Some promising results were achieved in managing various malignancies, including pancreas, prostate, liver, kidney, breast and bone. Other applications include brain tumor ablation and disruption of the blood-brain barrier. We aim at briefly outlining the clinical utility of FUS as a noninvasive technique for a variety of types of cancer treatment. PMID:26918034

  13. Application of Bacteriorhodopsin Films in an Adaptive-Focusing Schlieren System

    NASA Technical Reports Server (NTRS)

    Downie, John D.

    1995-01-01

    The photochromic property of bacteriorhodopsin films is exploited in the application of a focusing schlieren optical system for the visualization of optical phase information. By encoding an image on the film with light of one wavelength and reading out with a different wavelength, the readout beam can effectively see the photographic negative of the original image. The potential advantage of this system over previous focusing schlieren systems is that the updatable nature of the bacteriorhodopsin film allows system adaptation. I discuss two image encoding and readout techniques for the bacteriorhodopsin and use film transmission characteristics to choose the more appropriate method. I demonstrate the system principle with experimental results using argon-ion and He-Cd lasers as the two light sources of different wavelengths, and I discuss current limitations to implementation with a white-light source.

  14. Application of bacteriorhodopsin films in an adaptive-focusing schlieren system

    NASA Astrophysics Data System (ADS)

    Downie, John D.

    1995-09-01

    The photochromic property of bacteriorhodopsin films is exploited in the application of a focusing schlieren optical system for the visualization of optical phase information. By encoding an image on the film with light of one wavelength and reading out with a different wavelength, the readout beam can effectively see the photographic negative of the original image. The potential advantage of this system over previous focusing schlieren systems is that the updatable nature of the bacteriorhodopsin film allows system adaptation. I discuss two image encoding and readout techniques for the bacteriorhodopsin and use film transmission characteristics to choose the more appropriate method. I demonstrate the system principle with experimental results using argon-ion and He-Cd lasers as the two light sources of different wavelengths, and I discuss current limitations to implementation with a white-light source.

  15. Abdominal and obstetric applications of a dynamically focused phased array real time ultrasound system.

    PubMed

    Morgan, C L; Trought, W S; von Ramm, O T; Thurstone, F L

    1980-05-01

    Abdominal and obstetric applications of a dynamically focused phased array real time ultrasonic system are described. This work was performed utilising both the Thaumascan (two-dimensional, high resolution, actual time, ultrasound, multi-element array scanner) and the first commercial unit based on this system, the Grumman RT-400. Examples of normal and pathological anatomy are presented from over 300 examinations performed to date, including a series of 28 abdominal aortic aneurysms studied with the RT-400. Following electronic alterations in the Thaumascan with resultant improvement in the grey scale, prospective analyses in 86 obstetric and 23 abdominal examinations were undertaken. These studies indicate that fetal, intra-uterine, and abdominal structures can be rapidly and consistently imaged. The value of real time ultrasonic scanning in obstetric and abdominal examinations is illustrated. The principles of dynamically focused phased arrays are described, and the merits and limitations of these systems are discussed.

  16. Driving circuitry for focused ultrasound noninvasive surgery and drug delivery applications.

    PubMed

    El-Desouki, Munir M; Hynynen, Kullervo

    2011-01-01

    Recent works on focused ultrasound (FUS) have shown great promise for cancer therapy. Researchers are continuously trying to improve system performance, which is resulting in an increased complexity that is more apparent when using multi-element phased array systems. This has led to significant efforts to reduce system size and cost by relying on system integration. Although ideas from other fields such as microwave antenna phased arrays can be adopted in FUS, the application requirements differ significantly since the frequency range used in FUS is much lower. In this paper, we review recent efforts to design efficient power monitoring, phase shifting and output driving techniques used specifically for high intensity focused ultrasound (HIFU).

  17. Investigating the application of AOP methodology in development of Financial Accounting Software using Eclipse-AJDT Environment

    NASA Astrophysics Data System (ADS)

    Sharma, Amita; Sarangdevot, S. S.

    2010-11-01

    Aspect-Oriented Programming (AOP) methodology has been investigated in development of real world business application software—Financial Accounting Software. Eclipse-AJDT environment has been used as open source enhanced IDE support for programming in AOP language—Aspect J. Crosscutting concerns have been identified and modularized as aspects. This reduces the complexity of the design considerably due to elimination of code scattering and tangling. Improvement in modularity, quality and performance is achieved. The study concludes that AOP methodology in Eclipse-AJDT environment offers powerful support for modular design and implementation of real world quality business software.

  18. Computer Software for Library/Media Center Applications [and] An Update.

    ERIC Educational Resources Information Center

    Deacon, Jim

    1983-01-01

    Software titles are listed under the name and address of the company or institution from which they are commercially available, but no price information is provided. Types of microcomputers are indicated for which each software package is available. Titles cover library skills instruction for students or other users as well as library technical…

  19. Applications of Logic Coverage Criteria and Logic Mutation to Software Testing

    ERIC Educational Resources Information Center

    Kaminski, Garrett K.

    2011-01-01

    Logic is an important component of software. Thus, software logic testing has enjoyed significant research over a period of decades, with renewed interest in the last several years. One approach to detecting logic faults is to create and execute tests that satisfy logic coverage criteria. Another approach to detecting faults is to perform mutation…

  20. Applications of Logic Coverage Criteria and Logic Mutation to Software Testing

    ERIC Educational Resources Information Center

    Kaminski, Garrett K.

    2011-01-01

    Logic is an important component of software. Thus, software logic testing has enjoyed significant research over a period of decades, with renewed interest in the last several years. One approach to detecting logic faults is to create and execute tests that satisfy logic coverage criteria. Another approach to detecting faults is to perform mutation…

  1. An application generator for rapid prototyping of Ada real-time control software

    NASA Technical Reports Server (NTRS)

    Johnson, Jim; Biglari, Haik; Lehman, Larry

    1990-01-01

    The need to increase engineering productivity and decrease software life cycle costs in real-time system development establishes a motivation for a method of rapid prototyping. The design by iterative rapid prototyping technique is described. A tool which facilitates such a design methodology for the generation of embedded control software is described.

  2. Software Engineering Environments for Mission Critical Applications -- STARS Alternative Programmatic Approaches.

    DTIC Science & Technology

    1984-08-01

    John Manley, Gil Myers, Tricia Oberndorf, Donn Philpot, Brian Schaar, and Bob Wasilausky for reviewing and commenting on the draft. The authors also...Tools by B. Kernighan and T. Plauger (Prentice-Hall, 1975) and the paper "A Software Engineering Environment for Weapon System Software" by H. Steubing

  3. Optical diagnostic and therapy applications of femtosecond laser radiation using lens-axicon focusing.

    PubMed

    Parigger, Christian G; Johnson, Jacqueline A; Splinter, Robert

    2013-01-01

    Diagnostic modalities by means of optical and/or near infrared femtosecond radiation through biological media can in principle be adapted to therapeutic applications. Of specific interest are soft tissue diagnostics and subsequent therapy through hard tissue such as bone. Femto-second laser pulses are delivered to hydroxyapatite representing bone, and photo-acoustic spectroscopy is presented in order to identify the location of optical anomalies in an otherwise homogeneous medium. Imaging through bone is being considered for diagnostic, and potentially therapeutic, applications related to brain tumors. The use of mesomeric optics such as lens-axicon combinations is of interest to achieve the favorable distribution of focused radiation. Direct therapy by increasing local temperature to induce hyperthermia is one mode of brain tumor therapy. This can be enhanced by seeding the tumor with nanoparticles. Opto-acoustic imaging using femtosecond laser radiation is a further opportunity for diagnosis.

  4. Theranostic applications of carbon nanomaterials in cancer: Focus on imaging and cargo delivery.

    PubMed

    Chen, Daiqin; Dougherty, Casey A; Zhu, Kaicheng; Hong, Hao

    2015-07-28

    Carbon based nanomaterials have attracted significant attention over the past decades due to their unique physical properties, versatile functionalization chemistry, and biological compatibility. In this review, we will summarize the current state-of-the-art applications of carbon nanomaterials in cancer imaging and drug delivery/therapy. The carbon nanomaterials will be categorized into fullerenes, nanotubes, nanohorns, nanodiamonds, nanodots and graphene derivatives based on their morphologies. The chemical conjugation/functionalization strategies of each category will be introduced before focusing on their applications in cancer imaging (fluorescence/bioluminescence, magnetic resonance (MR), positron emission tomography (PET), single-photon emission computed tomography (SPECT), photoacoustic, Raman imaging, etc.) and cargo (chemo/gene/therapy) delivery. The advantages and limitations of each category and the potential clinical utilization of these carbon nanomaterials will be discussed. Multifunctional carbon nanoplatforms have the potential to serve as optimal candidates for image-guided delivery vectors for cancer.

  5. Expert System Software Assistant for Payload Operations

    NASA Technical Reports Server (NTRS)

    Rogers, Mark N.

    1997-01-01

    The broad objective of this expert system software based application was to demonstrate the enhancements and cost savings that can be achieved through expert system software utilization in a spacecraft ground control center. Spacelab provided a valuable proving ground for this advanced software technology; a technology that will be exploited and expanded for future ISS operations. Our specific focus was on demonstrating payload cadre command and control efficiency improvements through the use of "smart" software which monitors flight telemetry, provides enhanced schematic-based data visualization, and performs advanced engineering data analysis.

  6. Application of the Golden Software Surfer mapping software for automation of visualisation of meteorological and oceanographic data in IMGW Maritime Branch.

    NASA Astrophysics Data System (ADS)

    Piliczewski, B.

    2003-04-01

    The Golden Software Surfer has been used in IMGW Maritime Branch for more than ten years. This tool provides ActiveX Automation objects, which allow scripts to control practically every feature of Surfer. These objects can be accessed from any Automation-enabled environment, such as Visual Basic or Excel. Several applications based on Surfer has been developed in IMGW. The first example is an on-line oceanographic service, which presents forecasts of the water temperature, sea level and currents originating from the HIROMB model and is automatically updated every day. Surfer was also utilised in MERMAID, an international project supported by EC under the 5th Framework Programme. The main aim of this project was to create a prototype of the Internet-based data brokerage system, which would enable to search, extract, buy and download datasets containing meteorological or oceanographic data. During the project IMGW developed an online application, called Mermaid Viewer, which enables communication with the data broker and automatic visualisation of the downloaded data using Surfer. Both the above mentioned applications were developed in Visual Basic. Currently it is considered to adopt Surfer for the monitoring service, which provides access to the data collected in the monitoring of the Baltic Sea environment.

  7. Application of Open Source Software by the Lunar Mapping and Modeling Project

    NASA Astrophysics Data System (ADS)

    Ramirez, P.; Goodale, C. E.; Bui, B.; Chang, G.; Kim, R. M.; Law, E.; Malhotra, S.; Rodriguez, L.; Sadaqathullah, S.; Mattmann, C. A.; Crichton, D. J.

    2011-12-01

    The Lunar Mapping and Modeling Project (LMMP), led by the Marshall Space Flight center (MSFC), is responsible for the development of an information system to support lunar exploration, decision analysis, and release of lunar data to the public. The data available through the lunar portal is predominantly derived from present lunar missions (e.g., the Lunar Reconnaissance Orbiter (LRO)) and from historical missions (e.g., Apollo). This project has created a gold source of data, models, and tools for lunar explorers to exercise and incorporate into their activities. At Jet Propulsion Laboratory (JPL), we focused on engineering and building the infrastructure to support cataloging, archiving, accessing, and delivery of lunar data. We decided to use a RESTful service-oriented architecture to enable us to abstract from the underlying technology choices and focus on interfaces to be used internally and externally. This decision allowed us to leverage several open source software components and integrate them by either writing a thin REST service layer or relying on the API they provided; the approach chosen was dependent on the targeted consumer of a given interface. We will discuss our varying experience using open source products; namely Apache OODT, Oracle Berkley DB XML, Apache Solr, and Oracle OpenSSO (now named OpenAM). Apache OODT, developed at NASA's Jet Propulsion Laboratory and recently migrated over to Apache, provided the means for ingestion and cataloguing of products within the infrastructure. Its usage was based upon team experience with the project and past benefit received on other projects internal and external to JPL. Berkeley DB XML, distributed by Oracle for both commercial and open source use, was the storage technology chosen for our metadata. This decision was in part based on our use Federal Geographic Data Committee (FGDC) Metadata, which is expressed in XML, and the desire to keep it in its native form and exploit other technologies built on

  8. Excel2Genie: A Microsoft Excel application to improve the flexibility of the Genie-2000 Spectroscopic software.

    PubMed

    Forgács, Attila; Balkay, László; Trón, Lajos; Raics, Péter

    2014-12-01

    Excel2Genie, a simple and user-friendly Microsoft Excel interface, has been developed to the Genie-2000 Spectroscopic Software of Canberra Industries. This Excel application can directly control Canberra Multichannel Analyzer (MCA), process the acquired data and visualize them. Combination of Genie-2000 with Excel2Genie results in remarkably increased flexibility and a possibility to carry out repetitive data acquisitions even with changing parameters and more sophisticated analysis. The developed software package comprises three worksheets: display parameters and results of data acquisition, data analysis and mathematical operations carried out on the measured gamma spectra. At the same time it also allows control of these processes. Excel2Genie is freely available to assist gamma spectrum measurements and data evaluation by the interested Canberra users. With access to the Visual Basic Application (VBA) source code of this application users are enabled to modify the developed interface according to their intentions.

  9. Printable Electrochemical Biosensors: A Focus on Screen-Printed Electrodes and Their Application

    PubMed Central

    Yamanaka, Keiichiro; Vestergaard, Mun’delanji C.; Tamiya, Eiichi

    2016-01-01

    In this review we present electrochemical biosensor developments, focusing on screen-printed electrodes (SPEs) and their applications. In particular, we discuss how SPEs enable simple integration, and the portability needed for on-field applications. First, we briefly discuss the general concept of biosensors and quickly move on to electrochemical biosensors. Drawing from research undertaken in this area, we cover the development of electrochemical DNA biosensors in great detail. Through specific examples, we describe the fabrication and surface modification of printed electrodes for sensitive and selective detection of targeted DNA sequences, as well as integration with reverse transcription-polymerase chain reaction (RT-PCR). For a more rounded approach, we also touch on electrochemical immunosensors and enzyme-based biosensors. Last, we present some electrochemical devices specifically developed for use with SPEs, including USB-powered compact mini potentiostat. The coupling demonstrates the practical use of printable electrode technologies for application at point-of-use. Although tremendous advances have indeed been made in this area, a few challenges remain. One of the main challenges is application of these technologies for on-field analysis, which involves complicated sample matrices. PMID:27775661

  10. Printable Electrochemical Biosensors: A Focus on Screen-Printed Electrodes and Their Application.

    PubMed

    Yamanaka, Keiichiro; Vestergaard, Mun'delanji C; Tamiya, Eiichi

    2016-10-21

    In this review we present electrochemical biosensor developments, focusing on screen-printed electrodes (SPEs) and their applications. In particular, we discuss how SPEs enable simple integration, and the portability needed for on-field applications. First, we briefly discuss the general concept of biosensors and quickly move on to electrochemical biosensors. Drawing from research undertaken in this area, we cover the development of electrochemical DNA biosensors in great detail. Through specific examples, we describe the fabrication and surface modification of printed electrodes for sensitive and selective detection of targeted DNA sequences, as well as integration with reverse transcription-polymerase chain reaction (RT-PCR). For a more rounded approach, we also touch on electrochemical immunosensors and enzyme-based biosensors. Last, we present some electrochemical devices specifically developed for use with SPEs, including USB-powered compact mini potentiostat. The coupling demonstrates the practical use of printable electrode technologies for application at point-of-use. Although tremendous advances have indeed been made in this area, a few challenges remain. One of the main challenges is application of these technologies for on-field analysis, which involves complicated sample matrices.

  11. Visual Recognition Software for Binary Classification and its Application to Pollen Identification

    NASA Astrophysics Data System (ADS)

    Punyasena, S. W.; Tcheng, D. K.; Nayak, A.

    2014-12-01

    An underappreciated source of uncertainty in paleoecology is the uncertainty of palynological identifications. The confidence of any given identification is not regularly reported in published results, so cannot be incorporated into subsequent meta-analyses. Automated identifications systems potentially provide a means of objectively measuring the confidence of a given count or single identification, as well as a mechanism for increasing sample sizes and throughput. We developed the software ARLO (Automated Recognition with Layered Optimization) to tackle difficult visual classification problems such as pollen identification. ARLO applies pattern recognition and machine learning to the analysis of pollen images. The features that the system discovers are not the traditional features of pollen morphology. Instead, general purpose image features, such as pixel lines and grids of different dimensions, size, spacing, and resolution, are used. ARLO adapts to a given problem by searching for the most effective combination of feature representation and learning strategy. We present a two phase approach which uses our machine learning process to first segment pollen grains from the background and then classify pollen pixels and report species ratios. We conducted two separate experiments that utilized two distinct sets of algorithms and optimization procedures. The first analysis focused on reconstructing black and white spruce pollen ratios, training and testing our classification model at the slide level. This allowed us to directly compare our automated counts and expert counts to slides of known spruce ratios. Our second analysis focused on maximizing classification accuracy at the individual pollen grain level. Instead of predicting ratios of given slides, we predicted the species represented in a given image window. The resulting analysis was more scalable, as we were able to adapt the most efficient parts of the methodology from our first analysis. ARLO was able to

  12. Software Reviews.

    ERIC Educational Resources Information Center

    Smith, Richard L., Ed.

    1988-01-01

    Contains evaluations of two computer software packages, "Simulation Experiments 45-48 in Epstein's Laboratory Manual for Chemistry" and "Maps and Legends--the Cartographer (Ver 3.0)." Includes a brief description, applications, and the perceived strengths and weaknesses for each package. (CW)

  13. Software Configuration Management Guidebook

    NASA Technical Reports Server (NTRS)

    1995-01-01

    The growth in cost and importance of software to NASA has caused NASA to address the improvement of software development across the agency. One of the products of this program is a series of guidebooks that define a NASA concept of the assurance processes which are used in software development. The Software Assurance Guidebook, SMAP-GB-A201, issued in September, 1989, provides an overall picture of the concepts and practices of NASA in software assurance. Lower level guidebooks focus on specific activities that fall within the software assurance discipline, and provide more detailed information for the manager and/or practitioner. This is the Software Configuration Management Guidebook which describes software configuration management in a way that is compatible with practices in industry and at NASA Centers. Software configuration management is a key software development process, and is essential for doing software assurance.

  14. New horizons for focused ultrasound (FUS) - therapeutic applications in neurodegenerative diseases.

    PubMed

    Miller, Diane B; O'Callaghan, James P

    2017-04-01

    Access to the CNS and delivery of therapeutics across the blood-brain barrier remains a challenge for most treatments of major neurological diseases such as AD or PD. Focused ultrasound represents a potential approach for overcoming these barriers to treating AD and PD and perhaps other neurological diseases. Ultrasound (US) is best known for its imaging capabilities of organs in the periphery, but various arrangements of the transducers producing the acoustic signal allow the energy to be precisely focused (F) within the skull. Using FUS in combination with MRI and contrast agents further enhances accuracy by providing clear information on location. Varying the acoustic power allows FUS to be used in applications ranging from imaging, stimulation of brain circuits, to ablation of tissue. In several transgenic mouse models of AD, the use of FUS with microbubbles reduces plaque load and improves cognition and suggests the need to investigate this technology for plaque removal in AD. In PD, FUS is being explored as a way to non-invasively ablate the brain areas responsible for the tremor and dyskinesia associated with the disease, but has yet to be utilized for non-invasive delivery of putative therapeutics. The FUS approach also greatly increases the range of possible CNS therapeutics as it overcomes the issues of BBB penetration. In this review we discuss how the characteristics and various applications of FUS may advance the therapeutics available for treating or preventing neurodegenerative disorders with an emphasis on treating AD and PD.

  15. An application of the focused liquid jet: needle free drug injection system

    NASA Astrophysics Data System (ADS)

    Kiyama, Akihito; Katsuta, Chihiro; Kawamoto, Sennosuke; Endo, Nanami; Tanaka, Akane; Tagawa, Yoshiyuki

    2016-11-01

    Recently, a focused liquid jet draws great attention since it can be applied to various applications (e. g. Ink jet printing, medical devices). In our research, in order to discuss its applicability for a needle-free drug injection system, we shoot a focused liquid jet to an animal skin with very high-speed. Previously, the penetration of this jet into a gelatin and an artificial skin has been performed in order to model of the jet penetration process. However, experiment for jet injection into the animal skin has not been conducted yet. In this presentation, we inject ink as the liquid jet into the skin of the hairless rat. We observe the top/back view and the cross-sectional view of the injected (ink-stained) skin. We capture the stained area of the skin in order to find characteristics of the jet penetration. We discuss the criteria for the jet penetration into the skin. This work was supported by JSPS KAKENHI Grant Numbers JP26709007, JP16J08521.

  16. A strong-focusing 800 MeV cyclotron for high-current applications

    NASA Astrophysics Data System (ADS)

    Pogue, N.; Assadi, S.; Badgley, K.; Comeaux, J.; Kellams, J.; McInturff, A.; McIntyre, P.; Sattarov, A.

    2013-04-01

    A superconducting strong-focusing cyclotron (SFC) is being developed for high-current applications. It incorporates four innovations. Superconducting quarter-wave cavities are used to provide >20 MV/turn acceleration. The orbit separation is thereby opened so that bunch-bunch interactions between successive orbits are eliminated. Quadrapole focusing channels are incorporated within the sectors so that alternating-gradient strong-focusing transport is maintained throughout. Dipole windings on the inner and outer orbits provide enhanced control for injection and extraction of bunches. Finally each sector magnet is configured as a flux-coupled stack of independent apertures, so that any desired number of independent cyclotrons can be integrated within a common footprint. Preliminary simulations indicate that each SFC should be capable of accelerating 10 mA CW to 800 MeV with very low loss and >50% energy efficiency. A primary motivation for SFC is as a proton driver for accelerator-driven subcritical fission in a molten salt core. The cores are fueled solely with the transuranics from spent nuclear fuel from a conventional nuclear power plant. The beams from one SFC stack would destroy all of the transuranics and long-lived fission products that are produced by a GWe reactor [1]. This capability offers the opportunity to close the nuclear fuel cycle and provide a path to green nuclear energy.

  17. Multi-focus, high resolution inspection system for extended range applications

    NASA Astrophysics Data System (ADS)

    Harding, Kevin

    2016-05-01

    Visual inspection of parts or structures for defects typically requires good spatial resolution to see the defects, but may also require a large focus range. But to obtain the best resolution from an imaging system, it needs to have a low f-number which limits the usable depth of field. Methods to use autofocus or focus stacking provides more range at high resolution, but often at the expense of computation time, loss of a real time image and uncertainty in scale changes. This paper describes an approach to quickly move through a range of focus positions without the need to move optics mechanically in a manner that is highly repeatable, maintains high resolution and provides the potential for a live image directly viewable by an inspector, even at microscope level magnifications. This paper will present the approach we investigated and discuss the pros and cons for a range of applications from large structures to small feature inspection. The paper will present examples of what resolution was achieved and how the multiple images might also be used to determine other parameters such as pose of a test surface.

  18. In Vivo application and localization of transcranial focused ultrasound using dual-mode ultrasound arrays.

    PubMed

    Haritonova, Alyona; Liu, Dalong; Ebbini, Emad S

    2015-12-01

    Focused ultrasound (FUS) has been proposed for a variety of transcranial applications, including neuromodulation, tumor ablation, and blood-brain barrier opening. A flurry of activity in recent years has generated encouraging results demonstrating its feasibility in these and other applications. To date, monitoring of FUS beams has been primarily accomplished using MR guidance, where both MR thermography and elastography have been used. The recent introduction of real-time dual-mode ultrasound array (DMUA) systems offers a new paradigm in transcranial focusing. In this paper, we present first experimental results of ultrasound-guided transcranial FUS (tFUS) application in a rodent brain, both ex vivo and in vivo. DMUA imaging is used for visualization of the treatment region for placement of the focal spot within the brain. This includes the detection and localization of pulsating blood vessels at or near the target point(s). In addition, DMUA imaging is used to monitor and localize the FUS-tissue interactions in real time. In particular, a concave (40 mm radius of curvature), 32-element, 3.5-MHz DMUA prototype was used for imaging and tFUS application in ex vivo and in vivo rat models. The ex vivo experiments were used to evaluate the point spread function of the transcranial DMUA imaging at various points within the brain. In addition, DMUA-based transcranial ultrasound thermography measurements were compared with thermocouple measurements of subtherapeutic tFUS heating in rat brain ex vivo. The ex vivo setting was also used to demonstrate the capability of DMUA to produce localized thermal lesions. The in vivo experiments were designed to demonstrate the ability of the DMUA to apply, monitor, and localize subtherapeutic tFUS patterns that could be beneficial in transient blood-brain barrier opening. The results show that although the DMUA focus is degraded due to the propagation through the skull, it still produces localized heating effects within a sub

  19. In Vivo Application and Localization of Transcranial Focused Ultrasound Using Dual-Mode Ultrasound Arrays

    PubMed Central

    Haritonova, Alyona; Liu, Dalong; Ebbini, Emad S.

    2015-01-01

    Focused ultrasound (FUS) has been proposed for a variety of transcranial applications, including neuromodulation, tumor ablation, and blood brain barrier opening. A flurry of activity in recent years has generated encouraging results demonstrating its feasibility in these and other applications. To date, monitoring of FUS beams have been primarily accomplished using MR guidance, where both MR thermography and elastography have been used. The recent introduction of real-time dual-mode ultrasound array (DMUA) systems offers a new paradigm in transcranial focusing. In this paper, we present first experimental results of ultrasound-guided transcranial FUS (tFUS) application in a rodent brain, both ex vivo and in vivo. DMUA imaging is used for visualization of the treatment region for placement of the focal spot within the brain. This includes the detection and localization of pulsating blood vessels at or near the target point(s). In addition, DMUA imaging is used to monitor and localize the FUS-tissue interactions in real-time. In particular, a concave (40-mm radius of curvature), 32-element, 3.5 MHz DMUA prototype was used for imaging and tFUS application in ex vivo and in vivo rat model. The ex vivo experiments were used to evaluate the point spread function (psf) of the transcranial DMUA imaging at various points within the brain. In addition, DMUA-based transcranial ultrasound thermography measurements were compared with thermocouple measurements of subtherapeutic tFUS heating in rat brain ex vivo. The ex vivo setting was also used to demonstrate the DMUA capability to produce localized thermal lesions. The in vivo experiments were designed to demonstrate the ability of the DMUA to apply, monitor, and localize subtherapeutic tFUS patterns that could be beneficial in transient blood brain barrier opening. The results show that, while the DMUA focus is degraded due to the propagation through the skull, it still produces localized heating effects within sub

  20. The Program for Climate Model Diagnosis and Intercomparison (PCMDI) Software Development: Applications, Infrastructure, and Middleware/Networks

    SciTech Connect

    Williams, Dean N.

    2011-06-30

    The status of and future plans for the Program for Climate Model Diagnosis and Intercomparison (PCMDI) hinge on software that PCMDI is either currently distributing or plans to distribute to the climate community in the near future. These software products include standard conventions, national and international federated infrastructures, and community analysis and visualization tools. This report also mentions other secondary software not necessarily led by or developed at PCMDI to provide a complete picture of the overarching applications, infrastructures, and middleware/networks. Much of the software described anticipates the use of future technologies envisioned over the span of next year to 10 years. These technologies, together with the software, will be the catalyst required to address extreme-scale data warehousing, scalability issues, and service-level requirements for a diverse set of well-known projects essential for predicting climate change. These tools, unlike the previous static analysis tools of the past, will support the co-existence of many users in a productive, shared virtual environment. This advanced technological world driven by extreme-scale computing and the data it generates will increase scientists’ productivity, exploit national and international relationships, and push research to new levels of understanding.

  1. Choice: 36 band feature selection software with applications to multispectral pattern recognition

    NASA Technical Reports Server (NTRS)

    Jones, W. C.

    1973-01-01

    Feature selection software was developed at the Earth Resources Laboratory that is capable of inputting up to 36 channels and selecting channel subsets according to several criteria based on divergence. One of the criterion used is compatible with the table look-up classifier requirements. The software indicates which channel subset best separates (based on average divergence) each class from all other classes. The software employs an exhaustive search technique, and computer time is not prohibitive. A typical task to select the best 4 of 22 channels for 12 classes takes 9 minutes on a Univac 1108 computer.

  2. OPERA, an automatic PSF reconstruction software for Shack-Hartmann AO systems: application to Altair

    NASA Astrophysics Data System (ADS)

    Jolissaint, Laurent; Veran, Jean-Pierre; Marino, Jose

    2004-10-01

    When doing high angular resolution imaging with adaptive optics (AO), it is of crucial importance to have an accurate knowledge of the point spread function associated with each observation. Applications are numerous: image contrast enhancement by deconvolution, improved photometry and astrometry, as well as real time AO performance evaluation. In this paper, we present our work on automatic PSF reconstruction based on control loop data, acquired simultaneously with the observation. This problem has already been solved for curvature AO systems. To adapt this method to another type of WFS, a specific analytical noise propagation model must be established. For the Shack-Hartmann WFS, we are able to derive a very accurate estimate of the noise on each slope measurement, based on the covariances of the WFS CCD pixel values in the corresponding sub-aperture. These covariances can be either derived off-line from telemetry data, or calculated by the AO computer during the acquisition. We present improved methods to determine 1) r0 from the DM drive commands, which includes an estimation of the outer scale L0 2) the contribution of the high spatial frequency component of the turbulent phase, which is not corrected by the AO system and is scaled by r0. This new method has been implemented in an IDL-based software called OPERA (Performance of Adaptive Optics). We have tested OPERA on Altair, the recently commissioned Gemini-North AO system, and present our preliminary results. We also summarize the AO data required to run OPERA on any other AO system.

  3. VARK learning preferences and mobile anatomy software application use in pre-clinical chiropractic students.

    PubMed

    Meyer, Amanda J; Stomski, Norman J; Innes, Stanley I; Armson, Anthony J

    2016-05-06

    Ubiquitous smartphone ownership and reduced face-to-face teaching time may lead to students making greater use of mobile technologies in their learning. This is the first study to report on the prevalence of mobile gross anatomy software applications (apps) usage in pre-clinical chiropractic students and to ascertain if a relationship exists between preferred learning styles as determined by the validated VARK(©) questionnaire and use of mobile anatomy apps. The majority of the students who completed the VARK questionnaire were multimodal learners with kinesthetic and visual preferences. Sixty-seven percent (73/109) of students owned one or more mobile anatomy apps which were used by 57 students. Most of these students owned one to five apps and spent less than 30 minutes per week using them. Six of the top eight mobile anatomy apps owned and recommended by the students were developed by 3D4Medical. Visual learning preferences were not associated with time spent using mobile anatomy apps (OR = 0.40, 95% CI 0.12-1.40). Similarly, kinesthetic learning preferences (OR = 1.88, 95% CI 0.18-20.2), quadmodal preferences (OR = 0.71, 95% CI 0.06-9.25), or gender (OR = 1.51, 95% CI 0.48-4.81) did not affect the time students' spent using mobile anatomy apps. Learning preferences do not appear to influence students' time spent using mobile anatomy apps. Anat Sci Educ 9: 247-254. © 2015 American Association of Anatomists. © 2015 American Association of Anatomists.

  4. Surface modification of electrospun fibres for biomedical applications: A focus on radical polymerization methods.

    PubMed

    Duque Sánchez, Lina; Brack, Narelle; Postma, Almar; Pigram, Paul J; Meagher, Laurence

    2016-11-01

    The development of electrospun ultrafine fibres from biodegradable and biocompatible polymers has created exciting opportunities for biomedical applications. Fibre meshes with high surface area, suitable porosity and stiffness have been produced. Despite desirable structural and topographical properties, for most synthetic and some naturally occurring materials, the nature of the fibre surface chemistry has inhibited development. Hydrophobicity, undesirable non-specific protein adsorption and bacterial attachment and growth, coupled with a lack of surface functionality in many cases and an incomplete understanding of the myriad of interactions between cells and extracellular matrix (ECM) proteins have impeded the application of these systems. Chemical and physical treatments have been applied in order to modify or control the surface properties of electrospun fibres, with some success. Chemical modification using controlled radical polymerization, referred to here as reversible-deactivation radical polymerization (RDRP), has successfully introduced advanced surface functionality in some fibre systems. Atom transfer radical polymerization (ATRP) and reversible addition fragmentation chain transfer (RAFT) are the most widely investigated techniques. This review analyses the practical applications of electrospinning for the fabrication of high quality ultrafine fibres and evaluates the techniques available for the surface modification of electrospun ultrafine fibres and includes a detailed focus on RDRP approaches.

  5. The international river interface cooperative: Public domain flow and morphodynamics software for education and applications

    USGS Publications Warehouse

    Nelson, Jonathan M.; Shimizu, Yasuyuki; Abe, Takaaki; Asahi, Kazutake; Gamou, Mineyuki; Inoue, Takuya; Iwasaki, Toshiki; Kakinuma, Takaharu; Kawamura, Satomi; Kimura, Ichiro; Kyuka, Tomoko; McDonald, Richard R.; Nabi, Mohamed; Nakatsugawa, Makoto; Simoes, Francisco J.; Takebayashi, Hiroshi; Watanabe, Yasunori

    2016-01-01

    This paper describes a new, public-domain interface for modeling flow, sediment transport and morphodynamics in rivers and other geophysical flows. The interface is named after the International River Interface Cooperative (iRIC), the group that constructed the interface and many of the current solvers included in iRIC. The interface is entirely free to any user and currently houses thirteen models ranging from simple one-dimensional models through three-dimensional large-eddy simulation models. Solvers are only loosely coupled to the interface so it is straightforward to modify existing solvers or to introduce other solvers into the system. Six of the most widely-used solvers are described in detail including example calculations to serve as an aid for users choosing what approach might be most appropriate for their own applications. The example calculations range from practical computations of bed evolution in natural rivers to highly detailed predictions of the development of small-scale bedforms on an initially flat bed. The remaining solvers are also briefly described. Although the focus of most solvers is coupled flow and morphodynamics, several of the solvers are also specifically aimed at providing flood inundation predictions over large spatial domains. Potential users can download the application, solvers, manuals, and educational materials including detailed tutorials at www.-i-ric.org. The iRIC development group encourages scientists and engineers to use the tool and to consider adding their own methods to the iRIC suite of tools.

  6. The international river interface cooperative: Public domain flow and morphodynamics software for education and applications

    NASA Astrophysics Data System (ADS)

    Nelson, Jonathan M.; Shimizu, Yasuyuki; Abe, Takaaki; Asahi, Kazutake; Gamou, Mineyuki; Inoue, Takuya; Iwasaki, Toshiki; Kakinuma, Takaharu; Kawamura, Satomi; Kimura, Ichiro; Kyuka, Tomoko; McDonald, Richard R.; Nabi, Mohamed; Nakatsugawa, Makoto; Simões, Francisco R.; Takebayashi, Hiroshi; Watanabe, Yasunori

    2016-07-01

    This paper describes a new, public-domain interface for modeling flow, sediment transport and morphodynamics in rivers and other geophysical flows. The interface is named after the International River Interface Cooperative (iRIC), the group that constructed the interface and many of the current solvers included in iRIC. The interface is entirely free to any user and currently houses thirteen models ranging from simple one-dimensional models through three-dimensional large-eddy simulation models. Solvers are only loosely coupled to the interface so it is straightforward to modify existing solvers or to introduce other solvers into the system. Six of the most widely-used solvers are described in detail including example calculations to serve as an aid for users choosing what approach might be most appropriate for their own applications. The example calculations range from practical computations of bed evolution in natural rivers to highly detailed predictions of the development of small-scale bedforms on an initially flat bed. The remaining solvers are also briefly described. Although the focus of most solvers is coupled flow and morphodynamics, several of the solvers are also specifically aimed at providing flood inundation predictions over large spatial domains. Potential users can download the application, solvers, manuals, and educational materials including detailed tutorials at www.-i-ric.org. The iRIC development group encourages scientists and engineers to use the tool and to consider adding their own methods to the iRIC suite of tools.

  7. Feature Selection for Evolutionary Commercial-off-the-Shelf Software: Studies Focusing on Time-to-Market, Innovation and Hedonic-Utilitarian Trade-Offs

    ERIC Educational Resources Information Center

    Kakar, Adarsh Kumar

    2013-01-01

    Feature selection is one of the most important decisions made by product managers. This three article study investigates the concepts, tools and techniques for making trade-off decisions of introducing new features in evolving Commercial-Off-The-Shelf (COTS) software products. The first article investigates the efficacy of various feature…

  8. Feature Selection for Evolutionary Commercial-off-the-Shelf Software: Studies Focusing on Time-to-Market, Innovation and Hedonic-Utilitarian Trade-Offs

    ERIC Educational Resources Information Center

    Kakar, Adarsh Kumar

    2013-01-01

    Feature selection is one of the most important decisions made by product managers. This three article study investigates the concepts, tools and techniques for making trade-off decisions of introducing new features in evolving Commercial-Off-The-Shelf (COTS) software products. The first article investigates the efficacy of various feature…

  9. Software component quality evaluation

    NASA Technical Reports Server (NTRS)

    Clough, A. J.

    1991-01-01

    The paper describes a software inspection process that can be used to evaluate the quality of software components. Quality criteria, process application, independent testing of the process and proposed associated tool support are covered. Early results indicate that this technique is well suited for assessing software component quality in a standardized fashion. With automated machine assistance to facilitate both the evaluation and selection of software components, such a technique should promote effective reuse of software components.

  10. Towards the Application of Open Source Software in Developing National Electronic Health Record-Narrative Review Article.

    PubMed

    Aminpour, Farzaneh; Sadoughi, Farahnaz; Ahmadi, Maryam

    2013-12-01

    Electronic Health Record (EHR) is a repository of patient health information shared among multiple authorized users. As a modern method of storing and processing health information, it is a solution for improving quality, safety and efficiency of patient care and health system. However, establishment of EHR requires a significant investment of time and money. While many of healthcare providers have very limited capital, application of open source software would be considered as a solution in developing national electronic health record especially in countries with low income. The evidence showed that financial limitation is one of the obstacles to implement electronic health records in developing countries. Therefore, establishment of an open source EHR system capable of modifications according to the national requirements seems to be inevitable in Iran. The present study identifies the impact of application of open source software in developing national electronic health record in Iran.

  11. Towards the Application of Open Source Software in Developing National Electronic Health Record-Narrative Review Article

    PubMed Central

    AMINPOUR, Farzaneh; SADOUGHI, Farahnaz; AHMADI, Maryam

    2013-01-01

    Abstract Electronic Health Record (EHR) is a repository of patient health information shared among multiple authorized users. As a modern method of storing and processing health information, it is a solution for improving quality, safety and efficiency of patient care and health system. However, establishment of EHR requires a significant investment of time and money. While many of healthcare providers have very limited capital, application of open source software would be considered as a solution in developing national electronic health record especially in countries with low income. The evidence showed that financial limitation is one of the obstacles to implement electronic health records in developing countries. Therefore, establishment of an open source EHR system capable of modifications according to the national requirements seems to be inevitable in Iran. The present study identifies the impact of application of open source software in developing national electronic health record in Iran. PMID:26060634

  12. The Use of Mobile Health Applications Among Youth and Young Adults Living with HIV: Focus Group Findings

    PubMed Central

    Siedle-Khan, Robert; Sheon, Nicolas; Lightfoot, Marguerita

    2016-01-01

    Abstract The objective of this study was to conduct focus groups with youth (18–29 years old) living with HIV (YLWH) to better understand preferences for mobile applications in general and to inform the design of a mobile health application aimed at improving retention and engagement in healthcare and adherence to antiretroviral therapy. We conducted four focus groups with YLWH to elicit the names and characteristics of applications that they commonly used, reasons they deleted applications, and the features of an ideal mobile health application. A diverse sample of youth (N = 17) with a mean age of 25 years, 88.2% male, and 29.4% African American participated in four focus groups. Positive attributes of applications included informative, simple, allowing for networking, timely updates, little overlap with other applications, unlimited access to entertainment, and with ongoing advancement. Participants identified several reasons for deleting applications, including engaging in excessive behaviors (e.g., spending money), for hook ups only, too many notifications or restrictions, occupied too much space on device, or required wireless connectivity or frequent updates. Participants suggested that a mobile health application that they would find useful should have the ability to connect to a community of other YLWH, readily access healthcare providers, track personal data and information (such as laboratory data), and obtain health news and education. Privacy was a key factor in a mobile health application for all participants. Researchers can use the information provided by focus group participants in creating mobile health applications for YLWH. PMID:27214751

  13. The Use of Mobile Health Applications Among Youth and Young Adults Living with HIV: Focus Group Findings.

    PubMed

    Saberi, Parya; Siedle-Khan, Robert; Sheon, Nicolas; Lightfoot, Marguerita

    2016-06-01

    The objective of this study was to conduct focus groups with youth (18-29 years old) living with HIV (YLWH) to better understand preferences for mobile applications in general and to inform the design of a mobile health application aimed at improving retention and engagement in healthcare and adherence to antiretroviral therapy. We conducted four focus groups with YLWH to elicit the names and characteristics of applications that they commonly used, reasons they deleted applications, and the features of an ideal mobile health application. A diverse sample of youth (N = 17) with a mean age of 25 years, 88.2% male, and 29.4% African American participated in four focus groups. Positive attributes of applications included informative, simple, allowing for networking, timely updates, little overlap with other applications, unlimited access to entertainment, and with ongoing advancement. Participants identified several reasons for deleting applications, including engaging in excessive behaviors (e.g., spending money), for hook ups only, too many notifications or restrictions, occupied too much space on device, or required wireless connectivity or frequent updates. Participants suggested that a mobile health application that they would find useful should have the ability to connect to a community of other YLWH, readily access healthcare providers, track personal data and information (such as laboratory data), and obtain health news and education. Privacy was a key factor in a mobile health application for all participants. Researchers can use the information provided by focus group participants in creating mobile health applications for YLWH.

  14. Application of an ultrasonic focusing radiator for acoustic levitation of submillimeter samples

    NASA Technical Reports Server (NTRS)

    Lee, M. C.

    1981-01-01

    An acoustic apparatus has been specifically developed to handle samples of submillimeter size in a gaseous medium. This apparatus consists of an acoustic levitation device, deployment devices for small liquid and solid samples, heat sources for sample heat treatment, acoustic alignment devices, a cooling system and data-acquisition instrumentation. The levitation device includes a spherical aluminum dish of 12 in. diameter and 0.6 in. thickness, 130 pieces of PZT transducers attached to the back side of the dish and a spherical concave reflector situated in the vicinity of the center of curvature of the dish. The three lowest operating frequencies for the focusing-radiator levitation device are 75, 105 and 163 kHz, respectively. In comparison with other levitation apparatus, it possesses a large radiation pressure and a high lateral positional stability. This apparatus can be used most advantageously in the study of droplets and spherical shell systems, for instance, for fusion target applications.

  15. Application of an ultrasonic focusing radiator for acoustic levitation of submillimeter samples

    SciTech Connect

    Lee, M.C.

    1981-01-01

    An acoustic apparatus has been specifically developed to handle samples of submillimeter size in a gaseous medium. This apparatus consists of an acoustic levitation device, deployment devices for small liquid and solid samples, heat sources for sample heat treatment, acoustic alignment devices, a cooling system and data-acquisition instrumentation. The levitation device includes a spherical aluminum dish of 12'' diameter and 0.6'' thickness, 130 pieces of PZT tranducers attached to the back side of the dish and a spherical concave reflector situated in the vicinity of the center of curvature of the dish. The three lowest operating frequencies for the focusing-radiator levitation device are 75, 105 and 163 kHz, respectively. In comparison with other levitation apparatus, it possesses a large radiation pressure and a high lateral positional stability. This apparatus can be used most advantageously in the study of droplets and spherical shell systems, for instance, for fusion target applications.

  16. Characterization of new FPS (Focus Projection and Scale) vidicons for scientific imaging applications

    NASA Astrophysics Data System (ADS)

    Yates, G. J.; Jaramillo, S. A.; Holmes, V. H.; Black, J. P.

    1988-06-01

    Several new photoconductors now commercially available as targets in Type 7803 FPS (Focus Projection and Scan) electrostatically focussed vidicons have been characterized for use as radiometric sensors in transient illumination and single frame applications. These include Saticon (Se + Te + As), Newvicon (ZnSe), Pasecon (CdSe), and Plumbicon (PbO). Samples from several domestic and foreign manufacturers have been evaluated for photoconductive response time and responsivity at selected narrow wavelength bands, including 410 nm, 560 nm, and 822 nm. These data are compared with performance data from older target materials including antimony trisulfide (Sb2S3) and silicon. Dynamic range and resolution trade-offs as functions of read-beam aperture diameter and raster size are also presented. The point spread functions for standard 1-mil vidicons and for increased apertures of 1.5, 2.0, 3.0, and 4.0-mil are also discussed.

  17. Software Epistemology

    DTIC Science & Technology

    2016-03-01

    epistemology have focused on two contrary goals: first, small signatures that are able to identify malware that may have polymorphic presentation and...one version of a library can interoperate with another version of the same library. In the case of small signatures for malware , signatures must be...from source code or machine binaries—enables the rapid identification of known software vulnerabilities, unsafe use cases, and hidden malware in

  18. Focusing on energy and optoelectronic applications: a journey for graphene and graphene oxide at large scale.

    PubMed

    Wan, Xiangjian; Huang, Yi; Chen, Yongsheng

    2012-04-17

    Carbon is the only element that has stable allotropes in the 0th through the 3rd dimension, all of which have many outstanding properties. Graphene is the basic building block of other important carbon allotropes. Studies of graphene became much more active after the Geim group isolated "free" and "perfect" graphene sheets and demonstrated the unprecedented electronic properties of graphene in 2004. So far, no other individual material combines so many important properties, including high mobility, Hall effect, transparency, mechanical strength, and thermal conductivity. In this Account, we briefly review our studies of bulk scale graphene and graphene oxide (GO), including their synthesis and applications focused on energy and optoelectronics. Researchers use many methods to produce graphene materials: bottom-up and top-down methods and scalable methods such as chemical vapor deposition (CVD) and chemical exfoliation. Each fabrication method has both advantages and limitations. CVD could represent the most important production method for electronic applications. The chemical exfoliation method offers the advantages of easy scale up and easy solution processing but also produces graphene oxide (GO), which leads to defects and the introduction of heavy functional groups. However, most of these additional functional groups and defects can be removed by chemical reduction or thermal annealing. Because solution processing is required for many film and device applications, including transparent electrodes for touch screens, light-emitting devices (LED), field-effect transistors (FET), and photovoltaic devices (OPV), flexible electronics, and composite applications, the use of GO is important for the production of graphene. Because graphene has an intrinsic zero band gap, this issue needs to be tackled for its FET applications. The studies for transparent electrode related applications have made great progress, but researchers need to improve sheet resistance while

  19. Software Formal Inspections Standard

    NASA Technical Reports Server (NTRS)

    1993-01-01

    This Software Formal Inspections Standard (hereinafter referred to as Standard) is applicable to NASA software. This Standard defines the requirements that shall be fulfilled by the software formal inspections process whenever this process is specified for NASA software. The objective of this Standard is to define the requirements for a process that inspects software products to detect and eliminate defects as early as possible in the software life cycle. The process also provides for the collection and analysis of inspection data to improve the inspection process as well as the quality of the software.

  20. Information flow and application to epileptogenic focus localization from intracranial EEG.

    PubMed

    Sabesan, Shivkumar; Good, Levi B; Tsakalis, Konstantinos S; Spanias, Andreas; Treiman, David M; Iasemidis, Leon D

    2009-06-01

    Transfer entropy ( TE) is a recently proposed measure of the information flow between coupled linear or nonlinear systems. In this study, we suggest improvements in the selection of parameters for the estimation of TE that significantly enhance its accuracy and robustness in identifying the direction and the level of information flow between observed data series generated by coupled complex systems. We show the application of the improved TE method to long (in the order of days; approximately a total of 600 h across all patients), continuous, intracranial electroencephalograms (EEG) recorded in two different medical centers from four patients with focal temporal lobe epilepsy (TLE) for localization of their foci. All patients underwent ablative surgery of their clinically assessed foci. Based on a surrogate statistical analysis of the TE results, it is shown that the identified potential focal sites through the suggested analysis were in agreement with the clinically assessed sites of the epileptogenic focus in all patients analyzed. It is noteworthy that the analysis was conducted on the available whole-duration multielectrode EEG, that is, without any subjective prior selection of EEG segments or electrodes for analysis. The above, in conjunction with the use of surrogate data, make the results of this analysis robust. These findings suggest a critical role TE may play in epilepsy research in general, and as a tool for robust localization of the epileptogenic focus/foci in patients with focal epilepsy in particular.

  1. A DERATING METHOD FOR THERAPEUTIC APPLICATIONS OF HIGH INTENSITY FOCUSED ULTRASOUND

    PubMed Central

    Bessonova, O.V.; Khokhlova, V.A.; Canney, M.S.; Bailey, M.R.; Crum, L.A.

    2010-01-01

    Current methods of determining high intensity focused ultrasound (HIFU) fields in tissue rely on extrapolation of measurements in water assuming linear wave propagation both in water and in tissue. Neglecting nonlinear propagation effects in the derating process can result in significant errors. In this work, a new method based on scaling the source amplitude is introduced to estimate focal parameters of nonlinear HIFU fields in tissue. Focal values of acoustic field parameters in absorptive tissue are obtained from a numerical solution to a KZK-type equation and are compared to those simulated for propagation in water. Focal waveforms, peak pressures, and intensities are calculated over a wide range of source outputs and linear focusing gains. Our modeling indicates, that for the high gain sources which are typically used in therapeutic medical applications, the focal field parameters derated with our method agree well with numerical simulation in tissue. The feasibility of the derating method is demonstrated experimentally in excised bovine liver tissue. PMID:20582159

  2. RECOLLIMATION AND RADIATIVE FOCUSING OF RELATIVISTIC JETS: APPLICATIONS TO BLAZARS AND M87

    SciTech Connect

    Bromberg, Omer; Levinson, Amir

    2009-07-10

    Recent observations of M87 and some blazars reveal violent activity in small regions located at relatively large distances from the central engine. Motivated by these considerations, we study the hydrodynamic collimation of a relativistic cooling outflow using a semianalytical model developed earlier. We first demonstrate that radiative cooling of the shocked outflow layer can lead to a focusing of the outflow and its reconfinement in a region having a very small cross-sectional radius. Such a configuration can produce rapid variability at large distances from the central engine via reflections of the converging recollimation shock. Possible applications of this model to TeV blazars are discussed. We then apply our model to M87. The low radiative efficiency of the M87 jet renders focusing unlikely. However, the shallow profile of the ambient medium pressure inferred from observations results in extremely good collimation that can explain the reported variability of the X-ray flux emitted from the HST-1 knot.

  3. Software Design Improvements. Part 2; Software Quality and the Design and Inspection Process

    NASA Technical Reports Server (NTRS)

    Lalli, Vincent R.; Packard, Michael H.; Ziemianski, Tom

    1997-01-01

    The application of assurance engineering techniques improves the duration of failure-free performance of software. The totality of features and characteristics of a software product are what determine its ability to satisfy customer needs. Software in safety-critical systems is very important to NASA. We follow the System Safety Working Groups definition for system safety software as: 'The optimization of system safety in the design, development, use and maintenance of software and its integration with safety-critical systems in an operational environment. 'If it is not safe, say so' has become our motto. This paper goes over methods that have been used by NASA to make software design improvements by focusing on software quality and the design and inspection process.

  4. A Statistical Testing Approach for Quantifying Software Reliability; Application to an Example System

    SciTech Connect

    Chu, Tsong-Lun; Varuttamaseni, Athi; Baek, Joo-Seok

    2016-11-01

    The U.S. Nuclear Regulatory Commission (NRC) encourages the use of probabilistic risk assessment (PRA) technology in all regulatory matters, to the extent supported by the state-of-the-art in PRA methods and data. Although much has been accomplished in the area of risk-informed regulation, risk assessment for digital systems has not been fully developed. The NRC established a plan for research on digital systems to identify and develop methods, analytical tools, and regulatory guidance for (1) including models of digital systems in the PRAs of nuclear power plants (NPPs), and (2) incorporating digital systems in the NRC's risk-informed licensing and oversight activities. Under NRC's sponsorship, Brookhaven National Laboratory (BNL) explored approaches for addressing the failures of digital instrumentation and control (I and C) systems in the current NPP PRA framework. Specific areas investigated included PRA modeling digital hardware, development of a philosophical basis for defining software failure, and identification of desirable attributes of quantitative software reliability methods. Based on the earlier research, statistical testing is considered a promising method for quantifying software reliability. This paper describes a statistical software testing approach for quantifying software reliability and applies it to the loop-operating control system (LOCS) of an experimental loop of the Advanced Test Reactor (ATR) at Idaho National Laboratory (INL).

  5. Design principle and calculations of a Scheffler fixed focus concentrator for medium temperature applications

    SciTech Connect

    Munir, A.; Hensel, O.; Scheffler, W.

    2010-08-15

    Scheffler fixed focus concentrators are successfully used for medium temperature applications in different parts of the world. These concentrators are taken as lateral sections of paraboloids and provide fixed focus away from the path of incident beam radiations throughout the year. The paper presents a complete description about the design principle and construction details of an 8 m{sup 2} surface area Scheffler concentrator. The first part of the paper presents the mathematical calculations to design the reflector parabola curve and reflector elliptical frame with respect to equinox (solar declination = 0) by selecting a specific lateral part of a paraboloid. Crossbar equations and their ellipses, arc lengths and their radii are also calculated to form the required lateral section of the paraboloid. Thereafter, the seasonal parabola equations are calculated for two extreme positions of summer and winter in the northern hemisphere (standing reflectors). The slopes of the parabola equations for equinox (solar declination = 0), summer (solar declination = +23.5) and winter (solar declination = -23.5) for the Scheffler reflector (8 m{sup 2} surface area) are calculated to be 0.17, 0.28, and 0.13 respectively. The y-intercepts of the parabola equations for equinox, summer and winter are calculated as 0, 0.54, and -0.53 respectively. By comparing with the equinox parabola curve, the summer parabola is found to be smaller in size and uses the top part of the parabola curve while the winter parabola is bigger in size and uses the lower part of the parabola curve to give the fixed focus. For this purpose, the reflector assembly is composed of flexible crossbars and a frame to induce the required change of the parabola curves with the changing solar declination. The paper also presents the calculation procedure of seasonal parabola equations for standing reflectors in the southern hemisphere as well as for laying reflectors in the northern and southern hemispheres. Highly

  6. High intensity focused ultrasound technology, its scope and applications in therapy and drug delivery.

    PubMed

    Phenix, Christopher Peter; Togtema, Melissa; Pichardo, Samuel; Zehbe, Ingeborg; Curiel, Laura

    2014-01-01

    Ultrasonography is a safe, inexpensive and wide-spread diagnostic tool capable of producing real-time non-invasive images without significant biological effects. However, the propagation of higher energy, intensity and frequency ultrasound waves through living tissues can induce thermal, mechanical and chemical effects useful for a variety of therapeutic applications. With the recent development of clinically approved High Intensity Focused Ultrasound (HIFU) systems, therapeutic ultrasound is now a medical reality. Indeed, HIFU has been used for the thermal ablation of pathological lesions; localized, minimally invasive ultrasound-mediated drug delivery through the transient formation of pores on cell membranes; the temporary disruption of skin and the blood brain barrier; the ultrasound induced break-down of blood clots; and the targeted release of drugs using ultrasound and temperature sensitive drug carriers. This review seeks to engage the pharmaceutical research community by providing an overview on the biological effects of ultrasound as well as highlighting important therapeutic applications, current deficiencies and future directions.

  7. Benchmarking Software Assurance Implementation

    DTIC Science & Technology

    2011-05-18

    product The chicken#. (a.k.a. Process Focused Assessment ) – Management Systems ( ISO 9001 , ISO 27001, ISO 2000) – Capability Maturity Models (CMMI...How – Executive leadership commitment – Translate ROI to project manager vocabulary (cost, schedule, quality ) – Start small and build – Use...collaboration Vocabulary Reserved Words Software Acquisition Information Assurance Project Management System Engineering Software Engineering Software

  8. Software Engineering for Portability.

    ERIC Educational Resources Information Center

    Stanchev, Ivan

    1990-01-01

    Discussion of the portability of educational software focuses on the software design and development process. Topics discussed include levels of portability; the user-computer dialog; software engineering principles; design techniques for student performance records; techniques of courseware programing; and suggestions for further research and…

  9. TOUGH2 software qualification

    SciTech Connect

    Pruess, K.; Simmons, A.; Wu, Y.S.; Moridis, G.

    1996-02-01

    TOUGH2 is a numerical simulation code for multi-dimensional coupled fluid and heat flow of multiphase, multicomponent fluid mixtures in porous and fractured media. It belongs to the MULKOM ({open_quotes}MULti-KOMponent{close_quotes}) family of codes and is a more general version of the TOUGH simulator. The MULKOM family of codes was originally developed with a focus on geothermal reservoir simulation. They are suited to modeling systems which contain different fluid mixtures, with applications to flow problems arising in the context of high-level nuclear waste isolation, oil and gas recovery and storage, and groundwater resource protection. TOUGH2 is essentially a subset of MULKOM, consisting of a selection of the better tested and documented MULKOM program modules. The purpose of this package of reports is to provide all software baseline documents necessary for the software qualification of TOUGH2.

  10. Open source hardware and software platform for robotics and artificial intelligence applications

    NASA Astrophysics Data System (ADS)

    Liang, S. Ng; Tan, K. O.; Lai Clement, T. H.; Ng, S. K.; Mohammed, A. H. Ali; Mailah, Musa; Azhar Yussof, Wan; Hamedon, Zamzuri; Yussof, Zulkifli

    2016-02-01

    Recent developments in open source hardware and software platforms (Android, Arduino, Linux, OpenCV etc.) have enabled rapid development of previously expensive and sophisticated system within a lower budget and flatter learning curves for developers. Using these platform, we designed and developed a Java-based 3D robotic simulation system, with graph database, which is integrated in online and offline modes with an Android-Arduino based rubbish picking remote control car. The combination of the open source hardware and software system created a flexible and expandable platform for further developments in the future, both in the software and hardware areas, in particular in combination with graph database for artificial intelligence, as well as more sophisticated hardware, such as legged or humanoid robots.

  11. Detecting Optic Atrophy in Multiple Sclerosis Patients Using New Colorimetric Analysis Software: From Idea to Application.

    PubMed

    Bambo, Maria Pilar; Garcia-Martin, Elena; Perez-Olivan, Susana; Larrosa-Povés, José Manuel; Polo-Llorens, Vicente; Gonzalez-De la Rosa, Manuel

    2016-01-01

    Neuro-ophthalmologists typically observe a temporal pallor of the optic disc in patients with multiple sclerosis. Here, we describe the emergence of an idea to quantify these optic disc color changes in multiple sclerosis patients. We recruited 12 multiple sclerosis patients with previous optic neuritis attack and obtained photographs of their optic discs. The Laguna ONhE, a new colorimetric software using hemoglobin as the reference pigment in the papilla, was used for the analysis. The papilla of these multiple sclerosis patients showed greater pallor, especially in the temporal sector. The software detected the pallor and assigned hemoglobin percentages below normal reference values. Measurements of optic disc hemoglobin levels obtained with the Laguna ONhE software program had good ability to detect optic atrophy and, consequently, axonal loss in multiple sclerosis patients. This new technology is easy to implement in routine clinical practice.

  12. The 'Densitometric Image Analysis Software' and its application to determine stepwise equilibrium constants from electrophoretic mobility shift assays.

    PubMed

    van Oeffelen, Liesbeth; Peeters, Eveline; Nguyen Le Minh, Phu; Charlier, Daniël

    2014-01-01

    Current software applications for densitometric analysis, such as ImageJ, QuantityOne (BioRad) and the Intelligent or Advanced Quantifier (Bio Image) do not allow to take the non-linearity of autoradiographic films into account during calibration. As a consequence, quantification of autoradiographs is often regarded as problematic, and phosphorimaging is the preferred alternative. However, the non-linear behaviour of autoradiographs can be described mathematically, so it can be accounted for. Therefore, the 'Densitometric Image Analysis Software' has been developed, which allows to quantify electrophoretic bands in autoradiographs, as well as in gels and phosphorimages, while providing optimized band selection support to the user. Moreover, the program can determine protein-DNA binding constants from Electrophoretic Mobility Shift Assays (EMSAs). For this purpose, the software calculates a chosen stepwise equilibrium constant for each migration lane within the EMSA, and estimates the errors due to non-uniformity of the background noise, smear caused by complex dissociation or denaturation of double-stranded DNA, and technical errors such as pipetting inaccuracies. Thereby, the program helps the user to optimize experimental parameters and to choose the best lanes for estimating an average equilibrium constant. This process can reduce the inaccuracy of equilibrium constants from the usual factor of 2 to about 20%, which is particularly useful when determining position weight matrices and cooperative binding constants to predict genomic binding sites. The MATLAB source code, platform-dependent software and installation instructions are available via the website http://micr.vub.ac.be.

  13. Application of Artificial Intelligence technology to the analysis and synthesis of reliable software systems

    NASA Technical Reports Server (NTRS)

    Wild, Christian; Eckhardt, Dave

    1987-01-01

    The development of a methodology for the production of highly reliable software is one of the greatest challenges facing the computer industry. Meeting this challenge will undoubtably involve the integration of many technologies. This paper describes the use of Artificial Intelligence technologies in the automated analysis of the formal algebraic specifications of abstract data types. These technologies include symbolic execution of specifications using techniques of automated deduction and machine learning through the use of examples. On-going research into the role of knowledge representation and problem solving in the process of developing software is also discussed.

  14. Microcomputer technology applications: Charger and regulator software for a breadboard programmable power processor

    NASA Technical Reports Server (NTRS)

    Green, D. M.

    1978-01-01

    Software programs are described, one which implements a voltage regulation function, and one which implements a charger function with peak-power tracking of its input. The software, written in modular fashion, is intended as a vehicle for further experimentation with the P-3 system. A control teleprinter allows an operator to make parameter modifications to the control algorithm during experiments. The programs require 3K ROM and 2K ram each. User manuals for each system are included as well as a third program for simple I/O control.

  15. An application of the IMC software to controller design for the JPL LSCL Experiment Facility

    NASA Technical Reports Server (NTRS)

    Zhu, Guoming; Skelton, Robert E.

    1993-01-01

    A software package which Integrates Model reduction and Controller design (The IMC software) is applied to design controllers for the JPL Large Spacecraft Control Laboratory Experiment Facility. Modal Cost Analysis is used for the model reduction, and various Output Covariance Constraints are guaranteed by the controller design. The main motivation is to find the controller with the 'best' performance with respect to output variances. Indeed it is shown that by iterating on the reduced order design model, the controller designed does have better performance than that obtained with the first model reduction.

  16. Application of Artificial Intelligence technology to the analysis and synthesis of reliable software systems

    NASA Technical Reports Server (NTRS)

    Wild, Christian; Eckhardt, Dave

    1987-01-01

    The development of a methodology for the production of highly reliable software is one of the greatest challenges facing the computer industry. Meeting this challenge will undoubtably involve the integration of many technologies. This paper describes the use of Artificial Intelligence technologies in the automated analysis of the formal algebraic specifications of abstract data types. These technologies include symbolic execution of specifications using techniques of automated deduction and machine learning through the use of examples. On-going research into the role of knowledge representation and problem solving in the process of developing software is also discussed.

  17. An investigation of HPGe gamma efficiency calibration software (ANGLE V.3) for applications in nuclear decommissioning.

    PubMed

    Bell, S J; Judge, S M; Regan, P H

    2012-12-01

    High resolution gamma spectrometry offers a rapid method to characterise waste materials on a decommissioning nuclear site. To meet regulatory requirements, measurements must be traceable to national standards, meaning that the spectrometers must be calibrated for a wide range of materials. Semi-empirical modelling software (such as ANGLE™) offers a convenient method to carry out such calibrations. This paper describes an assessment of the modelling software for use by a small laboratory based on a nuclear site. The results confirmed the need for accurate information on the detection construction if the calibration were to be accurate to within 10%. Copyright © 2012 Elsevier Ltd. All rights reserved.

  18. Experimental demonstration of elastic optical networks based on enhanced software defined networking (eSDN) for data center application.

    PubMed

    Zhang, Jie; Yang, Hui; Zhao, Yongli; Ji, Yuefeng; Li, Hui; Lin, Yi; Li, Gang; Han, Jianrui; Lee, Young; Ma, Teng

    2013-11-04

    Due to the high burstiness and high-bandwidth characteristics of the applications, data center interconnection by elastic optical networks have attracted much attention of network operators and service providers. Many data center applications require lower delay and higher availability with the end-to-end guaranteed quality of service. In this paper, we propose and implement a novel elastic optical network based on enhanced software defined networking (eSDN) architecture for data center application, by introducing a transport-aware cross stratum optimization (TA-CSO) strategy. eSDN can enable cross stratum optimization of application and elastic optical network stratum resources and provide the elastic physical layer parameter adjustment, e.g., modulation format and bandwidth. We have designed and verified experimentally software defined path provisioning on our testbed with four real OpenFlow-enabled elastic optical nodes for data center application. The overall feasibility and efficiency of the proposed architecture is also experimentally demonstrated and compared with individual CSO and physical layer adjustment strategies in terms of path setup/release/adjustment latency, blocking probability and resource occupation rate.

  19. A fast auto-focusing technique for the long focal lens TDI CCD camera in remote sensing applications

    NASA Astrophysics Data System (ADS)

    Wang, Dejiang; Ding, Xu; Zhang, Tao; Kuang, Haipeng

    2013-02-01

    The key issue in automatic focus adjustment for long focal lens TDI CCD camera in remote sensing applications is to achieve the optimum focus position as fast as possible. Existing auto-focusing techniques consume too much time as the mechanical focusing parts of the camera move in steps during the searching procedure. In this paper, we demonstrate a fast auto-focusing technique, which employs the internal optical elements and the TDI CCD itself to directly sense the deviations in back focal distance of the lens and restore the imaging system to a best-available focus. It is particularly advantageous for determination of the focus, due to that the relative motion between the TDI CCD and the focusing element can proceed without interruption. Moreover, the theoretical formulas describing the effect of imaging motion on the focusing precision and the effective focusing range are also developed. Finally, an experimental setup is constructed to evaluate the performance of the proposed technique. The results of the experiment show a ±5 μm precision of auto-focusing in a range of ±500 μmdefocus, and the searching procedure could be accomplished within 0.125 s, which leads to remarkable improvement on the real-time imaging capability for high resolution TDI CCD camera in remote sensing applications.

  20. Software Applications to Access Earth Science Data: Building an ECHO Client

    NASA Astrophysics Data System (ADS)

    Cohen, A.; Cechini, M.; Pilone, D.

    2010-12-01

    Historically, developing an ECHO (NASA’s Earth Observing System (EOS) ClearingHOuse) client required interaction with its SOAP API. SOAP, as a framework for web service communication has numerous advantages for Enterprise applications and Java/C# type programming languages. However, as interest has grown for quick development cycles and more intriguing “mashups,” ECHO has seen the SOAP API lose its appeal. In order to address these changing needs, ECHO has introduced two new interfaces facilitating simple access to its metadata holdings. The first interface is built upon the OpenSearch format and ESIP Federated Search framework. The second interface is built upon the Representational State Transfer (REST) architecture. Using the REST and OpenSearch APIs to access ECHO makes development with modern languages much more feasible and simpler. Client developers can leverage the simple interaction with ECHO to focus more of their time on the advanced functionality they are presenting to users. To demonstrate the simplicity of developing with the REST API, participants will be led through a hands-on experience where they will develop an ECHO client that performs the following actions: + Login + Provider discovery + Provider based dataset discovery + Dataset, Temporal, and Spatial constraint based Granule discovery + Online Data Access

  1. Microfabrication and Test of a Three-Dimensional Polymer Hydro-focusing Unit for Flow Cytometry Applications

    NASA Technical Reports Server (NTRS)

    Yang, Ren; Feeback, Daniel L.; Wang, Wanjun

    2004-01-01

    This paper details a novel three-dimensional (3D) hydro-focusing micro cell sorter for micro flow cytometry applications. The unit was microfabricated by means of SU-8 3D lithography. The 3D microstructure for coaxial sheathing was designed, microfabricated, and tested. Three-dimensional hydro-focusing capability was demonstrated with an experiment to sort labeled tanned sheep erythrocytes (red blood cells). This polymer hydro-focusing microstructure is easily microfabricated and integrated with other polymer microfluidic structures.

  2. Computer Software.

    ERIC Educational Resources Information Center

    Kay, Alan

    1984-01-01

    Discusses the nature and development of computer software. Programing, programing languages, types of software (including dynamic spreadsheets), and software of the future are among the topics considered. (JN)

  3. Contemporary Applications of Computer Technology: Development of Meaningful Software in Special Education/Rehabilitation.

    ERIC Educational Resources Information Center

    Mills, Russell

    Four elements of clinical programming must be considered during development in order for a software program to be truly useful in rehabilitation: presentation of a useful task; treatment parameters selectable by clinicians; data collection/analysis; and authoring capability. These criteria govern the development of all Brain-Link Software…

  4. Early Algebra with Graphics Software as a Type II Application of Technology

    ERIC Educational Resources Information Center

    Abramovich, Sergei

    2006-01-01

    This paper describes the use of Kid Pix-graphics software for creative activities of young children--in the context of early algebra as determined by the mathematics core curriculum of New York state. It shows how grade-two appropriate pedagogy makes it possible to bring about a qualitative change in the learning process of those commonly…

  5. Contemporary Applications of Computer Technology: Development of Meaningful Software in Special Education/Rehabilitation.

    ERIC Educational Resources Information Center

    Mills, Russell

    Four elements of clinical programming must be considered during development in order for a software program to be truly useful in rehabilitation: presentation of a useful task; treatment parameters selectable by clinicians; data collection/analysis; and authoring capability. These criteria govern the development of all Brain-Link Software…

  6. Application of neural networks to software quality modeling of a very large telecommunications system.

    PubMed

    Khoshgoftaar, T M; Allen, E B; Hudepohl, J P; Aud, S J

    1997-01-01

    Society relies on telecommunications to such an extent that telecommunications software must have high reliability. Enhanced measurement for early risk assessment of latent defects (EMERALD) is a joint project of Nortel and Bell Canada for improving the reliability of telecommunications software products. This paper reports a case study of neural-network modeling techniques developed for the EMERALD system. The resulting neural network is currently in the prototype testing phase at Nortel. Neural-network models can be used to identify fault-prone modules for extra attention early in development, and thus reduce the risk of operational problems with those modules. We modeled a subset of modules representing over seven million lines of code from a very large telecommunications software system. The set consisted of those modules reused with changes from the previous release. The dependent variable was membership in the class of fault-prone modules. The independent variables were principal components of nine measures of software design attributes. We compared the neural-network model with a nonparametric discriminant model and found the neural-network model had better predictive accuracy.

  7. Possible Application of Quality Function Deployment in Software Systems Development in the United States Air Force

    DTIC Science & Technology

    1991-12-01

    his cooperation in acquiring QFD Designer. I also wish to thank Mr Allen Chartier of the American Supplieri Institute for his help in identifying...and What Didn’t," Transactions from the Symposium on Quality Function Deployment. 305-335. Dearborn MI: ASI Press, 1989. Pressman, Roger S. Software

  8. Study of application of space telescope science operations software for SIRTF use

    NASA Technical Reports Server (NTRS)

    Dignam, F.; Stetson, E.; Allendoerfer, W.

    1985-01-01

    The design and development of the Space Telescope Science Operations Ground System (ST SOGS) was evaluated to compile a history of lessons learned that would benefit NASA's Space Infrared Telescope Facility (SIRTF). Forty-nine specific recommendations resulted and were categorized as follows: (1) requirements: a discussion of the content, timeliness and proper allocation of the system and segment requirements and the resulting impact on SOGS development; (2) science instruments: a consideration of the impact of the Science Instrument design and data streams on SOGS software; and (3) contract phasing: an analysis of the impact of beginning the various ST program segments at different times. Approximately half of the software design and source code might be useable for SIRTF. Transportability of this software requires, at minimum, a compatible DEC VAX-based architecture and VMS operating system, system support software similar to that developed for SOGS, and continued evolution of the SIRTF operations concept and requirements such that they remain compatible with ST SOGS operation.

  9. Application of modern software packages to calculating the solidification of high-speed steels

    NASA Astrophysics Data System (ADS)

    Morozov, S. I.

    2015-12-01

    The solidification of high-speed steels is calculated with the Pandat and JMatPro software packages. The results of calculating equilibrium and nonequilibrium solidification are presented and discussed. The nonequilibrium solidification is simulated using the Shelley-Gulliver model. The fraction of carbides changes as a function of the carbon content in the steels.

  10. Application of fuzzy logic in intelligent software agents for IP selection

    NASA Astrophysics Data System (ADS)

    Liu, Jian; Shragowitz, Eugene B.

    1999-11-01

    IPs (Intellectual Properties) are becoming increasingly essential in today's electronic system design. One of important issues in design reuse is the IP selection, i.e. finding an existing solution that matches the user's expectations best. This paper describes the Internet-based intelligent software system (Software Agent) that helps the user to pick out the optimal designs among those marketed by the IP vendors. The Software Agent for IP Selection (SAFIPS) conducts dialogues with both the IP users and IP vendors, narrowing the choices after evaluating general characteristics first, followed by matching behavioral, RTL, logic, and physical levels. The SAFIPS system conducts reasoning based on fuzzy logic rules derived in the process of dialogues of the software agent with the IP users and vendors. In addition to the dialogue system and fuzzy logic inference system, the SAFIPS includes a HDL simulator and fuzzy logic evaluator that are used to measure the level of matching of the user's behavioral model with the IP vendor's model.

  11. Altering the Application of the Traditional Systems Development Life Cycle for Air Force Software Programs.

    DTIC Science & Technology

    1987-04-01

    York: North Holland, Inc., 1981. 2. Fox, Joseph M. Software and Its Development. Englewood Cliffs, New Jersey: Prentice-Hall, Inc., 1982. A 3. Gujarati ... Damodar . Basic Econometrics. New York: McGraw- Hill Book Company, 1978. J.,. 4. Larr, L., et al. Planning Guide for Computer Programming Development

  12. A Component Approach to Collaborative Scientific Software Development: Tools and Techniques Utilized by the Quantum Chemistry Science Application Partnership

    DOE PAGES

    Kenny, Joseph P.; Janssen, Curtis L.; Gordon, Mark S.; ...

    2008-01-01

    Cutting-edge scientific computing software is complex, increasingly involving the coupling of multiple packages to combine advanced algorithms or simulations at multiple physical scales. Component-based software engineering (CBSE) has been advanced as a technique for managing this complexity, and complex component applications have been created in the quantum chemistry domain, as well as several other simulation areas, using the component model advocated by the Common Component Architecture (CCA) Forum. While programming models do indeed enable sound software engineering practices, the selection of programming model is just one building block in a comprehensive approach to large-scale collaborative development which must also addressmore » interface and data standardization, and language and package interoperability. We provide an overview of the development approach utilized within the Quantum Chemistry Science Application Partnership, identifying design challenges, describing the techniques which we have adopted to address these challenges and highlighting the advantages which the CCA approach offers for collaborative development.« less

  13. Pulsed application of focused ultrasound to the LI4 elicits deqi sensations: pilot study.

    PubMed

    Yoo, Seung-Schik; Lee, Wonhye; Kim, Hyungmin

    2014-08-01

    Focused ultrasound (FUS) techniques enable the delivery of acoustic pressure waves to a localized, specific region of anatomy, and mechanically stimulate the sonicated region when given in a train of pulses. The present pilot study examines if the pulsed application of acoustic waves focused to an acupuncture point (LI4, Hegu), i.e. FUS acupuncture, can elicit deqi sensations. The FUS was generated by a single-element ultrasound transducer, and delivered to the LI4 of acupuncture-naïve participants (n=10) for a duration of 1s using 2 ms tone-burst-duration and 50 Hz pulse repetition frequency. The subjective ratings of deqi descriptors were obtained across different conditions, i.e. FUS acupuncture using acoustic intensities of 1 and 3 W/cm(2) (spatial-peak temporal-averaged intensity, Ispta), sham sonication condition, tactile stimulation using a von Frey monofilament, and needle-based real and sham acupuncture. We also measured the presence of sharp pain, unpleasantness, and anxiety level during each condition. The FUS acupuncture given at 3 W/cm(2) elicited deqi sensation ratings similar to those acquired during the needle-based acupuncture condition across the subjects, with significantly reduced levels of non-deqi related sensations, such as sharp pain, anxiety and unpleasantness. The lower acoustic intensity also generated deqi sensations, but at a lesser degree than the ones acquired using the higher acoustic intensity. Neither the sham conditions nor the tactile stimulation elicited deqi sensations. The present data on acoustic acupuncture, with its exquisite spatial and depth control, along with the ability to electronically adjust its intensity, may suggest its potential utilization as an alternative mode of acupuncture, although further study is needed to probe its clinical efficacy. Copyright © 2014 Elsevier Ltd. All rights reserved.

  14. Characterization and application of simultaneously spatio-temporally focused ultrafast laser pulses

    NASA Astrophysics Data System (ADS)

    Greco, Michael J.

    Chirped pulse amplication of ultrafast laser pulses has become an essential technology in the elds of micromachining, tissue ablation, and microscopy. With specically tailored pulses of light we have been able to begin investigation into lab-on-a-chip technology, which has the potential of revolutionizing the medical industry. Advances in microscopy have allowed sub diraction limited resolution to become a reality as well as lensless imaging of single molecules. An intimate knowledge of ultrafast optical pulses, the ability to manipulate an optical spectrum and generate an optical pulse of a specic temporal shape, allows us to continue pushing these elds forward as well as open new ones. This thesis investigates the spatio-temporal construction of pulses which are simultaneously spatio-temporally focused (SSTF) and about their current and future applications. By laterally chirping a compressed laser pulse we have conned the peak intensity to a shorter distance along the optical axis than can be achieved by conventional methods. This also brings about interesting changes to the structure of the pulse intensity such as pulse front tilt (PFT), an eect where the pulse energy is delayed across the focal spot at the focal plane by longer durations than the pulse itself. Though these pulses have found utility in microscopy and micromachining, in-situ methods for characterizing them spatially and temporally are not yet wide spread. I present here an in-situ characterization technique for both spatial and temporal diagnosis of SSTF pulses. By performing a knife-edge scan and collecting the light in a spectrometer, the relative spectral position as well as beam size can be deduced. Temporal characterization is done by dispersion scan, where a second harmonic crystal through the beam focus. Combining the unknown phase of the pulse with the known phase (a result of angular dispersion) allows the unknown phase to be extracted from the second harmonic spectra.

  15. THE APPLICATION OF THE SXF LATTICE DESCRIPTION AND THE UAL SOFTWARE ENVIRONMENT TO THE ANALYSIS OF THE LHC.

    SciTech Connect

    FISCHER,W.; PILAT,F.; PTITSON,V.

    1999-03-29

    A software environment for accelerator modeling has been developed which includes the UAL (Unified Accelerator Library), a collection of accelerator physics libraries with a Per1 interface for scripting, and the SXF (Standard eXchange Format), a format for accelerator description which extends the MAD sequence by including deviations from design values. SXF interfaces have been written for several programs, including MAD9 and MAD8 via the doom database, Cosy, TevLat and UAL itself, which includes Teapot++. After an overview of the software we describe the application of the tools to the analysis of the LHC lattice stability, in the presence of alignment and coupling errors, and to the correction of the first turn and closed orbit in the machine.

  16. The Study on Neuro-IE Management Software in Manufacturing Enterprises. -The Application of Video Analysis Technology

    NASA Astrophysics Data System (ADS)

    Bian, Jun; Fu, Huijian; Shang, Qian; Zhou, Xiangyang; Ma, Qingguo

    This paper analyzes the outstanding problems in current industrial production by reviewing the three stages of the Industrial Engineering Development. Based on investigations and interviews in enterprises, we propose the new idea of applying "computer video analysis technology" to new industrial engineering management software, and add "loose-coefficient" of the working station to this software in order to arrange scientific and humanistic production. Meanwhile, we suggest utilizing Biofeedback Technology to promote further research on "the rules of workers' physiological, psychological and emotional changes in production". This new kind of combination will push forward industrial engineering theories and benefit enterprises in progressing towards flexible social production, thus it will be of great theory innovation value, social significance and application value.

  17. MisTec: A software application for supporting space exploration scenario options and technology development analysis and planning

    NASA Technical Reports Server (NTRS)

    Horsham, Gary A. P.

    1991-01-01

    The structure and composition of a new, emerging software application, which models and analyzes space exploration scenario options for feasibility based on technology development projections is presented. The software application consists of four main components: a scenario generator for designing and inputting scenario options and constraints; a processor which performs algorithmic coupling and options analyses of mission activity requirements and technology capabilities; a results display which graphically and textually shows coupling and options analysis results; and a data/knowledge base which contains information on a variety of mission activities and (power and propulsion) technology system capabilities. The general long-range study process used by NASA to support recent studies is briefly introduced to provide the primary basis for comparison for discussing the potential advantages to be gained from developing and applying this king of application. A hypothetical example of a scenario option to facilitate the best conceptual understanding of what the application is, how it works, or the operating methodology, and when it might be applied is presented.

  18. MisTec - A software application for supporting space exploration scenario options and technology development analysis and planning

    NASA Technical Reports Server (NTRS)

    Horsham, Gary A. P.

    1992-01-01

    This structure and composition of a new, emerging software application, which models and analyzes space exploration scenario options for feasibility based on technology development projections is presented. The software application consists of four main components: a scenario generator for designing and inputting scenario options and constraints; a processor which performs algorithmic coupling and options analyses of mission activity requirements and technology capabilities; a results display which graphically and textually shows coupling and options analysis results; and a data/knowledge base which contains information on a variety of mission activities and (power and propulsion) technology system capabilities. The general long-range study process used by NASA to support recent studies is briefly introduced to provide the primary basis for comparison for discussing the potential advantages to be gained from developing and applying this kind of application. A hypothetical example of a scenario option to facilitate the best conceptual understanding of what the application is, how it works, or the operating methodology, and when it might be applied is presented.

  19. MisTec - A software application for supporting space exploration scenario options and technology development analysis and planning

    NASA Technical Reports Server (NTRS)

    Horsham, Gary A. P.

    1992-01-01

    This structure and composition of a new, emerging software application, which models and analyzes space exploration scenario options for feasibility based on technology development projections is presented. The software application consists of four main components: a scenario generator for designing and inputting scenario options and constraints; a processor which performs algorithmic coupling and options analyses of mission activity requirements and technology capabilities; a results display which graphically and textually shows coupling and options analysis results; and a data/knowledge base which contains information on a variety of mission activities and (power and propulsion) technology system capabilities. The general long-range study process used by NASA to support recent studies is briefly introduced to provide the primary basis for comparison for discussing the potential advantages to be gained from developing and applying this kind of application. A hypothetical example of a scenario option to facilitate the best conceptual understanding of what the application is, how it works, or the operating methodology, and when it might be applied is presented.

  20. [Development of a current version of a software application for research and practice in human nutrition (GRUNUMUR 2.0)].

    PubMed

    Pérez-Llamas, F; Garaulet, M; Torralba, C; Zamora, S

    2012-01-01

    The aim of this paper is the description of a new version of the software application GRUNUMUR, a useful tool for human nutrition studies designed by the Nutrition Research Group from the Murcia University. Similar to the first, this second version offers the possibility to address different types of study: dietary habits (24 h recall, 7-days dietary record and Food Frequency Questionnaire), epidemiological, anthropometrical and clinical studies. The new version, called GRUNUMUR 2.0, compatible with the first one, has an online help system for all functions of the application, providing the user tasks, allows safe storage of a virtually unlimited number of results, in an orderly and organized way, you can retrieve it when required, through a system of backups and scheduled maintenance and unattended (tasks performed by a server), another advantage is its total accessibility, both from the university intranet (www.um.es) and from the internet, it works via Web Browser (http://senver.inf.um.es/esen), and finally, allows data to be exported to Excel for further processing with other applications as well as publishing reports in PDF, to deliver study participants if necessary. The new version has been validated by comparing the extracted results with those obtained from the other software with no significant differences for any of the variables analyzed. The application GRUNUMUR 2.0 is a tool improved, useful and reliable for addressing human nutrition studies.