Science.gov

Sample records for focused application software

  1. Knowledge focus via software agents

    NASA Astrophysics Data System (ADS)

    Henager, Donald E.

    2001-09-01

    The essence of military Command and Control (C2) is making knowledge intensive decisions in a limited amount of time using uncertain, incorrect, or outdated information. It is essential to provide tools to decision-makers that provide: * Management of friendly forces by treating the "friendly resources as a system". * Rapid assessment of effects of military actions againt the "enemy as a system". * Assessment of how an enemy should, can, and could react to friendly military activities. Software agents in the form of mission agents, target agents, maintenance agents, and logistics agents can meet this information challenge. The role of each agent is to know all the details about its assigned mission, target, maintenance, or logistics entity. The Mission Agent would fight for mission resources based on the mission priority and analyze the effect that a proposed mission's results would have on the enemy. The Target Agent (TA) communicates with other targets to determine its role in the system of targets. A system of TAs would be able to inform a planner or analyst of the status of a system of targets, the effect of that status, adn the effect of attacks on that system. The system of TAs would also be able to analyze possible enemy reactions to attack by determining ways to minimize the effect of attack, such as rerouting traffic or using deception. The Maintenance Agent would scheudle maintenance events and notify the maintenance unit. The Logistics Agent would manage shipment and delivery of supplies to maintain appropriate levels of weapons, fuel and spare parts. The central idea underlying this case of software agents is knowledge focus. Software agents are createad automatically to focus their attention on individual real-world entities (e.g., missions, targets) and view the world from that entities perspective. The agent autonomously monitors the entity, identifies problems/opportunities, formulates solutions, and informs the decision-maker. The agent must be

  2. Using Digital Devices in a First Year Classroom: A Focus on the Design and Use of Phonics Software Applications

    ERIC Educational Resources Information Center

    Nicholas, Maria; McKenzie, Sophie; Wells, Muriel A.

    2017-01-01

    When integrated within a holistic literacy program, phonics applications can be used in classrooms to facilitate students' self-directed learning of letter-sound knowledge; but are they designed to allow for such a purpose? With most phonics software applications making heavy use of image cues, this project has more specifically investigated…

  3. SU-E-J-04: Integration of Interstitial High Intensity Therapeutic Ultrasound Applicators On a Clinical MRI-Guided High Intensity Focused Ultrasound Treatment Planning Software Platform

    SciTech Connect

    Ellens, N; Partanen, A; Ghoshal, G; Burdette, E; Farahani, K

    2015-06-15

    Purpose: Interstitial high intensity therapeutic ultrasound (HITU) applicators can be used to ablate tissue percutaneously, allowing for minimally-invasive treatment without ionizing radiation [1,2]. The purpose of this study was to evaluate the feasibility and usability of combining multielement interstitial HITU applicators with a clinical magnetic resonance imaging (MRI)-guided focused ultrasound software platform. Methods: The Sonalleve software platform (Philips Healthcare, Vantaa, Finland) combines anatomical MRI for target selection and multi-planar MRI thermometry to provide real-time temperature information. The MRI-compatible interstitial US applicators (Acoustic MedSystems, Savoy, IL, USA) had 1–4 cylindrical US elements, each 1 cm long with either 180° or 360° of active surface. Each applicator (4 Fr diameter, enclosed within a 13 Fr flexible catheter) was inserted into a tissue-mimicking agar-silica phantom. Degassed water was circulated around the transducers for cooling and coupling. Based on the location of the applicator, a virtual transducer overlay was added to the software to assist targeting and to allow automatic thermometry slice placement. The phantom was sonicated at 7 MHz for 5 minutes with 6–8 W of acoustic power for each element. MR thermometry data were collected during and after sonication. Results: Preliminary testing indicated that the applicator location could be identified in the planning images and the transducer locations predicted within 1 mm accuracy using the overlay. Ablation zones (thermal dose ≥ 240 CEM43) for 2 active, adjacent US elements ranged from 18 mm × 24 mm (width × length) to 25 mm × 25 mm for the 6 W and 8 W sonications, respectively. Conclusion: The combination of interstitial HITU applicators and this software platform holds promise for novel approaches in minimally-invasive MRI-guided therapy, especially when bony structures or air-filled cavities may preclude extracorporeal HIFU.[1] Diederich et al

  4. Cartographic applications software

    USGS Publications Warehouse

    ,

    1992-01-01

    The Office of the Assistant Division Chief for Research, National Mapping Division, develops computer software for the solution of geometronic problems in the fields of surveying, geodesy, remote sensing, and photogrammetry. Software that has been developed using public funds is available on request for a nominal charge to recover the cost of duplication.

  5. Neurofeedback training aimed to improve focused attention and alertness in children with ADHD: a study of relative power of EEG rhythms using custom-made software application.

    PubMed

    Hillard, Brent; El-Baz, Ayman S; Sears, Lonnie; Tasman, Allan; Sokhadze, Estate M

    2013-07-01

    Neurofeedback is a nonpharmacological treatment for attention-deficit hyperactivity disorder (ADHD). We propose that operant conditioning of electroencephalogram (EEG) in neurofeedback training aimed to mitigate inattention and low arousal in ADHD, will be accompanied by changes in EEG bands' relative power. Patients were 18 children diagnosed with ADHD. The neurofeedback protocol ("Focus/Alertness" by Peak Achievement Trainer) has a focused attention and alertness training mode. The neurofeedback protocol provides one for Focus and one for Alertness. This does not allow for collecting information regarding changes in specific EEG bands (delta, theta, alpha, low and high beta, and gamma) power within the 2 to 45 Hz range. Quantitative EEG analysis was completed on each of twelve 25-minute-long sessions using a custom-made MatLab application to determine the relative power of each of the aforementioned EEG bands throughout each session, and from the first session to the last session. Additional statistical analysis determined significant changes in relative power within sessions (from minute 1 to minute 25) and between sessions (from session 1 to session 12). Analysis was of relative power of theta, alpha, low and high beta, theta/alpha, theta/beta, and theta/low beta and theta/high beta ratios. Additional secondary measures of patients' post-neurofeedback outcomes were assessed, using an audiovisual selective attention test (IVA + Plus) and behavioral evaluation scores from the Aberrant Behavior Checklist. Analysis of data computed in the MatLab application, determined that theta/low beta and theta/alpha ratios decreased significantly from session 1 to session 12, and from minute 1 to minute 25 within sessions. The findings regarding EEG changes resulting from brain wave self-regulation training, along with behavioral evaluations, will help elucidate neural mechanisms of neurofeedback aimed to improve focused attention and alertness in ADHD.

  6. Tired of Teaching Software Applications?

    ERIC Educational Resources Information Center

    Lippert, Susan K.; Granger, Mary J.

    Many university business schools have an instructor-led course introducing computer software application packages. This course is often required for all undergraduates and is a prerequisite to other courses, such as accounting, finance, marketing, and operations management. Knowledge and skills gained in this course should enable students not only…

  7. Ray-tracing software comparison for linear focusing solar collectors

    NASA Astrophysics Data System (ADS)

    Osório, Tiago; Horta, Pedro; Larcher, Marco; Pujol-Nadal, Ramón; Hertel, Julian; van Rooyen, De Wet; Heimsath, Anna; Schneider, Simon; Benitez, Daniel; Frein, Antoine; Denarie, Alice

    2016-05-01

    Ray-Tracing software tools have been widely used in the optical design of solar concentrating collectors. In spite of the ability of these tools to assess the geometrical and material aspects impacting the optical performance of concentrators, their use in combination with experimental measurements in the framework of collector testing procedures as not been implemented, to the date, in none of the current solar collector testing standards. In the latest revision of ISO9806 an effort was made to include linear focusing concentrating collectors but some practical and theoretical difficulties emerged. A Ray-Tracing analysis could provide important contributions to overcome these issues, complementing the experimental results obtained through thermal testing and allowing the achievement of more thorough testing outputs with lower experimental requirements. In order to evaluate different available software tools a comparison study was conducted. Taking as representative technologies for line-focus concentrators the Parabolic Trough Collector and the Linear Fresnel Reflector Collector, two exemplary cases with predefined conditions - geometry, sun model and material properties - were simulated with different software tools. This work was carried out within IEA/SHC Task 49 "Solar Heat Integration in Industrial Processes".

  8. Risk reduction using DDP (Defect Detection and Prevention): Software support and software applications

    NASA Technical Reports Server (NTRS)

    Feather, M. S.

    2001-01-01

    Risk assessment and mitigation is the focus of the Defect Detection and Prevention (DDP) process, which has been applied to spacecraft technology assessments and planning, both hardware and software. DDP's major elements and their relevance to core requirement engineering concerns are summarized. The accompanying research demonstration illustrates DDP's tool support, and further customizations for application to software.

  9. The LBT double prime focus camera control software

    NASA Astrophysics Data System (ADS)

    Di Paola, Andrea; Baruffolo, Andrea; Gallozzi, Stefano; Pedichini, Fernando; Speziali, Roberto

    2004-09-01

    The LBT double prime focus camera (LBC) is composed of twin CCD mosaic imagers. The instrument is designed to match the double channel structure of the LBT telescope and to exploit parallel observing mode by optimizing one camera at blue and the other at red side of the visible spectrum. Because of these facts, the LBC activity will likely consist of simultaneous multi-wavelength observation of specific targets, with both channels working at the same time to acquire and download images at different rates. The LBC Control Software is responsible for coordinating these activities by managing scientific sensors and all the ancillary devices such as rotators, filter wheels, optical correctors focusing, house-keeping information, tracking and Active Optics wavefront sensors. The result is obtained using four dedicated PCs to control the four CCD controllers and one dual processor PC to manage all the other aspects including instrument operator interface. The general architecture of the LBC Control Software is described as well as solutions and details about its implementation.

  10. Software Component Technologies and Space Applications

    NASA Technical Reports Server (NTRS)

    Batory, Don

    1995-01-01

    In the near future, software systems will be more reconfigurable than hardware. This will be possible through the advent of software component technologies which have been prototyped in universities and research labs. In this paper, we outline the foundations for those technologies and suggest how they might impact software for space applications.

  11. Programming Language Software For Graphics Applications

    NASA Technical Reports Server (NTRS)

    Beckman, Brian C.

    1993-01-01

    New approach reduces repetitive development of features common to different applications. High-level programming language and interactive environment with access to graphical hardware and software created by adding graphical commands and other constructs to standardized, general-purpose programming language, "Scheme". Designed for use in developing other software incorporating interactive computer-graphics capabilities into application programs. Provides alternative to programming entire applications in C or FORTRAN, specifically ameliorating design and implementation of complex control and data structures typifying applications with interactive graphics. Enables experimental programming and rapid development of prototype software, and yields high-level programs serving as executable versions of software-design documentation.

  12. Collaborative Software and Focused Distraction in the Classroom

    ERIC Educational Resources Information Center

    Rhine, Steve; Bailey, Mark

    2011-01-01

    In search of strategies for increasing their pre-service teachers' thoughtful engagement with content and in an effort to model connection between choice of technology and pedagogical goals, the authors utilized collaborative software during class time. Collaborative software allows all students to write simultaneously on a single collective…

  13. Database Handling Software and Scientific Applications.

    ERIC Educational Resources Information Center

    Gabaldon, Diana J.

    1984-01-01

    Discusses the general characteristics of database management systems and file systems. Also gives a basic framework for evaluating such software and suggests characteristics that should be considered when buying software for specific scientific applications. A list of vendor addresses for popular database management systems is included. (JN)

  14. Firing Room Remote Application Software Development

    NASA Technical Reports Server (NTRS)

    Liu, Kan

    2015-01-01

    The Engineering and Technology Directorate (NE) at National Aeronautics and Space Administration (NASA) Kennedy Space Center (KSC) is designing a new command and control system for the checkout and launch of Space Launch System (SLS) and future rockets. The purposes of the semester long internship as a remote application software developer include the design, development, integration, and verification of the software and hardware in the firing rooms, in particular with the Mobile Launcher (ML) Launch Accessories (LACC) subsystem. In addition, a software test verification procedure document was created to verify and checkout LACC software for Launch Equipment Test Facility (LETF) testing.

  15. Software engineering with application-specific languages

    NASA Technical Reports Server (NTRS)

    Campbell, David J.; Barker, Linda; Mitchell, Deborah; Pollack, Robert H.

    1993-01-01

    Application-Specific Languages (ASL's) are small, special-purpose languages that are targeted to solve a specific class of problems. Using ASL's on software development projects can provide considerable cost savings, reduce risk, and enhance quality and reliability. ASL's provide a platform for reuse within a project or across many projects and enable less-experienced programmers to tap into the expertise of application-area experts. ASL's have been used on several software development projects for the Space Shuttle Program. On these projects, the use of ASL's resulted in considerable cost savings over conventional development techniques. Two of these projects are described.

  16. Firing Room Remote Application Software Development & Swamp Works Laboratory Robot Software Development

    NASA Technical Reports Server (NTRS)

    Garcia, Janette

    2016-01-01

    The National Aeronautics and Space Administration (NASA) is creating a way to send humans beyond low Earth orbit, and later to Mars. Kennedy Space Center (KSC) is working to make this possible by developing a Spaceport Command and Control System (SCCS) which will allow the launch of Space Launch System (SLS). This paper's focus is on the work performed by the author in her first and second part of the internship as a remote application software developer. During the first part of her internship, the author worked on the SCCS's software application layer by assisting multiple ground subsystems teams including Launch Accessories (LACC) and Environmental Control System (ECS) on the design, development, integration, and testing of remote control software applications. Then, on the second part of the internship, the author worked on the development of robot software at the Swamp Works Laboratory which is a research and technology development group which focuses on inventing new technology to help future In-Situ Resource Utilization (ISRU) missions.

  17. Workshop and conference on Grand Challenges applications and software technology

    SciTech Connect

    Not Available

    1993-12-31

    On May 4--7, 1993, nine federal agencies sponsored a four-day meeting on Grand Challenge applications and software technology. The objective was to bring High-Performance Computing and Communications (HPCC) Grand Challenge applications research groups supported under the federal HPCC program together with HPCC software technologists to: discuss multidisciplinary computational science research issues and approaches, identify major technology challenges facing users and providers, and refine software technology requirements for Grand Challenge applications research. The first day and a half focused on applications. Presentations were given by speakers from universities, national laboratories, and government agencies actively involved in Grand Challenge research. Five areas of research were covered: environmental and earth sciences; computational physics; computational biology, chemistry, and materials sciences; computational fluid and plasma dynamics; and applications of artificial intelligence. The next day and a half was spent in working groups in which the applications researchers were joined by software technologists. Nine breakout sessions took place: I/0, Data, and File Systems; Parallel Programming Paradigms; Performance Characterization and Evaluation of Massively Parallel Processing Applications; Program Development Tools; Building Multidisciplinary Applications; Algorithm and Libraries I; Algorithms and Libraries II; Graphics and Visualization; and National HPCC Infrastructure.

  18. E-learning for Critical Thinking: Using Nominal Focus Group Method to Inform Software Content and Design

    PubMed Central

    Parker, Steve; Mayner, Lidia; Michael Gillham, David

    2015-01-01

    Background: Undergraduate nursing students are often confused by multiple understandings of critical thinking. In response to this situation, the Critiique for critical thinking (CCT) project was implemented to provide consistent structured guidance about critical thinking. Objectives: This paper introduces Critiique software, describes initial validation of the content of this critical thinking tool and explores wider applications of the Critiique software. Materials and Methods: Critiique is flexible, authorable software that guides students step-by-step through critical appraisal of research papers. The spelling of Critiique was deliberate, so as to acquire a unique web domain name and associated logo. The CCT project involved implementation of a modified nominal focus group process with academic staff working together to establish common understandings of critical thinking. Previous work established a consensus about critical thinking in nursing and provided a starting point for the focus groups. The study was conducted at an Australian university campus with the focus group guided by open ended questions. Results: Focus group data established categories of content that academic staff identified as important for teaching critical thinking. This emerging focus group data was then used to inform modification of Critiique software so that students had access to consistent and structured guidance in relation to critical thinking and critical appraisal. Conclusions: The project succeeded in using focus group data from academics to inform software development while at the same time retaining the benefits of broader philosophical dimensions of critical thinking. PMID:26835469

  19. Firing Room Remote Application Software Development

    NASA Technical Reports Server (NTRS)

    Liu, Kan

    2014-01-01

    The Engineering and Technology Directorate (NE) at National Aeronautics and Space Administration (NASA) Kennedy Space Center (KSC) is designing a new command and control system for the checkout and launch of Space Launch System (SLS) and future rockets. The purposes of the semester long internship as a remote application software developer include the design, development, integration, and verification of the software and hardware in the firing rooms, in particular with the Mobile Launcher (ML) Launch Accessories subsystem. In addition, a Conversion Fusion project was created to show specific approved checkout and launch engineering data for public-friendly display purposes.

  20. Application development using the ALMA common software

    NASA Astrophysics Data System (ADS)

    Chiozzi, G.; Caproni, A.; Jeram, B.; Sommer, H.; Wang, V.; Plesko, M.; Sekoranja, M.; Zagar, K.; Fugate, D. W.; Harrington, S.; Di Marcantonio, P.; Cirami, R.

    2006-06-01

    The ALMA Common Software (ACS) provides the software infrastructure used by ALMA and by several other telescope projects, thanks also to the choice of adopting the LGPL public license. ACS is a set of application frameworks providing the basic services needed for object oriented distributed computing. Among these are transparent remote object invocation, object deployment and location based on a container/component model, distributed error, alarm handling, logging and events. ACS is based on CORBA and built on top of free CORBA implementations. Free software is extensively used wherever possible. The general architecture of ACS was presented at SPIE 2002. ACS has been under development for 6 years and it is midway through its development life. Many applications have been written using ACS; the ALMA test facility, APEX and other telescopes are running systems based on ACS. This is therefore a good time to look back and see what have been until now the strong and the weak points of ACS in terms of architecture and implementation. In this perspective, it is very important to analyze the applications based on ACS, the feedback received by the users and the impact that this feedback has had on the development of ACS itself, by favoring the development of some features with respect to others. The purpose of this paper is to describe the results of this analysis and discuss what we would like to do in order to extend and improve ACS in the coming years, in particular to make application development easier and more efficient.

  1. A Software Architecture for High Level Applications

    SciTech Connect

    Shen,G.

    2009-05-04

    A modular software platform for high level applications is under development at the National Synchrotron Light Source II project. This platform is based on client-server architecture, and the components of high level applications on this platform will be modular and distributed, and therefore reusable. An online model server is indispensable for model based control. Different accelerator facilities have different requirements for the online simulation. To supply various accelerator simulators, a set of narrow and general application programming interfaces is developed based on Tracy-3 and Elegant. This paper describes the system architecture for the modular high level applications, the design of narrow and general application programming interface for an online model server, and the prototype of online model server.

  2. 36 CFR 1194.21 - Software applications and operating systems.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 36 Parks, Forests, and Public Property 3 2011-07-01 2011-07-01 false Software applications and... Standards § 1194.21 Software applications and operating systems. (a) When software is designed to run on a...) Software shall not use flashing or blinking text, objects, or other elements having a flash or...

  3. 36 CFR 1194.21 - Software applications and operating systems.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 36 Parks, Forests, and Public Property 3 2013-07-01 2012-07-01 true Software applications and... Standards § 1194.21 Software applications and operating systems. (a) When software is designed to run on a...) Software shall not use flashing or blinking text, objects, or other elements having a flash or...

  4. Parallel Algorithms and Software for Nuclear, Energy, and Environmental Applications. Part II: Multiphysics Software

    SciTech Connect

    Derek Gaston; Luanjing Guo; Glen Hansen; Hai Huang; Richard Johnson; Dana Knoll; Chris Newman; Hyeong Kae Park; Robert Podgorney; Michael Tonks; Richard Williamson

    2012-09-01

    This paper is the second part of a two part sequence on multiphysics algorithms and software. The first [1] focused on the algorithms; this part treats the multiphysics software framework and applications based on it. Tight coupling is typically designed into the analysis application at inception, as such an application is strongly tied to a composite nonlinear solver that arrives at the final solution by treating all equations simultaneously. The application must also take care to minimize both time and space error between the physics, particularly if more than one mesh representation is needed in the solution process. This paper presents an application framework that was specifically designed to support tightly coupled multiphysics analysis. The Multiphysics Object Oriented Simulation Environment (MOOSE) is based on the Jacobian-free Newton-Krylov (JFNK) method combined with physics-based preconditioning to provide the underlying mathematical structure for applications. The report concludes with the presentation of a host of nuclear, energy, and environmental applications that demonstrate the efficacy of the approach and the utility of a well-designed multiphysics framework.

  5. Plasma focus: Present status and potential applications

    SciTech Connect

    Brzosko, J.S.; Nardi, V.; Powell, C.

    1997-12-01

    Initially, dense plasma focus (DPF) machines were constructed independently by Filippov in Moscow and Mather in Los Alamos at the end of the 1950s. Since then, more than 30 laboratories have carried vigorous DPF programs, oriented mainly toward the studies of physics of ion acceleration and trapping in the plasma focus environment. Applications of the DPF as intense neutron and X-ray sources have been recognized since its discovery but not implemented for various reasons. Recently, some groups (including AES) addressed the issue of DPF applications, and some of them are briefly discussed in this paper.

  6. Evaluation of the DDSolver software applications.

    PubMed

    Zuo, Jieyu; Gao, Yuan; Bou-Chacra, Nadia; Löbenberg, Raimar

    2014-01-01

    When a new oral dosage form is developed, its dissolution behavior must be quantitatively analyzed. Dissolution analysis involves a comparison of the dissolution profiles and the application of mathematical models to describe the drug release pattern. This report aims to assess the application of the DDSolver, an Excel add-in software package, which is designed to analyze data obtained from dissolution experiments. The data used in this report were chosen from two dissolution studies. The results of the DDSolver analysis were compared with those obtained using an Excel worksheet. The comparisons among three different products obtained similarity factors (f 2) of 23.21, 46.66, and 17.91 using both DDSolver and the Excel worksheet. The results differed when DDSolver and Excel were used to calculate the release exponent "n" in the Korsmeyer-Peppas model. Performing routine quantitative analysis proved to be much easier using the DDSolver program than an Excel spreadsheet. The use of the DDSolver program reduced the calculation time and has the potential to omit calculation errors, thus making this software package a convenient tool for dissolution comparison.

  7. Software reliability models for critical applications

    SciTech Connect

    Pham, H.; Pham, M.

    1991-12-01

    This report presents the results of the first phase of the ongoing EG&G Idaho, Inc. Software Reliability Research Program. The program is studying the existing software reliability models and proposes a state-of-the-art software reliability model that is relevant to the nuclear reactor control environment. This report consists of three parts: (1) summaries of the literature review of existing software reliability and fault tolerant software reliability models and their related issues, (2) proposed technique for software reliability enhancement, and (3) general discussion and future research. The development of this proposed state-of-the-art software reliability model will be performed in the second place. 407 refs., 4 figs., 2 tabs.

  8. Software reliability models for critical applications

    SciTech Connect

    Pham, H.; Pham, M.

    1991-12-01

    This report presents the results of the first phase of the ongoing EG G Idaho, Inc. Software Reliability Research Program. The program is studying the existing software reliability models and proposes a state-of-the-art software reliability model that is relevant to the nuclear reactor control environment. This report consists of three parts: (1) summaries of the literature review of existing software reliability and fault tolerant software reliability models and their related issues, (2) proposed technique for software reliability enhancement, and (3) general discussion and future research. The development of this proposed state-of-the-art software reliability model will be performed in the second place. 407 refs., 4 figs., 2 tabs.

  9. Designing Control System Application Software for Change

    NASA Technical Reports Server (NTRS)

    Boulanger, Richard

    2001-01-01

    The Unified Modeling Language (UML) was used to design the Environmental Systems Test Stand (ESTS) control system software. The UML was chosen for its ability to facilitate a clear dialog between software designer and customer, from which requirements are discovered and documented in a manner which transposes directly to program objects. Applying the UML to control system software design has resulted in a baseline set of documents from which change and effort of that change can be accurately measured. As the Environmental Systems Test Stand evolves, accurate estimates of the time and effort required to change the control system software will be made. Accurate quantification of the cost of software change can be before implementation, improving schedule and budget accuracy.

  10. Analysis of Software Development Methodologies to Build Safety Software Applications for the SATEX-II: A Mexican Experimental Satellite

    NASA Astrophysics Data System (ADS)

    Aguilar Cisneros, Jorge; Vargas Martinez, Hector; Pedroza Melendez, Alejandro; Alonso Arevalo, Miguel

    2013-09-01

    Mexico is a country where the experience to build software for satellite applications is beginning. This is a delicate situation because in the near future we will need to develop software for the SATEX-II (Mexican Experimental Satellite). SATEX- II is a SOMECyTA's project (the Mexican Society of Aerospace Science and Technology). We have experienced applying software development methodologies, like TSP (Team Software Process) and SCRUM in other areas. Then, we analyzed these methodologies and we concluded: these can be applied to develop software for the SATEX-II, also, we supported these methodologies with SSP-05-0 Standard in particular with ESA PSS-05-11. Our analysis was focusing on main characteristics of each methodology and how these methodologies could be used with the ESA PSS 05-0 Standards. Our outcomes, in general, may be used by teams who need to build small satellites, but, in particular, these are going to be used when we will build the on board software applications for the SATEX-II.

  11. Classroom Applications of Electronic Spreadsheet Computer Software.

    ERIC Educational Resources Information Center

    Tolbert, Patricia H.; Tolbert, Charles M., II

    1983-01-01

    Details classroom use of SuperCalc, a software accounting package developed originally for small businesses, as a computerized gradebook. A procedure which uses data from the computer gradebook to produce weekly printed reports for parents is also described. (MBR)

  12. SHMTools: a general-purpose software tool for SHM applications

    SciTech Connect

    Harvey, Dustin; Farrar, Charles; Taylor, Stuart; Park, Gyuhae; Flynn, Eric B; Kpotufe, Samory; Dondi, Denis; Mollov, Todor; Todd, Michael D; Rosin, Tajana S; Figueiredo, Eloi

    2010-11-30

    This paper describes a new software package for various structural health monitoring (SHM) applications. The software is a set of standardized MATLAB routines covering three main stages of SHM: data acquisition, feature extraction, and feature classification for damage identification. A subset of the software in SHMTools is embeddable, which consists of Matlab functions that can be cross-compiled into generic 'C' programs to be run on a target hardware. The software is also designed to accommodate multiple sensing modalities, including piezoelectric active-sensing, which has been widely used in SHM practice. The software package, including standardized datasets, are publicly available for use by the SHM community. The details of this embeddable software will be discussed, along with several example processes that can be used for guidelines for future use of the software.

  13. Safety Characteristics in System Application Software for Human Rated Exploration

    NASA Technical Reports Server (NTRS)

    Mango, E. J.

    2016-01-01

    NASA and its industry and international partners are embarking on a bold and inspiring development effort to design and build an exploration class space system. The space system is made up of the Orion system, the Space Launch System (SLS) and the Ground Systems Development and Operations (GSDO) system. All are highly coupled together and dependent on each other for the combined safety of the space system. A key area of system safety focus needs to be in the ground and flight application software system (GFAS). In the development, certification and operations of GFAS, there are a series of safety characteristics that define the approach to ensure mission success. This paper will explore and examine the safety characteristics of the GFAS development.

  14. An evaluation of the Interactive Software Invocation System (ISIS) for software development applications. [flight software

    NASA Technical Reports Server (NTRS)

    Noland, M. S.

    1981-01-01

    The Interactive Software Invocation System (ISIS), which allows a user to build, modify, control, and process a total flight software system without direct communications with the host computer, is described. This interactive data management system provides the user with a file manager, text editor, a tool invoker, and an Interactive Programming Language (IPL). The basic file design of ISIS is a five level hierarchical structure. The file manager controls this hierarchical file structure and permits the user to create, to save, to access, and to purge pages of information. The text editor is used to manipulate pages of text to be modified and the tool invoker allows the user to communicate with the host computer through a RUN file created by the user. The IPL is based on PASCAL and contains most of the statements found in a high-level programming language. In order to evaluate the effectiveness of the system as applied to a flight project, the collection of software components required to support the Annular Suspension and Pointing System (ASPS) flight project were integrated using ISIS. The ASPS software system and its integration into ISIS is described.

  15. Software framework for nano- and microscale measurement applications

    NASA Astrophysics Data System (ADS)

    Röning, Juha; Tuhkanen, Ville; Sipola, Risto; Vallius, Tero

    2011-01-01

    Development of new instruments and measurement methods has advanced research in the field of nanotechnology. Development of measurement systems used in research requires support from reconfigurable software. Application frameworks can be used to develop domain-specific application skeletons. New applications are specialized from the framework by filling its extension points. This paper presents an application framework for nano- and micro-scale applications. The framework consists of implementation of a robotic control architecture and components that implement features available in measurement applications. To ease the development of user interfaces for measurement systems, the framework also contains ready-to-use user interface components. The goal of the framework was to ease the development of new applications for measurement systems. Features of the implemented framework were examined through two test cases. Benefits gained by using the framework were analyzed by determining work needed to specialize new applications from the framework. Also the degree of reusability of specialized applications was examined. The work shows that the developed framework can be used to implement software for measurement systems and that the major part of the software can be implemented by using reusable components of the framework. When developing new software, a developer only needs to develop components related to the hardware used and performing the measurement task. Using the framework developing new software takes less time. The framework also unifies structure of developed software.

  16. Value Engineering: An Application to Computer Software

    DTIC Science & Technology

    1995-06-01

    of Value Engineering to a software development process. Purchasing agents for the State of New Mexico were tasked to reduce the amount of wailing costs...of VE in sortware acquisitionfdevelopment (ie. educacion , award programs, designate Govt. savings for use in 77 generating additional savings

  17. End User Software Development for Transportation Applications.

    DTIC Science & Technology

    1987-09-01

    patience, attention, and guidance throughout this research project. I also wish to thank Captain Demetrius " Glass of the HQ USAF Transportation Plans and...necessary management and direction to refine non-standardized, interim software computer programs. According to Captain Demetrius Glass of the

  18. Software Applications To Increase Administrative and Teacher Effectiveness.

    ERIC Educational Resources Information Center

    Garland, Virginia E.

    Arguing that the most effective types of managerial computer software for teacher use are word processing, database management, and electronic spreadsheet packages, this paper uses Apple Writer, PFS File, and VisiCalc as examples of such software and suggests ways in which they can be used by classroom teachers. Applications of Apple Writer that…

  19. 36 CFR 1194.21 - Software applications and operating systems.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... operating systems. 1194.21 Section 1194.21 Parks, Forests, and Public Property ARCHITECTURAL AND... Standards § 1194.21 Software applications and operating systems. (a) When software is designed to run on a... shall not disrupt or disable activated features of any operating system that are identified...

  20. Software development for safety-critical medical applications

    NASA Technical Reports Server (NTRS)

    Knight, John C.

    1992-01-01

    There are many computer-based medical applications in which safety and not reliability is the overriding concern. Reduced, altered, or no functionality of such systems is acceptable as long as no harm is done. A precise, formal definition of what software safety means is essential, however, before any attempt can be made to achieve it. Without this definition, it is not possible to determine whether a specific software entity is safe. A set of definitions pertaining to software safety will be presented and a case study involving an experimental medical device will be described. Some new techniques aimed at improving software safety will also be discussed.

  1. Focused force angioplasty Theory and application

    SciTech Connect

    Solar, Ronald J.; Ischinger, Thomas A

    2003-03-01

    Focused force angioplasty is a technique in which the forces resulting from inflating an angioplasty balloon in a stenosis are concentrated and focused at one or more locations within the stenosis. While the technique has been shown to be useful in resolving resistant stenoses, its real value may be in minimizing the vascular trauma associated with balloon angioplasty and subsequently improving the outcome.

  2. Designing application software in wide area network settings

    NASA Technical Reports Server (NTRS)

    Makpangou, Mesaac; Birman, Ken

    1990-01-01

    Progress in methodologies for developing robust local area network software has not been matched by similar results for wide area settings. The design of application software spanning multiple local area environments is examined. For important classes of applications, simple design techniques are presented that yield fault tolerant wide area programs. An implementation of these techniques as a set of tools for use within the ISIS system is described.

  3. [Application of password manager software in health care].

    PubMed

    Ködmön, József

    2016-12-01

    When using multiple IT systems, handling of passwords in a secure manner means a potential source of problem. The most frequent issues are choosing the appropriate length and complexity, and then remembering the strong passwords. Password manager software provides a good solution for this problem, while greatly increasing the security of sensitive medical data. This article introduces a password manager software and provides basic information of the application. It also discusses how to select a really secure password manager software and suggests a practical application to efficient, safe and comfortable use for health care. Orv. Hetil., 2016, 157(52), 2066-2073.

  4. A computationally efficient software application for calculating vibration from underground railways

    NASA Astrophysics Data System (ADS)

    Hussein, M. F. M.; Hunt, H. E. M.

    2009-08-01

    The PiP model is a software application with a user-friendly interface for calculating vibration from underground railways. This paper reports about the software with a focus on its latest version and the plans for future developments. The software calculates the Power Spectral Density of vibration due to a moving train on floating-slab track with track irregularity described by typical values of spectra for tracks with good, average and bad conditions. The latest version accounts for a tunnel embedded in a half space by employing a toolbox developed at K.U. Leuven which calculates Green's functions for a multi-layered half-space.

  5. HSCT4.0 Application: Software Requirements Specification

    NASA Technical Reports Server (NTRS)

    Salas, A. O.; Walsh, J. L.; Mason, B. H.; Weston, R. P.; Townsend, J. C.; Samareh, J. A.; Green, L. L.

    2001-01-01

    The software requirements for the High Performance Computing and Communication Program High Speed Civil Transport application project, referred to as HSCT4.0, are described. The objective of the HSCT4.0 application project is to demonstrate the application of high-performance computing techniques to the problem of multidisciplinary design optimization of a supersonic transport configuration, using high-fidelity analysis simulations. Descriptions of the various functions (and the relationships among them) that make up the multidisciplinary application as well as the constraints on the software design arc provided. This document serves to establish an agreement between the suppliers and the customer as to what the HSCT4.0 application should do and provides to the software developers the information necessary to design and implement the system.

  6. Standardization of Software Application Development and Governance

    DTIC Science & Technology

    2015-03-01

    Janssen, 2014a). The JavaBeans concept allows the encapsulation of multiple objects into a single Bean . Beans register to receive or send objects...to other applications or other parts of the system. A program can re-use the Beans or objects in multiple Beans depending on how the application

  7. Control system software, simulation, and robotic applications

    NASA Technical Reports Server (NTRS)

    Frisch, Harold P.

    1991-01-01

    All essential existing capabilities needed to create a man-machine interaction dynamics and performance (MMIDAP) capability are reviewed. The multibody system dynamics software program Order N DISCOS will be used for machine and musculo-skeletal dynamics modeling. The program JACK will be used for estimating and animating whole body human response to given loading situations and motion constraints. The basic elements of performance (BEP) task decomposition methodologies associated with the Human Performance Institute database will be used for performance assessment. Techniques for resolving the statically indeterminant muscular load sharing problem will be used for a detailed understanding of potential musculotendon or ligamentous fatigue, pain, discomfort, and trauma. The envisioned capacity is to be used for mechanical system design, human performance assessment, extrapolation of man/machine interaction test data, biomedical engineering, and soft prototyping within a concurrent engineering (CE) system.

  8. Development and application of new quality model for software projects.

    PubMed

    Karnavel, K; Dillibabu, R

    2014-01-01

    The IT industry tries to employ a number of models to identify the defects in the construction of software projects. In this paper, we present COQUALMO and its limitations and aim to increase the quality without increasing the cost and time. The computation time, cost, and effort to predict the residual defects are very high; this was overcome by developing an appropriate new quality model named the software testing defect corrective model (STDCM). The STDCM was used to estimate the number of remaining residual defects in the software product; a few assumptions and the detailed steps of the STDCM are highlighted. The application of the STDCM is explored in software projects. The implementation of the model is validated using statistical inference, which shows there is a significant improvement in the quality of the software projects.

  9. Development and Application of New Quality Model for Software Projects

    PubMed Central

    Karnavel, K.; Dillibabu, R.

    2014-01-01

    The IT industry tries to employ a number of models to identify the defects in the construction of software projects. In this paper, we present COQUALMO and its limitations and aim to increase the quality without increasing the cost and time. The computation time, cost, and effort to predict the residual defects are very high; this was overcome by developing an appropriate new quality model named the software testing defect corrective model (STDCM). The STDCM was used to estimate the number of remaining residual defects in the software product; a few assumptions and the detailed steps of the STDCM are highlighted. The application of the STDCM is explored in software projects. The implementation of the model is validated using statistical inference, which shows there is a significant improvement in the quality of the software projects. PMID:25478594

  10. Computer Applications in Marketing. An Annotated Bibliography of Computer Software.

    ERIC Educational Resources Information Center

    Burrow, Jim; Schwamman, Faye

    This bibliography contains annotations of 95 items of educational and business software with applications in seven marketing and business functions. The annotations, which appear in alphabetical order by title, provide this information: category (related application), title, date, source and price, equipment, supplementary materials, description…

  11. Simplifying applications software for vision guided robot implementation

    NASA Technical Reports Server (NTRS)

    Duncheon, Charlie

    1994-01-01

    A simple approach to robot applications software is described. The idea is to use commercially available software and hardware wherever possible to minimize system costs, schedules and risks. The U.S. has been slow in the adaptation of robots and flexible automation compared to the fluorishing growth of robot implementation in Japan. The U.S. can benefit from this approach because of a more flexible array of vision guided robot technologies.

  12. Applications of isoelectric focusing in forensic serology.

    PubMed

    Murch, R S; Budowle, B

    1986-07-01

    The typing of certain polymorphic proteins present in human body fluids is an important aspect of the analysis of serological evidence. This is particularly true when dealing with evidence related to violent criminal activity such as homocide, assault, or rape. Until recently, the routine analysis of the genetic polymorphisms of interest relied upon conventional electrophoretic techniques such as horizontal starch or agarose slab gel or both, cellulose acetate, and vertical polyacrylamide gradient gel methods. These techniques adequately separate a limited number of common variants. In some cases, these methods are still those of choice. However, as a result of the nature of the conventional approach, problems with time required for analysis, resolution, diffusion of bands, sensitivity of protein detection, and cost are often encountered. Isoelectric focusing (IEF) offers an effective alternative to conventional electrophoresis for genetic marker typing. This method exploits the isoelectric point of allelic products rather than charge-to-mass ratio in a particular pH environment. The advantages of employing IEF include: reduction of time of analysis, increased resolution of protein bands, the possibility of subtyping existing phenotypes, increased sensitivity of detection, the counteraction of diffusion effects, and reduced cost per sample.

  13. High Intensity Focused Ultrasound Tumor Therapy System and Its Application

    NASA Astrophysics Data System (ADS)

    Sun, Fucheng; He, Ye; Li, Rui

    2007-05-01

    At the end of last century, a High Intensity Focused Ultrasound (HIFU) tumor therapy system was successfully developed and manufactured in China, which has been already applied to clinical therapy. This article aims to discuss the HIFU therapy system and its application. Detailed research includes the following: power amplifiers for high-power ultrasound, ultrasound transducers with large apertures, accurate 3-D mechanical drives, a software control system (both high-voltage control and low-voltage control), and the B-mode ultrasonic diagnostic equipment used for treatment monitoring. Research on the dosage of ultrasound required for tumour therapy in multiple human cases has made it possible to relate a dosage formula, presented in this paper, to other significant parameters such as the volume of thermal tumor solidification, the acoustic intensity (I), and the ultrasound emission time (tn). Moreover, the HIFU therapy system can be applied to the clinical treatment of both benign and malignant tumors in the pelvic and abdominal cavity, such as uterine fibroids, liver cancer and pancreatic carcinoma.

  14. Solar-terrestrial models and application software

    NASA Technical Reports Server (NTRS)

    Bilitza, Dieter

    1990-01-01

    The empirical models related to solar-terrestrial sciences are listed and described which are available in the form of computer programs. Also included are programs that use one or more of these models for application specific purposes. The entries are grouped according to the region of the solar-terrestrial environment to which they belong and according to the parameter which they describe. Regions considered include the ionosphere, atmosphere, magnetosphere, planets, interplanetary space, and heliosphere. Also provided is the information on the accessibility for solar-terrestrial models to specify the magnetic and solar activity conditions.

  15. Mission design applications of QUICK. [software for interactive trajectory calculation

    NASA Technical Reports Server (NTRS)

    Skinner, David L.; Bass, Laura E.; Byrnes, Dennis V.; Cheng, Jeannie T.; Fordyce, Jess E.; Knocke, Philip C.; Lyons, Daniel T.; Pojman, Joan L.; Stetson, Douglas S.; Wolf, Aron A.

    1990-01-01

    An overview of an interactive software environment for space mission design termed QUICK is presented. This stand-alone program provides a programmable FORTRAN-like calculator interface to a wide range of both built-in and user defined functions. QUICK has evolved into a general-purpose software environment that can be intrinsically and dynamically customized for a wide range of mission design applications. Specific applications are described for some space programs, e.g., the earth-Venus-Mars mission, the Cassini mission to Saturn, the Mars Observer, the Galileo Project, and the Magellan Spacecraft.

  16. Static and Dynamic Verification of Critical Software for Space Applications

    NASA Astrophysics Data System (ADS)

    Moreira, F.; Maia, R.; Costa, D.; Duro, N.; Rodríguez-Dapena, P.; Hjortnaes, K.

    Space technology is no longer used only for much specialised research activities or for sophisticated manned space missions. Modern society relies more and more on space technology and applications for every day activities. Worldwide telecommunications, Earth observation, navigation and remote sensing are only a few examples of space applications on which we rely daily. The European driven global navigation system Galileo and its associated applications, e.g. air traffic management, vessel and car navigation, will significantly expand the already stringent safety requirements for space based applications Apart from their usefulness and practical applications, every single piece of onboard software deployed into the space represents an enormous investment. With a long lifetime operation and being extremely difficult to maintain and upgrade, at least when comparing with "mainstream" software development, the importance of ensuring their correctness before deployment is immense. Verification &Validation techniques and technologies have a key role in ensuring that the onboard software is correct and error free, or at least free from errors that can potentially lead to catastrophic failures. Many RAMS techniques including both static criticality analysis and dynamic verification techniques have been used as a means to verify and validate critical software and to ensure its correctness. But, traditionally, these have been isolated applied. One of the main reasons is the immaturity of this field in what concerns to its application to the increasing software product(s) within space systems. This paper presents an innovative way of combining both static and dynamic techniques exploiting their synergy and complementarity for software fault removal. The methodology proposed is based on the combination of Software FMEA and FTA with Fault-injection techniques. The case study herein described is implemented with support from two tools: The SoftCare tool for the SFMEA and SFTA

  17. Enhancement of computer system for applications software branch

    NASA Technical Reports Server (NTRS)

    Bykat, Alex

    1987-01-01

    Presented is a compilation of the history of a two-month project concerned with a survey, evaluation, and specification of a new computer system for the Applications Software Branch of the Software and Data Management Division of Information and Electronic Systems Laboratory of Marshall Space Flight Center, NASA. Information gathering consisted of discussions and surveys of branch activities, evaluation of computer manufacturer literature, and presentations by vendors. Information gathering was followed by evaluation of their systems. The criteria of the latter were: the (tentative) architecture selected for the new system, type of network architecture supported, software tools, and to some extent the price. The information received from the vendors, as well as additional research, lead to detailed design of a suitable system. This design included considerations of hardware and software environments as well as personnel issues such as training. Design of the system culminated in a recommendation for a new computing system for the Branch.

  18. Web Application Software for Ground Operations Planning Database (GOPDb) Management

    NASA Technical Reports Server (NTRS)

    Lanham, Clifton; Kallner, Shawn; Gernand, Jeffrey

    2013-01-01

    A Web application facilitates collaborative development of the ground operations planning document. This will reduce costs and development time for new programs by incorporating the data governance, access control, and revision tracking of the ground operations planning data. Ground Operations Planning requires the creation and maintenance of detailed timelines and documentation. The GOPDb Web application was created using state-of-the-art Web 2.0 technologies, and was deployed as SaaS (Software as a Service), with an emphasis on data governance and security needs. Application access is managed using two-factor authentication, with data write permissions tied to user roles and responsibilities. Multiple instances of the application can be deployed on a Web server to meet the robust needs for multiple, future programs with minimal additional cost. This innovation features high availability and scalability, with no additional software that needs to be bought or installed. For data governance and security (data quality, management, business process management, and risk management for data handling), the software uses NAMS. No local copy/cloning of data is permitted. Data change log/tracking is addressed, as well as collaboration, work flow, and process standardization. The software provides on-line documentation and detailed Web-based help. There are multiple ways that this software can be deployed on a Web server to meet ground operations planning needs for future programs. The software could be used to support commercial crew ground operations planning, as well as commercial payload/satellite ground operations planning. The application source code and database schema are owned by NASA.

  19. The Application of Solution-Focused Work in Employment Counseling

    ERIC Educational Resources Information Center

    Bezanson, Birdie J.

    2004-01-01

    The author explores the applicability of a solution-focused therapy (SFT) model as a comprehensive approach to employment counseling. SFT focuses the client on developing a vision of a preferred future and assumes that the client has the talents and resources that can be accessed in the employment counseling process. The solution-focused counselor…

  20. 36 CFR 1194.21 - Software applications and operating systems.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... accessibility features where the application programming interface for those accessibility features has been...-defined on-screen indication of the current focus shall be provided that moves among interactive interface... technology can track focus and focus changes. (d) Sufficient information about a user interface...

  1. 36 CFR 1194.21 - Software applications and operating systems.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... accessibility features where the application programming interface for those accessibility features has been...-defined on-screen indication of the current focus shall be provided that moves among interactive interface... technology can track focus and focus changes. (d) Sufficient information about a user interface...

  2. Application of Plagiarism Screening Software in the Chemical Engineering Curriculum

    ERIC Educational Resources Information Center

    Cooper, Matthew E.; Bullard, Lisa G.

    2014-01-01

    Plagiarism is an area of increasing concern for written ChE assignments, such as laboratory and design reports, due to ease of access to text and other materials via the internet. This study examines the application of plagiarism screening software to four courses in a university chemical engineering curriculum. The effectiveness of plagiarism…

  3. Software Applications Course as an Early Indicator of Academic Performance

    ERIC Educational Resources Information Center

    Benham, Harry C.; Bielinska-Kwapisz, Agnieszka; Brown, F. William

    2013-01-01

    This study's objective is to determine if students who were unable to successfully complete a required sophomore level business software applications course encountered unique academic difficulties in that course, or if their difficulty signaled more general academic achievement problems in business. The study points to the importance of including…

  4. Final Report. Center for Scalable Application Development Software

    SciTech Connect

    Mellor-Crummey, John

    2014-10-26

    The Center for Scalable Application Development Software (CScADS) was established as a part- nership between Rice University, Argonne National Laboratory, University of California Berkeley, University of Tennessee – Knoxville, and University of Wisconsin – Madison. CScADS pursued an integrated set of activities with the aim of increasing the productivity of DOE computational scientists by catalyzing the development of systems software, libraries, compilers, and tools for leadership computing platforms. Principal Center activities were workshops to engage the research community in the challenges of leadership computing, research and development of open-source software, and work with computational scientists to help them develop codes for leadership computing platforms. This final report summarizes CScADS activities at Rice University in these areas.

  5. The application of image processing software: Photoshop in environmental design

    NASA Astrophysics Data System (ADS)

    Dong, Baohua; Zhang, Chunmi; Zhuo, Chen

    2011-02-01

    In the process of environmental design and creation, the design sketch holds a very important position in that it not only illuminates the design's idea and concept but also shows the design's visual effects to the client. In the field of environmental design, computer aided design has made significant improvement. Many types of specialized design software for environmental performance of the drawings and post artistic processing have been implemented. Additionally, with the use of this software, working efficiency has greatly increased and drawings have become more specific and more specialized. By analyzing the application of photoshop image processing software in environmental design and comparing and contrasting traditional hand drawing and drawing with modern technology, this essay will further explore the way for computer technology to play a bigger role in environmental design.

  6. Software.

    ERIC Educational Resources Information Center

    Journal of Chemical Education, 1989

    1989-01-01

    Presented are reviews of two computer software packages for Apple II computers; "Organic Spectroscopy," and "Videodisc Display Program" for use with "The Periodic Table Videodisc." A sample spectrograph from "Organic Spectroscopy" is included. (CW)

  7. Software radio technology and applications to law enforcement

    NASA Astrophysics Data System (ADS)

    Mitola, Joseph, III

    1997-02-01

    Law enforcement use of radio includes the rapid creation of networks for the dozens of law enforcement organizations who come together in situations as diverse as the TWA 800 disaster in New York or the SunFest celebration in Palm Beach. The software radio is a proven technology for rapidly building such interoperable networks, including seamless bridging cross sub-networks of different frequency bands, channel modulations and information formats. In addition, law enforcement must manage the costs of related radio base station infrastructure, mobile units and handsets. The software radio is a collection of engineering techniques for creating radio infrastructure that can be programmed for new standards and that can be dynamically updated with new software personalities even 'over the air,' reducing the need to purchase new hardware to remain current with emerging radio interface standards. Although relatively expensive today, continuing DoD, federal and commercial investment in software radio technology will bring products within the reach of law enforcement applications within the next few years. The Modular Multifunction Information Transfer Systems (MMITS) Forum provides further impetus for cost reductions through the market efficiencies of open architecture. This article summarizes software radio technology and key trends in the marketplace including the progress of the MMITS forum. Expanded law enforcement participation in this forum would accelerate the availability of low cost products for law enforcement.

  8. IFDOTMETER: A New Software Application for Automated Immunofluorescence Analysis.

    PubMed

    Rodríguez-Arribas, Mario; Pizarro-Estrella, Elisa; Gómez-Sánchez, Rubén; Yakhine-Diop, S M S; Gragera-Hidalgo, Antonio; Cristo, Alejandro; Bravo-San Pedro, Jose M; González-Polo, Rosa A; Fuentes, José M

    2016-04-01

    Most laboratories interested in autophagy use different imaging software for managing and analyzing heterogeneous parameters in immunofluorescence experiments (e.g., LC3-puncta quantification and determination of the number and size of lysosomes). One solution would be software that works on a user's laptop or workstation that can access all image settings and provide quick and easy-to-use analysis of data. Thus, we have designed and implemented an application called IFDOTMETER, which can run on all major operating systems because it has been programmed using JAVA (Sun Microsystems). Briefly, IFDOTMETER software has been created to quantify a variety of biological hallmarks, including mitochondrial morphology and nuclear condensation. The program interface is intuitive and user-friendly, making it useful for users not familiar with computer handling. By setting previously defined parameters, the software can automatically analyze a large number of images without the supervision of the researcher. Once analysis is complete, the results are stored in a spreadsheet. Using software for high-throughput cell image analysis offers researchers the possibility of performing comprehensive and precise analysis of a high number of images in an automated manner, making this routine task easier.

  9. Methods, Software and Tools for Three Numerical Applications. Final report

    SciTech Connect

    E. R. Jessup

    2000-03-01

    This is a report of the results of the authors work supported by DOE contract DE-FG03-97ER25325. They proposed to study three numerical problems. They are: (1) the extension of the PMESC parallel programming library; (2) the development of algorithms and software for certain generalized eigenvalue and singular value (SVD) problems, and (3) the application of techniques of linear algebra to an information retrieval technique known as latent semantic indexing (LSI).

  10. Improving ICT Governance by Reorganizing Operation of ICT and Software Applications: The First Step to Outsource

    NASA Astrophysics Data System (ADS)

    Johansson, Björn

    During recent years great attention has been paid to outsourcing as well as to the reverse, insourcing (Dibbern et al., 2004). There has been a strong focus on how the management of software applications and information and communication technology (ICT), expressed as ICT management versus ICT governance, should be carried out (Grembergen, 2004). The maintenance and operation of software applications and ICT use a lot of the resources spent on ICT in organizations today (Bearingpoint, 2004), and managers are asked to increase the business benefits of these investments (Weill & Ross, 2004). That is, they are asked to improve the usage of ICT and to develop new business critical solutions supported by ICT. It also means that investments in ICT and software applications need to be shown to be worthwhile. Basically there are two considerations to take into account with ICT usage: cost reduction and improving business value. How the governance and management of ICT and software applications are organized is important. This means that the improvement of the control of maintenance and operation may be of interest to executives of organizations. It can be stated that usage is dependent on how it is organized. So, if an increase of ICT governance is the same as having well-organized ICT resources, could this be seen as the first step in organizations striving for external provision of ICT? This question is dealt with to some degree in this paper.

  11. Software Transition Project Retrospectives and the Application of SEL Effort Estimation Model and Boehm's COCOMO to Complex Software Transition Projects

    NASA Technical Reports Server (NTRS)

    McNeill, Justin

    1995-01-01

    The Multimission Image Processing Subsystem (MIPS) at the Jet Propulsion Laboratory (JPL) has managed transitions of application software sets from one operating system and hardware platform to multiple operating systems and hardware platforms. As a part of these transitions, cost estimates were generated from the personal experience of in-house developers and managers to calculate the total effort required for such projects. Productivity measures have been collected for two such transitions, one very large and the other relatively small in terms of source lines of code. These estimates used a cost estimation model similar to the Software Engineering Laboratory (SEL) Effort Estimation Model. Experience in transitioning software within JPL MIPS have uncovered a high incidence of interface complexity. Interfaces, both internal and external to individual software applications, have contributed to software transition project complexity, and thus to scheduling difficulties and larger than anticipated design work on software to be ported.

  12. Open source software engineering for geoscientific modeling applications

    NASA Astrophysics Data System (ADS)

    Bilke, L.; Rink, K.; Fischer, T.; Kolditz, O.

    2012-12-01

    OpenGeoSys (OGS) is a scientific open source project for numerical simulation of thermo-hydro-mechanical-chemical (THMC) processes in porous and fractured media. The OGS software development community is distributed all over the world and people with different backgrounds are contributing code to a complex software system. The following points have to be addressed for successful software development: - Platform independent code - A unified build system - A version control system - A collaborative project web site - Continuous builds and testing - Providing binaries and documentation for end users OGS should run on a PC as well as on a computing cluster regardless of the operating system. Therefore the code should not include any platform specific feature or library. Instead open source and platform independent libraries like Qt for the graphical user interface or VTK for visualization algorithms are used. A source code management and version control system is a definite requirement for distributed software development. For this purpose Git is used, which enables developers to work on separate versions (branches) of the software and to merge those versions at some point to the official one. The version control system is integrated into an information and collaboration website based on a wiki system. The wiki is used for collecting information such as tutorials, application examples and case studies. Discussions take place in the OGS mailing list. To improve code stability and to verify code correctness a continuous build and testing system, based on the Jenkins Continuous Integration Server, has been established. This server is connected to the version control system and does the following on every code change: - Compiles (builds) the code on every supported platform (Linux, Windows, MacOS) - Runs a comprehensive test suite of over 120 benchmarks and verifies the results Runs software development related metrics on the code (like compiler warnings, code complexity

  13. Software architecture for time-constrained machine vision applications

    NASA Astrophysics Data System (ADS)

    Usamentiaga, Rubén; Molleda, Julio; García, Daniel F.; Bulnes, Francisco G.

    2013-01-01

    Real-time image and video processing applications require skilled architects, and recent trends in the hardware platform make the design and implementation of these applications increasingly complex. Many frameworks and libraries have been proposed or commercialized to simplify the design and tuning of real-time image processing applications. However, they tend to lack flexibility, because they are normally oriented toward particular types of applications, or they impose specific data processing models such as the pipeline. Other issues include large memory footprints, difficulty for reuse, and inefficient execution on multicore processors. We present a novel software architecture for time-constrained machine vision applications that addresses these issues. The architecture is divided into three layers. The platform abstraction layer provides a high-level application programming interface for the rest of the architecture. The messaging layer provides a message-passing interface based on a dynamic publish/subscribe pattern. A topic-based filtering in which messages are published to topics is used to route the messages from the publishers to the subscribers interested in a particular type of message. The application layer provides a repository for reusable application modules designed for machine vision applications. These modules, which include acquisition, visualization, communication, user interface, and data processing, take advantage of the power of well-known libraries such as OpenCV, Intel IPP, or CUDA. Finally, the proposed architecture is applied to a real machine vision application: a jam detector for steel pickling lines.

  14. WinTRAX: A raytracing software package for the design of multipole focusing systems

    NASA Astrophysics Data System (ADS)

    Grime, G. W.

    2013-07-01

    The software package TRAX was a simulation tool for modelling the path of charged particles through linear cylindrical multipole fields described by analytical expressions and was a development of the earlier OXRAY program (Grime and Watt, 1983; Grime et al., 1982) [1,2]. In a 2005 comparison of raytracing software packages (Incerti et al., 2005) [3], TRAX/OXRAY was compared with Geant4 and Zgoubi and was found to give close agreement with the more modern codes. TRAX was a text-based program which was only available for operation in a now rare VMS workstation environment, so a new program, WinTRAX, has been developed for the Windows operating system. This implements the same basic computing strategy as TRAX, and key sections of the code are direct translations from FORTRAN to C++, but the Windows environment is exploited to make an intuitive graphical user interface which simplifies and enhances many operations including system definition and storage, optimisation, beam simulation (including with misaligned elements) and aberration coefficient determination. This paper describes the program and presents comparisons with other software and real installations.

  15. Software Receiver Processing for Deep Space Telemetry Applications

    NASA Astrophysics Data System (ADS)

    Lay, N.; Lyubarev, M.; Tkacenko, A.; Srinivasan, M.; Andrews, K.; Finley, S.; Goodhart, C.; Navarro, R.

    2010-02-01

    Recently, much effort has been placed toward the development of the Reconfigurable Wideband Ground Receiver (RWGR): a variable-data-rate, reprogrammable receiver, whose technologies are intended for infusion into the Deep Space Network. A significant thrust of that effort has been focused on the development of field-programmable gate array (FPGA)-based algorithms for processing high-rate waveforms up to 640 Mbps. In this article, we describe the development of software receiver algorithms used to perform telemetry demodulation of low- to medium-data-rate signals.

  16. Computational protein design: the Proteus software and selected applications.

    PubMed

    Simonson, Thomas; Gaillard, Thomas; Mignon, David; Schmidt am Busch, Marcel; Lopes, Anne; Amara, Najette; Polydorides, Savvas; Sedano, Audrey; Druart, Karen; Archontis, Georgios

    2013-10-30

    We describe an automated procedure for protein design, implemented in a flexible software package, called Proteus. System setup and calculation of an energy matrix are done with the XPLOR modeling program and its sophisticated command language, supporting several force fields and solvent models. A second program provides algorithms to search sequence space. It allows a decomposition of the system into groups, which can be combined in different ways in the energy function, for both positive and negative design. The whole procedure can be controlled by editing 2-4 scripts. Two applications consider the tyrosyl-tRNA synthetase enzyme and its successful redesign to bind both O-methyl-tyrosine and D-tyrosine. For the latter, we present Monte Carlo simulations where the D-tyrosine concentration is gradually increased, displacing L-tyrosine from the binding pocket and yielding the binding free energy difference, in good agreement with experiment. Complete redesign of the Crk SH3 domain is presented. The top 10000 sequences are all assigned to the correct fold by the SUPERFAMILY library of Hidden Markov Models. Finally, we report the acid/base behavior of the SNase protein. Sidechain protonation is treated as a form of mutation; it is then straightforward to perform constant-pH Monte Carlo simulations, which yield good agreement with experiment. Overall, the software can be used for a wide range of application, producing not only native-like sequences but also thermodynamic properties with errors that appear comparable to other current software packages.

  17. Intracranial Applications of MR Imaging-Guided Focused Ultrasound.

    PubMed

    Khanna, N; Gandhi, D; Steven, A; Frenkel, V; Melhem, E R

    2017-03-01

    Initially used in the treatment of prostate cancer and uterine fibroids, the role of focused ultrasound has expanded as transcranial acoustic wave distortion and other limitations have been overcome. Its utility relies on focal energy deposition via acoustic wave propagation. The duty cycle and intensity of focused ultrasound influence the rate of energy deposition and result in unique physiologic and biomechanical effects. Thermal ablation via high-intensity continuous exposure generates coagulative necrosis of tissues. High-intensity, pulsed application reduces temporally averaged energy deposition, resulting in mechanical effects, including reversible, localized BBB disruption, which enhances neurotherapeutic agent delivery. While the precise mechanisms remain unclear, low-intensity, pulsed exposures can influence neuronal activity with preservation of cytoarchitecture. Its noninvasive nature, high-resolution, radiation-free features allow focused ultrasound to compare favorably with other modalities. We discuss the physical characteristics of focused ultrasound devices, the biophysical mechanisms at the tissue level, and current and emerging applications.

  18. Application and systems software in Ada: Development experiences

    NASA Technical Reports Server (NTRS)

    Kuschill, Jim

    1986-01-01

    In its most basic sense software development involves describing the tasks to be solved, including the given objects and the operations to be performed on those objects. Unfortunately, the way people describe objects and operations usually bears little resemblance to source code in most contemporary computer languages. There are two ways around this problem. One is to allow users to describe what they want the computer to do in everyday, typically imprecise English. The PRODOC methodology and software development environment is based on a second more flexible and possibly even easier to use approach. Rather than hiding program structure, PRODOC represents such structure graphically using visual programming techniques. In addition, the program terminology used in PRODOC may be customized so as to match the way human experts in any given application area naturally describe the relevant data and operations. The PRODOC methodology is described in detail.

  19. Evaluation of the Trajectory Operations Applications Software Task (TOAST)

    NASA Technical Reports Server (NTRS)

    Perkins, Sharon; Martin, Andrea; Bavinger, Bill

    1990-01-01

    The Trajectory Operations Applications Software Task (TOAST) is a software development project under the auspices of the Mission Operations Directorate. Its purpose is to provide trajectory operation pre-mission and real-time support for the Space Shuttle program. As an Application Manager, TOAST provides an isolation layer between the underlying Unix operating system and the series of user programs. It provides two main services: a common interface to operating system functions with semantics appropriate for C or FORTRAN, and a structured input and output package that can be utilized by user application programs. In order to evaluate TOAST as an Application Manager, the task was to assess current and planned capabilities, compare capabilities to functions available in commercially-available off the shelf (COTS) and Flight Analysis Design System (FADS) users for TOAST implementation. As a result of the investigation, it was found that the current version of TOAST is well implemented and meets the needs of the real-time users. The plans for migrating TOAST to the X Window System are essentially sound; the Executive will port with minor changes, while Menu Handler will require a total rewrite. A series of recommendations for future TOAST directions are included.

  20. Software tools for developing parallel applications. Part 1: Code development and debugging

    SciTech Connect

    Brown, J.; Geist, A.; Pancake, C.; Rover, D.

    1997-04-01

    Developing an application for parallel computers can be a lengthy and frustrating process making it a perfect candidate for software tool support. Yet application programmers are often the last to hear about new tools emerging from R and D efforts. This paper provides an overview of two focuses of tool support: code development and debugging. Each is discussed in terms of the programmer needs addressed, the extent to which representative current tools meet those needs, and what new levels of tool support are important if parallel computing is to become more widespread.

  1. Adaptive Signal Processing Testbed application software: User's manual

    NASA Astrophysics Data System (ADS)

    Parliament, Hugh A.

    1992-05-01

    The Adaptive Signal Processing Testbed (ASPT) application software is a set of programs that provide general data acquisition and minimal processing functions on live digital data. The data are obtained from a digital input interface whose data source is the DAR4000 digital quadrature receiver that receives a phase shift keying signal at 21.4 MHz intermediate frequency. The data acquisition software is used to acquire raw unprocessed data from the DAR4000 and store it on disk in the Sun workstation based ASPT. File processing utilities are available to convert the stored files for analysis. The data evaluation software is used for the following functions: acquisition of data from the DAR4000, conversion to IEEE format, and storage to disk; acquisition of data from the DAR4000, power spectrum estimation, and on-line plotting on the graphics screen; and processing of disk file data, power spectrum estimation, and display and/or storage to disk in the new format. A user's guide is provided that describes the acquisition and evaluation programs along with how to acquire, evaluate, and use the data.

  2. Supporting SBML as a model exchange format in software applications.

    PubMed

    Keating, Sarah M; Le Novère, Nicolas

    2013-01-01

    This chapter describes the Systems Biology Markup Language (SBML) from its origins. It describes the rationale behind and importance of having a common language when it comes to representing models. This chapter mentions the development of SBML and outlines the structure of an SBML model. It provides a section on libSBML, a useful application programming interface (API) library for reading, writing, manipulating and validating content expressed in the SBML format. Finally the chapter also provides a description of the SBML Toolbox which provides a means of facilitating the import and export of SBML from both MATLAB and Octave ( http://www.gnu.org/software/octave/) environments.

  3. APPLICATION OF SOFTWARE QUALITY ASSURANCE CONCEPTS AND PROCEDURES TO ENVIORNMENTAL RESEARCH INVOLVING SOFTWARE DEVELOPMENT

    EPA Science Inventory

    As EPA’s environmental research expands into new areas that involve the development of software, quality assurance concepts and procedures that were originally developed for environmental data collection may not be appropriate. Fortunately, software quality assurance is a ...

  4. Bibliographic Management Software: A Focus Group Study of the Preferences and Practices of Undergraduate Students

    ERIC Educational Resources Information Center

    Salem, Jamie; Fehrmann, Paul

    2013-01-01

    With the growing population of undergraduate students on our campus and an increased focus on their success, librarians at a large midwestern university were interested in the citation management styles of this university cohort. Our university library spends considerable resources each year to maintain and promote access to the robust…

  5. Application of Zernike polynomials towards accelerated adaptive focusing of transcranial high intensity focused ultrasound

    PubMed Central

    Kaye, Elena A.; Hertzberg, Yoni; Marx, Michael; Werner, Beat; Navon, Gil; Levoy, Marc; Pauly, Kim Butts

    2012-01-01

    initial estimates based on using the average of the phase aberration data from the individual subgroups of subjects was shown to increase the intensity at the focal spot for the five subjects. Conclusions: The application of ZPs to phase aberration correction was shown to be beneficial for adaptive focusing of transcranial ultrasound. The skull-based phase aberrations were found to be well approximated by the number of ZP modes representing only a fraction of the number of elements in the hemispherical transducer. Implementing the initial phase aberration estimate together with Zernike-based algorithm can be used to improve the robustness and can potentially greatly increase the viability of MR-ARFI-based focusing for a clinical transcranial MRgFUS therapy. PMID:23039661

  6. The Application of Software Safety to the Constellation Program Launch Control System

    NASA Technical Reports Server (NTRS)

    Kania, James; Hill, Janice

    2011-01-01

    The application of software safety practices on the LCS project resulted in the successful implementation of the NASA Software Safety Standard NASA-STD-8719.138 and CxP software safety requirements. The GOP-GEN-GSW-011 Hazard Report was the first report developed at KSC to identify software hazard causes and their controls. This approach can be applied to similar large software - intensive systems where loss of control can lead to a hazard.

  7. Dynamically focused optical coherence tomography for endoscopic applications

    NASA Astrophysics Data System (ADS)

    Divetia, Asheesh; Hsieh, Tsung-Hsi; Zhang, Jun; Chen, Zhongping; Bachman, Mark; Li, Guann-Pyng

    2005-03-01

    We report a demonstration of a small liquid-filled polymer lens that may be used to dynamically provide scanning depth focus for endoscopic optical coherence tomography (OCT) applications. The focal depth of the lens is controlled by changing the hydraulic pressure within the lens, enabling dynamic focal depth control without the need for articulated parts. The 1 mm diameter lens is shown to have resolving power of 5 μm, and can enable depth scans of 2.5 mm, making it suitable for use with OCT-enabled optical biopsy applications.

  8. Focused microwave-assisted Soxhlet extraction: devices and applications.

    PubMed

    Luque-García, J L; Luque de Castro, M D

    2004-10-20

    An overview of a new extraction technique called focused microwave-assisted Soxhlet extraction (FMASE) is here presented. This technique is based on the same principles as conventional Soxhlet extraction but using microwaves as auxiliary energy to accelerate the process. The different devices designed and constructed so far, their advantages and limitations as well as their main applications on environmental and food analysis are discussed in this article.

  9. Optimizing Focusing X-Ray Optics for Planetary Science Applications

    NASA Astrophysics Data System (ADS)

    Melso, Nicole; Romaine, Suzanne; Hong, Jaesub; Cotroneo, Vincenzo

    2015-01-01

    X-Ray observations are a valuable tool for studying the composition, formation and evolution of the numerous X-Ray emitting objects in our Solar System. Although there are plenty of useful applications for in situ X-Ray focusing instrumentation, X-Ray focusing optics have never been feasible for use onboard planetary missions due to their mass and cost. Recent advancements in small-scale X-Ray instrumentation have made focusing X-Ray technology more practical and affordable for use onboard in situ spacecraft. Specifically, the technology of a metal-ceramic hybrid material combined with Electroformed Nickel Replication (ENR) holds great promise for realizing lightweight X-ray optics. We are working to optimize these lightweight focusing X-Ray optics for use in planetary science applications. We have explored multiple configurations and geometries that maximize the telescope's effective area and field of view while meeting practical mass and volume requirements. Each configuration was modeled via analytic calculations and Monte Carlo ray tracing simulations and compared to alternative Micro-pore Optics designs. The improved performance of our approach using hybrid materials has many exciting implications for the future of planetary science, X-Ray instrumentation, and the exploration of X-Ray sources in our Solar System.This work was supported in part by the NSF REU and DoD ASSURE programs under NSF grant no. 1262851 and by the Smithsonian Institution.

  10. Clinical applications of high-intensity focused ultrasound.

    PubMed

    She, W H; Cheung, T T; Jenkins, C R; Irwin, M G

    2016-08-01

    Ultrasound has been developed for therapeutic use in addition to its diagnostic ability. The use of focused ultrasound energy can offer a non-invasive method for tissue ablation, and can therefore be used to treat various solid tumours. High-intensity focused ultrasound is being increasingly used in the treatment of both primary and metastatic tumours as these can be precisely located for ablation. It has been shown to be particularly useful in the treatment of uterine fibroids, and various solid tumours including those of the pancreas and liver. High-intensity focused ultrasound is a valid treatment option for liver tumours in patients with significant medical co-morbidity who are at high risk for surgery or who have relatively poor liver function that may preclude hepatectomy. It has also been used as a form of bridging therapy while patients awaiting cadaveric donor liver transplantation. In this article, we outline the principles of high-intensity focused ultrasound and its clinical applications, including the management protocol development in the treatment of hepatocellular carcinoma in Hong Kong by performing a search on MEDLINE (OVID), EMBASE, and PubMed. The search of these databases ranged from the date of their establishment until December 2015. The search terms used were: high-intensity focused ultrasound, ultrasound, magnetic resonance imaging, liver tumour, hepatocellular carcinoma, pancreas, renal cell carcinoma, prostate cancer, breast cancer, fibroids, bone tumour, atrial fibrillation, glaucoma, Parkinson's disease, essential tremor, and neuropathic pain.

  11. Optimizing Flight Control Software With an Application Platform

    NASA Technical Reports Server (NTRS)

    Smith, Irene Skupniewicz; Shi, Nija; Webster, Christopher

    2012-01-01

    Flight controllers in NASA s mission control centers work day and night to ensure that missions succeed and crews are safe. The IT goals of NASA mission control centers are similar to those of most businesses: to evolve IT infrastructure from basic to dynamic. This paper describes Mission Control Technologies (MCT), an application platform that is powering mission control today and is designed to meet the needs of future NASA control centers. MCT is an extensible platform that provides GUI components and a runtime environment. The platform enables NASA s IT goals through its use of lightweight interfaces and configurable components, which promote standardization and incorporate useful solution patterns. The MCT architecture positions mission control centers to reach the goal of dynamic IT, leading to lower cost of ownership, and treating software as a strategic investment.

  12. Practical Application of Model Checking in Software Verification

    NASA Technical Reports Server (NTRS)

    Havelund, Klaus; Skakkebaek, Jens Ulrik

    1999-01-01

    This paper presents our experiences in applying the JAVA PATHFINDER (J(sub PF)), a recently developed JAVA to SPIN translator, in the finding of synchronization bugs in a Chinese Chess game server application written in JAVA. We give an overview of J(sub PF) and the subset of JAVA that it supports and describe the abstraction and verification of the game server. Finally, we analyze the results of the effort. We argue that abstraction by under-approximation is necessary for abstracting sufficiently smaller models for verification purposes; that user guidance is crucial for effective abstraction; and that current model checkers do not conveniently support the computational models of software in general and JAVA in particular.

  13. Process Orchestration With Modular Software Applications On Intelligent Field Devices

    NASA Astrophysics Data System (ADS)

    Orfgen, Marius; Schmitt, Mathias

    2015-07-01

    The method developed by the DFKI-IFS for extending the functionality of intelligent field devices through the use of reloadable software applications (so-called Apps) is to be further augmented with a methodology and communication concept for process orchestration. The concept allows individual Apps from different manufacturers to decentrally share information. This way of communicating forms the basis for the dynamic orchestration of Apps to complete processes, in that it allows the actions of one App (e.g. detecting a component part with a sensor App) to trigger reactions in other Apps (e.g. triggering the processing of that component part). A holistic methodology and its implementation as a configuration tool allows one to model the information flow between Apps, as well as automatically introduce it into physical production hardware via available interfaces provided by the Field Device Middleware. Consequently, configuring industrial facilities is made simpler, resulting in shorter changeover and shutdown times.

  14. Operational excellence (six sigma) philosophy: Application to software quality assurance

    SciTech Connect

    Lackner, M.

    1997-11-01

    This report contains viewgraphs on operational excellence philosophy of six sigma applied to software quality assurance. This report outlines the following: goal of six sigma; six sigma tools; manufacturing vs administrative processes; Software quality assurance document inspections; map software quality assurance requirements document; failure mode effects analysis for requirements document; measuring the right response variables; and questions.

  15. High Energy Physics Forum for Computational Excellence: Working Group Reports (I. Applications Software II. Software Libraries and Tools III. Systems)

    SciTech Connect

    Habib, Salman; Roser, Robert

    2015-10-28

    Computing plays an essential role in all aspects of high energy physics. As computational technology evolves rapidly in new directions, and data throughput and volume continue to follow a steep trend-line, it is important for the HEP community to develop an effective response to a series of expected challenges. In order to help shape the desired response, the HEP Forum for Computational Excellence (HEP-FCE) initiated a roadmap planning activity with two key overlapping drivers -- 1) software effectiveness, and 2) infrastructure and expertise advancement. The HEP-FCE formed three working groups, 1) Applications Software, 2) Software Libraries and Tools, and 3) Systems (including systems software), to provide an overview of the current status of HEP computing and to present findings and opportunities for the desired HEP computational roadmap. The final versions of the reports are combined in this document, and are presented along with introductory material.

  16. Migrating data from TcSE to DOORS : an evaluation of the T-Plan Integrator software application.

    SciTech Connect

    Post, Debra S.; Manzanares, David A.; Taylor, Jeffrey L.

    2011-02-01

    This report describes our evaluation of the T-Plan Integrator software application as it was used to transfer a real data set from the Teamcenter for Systems Engineering (TcSE) software application to the DOORS software application. The T-Plan Integrator was evaluated to determine if it would meet the needs of Sandia National Laboratories to migrate our existing data sets from TcSE to DOORS. This report presents the struggles of migrating data and focuses on how the Integrator can be used to map a data set and its data architecture from TcSE to DOORS. Finally, this report describes how the bulk of the migration can take place using the Integrator; however, about 20-30% of the data would need to be transferred from TcSE to DOORS manually. This report does not evaluate the transfer of data from DOORS to TcSE.

  17. Update on Clinical MR-guided Focused Ultrasound Applications

    PubMed Central

    McDannold, Nathan

    2015-01-01

    SYNOPSIS Focused ultrasound (FUS) can be used to thermally ablate tissue. The performance of FUS under magnetic resonance (MR) guidance enables aiming the focus at the target, accurate treatment planning, real-time temperature mapping, and evaluation of the treatment. This review updates several clinical applications of MR-guided FUS. MR-guided FUS has a CE mark and FDA approval for thermal ablation for uterine fibroids and bone metastases related pain management. Thousands of uterine fibroid patients have successfully been treated with minor side effects. Technical improvements, increased experience, and the use of a screening MRI examination should further improve treatment outcome. When used for bone metastases and other bone diseases, thermal ablation leads to pain relief due to denervation, and debulking of the tumor. The use of a hemi-spherical multi-element transducer and phase corrections have enabled application of FUS through the skull. Transcranial MR-guided FUS has received CE certification for ablation of deep, central locations in the brain such as the thalamus. Thermal ablation of specific parts of the thalamus can result in relief of the symptoms in neurological disorders such as essential tremor, Parkinson’s, and neuropathic pain. No CE mark or FDA approval has been obtained as yet for treatment of prostate cancer or breast cancer, but several approaches have been proposed and clinical trials should show the potential of MR-guided FUS for these and other applications. PMID:26499282

  18. Software Portability Considerations for Multiple Applications over Multiple Sites

    PubMed Central

    Munnecke, Thomas

    1981-01-01

    There are great benefits to be obtained by distributing the cost of software development over multiple sites. Both economies and dis-economies of scale become prominent when broad-based software portability is looked at carefully. However, traditional data processing techniques are oriented to making specific users, rather than general software for a class of users. This trend toward overspecification is getting worse with traditional data processing languages, while other standard languages are confronting it.

  19. Cloud based, Open Source Software Application for Mitigating Herbicide Drift

    NASA Astrophysics Data System (ADS)

    Saraswat, D.; Scott, B.

    2014-12-01

    The spread of herbicide resistant weeds has resulted in the need for clearly marked fields. In response to this need, the University of Arkansas Cooperative Extension Service launched a program named Flag the Technology in 2011. This program uses color-coded flags as a visual alert of the herbicide trait technology within a farm field. The flag based program also serves to help avoid herbicide misapplication and prevent herbicide drift damage between fields with differing crop technologies. This program has been endorsed by Southern Weed Science Society of America and is attracting interest from across the USA, Canada, and Australia. However, flags have risk of misplacement or disappearance due to mischief or severe windstorms/thunderstorms, respectively. This presentation will discuss the design and development of a cloud-based, free application utilizing open-source technologies, called Flag the Technology Cloud (FTTCloud), for allowing agricultural stakeholders to color code their farm fields for indicating herbicide resistant technologies. The developed software utilizes modern web development practices, widely used design technologies, and basic geographic information system (GIS) based interactive interfaces for representing, color-coding, searching, and visualizing fields. This program has also been made compatible for a wider usability on different size devices- smartphones, tablets, desktops and laptops.

  20. Current focus of stem cell application in retinal repair

    PubMed Central

    Alonso-Alonso, María L; Srivastava, Girish K

    2015-01-01

    The relevance of retinal diseases, both in society’s economy and in the quality of people’s life who suffer with them, has made stem cell therapy an interesting topic for research. Embryonic stem cells (ESCs), induced pluripotent stem cells (iPSCs) and adipose derived mesenchymal stem cells (ADMSCs) are the focus in current endeavors as a source of different retinal cells, such as photoreceptors and retinal pigment epithelial cells. The aim is to apply them for cell replacement as an option for treating retinal diseases which so far are untreatable in their advanced stage. ESCs, despite the great potential for differentiation, have the dangerous risk of teratoma formation as well as ethical issues, which must be resolved before starting a clinical trial. iPSCs, like ESCs, are able to differentiate in to several types of retinal cells. However, the process to get them for personalized cell therapy has a high cost in terms of time and money. Researchers are working to resolve this since iPSCs seem to be a realistic option for treating retinal diseases. ADMSCs have the advantage that the procedures to obtain them are easier. Despite advancements in stem cell application, there are still several challenges that need to be overcome before transferring the research results to clinical application. This paper reviews recent research achievements of the applications of these three types of stem cells as well as clinical trials currently based on them. PMID:25914770

  1. Web Services Provide Access to SCEC Scientific Research Application Software

    NASA Astrophysics Data System (ADS)

    Gupta, N.; Gupta, V.; Okaya, D.; Kamb, L.; Maechling, P.

    2003-12-01

    Web services offer scientific communities a new paradigm for sharing research codes and communicating results. While there are formal technical definitions of what constitutes a web service, for a user community such as the Southern California Earthquake Center (SCEC), we may conceptually consider a web service to be functionality provided on-demand by an application which is run on a remote computer located elsewhere on the Internet. The value of a web service is that it can (1) run a scientific code without the user needing to install and learn the intricacies of running the code; (2) provide the technical framework which allows a user's computer to talk to the remote computer which performs the service; (3) provide the computational resources to run the code; and (4) bundle several analysis steps and provide the end results in digital or (post-processed) graphical form. Within an NSF-sponsored ITR project coordinated by SCEC, we are constructing web services using architectural protocols and programming languages (e.g., Java). However, because the SCEC community has a rich pool of scientific research software (written in traditional languages such as C and FORTRAN), we also emphasize making existing scientific codes available by constructing web service frameworks which wrap around and directly run these codes. In doing so we attempt to broaden community usage of these codes. Web service wrapping of a scientific code can be done using a "web servlet" construction or by using a SOAP/WSDL-based framework. This latter approach is widely adopted in IT circles although it is subject to rapid evolution. Our wrapping framework attempts to "honor" the original codes with as little modification as is possible. For versatility we identify three methods of user access: (A) a web-based GUI (written in HTML and/or Java applets); (B) a Linux/OSX/UNIX command line "initiator" utility (shell-scriptable); and (C) direct access from within any Java application (and with the

  2. Does Screencast Teaching Software Application Needs Narration for Effective Learning?

    ERIC Educational Resources Information Center

    Mohamad Ali, Ahmad Zamzuri; Samsudin, Khairulanuar; Hassan, Mohamad; Sidek, Salman Firdaus

    2011-01-01

    The aim of this study was to investigate the effects of screencast with narration and without narration in enhancing learning performance. A series of screencast teaching Flash animation software was developed using screen capture software for the purpose of this research. The screencast series were uploaded to specialized channels created in…

  3. Spectromicroscopy and coherent diffraction imaging: focus on energy materials applications.

    PubMed

    Hitchcock, Adam P; Toney, Michael F

    2014-09-01

    Current and future capabilities of X-ray spectromicroscopy are discussed based on coherence-limited imaging methods which will benefit from the dramatic increase in brightness expected from a diffraction-limited storage ring (DLSR). The methods discussed include advanced coherent diffraction techniques and nanoprobe-based real-space imaging using Fresnel zone plates or other diffractive optics whose performance is affected by the degree of coherence. The capabilities of current systems, improvements which can be expected, and some of the important scientific themes which will be impacted are described, with focus on energy materials applications. Potential performance improvements of these techniques based on anticipated DLSR performance are estimated. Several examples of energy sciences research problems which are out of reach of current instrumentation, but which might be solved with the enhanced DLSR performance, are discussed.

  4. Transfer of Learning: The Effects of Different Instruction Methods on Software Application Learning

    ERIC Educational Resources Information Center

    Larson, Mark E.

    2010-01-01

    Human Resource Departments (HRD), especially instructors, are challenged to keep pace with rapidly changing computer software applications and technology. The problem under investigation revealed after instruction of a software application if a particular method of instruction was a predictor of transfer of learning, when other risk factors were…

  5. Universally applicable three-dimensional hydrodynamic microfluidic flow focusing.

    PubMed

    Chiu, Yu-Jui; Cho, Sung Hwan; Mei, Zhe; Lien, Victor; Wu, Tsung-Feng; Lo, Yu-Hwa

    2013-05-07

    We have demonstrated a microfluidic device that can not only achieve three-dimensional flow focusing but also confine particles to the center stream along the channel. The device has a sample channel of smaller height and two sheath flow channels of greater height, merged into the downstream main channel where 3D focusing effects occur. We have demonstrated that both beads and cells in our device display significantly lower CVs in velocity and position distributions as well as reduced probability of coincidental events than they do in conventional 2D-confined microfluidic channels. The improved particle confinement in the microfluidic channel is highly desirable for microfluidic flow cytometers and in fluorescence-activated cell sorting (FACS). We have also reported a novel method to measure the velocity of each individual particle in the microfluidic channel. The method is compatible with the flow cytometer setup and requires no sophisticated visualization equipment. The principles and methods of device design and characterization can be applicable to many types of microfluidic systems.

  6. Laboratory Connections. Commercial Interfacing Packages: Part II: Software and Applications.

    ERIC Educational Resources Information Center

    Powers, Michael H.

    1989-01-01

    Describes the titration of a weak base with a strong acid and subsequent treatment of experimental data using commercially available software. Provides a BASIC program for determining first and second derivatives of data input. Lists 6 references. (YP)

  7. Applications of Microcomputers in the Teaching of Physics 6502 Software.

    ERIC Educational Resources Information Center

    Marsh, David P.

    1980-01-01

    Described is a variety of uses of the microcomputer when coupled with software available for systems using 6502 microprocessors. Included are several computer programs which exhibit some of the possibilities for programing the 6502 microprocessors. (DS)

  8. A Taxonomy of Knowledge Management Software Tools: Origins and Applications.

    ERIC Educational Resources Information Center

    Tyndale, Peter

    2002-01-01

    Examines, evaluates, and organizes a wide variety of knowledge management software tools by examining the literature related to the selection and evaluation of knowledge management tools. (Author/SLD)

  9. [Genetic algorithm application to multi-focus patterns of 256-element phased array for focused ultrasound surgery].

    PubMed

    Xu, Feng; Wan, Mingxi; Lu, Mingzhu

    2008-10-01

    The genetic optimal algorithm and sound field calculation approach for the spherical-section phased array are presented in this paper. The in-house manufactured 256-element phased array focused ultrasound surgery system is briefly described. The on-axis single focus and off-axis single focus are simulated along with the axis-symmetric six-focus patter and the axis-asymmetric four-focus pattern using a 256-element phased array and the genetic optimal algorithm and sound field calculation approach. The experimental results of the described 256-element phased array focused ultrasound surgery system acting on organic glass and phantom are also analyzed. The results of the simulations and experiments confirm the applicability of the genetic algorithm and field calculation approaches in accurately steering three dimensional foci and focus.

  10. Problem-Solving Software

    NASA Technical Reports Server (NTRS)

    1992-01-01

    CBR Express software solves problems by adapting sorted solutions to new problems specified by a user. It is applicable to a wide range of situations. The technology was originally developed by Inference Corporation for Johnson Space Center's Advanced Software Development Workstation. The project focused on the reuse of software designs, and Inference used CBR as part of the ACCESS prototype software. The commercial CBR Express is used as a "help desk" for customer support, enabling reuse of existing information when necessary. It has been adopted by several companies, among them American Airlines, which uses it to solve reservation system software problems.

  11. Verification of operating software for cooperative monitoring applications

    SciTech Connect

    Tolk, K.M.; Rembold, R.K.

    1997-08-01

    Monitoring agencies often use computer based equipment to control instruments and to collect data at sites that are being monitored under international safeguards or other cooperative monitoring agreements. In order for this data to be used as an independent verification of data supplied by the host at the facility, the software used must be trusted by the monitoring agency. The monitoring party must be sure that the software has not be altered to give results that could lead to erroneous conclusions about nuclear materials inventories or other operating conditions at the site. The host might also want to verify that the software being used is the software that has been previously inspected in order to be assured that only data that is allowed under the agreement is being collected. A description of a method to provide this verification using keyed has functions and how the proposed method overcomes possible vulnerabilities in methods currently in use such as loading the software from trusted disks is presented. The use of public key data authentication for this purpose is also discussed.

  12. Selection of bioprocess simulation software for industrial applications.

    PubMed

    Shanklin, T; Roper, K; Yegneswaran, P K; Marten, M R

    2001-02-20

    Two commercially available, process-simulation software packages (Aspen Batch Plus v1.2, Aspen Technology, Inc., Cambridge, Massachusetts, and Intelligen SuperPro v3.0, INTELLIGEN, INC., Scotch Plains, Ner Jersey) are evaluated for use in modeling industrial, biotechnology processes. Software is quantitatively evaluated by Kepner-Tregoe Decision Analysis (Kepner and Tregoe, 1981). This evaluation shows that Aspen Batch Plus v1.2 (ABP) and Intelligen SuperPro v3.0 (ISP) can successfully perform specific simulation tasks but do not provide a complete model of all phenomena occurring within a biotechnology process. Software is best suited to provide a format for process management, using material and energy balances to answer scheduling questions, explore equipment change-outs, and calculate cost data. The ability of simulation software to accurately predict unit operation scale-up and optimize bioprocesses is limited. To realistically evaluate the software, a vaccine manufacturing process under development at Merck & Company is simulated. Case studies from the vaccine process are presented as examples of how ABP and ISP can be used to shed light on real-world processing issues.

  13. Software Defined GPS API: Development and Implementation of GPS Correlator Architectures Using MATLAB with Focus on SDR Implementations

    DTIC Science & Technology

    2014-05-18

    intention of offering improved software libraries for GNSS signal acquisition. It has been the team mission to implement new and improved techniques...with the intention of offering improved software libraries for GNSS signal acquisition. It has been the team mission to implement new and improved...intention of offering improved software libraries for GNSS signal acquisition. It has been the team mission to implement new and improved techniques to

  14. Editorial: Focus on Atom Optics and its Applications

    NASA Astrophysics Data System (ADS)

    Schmidt-Kaler, F.; Pfau, T.; Schmelcher, P.; Schleich, W.

    2010-06-01

    Atom optics employs the modern techniques of quantum optics and laser cooling to enable applications which often outperform current standard technologies. Atomic matter wave interferometers allow for ultra-precise sensors; metrology and clocks are pushed to an extraordinary accuracy of 17 digits using single atoms. Miniaturization and integration are driven forward for both atomic clocks and atom optical circuits. With the miniaturization of information-storage and -processing devices, the scale of single atoms is approached in solid state devices, where the laws of quantum physics lead to novel, advantageous features and functionalities. An upcoming branch of atom optics is the control of single atoms, potentially allowing solid state devices to be built atom by atom; some of which would be applicable in future quantum information processing devices. Selective manipulation of individual atoms also enables trace analysis of extremely rare isotopes. Additionally, sources of neutral atoms with high brightness are being developed and, if combined with photo ionization, even novel focused ion beam sources are within reach. Ultracold chemistry is fertilized by atomic techniques, when reactions of chemical constituents are investigated between ions, atoms, molecules, trapped or aligned in designed fields and cooled to ultra-low temperatures such that the reaction kinetics can be studied in a completely state-resolved manner. Focus on Atom Optics and its Applications Contents Sensitive gravity-gradiometry with atom interferometry: progress towards an improved determination of the gravitational constant F Sorrentino, Y-H Lien, G Rosi, L Cacciapuoti, M Prevedelli and G M Tino A single-atom detector integrated on an atom chip: fabrication, characterization and application D Heine, W Rohringer, D Fischer, M Wilzbach, T Raub, S Loziczky, XiYuan Liu, S Groth, B Hessmo and J Schmiedmayer Interaction of a propagating guided matter wave with a localized potential G L Gattobigio, A

  15. Progress in Addressing DNFSB Recommendation 2002-1 Issues: Improving Accident Analysis Software Applications

    SciTech Connect

    VINCENT, ANDREW

    2005-04-25

    Defense Nuclear Facilities Safety Board (DNFSB) Recommendation 2002-1 (''Quality Assurance for Safety-Related Software'') identified a number of quality assurance issues on the use of software in Department of Energy (DOE) facilities for analyzing hazards, and designing and operating controls to prevent or mitigate potential accidents. Over the last year, DOE has begun several processes and programs as part of the Implementation Plan commitments, and in particular, has made significant progress in addressing several sets of issues particularly important in the application of software for performing hazard and accident analysis. The work discussed here demonstrates that through these actions, Software Quality Assurance (SQA) guidance and software tools are available that can be used to improve resulting safety analysis. Specifically, five of the primary actions corresponding to the commitments made in the Implementation Plan to Recommendation 2002-1 are identified and discussed in this paper. Included are the web-based DOE SQA Knowledge Portal and the Central Registry, guidance and gap analysis reports, electronic bulletin board and discussion forum, and a DOE safety software guide. These SQA products can benefit DOE safety contractors in the development of hazard and accident analysis by precluding inappropriate software applications and utilizing best practices when incorporating software results to safety basis documentation. The improvement actions discussed here mark a beginning to establishing stronger, standard-compliant programs, practices, and processes in SQA among safety software users, managers, and reviewers throughout the DOE Complex. Additional effort is needed, however, particularly in: (1) processes to add new software applications to the DOE Safety Software Toolbox; (2) improving the effectiveness of software issue communication; and (3) promoting a safety software quality assurance culture.

  16. Application of software technology to a future spacecraft computer design

    NASA Technical Reports Server (NTRS)

    Labaugh, R. J.

    1980-01-01

    A study was conducted to determine how major improvements in spacecraft computer systems can be obtained from recent advances in hardware and software technology. Investigations into integrated circuit technology indicated that the CMOS/SOS chip set being developed for the Air Force Avionics Laboratory at Wright Patterson had the best potential for improving the performance of spaceborne computer systems. An integral part of the chip set is the bit slice arithmetic and logic unit. The flexibility allowed by microprogramming, combined with the software investigations, led to the specification of a baseline architecture and instruction set.

  17. Extending the Role of the Corporate Library: Corporate Database Applications Using BRS/Search Software.

    ERIC Educational Resources Information Center

    Lammert, Diana

    1993-01-01

    Describes the McKenna Information Center's application of BRS/SEARCH, information retrieval software, as part of its services to Kennmetal Inc., its parent company. Features and uses of the software, including commands, custom searching, menu-driven interfaces, preparing reports, and designing databases are covered. Nine examples of software…

  18. ELAS - SCIENCE & TECHNOLOGY LABORATORY APPLICATIONS SOFTWARE (MASSCOMP VERSION)

    NASA Technical Reports Server (NTRS)

    Walters, D.

    1994-01-01

    The Science and Technology Laboratory Applications Software (ELAS) was originally designed to analyze and process digital imagery data, specifically remotely-sensed scanner data. This capability includes the processing of Landsat multispectral data; aircraft-acquired scanner data; digitized topographic data; and numerous other ancillary data, such as soil types and rainfall information, that can be stored in digitized form. ELAS has the subsequent capability to geographically reference this data to dozens of standard, as well as user created projections. As an integrated image processing system, ELAS offers the user of remotely-sensed data a wide range of capabilities in the areas of land cover analysis and general purpose image analysis. ELAS is designed for flexible use and operation and includes its own FORTRAN operating subsystem and an expandable set of FORTRAN application modules. Because all of ELAS resides in one "logical" FORTRAN program, data inputs and outputs, directives, and module switching are convenient for the user. There are over 230 modules presently available to aid the user in performing a wide range of land cover analyses and manipulation. The file management modules enable the user to allocate, define, access, and specify usage for all types of files (ELAS files, subfiles, external files etc.). Various other modules convert specific types of satellite, aircraft, and vector-polygon data into files that can be used by other ELAS modules. The user also has many module options which aid in displaying image data, such as magnification/reduction of the display; true color display; and several memory functions. Additional modules allow for the building and manipulation of polygonal areas of the image data. Finally, there are modules which allow the user to select and classify the image data. An important feature of the ELAS subsystem is that its structure allows new applications modules to be easily integrated in the future. ELAS has as a standard

  19. ELAS - SCIENCE & TECHNOLOGY LABORATORY APPLICATIONS SOFTWARE (DEC VAX VERSION)

    NASA Technical Reports Server (NTRS)

    Junkin, B. G.

    1994-01-01

    The Science and Technology Laboratory Applications Software (ELAS) was originally designed to analyze and process digital imagery data, specifically remotely-sensed scanner data. This capability includes the processing of Landsat multispectral data; aircraft-acquired scanner data; digitized topographic data; and numerous other ancillary data, such as soil types and rainfall information, that can be stored in digitized form. ELAS has the subsequent capability to geographically reference this data to dozens of standard, as well as user created projections. As an integrated image processing system, ELAS offers the user of remotely-sensed data a wide range of capabilities in the areas of land cover analysis and general purpose image analysis. ELAS is designed for flexible use and operation and includes its own FORTRAN operating subsystem and an expandable set of FORTRAN application modules. Because all of ELAS resides in one "logical" FORTRAN program, data inputs and outputs, directives, and module switching are convenient for the user. There are over 230 modules presently available to aid the user in performing a wide range of land cover analyses and manipulation. The file management modules enable the user to allocate, define, access, and specify usage for all types of files (ELAS files, subfiles, external files etc.). Various other modules convert specific types of satellite, aircraft, and vector-polygon data into files that can be used by other ELAS modules. The user also has many module options which aid in displaying image data, such as magnification/reduction of the display; true color display; and several memory functions. Additional modules allow for the building and manipulation of polygonal areas of the image data. Finally, there are modules which allow the user to select and classify the image data. An important feature of the ELAS subsystem is that its structure allows new applications modules to be easily integrated in the future. ELAS has as a standard

  20. ELAS - SCIENCE & TECHNOLOGY LABORATORY APPLICATIONS SOFTWARE (SILICON GRAPHICS VERSION)

    NASA Technical Reports Server (NTRS)

    Walters, D.

    1994-01-01

    The Science and Technology Laboratory Applications Software (ELAS) was originally designed to analyze and process digital imagery data, specifically remotely-sensed scanner data. This capability includes the processing of Landsat multispectral data; aircraft-acquired scanner data; digitized topographic data; and numerous other ancillary data, such as soil types and rainfall information, that can be stored in digitized form. ELAS has the subsequent capability to geographically reference this data to dozens of standard, as well as user created projections. As an integrated image processing system, ELAS offers the user of remotely-sensed data a wide range of capabilities in the areas of land cover analysis and general purpose image analysis. ELAS is designed for flexible use and operation and includes its own FORTRAN operating subsystem and an expandable set of FORTRAN application modules. Because all of ELAS resides in one "logical" FORTRAN program, data inputs and outputs, directives, and module switching are convenient for the user. There are over 230 modules presently available to aid the user in performing a wide range of land cover analyses and manipulation. The file management modules enable the user to allocate, define, access, and specify usage for all types of files (ELAS files, subfiles, external files etc.). Various other modules convert specific types of satellite, aircraft, and vector-polygon data into files that can be used by other ELAS modules. The user also has many module options which aid in displaying image data, such as magnification/reduction of the display; true color display; and several memory functions. Additional modules allow for the building and manipulation of polygonal areas of the image data. Finally, there are modules which allow the user to select and classify the image data. An important feature of the ELAS subsystem is that its structure allows new applications modules to be easily integrated in the future. ELAS has as a standard

  1. ELAS - SCIENCE & TECHNOLOGY LABORATORY APPLICATIONS SOFTWARE (CONCURRENT VERSION)

    NASA Technical Reports Server (NTRS)

    Pearson, R. W.

    1994-01-01

    The Science and Technology Laboratory Applications Software (ELAS) was originally designed to analyze and process digital imagery data, specifically remotely-sensed scanner data. This capability includes the processing of Landsat multispectral data; aircraft-acquired scanner data; digitized topographic data; and numerous other ancillary data, such as soil types and rainfall information, that can be stored in digitized form. ELAS has the subsequent capability to geographically reference this data to dozens of standard, as well as user created projections. As an integrated image processing system, ELAS offers the user of remotely-sensed data a wide range of capabilities in the areas of land cover analysis and general purpose image analysis. ELAS is designed for flexible use and operation and includes its own FORTRAN operating subsystem and an expandable set of FORTRAN application modules. Because all of ELAS resides in one "logical" FORTRAN program, data inputs and outputs, directives, and module switching are convenient for the user. There are over 230 modules presently available to aid the user in performing a wide range of land cover analyses and manipulation. The file management modules enable the user to allocate, define, access, and specify usage for all types of files (ELAS files, subfiles, external files etc.). Various other modules convert specific types of satellite, aircraft, and vector-polygon data into files that can be used by other ELAS modules. The user also has many module options which aid in displaying image data, such as magnification/reduction of the display; true color display; and several memory functions. Additional modules allow for the building and manipulation of polygonal areas of the image data. Finally, there are modules which allow the user to select and classify the image data. An important feature of the ELAS subsystem is that its structure allows new applications modules to be easily integrated in the future. ELAS has as a standard

  2. ELAS - SCIENCE & TECHNOLOGY LABORATORY APPLICATIONS SOFTWARE (SUN VERSION)

    NASA Technical Reports Server (NTRS)

    Walters, D.

    1994-01-01

    The Science and Technology Laboratory Applications Software (ELAS) was originally designed to analyze and process digital imagery data, specifically remotely-sensed scanner data. This capability includes the processing of Landsat multispectral data; aircraft-acquired scanner data; digitized topographic data; and numerous other ancillary data, such as soil types and rainfall information, that can be stored in digitized form. ELAS has the subsequent capability to geographically reference this data to dozens of standard, as well as user created projections. As an integrated image processing system, ELAS offers the user of remotely-sensed data a wide range of capabilities in the areas of land cover analysis and general purpose image analysis. ELAS is designed for flexible use and operation and includes its own FORTRAN operating subsystem and an expandable set of FORTRAN application modules. Because all of ELAS resides in one "logical" FORTRAN program, data inputs and outputs, directives, and module switching are convenient for the user. There are over 230 modules presently available to aid the user in performing a wide range of land cover analyses and manipulation. The file management modules enable the user to allocate, define, access, and specify usage for all types of files (ELAS files, subfiles, external files etc.). Various other modules convert specific types of satellite, aircraft, and vector-polygon data into files that can be used by other ELAS modules. The user also has many module options which aid in displaying image data, such as magnification/reduction of the display; true color display; and several memory functions. Additional modules allow for the building and manipulation of polygonal areas of the image data. Finally, there are modules which allow the user to select and classify the image data. An important feature of the ELAS subsystem is that its structure allows new applications modules to be easily integrated in the future. ELAS has as a standard

  3. ELAS - SCIENCE & TECHNOLOGY LABORATORY APPLICATIONS SOFTWARE (CONCURRENT VERSION)

    NASA Technical Reports Server (NTRS)

    Pearson, R. W.

    1994-01-01

    The Science and Technology Laboratory Applications Software (ELAS) was originally designed to analyze and process digital imagery data, specifically remotely-sensed scanner data. This capability includes the processing of Landsat multispectral data; aircraft-acquired scanner data; digitized topographic data; and numerous other ancillary data, such as soil types and rainfall information, that can be stored in digitized form. ELAS has the subsequent capability to geographically reference this data to dozens of standard, as well as user created projections. As an integrated image processing system, ELAS offers the user of remotely-sensed data a wide range of capabilities in the areas of land cover analysis and general purpose image analysis. ELAS is designed for flexible use and operation and includes its own FORTRAN operating subsystem and an expandable set of FORTRAN application modules. Because all of ELAS resides in one "logical" FORTRAN program, data inputs and outputs, directives, and module switching are convenient for the user. There are over 230 modules presently available to aid the user in performing a wide range of land cover analyses and manipulation. The file management modules enable the user to allocate, define, access, and specify usage for all types of files (ELAS files, subfiles, external files etc.). Various other modules convert specific types of satellite, aircraft, and vector-polygon data into files that can be used by other ELAS modules. The user also has many module options which aid in displaying image data, such as magnification/reduction of the display; true color display; and several memory functions. Additional modules allow for the building and manipulation of polygonal areas of the image data. Finally, there are modules which allow the user to select and classify the image data. An important feature of the ELAS subsystem is that its structure allows new applications modules to be easily integrated in the future. ELAS has as a standard

  4. ELAS - SCIENCE & TECHNOLOGY LABORATORY APPLICATIONS SOFTWARE (DEC VAX VERSION)

    NASA Technical Reports Server (NTRS)

    Junkin, B. G.

    1994-01-01

    The Science and Technology Laboratory Applications Software (ELAS) was originally designed to analyze and process digital imagery data, specifically remotely-sensed scanner data. This capability includes the processing of Landsat multispectral data; aircraft-acquired scanner data; digitized topographic data; and numerous other ancillary data, such as soil types and rainfall information, that can be stored in digitized form. ELAS has the subsequent capability to geographically reference this data to dozens of standard, as well as user created projections. As an integrated image processing system, ELAS offers the user of remotely-sensed data a wide range of capabilities in the areas of land cover analysis and general purpose image analysis. ELAS is designed for flexible use and operation and includes its own FORTRAN operating subsystem and an expandable set of FORTRAN application modules. Because all of ELAS resides in one "logical" FORTRAN program, data inputs and outputs, directives, and module switching are convenient for the user. There are over 230 modules presently available to aid the user in performing a wide range of land cover analyses and manipulation. The file management modules enable the user to allocate, define, access, and specify usage for all types of files (ELAS files, subfiles, external files etc.). Various other modules convert specific types of satellite, aircraft, and vector-polygon data into files that can be used by other ELAS modules. The user also has many module options which aid in displaying image data, such as magnification/reduction of the display; true color display; and several memory functions. Additional modules allow for the building and manipulation of polygonal areas of the image data. Finally, there are modules which allow the user to select and classify the image data. An important feature of the ELAS subsystem is that its structure allows new applications modules to be easily integrated in the future. ELAS has as a standard

  5. ELAS - SCIENCE & TECHNOLOGY LABORATORY APPLICATIONS SOFTWARE (MASSCOMP VERSION)

    NASA Technical Reports Server (NTRS)

    Walters, D.

    1994-01-01

    The Science and Technology Laboratory Applications Software (ELAS) was originally designed to analyze and process digital imagery data, specifically remotely-sensed scanner data. This capability includes the processing of Landsat multispectral data; aircraft-acquired scanner data; digitized topographic data; and numerous other ancillary data, such as soil types and rainfall information, that can be stored in digitized form. ELAS has the subsequent capability to geographically reference this data to dozens of standard, as well as user created projections. As an integrated image processing system, ELAS offers the user of remotely-sensed data a wide range of capabilities in the areas of land cover analysis and general purpose image analysis. ELAS is designed for flexible use and operation and includes its own FORTRAN operating subsystem and an expandable set of FORTRAN application modules. Because all of ELAS resides in one "logical" FORTRAN program, data inputs and outputs, directives, and module switching are convenient for the user. There are over 230 modules presently available to aid the user in performing a wide range of land cover analyses and manipulation. The file management modules enable the user to allocate, define, access, and specify usage for all types of files (ELAS files, subfiles, external files etc.). Various other modules convert specific types of satellite, aircraft, and vector-polygon data into files that can be used by other ELAS modules. The user also has many module options which aid in displaying image data, such as magnification/reduction of the display; true color display; and several memory functions. Additional modules allow for the building and manipulation of polygonal areas of the image data. Finally, there are modules which allow the user to select and classify the image data. An important feature of the ELAS subsystem is that its structure allows new applications modules to be easily integrated in the future. ELAS has as a standard

  6. ELAS - SCIENCE & TECHNOLOGY LABORATORY APPLICATIONS SOFTWARE (SUN VERSION)

    NASA Technical Reports Server (NTRS)

    Walters, D.

    1994-01-01

    The Science and Technology Laboratory Applications Software (ELAS) was originally designed to analyze and process digital imagery data, specifically remotely-sensed scanner data. This capability includes the processing of Landsat multispectral data; aircraft-acquired scanner data; digitized topographic data; and numerous other ancillary data, such as soil types and rainfall information, that can be stored in digitized form. ELAS has the subsequent capability to geographically reference this data to dozens of standard, as well as user created projections. As an integrated image processing system, ELAS offers the user of remotely-sensed data a wide range of capabilities in the areas of land cover analysis and general purpose image analysis. ELAS is designed for flexible use and operation and includes its own FORTRAN operating subsystem and an expandable set of FORTRAN application modules. Because all of ELAS resides in one "logical" FORTRAN program, data inputs and outputs, directives, and module switching are convenient for the user. There are over 230 modules presently available to aid the user in performing a wide range of land cover analyses and manipulation. The file management modules enable the user to allocate, define, access, and specify usage for all types of files (ELAS files, subfiles, external files etc.). Various other modules convert specific types of satellite, aircraft, and vector-polygon data into files that can be used by other ELAS modules. The user also has many module options which aid in displaying image data, such as magnification/reduction of the display; true color display; and several memory functions. Additional modules allow for the building and manipulation of polygonal areas of the image data. Finally, there are modules which allow the user to select and classify the image data. An important feature of the ELAS subsystem is that its structure allows new applications modules to be easily integrated in the future. ELAS has as a standard

  7. ELAS - SCIENCE & TECHNOLOGY LABORATORY APPLICATIONS SOFTWARE (SILICON GRAPHICS VERSION)

    NASA Technical Reports Server (NTRS)

    Walters, D.

    1994-01-01

    The Science and Technology Laboratory Applications Software (ELAS) was originally designed to analyze and process digital imagery data, specifically remotely-sensed scanner data. This capability includes the processing of Landsat multispectral data; aircraft-acquired scanner data; digitized topographic data; and numerous other ancillary data, such as soil types and rainfall information, that can be stored in digitized form. ELAS has the subsequent capability to geographically reference this data to dozens of standard, as well as user created projections. As an integrated image processing system, ELAS offers the user of remotely-sensed data a wide range of capabilities in the areas of land cover analysis and general purpose image analysis. ELAS is designed for flexible use and operation and includes its own FORTRAN operating subsystem and an expandable set of FORTRAN application modules. Because all of ELAS resides in one "logical" FORTRAN program, data inputs and outputs, directives, and module switching are convenient for the user. There are over 230 modules presently available to aid the user in performing a wide range of land cover analyses and manipulation. The file management modules enable the user to allocate, define, access, and specify usage for all types of files (ELAS files, subfiles, external files etc.). Various other modules convert specific types of satellite, aircraft, and vector-polygon data into files that can be used by other ELAS modules. The user also has many module options which aid in displaying image data, such as magnification/reduction of the display; true color display; and several memory functions. Additional modules allow for the building and manipulation of polygonal areas of the image data. Finally, there are modules which allow the user to select and classify the image data. An important feature of the ELAS subsystem is that its structure allows new applications modules to be easily integrated in the future. ELAS has as a standard

  8. Generating Safety-Critical PLC Code From a High-Level Application Software Specification

    NASA Technical Reports Server (NTRS)

    2008-01-01

    The benefits of automatic-application code generation are widely accepted within the software engineering community. These benefits include raised abstraction level of application programming, shorter product development time, lower maintenance costs, and increased code quality and consistency. Surprisingly, code generation concepts have not yet found wide acceptance and use in the field of programmable logic controller (PLC) software development. Software engineers at Kennedy Space Center recognized the need for PLC code generation while developing the new ground checkout and launch processing system, called the Launch Control System (LCS). Engineers developed a process and a prototype software tool that automatically translates a high-level representation or specification of application software into ladder logic that executes on a PLC. All the computer hardware in the LCS is planned to be commercial off the shelf (COTS), including industrial controllers or PLCs that are connected to the sensors and end items out in the field. Most of the software in LCS is also planned to be COTS, with only small adapter software modules that must be developed in order to interface between the various COTS software products. A domain-specific language (DSL) is a programming language designed to perform tasks and to solve problems in a particular domain, such as ground processing of launch vehicles. The LCS engineers created a DSL for developing test sequences of ground checkout and launch operations of future launch vehicle and spacecraft elements, and they are developing a tabular specification format that uses the DSL keywords and functions familiar to the ground and flight system users. The tabular specification format, or tabular spec, allows most ground and flight system users to document how the application software is intended to function and requires little or no software programming knowledge or experience. A small sample from a prototype tabular spec application is

  9. Software Agents Applications Using Real-Time CORBA

    NASA Astrophysics Data System (ADS)

    Fowell, S.; Ward, R.; Nielsen, M.

    This paper describes current projects being performed by SciSys in the area of the use of software agents, built using CORBA middleware, to improve operations within autonomous satellite/ground systems. These concepts have been developed and demonstrated in a series of experiments variously funded by ESA's Technology Flight Opportunity Initiative (TFO) and Leading Edge Technology for SMEs (LET-SME), and the British National Space Centre's (BNSC) National Technology Programme. Some of this earlier work has already been reported in [1]. This paper will address the trends, issues and solutions associated with this software agent architecture concept, together with its implementation using CORBA within an on-board environment, that is to say taking account of its real- time and resource constrained nature.

  10. Application of Lightweight Formal Methods to Software Security

    NASA Technical Reports Server (NTRS)

    Gilliam, David P.; Powell, John D.; Bishop, Matt

    2005-01-01

    Formal specification and verification of security has proven a challenging task. There is no single method that has proven feasible. Instead, an integrated approach which combines several formal techniques can increase the confidence in the verification of software security properties. Such an approach which species security properties in a library that can be reused by 2 instruments and their methodologies developed for the National Aeronautics and Space Administration (NASA) at the Jet Propulsion Laboratory (JPL) are described herein The Flexible Modeling Framework (FMF) is a model based verijkation instrument that uses Promela and the SPIN model checker. The Property Based Tester (PBT) uses TASPEC and a Text Execution Monitor (TEM). They are used to reduce vulnerabilities and unwanted exposures in software during the development and maintenance life cycles.

  11. Bringing Legacy Visualization Software to Modern Computing Devices via Application Streaming

    NASA Astrophysics Data System (ADS)

    Fisher, Ward

    2014-05-01

    Planning software compatibility across forthcoming generations of computing platforms is a problem commonly encountered in software engineering and development. While this problem can affect any class of software, data analysis and visualization programs are particularly vulnerable. This is due in part to their inherent dependency on specialized hardware and computing environments. A number of strategies and tools have been designed to aid software engineers with this task. While generally embraced by developers at 'traditional' software companies, these methodologies are often dismissed by the scientific software community as unwieldy, inefficient and unnecessary. As a result, many important and storied scientific software packages can struggle to adapt to a new computing environment; for example, one in which much work is carried out on sub-laptop devices (such as tablets and smartphones). Rewriting these packages for a new platform often requires significant investment in terms of development time and developer expertise. In many cases, porting older software to modern devices is neither practical nor possible. As a result, replacement software must be developed from scratch, wasting resources better spent on other projects. Enabled largely by the rapid rise and adoption of cloud computing platforms, 'Application Streaming' technologies allow legacy visualization and analysis software to be operated wholly from a client device (be it laptop, tablet or smartphone) while retaining full functionality and interactivity. It mitigates much of the developer effort required by other more traditional methods while simultaneously reducing the time it takes to bring the software to a new platform. This work will provide an overview of Application Streaming and how it compares against other technologies which allow scientific visualization software to be executed from a remote computer. We will discuss the functionality and limitations of existing application streaming

  12. Applications of multigrid software in the atmospheric sciences

    NASA Technical Reports Server (NTRS)

    Adams, J.; Garcia, R.; Gross, B.; Hack, J.; Haidvogel, D.; Pizzo, V.

    1992-01-01

    Elliptic partial differential equations from different areas in the atmospheric sciences are efficiently and easily solved utilizing the multigrid software package named MUDPACK. It is demonstrated that the multigrid method is more efficient than other commonly employed techniques, such as Gaussian elimination and fixed-grid relaxation. The efficiency relative to other techniques, both in terms of storage requirement and computational time, increases quickly with grid size.

  13. Software Quality Evaluation Models Applicable in Health Information and Communications Technologies. A Review of the Literature.

    PubMed

    Villamor Ordozgoiti, Alberto; Delgado Hito, Pilar; Guix Comellas, Eva María; Fernandez Sanchez, Carlos Manuel; Garcia Hernandez, Milagros; Lluch Canut, Teresa

    2016-01-01

    Information and Communications Technologies in healthcare has increased the need to consider quality criteria through standardised processes. The aim of this study was to analyse the software quality evaluation models applicable to healthcare from the perspective of ICT-purchasers. Through a systematic literature review with the keywords software, product, quality, evaluation and health, we selected and analysed 20 original research papers published from 2005-2016 in health science and technology databases. The results showed four main topics: non-ISO models, software quality evaluation models based on ISO/IEC standards, studies analysing software quality evaluation models, and studies analysing ISO standards for software quality evaluation. The models provide cost-efficiency criteria for specific software, and improve use outcomes. The ISO/IEC25000 standard is shown as the most suitable for evaluating the quality of ICTs for healthcare use from the perspective of institutional acquisition.

  14. Handbook of software quality assurance techniques applicable to the nuclear industry

    SciTech Connect

    Bryant, J.L.; Wilburn, N.P.

    1987-08-01

    Pacific Northwest Laboratory is conducting a research project to recommend good engineering practices in the application of 10 CFR 50, Appendix B requirements to assure quality in the development and use of computer software for the design and operation of nuclear power plants for NRC and industry. This handbook defines the content of a software quality assurance program by enumerating the techniques applicable. Definitions, descriptions, and references where further information may be obtained are provided for each topic.

  15. Software Application for Supporting the Education of Database Systems

    ERIC Educational Resources Information Center

    Vágner, Anikó

    2015-01-01

    The article introduces an application which supports the education of database systems, particularly the teaching of SQL and PL/SQL in Oracle Database Management System environment. The application has two parts, one is the database schema and its content, and the other is a C# application. The schema is to administrate and store the tasks and the…

  16. Application of Domain Knowledge to Software Quality Assurance

    NASA Technical Reports Server (NTRS)

    Wild, Christian W.

    1997-01-01

    This work focused on capturing, using, and evolving a qualitative decision support structure across the life cycle of a project. The particular application of this study was towards business process reengineering and the representation of the business process in a set of Business Rules (BR). In this work, we defined a decision model which captured the qualitative decision deliberation process. It represented arguments both for and against proposed alternatives to a problem. It was felt that the subjective nature of many critical business policy decisions required a qualitative modeling approach similar to that of Lee and Mylopoulos. While previous work was limited almost exclusively to the decision capture phase, which occurs early in the project life cycle, we investigated the use of such a model during the later stages as well. One of our significant developments was the use of the decision model during the operational phase of a project. By operational phase, we mean the phase in which the system or set of policies which were earlier decided are deployed and put into practice. By making the decision model available to operational decision makers, they would have access to the arguments pro and con for a variety of actions and can thus make a more informed decision which balances the often conflicting criteria by which the value of action is measured. We also developed the concept of a 'monitored decision' in which metrics of performance were identified during the decision making process and used to evaluate the quality of that decision. It is important to monitor those decision which seem at highest risk of not meeting their stated objectives. Operational decisions are also potentially high risk decisions. Finally, we investigated the use of performance metrics for monitored decisions and audit logs of operational decisions in order to feed an evolutionary phase of the the life cycle. During evolution, decisions are revisisted, assumptions verified or refuted

  17. The application of formal software engineering methods to the unattended and remote monitoring software suite at Los Alamos National Laboratory

    SciTech Connect

    Determan, John Clifford; Longo, Joseph F; Michel, Kelly D

    2009-01-01

    The Unattended and Remote Monitoring (UNARM) system is a collection of specialized hardware and software used by the International Atomic Energy Agency (IAEA) to institute nuclear safeguards at many nuclear facilities around the world. The hardware consists of detectors, instruments, and networked computers for acquiring various forms of data, including but not limited to radiation data, global position coordinates, camera images, isotopic data, and operator declarations. The software provides two primary functions: the secure and reliable collection of this data from the instruments and the ability to perform an integrated review and analysis of the disparate data sources. Several years ago the team responsible for maintaining the software portion of the UNARM system began the process of formalizing its operations. These formal operations include a configuration management system, a change control board, an issue tracking system, and extensive formal testing, for both functionality and reliability. Functionality is tested with formal test cases chosen to fully represent the data types and methods of analysis that will be commonly encountered. Reliability is tested with iterative, concurrent testing where up to five analyses are executed simultaneously for thousands of cycles. Iterative concurrent testing helps ensure that there are no resource conflicts or leaks when multiple system components are in use simultaneously. The goal of this work is to provide a high quality, reliable product, commensurate with the criticality of the application. Testing results will be presented that demonstrate that this goal has been achieved and the impact of the introduction of a formal software engineering framework to the UNARM product will be presented.

  18. Scoring of medical publications with SIGAPS software: Application to orthopedics.

    PubMed

    Rouvillain, J-L; Derancourt, C; Moore, N; Devos, P

    2014-11-01

    SIGAPS is a bibliometric software tool developed in France to identify and analyze Medline-indexed publications that are produced by a researcher or research group. This measurement takes into account the author's ranking on the paper along with the journal's prestige according to its impact factor within the research field. However, use of this impact factor is the primary limitation of SIGAPS. SIGAPS analysis results are used to assign a financial value to hospital facilities. The impact of the journal Revue de Chirurgie Orthopédique and its successor-Orthopaedics & Traumatology: Surgery & Research-was compared using the Medline-based ISI (SIGAPS) and SCOPUS-based SCImago journal rankings.

  19. Optimizing the use of open-source software applications in drug discovery.

    PubMed

    Geldenhuys, Werner J; Gaasch, Kevin E; Watson, Mark; Allen, David D; Van der Schyf, Cornelis J

    2006-02-01

    Drug discovery is a time consuming and costly process. Recently, a trend towards the use of in silico computational chemistry and molecular modeling for computer-aided drug design has gained significant momentum. This review investigates the application of free and/or open-source software in the drug discovery process. Among the reviewed software programs are applications programmed in JAVA, Perl and Python, as well as resources including software libraries. These programs might be useful for cheminformatics approaches to drug discovery, including QSAR studies, energy minimization and docking studies in drug design endeavors. Furthermore, this review explores options for integrating available computer modeling open-source software applications in drug discovery programs.

  20. Data-Interpolating Variational Analysis (DIVA) software : recent development and application

    NASA Astrophysics Data System (ADS)

    Watelet, Sylvain; Barth, Alexander; Troupin, Charles; Ouberdous, Mohamed; Beckers, Jean-Marie

    2014-05-01

    The Data-Interpolating Variational Analysis (DIVA) software is a tool designed to reconstruct a continuous field from discrete measurements. This method is based on the numerical implementation of the Variational Inverse Model (VIM), which consists of a minimization of a cost function, allowing the choice of the analyzed field fitting at best the data sets. The problem is solved efficiently using a finite-element method. This statistical method is particularly suited to deal with irregularly-spaced observations, producing outputs on a regular grid. Initially created to work in a two-dimensional way, the software is now able to handle 3D or even 4D analysis, in order to easily produce ocean climatologies. These analyzes can easily be improved by taking advantage of the DIVA's ability to take topographic and dynamic constraints into account (coastal relief, prevailing wind impacting the advection,...). In DIVA, we assume errors on measurements are not correlated, which means we do not consider the effect of correlated observation errors on the analysis and we therefore use a diagonal observation error covariance matrix. However, the oceanographic data sets are generally clustered in space and time, thus introducing some correlation between observations. In order to determine the impact of such an approximation and provide strategies to mitigate its effects, we conducted several synthetic experiments with known correlation structure. Overall, the best results were obtained with a variant of the covariance inflation method. Finally, a new application of DIVA on satellite altimetry data will be presented : these data have particular space and time distributions, as they consist of repeated tracks (~10-35 days) of measurements with a distance lower than 10 km between two successive measurements in a given track. The tools designed to determine the analysis parameters were adapted to these specificities. Moreover, different weights were applied to measurements in order to

  1. Software applications toward quantitative metabolic flux analysis and modeling.

    PubMed

    Dandekar, Thomas; Fieselmann, Astrid; Majeed, Saman; Ahmed, Zeeshan

    2014-01-01

    Metabolites and their pathways are central for adaptation and survival. Metabolic modeling elucidates in silico all the possible flux pathways (flux balance analysis, FBA) and predicts the actual fluxes under a given situation, further refinement of these models is possible by including experimental isotopologue data. In this review, we initially introduce the key theoretical concepts and different analysis steps in the modeling process before comparing flux calculation and metabolite analysis programs such as C13, BioOpt, COBRA toolbox, Metatool, efmtool, FiatFlux, ReMatch, VANTED, iMAT and YANA. Their respective strengths and limitations are discussed and compared to alternative software. While data analysis of metabolites, calculation of metabolic fluxes, pathways and their condition-specific changes are all possible, we highlight the considerations that need to be taken into account before deciding on a specific software. Current challenges in the field include the computation of large-scale networks (in elementary mode analysis), regulatory interactions and detailed kinetics, and these are discussed in the light of powerful new approaches.

  2. Application of Gaia Analysis Software AGIS to Nano-JASMINE

    NASA Astrophysics Data System (ADS)

    Yamada, Y.; Lammers, U.; Gouda, N.

    2011-07-01

    The core data reduction for the Nano-JASMINE mission is planned to be done with Gaia's Astrometric Global Iterative Solution (AGIS). Nano-JASMINE is an ultra small (35 kg) satellite for astrometry observations in Japan and Gaia is ESA's large (over 1000 kg) next-generation astrometry mission. The accuracy of Nano-JASMINE is about 3 mas, comparable to the Hipparcos mission, Gaia's predecessor some 20 years ago. It is challenging that such a small satellite can perform real scientific observations. The collaboration for sharing software started in 2007. In addition to similar design and operating principles of the two missions, this is possible thanks to the encapsulation of all Gaia-specific aspects of AGIS in a Parameter Database. Nano-JASMINE will be the test bench for the Gaia AGIS software. We present this idea in detail and the necessary practical steps to make AGIS work with Nano-JASMINE data. We also show the key mission parameters, goals, and status of the data reduction for the Nano-JASMINE.

  3. Application of parallelized software architecture to an autonomous ground vehicle

    NASA Astrophysics Data System (ADS)

    Shakya, Rahul; Wright, Adam; Shin, Young Ho; Momin, Orko; Petkovsek, Steven; Wortman, Paul; Gautam, Prasanna; Norton, Adam

    2011-01-01

    This paper presents improvements made to Q, an autonomous ground vehicle designed to participate in the Intelligent Ground Vehicle Competition (IGVC). For the 2010 IGVC, Q was upgraded with a new parallelized software architecture and a new vision processor. Improvements were made to the power system reducing the number of batteries required for operation from six to one. In previous years, a single state machine was used to execute the bulk of processing activities including sensor interfacing, data processing, path planning, navigation algorithms and motor control. This inefficient approach led to poor software performance and made it difficult to maintain or modify. For IGVC 2010, the team implemented a modular parallel architecture using the National Instruments (NI) LabVIEW programming language. The new architecture divides all the necessary tasks - motor control, navigation, sensor data collection, etc. into well-organized components that execute in parallel, providing considerable flexibility and facilitating efficient use of processing power. Computer vision is used to detect white lines on the ground and determine their location relative to the robot. With the new vision processor and some optimization of the image processing algorithm used last year, two frames can be acquired and processed in 70ms. With all these improvements, Q placed 2nd in the autonomous challenge.

  4. Software Defined Radio Standard Architecture and its Application to NASA Space Missions

    NASA Technical Reports Server (NTRS)

    Andro, Monty; Reinhart, Richard C.

    2006-01-01

    A software defined radio (SDR) architecture used in space-based platforms proposes to standardize certain aspects of radio development such as interface definitions, functional control and execution, and application software and firmware development. NASA has charted a team to develop an open software defined radio hardware and software architecture to support NASA missions and determine the viability of an Agency-wide Standard. A draft concept of the proposed standard has been released and discussed among organizations in the SDR community. Appropriate leveraging of the JTRS SCA, OMG's SWRadio Architecture and other aspects are considered. A standard radio architecture offers potential value by employing common waveform software instantiation, operation, testing and software maintenance. While software defined radios offer greater flexibility, they also poses challenges to the radio development for the space environment in terms of size, mass and power consumption and available technology. An SDR architecture for space must recognize and address the constraints of space flight hardware, and systems along with flight heritage and culture. NASA is actively participating in the development of technology and standards related to software defined radios. As NASA considers a standard radio architecture for space communications, input and coordination from government agencies, the industry, academia, and standards bodies is key to a successful architecture. The unique aspects of space require thorough investigation of relevant terrestrial technologies properly adapted to space. The talk will describe NASA's current effort to investigate SDR applications to space missions and a brief overview of a candidate architecture under consideration for space based platforms.

  5. A wind tunnel application of large-field focusing schlieren

    NASA Technical Reports Server (NTRS)

    Ponton, Michael K.; Seiner, John M.; Mitchell, L. K.; Manning, James C.; Jansen, Bernard J.; Lagen, Nicholas T.

    1992-01-01

    A large-field focusing schlieren apparatus was installed in the NASA Lewis Research Center 9 by 15 foot wind tunnel in an attempt to determine the density gradient flow field of a free jet issuing from a supersonic nozzle configuration. The nozzle exit geometry was designed to reduce acoustic emissions from the jet by enhancing plume mixing. Thus, the flow exhibited a complex three-dimensional structure which warranted utilizing the sharp focusing capability of this type of schlieren method. Design considerations concerning tunnel limitations, high-speed photography, and video tape recording are presented in the paper.

  6. Comprehensive, powerful, efficient, intuitive: a new software framework for clinical imaging applications

    NASA Astrophysics Data System (ADS)

    Augustine, Kurt E.; Holmes, David R., III; Hanson, Dennis P.; Robb, Richard A.

    2006-03-01

    One of the greatest challenges for a software engineer is to create a complex application that is comprehensive enough to be useful to a diverse set of users, yet focused enough for individual tasks to be carried out efficiently with minimal training. This "powerful yet simple" paradox is particularly prevalent in advanced medical imaging applications. Recent research in the Biomedical Imaging Resource (BIR) at Mayo Clinic has been directed toward development of an imaging application framework that provides powerful image visualization/analysis tools in an intuitive, easy-to-use interface. It is based on two concepts very familiar to physicians - Cases and Workflows. Each case is associated with a unique patient and a specific set of routine clinical tasks, or a workflow. Each workflow is comprised of an ordered set of general-purpose modules which can be re-used for each unique workflow. Clinicians help describe and design the workflows, and then are provided with an intuitive interface to both patient data and analysis tools. Since most of the individual steps are common to many different workflows, the use of general-purpose modules reduces development time and results in applications that are consistent, stable, and robust. While the development of individual modules may reflect years of research by imaging scientists, new customized workflows based on the new modules can be developed extremely fast. If a powerful, comprehensive application is difficult to learn and complicated to use, it will be unacceptable to most clinicians. Clinical image analysis tools must be intuitive and effective or they simply will not be used.

  7. Office Computer Software: A Comprehensive Review of Software Programs.

    ERIC Educational Resources Information Center

    Secretary, 1992

    1992-01-01

    Describes types of software including system software, application software, spreadsheets, accounting software, graphics packages, desktop publishing software, database, desktop and personal information management software, project and records management software, groupware, and shareware. (JOW)

  8. Automated Translation of Safety Critical Application Software Specifications into PLC Ladder Logic

    NASA Technical Reports Server (NTRS)

    Leucht, Kurt W.; Semmel, Glenn S.

    2008-01-01

    The numerous benefits of automatic application code generation are widely accepted within the software engineering community. A few of these benefits include raising the abstraction level of application programming, shorter product development time, lower maintenance costs, and increased code quality and consistency. Surprisingly, code generation concepts have not yet found wide acceptance and use in the field of programmable logic controller (PLC) software development. Software engineers at the NASA Kennedy Space Center (KSC) recognized the need for PLC code generation while developing their new ground checkout and launch processing system. They developed a process and a prototype software tool that automatically translates a high-level representation or specification of safety critical application software into ladder logic that executes on a PLC. This process and tool are expected to increase the reliability of the PLC code over that which is written manually, and may even lower life-cycle costs and shorten the development schedule of the new control system at KSC. This paper examines the problem domain and discusses the process and software tool that were prototyped by the KSC software engineers.

  9. Applications of focused ion beam systems in gunshot residue investigation.

    PubMed

    Niewöhner, L; Wenz, H W

    1999-01-01

    Scanning ion microscopy technology has opened a new door to forensic scientists, allowing the GSR investigator to see inside a particle's core. Using a focused ion beam, particles can be cross-sectioned, revealing interior morphology and character that can be utilized for identification of the ammunition manufacturer.

  10. Generation of Focused Shock Waves in Water for Biomedical Applications

    NASA Astrophysics Data System (ADS)

    Lukeš, Petr; Šunka, Pavel; Hoffer, Petr; Stelmashuk, Vitaliy; Beneš, Jiří; Poučková, Pavla; Zadinová, Marie; Zeman, Jan

    The physical characteristics of focused two-successive (tandem) shock waves (FTSW) in water and their biological effects are presented. FTSW were ­generated by underwater multichannel electrical discharges in a highly conductive saline solution using two porous ceramic-coated cylindrical electrodes of different diameter and surface area. The primary cylindrical pressure wave generated at each composite electrode was focused by a metallic parabolic reflector to a common focal point to form two strong shock waves with a variable time delay between the waves. The pressure field and interaction between the first and the second shock waves at the focus were investigated using schlieren photography and polyvinylidene fluoride (PVDF) shock gauge sensors. The largest interaction was obtained for a time delay of 8-15 μs between the waves, producing an amplitude of the negative pressure phase of the second shock wave down to -80 MPa and a large number of cavitations at the focus. The biological effects of FTSW were demonstrated in vitro on damage to B16 melanoma cells, in vivo on targeted lesions in the thigh muscles of rabbits and on the growth delay of sarcoma tumors in Lewis rats treated in vivo by FTSW, compared to untreated controls.

  11. An experimental investigation of fault tolerant software structures in an avionics application

    NASA Technical Reports Server (NTRS)

    Caglayan, Alper K.; Eckhardt, Dave E., Jr.

    1989-01-01

    The objective of this experimental investigation is to compare the functional performance and software reliability of competing fault tolerant software structures utilizing software diversity. In this experiment, three versions of the redundancy management software for a skewed sensor array have been developed using three diverse failure detection and isolation algorithms and incorporated into various N-version, recovery block and hybrid software structures. The empirical results show that, for maximum functional performance improvement in the selected application domain, the results of diverse algorithms should be voted before being processed by multiple versions without enforced diversity. Results also suggest that when the reliability gain with an N-version structure is modest, recovery block structures are more feasible since higher reliability can be obtained using an acceptance check with a modest reliability.

  12. Conceptions of Software Development by Project Managers: A Study of Managing the Outsourced Development of Software Applications for United States Federal Government Agencies

    ERIC Educational Resources Information Center

    Eisen, Daniel

    2013-01-01

    This study explores how project managers, working for private federal IT contractors, experience and understand managing the development of software applications for U.S. federal government agencies. Very little is known about how they manage their projects in this challenging environment. Software development is a complex task and only grows in…

  13. Cold atomic beam ion source for focused ion beam applications

    SciTech Connect

    Knuffman, B.; Steele, A. V.; McClelland, J. J.

    2013-07-28

    We report measurements and modeling of an ion source that is based on ionization of a laser-cooled atomic beam. We show a high brightness and a low energy spread, suitable for use in next-generation, high-resolution focused ion beam systems. Our measurements of total ion current as a function of ionization conditions support an analytical model that also predicts the cross-sectional current density and spatial distribution of ions created in the source. The model predicts a peak brightness of 2 × 10{sup 7} A m{sup −2} sr{sup −1} eV{sup −1} and an energy spread less than 0.34 eV. The model is also combined with Monte-Carlo simulations of the inter-ion Coulomb forces to show that the source can be operated at several picoamperes with a brightness above 1 × 10{sup 7} A m{sup −2} sr{sup −1} eV{sup −1}. We estimate that when combined with a conventional ion focusing column, an ion source with these properties could focus a 1 pA beam into a spot smaller than 1 nm. A total current greater than 5 nA was measured in a lower-brightness configuration of the ion source, demonstrating the possibility of a high current mode of operation.

  14. Application of an impedance matching transformer to a plasma focus.

    PubMed

    Bures, B L; James, C; Krishnan, M; Adler, R

    2011-10-01

    A plasma focus was constructed using an impedance matching transformer to improve power transfer between the pulse power and the dynamic plasma load. The system relied on two switches and twelve transformer cores to produce a 100 kA pulse in short circuit on the secondary at 27 kV on the primary with 110 J stored. With the two transformer systems in parallel, the Thevenin equivalent circuit parameters on the secondary side of the driver are: C = 10.9 μF, V(0) = 4.5 kV, L = 17 nH, and R = 5 mΩ. An equivalent direct drive circuit would require a large number of switches in parallel, to achieve the same Thevenin equivalent. The benefits of this approach are replacement of consumable switches with non-consumable transformer cores, reduction of the driver inductance and resistance as viewed by the dynamic load, and reduction of the stored energy to produce a given peak current. The system is designed to operate at 100 Hz, so minimizing the stored energy results in less load on the thermal management system. When operated at 1 Hz, the neutron yield from the transformer matched plasma focus was similar to the neutron yield from a conventional (directly driven) plasma focus at the same peak current.

  15. Cold atomic beam ion source for focused ion beam applications

    NASA Astrophysics Data System (ADS)

    Knuffman, B.; Steele, A. V.; McClelland, J. J.

    2013-07-01

    We report measurements and modeling of an ion source that is based on ionization of a laser-cooled atomic beam. We show a high brightness and a low energy spread, suitable for use in next-generation, high-resolution focused ion beam systems. Our measurements of total ion current as a function of ionization conditions support an analytical model that also predicts the cross-sectional current density and spatial distribution of ions created in the source. The model predicts a peak brightness of 2 × 107 A m-2 sr-1 eV-1 and an energy spread less than 0.34 eV. The model is also combined with Monte-Carlo simulations of the inter-ion Coulomb forces to show that the source can be operated at several picoamperes with a brightness above 1 × 107 A m-2 sr-1 eV-1. We estimate that when combined with a conventional ion focusing column, an ion source with these properties could focus a 1 pA beam into a spot smaller than 1 nm. A total current greater than 5 nA was measured in a lower-brightness configuration of the ion source, demonstrating the possibility of a high current mode of operation.

  16. Application of Regulatory Focus Theory to Search Advertising

    PubMed Central

    Mowle, Elyse N.; Georgia, Emily J.; Doss, Brian D.; Updegraff, John A.

    2015-01-01

    Purpose The purpose of this paper is to test the utility of regulatory focus theory principles in a real-world setting; specifically, Internet hosted text advertisements. Effect of compatibility of the ad text with the regulatory focus of the consumer was examined. Design/methodology/approach Advertisements were created using Google AdWords. Data were collected for the number of views and clicks each ad received. Effect of regulatory fit was measured using logistic regression. Findings Logistic regression analyses demonstrated that there was a strong main effect for keyword, such that users were almost six times as likely to click on a promotion advertisement as a prevention advertisement, as well as a main effect for compatibility, such that users were twice as likely to click on an advertisement with content that was consistent with their keyword. Finally, there was a strong interaction of these two variables, such that the effect of consistent advertisements was stronger for promotion searches than for prevention searches. Research limitations/implications The effect of ad compatibility had medium to large effect sizes, suggesting that individuals’ state may have more influence on advertising response than do individuals’ traits (e.g. personality traits). Measurement of regulatory fit was limited by the constraints of Google AdWords. Practical implications The results of this study provide a possible framework for ad creation for Internet advertisers. Originality/value This paper is the first study to demonstrate the utility of regulatory focus theory in online advertising. PMID:26430293

  17. User Manual for the Data-Series Interface of the Gr Application Software

    USGS Publications Warehouse

    Donovan, John M.

    2009-01-01

    This manual describes the data-series interface for the Gr Application software. Basic tasks such as plotting, editing, manipulating, and printing data series are presented. The properties of the various types of data objects and graphical objects used within the application, and the relationships between them also are presented. Descriptions of compatible data-series file formats are provided.

  18. Cost Effective Applications of High Integrity Software Processes

    DTIC Science & Technology

    2011-05-18

    Inspections/ peer reviews • Checklists • Programming Languages and Coding Standards • Static Code Analysis C d l it• o e comp ex y • Unit Testing...rom their own perspective © 2011 Lockheed Martin Corporation AER201103026 Inspection/ Peer Reviews • Reduce costly rework − Focus on defect removal... peer reviews ) to remove up to 80 percent of their defects • It doesn’t have to be hard − Reviews can be of many different types (very formal

  19. Generic domain models in software engineering

    NASA Technical Reports Server (NTRS)

    Maiden, Neil

    1992-01-01

    This paper outlines three research directions related to domain-specific software development: (1) reuse of generic models for domain-specific software development; (2) empirical evidence to determine these generic models, namely elicitation of mental knowledge schema possessed by expert software developers; and (3) exploitation of generic domain models to assist modelling of specific applications. It focuses on knowledge acquisition for domain-specific software development, with emphasis on tool support for the most important phases of software development.

  20. Focusing particle concentrator with application to ultrafine particles

    DOEpatents

    Hering, Susanne; Lewis, Gregory; Spielman, Steven R.

    2013-06-11

    Technology is presented for the high efficiency concentration of fine and ultrafine airborne particles into a small fraction of the sampled airflow by condensational enlargement, aerodynamic focusing and flow separation. A nozzle concentrator structure including an acceleration nozzle with a flow extraction structure may be coupled to a containment vessel. The containment vessel may include a water condensation growth tube to facilitate the concentration of ultrafine particles. The containment vessel may further include a separate carrier flow introduced at the center of the sampled flow, upstream of the acceleration nozzle of the nozzle concentrator to facilitate the separation of particle and vapor constituents.

  1. A beamline matching application based on open source software

    SciTech Connect

    2000-12-21

    An interactive Beamline Matching application has been developed using beamline and automatic differentiation class libraries. Various freely available components were used; in particular, the user interface is based on FLTK, a C++ toolkit distributed under the terms of the GNU Public License (GPL). The result is an application that compiles without modifications under both X-Windows and Win32 and offers the same look and feel under both operating environments. In this paper, we discuss some of the practical issues that were confronted and the choices that were made. In particular, we discuss object-based event propagation mechanisms, multithreading, language mixing and persistence.

  2. Directory of Industry and University Collaborations with a Focus on Software Engineering Education and Training, Version 6

    DTIC Science & Technology

    1997-11-01

    pointer to potential new client bases. A short bibliography points the reader to background material on software engineering curricula , coalitions...studies and influence on course material ) is a condition of the funding. Points of Contact for further information Dr. Jacob Slonim Head of Research...1995-96 academic year, the Computer Science Department experimented with a new approach to teaching its first two courses in the undergraduate

  3. Beehive: A Software Application for Synchronous Collaborative Learning

    ERIC Educational Resources Information Center

    Turani, Aiman; Calvo, Rafael A.

    2006-01-01

    Purpose: The purpose of this paper is to describe Beehive, a new web application framework for designing and supporting synchronous collaborative learning. Design/methodology/approach: Our web engineering approach integrates educational design expertise into a technology for building tools for collaborative learning activities. Beehive simplifies…

  4. A lightweight focusing reflector concept for space power applications

    NASA Astrophysics Data System (ADS)

    Wallace, T.; Bussard, R. W.

    A very lightweight membrane mirror system which can function as a flat or concave mirror and has applications in space power systems is described. The structural properties, including steady-state design and dynamic effects, are addressed along with optical properties. Operational issues are briefly discussed, including orbit stabilization, deformation by solar pressure, and pointing control. The design of the mirror provides a simple means of altering the mirror focal length.

  5. Scaling Irregular Applications through Data Aggregation and Software Multithreading

    SciTech Connect

    Morari, Alessandro; Tumeo, Antonino; Chavarría-Miranda, Daniel; Villa, Oreste; Valero, Mateo

    2014-05-30

    Bioinformatics, data analytics, semantic databases, knowledge discovery are emerging high performance application areas that exploit dynamic, linked data structures such as graphs, unbalanced trees or unstructured grids. These data structures usually are very large, requiring significantly more memory than available on single shared memory systems. Additionally, these data structures are difficult to partition on distributed memory systems. They also present poor spatial and temporal locality, thus generating unpredictable memory and network accesses. The Partitioned Global Address Space (PGAS) programming model seems suitable for these applications, because it allows using a shared memory abstraction across distributed-memory clusters. However, current PGAS languages and libraries are built to target regular remote data accesses and block transfers. Furthermore, they usually rely on the Single Program Multiple Data (SPMD) parallel control model, which is not well suited to the fine grained, dynamic and unbalanced parallelism of irregular applications. In this paper we present {\\bf GMT} (Global Memory and Threading library), a custom runtime library that enables efficient execution of irregular applications on commodity clusters. GMT integrates a PGAS data substrate with simple fork/join parallelism and provides automatic load balancing on a per node basis. It implements multi-level aggregation and lightweight multithreading to maximize memory and network bandwidth with fine-grained data accesses and tolerate long data access latencies. A key innovation in the GMT runtime is its thread specialization (workers, helpers and communication threads) that realize the overall functionality. We compare our approach with other PGAS models, such as UPC running using GASNet, and hand-optimized MPI code on a set of typical large-scale irregular applications, demonstrating speedups of an order of magnitude.

  6. Focused Magnetic Resonance Coupling Coils for Electromagnetic Therapy Applications.

    PubMed

    Yeung, Sai Ho; Pradhan, Raunaq; Feng, Xiaohua; Zheng, Yuanjin

    2015-11-01

    This paper presents the design and construction of a pair of figure-of-eight coils, coupled by magnetic resonance coupling (MRC), which could generate (150 V/m per Ampere) electric field at the focal points for electromagnetic therapy related applications. The E field generated at the targeted site would be significantly enhanced under the same amount of current flowing through the MRC figure-of-eight coils compared to normal coils, due to the superposition of E field contributed by the coils. Furthermore, the MRC figure-of-eight coil is designed and the results are verified in theory, simulation, and experiments. In the ex vivo tissue measurement, 35% current and 82% ohmic power improvements were observed. Since it can enhance the current and ohmic power, the MRC figure-of-eight coils are promising solutions for electromagnetic therapy applications. The potential applications of the coils include noninvasive radio frequency (RF) stimulation, thermoacoustic imaging, electromagnetic field therapies, and RF ablation, etc.

  7. An overview of PET/MR, focused on clinical applications.

    PubMed

    Catalano, Onofrio Antonio; Masch, William Roger; Catana, Ciprian; Mahmood, Umar; Sahani, Dushyant Vasudeo; Gee, Michael Stanley; Menezes, Leon; Soricelli, Andrea; Salvatore, Marco; Gervais, Debra; Rosen, Bruce Robert

    2017-02-01

    Hybrid PET/MR scanners are innovative imaging devices that simultaneously or sequentially acquire and fuse anatomical and functional data from magnetic resonance (MR) with metabolic information from positron emission tomography (PET) (Delso et al. in J Nucl Med 52:1914-1922, 2011; Zaidi et al. in Phys Med Biol 56:3091-3106, 2011). Hybrid PET/MR scanners have the potential to greatly impact not only on medical research but also, and more importantly, on patient management. Although their clinical applications are still under investigation, the increased worldwide availability of PET/MR scanners, and the growing published literature are important determinants in their rising utilization for primarily clinical applications. In this manuscript, we provide a summary of the physical features of PET/MR, including its limitations, which are most relevant to clinical PET/MR implementation and to interpretation. Thereafter, we discuss the most important current and emergent clinical applications of such hybrid technology in the abdomen and pelvis, both in the field of oncologic and non-oncologic imaging, and we provide, when possible, a comparison with clinically consolidated imaging techniques, like for example PET/CT.

  8. The Enterprise Derivative Application: Flexible Software for Optimizing Manufacturing Processes

    SciTech Connect

    Ward, Richard C; Allgood, Glenn O; Knox, John R

    2008-11-01

    The Enterprise Derivative Application (EDA) implements the enterprise-derivative analysis for optimization of an industrial process (Allgood and Manges, 2001). It is a tool to help industry planners choose the most productive way of manufacturing their products while minimizing their cost. Developed in MS Access, the application allows users to input initial data ranging from raw material to variable costs and enables the tracking of specific information as material is passed from one process to another. Energy-derivative analysis is based on calculation of sensitivity parameters. For the specific application to a steel production process these include: the cost to product sensitivity, the product to energy sensitivity, the energy to efficiency sensitivity, and the efficiency to cost sensitivity. Using the EDA, for all processes the user can display a particular sensitivity or all sensitivities can be compared for all processes. Although energy-derivative analysis was originally designed for use by the steel industry, it is flexible enough to be applied to many other industrial processes. Examples of processes where energy-derivative analysis would prove useful are wireless monitoring of processes in the petroleum cracking industry and wireless monitoring of motor failure for determining the optimum time to replace motor parts. One advantage of the MS Access-based application is its flexibility in defining the process flow and establishing the relationships between parent and child process and products resulting from a process. Due to the general design of the program, a process can be anything that occurs over time with resulting output (products). So the application can be easily modified to many different industrial and organizational environments. Another advantage is the flexibility of defining sensitivity parameters. Sensitivities can be determined between all possible variables in the process flow as a function of time. Thus the dynamic development of the

  9. Applications of software-defined radio (SDR) technology in hospital environments.

    PubMed

    Chávez-Santiago, Raúl; Mateska, Aleksandra; Chomu, Konstantin; Gavrilovska, Liljana; Balasingham, Ilangko

    2013-01-01

    A software-defined radio (SDR) is a radio communication system where the major part of its functionality is implemented by means of software in a personal computer or embedded system. Such a design paradigm has the major advantage of producing devices that can receive and transmit widely different radio protocols based solely on the software used. This flexibility opens several application opportunities in hospital environments, where a large number of wired and wireless electronic devices must coexist in confined areas like operating rooms and intensive care units. This paper outlines some possible applications in the 2360-2500 MHz frequency band. These applications include the integration of wireless medical devices in a common communication platform for seamless interoperability, and cognitive radio (CR) for body area networks (BANs) and wireless sensor networks (WSNs) for medical environmental surveillance. The description of a proof-of-concept CR prototype is also presented.

  10. A model of cloud application assignments in software-defined storages

    NASA Astrophysics Data System (ADS)

    Bolodurina, Irina P.; Parfenov, Denis I.; Polezhaev, Petr N.; Shukhman, Alexander E.

    2017-01-01

    The aim of this study is to analyze the structure and mechanisms of interaction of typical cloud applications and to suggest the approaches to optimize their placement in storage systems. In this paper, we describe a generalized model of cloud applications including the three basic layers: a model of application, a model of service, and a model of resource. The distinctive feature of the model suggested implies analyzing cloud resources from the user point of view and from the point of view of a software-defined infrastructure of the virtual data center (DC). The innovation character of this model is in describing at the same time the application data placements, as well as the state of the virtual environment, taking into account the network topology. The model of software-defined storage has been developed as a submodel within the resource model. This model allows implementing the algorithm for control of cloud application assignments in software-defined storages. Experimental researches returned this algorithm decreases in cloud application response time and performance growth in user request processes. The use of software-defined data storages allows the decrease in the number of physical store devices, which demonstrates the efficiency of our algorithm.

  11. Compact plasma focus devices: Flexible laboratory sources for applications

    SciTech Connect

    Lebert, R.; Engel, A.; Bergmann, K.; Treichel, O.; Gavrilescu, C.; Neff, W.

    1997-05-05

    Small pinch plasma devices are intense sources of pulsed XUV-radiation. Because of their low costs and their compact sizes pinch plasmas seem well suited to supplement research activities based on synchrotrons. With correct optimisation, both continuous radiation and narrowband line radiation can be tailored for specific applications. For the special demand of optimising narrowband emission from these plasmas the scaling of K-shell line emission of intermediate atomic number pinch plasmas with respect to device parameters has been studied. Scaling laws, especially taking into account the transient behaviour of the pinch plasma, give design criteria. Investigations of the transition between column and micropinch mode offer predictable access to shorter wavelengths and smaller source sizes. Results on proximity x-ray lithography, imaging and contact x-ray microscopy, x-ray fluorescence (XFA) microscopy and photo-electron spectroscopy (XPS) were achieved.

  12. Engineering of Data Acquiring Mobile Software and Sustainable End-User Applications

    NASA Technical Reports Server (NTRS)

    Smith, Benton T.

    2013-01-01

    The criteria for which data acquiring software and its supporting infrastructure should be designed should take the following two points into account: the reusability and organization of stored online and remote data and content, and an assessment on whether abandoning a platform optimized design in favor for a multi-platform solution significantly reduces the performance of an end-user application. Furthermore, in-house applications that control or process instrument acquired data for end-users should be designed with a communication and control interface such that the application's modules can be reused as plug-in modular components in greater software systems. The application of the above mentioned is applied using two loosely related projects: a mobile application, and a website containing live and simulated data. For the intelligent devices mobile application AIDM, the end-user interface have a platform and data type optimized design, while the database and back-end applications store this information in an organized manner and manage access to that data to only to authorized user end application(s). Finally, the content for the website was derived from a database such that the content can be included and uniform to all applications accessing the content. With these projects being ongoing, I have concluded from my research that the applicable methods presented are feasible for both projects, and that a multi-platform design for the mobile application only marginally drop the performance of the mobile application.

  13. Using WWW to Improve Software Development and Maintenance: Application of the Light System to Aleph Programs

    NASA Astrophysics Data System (ADS)

    Aimar, A.; Aimar, M.; Khodabandeh, A.; Palazzi, P.; Rousseau, B.; Ruggier, M.; Cattaneo, M.; Comas Illas, P.

    Programmers who develop, use, maintain, modify software are faced with the problem of scanning and understanding large amounts of documents, ranging from source code to requirements, analysis and design diagrams, user and reference manuals, etc. This task is non trivial and time consuming, because of the number and size of documents, and the many implicit cross-references that they contain. In large distributed development teams, where software and related documents are produced at various sites, the problem can be even more severe. LIGHT, Life cycle Global HyperText, is an attempt to solve the problem using WWW technology. The basic idea is to make all the software documents, including code, available and cross-connected on the WWW. The first application of this concept to go in production is JULIA/LIGHT, a system to convert and publish on WWW the software documentation of the JULIA reconstruction program of the ALEPH experiment at CERN, European Organisation for Particle Physics, Geneva.

  14. Imaging In focus: Reflected light imaging: Techniques and applications.

    PubMed

    Guggenheim, Emily J; Lynch, Iseult; Rappoport, Joshua Z

    2017-02-01

    Reflectance imaging is a broad term that describes the formation of images by the detection of illumination light that is back-scattered from reflective features within a sample. Reflectance imaging can be performed in a variety of different configurations, such as confocal, oblique angle illumination, structured illumination, interferometry and total internal reflectance, permitting a plethora of biomedical applications. Reflectance imaging has proven indispensable for critical investigations into the safety and understanding of biomedically and environmentally relevant nano-materials, an area of high priority and investment. The non-destructive in vivo imaging ability of reflectance techniques permits alternative diagnostic strategies that may eventually facilitate the eradication of some invasive biopsy procedures. Reflectance can also provide additional structural information and clarity necessary in fluorescent based in vivo studies. Near-coverslip interrogation techniques, such as reflectance interferometry and total internal reflection, have provided a label free means to investigate cell-surface contacts, cell motility and vesicle trafficking in vivo and in vitro. Other key advances include the ability to acquire superresolution reflectance images providing increased spatial resolution.

  15. Internet-based hardware/software co-design framework for embedded 3D graphics applications

    NASA Astrophysics Data System (ADS)

    Yeh, Chi-Tsai; Wang, Chun-Hao; Huang, Ing-Jer; Wong, Weng-Fai

    2011-12-01

    Advances in technology are making it possible to run three-dimensional (3D) graphics applications on embedded and handheld devices. In this article, we propose a hardware/software co-design environment for 3D graphics application development that includes the 3D graphics software, OpenGL ES application programming interface (API), device driver, and 3D graphics hardware simulators. We developed a 3D graphics system-on-a-chip (SoC) accelerator using transaction-level modeling (TLM). This gives software designers early access to the hardware even before it is ready. On the other hand, hardware designers also stand to gain from the more complex test benches made available in the software for verification. A unique aspect of our framework is that it allows hardware and software designers from geographically dispersed areas to cooperate and work on the same framework. Designs can be entered and executed from anywhere in the world without full access to the entire framework, which may include proprietary components. This results in controlled and secure transparency and reproducibility, granting leveled access to users of various roles.

  16. Cloud computing geospatial application for water resources based on free and open source software and open standards - a prototype

    NASA Astrophysics Data System (ADS)

    Delipetrev, Blagoj

    2016-04-01

    Presently, most of the existing software is desktop-based, designed to work on a single computer, which represents a major limitation in many ways, starting from limited computer processing, storage power, accessibility, availability, etc. The only feasible solution lies in the web and cloud. This abstract presents research and development of a cloud computing geospatial application for water resources based on free and open source software and open standards using hybrid deployment model of public - private cloud, running on two separate virtual machines (VMs). The first one (VM1) is running on Amazon web services (AWS) and the second one (VM2) is running on a Xen cloud platform. The presented cloud application is developed using free and open source software, open standards and prototype code. The cloud application presents a framework how to develop specialized cloud geospatial application that needs only a web browser to be used. This cloud application is the ultimate collaboration geospatial platform because multiple users across the globe with internet connection and browser can jointly model geospatial objects, enter attribute data and information, execute algorithms, and visualize results. The presented cloud application is: available all the time, accessible from everywhere, it is scalable, works in a distributed computer environment, it creates a real-time multiuser collaboration platform, the programing languages code and components are interoperable, and it is flexible in including additional components. The cloud geospatial application is implemented as a specialized water resources application with three web services for 1) data infrastructure (DI), 2) support for water resources modelling (WRM), 3) user management. The web services are running on two VMs that are communicating over the internet providing services to users. The application was tested on the Zletovica river basin case study with concurrent multiple users. The application is a state

  17. MAPI: a software framework for distributed biomedical applications

    PubMed Central

    2013-01-01

    Background The amount of web-based resources (databases, tools etc.) in biomedicine has increased, but the integrated usage of those resources is complex due to differences in access protocols and data formats. However, distributed data processing is becoming inevitable in several domains, in particular in biomedicine, where researchers face rapidly increasing data sizes. This big data is difficult to process locally because of the large processing, memory and storage capacity required. Results This manuscript describes a framework, called MAPI, which provides a uniform representation of resources available over the Internet, in particular for Web Services. The framework enhances their interoperability and collaborative use by enabling a uniform and remote access. The framework functionality is organized in modules that can be combined and configured in different ways to fulfil concrete development requirements. Conclusions The framework has been tested in the biomedical application domain where it has been a base for developing several clients that are able to integrate different web resources. The MAPI binaries and documentation are freely available at http://www.bitlab-es.com/mapi under the Creative Commons Attribution-No Derivative Works 2.5 Spain License. The MAPI source code is available by request (GPL v3 license). PMID:23311574

  18. A Software Package for Neural Network Applications Development

    NASA Technical Reports Server (NTRS)

    Baran, Robert H.

    1993-01-01

    Original Backprop (Version 1.2) is an MS-DOS package of four stand-alone C-language programs that enable users to develop neural network solutions to a variety of practical problems. Original Backprop generates three-layer, feed-forward (series-coupled) networks which map fixed-length input vectors into fixed length output vectors through an intermediate (hidden) layer of binary threshold units. Version 1.2 can handle up to 200 input vectors at a time, each having up to 128 real-valued components. The first subprogram, TSET, appends a number (up to 16) of classification bits to each input, thus creating a training set of input output pairs. The second subprogram, BACKPROP, creates a trilayer network to do the prescribed mapping and modifies the weights of its connections incrementally until the training set is leaned. The learning algorithm is the 'back-propagating error correction procedures first described by F. Rosenblatt in 1961. The third subprogram, VIEWNET, lets the trained network be examined, tested, and 'pruned' (by the deletion of unnecessary hidden units). The fourth subprogram, DONET, makes a TSR routine by which the finished product of the neural net design-and-training exercise can be consulted under other MS-DOS applications.

  19. Evaluation of the Trajectory Operations Applications Software Task (TOAST). Volume 2: Interview transcripts

    NASA Technical Reports Server (NTRS)

    Perkins, Sharon; Martin, Andrea; Bavinger, Bill

    1990-01-01

    The Trajectory Operations Applications Software Task (TOAST) is a software development project whose purpose is to provide trajectory operation pre-mission and real-time support for the Space Shuttle. The purpose of the evaluation was to evaluate TOAST as an Application Manager - to assess current and planned capabilities, compare capabilities to commercially-available off the shelf (COTS) software, and analyze requirements of MCC and Flight Analysis Design System (FADS) for TOAST implementation. As a major part of the data gathering for the evaluation, interviews were conducted with NASA and contractor personnel. Real-time and flight design users, orbit navigation users, the TOAST developers, and management were interviewed. Code reviews and demonstrations were also held. Each of these interviews was videotaped and transcribed as appropriate. Transcripts were edited and are presented chronologically.

  20. Performance diagnostics software for gas turbines in pipeline and cogeneration applications. Final report, July 1985-September 1989

    SciTech Connect

    Levine, P.

    1989-12-01

    The development experience for the PEGASYS and COGENT software is presented. The PEGASYS software is applicable to two-shaft gas turbines in simple, regenerative and combined cycle systems. The COGENT software is applicable to cogeneration systems. The test results show that the software is able to define the deviations between measured and expected power and thermal efficiency. Further, the software is able to identify the components causing the performance losses. The results show that axial compressor fouling is a major cause of performance losses and that the performance can be recovered by washing. A description of an on-line version of PEGASYS is described.

  1. Novice and Expert Collaboration in Educational Software Development: Evaluating Application Effectiveness

    ERIC Educational Resources Information Center

    Friedman, Rob; Saponara, Adam

    2008-01-01

    In an attempt to hone the role of learners as designers, this study investigates the effectiveness of an instructional software application resulting from a design process founded on the tenets of participatory design, informant design, and contextual inquiry, as well as a set of established design heuristics. Collaboration occurred among learning…

  2. VARK Learning Preferences and Mobile Anatomy Software Application Use in Pre-Clinical Chiropractic Students

    ERIC Educational Resources Information Center

    Meyer, Amanda J.; Stomski, Norman J.; Innes, Stanley I.; Armson, Anthony J.

    2016-01-01

    Ubiquitous smartphone ownership and reduced face-to-face teaching time may lead to students making greater use of mobile technologies in their learning. This is the first study to report on the prevalence of mobile gross anatomy software applications (apps) usage in pre-clinical chiropractic students and to ascertain if a relationship exists…

  3. Technology survey of computer software as applicable to the MIUS project

    NASA Technical Reports Server (NTRS)

    Fulbright, B. E.

    1975-01-01

    Existing computer software, available from either governmental or private sources, applicable to modular integrated utility system program simulation is surveyed. Several programs and subprograms are described to provide a consolidated reference, and a bibliography is included. The report covers the two broad areas of design simulation and system simulation.

  4. Application of thermoluminescence for detection of cascade shower 1: Hardware and software of reader system

    NASA Technical Reports Server (NTRS)

    Akashi, M.; Kawaguchi, S.; Watanabe, Z.; Misaki, A.; Niwa, M.; Okamoto, Y.; Fujinaga, T.; Ichimura, M.; Shibata, T.; Dake, S.

    1985-01-01

    A reader system for the detection of cascade showers via luminescence induced by heating sensitive material (BaSO4:Eu) is developed. The reader system is composed of following six instruments: (1) heater, (2) light guide, (3) image intensifier, (4) CCD camera, (5) image processor, (6) microcomputer. The efficiency of these apparatuses and software application for image analysis is reported.

  5. Application of the Open Software Foundation (OSF)distributed computing environment to global PACS

    NASA Astrophysics Data System (ADS)

    Martinez, Ralph; Alsafadi, Yasser H.; Kim, Jinman

    1994-05-01

    In this paper, we present our approach to developing Global Picture Archiving and Communication System (GPACS) applications using the Open Software Foundation (OSF) Distributed Computing Environment (DCE) services and toolkits. The OSF DCE services include remote procedure calls, naming service, threads service, time service, file management services, and security service. Several OSF DCE toolkits are currently available from computer and software vendors. Designing distributed Global PACS applications using the OSF DCE approach will feature an open architecture, heterogeneity, and technology independence for GPACS remote consultation and diagnosis applications, including synchronized image annotation, and system privacy and security. The applications can communicate through various transport services and communications networks in a Global PACS environment. The use of OSF DCE services for Global PACS will enable us to develop a robust distributed structure and new user services which feature reliability and scalability for Global PACS environments.

  6. Common characteristics of open source software development and applicability for drug discovery: a systematic review

    PubMed Central

    2011-01-01

    Background Innovation through an open source model has proven to be successful for software development. This success has led many to speculate if open source can be applied to other industries with similar success. We attempt to provide an understanding of open source software development characteristics for researchers, business leaders and government officials who may be interested in utilizing open source innovation in other contexts and with an emphasis on drug discovery. Methods A systematic review was performed by searching relevant, multidisciplinary databases to extract empirical research regarding the common characteristics and barriers of initiating and maintaining an open source software development project. Results Common characteristics to open source software development pertinent to open source drug discovery were extracted. The characteristics were then grouped into the areas of participant attraction, management of volunteers, control mechanisms, legal framework and physical constraints. Lastly, their applicability to drug discovery was examined. Conclusions We believe that the open source model is viable for drug discovery, although it is unlikely that it will exactly follow the form used in software development. Hybrids will likely develop that suit the unique characteristics of drug discovery. We suggest potential motivations for organizations to join an open source drug discovery project. We also examine specific differences between software and medicines, specifically how the need for laboratories and physical goods will impact the model as well as the effect of patents. PMID:21955914

  7. Development of Oceanographic Software Tools and Applications for Navy Operational Use

    DTIC Science & Technology

    1997-09-30

    DEVELOPMENT OF OCEANOGRAPHIC SOFTWARE TOOLS AND APPLICATIONS FOR NAVY OPERATIONAL USE James H. Corbin Center for Air Sea Technology Mississippi State...applications, were significantly reduced. Accordingly, the CAST objective for FY97 was to develop interactive graphical tools for shipboard METOC briefers...This was in response to a COMSIXTHFLT validated METOC requirement to provide visualization briefing tools , animations, and 3–D graphical depictions

  8. Constructing a working taxonomy of functional Ada software components for real-time embedded system applications

    NASA Technical Reports Server (NTRS)

    Wallace, Robert

    1986-01-01

    A major impediment to a systematic attack on Ada software reusability is the lack of an effective taxonomy for software component functions. The scope of all possible applications of Ada software is considered too great to allow the practical development of a working taxonomy. Instead, for the purposes herein, the scope of Ada software application is limited to device and subsystem control in real-time embedded systems. A functional approach is taken in constructing the taxonomy tree for identified Ada domain. The use of modular software functions as a starting point fits well with the object oriented programming philosophy of Ada. Examples of the types of functions represented within the working taxonomy are real time kernels, interrupt service routines, synchronization and message passing, data conversion, digital filtering and signal conditioning, and device control. The constructed taxonomy is proposed as a framework from which a need analysis can be performed to reveal voids in current Ada real-time embedded programming efforts for Space Station.

  9. Software Graphics Processing Unit (sGPU) for Deep Space Applications

    NASA Technical Reports Server (NTRS)

    McCabe, Mary; Salazar, George; Steele, Glen

    2015-01-01

    A graphics processing capability will be required for deep space missions and must include a range of applications, from safety-critical vehicle health status to telemedicine for crew health. However, preliminary radiation testing of commercial graphics processing cards suggest they cannot operate in the deep space radiation environment. Investigation into an Software Graphics Processing Unit (sGPU)comprised of commercial-equivalent radiation hardened/tolerant single board computers, field programmable gate arrays, and safety-critical display software shows promising results. Preliminary performance of approximately 30 frames per second (FPS) has been achieved. Use of multi-core processors may provide a significant increase in performance.

  10. Applications of Formal Methods to Specification and Safety of Avionics Software

    NASA Technical Reports Server (NTRS)

    Hoover, D. N.; Guaspari, David; Humenn, Polar

    1996-01-01

    This report treats several topics in applications of formal methods to avionics software development. Most of these topics concern decision tables, an orderly, easy-to-understand format for formally specifying complex choices among alternative courses of action. The topics relating to decision tables include: generalizations fo decision tables that are more concise and support the use of decision tables in a refinement-based formal software development process; a formalism for systems of decision tables with behaviors; an exposition of Parnas tables for users of decision tables; and test coverage criteria and decision tables. We outline features of a revised version of ORA's decision table tool, Tablewise, which will support many of the new ideas described in this report. We also survey formal safety analysis of specifications and software.

  11. An application of machine learning to the organization of institutional software repositories

    NASA Technical Reports Server (NTRS)

    Bailin, Sidney; Henderson, Scott; Truszkowski, Walt

    1993-01-01

    Software reuse has become a major goal in the development of space systems, as a recent NASA-wide workshop on the subject made clear. The Data Systems Technology Division of Goddard Space Flight Center has been working on tools and techniques for promoting reuse, in particular in the development of satellite ground support software. One of these tools is the Experiment in Libraries via Incremental Schemata and Cobweb (ElvisC). ElvisC applies machine learning to the problem of organizing a reusable software component library for efficient and reliable retrieval. In this paper we describe the background factors that have motivated this work, present the design of the system, and evaluate the results of its application.

  12. [Application of Stata software to test heterogeneity in meta-analysis method].

    PubMed

    Wang, Dan; Mou, Zhen-yun; Zhai, Jun-xia; Zong, Hong-xia; Zhao, Xiao-dong

    2008-07-01

    To introduce the application of Stata software to heterogeneity test in meta-analysis. A data set was set up according to the example in the study, and the corresponding commands of the methods in Stata 9 software were applied to test the example. The methods used were Q-test and I2 statistic attached to the fixed effect model forest plot, H statistic and Galbraith plot. The existence of the heterogeneity among studies could be detected by Q-test and H statistic and the degree of the heterogeneity could be detected by I2 statistic. The outliers which were the sources of the heterogeneity could be spotted from the Galbraith plot. Heterogeneity test in meta-analysis can be completed by the four methods in Stata software simply and quickly. H and I2 statistics are more robust, and the outliers of the heterogeneity can be clearly seen in the Galbraith plot among the four methods.

  13. Next generation of decision making software for nanopatterns characterization: application to semiconductor industry

    NASA Astrophysics Data System (ADS)

    Dervilllé, A.; Labrosse, A.; Zimmermann, Y.; Foucher, J.; Gronheid, R.; Boeckx, C.; Singh, A.; Leray, P.; Halder, S.

    2016-03-01

    The dimensional scaling in IC manufacturing strongly drives the demands on CD and defect metrology techniques and their measurement uncertainties. Defect review has become as important as CD metrology and both of them create a new metrology paradigm because it creates a completely new need for flexible, robust and scalable metrology software. Current, software architectures and metrology algorithms are performant but it must be pushed to another higher level in order to follow roadmap speed and requirements. For example: manage defect and CD in one step algorithm, customize algorithms and outputs features for each R&D team environment, provide software update every day or every week for R&D teams in order to explore easily various development strategies. The final goal is to avoid spending hours and days to manually tune algorithm to analyze metrology data and to allow R&D teams to stay focus on their expertise. The benefits are drastic costs reduction, more efficient R&D team and better process quality. In this paper, we propose a new generation of software platform and development infrastructure which can integrate specific metrology business modules. For example, we will show the integration of a chemistry module dedicated to electronics materials like Direct Self Assembly features. We will show a new generation of image analysis algorithms which are able to manage at the same time defect rates, images classifications, CD and roughness measurements with high throughput performances in order to be compatible with HVM. In a second part, we will assess the reliability, the customization of algorithm and the software platform capabilities to follow new specific semiconductor metrology software requirements: flexibility, robustness, high throughput and scalability. Finally, we will demonstrate how such environment has allowed a drastic reduction of data analysis cycle time.

  14. Software Bridge

    NASA Technical Reports Server (NTRS)

    1995-01-01

    I-Bridge is a commercial version of software developed by I-Kinetics under a NASA Small Business Innovation Research (SBIR) contract. The software allows users of Windows applications to gain quick, easy access to databases, programs and files on UNIX services. Information goes directly onto spreadsheets and other applications; users need not manually locate, transfer and convert data.

  15. Towards the Goal of Modular Climate Data Services: An Overview of NCPP Applications and Software

    NASA Astrophysics Data System (ADS)

    Koziol, B. W.; Cinquini, L.; Treshansky, A.; Murphy, S.; DeLuca, C.

    2013-12-01

    In August 2013, the National Climate Predictions and Projections Platform (NCPP) organized a workshop focusing on the quantitative evaluation of downscaled climate data products (QED-2013). The QED-2013 workshop focused on real-world application problems drawn from several sectors (e.g. hydrology, ecology, environmental health, agriculture), and required that downscaled downscaled data products be dynamically accessed, generated, manipulated, annotated, and evaluated. The cyberinfrastructure elements that were integrated to support the workshop included (1) a wiki-based project hosting environment (Earth System CoG) with an interface to data services provided by an Earth System Grid Federation (ESGF) data node; (2) metadata tools provided by the Earth System Documentation (ES-DOC) collaboration; and (3) a Python-based library OpenClimateGIS (OCGIS) for subsetting and converting NetCDF-based climate data to GIS and tabular formats. Collectively, this toolset represents a first deployment of a 'ClimateTranslator' that enables users to access, interpret, and apply climate information at local and regional scales. This presentation will provide an overview of these components above, how they were used in the workshop, and discussion of current and potential integration. The long-term strategy for this software stack is to offer the suite of services described on a customizable, per-project basis. Additional detail on the three components is below. (1) Earth System CoG is a web-based collaboration environment that integrates data discovery and access services with tools for supporting governance and the organization of information. QED-2013 utilized these capabilities to share with workshop participants a suite of downscaled datasets, associated images derived from those datasets, and metadata files describing the downscaling techniques involved. The collaboration side of CoG was used for workshop organization, discussion, and results. (2) The ES-DOC Questionnaire

  16. An application of software design and documentation language. [Galileo spacecraft command and data subsystem

    NASA Technical Reports Server (NTRS)

    Callender, E. D.; Clarkson, T. B.; Frasier, C. E.

    1980-01-01

    The software design and documentation language (SDDL) is a general purpose processor to support a lanugage for the description of any system, structure, concept, or procedure that may be presented from the viewpoint of a collection of hierarchical entities linked together by means of binary connections. The language comprises a set of rules of syntax, primitive construct classes (module, block, and module invocation), and language control directives. The result is a language with a fixed grammar, variable alphabet and punctuation, and an extendable vocabulary. The application of SDDL to the detailed software design of the Command Data Subsystem for the Galileo Spacecraft is discussed. A set of constructs was developed and applied. These constructs are evaluated and examples of their application are considered.

  17. Software Process Improvement Initiatives Based on Quality Assurance Strategies: A QATAM Pilot Application

    NASA Astrophysics Data System (ADS)

    Winkler, Dietmar; Elberzhager, Frank; Biffl, Stefan; Eschbach, Robert

    Quality Assurance (QA) strategies, i.e., bundles of verification and validation approaches embedded within a balanced software process can support project and quality managers in systematically planning and implementing improvement initiatives. New and modified processes and methods come up frequently that seems promising candidates for improvement. Nevertheless, the impact of processes and methods strongly depends on individual project contexts. A major challenge is how to systematically select and implement "bestpractices" for product construction, verification, and validation. In this paper we present the Quality Assurance Tradeoff Analysis Method (QATAM) that supports engineers in (a) systematically identifying candidate QA strategies and (b) evaluating QA strategy variants in a given project context. We evaluate feasibility and usefulness in a pilot application in a medium-size software engineering organization. Main results were that QATAM was considered useful for identifying and evaluating various improvement initiatives applicable for large organizations as well as for small and medium enterprises.

  18. Using Solution-Focused Applications for Transitional Coping of Workplace Survivors

    ERIC Educational Resources Information Center

    Germain, Marie-Line; Palamara, Sherry A.

    2007-01-01

    Solution-focused applications are proposed to assist survivor employees to return to workplace homeostasis after co-workers voluntarily or involuntarily leave the organization. A model for transitional coping is presented as well as a potential case study illustrating the application of the model. Implications for the theory, practice, and…

  19. Applications of on-product diffraction-based focus metrology in logic high volume manufacturing

    NASA Astrophysics Data System (ADS)

    Noyes, Ben F.; Mokaberi, Babak; Bolton, David; Li, Chen; Palande, Ashwin; Park, Kevin; Noot, Marc; Kea, Marc

    2016-03-01

    The integration of on-product diffraction-based focus (DBF) capability into the majority of immersion lithography layers in leading edge logic manufacturing has enabled new applications targeted towards improving cycle time and yield. A CD-based detection method is the process of record (POR) for excursion detection. The drawback of this method is increased cycle time and limited sampling due to CD-SEM metrology capacity constraints. The DBFbased method allows the addition of focus metrology samples to the existing overlay measurements on the integrated metrology (IM) system. The result enables the addition of measured focus to the SPC system, allowing a faster excursion detection method. For focus targeting, the current method involves using a dedicated focus-exposure matrix (FEM) on all scanners, resulting in lengthy analysis times and uncertainty in the best focus. The DBF method allows the measurement to occur on the IM system, on a regular production wafer, and at the same time as the exposure. This results in a cycle time gain as well as a less subjective determination of best focus. A third application aims to use the novel onproduct focus metrology data in order to apply per-exposure focus corrections to the scanner. These corrections are particularly effective at the edge of the wafer, where systematic layer-dependent effects can be removed using DBFbased scanner feedback. This paper will discuss the development of a methodology to accomplish each of these applications in a high-volume production environment. The new focus metrology method, sampling schemes, feedback mechanisms and analysis methods lead to improved focus control, as well as earlier detection of failures.

  20. GWASS: GRASS web application software system based on the GeoBrain web service

    NASA Astrophysics Data System (ADS)

    Qiu, Fang; Ni, Feng; Chastain, Bryan; Huang, Haiting; Zhao, Peisheng; Han, Weiguo; Di, Liping

    2012-10-01

    GRASS is a well-known geographic information system developed more than 30 years ago. As one of the earliest GIS systems, GRASS has currently survived mainly as free, open-source desktop GIS software, with users primarily limited to the research community or among programmers who use it to create customized functions. To allow average GIS end users to continue taking advantage of this widely-used software, we developed a GRASS Web Application Software System (GWASS), a distributed, web-based, multi-tiered Geospatial Information System (GIS) built on top of the GeoBrain web service, a project sponsored by NASA using the latest service oriented architecture (SOA). This SOA enabled system offers an effective and practical alternative to current commercial desktop GIS solutions. With GWASS, all geospatial processing and analyses are conducted by the server, so users are not required to install any software at the client side, which reduces the cost of access for users. The only resource needed to use GWASS is an access to the Internet, and anyone who knows how to use a web browser can operate the system. The SOA framework is revitalizing the GRASS as a new means to bring powerful geospatial analysis and resources to more users with concurrent access.

  1. A software tool of digital tomosynthesis application for patient positioning in radiotherapy.

    PubMed

    Yan, Hui; Dai, Jian-Rong

    2016-03-08

    Digital Tomosynthesis (DTS) is an image modality in reconstructing tomographic images from two-dimensional kV projections covering a narrow scan angles. Comparing with conventional cone-beam CT (CBCT), it requires less time and radiation dose in data acquisition. It is feasible to apply this technique in patient positioning in radiotherapy. To facilitate its clinical application, a software tool was developed and the reconstruction processes were accelerated by graphic process-ing unit (GPU). Two reconstruction and two registration processes are required for DTS application which is different from conventional CBCT application which requires one image reconstruction process and one image registration process. The reconstruction stage consists of productions of two types of DTS. One type of DTS is reconstructed from cone-beam (CB) projections covering a narrow scan angle and is named onboard DTS (ODTS), which represents the real patient position in treatment room. Another type of DTS is reconstructed from digitally reconstructed radiography (DRR) and is named reference DTS (RDTS), which represents the ideal patient position in treatment room. Prior to the reconstruction of RDTS, The DRRs are reconstructed from planning CT using the same acquisition setting of CB projections. The registration stage consists of two matching processes between ODTS and RDTS. The target shift in lateral and longitudinal axes are obtained from the matching between ODTS and RDTS in coronal view, while the target shift in longitudinal and vertical axes are obtained from the matching between ODTS and RDTS in sagittal view. In this software, both DRR and DTS reconstruction algorithms were implemented on GPU environments for acceleration purpose. The comprehensive evaluation of this software tool was performed including geometric accuracy, image quality, registration accuracy, and reconstruction efficiency. The average correlation coefficient between DRR/DTS generated by GPU-based algorithm

  2. Development of a web application for water resources based on open source software

    NASA Astrophysics Data System (ADS)

    Delipetrev, Blagoj; Jonoski, Andreja; Solomatine, Dimitri P.

    2014-01-01

    This article presents research and development of a prototype web application for water resources using latest advancements in Information and Communication Technologies (ICT), open source software and web GIS. The web application has three web services for: (1) managing, presenting and storing of geospatial data, (2) support of water resources modeling and (3) water resources optimization. The web application is developed using several programming languages (PhP, Ajax, JavaScript, Java), libraries (OpenLayers, JQuery) and open source software components (GeoServer, PostgreSQL, PostGIS). The presented web application has several main advantages: it is available all the time, it is accessible from everywhere, it creates a real time multi-user collaboration platform, the programing languages code and components are interoperable and designed to work in a distributed computer environment, it is flexible for adding additional components and services and, it is scalable depending on the workload. The application was successfully tested on a case study with concurrent multi-users access.

  3. Safety Characteristics in System Application of Software for Human Rated Exploration Missions for the 8th IAASS Conference

    NASA Technical Reports Server (NTRS)

    Mango, Edward J.

    2016-01-01

    NASA and its industry and international partners are embarking on a bold and inspiring development effort to design and build an exploration class space system. The space system is made up of the Orion system, the Space Launch System (SLS) and the Ground Systems Development and Operations (GSDO) system. All are highly coupled together and dependent on each other for the combined safety of the space system. A key area of system safety focus needs to be in the ground and flight application software system (GFAS). In the development, certification and operations of GFAS, there are a series of safety characteristics that define the approach to ensure mission success. This paper will explore and examine the safety characteristics of the GFAS development. The GFAS system integrates the flight software packages of the Orion and SLS with the ground systems and launch countdown sequencers through the 'agile' software development process. A unique approach is needed to develop the GFAS project capabilities within this agile process. NASA has defined the software development process through a set of standards. The standards were written during the infancy of the so-called industry 'agile development' movement and must be tailored to adapt to the highly integrated environment of human exploration systems. Safety of the space systems and the eventual crew on board is paramount during the preparation of the exploration flight systems. A series of software safety characteristics have been incorporated into the development and certification efforts to ensure readiness for use and compatibility with the space systems. Three underlining factors in the exploration architecture require the GFAS system to be unique in its approach to ensure safety for the space systems, both the flight as well as the ground systems. The first are the missions themselves, which are exploration in nature, and go far beyond the comfort of low Earth orbit operations. The second is the current exploration

  4. Modelface: an Application Programming Interface (API) for Homology Modeling Studies Using Modeller Software

    PubMed Central

    Sakhteman, Amirhossein; Zare, Bijan

    2016-01-01

    An interactive application, Modelface, was presented for Modeller software based on windows platform. The application is able to run all steps of homology modeling including pdb to fasta generation, running clustal, model building and loop refinement. Other modules of modeler including energy calculation, energy minimization and the ability to make single point mutations in the PDB structures are also implemented inside Modelface. The API is a simple batch based application with no memory occupation and is free of charge for academic use. The application is also able to repair missing atom types in the PDB structures making it suitable for many molecular modeling studies such as docking and molecular dynamic simulation. Some successful instances of modeling studies using Modelface are also reported. PMID:28243276

  5. Application of the AHP method in modeling the trust and reputation of software agents

    NASA Astrophysics Data System (ADS)

    Zytniewski, Mariusz; Klementa, Marek; Skorupka, Dariusz; Stanek, Stanislaw; Duchaczek, Artur

    2016-06-01

    Given the unique characteristics of cyberspace and, in particular, the number of inherent security threats, communication between software agents becomes a highly complex issue and a major challenge that, on the one hand, needs to be continuously monitored and, on the other, awaits new solutions addressing its vulnerabilities. An approach that has recently come into view mimics mechanisms typical of social systems and is based on trust and reputation that assist agents in deciding which other agents to interact with. The paper offers an enhancement to existing trust and reputation models, involving the application of the AHP method that is widely used for decision support in social systems, notably for risks analysis. To this end, it is proposed to expand the underlying conceptual basis by including such notions as self-trust and social trust, and to apply these to software agents. The discussion is concluded with an account of an experiment aimed at testing the effectiveness of the proposed solution.

  6. Lifelong personal health data and application software via virtual machines in the cloud.

    PubMed

    Van Gorp, Pieter; Comuzzi, Marco

    2014-01-01

    Personal Health Records (PHRs) should remain the lifelong property of patients, who should be able to show them conveniently and securely to selected caregivers and institutions. In this paper, we present MyPHRMachines, a cloud-based PHR system taking a radically new architectural solution to health record portability. In MyPHRMachines, health-related data and the application software to view and/or analyze it are separately deployed in the PHR system. After uploading their medical data to MyPHRMachines, patients can access them again from remote virtual machines that contain the right software to visualize and analyze them without any need for conversion. Patients can share their remote virtual machine session with selected caregivers, who will need only a Web browser to access the pre-loaded fragments of their lifelong PHR. We discuss a prototype of MyPHRMachines applied to two use cases, i.e., radiology image sharing and personalized medicine.

  7. The Application of V&V within Reuse-Based Software Engineering

    NASA Technical Reports Server (NTRS)

    Addy, Edward

    1996-01-01

    Verification and Validation (V&V) is performed during application development for many systems, especially safety-critical and mission-critical systems. The V&V process is intended to discover errors as early as possible during the development process. Early discovery is important in order to minimize the cost and other impacts of correcting these errors. In reuse-based software engineering, decisions on the requirements, design and even implementation of domain assets can can be made prior to beginning development of a specific system. in order to bring the effectiveness of V&V to bear within reuse-based software engineering. V&V must be incorporated within the domain engineering process.

  8. Application of the ARRAMIS Risk and Reliability Software to the Nuclear Accident Progression

    SciTech Connect

    Wyss, Gregory D.; Daniel, Sharon L.; Hays, Kelly M.; Brown, Thomas D.

    1997-06-01

    The ARRAMIS risk and reliability analysis software suite developed by Sandia National Laboratories enables analysts to evaluate the safety and reliability of a wide range of complex systems whose failure results in high consequences. This software was originally designed to model the systems, responses, and phenomena associated with potential severe accidents at commercial nuclear power reactors by solving very large fault tree and event tree models. However, because of its power and versatility, ARRAMIS and its constituent analysis engines have recently been used to evaluate a wide variety of systems, including nuclear weapons, telecommunications facilities, robotic material handling systems, and aircraft systems using hybrid fault tree event tree analysis techniques incorporating fully integrated uncertainty analysis capabilities. This paper describes recent applications in the area of nuclear reactor accident progression analysis using a large event tree methodology and the ARRAMIS package.

  9. Solar thermal power systems point-focusing thermal and electric applications projects. Volume 1: Executive summary

    NASA Technical Reports Server (NTRS)

    Marriott, A.

    1980-01-01

    The activities of the Point-Focusing Thermal and Electric Applications (PETEA) project for the fiscal year 1979 are summarized. The main thrust of the PFTEA Project, the small community solar thermal power experiment, was completed. Concept definition studies included a small central receiver approach, a point-focusing distributed receiver system with central power generation, and a point-focusing distributed receiver concept with distributed power generation. The first experiment in the Isolated Application Series was initiated. Planning for the third engineering experiment series, which addresses the industrial market sector, was also initiated. In addition to the experiment-related activities, several contracts to industry were let and studies were conducted to explore the market potential for point-focusing distributed receiver (PFDR) systems. System analysis studies were completed that looked at PFDR technology relative to other small power system technology candidates for the utility market sector.

  10. Transcranial MR-Guided Focused Ultrasound: A Review of the Technology and Neuro Applications

    PubMed Central

    Ghanouni, Pejman; Pauly, Kim Butts; Elias, W. Jeff; Henderson, Jaimie; Sheehan, Jason; Monteith, Stephen; Wintermark, Max

    2015-01-01

    MR guided focused ultrasound is a new, minimally invasive method of targeted tissue thermal ablation that may be of use to treat central neuropathic pain, essential tremor, Parkinson tremor, and brain tumors. The system has also been used to temporarily disrupt the blood-brain barrier to allow targeted drug delivery to brain tumors. This article reviews the physical principles of MR guided focused ultrasound and discusses current and potential applications of this exciting technology. PMID:26102394

  11. Optimal patterns for sequentially multiple focusing in high intensity focused ultrasound and their application to thermal dose

    NASA Astrophysics Data System (ADS)

    Shim, Mun-Bo; Lee, Hyoungki; Lee, Hotaik; Park, Junho; Ahn, Minsu

    2012-11-01

    The purpose of this study is to propose a new method for multiple-focus generation to shorten overall treatment time as well as to avoid the formation of high intensity regions outside the target volume. A numerical simulation of the acoustic fields produced by a 1017-element spherical-section ultrasound phased array transducer operating at a frequency of 1.0MHz with 16 cm radius of curvature is performed for the proposed multiple-focus generation. The total foci are partitioned into the several patterns because multiple focusing generally gives rise to the grating lobes outside of the three dimensional region of interest even if applying the optimization of intensity gain in determining the phases and amplitudes of the excitation source vector. The optimization problem is repeatedly formulated in term of the focal points until the multiple-focus patterns cover all the focal points. Genetic algorithm is used for selecting the patterns without the grating lobes. The obtained set of multiple-focus patterns can sequentially be used to necrose a given volume in the short time as well as in the safe treatment. The proposed method might prove useful to improve the speed and safety of focused ultrasound thermal ablation. This strategy will also be effective for any transducers as well as for other cases of multiple-focus generation.

  12. A flexible software architecture for scalable real-time image and video processing applications

    NASA Astrophysics Data System (ADS)

    Usamentiaga, Rubén; Molleda, Julio; García, Daniel F.; Bulnes, Francisco G.

    2012-06-01

    Real-time image and video processing applications require skilled architects, and recent trends in the hardware platform make the design and implementation of these applications increasingly complex. Many frameworks and libraries have been proposed or commercialized to simplify the design and tuning of real-time image processing applications. However, they tend to lack flexibility because they are normally oriented towards particular types of applications, or they impose specific data processing models such as the pipeline. Other issues include large memory footprints, difficulty for reuse and inefficient execution on multicore processors. This paper presents a novel software architecture for real-time image and video processing applications which addresses these issues. The architecture is divided into three layers: the platform abstraction layer, the messaging layer, and the application layer. The platform abstraction layer provides a high level application programming interface for the rest of the architecture. The messaging layer provides a message passing interface based on a dynamic publish/subscribe pattern. A topic-based filtering in which messages are published to topics is used to route the messages from the publishers to the subscribers interested in a particular type of messages. The application layer provides a repository for reusable application modules designed for real-time image and video processing applications. These modules, which include acquisition, visualization, communication, user interface and data processing modules, take advantage of the power of other well-known libraries such as OpenCV, Intel IPP, or CUDA. Finally, we present different prototypes and applications to show the possibilities of the proposed architecture.

  13. The development and application of composite complexity models and a relative complexity metric in a software maintenance environment

    NASA Technical Reports Server (NTRS)

    Hops, J. M.; Sherif, J. S.

    1994-01-01

    A great deal of effort is now being devoted to the study, analysis, prediction, and minimization of software maintenance expected cost, long before software is delivered to users or customers. It has been estimated that, on the average, the effort spent on software maintenance is as costly as the effort spent on all other software costs. Software design methods should be the starting point to aid in alleviating the problems of software maintenance complexity and high costs. Two aspects of maintenance deserve attention: (1) protocols for locating and rectifying defects, and for ensuring that noe new defects are introduced in the development phase of the software process; and (2) protocols for modification, enhancement, and upgrading. This article focuses primarily on the second aspect, the development of protocols to help increase the quality and reduce the costs associated with modifications, enhancements, and upgrades of existing software. This study developed parsimonious models and a relative complexity metric for complexity measurement of software that were used to rank the modules in the system relative to one another. Some success was achieved in using the models and the relative metric to identify maintenance-prone modules.

  14. Surgical model-view-controller simulation software framework for local and collaborative applications

    PubMed Central

    Sankaranarayanan, Ganesh; Halic, Tansel; Arikatla, Venkata Sreekanth; Lu, Zhonghua; De, Suvranu

    2010-01-01

    Purpose Surgical simulations require haptic interactions and collaboration in a shared virtual environment. A software framework for decoupled surgical simulation based on a multi-controller and multi-viewer model-view-controller (MVC) pattern was developed and tested. Methods A software framework for multimodal virtual environments was designed, supporting both visual interactions and haptic feedback while providing developers with an integration tool for heterogeneous architectures maintaining high performance, simplicity of implementation, and straightforward extension. The framework uses decoupled simulation with updates of over 1,000 Hz for haptics and accommodates networked simulation with delays of over 1,000 ms without performance penalty. Results The simulation software framework was implemented and was used to support the design of virtual reality-based surgery simulation systems. The framework supports the high level of complexity of such applications and the fast response required for interaction with haptics. The efficacy of the framework was tested by implementation of a minimally invasive surgery simulator. Conclusion A decoupled simulation approach can be implemented as a framework to handle simultaneous processes of the system at the various frame rates each process requires. The framework was successfully used to develop collaborative virtual environments (VEs) involving geographically distributed users connected through a network, with the results comparable to VEs for local users. PMID:20714933

  15. Second International Workshop on Software Engineering and Code Design in Parallel Meteorological and Oceanographic Applications

    NASA Technical Reports Server (NTRS)

    OKeefe, Matthew (Editor); Kerr, Christopher L. (Editor)

    1998-01-01

    This report contains the abstracts and technical papers from the Second International Workshop on Software Engineering and Code Design in Parallel Meteorological and Oceanographic Applications, held June 15-18, 1998, in Scottsdale, Arizona. The purpose of the workshop is to bring together software developers in meteorology and oceanography to discuss software engineering and code design issues for parallel architectures, including Massively Parallel Processors (MPP's), Parallel Vector Processors (PVP's), Symmetric Multi-Processors (SMP's), Distributed Shared Memory (DSM) multi-processors, and clusters. Issues to be discussed include: (1) code architectures for current parallel models, including basic data structures, storage allocation, variable naming conventions, coding rules and styles, i/o and pre/post-processing of data; (2) designing modular code; (3) load balancing and domain decomposition; (4) techniques that exploit parallelism efficiently yet hide the machine-related details from the programmer; (5) tools for making the programmer more productive; and (6) the proliferation of programming models (F--, OpenMP, MPI, and HPF).

  16. Design of single phase inverter using microcontroller assisted by data processing applications software

    NASA Astrophysics Data System (ADS)

    Ismail, K.; Muharam, A.; Amin; Widodo Budi, S.

    2015-12-01

    Inverter is widely used for industrial, office, and residential purposes. Inverter supports the development of alternative energy such as solar cells, wind turbines and fuel cells by converting dc voltage to ac voltage. Inverter has been made with a variety of hardware and software combinations, such as the use of pure analog circuit and various types of microcontroller as controller. When using pure analog circuit, modification would be difficult because it will change the entire hardware components. In inverter with microcontroller based design (with software), calculations to generate AC modulation is done in the microcontroller. This increases programming complexity and amount of coding downloaded to the microcontroller chip (capacity flash memory in the microcontroller is limited). This paper discusses the design of a single phase inverter using unipolar modulation of sine wave and triangular wave, which is done outside the microcontroller using data processing software application (Microsoft Excel), result shows that complexity programming was reduce and resolution sampling data is very influence to THD. Resolution sampling must taking ½ A degree to get best THD (15.8%).

  17. Spectra acquisition software for clinical applications of the USB4000 spectrometer

    NASA Astrophysics Data System (ADS)

    Martínez Rodríguez, A. E.; Delgado Atencio, J. A.; Vázquez Y Montiel, S.; Romero Hernández, R. A.

    2011-08-01

    The non-invasive clinic method of diffuse reflectance spectroscopy (DRS), for the diagnosis of human skin lesions can be performed by using spectrometric devices together with fiber optics probes. However, the operation of most of these devices commercially available is not specifically designed for clinical applications. As a result, the commercial software and the optical hardware of these spectrometers are impractical when trying to conciliate the requirements of a clinical procedure with their operation to perform the DRS for diagnosis purposes. Therefore, the development of home-built acquisition software will impact in a more reliable and practical spectrometric system for clinical environment. In this work is presented the development of an automation system that includes both a user graphical interface and a control system that enable a more reliable and faster acquisition of clinical spectra. The software features a voice control to perform the acquiring spectra process. The impact of this work is mostly the use of available programming platforms to implement a preliminary spectra processing tool that will lead to real-time acquisition of skin reflectance spectra of a given patient.

  18. A Study of the Feasibility of Duplicating JAMPS Applications Software in the Ada Programming Language.

    DTIC Science & Technology

    1984-04-01

    the Sieve of 33 Eratosthenes 4 Sizing Analysis for Existing Software Written in "C" 44 5 Sizing Data with Adjustments 45 6 Conversion from "C" to Ada...benchmark program based on the Sieve of EratoSthenes [7 ]. This program finds all of the prime numbers between 3 and 16381. The benchmark test results shown in...appears to be quite reasonable for the JAMPS application. 32 - Table 3 Benchmark Test Results Using the Sieve of Eratosthenes [7 ] Execution Operating Time

  19. Effects of a Passive Online Software Application on Heart Rate Variability and Autonomic Nervous System Balance

    PubMed Central

    2017-01-01

    Abstract Objective: This study investigated whether short-term exposure to a passive online software application of purported subtle energy technology would affect heart rate variability (HRV) and associated autonomic nervous system measures. Methods: This was a randomized, double-blinded, sham-controlled clinical trial (RCT). The study took place in a nonprofit laboratory in Emeryville, California. Twenty healthy, nonsmoking subjects (16 females), aged 40–75 years, participated. Quantum Code Technology™ (QCT), a purported subtle energy technology, was delivered through a passive software application (Heart+ App) on a smartphone placed <1 m from subjects who were seated and reading a catalog. HRV was measured for 5 min in triplicate for each condition via finger plethysmography using a Food and Drug Administration medically approved HRV measurement device. Measurements were made at baseline and 35 min following exposure to the software applications. The following parameters were calculated and analyzed: heart rate, total power, standard deviation node-to-node, root mean square sequential difference, low frequency to high frequency ratio (LF/HF), low frequency (LF), and high frequency (HF). Results: Paired samples t-tests showed that for the Heart+ App, mean LF/HF decreased (p = 9.5 × 10–4), while mean LF decreased in a trend (p = 0.06), indicating reduced sympathetic dominance. Root mean square sequential difference increased for the Heart+ App, showing a possible trend (p = 0.09). Post–pre differences in LF/HF for sham compared with the Heart+ App were also significant (p < 0.008) by independent t-test, indicating clinical relevance. Conclusions: Significant beneficial changes in mean LF/HF, along with possible trends in mean LF and root mean square sequential difference, were observed in subjects following 35 min exposure to the Heart+ App that was working in the background on an active smartphone untouched by the subjects

  20. Information Engineering and the Information Engineering Facility versus Rapid Application Development and Focus

    DTIC Science & Technology

    1992-12-01

    business requirements that are resilient and responsive to continuous change and improvement. Business re-engineering focuses on the strategic vision and... Supply department of the Naval Postgraduate School (NPS). Future possibilities for applications include integrating the curricular offices, travel and...operations ( supply , public works, and financial and personnel resources) are directed by military personnel with predominantly civilian staffs. The

  1. Electrically tunable-focusing and polarizer-free liquid crystal lenses for ophthalmic applications.

    PubMed

    Lin, Yi-Hsin; Chen, Hung-Shan

    2013-04-22

    An electrically tunable-focusing and polarizer-free liquid crystal (LC) lens for ophthalmic applications is demonstrated. The optical mechanism of a LC lens used in human eye system is introduced. The polarizer-free LC lens for myopia-presbyopia based on artificial accommodation is demonstrated. The continuously tunable-focusing properties of the LC lenses are more practical in applications for different visional conditions of people. The concept we proposed can also be applied to another types of lenses as long as the focusing properties are tunable. The concept in this paper can also be extensively applied to imaging systems, and projection systems, such as cameras in cell phones, pico projectors, and endoscopes.

  2. Complementing anatomy education using three-dimensional anatomy mobile software applications on tablet computers.

    PubMed

    Lewis, T L; Burnett, B; Tunstall, R G; Abrahams, P H

    2014-04-01

    Anatomy has traditionally been a cornerstone of medical education, which has been taught via dissection and didactic lectures. The rising prevalence of mobile tablet technology means medical software applications ("apps") play an increasingly important role in medical education. The applications highlighted in this article will aid anatomical educators to identify which are the most useful in clinical, academic, and educational environments. These have been systematically identified by downloading all applications with keywords related to anatomy and then carrying out qualitative assessment. Novel anatomy applications from developers such as Visible Body, 3D4Medical, and Pocket Anatomy allow students to visualize and manipulate complex anatomical structures using detailed 3D models. They often contain additional content including clinical correlations and a range of media from instructional videos to interactive quiz functions. The strength of tablet technology lies in its ability to consolidate and present anatomical information to the user in the most appropriate manner for their learning style. The only question mark remains over the level of detail and accuracy of these applications. Innovative medical educators who embrace tablet technology will find that anatomy applications serve as a useful learning tool when used in conjunction with existing teaching setups.

  3. A Java and XML Application to Support Numerical Model Development within the Geologic Sequestration Software Suite (GS3)

    NASA Astrophysics Data System (ADS)

    Williams, M. D.; Wurstner, S. K.; Thorne, P. D.; Freedman, V. L.; Litofsky, A.; Huda, S. A.; Gurumoorthi, V.

    2010-12-01

    A Java and XML based application is currently under development as part of the Geologic Sequestration Software Suite (GS3) to support the generation of input files for multiple subsurface multifluid flow and transport simulators. The application will aid in the translation of conceptual models to a numerical modeling framework, and will provide the capability of generating multi-scale (spatial and temporal) numerical models in support of a variety of collaborative geologic sequestration studies. User specifications for numerical models include grids, geology, boundary and initial conditions, source terms, material properties, geochemical reactions, and geomechanics. Some of these inputs are defined as part of the conceptual model, while others are specified during the numerical model development process. To preserve the distinction between the conceptual model and its translation to a numerical modeling framework, the application manages data associated with each specification independently. This facilitates 1) building multi-scale numerical models from a common conceptual model, 2) building numerical models from multiple conceptual models, 3) building numerical models and input files for different simulators from a common conceptual model, 4) ease in re-generating numerical models in response to revisions of the conceptual model, and 5) revising the numerical model specification during the development process (e.g., grid modifications and resulting re-assignment of material property values and distributions). A key aspect of the model definition software is the ability to define features in the numerical model by specifying them as geometric objects, eliminating the need for the user to specify node/element numbers that often change when the grid is revised. The GS3 platform provides the capability of tracking provenance and dependencies for data files used in the numerical model definition. Within this framework, metadata is generated to support configuration

  4. Scalable, high-performance 3D imaging software platform: system architecture and application to virtual colonoscopy.

    PubMed

    Yoshida, Hiroyuki; Wu, Yin; Cai, Wenli; Brett, Bevin

    2012-01-01

    One of the key challenges in three-dimensional (3D) medical imaging is to enable the fast turn-around time, which is often required for interactive or real-time response. This inevitably requires not only high computational power but also high memory bandwidth due to the massive amount of data that need to be processed. In this work, we have developed a software platform that is designed to support high-performance 3D medical image processing for a wide range of applications using increasingly available and affordable commodity computing systems: multi-core, clusters, and cloud computing systems. To achieve scalable, high-performance computing, our platform (1) employs size-adaptive, distributable block volumes as a core data structure for efficient parallelization of a wide range of 3D image processing algorithms; (2) supports task scheduling for efficient load distribution and balancing; and (3) consists of a layered parallel software libraries that allow a wide range of medical applications to share the same functionalities. We evaluated the performance of our platform by applying it to an electronic cleansing system in virtual colonoscopy, with initial experimental results showing a 10 times performance improvement on an 8-core workstation over the original sequential implementation of the system.

  5. Application of the Software as a Service Model to the Control of Complex Building Systems

    SciTech Connect

    Stadler, Michael; Donadee, Jon; Marnay, Chris; Lai, Judy; Mendes, Goncalo; Appen, Jan von; Mégel, Oliver; Bhattacharya, Prajesh; DeForest, Nicholas; Lai, Judy

    2011-03-18

    In an effort to create broad access to its optimization software, Lawrence Berkeley National Laboratory (LBNL), in collaboration with the University of California at Davis (UC Davis) and OSISoft, has recently developed a Software as a Service (SaaS) Model for reducing energy costs, cutting peak power demand, and reducing carbon emissions for multipurpose buildings. UC Davis currently collects and stores energy usage data from buildings on its campus. Researchers at LBNL sought to demonstrate that a SaaS application architecture could be built on top of this data system to optimize the scheduling of electricity and heat delivery in the building. The SaaS interface, known as WebOpt, consists of two major parts: a) the investment& planning and b) the operations module, which builds on the investment& planning module. The operational scheduling and load shifting optimization models within the operations module use data from load prediction and electrical grid emissions models to create an optimal operating schedule for the next week, reducing peak electricity consumption while maintaining quality of energy services. LBNL's application also provides facility managers with suggested energy infrastructure investments for achieving their energy cost and emission goals based on historical data collected with OSISoft's system. This paper describes these models as well as the SaaS architecture employed by LBNL researchers to provide asset scheduling services to UC Davis. The peak demand, emissions, and cost implications of the asset operation schedule and investments suggested by this optimization model are analyzed.

  6. Software architecture for a distributed real-time system in Ada, with application to telerobotics

    NASA Technical Reports Server (NTRS)

    Olsen, Douglas R.; Messiora, Steve; Leake, Stephen

    1992-01-01

    The architecture structure and software design methodology presented is described in the context of telerobotic application in Ada, specifically the Engineering Test Bed (ETB), which was developed to support the Flight Telerobotic Servicer (FTS) Program at GSFC. However, the nature of the architecture is such that it has applications to any multiprocessor distributed real-time system. The ETB architecture, which is a derivation of the NASA/NBS Standard Reference Model (NASREM), defines a hierarchy for representing a telerobot system. Within this hierarchy, a module is a logical entity consisting of the software associated with a set of related hardware components in the robot system. A module is comprised of submodules, which are cyclically executing processes that each perform a specific set of functions. The submodules in a module can run on separate processors. The submodules in the system communicate via command/status (C/S) interface channels, which are used to send commands down and relay status back up the system hierarchy. Submodules also communicate via setpoint data links, which are used to transfer control data from one submodule to another. A submodule invokes submodule algorithms (SMA's) to perform algorithmic operations. Data that describe or models a physical component of the system are stored as objects in the World Model (WM). The WM is a system-wide distributed database that is accessible to submodules in all modules of the system for creating, reading, and writing objects.

  7. Software-defined radio with flexible RF front end for satellite maritime radio applications

    NASA Astrophysics Data System (ADS)

    Budroweit, Jan

    2016-09-01

    This paper presents the concept of a software-defined radio with a flexible RF front end. The design and architecture of this system, as well as possible application examples will be explained. One specific scenario is the operation in maritime frequency bands. A well-known service is the Automatic Identification System (AIS), which has been captured by the DLR mission AISat, and will be chosen as a maritime application example. The results of an embedded solution for AIS on the SDR platform are presented in this paper. Since there is an increasing request for more performance on maritime radio bands, services like AIS will be enhanced by the International Association of Marine Aids to Navigation and Lighthouse Authorities (IALA). The new VHF Data Exchange Service (VDES) shall implement a dedicated satellite link. This paper describes that the SDR with a flexible RF front end can be used as a technology demonstration platform for this upcoming data exchange service.

  8. Intracoronary optical coherence tomography: Clinical and research applications and intravascular imaging software overview.

    PubMed

    Tenekecioglu, Erhan; Albuquerque, Felipe N; Sotomi, Yohei; Zeng, Yaping; Suwannasom, Pannipa; Tateishi, Hiroki; Cavalcante, Rafael; Ishibashi, Yuki; Nakatani, Shimpei; Abdelghani, Mohammad; Dijkstra, Jouke; Bourantas, Christos; Collet, Carlos; Karanasos, Antonios; Radu, Maria; Wang, Ancong; Muramatsu, Takashi; Landmesser, Ulf; Okamura, Takayuki; Regar, Evelyn; Räber, Lorenz; Guagliumi, Giulio; Pyo, Robert T; Onuma, Yoshinobu; Serruys, Patrick W

    2017-01-21

    By providing valuable information about the coronary artery wall and lumen, intravascular imaging may aid in optimizing interventional procedure results and thereby could improve clinical outcomes following percutaneous coronary intervention (PCI). Intravascular optical coherence tomography (OCT) is a light-based technology with a tissue penetration of approximately 1 to 3 mm and provides near histological resolution. It has emerged as a technological breakthrough in intravascular imaging with multiple clinical and research applications. OCT provides detailed visualization of the vessel following PCI and provides accurate assessment of post-procedural stent performance including detection of edge dissection, stent struts apposition, tissue prolapse, and healing parameters. Additionally, it can provide accurate characterization of plaque morphology and provides key information to optimize post-procedural outcomes. This manuscript aims to review the current clinical and research applications of intracoronary OCT and summarize the analytic OCT imaging software packages currently available. © 2017 Wiley Periodicals, Inc.

  9. VANESA - a software application for the visualization and analysis of networks in system biology applications.

    PubMed

    Brinkrolf, Christoph; Janowski, Sebastian Jan; Kormeier, Benjamin; Lewinski, Martin; Hippe, Klaus; Borck, Daniela; Hofestädt, Ralf

    2014-06-23

    VANESA is a modeling software for the automatic reconstruction and analysis of biological networks based on life-science database information. Using VANESA, scientists are able to model any kind of biological processes and systems as biological networks. It is now possible for scientists to automatically reconstruct important molecular systems with information from the databases KEGG, MINT, IntAct, HPRD, and BRENDA. Additionally, experimental results can be expanded with database information to better analyze the investigated elements and processes in an overall context. Users also have the possibility to use graph theoretical approaches in VANESA to identify regulatory structures and significant actors within the modeled systems. These structures can then be further investigated in the Petri net environment of VANESA. It is platform-independent, free-of-charge, and available at http://vanesa.sf.net.

  10. A complete software application for automatic registration of x-ray mammography and magnetic resonance images

    SciTech Connect

    Solves-Llorens, J. A.; Rupérez, M. J. Monserrat, C.; Lloret, M.

    2014-08-15

    Purpose: This work presents a complete and automatic software application to aid radiologists in breast cancer diagnosis. The application is a fully automated method that performs a complete registration of magnetic resonance (MR) images and x-ray (XR) images in both directions (from MR to XR and from XR to MR) and for both x-ray mammograms, craniocaudal (CC), and mediolateral oblique (MLO). This new approximation allows radiologists to mark points in the MR images and, without any manual intervention, it provides their corresponding points in both types of XR mammograms and vice versa. Methods: The application automatically segments magnetic resonance images and x-ray images using the C-Means method and the Otsu method, respectively. It compresses the magnetic resonance images in both directions, CC and MLO, using a biomechanical model of the breast that distinguishes the specific biomechanical behavior of each one of its three tissues (skin, fat, and glandular tissue) separately. It makes a projection of both compressions and registers them with the original XR images using affine transformations and nonrigid registration methods. Results: The application has been validated by two expert radiologists. This was carried out through a quantitative validation on 14 data sets in which the Euclidean distance between points marked by the radiologists and the corresponding points obtained by the application were measured. The results showed a mean error of 4.2 ± 1.9 mm for the MRI to CC registration, 4.8 ± 1.3 mm for the MRI to MLO registration, and 4.1 ± 1.3 mm for the CC and MLO to MRI registration. Conclusions: A complete software application that automatically registers XR and MR images of the breast has been implemented. The application permits radiologists to estimate the position of a lesion that is suspected of being a tumor in an imaging modality based on its position in another different modality with a clinically acceptable error. The results show that the

  11. Energy-Based Adaptive Focusing of waves: Application to Ultrasonic Transcranial Therapy

    NASA Astrophysics Data System (ADS)

    Herbert, E.; Pernot, M.; Montaldo, G.; Tanter, M.; Fink, M.

    2009-04-01

    We propose a general concept of adaptive focusing through complex media based on the estimation or measurement of the wave energy density at the desired focal spot. As it does not require the knowledge of phase information, this technique has many potential applications in acoustics and optics for light focusing through diffusive media. We present here the application of this technique to the problem of ultrasonic aberration correction for HIFU treatments. The estimation of wave energy density is based on the maximization of the ultrasound radiation force, using a multi-elements (64) array. A spatial coded excitation method is developed by using ad-hoc virtual transducers that include all the elements for each emission. The radiation force is maximized by optimizing the displacement of a small target at the focus. We measured the target displacement using ultrasound pulse echo on the same elements. A method using spatial coded excitation is developed in order to estimate the phase and amplitude aberration based on the target displacement. We validated this method using phase aberration up to 2π. The phase correction is achieved and the pressure field is measured using a needle hydrophone. The acoustic intensity at the focus is restored through very large aberrations. Basic experiments for brain HIFU treatment are presented. Optimal transcranial adaptive focusing is performed using a limited number of short ultrasonic radiation force pushes.

  12. Evaluation of commercial drilling and geological software for deep drilling applications

    NASA Astrophysics Data System (ADS)

    Pierdominici, Simona; Prevedel, Bernhard; Conze, Ronald; Tridec Team

    2013-04-01

    risks and costs. This procedure enables a timely, efficient and accurate data access and exchange among the rig site data acquisition system, office-based software applications and data storage. The loading of real-time data has to be quick and efficient in order to refine the model and learn the lessons for the next drilling operations.

  13. RICIS Software Engineering 90 Symposium: Aerospace Applications and Research Directions Proceedings Appendices

    NASA Technical Reports Server (NTRS)

    1990-01-01

    Papers presented at RICIS Software Engineering Symposium are compiled. The following subject areas are covered: flight critical software; management of real-time Ada; software reuse; megaprogramming software; Ada net; POSIX and Ada integration in the Space Station Freedom Program; and assessment of formal methods for trustworthy computer systems.

  14. Managing MDO Software Development Projects

    NASA Technical Reports Server (NTRS)

    Townsend, J. C.; Salas, A. O.

    2002-01-01

    Over the past decade, the NASA Langley Research Center developed a series of 'grand challenge' applications demonstrating the use of parallel and distributed computation and multidisciplinary design optimization. All but the last of these applications were focused on the high-speed civil transport vehicle; the final application focused on reusable launch vehicles. Teams of discipline experts developed these multidisciplinary applications by integrating legacy engineering analysis codes. As teams became larger and the application development became more complex with increasing levels of fidelity and numbers of disciplines, the need for applying software engineering practices became evident. This paper briefly introduces the application projects and then describes the approaches taken in project management and software engineering for each project; lessons learned are highlighted.

  15. NASA's Software Safety Standard

    NASA Technical Reports Server (NTRS)

    Ramsay, Christopher M.

    2007-01-01

    NASA relies more and more on software to control, monitor, and verify its safety critical systems, facilities and operations. Since the 1960's there has hardly been a spacecraft launched that does not have a computer on board that will provide command and control services. There have been recent incidents where software has played a role in high-profile mission failures and hazardous incidents. For example, the Mars Orbiter, Mars Polar Lander, the DART (Demonstration of Autonomous Rendezvous Technology), and MER (Mars Exploration Rover) Spirit anomalies were all caused or contributed to by software. The Mission Control Centers for the Shuttle, ISS, and unmanned programs are highly dependant on software for data displays, analysis, and mission planning. Despite this growing dependence on software control and monitoring, there has been little to no consistent application of software safety practices and methodology to NASA's projects with safety critical software. Meanwhile, academia and private industry have been stepping forward with procedures and standards for safety critical systems and software, for example Dr. Nancy Leveson's book Safeware: System Safety and Computers. The NASA Software Safety Standard, originally published in 1997, was widely ignored due to its complexity and poor organization. It also focused on concepts rather than definite procedural requirements organized around a software project lifecycle. Led by NASA Headquarters Office of Safety and Mission Assurance, the NASA Software Safety Standard has recently undergone a significant update. This new standard provides the procedures and guidelines for evaluating a project for safety criticality and then lays out the minimum project lifecycle requirements to assure the software is created, operated, and maintained in the safest possible manner. This update of the standard clearly delineates the minimum set of software safety requirements for a project without detailing the implementation for those

  16. Mercury: Reusable software application for Metadata Management, Data Discovery and Access

    NASA Astrophysics Data System (ADS)

    Devarakonda, Ranjeet; Palanisamy, Giri; Green, James; Wilson, Bruce E.

    2009-12-01

    simple, keyword, spatial and temporal searches across these metadata sources. The search user interface software has two API categories; a common core API which is used by all the Mercury user interfaces for querying the index and a customized API for project specific user interfaces. For our work in producing a reusable, portable, robust, feature-rich application, Mercury received a 2008 NASA Earth Science Data Systems Software Reuse Working Group Peer-Recognition Software Reuse Award. The new Mercury system is based on a Service Oriented Architecture and effectively reuses components for various services such as Thesaurus Service, Gazetteer Web Service and UDDI Directory Services. The software also provides various search services including: RSS, Geo-RSS, OpenSearch, Web Services and Portlets, integrated shopping cart to order datasets from various data centers (ORNL DAAC, NSIDC) and integrated visualization tools. Other features include: Filtering and dynamic sorting of search results, book-markable search results, save, retrieve, and modify search criteria.

  17. Total lithography system based on a new application software platform enabling smart scanner management

    NASA Astrophysics Data System (ADS)

    Kono, Hirotaka; Masaki, Kazuo; Matsuyama, Tomoyuki; Wakamoto, Shinji; Park, Seemoon; Sugihara, Taro; Shibazaki, Yuichi

    2015-03-01

    Along with device shrinkage, higher accuracy will continuously be required from photo-lithography tools in order to enhance on-product yield. In order to achieve higher yield, the advanced photo-lithography tools must be equipped with sophisticated tuning knobs on the tool and with software that is flexible enough to be applied per layer. This means photo-lithography tools must be capable of handling many types of sub-recipes and parameters simultaneously. To enable managing such a large amount of data easily and to setup lithography tools smoothly, we have developed a total lithography system called Litho Turnkey Solution based on a new software application platform, which we call Plug and Play Manager (PPM). PPM has its own graphical user interface, which enables total management of various data. Here various data means recipes, sub-recipes, tuning-parameters, measurement results, and so on. Through PPM, parameter making by intelligent applications such as CDU/Overlay tuning tools can easily be implemented. In addition, PPM is also linked to metrology tools and the customer's host computer, which enables data flow automation. Based on measurement data received from the metrology tools, PPM calculates correction parameters and sends them to the scanners automatically. This scheme can make calibration feedback loops possible. It should be noted that the abovementioned functions are running on the same platform through a user-friendly interface. This leads to smart scanner management and usability improvement. In this paper, we will demonstrate the latest development status of Nikon's total lithography solution based on PPM; describe details of each application; and provide supporting data for the accuracy and usability of the system. Keywords: exposure

  18. Design and Application of the Reconstruction Software for the BaBar Calorimeter

    SciTech Connect

    Strother, Philip David; /Imperial Coll., London

    2006-07-07

    The BaBar high energy physics experiment will be in operation at the PEP-II asymmetric e{sup +}e{sup -} collider in Spring 1999. The primary purpose of the experiment is the investigation of CP violation in the neutral B meson system. The electromagnetic calorimeter forms a central part of the experiment and new techniques are employed in data acquisition and reconstruction software to maximize the capability of this device. The use of a matched digital filter in the feature extraction in the front end electronics is presented. The performance of the filter in the presence of the expected high levels of soft photon background from the machine is evaluated. The high luminosity of the PEP-II machine and the demands on the precision of the calorimeter require reliable software that allows for increased physics capability. BaBar has selected C++ as its primary programming language and object oriented analysis and design as its coding paradigm. The application of this technology to the reconstruction software for the calorimeter is presented. The design of the systems for clustering, cluster division, track matching, particle identification and global calibration is discussed with emphasis on the provisions in the design for increased physics capability as levels of understanding of the detector increase. The CP violating channel B{sup 0} {yields} J/{Psi}K{sub S}{sup 0} has been studied in the two lepton, two {pi}{sup 0} final state. The contribution of this channel to the evaluation of the angle sin 2{beta} of the unitarity triangle is compared to that from the charged pion final state. An error of 0.34 on this quantity is expected after 1 year of running at design luminosity.

  19. The Environment for Application Software Integration and Execution (EASIE) version 1.0. Volume 1: Executive overview

    NASA Technical Reports Server (NTRS)

    Rowell, Lawrence F.; Davis, John S.

    1989-01-01

    The Environment for Application Software Integration and Execution (EASIE) provides a methodology and a set of software utility programs to ease the task of coordinating engineering design and analysis codes. EASIE was designed to meet the needs of conceptual design engineers that face the task of integrating many stand-alone engineering analysis programs. Using EASIE, programs are integrated through a relational database management system. Volume 1, Executive Overview, gives an overview of the functions provided by EASIE and describes their use. Three operational design systems based upon the EASIE software are briefly described.

  20. High intensity focused ultrasound surgery (HIFU) of the brain: A historical perspective, with modern applications

    PubMed Central

    Jagannathan, Jay; Sanghvi, Narendra K; Crum, Lawrence A; Yen, Chun-Po; Medel, Ricky; Dumont, Aaron S; Sheehan, Jason P; Steiner, Ladislau; Jolesz, Ferenc; Kassell, Neal F

    2014-01-01

    The field of MRI-guided high intensity focused ultrasound surgery (MRgFUS) is a rapidly evolving one with many potential applications in neurosurgery. This is the first of three articles on MRgFUS, this paper focuses on the historical development of the technology and it's potential applications to modern neurosurgery. The evolution of MRgFUS has occurred in parallel with modern neurological surgery and the two seemingly distinct disciplines share many of the same pioneering figures. Early studies on focused ultrasound treatment in the 1940's and 1950's demonstrated the ability to perform precise lesioning in the human brain, with a favorable risk-benefit profile. However, the need for a craniotomy, as well as lack of sophisticated imaging technology resulted in limited growth of HIFU for neurosurgery. More recently, technological advances, have permitted the combination of HIFU along with MRI guidance to provide an opportunity to effectively treat a variety of CNS disorders. Although challenges remain, HIFU-mediated neurosurgery may offer the ability to target and treat CNS conditions that were previously extremely difficult to perform. The remaining two articles in this series will focus on the physical principles of modern MRgFUS as well as current and future avenues for investigation. PMID:19190451

  1. Igpet software for modeling igneous processes: examples of application using the open educational version

    NASA Astrophysics Data System (ADS)

    Carr, Michael J.; Gazel, Esteban

    2016-09-01

    We provide here an open version of Igpet software, called t-Igpet to emphasize its application for teaching and research in forward modeling of igneous geochemistry. There are three programs, a norm utility, a petrologic mixing program using least squares and Igpet, a graphics program that includes many forms of numerical modeling. Igpet is a multifaceted tool that provides the following basic capabilities: igneous rock identification using the IUGS (International Union of Geological Sciences) classification and several supplementary diagrams; tectonic discrimination diagrams; pseudo-quaternary projections; least squares fitting of lines, polynomials and hyperbolae; magma mixing using two endmembers, histograms, x-y plots, ternary plots and spider-diagrams. The advanced capabilities of Igpet are multi-element mixing and magma evolution modeling. Mixing models are particularly useful for understanding the isotopic variations in rock suites that evolved by mixing different sources. The important melting models include, batch melting, fractional melting and aggregated fractional melting. Crystallization models include equilibrium and fractional crystallization and AFC (assimilation and fractional crystallization). Theses, reports and proposals concerning igneous petrology are improved by numerical modeling. For reviewed publications some elements of modeling are practically a requirement. Our intention in providing this software is to facilitate improved communication and lower entry barriers to research, especially for students.

  2. Impact of focusing of Ground Based SAR data on the quality of interferometric SAR applications

    NASA Astrophysics Data System (ADS)

    Zonno, Mariantonietta; Mascolo, Luigi; Guccione, Pietro; Nico, Giovanni; Di Pasquale, Andrea

    2014-10-01

    A Ground-Based Synthetic Aperture Radar (GB-SAR) is nowadays employed in several applications. The processing of ground-based, space and airborne SAR data relies on the same physical principles. Nevertheless specific algorithms for the focusing of data acquired by GB-SAR system have been proposed in literature. In this work the impact of the main focusing methods on the interferometric phase dispersion and on the coherence has been studied by employing a real dataset obtained by carrying out an experiment. Several acquisitions of a scene with a corner reflector mounted on a micrometric screw have been made; before some acquisitions the micrometric screw has been displaced of few millimetres in the Line-of-Sight direction. The images have been first focused by using two different algorithms and correspondently, two different sets of interferograms have been generated. The mean and standard deviation of the phase values in correspondence of the corner reflector have been compared to those obtained by knowing the real displacement of the micrometric screw. The mean phase and its dispersion and the coherence values for each focusing algorithm have been quantified and both the precision and the accuracy of the interferometic phase measurements obtained by using the two different focusing methods have been assessed.

  3. Neutron focusing using capillary optics and its applications to elemental analysis

    SciTech Connect

    Chen-Mayer, H. H.; Mildner, D. F. R.; Lamaze, G. P.; Paul, R. L.; Lindstrom, R. M.

    1999-06-10

    Capillary optics (Kumakhov lenses) have been characterized and tested at two cold neutron beam facilities at the NIST reactor: the Neutron Depth Profiling (NDP) and the Prompt Gamma Activation Analysis (PGAA) spectrometers. Lenses of both multifiber and monolithic types focus cold neutron beams from guides of cm transverse dimensions onto a sub-mm spot size with higher current densities at the expense of the angular resolution, which is acceptable for applications employing neutron absorption. These lenses can improve the sensitivity and detection limits for NDP and PGAA measurements on small samples, and enable sample scanning to study spatial non-uniformity or to perform compositional mapping. A summary of the neutron focusing effort is given, with examples of a multifiber lens with on-axis focusing, a bender-focuser with off-axis focusing, and a monolithic lens with a more compact size. Preliminary results and existing problems in applying these lenses to NDP and PGAA are presented, and current and future directions are discussed.

  4. Neutron focusing using capillary optics and its applications to elemental analysis

    NASA Astrophysics Data System (ADS)

    Chen-Mayer, H. H.; Mildner, D. F. R.; Lamaze, G. P.; Paul, R. L.; Lindstrom, R. M.

    1999-06-01

    Capillary optics (Kumakhov lenses) have been characterized and tested at two cold neutron beam facilities at the NIST reactor: the Neutron Depth Profiling (NDP) and the Prompt Gamma Activation Analysis (PGAA) spectrometers. Lenses of both multifiber and monolithic types focus cold neutron beams from guides of cm transverse dimensions onto a sub-mm spot size with higher current densities at the expense of the angular resolution, which is acceptable for applications employing neutron absorption. These lenses can improve the sensitivity and detection limits for NDP and PGAA measurements on small samples, and enable sample scanning to study spatial non-uniformity or to perform compositional mapping. A summary of the neutron focusing effort is given, with examples of a multifiber lens with on-axis focusing, a bender-focuser with off-axis focusing, and a monolithic lens with a more compact size. Preliminary results and existing problems in applying these lenses to NDP and PGAA are presented, and current and future directions are discussed.

  5. Collective Focusing of Intense Ion Beam Pulses for High-energy Density Physics Applications

    SciTech Connect

    Dorf, Mikhail A.; Kaganovich, Igor D.; Startsev, Edward A.; Davidson, Ronald C.

    2011-04-27

    The collective focusing concept in which a weak magnetic lens provides strong focusing of an intense ion beam pulse carrying a neutralizing electron background is investigated by making use of advanced particle-in-cell simulations and reduced analytical models. The original analysis by Robertson Phys. Rev. Lett. 48, 149 (1982) is extended to the parameter regimes of particular importance for several high-energy density physics applications. The present paper investigates (1) the effects of non-neutral collective focusing in a moderately strong magnetic field; (2) the diamagnetic effects leading to suppression of the applied magnetic field due to the presence of the beam pulse; and (3) the influence of a finite-radius conducting wall surrounding the beam cross-section on beam neutralization. In addition, it is demonstrated that the use of the collective focusing lens can significantly simplify the technical realization of the final focusing of ion beam pulses in the Neutralized Drift Compression Experiment-I (NDCX-I) , and the conceptual designs of possible experiments on NDCX-I are investigated by making use of advanced numerical simulations. 2011 American Institute of Physics

  6. ELAS - A geobased information system that is transferable to several computers. [Earth resources Laboratory Applications Software

    NASA Technical Reports Server (NTRS)

    Whitley, S. L.; Pearson, R. W.; Seyfarth, B. R.; Graham, M. H.

    1981-01-01

    In the early years of remote sensing, emphasis was placed on the processing and analysis of data from a single multispectral sensor, such as the Landsat Multispectral Scanner System (MSS). However, in connection with attempts to use the data for resource management, it was realized that many deficiencies existed in single data sets. A need was established to geographically reference the MSS data and to register with it data from disparate sources. Technological transfer activities have required systems concepts that can be easily transferred to computers of different types in other organizations. ELAS (Earth Resources Laboratory Applications Software), a geographically based information system, was developed to meet the considered needs. ELAS accepts data from a variety of sources. It contains programs to geographically reference the data to the Universal Transverse Mercator grid. One of the primary functions of ELAS is to produce a surface cover map.

  7. Instruction set extension for software defined radio in mobile GNSS applications

    NASA Astrophysics Data System (ADS)

    Marcinek, Krzysztof; Pleskacz, Witold A.

    2013-07-01

    A variety of currently operational GNSS frequencies and rapid development of new navigation satellite systems resulted in the need for interoperability and compatibility. Recent state-of-the-art integrated GNSS front-ends are capable of simultaneous support of multiple navigation systems [1, 2]. A unification of the signal processing part is also possible and, what is more, desirable. This paper introduces a proposal for universal instruction set extension (ISE), which is used for accelerating signal processing in SDR (Software Defined Radio) based GNSS applications. The results of this work were implemented and tested on the chip multithreading general purpose processor - AGATE [3], running on the Xilinx Virtex-6 ML605 FPGA evaluation board.

  8. Summary of the CSRI Workshop on Combinatorial Algebraic Topology (CAT): Software, Applications, & Algorithms

    SciTech Connect

    Bennett, Janine Camille; Day, David Minot; Mitchell, Scott A.

    2009-11-20

    This report summarizes the Combinatorial Algebraic Topology: software, applications & algorithms workshop (CAT Workshop). The workshop was sponsored by the Computer Science Research Institute of Sandia National Laboratories. It was organized by CSRI staff members Scott Mitchell and Shawn Martin. It was held in Santa Fe, New Mexico, August 29-30. The CAT Workshop website has links to some of the talk slides and other information, http://www.cs.sandia.gov/CSRI/Workshops/2009/CAT/index.html. The purpose of the report is to summarize the discussions and recap the sessions. There is a special emphasis on technical areas that are ripe for further exploration, and the plans for follow-up amongst the workshop participants. The intended audiences are the workshop participants, other researchers in the area, and the workshop sponsors.

  9. Platinum metallization for MEMS application. Focus on coating adhesion for biomedical applications.

    PubMed

    Guarnieri, Vittorio; Biazi, Leonardo; Marchiori, Roberto; Lago, Alexandre

    2014-01-01

    The adherence of Platinum thin film on Si/SiO2 wafer was studies using Chromium, Titanium or Alumina (Cr, Ti, Al2O3) as interlayer. The adhesion of Pt is a fundamental property in different areas, for example in MEMS devices, which operate at high temperature conditions, as well as in biomedical applications, where the problem of adhesion of a Pt film to the substrate is known as a major challenge in several industrial applications health and in biomedical devices, such as for example in the stents. We investigated the properties of Chromium, Titanium, and Alumina (Cr, Ti, and Al2O3) used as adhesion layers of Platinum (Pt) electrode. Thin films of Chromium, Titanium and Alumina were deposited on Silicon/Silicon dioxide (Si/SiO2) wafer by electron beam. We introduced Al2O3 as a new adhesion layer to test the behavior of the Pt film at higher temperature using a ceramic adhesion thin film. Electric behaviors were measured for different annealing temperatures to know the performance for Cr/Pt, Ti/Pt, and Al2O3/Pt metallic film in the gas sensor application. All these metal layers showed a good adhesion onto Si/SiO2 and also good Au wire bondability at room temperature, but for higher temperature than 400 °C the thin Cr/Pt and Ti/Pt films showed poor adhesion due to the atomic inter-diffusion between Platinum and the metal adhesion layers. The proposed Al2O3/Pt ceramic-metal layers confirmed a better adherence for the higher temperatures tested.

  10. A validated software application to measure fiber organization in soft tissue.

    PubMed

    Morrill, Erica E; Tulepbergenov, Azamat N; Stender, Christina J; Lamichhane, Roshani; Brown, Raquel J; Lujan, Trevor J

    2016-12-01

    The mechanical behavior of soft connective tissue is governed by a dense network of fibrillar proteins in the extracellular matrix. Characterization of this fibrous network requires the accurate extraction of descriptive structural parameters from imaging data, including fiber dispersion and mean fiber orientation. Common methods to quantify fiber parameters include fast Fourier transforms (FFT) and structure tensors; however, information is limited on the accuracy of these methods. In this study, we compared these two methods using test images of fiber networks with varying topology. The FFT method with a band-pass filter was the most accurate, with an error of [Formula: see text] in measuring mean fiber orientation and an error of [Formula: see text] in measuring fiber dispersion in the test images. The accuracy of the structure tensor method was approximately five times worse than the FFT band-pass method when measuring fiber dispersion. A free software application, FiberFit, was then developed that utilizes an FFT band-pass filter to fit fiber orientations to a semicircular von Mises distribution. FiberFit was used to measure collagen fibril organization in confocal images of bovine ligament at magnifications of [Formula: see text] and [Formula: see text]. Grayscale conversion prior to FFT analysis gave the most accurate results, with errors of [Formula: see text] for mean fiber orientation and [Formula: see text] for fiber dispersion when measuring confocal images at [Formula: see text]. By developing and validating a software application that facilitates the automated analysis of fiber organization, this study can help advance a mechanistic understanding of collagen networks and help clarify the mechanobiology of soft tissue remodeling and repair.

  11. Dental application of novel finite element analysis software for three-dimensional finite element modeling of a dentulous mandible from its computed tomography images.

    PubMed

    Nakamura, Keiko; Tajima, Kiyoshi; Chen, Ker-Kong; Nagamatsu, Yuki; Kakigawa, Hiroshi; Masumi, Shin-ich

    2013-12-01

    This study focused on the application of novel finite-element analysis software for constructing a finite-element model from the computed tomography data of a human dentulous mandible. The finite-element model is necessary for evaluating the mechanical response of the alveolar part of the mandible, resulting from occlusal force applied to the teeth during biting. Commercially available patient-specific general computed tomography-based finite-element analysis software was solely applied to the finite-element analysis for the extraction of computed tomography data. The mandibular bone with teeth was extracted from the original images. Both the enamel and the dentin were extracted after image processing, and the periodontal ligament was created from the segmented dentin. The constructed finite-element model was reasonably accurate using a total of 234,644 nodes and 1,268,784 tetrahedral and 40,665 shell elements. The elastic moduli of the heterogeneous mandibular bone were determined from the bone density data of the computed tomography images. The results suggested that the software applied in this study is both useful and powerful for creating a more accurate three-dimensional finite-element model of a dentulous mandible from the computed tomography data without the need for any other software.

  12. Collision of Physics and Software in the Monte Carlo Application Toolkit (MCATK)

    SciTech Connect

    Sweezy, Jeremy Ed

    2016-01-21

    The topic is presented in a series of slides organized as follows: MCATK overview, development strategy, available algorithms, problem modeling (sources, geometry, data, tallies), parallelism, miscellaneous tools/features, example MCATK application, recent areas of research, and summary and future work. MCATK is a C++ component-based Monte Carlo neutron-gamma transport software library with continuous energy neutron and photon transport. Designed to build specialized applications and to provide new functionality in existing general-purpose Monte Carlo codes like MCNP, it reads ACE formatted nuclear data generated by NJOY. The motivation behind MCATK was to reduce costs. MCATK physics involves continuous energy neutron & gamma transport with multi-temperature treatment, static eigenvalue (keff and α) algorithms, time-dependent algorithm, and fission chain algorithms. MCATK geometry includes mesh geometries and solid body geometries. MCATK provides verified, unit-test Monte Carlo components, flexibility in Monte Carlo application development, and numerous tools such as geometry and cross section plotters.

  13. Using Focused Regression for Accurate Time-Constrained Scaling of Scientific Applications

    SciTech Connect

    Barnes, B; Garren, J; Lowenthal, D; Reeves, J; de Supinski, B; Schulz, M; Rountree, B

    2010-01-28

    Many large-scale clusters now have hundreds of thousands of processors, and processor counts will be over one million within a few years. Computational scientists must scale their applications to exploit these new clusters. Time-constrained scaling, which is often used, tries to hold total execution time constant while increasing the problem size along with the processor count. However, complex interactions between parameters, the processor count, and execution time complicate determining the input parameters that achieve this goal. In this paper we develop a novel gray-box, focused median prediction errors are less than 13%. regression-based approach that assists the computational scientist with maintaining constant run time on increasing processor counts. Combining application-level information from a small set of training runs, our approach allows prediction of the input parameters that result in similar per-processor execution time at larger scales. Our experimental validation across seven applications showed that median prediction errors are less than 13%.

  14. RICIS Software Engineering 90 Symposium: Aerospace Applications and Research Directions Proceedings

    NASA Technical Reports Server (NTRS)

    1990-01-01

    Papers presented at RICIS Software Engineering Symposium are compiled. The following subject areas are covered: synthesis - integrating product and process; Serpent - a user interface management system; prototyping distributed simulation networks; and software reuse.

  15. Agile Software Development

    ERIC Educational Resources Information Center

    Biju, Soly Mathew

    2008-01-01

    Many software development firms are now adopting the agile software development method. This method involves the customer at every level of software development, thus reducing the impact of change in the requirement at a later stage. In this article, the principles of the agile method for software development are explored and there is a focus on…

  16. The Evolution of Software Pricing: From Box Licenses to Application Service Provider Models.

    ERIC Educational Resources Information Center

    Bontis, Nick; Chung, Honsan

    2000-01-01

    Describes three different pricing models for software. Findings of this case study support the proposition that software pricing is a complex and subjective process. The key determinant of alignment between vendor and user is the nature of value in the software to the buyer. This value proposition may range from increased cost reduction to…

  17. Software Engineering Environments for Mission Critical Applications -- STARS Alternative Programmatic Approaches.

    DTIC Science & Technology

    1984-08-01

    various phases would select the best candidates to continue development , reducing parallel efforts. Reshaping the policies, practices and regulations...replaced to meet changing needs, while continuing to support existing software. Software developed in an environment should be transportable to a...maintenance personnel are the developer , a separately identified life cycle support contractor, or a Government software support facility. Management

  18. Teacher-Designed Software for Interactive Linear Equations: Concepts, Interpretive Skills, Applications & Word-Problem Solving.

    ERIC Educational Resources Information Center

    Lawrence, Virginia

    No longer just a user of commercial software, the 21st century teacher is a designer of interactive software based on theories of learning. This software, a comprehensive study of straightline equations, enhances conceptual understanding, sketching, graphic interpretive and word problem solving skills as well as making connections to real-life and…

  19. Ultrasonic focusing through inhomogeneous media by application of the inverse scattering problem

    PubMed Central

    Haddadin, Osama S.; Ebbini, Emad S.

    2010-01-01

    A new approach is introduced for self-focusing phased arrays through inhomogeneous media for therapeutic and imaging applications. This algorithm utilizes solutions to the inverse scattering problem to estimate the impulse response (Green’s function) of the desired focal point(s) at the elements of the array. This approach is a two-stage procedure, where in the first stage the Green’s functions is estimated from measurements of the scattered field taken outside the region of interest. In the second stage, these estimates are used in the pseudoinverse method to compute excitation weights satisfying predefined set of constraints on the structure of the field at the focus points. These scalar, complex valued excitation weights are used to modulate the incident field for retransmission. The pseudoinverse pattern synthesis method requires knowing the Green’s function between the focus points and the array, which is difficult to attain for an unknown inhomogeneous medium. However, the solution to the inverse scattering problem, the scattering function, can be used directly to compute the required inhomogeneous Green’s function. This inverse scattering based self-focusing is noninvasive and does not require a strong point scatterer at or near the desired focus point. It simply requires measurements of the scattered field outside the region of interest. It can be used for high resolution imaging and enhanced therapeutic effects through inhomogeneous media without making any assumptions on the shape, size, or location of the inhomogeneity. This technique is outlined and numerical simulations are shown which validate this technique for single and multiple focusing using a circular array. PMID:9670525

  20. Supervised Semi-Automated Data Analysis Software for Gas Chromatography / Differential Mobility Spectrometry (GC/DMS) Metabolomics Applications.

    PubMed

    Peirano, Daniel J; Pasamontes, Alberto; Davis, Cristina E

    2016-09-01

    Modern differential mobility spectrometers (DMS) produce complex and multi-dimensional data streams that allow for near-real-time or post-hoc chemical detection for a variety of applications. An active area of interest for this technology is metabolite monitoring for biological applications, and these data sets regularly have unique technical and data analysis end user requirements. While there are initial publications on how investigators have individually processed and analyzed their DMS metabolomic data, there are no user-ready commercial or open source software packages that are easily used for this purpose. We have created custom software uniquely suited to analyze gas chromatograph / differential mobility spectrometry (GC/DMS) data from biological sources. Here we explain the implementation of the software, describe the user features that are available, and provide an example of how this software functions using a previously-published data set. The software is compatible with many commercial or home-made DMS systems. Because the software is versatile, it can also potentially be used for other similarly structured data sets, such as GC/GC and other IMS modalities.

  1. GEnomes Management Application (GEM.app): A new software tool for large-scale collaborative genome analysis

    PubMed Central

    Gonzalez, Michael A.; Acosta Lebrigio, Rafael F.; Van Booven, Derek; Ulloa, Rick H.; Powell, Eric; Speziani, Fiorella; Tekin, Mustafa; Schule, Rebecca; Zuchner, Stephan

    2015-01-01

    Novel genes are now identified at a rapid pace for many Mendelian disorders, and increasingly, for genetically complex phenotypes. However, new challenges have also become evident: (1) effectively managing larger exome and/or genome datasets, especially for smaller labs; (2) direct hands-on analysis and contextual interpretation of variant data in large genomic datasets; and (3) many small and medium-sized clinical and research-based investigative teams around the world are generating data that, if combined and shared, will significantly increase the opportunities for the entire community to identify new genes. To address these challenges we have developed GEnomes Management Application (GEM.app), a software tool to annotate, manage, visualize, and analyze large genomic datasets (https://genomics.med.miami.edu/). GEM.app currently contains ~1,600 whole exomes from 50 different phenotypes studied by 40 principal investigators from 15 different countries. The focus of GEM.app is on user-friendly analysis for non-bioinformaticians to make NGS data directly accessible. Yet, GEM.app provides powerful and flexible filter options, including single family filtering, across family/phenotype queries, nested filtering, and evaluation of segregation in families. In addition, the system is fast, obtaining results within 4 seconds across ~1,200 exomes. We believe that this system will further enhance identification of genetic causes of human disease. PMID:23463597

  2. Numerical simulation of shock wave focusing at fold caustics, with application to sonic boom.

    PubMed

    Marchiano, Régis; Coulouvrat, François; Grenon, Richard

    2003-10-01

    Weak shock wave focusing at fold caustics is described by the mixed type elliptic/hyperbolic nonlinear Tricomi equation. This paper presents a new and original numerical method for solving this equation, using a potential formulation and an "exact" numerical solver for handling nonlinearities. Validation tests demonstrate quantitatively the efficiency of the algorithm, which is able to handle complex waveforms as may come out from "optimized" aircraft designed to minimize sonic booms. It provides a real alternative to the approximate method of the hodograph transform. This motivated the application to evaluate the ground track focusing of sonic boom for an accelerating aircraft, by coupling CFD Euler simulations performed around the mock-up on an adaptated mesh grid, atmospheric propagation modeling, and the Tricomi algorithm. The chosen configuration is the European Eurosup mock-up. Convergence of the focused boom at the ground level as a function of the matching distance is investigated to demonstrate the efficiency of the numerical process. As a conclusion, it is indicated how the present work may pave the way towards a study on sonic superboom (focused boom) mitigation.

  3. Detailed design and first tests of the application software for the instrument control unit of Euclid-NISP

    NASA Astrophysics Data System (ADS)

    Ligori, S.; Corcione, L.; Capobianco, V.; Bonino, D.; Sirri, G.; Fornari, F.; Giacomini, F.; Patrizii, L.; Valenziano, L.; Travaglini, R.; Colodro, C.; Bortoletto, F.; Bonoli, C.; Chiarusi, T.; Margiotta, A.; Mauri, N.; Pasqualini, L.; Spurio, M.; Tenti, M.; Dal Corso, F.; Dusini, S.; Laudisio, F.; Sirignano, C.; Stanco, L.; Ventura, S.; Auricchio, N.; Balestra, A.; Franceschi, E.; Morgante, G.; Trifoglio, M.; Medinaceli, E.; Guizzo, G. P.; Debei, S.; Stephen, J. B.

    2016-07-01

    In this paper we describe the detailed design of the application software (ASW) of the instrument control unit (ICU) of NISP, the Near-Infrared Spectro-Photometer of the Euclid mission. This software is based on a real-time operating system (RTEMS) and will interface with all the subunits of NISP, as well as the command and data management unit (CDMU) of the spacecraft for telecommand and housekeeping management. We briefly review the main requirements driving the design and the architecture of the software that is approaching the Critical Design Review level. The interaction with the data processing unit (DPU), which is the intelligent subunit controlling the detector system, is described in detail, as well as the concept for the implementation of the failure detection, isolation and recovery (FDIR) algorithms. The first version of the software is under development on a Breadboard model produced by AIRBUS/CRISA. We describe the results of the tests and the main performances and budgets.

  4. Implementation of a Software Application for Presurgical Case History Review of Frozen Section Pathology Cases

    PubMed Central

    Norgan, Andrew P.; Okeson, Mathew L.; Juskewitch, Justin E.; Shah, Kabeer K.; Sukov, William R.

    2017-01-01

    Background: The frozen section pathology practice at Mayo Clinic in Rochester performs ~20,000 intraoperative consultations a year (~70–80/weekday). To prepare for intraoperative consultations, surgical pathology fellows and residents review the case history, previous pathology, and relevant imaging the day before surgery. Before the work described herein, review of pending surgical pathology cases was a paper-based process requiring handwritten transcription from the electronic health record, a laborious and potentially error prone process. Methods: To facilitate more efficient case review, a modular extension of an existing surgical listing software application (Surgical and Procedure Scheduling [SPS]) was developed. The module (SPS-pathology-specific module [PM]) added pathology-specific functionality including recording case notes, prefetching of radiology, pathology, and operative reports from the medical record, flagging infectious cases, and real-time tracking of cases in the operating room. After implementation, users were surveyed about its impact on the surgical pathology practice. Results: There were 16 survey respondents (five staff pathologists and eleven residents or fellows). All trainees (11/11) responded that the application improved an aspect of surgical list review including abstraction from medical records (10/11), identification of possibly infectious cases (7/11), and speed of list preparation (10/11). The average reported time savings in list preparation was 1.4 h/day. Respondents indicated the application improved the speed (11/16), clarity (13/16), and accuracy (10/16) of morning report. During the workday, respondents reported the application improved real-time case review (14/16) and situational awareness of ongoing cases (13/16). Conclusions: A majority of respondents found the SPS-PM improved all preparatory and logistical aspects of the Mayo Clinic frozen section surgical pathology practice. In addition, use of the SPS-PM saved an

  5. Agile hardware and software systems engineering for critical military space applications

    NASA Astrophysics Data System (ADS)

    Huang, Philip M.; Knuth, Andrew A.; Krueger, Robert O.; Garrison-Darrin, Margaret A.

    2012-06-01

    The Multi Mission Bus Demonstrator (MBD) is a successful demonstration of agile program management and system engineering in a high risk technology application where utilizing and implementing new, untraditional development strategies were necessary. MBD produced two fully functioning spacecraft for a military/DOD application in a record breaking time frame and at dramatically reduced costs. This paper discloses the adaptation and application of concepts developed in agile software engineering to hardware product and system development for critical military applications. This challenging spacecraft did not use existing key technology (heritage hardware) and created a large paradigm shift from traditional spacecraft development. The insertion of new technologies and methods in space hardware has long been a problem due to long build times, the desire to use heritage hardware, and lack of effective process. The role of momentum in the innovative process can be exploited to tackle ongoing technology disruptions and allowing risk interactions to be mitigated in a disciplined manner. Examples of how these concepts were used during the MBD program will be delineated. Maintaining project momentum was essential to assess the constant non recurring technological challenges which needed to be retired rapidly from the engineering risk liens. Development never slowed due to tactical assessment of the hardware with the adoption of the SCRUM technique. We adapted this concept as a representation of mitigation of technical risk while allowing for design freeze later in the program's development cycle. By using Agile Systems Engineering and Management techniques which enabled decisive action, the product development momentum effectively was used to produce two novel space vehicles in a fraction of time with dramatically reduced cost.

  6. Laboratory Information Management Software for genotyping workflows: applications in high throughput crop genotyping

    PubMed Central

    Jayashree, B; Reddy, Praveen T; Leeladevi, Y; Crouch, Jonathan H; Mahalakshmi, V; Buhariwalla, Hutokshi K; Eshwar, KE; Mace, Emma; Folksterma, Rolf; Senthilvel, S; Varshney, Rajeev K; Seetha, K; Rajalakshmi, R; Prasanth, VP; Chandra, Subhash; Swarupa, L; SriKalyani, P; Hoisington, David A

    2006-01-01

    Background With the advances in DNA sequencer-based technologies, it has become possible to automate several steps of the genotyping process leading to increased throughput. To efficiently handle the large amounts of genotypic data generated and help with quality control, there is a strong need for a software system that can help with the tracking of samples and capture and management of data at different steps of the process. Such systems, while serving to manage the workflow precisely, also encourage good laboratory practice by standardizing protocols, recording and annotating data from every step of the workflow. Results A laboratory information management system (LIMS) has been designed and implemented at the International Crops Research Institute for the Semi-Arid Tropics (ICRISAT) that meets the requirements of a moderately high throughput molecular genotyping facility. The application is designed as modules and is simple to learn and use. The application leads the user through each step of the process from starting an experiment to the storing of output data from the genotype detection step with auto-binning of alleles; thus ensuring that every DNA sample is handled in an identical manner and all the necessary data are captured. The application keeps track of DNA samples and generated data. Data entry into the system is through the use of forms for file uploads. The LIMS provides functions to trace back to the electrophoresis gel files or sample source for any genotypic data and for repeating experiments. The LIMS is being presently used for the capture of high throughput SSR (simple-sequence repeat) genotyping data from the legume (chickpea, groundnut and pigeonpea) and cereal (sorghum and millets) crops of importance in the semi-arid tropics. Conclusion A laboratory information management system is available that has been found useful in the management of microsatellite genotype data in a moderately high throughput genotyping laboratory. The application

  7. Software Review.

    ERIC Educational Resources Information Center

    McGrath, Diane, Ed.

    1989-01-01

    Reviewed is a computer software package entitled "Audubon Wildlife Adventures: Grizzly Bears" for Apple II and IBM microcomputers. Included are availability, hardware requirements, cost, and a description of the program. The murder-mystery flavor of the program is stressed in this program that focuses on illegal hunting and game…

  8. Application of bacteriorhodopsin films in an adaptive-focusing schlieren system

    NASA Astrophysics Data System (ADS)

    Downie, John D.

    1995-09-01

    The photochromic property of bacteriorhodopsin films is exploited in the application of a focusing schlieren optical system for the visualization of optical phase information. By encoding an image on the film with light of one wavelength and reading out with a different wavelength, the readout beam can effectively see the photographic negative of the original image. The potential advantage of this system over previous focusing schlieren systems is that the updatable nature of the bacteriorhodopsin film allows system adaptation. I discuss two image encoding and readout techniques for the bacteriorhodopsin and use film transmission characteristics to choose the more appropriate method. I demonstrate the system principle with experimental results using argon-ion and He-Cd lasers as the two light sources of different wavelengths, and I discuss current limitations to implementation with a white-light source.

  9. Driving Circuitry for Focused Ultrasound Noninvasive Surgery and Drug Delivery Applications

    PubMed Central

    El-Desouki, Munir M.; Hynynen, Kullervo

    2011-01-01

    Recent works on focused ultrasound (FUS) have shown great promise for cancer therapy. Researchers are continuously trying to improve system performance, which is resulting in an increased complexity that is more apparent when using multi-element phased array systems. This has led to significant efforts to reduce system size and cost by relying on system integration. Although ideas from other fields such as microwave antenna phased arrays can be adopted in FUS, the application requirements differ significantly since the frequency range used in FUS is much lower. In this paper, we review recent efforts to design efficient power monitoring, phase shifting and output driving techniques used specifically for high intensity focused ultrasound (HIFU). PMID:22346589

  10. Clinical Application of High-intensity Focused Ultrasound in Cancer Therapy

    PubMed Central

    Hsiao, Yi-Hsuan; Kuo, Shou-Jen; Tsai, Horng-Der; Chou, Ming-Chih; Yeh, Guang-Perng

    2016-01-01

    The treatment of cancer is an important issue in both developing and developed countries. Clinical use of ultrasound in cancer is not only for the diagnosis but also for the treatment. Focused ultrasound surgery (FUS) is a noninvasive technique. By using the combination of high-intensity focused ultrasound (HIFU) and imaging method, FUS has the potential to ablate tumor lesions precisely. The main mechanisms of HIFU ablation involve mechanical and thermal effects. Recent advances in HIFU have increased its popularity. Some promising results were achieved in managing various malignancies, including pancreas, prostate, liver, kidney, breast and bone. Other applications include brain tumor ablation and disruption of the blood-brain barrier. We aim at briefly outlining the clinical utility of FUS as a noninvasive technique for a variety of types of cancer treatment. PMID:26918034

  11. Application of Bacteriorhodopsin Films in an Adaptive-Focusing Schlieren System

    NASA Technical Reports Server (NTRS)

    Downie, John D.

    1995-01-01

    The photochromic property of bacteriorhodopsin films is exploited in the application of a focusing schlieren optical system for the visualization of optical phase information. By encoding an image on the film with light of one wavelength and reading out with a different wavelength, the readout beam can effectively see the photographic negative of the original image. The potential advantage of this system over previous focusing schlieren systems is that the updatable nature of the bacteriorhodopsin film allows system adaptation. I discuss two image encoding and readout techniques for the bacteriorhodopsin and use film transmission characteristics to choose the more appropriate method. I demonstrate the system principle with experimental results using argon-ion and He-Cd lasers as the two light sources of different wavelengths, and I discuss current limitations to implementation with a white-light source.

  12. Driving circuitry for focused ultrasound noninvasive surgery and drug delivery applications.

    PubMed

    El-Desouki, Munir M; Hynynen, Kullervo

    2011-01-01

    Recent works on focused ultrasound (FUS) have shown great promise for cancer therapy. Researchers are continuously trying to improve system performance, which is resulting in an increased complexity that is more apparent when using multi-element phased array systems. This has led to significant efforts to reduce system size and cost by relying on system integration. Although ideas from other fields such as microwave antenna phased arrays can be adopted in FUS, the application requirements differ significantly since the frequency range used in FUS is much lower. In this paper, we review recent efforts to design efficient power monitoring, phase shifting and output driving techniques used specifically for high intensity focused ultrasound (HIFU).

  13. Challenges in software applications for the cognitive evaluation and stimulation of the elderly

    PubMed Central

    2014-01-01

    Background Computer-based cognitive stimulation applications can help the elderly maintain and improve their cognitive skills. In this research paper, our objectives are to verify the usability of PESCO (an open-software application for cognitive evaluation and stimulation) and to determine the concurrent validity of cognitive assessment tests and the effectiveness of PESCO’s cognitive stimulation exercises. Methods Two studies were conducted in various community computer centers in the province of Granada. The first study tested tool usability by observing 43 elderly people and considering their responses to a questionnaire. In the second study, 36 elderly people completed pen-and-paper and PESCO tests followed by nine cognitive stimulation sessions. Meanwhile, a control group with 34 participants used computers for nine non-structured sessions. Results Analysis of the first study revealed that although PESCO had been developed by taking usability guidelines into account, there was room for improvement. Results from the second study indicated moderate concurrent validity between PESCO and standardized tests (Pearson’s r from .501 to .702) and highlighted the effectiveness of training exercises for improving attention (F = -4.111, p < .001) and planning (F = 5.791, p < .001) functions. Conclusions PESCO can be used by the elderly. The PESCO cognitive test module demonstrated its concurrent validity with traditional cognitive evaluation tests. The stimulation module is effective for improving attention and planning skills. PMID:24886420

  14. SEE: improving nurse-patient communications and preventing software piracy in nurse call applications.

    PubMed

    Unluturk, Mehmet S

    2012-06-01

    Nurse call system is an electrically functioning system by which patients can call upon from a bedside station or from a duty station. An intermittent tone shall be heard and a corridor lamp located outside the room starts blinking with a slow or a faster rate depending on the call origination. It is essential to alert nurses on time so that they can offer care and comfort without any delay. There are currently many devices available for a nurse call system to improve communication between nurses and patients such as pagers, RFID (radio frequency identification) badges, wireless phones and so on. To integrate all these devices into an existing nurse call system and make they communicate with each other, we propose software client applications called bridges in this paper. We also propose a window server application called SEE (Supervised Event Executive) that delivers messages among these devices. A single hardware dongle is utilized for authentication and copy protection for SEE. Protecting SEE with securities provided by dongle only is a weak defense against hackers. In this paper, we develop some defense patterns for hackers such as calculating checksums in runtime, making calls to dongle from multiple places in code and handling errors properly by logging them into database.

  15. Application of open source image guided therapy software in MR-guided therapies.

    PubMed

    Hata, Nobuhiko; Piper, Steve; Jolesz, Ferenc A; Tempany, Clare M C; Black, Peter McL; Morikawa, Shigehiro; Iseki, Horoshi; Hashizume, Makoto; Kikinis, Ron

    2007-01-01

    We present software engineering methods to provide free open-source software for MR-guided therapy. We report that graphical representation of the surgical tools, interconnectively with the tracking device, patient-to-image registration, and MRI-based thermal mapping are crucial components of MR-guided therapy in sharing such software. Software process includes a network-based distribution mechanism by multi-platform compiling tool CMake, CVS, quality assurance software DART. We developed six procedures in four separate clinical sites using proposed software engineering and process, and found the proposed method is feasible to facilitate multicenter clinical trial of MR-guided therapies. Our future studies include use of the software in non-MR-guided therapies.

  16. Data-Interpolating Variational Analysis (DIVA) software : recent development and application

    NASA Astrophysics Data System (ADS)

    Watelet, Sylvain; Beckers, Jean-Marie; Barth, Alexander; Back, Örjan

    2016-04-01

    The Data-Interpolating Variational Analysis (DIVA) software is a tool designed to reconstruct a continuous field from discrete measurements. This method is based on the numerical implementation of the Variational Inverse Model (VIM), which consists of a minimization of a cost function, allowing the choice of the analysed field fitting at best the data sets. The problem is solved efficiently using a finite-element method. This statistical method is particularly suited to deal with irregularly-spaced observations, producing outputs on a regular grid. Initially created to work in a two-dimensional way, the software is now able to handle 3D or even 4D analysis, in order to easily produce ocean climatologies. These analyses can easily be improved by taking advantage of the DIVA's ability to take topographic and dynamic constraints into account (coastal relief, prevailing wind impacting the advection,...). DIVA is an open-source software which is continuously upgraded and distributed for free through frequent version releases. The development is funded by the EMODnet and SeaDataNet projects and include many discussions and feedback from the users community. Here, we present two recent major upgrades : the data weighting option and the bottom-based analyses. Since DIVA works with a diagonal observation error covariance matrix, it is assumed that the observation errors are uncorrelated in space and time. In practice, this assumption is not always valid especially when dealing e.g. with cruise measurements (same instrument) or with time series at a fixed geographic point (representativity error). The data weighting option proposes to decrease the weights in the analysis of such observations. Theses weights are based on an exponential function using a 3D (x,y,t) distance between several observations. A comparison between not-weighted and weighted analyses will be shown. It has been a recurrent request from the DIVA users to improve the way the analyses near the ocean bottom

  17. The Point-Focusing Thermal and Electric Applications Project - A progress report. [small solar power systems applications

    NASA Technical Reports Server (NTRS)

    Marriott, A. T.

    1979-01-01

    The paper discusses the Point-Focusing Thermal and Electric Applications Project which encompasses three primary activities: (1) applications analysis and development, in which potential markets for small power systems (less than 10 MWe) are identified and characterized in order to provide requirements for design and information for activities relating to market development; (2) systems engineering and development, for analyses that will define the most appropriate small power system designs based on specific user requirements; and (3) experiment implementation and test, which deals with the design and placement of engineering experiments in various applications environments in order to test the readiness of the selected technology in an operational setting. Progress to date and/or key results are discussed throughout the text.

  18. Open Source Subtitle Editor Software Study for Section 508 Close Caption Applications

    NASA Technical Reports Server (NTRS)

    Murphy, F. Brandon

    2013-01-01

    This paper will focus on a specific item within the NASA Electronic Information Accessibility Policy - Multimedia Presentation shall have synchronized caption; thus making information accessible to a person with hearing impairment. This synchronized caption will assist a person with hearing or cognitive disability to access the same information as everyone else. This paper focuses on the research and implementation for CC (subtitle option) support to video multimedia. The goal of this research is identify the best available open-source (free) software to achieve synchronized captions requirement and achieve savings, while meeting the security requirement for Government information integrity and assurance. CC and subtitling are processes that display text within a video to provide additional or interpretive information for those whom may need it or those whom chose it. Closed captions typically show the transcription of the audio portion of a program (video) as it occurs (either verbatim or in its edited form), sometimes including non-speech elements (such as sound effects). The transcript can be provided by a third party source or can be extracted word for word from the video. This feature can be made available for videos in two forms: either Soft-Coded or Hard-Coded. Soft-Coded is the more optional version of CC, where you can chose to turn them on if you want, or you can turn them off. Most of the time, when using the Soft-Coded option, the transcript is also provided to the view along-side the video. This option is subject to compromise, whereas the transcript is merely a text file that can be changed by anyone who has access to it. With this option the integrity of the CC is at the mercy of the user. Hard-Coded CC is a more permanent form of CC. A Hard-Coded CC transcript is embedded within a video, without the option of removal.

  19. Theranostic applications of carbon nanomaterials in cancer: Focus on imaging and cargo delivery.

    PubMed

    Chen, Daiqin; Dougherty, Casey A; Zhu, Kaicheng; Hong, Hao

    2015-07-28

    Carbon based nanomaterials have attracted significant attention over the past decades due to their unique physical properties, versatile functionalization chemistry, and biological compatibility. In this review, we will summarize the current state-of-the-art applications of carbon nanomaterials in cancer imaging and drug delivery/therapy. The carbon nanomaterials will be categorized into fullerenes, nanotubes, nanohorns, nanodiamonds, nanodots and graphene derivatives based on their morphologies. The chemical conjugation/functionalization strategies of each category will be introduced before focusing on their applications in cancer imaging (fluorescence/bioluminescence, magnetic resonance (MR), positron emission tomography (PET), single-photon emission computed tomography (SPECT), photoacoustic, Raman imaging, etc.) and cargo (chemo/gene/therapy) delivery. The advantages and limitations of each category and the potential clinical utilization of these carbon nanomaterials will be discussed. Multifunctional carbon nanoplatforms have the potential to serve as optimal candidates for image-guided delivery vectors for cancer.

  20. Optical diagnostic and therapy applications of femtosecond laser radiation using lens-axicon focusing.

    PubMed

    Parigger, Christian G; Johnson, Jacqueline A; Splinter, Robert

    2013-01-01

    Diagnostic modalities by means of optical and/or near infrared femtosecond radiation through biological media can in principle be adapted to therapeutic applications. Of specific interest are soft tissue diagnostics and subsequent therapy through hard tissue such as bone. Femto-second laser pulses are delivered to hydroxyapatite representing bone, and photo-acoustic spectroscopy is presented in order to identify the location of optical anomalies in an otherwise homogeneous medium. Imaging through bone is being considered for diagnostic, and potentially therapeutic, applications related to brain tumors. The use of mesomeric optics such as lens-axicon combinations is of interest to achieve the favorable distribution of focused radiation. Direct therapy by increasing local temperature to induce hyperthermia is one mode of brain tumor therapy. This can be enhanced by seeding the tumor with nanoparticles. Opto-acoustic imaging using femtosecond laser radiation is a further opportunity for diagnosis.

  1. The Environment for Application Software Integration and Execution (EASIE) version 1.0. Volume 4: System installation and maintenance guide

    NASA Technical Reports Server (NTRS)

    Randall, Donald P.; Jones, Kennie H.; Rowell, Lawrence F.

    1988-01-01

    The Environment for Application Software Integration and Execution (EASIE) provides both a methodology and a set of software utility programs to ease the task of coordinating engineering design and analysis codes. This document provides necessary information for installing the EASIE software on a host computer system. The target host is a DEX VAX running VMS version 4; host dependencies are noted when appropriate. Relevant directories and individual files are identified, and compile/load/execute sequences are specified. In the case of the data management utilities, database management system (DBMS) specific features are described in an effort to assist the maintenance programmer in converting to a new DBMS. The document also describes a sample EASIE program directory structure to guide the program implementer in establishing his/her application dependent environment.

  2. Advanced software development workstation. Knowledge base design: Design of knowledge base for flight planning application

    NASA Technical Reports Server (NTRS)

    Izygon, Michel E.

    1992-01-01

    The development process of the knowledge base for the generation of Test Libraries for Mission Operations Computer (MOC) Command Support focused on a series of information gathering interviews. These knowledge capture sessions are supporting the development of a prototype for evaluating the capabilities of INTUIT on such an application. the prototype includes functions related to POCC (Payload Operation Control Center) processing. It prompts the end-users for input through a series of panels and then generates the Meds associated with the initialization and the update of hazardous command tables for a POCC Processing TLIB.

  3. Eddy Covariance Method for CO2 Emission Measurements: CCS Applications, Principles, Instrumentation and Software

    NASA Astrophysics Data System (ADS)

    Burba, George; Madsen, Rod; Feese, Kristin

    2013-04-01

    The Eddy Covariance method is a micrometeorological technique for direct high-speed measurements of the transport of gases, heat, and momentum between the earth's surface and the atmosphere. Gas fluxes, emission and exchange rates are carefully characterized from single-point in-situ measurements using permanent or mobile towers, or moving platforms such as automobiles, helicopters, airplanes, etc. Since the early 1990s, this technique has been widely used by micrometeorologists across the globe for quantifying CO2 emission rates from various natural, urban and agricultural ecosystems [1,2], including areas of agricultural carbon sequestration. Presently, over 600 eddy covariance stations are in operation in over 120 countries. In the last 3-5 years, advancements in instrumentation and software have reached the point when they can be effectively used outside the area of micrometeorology, and can prove valuable for geological carbon capture and sequestration, landfill emission measurements, high-precision agriculture and other non-micrometeorological industrial and regulatory applications. In the field of geological carbon capture and sequestration, the magnitude of CO2 seepage fluxes depends on a variety of factors. Emerging projects utilize eddy covariance measurement to monitor large areas where CO2 may escape from the subsurface, to detect and quantify CO2 leakage, and to assure the efficiency of CO2 geological storage [3,4,5,6,7,8]. Although Eddy Covariance is one of the most direct and defensible ways to measure and calculate turbulent fluxes, the method is mathematically complex, and requires careful setup, execution and data processing tailor-fit to a specific site and a project. With this in mind, step-by-step instructions were created to introduce a novice to the conventional Eddy Covariance technique [9], and to assist in further understanding the method through more advanced references such as graduate-level textbooks, flux networks guidelines, journals

  4. Printable Electrochemical Biosensors: A Focus on Screen-Printed Electrodes and Their Application

    PubMed Central

    Yamanaka, Keiichiro; Vestergaard, Mun’delanji C.; Tamiya, Eiichi

    2016-01-01

    In this review we present electrochemical biosensor developments, focusing on screen-printed electrodes (SPEs) and their applications. In particular, we discuss how SPEs enable simple integration, and the portability needed for on-field applications. First, we briefly discuss the general concept of biosensors and quickly move on to electrochemical biosensors. Drawing from research undertaken in this area, we cover the development of electrochemical DNA biosensors in great detail. Through specific examples, we describe the fabrication and surface modification of printed electrodes for sensitive and selective detection of targeted DNA sequences, as well as integration with reverse transcription-polymerase chain reaction (RT-PCR). For a more rounded approach, we also touch on electrochemical immunosensors and enzyme-based biosensors. Last, we present some electrochemical devices specifically developed for use with SPEs, including USB-powered compact mini potentiostat. The coupling demonstrates the practical use of printable electrode technologies for application at point-of-use. Although tremendous advances have indeed been made in this area, a few challenges remain. One of the main challenges is application of these technologies for on-field analysis, which involves complicated sample matrices. PMID:27775661

  5. Printable Electrochemical Biosensors: A Focus on Screen-Printed Electrodes and Their Application.

    PubMed

    Yamanaka, Keiichiro; Vestergaard, Mun'delanji C; Tamiya, Eiichi

    2016-10-21

    In this review we present electrochemical biosensor developments, focusing on screen-printed electrodes (SPEs) and their applications. In particular, we discuss how SPEs enable simple integration, and the portability needed for on-field applications. First, we briefly discuss the general concept of biosensors and quickly move on to electrochemical biosensors. Drawing from research undertaken in this area, we cover the development of electrochemical DNA biosensors in great detail. Through specific examples, we describe the fabrication and surface modification of printed electrodes for sensitive and selective detection of targeted DNA sequences, as well as integration with reverse transcription-polymerase chain reaction (RT-PCR). For a more rounded approach, we also touch on electrochemical immunosensors and enzyme-based biosensors. Last, we present some electrochemical devices specifically developed for use with SPEs, including USB-powered compact mini potentiostat. The coupling demonstrates the practical use of printable electrode technologies for application at point-of-use. Although tremendous advances have indeed been made in this area, a few challenges remain. One of the main challenges is application of these technologies for on-field analysis, which involves complicated sample matrices.

  6. An application of the focused liquid jet: needle free drug injection system

    NASA Astrophysics Data System (ADS)

    Kiyama, Akihito; Katsuta, Chihiro; Kawamoto, Sennosuke; Endo, Nanami; Tanaka, Akane; Tagawa, Yoshiyuki

    2016-11-01

    Recently, a focused liquid jet draws great attention since it can be applied to various applications (e. g. Ink jet printing, medical devices). In our research, in order to discuss its applicability for a needle-free drug injection system, we shoot a focused liquid jet to an animal skin with very high-speed. Previously, the penetration of this jet into a gelatin and an artificial skin has been performed in order to model of the jet penetration process. However, experiment for jet injection into the animal skin has not been conducted yet. In this presentation, we inject ink as the liquid jet into the skin of the hairless rat. We observe the top/back view and the cross-sectional view of the injected (ink-stained) skin. We capture the stained area of the skin in order to find characteristics of the jet penetration. We discuss the criteria for the jet penetration into the skin. This work was supported by JSPS KAKENHI Grant Numbers JP26709007, JP16J08521.

  7. New horizons for focused ultrasound (FUS) - therapeutic applications in neurodegenerative diseases.

    PubMed

    Miller, Diane B; O'Callaghan, James P

    2017-04-01

    Access to the CNS and delivery of therapeutics across the blood-brain barrier remains a challenge for most treatments of major neurological diseases such as AD or PD. Focused ultrasound represents a potential approach for overcoming these barriers to treating AD and PD and perhaps other neurological diseases. Ultrasound (US) is best known for its imaging capabilities of organs in the periphery, but various arrangements of the transducers producing the acoustic signal allow the energy to be precisely focused (F) within the skull. Using FUS in combination with MRI and contrast agents further enhances accuracy by providing clear information on location. Varying the acoustic power allows FUS to be used in applications ranging from imaging, stimulation of brain circuits, to ablation of tissue. In several transgenic mouse models of AD, the use of FUS with microbubbles reduces plaque load and improves cognition and suggests the need to investigate this technology for plaque removal in AD. In PD, FUS is being explored as a way to non-invasively ablate the brain areas responsible for the tremor and dyskinesia associated with the disease, but has yet to be utilized for non-invasive delivery of putative therapeutics. The FUS approach also greatly increases the range of possible CNS therapeutics as it overcomes the issues of BBB penetration. In this review we discuss how the characteristics and various applications of FUS may advance the therapeutics available for treating or preventing neurodegenerative disorders with an emphasis on treating AD and PD.

  8. The Relationship between Teacher Attitudes towards Software Applications and Student Achievement in Fourth and Fifth Grade Classrooms

    ERIC Educational Resources Information Center

    Spencer, Laura K.

    2010-01-01

    The problem: The problem addressed in this study was to examine how teacher attitudes towards software applications affect student achievement in the classroom. Method: A correlational study was conducted, and 50 fourth and fifth grade teachers who taught in the Santee School District, were administered a survey assessing their attitudes…

  9. Integrating model behavior, optimization, and sensitivity/uncertainty analysis: overview and application of the MOUSE software toolbox

    Technology Transfer Automated Retrieval System (TEKTRAN)

    This paper provides an overview of the Model Optimization, Uncertainty, and SEnsitivity Analysis (MOUSE) software application, an open-source, Java-based toolbox of visual and numerical analysis components for the evaluation of environmental models. MOUSE is based on the OPTAS model calibration syst...

  10. Adopting Open-Source Software Applications in U. S. Higher Education: A Cross-Disciplinary Review of the Literature

    ERIC Educational Resources Information Center

    van Rooij, Shahron Williams

    2009-01-01

    Higher Education institutions in the United States are considering Open Source software applications such as the Moodle and Sakai course management systems and the Kuali financial system to build integrated learning environments that serve both academic and administrative needs. Open Source is presumed to be more flexible and less costly than…

  11. 25 CFR 547.8 - What are the minimum technical software standards applicable to Class II gaming systems?

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... OF CLASS II GAMES § 547.8 What are the minimum technical software standards applicable to Class II... of Class II games. (a) Player interface displays. (1) If not otherwise provided to the player, the player interface shall display the following: (i) The purchase or wager amount; (ii) Game results;...

  12. 25 CFR 547.8 - What are the minimum technical software standards applicable to Class II gaming systems?

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... OF CLASS II GAMES § 547.8 What are the minimum technical software standards applicable to Class II... of Class II games. (a) Player interface displays. (1) If not otherwise provided to the player, the player interface shall display the following: (i) The purchase or wager amount; (ii) Game results;...

  13. 25 CFR 547.8 - What are the minimum technical software standards applicable to Class II gaming systems?

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... OF CLASS II GAMES § 547.8 What are the minimum technical software standards applicable to Class II... of Class II games. (a) Player interface displays. (1) If not otherwise provided to the player, the player interface shall display the following: (i) The purchase or wager amount; (ii) Game results;...

  14. Multi-focus, high resolution inspection system for extended range applications

    NASA Astrophysics Data System (ADS)

    Harding, Kevin

    2016-05-01

    Visual inspection of parts or structures for defects typically requires good spatial resolution to see the defects, but may also require a large focus range. But to obtain the best resolution from an imaging system, it needs to have a low f-number which limits the usable depth of field. Methods to use autofocus or focus stacking provides more range at high resolution, but often at the expense of computation time, loss of a real time image and uncertainty in scale changes. This paper describes an approach to quickly move through a range of focus positions without the need to move optics mechanically in a manner that is highly repeatable, maintains high resolution and provides the potential for a live image directly viewable by an inspector, even at microscope level magnifications. This paper will present the approach we investigated and discuss the pros and cons for a range of applications from large structures to small feature inspection. The paper will present examples of what resolution was achieved and how the multiple images might also be used to determine other parameters such as pose of a test surface.

  15. A strong-focusing 800 MeV cyclotron for high-current applications

    NASA Astrophysics Data System (ADS)

    Pogue, N.; Assadi, S.; Badgley, K.; Comeaux, J.; Kellams, J.; McInturff, A.; McIntyre, P.; Sattarov, A.

    2013-04-01

    A superconducting strong-focusing cyclotron (SFC) is being developed for high-current applications. It incorporates four innovations. Superconducting quarter-wave cavities are used to provide >20 MV/turn acceleration. The orbit separation is thereby opened so that bunch-bunch interactions between successive orbits are eliminated. Quadrapole focusing channels are incorporated within the sectors so that alternating-gradient strong-focusing transport is maintained throughout. Dipole windings on the inner and outer orbits provide enhanced control for injection and extraction of bunches. Finally each sector magnet is configured as a flux-coupled stack of independent apertures, so that any desired number of independent cyclotrons can be integrated within a common footprint. Preliminary simulations indicate that each SFC should be capable of accelerating 10 mA CW to 800 MeV with very low loss and >50% energy efficiency. A primary motivation for SFC is as a proton driver for accelerator-driven subcritical fission in a molten salt core. The cores are fueled solely with the transuranics from spent nuclear fuel from a conventional nuclear power plant. The beams from one SFC stack would destroy all of the transuranics and long-lived fission products that are produced by a GWe reactor [1]. This capability offers the opportunity to close the nuclear fuel cycle and provide a path to green nuclear energy.

  16. [The Development and Application of the Orthopaedics Implants Failure Database Software Based on WEB].

    PubMed

    Huang, Jiahua; Zhou, Hai; Zhang, Binbin; Ding, Biao

    2015-09-01

    This article develops a new failure database software for orthopaedics implants based on WEB. The software is based on B/S mode, ASP dynamic web technology is used as its main development language to achieve data interactivity, Microsoft Access is used to create a database, these mature technologies make the software extend function or upgrade easily. In this article, the design and development idea of the software, the software working process and functions as well as relative technical features are presented. With this software, we can store many different types of the fault events of orthopaedics implants, the failure data can be statistically analyzed, and in the macroscopic view, it can be used to evaluate the reliability of orthopaedics implants and operations, it also can ultimately guide the doctors to improve the clinical treatment level.

  17. Application of an integrated multi-criteria decision making AHP-TOPSIS methodology for ETL software selection.

    PubMed

    Hanine, Mohamed; Boutkhoum, Omar; Tikniouine, Abdessadek; Agouti, Tarik

    2016-01-01

    Actually, a set of ETL software (Extract, Transform and Load) is available to constitute a major investment market. Each ETL uses its own techniques for extracting, transforming and loading data into data warehouse, which makes the task of evaluating ETL software very difficult. However, choosing the right software of ETL is critical to the success or failure of any Business Intelligence project. As there are many impacting factors in the selection of ETL software, the same process is considered as a complex multi-criteria decision making (MCDM) problem. In this study, an application of decision-making methodology that employs the two well-known MCDM techniques, namely Analytic Hierarchy Process (AHP) and Technique for Order Preference by Similarity to Ideal Solution (TOPSIS) methods is designed. In this respect, the aim of using AHP is to analyze the structure of the ETL software selection problem and obtain weights of the selected criteria. Then, TOPSIS technique is used to calculate the alternatives' ratings. An example is given to illustrate the proposed methodology. Finally, a software prototype for demonstrating both methods is implemented.

  18. Application of Real Options Theory to DoD Software Acquisitions

    DTIC Science & Technology

    2009-02-20

    2008). A Monte Carlo simulation of the risk model (Figure 5) was run using the Risk Simulator software, taking into account interdependencies between...and we run the Monte Carlo simulation of the model with the revised risk estimates again. Based on the risk of requirements uncertainty8 presented...design and implementation, software validation and software evolution uncertainties all of which can be categorized as exhibiting both Heisenberg -type3

  19. In Vivo Application and Localization of Transcranial Focused Ultrasound Using Dual-Mode Ultrasound Arrays

    PubMed Central

    Haritonova, Alyona; Liu, Dalong; Ebbini, Emad S.

    2015-01-01

    Focused ultrasound (FUS) has been proposed for a variety of transcranial applications, including neuromodulation, tumor ablation, and blood brain barrier opening. A flurry of activity in recent years has generated encouraging results demonstrating its feasibility in these and other applications. To date, monitoring of FUS beams have been primarily accomplished using MR guidance, where both MR thermography and elastography have been used. The recent introduction of real-time dual-mode ultrasound array (DMUA) systems offers a new paradigm in transcranial focusing. In this paper, we present first experimental results of ultrasound-guided transcranial FUS (tFUS) application in a rodent brain, both ex vivo and in vivo. DMUA imaging is used for visualization of the treatment region for placement of the focal spot within the brain. This includes the detection and localization of pulsating blood vessels at or near the target point(s). In addition, DMUA imaging is used to monitor and localize the FUS-tissue interactions in real-time. In particular, a concave (40-mm radius of curvature), 32-element, 3.5 MHz DMUA prototype was used for imaging and tFUS application in ex vivo and in vivo rat model. The ex vivo experiments were used to evaluate the point spread function (psf) of the transcranial DMUA imaging at various points within the brain. In addition, DMUA-based transcranial ultrasound thermography measurements were compared with thermocouple measurements of subtherapeutic tFUS heating in rat brain ex vivo. The ex vivo setting was also used to demonstrate the DMUA capability to produce localized thermal lesions. The in vivo experiments were designed to demonstrate the ability of the DMUA to apply, monitor, and localize subtherapeutic tFUS patterns that could be beneficial in transient blood brain barrier opening. The results show that, while the DMUA focus is degraded due to the propagation through the skull, it still produces localized heating effects within sub

  20. Caltech/JPL Conference on Image Processing Technology, Data Sources and Software for Commercial and Scientific Applications

    NASA Technical Reports Server (NTRS)

    Redmann, G. H.

    1976-01-01

    Recent advances in image processing and new applications are presented to the user community to stimulate the development and transfer of this technology to industrial and commercial applications. The Proceedings contains 37 papers and abstracts, including many illustrations (some in color) and provides a single reference source for the user community regarding the ordering and obtaining of NASA-developed image-processing software and science data.

  1. Software Aspects of IEEE Floating-Point Computations for Numerical Applications in High Energy Physics

    ScienceCinema

    None

    2016-07-12

    Floating-point computations are at the heart of much of the computing done in high energy physics. The correctness, speed and accuracy of these computations are of paramount importance. The lack of any of these characteristics can mean the difference between new, exciting physics and an embarrassing correction. This talk will examine practical aspects of IEEE 754-2008 floating-point arithmetic as encountered in HEP applications. After describing the basic features of IEEE floating-point arithmetic, the presentation will cover: common hardware implementations (SSE, x87) techniques for improving the accuracy of summation, multiplication and data interchange compiler options for gcc and icc affecting floating-point operations hazards to be avoided About the speaker Jeffrey M Arnold is a Senior Software Engineer in the Intel Compiler and Languages group at Intel Corporation. He has been part of the Digital->Compaq->Intel compiler organization for nearly 20 years; part of that time, he worked on both low- and high-level math libraries. Prior to that, he was in the VMS Engineering organization at Digital Equipment Corporation. In the late 1980s, Jeff spent 2½ years at CERN as part of the CERN/Digital Joint Project. In 2008, he returned to CERN to spent 10 weeks working with CERN/openlab. Since that time, he has returned to CERN multiple times to teach at openlab workshops and consult with various LHC experiments. Jeff received his Ph.D. in physics from Case Western Reserve University.

  2. Conformity assessment of the measurement accuracy in testing laboratories using a software application

    NASA Astrophysics Data System (ADS)

    Diniţă, A.

    2017-02-01

    This article presents a method for assessing the accuracy of the measurements obtained at different tests conducted in laboratories by implementing the interlaboratory comparison method (organization, performance and evaluation of measurements of tests on the same or similar items by two or more laboratories under predetermined conditions). The program (independent software application), realised by the author and described in this paper, analyses the measurement accuracy and performance of testing laboratory by comparing the results obtained from different tests, using the modify Youden diagram, helping identify different types of errors that can occur in measurement, according to ISO 13528:2015, Statistical methods for use in proficiency testing by interlaboratory comparison. A case study is presented in the article by determining the chemical composition of identical samples from five different laboratories. The Youden diagram obtained from this study case was used to identify errors in the laboratory testing equipment. This paper was accepted for publication in Proceedings after double peer reviewing process but was not presented at the Conference ROTRIB’16

  3. MaRiMba: a software application for spectral library-based MRM transition list assembly.

    PubMed

    Sherwood, Carly A; Eastham, Ashley; Lee, Lik Wee; Peterson, Amelia; Eng, Jimmy K; Shteynberg, David; Mendoza, Luis; Deutsch, Eric W; Risler, Jenni; Tasman, Natalie; Aebersold, Ruedi; Lam, Henry; Martin, Daniel B

    2009-10-01

    Multiple reaction monitoring mass spectrometry (MRM-MS) is a targeted analysis method that has been increasingly viewed as an avenue to explore proteomes with unprecedented sensitivity and throughput. We have developed a software tool, called MaRiMba, to automate the creation of explicitly defined MRM transition lists required to program triple quadrupole mass spectrometers in such analyses. MaRiMba creates MRM transition lists from downloaded or custom-built spectral libraries, restricts output to specified proteins or peptides, and filters based on precursor peptide and product ion properties. MaRiMba can also create MRM lists containing corresponding transitions for isotopically heavy peptides, for which the precursor and product ions are adjusted according to user specifications. This open-source application is operated through a graphical user interface incorporated into the Trans-Proteomic Pipeline, and it outputs the final MRM list to a text file for upload to MS instruments. To illustrate the use of MaRiMba, we used the tool to design and execute an MRM-MS experiment in which we targeted the proteins of a well-defined and previously published standard mixture.

  4. MaRiMba: A Software Application for Spectral Library-Based MRM Transition List Assembly

    PubMed Central

    Sherwood, Carly A.; Eastham, Ashley; Lee, Lik Wee; Peterson, Amelia; Eng, Jimmy K.; Shteynberg, David; Mendoza, Luis; Deutsch, Eric W.; Risler, Jenni; Tasman, Natalie; Aebersold, Ruedi; Lam, Henry; Martin, Daniel B.

    2009-01-01

    Multiple reaction monitoring mass spectrometry (MRM-MS) is a targeted analysis method that has been increasingly viewed as an avenue to explore proteomes with unprecedented sensitivity and throughput. We have developed a software tool, called MaRiMba, to automate the creation of explicitly defined MRM transition lists required to program triple quadrupole mass spectrometers in such analyses. MaRiMba creates MRM transition lists from downloaded or custom-built spectral libraries, restricts output to specified proteins or peptides, and filters based on precursor peptide and product ion properties. MaRiMba can also create MRM lists containing corresponding transitions for isotopically heavy peptides, for which the precursor and product ions are adjusted according to user specifications. This open-source application is operated through a graphical user interface incorporated into the Trans-Proteomic Pipeline, and it outputs the final MRM list to a text file for upload to MS instruments. To illustrate the use of MaRiMba, we used the tool to design and execute an MRM-MS experiment in which we targeted the proteins of a well-defined and previously published standard mixture. PMID:19603829

  5. Software Configuration Management Guidebook

    NASA Technical Reports Server (NTRS)

    1995-01-01

    The growth in cost and importance of software to NASA has caused NASA to address the improvement of software development across the agency. One of the products of this program is a series of guidebooks that define a NASA concept of the assurance processes which are used in software development. The Software Assurance Guidebook, SMAP-GB-A201, issued in September, 1989, provides an overall picture of the concepts and practices of NASA in software assurance. Lower level guidebooks focus on specific activities that fall within the software assurance discipline, and provide more detailed information for the manager and/or practitioner. This is the Software Configuration Management Guidebook which describes software configuration management in a way that is compatible with practices in industry and at NASA Centers. Software configuration management is a key software development process, and is essential for doing software assurance.

  6. Feature Selection for Evolutionary Commercial-off-the-Shelf Software: Studies Focusing on Time-to-Market, Innovation and Hedonic-Utilitarian Trade-Offs

    ERIC Educational Resources Information Center

    Kakar, Adarsh Kumar

    2013-01-01

    Feature selection is one of the most important decisions made by product managers. This three article study investigates the concepts, tools and techniques for making trade-off decisions of introducing new features in evolving Commercial-Off-The-Shelf (COTS) software products. The first article investigates the efficacy of various feature…

  7. Surface modification of electrospun fibres for biomedical applications: A focus on radical polymerization methods.

    PubMed

    Duque Sánchez, Lina; Brack, Narelle; Postma, Almar; Pigram, Paul J; Meagher, Laurence

    2016-11-01

    The development of electrospun ultrafine fibres from biodegradable and biocompatible polymers has created exciting opportunities for biomedical applications. Fibre meshes with high surface area, suitable porosity and stiffness have been produced. Despite desirable structural and topographical properties, for most synthetic and some naturally occurring materials, the nature of the fibre surface chemistry has inhibited development. Hydrophobicity, undesirable non-specific protein adsorption and bacterial attachment and growth, coupled with a lack of surface functionality in many cases and an incomplete understanding of the myriad of interactions between cells and extracellular matrix (ECM) proteins have impeded the application of these systems. Chemical and physical treatments have been applied in order to modify or control the surface properties of electrospun fibres, with some success. Chemical modification using controlled radical polymerization, referred to here as reversible-deactivation radical polymerization (RDRP), has successfully introduced advanced surface functionality in some fibre systems. Atom transfer radical polymerization (ATRP) and reversible addition fragmentation chain transfer (RAFT) are the most widely investigated techniques. This review analyses the practical applications of electrospinning for the fabrication of high quality ultrafine fibres and evaluates the techniques available for the surface modification of electrospun ultrafine fibres and includes a detailed focus on RDRP approaches.

  8. Investigating the application of AOP methodology in development of Financial Accounting Software using Eclipse-AJDT Environment

    NASA Astrophysics Data System (ADS)

    Sharma, Amita; Sarangdevot, S. S.

    2010-11-01

    Aspect-Oriented Programming (AOP) methodology has been investigated in development of real world business application software—Financial Accounting Software. Eclipse-AJDT environment has been used as open source enhanced IDE support for programming in AOP language—Aspect J. Crosscutting concerns have been identified and modularized as aspects. This reduces the complexity of the design considerably due to elimination of code scattering and tangling. Improvement in modularity, quality and performance is achieved. The study concludes that AOP methodology in Eclipse-AJDT environment offers powerful support for modular design and implementation of real world quality business software.

  9. Software Reviews.

    ERIC Educational Resources Information Center

    Smith, Richard L., Ed.

    1988-01-01

    Contains evaluations of two computer software packages, "Simulation Experiments 45-48 in Epstein's Laboratory Manual for Chemistry" and "Maps and Legends--the Cartographer (Ver 3.0)." Includes a brief description, applications, and the perceived strengths and weaknesses for each package. (CW)

  10. An application generator for rapid prototyping of Ada real-time control software

    NASA Technical Reports Server (NTRS)

    Johnson, Jim; Biglari, Haik; Lehman, Larry

    1990-01-01

    The need to increase engineering productivity and decrease software life cycle costs in real-time system development establishes a motivation for a method of rapid prototyping. The design by iterative rapid prototyping technique is described. A tool which facilitates such a design methodology for the generation of embedded control software is described.

  11. Applications of Logic Coverage Criteria and Logic Mutation to Software Testing

    ERIC Educational Resources Information Center

    Kaminski, Garrett K.

    2011-01-01

    Logic is an important component of software. Thus, software logic testing has enjoyed significant research over a period of decades, with renewed interest in the last several years. One approach to detecting logic faults is to create and execute tests that satisfy logic coverage criteria. Another approach to detecting faults is to perform mutation…

  12. Application of the Golden Software Surfer mapping software for automation of visualisation of meteorological and oceanographic data in IMGW Maritime Branch.

    NASA Astrophysics Data System (ADS)

    Piliczewski, B.

    2003-04-01

    The Golden Software Surfer has been used in IMGW Maritime Branch for more than ten years. This tool provides ActiveX Automation objects, which allow scripts to control practically every feature of Surfer. These objects can be accessed from any Automation-enabled environment, such as Visual Basic or Excel. Several applications based on Surfer has been developed in IMGW. The first example is an on-line oceanographic service, which presents forecasts of the water temperature, sea level and currents originating from the HIROMB model and is automatically updated every day. Surfer was also utilised in MERMAID, an international project supported by EC under the 5th Framework Programme. The main aim of this project was to create a prototype of the Internet-based data brokerage system, which would enable to search, extract, buy and download datasets containing meteorological or oceanographic data. During the project IMGW developed an online application, called Mermaid Viewer, which enables communication with the data broker and automatic visualisation of the downloaded data using Surfer. Both the above mentioned applications were developed in Visual Basic. Currently it is considered to adopt Surfer for the monitoring service, which provides access to the data collected in the monitoring of the Baltic Sea environment.

  13. Application of Open Source Software by the Lunar Mapping and Modeling Project

    NASA Astrophysics Data System (ADS)

    Ramirez, P.; Goodale, C. E.; Bui, B.; Chang, G.; Kim, R. M.; Law, E.; Malhotra, S.; Rodriguez, L.; Sadaqathullah, S.; Mattmann, C. A.; Crichton, D. J.

    2011-12-01

    The Lunar Mapping and Modeling Project (LMMP), led by the Marshall Space Flight center (MSFC), is responsible for the development of an information system to support lunar exploration, decision analysis, and release of lunar data to the public. The data available through the lunar portal is predominantly derived from present lunar missions (e.g., the Lunar Reconnaissance Orbiter (LRO)) and from historical missions (e.g., Apollo). This project has created a gold source of data, models, and tools for lunar explorers to exercise and incorporate into their activities. At Jet Propulsion Laboratory (JPL), we focused on engineering and building the infrastructure to support cataloging, archiving, accessing, and delivery of lunar data. We decided to use a RESTful service-oriented architecture to enable us to abstract from the underlying technology choices and focus on interfaces to be used internally and externally. This decision allowed us to leverage several open source software components and integrate them by either writing a thin REST service layer or relying on the API they provided; the approach chosen was dependent on the targeted consumer of a given interface. We will discuss our varying experience using open source products; namely Apache OODT, Oracle Berkley DB XML, Apache Solr, and Oracle OpenSSO (now named OpenAM). Apache OODT, developed at NASA's Jet Propulsion Laboratory and recently migrated over to Apache, provided the means for ingestion and cataloguing of products within the infrastructure. Its usage was based upon team experience with the project and past benefit received on other projects internal and external to JPL. Berkeley DB XML, distributed by Oracle for both commercial and open source use, was the storage technology chosen for our metadata. This decision was in part based on our use Federal Geographic Data Committee (FGDC) Metadata, which is expressed in XML, and the desire to keep it in its native form and exploit other technologies built on

  14. Excel2Genie: A Microsoft Excel application to improve the flexibility of the Genie-2000 Spectroscopic software.

    PubMed

    Forgács, Attila; Balkay, László; Trón, Lajos; Raics, Péter

    2014-12-01

    Excel2Genie, a simple and user-friendly Microsoft Excel interface, has been developed to the Genie-2000 Spectroscopic Software of Canberra Industries. This Excel application can directly control Canberra Multichannel Analyzer (MCA), process the acquired data and visualize them. Combination of Genie-2000 with Excel2Genie results in remarkably increased flexibility and a possibility to carry out repetitive data acquisitions even with changing parameters and more sophisticated analysis. The developed software package comprises three worksheets: display parameters and results of data acquisition, data analysis and mathematical operations carried out on the measured gamma spectra. At the same time it also allows control of these processes. Excel2Genie is freely available to assist gamma spectrum measurements and data evaluation by the interested Canberra users. With access to the Visual Basic Application (VBA) source code of this application users are enabled to modify the developed interface according to their intentions.

  15. Visual Recognition Software for Binary Classification and its Application to Pollen Identification

    NASA Astrophysics Data System (ADS)

    Punyasena, S. W.; Tcheng, D. K.; Nayak, A.

    2014-12-01

    An underappreciated source of uncertainty in paleoecology is the uncertainty of palynological identifications. The confidence of any given identification is not regularly reported in published results, so cannot be incorporated into subsequent meta-analyses. Automated identifications systems potentially provide a means of objectively measuring the confidence of a given count or single identification, as well as a mechanism for increasing sample sizes and throughput. We developed the software ARLO (Automated Recognition with Layered Optimization) to tackle difficult visual classification problems such as pollen identification. ARLO applies pattern recognition and machine learning to the analysis of pollen images. The features that the system discovers are not the traditional features of pollen morphology. Instead, general purpose image features, such as pixel lines and grids of different dimensions, size, spacing, and resolution, are used. ARLO adapts to a given problem by searching for the most effective combination of feature representation and learning strategy. We present a two phase approach which uses our machine learning process to first segment pollen grains from the background and then classify pollen pixels and report species ratios. We conducted two separate experiments that utilized two distinct sets of algorithms and optimization procedures. The first analysis focused on reconstructing black and white spruce pollen ratios, training and testing our classification model at the slide level. This allowed us to directly compare our automated counts and expert counts to slides of known spruce ratios. Our second analysis focused on maximizing classification accuracy at the individual pollen grain level. Instead of predicting ratios of given slides, we predicted the species represented in a given image window. The resulting analysis was more scalable, as we were able to adapt the most efficient parts of the methodology from our first analysis. ARLO was able to

  16. The Use of Mobile Health Applications Among Youth and Young Adults Living with HIV: Focus Group Findings.

    PubMed

    Saberi, Parya; Siedle-Khan, Robert; Sheon, Nicolas; Lightfoot, Marguerita

    2016-06-01

    The objective of this study was to conduct focus groups with youth (18-29 years old) living with HIV (YLWH) to better understand preferences for mobile applications in general and to inform the design of a mobile health application aimed at improving retention and engagement in healthcare and adherence to antiretroviral therapy. We conducted four focus groups with YLWH to elicit the names and characteristics of applications that they commonly used, reasons they deleted applications, and the features of an ideal mobile health application. A diverse sample of youth (N = 17) with a mean age of 25 years, 88.2% male, and 29.4% African American participated in four focus groups. Positive attributes of applications included informative, simple, allowing for networking, timely updates, little overlap with other applications, unlimited access to entertainment, and with ongoing advancement. Participants identified several reasons for deleting applications, including engaging in excessive behaviors (e.g., spending money), for hook ups only, too many notifications or restrictions, occupied too much space on device, or required wireless connectivity or frequent updates. Participants suggested that a mobile health application that they would find useful should have the ability to connect to a community of other YLWH, readily access healthcare providers, track personal data and information (such as laboratory data), and obtain health news and education. Privacy was a key factor in a mobile health application for all participants. Researchers can use the information provided by focus group participants in creating mobile health applications for YLWH.

  17. Characterization of new FPS (Focus Projection and Scale) vidicons for scientific imaging applications

    NASA Astrophysics Data System (ADS)

    Yates, G. J.; Jaramillo, S. A.; Holmes, V. H.; Black, J. P.

    1988-06-01

    Several new photoconductors now commercially available as targets in Type 7803 FPS (Focus Projection and Scan) electrostatically focussed vidicons have been characterized for use as radiometric sensors in transient illumination and single frame applications. These include Saticon (Se + Te + As), Newvicon (ZnSe), Pasecon (CdSe), and Plumbicon (PbO). Samples from several domestic and foreign manufacturers have been evaluated for photoconductive response time and responsivity at selected narrow wavelength bands, including 410 nm, 560 nm, and 822 nm. These data are compared with performance data from older target materials including antimony trisulfide (Sb2S3) and silicon. Dynamic range and resolution trade-offs as functions of read-beam aperture diameter and raster size are also presented. The point spread functions for standard 1-mil vidicons and for increased apertures of 1.5, 2.0, 3.0, and 4.0-mil are also discussed.

  18. RECOLLIMATION AND RADIATIVE FOCUSING OF RELATIVISTIC JETS: APPLICATIONS TO BLAZARS AND M87

    SciTech Connect

    Bromberg, Omer; Levinson, Amir

    2009-07-10

    Recent observations of M87 and some blazars reveal violent activity in small regions located at relatively large distances from the central engine. Motivated by these considerations, we study the hydrodynamic collimation of a relativistic cooling outflow using a semianalytical model developed earlier. We first demonstrate that radiative cooling of the shocked outflow layer can lead to a focusing of the outflow and its reconfinement in a region having a very small cross-sectional radius. Such a configuration can produce rapid variability at large distances from the central engine via reflections of the converging recollimation shock. Possible applications of this model to TeV blazars are discussed. We then apply our model to M87. The low radiative efficiency of the M87 jet renders focusing unlikely. However, the shallow profile of the ambient medium pressure inferred from observations results in extremely good collimation that can explain the reported variability of the X-ray flux emitted from the HST-1 knot.

  19. Information flow and application to epileptogenic focus localization from intracranial EEG.

    PubMed

    Sabesan, Shivkumar; Good, Levi B; Tsakalis, Konstantinos S; Spanias, Andreas; Treiman, David M; Iasemidis, Leon D

    2009-06-01

    Transfer entropy ( TE) is a recently proposed measure of the information flow between coupled linear or nonlinear systems. In this study, we suggest improvements in the selection of parameters for the estimation of TE that significantly enhance its accuracy and robustness in identifying the direction and the level of information flow between observed data series generated by coupled complex systems. We show the application of the improved TE method to long (in the order of days; approximately a total of 600 h across all patients), continuous, intracranial electroencephalograms (EEG) recorded in two different medical centers from four patients with focal temporal lobe epilepsy (TLE) for localization of their foci. All patients underwent ablative surgery of their clinically assessed foci. Based on a surrogate statistical analysis of the TE results, it is shown that the identified potential focal sites through the suggested analysis were in agreement with the clinically assessed sites of the epileptogenic focus in all patients analyzed. It is noteworthy that the analysis was conducted on the available whole-duration multielectrode EEG, that is, without any subjective prior selection of EEG segments or electrodes for analysis. The above, in conjunction with the use of surrogate data, make the results of this analysis robust. These findings suggest a critical role TE may play in epilepsy research in general, and as a tool for robust localization of the epileptogenic focus/foci in patients with focal epilepsy in particular.

  20. The Program for Climate Model Diagnosis and Intercomparison (PCMDI) Software Development: Applications, Infrastructure, and Middleware/Networks

    SciTech Connect

    Williams, Dean N.

    2011-06-30

    The status of and future plans for the Program for Climate Model Diagnosis and Intercomparison (PCMDI) hinge on software that PCMDI is either currently distributing or plans to distribute to the climate community in the near future. These software products include standard conventions, national and international federated infrastructures, and community analysis and visualization tools. This report also mentions other secondary software not necessarily led by or developed at PCMDI to provide a complete picture of the overarching applications, infrastructures, and middleware/networks. Much of the software described anticipates the use of future technologies envisioned over the span of next year to 10 years. These technologies, together with the software, will be the catalyst required to address extreme-scale data warehousing, scalability issues, and service-level requirements for a diverse set of well-known projects essential for predicting climate change. These tools, unlike the previous static analysis tools of the past, will support the co-existence of many users in a productive, shared virtual environment. This advanced technological world driven by extreme-scale computing and the data it generates will increase scientists’ productivity, exploit national and international relationships, and push research to new levels of understanding.

  1. Design principle and calculations of a Scheffler fixed focus concentrator for medium temperature applications

    SciTech Connect

    Munir, A.; Hensel, O.; Scheffler, W.

    2010-08-15

    Scheffler fixed focus concentrators are successfully used for medium temperature applications in different parts of the world. These concentrators are taken as lateral sections of paraboloids and provide fixed focus away from the path of incident beam radiations throughout the year. The paper presents a complete description about the design principle and construction details of an 8 m{sup 2} surface area Scheffler concentrator. The first part of the paper presents the mathematical calculations to design the reflector parabola curve and reflector elliptical frame with respect to equinox (solar declination = 0) by selecting a specific lateral part of a paraboloid. Crossbar equations and their ellipses, arc lengths and their radii are also calculated to form the required lateral section of the paraboloid. Thereafter, the seasonal parabola equations are calculated for two extreme positions of summer and winter in the northern hemisphere (standing reflectors). The slopes of the parabola equations for equinox (solar declination = 0), summer (solar declination = +23.5) and winter (solar declination = -23.5) for the Scheffler reflector (8 m{sup 2} surface area) are calculated to be 0.17, 0.28, and 0.13 respectively. The y-intercepts of the parabola equations for equinox, summer and winter are calculated as 0, 0.54, and -0.53 respectively. By comparing with the equinox parabola curve, the summer parabola is found to be smaller in size and uses the top part of the parabola curve while the winter parabola is bigger in size and uses the lower part of the parabola curve to give the fixed focus. For this purpose, the reflector assembly is composed of flexible crossbars and a frame to induce the required change of the parabola curves with the changing solar declination. The paper also presents the calculation procedure of seasonal parabola equations for standing reflectors in the southern hemisphere as well as for laying reflectors in the northern and southern hemispheres. Highly

  2. Software component quality evaluation

    NASA Technical Reports Server (NTRS)

    Clough, A. J.

    1991-01-01

    The paper describes a software inspection process that can be used to evaluate the quality of software components. Quality criteria, process application, independent testing of the process and proposed associated tool support are covered. Early results indicate that this technique is well suited for assessing software component quality in a standardized fashion. With automated machine assistance to facilitate both the evaluation and selection of software components, such a technique should promote effective reuse of software components.

  3. High intensity focused ultrasound technology, its scope and applications in therapy and drug delivery.

    PubMed

    Phenix, Christopher Peter; Togtema, Melissa; Pichardo, Samuel; Zehbe, Ingeborg; Curiel, Laura

    2014-01-01

    Ultrasonography is a safe, inexpensive and wide-spread diagnostic tool capable of producing real-time non-invasive images without significant biological effects. However, the propagation of higher energy, intensity and frequency ultrasound waves through living tissues can induce thermal, mechanical and chemical effects useful for a variety of therapeutic applications. With the recent development of clinically approved High Intensity Focused Ultrasound (HIFU) systems, therapeutic ultrasound is now a medical reality. Indeed, HIFU has been used for the thermal ablation of pathological lesions; localized, minimally invasive ultrasound-mediated drug delivery through the transient formation of pores on cell membranes; the temporary disruption of skin and the blood brain barrier; the ultrasound induced break-down of blood clots; and the targeted release of drugs using ultrasound and temperature sensitive drug carriers. This review seeks to engage the pharmaceutical research community by providing an overview on the biological effects of ultrasound as well as highlighting important therapeutic applications, current deficiencies and future directions.

  4. VARK learning preferences and mobile anatomy software application use in pre-clinical chiropractic students.

    PubMed

    Meyer, Amanda J; Stomski, Norman J; Innes, Stanley I; Armson, Anthony J

    2016-05-06

    Ubiquitous smartphone ownership and reduced face-to-face teaching time may lead to students making greater use of mobile technologies in their learning. This is the first study to report on the prevalence of mobile gross anatomy software applications (apps) usage in pre-clinical chiropractic students and to ascertain if a relationship exists between preferred learning styles as determined by the validated VARK(©) questionnaire and use of mobile anatomy apps. The majority of the students who completed the VARK questionnaire were multimodal learners with kinesthetic and visual preferences. Sixty-seven percent (73/109) of students owned one or more mobile anatomy apps which were used by 57 students. Most of these students owned one to five apps and spent less than 30 minutes per week using them. Six of the top eight mobile anatomy apps owned and recommended by the students were developed by 3D4Medical. Visual learning preferences were not associated with time spent using mobile anatomy apps (OR = 0.40, 95% CI 0.12-1.40). Similarly, kinesthetic learning preferences (OR = 1.88, 95% CI 0.18-20.2), quadmodal preferences (OR = 0.71, 95% CI 0.06-9.25), or gender (OR = 1.51, 95% CI 0.48-4.81) did not affect the time students' spent using mobile anatomy apps. Learning preferences do not appear to influence students' time spent using mobile anatomy apps. Anat Sci Educ 9: 247-254. © 2015 American Association of Anatomists.

  5. OPERA, an automatic PSF reconstruction software for Shack-Hartmann AO systems: application to Altair

    NASA Astrophysics Data System (ADS)

    Jolissaint, Laurent; Veran, Jean-Pierre; Marino, Jose

    2004-10-01

    When doing high angular resolution imaging with adaptive optics (AO), it is of crucial importance to have an accurate knowledge of the point spread function associated with each observation. Applications are numerous: image contrast enhancement by deconvolution, improved photometry and astrometry, as well as real time AO performance evaluation. In this paper, we present our work on automatic PSF reconstruction based on control loop data, acquired simultaneously with the observation. This problem has already been solved for curvature AO systems. To adapt this method to another type of WFS, a specific analytical noise propagation model must be established. For the Shack-Hartmann WFS, we are able to derive a very accurate estimate of the noise on each slope measurement, based on the covariances of the WFS CCD pixel values in the corresponding sub-aperture. These covariances can be either derived off-line from telemetry data, or calculated by the AO computer during the acquisition. We present improved methods to determine 1) r0 from the DM drive commands, which includes an estimation of the outer scale L0 2) the contribution of the high spatial frequency component of the turbulent phase, which is not corrected by the AO system and is scaled by r0. This new method has been implemented in an IDL-based software called OPERA (Performance of Adaptive Optics). We have tested OPERA on Altair, the recently commissioned Gemini-North AO system, and present our preliminary results. We also summarize the AO data required to run OPERA on any other AO system.

  6. A fast auto-focusing technique for the long focal lens TDI CCD camera in remote sensing applications

    NASA Astrophysics Data System (ADS)

    Wang, Dejiang; Ding, Xu; Zhang, Tao; Kuang, Haipeng

    2013-02-01

    The key issue in automatic focus adjustment for long focal lens TDI CCD camera in remote sensing applications is to achieve the optimum focus position as fast as possible. Existing auto-focusing techniques consume too much time as the mechanical focusing parts of the camera move in steps during the searching procedure. In this paper, we demonstrate a fast auto-focusing technique, which employs the internal optical elements and the TDI CCD itself to directly sense the deviations in back focal distance of the lens and restore the imaging system to a best-available focus. It is particularly advantageous for determination of the focus, due to that the relative motion between the TDI CCD and the focusing element can proceed without interruption. Moreover, the theoretical formulas describing the effect of imaging motion on the focusing precision and the effective focusing range are also developed. Finally, an experimental setup is constructed to evaluate the performance of the proposed technique. The results of the experiment show a ±5 μm precision of auto-focusing in a range of ±500 μmdefocus, and the searching procedure could be accomplished within 0.125 s, which leads to remarkable improvement on the real-time imaging capability for high resolution TDI CCD camera in remote sensing applications.

  7. Software Design Improvements. Part 2; Software Quality and the Design and Inspection Process

    NASA Technical Reports Server (NTRS)

    Lalli, Vincent R.; Packard, Michael H.; Ziemianski, Tom

    1997-01-01

    The application of assurance engineering techniques improves the duration of failure-free performance of software. The totality of features and characteristics of a software product are what determine its ability to satisfy customer needs. Software in safety-critical systems is very important to NASA. We follow the System Safety Working Groups definition for system safety software as: 'The optimization of system safety in the design, development, use and maintenance of software and its integration with safety-critical systems in an operational environment. 'If it is not safe, say so' has become our motto. This paper goes over methods that have been used by NASA to make software design improvements by focusing on software quality and the design and inspection process.

  8. Microfabrication and Test of a Three-Dimensional Polymer Hydro-focusing Unit for Flow Cytometry Applications

    NASA Technical Reports Server (NTRS)

    Yang, Ren; Feeback, Daniel L.; Wang, Wanjun

    2004-01-01

    This paper details a novel three-dimensional (3D) hydro-focusing micro cell sorter for micro flow cytometry applications. The unit was microfabricated by means of SU-8 3D lithography. The 3D microstructure for coaxial sheathing was designed, microfabricated, and tested. Three-dimensional hydro-focusing capability was demonstrated with an experiment to sort labeled tanned sheep erythrocytes (red blood cells). This polymer hydro-focusing microstructure is easily microfabricated and integrated with other polymer microfluidic structures.

  9. The international river interface cooperative: Public domain flow and morphodynamics software for education and applications

    NASA Astrophysics Data System (ADS)

    Nelson, Jonathan M.; Shimizu, Yasuyuki; Abe, Takaaki; Asahi, Kazutake; Gamou, Mineyuki; Inoue, Takuya; Iwasaki, Toshiki; Kakinuma, Takaharu; Kawamura, Satomi; Kimura, Ichiro; Kyuka, Tomoko; McDonald, Richard R.; Nabi, Mohamed; Nakatsugawa, Makoto; Simões, Francisco R.; Takebayashi, Hiroshi; Watanabe, Yasunori

    2016-07-01

    This paper describes a new, public-domain interface for modeling flow, sediment transport and morphodynamics in rivers and other geophysical flows. The interface is named after the International River Interface Cooperative (iRIC), the group that constructed the interface and many of the current solvers included in iRIC. The interface is entirely free to any user and currently houses thirteen models ranging from simple one-dimensional models through three-dimensional large-eddy simulation models. Solvers are only loosely coupled to the interface so it is straightforward to modify existing solvers or to introduce other solvers into the system. Six of the most widely-used solvers are described in detail including example calculations to serve as an aid for users choosing what approach might be most appropriate for their own applications. The example calculations range from practical computations of bed evolution in natural rivers to highly detailed predictions of the development of small-scale bedforms on an initially flat bed. The remaining solvers are also briefly described. Although the focus of most solvers is coupled flow and morphodynamics, several of the solvers are also specifically aimed at providing flood inundation predictions over large spatial domains. Potential users can download the application, solvers, manuals, and educational materials including detailed tutorials at www.-i-ric.org. The iRIC development group encourages scientists and engineers to use the tool and to consider adding their own methods to the iRIC suite of tools.

  10. The international river interface cooperative: Public domain flow and morphodynamics software for education and applications

    USGS Publications Warehouse

    Nelson, Jonathan M.; Shimizu, Yasuyuki; Abe, Takaaki; Asahi, Kazutake; Gamou, Mineyuki; Inoue, Takuya; Iwasaki, Toshiki; Kakinuma, Takaharu; Kawamura, Satomi; Kimura, Ichiro; Kyuka, Tomoko; McDonald, Richard R.; Nabi, Mohamed; Nakatsugawa, Makoto; Simoes, Francisco J.; Takebayashi, Hiroshi; Watanabe, Yasunori

    2016-01-01

    This paper describes a new, public-domain interface for modeling flow, sediment transport and morphodynamics in rivers and other geophysical flows. The interface is named after the International River Interface Cooperative (iRIC), the group that constructed the interface and many of the current solvers included in iRIC. The interface is entirely free to any user and currently houses thirteen models ranging from simple one-dimensional models through three-dimensional large-eddy simulation models. Solvers are only loosely coupled to the interface so it is straightforward to modify existing solvers or to introduce other solvers into the system. Six of the most widely-used solvers are described in detail including example calculations to serve as an aid for users choosing what approach might be most appropriate for their own applications. The example calculations range from practical computations of bed evolution in natural rivers to highly detailed predictions of the development of small-scale bedforms on an initially flat bed. The remaining solvers are also briefly described. Although the focus of most solvers is coupled flow and morphodynamics, several of the solvers are also specifically aimed at providing flood inundation predictions over large spatial domains. Potential users can download the application, solvers, manuals, and educational materials including detailed tutorials at www.-i-ric.org. The iRIC development group encourages scientists and engineers to use the tool and to consider adding their own methods to the iRIC suite of tools.

  11. Software Formal Inspections Standard

    NASA Technical Reports Server (NTRS)

    1993-01-01

    This Software Formal Inspections Standard (hereinafter referred to as Standard) is applicable to NASA software. This Standard defines the requirements that shall be fulfilled by the software formal inspections process whenever this process is specified for NASA software. The objective of this Standard is to define the requirements for a process that inspects software products to detect and eliminate defects as early as possible in the software life cycle. The process also provides for the collection and analysis of inspection data to improve the inspection process as well as the quality of the software.

  12. Benchmarking Software Assurance Implementation

    DTIC Science & Technology

    2011-05-18

    product The chicken#. (a.k.a. Process Focused Assessment ) – Management Systems ( ISO 9001 , ISO 27001, ISO 2000) – Capability Maturity Models (CMMI...How – Executive leadership commitment – Translate ROI to project manager vocabulary (cost, schedule, quality ) – Start small and build – Use...collaboration Vocabulary Reserved Words Software Acquisition Information Assurance Project Management System Engineering Software Engineering Software

  13. Software Engineering for Portability.

    ERIC Educational Resources Information Center

    Stanchev, Ivan

    1990-01-01

    Discussion of the portability of educational software focuses on the software design and development process. Topics discussed include levels of portability; the user-computer dialog; software engineering principles; design techniques for student performance records; techniques of courseware programing; and suggestions for further research and…

  14. TOUGH2 software qualification

    SciTech Connect

    Pruess, K.; Simmons, A.; Wu, Y.S.; Moridis, G.

    1996-02-01

    TOUGH2 is a numerical simulation code for multi-dimensional coupled fluid and heat flow of multiphase, multicomponent fluid mixtures in porous and fractured media. It belongs to the MULKOM ({open_quotes}MULti-KOMponent{close_quotes}) family of codes and is a more general version of the TOUGH simulator. The MULKOM family of codes was originally developed with a focus on geothermal reservoir simulation. They are suited to modeling systems which contain different fluid mixtures, with applications to flow problems arising in the context of high-level nuclear waste isolation, oil and gas recovery and storage, and groundwater resource protection. TOUGH2 is essentially a subset of MULKOM, consisting of a selection of the better tested and documented MULKOM program modules. The purpose of this package of reports is to provide all software baseline documents necessary for the software qualification of TOUGH2.

  15. Towards the Application of Open Source Software in Developing National Electronic Health Record-Narrative Review Article

    PubMed Central

    AMINPOUR, Farzaneh; SADOUGHI, Farahnaz; AHMADI, Maryam

    2013-01-01

    Abstract Electronic Health Record (EHR) is a repository of patient health information shared among multiple authorized users. As a modern method of storing and processing health information, it is a solution for improving quality, safety and efficiency of patient care and health system. However, establishment of EHR requires a significant investment of time and money. While many of healthcare providers have very limited capital, application of open source software would be considered as a solution in developing national electronic health record especially in countries with low income. The evidence showed that financial limitation is one of the obstacles to implement electronic health records in developing countries. Therefore, establishment of an open source EHR system capable of modifications according to the national requirements seems to be inevitable in Iran. The present study identifies the impact of application of open source software in developing national electronic health record in Iran. PMID:26060634

  16. Characterization and application of simultaneously spatio-temporally focused ultrafast laser pulses

    NASA Astrophysics Data System (ADS)

    Greco, Michael J.

    Chirped pulse amplication of ultrafast laser pulses has become an essential technology in the elds of micromachining, tissue ablation, and microscopy. With specically tailored pulses of light we have been able to begin investigation into lab-on-a-chip technology, which has the potential of revolutionizing the medical industry. Advances in microscopy have allowed sub diraction limited resolution to become a reality as well as lensless imaging of single molecules. An intimate knowledge of ultrafast optical pulses, the ability to manipulate an optical spectrum and generate an optical pulse of a specic temporal shape, allows us to continue pushing these elds forward as well as open new ones. This thesis investigates the spatio-temporal construction of pulses which are simultaneously spatio-temporally focused (SSTF) and about their current and future applications. By laterally chirping a compressed laser pulse we have conned the peak intensity to a shorter distance along the optical axis than can be achieved by conventional methods. This also brings about interesting changes to the structure of the pulse intensity such as pulse front tilt (PFT), an eect where the pulse energy is delayed across the focal spot at the focal plane by longer durations than the pulse itself. Though these pulses have found utility in microscopy and micromachining, in-situ methods for characterizing them spatially and temporally are not yet wide spread. I present here an in-situ characterization technique for both spatial and temporal diagnosis of SSTF pulses. By performing a knife-edge scan and collecting the light in a spectrometer, the relative spectral position as well as beam size can be deduced. Temporal characterization is done by dispersion scan, where a second harmonic crystal through the beam focus. Combining the unknown phase of the pulse with the known phase (a result of angular dispersion) allows the unknown phase to be extracted from the second harmonic spectra.

  17. A Statistical Testing Approach for Quantifying Software Reliability; Application to an Example System

    SciTech Connect

    Chu, Tsong-Lun; Varuttamaseni, Athi; Baek, Joo-Seok

    2016-11-01

    The U.S. Nuclear Regulatory Commission (NRC) encourages the use of probabilistic risk assessment (PRA) technology in all regulatory matters, to the extent supported by the state-of-the-art in PRA methods and data. Although much has been accomplished in the area of risk-informed regulation, risk assessment for digital systems has not been fully developed. The NRC established a plan for research on digital systems to identify and develop methods, analytical tools, and regulatory guidance for (1) including models of digital systems in the PRAs of nuclear power plants (NPPs), and (2) incorporating digital systems in the NRC's risk-informed licensing and oversight activities. Under NRC's sponsorship, Brookhaven National Laboratory (BNL) explored approaches for addressing the failures of digital instrumentation and control (I and C) systems in the current NPP PRA framework. Specific areas investigated included PRA modeling digital hardware, development of a philosophical basis for defining software failure, and identification of desirable attributes of quantitative software reliability methods. Based on the earlier research, statistical testing is considered a promising method for quantifying software reliability. This paper describes a statistical software testing approach for quantifying software reliability and applies it to the loop-operating control system (LOCS) of an experimental loop of the Advanced Test Reactor (ATR) at Idaho National Laboratory (INL).

  18. Nanofocus of tenth of joules and a portable plasma focus of few joules for field applications

    SciTech Connect

    Soto, Leopoldo; Pavez, Cristian; Moreno, Jose; Tarifeno, Ariel; Pedreros, Jose; Altamirano, Luis

    2009-01-21

    A repetitive pinch plasma focus that works with stored energy less than 1 J per shot has be developed at the Chilean Nuclear Energy Commission. The main features of this device, repetitive Nanofocus, are 5 nF of capacity, 5 nH of inductance, 5-10 kV charging voltage, 60-250 mJ stored energy, 5-10 kA current peak, per shot. The device has been operated at 20 Hz in hydrogen and deuterium. X-ray radiographs of materials of different thickness were obtained. Neutrons were detected using a system based upon {sup 3}He proportional counter in chare integrated mode. However, the reproducibility of this miniaturized device is low and several technological subjects have to be previously solved in order to produce neutrons for periods greater than minutes. Further studies in the Nanofocus are being carried out. In addition, a device with a stored energy of a few joules is being explored. A preliminary compact, low weight (3 kg), portable PF device (25 cmx5 cmx5 cm) for field applications has been designed. This device was designed to operate with few kilovolts (10 kV or less) with a stored energy of 2 J and a repetition rate of 10 Hz without cooling. A neutron flux of the order of 10{sup 4}-10{sup 5} n/s is expected.

  19. Multi-resolution correlative focused ion beam scanning electron microscopy: applications to cell biology.

    PubMed

    Narayan, Kedar; Danielson, Cindy M; Lagarec, Ken; Lowekamp, Bradley C; Coffman, Phil; Laquerre, Alexandre; Phaneuf, Michael W; Hope, Thomas J; Subramaniam, Sriram

    2014-03-01

    Efficient correlative imaging of small targets within large fields is a central problem in cell biology. Here, we demonstrate a series of technical advances in focused ion beam scanning electron microscopy (FIB-SEM) to address this issue. We report increases in the speed, robustness and automation of the process, and achieve consistent z slice thickness of ∼3 nm. We introduce "keyframe imaging" as a new approach to simultaneously image large fields of view and obtain high-resolution 3D images of targeted sub-volumes. We demonstrate application of these advances to image post-fusion cytoplasmic intermediates of the HIV core. Using fluorescently labeled cell membranes, proteins and HIV cores, we first produce a "target map" of an HIV infected cell by fluorescence microscopy. We then generate a correlated 3D EM volume of the entire cell as well as high-resolution 3D images of individual HIV cores, achieving correlative imaging across a volume scale of 10(9) in a single automated experimental run.

  20. A Spatially Focused Method for High Density Electrode-Based Functional Brain Mapping Applications.

    PubMed

    Chang, Chih-Wei; Hsin, Yue-Loong; Liu, Wentai

    2016-10-01

    Mapping the electric field of the brain with electrodes benefits from its superior temporal resolution but is prone to low spatial resolution property comparing with other modalities such as fMRI, which can directly impact the precision of clinical diagnosis. Simulations show that dense arrays with straightforwardly miniaturized electrodes in terms of size and pitch may not improve the spatial resolution but only strengthen the cross coupling between adjacent channels due to volume conduction. We present a new spatially focused method to improve the electrode spatial selectivity and consequently suppress the neural signal coupling from the sources in the vicinity. Compared with existing spatial filtering methods with fixed coefficients, the proposed method is adaptively optimized for the geometric parameters of the recording electrode arrays, including electrode size, pitch and source depth. The effective spatial bandwidth, characterized as Radius of Half Power, can be reduced by about 70% for ECoG and the case of distant sources scenarios. The proposed method has been applied to the analysis of high-frequency oscillations (HFOs) in seizures to study the ictal pathway in the epileptogenic region. The results reveal lucid HFO wavefront propagation in both preictal and ictal stages due to a 75% reduction in the coupling effect. The results also show that a specific power threshold of preictal HFOs is needed in order to initiate an epileptic seizure. This demonstrates that our method indeed facilitates the investigation of complex neurobiological signals preprocessing applications.

  1. A Spatially Focused Method for High Density Electrode-Based Functional Brain Mapping Applications.

    PubMed

    Chang, Chih-Wei; Hsin, Yue-Loong; Liu, Wentai

    2016-03-07

    Mapping the electric field of the brain with electrodes benefits from its superior temporal resolution but is prone to low spatial resolution property comparing with other modalities such as fMRI, which can directly impact the precision of clinical diagnosis. Simulations show that dense arrays with straightforwardly miniaturized electrodes in terms of size and pitch may not improve the spatial resolution but only strengthen the cross coupling between adjacent channels due to volume conduction. We present a new spatially focused method to improve the electrode spatial selectivity and consequently suppress the neural signal coupling from the sources in the vicinity. Compared with existing spatial filtering methods with fixed coefficients, the proposed method is adaptively optimized for the geometric parameters of the recording electrode arrays, including electrode size, pitch and source depth. The effective spatial bandwidth, characterized as Radius of Half Power, can be reduced by about 70% for ECoG and the case of distant sources scenarios. The proposed method has been applied to the analysis of high-frequency oscillations (HFOs) in seizures to study the ictal pathway in the epileptogenic region. The results reveal lucid HFO wavefront propagation in both preictal and ictal stages due to a 75% reduction in the coupling effect. The results also show that a specific power threshold of preictal HFOs is needed in order to initiate an epileptic seizure. This demonstrates that our method indeed facilitates the investigation of complex neurobiological signals preprocessing applications.

  2. Development of the dense plasma focus for short-pulse applications

    DOE PAGES

    Bennett, N.; Blasco, M.; Breeding, K.; ...

    2017-01-05

    The dense plasma focus (DPF) has long been considered a compact source for pulsed neutrons and has traditionally been optimized for the total neutron yield. Here, we describe the efforts to optimize the DPF for short-pulse applications by introducing a reentrant cathode at the end of the coaxial plasma gun. We reduced the resulting neutron pulse widths by an average of 21±921±9% from the traditional long-drift DPF design. Pulse widths and yields achieved from deuterium-tritium fusion at 2 MA are 61.8±30.761.8±30.7 ns FWHM and 1.84±0.49×10121.84±0.49×1012 neutrons per shot. Simulations were conducted concurrently to elucidate the DPF operation and confirm themore » role of the reentrant cathode. Furthermore, a hybrid fluid-kinetic particle-in-cell modeling capability demonstrates correct sheath velocities, plasma instabilities, and fusion yield rates. Consistent with previous findings that the DPF is dominated by beam-target fusion from superthermal ions, we estimate that the thermonuclear contribution is at the 1% level.« less

  3. Generation of micro-sized PDMS particles by a flow focusing technique for biomicrofluidics applications.

    PubMed

    Muñoz-Sánchez, B N; Silva, S F; Pinho, D; Vega, E J; Lima, R

    2016-01-01

    Polydimethylsiloxane (PDMS), due to its remarkable properties, is one of the most widely used polymers in many industrial and medical applications. In this work, a technique based on a flow focusing technique is used to produce PDMS spherical particles with sizes of a few microns. PDMS precursor is injected through a hypodermic needle to form a film/reservoir over the needle's outer surface. This film flows towards the needle tip until a liquid ligament is steadily ejected thanks to the action of a coflowing viscous liquid stream. The outcome is a capillary jet which breaks up into PDMS precursor droplets due to the growth of capillary waves producing a micrometer emulsion. The PDMS liquid droplets in the solution are thermally cured into solid microparticles. The size distribution of the particles is analyzed before and after curing, showing an acceptable degree of monodispersity. The PDMS liquid droplets suffer shrinkage while curing. These microparticles can be used in very varied technological fields, such as biomedicine, biotechnology, pharmacy, and industrial engineering.

  4. Generation of micro-sized PDMS particles by a flow focusing technique for biomicrofluidics applications

    PubMed Central

    Vega, E. J.; Lima, R.

    2016-01-01

    Polydimethylsiloxane (PDMS), due to its remarkable properties, is one of the most widely used polymers in many industrial and medical applications. In this work, a technique based on a flow focusing technique is used to produce PDMS spherical particles with sizes of a few microns. PDMS precursor is injected through a hypodermic needle to form a film/reservoir over the needle's outer surface. This film flows towards the needle tip until a liquid ligament is steadily ejected thanks to the action of a coflowing viscous liquid stream. The outcome is a capillary jet which breaks up into PDMS precursor droplets due to the growth of capillary waves producing a micrometer emulsion. The PDMS liquid droplets in the solution are thermally cured into solid microparticles. The size distribution of the particles is analyzed before and after curing, showing an acceptable degree of monodispersity. The PDMS liquid droplets suffer shrinkage while curing. These microparticles can be used in very varied technological fields, such as biomedicine, biotechnology, pharmacy, and industrial engineering. PMID:27042245

  5. Detecting Optic Atrophy in Multiple Sclerosis Patients Using New Colorimetric Analysis Software: From Idea to Application.

    PubMed

    Bambo, Maria Pilar; Garcia-Martin, Elena; Perez-Olivan, Susana; Larrosa-Povés, José Manuel; Polo-Llorens, Vicente; Gonzalez-De la Rosa, Manuel

    2016-01-01

    Neuro-ophthalmologists typically observe a temporal pallor of the optic disc in patients with multiple sclerosis. Here, we describe the emergence of an idea to quantify these optic disc color changes in multiple sclerosis patients. We recruited 12 multiple sclerosis patients with previous optic neuritis attack and obtained photographs of their optic discs. The Laguna ONhE, a new colorimetric software using hemoglobin as the reference pigment in the papilla, was used for the analysis. The papilla of these multiple sclerosis patients showed greater pallor, especially in the temporal sector. The software detected the pallor and assigned hemoglobin percentages below normal reference values. Measurements of optic disc hemoglobin levels obtained with the Laguna ONhE software program had good ability to detect optic atrophy and, consequently, axonal loss in multiple sclerosis patients. This new technology is easy to implement in routine clinical practice.

  6. Open source hardware and software platform for robotics and artificial intelligence applications

    NASA Astrophysics Data System (ADS)

    Liang, S. Ng; Tan, K. O.; Lai Clement, T. H.; Ng, S. K.; Mohammed, A. H. Ali; Mailah, Musa; Azhar Yussof, Wan; Hamedon, Zamzuri; Yussof, Zulkifli

    2016-02-01

    Recent developments in open source hardware and software platforms (Android, Arduino, Linux, OpenCV etc.) have enabled rapid development of previously expensive and sophisticated system within a lower budget and flatter learning curves for developers. Using these platform, we designed and developed a Java-based 3D robotic simulation system, with graph database, which is integrated in online and offline modes with an Android-Arduino based rubbish picking remote control car. The combination of the open source hardware and software system created a flexible and expandable platform for further developments in the future, both in the software and hardware areas, in particular in combination with graph database for artificial intelligence, as well as more sophisticated hardware, such as legged or humanoid robots.

  7. The 'Densitometric Image Analysis Software' and its application to determine stepwise equilibrium constants from electrophoretic mobility shift assays.

    PubMed

    van Oeffelen, Liesbeth; Peeters, Eveline; Nguyen Le Minh, Phu; Charlier, Daniël

    2014-01-01

    Current software applications for densitometric analysis, such as ImageJ, QuantityOne (BioRad) and the Intelligent or Advanced Quantifier (Bio Image) do not allow to take the non-linearity of autoradiographic films into account during calibration. As a consequence, quantification of autoradiographs is often regarded as problematic, and phosphorimaging is the preferred alternative. However, the non-linear behaviour of autoradiographs can be described mathematically, so it can be accounted for. Therefore, the 'Densitometric Image Analysis Software' has been developed, which allows to quantify electrophoretic bands in autoradiographs, as well as in gels and phosphorimages, while providing optimized band selection support to the user. Moreover, the program can determine protein-DNA binding constants from Electrophoretic Mobility Shift Assays (EMSAs). For this purpose, the software calculates a chosen stepwise equilibrium constant for each migration lane within the EMSA, and estimates the errors due to non-uniformity of the background noise, smear caused by complex dissociation or denaturation of double-stranded DNA, and technical errors such as pipetting inaccuracies. Thereby, the program helps the user to optimize experimental parameters and to choose the best lanes for estimating an average equilibrium constant. This process can reduce the inaccuracy of equilibrium constants from the usual factor of 2 to about 20%, which is particularly useful when determining position weight matrices and cooperative binding constants to predict genomic binding sites. The MATLAB source code, platform-dependent software and installation instructions are available via the website http://micr.vub.ac.be.

  8. Space Station Software Recommendations

    NASA Technical Reports Server (NTRS)

    Voigt, S. (Editor)

    1985-01-01

    Four panels of invited experts and NASA representatives focused on the following topics: software management, software development environment, languages, and software standards. Each panel deliberated in private, held two open sessions with audience participation, and developed recommendations for the NASA Space Station Program. The major thrusts of the recommendations were as follows: (1) The software management plan should establish policies, responsibilities, and decision points for software acquisition; (2) NASA should furnish a uniform modular software support environment and require its use for all space station software acquired (or developed); (3) The language Ada should be selected for space station software, and NASA should begin to address issues related to the effective use of Ada; and (4) The space station software standards should be selected (based upon existing standards where possible), and an organization should be identified to promulgate and enforce them. These and related recommendations are described in detail in the conference proceedings.

  9. Application of Artificial Intelligence technology to the analysis and synthesis of reliable software systems

    NASA Technical Reports Server (NTRS)

    Wild, Christian; Eckhardt, Dave

    1987-01-01

    The development of a methodology for the production of highly reliable software is one of the greatest challenges facing the computer industry. Meeting this challenge will undoubtably involve the integration of many technologies. This paper describes the use of Artificial Intelligence technologies in the automated analysis of the formal algebraic specifications of abstract data types. These technologies include symbolic execution of specifications using techniques of automated deduction and machine learning through the use of examples. On-going research into the role of knowledge representation and problem solving in the process of developing software is also discussed.

  10. Microcomputer technology applications: Charger and regulator software for a breadboard programmable power processor

    NASA Technical Reports Server (NTRS)

    Green, D. M.

    1978-01-01

    Software programs are described, one which implements a voltage regulation function, and one which implements a charger function with peak-power tracking of its input. The software, written in modular fashion, is intended as a vehicle for further experimentation with the P-3 system. A control teleprinter allows an operator to make parameter modifications to the control algorithm during experiments. The programs require 3K ROM and 2K ram each. User manuals for each system are included as well as a third program for simple I/O control.

  11. An application of the IMC software to controller design for the JPL LSCL Experiment Facility

    NASA Technical Reports Server (NTRS)

    Zhu, Guoming; Skelton, Robert E.

    1993-01-01

    A software package which Integrates Model reduction and Controller design (The IMC software) is applied to design controllers for the JPL Large Spacecraft Control Laboratory Experiment Facility. Modal Cost Analysis is used for the model reduction, and various Output Covariance Constraints are guaranteed by the controller design. The main motivation is to find the controller with the 'best' performance with respect to output variances. Indeed it is shown that by iterating on the reduced order design model, the controller designed does have better performance than that obtained with the first model reduction.

  12. Experimental demonstration of elastic optical networks based on enhanced software defined networking (eSDN) for data center application.

    PubMed

    Zhang, Jie; Yang, Hui; Zhao, Yongli; Ji, Yuefeng; Li, Hui; Lin, Yi; Li, Gang; Han, Jianrui; Lee, Young; Ma, Teng

    2013-11-04

    Due to the high burstiness and high-bandwidth characteristics of the applications, data center interconnection by elastic optical networks have attracted much attention of network operators and service providers. Many data center applications require lower delay and higher availability with the end-to-end guaranteed quality of service. In this paper, we propose and implement a novel elastic optical network based on enhanced software defined networking (eSDN) architecture for data center application, by introducing a transport-aware cross stratum optimization (TA-CSO) strategy. eSDN can enable cross stratum optimization of application and elastic optical network stratum resources and provide the elastic physical layer parameter adjustment, e.g., modulation format and bandwidth. We have designed and verified experimentally software defined path provisioning on our testbed with four real OpenFlow-enabled elastic optical nodes for data center application. The overall feasibility and efficiency of the proposed architecture is also experimentally demonstrated and compared with individual CSO and physical layer adjustment strategies in terms of path setup/release/adjustment latency, blocking probability and resource occupation rate.

  13. Software Epistemology

    DTIC Science & Technology

    2016-03-01

    corpuses at scale using deep neural networks, i.e., Deep Machine Learning, on high quality features computed from canonical representations of...the application of Deep Learning on software features to support automated vulnerability identification and repair. 1.2 Overview Draper’s...referenced in Table 2. Several web -based tools were maintained to show cluster processing status. Figure 10 shows a snapshot of the build inventory

  14. Application of Calibrated Peer Review (CPR) Writing Assignments to Enhance Experiments with an Environmental Chemistry Focus

    ERIC Educational Resources Information Center

    Margerum, Lawrence D.; Gulsrud, Maren; Manlapez, Ronald; Rebong, Rachelle; Love, Austin

    2007-01-01

    The browser-based software program, Calibrated Peer Review (CPR) developed by the Molecular Science Project enables instructors to create structured writing assignments in which students learn by writing and reading for content. Though the CPR project covers only one experiment in general chemistry, it might provide lab instructors with a method…

  15. Study of application of space telescope science operations software for SIRTF use

    NASA Technical Reports Server (NTRS)

    Dignam, F.; Stetson, E.; Allendoerfer, W.

    1985-01-01

    The design and development of the Space Telescope Science Operations Ground System (ST SOGS) was evaluated to compile a history of lessons learned that would benefit NASA's Space Infrared Telescope Facility (SIRTF). Forty-nine specific recommendations resulted and were categorized as follows: (1) requirements: a discussion of the content, timeliness and proper allocation of the system and segment requirements and the resulting impact on SOGS development; (2) science instruments: a consideration of the impact of the Science Instrument design and data streams on SOGS software; and (3) contract phasing: an analysis of the impact of beginning the various ST program segments at different times. Approximately half of the software design and source code might be useable for SIRTF. Transportability of this software requires, at minimum, a compatible DEC VAX-based architecture and VMS operating system, system support software similar to that developed for SOGS, and continued evolution of the SIRTF operations concept and requirements such that they remain compatible with ST SOGS operation.

  16. Contemporary Applications of Computer Technology: Development of Meaningful Software in Special Education/Rehabilitation.

    ERIC Educational Resources Information Center

    Mills, Russell

    Four elements of clinical programming must be considered during development in order for a software program to be truly useful in rehabilitation: presentation of a useful task; treatment parameters selectable by clinicians; data collection/analysis; and authoring capability. These criteria govern the development of all Brain-Link Software…

  17. Possible Application of Quality Function Deployment in Software Systems Development in the United States Air Force

    DTIC Science & Technology

    1991-12-01

    his cooperation in acquiring QFD Designer. I also wish to thank Mr Allen Chartier of the American Supplieri Institute for his help in identifying...and What Didn’t," Transactions from the Symposium on Quality Function Deployment. 305-335. Dearborn MI: ASI Press, 1989. Pressman, Roger S. Software

  18. Altering the Application of the Traditional Systems Development Life Cycle for Air Force Software Programs.

    DTIC Science & Technology

    1987-04-01

    York: North Holland, Inc., 1981. 2. Fox, Joseph M. Software and Its Development. Englewood Cliffs, New Jersey: Prentice-Hall, Inc., 1982. A 3. Gujarati ... Damodar . Basic Econometrics. New York: McGraw- Hill Book Company, 1978. J.,. 4. Larr, L., et al. Planning Guide for Computer Programming Development

  19. A Component Approach to Collaborative Scientific Software Development: Tools and Techniques Utilized by the Quantum Chemistry Science Application Partnership

    DOE PAGES

    Kenny, Joseph P.; Janssen, Curtis L.; Gordon, Mark S.; ...

    2008-01-01

    Cutting-edge scientific computing software is complex, increasingly involving the coupling of multiple packages to combine advanced algorithms or simulations at multiple physical scales. Component-based software engineering (CBSE) has been advanced as a technique for managing this complexity, and complex component applications have been created in the quantum chemistry domain, as well as several other simulation areas, using the component model advocated by the Common Component Architecture (CCA) Forum. While programming models do indeed enable sound software engineering practices, the selection of programming model is just one building block in a comprehensive approach to large-scale collaborative development which must also addressmore » interface and data standardization, and language and package interoperability. We provide an overview of the development approach utilized within the Quantum Chemistry Science Application Partnership, identifying design challenges, describing the techniques which we have adopted to address these challenges and highlighting the advantages which the CCA approach offers for collaborative development.« less

  20. Further Evolution of Composite Doubler Aircraft Repairs Through a Focus on Niche Applications

    SciTech Connect

    ROACH,DENNIS P.

    2000-07-15

    The number of commercial airframes exceeding twenty years of service continues to grow. A typical aircraft can experience over 2,000 fatigue cycles (cabin pressurizations) and even greater flight hours in a single year. An unavoidable by-product of aircraft use is that crack and corrosion flaws develop throughout the aircraft's skin and substructure elements. Economic barriers to the purchase of new aircraft have created an aging aircraft fleet and placed even greater demands on efficient and safe repair methods. The use of bonded composite doublers offers the airframe manufacturers and aircraft maintenance facilities a cost effective method to safety extend the lives of their aircraft. Instead of riveting multiple steel or aluminum plates to facilitate an aircraft repair, it is now possible to bond a single Boron-Epoxy composite doubler to the damaged structure. The FAA's Airworthiness Assurance Center at Sandia National Labs (AANC) is conducting a program with Boeing and Federal Express to validate and introduce composite doubler repair technology to the US commercial aircraft industry. This project focuses on repair of DC-10 structure and builds on the foundation of the successful L-1011 door corner repair that was completed by the AANC, Lockheed-Martin, and Delta Air Lines. The L-1011 composite doubler repair was installed in 1997 and has not developed any flaws in over three years of service, As a follow-on effort, this DC-1O repair program investigated design, analysis, performance (durability, flaw containment, reliability), installation, and nondestructive inspection issues. Current activities are demonstrating regular use of composite doubler repairs on commercial aircraft. The primary goal of this program is to move the technology into niche applications and to streamline the design-to-installation process. Using the data accumulated to date, the team has designed, analyzed, and developed inspection techniques for an array of composite doubler repairs

  1. On Software Compatibility.

    ERIC Educational Resources Information Center

    Ershov, Andrei P.

    The problem of compatibility of software hampers the development of computer application. One solution lies in standardization of languages, terms, peripherais, operating systems and computer characteristics. (AB)

  2. THE APPLICATION OF THE SXF LATTICE DESCRIPTION AND THE UAL SOFTWARE ENVIRONMENT TO THE ANALYSIS OF THE LHC.

    SciTech Connect

    FISCHER,W.; PILAT,F.; PTITSON,V.

    1999-03-29

    A software environment for accelerator modeling has been developed which includes the UAL (Unified Accelerator Library), a collection of accelerator physics libraries with a Per1 interface for scripting, and the SXF (Standard eXchange Format), a format for accelerator description which extends the MAD sequence by including deviations from design values. SXF interfaces have been written for several programs, including MAD9 and MAD8 via the doom database, Cosy, TevLat and UAL itself, which includes Teapot++. After an overview of the software we describe the application of the tools to the analysis of the LHC lattice stability, in the presence of alignment and coupling errors, and to the correction of the first turn and closed orbit in the machine.

  3. The Study on Neuro-IE Management Software in Manufacturing Enterprises. -The Application of Video Analysis Technology

    NASA Astrophysics Data System (ADS)

    Bian, Jun; Fu, Huijian; Shang, Qian; Zhou, Xiangyang; Ma, Qingguo

    This paper analyzes the outstanding problems in current industrial production by reviewing the three stages of the Industrial Engineering Development. Based on investigations and interviews in enterprises, we propose the new idea of applying "computer video analysis technology" to new industrial engineering management software, and add "loose-coefficient" of the working station to this software in order to arrange scientific and humanistic production. Meanwhile, we suggest utilizing Biofeedback Technology to promote further research on "the rules of workers' physiological, psychological and emotional changes in production". This new kind of combination will push forward industrial engineering theories and benefit enterprises in progressing towards flexible social production, thus it will be of great theory innovation value, social significance and application value.

  4. MisTec: A software application for supporting space exploration scenario options and technology development analysis and planning

    NASA Technical Reports Server (NTRS)

    Horsham, Gary A. P.

    1991-01-01

    The structure and composition of a new, emerging software application, which models and analyzes space exploration scenario options for feasibility based on technology development projections is presented. The software application consists of four main components: a scenario generator for designing and inputting scenario options and constraints; a processor which performs algorithmic coupling and options analyses of mission activity requirements and technology capabilities; a results display which graphically and textually shows coupling and options analysis results; and a data/knowledge base which contains information on a variety of mission activities and (power and propulsion) technology system capabilities. The general long-range study process used by NASA to support recent studies is briefly introduced to provide the primary basis for comparison for discussing the potential advantages to be gained from developing and applying this king of application. A hypothetical example of a scenario option to facilitate the best conceptual understanding of what the application is, how it works, or the operating methodology, and when it might be applied is presented.

  5. MisTec - A software application for supporting space exploration scenario options and technology development analysis and planning

    NASA Technical Reports Server (NTRS)

    Horsham, Gary A. P.

    1992-01-01

    This structure and composition of a new, emerging software application, which models and analyzes space exploration scenario options for feasibility based on technology development projections is presented. The software application consists of four main components: a scenario generator for designing and inputting scenario options and constraints; a processor which performs algorithmic coupling and options analyses of mission activity requirements and technology capabilities; a results display which graphically and textually shows coupling and options analysis results; and a data/knowledge base which contains information on a variety of mission activities and (power and propulsion) technology system capabilities. The general long-range study process used by NASA to support recent studies is briefly introduced to provide the primary basis for comparison for discussing the potential advantages to be gained from developing and applying this kind of application. A hypothetical example of a scenario option to facilitate the best conceptual understanding of what the application is, how it works, or the operating methodology, and when it might be applied is presented.

  6. Design, Fabrication, and Characterization of Metamaterials for Transformation Optics and Focusing Applications

    DTIC Science & Technology

    2014-02-11

    of refraction in the region of the “lens”, successfully focusing surface plasmon polaritons (SPP). SUPERABSORBERS: The team used the Rigorous Coupled...PLASMONIC FOCUSING: The team constructed a device capable of splitting and focusing surface plasmon polaritons into different locations depending on the...surface plasmon polaritons , plasmonics 16. SECURITY CLASSIFICATION OF: 17. LIMITATION OF ABSTRACT SAR 18, NUMBER OF PAGES 19 19a. NAME

  7. Microfabrication and Test of a Three-Dimensional Polymer Hydro-focusing Unit for Flow Cytometry Applications

    NASA Technical Reports Server (NTRS)

    Yang, Ren; Feeback, Daniel L.; Wang, Wan-Jun

    2005-01-01

    This paper details a novel three-dimensional (3D) hydro-focusing micro cell sorter for micro flow cytometry applications. The unit was microfabricated by means of SU-8 3D lithography. The 3D microstructure for coaxial sheathing was designed, microfabricated, and tested. Three-dimensional hydrofocusing capability was demonstrated with an experiment to sort labeled tanned sheep erythrocytes (red blood cells). This polymer hydro-focusing microstructure is easily microfabricated and integrated with other polymer microfluidic structures. Keywords: SU-8, three-dimensional hydro-focusing, microfluidic, microchannel, cytometer

  8. Value Benefit Analysis Software and Its Application in Bolu-Lake Abant Natural Park

    PubMed Central

    Aytekin, Alper; Corbaci, Omer Lutfu

    2008-01-01

    Value benefit analysis (VBA) is a psychometric instrument for finding the best compromise in forestry multiple-use planning, when the multiple objectives cannot be expressed in the same physical or monetary unit. It insures a systematic assessment of the consequences of proposed alternatives and thoroughly documents the decision process. The method leads to a ranking of alternatives based upon weighting of the objectives and evaluation of the contribution of each alternative to these objectives. The use of the method is illustrated with hypothetical data about Bolu-Lake Abant Natural Park (BLANP). In this study, in addition, computer software controlling the confidence was created. This software puts into practice the method proposed by Churchman and Ackoff, and determines the significance of the alternatives quickly and accurately. PMID:27873837

  9. PubMedAlertMe--standalone Windows-based PubMed SDI software application.

    PubMed

    Ma'ayan, Avi

    2008-05-01

    PubMedAlertMe is a Windows-based software system for automatically receiving e-mail alert messages about recent publications listed on PubMed. The e-mail messages contain links to newly available abstracts listed on PubMed describing publications that were selectively returned from a specified list of queries. Links are also provided to directly export citations to EndNote, and links are provided to directly forward articles to colleagues. The program is standalone. Thus, it does not require a remote mail server or user registration. PubMedAlertMe is free software, and can be downloaded from: http://amp.pharm.mssm.edu/PubMedAlertMe/PubMedAlertMe_setup.zip.

  10. [Computers in nursing: development of free software application with care and management].

    PubMed

    dos Santos, Sérgio Ribeiro

    2010-06-01

    This study aimed at developing an information system in nursing with the implementation of nursing care and management of the service. The SisEnf--Information System in Nursing--is a free software module that comprises the care of nursing: history, clinical examination and care plan; the management module consists of: service shifts, personnel management, hospital indicators and other elements. The system was implemented at the Medical Clinic of the Lauro Wanderley University Hospital, at Universidade Federal da Paraiba. In view of the need to bring user and developer closer, in addition to the constant change of functional requirements during the interactive process, the method of unified process was used. The SisEnf was developed on a WEB platform and using free software. Hence, the work developed aimed at assisting in the working process of nursing, which will now have the opportunity to incorporate information technology in their work routine.

  11. Parallel Domain Decomposition Formulation and Software for Large-Scale Sparse Symmetrical/Unsymmetrical Aeroacoustic Applications

    NASA Technical Reports Server (NTRS)

    Nguyen, D. T.; Watson, Willie R. (Technical Monitor)

    2005-01-01

    The overall objectives of this research work are to formulate and validate efficient parallel algorithms, and to efficiently design/implement computer software for solving large-scale acoustic problems, arised from the unified frameworks of the finite element procedures. The adopted parallel Finite Element (FE) Domain Decomposition (DD) procedures should fully take advantages of multiple processing capabilities offered by most modern high performance computing platforms for efficient parallel computation. To achieve this objective. the formulation needs to integrate efficient sparse (and dense) assembly techniques, hybrid (or mixed) direct and iterative equation solvers, proper pre-conditioned strategies, unrolling strategies, and effective processors' communicating schemes. Finally, the numerical performance of the developed parallel finite element procedures will be evaluated by solving series of structural, and acoustic (symmetrical and un-symmetrical) problems (in different computing platforms). Comparisons with existing "commercialized" and/or "public domain" software are also included, whenever possible.

  12. Development and validation of a video analysis software for marine benthic applications

    NASA Astrophysics Data System (ADS)

    Romero-Ramirez, A.; Grémare, A.; Bernard, G.; Pascal, L.; Maire, O.; Duchêne, J. C.

    2016-10-01

    Our aim in the EU funded JERICO project was to develop a flexible and scalable imaging platform that could be used in the widest possible set of ecological situations. Depending on research objectives, both image acquisition and analysis procedures may indeed differ. Up to now the attempts for automating image analysis procedures have consisted of the development of pieces of software specifically designed for a given objective. This led to the conception of a new software: AVIExplore. Its general architecture and its three constitutive modules: AVIExplore - Mobile, AVIExplore - Fixed and AVIExplore - ScriptEdit are presented. AVIExplore provides a unique environment for video analysis. Its main features include: (1) image selection tools allowing for the division of videos in homogeneous sections, (2) automatic extraction of targeted information, (3) solutions for long-term time-series as well as large spatial scale image acquisition, (4) real time acquisition and in some cases real time analysis, and (5) a large range of customized image-analysis possibilities through a script editor. The flexibility of use of AVIExplore is illustrated and validated by three case studies: (1) coral identification and mapping, (2) identification and quantification of different types of behaviors in a mud shrimp, and (3) quantification of filtering activity in a passive suspension-feeder. The accuracy of the software is measured comparing with visual assessment. It is: 90.2%, 82.7%, and 98.3% for the three case studies, respectively. Some of the advantages and current limitations of the software as well as some of its foreseen advancements are then briefly discussed.

  13. Development of methods for equivalent transformation of GERT networks for application in multi-version software

    NASA Astrophysics Data System (ADS)

    Saramud, M. V.; Zelenkov, P. V.; Kovalev, I. V.; Kovalev, D. I.; Kartsan, I. N.

    2016-11-01

    The usage of information and control systems in various fields of production and research requires a certain level of reliability in the functioning of these systems. One of the most promising and already positively proven methodologies to ensure high reliability and fault tolerance of software is multi-version designing. The greatest value this factor have in the areas where control system failure can result in significant financial and material losses, as well as harm to human health and life.

  14. Modeling Heterogeneous Carbon Nanotube Networks for Photovoltaic Applications Using Silvaco Atlas Software

    DTIC Science & Technology

    2012-06-01

    14 Figure 9. The spectral responses of gallium indium phosphide (GaInP), gallium arsenide (GaAs), and germanium (Ge) solar cells are graphed...power was not created until 1954 when Chapin, Fuller, and Pearson developed a silicon based solar cell for Bell Labs. Since the creation of the...top layer of a Gallium Arsenide (GaAs) solar cell which was simulated in ATLAS software. 3 II. BACKGROUND The use of CNTs to improve the

  15. Application of Real Options Theory to DoD Software Acquisitions

    DTIC Science & Technology

    2009-08-01

    financial valuation, Monte Carlo simulation, stochastic forecasting, optimization, and - vi - risk analysis. He also holds the position of the EU...opportunity exceeds the exercise price (Hevert, 2008). A Monte Carlo simulation of the risk model (Figure 5) was run using the Risk Simulator software...approach. We then ran the Monte Carlo simulation of the model with the revised risk estimates again. Based on the risk of requirements uncertainty8

  16. Perspectives in Medical Applications of Monte Carlo Simulation Software for Clinical Practice in Radiotherapy Treatments

    NASA Astrophysics Data System (ADS)

    Boschini, Matteo; Giani, Simone; Ivanchenko, Vladimir; Rancoita, Pier-Giorgio

    2006-04-01

    We discuss the physics requirements to accurately model radiation dosimetry in the human body as performed for oncological radiotherapy treatment. Recent advancements in computing hardware and software simulation technology allow precise dose calculation in real-life imaging output, with speed suitable for clinical needs. An experimental programme, based on physics published literature, is proposed to demonstrate the actual possibility to improve the precision of radiotherapy treatment planning.

  17. Interfacing US Census map files with statistical graphics software: Application and use in epidemiology

    SciTech Connect

    Rizzardi, M.; Mohr, M.S.; Merrill, D.W.; Selvin, S. California Univ., Berkeley, CA . School of Public Health)

    1993-03-01

    In 1990, the United States Bureau of the Census released detailed geographic map files known as TIGER/Line (Topologically Integrated Geographic Encoding and Referencing). The TIGER files, accessible through purchase or Federal repository libraries, contain 24 billion characters of data describing various geographic features including coastlines, hydrography, transportation networks, political boundaries, etc. covering the entire United States. Many of these physical features are of potential interest in epidemiological case studies. Unfortunately, the TIGER database only provides raw alphanumeric data; no utility software, graphical or otherwise, is included. Recently, the S statistical software package has been extended to include a map display function. The map function augments S's high-level approach toward statistical analysis and graphical display of data. Coupling this statistical software with the map database developed for US Census data collection will facilitate epidemiological research. We discuss the technical background necessary to utilize the TIGER database for mapping with S. Two types of S maps, segment-based and polygon-based, are discussed along with methods to construct them from TIGER data. Polygon-based maps are useful for displaying regional statistical data; e.g., disease rates or incidence at the census tract level. Segment-based maps are easier to assemble and appropriate if the data are not regionalized. Census tract data of AIDS incidence in San Francisco (CA) and lung cancer case locations relative to petrochemical refinery sites in Contra Costa County (CA) are used to illustrate the methods and potential uses of interfacing the TIGER database with S.

  18. Two-dimensional spatiotemporal focusing of femtosecond pulses and its applications in microscopy

    SciTech Connect

    Song, Qiyuan; Nakamura, Aoi; Hirosawa, Kenichi; Kannari, Fumihiko; Isobe, Keisuke; Midorikawa, Katsumi

    2015-08-15

    We demonstrate and theoretically analyze the two-dimensional spatiotemporal focusing of femtosecond pulses by utilizing a two-dimensional spectral disperser. Compared with spatiotemporal focusing with a diffraction grating, it can achieve widefield illumination with better sectioning ability for a multiphoton excitation process. By utilizing paraxial approximation, our analytical method improves the axial confinement ability and identifies that the free spectra range (FSR) of the two-dimensional spectral disperser affects the out-of-focus multiphoton excitation intensity due to the temporal self-imaging effect. Based on our numerical simulation, a FSR of 50 GHz is necessary to reduce the out-of-focus two-photon excitation by 2 orders of magnitude compared with that in a grating-based spatiotemporal focusing scheme for a 90-fs excitation laser pulse. We build a two-dimensional spatiotemporal focusing microscope using a virtually imaged phased array and achieve an axial resolution of 1.3 μm, which outperforms the resolution of conventional spatiotemporal focusing using a grating by a factor of 1.7, and demonstrate better image contrast inside a tissue-like phantom.

  19. Software for Fermat's Principle and Lenses

    ERIC Educational Resources Information Center

    Mihas, Pavlos

    2012-01-01

    Fermat's principle is considered as a unifying concept. It is usually presented erroneously as a "least time principle". In this paper we present some software that shows cases of maxima and minima and the application of Fermat's principle to the problem of focusing in lenses. (Contains 12 figures.)

  20. A Practical Software Architecture for Virtual Universities

    ERIC Educational Resources Information Center

    Xiang, Peifeng; Shi, Yuanchun; Qin, Weijun

    2006-01-01

    This article introduces a practical software architecture called CUBES, which focuses on system integration and evolvement for online virtual universities. The key of CUBES is a supporting platform that helps to integrate and evolve heterogeneous educational applications developed by different organizations. Both standardized educational…

  1. Shuttle mission simulator software conceptual design

    NASA Technical Reports Server (NTRS)

    Burke, J. F.

    1973-01-01

    Software conceptual designs (SCD) are presented for meeting the simulator requirements for the shuttle missions. The major areas of the SCD discussed include: malfunction insertion, flight software, applications software, systems software, and computer complex.

  2. Development and application of a complex numerical model and software for the computation of dose conversion factors for radon progenies.

    PubMed

    Farkas, Árpád; Balásházy, Imre

    2015-04-01

    A more exact determination of dose conversion factors associated with radon progeny inhalation was possible due to the advancements in epidemiological health risk estimates in the last years. The enhancement of computational power and the development of numerical techniques allow computing dose conversion factors with increasing reliability. The objective of this study was to develop an integrated model and software based on a self-developed airway deposition code, an own bronchial dosimetry model and the computational methods accepted by International Commission on Radiological Protection (ICRP) to calculate dose conversion coefficients for different exposure conditions. The model was tested by its application for exposure and breathing conditions characteristic of mines and homes. The dose conversion factors were 8 and 16 mSv WLM(-1) for homes and mines when applying a stochastic deposition model combined with the ICRP dosimetry model (named PM-A model), and 9 and 17 mSv WLM(-1) when applying the same deposition model combined with authors' bronchial dosimetry model and the ICRP bronchiolar and alveolar-interstitial dosimetry model (called PM-B model). User friendly software for the computation of dose conversion factors has also been developed. The software allows one to compute conversion factors for a large range of exposure and breathing parameters and to perform sensitivity analyses.

  3. The Life Cycle Application of Intelligent Software Modeling for the First Materials Science Research Rack

    NASA Technical Reports Server (NTRS)

    Rice, Amanda; Parris, Frank; Nerren, Philip

    2000-01-01

    Marshall Space Flight Center (MSFC) has been funding development of intelligent software models to benefit payload ground operations for nearly a decade. Experience gained from simulator development and real-time monitoring and control is being applied to engineering design, testing, and operation of the First Material Science Research Rack (MSRR-1). MSRR-1 is the first rack in a suite of three racks comprising the Materials Science Research Facility (MSRF) which will operate on the International Space Station (ISS). The MSRF will accommodate advanced microgravity investigations in areas such as the fields of solidification of metals and alloys, thermo-physical properties of polymers, crystal growth studies of semiconductor materials, and research in ceramics and glasses. The MSRR-1 is a joint venture between NASA and the European Space Agency (ESA) to study the behavior of different materials during high temperature processing in a low gravity environment. The planned MSRR-1 mission duration is five (5) years on-orbit and the total design life is ten (IO) years. The MSRR-1 launch is scheduled on the third Utilization Flight (UF-3) to ISS, currently in February of 2003). The objective of MSRR-1 is to provide an early capability on the ISS to conduct material science, materials technology, and space product research investigations in microgravity. It will provide a modular, multi-user facility for microgravity research in materials crystal growth and solidification. An intelligent software model of MSRR-1 is under development and will serve multiple purposes to support the engineering analysis, testing, training, and operational phases of the MSRR-1 life cycle development. The G2 real-time expert system software environment developed by Gensym Corporation was selected as the intelligent system shell for this development work based on past experience gained and the effectiveness of the programming environment. Our approach of multi- uses of the simulation model and

  4. Interfacing U.S. census map files with statistical graphics software: application and use in epidemiology.

    PubMed

    Rizzardi, M; Mohr, M S; Merrill, D W; Selvin, S

    1993-10-01

    In 1990, the United States Bureau of the Census released detailed geographic map files known as TIGER/Line (Topologically Integrated Geographic Encoding and Referencing). The TIGER files, accessible through purchase or federal repository libraries, contain 24 billion characters of data describing various geographic features including coastlines, hydrography, transportation networks, political boundaries, etc. for the entire United States. Many of these physical features are of potential interest in epidemiological case studies. Unfortunately, the TIGER data base only provides raw alphanumeric data; no utility software, graphical or otherwise, is included. Recently, the S statistical software package has been extended to include a map display function. The map function augments S's high-level approach towards statistical analysis and graphical display of data. Coupling this statistical software with the map data base developed for U.S. Census data collection will facilitate epidemiological research. We discuss the technical background necessary to utilize the TIGER data base for mapping with S. Two types of S maps, segment-based and polygon-based, are discussed along with methods to construct them from TIGER data. Polygon-based maps are useful for displaying regional statistical data, such as disease rates or incidence at the census tract level. Segment-based maps are easier to assemble and are appropriate when the data are not regionalized. Census tract data of AIDS incidence in San Francisco and lung cancer case locations relative to petrochemical refinery sites in Contra Costa County are used to illustrate the methods and potential uses of interfacing the TIGER data base with S.

  5. Interfacing US Census map files with statistical graphics software: Application and use in epidemiology. Revision 1

    SciTech Connect

    Rizzardi, M.; Mohr, M.S.; Merrill, D.W.; Selvin, S. |

    1993-03-01

    In 1990, the United States Bureau of the Census released detailed geographic map files known as TIGER/Line (Topologically Integrated Geographic Encoding and Referencing). The TIGER files, accessible through purchase or Federal repository libraries, contain 24 billion characters of data describing various geographic features including coastlines, hydrography, transportation networks, political boundaries, etc. covering the entire United States. Many of these physical features are of potential interest in epidemiological case studies. Unfortunately, the TIGER database only provides raw alphanumeric data; no utility software, graphical or otherwise, is included. Recently, the S statistical software package has been extended to include a map display function. The map function augments S`s high-level approach toward statistical analysis and graphical display of data. Coupling this statistical software with the map database developed for US Census data collection will facilitate epidemiological research. We discuss the technical background necessary to utilize the TIGER database for mapping with S. Two types of S maps, segment-based and polygon-based, are discussed along with methods to construct them from TIGER data. Polygon-based maps are useful for displaying regional statistical data; e.g., disease rates or incidence at the census tract level. Segment-based maps are easier to assemble and appropriate if the data are not regionalized. Census tract data of AIDS incidence in San Francisco (CA) and lung cancer case locations relative to petrochemical refinery sites in Contra Costa County (CA) are used to illustrate the methods and potential uses of interfacing the TIGER database with S.

  6. Computerized accounting for the dental office. Using horizontal applications general ledger software.

    PubMed

    Garsson, B

    1988-01-01

    Remember that computer software is designed for accrual accounting, whereas your business operates and reports income on a cash basis. The rules of tax law stipulate that professional practices may use the cash method of accounting, but if accrual accounting is ever used to report taxable income the government may not permit a switch back to cash accounting. Therefore, always consider the computer as a bookkeeper, not a substitute for a qualified accountant. (Your accountant will have readily accessible payroll and general ledger data available for analysis and tax reports, thanks to the magic of computer processing.) Accounts Payable reports are interfaced with the general ledger and are of interest for transaction detail, open invoice and cash flow analysis, and for a record of payments by vendor. Payroll reports, including check register and withholding detail are provided and interfaced with the general ledger. The use of accounting software expands the use of in-office computers to areas beyond professional billing and insurance form generation. It simplifies payroll recordkeeping; maintains payables details; integrates payables, receivables, and payroll with general ledger files; provides instantaneous information on all aspects of the business office; and creates a continuous "audit-trail" following the entering of data. The availability of packaged accounting software allows the professional business office an array of choices. The person(s) responsible for bookkeeping and accounting should choose carefully, ensuring that any system is easy to use, has been thoroughly tested, and provides at least as much control over office records as has been outlined in this article.

  7. Final Technical Report - Center for Technology for Advanced Scientific Component Software (TASCS)

    SciTech Connect

    Sussman, Alan

    2014-10-21

    This is a final technical report for the University of Maryland work in the SciDAC Center for Technology for Advanced Scientific Component Software (TASCS). The Maryland work focused on software tools for coupling parallel software components built using the Common Component Architecture (CCA) APIs. Those tools are based on the Maryland InterComm software framework that has been used in multiple computational science applications to build large-scale simulations of complex physical systems that employ multiple separately developed codes.

  8. An FPGA hardware/software co-design towards evolvable spiking neural networks for robotics application.

    PubMed

    Johnston, S P; Prasad, G; Maguire, L; McGinnity, T M

    2010-12-01

    This paper presents an approach that permits the effective hardware realization of a novel Evolvable Spiking Neural Network (ESNN) paradigm on Field Programmable Gate Arrays (FPGAs). The ESNN possesses a hybrid learning algorithm that consists of a Spike Timing Dependent Plasticity (STDP) mechanism fused with a Genetic Algorithm (GA). The design and implementation direction utilizes the latest advancements in FPGA technology to provide a partitioned hardware/software co-design solution. The approach achieves the maximum FPGA flexibility obtainable for the ESNN paradigm. The algorithm was applied as an embedded intelligent system robotic controller to solve an autonomous navigation and obstacle avoidance problem.

  9. Development of software application dedicated to impulse- radar-based system for monitoring of human movements

    NASA Astrophysics Data System (ADS)

    Miękina, Andrzej; Wagner, Jakub; Mazurek, Paweł; Morawski, Roman Z.; Sudmann, Tobba T.; Børsheim, Ingebjørg T.; Øvsthus, Knut; Jacobsen, Frode F.; Ciamulski, Tomasz; Winiecki, Wiesław

    2016-11-01

    The importance of research on new technologies that could be employed in care services for elderly and disabled persons is highlighted. Advantages of radar sensors, when applied for non-invasive monitoring of such persons in their home environment, are indicated. A need for comprehensible visualisation of the intermediate results of measurement data processing is justified. Capability of an impulse-radar-based system to provide information, being of crucial importance for medical or healthcare personnel, are investigated. An exemplary software interface, tailored for non-technical users, is proposed, and preliminary results of impulse-radar-based monitoring of human movements are demonstrated.

  10. Energy-based adaptive focusing of waves: application to noninvasive aberration correction of ultrasonic wavefields.

    PubMed

    Herbert, Eric; Pernot, Mathieu; Montaldo, Gabriel; Fink, Mathias; Tanter, Mickael

    2009-11-01

    An aberration correction method based on the maximization of the wave intensity at the focus of an emitting array is presented. The potential of this new adaptive focusing technique is investigated for ultrasonic focusing in biological tissues. The acoustic intensity is maximized noninvasively through direct measurement or indirect estimation of the beam energy at the focus for a series of spatially coded emissions. For ultrasonic waves, the acoustic energy at the desired focus can be indirectly estimated from the local displacements induced in tissues by the ultrasonic radiation force of the beam. Based on the measurement of these displacements, this method allows determination of the precise estimation of the phase and amplitude aberrations, and consequently the correction of aberrations along the beam travel path. The proof of concept is first performed experimentally using a large therapeutic array with strong electronic phase aberrations (up to 2pi). Displacements induced by the ultrasonic radiation force at the desired focus are indirectly estimated using the time shift of backscattered echoes recorded on the array. The phase estimation is deduced accurately using a direct inversion algorithm which reduces the standard deviation of the phase distribution from sigma = 1.89 radian before correction to sigma = 0.53 radian following correction. The corrected beam focusing quality is verified using a needle hydrophone. The peak intensity obtained through the aberrator is found to be -7.69 dB below the reference intensity obtained without any aberration. Using the phase correction, a sharp focus is restored through the aberrator with a relative peak intensity of -0.89 dB. The technique is tested experimentally using a linear transmit/receive array through a real aberrating layer. The array is used to automatically correct its beam quality, as it both generates the radiation force with coded excitations and indirectly estimates the acoustic intensity at the focus

  11. Trends in Educational Computing: Decreasing Interest and the Changing Focus of Instruction.

    ERIC Educational Resources Information Center

    Lockheed, Marlaine

    1986-01-01

    Presents a rationale for changing from the current emphasis of precollege computer courses on BASIC programming skills to a focus on teaching applications software skills. Reviews research regarding the quality of computer literacy courses. Discusses parallel cognitive and affective consequences of programming and applications software. Promotes…

  12. Software-Defined Ultra-wideband Radio Communications: A New RF Technology for Emergency Response Applications

    SciTech Connect

    Nekoogar, F; Dowla, F

    2009-10-19

    Reliable wireless communication links for local-area (short-range) and regional (long-range) reach capabilities are crucial for emergency response to disasters. Lack of a dependable communication system can result in disruptions in the situational awareness between the local responders in the field and the emergency command and control centers. To date, all wireless communications systems such as cell phones and walkie-talkies use narrowband radio frequency (RF) signaling for data communication. However, the hostile radio propagation environment caused by collapsed structures and rubble in various disaster sites results in significant degradation and attenuation of narrowband RF signals, which ends up in frequent communication breakdowns. To address the challenges of reliable radio communication in disaster fields, we propose an approach to use ultra-wideband (UWB) or wideband RF waveforms for implementation on Software Defined Radio (SDR) platforms. Ultra-wideband communications has been proven by many research groups to be effective in addressing many of the limitations faced by conventional narrowband radio technologies. In addition, LLNL's radio and wireless team have shown significant success in field deployment of various UWB communications system for harsh environments based on LLNL's patented UWB modulation and equalization techniques. Furthermore, using software defined radio platform for UWB communications offers a great deal of flexibility in operational parameters and helps the radio system to dynamically adapt itself to its environment for optimal performance.

  13. Software Development With Application Generators: The Naval Aviation Logistics Command Management Information System Case

    DTIC Science & Technology

    1992-09-01

    Aviation Logistics Command Management Information System (NALCOMIS) prototyping development effort, the critical success factors required to implement prototyping with application generators in other areas of DoD.

  14. Software Surrogate

    NASA Technical Reports Server (NTRS)

    1999-01-01

    In 1994, Blackboard Technology received a NASA Phase I SBIR award entitled "A Blackboard-Based Framework for Mixed-Initiative, Crewed- Space-System Applications." This research continued in Phase II at JSC, where a generic architecture was developed in which a software surrogate serves as the operator's representative in the fast-paced realm of nearly autonomous, intelligent systems. This SBIR research effort addressed the need to support human-operator monitoring and intervention with intelligent systems such as those being developed for NASA's crewed space program.

  15. Analysis Software

    NASA Technical Reports Server (NTRS)

    1994-01-01

    General Purpose Boundary Element Solution Technology (GPBEST) software employs the boundary element method of mechanical engineering analysis, as opposed to finite element. It is, according to one of its developers, 10 times faster in data preparation and more accurate than other methods. Its use results in less expensive products because the time between design and manufacturing is shortened. A commercial derivative of a NASA-developed computer code, it is marketed by Best Corporation to solve problems in stress analysis, heat transfer, fluid analysis and yielding and cracking of solids. Other applications include designing tractor and auto parts, household appliances and acoustic analysis.

  16. Simulation Software

    NASA Technical Reports Server (NTRS)

    1996-01-01

    Various NASA Small Business Innovation Research grants from Marshall Space Flight Center, Langley Research Center and Ames Research Center were used to develop the 'kernel' of COMCO's modeling and simulation software, the PHLEX finite element code. NASA needed it to model designs of flight vehicles; one of many customized commercial applications is UNISIM, a PHLEX-based code for analyzing underground flows in oil reservoirs for Texaco, Inc. COMCO's products simulate a computational mechanics problem, estimate the solution's error and produce the optimal hp-adapted mesh for the accuracy the user chooses. The system is also used as a research or training tool in universities and in mechanical design in industrial corporations.

  17. Software assurance standard

    NASA Technical Reports Server (NTRS)

    1992-01-01

    This standard specifies the software assurance program for the provider of software. It also delineates the assurance activities for the provider and the assurance data that are to be furnished by the provider to the acquirer. In any software development effort, the provider is the entity or individual that actually designs, develops, and implements the software product, while the acquirer is the entity or individual who specifies the requirements and accepts the resulting products. This standard specifies at a high level an overall software assurance program for software developed for and by NASA. Assurance includes the disciplines of quality assurance, quality engineering, verification and validation, nonconformance reporting and corrective action, safety assurance, and security assurance. The application of these disciplines during a software development life cycle is called software assurance. Subsequent lower-level standards will specify the specific processes within these disciplines.

  18. The Environment for Application Software Integration and Execution (EASIE), version 1.0. Volume 2: Program integration guide

    NASA Technical Reports Server (NTRS)

    Jones, Kennie H.; Randall, Donald P.; Stallcup, Scott S.; Rowell, Lawrence F.

    1988-01-01

    The Environment for Application Software Integration and Execution, EASIE, provides a methodology and a set of software utility programs to ease the task of coordinating engineering design and analysis codes. EASIE was designed to meet the needs of conceptual design engineers that face the task of integrating many stand-alone engineering analysis programs. Using EASIE, programs are integrated through a relational data base management system. In volume 2, the use of a SYSTEM LIBRARY PROCESSOR is used to construct a DATA DICTIONARY describing all relations defined in the data base, and a TEMPLATE LIBRARY. A TEMPLATE is a description of all subsets of relations (including conditional selection criteria and sorting specifications) to be accessed as input or output for a given application. Together, these form the SYSTEM LIBRARY which is used to automatically produce the data base schema, FORTRAN subroutines to retrieve/store data from/to the data base, and instructions to a generic REVIEWER program providing review/modification of data for a given template. Automation of these functions eliminates much of the tedious, error prone work required by the usual approach to data base integration.

  19. Software design specification. Part 2: Orbital Flight Test (OFT) detailed design specification. Volume 3: Applications. Book 2: System management

    NASA Technical Reports Server (NTRS)

    1979-01-01

    The functions performed by the systems management (SM) application software are described along with the design employed to accomplish these functions. The operational sequences (OPS) control segments and the cyclic processes they control are defined. The SM specialist function control (SPEC) segments and the display controlled 'on-demand' processes that are invoked by either an OPS or SPEC control segment as a direct result of an item entry to a display are included. Each processing element in the SM application is described including an input/output table and a structured control flow diagram. The flow through the module and other information pertinent to that process and its interfaces to other processes are included.

  20. Focused ion beam post-processing of optical fiber Fabry-Perot cavities for sensing applications.

    PubMed

    André, Ricardo M; Pevec, Simon; Becker, Martin; Dellith, Jan; Rothhardt, Manfred; Marques, Manuel B; Donlagic, Denis; Bartelt, Hartmut; Frazão, Orlando

    2014-06-02

    Focused ion beam technology is combined with chemical etching of specifically designed fibers to create Fabry-Perot interferometers. Hydrofluoric acid is used to etch special fibers and create microwires with diameters of 15 μm. These microwires are then milled with a focused ion beam to create two different structures: an indented Fabry-Perot structure and a cantilever Fabry-Perot structure that are characterized in terms of temperature. The cantilever structure is also sensitive to vibrations and is capable of measuring frequencies in the range 1 Hz - 40 kHz.

  1. Application of mixsep software package: Performance verification of male-mixed DNA analysis.

    PubMed

    Hu, Na; Cong, Bin; Gao, Tao; Chen, Yu; Shen, Junyi; Li, Shujin; Ma, Chunling

    2015-08-01

    An experimental model of male-mixed DNA (n=297) was constructed according to the mixed DNA construction principle. This comprised the use of the Applied Biosystems (ABI) 7500 quantitative polymerase chain reaction system, with scientific validation of mixture proportion (Mx; root-mean-square error ≤ 0.02). Statistical analysis was performed on locus separation accuracy using mixsep, a DNA mixture separation R-package, and the analytical performance of mixsep was assessed by examining the data distribution pattern of different mixed gradients, short tandem repeat (STR) loci and mixed DNA types. The results showed that locus separation accuracy had a negative linear correlation with the mixed gradient (R(2)=-0.7121). With increasing mixed gradient imbalance, locus separation accuracy first increased and then decreased, with the highest value detected at a gradient of 1:3 (≥ 90%). The mixed gradient, which is the theoretical Mx, was one of the primary factors that influenced the success of mixed DNA analysis. Among the 16 STR loci detected by Identifiler®, the separation accuracy was relatively high (>88%) for loci D5S818, D8S1179 and FGA, whereas the median separation accuracy value was lowest for the D7S820 locus. STR loci with relatively large numbers of allelic drop-out (ADO; >15) were all located in the yellow and red channels, including loci D18S51, D19S433, FGA, TPOX and vWA. These five loci featured low allele peak heights, which was consistent with the low sensitivity of the ABI 3130xl Genetic Analyzer to yellow and red fluorescence. The locus separation accuracy of the mixsep package was substantially different with and without the inclusion of ADO loci; inclusion of ADO significantly reduced the analytical performance of the mixsep package, which was consistent with the lack of an ADO functional module in this software. The present study demonstrated that the mixsep software had a number of advantages and was recommended for analysis of mixed DNA. This

  2. The Application of Solution-Focused Brief Therapy in a Public School Setting.

    ERIC Educational Resources Information Center

    Williams, G. Robert

    2000-01-01

    Research has found a relationship between family variables, including the involvement of parents in their children's education, and student success. This article highlights the use of solution-focused therapy in a public school setting to address the needs of both students and families. (GCP)

  3. ASR Application in Climate Change Adaptation: The Need, Issues and Research Focus

    EPA Science Inventory

    This presentation will focus on four key points: (a) Aquifer storage and recovery: a long-held practice offering a potential tool for climate change adaptation, (b) The drivers: 1) hydrological perturbations related to climate change, 2) water imbalance in both Qand Vbetween wat...

  4. Means for the focusing and acceleration of parallel beams of charged particles. [Patent application

    DOEpatents

    Maschke, A.W.

    1980-09-23

    Apparatus for focusing beams of charged particles comprising planar arrays of electrostatic quadrupoles. The array may be assembled from a single component which comprises a support plate containing uniform rows of poles. Each pole is separated by a hole through the plate designed to pass a beam. Two such plates may be positioned with their poles intermeshed to form a plurality of quadrupoles.

  5. Solution of the lossy nonlinear Tricomi equation with application to sonic boom focusing

    NASA Astrophysics Data System (ADS)

    Salamone, Joseph A., III

    Sonic boom focusing theory has been augmented with new terms that account for mean flow effects in the direction of propagation and also for atmospheric absorption/dispersion due to molecular relaxation due to oxygen and nitrogen. The newly derived model equation was numerically implemented using a computer code. The computer code was numerically validated using a spectral solution for nonlinear propagation of a sinusoid through a lossy homogeneous medium. An additional numerical check was performed to verify the linear diffraction component of the code calculations. The computer code was experimentally validated using measured sonic boom focusing data from the NASA sponsored Superboom Caustic and Analysis Measurement Program (SCAMP) flight test. The computer code was in good agreement with both the numerical and experimental validation. The newly developed code was applied to examine the focusing of a NASA low-boom demonstration vehicle concept. The resulting pressure field was calculated for several supersonic climb profiles. The shaping efforts designed into the signatures were still somewhat evident despite the effects of sonic boom focusing.

  6. Global focused identification of germplasm strategy (figs) application on Trifolium epens L.

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Trifolium repens L. is a legume species extensively used in grass pastures. Traits such as level of cyanogenic glucosides and flower production are important in breeding productive and nutritious varieties. The Focused Identification of Germplasm Strategy (FIGS) is an approach used to screen large g...

  7. Surveillance application using patten recognition software at the EBR-II Reactor Facility

    SciTech Connect

    Olson, D.L.

    1992-05-01

    The System State Analyzer (SSA) is a software based pattern recognition system. For the past several year this system has been used at Argonne National Laboratory`s Experimental Breeder Reactor 2 (EBR-2) reactor for detection of degradation and other abnormalities in plant systems. Currently there are two versions of the SSA being used at EBR-2. One version of SSA is used for daily surveillance and trending of the reactor delta-T and startups of the reactor. Another version of the SSA is the QSSA which is used to monitor individual systems of the reactor such as the Secondary Sodium System, Secondary Sodium Pumps, and Steam Generator. This system has been able to detect problems such as signals being affected by temperature variations due to a failing temperature controller.

  8. Surveillance application using patten recognition software at the EBR-II Reactor Facility

    SciTech Connect

    Olson, D.L.

    1992-01-01

    The System State Analyzer (SSA) is a software based pattern recognition system. For the past several year this system has been used at Argonne National Laboratory's Experimental Breeder Reactor 2 (EBR-2) reactor for detection of degradation and other abnormalities in plant systems. Currently there are two versions of the SSA being used at EBR-2. One version of SSA is used for daily surveillance and trending of the reactor delta-T and startups of the reactor. Another version of the SSA is the QSSA which is used to monitor individual systems of the reactor such as the Secondary Sodium System, Secondary Sodium Pumps, and Steam Generator. This system has been able to detect problems such as signals being affected by temperature variations due to a failing temperature controller.

  9. w4CSeq: software and web application to analyze 4C-seq data.

    PubMed

    Cai, Mingyang; Gao, Fan; Lu, Wange; Wang, Kai

    2016-11-01

    Circularized Chromosome Conformation Capture followed by deep sequencing (4C-Seq) is a powerful technique to identify genome-wide partners interacting with a pre-specified genomic locus. Here, we present a computational and statistical approach to analyze 4C-Seq data generated from both enzyme digestion and sonication fragmentation-based methods. We implemented a command line software tool and a web interface called w4CSeq, which takes in the raw 4C sequencing data (FASTQ files) as input, performs automated statistical analysis and presents results in a user-friendly manner. Besides providing users with the list of candidate interacting sites/regions, w4CSeq generates figures showing genome-wide distribution of interacting regions, and sketches the enrichment of key features such as TSSs, TTSs, CpG sites and DNA replication timing around 4C sites.

  10. Module-based Hybrid Uncertainty Quantification for Multi-physics Applications: Theory and Software

    SciTech Connect

    Tong, Charles; Chen, Xiao; Iaccarino, Gianluca; Mittal, Akshay

    2013-10-08

    In this project we proposed to develop an innovative uncertainty quantification methodology that captures the best of the two competing approaches in UQ, namely, intrusive and non-intrusive approaches. The idea is to develop the mathematics and the associated computational framework and algorithms to facilitate the use of intrusive or non-intrusive UQ methods in different modules of a multi-physics multi-module simulation model in a way that physics code developers for different modules are shielded (as much as possible) from the chores of accounting for the uncertain ties introduced by the other modules. As the result of our research and development, we have produced a number of publications, conference presentations, and a software product.

  11. Open-source hardware and software and web application for gamma dose rate network operation.

    PubMed

    Luff, R; Zähringer, M; Harms, W; Bleher, M; Prommer, B; Stöhlker, U

    2014-08-01

    The German Federal Office for Radiation Protection operates a network of about 1800 gamma dose rate stations as a part of the national emergency preparedness plan. Each of the six network centres is capable of operating the network alone. Most of the used hardware and software have been developed in-house under open-source license. Short development cycles and close cooperation between developers and users ensure robustness, transparency and fast maintenance procedures, thus avoiding unnecessary complex solutions. This also reduces the overall costs of the network operation. An easy-to-expand web interface has been developed to make the complete system available to other interested network operators in order to increase cooperation between different countries. The interface is also regularly in use for education during scholarships of trainees supported, e.g. by the 'International Atomic Energy Agency' to operate a local area dose rate monitoring test network.

  12. Applications of artificial intelligence to space station and automated software techniques: High level robot command language

    NASA Technical Reports Server (NTRS)

    Mckee, James W.

    1989-01-01

    The objective is to develop a system that will allow a person not necessarily skilled in the art of programming robots to quickly and naturally create the necessary data and commands to enable a robot to perform a desired task. The system will use a menu driven graphical user interface. This interface will allow the user to input data to select objects to be moved. There will be an imbedded expert system to process the knowledge about objects and the robot to determine how they are to be moved. There will be automatic path planning to avoid obstacles in the work space and to create a near optimum path. The system will contain the software to generate the required robot instructions.

  13. Ecologically-focused Calibration of Hydrological Models for Environmental Flow Applications

    NASA Astrophysics Data System (ADS)

    Adams, S. K.; Bledsoe, B. P.

    2015-12-01

    Hydrologic alteration resulting from watershed urbanization is a common cause of aquatic ecosystem degradation. Developing environmental flow criteria for urbanizing watersheds requires quantitative flow-ecology relationships that describe biological responses to streamflow alteration. Ideally, gaged flow data are used to develop flow-ecology relationships; however, biological monitoring sites are frequently ungaged. For these ungaged locations, hydrologic models must be used to predict streamflow characteristics through calibration and testing at gaged sites, followed by extrapolation to ungaged sites. Physically-based modeling of rainfall-runoff response has frequently utilized "best overall fit" calibration criteria, such as the Nash-Sutcliffe Efficiency (NSE), that do not necessarily focus on specific aspects of the flow regime relevant to biota of interest. This study investigates the utility of employing flow characteristics known a priori to influence regional biological endpoints as "ecologically-focused" calibration criteria compared to traditional, "best overall fit" criteria. For this study, 19 continuous HEC-HMS 4.0 models were created in coastal southern California and calibrated to hourly USGS streamflow gages with nearby biological monitoring sites using one "best overall fit" and three "ecologically-focused" criteria: NSE, Richards-Baker Flashiness Index (RBI), percent of time when the flow is < 1 cfs (%<1), and a Combined Calibration (RBI and %<1). Calibrated models were compared using calibration accuracy, environmental flow metric reproducibility, and the strength of flow-ecology relationships. Results indicate that "ecologically-focused" criteria can be calibrated with high accuracy and may provide stronger flow-ecology relationships than "best overall fit" criteria, especially when multiple "ecologically-focused" criteria are used in concert, despite inabilities to accurately reproduce additional types of ecological flow metrics to which the

  14. Ion focusing

    DOEpatents

    Cooks, Robert Graham; Baird, Zane; Peng, Wen-Ping

    2017-01-17

    The invention generally relates to apparatuses for focusing ions at or above ambient pressure and methods of use thereof. In certain embodiments, the invention provides an apparatus for focusing ions that includes an electrode having a cavity, at least one inlet within the electrode configured to operatively couple with an ionization source, such that discharge generated by the ionization source is injected into the cavity of the electrode, and an outlet. The cavity in the electrode is shaped such that upon application of voltage to the electrode, ions within the cavity are focused and directed to the outlet, which is positioned such that a proximal end of the outlet receives the focused ions and a distal end of the outlet is open to ambient pressure.

  15. Ion focusing

    SciTech Connect

    Cooks, Robert Graham; Baird, Zane; Peng, Wen-Ping

    2015-11-10

    The invention generally relates to apparatuses for focusing ions at or above ambient pressure and methods of use thereof. In certain embodiments, the invention provides an apparatus for focusing ions that includes an electrode having a cavity, at least one inlet within the electrode configured to operatively couple with an ionization source, such that discharge generated by the ionization source is injected into the cavity of the electrode, and an outlet. The cavity in the electrode is shaped such that upon application of voltage to the electrode, ions within the cavity are focused and directed to the outlet, which is positioned such that a proximal end of the outlet receives the focused ions and a distal end of the outlet is open to ambient pressure.

  16. The Life Cycle Application of Intelligent Software Modeling for the First Materials Science Research Rack

    NASA Technical Reports Server (NTRS)

    Rice, Amanda; Parris, Frank; Nerren, Philip

    2000-01-01

    Marshall Space Flight Center (MSFC) has been funding development of intelligent software models to benefit payload ground operations for nearly a decade. Experience gained from simulator development and real-time monitoring and control is being applied to engineering design, testing, and operation of the First Material Science Research Rack (MSRR-1). MSRR-1 is the first rack in a suite of three racks comprising the Materials Science Research Facility (MSRF) which will operate on the International Space Station (ISS). The MSRF will accommodate advanced microgravity investigations in areas such as the fields of solidification of metals and alloys, thermo-physical properties of polymers, crystal growth studies of semiconductor materials, and research in ceramics and glasses. The MSRR-1 is a joint venture between NASA and the European Space Agency (ESA) to study the behavior of different materials during high temperature processing in a low gravity environment. The planned MSRR-1 mission duration is five (5) years on-orbit and the total design life is ten (IO) years. The MSRR-1 launch is scheduled on the third Utilization Flight (UF-3) to ISS, currently in February of 2003). The objective of MSRR-1 is to provide an early capability on the ISS to conduct material science, materials technology, and space product research investigations in microgravity. It will provide a modular, multi-user facility for microgravity research in materials crystal growth and solidification. An intelligent software model of MSRR-1 is under development and will serve multiple purposes to support the engineering analysis, testing, training, and operational phases of the MSRR-1 life cycle development. The G2 real-time expert system software environment developed by Gensym Corporation was selected as the intelligent system shell for this development work based on past experience gained and the effectiveness of the programming environment. Our approach of multi- uses of the simulation model and

  17. Software Engineering Laboratory Series: Collected Software Engineering Papers. Volume 15

    NASA Technical Reports Server (NTRS)

    1997-01-01

    The Software Engineering Laboratory (SEL) is an organization sponsored by NASA/GSFC and created to investigate the effectiveness of software engineering technologies when applied to the development of application software. The activities, findings, and recommendations of the SEL are recorded in the Software Engineering Laboratory Series, a continuing series of reports that includes this document.

  18. Software Engineering Laboratory Series: Collected Software Engineering Papers. Volume 14

    NASA Technical Reports Server (NTRS)

    1996-01-01

    The Software Engineering Laboratory (SEL) is an organization sponsored by NASA/GSFC and created to investigate the effectiveness of software engineering technologies when applied to the development of application software. The activities, findings, and recommendations of the SEL are recorded in the Software Engineering Laboratory Series, a continuing series of reports that includes this document.

  19. Software Engineering Laboratory Series: Collected Software Engineering Papers. Volume 13

    NASA Technical Reports Server (NTRS)

    1995-01-01

    The Software Engineering Laboratory (SEL) is an organization sponsored by NASA/GSFC and created to investigate the effectiveness of software engineering technologies when applied to the development of application software. The activities, findings, and recommendations of the SEL are recorded in the Software Engineering Laboratory Series, a continuing series of reports that includes this document.

  20. Roadmap for Peridynamic Software Implementation

    SciTech Connect

    Littlewood, David John

    2015-10-01

    The application of peridynamics for engineering analysis requires an efficient and robust software implementation. Key elements include processing of the discretization, the proximity search for identification of pairwise interactions, evaluation of the con- stitutive model, application of a bond-damage law, and contact modeling. Additional requirements may arise from the choice of time integration scheme, for example esti- mation of the maximum stable time step for explicit schemes, and construction of the tangent stiffness matrix for many implicit approaches. This report summaries progress to date on the software implementation of the peridynamic theory of solid mechanics. Discussion is focused on parallel implementation of the meshfree discretization scheme of Silling and Askari [33] in three dimensions, although much of the discussion applies to computational peridynamics in general.

  1. On various metrics used for validation of predictive QSAR models with applications in virtual screening and focused library design.

    PubMed

    Roy, Kunal; Mitra, Indrani

    2011-07-01

    Quantitative structure-activity relationships (QSARs) have important applications in drug discovery research, environmental fate modeling, property prediction, etc. Validation has been recognized as a very important step for QSAR model development. As one of the important objectives of QSAR modeling is to predict activity/property/toxicity of new chemicals falling within the domain of applicability of the developed models and QSARs are being used for regulatory decisions, checking reliability of the models and confidence of their predictions is a very important aspect, which can be judged during the validation process. One prime application of a statistically significant QSAR model is virtual screening for molecules with improved potency based on the pharmacophoric features and the descriptors appearing in the QSAR model. Validated QSAR models may also be utilized for design of focused libraries which may be subsequently screened for the selection of hits. The present review focuses on various metrics used for validation of predictive QSAR models together with an overview of the application of QSAR models in the fields of virtual screening and focused library design for diverse series of compounds with citation of some recent examples.

  2. Microfabrication and Test of a Three-Dimensional Polymer Hydro-Focusing Unit for Flow Cytometry Applications

    NASA Technical Reports Server (NTRS)

    Yang, Ren; Feedback, Daniel L.; Wang, Wanjun

    2004-01-01

    This paper details a novel three-dimensional (3D) hydro-focusing micro cell sorter for micro flow cytometry applications. The unit was micro-fabricated by means of SU-8 3D lithography. The 3D microstructure for coaxial sheathing was designed, micro-fabricated, and tested. Three-dimensional hydrofocusing capability was demonstrated with an experiment to sort labeled tanned sheep erythrocytes (red blood cells). This polymer hydro-focusing microstructure is easily micro-fabricated and integrated with other polymer microfluidic structures.

  3. Effects of Varying Interactive Strategies Provided by Computer-Based Tutorials for a Software Application Program.

    ERIC Educational Resources Information Center

    Tiemann, Philip W.; Markle, Susan M.

    1990-01-01

    Discussion of interaction in computer-based tutorials (CBT) focuses on a study that compared the performance of adult learners from training with three CBTs that varied the level of interactivity. The degrees of learner control, system control, and domain control are discussed, and the Lotus spreadsheet tutorials used are described. (24…

  4. Software Product Lines Essentials

    DTIC Science & Technology

    2008-07-01

    improvement Technology innovation Reuse 7 Software Product Lines Linda Northrop © 2008 Carnegie Mellon University Few Systems Are Unique Most...Focus was small-grained, opportunistic, and technology -driven. Results did not meet business goals. Reuse History 9 Software Product Lines Linda...servers, storage servers, network camera and scanner servers Bold Stroke Avionics Customized solutions for transportation industries E-COM Technology

  5. Application of symbolic and algebraic manipulation software in solving applied mechanics problems

    NASA Technical Reports Server (NTRS)

    Tsai, Wen-Lang; Kikuchi, Noboru

    1993-01-01

    As its name implies, symbolic and algebraic manipulation is an operational tool which not only can retain symbols throughout computations but also can express results in terms of symbols. This report starts with a history of symbolic and algebraic manipulators and a review of the literatures. With the help of selected examples, the capabilities of symbolic and algebraic manipulators are demonstrated. These applications to problems of applied mechanics are then presented. They are the application of automatic formulation to applied mechanics problems, application to a materially nonlinear problem (rigid-plastic ring compression) by finite element method (FEM) and application to plate problems by FEM. The advantages and difficulties, contributions, education, and perspectives of symbolic and algebraic manipulation are discussed. It is well known that there exist some fundamental difficulties in symbolic and algebraic manipulation, such as internal swelling and mathematical limitation. A remedy for these difficulties is proposed, and the three applications mentioned are solved successfully. For example, the closed from solution of stiffness matrix of four-node isoparametrical quadrilateral element for 2-D elasticity problem was not available before. Due to the work presented, the automatic construction of it becomes feasible. In addition, a new advantage of the application of symbolic and algebraic manipulation found is believed to be crucial in improving the efficiency of program execution in the future. This will substantially shorten the response time of a system. It is very significant for certain systems, such as missile and high speed aircraft systems, in which time plays an important role.

  6. Applications of the Lithium Focused Ion Beam: Nanoscale Electrochemistry and Microdisk Mode Imaging

    NASA Astrophysics Data System (ADS)

    McGehee, William; Takeuchi, Saya; Michels, Thomas; Oleshko, Vladimir; Aksyuk, Vladimir; Soles, Christopher; McClelland, Jabez; CenterNanoscale Science; Technology at NIST Collaboration; Materials Measurement Laboratory at NIST Collaboration

    2016-05-01

    The NIST-developed lithium Focused-Ion-Beam (LiFIB) system creates a low-energy, picoampere-scale ion beam from a photoionized gas of laser-cooled atoms. The ion beam can be focused to a <30 nm spot and scanned across a sample. This enables imaging through collection of ion-induced secondary electrons (similar to SEM) as well as the ability to selectively deposit lithium-ions into nanoscale volumes in a material. We exploit this second ability of the LiFIB to selectively ''titrate'' lithium ions as a means of probing the optical modes in microdisk resonators as well as for exploring nanoscale, Li-ion electrochemistry in battery-relevant materials. We present an overview of both measurements, including imaging of the optical mode in a silicon microdisk and a comparison of FIB and electrochemical lithiation of tin.

  7. High energy focused ion beam technology and applications at the Louisiana Accelerator Center

    NASA Astrophysics Data System (ADS)

    Glass, G. A.; Dymnikov, A. D.; Rout, B.; Zachry, D. P.

    2007-07-01

    The high energy focused ion beam (HEFIB) system at the Louisiana Accelerator Center (LAC) of the University of Louisiana at Lafayette, Lafayette, USA, is constructed on one of the beamlines of a National Electrostatics Corporation 1.7 MV 5SDH-2 tandem accelerator. The HEFIB system has several components, including a versatile magnetic quadrupole sextuplet lens focusing system defined as the Russian magnetic sextuplet (RMS) system having the same demagnifications, the same focal lengths and the same positions of the focal points in xz and yz planes as the Russian quadruplet and a one-piece concrete supporting base and integrated endstation with air isolation. A review of recent microlithography and HEFIB system developments at LAC are presented, as well as new results using heavy ion (HI) beam lithography on crystalline silicon.

  8. Potential medical applications of the plasma focus in the radioisotope production for PET imaging

    NASA Astrophysics Data System (ADS)

    Roshan, M. V.; Razaghi, S.; Asghari, F.; Rawat, R. S.; Springham, S. V.; Lee, P.; Lee, S.; Tan, T. L.

    2014-06-01

    Devices other than the accelerators are desired to be investigated for generating high energy particles to induce nuclear reaction and positron emission tomography (PET) producing radioisotopes. The experimental data of plasma focus devices (PF) are studied and the activity scaling law for External Solid Target (EST) activation is established. Based on the scaling law and the techniques to enhance the radioisotopes production, the feasibility of generating the required activity for PET imaging is studied.

  9. Classification software technique assessment

    NASA Technical Reports Server (NTRS)

    Jayroe, R. R., Jr.; Atkinson, R.; Dasarathy, B. V.; Lybanon, M.; Ramapryian, H. K.

    1976-01-01

    A catalog of software options is presented for the use of local user communities to obtain software for analyzing remotely sensed multispectral imagery. The resources required to utilize a particular software program are described. Descriptions of how a particular program analyzes data and the performance of that program for an application and data set provided by the user are shown. An effort is made to establish a statistical performance base for various software programs with regard to different data sets and analysis applications, to determine the status of the state-of-the-art.

  10. Conical octopole ion guide: Design, focusing, and its application to the deposition of low energetic clusters

    SciTech Connect

    Roettgen, Martin A.; Judai, Ken; Antonietti, Jean-Marie; Heiz, Ueli; Rauschenbach, Stephan; Kern, Klaus

    2006-01-15

    A design of a radio-frequency (rf) octopole ion guide with truncated conical rods arranged in a conical geometry is presented. The performance is tested in a cluster deposition apparatus used for the soft-landing of size-selected clusters on well-characterized substrates used as a model system in heterogeneous catalysis in ultrahigh vacuum. This device allows us to focus 500 pA of a mass-selected Ni{sub 20}{sup +} cluster ion beam from 9 mm down to a spot size of 2 mm in diameter. The transmittance is 70%{+-}5% at a rf voltage of 420 V{sub pp} applied over an amateur radio transceiver with an interposed homemade amplifier-transformer circuit. An increase of the cluster density by a factor of 15 has been achieved. Three ion trajectories are simulated by using SIMION6, which are relevant for this focusing device: transmitted, reflected, and absorbed. The observed effects in the simulations can be successfully explained by the adiabatic approximation. The focusing behavior of the conical octopole lens is demonstrated by experiment and simulations to be a very useful technique for increasing molecule or cluster densities on a substrate and thus reducing deposition time.

  11. Design of anisotropic focusing metasurface and its application for high-gain lens antenna

    NASA Astrophysics Data System (ADS)

    Guo, Wenlong; Wang, Guangming; Li, Haipeng; Li, Tangjing; Ge, Qichao; Zhuang, Yaqiang

    2017-03-01

    In this paper, we propose an anisotropic focusing metasurface with function of focusing orthogonally polarized waves in refraction and reflection modes respectively. By employing four layered metallic patches spaced by triple layered dielectric spacers, an anisotropic phase element is designed with capability of transmitting x-polarized waves but reflecting y-polarized beams efficiently. Composed of 21 × 21 cells and with size of 105 × 105 mm2, a focusing metasurface operating at 15 GHz is designed with the same focal length of 30 mm for x- and y-polarized waves. By setting a patch antenna at the focal point, the metasurface sample is employed to enhance gain of the radiation source. For verification, the metasurface sample is fabricated and measured. The antenna performance, in terms of realized boresight gain and operating bandwidth under x- and y-polarized waves illumination, is presented. Results show that the 1 dB gain bandwidths are respectively from 14.7 to 15.3 GHz and 14.7 to 15.2 GHz, and the gain are enhanced by 14.1 dB, 15.1 dB in refraction and reflection modes when the metasurface is impinged by x- and y-polarized spherical waves. The proposed anisotropic metasurface may afford an alternative for designing anisotropic planar lens or high-gain antenna.

  12. Focused tandem shock waves in water and their potential application in cancer treatment

    NASA Astrophysics Data System (ADS)

    Lukes, P.; Sunka, P.; Hoffer, P.; Stelmashuk, V.; Pouckova, P.; Zadinova, M.; Zeman, J.; Dibdiak, L.; Kolarova, H.; Tomankova, K.; Binder, S.; Benes, J.

    2014-01-01

    The generator of two focused successive (tandem) shock waves (FTSW) in water produced by underwater multichannel electrical discharges at two composite electrodes, with a time delay between the first and second shock waves of 10 s, was developed. It produces, at the focus, a strong shock wave with a peak positive pressure of up to 80 MPa, followed by a tensile wave with a peak negative pressure of up to MPa, thus generating at the focus a large amount of cavitation. Biological effects of FTSW were demonstrated in vitro on hemolysis of erythrocytes and cell viability of human acute lymphoblastic leukemia cells as well as on tumor growth delay ex vivo and in vivo experiments performed with B16 melanoma, T-lymphoma, and R5-28 sarcoma cell lines. It was demonstrated in vivo that FTSW can enhance antitumor effects of chemotherapeutic drugs, such as cisplatin, most likely due to increased permeability of the membrane of cancer cells induced by FTSW. Synergetic cytotoxicity of FTSW with sonosensitive porphyrin-based drug Photosan on tumor growth was observed, possibly due to the cavitation-induced sonodynamic effect of FTSW.

  13. Application of the focused ion beam technique in aerosol science: detailed investigation of selected, airborne particles.

    PubMed

    Kaegi, R; Gasser, Ph

    2006-11-01

    The focused ion beam technique was used to fabricate transmission electron microscope lamellas of selected, micrometre-sized airborne particles. Particles were sampled from ambient air on Nuclepore polycarbonate filters and analysed with an environmental scanning electron microscope. A large number of particles between 0.6 and 10 microm in diameter (projected optical equivalent diameter) were detected and analysed using computer-controlled scanning electron microscopy. From the resulting dataset, where the chemistry, morphology and position of each individual particle are stored, two particles were selected for a more detailed investigation. For that purpose, the particle-loaded filter was transferred from the environmental scanning electron microscope to the focused ion beam, where lamellas of the selected particles were fabricated. The definition of a custom coordinate system enabled the relocation of the particles after the transfer. The lamellas were finally analysed with an analytical transmission electron microscope. Internal structure and elemental distribution maps of the interior of the particles provided additional information about the particles, which helped to assign the particles to their sources. The combination of computer-controlled scanning electron microscopy, focused ion beam and transmission electron microscopy offers new possibilities for characterizing airborne particles in great detail, eventually enabling a detailed source apportionment of specific particles. The particle of interest can be selected from a large dataset (e.g. based on chemistry and/or morphology) and then investigated in more detail in the transmission electron microscope.

  14. Buyer's Guide to Communications Software.

    ERIC Educational Resources Information Center

    Powell, David B.

    1984-01-01

    In order to help users make informed decisions when buying communications software, this article suggests that buyers consider communications software compatibility; software protocols for data transmission; software products offering compatibility with Telenet, Prestel, and Ethernet/XTEN; specific intended applications; and specialized features…

  15. Application of Ion Mobility Spectrometry (IMS) in forensic chemistry and toxicology with focus on biological matrices

    NASA Technical Reports Server (NTRS)

    Bernhard, Werner; Keller, Thomas; Regenscheit, Priska

    1995-01-01

    The IMS (Ion Mobility Spectroscopy) instrument 'Ionscan' takes advantage of the fact that trace quantities of illicit drugs are adsorbed on dust particles on clothes, in cars and on other items of evidence. The dust particles are collected on a membrane filter by a special attachment on a vacuum cleaner. The sample is then directly inserted into the spectrometer and can be analyzed immediately. We show casework applications of a forensic chemistry and toxicology laboratory. One new application of IMS in forensic chemistry is the detection of psilocybin in dried mushrooms without any further sample preparation.

  16. DigitalVHI--a freeware open-source software application to capture the Voice Handicap Index and other questionnaire data in various languages.

    PubMed

    Herbst, Christian T; Oh, Jinook; Vydrová, Jitka; Švec, Jan G

    2015-07-01

    In this short report we introduce DigitalVHI, a free open-source software application for obtaining Voice Handicap Index (VHI) and other questionnaire data, which can be put on a computer in clinics and used in clinical practice. The software can simplify performing clinical studies since it makes the VHI scores directly available for analysis in a digital form. It can be downloaded from http://www.christian-herbst.org/DigitalVHI/.

  17. Value Focused Thinking Applications to Supervised Pattern Classification With Extensions to Hyperspectral Anomaly Detection Algorithms

    DTIC Science & Technology

    2015-03-26

    HYPERSPECTRAL ANOMALY DETECTION ALGORITHMS THESIS MARCH 2015 David E. Scanland, Captain, USAF AFIT-ENS-MS-15-M-121 DEPARTMENT OF THE AIR FORCE...PATTERN CLASSIFICATION WITH EXTENSIONS TO HYPERSPECTRAL ANOMALY DETECTION ALGORITHMS THESIS Presented to the Faculty Department of...APPLICATION TO SUPERVISED PATTERN CLASSIFICATION WITH EXTENSIONS TO HYPERSPECTRAL ANOMALY DETECTION ALGORITHMS David E. Scanland, MS Captain, USAF

  18. New High Resolution Scanning Ion Microprobe and Focused Ion Beam Applications.

    DTIC Science & Technology

    1984-08-31

    toward a number of practical applications. Somewhat paradoxically , the structure of intercalated graphite is better known at the atomic level, through x...Timothy R. Fox, Ph.D. Thesis, The University of Chicago, December 1980. 2. Energy Loss of Diproton Clusters in Carbon Below the Fermi Velocity. Kin

  19. A software architecture for multidisciplinary applications: Integrating task and data parallelism

    NASA Technical Reports Server (NTRS)

    Chapman, Barbara; Mehrotra, Piyush; Vanrosendale, John; Zima, Hans

    1994-01-01

    Data parallel languages such as Vienna Fortran and HPF can be successfully applied to a wide range of numerical applications. However, many advanced scientific and engineering applications are of a multidisciplinary and heterogeneous nature and thus do not fit well into the data parallel paradigm. In this paper we present new Fortran 90 language extensions to fill this gap. Tasks can be spawned as asynchronous activities in a homogeneous or heterogeneous computing environment; they interact by sharing access to Shared Data Abstractions (SDA's). SDA's are an extension of Fortran 90 modules, representing a pool of common data, together with a set of Methods for controlled access to these data and a mechanism for providing persistent storage. Our language supports the integration of data and task parallelism as well as nested task parallelism and thus can be used to express multidisciplinary applications in a natural and efficient way.

  20. Application of a Software tool for Evaluating Human Factors in Accident Sequences

    SciTech Connect

    Queral, Cesar; Exposito, Antonio; Gonzalez, Isaac

    2006-07-01

    The Probabilistic Safety Assessment (PSA) includes the actions of the operator like elements in the set of the considered protection performances during accident sequences. Nevertheless, its impact throughout a sequence is not analyzed in a dynamic way. In this sense, it is convenient to make more detailed studies about its importance in the dynamics of the sequences, letting make studies of sensitivity respect to the human reliability and the response times. For this reason, the CSN is involved in several activities oriented to develop a new safety analysis methodology, the Integrated Safety Assessment (ISA), which must be able to incorporate operator actions in conventional thermo-hydraulic (TH) simulations. One of them is the collaboration project between CSN, HRP and the DSE-UPM that started in 2003. In the framework of this project, a software tool has been developed to incorporate operator actions in TH simulations. As a part of the ISA, this tool permits to quantify human error probabilities (HEP) and to evaluate its impact in the final state of the plant. Independently, it can be used for evaluating the impact of the execution by operators of procedures and guidelines in the final state of the plant and the evaluation of the allowable response times for the manual actions of the operator. The results obtained in the first pilot case are included in this paper. (authors)

  1. Visual Recognition Software for Binary Classification and Its Application to Spruce Pollen Identification.

    PubMed

    Tcheng, David K; Nayak, Ashwin K; Fowlkes, Charless C; Punyasena, Surangi W

    2016-01-01

    Discriminating between black and white spruce (Picea mariana and Picea glauca) is a difficult palynological classification problem that, if solved, would provide valuable data for paleoclimate reconstructions. We developed an open-source visual recognition software (ARLO, Automated Recognition with Layered Optimization) capable of differentiating between these two species at an accuracy on par with human experts. The system applies pattern recognition and machine learning to the analysis of pollen images and discovers general-purpose image features, defined by simple features of lines and grids of pixels taken at different dimensions, size, spacing, and resolution. It adapts to a given problem by searching for the most effective combination of both feature representation and learning strategy. This results in a powerful and flexible framework for image classification. We worked with images acquired using an automated slide scanner. We first applied a hash-based "pollen spotting" model to segment pollen grains from the slide background. We next tested ARLO's ability to reconstruct black to white spruce pollen ratios using artificially constructed slides of known ratios. We then developed a more scalable hash-based method of image analysis that was able to distinguish between the pollen of black and white spruce with an estimated accuracy of 83.61%, comparable to human expert performance. Our results demonstrate the capability of machine learning systems to automate challenging taxonomic classifications in pollen analysis, and our success with simple image representations suggests that our approach is generalizable to many other object recognition problems.

  2. A 4th-order reconfigurable analog baseband filter for software-defined radio applications

    NASA Astrophysics Data System (ADS)

    Weiwei, Wang; Xuegui, Chang; Xiao, Wang; Kefeng, Han; Xi, Tan; Na, Yan; Hao, Min

    2011-04-01

    This paper presents a 4th-order reconfigurable analog baseband filter for software-defined radios. The design exploits an active-RC low pass filter (LPF) structure with digital assistant, which is flexible for tunability of filter characteristics, such as cut-off frequency, selectivity, type, noise, gain and power. A novel reconfigurable operational amplifier is proposed to realize the optimization of noise and scalability of power dissipation. The chip was fabricated in an SMIC 0.13 μm CMOS process. The main filter and frequency calibration circuit occupy 1.8 × 0.8 mm2 and 0.48 × 0.25 mm2 areas, respectively. The measurement results indicate that the filter provides Butterworth and Chebyshev responses with a wide frequency tuning range from 280 kHz to 15 MHz and a gain range from 0 to 18 dB. An IIP3 of 29 dBm is achieved under a 1.2 V power supply. The input inferred noise density varies from 41 to 133 according to a given standard, and the power consumptions are 5.46 mW for low band (from 280 kHz to 3 MHz) and 8.74 mW for high band (from 3 to 15 MHz) mode.

  3. Visual Recognition Software for Binary Classification and Its Application to Spruce Pollen Identification

    PubMed Central

    Tcheng, David K.; Nayak, Ashwin K.; Fowlkes, Charless C.; Punyasena, Surangi W.

    2016-01-01

    Discriminating between black and white spruce (Picea mariana and Picea glauca) is a difficult palynological classification problem that, if solved, would provide valuable data for paleoclimate reconstructions. We developed an open-source visual recognition software (ARLO, Automated Recognition with Layered Optimization) capable of differentiating between these two species at an accuracy on par with human experts. The system applies pattern recognition and machine learning to the analysis of pollen images and discovers general-purpose image features, defined by simple features of lines and grids of pixels taken at different dimensions, size, spacing, and resolution. It adapts to a given problem by searching for the most effective combination of both feature representation and learning strategy. This results in a powerful and flexible framework for image classification. We worked with images acquired using an automated slide scanner. We first applied a hash-based “pollen spotting” model to segment pollen grains from the slide background. We next tested ARLO’s ability to reconstruct black to white spruce pollen ratios using artificially constructed slides of known ratios. We then developed a more scalable hash-based method of image analysis that was able to distinguish between the pollen of black and white spruce with an estimated accuracy of 83.61%, comparable to human expert performance. Our results demonstrate the capability of machine learning systems to automate challenging taxonomic classifications in pollen analysis, and our success with simple image representations suggests that our approach is generalizable to many other object recognition problems. PMID:26867017

  4. Software architecture for a kinematically dissimilar master-slave telethesis for exploring rehabilitation applications

    NASA Astrophysics Data System (ADS)

    Wunderlich, Joseph; Chen, Shoupu; Pino, D.; Rahman, Tariq; Harwin, William S.

    1993-12-01

    A person with limited arm and hand function could benefit from technology based on teleoperation principles, particularly where the mechanism provides proprioceptive-like information to the operator giving an idea of the forces encountered in the environment and the positions of the slave robot. A test-bed system is being prepared to evaluate the potential for adapting telemanipulator technology to the needs of people with high level spinal cord injury. This test-bed uses a kinematically dissimilar master and slave pair and will be adaptable to a variety of disabilities. The master will be head controlled and when combined with auxiliary functions will provide the degrees of freedom necessary to attempt any task. In the simplest case, this mapping could be direct, with the slave amplifying the person's movements and forces. It is unrealistic however to expect that the operator will have the range of head movement required for accurate operation of the slave over the full workspace. Neither is it likely that the person will be able to achieve simultaneous and independent control of the 6 or more degrees of freedom required to move the slave. Thus a set of more general mappings must be available that can be chosen to relate to the intrinsic coordinates of the task. The software structure to implement the control of this master-slave system is based on two IBM PC computers linked via an ethernet.

  5. Application of recursive manipulator dynamics to hybrid software/hardware simulation

    NASA Technical Reports Server (NTRS)

    Hill, Christopher J.; Hopping, Kenneth A.; Price, Charles R.

    1989-01-01

    Computer simulations of robotic mechanisms have traditionally solved the dynamic equations of motion for an N degree of freedom manipulator by formulating an N dimensional matrix equation combining the accelerations and torques (forces) for all joints. The use of an alternative formulation that is strictly recursive is described. The dynamic solution proceeds on a joint by joint basis, so it is possible to perform inverse dynamics at arbitrary joints. The dynamic formulation is generalized with respect to both rotational and translational joints, and it is also directly extendable to branched manipulator chains. A hardware substitution test is described in which a servo drive motor was integrated with a simulated manipulator arm. The form of the dynamic equation permits calculation of acceleration given torque or vice versa. Computing torque as a function of acceleration is required for the hybrid software/hardware simulation test described. For this test, a joint servo motor is controlled in conjunction with the simulation, and the dynamic torque on the servo motor is provided by a load motor on a common driveshaft.

  6. Application of configuration software WinCC in logistics automatic control system

    NASA Astrophysics Data System (ADS)

    Weng, Yifang; Duan, Zhengang; Lian, Xiaoqin; Zhang, Xiaoli; Yang, Wenying

    2006-11-01

    This paper presents the application of WinCC in Logistics Automatic Control System on the experiment facility of miniature warehouse. The information management system, the supervisory control system and the PLC execution part are developed. The communication between WinCC and PLC based on the PROFIBUS protocol is implemented. DDE Communication, VB as server and WinCC as client, is realized. The system combines information management and supervisory control together and works well. It would be applicable to industry after deeper study.

  7. User Friendly Processing of Sediment CT Data: Software and Application in High Resolution Non-Destructive Sediment Core Data Sets

    NASA Astrophysics Data System (ADS)

    Reilly, B. T.; Stoner, J. S.; Wiest, J.; Abbott, M. B.; Francus, P.; Lapointe, F.

    2015-12-01

    Computed Tomography (CT) of sediment cores allow for high resolution images, three dimensional volumes, and down core profiles, generated through the attenuation of X-rays as a function of density and atomic number. When using a medical CT-Scanner, these quantitative data are stored in pixels using the Hounsfield scale, which are relative to the attenuation of X-rays in water and air at standard temperature and pressure. Here we present MATLAB based software specifically designed for sedimentary applications with a user friendly graphical interface to process DICOM files and stitch overlapping CT scans. For visualization, the software allows easy generation of core slice images with grayscale and false color relative to a user defined Hounsfield number range. For comparison to other high resolution non-destructive methods, down core Hounsfield number profiles are extracted using a method robust to coring imperfections, like deformation, bowing, gaps, and gas expansion. We demonstrate the usefulness of this technique with lacustrine sediment cores from the Western United States and Canadian High Arctic, including Fish Lake, Oregon, and Sawtooth Lake, Ellesmere Island. These sites represent two different depositional environments and provide examples for a variety of common coring defects and lithologies. The Hounsfield profiles and images can be used in combination with other high resolution data sets, including sediment magnetic parameters, XRF core scans and many other types of data, to provide unique insights into how lithology influences paleoenvironmental and paleomagnetic records and their interpretations.

  8. Point Focusing Thermal and Electric Applications Project. Volume 2: Workshop proceedings

    NASA Technical Reports Server (NTRS)

    Landis, K. E. (Editor)

    1979-01-01

    Point focus distributed receiver solar thermal technology for the production of electric power and of industrial process heat is discussed. Thermal power systems are described. Emphasis is on the development of cost effective systems which will accelerate the commercialization and industrialization of plants, using parabolic dish collectors. The characteristics of PFDR systems and the cost targets for major subsystems hardware are identified. Markets for this technology and their size are identified, and expected levelized bus bar energy costs as a function of yearly production level are presented. The present status of the technology development effort is discussed.

  9. Study of transient spark discharge focused at NOx generation for biomedical applications

    NASA Astrophysics Data System (ADS)

    Janda, M.; Martišovitš, V.; Hensel, K.; Machala, Z.

    2016-10-01

    The paper is focused at nitrogen oxides generation by transient spark (TS) in atmospheric pressure air. The TS is a DC-driven self-pulsing discharge with short duration (∼⃒10-100 ns) high current pulses (>1A), with the repetition frequency 1-10 kHz. Thanks to the short spark duration, highly reactive non-equilibrium plasma is generated, producing ∼⃒300 ppm of NOx per input energy density 100 J.l-1. Further optimization of NO/NO2 production to improve the biomedical/antimicrobial effects is possible by modifying the electric circuit generating the TS.

  10. Application of strongly focused pulsed electron beam for the reaction wheels balancing

    NASA Astrophysics Data System (ADS)

    Borduleva, A. O.; Bleykher, G. A.; Solovev, V. V.; Krivobokov, V. P.; Babihina, M. N.

    2016-11-01

    In the given work the material removing possibility by the strongly focused pulsed electron beam was investigated. The optimal mode of flywheels balancing was found. At this mode the power density is 1.6 MW/cm2 and pulse duration is 0.65 s. At such parameters the evaporation rate is equal to 11 g/scm2. It is possible to vary the amount of remote material from 1 to 100 mg, that is sufficient to balance flywheel. It is found that treatment by an electron beam does not change the material structure.

  11. Bloodless Partial Nephrectomy Through Application of Non-Focused High-Intensity Ultrasound

    NASA Astrophysics Data System (ADS)

    Murat, François-Joseph; Lafon, Cyril; Gelet, Albert; Martin, Xavier; Cathignol, Dominique

    2005-03-01

    The goal of this study was to evaluate the hemostatic ability of a new interstitial applicator composed of a planar ultrasonic transducer with a reflector, during partial nephrectomy in a porcine model. The new applicator was designed to make effective use of all the acoustic energy to coagulate the renal tissue . Placement of the reflector opposite the transducer allows use of all the acoustic energy for coagulation. Despite the low transmission frequency, it is possible to work at a relatively weak intensity, with the aid of the reflector. As a result, intense cooling of the transducer is no longer needed. The transducer functions at a frequency of 3.78 MHz. A movable brass plate was mounted to the applicator, parallel to the transducer, to reflect energy that was not absorbed during ultrasound wave transmission. Additionally, the plate served to immobilize the kidney during the treatment. Our methodology was to expose the kidneys of 9 pigs through abdominal laparotomy. An initial series of experiments on 5 pigs allowed exposure conditions to be selected. Thermocouples were implanted in the kidneys after exposure at 15, 20, and 25 mm from the renal capsule surface. The remaining 4 pigs underwent ultrasound treatment with the applicator before a bilateral lower pole partial nephrectomy. The treatment consisted of juxtaposing elementary lesions (made at an intensity of 26 W/cm2 for 50 seconds) circumferentially in a subhilar location. The hemostatic efficacy was evaluated just after the shots and during the 30 minutes that followed the sectioning of the kidney's lower pole. In the event of persistent bleeding, an it was possible form an elementary lesion opposite the insufficiently treated zone. For an exposure duration of 50 seconds at 26 W/cm2, the lesions obtained covered the total thickness of the kidney, which varied between 22 and 36 mm. The temperatures observed within the treated tissues were 62°, 59°, and 58°C at 15, 20 and 25 mm respectively from the

  12. Digital image measurement of specimen deformation based on CCD cameras and Image J software: an application to human pelvic biomechanics

    NASA Astrophysics Data System (ADS)

    Jia, Yongwei; Cheng, Liming; Yu, Guangrong; Lou, Yongjian; Yu, Yan; Chen, Bo; Ding, Zuquan

    2008-03-01

    matched the clinical results. Digital image measurement of specimen deformation based on CCD cameras and Image J software has good perspective for application in biomechanical research, which has the advantage of simple optical setup, no-contact, high precision, and no special requirement of test environment.

  13. Application of radiation processing in asia and the pacific region: Focus on malaysia

    NASA Astrophysics Data System (ADS)

    Mohd Dahlan, Khairul Zaman HJ.

    1995-09-01

    Applications of radiation processing in Malaysia and other developing countries in Asia and the Pacific region is increasing as the countries move toward industrialisation. At present, there are more than 85 gamma facilities and 334 electron accelerators in Asia and the Pacific region which are mainly in Japan, Rep. of Korea and China. The main applications which are in the interest of the region are radiation sterilisation of medical products; radiation crosslinking of wire and cable, heat shrinkable film and tube, and foam; radiation curing of surface coatings, printing inks and adhesive; radiation vulcanisation of natural rubber latex; radiation processing of agro-industrial waste; radiation treatment of sewage sludge and municipal waste; food irradiation; tissue grafts and radiation synthesis of bioactive materials.

  14. Volume visualization: a technical overview with a focus on medical applications.

    PubMed

    Zhang, Qi; Eagleson, Roy; Peters, Terry M

    2011-08-01

    With the increasing availability of high-resolution isotropic three- or four-dimensional medical datasets from sources such as magnetic resonance imaging, computed tomography, and ultrasound, volumetric image visualization techniques have increased in importance. Over the past two decades, a number of new algorithms and improvements have been developed for practical clinical image display. More recently, further efficiencies have been attained by designing and implementing volume-rendering algorithms on graphics processing units (GPUs). In this paper, we review volumetric image visualization pipelines, algorithms, and medical applications. We also illustrate our algorithm implementation and evaluation results, and address the advantages and drawbacks of each algorithm in terms of image quality and efficiency. Within the outlined literature review, we have integrated our research results relating to new visualization, classification, enhancement, and multimodal data dynamic rendering. Finally, we illustrate issues related to modern GPU working pipelines, and their applications in volume visualization domain.

  15. Automating Embedded Analysis Capabilities and Managing Software Complexity in Multiphysics Simulation, Part II: Application to Partial Differential Equations

    DOE PAGES

    Pawlowski, Roger P.; Phipps, Eric T.; Salinger, Andrew G.; ...

    2012-01-01

    A template-based generic programming approach was presented in Part I of this series of papers [Sci. Program. 20 (2012), 197–219] that separates the development effort of programming a physical model from that of computing additional quantities, such as derivatives, needed for embedded analysis algorithms. In this paper, we describe the implementation details for using the template-based generic programming approach for simulation and analysis of partial differential equations (PDEs). We detail several of the hurdles that we have encountered, and some of the software infrastructure developed to overcome them. We end with a demonstration where we present shape optimization and uncertaintymore » quantification results for a 3D PDE application.« less

  16. Straightforward production of encoded microbeads by Flow Focusing: potential applications for biomolecule detection.

    PubMed

    Gañán-Calvo, A M; Martín-Banderas, L; González-Prieto, R; Rodríguez-Gil, A; Berdún-Alvarez, T; Cebolla, A; Chávez, S; Flores-Mosquera, M

    2006-10-31

    Fluorescently encoded polymeric microparticles are acquiring great importance in the development of simultaneous multianalyte screening assays. We have developed a very versatile and straightforward method for the production of dye-labeled microparticles with a very reproducible size distribution and freely-chosen and discernible fluorescent properties. Our method combines Flow Focusing technology with a solvent evaporation/extraction procedure in a single step, yielding spherical, non-aggregate and non-porous particles. We have designed a multi-coloured bead array which includes the possibility of modifying the surface properties of the microparticles, which offer excellent properties for covalent attachment of biomolecules such as peptides, oligonucleotides, proteins, etc. We also show the potential of the fluorescently labeled microspheres for the detection of biomolecule (peptides and oligonucelotides) interactions using flow cytometry.

  17. Application of dual-focus fluorescence correlation spectroscopy to microfluidic flow-velocity measurement.

    PubMed

    Arbour, Tyler J; Enderlein, Jörg

    2010-05-21

    Several methods exist to measure and map fluid velocities in microfluidic devices, which are vital to understanding properties on the micro- and nano-scale. Fluorescence correlation spectroscopy (FCS) is a method traditionally exploited for its ability to measure molecular diffusion coefficients. However, several reports during the past decade have shown that FCS can also be successfully used to measure precise flow rates in microfluidics with very high spatial resolution, making it a competitive alternative to other common flow-measurement methods. In 2007 we introduced a modified version of conventional FCS that overcomes many of the artifacts troubling the standard technique. Here we show how the advantages of this method, called dual-focus FCS, extend to flow measurements. To do so, we have measured the velocity flow profile along the cross-section of a square-bore microfluidic channel and compared the result to the theoretical prediction.

  18. Photomechanical ablation of biological tissue induced by focused femtosecond laser and its application for acupuncture

    NASA Astrophysics Data System (ADS)

    Hosokawa, Yoichiroh; Ohta, Mika; Ito, Akihiko; Takaoka, Yutaka

    2013-03-01

    Photomechanical laser ablation due to focused femtosecond laser irradiation was induced on the hind legs of living mice, and its clinical influence on muscle cell proliferation was investigated via histological examination and reverse transcriptase-polymerase chain reaction (RT-PCR) analysis to examine the expression of the gene encoding myostatin, which is a growth repressor in muscle satellite cells. The histological examination suggested that damage of the tissue due to the femtosecond laser irradiation was localized on epidermis and dermis and hardly induced in the muscle tissue below. On the other hand, gene expression of the myostatin of muscle tissue after laser irradiation was suppressed. The suppression of myostatin expression facilitates the proliferation of muscle cells, because myostatin is a growth repressor in muscle satellite cells. On the basis of these results, we recognize the potential of the femtosecond laser as a tool for noncontact, high-throughput acupuncture in the treatment of muscle disease.

  19. Tomographic reconstruction of tissue properties and temperature increase for high-intensity focused ultrasound applications.

    PubMed

    Yin, Lu; Gudur, Madhu Sudhan Reddy; Hsiao, Yi-Sing; Kumon, Ronald E; Deng, Cheri X; Jiang, Huabei

    2013-10-01

    The acoustic and thermal properties as well as the temperature change within a tissue volume during high-intensity focused ultrasound ablation are critically important for treatment planning and monitoring. Described in this article is a tomographic reconstruction method used to determine the tissue properties and increase in temperature in a 3-D volume. On the basis of the iterative finite-element solution to the bioheat equation coupled with Tikhonov regularization techniques, our reconstruction algorithm solves the inverse problem of bioheat transfer and uses the time-dependent temperature measured on a tissue surface to obtain the acoustic absorption coefficient, thermal diffusivity and temperature increase within the subsurface volume. Numerical simulations were performed to validate the reconstruction algorithm. The method was initially conducted in ex vivo experiments in which time-dependent temperature on a tissue surface was measured using high-resolution, non-invasive infrared thermography.

  20. TOMOGRAPHIC RECONSTRUCTION OF TISSUE PROPERTIES AND TEMPERATURE INCREASE FOR HIGH-INTENSITY FOCUSED ULTRASOUND APPLICATIONS

    PubMed Central

    Yin, Lu; Gudur, Madhu Sudhan Reddy; Hsiao, Yi-Sing; Kumon, Ronald E.; Deng, Cheri X.; Jiang, Huabei

    2013-01-01

    The acoustic and thermal properties as well as the temperature change within a tissue volume during high-intensity focused ultrasound ablation are critically important for treatment planning and monitoring. Described in this article is a tomographic reconstruction method used to determine the tissue properties and increase in temperature in a 3-D volume. On the basis of the iterative finite-element solution to the bioheat equation coupled with Tikhonov regularization techniques, our reconstruction algorithm solves the inverse problem of bioheat transfer and uses the time-dependent temperature measured on a tissue surface to obtain the acoustic absorption coefficient, thermal diffusivity and temperature increase within the subsurface volume. Numerical simulations were performed to validate the reconstruction algorithm. The method was initially conducted in ex vivo experiments in which time-dependent temperature on a tissue surface was measured using high-resolution, non-invasive infrared thermography. PMID:23849388

  1. Radiation protection of PFMA-1, a plasma focus for medical applications.

    PubMed

    Fabbri, A; Frignani, M; Mannucci, S; Mostacci, D; Rocchi, F; Sumini, M; Teodori, F; Angeli, E; Tartari, A; Cucchi, G

    2007-12-01

    A plasma focus is being developed for breeding short-lived radionuclides. The different radiation protection issues and concerns posed by the machine once in operation are analysed and discussed. Activation is shown to be totally negligible and likewise neutron emission is found to pose no concern at all. The only source of radiation risk is found to rest in the radionuclides produced, 18F and 15 O, generating a peak exposure of 1.114 Sv y(-1) at the distance of closest approach of 2.5 m. Shielding to protect against this hazard is calculated to be 5 cm Pb or 54 cm concrete for the operation area and 5.5 cm Pb for the transportation flask.

  2. Iodine enhanced focused-ion-beam etching of silicon for photonic applications

    SciTech Connect

    Schrauwen, Jonathan; Thourhout, Dries van; Baets, Roel

    2007-11-15

    Focused-ion-beam etching of silicon enables fast and versatile fabrication of micro- and nanophotonic devices. However, large optical losses due to crystal damage and ion implantation make the devices impractical when the optical mode is confined near the etched region. These losses are shown to be reduced by the local implantation and etching of silicon waveguides with iodine gas enhancement, followed by baking at 300 deg. C. The excess optical loss in the silicon waveguides drops from 3500 to 1700 dB/cm when iodine gas is used, and is further reduced to 200 dB/cm after baking at 300 deg. C. We present elemental and chemical surface analyses supporting that this is caused by the desorption of iodine from the silicon surface. Finally we present a model to extract the absorption coefficient from the measurements.

  3. Quasi-static elastography and its application in investigation of focused ultrasound induced tissue lesions

    NASA Astrophysics Data System (ADS)

    Wang, Bin; Ling, Tao; Shen, Yong; Wang, Yan; Zheng, Hairong; Li, Faqi

    2012-10-01

    Monitoring of Focused Ultrasound (FUS) therapy has always been a key factor for a successful therapy. Although B-mode ultrasound has long been used for monitoring FUS therapy, the gray scale changes can not precisely reflect the lesion formation inside the tissue, while MR thermometry is considered to be too expensive. In this study, elastography had been performed using a commercial ultrasound system to investigate lesions produced by FUS irradiation in vitro. Several motion detection algorithms had been performed to improve the motion detection accuracy in the elastography. The effects of different algorithms on the motion detection accuracy were compared. Experimental results on the FUS induced lesion in swine muscle were introduced. The results indicated that lesions induced by small dosage of FUS inside the tissue can be successfully detected, which has a profound clinical meaning for the monitoring of FUS therapy.

  4. Consumer familiarity, perspectives and expected value of personalized medicine with a focus on applications in oncology

    PubMed Central

    Garfeld, Susan; Douglas, Michael P; MacDonald, Karen V; Marshall, Deborah A; Phillips, Kathryn A

    2015-01-01

    Aims Knowledge of consumer perspectives of personalized medicine (PM) is limited. Our study assessed consumer perspectives of PM, with a focus on oncology care, to inform industry, clinician and payer stakeholders' programs and policy. Materials & Methods A nationally representative survey of 602 US consumers' ≥30 years old explored familiarity, perspectives and expected value of PM. Results Most (73%) respondents have not heard of ‘personalized medicine,’ though after understanding the term most (95%) expect PM to have a positive beneft. Consumer's willingness to pay is associated with products' impact on survival, rather than predicting disease risk. If testing indicates consumers are not candidates for oncology therapies, most (84%) would seek a second opinion or want therapy anyway. Conclusions Understanding heterogeneity in consumer perspectives of PM can inform program and policy development. PMID:25620993

  5. The application of sparse arrays in high frequency transcranial focused ultrasound therapy: A simulation study

    PubMed Central

    Pajek, Daniel; Hynynen, Kullervo

    2013-01-01

    Purpose: Transcranial focused ultrasound is an emerging therapeutic modality that can be used to perform noninvasive neurosurgical procedures. The current clinical transcranial phased array operates at 650 kHz, however the development of a higher frequency array would enable more precision, while reducing the risk of standing waves. However, the smaller wavelength and the skull's increased distortion at this frequency are problematic. It would require an order of magnitude more elements to create such an array. Random sparse arrays enable steering of a therapeutic array with fewer elements. However, the tradeoffs inherent in the use of sparsity in a transcranial phased array have not been systematically investigated and so the objective of this simulation study is to investigate the effect of sparsity on transcranial arrays at a frequency of 1.5 MHz that provides small focal spots for precise exposure control. Methods: Transcranial sonication simulations were conducted using a multilayer Rayleigh-Sommerfeld propagation model. Element size and element population were varied and the phased array's ability to steer was assessed. Results: The focal pressures decreased proportionally as elements were removed. However, off-focus hotspots were generated if a high degree of steering was attempted with very sparse arrays. A phased array consisting of 1588 elements 3 mm in size, a 10% population, was appropriate for steering up to 4 cm in all directions. However, a higher element population would be required if near-skull sonication is desired. Conclusions: This study demonstrated that the development of a sparse, hemispherical array at 1.5 MHz could enable more precision in therapies that utilize lower intensity sonications. PMID:24320540

  6. The application of sparse arrays in high frequency transcranial focused ultrasound therapy: A simulation study

    SciTech Connect

    Pajek, Daniel Hynynen, Kullervo

    2013-12-15

    Purpose: Transcranial focused ultrasound is an emerging therapeutic modality that can be used to perform noninvasive neurosurgical procedures. The current clinical transcranial phased array operates at 650 kHz, however the development of a higher frequency array would enable more precision, while reducing the risk of standing waves. However, the smaller wavelength and the skull's increased distortion at this frequency are problematic. It would require an order of magnitude more elements to create such an array. Random sparse arrays enable steering of a therapeutic array with fewer elements. However, the tradeoffs inherent in the use of sparsity in a transcranial phased array have not been systematically investigated and so the objective of this simulation study is to investigate the effect of sparsity on transcranial arrays at a frequency of 1.5 MHz that provides small focal spots for precise exposure control. Methods: Transcranial sonication simulations were conducted using a multilayer Rayleigh-Sommerfeld propagation model. Element size and element population were varied and the phased array's ability to steer was assessed. Results: The focal pressures decreased proportionally as elements were removed. However, off-focus hotspots were generated if a high degree of steering was attempted with very sparse arrays. A phased array consisting of 1588 elements 3 mm in size, a 10% population, was appropriate for steering up to 4 cm in all directions. However, a higher element population would be required if near-skull sonication is desired. Conclusions: This study demonstrated that the development of a sparse, hemispherical array at 1.5 MHz could enable more precision in therapies that utilize lower intensity sonications.

  7. Image 100 procedures manual development: Applications system library definition and Image 100 software definition

    NASA Technical Reports Server (NTRS)

    Guseman, L. F., Jr.; Decell, H. P., Jr.

    1975-01-01

    An outline for an Image 100 procedures manual for Earth Resources Program image analysis was developed which sets forth guidelines that provide a basis for the preparation and updating of an Image 100 Procedures Manual. The scope of the outline was limited to definition of general features of a procedures manual together with special features of an interactive system. Computer programs were identified which should be implemented as part of an applications oriented library for the system.

  8. Mid-water Software Tools and the Application to Processing and Analysis of the Latest Generation Multibeam Sonars

    NASA Astrophysics Data System (ADS)

    Gee, L.; Doucet, M.

    2010-12-01

    The latest generation of multibeam sonars now has the ability to map the water-column, along with the seafloor. Currently, the users of these sonars have a limited view of the mid-water data in real-time, and if they do store the data, they are restricted to replaying it only, with no ability for further analysis. The water-column data has the potential to address a number of research areas including detection of small targets (wrecks, etc.) above the seabed, mapping of fish and marine mammals and a wide range of physical oceanographic processes. However, researchers have been required to develop their own in-house software tools before they can even begin their study of the water column data. This paper describes the development of more general software tools for the full processing of raw sonar data (bathymetry, backscatter and water-column) to yield output products suitable for visualization in a 4D time-synchronized environment. The huge water-column data volumes generated by the new sonars, combined with the variety of data formats from the different sonar manufacturers, provides a significant challenge in the design and development of tools that can be applied to the wide variety of applications. The development of the mid-water tools on this project addressed this problem by using a unified way of storing the water column data in a generic water column format (GWC). The sonar data are converted into the GWC by re-integrating the water column packets with time-based navigation and attitude, such that downstream in the workflow, the tools will have access to all relevant data of any particular ping. Dependent on the application and the resolution requirements, the conversion process also allows simple sub-sampling. Additionally, each file is indexed to enable fast non-linear lookup and extraction of any packet type or packet type collection in the sonar file. These tools also fully exploit multi-core and hyper-threading technologies to maximize the throughput

  9. Development of a web GIS application for emissions inventory spatial allocation based on open source software tools

    NASA Astrophysics Data System (ADS)

    Gkatzoflias, Dimitrios; Mellios, Giorgos; Samaras, Zissis

    2013-03-01

    Combining emission inventory methods and geographic information systems (GIS) remains a key issue for environmental modelling and management purposes. This paper examines the development of a web GIS application as part of an emission inventory system that produces maps and files with spatial allocated emissions in a grid format. The study is not confined in the maps produced but also presents the features and capabilities of a web application that can be used by every user even without any prior knowledge of the GIS field. The development of the application was based on open source software tools such as MapServer for the GIS functions, PostgreSQL and PostGIS for the data management and HTML, PHP and JavaScript as programming languages. In addition, background processes are used in an innovative manner to handle the time consuming and computational costly procedures of the application. Furthermore, a web map service was created to provide maps to other clients such as the Google Maps API v3 that is used as part of the user interface. The output of the application includes maps in vector and raster format, maps with temporal resolution on daily and hourly basis, grid files that can be used by air quality management systems and grid files consistent with the European Monitoring and Evaluation Programme Grid. Although the system was developed and validated for the Republic of Cyprus covering a remarkable wide range of pollutant and emissions sources, it can be easily customized for use in other countries or smaller areas, as long as geospatial and activity data are available.

  10. Exploring the focus and experiences of smartphone applications for addiction recovery.

    PubMed

    Savic, Michael; Best, David; Rodda, Simone; Lubman, Dan I

    2013-01-01

    Addiction recovery Smartphone applications (apps) (n = 87) identified on the Google Play store in 2012 were coded, along with app user reviews, to explore functions, foci, and user experiences. Content analysis revealed that apps typically provided information on recovery, as well as content to enhance motivation, promote social support and tools to monitor progress. App users commented that the apps helped to inform them, keep them focussed, inspire them, and connect them with other people and groups. Because few addiction recovery apps appear to have been formally evaluated, further research is needed to ascertain their effectiveness as stand-alone or adjunctive interventions.

  11. Evaluation of an improved algorithm for producing realistic 3D breast software phantoms: Application for mammography

    PubMed Central

    Bliznakova, K.; Suryanarayanan, S.; Karellas, A.; Pallikarakis, N.

    2010-01-01

    Purpose: This work presents an improved algorithm for the generation of 3D breast software phantoms and its evaluation for mammography. Methods: The improved methodology has evolved from a previously presented 3D noncompressed breast modeling method used for the creation of breast models of different size, shape, and composition. The breast phantom is composed of breast surface, duct system and terminal ductal lobular units, Cooper’s ligaments, lymphatic and blood vessel systems, pectoral muscle, skin, 3D mammographic background texture, and breast abnormalities. The key improvement is the development of a new algorithm for 3D mammographic texture generation. Simulated images of the enhanced 3D breast model without lesions were produced by simulating mammographic image acquisition and were evaluated subjectively and quantitatively. For evaluation purposes, a database with regions of interest taken from simulated and real mammograms was created. Four experienced radiologists participated in a visual subjective evaluation trial, as they judged the quality of the simulated mammograms, using the new algorithm compared to mammograms, obtained with the old modeling approach. In addition, extensive quantitative evaluation included power spectral analysis and calculation of fractal dimension, skewness, and kurtosis of simulated and real mammograms from the database. Results: The results from the subjective evaluation strongly suggest that the new methodology for mammographic breast texture creates improved breast models compared to the old approach. Calculated parameters on simulated images such as β exponent deducted from the power law spectral analysis and fractal dimension are similar to those calculated on real mammograms. The results for the kurtosis and skewness are also in good coincidence with those calculated from clinical images. Comparison with similar calculations published in the literature showed good agreement in the majority of cases. Conclusions: The

  12. Evaluation of an improved algorithm for producing realistic 3D breast software phantoms: Application for mammography

    SciTech Connect

    Bliznakova, K.; Suryanarayanan, S.; Karellas, A.; Pallikarakis, N.

    2010-11-15

    Purpose: This work presents an improved algorithm for the generation of 3D breast software phantoms and its evaluation for mammography. Methods: The improved methodology has evolved from a previously presented 3D noncompressed breast modeling method used for the creation of breast models of different size, shape, and composition. The breast phantom is composed of breast surface, duct system and terminal ductal lobular units, Cooper's ligaments, lymphatic and blood vessel systems, pectoral muscle, skin, 3D mammographic background texture, and breast abnormalities. The key improvement is the development of a new algorithm for 3D mammographic texture generation. Simulated images of the enhanced 3D breast model without lesions were produced by simulating mammographic image acquisition and were evaluated subjectively and quantitatively. For evaluation purposes, a database with regions of interest taken from simulated and real mammograms was created. Four experienced radiologists participated in a visual subjective evaluation trial, as they judged the quality of the simulated mammograms, using the new algorithm compared to mammograms, obtained with the old modeling approach. In addition, extensive quantitative evaluation included power spectral analysis and calculation of fractal dimension, skewness, and kurtosis of simulated and real mammograms from the database. Results: The results from the subjective evaluation strongly suggest that the new methodology for mammographic breast texture creates improved breast models compared to the old approach. Calculated parameters on simulated images such as {beta} exponent deducted from the power law spectral analysis and fractal dimension are similar to those calculated on real mammograms. The results for the kurtosis and skewness are also in good coincidence with those calculated from clinical images. Comparison with similar calculations published in the literature showed good agreement in the majority of cases. Conclusions: The

  13. History and modern applications of nano-composite materials carrying GA/cm2 current density due to a Bose-Einstein Condensate at room temperature produced by Focused Electron Beam Induced Processing for many extraordinary novel technical applications

    NASA Astrophysics Data System (ADS)

    Koops, Hans W. P.

    2015-12-01

    The discovery of Focused Electron Beam Induced Processing and early applications of this technology led to the possible use of a novel nanogranular material “Koops-GranMat®” using Pt/C and Au/C material. which carries at room temperature a current density > 50 times the current density which high TC superconductors can carry. The explanation for the characteristics of this novel material is given. This fact allows producing novel products for many applications using Dual Beam system having a gas supply and X.Y.T stream data programming and not using GDSII layout pattern control software. Novel products are possible for energy transportation. -distribution.-switching, photon-detection above 65 meV energy for very efficient energy harvesting, for bright field emission electron sources used for vacuum electronic devices like amplifiers for HF electronics, micro-tubes, 30 GHz to 6 THz switching amplifiers with signal to noise ratio >10(!), THz power sources up to 1 Watt, in combination with miniaturized vacuum pumps, vacuum gauges, IR to THz detectors, EUV- and X-Ray sources. Since focusing electron beam induced deposition works also at low energy, selfcloning multibeam-production machines for field emitter lamps, displays, multi-beam - lithography, - imaging, and - inspection, energy harvesting, and power distribution with switches controlling field-emitter arrays for KA of currents but with < 100 V switching voltage are possible. Finally the replacement of HTC superconductors and its applications by the Koops-GranMat® having Koops-Pairs at room temperature will allow the investigation devices similar to Josephson Junctions and its applications now called QUIDART (Quantum interference devices at Room Temperature). All these possibilities will support a revolution in the optical, electric, power, and electronic technology.

  14. Decision support systems and applications in ophthalmology: literature and commercial review focused on mobile apps.

    PubMed

    de la Torre-Díez, Isabel; Martínez-Pérez, Borja; López-Coronado, Miguel; Díaz, Javier Rodríguez; López, Miguel Maldonado

    2015-01-01

    The growing importance that mobile devices have in daily life has also reached health care and medicine. This is making the paradigm of health care change and the concept of mHealth or mobile health more relevant, whose main essence is the apps. This new reality makes it possible for doctors who are not specialist to have easy access to all the information generated in different corners of the world, making them potential keepers of that knowledge. However, the new daily information exceeds the limits of the human intellect, making Decision Support Systems (DSS) necessary for helping doctors to diagnose diseases and also help them to decide the attitude that has to be taken towards these diagnoses. These could improve the health care in remote areas and developing countries. All of this is even more important in diseases that are more prevalent in primary care and that directly affect the people's quality of life, this is the case in ophthalmological problems where in first patient care a specialist in ophthalmology is not involved. The goal of this paper is to analyse the state of the art of DSS in Ophthalmology. Many of them focused on diseases affecting the eye's posterior pole. For achieving the main purpose of this research work, a literature review and commercial apps analysis will be done. The used databases and systems will be IEEE Xplore, Web of Science (WoS), Scopus, and PubMed. The search is limited to articles published from 2000 until now. Later, different Mobile Decision Support System (MDSS) in Ophthalmology will be analyzed in the virtual stores for Android and iOS. 37 articles were selected according their thematic (posterior pole, anterior pole, Electronic Health Records (EHRs), cloud, data mining, algorithms and structures for DSS, and other) from a total of 600 found in the above cited databases. Very few mobile apps were found in the different stores. It can be concluded that almost all existing mobile apps are focused on the eye's posterior

  15. Collection and focusing of laser accelerated ion beams for therapy applications

    NASA Astrophysics Data System (ADS)

    Hofmann, Ingo; Meyer-Ter-Vehn, Jürgen; Yan, Xueqing; Orzhekhovskaya, Anna; Yaramyshev, Stepan

    2011-03-01

    Experimental results in laser acceleration of protons and ions and theoretical predictions that the currently achieved energies might be raised by factors 5-10 in the next few years have stimulated research exploring this new technology for oncology as a compact alternative to conventional synchrotron based accelerator technology. The emphasis of this paper is on collection and focusing of the laser produced particles by using simulation data from a specific laser acceleration model. We present a scaling law for the “chromatic emittance” of the collector—here assumed as a solenoid lens—and apply it to the particle energy and angular spectra of the simulation output. For a 10 Hz laser system we find that particle collection by a solenoid magnet well satisfies requirements of intensity and beam quality as needed for depth scanning irradiation. This includes a sufficiently large safety margin for intensity, whereas a scheme without collection—by using mere aperture collimation—hardly reaches the needed intensities.

  16. [Focus on the clinical application of the first artificial bioengineered cornea in China].

    PubMed

    Shi, W Y; Xie, L X

    2016-03-01

    Corneal disease ranks second among blinding diseases in China, with the corneal blindness taking up one fourth of the blind population. It has a high incidence and a low rate of sight rehabilitation. The lack of cornea donors and the shortage of corneal specialists have become the bottleneck that hampers the development in curing corneal disease. It is of primal importance to find sufficient donor sources in order to alleviate the current situation. In 2015, the world's first artificial bioengineered cornea was approved by CFDA to be applied clinically as a substitute for human cornea in lamellar keratoplasty. It will not only relieve the cornea donor shortage, but will also spur the training and practices of corneal specialists. The application and clinical trials of such an innovative product will significantly boost fundamental and clinical researches in corneal disease in our country, helping more patients with corneal blindness to recover their sights.

  17. Techtalk: Selecting Software for Instruction.

    ERIC Educational Resources Information Center

    Christ, Frank

    1983-01-01

    Explores primary steps in finding, selecting, and evaluating software for instructional applications. Cites directories of commercial vendors; periodicals that provide software announcements and reviews; and guides to assist in courseware selection and evaluation. (DMM)

  18. Characterization of Meteorites by Focused Ion Beam Sectioning: Recent Applications to CAIs and Primitive Meteorite Matrices

    NASA Technical Reports Server (NTRS)

    Christoffersen, Roy; Keller, Lindsay P.; Han, Jangmi; Rahman, Zia; Berger, Eve L.

    2015-01-01

    Focused ion beam (FIB) sectioning has revolutionized preparation of meteorite samples for characterization by analytical transmission electron microscopy (TEM) and other techniques. Although FIB is not "non-destructive" in the purest sense, each extracted section amounts to no more than nanograms (approximately 500 cubic microns) removed intact from locations precisely controlled by SEM imaging and analysis. Physical alteration of surrounding material by ion damage, fracture or sputter contamination effects is localized to within a few micrometers around the lift-out point. This leaves adjacent material intact for coordinate geochemical analysis by SIMS, microdrill extraction/TIMS and other techniques. After lift out, FIB sections can be quantitatively analyzed by electron microprobe prior to final thinning, synchrotron x-ray techniques, and by the full range of state-of-the-art analytical field-emission scanning transmission electron microscope (FE-STEM) techniques once thinning is complete. Multiple meteorite studies supported by FIB/FE-STEM are currently underway at NASA-JSC, including coordinated analysis of refractory phase assemblages in CAIs and fine-grained matrices in carbonaceous chondrites. FIB sectioning of CAIs has uncovered epitaxial and other overgrowth relations between corundum-hibonite-spinel consistent with hibonite preceding corundum and/or spinel in non-equilibrium condensation sequences at combinations of higher gas pressures, dust-gas enrichments or significant nebular transport. For all of these cases, the ability of FIB to allow for coordination with spatially-associated isotopic data by SIMS provides immense value for constraining the formation scenarios of the particular CAI assemblage. For carbonaceous chondrites matrix material, FIB has allowed us to obtain intact continuous sections of the immediate outer surface of Murchison (CM2) after it has been experimentally ion processed to simulate solar wind space weathering. The surface

  19. Applications of animal models of infectious arthritis in drug discovery: a focus on alphaviral disease.

    PubMed

    Herrero, Lara; Nelson, Michelle; Bettadapura, Jayaram; Gahan, Michelle E; Mahalingam, Suresh

    2011-06-01

    Animal models, which mimic human disease, are invaluable tools for understanding the mechanisms of disease pathogenesis and development of treatment strategies. In particular, animal models play important roles in the area of infectious arthritis. Alphaviruses, including Ross River virus (RRV), o'nyong-nyong virus, chikungunya virus (CHIKV), mayaro virus, Semliki Forest virus and sindbis virus, are globally distributed and cause transient illness characterized by fever, rash, myalgia, arthralgia and arthritis in humans. Severe forms of the disease result in chronic incapacitating arthralgia and arthritis. The mechanisms of how these viruses cause musculoskeletal disease are ill defined. In recent years, the use of a mouse model for RRV-induced disease has assisted in unraveling the pathobiology of infection and in discovering novel drugs to ameliorate disease. RRV as an infection model has the potential to provide key insights into such disease processes, particularly as many viruses, other than alphaviruses, are known to cause infectious arthritides. The emergence and outbreak of CHIKV in many parts of the world has necessitated the need to develop animal models of CHIKV disease. The development of non-human primate models of CHIKV disease has given insights into viral tropism and disease pathogenesis and facilitated the development of new treatment strategies. This review highlights the application of animal models of alphaviral diseases in the fundamental understanding of the mechanisms that contribute to disease and for defining the role that the immune response may have on disease pathogenesis, with the view of providing the foundation for new treatments.

  20. An improved approach for flight readiness certification: Methodology for failure risk assessment and application examples. Volume 2: Software documentation

    NASA Technical Reports Server (NTRS)

    Moore, N. R.; Ebbeler, D. H.; Newlin, L. E.; Sutharshana, S.; Creager, M.

    1992-01-01

    An improved methodology for quantitatively evaluating failure risk of spaceflight systems to assess flight readiness and identify risk control measures is presented. This methodology, called Probabilistic Failure Assessment (PFA), combines operating experience from tests and flights with engineering analysis to estimate failure risk. The PFA methodology is of particular value when information on which to base an assessment of failure risk, including test experience and knowledge of parameters used in engineering analyses of failure phenomena, is expensive or difficult to acquire. The PFA methodology is a prescribed statistical structure in which engineering analysis models that characterize failure phenomena are used conjointly with uncertainties about analysis parameters and/or modeling accuracy to estimate failure probability distributions for specific failure modes, These distributions can then be modified, by means of statistical procedures of the PFA methodology, to reflect any test or flight experience. Conventional engineering analysis models currently employed for design of failure prediction are used in this methodology. The PFA methodology is described and examples of its application are presented. Conventional approaches to failure risk evaluation for spaceflight systems are discussed, and the rationale for the approach taken in the PFA methodology is presented. The statistical methods, engineering models, and computer software used in fatigue failure mode applications are thoroughly documented.

  1. Tomato Analyzer: a useful software application to collect accurate and detailed morphological and colorimetric data from two-dimensional objects.

    PubMed

    Rodríguez, Gustavo R; Moyseenko, Jennifer B; Robbins, Matthew D; Morejón, Nancy Huarachi; Francis, David M; van der Knaap, Esther

    2010-03-16

    Measuring fruit morphology and color traits of vegetable and fruit crops in an objective and reproducible way is important for detailed phenotypic analyses of these traits. Tomato Analyzer (TA) is a software program that measures 37 attributes related to two-dimensional shape in a semi-automatic and reproducible manner. Many of these attributes, such as angles at the distal and proximal ends of the fruit and areas of indentation, are difficult to quantify manually. The attributes are organized in ten categories within the software: Basic Measurement, Fruit Shape Index, Blockiness, Homogeneity, Proximal Fruit End Shape, Distal Fruit End Shape, Asymmetry, Internal Eccentricity, Latitudinal Section and Morphometrics. The last category requires neither prior knowledge nor predetermined notions of the shape attributes, so morphometric analysis offers an unbiased option that may be better adapted to high-throughput analyses than attribute analysis. TA also offers the Color Test application that was designed to collect color measurements from scanned images and allow scanning devices to be calibrated using color standards. TA provides several options to export and analyze shape attribute, morphometric, and color data. The data may be exported to an excel file in batch mode (more than 100 images at one time) or exported as individual images. The user can choose between output that displays the average for each attribute for the objects in each image (including standard deviation), or an output that displays the attribute values for each object on the image. TA has been a valuable and effective tool for indentifying and confirming tomato fruit shape Quantitative Trait Loci (QTL), as well as performing in-depth analyses of the effect of key fruit shape genes on plant morphology. Also, TA can be used to objectively classify fruit into various shape categories. Lastly, fruit shape and color traits in other plant species as well as other plant organs such as leaves and seeds

  2. Application of mass-separated focused ion beams in nano-technology

    NASA Astrophysics Data System (ADS)

    Bischoff, L.

    2008-04-01

    FIB applications like writing ion implantation, ion beam mixing or ion beam synthesis in the μm- or nm range often require ion species other than gallium. Therefore alloy liquid metal ion sources (LMIS) have to be developed and applied in FIB tools. The energy distribution of ions emitted from an alloy LMIS is one of the crucial parameters for the performance of a FIB column. Different source materials like AuGe, AuSi, AuGeSi, CoNd, ErNi, ErFeNiCr, MnGe, GaBi, GaBiLi, SnPb, … were investigated with respect to the energy spread of the different ion species as a function of emission current, ion mass and emitter temperature. Different alloy LMIS's have been developed and used in the FZD - FIB system especially for writing implantation to fabricate sub-μm pattern without any lithographic steps. Co and various other ion species were applied to generate CoSi2 nano-structures, like dots and wires by ion beam synthesis or to manipulate the properties of magnetic films. Additionally, the possibility of varying the flux in the FIB by changing the pixel dwell-time can be used for the investigation of the radiation damage and dynamic annealing in Si, Ge and SiC at elevated implantation temperatures. Furthermore, a broad spectrum of ions was employed to study in a fast manner the sputtering process depending on temperature, angle of incidence and ion mass on a couple of target materials. These studies are important for the 3D-fabrication of various kinds of micro-tools by FIB milling.

  3. Software Program: Software Management Guidebook

    NASA Technical Reports Server (NTRS)

    1996-01-01

    The purpose of this NASA Software Management Guidebook is twofold. First, this document defines the core products and activities required of NASA software projects. It defines life-cycle models and activity-related methods but acknowledges that no single life-cycle model is appropriate for all NASA software projects. It also acknowledges that the appropriate method for accomplishing a required activity depends on characteristics of the software project. Second, this guidebook provides specific guidance to software project managers and team leaders in selecting appropriate life cycles and methods to develop a tailored plan for a software engineering project.

  4. Steady-state heat transfer in transversely heated porous media with application to focused solar energy collectors

    NASA Technical Reports Server (NTRS)

    Nichols, L. D.

    1976-01-01

    A fluid flowing in a porous medium heated transversely to the fluid flow is considered. This configuration is applicable to a focused solar energy collector for use in an electric power generating system. A fluidized bed can be regarded as a porous medium with special properties. The solutions presented are valid for describing the effectiveness of such a fluidized bed for collecting concentrated solar energy to heat the working fluid of a heat engine. Results indicate the advantage of high thermal conductivity in the transverse direction and high operating temperature of the porous medium.

  5. Software For Genetic Algorithms

    NASA Technical Reports Server (NTRS)

    Wang, Lui; Bayer, Steve E.

    1992-01-01

    SPLICER computer program is genetic-algorithm software tool used to solve search and optimization problems. Provides underlying framework and structure for building genetic-algorithm application program. Written in Think C.

  6. Proprietary software

    NASA Technical Reports Server (NTRS)

    Marnock, M. J.

    1971-01-01

    The protection of intellectual property by a patent, a copyright, or trade secrets is reviewed. The present and future use of computers and software are discussed, along with the governmental uses of software. The popularity of contractual agreements for sale or lease of computer programs and software services is also summarized.

  7. NASA's Software Safety Standard

    NASA Technical Reports Server (NTRS)

    Ramsay, Christopher M.

    2005-01-01

    NASA (National Aeronautics and Space Administration) relies more and more on software to control, monitor, and verify its safety critical systems, facilities and operations. Since the 1960's there has hardly been a spacecraft (manned or unmanned) launched that did not have a computer on board that provided vital command and control services. Despite this growing dependence on software control and monitoring, there has been no consistent application of software safety practices and methodology to NASA's projects with safety critical software. Led by the NASA Headquarters Office of Safety and Mission Assurance, the NASA Software Safety Standard (STD-18l9.13B) has recently undergone a significant update in an attempt to provide that consistency. This paper will discuss the key features of the new NASA Software Safety Standard. It will start with a brief history of the use and development of software in safety critical applications at NASA. It will then give a brief overview of the NASA Software Working Group and the approach it took to revise the software engineering process across the Agency.

  8. Advanced Software Development Workstation Project

    NASA Technical Reports Server (NTRS)

    Lee, Daniel

    1989-01-01

    The Advanced Software Development Workstation Project, funded by Johnson Space Center, is investigating knowledge-based techniques for software reuse in NASA software development projects. Two prototypes have been demonstrated and a third is now in development. The approach is to build a foundation that provides passive reuse support, add a layer that uses domain-independent programming knowledge, add a layer that supports the acquisition of domain-specific programming knowledge to provide active support, and enhance maintainability and modifiability through an object-oriented approach. The development of new application software would use specification-by-reformulation, based on a cognitive theory of retrieval from very long-term memory in humans, and using an Ada code library and an object base. Current tasks include enhancements to the knowledge representation of Ada packages and abstract data types, extensions to support Ada package instantiation knowledge acquisition, integration with Ada compilers and relational databases, enhancements to the graphical user interface, and demonstration of the system with a NASA contractor-developed trajectory simulation package. Future work will focus on investigating issues involving scale-up and integration.

  9. Requirements Engineering in Building Climate Science Software

    NASA Astrophysics Data System (ADS)

    Batcheller, Archer L.

    Software has an important role in supporting scientific work. This dissertation studies teams that build scientific software, focusing on the way that they determine what the software should do. These requirements engineering processes are investigated through three case studies of climate science software projects. The Earth System Modeling Framework assists modeling applications, the Earth System Grid distributes data via a web portal, and the NCAR (National Center for Atmospheric Research) Command Language is used to convert, analyze and visualize data. Document analysis, observation, and interviews were used to investigate the requirements-related work. The first research question is about how and why stakeholders engage in a project, and what they do for the project. Two key findings arise. First, user counts are a vital measure of project success, which makes adoption important and makes counting tricky and political. Second, despite the importance of quantities of users, a few particular "power users" develop a relationship with the software developers and play a special role in providing feedback to the software team and integrating the system into user practice. The second research question focuses on how project objectives are articulated and how they are put into practice. The team seeks to both build a software system according to product requirements but also to conduct their work according to process requirements such as user support. Support provides essential communication between users and developers that assists with refining and identifying requirements for the software. It also helps users to learn and apply the software to their real needs. User support is a vital activity for scientific software teams aspiring to create infrastructure. The third research question is about how change in scientific practice and knowledge leads to changes in the software, and vice versa. The "thickness" of a layer of software infrastructure impacts whether the

  10. Dosimetry in radiotherapy using a-Si EPIDs: Systems, methods, and applications focusing on 3D patient dose estimation

    NASA Astrophysics Data System (ADS)

    McCurdy, B. M. C.

    2013-06-01

    An overview is provided of the use of amorphous silicon electronic portal imaging devices (EPIDs) for dosimetric purposes in radiation therapy, focusing on 3D patient dose estimation. EPIDs were originally developed to provide on-treatment radiological imaging to assist with patient setup, but there has also been a natural interest in using them as dosimeters since they use the megavoltage therapy beam to form images. The current generation of clinically available EPID technology, amorphous-silicon (a-Si) flat panel imagers, possess many characteristics that make them much better suited to dosimetric applications than earlier EPID technologies. Features such as linearity with dose/dose rate, high spatial resolution, realtime capability, minimal optical glare, and digital operation combine with the convenience of a compact, retractable detector system directly mounted on the linear accelerator to provide a system that is well-suited to dosimetric applications. This review will discuss clinically available a-Si EPID systems, highlighting dosimetric characteristics and remaining limitations. Methods for using EPIDs in dosimetry applications will be discussed. Dosimetric applications using a-Si EPIDs to estimate three-dimensional dose in the patient during treatment will be overviewed. Clinics throughout the world are implementing increasingly complex treatments such as dynamic intensity modulated radiation therapy and volumetric modulated arc therapy, as well as specialized treatment techniques using large doses per fraction and short treatment courses (ie. hypofractionation and stereotactic radiosurgery). These factors drive the continued strong interest in using EPIDs as dosimeters for patient treatment verification.

  11. Gammasphere software development. Progress report

    SciTech Connect

    Piercey, R.B.

    1994-01-01

    This report describes the activities of the nuclear physics group at Mississippi State University which were performed during 1993. Significant progress has been made in the focus areas: chairing the Gammasphere Software Working Group (SWG); assisting with the porting and enhancement of the ORNL UPAK histogramming software package; and developing standard formats for Gammasphere data products. In addition, they have established a new public ftp archive to distribute software and software development tools and information.

  12. Software Released by LEWICE 2.0 Ice Accretion Software Development Project

    NASA Technical Reports Server (NTRS)

    Potapczuk, Mark G.

    2000-01-01

    Computational icing simulation methods are making the transition from the realm of research to commonplace use in design and certification. As such, standards of software management, design, validation, and documentation must be adjusted to accommodate the increased expectations of the user community with respect to accuracy, reliability, capability, and usability. With this in mind, in collaboration with Glenn's Engineering Design and Analysis Division, the Icing Branch of the NASA Glenn Research Center at Lewis Field began a software improvement project focused on the two-dimensional ice accretion simulation tool LEWICE. This project is serving as an introduction to the concepts of software management and is intended to serve as a pilot project for future icing simulation code development. The LEWICE 2.0 Software Development Project consisted of two major elements: software management and software validation. The software management element consisted of identifying features of well-designed and well-managed software that are appropriate for an analytical prediction tool such as LEWICE and applying them to a revised version of the code. This element included tasks such as identification of software requirements, development and implementation of coding standards, and implementation of software revision control practices. With the application of these techniques, the LEWICE ice accretion code became a more stable and reliable software product. In addition, the lessons learned about software development and maintenance can be factored into future software projects at the outset. The software validation activity was an integral part of our effort to make LEWICE a more accurate and reliable analysis tool. Because of the efforts taken to extensively validate this software, LEWICE 2.0 is more robust than previous releases and can reproduce results accurately across several computing platforms. It also differs from previous versions in the extensive quantitative

  13. A dedicated software application for treatment verification with off-line PET/CT imaging at the Heidelberg Ion Beam Therapy Center

    NASA Astrophysics Data System (ADS)

    Chen, W.; Bauer, J.; Kurz, C.; Tessonnier, T.; Handrack, J.; Haberer, T.; Debus, J.; Parodi, K.

    2017-01-01

    We present the workflow of the offline-PET based range verification method used at the Heidelberg Ion Beam Therapy Center, detailing the functionalities of an in-house developed software application, SimInterface14, with which range analysis is performed. Moreover, we introduce the design of a decision support system assessing uncertainties and facilitating physicians in decisions making for plan adaptation.

  14. The Use of Commercial Application and Programming Software To Provide Language Students with Self-Access Materials Used in Their Own Time.

    ERIC Educational Resources Information Center

    Rogers, Paul

    This paper describes the use of commercial application and programming software to provide language students with self-access materials used in their own time. In all cases, it assumes such students will be unsupervised. The idea is that in place of buying expensive and restricted commercially produced authoring packages, a teacher with limited…

  15. Developing Software Product Lines for Science Data Systems (Invited)

    NASA Astrophysics Data System (ADS)

    Crichton, D. J.; Hughes, J. S.; Mattmann, C. A.; Law, E.; Hardman, S.

    2010-12-01

    Software reuse has traditionally been a challenging proposition. While the allure of reusing software has great appeal to increasing stability and reducing software costs, there has been limited success in building software that can be efficiently reused. In many cases, reuse is limited to the reuse of software expertise or repurposing existing software code. While there are certainly cultural challenges involved in reusing software, much of the challenge can be traced back to the strategy involved in developing reusable software. The discipline of software architecture plays an important role since software reuse is highly dependent on developing a reference architecture that can be used for the construction of software product lines. All too often software reference architectures are implicit or are highly focused on specific implementations. The challenge is developing a reference architecture that identifies core patterns that exist across many systems at appropriate level of abstraction and then developing a reference implementation that can serve as a reusable product line. At the Jet Propulsion Laboratory (JPL), we have been involved in developing both reference architectures and software product lines for science data systems [1]. These reference architectures identify common patterns in data capture, data processing and product generation, data discovery, data access and distribution, and data movement. How those patterns are implemented is critical to establishing a reusable architecture. In addition, the separation of the technical and data architecture has proven critical to allowing for such product lines to be applied to multiple disciplines, where domain information models are developed and applied, rather than directly integrated into software. This presentation will focus on defining software architecture and product lines, the development of these capabilities at JPL, and the application to earth, planetary and biomedical domains. [1] C. Mattmann

  16. Optically deviated focusing method based high-speed SD-OCT for in vivo retinal clinical applications

    NASA Astrophysics Data System (ADS)

    Wijesinghe, Ruchire Eranga; Park, Kibeom; Kim, Pilun; Oh, Jaeryung; Kim, Seong-Woo; Kim, Kwangtae; Kim, Beop-Min; Jeon, Mansik; Kim, Jeehyun

    2016-04-01

    The aim of this study is to provide accurately focused, high-resolution in vivo human retinal depth images using an optically deviated focusing method with spectral-domain optical coherence tomography (SD-OCT) system. The proposed method was applied to increase the retinal diagnosing speed of patients with various values of retinal distances (i.e., the distance between the crystalline eye lens and the retina). The increased diagnosing speed was facilitated through an optical modification in the OCT sample arm configuration. Moreover, the optical path length matching process was compensated using the proposed optically deviated focusing method. The developed system was mounted on a bench-top cradle to overcome the motion artifacts. Further, we demonstrated the capability of the system by carrying out in vivo retinal imaging experiments. The clinical trials confirmed that the system was effective in diagnosing normal and abnormal retinal layers as several retinal abnormalities were identified using non-averaged single-shot OCT images, which demonstrate the feasibility of the method for clinical applications.

  17. A Quantitative Software Risk Assessment Model

    NASA Technical Reports Server (NTRS)

    Lee, Alice

    2002-01-01

    This slide presentation reviews a risk assessment model as applied to software development. the presentation uses graphs to demonstrate basic concepts of software reliability. It also discusses the application to the risk model to the software development life cycle.

  18. Software for Statistical Analysis of Weibull Distributions with Application to Gear Fatigue Data: User Manual with Verification

    NASA Technical Reports Server (NTRS)

    Krantz, Timothy L.

    2002-01-01

    The Weibull distribution has been widely adopted for the statistical description and inference of fatigue data. This document provides user instructions, examples, and verification for software to analyze gear fatigue test data. The software was developed presuming the data are adequately modeled using a two-parameter Weibull distribution. The calculations are based on likelihood methods, and the approach taken is valid for data that include type 1 censoring. The software was verified by reproducing results published by others.

  19. Software for Statistical Analysis of Weibull Distributions with Application to Gear Fatigue Data: User Manual with Verification

    NASA Technical Reports Server (NTRS)

    Kranz, Timothy L.

    2002-01-01

    The Weibull distribution has been widely adopted for the statistical description and inference of fatigue data. This document provides user instructions, examples, and verification for software to analyze gear fatigue test data. The software was developed presuming the data are adequately modeled using a two-parameter Weibull distribution. The calculations are based on likelihood methods, and the approach taken is valid for data that include type I censoring. The software was verified by reproducing results published by others.

  20. Lessons learned from development and quality assurance of software systems at the Halden Project

    SciTech Connect

    Bjorlo, T.J.; Berg, O.; Pehrsen, M.; Dahll, G.; Sivertsen, T.

    1996-03-01

    The OECD Halden Reactor Project has developed a number of software systems within the research programmes. These programmes have comprised a wide range of topics, like studies of software for safety-critical applications, development of different operator support systems, and software systems for building and implementing graphical user interfaces. The systems have ranged from simple prototypes to installations in process plants. In the development of these software systems, Halden has gained much experience in quality assurance of different types of software. This paper summarises the accumulated experience at the Halden Project in quality assurance of software systems. The different software systems being developed at the Halden Project may be grouped into three categories. These are plant-specific software systems (one-of-a-kind deliveries), generic software products, and safety-critical software systems. This classification has been found convenient as the categories have different requirements to the quality assurance process. In addition, the experience from use of software development tools and proprietary software systems at Halden, is addressed. The paper also focuses on the experience gained from the complete software life cycle, starting with the software planning phase and ending with software operation and maintenance.