Sample records for software takes advantage

  1. Scaling Task Management in Space and Time: Reducing User Overhead in Ubiquitous-Computing Environments

    DTIC Science & Technology

    2005-03-28

    consequently users are torn between taking advantage of increasingly pervasive computing systems, and the price (in attention and skill) that they have to... advantage of the surrounding computing environments; and (c) that it is usable by non-experts. Second, from a software architect’s perspective, we...take full advantage of the computing systems accessible to them, much as they take advantage of the furniture in each physical space. In the example

  2. Stretching Your Technology Dollar

    ERIC Educational Resources Information Center

    Johnson, Doug

    2012-01-01

    A school district technology director offers 10 strategies to help schools make the most of their technology dollar. These include using effective budgeting techniques, taking advantage of the buying power of groups, practicing sustainable technology, purchasing the right tool for the right job, taking advantage of free software, using cloud…

  3. [Apple-Macintosh compatible software for documentation, management and evaluation of ultrasound findings in obstetrics].

    PubMed

    Kurmanavicius, J; Huch, R; Huch, A

    1993-02-01

    The advantage of using a computer to automate routine calculations and print out charts of the obstetrical ultrasound examination is obvious. This report describes a software designed to simplify the documentation and analysis of ultrasound data in obstetrics. The system is easy to use, even for persons with little computer knowledge. The programme was written in FoxBase+/Mac (Fox Software, Inc., USA). FoxBase+/Mac takes full advantage of the easy-to-learn, easy-to-use Macintosh interface and is also very fast. Another advantage of this software is that it can be used in teaching. Non-experienced examinators can double-check the correctness of their scanning planes by observing the ultrasound pictures with the markers indicating the right measurement sites and the lists of standard values of biometrical parameters for the corresponding gestational age on the screen. In routine obstetrical ultrasound examinations it takes less than 5 min to enter the foetal biometry data and print out reports. These reports are informative and easy to interpret.

  4. Computer and control applications in a vegetable processing plant

    USDA-ARS?s Scientific Manuscript database

    There are many advantages to the use of computers and control in food industry. Software in the food industry takes 2 forms - general purpose commercial computer software and software for specialized applications, such as drying and thermal processing of foods. Many applied simulation models for d...

  5. Domain analysis for the reuse of software development experiences

    NASA Technical Reports Server (NTRS)

    Basili, V. R.; Briand, L. C.; Thomas, W. M.

    1994-01-01

    We need to be able to learn from past experiences so we can improve our software processes and products. The Experience Factory is an organizational structure designed to support and encourage the effective reuse of software experiences. This structure consists of two organizations which separates project development concerns from organizational concerns of experience packaging and learning. The experience factory provides the processes and support for analyzing, packaging, and improving the organization's stored experience. The project organization is structured to reuse this stored experience in its development efforts. However, a number of questions arise: What past experiences are relevant? Can they all be used (reused) on our current project? How do we take advantage of what has been learned in other parts of the organization? How do we take advantage of experience in the world-at-large? Can someone else's best practices be used in our organization with confidence? This paper describes approaches to help answer these questions. We propose both quantitative and qualitative approaches for effectively reusing software development experiences.

  6. Using the CoRE Requirements Method with ADARTS. Version 01.00.05

    DTIC Science & Technology

    1994-03-01

    requirements; combining ADARTS processes and objects derived from CoRE requirements into an ADARTS software architecture design ; and taking advantage of...CoRE’s precision in the ADARTS process structuring, class structuring, and software architecture design activities. Object-oriented requirements and

  7. Framework programmable platform for the advanced software development workstation. Integration mechanism design document

    NASA Technical Reports Server (NTRS)

    Mayer, Richard J.; Blinn, Thomas M.; Mayer, Paula S. D.; Reddy, Uday; Ackley, Keith; Futrell, Mike

    1991-01-01

    The Framework Programmable Software Development Platform (FPP) is a project aimed at combining effective tool and data integration mechanisms with a model of the software development process in an intelligent integrated software development environment. Guided by this model, this system development framework will take advantage of an integrated operating environment to automate effectively the management of the software development process so that costly mistakes during the development phase can be eliminated.

  8. Education and the Asian Surge: A Comparison of the Education Systems in India and China

    DTIC Science & Technology

    2008-01-01

    countries similar to those that other researchers have faced. For instance, Bardhan (2003) notes that fewer reliability checks and internal consistency tests...with a critical mass to take advantage of the software outsourcing boom 2 According to UNESCO, although the definition of literacy may vary from one...need to be targeted. For instance, too much emphasis on the study of information technology to take advantage of the current outsourcing trends could

  9. Enhancing the Student Learning Experience in Software Engineering Project Courses

    ERIC Educational Resources Information Center

    Marques, Maira; Ochoa, Sergio F.; Bastarrica, Maria Cecilia; Gutierrez, Francisco J.

    2018-01-01

    Carrying out real-world software projects in their academic studies helps students to understand what they will face in industry, and to experience first-hand the challenges involved when working collaboratively. Most of the instructional strategies used to help students take advantage of these activities focus on supporting agile programming,…

  10. Debate and the World Debates with You

    ERIC Educational Resources Information Center

    Read, Tina

    2011-01-01

    These days, many children do not enjoy the freedom to play outside. However, one of the advantages for children growing up now is the new technology that allows them to communicate with people their age, anywhere in the world. The author's company, Illumination Educational Software, decided to take advantage of these advances in technology to get…

  11. Facilities | Computational Science | NREL

    Science.gov Websites

    technology innovation by providing scientists and engineers the ability to tackle energy challenges that scientists and engineers to take full advantage of advanced computing hardware and software resources

  12. Framework Programmable Platform for the advanced software development workstation: Framework processor design document

    NASA Technical Reports Server (NTRS)

    Mayer, Richard J.; Blinn, Thomas M.; Mayer, Paula S. D.; Ackley, Keith A.; Crump, Wes; Sanders, Les

    1991-01-01

    The design of the Framework Processor (FP) component of the Framework Programmable Software Development Platform (FFP) is described. The FFP is a project aimed at combining effective tool and data integration mechanisms with a model of the software development process in an intelligent integrated software development environment. Guided by the model, this Framework Processor will take advantage of an integrated operating environment to provide automated support for the management and control of the software development process so that costly mistakes during the development phase can be eliminated.

  13. Whole earth modeling: developing and disseminating scientific software for computational geophysics.

    NASA Astrophysics Data System (ADS)

    Kellogg, L. H.

    2016-12-01

    Historically, a great deal of specialized scientific software for modeling and data analysis has been developed by individual researchers or small groups of scientists working on their own specific research problems. As the magnitude of available data and computer power has increased, so has the complexity of scientific problems addressed by computational methods, creating both a need to sustain existing scientific software, and expand its development to take advantage of new algorithms, new software approaches, and new computational hardware. To that end, communities like the Computational Infrastructure for Geodynamics (CIG) have been established to support the use of best practices in scientific computing for solid earth geophysics research and teaching. Working as a scientific community enables computational geophysicists to take advantage of technological developments, improve the accuracy and performance of software, build on prior software development, and collaborate more readily. The CIG community, and others, have adopted an open-source development model, in which code is developed and disseminated by the community in an open fashion, using version control and software repositories like Git. One emerging issue is how to adequately identify and credit the intellectual contributions involved in creating open source scientific software. The traditional method of disseminating scientific ideas, peer reviewed publication, was not designed for review or crediting scientific software, although emerging publication strategies such software journals are attempting to address the need. We are piloting an integrated approach in which authors are identified and credited as scientific software is developed and run. Successful software citation requires integration with the scholarly publication and indexing mechanisms as well, to assign credit, ensure discoverability, and provide provenance for software.

  14. Java for flight software

    NASA Technical Reports Server (NTRS)

    Benowitz, E. G.; Niessner, A. F.

    2003-01-01

    We have successfully demonstrated a portion of the spacecraft attitude control and fault protection, running on a standard Java platform, and are currently in the process of taking advantage of the features provided by the RTSJ.

  15. Framework Programmable Platform for the Advanced Software Development Workstation (FPP/ASDW). Demonstration framework document. Volume 1: Concepts and activity descriptions

    NASA Technical Reports Server (NTRS)

    Mayer, Richard J.; Blinn, Thomas M.; Dewitte, Paul S.; Crump, John W.; Ackley, Keith A.

    1992-01-01

    The Framework Programmable Software Development Platform (FPP) is a project aimed at effectively combining tool and data integration mechanisms with a model of the software development process to provide an intelligent integrated software development environment. Guided by the model, this system development framework will take advantage of an integrated operating environment to automate effectively the management of the software development process so that costly mistakes during the development phase can be eliminated. The Advanced Software Development Workstation (ASDW) program is conducting research into development of advanced technologies for Computer Aided Software Engineering (CASE).

  16. JTAG-based remote configuration of FPGAs over optical fibers

    DOE PAGES

    Deng, B.; Xu, H.; Liu, C.; ...

    2015-01-28

    In this study, a remote FPGA-configuration method based on JTAG extension over optical fibers is presented. The method takes advantage of commercial components and ready-to-use software such as iMPACT and does not require any hardware or software development. The method combines the advantages of the slow remote JTAG configuration and the fast local flash memory configuration. The method has been verified successfully and used in the Demonstrator of Liquid-Argon Trigger Digitization Board (LTDB) for the ATLAS liquid argon calorimeter Phase-I trigger upgrade. All components on the FPGA side are verified to meet the radiation tolerance requirements.

  17. Potential of the Cogex Software Platform to Replace Logbooks in Capstone Design Projects

    ERIC Educational Resources Information Center

    Foley, David; Charron, François; Plante, Jean-Sébastien

    2018-01-01

    Recent technologies are offering the power to share and grow knowledge and ideas in unprecedented ways. The CogEx software platform was developed to take advantage of the digital world with innovative ideas to support designers work in both industrial and academic contexts. This paper presents a qualitative study on the usage of CogEx during…

  18. Engineering intelligent tutoring systems

    NASA Technical Reports Server (NTRS)

    Warren, Kimberly C.; Goodman, Bradley A.

    1993-01-01

    We have defined an object-oriented software architecture for Intelligent Tutoring Systems (ITS's) to facilitate the rapid development, testing, and fielding of ITS's. This software architecture partitions the functionality of the ITS into a collection of software components with well-defined interfaces and execution concept. The architecture was designed to isolate advanced technology components, partition domain dependencies, take advantage of the increased availability of commercial software packages, and reduce the risks involved in acquiring ITS's. A key component of the architecture, the Executive, is a publish and subscribe message handling component that coordinates all communication between ITS components.

  19. Software and resources for computational medicinal chemistry

    PubMed Central

    Liao, Chenzhong; Sitzmann, Markus; Pugliese, Angelo; Nicklaus, Marc C

    2011-01-01

    Computer-aided drug design plays a vital role in drug discovery and development and has become an indispensable tool in the pharmaceutical industry. Computational medicinal chemists can take advantage of all kinds of software and resources in the computer-aided drug design field for the purposes of discovering and optimizing biologically active compounds. This article reviews software and other resources related to computer-aided drug design approaches, putting particular emphasis on structure-based drug design, ligand-based drug design, chemical databases and chemoinformatics tools. PMID:21707404

  20. Matpar: Parallel Extensions for MATLAB

    NASA Technical Reports Server (NTRS)

    Springer, P. L.

    1998-01-01

    Matpar is a set of client/server software that allows a MATLAB user to take advantage of a parallel computer for very large problems. The user can replace calls to certain built-in MATLAB functions with calls to Matpar functions.

  1. Taking advantage of ground data systems attributes to achieve quality results in testing software

    NASA Technical Reports Server (NTRS)

    Sigman, Clayton B.; Koslosky, John T.; Hageman, Barbara H.

    1994-01-01

    During the software development life cycle process, basic testing starts with the development team. At the end of the development process, an acceptance test is performed for the user to ensure that the deliverable is acceptable. Ideally, the delivery is an operational product with zero defects. However, the goal of zero defects is normally not achieved but is successful to various degrees. With the emphasis on building low cost ground support systems while maintaining a quality product, a key element in the test process is simulator capability. This paper reviews the Transportable Payload Operations Control Center (TPOCC) Advanced Spacecraft Simulator (TASS) test tool that is used in the acceptance test process for unmanned satellite operations control centers. The TASS is designed to support the development, test and operational environments of the Goddard Space Flight Center (GSFC) operations control centers. The TASS uses the same basic architecture as the operations control center. This architecture is characterized by its use of distributed processing, industry standards, commercial off-the-shelf (COTS) hardware and software components, and reusable software. The TASS uses much of the same TPOCC architecture and reusable software that the operations control center developer uses. The TASS also makes use of reusable simulator software in the mission specific versions of the TASS. Very little new software needs to be developed, mainly mission specific telemetry communication and command processing software. By taking advantage of the ground data system attributes, successful software reuse for operational systems provides the opportunity to extend the reuse concept into the test area. Consistency in test approach is a major step in achieving quality results.

  2. Keeping Things Interesting: A Reuse Case Study

    NASA Astrophysics Data System (ADS)

    Troisi, V.; Swick, R.; Seufert, E.

    2006-12-01

    Software reuse has several obvious advantages. By taking advantage of the experience and skill of colleagues one not only saves time, money and resources, but can also jump start a project that might otherwise have floundered from the start, or not even have been possible. One of the least talked about advantages of software reuse is it helps keep the work interesting for the developers. Reuse prevents developers from spending time and energy writing software solutions to problems that have already been solved, and frees them to concentrate on solving new problems, developing new components, and doing things that have never been done before. At the National Snow and Ice Data Center we are fortunate our user community has some unique needs that aren't met by mainstream solutions. Consequently we look for reuse opportunities wherever possible so we can focus on the tasks that add value for our user community. This poster offers a case study of one thread through a decade of reuse at NSIDC that has involved eight different development efforts to date.

  3. Utilizing GIS to evaluate base schedules in paratransit operations

    DOT National Transportation Integrated Search

    1999-02-02

    With ready access to street file names and inexpensive GIS software, paratransit systems can take advantage of GIS technology to evaluate base schedules on a regular basis in order to maintain system efficiency at consistently high levels. This proje...

  4. Environmental Models as a Service: Enabling Interoperability through RESTful Endpoints and API Documentation

    EPA Science Inventory

    Achieving interoperability in environmental modeling has evolved as software technology has progressed. The recent rise of cloud computing and proliferation of web services initiated a new stage for creating interoperable systems. Scientific programmers increasingly take advantag...

  5. Environmental Models as a Service: Enabling Interoperability through RESTful Endpoints and API Documentation.

    EPA Science Inventory

    Achieving interoperability in environmental modeling has evolved as software technology has progressed. The recent rise of cloud computing and proliferation of web services initiated a new stage for creating interoperable systems. Scientific programmers increasingly take advantag...

  6. Models for Deploying Open Source and Commercial Software to Support Earth Science Data Processing and Distribution

    NASA Astrophysics Data System (ADS)

    Yetman, G.; Downs, R. R.

    2011-12-01

    Software deployment is needed to process and distribute scientific data throughout the data lifecycle. Developing software in-house can take software development teams away from other software development projects and can require efforts to maintain the software over time. Adopting and reusing software and system modules that have been previously developed by others can reduce in-house software development and maintenance costs and can contribute to the quality of the system being developed. A variety of models are available for reusing and deploying software and systems that have been developed by others. These deployment models include open source software, vendor-supported open source software, commercial software, and combinations of these approaches. Deployment in Earth science data processing and distribution has demonstrated the advantages and drawbacks of each model. Deploying open source software offers advantages for developing and maintaining scientific data processing systems and applications. By joining an open source community that is developing a particular system module or application, a scientific data processing team can contribute to aspects of the software development without having to commit to developing the software alone. Communities of interested developers can share the work while focusing on activities that utilize in-house expertise and addresses internal requirements. Maintenance is also shared by members of the community. Deploying vendor-supported open source software offers similar advantages to open source software. However, by procuring the services of a vendor, the in-house team can rely on the vendor to provide, install, and maintain the software over time. Vendor-supported open source software may be ideal for teams that recognize the value of an open source software component or application and would like to contribute to the effort, but do not have the time or expertise to contribute extensively. Vendor-supported software may also have the additional benefits of guaranteed up-time, bug fixes, and vendor-added enhancements. Deploying commercial software can be advantageous for obtaining system or software components offered by a vendor that meet in-house requirements. The vendor can be contracted to provide installation, support and maintenance services as needed. Combining these options offers a menu of choices, enabling selection of system components or software modules that meet the evolving requirements encountered throughout the scientific data lifecycle.

  7. Environmental Models as a Service: Enabling Interoperability through RESTful Endpoints and API Documentation (presentation)

    EPA Science Inventory

    Achieving interoperability in environmental modeling has evolved as software technology has progressed. The recent rise of cloud computing and proliferation of web services initiated a new stage for creating interoperable systems. Scientific programmers increasingly take advantag...

  8. Framework for ReSTful Web Services in OSGi

    NASA Technical Reports Server (NTRS)

    Shams, Khawaja S.; Norris, Jeffrey S.; Powell, Mark W.; Crockett, Thomas M.; Mittman, David S.; Fox, Jason M.; Joswig, Joseph C.; Wallick, Michael N.; Torres, Recaredo J.; Rabe, Kenneth

    2009-01-01

    Ensemble ReST is a software system that eases the development, deployment, and maintenance of server-side application programs to perform functions that would otherwise be performed by client software. Ensemble ReST takes advantage of the proven disciplines of ReST (Representational State Transfer. ReST leverages the standardized HTTP protocol to enable developers to offer services to a diverse variety of clients: from shell scripts to sophisticated Java application suites

  9. Neutron Scattering Announcements

    Science.gov Websites

    will be added. We encourage everyone interested in neutron scattering to take full advantage of this neutron source ESS. After an initial layout phase using analytical considerations further assessment of Home Page | Facilities | Reference | Software | Conferences | Announcements | Mailing Lists Neutron

  10. Improved Foundry Castings Utilizing CAD/CAM (Computer Aided Design/ Computer Aided Manufacture). Volume 1. Overview

    DTIC Science & Technology

    1988-06-30

    casting. 68 Figure 1-9: Line printer representation of roll solidification. 69 Figure I1-1: Test casting model. 76 Figure 11-2: Division of test casting...writing new casting analysis and design routines. The new routines would take advantage of advanced criteria for predicting casting soundness and cast...properties and technical advances in computer hardware and software. 11 2. CONCLUSIONS UPCAST, a comprehensive software package, has been developed for

  11. GRO/EGRET data analysis software: An integrated system of custom and commercial software using standard interfaces

    NASA Technical Reports Server (NTRS)

    Laubenthal, N. A.; Bertsch, D.; Lal, N.; Etienne, A.; Mcdonald, L.; Mattox, J.; Sreekumar, P.; Nolan, P.; Fierro, J.

    1992-01-01

    The Energetic Gamma Ray Telescope Experiment (EGRET) on the Compton Gamma Ray Observatory has been in orbit for more than a year and is being used to map the full sky for gamma rays in a wide energy range from 30 to 20,000 MeV. Already these measurements have resulted in a wide range of exciting new information on quasars, pulsars, galactic sources, and diffuse gamma ray emission. The central part of the analysis is done with sky maps that typically cover an 80 x 80 degree section of the sky for an exposure time of several days. Specific software developed for this program generates the counts, exposure, and intensity maps. The analysis is done on a network of UNIX based workstations and takes full advantage of a custom-built user interface called X-dialog. The maps that are generated are stored in the FITS format for a collection of energies. These, along with similar diffuse emission background maps generated from a model calculation, serve as input to a maximum likelihood program that produces maps of likelihood with optional contours that are used to evaluate regions for sources. Likelihood also evaluates the background corrected intensity at each location for each energy interval from which spectra can be generated. Being in a standard FITS format permits all of the maps to be easily accessed by the full complement of tools available in several commercial astronomical analysis systems. In the EGRET case, IDL is used to produce graphics plots in two and three dimensions and to quickly implement any special evaluation that might be desired. Other custom-built software, such as the spectral and pulsar analyses, take advantage of the XView toolkit for display and Postscript output for the color hard copy. This poster paper outlines the data flow and provides examples of the user interfaces and output products. It stresses the advantages that are derived from the integration of the specific instrument-unique software and powerful commercial tools for graphics and statistical evaluation. This approach has several proven advantages including flexibility, a minimum of development effort, ease of use, and portability.

  12. Introducing a New Software for Geodetic Analysis

    NASA Astrophysics Data System (ADS)

    Hjelle, Geir Arne; Dähnn, Michael; Fausk, Ingrid; Kirkvik, Ann-Silje; Mysen, Eirik

    2017-04-01

    At the Norwegian Mapping Authority, we are currently developing Where, a new software for geodetic analysis. Where is built on our experiences with the Geosat software, and will be able to analyse and combine data from VLBI, SLR, GNSS and DORIS. The software is mainly written in Python which has proved very fruitful. The code is quick to write and the architecture is easily extendable and maintainable, while at the same time taking advantage of well-tested code like the SOFA and IERS libraries. This presentation will show some of the current capabilities of Where, including benchmarks against other software packages, and outline our plans for further progress. In addition we will report on some investigations we have done experimenting with alternative weighting strategies for VLBI.

  13. Extend Instruction outside the Classroom: Take Advantage of Your Learning Management System

    ERIC Educational Resources Information Center

    Jensen, Lauren A.

    2010-01-01

    Numerous institutions of higher education have implemented a learning management system (LMS) or are considering doing so. This web-based software package provides self-service and quick (often personalized) access to content in a dynamic environment. Learning management systems support administrative, reporting, and documentation activities. LMSs…

  14. Software Should be Written by Writers.

    ERIC Educational Resources Information Center

    Sheridan, James

    1983-01-01

    Considering the computer as a collaborator rather than a machine, it is encouraged that those in the humanities and the arts fields take advantage of the great potential that artificial intelligence can offer. Stresses that unless deliberately restricted, the computer is an inherently interdisciplinary medium, and capable of interacting with any…

  15. Neutron Scattering Home Page (Low-Graphics)

    Science.gov Websites

    will be added. We encourage everyone interested in neutron scattering to take full advantage of this Home Page | Facilities | Reference | Software | Conferences | Announcements | Mailing Lists Neutron Scattering Banner Neutron Scattering Home Page A new portal for neutron scattering has just been established

  16. The Market for Educational Software.

    ERIC Educational Resources Information Center

    Harvey, James, Ed.

    This report summarizes one of a series of workshops organized by RAND's Critical Technologies Institute, on behalf of the U.S. Department of Education, to take advantage of the experience of those already implementing new technologies in the schools. The workshop consisted chiefly of dialogues with educators and experts from the private sector who…

  17. CINDA-3G: Improved Numerical Differencing Analyzer Program for Third-Generation Computers

    NASA Technical Reports Server (NTRS)

    Gaski, J. D.; Lewis, D. R.; Thompson, L. R.

    1970-01-01

    The goal of this work was to develop a new and versatile program to supplement or replace the original Chrysler Improved Numerical Differencing Analyzer (CINDA) thermal analyzer program in order to take advantage of the improved systems software and machine speeds of the third-generation computers.

  18. Towards cheaper control centers

    NASA Technical Reports Server (NTRS)

    Baize, Lionel

    1994-01-01

    Today, any approach to the design of new space systems must take into consideration an important constraint, namely costs. This approach is our guideline for new missions and also applies to the ground segment, and particularly to the control center. CNES has carried out a study on a recent control center for application satellites in order to take advantage of the experience gained. This analysis, the purpose of which is to determine, a posteriori, the costs of architecture needs and choices, takes hardware and software costs into account and makes a number of recommendations.

  19. Development of a Next Generation Concurrent Framework for the ATLAS Experiment

    NASA Astrophysics Data System (ADS)

    Calafiura, P.; Lampl, W.; Leggett, C.; Malon, D.; Stewart, G.; Wynne, B.

    2015-12-01

    The ATLAS experiment has successfully used its Gaudi/Athena software framework for data taking and analysis during the first LHC run, with billions of events successfully processed. However, the design of Gaudi/Athena dates from early 2000 and the software and the physics code has been written using a single threaded, serial design. This programming model has increasing difficulty in exploiting the potential of current CPUs, which offer their best performance only through taking full advantage of multiple cores and wide vector registers. Future CPU evolution will intensify this trend, with core counts increasing and memory per core falling. With current memory consumption for 64 bit ATLAS reconstruction in a high luminosity environment approaching 4GB, it will become impossible to fully occupy all cores in a machine without exhausting available memory. However, since maximizing performance per watt will be a key metric, a mechanism must be found to use all cores as efficiently as possible. In this paper we report on our progress with a practical demonstration of the use of multithreading in the ATLAS reconstruction software, using the GaudiHive framework. We have expanded support to Calorimeter, Inner Detector, and Tracking code, discussing what changes were necessary in order to allow the serially designed ATLAS code to run, both to the framework and to the tools and algorithms used. We report on both the performance gains, and what general lessons were learned about the code patterns that had been employed in the software and which patterns were identified as particularly problematic for multi-threading. We also present our findings on implementing a hybrid multi-threaded / multi-process framework, to take advantage of the strengths of each type of concurrency, while avoiding some of their corresponding limitations.

  20. Software Process Improvement: Supporting the Linking of the Software and the Business Strategies

    NASA Astrophysics Data System (ADS)

    Albuquerque, Adriano Bessa; Rocha, Ana Regina; Lima, Andreia Cavalcanti

    The market is becoming more and more competitive, a lot of products and services depend of the software product and the software is one of the most important assets, which influence the organizations’ businesses. Considering this context, we can observe that the companies must to deal with the software, developing or acquiring, carefully. One of the perspectives that can help to take advantage of the software, supporting effectively the business, is to invest on the organization’s software processes. This paper presents an approach to evaluate and improve the processes assets of the software organizations, based on internationally well-known standards and process models. This approach is supported by automated tools from the TABA Workstation and is part of a wider improvement strategy constituted of three layers (organizational layer, process execution layer and external entity layer). Moreover, this paper presents the experience of use and their results.

  1. RELAP-7 Software Verification and Validation Plan

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Smith, Curtis L.; Choi, Yong-Joon; Zou, Ling

    This INL plan comprehensively describes the software for RELAP-7 and documents the software, interface, and software design requirements for the application. The plan also describes the testing-based software verification and validation (SV&V) process—a set of specially designed software models used to test RELAP-7. The RELAP-7 (Reactor Excursion and Leak Analysis Program) code is a nuclear reactor system safety analysis code being developed at Idaho National Laboratory (INL). The code is based on the INL’s modern scientific software development framework – MOOSE (Multi-Physics Object-Oriented Simulation Environment). The overall design goal of RELAP-7 is to take advantage of the previous thirty yearsmore » of advancements in computer architecture, software design, numerical integration methods, and physical models. The end result will be a reactor systems analysis capability that retains and improves upon RELAP5’s capability and extends the analysis capability for all reactor system simulation scenarios.« less

  2. Framework Programmable Platform for the Advanced Software Development Workstation: Preliminary system design document

    NASA Technical Reports Server (NTRS)

    Mayer, Richard J.; Blinn, Thomas M.; Mayer, Paula S. D.; Ackley, Keith A.; Crump, John W., IV; Henderson, Richard; Futrell, Michael T.

    1991-01-01

    The Framework Programmable Software Development Platform (FPP) is a project aimed at combining effective tool and data integration mechanisms with a model of the software development process in an intelligent integrated software environment. Guided by the model, this system development framework will take advantage of an integrated operating environment to automate effectively the management of the software development process so that costly mistakes during the development phase can be eliminated. The focus here is on the design of components that make up the FPP. These components serve as supporting systems for the Integration Mechanism and the Framework Processor and provide the 'glue' that ties the FPP together. Also discussed are the components that allow the platform to operate in a distributed, heterogeneous environment and to manage the development and evolution of software system artifacts.

  3. Implementing a modeling software for animated protein-complex interactions using a physics simulation library.

    PubMed

    Ueno, Yutaka; Ito, Shuntaro; Konagaya, Akihiko

    2014-12-01

    To better understand the behaviors and structural dynamics of proteins within a cell, novel software tools are being developed that can create molecular animations based on the findings of structural biology. This study proposes our method developed based on our prototypes to detect collisions and examine the soft-body dynamics of molecular models. The code was implemented with a software development toolkit for rigid-body dynamics simulation and a three-dimensional graphics library. The essential functions of the target software system included the basic molecular modeling environment, collision detection in the molecular models, and physical simulations of the movement of the model. Taking advantage of recent software technologies such as physics simulation modules and interpreted scripting language, the functions required for accurate and meaningful molecular animation were implemented efficiently.

  4. Genten: Software for Generalized Tensor Decompositions v. 1.0.0

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Phipps, Eric T.; Kolda, Tamara G.; Dunlavy, Daniel

    Tensors, or multidimensional arrays, are a powerful mathematical means of describing multiway data. This software provides computational means for decomposing or approximating a given tensor in terms of smaller tensors of lower dimension, focusing on decomposition of large, sparse tensors. These techniques have applications in many scientific areas, including signal processing, linear algebra, computer vision, numerical analysis, data mining, graph analysis, neuroscience and more. The software is designed to take advantage of parallelism present emerging computer architectures such has multi-core CPUs, many-core accelerators such as the Intel Xeon Phi, and computation-oriented GPUs to enable efficient processing of large tensors.

  5. Using Web Speech Technology with Language Learning Applications

    ERIC Educational Resources Information Center

    Daniels, Paul

    2015-01-01

    In this article, the author presents the history of human-to-computer interaction based upon the design of sophisticated computerized speech recognition algorithms. Advancements such as the arrival of cloud-based computing and software like Google's Web Speech API allows anyone with an Internet connection and Chrome browser to take advantage of…

  6. Taking the "Total Cost of Ownership" Concept to the Classroom.

    ERIC Educational Resources Information Center

    Fitzgerald, Sara

    2001-01-01

    Suggests school leaders must understand the total cost of ownership (TOC)-all of the costs involved with installing, operating, and maintaining computers-if they are going to use them to full advantage and cost-effectively. Discusses the major components of TCO after initial hardware investment (professional development, software, support, and…

  7. Managing Contracts For Educational Equity: Emerging Trends and Issues

    ERIC Educational Resources Information Center

    Burch, Patricia

    2010-01-01

    The past decade has witnessed an unprecedented expansion of the influence of the private sector in all aspects of public education. Across the United States, test publishers, software companies, virtual charter school operators, and other industries are rapidly moving to take advantage of the significant revenues made available by public policies.…

  8. Infrastructure and the Virtual Observatory

    NASA Astrophysics Data System (ADS)

    Dowler, P.; Gaudet, S.; Schade, D.

    2011-07-01

    The modern data center is faced with architectural and software engineering challenges that grow along with the challenges facing observatories: massive data flow, distributed computing environments, and distributed teams collaborating on large and small projects. By using VO standards as key components of the infrastructure, projects can take advantage of a decade of intellectual investment by the IVOA community. By their nature, these standards are proven and tested designs that already exist. Adopting VO standards saves considerable design effort, allows projects to take advantage of open-source software and test suites to speed development, and enables the use of third party tools that understand the VO protocols. The evolving CADC architecture now makes heavy use of VO standards. We show examples of how these standards may be used directly, coupled with non-VO standards, or extended with custom capabilities to solve real problems and provide value to our users. In the end, we use VO services as major parts of the core infrastructure to reduce cost rather than as an extra layer with additional cost and we can deliver more general purpose and robust services to our user community.

  9. A new approach for instrument software at Gemini

    NASA Astrophysics Data System (ADS)

    Gillies, Kim; Nunez, Arturo; Dunn, Jennifer

    2008-07-01

    Gemini Observatory is now developing its next generation of astronomical instruments, the Aspen instruments. These new instruments are sophisticated and costly requiring large distributed, collaborative teams. Instrument software groups often include experienced team members with existing mature code. Gemini has taken its experience from the previous generation of instruments and current hardware and software technology to create an approach for developing instrument software that takes advantage of the strengths of our instrument builders and our own operations needs. This paper describes this new software approach that couples a lightweight infrastructure and software library with aspects of modern agile software development. The Gemini Planet Imager instrument project, which is currently approaching its critical design review, is used to demonstrate aspects of this approach. New facilities under development will face similar issues in the future, and the approach presented here can be applied to other projects.

  10. pcircle - A Suite of Scalable Parallel File System Tools

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    WANG, FEIYI

    2015-10-01

    Most of the software related to file system are written for conventional local file system, they are serialized and can't take advantage of the benefit of a large scale parallel file system. "pcircle" software builds on top of ubiquitous MPI in cluster computing environment and "work-stealing" pattern to provide a scalable, high-performance suite of file system tools. In particular - it implemented parallel data copy and parallel data checksumming, with advanced features such as async progress report, checkpoint and restart, as well as integrity checking.

  11. OSI for hardware/software interoperability

    NASA Astrophysics Data System (ADS)

    Wood, Richard J.; Harvey, Donald L.; Linderman, Richard W.; Gardener, Gary A.; Capraro, Gerard T.

    1994-03-01

    There is a need in public safety for real-time data collection and transmission from one or more sensors. The Rome Laboratory and the Ballistic Missile Defense Organization are pursuing an effort to bring the benefits of Open System Architectures (OSA) to embedded systems within the Department of Defense. When developed properly OSA provides interoperability, commonality, graceful upgradeability, survivability and hardware/software transportability to greatly minimize life cycle costs, integration and supportability. Architecture flexibility can be achieved to take advantage of commercial accomplishments by basing these developments on vendor-neutral commercially accepted standards and protocols.

  12. Yearbook Production: Yearbook Staffs Can Now "Blame" Strengths, Weaknesses on Computer as They Take More Control of Their Publications.

    ERIC Educational Resources Information Center

    Hall, H. L.

    1988-01-01

    Reports on the advantages and disadvantages of desktop publishing, using the Apple Macintosh and "Pagemaker" software, to produce a high school yearbook. Asserts that while desktop publishing may be initially more time consuming for those unfamiliar with computers, desktop publishing gives high school journalism staffs more control over…

  13. Chrysler improved numerical differencing analyzer for third generation computers CINDA-3G

    NASA Technical Reports Server (NTRS)

    Gaski, J. D.; Lewis, D. R.; Thompson, L. R.

    1972-01-01

    New and versatile method has been developed to supplement or replace use of original CINDA thermal analyzer program in order to take advantage of improved systems software and machine speeds of third generation computers. CINDA-3G program options offer variety of methods for solution of thermal analog models presented in network format.

  14. Top 10 Threats to Computer Systems Include Professors and Students

    ERIC Educational Resources Information Center

    Young, Jeffrey R.

    2008-01-01

    User awareness is growing in importance when it comes to computer security. Not long ago, keeping college networks safe from cyberattackers mainly involved making sure computers around campus had the latest software patches. New computer worms or viruses would pop up, taking advantage of some digital hole in the Windows operating system or in…

  15. Bayesian Inference: with ecological applications

    USGS Publications Warehouse

    Link, William A.; Barker, Richard J.

    2010-01-01

    This text provides a mathematically rigorous yet accessible and engaging introduction to Bayesian inference with relevant examples that will be of interest to biologists working in the fields of ecology, wildlife management and environmental studies as well as students in advanced undergraduate statistics.. This text opens the door to Bayesian inference, taking advantage of modern computational efficiencies and easily accessible software to evaluate complex hierarchical models.

  16. Biotechnology software in the digital age: are you winning?

    PubMed

    Scheitz, Cornelia Johanna Franziska; Peck, Lawrence J; Groban, Eli S

    2018-01-16

    There is a digital revolution taking place and biotechnology companies are slow to adapt. Many pharmaceutical, biotechnology, and industrial bio-production companies believe that software must be developed and maintained in-house and that data are more secure on internal servers than on the cloud. In fact, most companies in this space continue to employ large IT and software teams and acquire computational infrastructure in the form of in-house servers. This is due to a fear of the cloud not sufficiently protecting in-house resources and the belief that their software is valuable IP. Over the next decade, the ability to quickly adapt to changing market conditions, with agile software teams, will quickly become a compelling competitive advantage. Biotechnology companies that do not adopt the new regime may lose on key business metrics such as return on invested capital, revenue, profitability, and eventually market share.

  17. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pachuilo, Andrew R; Ragan, Eric; Goodall, John R

    Visualization tools can take advantage of multiple coordinated views to support analysis of large, multidimensional data sets. Effective design of such views and layouts can be challenging, but understanding users analysis strategies can inform design improvements. We outline an approach for intelligent design configuration of visualization tools with multiple coordinated views, and we discuss a proposed software framework to support the approach. The proposed software framework could capture and learn from user interaction data to automate new compositions of views and widgets. Such a framework could reduce the time needed for meta analysis of the visualization use and lead tomore » more effective visualization design.« less

  18. Evaluation of the discrete vortex wake cross flow model using vector computers. Part 1: Theory and application

    NASA Technical Reports Server (NTRS)

    1979-01-01

    The current program had the objective to modify a discrete vortex wake method to efficiently compute the aerodynamic forces and moments on high fineness ratio bodies (f approximately 10.0). The approach is to increase computational efficiency by structuring the program to take advantage of new computer vector software and by developing new algorithms when vector software can not efficiently be used. An efficient program was written and substantial savings achieved. Several test cases were run for fineness ratios up to f = 16.0 and angles of attack up to 50 degrees.

  19. Supporting geoscience with graphical-user-interface Internet tools for the Macintosh

    NASA Astrophysics Data System (ADS)

    Robin, Bernard

    1995-07-01

    This paper describes a suite of Macintosh graphical-user-interface (GUI) software programs that can be used in conjunction with the Internet to support geoscience education. These software programs allow science educators to access and retrieve a large body of resources from an increasing number of network sites, taking advantage of the intuitive, simple-to-use Macintosh operating system. With these tools, educators easily can locate, download, and exchange not only text files but also sound resources, video movie clips, and software application files from their desktop computers. Another major advantage of these software tools is that they are available at no cost and may be distributed freely. The following GUI software tools are described including examples of how they can be used in an educational setting: ∗ Eudora—an e-mail program ∗ NewsWatcher—a newsreader ∗ TurboGopher—a Gopher program ∗ Fetch—a software application for easy File Transfer Protocol (FTP) ∗ NCSA Mosaic—a worldwide hypertext browsing program. An explosive growth of online archives currently is underway as new electronic sites are being added continuously to the Internet. Many of these resources may be of interest to science educators who learn they can share not only ASCII text files, but also graphic image files, sound resources, QuickTime movie clips, and hypermedia projects with colleagues from locations around the world. These powerful, yet simple to learn GUI software tools are providing a revolution in how knowledge can be accessed, retrieved, and shared.

  20. Tevatron beam position monitor upgrade

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wolbers, Stephen; Banerjee, B.; Barker, B.

    2005-05-01

    The Tevatron Beam Position Monitor (BPM) readout electronics and software have been upgraded to improve measurement precision, functionality and reliability. The original system, designed and built in the early 1980's, became inadequate for current and future operations of the Tevatron. The upgraded system consists of 960 channels of new electronics to process analog signals from 240 BPMs, new front-end software, new online and controls software, and modified applications to take advantage of the improved measurements and support the new functionality. The new system reads signals from both ends of the existing directional stripline pickups to provide simultaneous proton and antiprotonmore » position measurements. Measurements using the new system are presented that demonstrate its improved resolution and overall performance.« less

  1. Aspect-Oriented Model-Driven Software Product Line Engineering

    NASA Astrophysics Data System (ADS)

    Groher, Iris; Voelter, Markus

    Software product line engineering aims to reduce development time, effort, cost, and complexity by taking advantage of the commonality within a portfolio of similar products. The effectiveness of a software product line approach directly depends on how well feature variability within the portfolio is implemented and managed throughout the development lifecycle, from early analysis through maintenance and evolution. This article presents an approach that facilitates variability implementation, management, and tracing by integrating model-driven and aspect-oriented software development. Features are separated in models and composed of aspect-oriented composition techniques on model level. Model transformations support the transition from problem to solution space models. Aspect-oriented techniques enable the explicit expression and modularization of variability on model, template, and code level. The presented concepts are illustrated with a case study of a home automation system.

  2. A Simplified Mesh Deformation Method Using Commercial Structural Analysis Software

    NASA Technical Reports Server (NTRS)

    Hsu, Su-Yuen; Chang, Chau-Lyan; Samareh, Jamshid

    2004-01-01

    Mesh deformation in response to redefined or moving aerodynamic surface geometries is a frequently encountered task in many applications. Most existing methods are either mathematically too complex or computationally too expensive for usage in practical design and optimization. We propose a simplified mesh deformation method based on linear elastic finite element analyses that can be easily implemented by using commercially available structural analysis software. Using a prescribed displacement at the mesh boundaries, a simple structural analysis is constructed based on a spatially varying Young s modulus to move the entire mesh in accordance with the surface geometry redefinitions. A variety of surface movements, such as translation, rotation, or incremental surface reshaping that often takes place in an optimization procedure, may be handled by the present method. We describe the numerical formulation and implementation using the NASTRAN software in this paper. The use of commercial software bypasses tedious reimplementation and takes advantage of the computational efficiency offered by the vendor. A two-dimensional airfoil mesh and a three-dimensional aircraft mesh were used as test cases to demonstrate the effectiveness of the proposed method. Euler and Navier-Stokes calculations were performed for the deformed two-dimensional meshes.

  3. BarraCUDA - a fast short read sequence aligner using graphics processing units

    PubMed Central

    2012-01-01

    Background With the maturation of next-generation DNA sequencing (NGS) technologies, the throughput of DNA sequencing reads has soared to over 600 gigabases from a single instrument run. General purpose computing on graphics processing units (GPGPU), extracts the computing power from hundreds of parallel stream processors within graphics processing cores and provides a cost-effective and energy efficient alternative to traditional high-performance computing (HPC) clusters. In this article, we describe the implementation of BarraCUDA, a GPGPU sequence alignment software that is based on BWA, to accelerate the alignment of sequencing reads generated by these instruments to a reference DNA sequence. Findings Using the NVIDIA Compute Unified Device Architecture (CUDA) software development environment, we ported the most computational-intensive alignment component of BWA to GPU to take advantage of the massive parallelism. As a result, BarraCUDA offers a magnitude of performance boost in alignment throughput when compared to a CPU core while delivering the same level of alignment fidelity. The software is also capable of supporting multiple CUDA devices in parallel to further accelerate the alignment throughput. Conclusions BarraCUDA is designed to take advantage of the parallelism of GPU to accelerate the alignment of millions of sequencing reads generated by NGS instruments. By doing this, we could, at least in part streamline the current bioinformatics pipeline such that the wider scientific community could benefit from the sequencing technology. BarraCUDA is currently available from http://seqbarracuda.sf.net PMID:22244497

  4. Software Solution of Web Questionnaires for the Analysis of the Economy in Relation to the Competence of Students

    ERIC Educational Resources Information Center

    Simeunovic, Vlado; Milic, Sanja

    2018-01-01

    The basic idea of the research was to take advantage of IT and establish a direct contact between businesses (employers) and universities in order to exchange relevant data on the knowledge, skills and competencies of students who got their first job in the economy. We used the best practices from previous papers that dealt with designing web…

  5. Data-Driven Software Framework for Web-Based ISS Telescience

    NASA Technical Reports Server (NTRS)

    Tso, Kam S.

    2005-01-01

    Software that enables authorized users to monitor and control scientific payloads aboard the International Space Station (ISS) from diverse terrestrial locations equipped with Internet connections is undergoing development. This software reflects a data-driven approach to distributed operations. A Web-based software framework leverages prior developments in Java and Extensible Markup Language (XML) to create portable code and portable data, to which one can gain access via Web-browser software on almost any common computer. Open-source software is used extensively to minimize cost; the framework also accommodates enterprise-class server software to satisfy needs for high performance and security. To accommodate the diversity of ISS experiments and users, the framework emphasizes openness and extensibility. Users can take advantage of available viewer software to create their own client programs according to their particular preferences, and can upload these programs for custom processing of data, generation of views, and planning of experiments. The same software system, possibly augmented with a subset of data and additional software tools, could be used for public outreach by enabling public users to replay telescience experiments, conduct their experiments with simulated payloads, and create their own client programs and other custom software.

  6. Making software get along: integrating optical and mechanical design programs

    NASA Astrophysics Data System (ADS)

    Shackelford, Christie J.; Chinnock, Randal B.

    2001-03-01

    As modern optomechanical engineers, we have the good fortune of having very sophisticated software programs available to us. The current optical design, mechanical design, industrial design, and CAM programs are very powerful tools with some very desirable features. However, no one program can do everything necessary to complete an entire optomechanical system design. Each program has a unique set of features and benefits, and typically two or mo re will be used during the product development process. At a minimum, an optical design program and a mechanical CAD package will be employed. As we strive for efficient, cost-effective, and rapid progress in our development projects, we must use these programs to their full advantage, while keeping redundant tasks to a minimum. Together, these programs offer the promise of a `seamless' flow of data from concept all the way to the download of part designs directly to the machine shop for fabrication. In reality, transferring data from one software package to the next is often frustrating. Overcoming these problems takes some know-how, a bit of creativity, and a lot of persistence. This paper describes a complex optomechanical development effort in which a variety of software tools were used from the concept stage to prototyping. It will describe what software was used for each major design task, how we learned to use them together to best advantage, and how we overcame the frustrations of software that didn't get along.

  7. Reuse at the Software Productivity Consortium

    NASA Technical Reports Server (NTRS)

    Weiss, David M.

    1989-01-01

    The Software Productivity Consortium is sponsored by 14 aerospace companies as a developer of software engineering methods and tools. Software reuse and prototyping are currently the major emphasis areas. The Methodology and Measurement Project in the Software Technology Exploration Division has developed some concepts for reuse which they intend to develop into a synthesis process. They have identified two approaches to software reuse: opportunistic and systematic. The assumptions underlying the systematic approach, phrased as hypotheses, are the following: the redevelopment hypothesis, i.e., software developers solve the same problems repeatedly; the oracle hypothesis, i.e., developers are able to predict variations from one redevelopment to others; and the organizational hypothesis, i.e., software must be organized according to behavior and structure to take advantage of the predictions that the developers make. The conceptual basis for reuse includes: program families, information hiding, abstract interfaces, uses and information hiding hierarchies, and process structure. The primary reusable software characteristics are black-box descriptions, structural descriptions, and composition and decomposition based on program families. Automated support can be provided for systematic reuse, and the Consortium is developing a prototype reuse library and guidebook. The software synthesis process that the Consortium is aiming toward includes modeling, refinement, prototyping, reuse, assessment, and new construction.

  8. Yaxx: Yet another X-ray extractor

    NASA Astrophysics Data System (ADS)

    Aldcroft, Tom

    2013-06-01

    Yaxx is a Perl script that facilitates batch data processing using Perl open source software and commonly available software such as CIAO/Sherpa, S-lang, SAS, and FTOOLS. For Chandra and XMM analysis it includes automated spectral extraction, fitting, and report generation. Yaxx can be run without climbing an extensive learning curve; even so, yaxx is highly configurable and can be customized to support complex analysis. yaxx uses template files and takes full advantage of the unique Sherpa / S-lang environment to make much of the processing user configurable. Although originally developed with an emphasis on X-ray data analysis, yaxx evolved to be a general-purpose pipeline scripting package.

  9. SDR implementation of the receiver of adaptive communication system

    NASA Astrophysics Data System (ADS)

    Skarzynski, Jacek; Darmetko, Marcin; Kozlowski, Sebastian; Kurek, Krzysztof

    2016-04-01

    The paper presents software implementation of a receiver forming a part of an adaptive communication system. The system is intended for communication with a satellite placed in a low Earth orbit (LEO). The ability of adaptation is believed to increase the total amount of data transmitted from the satellite to the ground station. Depending on the signal-to-noise ratio (SNR) of the received signal, adaptive transmission is realized using different transmission modes, i.e., different modulation schemes (BPSK, QPSK, 8-PSK, and 16-APSK) and different convolutional code rates (1/2, 2/3, 3/4, 5/6, and 7/8). The receiver consists of a software-defined radio (SDR) module (National Instruments USRP-2920) and a multithread reception software running on Windows operating system. In order to increase the speed of signal processing, the software takes advantage of single instruction multiple data instructions supported by x86 processor architecture.

  10. Web-based Quality Control Tool used to validate CERES products on a cluster of Linux servers

    NASA Astrophysics Data System (ADS)

    Chu, C.; Sun-Mack, S.; Heckert, E.; Chen, Y.; Mlynczak, P.; Mitrescu, C.; Doelling, D.

    2014-12-01

    There have been a few popular desktop tools used in the Earth Science community to validate science data. Because of the limitation on the capacity of desktop hardware such as disk space and CPUs, those softwares are not able to display large amount of data from files.This poster will talk about an in-house developed web-based software built on a cluster of Linux servers. That allows users to take advantage of a few Linux servers working in parallel to generate hundreds images in a short period of time. The poster will demonstrate:(1) The hardware and software architecture is used to provide high throughput of images. (2) The software structure that can incorporate new products and new requirement quickly. (3) The user interface about how users can manipulate the data and users can control how the images are displayed.

  11. Enhancing E-Health Information Systems with Agent Technology

    PubMed Central

    Nguyen, Minh Tuan; Fuhrer, Patrik; Pasquier-Rocha, Jacques

    2009-01-01

    Agent Technology is an emerging and promising research area in software technology, which increasingly contributes to the development of value-added information systems for large healthcare organizations. Through the MediMAS prototype, resulting from a case study conducted at a local Swiss hospital, this paper aims at presenting the advantages of reinforcing such a complex E-health man-machine information organization with software agents. The latter will work on behalf of human agents, taking care of routine tasks, and thus increasing the speed, the systematic, and ultimately the reliability of the information exchanges. We further claim that the modeling of the software agent layer can be methodically derived from the actual “classical” laboratory organization and practices, as well as seamlessly integrated with the existing information system. PMID:19096509

  12. A Disk-Based System for Producing and Distributing Science Products from MODIS

    NASA Technical Reports Server (NTRS)

    Masuoka, Edward; Wolfe, Robert; Sinno, Scott; Ye Gang; Teague, Michael

    2007-01-01

    Since beginning operations in 1999, the MODIS Adaptive Processing System (MODAPS) has evolved to take advantage of trends in information technology, such as the falling cost of computing cycles and disk storage and the availability of high quality open-source software (Linux, Apache and Perl), to achieve substantial gains in processing and distribution capacity and throughput while driving down the cost of system operations.

  13. Research on computer-aided design of modern marine power systems

    NASA Astrophysics Data System (ADS)

    Ding, Dongdong; Zeng, Fanming; Chen, Guojun

    2004-03-01

    To make the MPS (Marine Power System) design process more economical and easier, a new CAD scheme is brought forward which takes much advantage of VR (Virtual Reality) and AI (Artificial Intelligence) technologies. This CAD system can shorten the period of design and reduce the requirements on designers' experience in large scale. And some key issues like the selection of hardware and software of such a system are discussed.

  14. Craniux: A LabVIEW-Based Modular Software Framework for Brain-Machine Interface Research

    PubMed Central

    Degenhart, Alan D.; Kelly, John W.; Ashmore, Robin C.; Collinger, Jennifer L.; Tyler-Kabara, Elizabeth C.; Weber, Douglas J.; Wang, Wei

    2011-01-01

    This paper presents “Craniux,” an open-access, open-source software framework for brain-machine interface (BMI) research. Developed in LabVIEW, a high-level graphical programming environment, Craniux offers both out-of-the-box functionality and a modular BMI software framework that is easily extendable. Specifically, it allows researchers to take advantage of multiple features inherent to the LabVIEW environment for on-the-fly data visualization, parallel processing, multithreading, and data saving. This paper introduces the basic features and system architecture of Craniux and describes the validation of the system under real-time BMI operation using simulated and real electrocorticographic (ECoG) signals. Our results indicate that Craniux is able to operate consistently in real time, enabling a seamless work flow to achieve brain control of cursor movement. The Craniux software framework is made available to the scientific research community to provide a LabVIEW-based BMI software platform for future BMI research and development. PMID:21687575

  15. Craniux: a LabVIEW-based modular software framework for brain-machine interface research.

    PubMed

    Degenhart, Alan D; Kelly, John W; Ashmore, Robin C; Collinger, Jennifer L; Tyler-Kabara, Elizabeth C; Weber, Douglas J; Wang, Wei

    2011-01-01

    This paper presents "Craniux," an open-access, open-source software framework for brain-machine interface (BMI) research. Developed in LabVIEW, a high-level graphical programming environment, Craniux offers both out-of-the-box functionality and a modular BMI software framework that is easily extendable. Specifically, it allows researchers to take advantage of multiple features inherent to the LabVIEW environment for on-the-fly data visualization, parallel processing, multithreading, and data saving. This paper introduces the basic features and system architecture of Craniux and describes the validation of the system under real-time BMI operation using simulated and real electrocorticographic (ECoG) signals. Our results indicate that Craniux is able to operate consistently in real time, enabling a seamless work flow to achieve brain control of cursor movement. The Craniux software framework is made available to the scientific research community to provide a LabVIEW-based BMI software platform for future BMI research and development.

  16. Web Applications and Thin Clients in the Navy

    DTIC Science & Technology

    2011-09-01

    say thank you to his family and shipmates for all the encouragement and distractions, when he needed them the most. xviii THIS PAGE INTENTIONALLY...to take full advantage of touch screen features, like journal software that converts handwriting to standard text (Mallick, 2003). 5. Smart Pads...outsourcing Web Applications have no direct control or access to the system and therefore no say in how the network is managed (Clouse, n.d.). Any issues

  17. XMM-Newton Science Analysis Software: How to Bring New Technologies to Long-life Satellite Missions

    NASA Astrophysics Data System (ADS)

    Ibarra, A.; Calle, I.; Gabriel, C.; Salgado, J.; Osuna, P.

    2009-09-01

    We present here the beta version of the Remote Interface to SAS Analysis (RISA), a web service-based system that allows users the analysis of XMM-Newton data making use of all of the existing SAS functionalities. RISA takes advantage of GRID architecture to run SAS, achieving high performance in resource management. We are also making the SAS remote analysis compatible with present and future VO standards.

  18. DOD Weapon Systems Software Management Study, Appendix B. Shipborne Systems

    DTIC Science & Technology

    1975-06-01

    program management, from Inception to development maintenance, 2. Detailed documentation requirements, 3. Standard high -level language development (CS-1...the Guided Missile School (GMS) at Dam Neck. The APL Land-Based Test Site (LETS) consisted of a Mk 152 digital fire control computer, SPG-55B radar...instruction and data segments are respectively placed in low and high core addresses to take advantage of UYK-7 memory accessing time savings. UYK-7

  19. Automated RTOP Management System

    NASA Technical Reports Server (NTRS)

    Hayes, P.

    1984-01-01

    The structure of NASA's Office of Aeronautics and Space Technology electronic information system network from 1983 to 1985 is illustrated. The RTOP automated system takes advantage of existing hardware, software, and expertise, and provides: (1) computerized cover sheet and resources forms; (2) electronic signature and transmission; (3) a data-based information system; (4) graphics; (5) intercenter communications; (6) management information; and (7) text editing. The system is coordinated with Headquarters efforts in codes R,E, and T.

  20. Full speed ahead for software

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wolfe, A.

    1986-03-10

    Supercomputing software is moving into high gear, spurred by the rapid spread of supercomputers into new applications. The critical challenge is how to develop tools that will make it easier for programmers to write applications that take advantage of vectorizing in the classical supercomputer and the parallelism that is emerging in supercomputers and minisupercomputers. Writing parallel software is a challenge that every programmer must face because parallel architectures are springing up across the range of computing. Cray is developing a host of tools for programmers. Tools to support multitasking (in supercomputer parlance, multitasking means dividing up a single program tomore » run on multiple processors) are high on Cray's agenda. On tap for multitasking is Premult, dubbed a microtasking tool. As a preprocessor for Cray's CFT77 FORTRAN compiler, Premult will provide fine-grain multitasking.« less

  1. IMAT graphics manual

    NASA Technical Reports Server (NTRS)

    Stockwell, Alan E.; Cooper, Paul A.

    1991-01-01

    The Integrated Multidisciplinary Analysis Tool (IMAT) consists of a menu driven executive system coupled with a relational database which links commercial structures, structural dynamics and control codes. The IMAT graphics system, a key element of the software, provides a common interface for storing, retrieving, and displaying graphical information. The IMAT Graphics Manual shows users of commercial analysis codes (MATRIXx, MSC/NASTRAN and I-DEAS) how to use the IMAT graphics system to obtain high quality graphical output using familiar plotting procedures. The manual explains the key features of the IMAT graphics system, illustrates their use with simple step-by-step examples, and provides a reference for users who wish to take advantage of the flexibility of the software to customize their own applications.

  2. ASTEC: Controls analysis for personal computers

    NASA Technical Reports Server (NTRS)

    Downing, John P.; Bauer, Frank H.; Thorpe, Christopher J.

    1989-01-01

    The ASTEC (Analysis and Simulation Tools for Engineering Controls) software is under development at Goddard Space Flight Center (GSFC). The design goal is to provide a wide selection of controls analysis tools at the personal computer level, as well as the capability to upload compute-intensive jobs to a mainframe or supercomputer. The project is a follow-on to the INCA (INteractive Controls Analysis) program that has been developed at GSFC over the past five years. While ASTEC makes use of the algorithms and expertise developed for the INCA program, the user interface was redesigned to take advantage of the capabilities of the personal computer. The design philosophy and the current capabilities of the ASTEC software are described.

  3. Programming Makes Software; Support Makes Users

    NASA Astrophysics Data System (ADS)

    Batcheller, A. L.

    2010-12-01

    Skilled software engineers may build fantastic software for climate modeling, yet fail to achieve their project’s objectives. Software support and related activities are just as critical as writing software. This study followed three different software projects in the climate sciences, using interviews, observation, and document analysis to examine the value added by support work. Supporting the project and interacting with users was a key task for software developers, who often spent 50% of their time on it. Such support work most often involved replying to questions on an email list, but also included talking to users on teleconference calls and in person. Software support increased adoption by building the software’s reputation and showing individuals how the software can meet their needs. In the process of providing support, developers often learned new of requirements as users reported features they desire and bugs they found. As software matures and gains widespread use, support work often increases. In fact, such increases can be one signal that the software has achieved broad acceptance. Maturing projects also find demand for instructional classes, online tutorials and detailed examples of how to use the software. The importance of support highlights the fact that building software systems involves both social and technical aspects. Yes, we need to build the software, but we also need to “build” the users and practices that can take advantage of it.

  4. Visualization of Octree Adaptive Mesh Refinement (AMR) in Astrophysical Simulations

    NASA Astrophysics Data System (ADS)

    Labadens, M.; Chapon, D.; Pomaréde, D.; Teyssier, R.

    2012-09-01

    Computer simulations are important in current cosmological research. Those simulations run in parallel on thousands of processors, and produce huge amount of data. Adaptive mesh refinement is used to reduce the computing cost while keeping good numerical accuracy in regions of interest. RAMSES is a cosmological code developed by the Commissariat à l'énergie atomique et aux énergies alternatives (English: Atomic Energy and Alternative Energies Commission) which uses Octree adaptive mesh refinement. Compared to grid based AMR, the Octree AMR has the advantage to fit very precisely the adaptive resolution of the grid to the local problem complexity. However, this specific octree data type need some specific software to be visualized, as generic visualization tools works on Cartesian grid data type. This is why the PYMSES software has been also developed by our team. It relies on the python scripting language to ensure a modular and easy access to explore those specific data. In order to take advantage of the High Performance Computer which runs the RAMSES simulation, it also uses MPI and multiprocessing to run some parallel code. We would like to present with more details our PYMSES software with some performance benchmarks. PYMSES has currently two visualization techniques which work directly on the AMR. The first one is a splatting technique, and the second one is a custom ray tracing technique. Both have their own advantages and drawbacks. We have also compared two parallel programming techniques with the python multiprocessing library versus the use of MPI run. The load balancing strategy has to be smartly defined in order to achieve a good speed up in our computation. Results obtained with this software are illustrated in the context of a massive, 9000-processor parallel simulation of a Milky Way-like galaxy.

  5. The different ways to obtain digital images of urine microscopy findings: Their advantages and limitations.

    PubMed

    Fogazzi, G B; Garigali, G

    2017-03-01

    We describe three ways to take digital images of urine sediment findings. Way 1 encompasses a digital camera permanently mounted on the microscope and connected with a computer equipped with a proprietary software to acquire, process and store the images. Way 2 is based on the use of inexpensive compact digital cameras, held by hands - or mounted on a tripod - close to one eyepiece of the microscope. Way 3 is based on the use of smartphones, held by hands close to one eyepiece of the microscope or connected to the microscope by an adapter. The procedures, advantages and limitations of each way are reported. Copyright © 2017. Published by Elsevier B.V.

  6. 21st Century Military Operations in a Complex Electromagnetic Environment

    DTIC Science & Technology

    2015-07-01

    critically important, should not be viewed as complete. More is likely needed and it is hoped, as a result of improvements in governance of EW enterprise...strategy and take away the U.S. advantage. A commitment of $2.3 billion per year is viewed by this study as a relatively small down payment to...Mr. Al Munson Potomac Institue for Policy Studies Maj. Gen. Paul Nielsen, USAF (ret) Software Engineering Institute, Carnegie Mellon University Mr

  7. Hirabayashi, Satoshi; Kroll, Charles N.; Nowak, David J. 2011. Component-based development and sensitivity analyses of an air pollutant dry deposition model. Environmental Modelling & Software. 26(6): 804-816.

    Treesearch

    Satoshi Hirabayashi; Chuck Kroll; David Nowak

    2011-01-01

    The Urban Forest Effects-Deposition model (UFORE-D) was developed with a component-based modeling approach. Functions of the model were separated into components that are responsible for user interface, data input/output, and core model functions. Taking advantage of the component-based approach, three UFORE-D applications were developed: a base application to estimate...

  8. Multicore: Fallout from a Computing Evolution

    ScienceCinema

    Yelick, Kathy [Director, NERSC

    2017-12-09

    July 22, 2008 Berkeley Lab lecture: Parallel computing used to be reserved for big science and engineering projects, but in two years that's all changed. Even laptops and hand-helds use parallel processors. Unfortunately, the software hasn't kept pace. Kathy Yelick, Director of the National Energy Research Scientific Computing Center at Berkeley Lab, describes the resulting chaos and the computing community's efforts to develop exciting applications that take advantage of tens or hundreds of processors on a single chip.

  9. Multi-Spacecraft Analysis with Generic Visualization Tools

    NASA Astrophysics Data System (ADS)

    Mukherjee, J.; Vela, L.; Gonzalez, C.; Jeffers, S.

    2010-12-01

    To handle the needs of scientists today and in the future, software tools are going to have to take better advantage of the currently available hardware. Specifically, computing power, memory, and disk space have become cheaper, while bandwidth has become more expensive due to the explosion of online applications. To overcome these limitations, we have enhanced our Southwest Data Display and Analysis System (SDDAS) to take better advantage of the hardware by utilizing threads and data caching. Furthermore, the system was enhanced to support a framework for adding data formats and data visualization methods without costly rewrites. Visualization tools can speed analysis of many common scientific tasks and we will present a suite of tools that encompass the entire process of retrieving data from multiple data stores to common visualizations of the data. The goals for the end user are ease of use and interactivity with the data and the resulting plots. The data can be simultaneously plotted in a variety of formats and/or time and spatial resolutions. The software will allow one to slice and separate data to achieve other visualizations. Furthermore, one can interact with the data using the GUI or through an embedded language based on the Lua scripting language. The data presented will be primarily from the Cluster and Mars Express missions; however, the tools are data type agnostic and can be used for virtually any type of data.

  10. Trends in Mobile Application Development

    NASA Astrophysics Data System (ADS)

    Holzer, Adrian; Ondrus, Jan

    Major software companies, such as Apple and Google, are disturbing the relatively safe and established actors of the mobile application business. These newcomers have caused significant structural changes by imposing and enforcing their own rules for the future of mobile application development. The implications of these changes do not only concern the mobile network operators and mobile phone manufacturers. This changed environment also brings additional opportunities and constraints for current mobile application developers. Therefore, developers need to assess what their options are and how they can take advantages of these current trends. In this paper, we take a developer’s perspective in order to explore how the structural changes will influence the mobile application development markets. Moreover, we discuss what aspects developers need to take into account in order to position themselves within the current trends.

  11. The Use of UML for Software Requirements Expression and Management

    NASA Technical Reports Server (NTRS)

    Murray, Alex; Clark, Ken

    2015-01-01

    It is common practice to write English-language "shall" statements to embody detailed software requirements in aerospace software applications. This paper explores the use of the UML language as a replacement for the English language for this purpose. Among the advantages offered by the Unified Modeling Language (UML) is a high degree of clarity and precision in the expression of domain concepts as well as architecture and design. Can this quality of UML be exploited for the definition of software requirements? While expressing logical behavior, interface characteristics, timeliness constraints, and other constraints on software using UML is commonly done and relatively straight-forward, achieving the additional aspects of the expression and management of software requirements that stakeholders expect, especially traceability, is far less so. These other characteristics, concerned with auditing and quality control, include the ability to trace a requirement to a parent requirement (which may well be an English "shall" statement), to trace a requirement to verification activities or scenarios which verify that requirement, and to trace a requirement to elements of the software design which implement that requirement. UML Use Cases, designed for capturing requirements, have not always been satisfactory. Some applications of them simply use the Use Case model element as a repository for English requirement statements. Other applications of Use Cases, in which Use Cases are incorporated into behavioral diagrams that successfully communicate the behaviors and constraints required of the software, do indeed take advantage of UML's clarity, but not in ways that support the traceability features mentioned above. Our approach uses the Stereotype construct of UML to precisely identify elements of UML constructs, especially behaviors such as State Machines and Activities, as requirements, and also to achieve the necessary mapping capabilities. We describe this approach in the context of a space-based software application currently under development at the Jet Propulsion Laboratory.

  12. Using PAFEC as a preprocessor for COSMIC/NASTRAN

    NASA Technical Reports Server (NTRS)

    Gray, W. H.; Baudry, T. V.

    1983-01-01

    Programs for Automatic Finite Element Calculations (PAFEC) is a general purpose, three dimensional linear and nonlinear finite element program (ref. 1). PAFEC's features include free format input utilizing engineering keywords, powerful mesh generating facilities, sophisticated data base management procedures, and extensive data validation checks. Presented here is a description of a software interface that permits PAFEC to be used as a preprocessor for COSMIC/NASTRAN. This user friendly software, called PAFCOS, frees the stress analyst from the laborious and error prone procedure of creating and debugging a rigid format COSMIC/NASTRAN bulk data deck. By interactively creating and debugging a finite element model with PAFEC, thus taking full advantage of the free format engineering keyword oriented data structure of PAFEC, the amount of time spent during model generation can be drastically reduced. The PAFCOS software will automatically convert a PAFEC data structure into a COSMIC/NASTRAN bulk data deck. The capabilities and limitations of the PAFCOS software are fully discussed in the following report.

  13. PAVE: program for assembling and viewing ESTs.

    PubMed

    Soderlund, Carol; Johnson, Eric; Bomhoff, Matthew; Descour, Anne

    2009-08-26

    New sequencing technologies are rapidly emerging. Many laboratories are simultaneously working with the traditional Sanger ESTs and experimenting with ESTs generated by the 454 Life Science sequencers. Though Sanger ESTs have been used to generate contigs for many years, no program takes full advantage of the 5' and 3' mate-pair information, hence, many tentative transcripts are assembled into two separate contigs. The new 454 technology has the benefit of high-throughput expression profiling, but introduces time and space problems for assembling large contigs. The PAVE (Program for Assembling and Viewing ESTs) assembler takes advantage of the 5' and 3' mate-pair information by requiring that the mate-pairs be assembled into the same contig and joined by n's if the two sub-contigs do not overlap. It handles the depth of 454 data sets by "burying" similar ESTs during assembly, which retains the expression level information while circumventing time and space problems. PAVE uses MegaBLAST for the clustering step and CAP3 for assembly, however it assembles incrementally to enforce the mate-pair constraint, bury ESTs, and reduce incorrect joins and splits. The PAVE data management system uses a MySQL database to store multiple libraries of ESTs along with their metadata; the management system allows multiple assemblies with variations on libraries and parameters. Analysis routines provide standard annotation for the contigs including a measure of differentially expressed genes across the libraries. A Java viewer program is provided for display and analysis of the results. Our results clearly show the benefit of using the PAVE assembler to explicitly use mate-pair information and bury ESTs for large contigs. The PAVE assembler provides a software package for assembling Sanger and/or 454 ESTs. The assembly software, data management software, Java viewer and user's guide are freely available.

  14. Advanced Software V&V for Civil Aviation and Autonomy

    NASA Technical Reports Server (NTRS)

    Brat, Guillaume P.

    2017-01-01

    With the advances in high-computing platform (e.g., advanced graphical processing units or multi-core processors), computationally-intensive software techniques such as the ones used in artificial intelligence or formal methods have provided us with an opportunity to further increase safety in the aviation industry. Some of these techniques have facilitated building safety at design time, like in aircraft engines or software verification and validation, and others can introduce safety benefits during operations as long as we adapt our processes. In this talk, I will present how NASA is taking advantage of these new software techniques to build in safety at design time through advanced software verification and validation, which can be applied earlier and earlier in the design life cycle and thus help also reduce the cost of aviation assurance. I will then show how run-time techniques (such as runtime assurance or data analytics) offer us a chance to catch even more complex problems, even in the face of changing and unpredictable environments. These new techniques will be extremely useful as our aviation systems become more complex and more autonomous.

  15. Some design constraints required for the assembly of software components: The incorporation of atomic abstract types into generically structured abstract types

    NASA Technical Reports Server (NTRS)

    Johnson, Charles S.

    1986-01-01

    It is nearly axiomatic, that to take the greatest advantage of the useful features available in a development system, and to avoid the negative interactions of those features, requires the exercise of a design methodology which constrains their use. A major design support feature of the Ada language is abstraction: for data, functions processes, resources, and system elements in general. Atomic abstract types can be created in packages defining those private types and all of the overloaded operators, functions, and hidden data required for their use in an application. Generically structured abstract types can be created in generic packages defining those structured private types, as buildups from the user-defined data types which are input as parameters. A study is made of the design constraints required for software incorporating either atomic or generically structured abstract types, if the integration of software components based on them is to be subsequently performed. The impact of these techniques on the reusability of software and the creation of project-specific software support environments is also discussed.

  16. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Walker, La Tonya Nicole; Malczynski, Leonard A.

    DYNAMO is a computer program for building and running 'continuous' simulation models. It was developed by the Industrial Dynamics Group at the Massachusetts Institute of Technology for simulating dynamic feedback models of business, economic, and social systems. The history of the system dynamics method since 1957 includes many classic models built in DYANMO. It was not until the late 1980s that software was built to take advantage of the rise of personal computers and graphical user interfaces that DYNAMO was supplanted. There is much learning and insight to be gained from examining the DYANMO models and their accompanying research papers.more » We believe that it is a worthwhile exercise to convert DYNAMO models to more recent software packages. We have made an attempt to make it easier to turn these models into a more current system dynamics software language, Powersim © Studio produced by Powersim AS 2 of Bergen, Norway. This guide shows how to convert DYNAMO syntax into Studio syntax.« less

  17. Extreme Ultraviolet Imaging Telescope (EIT)

    NASA Technical Reports Server (NTRS)

    Lemen, J. R.; Freeland, S. L.

    1997-01-01

    Efforts concentrated on development and implementation of the SolarSoft (SSW) data analysis system. From an EIT analysis perspective, this system was designed to facilitate efficient reuse and conversion of software developed for Yohkoh/SXT and to take advantage of a large existing body of software developed by the SDAC, Yohkoh, and SOHO instrument teams. Another strong motivation for this system was to provide an EIT analysis environment which permits coordinated analysis of EIT data in conjunction with data from important supporting instruments, including Yohkoh/SXT and the other SOHO coronal instruments; CDS, SUMER, and LASCO. In addition, the SSW system will support coordinated EIT/TRACE analysis (by design) when TRACE data is available; TRACE launch is currently planned for March 1998. Working with Jeff Newmark, the Chianti software package (K.P. Dere et al) and UV /EUV data base was fully integrated into the SSW system to facilitate EIT temperature and emission analysis.

  18. A CCD experimental platform for large telescope in Antarctica based on FPGA

    NASA Astrophysics Data System (ADS)

    Zhu, Yuhua; Qi, Yongjun

    2014-07-01

    The CCD , as a detector , is one of the important components of astronomical telescopes. For a large telescope in Antarctica, a set of CCD detector system with large size, high sensitivity and low noise is indispensable. Because of the extremely low temperatures and unattended, system maintenance and software and hardware upgrade become hard problems. This paper introduces a general CCD controller experiment platform, using Field programmable gate array FPGA, which is, in fact, a large-scale field reconfigurable array. Taking the advantage of convenience to modify the system, construction of driving circuit, digital signal processing module, network communication interface, control algorithm validation, and remote reconfigurable module may realize. With the concept of integrated hardware and software, the paper discusses the key technology of building scientific CCD system suitable for the special work environment in Antarctica, focusing on the method of remote reconfiguration for controller via network and then offering a feasible hardware and software solution.

  19. Integrating existing software toolkits into VO system

    NASA Astrophysics Data System (ADS)

    Cui, Chenzhou; Zhao, Yong-Heng; Wang, Xiaoqian; Sang, Jian; Luo, Ze

    2004-09-01

    Virtual Observatory (VO) is a collection of interoperating data archives and software tools. Taking advantages of the latest information technologies, it aims to provide a data-intensively online research environment for astronomers all around the world. A large number of high-qualified astronomical software packages and libraries are powerful and easy of use, and have been widely used by astronomers for many years. Integrating those toolkits into the VO system is a necessary and important task for the VO developers. VO architecture greatly depends on Grid and Web services, consequently the general VO integration route is "Java Ready - Grid Ready - VO Ready". In the paper, we discuss the importance of VO integration for existing toolkits and discuss the possible solutions. We introduce two efforts in the field from China-VO project, "gImageMagick" and "Galactic abundance gradients statistical research under grid environment". We also discuss what additional work should be done to convert Grid service to VO service.

  20. Parallel Domain Decomposition Formulation and Software for Large-Scale Sparse Symmetrical/Unsymmetrical Aeroacoustic Applications

    NASA Technical Reports Server (NTRS)

    Nguyen, D. T.; Watson, Willie R. (Technical Monitor)

    2005-01-01

    The overall objectives of this research work are to formulate and validate efficient parallel algorithms, and to efficiently design/implement computer software for solving large-scale acoustic problems, arised from the unified frameworks of the finite element procedures. The adopted parallel Finite Element (FE) Domain Decomposition (DD) procedures should fully take advantages of multiple processing capabilities offered by most modern high performance computing platforms for efficient parallel computation. To achieve this objective. the formulation needs to integrate efficient sparse (and dense) assembly techniques, hybrid (or mixed) direct and iterative equation solvers, proper pre-conditioned strategies, unrolling strategies, and effective processors' communicating schemes. Finally, the numerical performance of the developed parallel finite element procedures will be evaluated by solving series of structural, and acoustic (symmetrical and un-symmetrical) problems (in different computing platforms). Comparisons with existing "commercialized" and/or "public domain" software are also included, whenever possible.

  1. The pseudo-Boolean optimization approach to form the N-version software structure

    NASA Astrophysics Data System (ADS)

    Kovalev, I. V.; Kovalev, D. I.; Zelenkov, P. V.; Voroshilova, A. A.

    2015-10-01

    The problem of developing an optimal structure of N-version software system presents a kind of very complex optimization problem. This causes the use of deterministic optimization methods inappropriate for solving the stated problem. In this view, exploiting heuristic strategies looks more rational. In the field of pseudo-Boolean optimization theory, the so called method of varied probabilities (MVP) has been developed to solve problems with a large dimensionality. Some additional modifications of MVP have been made to solve the problem of N-version systems design. Those algorithms take into account the discovered specific features of the objective function. The practical experiments have shown the advantage of using these algorithm modifications because of reducing a search space.

  2. Preparing Colorful Astronomical Images II

    NASA Astrophysics Data System (ADS)

    Levay, Z. G.; Frattare, L. M.

    2002-12-01

    We present additional techniques for using mainstream graphics software (Adobe Photoshop and Illustrator) to produce composite color images and illustrations from astronomical data. These techniques have been used on numerous images from the Hubble Space Telescope to produce photographic, print and web-based products for news, education and public presentation as well as illustrations for technical publication. We expand on a previous paper to present more detail and additional techniques, taking advantage of new or improved features available in the latest software versions. While Photoshop is not intended for quantitative analysis of full dynamic range data (as are IRAF or IDL, for example), we have had much success applying Photoshop's numerous, versatile tools to work with scaled images, masks, text and graphics in multiple semi-transparent layers and channels.

  3. Managing Data From Signal-Propagation Experiments

    NASA Technical Reports Server (NTRS)

    Kantak, A. V.

    1989-01-01

    Computer programs generate characteristic plots from amplitudes and phases. Software system enables minicomputer to process data on amplitudes and phases of signals received during experiments in ground-mobile/satellite radio propagation. Takes advantage of file-handling capabilities of UNIX operating system and C programming language. Interacts with user, under whose guidance programs in FORTRAN language generate plots of spectra or other curves of types commonly used to characterize signals. FORTRAN programs used to process file-handling outputs into any of several useful forms.

  4. Generalized Preconditioned Locally Harmonic Residual Eigensolver (GPLHR) v0.1

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    VECHARYNSKI, EUGENE; YANG, CHAO

    The software contains a MATLAB implementation of the Generalized Preconditioned Locally Harmonic Residual (GPLHR) method for solving standard and generalized non-Hermitian eigenproblems. The method is particularly useful for computing a subset of eigenvalues, and their eigen- or Schur vectors, closest to a given shift. The proposed method is based on block iterations and can take advantage of a preconditioner if it is available. It does not need to perform exact shift-and-invert transformation. Standard and generalized eigenproblems are handled in a unified framework.

  5. Teaching physics with Angry Birds: exploring the kinematics and dynamics of the game

    NASA Astrophysics Data System (ADS)

    Rodrigues, M.; Simeão Carvalho, P.

    2013-07-01

    In this paper, we present classroom strategies for teaching kinematics at middle and high school levels, using Rovio’s famous game Angry Birds and the video analyser software Tracker. We show how to take advantage of this entertaining video game, by recording appropriate motions of birds that students can explore by manipulating data, characterizing the red bird’s motion and fitting results to physical models. A dynamic approach is also addressed to link gravitational force to projectile trajectories.

  6. Study for incorporating time-synchronized approach control into the CH-47/VALT digital navigation system

    NASA Technical Reports Server (NTRS)

    Mcconnell, W. J., Jr.

    1979-01-01

    Techniques for obtaining time synchronized (4D) approach control in the VALT research helicopter is described. Various 4D concepts and their compatibility with the existing VALT digital computer navigation and guidance system hardware and software are examined. Modifications to various techniques were investigated in order to take advantage of the unique operating characteristics of the helicopter in the terminal area. A 4D system is proposed, combining the direct to maneuver with the existing VALT curved path generation capability.

  7. Multicore: Fallout From a Computing Evolution (LBNL Summer Lecture Series)

    ScienceCinema

    Yelick, Kathy [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States). National Energy Research Scientific Computing Center (NERSC)

    2018-05-07

    Summer Lecture Series 2008: Parallel computing used to be reserved for big science and engineering projects, but in two years that's all changed. Even laptops and hand-helds use parallel processors. Unfortunately, the software hasn't kept pace. Kathy Yelick, Director of the National Energy Research Scientific Computing Center at Berkeley Lab, describes the resulting chaos and the computing community's efforts to develop exciting applications that take advantage of tens or hundreds of processors on a single chip.

  8. BioNetFit: a fitting tool compatible with BioNetGen, NFsim and distributed computing environments

    DOE PAGES

    Thomas, Brandon R.; Chylek, Lily A.; Colvin, Joshua; ...

    2015-11-09

    Rule-based models are analyzed with specialized simulators, such as those provided by the BioNetGen and NFsim open-source software packages. Here in this paper, we present BioNetFit, a general-purpose fitting tool that is compatible with BioNetGen and NFsim. BioNetFit is designed to take advantage of distributed computing resources. This feature facilitates fitting (i.e. optimization of parameter values for consistency with data) when simulations are computationally expensive.

  9. Advanced Resistive Exercise Device (ARED) Flight Software (FSW): A Unique Approach to Exercise in Long Duration Habitats

    NASA Technical Reports Server (NTRS)

    Mangieri, Mark

    2005-01-01

    ARED flight instrumentation software is associated with an overall custom designed resistive exercise system that will be deployed on the International Space Station (ISS). This innovative software application fuses together many diverse and new technologies into a robust and usable package. The software takes advantage of touchscreen user interface technology by providing a graphical user interface on a Windows based tablet PC, meeting a design constraint of keyboard-less interaction with flight crewmembers. The software interacts with modified commercial data acquisition (DAQ) hardware to acquire multiple channels of sensor measurment from the ARED device. This information is recorded on the tablet PC and made available, via International Space Station (ISS) Wireless LAN (WLAN) and telemetry subsystems, to ground based mission medics and trainers for analysis. The software includes a feature to accept electronically encoded prescriptions of exercises that guide crewmembers through a customized regimen of resistive weight training, based on personal analysis. These electronically encoded prescriptions are provided to the crew via ISS WLAN and telemetry subsystems. All personal data is securely associated with an individual crew member, based on a PIN ID mechanism.

  10. The transition of GTDS to the Unix workstation environment

    NASA Technical Reports Server (NTRS)

    Carter, D.; Metzinger, R.; Proulx, R.; Cefola, P.

    1995-01-01

    Future Flight Dynamics systems should take advantage of the possibilities provided by current and future generations of low-cost, high performance workstation computing environments with Graphical User Interface. The port of the existing mainframe Flight Dynamics systems to the workstation environment offers an economic approach for combining the tremendous engineering heritage that has been encapsulated in these systems with the advantages of the new computing environments. This paper will describe the successful transition of the Draper Laboratory R&D version of GTDS (Goddard Trajectory Determination System) from the IBM Mainframe to the Unix workstation environment. The approach will be a mix of historical timeline notes, descriptions of the technical problems overcome, and descriptions of associated SQA (software quality assurance) issues.

  11. Development of interactive multimedia applications

    NASA Technical Reports Server (NTRS)

    Leigh, Albert; Wang, Lui

    1993-01-01

    Multimedia is making an increasingly significant contribution to our informational society. The usefulness of this technology is already evident in education, business presentations, informational kiosks (e.g., in museums), training and the entertainment environment. Institutions, from grade schools to medical schools, are exploring the use of multifaceted electronic text books and teaching aids to enhance course materials. Through multimedia, teachers and students can take full advantage of the cognitive value of animation, audio, video and other types in a seamless application. The Software Technology Branch at NASA Johnson Space Center (NASA/JSC) is taking similar approaches to apply the state-of-the-art technology to space training, mission operations and other applications. This paper discusses the characteristics and development of multimedia applications at the NASA/JSC.

  12. Development of a polarized neutron beam line at Algerian research reactors using McStas software

    NASA Astrophysics Data System (ADS)

    Makhloufi, M.; Salah, H.

    2017-02-01

    Unpolarized instrumentation has long been studied and designed using McStas simulation tool. But, only recently new models were developed for McStas to simulate polarized neutron scattering instruments. In the present contribution, we used McStas software to design a polarized neutron beam line, taking advantage of the available spectrometers reflectometer and diffractometer in Algeria. Both thermal and cold neutron was considered. The polarization was made by two types of supermirrors polarizers FeSi and CoCu provided by the HZB institute. For sake of performance and comparison, the polarizers were characterized and their characteristics reproduced. The simulated instruments are reported. Flipper and electromagnets for guide field are developed. Further developments including analyzers and upgrading of the existing spectrometers are underway.

  13. Avionics upgrade strategies for the Space Shuttle and derivatives

    NASA Astrophysics Data System (ADS)

    Swaim, Richard A.; Wingert, William B.

    Some approaches aimed at providing a low-cost, low-risk strategy to upgrade the shuttle onboard avionics are described. These approaches allow migration to a shuttle-derived vehicle and provide commonality with Space Station Freedom avionics to the extent practical. Some goals of the Shuttle cockpit upgrade include: offloading of the main computers by distributing avionics display functions, reducing crew workload, reducing maintenance cost, and providing display reconfigurability and context sensitivity. These goals are being met by using a combination of off-the-shelf and newly developed software and hardware. The software will be developed using Ada. Advanced active matrix liquid crystal displays are being used to meet the tight space, weight, and power consumption requirements. Eventually, it is desirable to upgrade the current shuttle data processing system with a system that has more in common with the Space Station data management system. This will involve not only changes in Space Shuttle onboard hardware, but changes in the software. Possible approaches to maximizing the use of the existing software base while taking advantage of new language capabilities are discussed.

  14. Launching GUPPI: the Green Bank Ultimate Pulsar Processing Instrument

    NASA Astrophysics Data System (ADS)

    DuPlain, Ron; Ransom, Scott; Demorest, Paul; Brandt, Patrick; Ford, John; Shelton, Amy L.

    2008-08-01

    The National Radio Astronomy Observatory (NRAO) is launching the Green Bank Ultimate Pulsar Processing Instrument (GUPPI), a prototype flexible digital signal processor designed for pulsar observations with the Robert C. Byrd Green Bank Telescope (GBT). GUPPI uses field programmable gate array (FPGA) hardware and design tools developed by the Center for Astronomy Signal Processing and Electronics Research (CASPER) at the University of California, Berkeley. The NRAO has been concurrently developing GUPPI software and hardware using minimal software resources. The software handles instrument monitor and control, data acquisition, and hardware interfacing. GUPPI is currently an expert-only spectrometer, but supports future integration with the full GBT production system. The NRAO was able to take advantage of the unique flexibility of the CASPER FPGA hardware platform, develop hardware and software in parallel, and build a suite of software tools for monitoring, controlling, and acquiring data with a new instrument over a short timeline of just a few months. The NRAO interacts regularly with CASPER and its users, and GUPPI stands as an example of what reconfigurable computing and open-source development can do for radio astronomy. GUPPI is modular for portability, and the NRAO provides the results of development as an open-source resource.

  15. Flexible control techniques for a lunar base

    NASA Technical Reports Server (NTRS)

    Kraus, Thomas W.

    1992-01-01

    The fundamental elements found in every terrestrial control system can be employed in all lunar applications. These elements include sensors which measure physical properties, controllers which acquire sensor data and calculate a control response, and actuators which apply the control output to the process. The unique characteristics of the lunar environment will certainly require the development of new control system technology. However, weightlessness, harsh atmospheric conditions, temperature extremes, and radiation hazards will most significantly impact the design of sensors and actuators. The controller and associated control algorithms, which are the most complex element of any control system, can be derived in their entirety from existing technology. Lunar process control applications -- ranging from small-scale research projects to full-scale processing plants -- will benefit greatly from the controller advances being developed today. In particular, new software technology aimed at commercial process monitoring and control applications will almost completely eliminate the need for custom programs and the lengthy development and testing cycle they require. The applicability of existing industrial software to lunar applications has other significant advantages in addition to cost and quality. This software is designed to run on standard hardware platforms and takes advantage of existing LAN and telecommunications technology. Further, in order to exploit the existing commercial market, the software is being designed to be implemented by users of all skill levels -- typically users who are familiar with their process, but not necessarily with software or control theory. This means that specialized technical support personnel will not need to be on-hand, and the associated costs are eliminated. Finally, the latest industrial software designed for the commercial market is extremely flexible, in order to fit the requirements of many types of processing applications with little or no customization. This means that lunar process control projects will not be delayed by unforeseen problems or last minute process modifications. The software will include all of the tools needed to adapt to virtually any changes. In contrast to other space programs which required the development of tremendous amounts of custom software, lunar-based processing facilities will benefit from the use of existing software technology which is being proven in commercial applications on Earth.

  16. Integration of USB and firewire cameras in machine vision applications

    NASA Astrophysics Data System (ADS)

    Smith, Timothy E.; Britton, Douglas F.; Daley, Wayne D.; Carey, Richard

    1999-08-01

    Digital cameras have been around for many years, but a new breed of consumer market cameras is hitting the main stream. By using these devices, system designers and integrators will be well posited to take advantage of technological advances developed to support multimedia and imaging applications on the PC platform. Having these new cameras on the consumer market means lower cost, but it does not necessarily guarantee ease of integration. There are many issues that need to be accounted for like image quality, maintainable frame rates, image size and resolution, supported operating system, and ease of software integration. This paper will describe briefly a couple of the consumer digital standards, and then discuss some of the advantages and pitfalls of integrating both USB and Firewire cameras into computer/machine vision applications.

  17. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rohe, Daniel Peter

    Sandia National Laboratories has recently purchased a Polytec 3D Scanning Laser Doppler Vibrometer for vibration measurement. This device has proven to be a very nice tool for making vibration measurements, and has a number of advantages over traditional sensors such as accelerometers. The non-contact nature of the laser vibrometer means there is no mass loading due to measuring the response. Additionally, the laser scanning heads can position the laser spot much more quickly and accurately than placing an accelerometer or performing a roving hammer impact. The disadvantage of the system is that a significant amount of time must be investedmore » to align the lasers with each other and the part so that the laser spots can be accurately positioned. The Polytec software includes a number of nice tools to aid in this procedure; however, certain portions are still tedious. Luckily, the Polytec software is readily extensible by programming macros for the system, so tedious portions of the procedure can be made easier by automating the process. The Polytec Software includes a WinWrap (similar to Visual Basic) editor and interface to run macros written in that programming language. The author, however, is much more proficient in Python, and the latter also has a much larger set of libraries that can be used to create very complex macros, while taking advantage of Python’s inherent readability and maintainability.« less

  18. Healthcare applications of knowledge discovery in databases.

    PubMed

    DeGruy, K B

    2000-01-01

    Many healthcare leaders find themselves overwhelmed with data, but lack the information they need to make informed decisions. Knowledge discovery in databases (KDD) can help organizations turn their data into information. KDD is the process of finding complex patterns and relationships in data. The tools and techniques of KDD have achieved impressive results in other industries, and healthcare needs to take advantage of advances in this exciting field. Recent advances in the KDD field have brought it from the realm of research institutions and large corporations to many smaller companies. Software and hardware advances enable small organizations to tap the power of KDD using desktop PCs. KDD has been used extensively for fraud detection and focused marketing. There is a wealth of data available within the healthcare industry that would benefit from the application of KDD tools and techniques. Providers and payers have a vast quantity of data (such as, charges and claims), but not effective way to analyze the data to accurately determine relationships and trends. Organizations that take advantage of KDD techniques will find that they offer valuable assistance in the quest to lower healthcare costs while improving healthcare quality.

  19. Automated daily processing of more than 1000 ground-based GPS receivers for studying intense ionospheric storms

    NASA Technical Reports Server (NTRS)

    Komjathy, Attila; Sparks, Lawrence; Wilson, Brian D.; Mannucci, Anthony J.

    2005-01-01

    To take advantage of the vast amount of GPS data, researchers use a number of techniques to estimate satellite and receiver interfrequency biases and the total electron content (TEC) of the ionosphere. Most techniques estimate vertical ionospheric structure and, simultaneously, hardware-related biases treated as nuisance parameters. These methods often are limited to 200 GPS receivers and use a sequential least squares or Kalman filter approach. The biases are later removed from the measurements to obtain unbiased TEC. In our approach to calibrating GPS receiver and transmitter interfrequency biases we take advantage of all available GPS receivers using a new processing algorithm based on the Global Ionospheric Mapping (GIM) software developed at the Jet Propulsion Laboratory. This new capability is designed to estimate receiver biases for all stations. We solve for the instrumental biases by modeling the ionospheric delay and removing it from the observation equation using precomputed GIM maps. The precomputed GIM maps rely on 200 globally distributed GPS receivers to establish the ''background'' used to model the ionosphere at the remaining 800 GPS sites.

  20. NASTRAN as a resource in code development

    NASA Technical Reports Server (NTRS)

    Stanton, E. L.; Crain, L. M.; Neu, T. F.

    1975-01-01

    A case history is presented in which the NASTRAN system provided both guidelines and working software for use in the development of a discrete element program, PATCHES-111. To avoid duplication and to take advantage of the wide spread user familiarity with NASTRAN, the PATCHES-111 system uses NASTRAN bulk data syntax, NASTRAN matrix utilities, and the NASTRAN linkage editor. Problems in developing the program are discussed along with details on the architecture of the PATCHES-111 parametric cubic modeling system. The system includes model construction procedures, checkpoint/restart strategies, and other features.

  1. Spaceborne Hybrid-FPGA System for Processing FTIR Data

    NASA Technical Reports Server (NTRS)

    Bekker, Dmitriy; Blavier, Jean-Francois L.; Pingree, Paula J.; Lukowiak, Marcin; Shaaban, Muhammad

    2008-01-01

    Progress has been made in a continuing effort to develop a spaceborne computer system for processing readout data from a Fourier-transform infrared (FTIR) spectrometer to reduce the volume of data transmitted to Earth. The approach followed in this effort, oriented toward reducing design time and reducing the size and weight of the spectrometer electronics, has been to exploit the versatility of recently developed hybrid field-programmable gate arrays (FPGAs) to run diverse software on embedded processors while also taking advantage of the reconfigurable hardware resources of the FPGAs.

  2. Electronic dental records: start taking the steps.

    PubMed

    Bergoff, Jana

    2011-01-01

    Converting paper patient records charts into their electronic counterparts (EDRs) not only has many advantages, but also could become a legal requirement in the future. Several steps key to a successful transition includes assessing the needs of the dental team and what they require as a part of the implementation Existing software and hardware must be evaluated for continued use and expansion. Proper protocols for information transfer must be established to ensure complete records while maintaining HIPAA regulations regarding patient privacy. Reduce anxiety by setting realistic dead-lines and using trusted back-up methods.

  3. Edge directed image interpolation with Bamberger pyramids

    NASA Astrophysics Data System (ADS)

    Rosiles, Jose Gerardo

    2005-08-01

    Image interpolation is a standard feature in digital image editing software, digital camera systems and printers. Classical methods for resizing produce blurred images with unacceptable quality. Bamberger Pyramids and filter banks have been successfully used for texture and image analysis. They provide excellent multiresolution and directional selectivity. In this paper we present an edge-directed image interpolation algorithm which takes advantage of the simultaneous spatial-directional edge localization at the subband level. The proposed algorithm outperform classical schemes like bilinear and bicubic schemes from the visual and numerical point of views.

  4. Advanced Automation for Ion Trap Mass Spectrometry-New Opportunities for Real-Time Autonomous Analysis

    NASA Technical Reports Server (NTRS)

    Palmer, Peter T.; Wong, C. M.; Salmonson, J. D.; Yost, R. A.; Griffin, T. P.; Yates, N. A.; Lawless, James G. (Technical Monitor)

    1994-01-01

    The utility of MS/MS for both target compound analysis and the structure elucidation of unknowns has been described in a number of references. A broader acceptance of this technique has not yet been realized as it requires large, complex, and costly instrumentation which has not been competitive with more conventional techniques. Recent advancements in ion trap mass spectrometry promise to change this situation. Although the ion trap's small size, sensitivity, and ability to perform multiple stages of mass spectrometry have made it eminently suitable for on-line, real-time monitoring applications, advance automation techniques are required to make these capabilities more accessible to non-experts. Towards this end we have developed custom software for the design and implementation of MS/MS experiments. This software allows the user to take full advantage of the ion trap's versatility with respect to ionization techniques, scan proxies, and ion accumulation/ejection methods. Additionally, expert system software has been developed for autonomous target compound analysis. This software has been linked to ion trap control software and a commercial data system to bring all of the steps in the analysis cycle under control of the expert system. These software development efforts and their utilization for a number of trace analysis applications will be described.

  5. Using parallel computing for the display and simulation of the space debris environment

    NASA Astrophysics Data System (ADS)

    Möckel, M.; Wiedemann, C.; Flegel, S.; Gelhaus, J.; Vörsmann, P.; Klinkrad, H.; Krag, H.

    2011-07-01

    Parallelism is becoming the leading paradigm in today's computer architectures. In order to take full advantage of this development, new algorithms have to be specifically designed for parallel execution while many old ones have to be upgraded accordingly. One field in which parallel computing has been firmly established for many years is computer graphics. Calculating and displaying three-dimensional computer generated imagery in real time requires complex numerical operations to be performed at high speed on a large number of objects. Since most of these objects can be processed independently, parallel computing is applicable in this field. Modern graphics processing units (GPUs) have become capable of performing millions of matrix and vector operations per second on multiple objects simultaneously. As a side project, a software tool is currently being developed at the Institute of Aerospace Systems that provides an animated, three-dimensional visualization of both actual and simulated space debris objects. Due to the nature of these objects it is possible to process them individually and independently from each other. Therefore, an analytical orbit propagation algorithm has been implemented to run on a GPU. By taking advantage of all its processing power a huge performance increase, compared to its CPU-based counterpart, could be achieved. For several years efforts have been made to harness this computing power for applications other than computer graphics. Software tools for the simulation of space debris are among those that could profit from embracing parallelism. With recently emerged software development tools such as OpenCL it is possible to transfer the new algorithms used in the visualization outside the field of computer graphics and implement them, for example, into the space debris simulation environment. This way they can make use of parallel hardware such as GPUs and Multi-Core-CPUs for faster computation. In this paper the visualization software will be introduced, including a comparison between the serial and the parallel method of orbit propagation. Ways of how to use the benefits of the latter method for space debris simulation will be discussed. An introduction to OpenCL will be given as well as an exemplary algorithm from the field of space debris simulation.

  6. Using parallel computing for the display and simulation of the space debris environment

    NASA Astrophysics Data System (ADS)

    Moeckel, Marek; Wiedemann, Carsten; Flegel, Sven Kevin; Gelhaus, Johannes; Klinkrad, Heiner; Krag, Holger; Voersmann, Peter

    Parallelism is becoming the leading paradigm in today's computer architectures. In order to take full advantage of this development, new algorithms have to be specifically designed for parallel execution while many old ones have to be upgraded accordingly. One field in which parallel computing has been firmly established for many years is computer graphics. Calculating and displaying three-dimensional computer generated imagery in real time requires complex numerical operations to be performed at high speed on a large number of objects. Since most of these objects can be processed independently, parallel computing is applicable in this field. Modern graphics processing units (GPUs) have become capable of performing millions of matrix and vector operations per second on multiple objects simultaneously. As a side project, a software tool is currently being developed at the Institute of Aerospace Systems that provides an animated, three-dimensional visualization of both actual and simulated space debris objects. Due to the nature of these objects it is possible to process them individually and independently from each other. Therefore, an analytical orbit propagation algorithm has been implemented to run on a GPU. By taking advantage of all its processing power a huge performance increase, compared to its CPU-based counterpart, could be achieved. For several years efforts have been made to harness this computing power for applications other than computer graphics. Software tools for the simulation of space debris are among those that could profit from embracing parallelism. With recently emerged software development tools such as OpenCL it is possible to transfer the new algorithms used in the visualization outside the field of computer graphics and implement them, for example, into the space debris simulation environment. This way they can make use of parallel hardware such as GPUs and Multi-Core-CPUs for faster computation. In this paper the visualization software will be introduced, including a comparison between the serial and the parallel method of orbit propagation. Ways of how to use the benefits of the latter method for space debris simulation will be discussed. An introduction of OpenCL will be given as well as an exemplary algorithm from the field of space debris simulation.

  7. Design and implementation of a cloud based lithography illumination pupil processing application

    NASA Astrophysics Data System (ADS)

    Zhang, Youbao; Ma, Xinghua; Zhu, Jing; Zhang, Fang; Huang, Huijie

    2017-02-01

    Pupil parameters are important parameters to evaluate the quality of lithography illumination system. In this paper, a cloud based full-featured pupil processing application is implemented. A web browser is used for the UI (User Interface), the websocket protocol and JSON format are used for the communication between the client and the server, and the computing part is implemented in the server side, where the application integrated a variety of high quality professional libraries, such as image processing libraries libvips and ImageMagic, automatic reporting system latex, etc., to support the program. The cloud based framework takes advantage of server's superior computing power and rich software collections, and the program could run anywhere there is a modern browser due to its web UI design. Compared to the traditional way of software operation model: purchased, licensed, shipped, downloaded, installed, maintained, and upgraded, the new cloud based approach, which is no installation, easy to use and maintenance, opens up a new way. Cloud based application probably is the future of the software development.

  8. Embedded systems for supporting computer accessibility.

    PubMed

    Mulfari, Davide; Celesti, Antonio; Fazio, Maria; Villari, Massimo; Puliafito, Antonio

    2015-01-01

    Nowadays, customized AT software solutions allow their users to interact with various kinds of computer systems. Such tools are generally available on personal devices (e.g., smartphones, laptops and so on) commonly used by a person with a disability. In this paper, we investigate a way of using the aforementioned AT equipments in order to access many different devices without assistive preferences. The solution takes advantage of open source hardware and its core component consists of an affordable Linux embedded system: it grabs data coming from the assistive software, which runs on the user's personal device, then, after processing, it generates native keyboard and mouse HID commands for the target computing device controlled by the end user. This process supports any operating system available on the target machine and it requires no specialized software installation; therefore the user with a disability can rely on a single assistive tool to control a wide range of computing platforms, including conventional computers and many kinds of mobile devices, which receive input commands through the USB HID protocol.

  9. Aerosol and Surface Parameter Retrievals for a Multi-Angle, Multiband Spectrometer

    NASA Technical Reports Server (NTRS)

    Broderick, Daniel

    2012-01-01

    This software retrieves the surface and atmosphere parameters of multi-angle, multiband spectra. The synthetic spectra are generated by applying the modified Rahman-Pinty-Verstraete Bidirectional Reflectance Distribution Function (BRDF) model, and a single-scattering dominated atmosphere model to surface reflectance data from Multiangle Imaging SpectroRadiometer (MISR). The aerosol physical model uses a single scattering approximation using Rayleigh scattering molecules, and Henyey-Greenstein aerosols. The surface and atmosphere parameters of the models are retrieved using the Lavenberg-Marquardt algorithm. The software can retrieve the surface and atmosphere parameters with two different scales. The surface parameters are retrieved pixel-by-pixel while the atmosphere parameters are retrieved for a group of pixels where the same atmosphere model parameters are applied. This two-scale approach allows one to select the natural scale of the atmosphere properties relative to surface properties. The software also takes advantage of an intelligent initial condition given by the solution of the neighbor pixels.

  10. Adding Hierarchical Objects to Relational Database General-Purpose XML-Based Information Managements

    NASA Technical Reports Server (NTRS)

    Lin, Shu-Chun; Knight, Chris; La, Tracy; Maluf, David; Bell, David; Tran, Khai Peter; Gawdiak, Yuri

    2006-01-01

    NETMARK is a flexible, high-throughput software system for managing, storing, and rapid searching of unstructured and semi-structured documents. NETMARK transforms such documents from their original highly complex, constantly changing, heterogeneous data formats into well-structured, common data formats in using Hypertext Markup Language (HTML) and/or Extensible Markup Language (XML). The software implements an object-relational database system that combines the best practices of the relational model utilizing Structured Query Language (SQL) with those of the object-oriented, semantic database model for creating complex data. In particular, NETMARK takes advantage of the Oracle 8i object-relational database model using physical-address data types for very efficient keyword searches of records across both context and content. NETMARK also supports multiple international standards such as WEBDAV for drag-and-drop file management and SOAP for integrated information management using Web services. The document-organization and -searching capabilities afforded by NETMARK are likely to make this software attractive for use in disciplines as diverse as science, auditing, and law enforcement.

  11. Automated software system for checking the structure and format of ACM SIG documents

    NASA Astrophysics Data System (ADS)

    Mirza, Arsalan Rahman; Sah, Melike

    2017-04-01

    Microsoft (MS) Office Word is one of the most commonly used software tools for creating documents. MS Word 2007 and above uses XML to represent the structure of MS Word documents. Metadata about the documents are automatically created using Office Open XML (OOXML) syntax. We develop a new framework, which is called ADFCS (Automated Document Format Checking System) that takes the advantage of the OOXML metadata, in order to extract semantic information from MS Office Word documents. In particular, we develop a new ontology for Association for Computing Machinery (ACM) Special Interested Group (SIG) documents for representing the structure and format of these documents by using OWL (Web Ontology Language). Then, the metadata is extracted automatically in RDF (Resource Description Framework) according to this ontology using the developed software. Finally, we generate extensive rules in order to infer whether the documents are formatted according to ACM SIG standards. This paper, introduces ACM SIG ontology, metadata extraction process, inference engine, ADFCS online user interface, system evaluation and user study evaluations.

  12. Improving the strength of additively manufactured objects via modified interior structure

    NASA Astrophysics Data System (ADS)

    Al, Can Mert; Yaman, Ulas

    2017-10-01

    Additive manufacturing (AM), in other words 3D printing, is becoming more common because of its crucial advantages such as geometric complexity, functional interior structures, etc. over traditional manufacturing methods. Especially, Fused Filament Fabrication (FFF) 3D printing technology is frequently used because of the fact that desktop variants of these types of printers are highly appropriate for different fields and are improving rapidly. In spite of the fact that there are significant advantages of AM, the strength of the parts fabricated with AM is still a major problem especially when plastic materials, such as Acrylonitrile butadiene styrene (ABS), Polylactic acid (PLA), Nylon, etc., are utilized. In this study, an alternative method is proposed in which the strength of AM fabricated parts is improved employing direct slicing approach. Traditional Computer Aided Manufacturing (CAM) software of 3D printers takes only the geometry as an input in triangular mesh form (stereolithography, STL file) generated by Computer Aided Design software. This file format includes data only about the outer boundaries of the geometry. Interior of the artifacts are manufactured with homogeneous infill patterns, such as diagonal, honeycomb, linear, etc. according to the paths generated in CAM software. The developed method within this study provides a way to fabricate parts with heterogeneous infill patterns by utilizing the stress field data obtained from a Finite Element Analysis software, such as ABAQUS. According to the performed tensile tests, the strength of the test specimen is improved by about 45% compared to the conventional way of 3D printing.

  13. A pluggable framework for parallel pairwise sequence search.

    PubMed

    Archuleta, Jeremy; Feng, Wu-chun; Tilevich, Eli

    2007-01-01

    The current and near future of the computing industry is one of multi-core and multi-processor technology. Most existing sequence-search tools have been designed with a focus on single-core, single-processor systems. This discrepancy between software design and hardware architecture substantially hinders sequence-search performance by not allowing full utilization of the hardware. This paper presents a novel framework that will aid the conversion of serial sequence-search tools into a parallel version that can take full advantage of the available hardware. The framework, which is based on a software architecture called mixin layers with refined roles, enables modules to be plugged into the framework with minimal effort. The inherent modular design improves maintenance and extensibility, thus opening up a plethora of opportunities for advanced algorithmic features to be developed and incorporated while routine maintenance of the codebase persists.

  14. Shape optimization of road tunnel cross-section by simulated annealing

    NASA Astrophysics Data System (ADS)

    Sobótka, Maciej; Pachnicz, Michał

    2016-06-01

    The paper concerns shape optimization of a tunnel excavation cross-section. The study incorporates optimization procedure of the simulated annealing (SA). The form of a cost function derives from the energetic optimality condition, formulated in the authors' previous papers. The utilized algorithm takes advantage of the optimization procedure already published by the authors. Unlike other approaches presented in literature, the one introduced in this paper takes into consideration a practical requirement of preserving fixed clearance gauge. Itasca Flac software is utilized in numerical examples. The optimal excavation shapes are determined for five different in situ stress ratios. This factor significantly affects the optimal topology of excavation. The resulting shapes are elongated in the direction of a principal stress greater value. Moreover, the obtained optimal shapes have smooth contours circumscribing the gauge.

  15. Energy-Saving Traffic Scheduling in Hybrid Software Defined Wireless Rechargeable Sensor Networks

    PubMed Central

    Wei, Yunkai; Ma, Xiaohui; Yang, Ning; Chen, Yijin

    2017-01-01

    Software Defined Wireless Rechargeable Sensor Networks (SDWRSNs) are an inexorable trend for Wireless Sensor Networks (WSNs), including Wireless Rechargeable Sensor Network (WRSNs). However, the traditional network devices cannot be completely substituted in the short term. Hybrid SDWRSNs, where software defined devices and traditional devices coexist, will last for a long time. Hybrid SDWRSNs bring new challenges as well as opportunities for energy saving issues, which is still a key problem considering that the wireless chargers are also exhaustible, especially in some rigid environment out of the main supply. Numerous energy saving schemes for WSNs, or even some works for WRSNs, are no longer suitable for the new features of hybrid SDWRSNs. To solve this problem, this paper puts forward an Energy-saving Traffic Scheduling (ETS) algorithm. The ETS algorithm adequately considers the new characters in hybrid SDWRSNs, and takes advantage of the Software Defined Networking (SDN) controller’s direct control ability on SDN nodes and indirect control ability on normal nodes. The simulation results show that, comparing with traditional Minimum Transmission Energy (MTE) protocol, ETS can substantially improve the energy efficiency in hybrid SDWRSNs for up to 20–40% while ensuring feasible data delay. PMID:28914816

  16. Energy-Saving Traffic Scheduling in Hybrid Software Defined Wireless Rechargeable Sensor Networks.

    PubMed

    Wei, Yunkai; Ma, Xiaohui; Yang, Ning; Chen, Yijin

    2017-09-15

    Software Defined Wireless Rechargeable Sensor Networks (SDWRSNs) are an inexorable trend for Wireless Sensor Networks (WSNs), including Wireless Rechargeable Sensor Network (WRSNs). However, the traditional network devices cannot be completely substituted in the short term. Hybrid SDWRSNs, where software defined devices and traditional devices coexist, will last for a long time. Hybrid SDWRSNs bring new challenges as well as opportunities for energy saving issues, which is still a key problem considering that the wireless chargers are also exhaustible, especially in some rigid environment out of the main supply. Numerous energy saving schemes for WSNs, or even some works for WRSNs, are no longer suitable for the new features of hybrid SDWRSNs. To solve this problem, this paper puts forward an Energy-saving Traffic Scheduling (ETS) algorithm. The ETS algorithm adequately considers the new characters in hybrid SDWRSNs, and takes advantage of the Software Defined Networking (SDN) controller's direct control ability on SDN nodes and indirect control ability on normal nodes. The simulation results show that, comparing with traditional Minimum Transmission Energy (MTE) protocol, ETS can substantially improve the energy efficiency in hybrid SDWRSNs for up to 20-40% while ensuring feasible data delay.

  17. High-performance software-only H.261 video compression on PC

    NASA Astrophysics Data System (ADS)

    Kasperovich, Leonid

    1996-03-01

    This paper describes an implementation of a software H.261 codec for PC, that takes an advantage of the fast computational algorithms for DCT-based video compression, which have been presented by the author at the February's 1995 SPIE/IS&T meeting. The motivation for developing the H.261 prototype system is to demonstrate a feasibility of real time software- only videoconferencing solution to operate across a wide range of network bandwidth, frame rate, and resolution of the input video. As the bandwidths of current network technology will be increased, the higher frame rate and resolution of video to be transmitted is allowed, that requires, in turn, a software codec to be able to compress pictures of CIF (352 X 288) resolution at up to 30 frame/sec. Running on Pentium 133 MHz PC the codec presented is capable to compress video in CIF format at 21 - 23 frame/sec. This result is comparable to the known hardware-based H.261 solutions, but it doesn't require any specific hardware. The methods to achieve high performance, the program optimization technique for Pentium microprocessor along with the performance profile, showing the actual contribution of the different encoding/decoding stages to the overall computational process, are presented.

  18. Representation of Serendipitous Scientific Data

    NASA Technical Reports Server (NTRS)

    James, Mark

    2006-01-01

    A computer program defines and implements an innovative kind of data structure than can be used for representing information derived from serendipitous discoveries made via collection of scientific data on long exploratory spacecraft missions. Data structures capable of collecting any kind of data can easily be implemented in advance, but the task of designing a fixed and efficient data structure suitable for processing raw data into useful information and taking advantage of serendipitous scientific discovery is becoming increasingly difficult as missions go deeper into space. The present software eases the task by enabling definition of arbitrarily complex data structures that can adapt at run time as raw data are transformed into other types of information. This software runs on a variety of computers, and can be distributed in either source code or binary code form. It must be run in conjunction with any one of a number of Lisp compilers that are available commercially or as shareware. It has no specific memory requirements and depends upon the other software with which it is used. This program is implemented as a library that is called by, and becomes folded into, the other software with which it is used.

  19. Robonaut's Flexible Information Technology Infrastructure

    NASA Technical Reports Server (NTRS)

    Askew, Scott; Bluethmann, William; Alder, Ken; Ambrose, Robert

    2003-01-01

    Robonaut, NASA's humanoid robot, is designed to work as both an astronaut assistant and, in certain situations, an astronaut surrogate. This highly dexterous robot performs complex tasks under telepresence control that could previously only be carried out directly by humans. Currently with 47 degrees of freedom (DOF), Robonaut is a state-of-the-art human size telemanipulator system. while many of Robonaut's embedded components have been custom designed to meet packaging or environmental requirements, the primary computing systems used in Robonaut are currently commercial-off-the-shelf (COTS) products which have some correlation to flight qualified computer systems. This loose coupling of information technology (IT) resources allows Robonaut to exploit cost effective solutions while floating the technology base to take advantage of the rapid pace of IT advances. These IT systems utilize a software development environment, which is both compatible with COTS hardware as well as flight proven computing systems, preserving the majority of software development for a flight system. The ability to use highly integrated and flexible COTS software development tools improves productivity while minimizing redesign for a space flight system. Further, the flexibility of Robonaut's software and communication architecture has allowed it to become a widely used distributed development testbed for integrating new capabilities and furthering experimental research.

  20. A Custom Approach for a Flexible, Real-Time and Reliable Software Defined Utility.

    PubMed

    Zaballos, Agustín; Navarro, Joan; Martín De Pozuelo, Ramon

    2018-02-28

    Information and communication technologies (ICTs) have enabled the evolution of traditional electric power distribution networks towards a new paradigm referred to as the smart grid. However, the different elements that compose the ICT plane of a smart grid are usually conceived as isolated systems that typically result in rigid hardware architectures, which are hard to interoperate, manage and adapt to new situations. In the recent years, software-defined systems that take advantage of software and high-speed data network infrastructures have emerged as a promising alternative to classic ad hoc approaches in terms of integration, automation, real-time reconfiguration and resource reusability. The purpose of this paper is to propose the usage of software-defined utilities (SDUs) to address the latent deployment and management limitations of smart grids. More specifically, the implementation of a smart grid's data storage and management system prototype by means of SDUs is introduced, which exhibits the feasibility of this alternative approach. This system features a hybrid cloud architecture able to meet the data storage requirements of electric utilities and adapt itself to their ever-evolving needs. Conducted experimentations endorse the feasibility of this solution and encourage practitioners to point their efforts in this direction.

  1. A Custom Approach for a Flexible, Real-Time and Reliable Software Defined Utility

    PubMed Central

    2018-01-01

    Information and communication technologies (ICTs) have enabled the evolution of traditional electric power distribution networks towards a new paradigm referred to as the smart grid. However, the different elements that compose the ICT plane of a smart grid are usually conceived as isolated systems that typically result in rigid hardware architectures, which are hard to interoperate, manage and adapt to new situations. In the recent years, software-defined systems that take advantage of software and high-speed data network infrastructures have emerged as a promising alternative to classic ad hoc approaches in terms of integration, automation, real-time reconfiguration and resource reusability. The purpose of this paper is to propose the usage of software-defined utilities (SDUs) to address the latent deployment and management limitations of smart grids. More specifically, the implementation of a smart grid’s data storage and management system prototype by means of SDUs is introduced, which exhibits the feasibility of this alternative approach. This system features a hybrid cloud architecture able to meet the data storage requirements of electric utilities and adapt itself to their ever-evolving needs. Conducted experimentations endorse the feasibility of this solution and encourage practitioners to point their efforts in this direction. PMID:29495599

  2. Implementation of a system to provide mobile satellite services in North America

    NASA Technical Reports Server (NTRS)

    Johanson, Gary A.; Davies, N. George; Tisdale, William R. H.

    1993-01-01

    This paper describes the implementation of the ground network to support Mobile Satellite Services (MSS). The system is designed to take advantage of a powerful new satellite series and provides significant improvements in capacity and throughput over systems in service today. The system is described in terms of the services provided and the system architecture being implemented to deliver those services. The system operation is described including examples of a circuit switched and packet switched call placement. The physical architecture is presented showing the major hardware components and software functionality placement within the hardware.

  3. High throughput computing: a solution for scientific analysis

    USGS Publications Warehouse

    O'Donnell, M.

    2011-01-01

    handle job failures due to hardware, software, or network interruptions (obviating the need to manually resubmit the job after each stoppage); be affordable; and most importantly, allow us to complete very large, complex analyses that otherwise would not even be possible. In short, we envisioned a job-management system that would take advantage of unused FORT CPUs within a local area network (LAN) to effectively distribute and run highly complex analytical processes. What we found was a solution that uses High Throughput Computing (HTC) and High Performance Computing (HPC) systems to do exactly that (Figure 1).

  4. Telescience Resource Kit

    NASA Technical Reports Server (NTRS)

    Schneider, Michelle; Lippincott, Jeff; Chubb, Steve; Whitaker, Jimmy; Rice, Jim; Gillis, Robert; Sims, Chris; Sellers, Donna; Bailey, Darrell (Technical Monitor)

    2002-01-01

    The Telescience Resource Kit (TReK) is a PC based ground control system. It can be used by a single individual or in a group environment to monitor and control spacecraft systems and payloads. Capabilities include data receipt, data processing, data storage, data management, and data transmission. Commercial-Off-The-Shelf (COTS) hardware and software have been employed to reduce development costs, operations and maintenance costs, and to effectively take advantage of new commercial products as they become available. The TReK system is currently being used to monitor and control payloads aboard the International Space Station. It is located at sites around the world.

  5. Real-time control system for adaptive resonator

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Flath, L; An, J; Brase, J

    2000-07-24

    Sustained operation of high average power solid-state lasers currently requires an adaptive resonator to produce the optimal beam quality. We describe the architecture of a real-time adaptive control system for correcting intra-cavity aberrations in a heat capacity laser. Image data collected from a wavefront sensor are processed and used to control phase with a high-spatial-resolution deformable mirror. Our controller takes advantage of recent developments in low-cost, high-performance processor technology. A desktop-based computational engine and object-oriented software architecture replaces the high-cost rack-mount embedded computers of previous systems.

  6. Evolution of a standard microprocessor-based space computer

    NASA Technical Reports Server (NTRS)

    Fernandez, M.

    1980-01-01

    An existing in inventory computer hardware/software package (B-1 RFS/ECM) was repackaged and applied to multiple missile/space programs. Concurrent with the application efforts, low risk modifications were made to the computer from program to program to take advantage of newer, advanced technology and to meet increasingly more demanding requirements (computational and memory capabilities, longer life, and fault tolerant autonomy). It is concluded that microprocessors hold promise in a number of critical areas for future space computer applications. However, the benefits of the DoD VHSIC Program are required and the old proliferation problem must be revised.

  7. An economic analysis for optimal distributed computing resources for mask synthesis and tape-out in production environment

    NASA Astrophysics Data System (ADS)

    Cork, Chris; Lugg, Robert; Chacko, Manoj; Levi, Shimon

    2005-06-01

    With the exponential increase in output database size due to the aggressive optical proximity correction (OPC) and resolution enhancement technique (RET) required for deep sub-wavelength process nodes, the CPU time required for mask tape-out continues to increase significantly. For integrated device manufacturers (IDMs), this can impact the time-to-market for their products where even a few days delay could have a huge commercial impact and loss of market window opportunity. For foundries, a shorter turnaround time provides a competitive advantage in their demanding market, too slow could mean customers looking elsewhere for these services; while a fast turnaround may even command a higher price. With FAB turnaround of a mature, plain-vanilla CMOS process of around 20-30 days, a delay of several days in mask tapeout would contribute a significant fraction to the total time to deliver prototypes. Unlike silicon processing, masks tape-out time can be decreased by simply purchasing extra computing resources and software licenses. Mask tape-out groups are taking advantage of the ever-decreasing hardware cost and increasing power of commodity processors. The significant distributability inherent in some commercial Mask Synthesis software can be leveraged to address this critical business issue. Different implementations have different fractions of the code that cannot be parallelized and this affects the efficiency with which it scales, as is described by Amdahl"s law. Very few are efficient enough to allow the effective use of 1000"s of processors, enabling run times to drop from days to only minutes. What follows is a cost aware methodology to quantify the scalability of this class of software, and thus act as a guide to estimating the optimal investment in terms of hardware and software licenses.

  8. Using all of your CPU's in HIPE

    NASA Astrophysics Data System (ADS)

    Jacobson, J. D.; Fadda, D.

    2012-09-01

    Modern computer architectures increasingly feature multi-core CPU's. For example, the MacbookPro features the Intel quad-core i7 processors. Through the use of hyper-threading, where each core can execute two threads simultaneously, the quad-core i7 can support eight simultaneous processing threads. All this on your laptop! This CPU power can now be put into service by scientists to perform data reduction tasks, but only if the software has been designed to take advantage of the multiple processor architectures. Up to now, software written for Herschel data reduction (HIPE), written in Jython and JAVA, is single-threaded and can only utilize a single processor. Users of HIPE do not get any advantage from the additional processors. Why not put all of the CPU resources to work reducing your data? We present a multi-threaded software application that corrects long-term transients in the signal from the PACS unchopped spectroscopy line scan mode. In this poster, we present a multi-threaded software framework to achieve performance improvements from parallel execution. We will show how a task to correct transients in the PACS Spectroscopy Pipeline for the un-chopped line scan mode, has been threaded. This computation-intensive task uses either a one-parameter or a three parameter exponential function, to characterize the transient. The task uses a JAVA implementation of Minpack, translated from the C (Moshier) and IDL (Markwardt) by the authors, to optimize the correction parameters. We also explain how to determine if a task can benefit from threading (Amdahl's Law), and if it is safe to thread. The design and implementation, using the JAVA concurrency package completions service is described. Pitfalls, timing bugs, thread safety, resource control, testing and performance improvements are described and plotted.

  9. A Measurement Framework for Team Level Assessment of Innovation Capability in Early Requirements Engineering

    NASA Astrophysics Data System (ADS)

    Regnell, Björn; Höst, Martin; Nilsson, Fredrik; Bengtsson, Henrik

    When developing software-intensive products for a market-place it is important for a development organisation to create innovative features for coming releases in order to achieve advantage over competitors. This paper focuses on assessment of innovation capability at team level in relation to the requirements engineering that is taking place before the actual product development projects are decided, when new business models, technology opportunities and intellectual property rights are created and investigated through e.g. prototyping and concept development. The result is a measurement framework focusing on four areas: innovation elicitation, selection, impact and ways-of-working. For each area, candidate measurements were derived from interviews to be used as inspiration in the development of a tailored measurement program. The framework is based on interviews with participants of a software team with specific innovation responsibilities and validated through cross-case analysis and feedback from practitioners.

  10. Rule-Based Design of Plant Expression Vectors Using GenoCAD.

    PubMed

    Coll, Anna; Wilson, Mandy L; Gruden, Kristina; Peccoud, Jean

    2015-01-01

    Plant synthetic biology requires software tools to assist on the design of complex multi-genic expression plasmids. Here a vector design strategy to express genes in plants is formalized and implemented as a grammar in GenoCAD, a Computer-Aided Design software for synthetic biology. It includes a library of plant biological parts organized in structural categories and a set of rules describing how to assemble these parts into large constructs. Rules developed here are organized and divided into three main subsections according to the aim of the final construct: protein localization studies, promoter analysis and protein-protein interaction experiments. The GenoCAD plant grammar guides the user through the design while allowing users to customize vectors according to their needs. Therefore the plant grammar implemented in GenoCAD will help plant biologists take advantage of methods from synthetic biology to design expression vectors supporting their research projects.

  11. Hadronic energy resolution of a highly granular scintillator-steel hadron calorimeter using software compensation techniques

    NASA Astrophysics Data System (ADS)

    Adloff, C.; Blaha, J.; Blaising, J.-J.; Drancourt, C.; Espargilière, A.; Gaglione, R.; Geffroy, N.; Karyotakis, Y.; Prast, J.; Vouters, G.; Francis, K.; Repond, J.; Smith, J.; Xia, L.; Baldolemar, E.; Li, J.; Park, S. T.; Sosebee, M.; White, A. P.; Yu, J.; Buanes, T.; Eigen, G.; Mikami, Y.; Watson, N. K.; Goto, T.; Mavromanolakis, G.; Thomson, M. A.; Ward, D. R.; Yan, W.; Benchekroun, D.; Hoummada, A.; Khoulaki, Y.; Benyamna, M.; Cârloganu, C.; Fehr, F.; Gay, P.; Manen, S.; Royer, L.; Blazey, G. C.; Dyshkant, A.; Lima, J. G. R.; Zutshi, V.; Hostachy, J.-Y.; Morin, L.; Cornett, U.; David, D.; Falley, G.; Gadow, K.; Göttlicher, P.; Günter, C.; Hermberg, B.; Karstensen, S.; Krivan, F.; Lucaci-Timoce, A.-I.; Lu, S.; Lutz, B.; Morozov, S.; Morgunov, V.; Reinecke, M.; Sefkow, F.; Smirnov, P.; Terwort, M.; Vargas-Trevino, A.; Feege, N.; Garutti, E.; Marchesini, I.; Ramilli, M.; Eckert, P.; Harion, T.; Kaplan, A.; Schultz-Coulon, H.-Ch; Shen, W.; Stamen, R.; Tadday, A.; Bilki, B.; Norbeck, E.; Onel, Y.; Wilson, G. W.; Kawagoe, K.; Dauncey, P. D.; Magnan, A.-M.; Wing, M.; Salvatore, F.; Calvo Alamillo, E.; Fouz, M.-C.; Puerta-Pelayo, J.; Balagura, V.; Bobchenko, B.; Chadeeva, M.; Danilov, M.; Epifantsev, A.; Markin, O.; Mizuk, R.; Novikov, E.; Rusinov, V.; Tarkovsky, E.; Kirikova, N.; Kozlov, V.; Smirnov, P.; Soloviev, Y.; Buzhan, P.; Dolgoshein, B.; Ilyin, A.; Kantserov, V.; Kaplin, V.; Karakash, A.; Popova, E.; Smirnov, S.; Kiesling, C.; Pfau, S.; Seidel, K.; Simon, F.; Soldner, C.; Szalay, M.; Tesar, M.; Weuste, L.; Bonis, J.; Bouquet, B.; Callier, S.; Cornebise, P.; Doublet, Ph; Dulucq, F.; Faucci Giannelli, M.; Fleury, J.; Li, H.; Martin-Chassard, G.; Richard, F.; de la Taille, Ch; Pöschl, R.; Raux, L.; Seguin-Moreau, N.; Wicek, F.; Anduze, M.; Boudry, V.; Brient, J.-C.; Jeans, D.; Mora de Freitas, P.; Musat, G.; Reinhard, M.; Ruan, M.; Videau, H.; Bulanek, B.; Zacek, J.; Cvach, J.; Gallus, P.; Havranek, M.; Janata, M.; Kvasnicka, J.; Lednicky, D.; Marcisovsky, M.; Polak, I.; Popule, J.; Tomasek, L.; Tomasek, M.; Ruzicka, P.; Sicho, P.; Smolik, J.; Vrba, V.; Zalesak, J.; Belhorma, B.; Ghazlane, H.; Takeshita, T.; Uozumi, S.; Sauer, J.; Weber, S.; Zeitnitz, C.

    2012-09-01

    The energy resolution of a highly granular 1 m3 analogue scintillator-steel hadronic calorimeter is studied using charged pions with energies from 10 GeV to 80 GeV at the CERN SPS. The energy resolution for single hadrons is determined to be approximately 58%/√E/GeV. This resolution is improved to approximately 45%/√E/GeV with software compensation techniques. These techniques take advantage of the event-by-event information about the substructure of hadronic showers which is provided by the imaging capabilities of the calorimeter. The energy reconstruction is improved either with corrections based on the local energy density or by applying a single correction factor to the event energy sum derived from a global measure of the shower energy density. The application of the compensation algorithms to geant4 simulations yield resolution improvements comparable to those observed for real data.

  12. Managing Ada development

    NASA Technical Reports Server (NTRS)

    Green, James R.

    1986-01-01

    The Ada programming language was developed under the sponsorship of the Department of Defense to address the soaring costs associated with software development and maintenance. Ada is powerful, and yet to take full advantage of its power, it is sufficiently complex and different from current programming approaches that there is considerable risk associated with committing a program to be done in Ada. There are also few programs of any substantial size that have been implemented using Ada that may be studied to determine those management methods that resulted in a successful Ada project. The items presented are the author's opinions which have been formed as a result of going through an experience software development. The difficulties faced, risks assumed, management methods applied, and lessons learned, and most importantly, the techniques that were successful are all valuable sources of management information for those managers ready to assume major Ada developments projects.

  13. Models and algorithm of optimization launch and deployment of virtual network functions in the virtual data center

    NASA Astrophysics Data System (ADS)

    Bolodurina, I. P.; Parfenov, D. I.

    2017-10-01

    The goal of our investigation is optimization of network work in virtual data center. The advantage of modern infrastructure virtualization lies in the possibility to use software-defined networks. However, the existing optimization of algorithmic solutions does not take into account specific features working with multiple classes of virtual network functions. The current paper describes models characterizing the basic structures of object of virtual data center. They including: a level distribution model of software-defined infrastructure virtual data center, a generalized model of a virtual network function, a neural network model of the identification of virtual network functions. We also developed an efficient algorithm for the optimization technology of containerization of virtual network functions in virtual data center. We propose an efficient algorithm for placing virtual network functions. In our investigation we also generalize the well renowned heuristic and deterministic algorithms of Karmakar-Karp.

  14. The pilot climate data system

    NASA Technical Reports Server (NTRS)

    Reph, M. G.; Treinish, L. A.; Smith, P. H.

    1984-01-01

    The Pilot Climate Data System (PCDS) is an interactive scientific information management system for locating, obtaining, manipulating, and displaying climate-research data. The PCDS was developed to manage a large collection of data of interest to the National Aeronautics and Space Administration's (NASA) research community and currently provides such support for approximately twenty data sets. In order to provide the PCDS capabilities, NASA's Goddard Space Flight Center (NASA/GSFC) has integrated the capabilities of several general-purpose software packages with specialized software for reading and reformatting the supported data sets. These capabilities were integrated in a manner which allows the PCDS to be easily expanded, either to provide support for additional data sets or to provide additional functional capabilities. This also allows the PCDS to take advantage of new technology as it becomes available, since parts of the system can be replaced with more powerful components without significantly affecting the user interface.

  15. A graphic user interface for efficient 3D photo-reconstruction based on free software

    NASA Astrophysics Data System (ADS)

    Castillo, Carlos; James, Michael; Gómez, Jose A.

    2015-04-01

    Recently, different studies have stressed the applicability of 3D photo-reconstruction based on Structure from Motion algorithms in a wide range of geoscience applications. For the purpose of image photo-reconstruction, a number of commercial and freely available software packages have been developed (e.g. Agisoft Photoscan, VisualSFM). The workflow involves typically different stages such as image matching, sparse and dense photo-reconstruction, point cloud filtering and georeferencing. For approaches using open and free software, each of these stages usually require different applications. In this communication, we present an easy-to-use graphic user interface (GUI) developed in Matlab® code as a tool for efficient 3D photo-reconstruction making use of powerful existing software: VisualSFM (Wu, 2015) for photo-reconstruction and CloudCompare (Girardeau-Montaut, 2015) for point cloud processing. The GUI performs as a manager of configurations and algorithms, taking advantage of the command line modes of existing software, which allows an intuitive and automated processing workflow for the geoscience user. The GUI includes several additional features: a) a routine for significantly reducing the duration of the image matching operation, normally the most time consuming stage; b) graphical outputs for understanding the overall performance of the algorithm (e.g. camera connectivity, point cloud density); c) a number of useful options typically performed before and after the photo-reconstruction stage (e.g. removal of blurry images, image renaming, vegetation filtering); d) a manager of batch processing for the automated reconstruction of different image datasets. In this study we explore the advantages of this new tool by testing its performance using imagery collected in several soil erosion applications. References Girardeau-Montaut, D. 2015. CloudCompare documentation accessed at http://cloudcompare.org/ Wu, C. 2015. VisualSFM documentation access at http://ccwu.me/vsfm/doc.html#.

  16. A Microarray Tool Provides Pathway and GO Term Analysis.

    PubMed

    Koch, Martin; Royer, Hans-Dieter; Wiese, Michael

    2011-12-01

    Analysis of gene expression profiles is no longer exclusively a task for bioinformatic experts. However, gaining statistically significant results is challenging and requires both biological knowledge and computational know-how. Here we present a novel, user-friendly microarray reporting tool called maRt. The software provides access to bioinformatic resources, like gene ontology terms and biological pathways by use of the DAVID and the BioMart web-service. Results are summarized in structured HTML reports, each presenting a different layer of information. In these report, contents of diverse sources are integrated and interlinked. To speed up processing, maRt takes advantage of the multi-core technology of modern desktop computers by using parallel processing. Since the software is built upon a RCP infrastructure it might be an outset for developers aiming to integrate novel R based applications. Installer, documentation and various kinds of tutorials are available under LGPL license at the website of our institute http://www.pharma.uni-bonn.de/www/mart. This software is free for academic use. Copyright © 2011 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  17. Novel Method of Storing and Reconstructing Events at Fermilab E-906/SeaQuest Using a MySQL Database

    NASA Astrophysics Data System (ADS)

    Hague, Tyler

    2010-11-01

    Fermilab E-906/SeaQuest is a fixed target experiment at Fermi National Accelerator Laboratory. We are investigating the antiquark asymmetry in the nucleon sea. By examining the ratio of the Drell- Yan cross sections of proton-proton and proton-deuterium collisions we can determine the asymmetry ratio. An essential feature in the development of the analysis software is to update the event reconstruction to modern software tools. We are doing this in a unique way by doing a majority of the calculations within an SQL database. Using a MySQL database allows us to take advantage of off-the-shelf software without sacrificing ROOT compatibility and avoid network bottlenecks with server-side data selection. Using our raw data we create stubs, or partial tracks, at each station which are pieced together to create full tracks. Our reconstruction process uses dynamically created SQL statements to analyze the data. These SQL statements create tables that contain the final reconstructed tracks as well as intermediate values. This poster will explain the reconstruction process and how it is being implemented.

  18. A real-time GNSS-R system based on software-defined radio and graphics processing units

    NASA Astrophysics Data System (ADS)

    Hobiger, Thomas; Amagai, Jun; Aida, Masanori; Narita, Hideki

    2012-04-01

    Reflected signals of the Global Navigation Satellite System (GNSS) from the sea or land surface can be utilized to deduce and monitor physical and geophysical parameters of the reflecting area. Unlike most other remote sensing techniques, GNSS-Reflectometry (GNSS-R) operates as a passive radar that takes advantage from the increasing number of navigation satellites that broadcast their L-band signals. Thereby, most of the GNSS-R receiver architectures are based on dedicated hardware solutions. Software-defined radio (SDR) technology has advanced in the recent years and enabled signal processing in real-time, which makes it an ideal candidate for the realization of a flexible GNSS-R system. Additionally, modern commodity graphic cards, which offer massive parallel computing performances, allow to handle the whole signal processing chain without interfering with the PC's CPU. Thus, this paper describes a GNSS-R system which has been developed on the principles of software-defined radio supported by General Purpose Graphics Processing Units (GPGPUs), and presents results from initial field tests which confirm the anticipated capability of the system.

  19. Auto-Generated Semantic Processing Services

    NASA Technical Reports Server (NTRS)

    Davis, Rodney; Hupf, Greg

    2009-01-01

    Auto-Generated Semantic Processing (AGSP) Services is a suite of software tools for automated generation of other computer programs, denoted cross-platform semantic adapters, that support interoperability of computer-based communication systems that utilize a variety of both new and legacy communication software running in a variety of operating- system/computer-hardware combinations. AGSP has numerous potential uses in military, space-exploration, and other government applications as well as in commercial telecommunications. The cross-platform semantic adapters take advantage of common features of computer- based communication systems to enforce semantics, messaging protocols, and standards of processing of streams of binary data to ensure integrity of data and consistency of meaning among interoperating systems. The auto-generation aspect of AGSP Services reduces development time and effort by emphasizing specification and minimizing implementation: In effect, the design, building, and debugging of software for effecting conversions among complex communication protocols, custom device mappings, and unique data-manipulation algorithms is replaced with metadata specifications that map to an abstract platform-independent communications model. AGSP Services is modular and has been shown to be easily integrable into new and legacy NASA flight and ground communication systems.

  20. Implementation of a data management software system for SSME test history data

    NASA Technical Reports Server (NTRS)

    Abernethy, Kenneth

    1986-01-01

    The implementation of a software system for managing Space Shuttle Main Engine (SSME) test/flight historical data is presented. The software system uses the database management system RIM7 for primary data storage and routine data management, but includes several FORTRAN programs, described here, which provide customized access to the RIM7 database. The consolidation, modification, and transfer of data from the database THIST, to the RIM7 database THISRM is discussed. The RIM7 utility modules for generating some standard reports from THISRM and performing some routine updating and maintenance are briefly described. The FORTRAN accessing programs described include programs for initial loading of large data sets into the database, capturing data from files for database inclusion, and producing specialized statistical reports which cannot be provided by the RIM7 report generator utility. An expert system tutorial, constructed using the expert system shell product INSIGHT2, is described. Finally, a potential expert system, which would analyze data in the database, is outlined. This system could use INSIGHT2 as well and would take advantage of RIM7's compatibility with the microcomputer database system RBase 5000.

  1. Permuting input for more effective sampling of 3D conformer space

    NASA Astrophysics Data System (ADS)

    Carta, Giorgio; Onnis, Valeria; Knox, Andrew J. S.; Fayne, Darren; Lloyd, David G.

    2006-03-01

    SMILES strings and other classic 2D structural formats offer a convenient way to represent molecules as a simplistic connection table, with the inherent advantages of ease of handling and storage. In the context of virtual screening, chemical databases to be screened are often initially represented by canonicalised SMILES strings that can be filtered and pre-processed in a number of ways, resulting in molecules that occupy similar regions of chemical space to active compounds of a therapeutic target. A wide variety of software exists to convert molecules into SMILES format, namely, Mol2smi (Daylight Inc.), MOE (Chemical Computing Group) and Babel (Openeye Scientific Software). Depending on the algorithm employed, the atoms of a SMILES string defining a molecule can be ordered differently. Upon conversion to 3D coordinates they result in the production of ostensibly the same molecule. In this work we show how different permutations of a SMILES string can affect conformer generation, affecting reliability and repeatability of the results. Furthermore, we propose a novel procedure for the generation of conformers, taking advantage of the permutation of the input strings—both SMILES and other 2D formats, leading to more effective sampling of conformation space in output, and also implementing fingerprint and principal component analyses step to post process and visualise the results.

  2. Adapting a Computerized Medical Dictation System to Prepare Academic Papers in Radiology.

    PubMed

    Sánchez, Yadiel; Prabhakar, Anand M; Uppot, Raul N

    2017-09-14

    Everyday radiologists use dictation software to compose clinical reports of imaging findings. The dictation software is tailored for medical use and to the speech pattern of each radiologist. Over the past 10 years we have used dictation software to compose academic manuscripts, correspondence letters, and texts of educational exhibits. The advantages of using voice dictation is faster composition of manuscripts. However, use of such software requires preparation. The purpose of this article is to review the steps of adapting a clinical dictation software for dictating academic manuscripts and detail the advantages and limitations of this technique. Copyright © 2017 Elsevier Inc. All rights reserved.

  3. The Chorus Conflict and Loss of Separation Resolution Algorithms

    NASA Technical Reports Server (NTRS)

    Butler, Ricky W.; Hagen, George E.; Maddalon, Jeffrey M.

    2013-01-01

    The Chorus software is designed to investigate near-term, tactical conflict and loss of separation detection and resolution concepts for air traffic management. This software is currently being used in two different problem domains: en-route self- separation and sense and avoid for unmanned aircraft systems. This paper describes the core resolution algorithms that are part of Chorus. The combination of several features of the Chorus program distinguish this software from other approaches to conflict and loss of separation resolution. First, the program stores a history of state information over time which enables it to handle communication dropouts and take advantage of previous input data. Second, the underlying conflict algorithms find resolutions that solve the most urgent conflict, but also seek to prevent secondary conflicts with the other aircraft. Third, if the program is run on multiple aircraft, and the two aircraft maneuver at the same time, the result will be implicitly co-ordinated. This implicit coordination property is established by ensuring that a resolution produced by Chorus will comply with a mathematically-defined criteria whose correctness has been formally verified. Fourth, the program produces both instantaneous solutions and kinematic solutions, which are based on simple accel- eration models. Finally, the program provides resolutions for recovery from loss of separation. Different versions of this software are implemented as Java and C++ software programs, respectively.

  4. Design of testbed and emulation tools

    NASA Technical Reports Server (NTRS)

    Lundstrom, S. F.; Flynn, M. J.

    1986-01-01

    The research summarized was concerned with the design of testbed and emulation tools suitable to assist in projecting, with reasonable accuracy, the expected performance of highly concurrent computing systems on large, complete applications. Such testbed and emulation tools are intended for the eventual use of those exploring new concurrent system architectures and organizations, either as users or as designers of such systems. While a range of alternatives was considered, a software based set of hierarchical tools was chosen to provide maximum flexibility, to ease in moving to new computers as technology improves and to take advantage of the inherent reliability and availability of commercially available computing systems.

  5. Hitchhiker On Space Station

    NASA Technical Reports Server (NTRS)

    Daelemans, Gerard; Goldsmith, Theodore

    1999-01-01

    The NASA/GSFC Shuttle Small Payloads Projects Office (SSPPO) has been studying the feasibility of migrating Hitchhiker customers past present and future to the International Space Station via a "Hitchhiker like" carrier system. SSPPO has been tasked to make the most use of existing hardware and software systems and infrastructure in its study of an ISS based carrier system. This paper summarizes the results of the SSPPO Hitchhiker on International Space Station (ISS) study. Included are a number of "Hitchhiker like" carrier system concepts that take advantage of the various ISS attached payload accommodation sites. Emphasis will be given to a HH concept that attaches to the Japanese Experiment Module - Exposed Facility (JEM-EF).

  6. Fast Photon Monte Carlo for Water Cherenkov Detectors

    NASA Astrophysics Data System (ADS)

    Latorre, Anthony; Seibert, Stanley

    2012-03-01

    We present Chroma, a high performance optical photon simulation for large particle physics detectors, such as the water Cerenkov far detector option for LBNE. This software takes advantage of the CUDA parallel computing platform to propagate photons using modern graphics processing units. In a computer model of a 200 kiloton water Cerenkov detector with 29,000 photomultiplier tubes, Chroma can propagate 2.5 million photons per second, around 200 times faster than the same simulation with Geant4. Chroma uses a surface based approach to modeling geometry which offers many benefits over a solid based modelling approach which is used in other simulations like Geant4.

  7. Customer social network affects marketing strategy: A simulation analysis based on competitive diffusion model

    NASA Astrophysics Data System (ADS)

    Hou, Rui; Wu, Jiawen; Du, Helen S.

    2017-03-01

    To explain the competition phenomenon and results between QQ and MSN (China) in the Chinese instant messaging software market, this paper developed a new population competition model based on customer social network. The simulation results show that the firm whose product with greater network externality effect will gain more market share than its rival when the same marketing strategy is used. The firm with the advantage of time, derived from the initial scale effect will become more competitive than its rival when facing a group of common penguin customers within a social network, verifying the winner-take-all phenomenon in this case.

  8. Evolution of the ATLAS Software Framework towards Concurrency

    NASA Astrophysics Data System (ADS)

    Jones, R. W. L.; Stewart, G. A.; Leggett, C.; Wynne, B. M.

    2015-05-01

    The ATLAS experiment has successfully used its Gaudi/Athena software framework for data taking and analysis during the first LHC run, with billions of events successfully processed. However, the design of Gaudi/Athena dates from early 2000 and the software and the physics code has been written using a single threaded, serial design. This programming model has increasing difficulty in exploiting the potential of current CPUs, which offer their best performance only through taking full advantage of multiple cores and wide vector registers. Future CPU evolution will intensify this trend, with core counts increasing and memory per core falling. Maximising performance per watt will be a key metric, so all of these cores must be used as efficiently as possible. In order to address the deficiencies of the current framework, ATLAS has embarked upon two projects: first, a practical demonstration of the use of multi-threading in our reconstruction software, using the GaudiHive framework; second, an exercise to gather requirements for an updated framework, going back to the first principles of how event processing occurs. In this paper we report on both these aspects of our work. For the hive based demonstrators, we discuss what changes were necessary in order to allow the serially designed ATLAS code to run, both to the framework and to the tools and algorithms used. We report on what general lessons were learned about the code patterns that had been employed in the software and which patterns were identified as particularly problematic for multi-threading. These lessons were fed into our considerations of a new framework and we present preliminary conclusions on this work. In particular we identify areas where the framework can be simplified in order to aid the implementation of a concurrent event processing scheme. Finally, we discuss the practical difficulties involved in migrating a large established code base to a multi-threaded framework and how this can be achieved for LHC Run 3.

  9. 3D OCT imaging in clinical settings: toward quantitative measurements of retinal structures

    NASA Astrophysics Data System (ADS)

    Zawadzki, Robert J.; Fuller, Alfred R.; Zhao, Mingtao; Wiley, David F.; Choi, Stacey S.; Bower, Bradley A.; Hamann, Bernd; Izatt, Joseph A.; Werner, John S.

    2006-02-01

    The acquisition speed of current FD-OCT (Fourier Domain - Optical Coherence Tomography) instruments allows rapid screening of three-dimensional (3D) volumes of human retinas in clinical settings. To take advantage of this ability requires software used by physicians to be capable of displaying and accessing volumetric data as well as supporting post processing in order to access important quantitative information such as thickness maps and segmented volumes. We describe our clinical FD-OCT system used to acquire 3D data from the human retina over the macula and optic nerve head. B-scans are registered to remove motion artifacts and post-processed with customized 3D visualization and analysis software. Our analysis software includes standard 3D visualization techniques along with a machine learning support vector machine (SVM) algorithm that allows a user to semi-automatically segment different retinal structures and layers. Our program makes possible measurements of the retinal layer thickness as well as volumes of structures of interest, despite the presence of noise and structural deformations associated with retinal pathology. Our software has been tested successfully in clinical settings for its efficacy in assessing 3D retinal structures in healthy as well as diseased cases. Our tool facilitates diagnosis and treatment monitoring of retinal diseases.

  10. Real-Time Point Positioning Performance Evaluation of Single-Frequency Receivers Using NASA's Global Differential GPS System

    NASA Technical Reports Server (NTRS)

    Muellerschoen, Ronald J.; Iijima, Byron; Meyer, Robert; Bar-Sever, Yoaz; Accad, Elie

    2004-01-01

    This paper evaluates the performance of a single-frequency receiver using the 1-Hz differential corrections as provided by NASA's global differential GPS system. While the dual-frequency user has the ability to eliminate the ionosphere error by taking a linear combination of observables, the single-frequency user must remove or calibrate this error by other means. To remove the ionosphere error we take advantage of the fact that the magnitude of the group delay in range observable and the carrier phase advance have the same magnitude but are opposite in sign. A way to calibrate this error is to use a real-time database of grid points computed by JPL's RTI (Real-Time Ionosphere) software. In both cases we evaluate the positional accuracy of a kinematic carrier phase based point positioning method on a global extent.

  11. GPU Accelerated Chemical Similarity Calculation for Compound Library Comparison

    PubMed Central

    Ma, Chao; Wang, Lirong; Xie, Xiang-Qun

    2012-01-01

    Chemical similarity calculation plays an important role in compound library design, virtual screening, and “lead” optimization. In this manuscript, we present a novel GPU-accelerated algorithm for all-vs-all Tanimoto matrix calculation and nearest neighbor search. By taking advantage of multi-core GPU architecture and CUDA parallel programming technology, the algorithm is up to 39 times superior to the existing commercial software that runs on CPUs. Because of the utilization of intrinsic GPU instructions, this approach is nearly 10 times faster than existing GPU-accelerated sparse vector algorithm, when Unity fingerprints are used for Tanimoto calculation. The GPU program that implements this new method takes about 20 minutes to complete the calculation of Tanimoto coefficients between 32M PubChem compounds and 10K Active Probes compounds, i.e., 324G Tanimoto coefficients, on a 128-CUDA-core GPU. PMID:21692447

  12. Successful Teaching of Radiobiology Students in the Medical Management of Acute Radiation Effects From Real Case Histories Using Clinical Signs and Symptoms and Taking Advantage of Recently Developed Software Tools.

    PubMed

    Majewski, Matthäus; Combs, Stephanie E; Trott, Klaus-Rüdiger; Abend, Michael; Port, Matthias

    2018-07-01

    In 2015, the Bundeswehr Institute of Radiobiology organized a North Atlantic Treaty Organization exercise to examine the significance of clinical signs and symptoms for the prediction of late-occurring acute radiation syndrome. Cases were generated using either the Medical Treatment Protocols for Radiation Accident Victims (METREPOL, n = 167) system or using real-case descriptions extracted from a database system for evaluation and archiving of radiation accidents based on case histories (SEARCH, n = 24). The cases ranged from unexposed [response category 0 (RC 0, n = 89)] to mild (RC 1, n = 45), moderate (RC 2, n = 19), severe (RC 3, n = 20), and lethal (RC 4, n = 18) acute radiation syndrome. During the previous exercise, expert teams successfully predicted hematological acute radiation syndrome severity, determined whether hospitalization was required, and gave treatment recommendations, taking advantage of different software tools developed by the North Atlantic Treaty Organization teams. The authors provided the same data set to radiobiology students who were introduced to the medical management of acute effects after radiation exposure and the software tools during a class lasting 15 h. Corresponding to the previous results, difficulties in the discrimination between RC 0/RC 1 and RC 3/RC 4, as well as a systematic underestimation of RC 1 and RC 2, were observed. Nevertheless, after merging reported response categories into clinically relevant groups (RC 0-1, RC 2-3, and RC 3-4), it was found that the majority of cases (95.2% ± 2.2 standard deviations) were correctly identified and that 94.7% (±2.6 standard deviations) developing acute radiation syndrome and z96.4% (±1.6 standard deviations) requiring hospitalization were identified correctly. Two out of three student teams also provided a dose estimate. These results are comparable to the best-performing team of the 2015 North Atlantic Treaty Organization exercise (response category: 92.5%; acute radiation syndrome: 95.8%; hospitalization: 96.3%).

  13. [Comparison among various software for LMS growth curve fitting methods].

    PubMed

    Han, Lin; Wu, Wenhong; Wei, Qiuxia

    2015-03-01

    To explore the methods to realize the growth curve fitting of coefficients of skewness-median-coefficient of variation (LMS) using different software, and to optimize growth curve statistical method for grass-root child and adolescent staffs. Regular physical examination data of head circumference for normal infants aging 3, 6, 9 and 12 months in Baotou City were analyzed. Statistical software such as SAS, R, STATA and SPSS were used to fit the LMS growth curve and the results were evaluated upon the user 's convenience, study circle, user interface, results display forms, software update and maintenance and so on. Growth curve fitting results showed the same calculation outcome and each of statistical software had its own advantages and disadvantages. With all the evaluation aspects in consideration, R software excelled others in LMS growth curve fitting. R software have the advantage over other software in grass roots child and adolescent staff.

  14. Astronomical Software Directory Service

    NASA Astrophysics Data System (ADS)

    Hanisch, Robert J.; Payne, Harry; Hayes, Jeffrey

    1997-01-01

    With the support of NASA's Astrophysics Data Program (NRA 92-OSSA-15), we have developed the Astronomical Software Directory Service (ASDS): a distributed, searchable, WWW-based database of software packages and their related documentation. ASDS provides integrated access to 56 astronomical software packages, with more than 16,000 URLs indexed for full-text searching. Users are performing about 400 searches per month. A new aspect of our service is the inclusion of telescope and instrumentation manuals, which prompted us to change the name to the Astronomical Software and Documentation Service. ASDS was originally conceived to serve two purposes: to provide a useful Internet service in an area of expertise of the investigators (astronomical software), and as a research project to investigate various architectures for searching through a set of documents distributed across the Internet. Two of the co-investigators were then installing and maintaining astronomical software as their primary job responsibility. We felt that a service which incorporated our experience in this area would be more useful than a straightforward listing of software packages. The original concept was for a service based on the client/server model, which would function as a directory/referral service rather than as an archive. For performing the searches, we began our investigation with a decision to evaluate the Isite software from the Center for Networked Information Discovery and Retrieval (CNIDR). This software was intended as a replacement for Wide-Area Information Service (WAIS), a client/server technology for performing full-text searches through a set of documents. Isite had some additional features that we considered attractive, and we enjoyed the cooperation of the Isite developers, who were happy to have ASDS as a demonstration project. We ended up staying with the software throughout the project, making modifications to take advantage of new features as they came along, as well as influencing the software development. The Web interface to the search engine is provided by a gateway program written in C++ by a consultant to the project (A. Warnock).

  15. ConfocalVR: Immersive Visualization Applied to Confocal Microscopy.

    PubMed

    Stefani, Caroline; Lacy-Hulbert, Adam; Skillman, Thomas

    2018-06-24

    ConfocalVR is a virtual reality (VR) application created to improve the ability of researchers to study the complexity of cell architecture. Confocal microscopes take pictures of fluorescently labeled proteins or molecules at different focal planes to create a stack of 2D images throughout the specimen. Current software applications reconstruct the 3D image and render it as a 2D projection onto a computer screen where users need to rotate the image to expose the full 3D structure. This process is mentally taxing, breaks down if you stop the rotation, and does not take advantage of the eye's full field of view. ConfocalVR exploits consumer-grade virtual reality (VR) systems to fully immerse the user in the 3D cellular image. In this virtual environment the user can: 1) adjust image viewing parameters without leaving the virtual space, 2) reach out and grab the image to quickly rotate and scale the image to focus on key features, and 3) interact with other users in a shared virtual space enabling real-time collaborative exploration and discussion. We found that immersive VR technology allows the user to rapidly understand cellular architecture and protein or molecule distribution. We note that it is impossible to understand the value of immersive visualization without experiencing it first hand, so we encourage readers to get access to a VR system, download this software, and evaluate it for yourself. The ConfocalVR software is available for download at http://www.confocalvr.com, and is free for nonprofits. Copyright © 2018. Published by Elsevier Ltd.

  16. Reliability of infarct volumetry: Its relevance and the improvement by a software-assisted approach.

    PubMed

    Friedländer, Felix; Bohmann, Ferdinand; Brunkhorst, Max; Chae, Ju-Hee; Devraj, Kavi; Köhler, Yvette; Kraft, Peter; Kuhn, Hannah; Lucaciu, Alexandra; Luger, Sebastian; Pfeilschifter, Waltraud; Sadler, Rebecca; Liesz, Arthur; Scholtyschik, Karolina; Stolz, Leonie; Vutukuri, Rajkumar; Brunkhorst, Robert

    2017-08-01

    Despite the efficacy of neuroprotective approaches in animal models of stroke, their translation has so far failed from bench to bedside. One reason is presumed to be a low quality of preclinical study design, leading to bias and a low a priori power. In this study, we propose that the key read-out of experimental stroke studies, the volume of the ischemic damage as commonly measured by free-handed planimetry of TTC-stained brain sections, is subject to an unrecognized low inter-rater and test-retest reliability with strong implications for statistical power and bias. As an alternative approach, we suggest a simple, open-source, software-assisted method, taking advantage of automatic-thresholding techniques. The validity and the improvement of reliability by an automated method to tMCAO infarct volumetry are demonstrated. In addition, we show the probable consequences of increased reliability for precision, p-values, effect inflation, and power calculation, exemplified by a systematic analysis of experimental stroke studies published in the year 2015. Our study reveals an underappreciated quality problem in translational stroke research and suggests that software-assisted infarct volumetry might help to improve reproducibility and therefore the robustness of bench to bedside translation.

  17. Informed-Proteomics: open-source software package for top-down proteomics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Park, Jungkap; Piehowski, Paul D.; Wilkins, Christopher

    Top-down proteomics involves the analysis of intact proteins. This approach is very attractive as it allows for analyzing proteins in their endogenous form without proteolysis, preserving valuable information about post-translation modifications, isoforms, proteolytic processing or their combinations collectively called proteoforms. Moreover, the quality of the top-down LC-MS/MS datasets is rapidly increasing due to advances in the liquid chromatography and mass spectrometry instrumentation and sample processing protocols. However, the top-down mass spectra are substantially more complex compare to the more conventional bottom-up data. To take full advantage of the increasing quality of the top-down LC-MS/MS datasets there is an urgent needmore » to develop algorithms and software tools for confident proteoform identification and quantification. In this study we present a new open source software suite for top-down proteomics analysis consisting of an LC-MS feature finding algorithm, a database search algorithm, and an interactive results viewer. The presented tool along with several other popular tools were evaluated using human-in-mouse xenograft luminal and basal breast tumor samples that are known to have significant differences in protein abundance based on bottom-up analysis.« less

  18. Systems Architecture for Fully Autonomous Space Missions

    NASA Technical Reports Server (NTRS)

    Esper, Jamie; Schnurr, R.; VanSteenberg, M.; Brumfield, Mark (Technical Monitor)

    2002-01-01

    The NASA Goddard Space Flight Center is working to develop a revolutionary new system architecture concept in support of fully autonomous missions. As part of GSFC's contribution to the New Millenium Program (NMP) Space Technology 7 Autonomy and on-Board Processing (ST7-A) Concept Definition Study, the system incorporates the latest commercial Internet and software development ideas and extends them into NASA ground and space segment architectures. The unique challenges facing the exploration of remote and inaccessible locales and the need to incorporate corresponding autonomy technologies within reasonable cost necessitate the re-thinking of traditional mission architectures. A measure of the resiliency of this architecture in its application to a broad range of future autonomy missions will depend on its effectiveness in leveraging from commercial tools developed for the personal computer and Internet markets. Specialized test stations and supporting software come to past as spacecraft take advantage of the extensive tools and research investments of billion-dollar commercial ventures. The projected improvements of the Internet and supporting infrastructure go hand-in-hand with market pressures that provide continuity in research. By taking advantage of consumer-oriented methods and processes, space-flight missions will continue to leverage on investments tailored to provide better services at reduced cost. The application of ground and space segment architectures each based on Local Area Networks (LAN), the use of personal computer-based operating systems, and the execution of activities and operations through a Wide Area Network (Internet) enable a revolution in spacecraft mission formulation, implementation, and flight operations. Hardware and software design, development, integration, test, and flight operations are all tied-in closely to a common thread that enables the smooth transitioning between program phases. The application of commercial software development techniques lays the foundation for delivery of product-oriented flight software modules and models. Software can then be readily applied to support the on-board autonomy required for mission self-management. An on-board intelligent system, based on advanced scripting languages, facilitates the mission autonomy required to offload ground system resources, and enables the spacecraft to manage itself safely through an efficient and effective process of reactive planning, science data acquisition, synthesis, and transmission to the ground. Autonomous ground systems in turn coordinate and support schedule contact times with the spacecraft. Specific autonomy software modules on-board include mission and science planners, instrument and subsystem control, and fault tolerance response software, all residing within a distributed computing environment supported through the flight LAN. Autonomy also requires the minimization of human intervention between users on the ground and the spacecraft, and hence calls for the elimination of the traditional operations control center as a funnel for data manipulation. Basic goal-oriented commands are sent directly from the user to the spacecraft through a distributed internet-based payload operations "center". The ensuing architecture calls for the use of spacecraft as point extensions on the Internet. This paper will detail the system architecture implementation chosen to enable cost-effective autonomous missions with applicability to a broad range of conditions. It will define the structure needed for implementation of such missions, including software and hardware infrastructures. The overall architecture is then laid out as a common thread in the mission life cycle from formulation through implementation and flight operations.

  19. Verdant: automated annotation, alignment and phylogenetic analysis of whole chloroplast genomes.

    PubMed

    McKain, Michael R; Hartsock, Ryan H; Wohl, Molly M; Kellogg, Elizabeth A

    2017-01-01

    Chloroplast genomes are now produced in the hundreds for angiosperm phylogenetics projects, but current methods for annotation, alignment and tree estimation still require some manual intervention reducing throughput and increasing analysis time for large chloroplast systematics projects. Verdant is a web-based software suite and database built to take advantage a novel annotation program, annoBTD. Using annoBTD, Verdant provides accurate annotation of chloroplast genomes without manual intervention. Subsequent alignment and tree estimation can incorporate newly annotated and publically available plastomes and can accommodate a large number of taxa. Verdant sharply reduces the time required for analysis of assembled chloroplast genomes and removes the need for pipelines and software on personal hardware. Verdant is available at: http://verdant.iplantcollaborative.org/plastidDB/ It is implemented in PHP, Perl, MySQL, Javascript, HTML and CSS with all major browsers supported. mrmckain@gmail.comSupplementary information: Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press.

  20. Reducing Design Cycle Time and Cost Through Process Resequencing

    NASA Technical Reports Server (NTRS)

    Rogers, James L.

    2004-01-01

    In today's competitive environment, companies are under enormous pressure to reduce the time and cost of their design cycle. One method for reducing both time and cost is to develop an understanding of the flow of the design processes and the effects of the iterative subcycles that are found in complex design projects. Once these aspects are understood, the design manager can make decisions that take advantage of decomposition, concurrent engineering, and parallel processing techniques to reduce the total time and the total cost of the design cycle. One software tool that can aid in this decision-making process is the Design Manager's Aid for Intelligent Decomposition (DeMAID). The DeMAID software minimizes the feedback couplings that create iterative subcycles, groups processes into iterative subcycles, and decomposes the subcycles into a hierarchical structure. The real benefits of producing the best design in the least time and at a minimum cost are obtained from sequencing the processes in the subcycles.

  1. BigWig and BigBed: enabling browsing of large distributed datasets.

    PubMed

    Kent, W J; Zweig, A S; Barber, G; Hinrichs, A S; Karolchik, D

    2010-09-01

    BigWig and BigBed files are compressed binary indexed files containing data at several resolutions that allow the high-performance display of next-generation sequencing experiment results in the UCSC Genome Browser. The visualization is implemented using a multi-layered software approach that takes advantage of specific capabilities of web-based protocols and Linux and UNIX operating systems files, R trees and various indexing and compression tricks. As a result, only the data needed to support the current browser view is transmitted rather than the entire file, enabling fast remote access to large distributed data sets. Binaries for the BigWig and BigBed creation and parsing utilities may be downloaded at http://hgdownload.cse.ucsc.edu/admin/exe/linux.x86_64/. Source code for the creation and visualization software is freely available for non-commercial use at http://hgdownload.cse.ucsc.edu/admin/jksrc.zip, implemented in C and supported on Linux. The UCSC Genome Browser is available at http://genome.ucsc.edu.

  2. Automatic building information model query generation

    DOE PAGES

    Jiang, Yufei; Yu, Nan; Ming, Jiang; ...

    2015-12-01

    Energy efficient building design and construction calls for extensive collaboration between different subfields of the Architecture, Engineering and Construction (AEC) community. Performing building design and construction engineering raises challenges on data integration and software interoperability. Using Building Information Modeling (BIM) data hub to host and integrate building models is a promising solution to address those challenges, which can ease building design information management. However, the partial model query mechanism of current BIM data hub collaboration model has several limitations, which prevents designers and engineers to take advantage of BIM. To address this problem, we propose a general and effective approachmore » to generate query code based on a Model View Definition (MVD). This approach is demonstrated through a software prototype called QueryGenerator. In conclusion, by demonstrating a case study using multi-zone air flow analysis, we show how our approach and tool can help domain experts to use BIM to drive building design with less labour and lower overhead cost.« less

  3. BEASTling: A software tool for linguistic phylogenetics using BEAST 2

    PubMed Central

    Forkel, Robert; Kaiping, Gereon A.; Atkinson, Quentin D.

    2017-01-01

    We present a new open source software tool called BEASTling, designed to simplify the preparation of Bayesian phylogenetic analyses of linguistic data using the BEAST 2 platform. BEASTling transforms comparatively short and human-readable configuration files into the XML files used by BEAST to specify analyses. By taking advantage of Creative Commons-licensed data from the Glottolog language catalog, BEASTling allows the user to conveniently filter datasets using names for recognised language families, to impose monophyly constraints so that inferred language trees are backward compatible with Glottolog classifications, or to assign geographic location data to languages for phylogeographic analyses. Support for the emerging cross-linguistic linked data format (CLDF) permits easy incorporation of data published in cross-linguistic linked databases into analyses. BEASTling is intended to make the power of Bayesian analysis more accessible to historical linguists without strong programming backgrounds, in the hopes of encouraging communication and collaboration between those developing computational models of language evolution (who are typically not linguists) and relevant domain experts. PMID:28796784

  4. Scale factor measure method without turntable for angular rate gyroscope

    NASA Astrophysics Data System (ADS)

    Qi, Fangyi; Han, Xuefei; Yao, Yanqing; Xiong, Yuting; Huang, Yuqiong; Wang, Hua

    2018-03-01

    In this paper, a scale factor test method without turntable is originally designed for the angular rate gyroscope. A test system which consists of test device, data acquisition circuit and data processing software based on Labview platform is designed. Taking advantage of gyroscope's sensitivity of angular rate, a gyroscope with known scale factor, serves as a standard gyroscope. The standard gyroscope is installed on the test device together with a measured gyroscope. By shaking the test device around its edge which is parallel to the input axis of gyroscope, the scale factor of the measured gyroscope can be obtained in real time by the data processing software. This test method is fast. It helps test system miniaturized, easy to carry or move. Measure quarts MEMS gyroscope's scale factor multi-times by this method, the difference is less than 0.2%. Compare with testing by turntable, the scale factor difference is less than 1%. The accuracy and repeatability of the test system seems good.

  5. BEASTling: A software tool for linguistic phylogenetics using BEAST 2.

    PubMed

    Maurits, Luke; Forkel, Robert; Kaiping, Gereon A; Atkinson, Quentin D

    2017-01-01

    We present a new open source software tool called BEASTling, designed to simplify the preparation of Bayesian phylogenetic analyses of linguistic data using the BEAST 2 platform. BEASTling transforms comparatively short and human-readable configuration files into the XML files used by BEAST to specify analyses. By taking advantage of Creative Commons-licensed data from the Glottolog language catalog, BEASTling allows the user to conveniently filter datasets using names for recognised language families, to impose monophyly constraints so that inferred language trees are backward compatible with Glottolog classifications, or to assign geographic location data to languages for phylogeographic analyses. Support for the emerging cross-linguistic linked data format (CLDF) permits easy incorporation of data published in cross-linguistic linked databases into analyses. BEASTling is intended to make the power of Bayesian analysis more accessible to historical linguists without strong programming backgrounds, in the hopes of encouraging communication and collaboration between those developing computational models of language evolution (who are typically not linguists) and relevant domain experts.

  6. Exploiting current-generation graphics hardware for synthetic-scene generation

    NASA Astrophysics Data System (ADS)

    Tanner, Michael A.; Keen, Wayne A.

    2010-04-01

    Increasing seeker frame rate and pixel count, as well as the demand for higher levels of scene fidelity, have driven scene generation software for hardware-in-the-loop (HWIL) and software-in-the-loop (SWIL) testing to higher levels of parallelization. Because modern PC graphics cards provide multiple computational cores (240 shader cores for a current NVIDIA Corporation GeForce and Quadro cards), implementation of phenomenology codes on graphics processing units (GPUs) offers significant potential for simultaneous enhancement of simulation frame rate and fidelity. To take advantage of this potential requires algorithm implementation that is structured to minimize data transfers between the central processing unit (CPU) and the GPU. In this paper, preliminary methodologies developed at the Kinetic Hardware In-The-Loop Simulator (KHILS) will be presented. Included in this paper will be various language tradeoffs between conventional shader programming, Compute Unified Device Architecture (CUDA) and Open Computing Language (OpenCL), including performance trades and possible pathways for future tool development.

  7. SAFE Software and FED Database to Uncover Protein-Protein Interactions using Gene Fusion Analysis.

    PubMed

    Tsagrasoulis, Dimosthenis; Danos, Vasilis; Kissa, Maria; Trimpalis, Philip; Koumandou, V Lila; Karagouni, Amalia D; Tsakalidis, Athanasios; Kossida, Sophia

    2012-01-01

    Domain Fusion Analysis takes advantage of the fact that certain proteins in a given proteome A, are found to have statistically significant similarity with two separate proteins in another proteome B. In other words, the result of a fusion event between two separate proteins in proteome B is a specific full-length protein in proteome A. In such a case, it can be safely concluded that the protein pair has a common biological function or even interacts physically. In this paper, we present the Fusion Events Database (FED), a database for the maintenance and retrieval of fusion data both in prokaryotic and eukaryotic organisms and the Software for the Analysis of Fusion Events (SAFE), a computational platform implemented for the automated detection, filtering and visualization of fusion events (both available at: http://www.bioacademy.gr/bioinformatics/projects/ProteinFusion/index.htm). Finally, we analyze the proteomes of three microorganisms using these tools in order to demonstrate their functionality.

  8. SAFE Software and FED Database to Uncover Protein-Protein Interactions using Gene Fusion Analysis

    PubMed Central

    Tsagrasoulis, Dimosthenis; Danos, Vasilis; Kissa, Maria; Trimpalis, Philip; Koumandou, V. Lila; Karagouni, Amalia D.; Tsakalidis, Athanasios; Kossida, Sophia

    2012-01-01

    Domain Fusion Analysis takes advantage of the fact that certain proteins in a given proteome A, are found to have statistically significant similarity with two separate proteins in another proteome B. In other words, the result of a fusion event between two separate proteins in proteome B is a specific full-length protein in proteome A. In such a case, it can be safely concluded that the protein pair has a common biological function or even interacts physically. In this paper, we present the Fusion Events Database (FED), a database for the maintenance and retrieval of fusion data both in prokaryotic and eukaryotic organisms and the Software for the Analysis of Fusion Events (SAFE), a computational platform implemented for the automated detection, filtering and visualization of fusion events (both available at: http://www.bioacademy.gr/bioinformatics/projects/ProteinFusion/index.htm). Finally, we analyze the proteomes of three microorganisms using these tools in order to demonstrate their functionality. PMID:22267904

  9. Around and about an application of the GAMLSS package to non-stationary flood frequency analysis

    NASA Astrophysics Data System (ADS)

    Debele, S. E.; Bogdanowicz, E.; Strupczewski, W. G.

    2017-08-01

    The non-stationarity of hydrologic processes due to climate change or human activities is challenging for the researchers and practitioners. However, the practical requirements for taking into account non-stationarity as a support in decision-making procedures exceed the up-to-date development of the theory and the of software. Currently, the most popular and freely available software package that allows for non-stationary statistical analysis is the GAMLSS (generalized additive models for location, scale and shape) package. GAMLSS has been used in a variety of fields. There are also several papers recommending GAMLSS in hydrological problems; however, there are still important issues which have not previously been discussed concerning mainly GAMLSS applicability not only for research and academic purposes, but also in a design practice. In this paper, we present a summary of our experiences in the implementation of GAMLSS to non-stationary flood frequency analysis, highlighting its advantages and pointing out weaknesses with regard to methodological and practical topics.

  10. The LBT real-time based control software to mitigate and compensate vibrations

    NASA Astrophysics Data System (ADS)

    Borelli, J.; Trowitzsch, J.; Brix, M.; Kürster, M.; Gässler, W.; Bertram, T.; Briegel, F.

    2010-07-01

    The Large Binocular Telescope (LBT) uses two 8.4 meters active primary mirrors and two adaptive secondary mirrors on the same mounting to take advantage of its interferometric capabilities. Both applications, interferometry and AO, are sensitive to vibrations. Several measurement campaigns have been carried out at the LBT and their results strongly indicate that a vibration monitoring system is required to improve the performance of LINC-NIRVANA, LBTI, and ARGOS, the laser guided ground layer adaptive optic system. Currently, a control software for mitigation and compensation of the vibrations is being designed. A complex set of algorithms collects real-time vibration data, archiving it for further analysis, and in parallel, generating the tip-tilt and optical path difference (OPD) data for the control loop of the instruments. A real-time data acquisition device equipped with embedded real-time Linux is used in our systems. A set of quick-look tools is currently under development in order to verify if the conditions at the telescope are suitable for interferometric/adaptive observations.

  11. Augmented reality and haptic interfaces for robot-assisted surgery.

    PubMed

    Yamamoto, Tomonori; Abolhassani, Niki; Jung, Sung; Okamura, Allison M; Judkins, Timothy N

    2012-03-01

    Current teleoperated robot-assisted minimally invasive surgical systems do not take full advantage of the potential performance enhancements offered by various forms of haptic feedback to the surgeon. Direct and graphical haptic feedback systems can be integrated with vision and robot control systems in order to provide haptic feedback to improve safety and tissue mechanical property identification. An interoperable interface for teleoperated robot-assisted minimally invasive surgery was developed to provide haptic feedback and augmented visual feedback using three-dimensional (3D) graphical overlays. The software framework consists of control and command software, robot plug-ins, image processing plug-ins and 3D surface reconstructions. The feasibility of the interface was demonstrated in two tasks performed with artificial tissue: palpation to detect hard lumps and surface tracing, using vision-based forbidden-region virtual fixtures to prevent the patient-side manipulator from entering unwanted regions of the workspace. The interoperable interface enables fast development and successful implementation of effective haptic feedback methods in teleoperation. Copyright © 2011 John Wiley & Sons, Ltd.

  12. Automatic building information model query generation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jiang, Yufei; Yu, Nan; Ming, Jiang

    Energy efficient building design and construction calls for extensive collaboration between different subfields of the Architecture, Engineering and Construction (AEC) community. Performing building design and construction engineering raises challenges on data integration and software interoperability. Using Building Information Modeling (BIM) data hub to host and integrate building models is a promising solution to address those challenges, which can ease building design information management. However, the partial model query mechanism of current BIM data hub collaboration model has several limitations, which prevents designers and engineers to take advantage of BIM. To address this problem, we propose a general and effective approachmore » to generate query code based on a Model View Definition (MVD). This approach is demonstrated through a software prototype called QueryGenerator. In conclusion, by demonstrating a case study using multi-zone air flow analysis, we show how our approach and tool can help domain experts to use BIM to drive building design with less labour and lower overhead cost.« less

  13. Diagnostic ability of computed tomography using DentaScan software in endodontics: case reports.

    PubMed

    Siotia, Jaya; Gupta, Sunil K; Acharya, Shashi R; Saraswathi, Vidya

    2011-01-01

    Radiographic examination is essential in diagnosis and treatment planning in endodontics. Conventional radiographs depict structures in two dimensions only. The ability to assess the area of interest in three dimensions is advantageous. Computed tomography is an imaging technique which produces three-dimensional images of an object by taking a series of two-dimensional sectional X-ray images. DentaScan is a computed tomography software program that allows the mandible and maxilla to be imaged in three planes: axial, panoramic, and cross-sectional. As computed tomography is used in endodontics, DentaScan can play a wider role in endodontic diagnosis. It provides valuable information in the assessment of the morphology of the root canal, diagnosis of root fractures, internal and external resorptions, pre-operative assessment of anatomic structures etc. The aim of this article is to explore the clinical usefulness of computed tomography and DentaScan in endodontic diagnosis, through a series of four cases of different endodontic problems.

  14. New Software Architecture Options for the TCL Data Acquisition System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Valenton, Emmanuel

    2014-09-01

    The Turbulent Combustion Laboratory (TCL) conducts research on combustion in turbulent flow environments. To conduct this research, the TCL utilizes several pulse lasers, a traversable wind tunnel, flow controllers, scientific grade CCD cameras, and numerous other components. Responsible for managing these different data-acquiring instruments and data processing components is the Data Acquisition (DAQ) software. However, the current system is constrained to running through VXI hardware—an instrument-computer interface—that is several years old, requiring the use of an outdated version of the visual programming language, LabVIEW. A new Acquisition System is being programmed which will borrow heavily from either a programming modelmore » known as the Current Value Table (CVT) System or another model known as the Server-Client System. The CVT System model is in essence, a giant spread sheet from which data or commands may be retrieved or written to, and the Server-Client System is based on network connections between a server and a client, very much like the Server-Client model of the Internet. Currently, the bare elements of a CVT DAQ Software have been implemented, consisting of client programs in addition to a server program that the CVT will run on. This system is being rigorously tested to evaluate the merits of pursuing the CVT System model and to uncover any potential flaws which may result in further implementation. If the CVT System is chosen, which is likely, then future work will consist of build up the system until enough client programs have been created to run the individual components of the lab. The advantages of such a System will be flexibility, portability, and polymorphism. Additionally, the new DAQ software will allow the Lab to replace the VXI with a newer instrument interface—the PXI—and take advantage of the capabilities of current and future versions of LabVIEW.« less

  15. SBSI: an extensible distributed software infrastructure for parameter estimation in systems biology.

    PubMed

    Adams, Richard; Clark, Allan; Yamaguchi, Azusa; Hanlon, Neil; Tsorman, Nikos; Ali, Shakir; Lebedeva, Galina; Goltsov, Alexey; Sorokin, Anatoly; Akman, Ozgur E; Troein, Carl; Millar, Andrew J; Goryanin, Igor; Gilmore, Stephen

    2013-03-01

    Complex computational experiments in Systems Biology, such as fitting model parameters to experimental data, can be challenging to perform. Not only do they frequently require a high level of computational power, but the software needed to run the experiment needs to be usable by scientists with varying levels of computational expertise, and modellers need to be able to obtain up-to-date experimental data resources easily. We have developed a software suite, the Systems Biology Software Infrastructure (SBSI), to facilitate the parameter-fitting process. SBSI is a modular software suite composed of three major components: SBSINumerics, a high-performance library containing parallelized algorithms for performing parameter fitting; SBSIDispatcher, a middleware application to track experiments and submit jobs to back-end servers; and SBSIVisual, an extensible client application used to configure optimization experiments and view results. Furthermore, we have created a plugin infrastructure to enable project-specific modules to be easily installed. Plugin developers can take advantage of the existing user-interface and application framework to customize SBSI for their own uses, facilitated by SBSI's use of standard data formats. All SBSI binaries and source-code are freely available from http://sourceforge.net/projects/sbsi under an Apache 2 open-source license. The server-side SBSINumerics runs on any Unix-based operating system; both SBSIVisual and SBSIDispatcher are written in Java and are platform independent, allowing use on Windows, Linux and Mac OS X. The SBSI project website at http://www.sbsi.ed.ac.uk provides documentation and tutorials.

  16. A Tour of Big Data, Open Source Data Management Technologies from the Apache Software Foundation

    NASA Astrophysics Data System (ADS)

    Mattmann, C. A.

    2012-12-01

    The Apache Software Foundation, a non-profit foundation charged with dissemination of open source software for the public good, provides a suite of data management technologies for distributed archiving, data ingestion, data dissemination, processing, triage and a host of other functionalities that are becoming critical in the Big Data regime. Apache is the world's largest open source software organization, boasting over 3000 developers from around the world all contributing to some of the most pervasive technologies in use today, from the HTTPD web server that powers a majority of Internet web sites to the Hadoop technology that is now projected at over a $1B dollar industry. Apache data management technologies are emerging as de facto off-the-shelf components for searching, distributing, processing and archiving key science data sets both geophysical, space and planetary based, all the way to biomedicine. In this talk, I will give a virtual tour of the Apache Software Foundation, its meritocracy and governance structure, and also its key big data technologies that organizations can take advantage of today and use to save cost, schedule, and resources in implementing their Big Data needs. I'll illustrate the Apache technologies in the context of several national priority projects, including the U.S. National Climate Assessment (NCA), and in the International Square Kilometre Array (SKA) project that are stretching the boundaries of volume, velocity, complexity, and other key Big Data dimensions.

  17. Supporting Development of Satellite's Guidance Navigation and Control Software: A Product Line Approach

    NASA Technical Reports Server (NTRS)

    McComas, David; Stark, Michael; Leake, Stephen; White, Michael; Morisio, Maurizio; Travassos, Guilherme H.; Powers, Edward I. (Technical Monitor)

    2000-01-01

    The NASA Goddard Space Flight Center Flight Software Branch (FSB) is developing a Guidance, Navigation, and Control (GNC) Flight Software (FSW) product line. The demand for increasingly more complex flight software in less time while maintaining the same level of quality has motivated us to look for better FSW development strategies. The GNC FSW product line has been planned to address the core GNC FSW functionality very similar on many recent low/near Earth missions in the last ten years. Unfortunately these missions have not accomplished significant drops in development cost since a systematic approach towards reuse has not been adopted. In addition, new demands are continually being placed upon the FSW which means the FSB must become more adept at providing GNC FSW functionality's core so it can accommodate additional requirements. These domain features together with engineering concepts are influencing the specification, description and evaluation of FSW product line. Domain engineering is the foundation for emerging product line software development approaches. A product line is 'A family of products designed to take advantage of their common aspects and predicted variabilities'. In our product line approach, domain engineering includes the engineering activities needed to produce reusable artifacts for a domain. Application engineering refers to developing an application in the domain starting from reusable artifacts. The focus of this paper is regarding the software process, lessons learned and on how the GNC FSW product line manages variability. Existing domain engineering approaches do not enforce any specific notation for domain analysis or commonality and variability analysis. Usually, natural language text is the preferred tool. The advantage is the flexibility and adapt ability of natural language. However, one has to be ready to accept also its well-known drawbacks, such as ambiguity, inconsistency, and contradictions. While most domain analysis approaches are functionally oriented, the idea of applying the object-oriented approach in domain analysis is not new. Some authors propose to use UML as the notation underlying domain analysis. Our work is based on the same idea of merging UML and domain analysis. Further, we propose a few extensions to UML in order to express variability, and we define precisely their semantics so that a tool can support them. The extensions are designed to be implemented on the API of a popular industrial CASE tool, with obvious advantages in cost and availability of tool support. The paper outlines the product line processes and identifies where variability must be addressed. Then it describes the product line products with respect to how they accommodate variability. The Celestial Body subdomain is used as a working example. Our results to date are summarized and plans for the future are described.

  18. CosmoQuest: A Cyber-Infrastructure for Crowdsourcing Planetary Surface Mapping and More

    NASA Astrophysics Data System (ADS)

    Gay, P.; Lehan, C.; Moore, J.; Bracey, G.; Gugliucci, N.

    2014-04-01

    The design and implementation of programs to crowdsource science presents a unique set of challenges to system architects, programmers, and designers. The CosmoQuest Citizen Science Builder (CSB) is an open source platform designed to take advantage of crowd computing and open source platforms to solve crowdsourcing problems in Planetary Science. CSB combines a clean user interface with a powerful back end to allow the quick design and deployment of citizen science sites that meet the needs of both the random Joe Public, and the detail driven Albert Professional. In this talk, the software will be overviewed, and the results of usability testing and accuracy testing with both citizen and professional scientists will be discussed.

  19. Strehl-constrained iterative blind deconvolution for post-adaptive-optics data

    NASA Astrophysics Data System (ADS)

    Desiderà, G.; Carbillet, M.

    2009-12-01

    Aims: We aim to improve blind deconvolution applied to post-adaptive-optics (AO) data by taking into account one of their basic characteristics, resulting from the necessarily partial AO correction: the Strehl ratio. Methods: We apply a Strehl constraint in the framework of iterative blind deconvolution (IBD) of post-AO near-infrared images simulated in a detailed end-to-end manner and considering a case that is as realistic as possible. Results: The results obtained clearly show the advantage of using such a constraint, from the point of view of both performance and stability, especially for poorly AO-corrected data. The proposed algorithm has been implemented in the freely-distributed and CAOS-based Software Package AIRY.

  20. Developing the online survey.

    PubMed

    Gordon, Jeffry S; McNew, Ryan

    2008-12-01

    Institutions of higher education are now using Internet-based technology tools to conduct surveys for data collection. Research shows that the type and quality of responses one receives with online surveys are comparable with what one receives in paper-based surveys. Data collection can take place on Web-based surveys, e-mail-based surveys, and personal digital assistants/Smartphone devices. Web surveys can be subscription templates, software packages installed on one's own server, or created from scratch using Web programming development tools. All of these approaches have their advantages and disadvantages. The survey owner must make informed decisions as to the right technology to implement. The correct choice can save hours of work in sorting, organizing, and analyzing data.

  1. Populating the Semantic Web by Macro-reading Internet Text

    NASA Astrophysics Data System (ADS)

    Mitchell, Tom M.; Betteridge, Justin; Carlson, Andrew; Hruschka, Estevam; Wang, Richard

    A key question regarding the future of the semantic web is "how will we acquire structured information to populate the semantic web on a vast scale?" One approach is to enter this information manually. A second approach is to take advantage of pre-existing databases, and to develop common ontologies, publishing standards, and reward systems to make this data widely accessible. We consider here a third approach: developing software that automatically extracts structured information from unstructured text present on the web. We also describe preliminary results demonstrating that machine learning algorithms can learn to extract tens of thousands of facts to populate a diverse ontology, with imperfect but reasonably good accuracy.

  2. A Tool for Longitudinal Beam Dynamics in Synchrotrons

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ostiguy, J.-F.; Lebedev, V. A.

    2017-05-01

    A number of codes are available to simulate longitudinal dynamics in synchrotrons. Some established ones include TIBETAN, LONG1D, ESME and ORBIT. While they embody a wealth of accumulated wisdom and experience, most of these codes were written decades ago and to some extent they reflect the constraints of their time. As a result, there is an interest for updated tools taking better advantage of modern software and hardware capabilities. At Fermilab, the PIP-II project has provided the impetus for development of such a tool. In this contribution, we discuss design decisions and code architecture. A selection of test cases basedmore » on an initial prototype are also presented.« less

  3. Organizing Space Shuttle parametric data for maintainability

    NASA Technical Reports Server (NTRS)

    Angier, R. C.

    1983-01-01

    A model of organization and management of Space Shuttle data is proposed. Shuttle avionics software is parametrically altered by a reconfiguration process for each flight. As the flight rate approaches an operational level, current methods of data management would become increasingly complex. An alternative method is introduced, using modularized standard data, and its implications for data collection, integration, validation, and reconfiguration processes are explored. Information modules are cataloged for later use, and may be combined in several levels for maintenance. For each flight, information modules can then be selected from the catalog at a high level. These concepts take advantage of the reusability of Space Shuttle information to reduce the cost of reconfiguration as flight experience increases.

  4. Tritium permeation model for plasma facing components

    NASA Astrophysics Data System (ADS)

    Longhurst, G. R.

    1992-12-01

    This report documents the development of a simplified one-dimensional tritium permeation and retention model. The model makes use of the same physical mechanisms as more sophisticated, time-transient codes such as implantation, recombination, diffusion, trapping and thermal gradient effects. It takes advantage of a number of simplifications and approximations to solve the steady-state problem and then provides interpolating functions to make estimates of intermediate states based on the steady-state solution. The model is developed for solution using commercial spread-sheet software such as Lotus 123. Comparison calculations are provided with the verified and validated TMAP4 transient code with good agreement. Results of calculations for the ITER CDA diverter are also included.

  5. A Generic Communication Protocol for Remote Laboratories: an Implementation on e-lab

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Henriques, Rafael B.; Fernandes, H.; Duarte, Andre S.

    2015-07-01

    The remote laboratories at IST (Instituto Superior Tecnico), e-lab, serve as a valuable tool for education and training based on remote control technologies. Due to the high number and increase of remotely operated experiments a generic protocol was developed to perform the communication between the software driver and the respective experimental setup in an easier and more unified way. The training in these fields of students and personnel can take advantage of such infrastructure with the purpose of deploying new experiments in a faster way. More than 10 experiments using the generic protocol are available on-line in a 24 xmore » 7 way. (authors)« less

  6. Quantitative Analysis of Venus Radar Backscatter Data in ArcGIS

    NASA Technical Reports Server (NTRS)

    Long, S. M.; Grosfils, E. B.

    2005-01-01

    Ongoing mapping of the Ganiki Planitia (V14) quadrangle of Venus and definition of material units has involved an integrated but qualitative analysis of Magellan radar backscatter images and topography using standard geomorphological mapping techniques. However, such analyses do not take full advantage of the quantitative information contained within the images. Analysis of the backscatter coefficient allows a much more rigorous statistical comparison between mapped units, permitting first order selfsimilarity tests of geographically separated materials assigned identical geomorphological labels. Such analyses cannot be performed directly on pixel (DN) values from Magellan backscatter images, because the pixels are scaled to the Muhleman law for radar echoes on Venus and are not corrected for latitudinal variations in incidence angle. Therefore, DN values must be converted based on pixel latitude back to their backscatter coefficient values before accurate statistical analysis can occur. Here we present a method for performing the conversions and analysis of Magellan backscatter data using commonly available ArcGIS software and illustrate the advantages of the process for geological mapping.

  7. Courseware Review.

    ERIC Educational Resources Information Center

    Risley, John, Ed.

    1988-01-01

    Compares the features of the sonic rangers available from HRM Software, MICROMEASUREMENTS, NAGAWTIS Software Research, and PASCO Scientific for demonstrations and experiments in mechanics. Presents the advantages of the sonic rangers and the typical graphics displayed by each software package. (YP)

  8. Math Description Engine Software Development Kit

    NASA Technical Reports Server (NTRS)

    Shelton, Robert O.; Smith, Stephanie L.; Dexter, Dan E.; Hodgson, Terry R.

    2010-01-01

    The Math Description Engine Software Development Kit (MDE SDK) can be used by software developers to make computer-rendered graphs more accessible to blind and visually-impaired users. The MDE SDK generates alternative graph descriptions in two forms: textual descriptions and non-verbal sound renderings, or sonification. It also enables display of an animated trace of a graph sonification on a visual graph component, with color and line-thickness options for users having low vision or color-related impairments. A set of accessible graphical user interface widgets is provided for operation by end users and for control of accessible graph displays. Version 1.0 of the MDE SDK generates text descriptions for 2D graphs commonly seen in math and science curriculum (and practice). The mathematically rich text descriptions can also serve as a virtual math and science assistant for blind and sighted users, making graphs more accessible for everyone. The MDE SDK has a simple application programming interface (API) that makes it easy for programmers and Web-site developers to make graphs accessible with just a few lines of code. The source code is written in Java for cross-platform compatibility and to take advantage of Java s built-in support for building accessible software application interfaces. Compiled-library and NASA Open Source versions are available with API documentation and Programmer s Guide at http:/ / prim e.jsc.n asa. gov.

  9. The EMIR experience in the use of software control simulators to speed up the time to telescope

    NASA Astrophysics Data System (ADS)

    Lopez Ramos, Pablo; López-Ruiz, J. C.; Moreno Arce, Heidy; Rosich, Josefina; Perez Menor, José Maria

    2012-09-01

    One of the main problems facing development teams working on instrument control systems consists on the need to access mechanisms which are not available until well into the integration phase. The need to work with real hardware creates additional problems like, among others: certain faults cannot be tested due to the possibility of hardware damage, taking the system to the limit may shorten its operational lifespan and the full system may not be available during some periods due to maintenance and/or testing of individual components. These problems can be treated with the use of simulators and by applying software/hardware standards. Since information on the construction and performance of electro-mechanical systems is available at relatively early stages of the project, simulators are developed in advance (before the existence of the mechanism) or, if conventions and standards have been correctly followed, a previously developed simulator might be used. This article describes our experience in building software simulators and the main advantages we have identified, which are: the control software can be developed even in the absence of real hardware, critical tests can be prepared using the simulated systems, test system behavior for hardware failure situations that represent a risk of the real system, and the speed up of in house integration of the entire instrument. The use of simulators allows us to reduce development, testing and integration time.

  10. ScaffoldSeq: Software for characterization of directed evolution populations.

    PubMed

    Woldring, Daniel R; Holec, Patrick V; Hackel, Benjamin J

    2016-07-01

    ScaffoldSeq is software designed for the numerous applications-including directed evolution analysis-in which a user generates a population of DNA sequences encoding for partially diverse proteins with related functions and would like to characterize the single site and pairwise amino acid frequencies across the population. A common scenario for enzyme maturation, antibody screening, and alternative scaffold engineering involves naïve and evolved populations that contain diversified regions, varying in both sequence and length, within a conserved framework. Analyzing the diversified regions of such populations is facilitated by high-throughput sequencing platforms; however, length variability within these regions (e.g., antibody CDRs) encumbers the alignment process. To overcome this challenge, the ScaffoldSeq algorithm takes advantage of conserved framework sequences to quickly identify diverse regions. Beyond this, unintended biases in sequence frequency are generated throughout the experimental workflow required to evolve and isolate clones of interest prior to DNA sequencing. ScaffoldSeq software uniquely handles this issue by providing tools to quantify and remove background sequences, cluster similar protein families, and dampen the impact of dominant clones. The software produces graphical and tabular summaries for each region of interest, allowing users to evaluate diversity in a site-specific manner as well as identify epistatic pairwise interactions. The code and detailed information are freely available at http://research.cems.umn.edu/hackel. Proteins 2016; 84:869-874. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.

  11. Portable Medical Laboratory Applications Software

    PubMed Central

    Silbert, Jerome A.

    1983-01-01

    Portability implies that a program can be run on a variety of computers with minimal software revision. The advantages of portability are outlined and design considerations for portable laboratory software are discussed. Specific approaches for achieving this goal are presented.

  12. 75 FR 21077 - Self-Regulatory Organizations; New York Stock Exchange LLC; Notice of Filing of Proposed Rule...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-04-22

    ... of the NYSE BBO service in any calendar month. In order to take advantage of the per-query fee, a... most likely to take advantage of the proposed service; (iv) The contribution of market data revenues that the Exchange believes is appropriate for entities that are most likely to take advantage of the...

  13. Web-Based Tools for Data Visualization and Decision Support for South Asia

    NASA Astrophysics Data System (ADS)

    Jones, N.; Nelson, J.; Pulla, S. T.; Ames, D. P.; Souffront, M.; David, C. H.; Zaitchik, B. F.; Gatlin, P. N.; Matin, M. A.

    2017-12-01

    The objective of the NASA SERVIR project is to assist developing countries in using information provided by Earth observing satellites to assess and manage climate risks, land use, and water resources. We present a collection of web apps that integrate earth observations and in situ data to facilitate deployment of data and water resources models as decision-making tools in support of this effort. The interactive nature of web apps makes this an excellent medium for creating decision support tools that harness cutting edge modeling techniques. Thin client apps hosted in a cloud portal eliminates the need for the decision makers to procure and maintain the high performance hardware required by the models, deal with issues related to software installation and platform incompatibilities, or monitor and install software updates, a problem that is exacerbated for many of the regional SERVIR hubs where both financial and technical capacity may be limited. All that is needed to use the system is an Internet connection and a web browser. We take advantage of these technologies to develop tools which can be centrally maintained but openly accessible. Advanced mapping and visualization make results intuitive and information derived actionable. We also take advantage of the emerging standards for sharing water information across the web using the OGC and WMO approved WaterML standards. This makes our tools interoperable and extensible via application programming interfaces (APIs) so that tools and data from other projects can both consume and share the tools developed in our project. Our approach enables the integration of multiple types of data and models, thus facilitating collaboration between science teams in SERVIR. The apps developed thus far by our team process time-varying netCDF files from Earth observations and large-scale computer simulations and allow visualization and exploration via raster animation and extraction of time series at selected points and/or regions.

  14. Educational Software: A Developer's Perspective.

    ERIC Educational Resources Information Center

    Armstrong, Timothy C.; Loane, Russell F.

    1994-01-01

    Examines the current status and short-term future of computer software development in higher education. Topics discussed include educational advantages of software; current program development techniques, including object oriented programming; and market trends, including IBM versus Macintosh and multimedia programs. (LRW)

  15. Open-Source Software in Computational Research: A Case Study

    DOE PAGES

    Syamlal, Madhava; O'Brien, Thomas J.; Benyahia, Sofiane; ...

    2008-01-01

    A case study of open-source (OS) development of the computational research software MFIX, used for multiphase computational fluid dynamics simulations, is presented here. The verification and validation steps required for constructing modern computational software and the advantages of OS development in those steps are discussed. The infrastructure used for enabling the OS development of MFIX is described. The impact of OS development on computational research and education in gas-solids flow, as well as the dissemination of information to other areas such as geophysical and volcanology research, is demonstrated. This study shows that the advantages of OS development were realized inmore » the case of MFIX: verification by many users, which enhances software quality; the use of software as a means for accumulating and exchanging information; the facilitation of peer review of the results of computational research.« less

  16. 5 CFR 792.207 - When does the child care subsidy program law become effective and how may agencies take advantage...

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 5 Administrative Personnel 2 2010-01-01 2010-01-01 false When does the child care subsidy program law become effective and how may agencies take advantage of this law? 792.207 Section 792.207... When does the child care subsidy program law become effective and how may agencies take advantage of...

  17. Learning motion concepts using real-time microcomputer-based laboratory tools

    NASA Astrophysics Data System (ADS)

    Thornton, Ronald K.; Sokoloff, David R.

    1990-09-01

    Microcomputer-based laboratory (MBL) tools have been developed which interface to Apple II and Macintosh computers. Students use these tools to collect physical data that are graphed in real time and then can be manipulated and analyzed. The MBL tools have made possible discovery-based laboratory curricula that embody results from educational research. These curricula allow students to take an active role in their learning and encourage them to construct physical knowledge from observation of the physical world. The curricula encourage collaborative learning by taking advantage of the fact that MBL tools present data in an immediately understandable graphical form. This article describes one of the tools—the motion detector (hardware and software)—and the kinematics curriculum. The effectiveness of this curriculum compared to traditional college and university methods for helping students learn basic kinematics concepts has been evaluated by pre- and post-testing and by observation. There is strong evidence for significantly improved learning and retention by students who used the MBL materials, compared to those taught in lecture.

  18. Using CASE Software to Teach Undergraduates Systems Analysis and Design.

    ERIC Educational Resources Information Center

    Wilcox, Russell E.

    1988-01-01

    Describes the design and delivery of a college course for information system students utilizing a Computer-Aided Software Engineering program. Discusses class assignments, cooperative learning, student attitudes, and the advantages of using this software in the course. (CW)

  19. Open Source Software in Medium Size Organizations: Key Factors for Adoption

    ERIC Educational Resources Information Center

    Solomon, Jerry T.

    2010-01-01

    For-profit organizations are constantly evaluating new technologies to gain competitive advantage. One such technology, application software, has changed significantly over the past 25 years with the introduction of Open Source Software (OSS). In contrast to commercial software that is developed by private companies and sold to organizations, OSS…

  20. Wavelet-enabled progressive data Access and Storage Protocol (WASP)

    NASA Astrophysics Data System (ADS)

    Clyne, J.; Frank, L.; Lesperance, T.; Norton, A.

    2015-12-01

    Current practices for storing numerical simulation outputs hail from an era when the disparity between compute and I/O performance was not as great as it is today. The memory contents for every sample, computed at every grid point location, are simply saved at some prescribed temporal frequency. Though straightforward, this approach fails to take advantage of the coherency in neighboring grid points that invariably exists in numerical solutions to mathematical models. Exploiting such coherence is essential to digital multimedia; DVD-Video, digital cameras, streaming movies and audio are all possible today because of transform-based compression schemes that make substantial reductions in data possible by taking advantage of the strong correlation between adjacent samples in both space and time. Such methods can also be exploited to enable progressive data refinement in a manner akin to that used in ubiquitous digital mapping applications: views from far away are shown in coarsened detail to provide context, and can be progressively refined as the user zooms in on a localized region of interest. The NSF funded WASP project aims to provide a common, NetCDF-compatible software framework for supporting wavelet-based, multi-scale, progressive data, enabling interactive exploration of large data sets for the geoscience communities. This presentation will provide an overview of this work in progress to develop community cyber-infrastructure for the efficient analysis of very large data sets.

  1. Comparison of cyclic correlation and the wavelet method for symbol rate detection

    NASA Astrophysics Data System (ADS)

    Carr, Richard; Whitney, James

    Software defined radio (SDR) is a relatively new technology that holds a great deal of promise in the communication field in general, and, in particular the area of space communications. Tra-ditional communication systems are comprised of a transmitter and a receiver, where through prior planning and scheduling, the transmitter and receiver are pre-configured for a particu-lar communication modality. For any particular modality the radio circuitry is configured to transmit, receive, and resolve one type of modulation at a certain data rate. Traditional radio's are limited by the fact that the circuitry is fixed. Software defined radios on the other hand do not suffer from this limitation. SDR's are comprised mainly of software modules which allow them to be flexible, in that they can resolve various types of modulation types that occur at different data rates. This ability is of very high importance in space where parameters of the communications link may need to be changed due to channel fading, reduced power, or other unforeseen events. In these cases the ability to autonomously change aspects of the radio's con-figuration becomes an absolute necessity in order to maintain communications. In order for the technology to work the receiver has to be able to determine the modulation type and the data rate of the signal. The data rate of the signal is one of the first parameters to be resolved, as it is needed to find the other signal parameters such as modulation type and the signal-to-noise ratio. There are a number of algorithms that have been developed to detect or estimate the data rate of a signal. This paper will investigate two of these algorithms, namely, the cyclic correlation algorithm and a wavelet-based detection algorithm. Both of these algorithms are feature-based algorithms, meaning that they make their estimations based on certain inherent features of the signals to which they are applied. The cyclic correlation algorithm takes advan-tage of the cyclostationary nature of MPSK signals, while the wavelet-based algorithms take advantage of the fact of being able to detect transient changes in the signal, i.e., transitions from `1' to'0'. Both of these algorithms are tested under various signal-to-noise conditions to see which has the better performance, and the results are presented in this paper.

  2. AWIPS II in the University Community: Unidata's efforts and capabilities of the software

    NASA Astrophysics Data System (ADS)

    Ramamurthy, Mohan; James, Michael

    2015-04-01

    The Advanced Weather Interactive Processing System, version II (AWIPS II) is a weather forecasting, display and analysis tool that is used by the National Oceanic and Atmospheric Administration/National Weather Service (NOAA/NWS) and the National Centers for Environmental Prediction (NCEP) to ingest analyze and disseminate operational weather data. The AWIPS II software is built on a Service Oriented Architecture, takes advantage of open source software, and its design affords expandability, flexibility, and portability. Since many university meteorology programs are eager to use the same tools used by NWS forecasters, Unidata community interest in AWIPS II is high. The Unidata Program Center (UPC) has worked closely with NCEP staff during AWIPS II development in order to devise a way to make it available to the university. The Unidata AWIPS II software was released in beta form in 2014, and it incorporates a number of key changes to the baseline U. S. National Weather Service release to process and display additional data formats and run all components in a single-server standalone configuration. In addition to making available open-source instances of the software libraries that can be downloaded and run at any university, Unidata has also deployed the data-server side of AWIPS II, known as EDEX, in the Amazon Web Service and Microsoft Azure cloud environments. In this set up, universities receive all of the data from remote cloud instances, while they only have to run the AWIPS II client, known as CAVE, to analyze and visualize the data. In this presentation, we will describe Unidata's AWIPS II efforts, including the capabilities of the software in visualizing many different types of real-time meteorological data and its myriad uses in university and other settings.

  3. Development of a software framework for data assimilation and its applications for streamflow forecasting in Japan

    NASA Astrophysics Data System (ADS)

    Noh, S. J.; Tachikawa, Y.; Shiiba, M.; Yorozu, K.; Kim, S.

    2012-04-01

    Data assimilation methods have received increased attention to accomplish uncertainty assessment and enhancement of forecasting capability in various areas. Despite of their potentials, applicable software frameworks to probabilistic approaches and data assimilation are still limited because the most of hydrologic modeling software are based on a deterministic approach. In this study, we developed a hydrological modeling framework for sequential data assimilation, so called MPI-OHyMoS. MPI-OHyMoS allows user to develop his/her own element models and to easily build a total simulation system model for hydrological simulations. Unlike process-based modeling framework, this software framework benefits from its object-oriented feature to flexibly represent hydrological processes without any change of the main library. Sequential data assimilation based on the particle filters is available for any hydrologic models based on MPI-OHyMoS considering various sources of uncertainty originated from input forcing, parameters and observations. The particle filters are a Bayesian learning process in which the propagation of all uncertainties is carried out by a suitable selection of randomly generated particles without any assumptions about the nature of the distributions. In MPI-OHyMoS, ensemble simulations are parallelized, which can take advantage of high performance computing (HPC) system. We applied this software framework for short-term streamflow forecasting of several catchments in Japan using a distributed hydrologic model. Uncertainty of model parameters and remotely-sensed rainfall data such as X-band or C-band radar is estimated and mitigated in the sequential data assimilation.

  4. OsiriX: an open-source software for navigating in multidimensional DICOM images.

    PubMed

    Rosset, Antoine; Spadola, Luca; Ratib, Osman

    2004-09-01

    A multidimensional image navigation and display software was designed for display and interpretation of large sets of multidimensional and multimodality images such as combined PET-CT studies. The software is developed in Objective-C on a Macintosh platform under the MacOS X operating system using the GNUstep development environment. It also benefits from the extremely fast and optimized 3D graphic capabilities of the OpenGL graphic standard widely used for computer games optimized for taking advantage of any hardware graphic accelerator boards available. In the design of the software special attention was given to adapt the user interface to the specific and complex tasks of navigating through large sets of image data. An interactive jog-wheel device widely used in the video and movie industry was implemented to allow users to navigate in the different dimensions of an image set much faster than with a traditional mouse or on-screen cursors and sliders. The program can easily be adapted for very specific tasks that require a limited number of functions, by adding and removing tools from the program's toolbar and avoiding an overwhelming number of unnecessary tools and functions. The processing and image rendering tools of the software are based on the open-source libraries ITK and VTK. This ensures that all new developments in image processing that could emerge from other academic institutions using these libraries can be directly ported to the OsiriX program. OsiriX is provided free of charge under the GNU open-source licensing agreement at http://homepage.mac.com/rossetantoine/osirix.

  5. Implementation of highly parallel and large scale GW calculations within the OpenAtom software

    NASA Astrophysics Data System (ADS)

    Ismail-Beigi, Sohrab

    The need to describe electronic excitations with better accuracy than provided by band structures produced by Density Functional Theory (DFT) has been a long-term enterprise for the computational condensed matter and materials theory communities. In some cases, appropriate theoretical frameworks have existed for some time but have been difficult to apply widely due to computational cost. For example, the GW approximation incorporates a great deal of important non-local and dynamical electronic interaction effects but has been too computationally expensive for routine use in large materials simulations. OpenAtom is an open source massively parallel ab initiodensity functional software package based on plane waves and pseudopotentials (http://charm.cs.uiuc.edu/OpenAtom/) that takes advantage of the Charm + + parallel framework. At present, it is developed via a three-way collaboration, funded by an NSF SI2-SSI grant (ACI-1339804), between Yale (Ismail-Beigi), IBM T. J. Watson (Glenn Martyna) and the University of Illinois at Urbana Champaign (Laxmikant Kale). We will describe the project and our current approach towards implementing large scale GW calculations with OpenAtom. Potential applications of large scale parallel GW software for problems involving electronic excitations in semiconductor and/or metal oxide systems will be also be pointed out.

  6. Meta-analysis in Stata using gllamm.

    PubMed

    Bagos, Pantelis G

    2015-12-01

    There are several user-written programs for performing meta-analysis in Stata (Stata Statistical Software: College Station, TX: Stata Corp LP). These include metan, metareg, mvmeta, and glst. However, there are several cases for which these programs do not suffice. For instance, there is no software for performing univariate meta-analysis with correlated estimates, for multilevel or hierarchical meta-analysis, or for meta-analysis of longitudinal data. In this work, we show with practical applications that many disparate models, including but not limited to the ones mentioned earlier, can be fitted using gllamm. The software is very versatile and can handle a wide variety of models with applications in a wide range of disciplines. The method presented here takes advantage of these modeling capabilities and makes use of appropriate transformations, based on the Cholesky decomposition of the inverse of the covariance matrix, known as generalized least squares, in order to handle correlated data. The models described earlier can be thought of as special instances of a general linear mixed-model formulation, but to the author's knowledge, a general exposition in order to incorporate all the available models for meta-analysis as special cases and the instructions to fit them in Stata has not been presented so far. Source code is available at http:www.compgen.org/tools/gllamm. Copyright © 2015 John Wiley & Sons, Ltd.

  7. Contributing opportunistic resources to the grid with HTCondor-CE-Bosco

    NASA Astrophysics Data System (ADS)

    Weitzel, Derek; Bockelman, Brian

    2017-10-01

    The HTCondor-CE [1] is the primary Compute Element (CE) software for the Open Science Grid. While it offers many advantages for large sites, for smaller, WLCG Tier-3 sites or opportunistic clusters, it can be a difficult task to install, configure, and maintain the HTCondor-CE. Installing a CE typically involves understanding several pieces of software, installing hundreds of packages on a dedicated node, updating several configuration files, and implementing grid authentication mechanisms. On the other hand, accessing remote clusters from personal computers has been dramatically improved with Bosco: site admins only need to setup SSH public key authentication and appropriate accounts on a login host. In this paper, we take a new approach with the HTCondor-CE-Bosco, a CE which combines the flexibility and reliability of the HTCondor-CE with the easy-to-install Bosco. The administrators of the opportunistic resource are not required to install any software: only SSH access and a user account are required from the host site. The OSG can then run the grid-specific portions from a central location. This provides a new, more centralized, model for running grid services, which complements the traditional distributed model. We will show the architecture of a HTCondor-CE-Bosco enabled site, as well as feedback from multiple sites that have deployed it.

  8. Using Mach threads to control DSN operational sequences

    NASA Technical Reports Server (NTRS)

    Urista, Juan

    1993-01-01

    The Link Monitor and Control Operator Assistant prototype (LMCOA) is a state-of-the-art, semiautomated monitor and control system based on an object-oriented design. The purpose of the LMCOA prototyping effort is to both investigate new technology (such as artificial intelligence) to support automation and to evaluate advances in information systems toward developing systems that take advantage of the technology. The emergence of object-oriented design methodology has enabled a major change in how software is designed and developed. This paper describes how the object-oriented approach was used to design and implement the LMCOA and the results of operational testing. The LMCOA is implemented on a NeXT workstation using the Mach operating system and the Objective-C programming language.

  9. Development of a multimedia CD-ROM on telemedicine and teleradiology

    NASA Astrophysics Data System (ADS)

    Schnur, Mark T.; Williamson, Morgan P.; Goeringer, Fred; Zimnik, Paul; Linn, Reid; Suitor, Charles T.; Rocca, Mitra A.; Strother, Thomas

    1996-04-01

    The Department of Defense Telemedicine Test Bed produced a CD-ROM including information on telemedicine, teleradiology and military medical advanced technology projects. The CD-ROM was produced using media from the Telemedicine Test Bed World Wide Web site and academic papers and presentations. Apple Media Tools software was used to produce the interactive program and the authoring was done on a high speed Apple Macintosh Power PC computer. The process took roughly 100 hours to author 50 Mb of data into 200 frames of interactive material. Future versions of the Telemedicine CD-ROM are in progress which will include much more material to take advantage of the 650 Mb available on a compact disk. This paper graphically depicts and explains the authoring process.

  10. Medical image informatics infrastructure design and applications.

    PubMed

    Huang, H K; Wong, S T; Pietka, E

    1997-01-01

    Picture archiving and communication systems (PACS) is a system integration of multimodality images and health information systems designed for improving the operation of a radiology department. As it evolves, PACS becomes a hospital image document management system with a voluminous image and related data file repository. A medical image informatics infrastructure can be designed to take advantage of existing data, providing PACS with add-on value for health care service, research, and education. A medical image informatics infrastructure (MIII) consists of the following components: medical images and associated data (including PACS database), image processing, data/knowledge base management, visualization, graphic user interface, communication networking, and application oriented software. This paper describes these components and their logical connection, and illustrates some applications based on the concept of the MIII.

  11. Accuracy of laser-scanned models compared to plaster models and cone-beam computed tomography.

    PubMed

    Kim, Jooseong; Heo, Giseon; Lagravère, Manuel O

    2014-05-01

    To compare the accuracy of measurements obtained from the three-dimensional (3D) laser scans to those taken from the cone-beam computed tomography (CBCT) scans and those obtained from plaster models. Eighteen different measurements, encompassing mesiodistal width of teeth and both maxillary and mandibular arch length and width, were selected using various landmarks. CBCT scans and plaster models were prepared from 60 patients. Plaster models were scanned using the Ortho Insight 3D laser scanner, and the selected landmarks were measured using its software. CBCT scans were imported and analyzed using the Avizo software, and the 26 landmarks corresponding to the selected measurements were located and recorded. The plaster models were also measured using a digital caliper. Descriptive statistics and intraclass correlation coefficient (ICC) were used to analyze the data. The ICC result showed that the values obtained by the three different methods were highly correlated in all measurements, all having correlations>0.808. When checking the differences between values and methods, the largest mean difference found was 0.59 mm±0.38 mm. In conclusion, plaster models, CBCT models, and laser-scanned models are three different diagnostic records, each with its own advantages and disadvantages. The present results showed that the laser-scanned models are highly accurate to plaster models and CBCT scans. This gives general clinicians an alternative to take into consideration the advantages of laser-scanned models over plaster models and CBCT reconstructions.

  12. Evaluation of the FIR Example using Xilinx Vivado High-Level Synthesis Compiler

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jin, Zheming; Finkel, Hal; Yoshii, Kazutomo

    Compared to central processing units (CPUs) and graphics processing units (GPUs), field programmable gate arrays (FPGAs) have major advantages in reconfigurability and performance achieved per watt. This development flow has been augmented with high-level synthesis (HLS) flow that can convert programs written in a high-level programming language to Hardware Description Language (HDL). Using high-level programming languages such as C, C++, and OpenCL for FPGA-based development could allow software developers, who have little FPGA knowledge, to take advantage of the FPGA-based application acceleration. This improves developer productivity and makes the FPGA-based acceleration accessible to hardware and software developers. Xilinx Vivado HLSmore » compiler is a high-level synthesis tool that enables C, C++ and System C specification to be directly targeted into Xilinx FPGAs without the need to create RTL manually. The white paper [1] published recently by Xilinx uses a finite impulse response (FIR) example to demonstrate the variable-precision features in the Vivado HLS compiler and the resource and power benefits of converting floating point to fixed point for a design. To get a better understanding of variable-precision features in terms of resource usage and performance, this report presents the experimental results of evaluating the FIR example using Vivado HLS 2017.1 and a Kintex Ultrascale FPGA. In addition, we evaluated the half-precision floating-point data type against the double-precision and single-precision data type and present the detailed results.« less

  13. Incorporating Code-Based Software in an Introductory Statistics Course

    ERIC Educational Resources Information Center

    Doehler, Kirsten; Taylor, Laura

    2015-01-01

    This article is based on the experiences of two statistics professors who have taught students to write and effectively utilize code-based software in a college-level introductory statistics course. Advantages of using software and code-based software in this context are discussed. Suggestions are made on how to ease students into using code with…

  14. Stereo and IMU-Assisted Visual Odometry for Small Robots

    NASA Technical Reports Server (NTRS)

    2012-01-01

    This software performs two functions: (1) taking stereo image pairs as input, it computes stereo disparity maps from them by cross-correlation to achieve 3D (three-dimensional) perception; (2) taking a sequence of stereo image pairs as input, it tracks features in the image sequence to estimate the motion of the cameras between successive image pairs. A real-time stereo vision system with IMU (inertial measurement unit)-assisted visual odometry was implemented on a single 750 MHz/520 MHz OMAP3530 SoC (system on chip) from TI (Texas Instruments). Frame rates of 46 fps (frames per second) were achieved at QVGA (Quarter Video Graphics Array i.e. 320 240), or 8 fps at VGA (Video Graphics Array 640 480) resolutions, while simultaneously tracking up to 200 features, taking full advantage of the OMAP3530's integer DSP (digital signal processor) and floating point ARM processors. This is a substantial advancement over previous work as the stereo implementation produces 146 Mde/s (millions of disparities evaluated per second) in 2.5W, yielding a stereo energy efficiency of 58.8 Mde/J, which is 3.75 better than prior DSP stereo while providing more functionality.

  15. Teaching Software Componentization: A Bar Chart Java Bean

    ERIC Educational Resources Information Center

    Mitri, Michel

    2010-01-01

    In the current object-oriented paradigm, software construction increasingly involves creating and utilizing "software components". These components can serve a variety of functions, from common algorithmic processes to database connectivity to graphical interfaces. The advantage of component architectures is that programmers can use pre-existing…

  16. LEGOS: Object-based software components for mission-critical systems. Final report, June 1, 1995--December 31, 1997

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    1998-08-01

    An estimated 85% of the installed base of software is a custom application with a production quantity of one. In practice, almost 100% of military software systems are custom software. Paradoxically, the marginal costs of producing additional units are near zero. So why hasn`t the software market, a market with high design costs and low productions costs evolved like other similar custom widget industries, such as automobiles and hardware chips? The military software industry seems immune to market pressures that have motivated a multilevel supply chain structure in other widget industries: design cost recovery, improve quality through specialization, and enablemore » rapid assembly from purchased components. The primary goal of the ComponentWare Consortium (CWC) technology plan was to overcome barriers to building and deploying mission-critical information systems by using verified, reusable software components (Component Ware). The adoption of the ComponentWare infrastructure is predicated upon a critical mass of the leading platform vendors` inevitable adoption of adopting emerging, object-based, distributed computing frameworks--initially CORBA and COM/OLE. The long-range goal of this work is to build and deploy military systems from verified reusable architectures. The promise of component-based applications is to enable developers to snap together new applications by mixing and matching prefabricated software components. A key result of this effort is the concept of reusable software architectures. A second important contribution is the notion that a software architecture is something that can be captured in a formal language and reused across multiple applications. The formalization and reuse of software architectures provide major cost and schedule improvements. The Unified Modeling Language (UML) is fast becoming the industry standard for object-oriented analysis and design notation for object-based systems. However, the lack of a standard real-time distributed object operating system, lack of a standard Computer-Aided Software Environment (CASE) tool notation and lack of a standard CASE tool repository has limited the realization of component software. The approach to fulfilling this need is the software component factory innovation. The factory approach takes advantage of emerging standards such as UML, CORBA, Java and the Internet. The key technical innovation of the software component factory is the ability to assemble and test new system configurations as well as assemble new tools on demand from existing tools and architecture design repositories.« less

  17. Leveraging Existing Mission Tools in a Re-Usable, Component-Based Software Environment

    NASA Technical Reports Server (NTRS)

    Greene, Kevin; Grenander, Sven; Kurien, James; z,s (fshir. z[orttr); z,scer; O'Reilly, Taifun

    2006-01-01

    Emerging methods in component-based software development offer significant advantages but may seem incompatible with existing mission operations applications. In this paper we relate our positive experiences integrating existing mission applications into component-based tools we are delivering to three missions. In most operations environments, a number of software applications have been integrated together to form the mission operations software. In contrast, with component-based software development chunks of related functionality and data structures, referred to as components, can be individually delivered, integrated and re-used. With the advent of powerful tools for managing component-based development, complex software systems can potentially see significant benefits in ease of integration, testability and reusability from these techniques. These benefits motivate us to ask how component-based development techniques can be relevant in a mission operations environment, where there is significant investment in software tools that are not component-based and may not be written in languages for which component-based tools even exist. Trusted and complex software tools for sequencing, validation, navigation, and other vital functions cannot simply be re-written or abandoned in order to gain the advantages offered by emerging component-based software techniques. Thus some middle ground must be found. We have faced exactly this issue, and have found several solutions. Ensemble is an open platform for development, integration, and deployment of mission operations software that we are developing. Ensemble itself is an extension of an open source, component-based software development platform called Eclipse. Due to the advantages of component-based development, we have been able to vary rapidly develop mission operations tools for three surface missions by mixing and matching from a common set of mission operation components. We have also had to determine how to integrate existing mission applications for sequence development, sequence validation, and high level activity planning, and other functions into a component-based environment. For each of these, we used a somewhat different technique based upon the structure and usage of the existing application.

  18. Mocking the weak lensing universe: The LensTools Python computing package

    NASA Astrophysics Data System (ADS)

    Petri, A.

    2016-10-01

    We present a newly developed software package which implements a wide range of routines frequently used in Weak Gravitational Lensing (WL). With the continuously increasing size of the WL scientific community we feel that easy to use Application Program Interfaces (APIs) for common calculations are a necessity to ensure efficiency and coordination across different working groups. Coupled with existing open source codes, such as CAMB (Lewis et al., 2000) and Gadget2 (Springel, 2005), LensTools brings together a cosmic shear simulation pipeline which, complemented with a variety of WL feature measurement tools and parameter sampling routines, provides easy access to the numerics for theoretical studies of WL as well as for experiment forecasts. Being implemented in PYTHON (Rossum, 1995), LensTools takes full advantage of a range of state-of-the art techniques developed by the large and growing open-source software community (Jones et al., 2001; McKinney, 2010; Astrophy Collaboration, 2013; Pedregosa et al., 2011; Foreman-Mackey et al., 2013). We made the LensTools code available on the Python Package Index and published its documentation on http://lenstools.readthedocs.io.

  19. The Spectral Image Processing System (SIPS): Software for integrated analysis of AVIRIS data

    NASA Technical Reports Server (NTRS)

    Kruse, F. A.; Lefkoff, A. B.; Boardman, J. W.; Heidebrecht, K. B.; Shapiro, A. T.; Barloon, P. J.; Goetz, A. F. H.

    1992-01-01

    The Spectral Image Processing System (SIPS) is a software package developed by the Center for the Study of Earth from Space (CSES) at the University of Colorado, Boulder, in response to a perceived need to provide integrated tools for analysis of imaging spectrometer data both spectrally and spatially. SIPS was specifically designed to deal with data from the Airborne Visible/Infrared Imaging Spectrometer (AVIRIS) and the High Resolution Imaging Spectrometer (HIRIS), but was tested with other datasets including the Geophysical and Environmental Research Imaging Spectrometer (GERIS), GEOSCAN images, and Landsat TM. SIPS was developed using the 'Interactive Data Language' (IDL). It takes advantage of high speed disk access and fast processors running under the UNIX operating system to provide rapid analysis of entire imaging spectrometer datasets. SIPS allows analysis of single or multiple imaging spectrometer data segments at full spatial and spectral resolution. It also allows visualization and interactive analysis of image cubes derived from quantitative analysis procedures such as absorption band characterization and spectral unmixing. SIPS consists of three modules: SIPS Utilities, SIPS_View, and SIPS Analysis. SIPS version 1.1 is described below.

  20. G-DYN Multibody Dynamics Engine

    NASA Technical Reports Server (NTRS)

    Acikmese, Behcet; Blackmore, James C.; Broderick, Daniel

    2011-01-01

    G-DYN is a multi-body dynamic simulation software engine that automatically assembles and integrates equations of motion for arbitrarily connected multibody dynamic systems. The algorithm behind G-DYN is based on a primal-dual formulation of the dynamics that captures the position and velocity vectors (primal variables) of each body and the interaction forces (dual variables) between bodies, which are particularly useful for control and estimation analysis and synthesis. It also takes full advantage of the spare matrix structure resulting from the system dynamics to numerically integrate the equations of motion efficiently. Furthermore, the dynamic model for each body can easily be replaced without re-deriving the overall equations of motion, and the assembly of the equations of motion is done automatically. G-DYN proved an essential software tool in the simulation of spacecraft systems used for small celestial body surface sampling, specifically in simulating touch-and-go (TAG) maneuvers of a robotic sampling system from a comet and asteroid. It is used extensively in validating mission concepts for small body sample return, such as Comet Odyssey and Galahad New Frontiers proposals.

  1. Cytoprophet: a Cytoscape plug-in for protein and domain interaction networks inference.

    PubMed

    Morcos, Faruck; Lamanna, Charles; Sikora, Marcin; Izaguirre, Jesús

    2008-10-01

    Cytoprophet is a software tool that allows prediction and visualization of protein and domain interaction networks. It is implemented as a plug-in of Cytoscape, an open source software framework for analysis and visualization of molecular networks. Cytoprophet implements three algorithms that predict new potential physical interactions using the domain composition of proteins and experimental assays. The algorithms for protein and domain interaction inference include maximum likelihood estimation (MLE) using expectation maximization (EM); the set cover approach maximum specificity set cover (MSSC) and the sum-product algorithm (SPA). After accepting an input set of proteins with Uniprot ID/Accession numbers and a selected prediction algorithm, Cytoprophet draws a network of potential interactions with probability scores and GO distances as edge attributes. A network of domain interactions between the domains of the initial protein list can also be generated. Cytoprophet was designed to take advantage of the visual capabilities of Cytoscape and be simple to use. An example of inference in a signaling network of myxobacterium Myxococcus xanthus is presented and available at Cytoprophet's website. http://cytoprophet.cse.nd.edu.

  2. Research to Support the Determination of Spacecraft Maximum Acceptable Concentrations of Potential Atmospheric Contaminants

    NASA Technical Reports Server (NTRS)

    Orr, John L.

    1997-01-01

    In many ways, the typical approach to the handling of bibliographic material for generating review articles and similar manuscripts has changed little since the use of xerographic reproduction has become widespread. The basic approach is to collect reprints of the relevant material and place it in folders or stacks based on its dominant content. As the amount of information available increases with the passage of time, the viability of this mechanical approach to bibliographic management decreases. The personal computer revolution has changed the way we deal with many familiar tasks. For example, word processing on personal computers has supplanted the typewriter for many applications. Similarly, spreadsheets have not only replaced many routine uses of calculators but have also made possible new applications because the cost of calculation is extremely low. Objective The objective of this research was to use personal computer bibliographic software technology to support the determination of spacecraft maximum acceptable concentration (SMAC) values. Specific Aims The specific aims were to produce draft SMAC documents for hydrogen sulfide and tetrachloroethylene taking maximum advantage of the bibliographic software.

  3. Digital evaluation of sitting posture comfort in human-vehicle system under Industry 4.0 framework

    NASA Astrophysics Data System (ADS)

    Tao, Qing; Kang, Jinsheng; Sun, Wenlei; Li, Zhaobo; Huo, Xiao

    2016-09-01

    Most of the previous studies on the vibration ride comfort of the human-vehicle system were focused only on one or two aspects of the investigation. A hybrid approach which integrates all kinds of investigation methods in real environment and virtual environment is described. The real experimental environment includes the WBV(whole body vibration) test, questionnaires for human subjective sensation and motion capture. The virtual experimental environment includes the theoretical calculation on simplified 5-DOF human body vibration model, the vibration simulation and analysis within ADAMS/VibrationTM module, and the digital human biomechanics and occupational health analysis in Jack software. While the real experimental environment provides realistic and accurate test results, it also serves as core and validation for the virtual experimental environment. The virtual experimental environment takes full advantages of current available vibration simulation and digital human modelling software, and makes it possible to evaluate the sitting posture comfort in a human-vehicle system with various human anthropometric parameters. How this digital evaluation system for car seat comfort design is fitted in the Industry 4.0 framework is also proposed.

  4. X-train: teaching professionals remotely.

    PubMed

    Santerre, Charles R

    2005-05-01

    Increased popularity of the Internet, along with the development of new software applications have dramatically improved our ability to create and deliver online continuing education trainings to professionals in the areas of nutrition and food safety. In addition, these technological advances permit effective and affordable measurement of training outcomes, i.e., changes in knowledge, attitude, and behavior, that result from these educational efforts. Impact assessment of engagement programs is becoming increasing important for demonstrating the value of training activities to stakeholders. A novel software program, called X-Train, takes advantage of technological advances (databases, computer graphics, Web-based interfaces, and network speed) for delivering high-quality trainings to teachers and health care professionals. X-Train automatically collects outcome data, and generates and sends certificates of completion and communicates with participants through electronic messages. X-Train can be used as a collaborative tool whereby experts from various academic institutions are brought together to develop Web-based trainings. Finally, X-Train uses a unique approach that encourages cooperative extension specialists and educators to promote these educational opportunities within their state or county.

  5. Integrating automated structured analysis and design with Ada programming support environments

    NASA Technical Reports Server (NTRS)

    Hecht, Alan; Simmons, Andy

    1986-01-01

    Ada Programming Support Environments (APSE) include many powerful tools that address the implementation of Ada code. These tools do not address the entire software development process. Structured analysis is a methodology that addresses the creation of complete and accurate system specifications. Structured design takes a specification and derives a plan to decompose the system subcomponents, and provides heuristics to optimize the software design to minimize errors and maintenance. It can also produce the creation of useable modules. Studies have shown that most software errors result from poor system specifications, and that these errors also become more expensive to fix as the development process continues. Structured analysis and design help to uncover error in the early stages of development. The APSE tools help to insure that the code produced is correct, and aid in finding obscure coding errors. However, they do not have the capability to detect errors in specifications or to detect poor designs. An automated system for structured analysis and design TEAMWORK, which can be integrated with an APSE to support software systems development from specification through implementation is described. These tools completement each other to help developers improve quality and productivity, as well as to reduce development and maintenance costs. Complete system documentation and reusable code also resultss from the use of these tools. Integrating an APSE with automated tools for structured analysis and design provide capabilities and advantages beyond those realized with any of these systems used by themselves.

  6. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Germain, Shawn

    Nuclear Power Plant (NPP) refueling outages create some of the most challenging activities the utilities face in both tracking and coordinating thousands of activities in a short period of time. Other challenges, including nuclear safety concerns arising from atypical system configurations and resource allocation issues, can create delays and schedule overruns, driving up outage costs. Today the majority of the outage communication is done using processes that do not take advantage of advances in modern technologies that enable enhanced communication, collaboration and information sharing. Some of the common practices include: runners that deliver paper-based requests for approval, radios, telephones, desktopmore » computers, daily schedule printouts, and static whiteboards that are used to display information. Many gains have been made to reduce the challenges facing outage coordinators; however; new opportunities can be realized by utilizing modern technological advancements in communication and information tools that can enhance the collective situational awareness of plant personnel leading to improved decision-making. Ongoing research as part of the Light Water Reactor Sustainability Program (LWRS) has been targeting NPP outage improvement. As part of this research, various applications of collaborative software have been demonstrated through pilot project utility partnerships. Collaboration software can be utilized as part of the larger concept of Computer-Supported Cooperative Work (CSCW). Collaborative software can be used for emergent issue resolution, Outage Control Center (OCC) displays, and schedule monitoring. Use of collaboration software enables outage staff and subject matter experts (SMEs) to view and update critical outage information from any location on site or off.« less

  7. UAF: a generic OPC unified architecture framework

    NASA Astrophysics Data System (ADS)

    Pessemier, Wim; Deconinck, Geert; Raskin, Gert; Saey, Philippe; Van Winckel, Hans

    2012-09-01

    As an emerging Service Oriented Architecture (SOA) specically designed for industrial automation and process control, the OPC Unied Architecture specication should be regarded as an attractive candidate for controlling scientic instrumentation. Even though an industry-backed standard such as OPC UA can oer substantial added value to these projects, its inherent complexity poses an important obstacle for adopting the technology. Building OPC UA applications requires considerable eort, even when taking advantage of a COTS Software Development Kit (SDK). The OPC Unied Architecture Framework (UAF) attempts to reduce this burden by introducing an abstraction layer between the SDK and the application code in order to achieve a better separation of the technical and the functional concerns. True to its industrial origin, the primary requirement of the framework is to maintain interoperability by staying close to the standard specications, and by expecting the minimum compliance from other OPC UA servers and clients. UAF can therefore be regarded as a software framework to quickly and comfortably develop and deploy OPC UA-based applications, while remaining compatible to third party OPC UA-compliant toolkits, servers (such as PLCs) and clients (such as SCADA software). In the rst phase, as covered by this paper, only the client-side of UAF has been tackled in order to transparently handle discovery, session management, subscriptions, monitored items etc. We describe the design principles and internal architecture of our open-source software project, the rst results of the framework running at the Mercator Telescope, and we give a preview of the planned server-side implementation.

  8. Department-Generated Microcomputer Software.

    ERIC Educational Resources Information Center

    Mantei, Erwin J.

    1986-01-01

    Explains how self-produced software can be used to perform rapid number analysis or number-crunching duties in geology classes. Reviews programs in mineralogy and petrology and identifies areas in geology where computers can be used effectively. Discusses the advantages and benefits of integrating department-generated software into a geology…

  9. Open Source Molecular Modeling

    PubMed Central

    Pirhadi, Somayeh; Sunseri, Jocelyn; Koes, David Ryan

    2016-01-01

    The success of molecular modeling and computational chemistry efforts are, by definition, dependent on quality software applications. Open source software development provides many advantages to users of modeling applications, not the least of which is that the software is free and completely extendable. In this review we categorize, enumerate, and describe available open source software packages for molecular modeling and computational chemistry. PMID:27631126

  10. Effects of Contoured Pallets on AMC Mission Efficiency

    DTIC Science & Technology

    2011-06-01

    carrier moves it on a B-747-100 as if it was a B-747-400, all while not planning to take advantage of the additional cargo capacity of the newer...nature of the cargo being moved and determine if opportunities existed to take advantage of increased MD-11 airlift. Each model had different...types of cargo can help efficiency by planning contoured requirements to take advantage of the fact that while moving these pallets underutilizes

  11. Methods and Software for Building Bibliographic Data Bases.

    ERIC Educational Resources Information Center

    Daehn, Ralph M.

    1985-01-01

    This in-depth look at database management systems (DBMS) for microcomputers covers data entry, information retrieval, security, DBMS software and design, and downloading of literature search results. The advantages of in-house systems versus online search vendors are discussed, and specifications of three software packages and 14 sources are…

  12. A Web-Based Learning System for Software Test Professionals

    ERIC Educational Resources Information Center

    Wang, Minhong; Jia, Haiyang; Sugumaran, V.; Ran, Weijia; Liao, Jian

    2011-01-01

    Fierce competition, globalization, and technology innovation have forced software companies to search for new ways to improve competitive advantage. Web-based learning is increasingly being used by software companies as an emergent approach for enhancing the skills of knowledge workers. However, the current practice of Web-based learning is…

  13. Improving mixing efficiency of a polymer micromixer by use of a plastic shim divider

    NASA Astrophysics Data System (ADS)

    Li, Lei; Lee, L. James; Castro, Jose M.; Yi, Allen Y.

    2010-03-01

    In this paper, a critical modification to a polymer based affordable split-and-recombination static micromixer is described. To evaluate the improvement, both the original and the modified design were carefully investigated using an experimental setup and numerical modeling approach. The structure of the micromixer was designed to take advantage of the process capabilities of both ultraprecision micromachining and microinjection molding process. Specifically, the original and the modified design were numerically simulated using commercial finite element method software ANSYS CFX to assist the re-designing of the micromixers. The simulation results have shown that both designs are capable of performing mixing while the modified design has a much improved performance. Mixing experiments with two different fluids were carried out using the original and the modified mixers again showed a significantly improved mixing uniformity by the latter. The measured mixing coefficient for the original design was 0.11, and for the improved design it was 0.065. The developed manufacturing process based on ultraprecision machining and microinjection molding processes for device fabrication has the advantage of high-dimensional precision, low cost and manufacturing flexibility.

  14. European Regional Climate Zone Modeling of a Commercial Absorption Heat Pump Hot Water Heater

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sharma, Vishaldeep; Shen, Bo; Keinath, Chris

    2017-01-01

    High efficiency gas-burning hot water heating takes advantage of a condensing heat exchanger to deliver improved combustion efficiency over a standard non-condensing configuration. The water heating is always lower than the gas heating value. In contrast, Gas Absorption Heat Pump (GAHP) hot water heating combines the efficiency of gas burning with the performance increase from a heat pump to offer significant gas energy savings. An ammonia-water system also has the advantage of zero Ozone Depletion Potential and low Global Warming Potential. In comparison with air source electric heat pumps, the absorption system can maintain higher coefficients of performance in coldermore » climates. In this work, a GAHP commercial water heating system was compared to a condensing gas storage system for a range of locations and climate zones across Europe. The thermodynamic performance map of a single effect ammonia-water absorption system was used in a building energy modeling software that could also incorporate the changing ambient air temperature and water mains temperature for a specific location, as well as a full-service restaurant water draw pattern.« less

  15. Knowledge-based engineering of a PLC controlled telescope

    NASA Astrophysics Data System (ADS)

    Pessemier, Wim; Raskin, Gert; Saey, Philippe; Van Winckel, Hans; Deconinck, Geert

    2016-08-01

    As the new control system of the Mercator Telescope is being finalized, we can review some technologies and design methodologies that are advantageous, despite their relative uncommonness in astronomical instrumentation. Particular for the Mercator Telescope is that it is controlled by a single high-end soft-PLC (Programmable Logic Controller). Using off-the-shelf components only, our distributed embedded system controls all subsystems of the telescope such as the pneumatic primary mirror support, the hydrostatic bearing, the telescope axes, the dome, the safety system, and so on. We show how real-time application logic can be written conveniently in typical PLC languages (IEC 61131-3) and in C++ (to implement the pointing kernel) using the commercial TwinCAT 3 programming environment. This software processes the inputs and outputs of the distributed system in real-time via an observatory-wide EtherCAT network, which is synchronized with high precision to an IEEE 1588 (PTP, Precision Time Protocol) time reference clock. Taking full advantage of the ability of soft-PLCs to run both real-time and non real-time software, the same device also hosts the most important user interfaces (HMIs or Human Machine Interfaces) and communication servers (OPC UA for process data, FTP for XML configuration data, and VNC for remote control). To manage the complexity of the system and to streamline the development process, we show how most of the software, electronics and systems engineering aspects of the control system have been modeled as a set of scripts written in a Domain Specific Language (DSL). When executed, these scripts populate a Knowledge Base (KB) which can be queried to retrieve specific information. By feeding the results of those queries to a template system, we were able to generate very detailed "browsable" web-based documentation about the system, but also PLC software code, Python client code, model verification reports, etc. The aim of this paper is to demonstrate the added value that technologies such as soft-PLCs and DSL-scripts and design methodologies such as knowledge-based engineering can bring to astronomical instrumentation.

  16. Using articulation and inscription as catalysts for reflection: Design principles for reflective inquiry

    NASA Astrophysics Data System (ADS)

    Loh, Ben Tun-Bin

    2003-07-01

    The demand for students to engage in complex student-driven and information-rich inquiry investigations poses challenges to existing learning environments. Students are not familiar with this style of work, and lack the skills, tools, and expectations it demands, often forging blindly forward in the investigation. If students are to be successful, they need to learn to be reflective inquirers, periodically stepping back from an investigation to evaluate their work. The fundamental goal of my dissertation is to understand how to design learning environments to promote and support reflective inquiry. I have three basic research questions: how to define this mode of work, how to help students learn it, and understanding how it facilitates reflection when enacted in a classroom. I take an exploratory approach in which, through iterative cycles of design, development, and reflection, I develop principles of design for reflective inquiry, instantiate those principles in the design of a software environment, and test that software in the context of classroom work. My work contributes to the understanding of reflective inquiry in three ways: First, I define a task model that describes the kinds of operations (cognitive tasks) that students should engage in as reflective inquirers. These operations are defined in terms of two basic tasks: articulation and inscription, which serve as catalysts for externalizing student thinking as objects of and triggers for reflection. Second, I instantiate the task model in the design of software tools (the Progress Portfolio). And, through proof of concept pilot studies, I examine how the task model and tools helped students with their investigative classroom work. Finally, I take a step back from these implementations and articulate general design principles for reflective inquiry with the goal of informing the design of other reflective inquiry learning environments. There are three design principles: (1) Provide a designated work space for reflection activities to focus student attention on reflection. (2) Help students create and use artifacts that represent their work and their thinking as a means to create referents for reflection. (3) Support and take advantage of social processes that help students reflect on their own work.

  17. Supporting the Loewenstein occupational therapy cognitive assessment using distributed user interfaces.

    PubMed

    Tesoriero, Ricardo; Gallud Lazaro, Jose A; Altalhi, Abdulrahman H

    2017-02-01

    Improve the quantity and quality of information obtained from traditional Loewenstein Occupational Therapy Cognitive Assessment Battery systems to monitor the evolution of patients' rehabilitation process as well as to compare different rehabilitation therapies. The system replaces traditional artefacts with virtual versions of them to take advantage of cutting edge interaction technology. The system is defined as a Distributed User Interface (DUI) supported by a display ecosystem, including mobile devices as well as multi-touch surfaces. Due to the heterogeneity of the devices involved in the system, the software technology is based on a client-server architecture using the Web as the software platform. The system provides therapists with information that is not available (or it is very difficult to gather) using traditional technologies (i.e. response time measurements, object tracking, information storage and retrieval facilities, etc.). The use of DUIs allows therapists to gather information that is unavailable using traditional assessment methods as well as adapt the system to patients' profile to increase the range of patients that are able to take this assessment. Implications for Rehabilitation Using a Distributed User Interface environment to carry out LOTCAs improves the quality of the information gathered during the rehabilitation assessment. This system captures physical data regarding patient's interaction during the assessment to improve the rehabilitation process analysis. Allows professionals to adapt the assessment procedure to create different versions according to patients' profile. Improves the availability of patients' profile information to therapists to adapt the assessment procedure.

  18. Prospectus 2000

    NASA Astrophysics Data System (ADS)

    Holmes, Jon L.; Gettys, Nancy S.

    2000-01-01

    We begin 2000 with a message about our plans for JCE Software and what you will be seeing in this column as the year progresses. Floppy Disk --> CD-ROM Most software today is distributed on CD-ROM or by downloading from the Internet. Several new computers no longer include a floppy disk drive as "standard equipment". Today's software no longer fits on one or two floppies (the installation software alone can require two disks) and the cost of reproducing and distributing several disks is prohibitive. In short, distribution of software on floppy disks is no longer practical. Therefore, JCE Software will distribute all new software publications on CD-ROM rather than on disks. Regular Issues --> Collections Distribution of all our software on CD-ROM allows us to extend our concept of software collections that we started with the General Chemistry Collection. Such collections will contain all the previously published software that is still "in print" (i.e., is compatible with current operating systems and hardware) and any new programs that fall under the topic of the collection. Proposed topics in addition to General Chemistry currently include Advanced Chemistry, Instrument and Laboratory Simulations, and Spectroscopy. Eventually, all regular issues will be replaced by these collections, which will be updated annually or semiannually with new programs and updates to existing programs. Abstracts for all new programs will continue to appear in this column when a collection or its update is ready for publication. We will continue to offer special issues of single larger programs (e.g. Periodic Table Live!, Chemistry Comes Alive! volumes) on CD-ROM and video on videotape. Connect with Your Students outside Class JCE Software has always offered network licenses to allow instructors to make our software available to students in computer labs, but that model no longer fits the way many instructors and students work with computers. Many students (or their families) own a personal computer allowing them much more flexibility than a campus computer lab. Many instructors utilize the World Wide Web, creating HTML pages for students to use. JCE Software has options available to take advantage of both of these developments. Software Adoption To provide students who own computers access to JCE Software programs, consider adopting one or more of our CD-ROMs as you would a textbook. The General Chemistry Collection has been adopted by several general chemistry courses. We can arrange to bundle CDs with laboratory manuals or to be sold separately to students through the campus bookstore. The cost per CD can be quite low (as little as $5) when large numbers are ordered, making this a cost-effective method of allowing students access to the software they need whenever and wherever they desire. Web-Ready Publications Several JCE Software programs use HTML to present the material. Viewed with the ubiquitous Internet Browser, HTML is compatible with both Mac OS and Windows (as well most other current operating systems) and provides a flexible hypermedia interface that is familiar to an increasing number of instructors and students. HTML-based publications are also ready for use on local intranets, with appropriate licensing, and can be readily incorporated into other HTML-based materials. Already published in this format are: Chemistry Comes Alive!, Volumes 1 and 2 (Special Issues 18 and 21), Flying over Atoms (Special Issue 19), and Periodic Table Live! Second Edition (Special Issue 17). Solid State Resources Second Edition (Special Issue 12) and Chemistry Comes Alive!, Volume 3 (Special Issue 23) will be available soon. Other submissions being developed in HTML format include ChemPages Laboratory and Multimedia General Chemistry Problems. Contact the JCE Software office to learn about licensing alternatives that take advantage of the World Wide Web. Periodic Table Live! 2nd ed. is one of JCE Software's "Web-ready" publications. Publication Plans for 2000 We have several exciting new issues planned for publication in the coming year. Chemistry Comes Alive! The Chemistry Comes Alive! (CCA!) series continues with additional CD-ROMs for Mac OS and Windows. Each volume in this series contains video and animations of chemical reactions that can be easily incorporated into your own computer-based presentations. Our digital video now uses state-of-the-art compression that yields higher quality video with smaller file sizes and data rates more suited for WWW delivery. Video for Periodic Table Live! 2nd edition, Chemistry Comes Alive! Volumes 3, ChemPages Laboratory, and Multimedia General Chemistry Problems use this new format. We will be releasing updates of CCA! Volumes 1 and 2 to take advantage of this new technology. We are very pleased with the results and think you will be also. The reaction of aluminum with chlorine is included in Chemistry Comes Alive! Volume 3. ChemPages Laboratory ChemPages Laboratory, developed by the New Traditions Curriculum Project at the University of Wisconsin-Madison, is an HTML-based CD-ROM for Mac OS and Windows that contains lessons and tutorials to prepare introductory chemistry students to work in the laboratory. It includes text, photographs, computer graphics, animations, digital video, and voice narration to introduce students to the laboratory equipment and procedures. ChemPages Laboratory teaches introductory chemistry students about laboratory instruments, equipment, and procedures. Versatile Video Video demonstrating the "drinking bird" is included in the Chemistry Comes Alive! video collection. Video from this collection can be incorporated into many other projects. As an example, David Whisnant has used the drinking bird in his Multimedia General Chemistry Problems, where students view the video and are asked to explain why the bird bobs up and down. JCE Software anticipates publication of Multimedia General Chemistry Problems on CD-ROM for Mac OS and Windows in 2000. It will be "Web-ready". General Chemistry Collection, 4th Edition The General Chemistry Collection will be revised early in the summer and CDs will be shipped in time for fall adoptions. The 4th edition will include JCE Software publications for general chemistry published in 1999, as well as any programs for general chemistry accepted in 2000. Regular Issues We have had many recent submissions and submissions of work in progress. In 2000 we will work with the authors and our peer-reviewers to complete and publish these submissions individually or as part of a software collection on CD-ROM. An Invitation In collaboration with JCE Online we plan to make available in 2000 more support files for JCE Software. These will include not only troubleshooting tips and technical support notes, but also supporting information submitted by users such as lessons, specific assignments, and activities using JCE Software publications. All JCE Software users are invited to contribute to this area. Get in touch with JCE Software and let us know how you are using our materials so that we can share your ideas with others! Although the word software is in our name, many of our publications are not traditional software. We also publish video on videotape, videodisc, and CD-ROM and electronic documents (Mathcad and Mathematica, spreadsheet files and macros, HTML documents, and PowerPoint presentations). Most chemistry instructors who use a computer in their teaching have created or considered creating one or more of these for their classes. If you have an original computer presentation, electronic document, animation, video, or any other item that is not printed text it is probably an appropriate submission for JCE Software. By publishing your work in any branch of the Journal of Chemical Education, you will share your efforts with chemistry instructors and students all over the world and get professional recognition for your achievements. All JCE Software publications are Y2K compliant.

  19. Embracing Open Software Development in Solar Physics

    NASA Astrophysics Data System (ADS)

    Hughitt, V. K.; Ireland, J.; Christe, S.; Mueller, D.

    2012-12-01

    We discuss two ongoing software projects in solar physics that have adopted best practices of the open source software community. The first, the Helioviewer Project, is a powerful data visualization tool which includes online and Java interfaces inspired by Google Maps (tm). This effort allows users to find solar features and events of interest, and download the corresponding data. Having found data of interest, the user now has to analyze it. The dominant solar data analysis platform is an open-source library called SolarSoft (SSW). Although SSW itself is open-source, the programming language used is IDL, a proprietary language with licensing costs that are prohibative for many institutions and individuals. SSW is composed of a collection of related scripts written by missions and individuals for solar data processing and analysis, without any consistent data structures or common interfaces. Further, at the time when SSW was initially developed, many of the best software development processes of today (mirrored and distributed version control, unit testing, continuous integration, etc.) were not standard, and have not since been adopted. The challenges inherent in developing SolarSoft led to a second software project known as SunPy. SunPy is an open-source Python-based library which seeks to create a unified solar data analysis environment including a number of core datatypes such as Maps, Lightcurves, and Spectra which have consistent interfaces and behaviors. By taking advantage of the large and sophisticated body of scientific software already available in Python (e.g. SciPy, NumPy, Matplotlib), and by adopting many of the best practices refined in open-source software development, SunPy has been able to develop at a very rapid pace while still ensuring a high level of reliability. The Helioviewer Project and SunPy represent two pioneering technologies in solar physics - simple yet flexible data visualization and a powerful, new data analysis environment. We discuss the development of both these efforts and how they are beginning to influence the solar physics community.

  20. Taking Advantage of Student Engagement Results in Student Affairs

    ERIC Educational Resources Information Center

    Kinzie, Jillian; Hurtado, Sarah S.

    2017-01-01

    This chapter urges student affairs professionals committed to enhancing student success through data-informed decision making to take full advantage of opportunities to apply and use student engagement results.

  1. The UNIX Operating System: A Model for Software Design.

    ERIC Educational Resources Information Center

    Kernighan, Brian W.; Morgan, Samuel P.

    1982-01-01

    Describes UNIX time-sharing operating system, including the program environment, software development tools, flexibility and ease of change, portability and other advantages, and five applications and three nonapplications of the system. (JN)

  2. Professional Ethics of Software Engineers: An Ethical Framework.

    PubMed

    Lurie, Yotam; Mark, Shlomo

    2016-04-01

    The purpose of this article is to propose an ethical framework for software engineers that connects software developers' ethical responsibilities directly to their professional standards. The implementation of such an ethical framework can overcome the traditional dichotomy between professional skills and ethical skills, which plagues the engineering professions, by proposing an approach to the fundamental tasks of the practitioner, i.e., software development, in which the professional standards are intrinsically connected to the ethical responsibilities. In so doing, the ethical framework improves the practitioner's professionalism and ethics. We call this approach Ethical-Driven Software Development (EDSD), as an approach to software development. EDSD manifests the advantages of an ethical framework as an alternative to the all too familiar approach in professional ethics that advocates "stand-alone codes of ethics". We believe that one outcome of this synergy between professional and ethical skills is simply better engineers. Moreover, since there are often different software solutions, which the engineer can provide to an issue at stake, the ethical framework provides a guiding principle, within the process of software development, that helps the engineer evaluate the advantages and disadvantages of different software solutions. It does not and cannot affect the end-product in and of-itself. However, it can and should, make the software engineer more conscious and aware of the ethical ramifications of certain engineering decisions within the process.

  3. Building Your Own Web Course: The Case for Off-the-Shelf Component Software.

    ERIC Educational Resources Information Center

    Kaplan, Howard

    1998-01-01

    Compares the features, advantages, and disadvantages of two major software options available for designing web courses: (1) component, off-the shelf software that allows for creation of audio slide lectures, course materials, discussion forums, animations, synchronous chat groups, quiz creators, and electronic mail, and (2) integrated packages…

  4. SBSI: an extensible distributed software infrastructure for parameter estimation in systems biology

    PubMed Central

    Adams, Richard; Clark, Allan; Yamaguchi, Azusa; Hanlon, Neil; Tsorman, Nikos; Ali, Shakir; Lebedeva, Galina; Goltsov, Alexey; Sorokin, Anatoly; Akman, Ozgur E.; Troein, Carl; Millar, Andrew J.; Goryanin, Igor; Gilmore, Stephen

    2013-01-01

    Summary: Complex computational experiments in Systems Biology, such as fitting model parameters to experimental data, can be challenging to perform. Not only do they frequently require a high level of computational power, but the software needed to run the experiment needs to be usable by scientists with varying levels of computational expertise, and modellers need to be able to obtain up-to-date experimental data resources easily. We have developed a software suite, the Systems Biology Software Infrastructure (SBSI), to facilitate the parameter-fitting process. SBSI is a modular software suite composed of three major components: SBSINumerics, a high-performance library containing parallelized algorithms for performing parameter fitting; SBSIDispatcher, a middleware application to track experiments and submit jobs to back-end servers; and SBSIVisual, an extensible client application used to configure optimization experiments and view results. Furthermore, we have created a plugin infrastructure to enable project-specific modules to be easily installed. Plugin developers can take advantage of the existing user-interface and application framework to customize SBSI for their own uses, facilitated by SBSI’s use of standard data formats. Availability and implementation: All SBSI binaries and source-code are freely available from http://sourceforge.net/projects/sbsi under an Apache 2 open-source license. The server-side SBSINumerics runs on any Unix-based operating system; both SBSIVisual and SBSIDispatcher are written in Java and are platform independent, allowing use on Windows, Linux and Mac OS X. The SBSI project website at http://www.sbsi.ed.ac.uk provides documentation and tutorials. Contact: stg@inf.ed.ac.uk Supplementary information: Supplementary data are available at Bioinformatics online. PMID:23329415

  5. 47 CFR 2.944 - Software defined radios.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 47 Telecommunication 1 2013-10-01 2013-10-01 false Software defined radios. 2.944 Section 2.944... Authorization § 2.944 Software defined radios. (a) Manufacturers must take steps to ensure that only software that has been approved with a software defined radio can be loaded into the radio. The software must...

  6. 47 CFR 2.944 - Software defined radios.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 47 Telecommunication 1 2010-10-01 2010-10-01 false Software defined radios. 2.944 Section 2.944... Authorization § 2.944 Software defined radios. (a) Manufacturers must take steps to ensure that only software that has been approved with a software defined radio can be loaded into the radio. The software must...

  7. 47 CFR 2.944 - Software defined radios.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 47 Telecommunication 1 2012-10-01 2012-10-01 false Software defined radios. 2.944 Section 2.944... Authorization § 2.944 Software defined radios. (a) Manufacturers must take steps to ensure that only software that has been approved with a software defined radio can be loaded into the radio. The software must...

  8. 47 CFR 2.944 - Software defined radios.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 47 Telecommunication 1 2011-10-01 2011-10-01 false Software defined radios. 2.944 Section 2.944... Authorization § 2.944 Software defined radios. (a) Manufacturers must take steps to ensure that only software that has been approved with a software defined radio can be loaded into the radio. The software must...

  9. 47 CFR 2.944 - Software defined radios.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 47 Telecommunication 1 2014-10-01 2014-10-01 false Software defined radios. 2.944 Section 2.944... Authorization § 2.944 Software defined radios. (a) Manufacturers must take steps to ensure that only software that has been approved with a software defined radio can be loaded into the radio. The software must...

  10. TCGA2BED: extracting, extending, integrating, and querying The Cancer Genome Atlas.

    PubMed

    Cumbo, Fabio; Fiscon, Giulia; Ceri, Stefano; Masseroli, Marco; Weitschek, Emanuel

    2017-01-03

    Data extraction and integration methods are becoming essential to effectively access and take advantage of the huge amounts of heterogeneous genomics and clinical data increasingly available. In this work, we focus on The Cancer Genome Atlas, a comprehensive archive of tumoral data containing the results of high-throughout experiments, mainly Next Generation Sequencing, for more than 30 cancer types. We propose TCGA2BED a software tool to search and retrieve TCGA data, and convert them in the structured BED format for their seamless use and integration. Additionally, it supports the conversion in CSV, GTF, JSON, and XML standard formats. Furthermore, TCGA2BED extends TCGA data with information extracted from other genomic databases (i.e., NCBI Entrez Gene, HGNC, UCSC, and miRBase). We also provide and maintain an automatically updated data repository with publicly available Copy Number Variation, DNA-methylation, DNA-seq, miRNA-seq, and RNA-seq (V1,V2) experimental data of TCGA converted into the BED format, and their associated clinical and biospecimen meta data in attribute-value text format. The availability of the valuable TCGA data in BED format reduces the time spent in taking advantage of them: it is possible to efficiently and effectively deal with huge amounts of cancer genomic data integratively, and to search, retrieve and extend them with additional information. The BED format facilitates the investigators allowing several knowledge discovery analyses on all tumor types in TCGA with the final aim of understanding pathological mechanisms and aiding cancer treatments.

  11. Soft robotics: a review and progress towards faster and higher torque actuators (presentation video)

    NASA Astrophysics Data System (ADS)

    Shepherd, Robert

    2014-03-01

    Last year, nearly 160,000 industrial robots were shipped worldwide—into a total market valued at 26 Bn (including hardware, software, and peripherals).[1] Service robots for professional (e.g., defense, medical, agriculture) and personal (e.g., household, handicap assistance, toys, and education) use accounted for 16,000 units, 3.4 Bn and 3,000,000 units, $1.2 Bn respectively.[1] The vast majority of these robotic systems use fully actuated, rigid components that take little advantage of passive dynamics. Soft robotics is a field that is taking advantage of compliant actuators and passive dynamics to achieve several goals: reduced design, manufacturing and control complexity, improved energy efficiency, more sophisticated motions, and safe human-machine interactions to name a few. The potential for societal impact is immense. In some instances, soft actuators have achieved commercial success; however, large scale adoption will require improved methods of controlling non-linear systems, greater reliability in their function, and increased utility from faster and more forceful actuation. In my talk, I will describe efforts from my work in the Whitesides group at Harvard to prove sophisticated motions in these machines using simple controls, as well capabilities unique to soft machines. I will also describe the potential for combinations of different classes of soft actuators (e.g., electrically and pneumatically actuated systems) to improve the utility of soft robots. 1. World Robotics - Industrial Robots 2013, 2013, International Federation of Robotics.

  12. Expert system technologies for Space Shuttle decision support: Two case studies

    NASA Technical Reports Server (NTRS)

    Ortiz, Christopher J.; Hasan, David A.

    1994-01-01

    This paper addresses the issue of integrating the C Language Integrated Production System (CLIPS) into distributed data acquisition environments. In particular, it presents preliminary results of some ongoing software development projects aimed at exploiting CLIPS technology in the new mission control center (MCC) being built at NASA Johnson Space Center. One interesting aspect of the control center is its distributed architecture; it consists of networked workstations which acquire and share data through the NASA/JSC-developed information sharing protocol (ISP). This paper outlines some approaches taken to integrate CLIPS and ISP in order to permit the development of intelligent data analysis applications which can be used in the MCC. Three approaches to CLIPS/IPS integration are discussed. The initial approach involves clearly separating CLIPS from ISP using user-defined functions for gathering and sending data to and from a local storage buffer. Memory and performance drawbacks of this design are summarized. The second approach involves taking full advantage of CLIPS and the CLIPS Object-Oriented Language (COOL) by using objects to directly transmit data and state changes from ISP to COOL. Any changes within the object slots eliminate the need for both a data structure and external function call thus taking advantage of the object matching capabilities within CLIPS 6.0. The final approach is to treat CLIPS and ISP as peer toolkits. Neither is embedded in the other; rather the application interweaves calls to each directly in the application source code.

  13. Magnetic resonance angiography: current status and future directions

    PubMed Central

    2011-01-01

    With recent improvement in hardware and software techniques, magnetic resonance angiography (MRA) has undergone significant changes in technique and approach. The advent of 3.0 T magnets has allowed reduction in exogenous contrast dose without compromising overall image quality. The use of novel intravascular contrast agents substantially increases the image windows and decreases contrast dose. Additionally, the lower risk and cost in non-contrast enhanced (NCE) MRA has sparked renewed interest in these methods. This article discusses the current state of both contrast-enhanced (CE) and NCE-MRA. New CE-MRA methods take advantage of dose reduction at 3.0 T, novel contrast agents, and parallel imaging methods. The risks of gadolinium-based contrast media, and the NCE-MRA methods of time-of-flight, steady-state free precession, and phase contrast are discussed. PMID:21388544

  14. Expert system development methodology and the transition from prototyping to operations: FIESTA, a case study

    NASA Technical Reports Server (NTRS)

    Happell, Nadine; Miksell, Steve; Carlisle, Candace

    1989-01-01

    A major barrier in taking expert systems from prototype to operational status involves instilling end user confidence in the operational system. The software of different life cycle models is examined and the advantages and disadvantages of each when applied to expert system development are explored. The Fault Isolation Expert System for Tracking and data relay satellite system Applications (FIESTA) is presented as a case study of development of an expert system. The end user confidence necessary for operational use of this system is accentuated by the fact that it will handle real-time data in a secure environment, allowing little tolerance for errors. How FIESTA is dealing with transition problems as it moves from an off-line standalone prototype to an on-line real-time system is discussed.

  15. VEDA: a web-based virtual environment for dynamic atomic force microscopy.

    PubMed

    Melcher, John; Hu, Shuiqing; Raman, Arvind

    2008-06-01

    We describe here the theory and applications of virtual environment dynamic atomic force microscopy (VEDA), a suite of state-of-the-art simulation tools deployed on nanoHUB (www.nanohub.org) for the accurate simulation of tip motion in dynamic atomic force microscopy (dAFM) over organic and inorganic samples. VEDA takes advantage of nanoHUB's cyberinfrastructure to run high-fidelity dAFM tip dynamics computations on local clusters and the teragrid. Consequently, these tools are freely accessible and the dAFM simulations are run using standard web-based browsers without requiring additional software. A wide range of issues in dAFM ranging from optimal probe choice, probe stability, and tip-sample interaction forces, power dissipation, to material property extraction and scanning dynamics over hetereogeneous samples can be addressed.

  16. Invited Article: VEDA: A web-based virtual environment for dynamic atomic force microscopy

    NASA Astrophysics Data System (ADS)

    Melcher, John; Hu, Shuiqing; Raman, Arvind

    2008-06-01

    We describe here the theory and applications of virtual environment dynamic atomic force microscopy (VEDA), a suite of state-of-the-art simulation tools deployed on nanoHUB (www.nanohub.org) for the accurate simulation of tip motion in dynamic atomic force microscopy (dAFM) over organic and inorganic samples. VEDA takes advantage of nanoHUB's cyberinfrastructure to run high-fidelity dAFM tip dynamics computations on local clusters and the teragrid. Consequently, these tools are freely accessible and the dAFM simulations are run using standard web-based browsers without requiring additional software. A wide range of issues in dAFM ranging from optimal probe choice, probe stability, and tip-sample interaction forces, power dissipation, to material property extraction and scanning dynamics over hetereogeneous samples can be addressed.

  17. An adaptive, object oriented strategy for base calling in DNA sequence analysis.

    PubMed Central

    Giddings, M C; Brumley, R L; Haker, M; Smith, L M

    1993-01-01

    An algorithm has been developed for the determination of nucleotide sequence from data produced in fluorescence-based automated DNA sequencing instruments employing the four-color strategy. This algorithm takes advantage of object oriented programming techniques for modularity and extensibility. The algorithm is adaptive in that data sets from a wide variety of instruments and sequencing conditions can be used with good results. Confidence values are provided on the base calls as an estimate of accuracy. The algorithm iteratively employs confidence determinations from several different modules, each of which examines a different feature of the data for accurate peak identification. Modules within this system can be added or removed for increased performance or for application to a different task. In comparisons with commercial software, the algorithm performed well. Images PMID:8233787

  18. Combining Semantic and Lexical Methods for Mapping MedDRA to VCM Icons.

    PubMed

    Lamy, Jean-Baptiste; Tsopra, Rosy

    2018-01-01

    VCM (Visualization of Concept in Medicine) is an iconic language that represents medical concepts, such as disorders, by icons. VCM has a formal semantics described by an ontology. The icons can be used in medical software for providing a visual summary or enriching texts. However, the use of VCM icons in user interfaces requires to map standard medical terminologies to VCM. Here, we present a method combining semantic and lexical approaches for mapping MedDRA to VCM. The method takes advantage of the hierarchical relations in MedDRA. It also analyzes the groups of lemmas in the term's labels, and relies on a manual mapping of these groups to the concepts in the VCM ontology. We evaluate the method on 50 terms. Finally, we discuss the method and suggest perspectives.

  19. Miniaturization as a key factor to the development and application of advanced metrology systems

    NASA Astrophysics Data System (ADS)

    Furlong, Cosme; Dobrev, Ivo; Harrington, Ellery; Hefti, Peter; Khaleghi, Morteza

    2012-10-01

    Recent technological advances of miniaturization engineering are enabling the realization of components and systems with unprecedented capabilities. Such capabilities, which are significantly beneficial to scientific and engineering applications, are impacting the development and the application of optical metrology systems for investigations under complex boundary, loading, and operating conditions. In this paper, and overview of metrology systems that we are developing is presented. Systems are being developed and applied to high-speed and high-resolution measurements of shape and deformations under actual operating conditions for such applications as sustainability, health, medical diagnosis, security, and urban infrastructure. Systems take advantage of recent developments in light sources and modulators, detectors, microelectromechanical (MEMS) sensors and actuators, kinematic positioners, rapid prototyping fabrication technologies, as well as software engineering.

  20. Expert system development methodology and the transition from prototyping to operations - Fiesta, a case study

    NASA Technical Reports Server (NTRS)

    Happell, Nadine; Miksell, Steve; Carlisle, Candace

    1989-01-01

    A major barrier in taking expert systems from prototype to operational status involves instilling end user confidence in the operational system. The software of different life cycle models is examined and the advantages and disadvantages of each when applied to expert system development are explored. The Fault Isolation Expert System for Tracking and data relay satellite system Applications (FIESTA) is presented as a case study of development of an expert system. The end user confidence necessary for operational use of this system is accentuated by the fact that it will handle real-time data in a secure environment, allowing little tolerance for errors. How FIESTA is dealing with transition problems as it moves from an off-line standalone prototype to an on-line real-time system is discussed.

  1. Countermeasures for Time-Cheat Detection in Multiplayer Online Games

    NASA Astrophysics Data System (ADS)

    Ferretti, Stefano

    Cheating is an important issue in games. Depending on the system over which the game is deployed, several types of malicious actions may be accomplished so as to take an unfair and unexpected advantage over the game and over the (digital, human) adversaries. When the game is a standalone application, cheats typically just relate to the specific software code being developed to build the application. It is not a surprise to find (in the Web and in specialized magazines) people that explain cheats on specific games stating, for instance, which configuration files can be altered (and how to do it) to automatically gain some bonus during the game. To avoid this, game developers are hence motivated to build stable code, with related data that should be securely managed and made difficult to alter.

  2. Cavallo's multiplier for in situ generation of high voltage

    NASA Astrophysics Data System (ADS)

    Clayton, S. M.; Ito, T. M.; Ramsey, J. C.; Wei, W.; Blatnik, M. A.; Filippone, B. W.; Seidel, G. M.

    2018-05-01

    A classic electrostatic induction machine, Cavallo's multiplier, is suggested for in situ production of very high voltage in cryogenic environments. The device is suitable for generating a large electrostatic field under conditions of very small load current. Operation of the Cavallo multiplier is analyzed, with quantitative description in terms of mutual capacitances between electrodes in the system. A demonstration apparatus was constructed, and measured voltages are compared to predictions based on measured capacitances in the system. The simplicity of the Cavallo multiplier makes it amenable to electrostatic analysis using finite element software, and electrode shapes can be optimized to take advantage of a high dielectric strength medium such as liquid helium. A design study is presented for a Cavallo multiplier in a large-scale, cryogenic experiment to measure the neutron electric dipole moment.

  3. Software Review. Macintosh Laboratory Automation: Three Software Packages.

    ERIC Educational Resources Information Center

    Jezl, Barbara Ann

    1990-01-01

    Reviewed are "LABTECH NOTEBOOK,""LabVIEW," and "Parameter Manager pmPLUS/pmTALK." Each package is described including functions, uses, hardware, and costs. Advantages and disadvantages of this type of laboratory approach are discussed. (CW)

  4. 24 CFR 1710.4 - Exemptions-general.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ..., practice, or course of business which operates or would operate as a fraud or deceit upon a purchaser. (c... Secretary in order to take advantage of an exemption. If a developer elects to take advantage of an...

  5. A Software Designed For STP Data Plot and Analysis Based on Object-oriented Methodology

    NASA Astrophysics Data System (ADS)

    Lina, L.; Murata, K.

    2006-12-01

    In the present study, we design a system that is named "STARS (Solar-Terrestrial data Analysis and Reference System)". The STARS provides a research environment that researchers can refer to and analyse a variety of data with single software. This software design is based on the OMT (Object Modeling Technique). The OMT is one of the object-oriented techniques, which has an advantage in maintenance improvement, reuse and long time development of a system. At the Center for Information Technology, Ehime University, after our designing of the STARS, we have already started implementing the STARS. The latest version of the STARS, the STARS5, was released in 2006. Any user can download the system from our WWW site (http:// www.infonet.cite.ehime-u.ac.jp/STARS). The present paper is mainly devoted to the design of a data analysis software system. Through our designing, we paid attention so that the design is flexible and applicable when other developers design software for the similar purpose. If our model is so particular only for our own purpose, it would be useless for other developers. Through our design of the domain object model, we carefully removed the parts, which depend on the system resources, e.g. hardware and software. We put the dependent parts into the application object model. In the present design, therefore, the domain object model and the utility object model are independent of computer resource. This helps anther developer to construct his/her own system based the present design. They simply modify their own application object models according to their system resource. This division of the design between dependent and independent part into three object models is one of the advantages in the OMT. If the design of software is completely done along with the OMT, implementation is rather simple and automatic: developers simply map their designs on our programs. If one creates "ganother STARS" with other programming language such as Java, the programmer simply follows the present system as long as the language is object-oriented language. Researchers would want to add their data into the STARS. In this case, they simply add their own data class in the domain object model. It is because any satellite data has properties such as time or date, which are inherited from the upper class. In this way, their effort is less than in other old methodologies. In the OMT, description format of the system is rather strictly standardized. When new developers take part in STARS project, they have only to understand each model to obtain the overview of the STARS. Then they follow this designs and documents to implement the system. The OMT makes a new comer easy to join into the project already running.

  6. Beam Position and Phase Monitor - Wire Mapping System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Watkins, Heath A; Shurter, Robert B.; Gilpatrick, John D.

    2012-04-10

    The Los Alamos Neutron Science Center (LANSCE) deploys many cylindrical beam position and phase monitors (BPPM) throughout the linac to measure the beam central position, phase and bunched-beam current. Each monitor is calibrated and qualified prior to installation to insure it meets LANSCE requirements. The BPPM wire mapping system is used to map the BPPM electrode offset, sensitivity and higher order coefficients. This system uses a three-axis motion table to position the wire antenna structure within the cavity, simulating the beam excitation of a BPPM at a fundamental frequency of 201.25 MHz. RF signal strength is measured and recorded formore » the four electrodes as the antenna position is updated. An effort is underway to extend the systems service to the LANSCE facility by replacing obsolete electronic hardware and taking advantage of software enhancements. This paper describes the upgraded wire positioning system's new hardware and software capabilities including its revised antenna structure, motion control interface, RF measurement equipment and Labview software upgrades. The main purpose of the wire mapping system at LANSCE is to characterize the amplitude response versus beam central position of BPPMs before they are installed in the beam line. The wire mapping system is able to simulate a beam using a thin wire and measure the signal response as the wire position is varied within the BPPM aperture.« less

  7. Strategic directions of computing at Fermilab

    NASA Astrophysics Data System (ADS)

    Wolbers, Stephen

    1998-05-01

    Fermilab computing has changed a great deal over the years, driven by the demands of the Fermilab experimental community to record and analyze larger and larger datasets, by the desire to take advantage of advances in computing hardware and software, and by the advances coming from the R&D efforts of the Fermilab Computing Division. The strategic directions of Fermilab Computing continue to be driven by the needs of the experimental program. The current fixed-target run will produce over 100 TBytes of raw data and systems must be in place to allow the timely analysis of the data. The collider run II, beginning in 1999, is projected to produce of order 1 PByte of data per year. There will be a major change in methodology and software language as the experiments move away from FORTRAN and into object-oriented languages. Increased use of automation and the reduction of operator-assisted tape mounts will be required to meet the needs of the large experiments and large data sets. Work will continue on higher-rate data acquisition systems for future experiments and projects. R&D projects will be pursued as necessary to provide software, tools, or systems which cannot be purchased or acquired elsewhere. A closer working relation with other high energy laboratories will be pursued to reduce duplication of effort and to allow effective collaboration on many aspects of HEP computing.

  8. The deep space 1 extended mission

    NASA Astrophysics Data System (ADS)

    Rayman, Marc D.; Varghese, Philip

    2001-03-01

    The primary mission of Deep Space 1 (DS1), the first flight of the New Millennium program, completed successfully in September 1999, having exceeded its objectives of testing new, high-risk technologies important for future space and Earth science missions. DS1 is now in its extended mission, with plans to take advantage of the advanced technologies, including solar electric propulsion, to conduct an encounter with comet 19P/Borrelly in September 2001. During the extended mission, the spacecraft's commercial star tracker failed; this critical loss prevented the spacecraft from achieving three-axis attitude control or knowledge. A two-phase approach to recovering the mission was undertaken. The first involved devising a new method of pointing the high-gain antenna to Earth using the radio signal received at the Deep Space Network as an indicator of spacecraft attitude. The second was the development of new flight software that allowed the spacecraft to return to three-axis operation without substantial ground assistance. The principal new feature of this software is the use of the science camera as an attitude sensor. The differences between the science camera and the star tracker have important implications not only for the design of the new software but also for the methods of operating the spacecraft and conducting the mission. The ambitious rescue was fully successful, and the extended mission is back on track.

  9. Recent advances in the structure elucidation of small organic molecules by the LSD software.

    PubMed

    Plainchont, Bertrand; de Paulo Emerenciano, Vicente; Nuzillard, Jean-Marc

    2013-08-01

    The LSD software proposes the structures of small organic molecules that fit with structural constraints from 1D and 2D NMR spectroscopy. Its initial design introduced limits that needed to be eliminated to extend its scope and help its users choose the most likely structure among those proposed. The LSD software code has been improved, so that it recognizes a wider set of atom types to build molecules. More flexibility has been given in the interpretation of 2D NMR data, including the automatic detection of very long-range correlations. A program named pyLSD was written to deal with problems in which atom types are ambiguously defined. It also provides a (13)C NMR chemical shift-based solution ranking algorithm. PyLSD was able to propose the correct structure of hexacyclinol, a natural product whose structure determination has been highly controversal. The solution was ranked first within a list of ten structures that were produced by pyLSD from the literature NMR data. The structure of an aporphin natural product was determined by pyLSD, taking advantage of the possibility of handling electrically charged atoms. The structure generation of the insect antifeedant azadirachtin by LSD was reinvestigated by pyLSD, considering that three (13)C resonances did not lead to univocal hybridization states. Copyright © 2013 John Wiley & Sons, Ltd.

  10. 75 FR 27341 - Increasing Market and Planning Efficiency Through Improved Software; Notice of Technical...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-05-14

    ... Market and Planning Efficiency Through Improved Software; Notice of Technical Conference To Discuss Increasing Market and Planning Efficiency Through Improved Software May 7, 2010. Take notice that Commission... planning efficiency through improved software. [[Page 27342

  11. Dynamic Optical Networks for Future Internet Environments

    NASA Astrophysics Data System (ADS)

    Matera, Francesco

    2014-05-01

    This article reports an overview on the evolution of the optical network scenario taking into account the exponential growth of connected devices, big data, and cloud computing that is driving a concrete transformation impacting the information and communication technology world. This hyper-connected scenario is deeply affecting relationships between individuals, enterprises, citizens, and public administrations, fostering innovative use cases in practically any environment and market, and introducing new opportunities and new challenges. The successful realization of this hyper-connected scenario depends on different elements of the ecosystem. In particular, it builds on connectivity and functionalities allowed by converged next-generation networks and their capacity to support and integrate with the Internet of Things, machine-to-machine, and cloud computing. This article aims at providing some hints of this scenario to contribute to analyze impacts on optical system and network issues and requirements. In particular, the role of the software-defined network is investigated by taking into account all scenarios regarding data centers, cloud computing, and machine-to-machine and trying to illustrate all the advantages that could be introduced by advanced optical communications.

  12. Liquid argon scintillation light studies in LArIAT

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kryczynski, Pawel

    2016-10-12

    The LArIAT experiment is using its Liquid Argon Time Projection Chamber (LArTPC) in the second run of data-taking at the Fermilab Test Beam Facility. The goal of the experiment is to study the response of LArTPCs to charged particles of energies relevant for planned neutrino experiments. In addition, it will help to develop and evaluate the performance of the simulation, analysis, and reconstruction software used in other LAr neutrino experiments. Particles from a tertiary beam detected by LArIAT (mainly protons, pions and muons) are identified using a set of beamline detectors, including Wire Chambers, Time of Flight counters and Cherenkovmore » counters, as well as a simplified sampling detector used to detect muons. In its effort towards augmenting LArTPC technology for other neutrino experiments, LArIAT also takes advantage of the scintillating capabilities of LAr and is testing the possibility of using the light signal to help reconstruct calorimetric information and particle ID. In this report, we present results from these studies of the scintillation light signal to evaluate detector performance and calorimetry.« less

  13. Integrated Electronic Health Record Database Management System: A Proposal.

    PubMed

    Schiza, Eirini C; Panos, George; David, Christiana; Petkov, Nicolai; Schizas, Christos N

    2015-01-01

    eHealth has attained significant importance as a new mechanism for health management and medical practice. However, the technological growth of eHealth is still limited by technical expertise needed to develop appropriate products. Researchers are constantly in a process of developing and testing new software for building and handling Clinical Medical Records, being renamed to Electronic Health Record (EHR) systems; EHRs take full advantage of the technological developments and at the same time provide increased diagnostic and treatment capabilities to doctors. A step to be considered for facilitating this aim is to involve more actively the doctor in building the fundamental steps for creating the EHR system and database. A global clinical patient record database management system can be electronically created by simulating real life medical practice health record taking and utilizing, analyzing the recorded parameters. This proposed approach demonstrates the effective implementation of a universal classic medical record in electronic form, a procedure by which, clinicians are led to utilize algorithms and intelligent systems for their differential diagnosis, final diagnosis and treatment strategies.

  14. TOPDOM: database of conservatively located domains and motifs in proteins.

    PubMed

    Varga, Julia; Dobson, László; Tusnády, Gábor E

    2016-09-01

    The TOPDOM database-originally created as a collection of domains and motifs located consistently on the same side of the membranes in α-helical transmembrane proteins-has been updated and extended by taking into consideration consistently localized domains and motifs in globular proteins, too. By taking advantage of the recently developed CCTOP algorithm to determine the type of a protein and predict topology in case of transmembrane proteins, and by applying a thorough search for domains and motifs as well as utilizing the most up-to-date version of all source databases, we managed to reach a 6-fold increase in the size of the whole database and a 2-fold increase in the number of transmembrane proteins. TOPDOM database is available at http://topdom.enzim.hu The webpage utilizes the common Apache, PHP5 and MySQL software to provide the user interface for accessing and searching the database. The database itself is generated on a high performance computer. tusnady.gabor@ttk.mta.hu Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press.

  15. A New Blondin System for Surveying and Photogrammetry

    PubMed Central

    Cuesta, Federico; Lopez-Rodriguez, Francisco M.; Esteban, Antonio

    2013-01-01

    The main objective of the system presented in this paper is to provide surveyors and engineers with a new photogrammetry device that can be easily integrated with surveying total stations and a global navigation satellite system (GNSS) infrastructure at a construction site, taking advantage of their accuracy and overcoming limitations of aerial vehicles with respect to weight, autonomy and skilled operator requirements in aerial photogrammetry. The system moves between two mounting points, in a blondin ropeway configuration, at the construction site, taking pictures and recording the data of the position and the orientation along the cable path. A cascaded extended Kalman filter is used to integrate measurements from the on-board inertial measurement unit (IMU), a GPS and a GNSS. Experimental results taken in a construction site show the system performance, including the validation of the position estimation, with a robotic surveying total station, or the creation of a digital surface model (DSM), using the emergent structure from motion (SfM) techniques and open software. The georeferencing of the DSM is performed based on estimated camera position or using ground control points (GCPs).

  16. An Instructional Note on Linear Programming--A Pedagogically Sound Approach.

    ERIC Educational Resources Information Center

    Mitchell, Richard

    1998-01-01

    Discusses the place of linear programming in college curricula and the advantages of using linear-programming software. Lists important characteristics of computer software used in linear programming for more effective teaching and learning. (ASK)

  17. Non-Classroom Use of "Presentation Software" in Accelerated Classes: Student Use and Perceptions of Value

    ERIC Educational Resources Information Center

    Davies, Thomas; Korte, Leon; Cornelsen, Erin

    2016-01-01

    Numerous articles found in education literature discuss the advantages and disadvantages of using "presentation" software to deliver critical course content to students. Frequently the perceived value of the use of software such as PowerPoint is dependent upon how it is used, for instance, the extent to which bells and whistles are…

  18. Pi-Sat: A Low Cost Small Satellite and Distributed Spacecraft Mission System Test Platform

    NASA Technical Reports Server (NTRS)

    Cudmore, Alan

    2015-01-01

    Current technology and budget trends indicate a shift in satellite architectures from large, expensive single satellite missions, to small, low cost distributed spacecraft missions. At the center of this shift is the SmallSatCubesat architecture. The primary goal of the Pi-Sat project is to create a low cost, and easy to use Distributed Spacecraft Mission (DSM) test bed to facilitate the research and development of next-generation DSM technologies and concepts. This test bed also serves as a realistic software development platform for Small Satellite and Cubesat architectures. The Pi-Sat is based on the popular $35 Raspberry Pi single board computer featuring a 700Mhz ARM processor, 512MB of RAM, a flash memory card, and a wealth of IO options. The Raspberry Pi runs the Linux operating system and can easily run Code 582s Core Flight System flight software architecture. The low cost and high availability of the Raspberry Pi make it an ideal platform for a Distributed Spacecraft Mission and Cubesat software development. The Pi-Sat models currently include a Pi-Sat 1U Cube, a Pi-Sat Wireless Node, and a Pi-Sat Cubesat processor card.The Pi-Sat project takes advantage of many popular trends in the Maker community including low cost electronics, 3d printing, and rapid prototyping in order to provide a realistic platform for flight software testing, training, and technology development. The Pi-Sat has also provided fantastic hands on training opportunities for NASA summer interns and Pathways students.

  19. Link Analysis in the Mission Planning Lab

    NASA Technical Reports Server (NTRS)

    McCarthy, Jessica A.; Cervantes, Benjamin W.; Daugherty, Sarah C.; Arroyo, Felipe; Mago, Divyang

    2011-01-01

    The legacy communications link analysis software currently used at Wallops Flight Facility involves processes that are different for command destruct, radar, and telemetry. There is a clear advantage to developing an easy-to-use tool that combines all the processes in one application. Link Analysis in the Mission Planning Lab (MPL) uses custom software and algorithms integrated with Analytical Graphics Inc. Satellite Toolkit (AGI STK). The MPL link analysis tool uses pre/post-mission data to conduct a dynamic link analysis between ground assets and the launch vehicle. Just as the legacy methods do, the MPL link analysis tool calculates signal strength and signal- to-noise according to the accepted processes for command destruct, radar, and telemetry assets. Graphs and other custom data are generated rapidly in formats for reports and presentations. STK is used for analysis as well as to depict plume angles and antenna gain patterns in 3D. The MPL has developed two interfaces with the STK software (see figure). The first interface is an HTML utility, which was developed in Visual Basic to enhance analysis for plume modeling and to offer a more user friendly, flexible tool. A graphical user interface (GUI) written in MATLAB (see figure upper right-hand corner) is also used to quickly depict link budget information for multiple ground assets. This new method yields a dramatic decrease in the time it takes to provide launch managers with the required link budgets to make critical pre-mission decisions. The software code used for these two custom utilities is a product of NASA's MPL.

  20. Development of Total Knee Replacement Digital Templating Software

    NASA Astrophysics Data System (ADS)

    Yusof, Siti Fairuz; Sulaiman, Riza; Thian Seng, Lee; Mohd. Kassim, Abdul Yazid; Abdullah, Suhail; Yusof, Shahril; Omar, Masbah; Abdul Hamid, Hamzaini

    In this study, by taking full advantage of digital X-ray and computer technology, we have developed a semi-automated procedure to template knee implants, by making use of digital templating method. Using this approach, a software system called OrthoKneeTMhas been designed and developed. The system is to be utilities as a study in the Department of Orthopaedic and Traumatology in medical faculty, UKM (FPUKM). OrthoKneeTMtemplating process employs uses a technique similar to those used by many surgeons, using acetate templates over X-ray films. Using template technique makes it easy to template various implant from every Implant manufacturers who have with a comprehensive database of templates. The templating functionality includes, template (knee) and manufactures templates (Smith & Nephew; and Zimmer). From an image of patient x-ray OrthoKneeTMtemplates help in quickly and easily reads to the approximate template size needed. The visual templating features then allow us quickly review multiple template sizes against the X-ray and thus obtain the nearly precise view of the implant size required. The system can assist by templating on one patient image and will generate reports that can accompany patient notes. The software system was implemented in Visual basic 6.0 Pro using the object-oriented techniques to manage the graphics and objects. The approaches for image scaling will be discussed. Several of measurement in orthopedic diagnosis process have been studied and added in this software as measurement tools features using mathematic theorem and equations. The study compared the results of the semi-automated (using digital templating) method to the conventional method to demonstrate the accuracy of the system.

  1. Constraint Network Analysis (CNA): a Python software package for efficiently linking biomacromolecular structure, flexibility, (thermo-)stability, and function.

    PubMed

    Pfleger, Christopher; Rathi, Prakash Chandra; Klein, Doris L; Radestock, Sebastian; Gohlke, Holger

    2013-04-22

    For deriving maximal advantage from information on biomacromolecular flexibility and rigidity, results from rigidity analyses must be linked to biologically relevant characteristics of a structure. Here, we describe the Python-based software package Constraint Network Analysis (CNA) developed for this task. CNA functions as a front- and backend to the graph-based rigidity analysis software FIRST. CNA goes beyond the mere identification of flexible and rigid regions in a biomacromolecule in that it (I) provides a refined modeling of thermal unfolding simulations that also considers the temperature-dependence of hydrophobic tethers, (II) allows performing rigidity analyses on ensembles of network topologies, either generated from structural ensembles or by using the concept of fuzzy noncovalent constraints, and (III) computes a set of global and local indices for quantifying biomacromolecular stability. This leads to more robust results from rigidity analyses and extends the application domain of rigidity analyses in that phase transition points ("melting points") and unfolding nuclei ("structural weak spots") are determined automatically. Furthermore, CNA robustly handles small-molecule ligands in general. Such advancements are important for applying rigidity analysis to data-driven protein engineering and for estimating the influence of ligand molecules on biomacromolecular stability. CNA maintains the efficiency of FIRST such that the analysis of a single protein structure takes a few seconds for systems of several hundred residues on a single core. These features make CNA an interesting tool for linking biomacromolecular structure, flexibility, (thermo-)stability, and function. CNA is available from http://cpclab.uni-duesseldorf.de/software for nonprofit organizations.

  2. MDWiZ: a platform for the automated translation of molecular dynamics simulations.

    PubMed

    Rusu, Victor H; Horta, Vitor A C; Horta, Bruno A C; Lins, Roberto D; Baron, Riccardo

    2014-03-01

    A variety of popular molecular dynamics (MD) simulation packages were independently developed in the last decades to reach diverse scientific goals. However, such non-coordinated development of software, force fields, and analysis tools for molecular simulations gave rise to an array of software formats and arbitrary conventions for routine preparation and analysis of simulation input and output data. Different formats and/or parameter definitions are used at each stage of the modeling process despite largely contain redundant information between alternative software tools. Such Babel of languages that cannot be easily and univocally translated one into another poses one of the major technical obstacles to the preparation, translation, and comparison of molecular simulation data that users face on a daily basis. Here, we present the MDWiZ platform, a freely accessed online portal designed to aid the fast and reliable preparation and conversion of file formats that allows researchers to reproduce or generate data from MD simulations using different setups, including force fields and models with different underlying potential forms. The general structure of MDWiZ is presented, the features of version 1.0 are detailed, and an extensive validation based on GROMACS to LAMMPS conversion is presented. We believe that MDWiZ will be largely useful to the molecular dynamics community. Such fast format and force field exchange for a given system allows tailoring the chosen system to a given computer platform and/or taking advantage of a specific capabilities offered by different software engines. Copyright © 2013 The Authors. Published by Elsevier Inc. All rights reserved.

  3. High-Performance Data Analysis Tools for Sun-Earth Connection Missions

    NASA Technical Reports Server (NTRS)

    Messmer, Peter

    2011-01-01

    The data analysis tool of choice for many Sun-Earth Connection missions is the Interactive Data Language (IDL) by ITT VIS. The increasing amount of data produced by these missions and the increasing complexity of image processing algorithms requires access to higher computing power. Parallel computing is a cost-effective way to increase the speed of computation, but algorithms oftentimes have to be modified to take advantage of parallel systems. Enhancing IDL to work on clusters gives scientists access to increased performance in a familiar programming environment. The goal of this project was to enable IDL applications to benefit from both computing clusters as well as graphics processing units (GPUs) for accelerating data analysis tasks. The tool suite developed in this project enables scientists now to solve demanding data analysis problems in IDL that previously required specialized software, and it allows them to be solved orders of magnitude faster than on conventional PCs. The tool suite consists of three components: (1) TaskDL, a software tool that simplifies the creation and management of task farms, collections of tasks that can be processed independently and require only small amounts of data communication; (2) mpiDL, a tool that allows IDL developers to use the Message Passing Interface (MPI) inside IDL for problems that require large amounts of data to be exchanged among multiple processors; and (3) GPULib, a tool that simplifies the use of GPUs as mathematical coprocessors from within IDL. mpiDL is unique in its support for the full MPI standard and its support of a broad range of MPI implementations. GPULib is unique in enabling users to take advantage of an inexpensive piece of hardware, possibly already installed in their computer, and achieve orders of magnitude faster execution time for numerically complex algorithms. TaskDL enables the simple setup and management of task farms on compute clusters. The products developed in this project have the potential to interact, so one can build a cluster of PCs, each equipped with a GPU, and use mpiDL to communicate between the nodes and GPULib to accelerate the computations on each node.

  4. Stream Flow Prediction and Flood Mapping in the Hindu Kush-Himalaya with the ICIMOD Water Resources App Portal (IWRAP)

    NASA Astrophysics Data System (ADS)

    Nelson, J.; Ames, D. P.; Jones, N.; Souffront, M.

    2016-12-01

    Earth observations of precipitation, temperature, moisture, and other atmospheric and land surface conditions form the foundation of global hydrologic forecasts that are increasingly available in native as well as other derived products. The European Centre for Medium Range Weather Forecasts (ECMWF) have developed such products for global flood awareness which can be downscaled to smaller regions and used for stream flow prediction in underserved areas such as the Hindu Kush-Himalaya. Combined with digital elevation data, now available at 30 meters through the Shuttle Radar Topography Mission (SRTM) reconnaissance-level flood maps can be generated across wide regions that would otherwise not be possible and where increased information to drive higher resolution models are available the same forecasts can be used to provide forcing inflows for improved flood maps. Advances in cloud computing offer a unique opportunity to facilitate deployment of water resources models as decision-making tools in the cloud-based ICIMOD Water Resources App Portal or IWRAP. The interactive nature of web apps makes this an excellent medium for creating decision support tools that harness cutting edge modeling techniques. Thin client apps hosted in a cloud portal eliminates the need for the decision makers to procure and maintain the high performance hardware required by the models, deal with issues related to software installation and platform incompatibilities, or monitor and install software updates, a problem that is exacerbated in the Hindu Kush-Himalaya where both financial and technical capacity are limited. All that is needed to use the system is an Internet connection and a web browser. We will take advantage of these technologies to develop tools which can be centrally maintained but openly accessible. Advanced mapping and visualization will make results intuitive and information derived actionable. We will also take advantage of the emerging standards for sharing water information across the web using the OGC and WMO approved WaterML standards. This will make our tools interoperable and we will help train those we work with so that tools and data from other projects can both consume and share with the tools developed in our project.

  5. Automation of Military Civil Engineering and Site Design Functions: Software Evaluation

    DTIC Science & Technology

    1989-09-01

    promising advantage over manual methods, USACERL is to evaluate available software to determine which, if any, is best suited to the type of civil...moved. Therefore, original surface data were assembled by scaling the northing and easting distances of field elevations and entering them manually into...in the software or requesting an update or addition to the software or manuals . Responses to forms submitted during the test were received at

  6. Fine Mapping Causal Variants with an Approximate Bayesian Method Using Marginal Test Statistics

    PubMed Central

    Chen, Wenan; Larrabee, Beth R.; Ovsyannikova, Inna G.; Kennedy, Richard B.; Haralambieva, Iana H.; Poland, Gregory A.; Schaid, Daniel J.

    2015-01-01

    Two recently developed fine-mapping methods, CAVIAR and PAINTOR, demonstrate better performance over other fine-mapping methods. They also have the advantage of using only the marginal test statistics and the correlation among SNPs. Both methods leverage the fact that the marginal test statistics asymptotically follow a multivariate normal distribution and are likelihood based. However, their relationship with Bayesian fine mapping, such as BIMBAM, is not clear. In this study, we first show that CAVIAR and BIMBAM are actually approximately equivalent to each other. This leads to a fine-mapping method using marginal test statistics in the Bayesian framework, which we call CAVIAR Bayes factor (CAVIARBF). Another advantage of the Bayesian framework is that it can answer both association and fine-mapping questions. We also used simulations to compare CAVIARBF with other methods under different numbers of causal variants. The results showed that both CAVIARBF and BIMBAM have better performance than PAINTOR and other methods. Compared to BIMBAM, CAVIARBF has the advantage of using only marginal test statistics and takes about one-quarter to one-fifth of the running time. We applied different methods on two independent cohorts of the same phenotype. Results showed that CAVIARBF, BIMBAM, and PAINTOR selected the same top 3 SNPs; however, CAVIARBF and BIMBAM had better consistency in selecting the top 10 ranked SNPs between the two cohorts. Software is available at https://bitbucket.org/Wenan/caviarbf. PMID:25948564

  7. Software for Optical Archive and Retrieval (SOAR) user's guide, version 4.2

    NASA Technical Reports Server (NTRS)

    Davis, Charles

    1991-01-01

    The optical disk is an emerging technology. Because it is not a magnetic medium, it offers a number of distinct advantages over the established form of storage, advantages that make it extremely attractive. They are as follows: (1) the ability to store much more data within the same space; (2) the random access characteristics of the Write Once Read Many optical disk; (3) a much longer life than that of traditional storage media; and (4) much greater data access rate. Software for Optical Archive and Retrieval (SOAR) user's guide is presented.

  8. Science Gateways, Scientific Workflows and Open Community Software

    NASA Astrophysics Data System (ADS)

    Pierce, M. E.; Marru, S.

    2014-12-01

    Science gateways and scientific workflows occupy different ends of the spectrum of user-focused cyberinfrastructure. Gateways, sometimes called science portals, provide a way for enabling large numbers of users to take advantage of advanced computing resources (supercomputers, advanced storage systems, science clouds) by providing Web and desktop interfaces and supporting services. Scientific workflows, at the other end of the spectrum, support advanced usage of cyberinfrastructure that enable "power users" to undertake computational experiments that are not easily done through the usual mechanisms (managing simulations across multiple sites, for example). Despite these different target communities, gateways and workflows share many similarities and can potentially be accommodated by the same software system. For example, pipelines to process InSAR imagery sets or to datamine GPS time series data are workflows. The results and the ability to make downstream products may be made available through a gateway, and power users may want to provide their own custom pipelines. In this abstract, we discuss our efforts to build an open source software system, Apache Airavata, that can accommodate both gateway and workflow use cases. Our approach is general, and we have applied the software to problems in a number of scientific domains. In this talk, we discuss our applications to usage scenarios specific to earth science, focusing on earthquake physics examples drawn from the QuakSim.org and GeoGateway.org efforts. We also examine the role of the Apache Software Foundation's open community model as a way to build up common commmunity codes that do not depend upon a single "owner" to sustain. Pushing beyond open source software, we also see the need to provide gateways and workflow systems as cloud services. These services centralize operations, provide well-defined programming interfaces, scale elastically, and have global-scale fault tolerance. We discuss our work providing Apache Airavata as a hosted service to provide these features.

  9. Information and communication technologies in geography education in the 21-th Century

    NASA Astrophysics Data System (ADS)

    Vangelova, Rumyana

    2014-05-01

    In 2013 I attended a course on the Introduction to the Use of Spatial Thinking and Geoinformation (in geography and related subjects) organized by the European Association of Geographers. This course have helped me to realize what will be tomorrow's classroom. We can change education of geography in the classroom by using the following information technologies: Envision in classroom This software solution provides interactive environment for the whole learning experience of students. Envision helps enhance the quality of teaching and also keeps children engaged. An advantage of Envision is that it integrates ICT in education in a natural and easy to implement way improving the quality of education by making it a more positive experience to all involved parties. It is easy to use by teachers, because it provides a flexible way to present lessons. Educational software system supports collaborative learning giving teachers powerful and easy-to-use tool for teaching and learning. It gives students opportunity to take part actively in the lessons and develops team working and collaboration skills. This software is suitable for very different topics in the classroom - geographical location, boundaries, climate, political map, etc. Teachers benefit by easily engaging the full attention of children, taking advantage of best practices and exchanging experience with their colleagues. Children use their mice to interact with the system and can answer questions as individuals or as a group. They solve puzzles, categorize objects/concepts/ or locate objects on a map, type answers using a virtual keyboard. During the lesson Envision tracks the behavior of each child. Interactive classboard The Interactive StarBoard Software helps better acquiring and understanding of the new academic information for the students. Children have great interest and show greater independence, which helps them for easier learning. The use of educational games in teaching Geography by this software helps to strengthen the individual work, stimulating their independent thinking and competitive nature. It helps mastering the material and acquisition of knowledge and skills in Geography in a fun environment. Using interactive classboard and creating different products such as diagrams, maps, drawings will enhance students' learning abilities, creativeness and knowledge on the environmental concepts and theories such as sustainable development and eco-thinking. Visualization of new learning content allows for short time students to receive more information Geomedia and GIS Geo-media is the visualization of information from different media sources and is concerned with digital content and its processing based on place, position and location. Geoinformation could be used to create attitude concerning contemporary problems - environmental, demographic and economic. 21st century school education needs to include geo-media into daily teaching and learning. Students use ArcGIS to create their own interactive maps related to the Bulgarian geography education and in that way they develop their spatial thinking skills. Using different techniques and approaches including geoinformation, geomedia, interactive classboard supports green thinking and behavior of students through involving them actively in studying environmental problems and issues. Students can easily understand human impacts and the management issues which arise in conserving the earth's unique ecosystems.

  10. Clearing your Desk! Software and Data Services for Collaborative Web Based GIS Analysis

    NASA Astrophysics Data System (ADS)

    Tarboton, D. G.; Idaszak, R.; Horsburgh, J. S.; Ames, D. P.; Goodall, J. L.; Band, L. E.; Merwade, V.; Couch, A.; Hooper, R. P.; Maidment, D. R.; Dash, P. K.; Stealey, M.; Yi, H.; Gan, T.; Gichamo, T.; Yildirim, A. A.; Liu, Y.

    2015-12-01

    Can your desktop computer crunch the large GIS datasets that are becoming increasingly common across the geosciences? Do you have access to or the know-how to take advantage of advanced high performance computing (HPC) capability? Web based cyberinfrastructure takes work off your desk or laptop computer and onto infrastructure or "cloud" based data and processing servers. This talk will describe the HydroShare collaborative environment and web based services being developed to support the sharing and processing of hydrologic data and models. HydroShare supports the upload, storage, and sharing of a broad class of hydrologic data including time series, geographic features and raster datasets, multidimensional space-time data, and other structured collections of data. Web service tools and a Python client library provide researchers with access to HPC resources without requiring them to become HPC experts. This reduces the time and effort spent in finding and organizing the data required to prepare the inputs for hydrologic models and facilitates the management of online data and execution of models on HPC systems. This presentation will illustrate the use of web based data and computation services from both the browser and desktop client software. These web-based services implement the Terrain Analysis Using Digital Elevation Model (TauDEM) tools for watershed delineation, generation of hydrology-based terrain information, and preparation of hydrologic model inputs. They allow users to develop scripts on their desktop computer that call analytical functions that are executed completely in the cloud, on HPC resources using input datasets stored in the cloud, without installing specialized software, learning how to use HPC, or transferring large datasets back to the user's desktop. These cases serve as examples for how this approach can be extended to other models to enhance the use of web and data services in the geosciences.

  11. Putting ROSE to Work: A Proposed Application of a Request-Oriented Scheduling Engine for Space Station Operations

    NASA Technical Reports Server (NTRS)

    Jaap, John; Muery, Kim

    2000-01-01

    Scheduling engines are found at the core of software systems that plan and schedule activities and resources. A Request-Oriented Scheduling Engine (ROSE) is one that processes a single request (adding a task to a timeline) and then waits for another request. For the International Space Station, a robust ROSE-based system would support multiple, simultaneous users, each formulating requests (defining scheduling requirements), submitting these requests via the internet to a single scheduling engine operating on a single timeline, and immediately viewing the resulting timeline. ROSE is significantly different from the engine currently used to schedule Space Station operations. The current engine supports essentially one person at a time, with a pre-defined set of requirements from many payloads, working in either a "batch" scheduling mode or an interactive/manual scheduling mode. A planning and scheduling process that takes advantage of the features of ROSE could produce greater customer satisfaction at reduced cost and reduced flow time. This paper describes a possible ROSE-based scheduling process and identifies the additional software component required to support it. Resulting changes to the management and control of the process are also discussed.

  12. Genomic similarity and kernel methods I: advancements by building on mathematical and statistical foundations.

    PubMed

    Schaid, Daniel J

    2010-01-01

    Measures of genomic similarity are the basis of many statistical analytic methods. We review the mathematical and statistical basis of similarity methods, particularly based on kernel methods. A kernel function converts information for a pair of subjects to a quantitative value representing either similarity (larger values meaning more similar) or distance (smaller values meaning more similar), with the requirement that it must create a positive semidefinite matrix when applied to all pairs of subjects. This review emphasizes the wide range of statistical methods and software that can be used when similarity is based on kernel methods, such as nonparametric regression, linear mixed models and generalized linear mixed models, hierarchical models, score statistics, and support vector machines. The mathematical rigor for these methods is summarized, as is the mathematical framework for making kernels. This review provides a framework to move from intuitive and heuristic approaches to define genomic similarities to more rigorous methods that can take advantage of powerful statistical modeling and existing software. A companion paper reviews novel approaches to creating kernels that might be useful for genomic analyses, providing insights with examples [1]. Copyright © 2010 S. Karger AG, Basel.

  13. New Version of SeismicHandler (SHX) based on ObsPy

    NASA Astrophysics Data System (ADS)

    Stammler, Klaus; Walther, Marcus

    2016-04-01

    The command line version of SeismicHandler (SH), a scientific analysis tool for seismic waveform data developed around 1990, has been redesigned in the recent years, based on a project funded by the Deutsche Forschungsgemeinschaft (DFG). The aim was to address new data access techniques, simplified metadata handling and a modularized software design. As a result the program was rewritten in Python in its main parts, taking advantage of simplicity of this script language and its variety of well developed software libraries, including ObsPy. SHX provides an easy access to waveforms and metadata via arclink and FDSN webservice protocols, also access to event catalogs is implemented. With single commands whole networks or stations within a certain area may be read in, the metadata are retrieved from the servers and stored in a local database. For data processing the large set of SH commands is available, as well as the SH scripting language. Via this SH language scripts or additional Python modules the command set of SHX is easily extendable. The program is open source, tested on Linux operating systems, documentation and download is found at URL "https://www.seismic-handler.org/".

  14. Performance Evaluation and Software Design for EVA Robotic Assistant Stereo Vision Heads

    NASA Technical Reports Server (NTRS)

    DiPaolo, Daniel

    2003-01-01

    The purpose of this project was to aid the EVA Robotic Assistant project by evaluating and designing the necessary interfaces for two stereo vision heads - the TracLabs Biclops pan-tilt-verge head, and the Helpmate Zebra pan-tilt-verge head. The first half of the project consisted of designing the necessary software interface so that the other modules of the EVA Robotic Assistant had proper access to all of the functionalities offered by each of the stereovision heads. This half took most of the project time, due to a lack of ready-made CORBA drivers for either of the heads. Once this was overcome, the evaluation stage of the project began. The second half of the project was to take these interfaces and to evaluate each of the stereo vision heads in terms of usefulness to the project. In the key project areas such as stability and reliability, the Zebra pan-tilt-verge head came out on top. However, the Biclops did have many more advantages over the Zebra, such as: lower power consumption, faster communications, and a simpler, cleaner API. Overall, the Biclops pan-tilt-verge head outperformed the Zebra pan-tilt-verge head.

  15. Cardiology office computer use: primer, pointers, pitfalls.

    PubMed

    Shepard, R B; Blum, R I

    1986-10-01

    An office computer is a utility, like an automobile, with benefits and costs that are both direct and hidden and potential for disaster. For the cardiologist or cardiovascular surgeon, the increasing power and decreasing costs of computer hardware and the availability of software make use of an office computer system an increasingly attractive possibility. Management of office business functions is common; handling and scientific analysis of practice medical information are less common. The cardiologist can also access national medical information systems for literature searches and for interactive further education. Selection and testing of programs and the entire computer system before purchase of computer hardware will reduce the chances of disappointment or serious problems. Personnel pretraining and planning for office information flow and medical information security are necessary. Some cardiologists design their own office systems, buy hardware and software as needed, write programs for themselves and carry out the implementation themselves. For most cardiologists, the better course will be to take advantage of the professional experience of expert advisors. This article provides a starting point from which the practicing cardiologist can approach considering, specifying or implementing an office computer system for business functions and for scientific analysis of practice results.

  16. World Wide Web Metaphors for Search Mission Data

    NASA Technical Reports Server (NTRS)

    Norris, Jeffrey S.; Wallick, Michael N.; Joswig, Joseph C.; Powell, Mark W.; Torres, Recaredo J.; Mittman, David S.; Abramyan, Lucy; Crockett, Thomas M.; Shams, Khawaja S.; Fox, Jason M.; hide

    2010-01-01

    A software program that searches and browses mission data emulates a Web browser, containing standard meta - phors for Web browsing. By taking advantage of back-end URLs, users may save and share search states. Also, since a Web interface is familiar to users, training time is reduced. Familiar back and forward buttons move through a local search history. A refresh/reload button regenerates a query, and loads in any new data. URLs can be constructed to save search results. Adding context to the current search is also handled through a familiar Web metaphor. The query is constructed by clicking on hyperlinks that represent new components to the search query. The selection of a link appears to the user as a page change; the choice of links changes to represent the updated search and the results are filtered by the new criteria. Selecting a navigation link changes the current query and also the URL that is associated with it. The back button can be used to return to the previous search state. This software is part of the MSLICE release, which was written in Java. It will run on any current Windows, Macintosh, or Linux system.

  17. Toward an automated parallel computing environment for geosciences

    NASA Astrophysics Data System (ADS)

    Zhang, Huai; Liu, Mian; Shi, Yaolin; Yuen, David A.; Yan, Zhenzhen; Liang, Guoping

    2007-08-01

    Software for geodynamic modeling has not kept up with the fast growing computing hardware and network resources. In the past decade supercomputing power has become available to most researchers in the form of affordable Beowulf clusters and other parallel computer platforms. However, to take full advantage of such computing power requires developing parallel algorithms and associated software, a task that is often too daunting for geoscience modelers whose main expertise is in geosciences. We introduce here an automated parallel computing environment built on open-source algorithms and libraries. Users interact with this computing environment by specifying the partial differential equations, solvers, and model-specific properties using an English-like modeling language in the input files. The system then automatically generates the finite element codes that can be run on distributed or shared memory parallel machines. This system is dynamic and flexible, allowing users to address different problems in geosciences. It is capable of providing web-based services, enabling users to generate source codes online. This unique feature will facilitate high-performance computing to be integrated with distributed data grids in the emerging cyber-infrastructures for geosciences. In this paper we discuss the principles of this automated modeling environment and provide examples to demonstrate its versatility.

  18. Framework for architecture-independent run-time reconfigurable applications

    NASA Astrophysics Data System (ADS)

    Lehn, David I.; Hudson, Rhett D.; Athanas, Peter M.

    2000-10-01

    Configurable Computing Machines (CCMs) have emerged as a technology with the computational benefits of custom ASICs as well as the flexibility and reconfigurability of general-purpose microprocessors. Significant effort from the research community has focused on techniques to move this reconfigurability from a rapid application development tool to a run-time tool. This requires the ability to change the hardware design while the application is executing and is known as Run-Time Reconfiguration (RTR). Widespread acceptance of run-time reconfigurable custom computing depends upon the existence of high-level automated design tools. Such tools must reduce the designers effort to port applications between different platforms as the architecture, hardware, and software evolves. A Java implementation of a high-level application framework, called Janus, is presented here. In this environment, developers create Java classes that describe the structural behavior of an application. The framework allows hardware and software modules to be freely mixed and interchanged. A compilation phase of the development process analyzes the structure of the application and adapts it to the target platform. Janus is capable of structuring the run-time behavior of an application to take advantage of the memory and computational resources available.

  19. Transition From NASA Space Communication Systems to Commerical Communication Products

    NASA Technical Reports Server (NTRS)

    Ghazvinian, Farzad; Lindsey, William C.

    1994-01-01

    Transitioning from twenty-five years of space communication system architecting, engineering and development to creating and marketing of commercial communication system hardware and software products is no simple task for small, high-tech system engineering companies whose major source of revenue has been the U.S. Government. Yet, many small businesses are faced with this onerous and perplexing task. The purpose of this talk/paper is to present one small business (LinCom) approach to taking advantage of the systems engineering expertise and knowledge captured in physical neural networks and simulation software by supporting numerous National Aeronautics and Space Administration (NASA) and the Department of Defense (DoD) projects, e.g., Space Shuttle, TDRSS, Space Station, DCSC, Milstar, etc. The innovative ingredients needed for a systems house to transition to a wireless communication system products house that supports personal communication services and networks (PCS and PCN) development in a global economy will be discussed. Efficient methods for using past government sponsored space system research and development to transition to VLSI communication chip set products will be presented along with notions of how synergy between government and industry can be maintained to benefit both parties.

  20. RELAP-7 Closure Correlations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zou, Ling; Berry, R. A.; Martineau, R. C.

    The RELAP-7 code is the next generation nuclear reactor system safety analysis code being developed at the Idaho National Laboratory (INL). The code is based on the INL’s modern scientific software development framework, MOOSE (Multi-Physics Object Oriented Simulation Environment). The overall design goal of RELAP-7 is to take advantage of the previous thirty years of advancements in computer architecture, software design, numerical integration methods, and physical models. The end result will be a reactor systems analysis capability that retains and improves upon RELAP5’s and TRACE’s capabilities and extends their analysis capabilities for all reactor system simulation scenarios. The RELAP-7 codemore » utilizes the well-posed 7-equation two-phase flow model for compressible two-phase flow. Closure models used in the TRACE code has been reviewed and selected to reflect the progress made during the past decades and provide a basis for the colure correlations implemented in the RELAP-7 code. This document provides a summary on the closure correlations that are currently implemented in the RELAP-7 code. The closure correlations include sub-grid models that describe interactions between the fluids and the flow channel, and interactions between the two phases.« less

  1. Openly Published Environmental Sensing (OPEnS) | Advancing Open-Source Research, Instrumentation, and Dissemination

    NASA Astrophysics Data System (ADS)

    Udell, C.; Selker, J. S.

    2017-12-01

    The increasing availability and functionality of Open-Source software and hardware along with 3D printing, low-cost electronics, and proliferation of open-access resources for learning rapid prototyping are contributing to fundamental transformations and new technologies in environmental sensing. These tools invite reevaluation of time-tested methodologies and devices toward more efficient, reusable, and inexpensive alternatives. Building upon Open-Source design facilitates community engagement and invites a Do-It-Together (DIT) collaborative framework for research where solutions to complex problems may be crowd-sourced. However, barriers persist that prevent researchers from taking advantage of the capabilities afforded by open-source software, hardware, and rapid prototyping. Some of these include: requisite technical skillsets, knowledge of equipment capabilities, identifying inexpensive sources for materials, money, space, and time. A university MAKER space staffed by engineering students to assist researchers is one proposed solution to overcome many of these obstacles. This presentation investigates the unique capabilities the USDA-funded Openly Published Environmental Sensing (OPEnS) Lab affords researchers, within Oregon State and internationally, and the unique functions these types of initiatives support at the intersection of MAKER spaces, Open-Source academic research, and open-access dissemination.

  2. Adaptive software architecture based on confident HCI for the deployment of sensitive services in Smart Homes.

    PubMed

    Vega-Barbas, Mario; Pau, Iván; Martín-Ruiz, María Luisa; Seoane, Fernando

    2015-03-25

    Smart spaces foster the development of natural and appropriate forms of human-computer interaction by taking advantage of home customization. The interaction potential of the Smart Home, which is a special type of smart space, is of particular interest in fields in which the acceptance of new technologies is limited and restrictive. The integration of smart home design patterns with sensitive solutions can increase user acceptance. In this paper, we present the main challenges that have been identified in the literature for the successful deployment of sensitive services (e.g., telemedicine and assistive services) in smart spaces and a software architecture that models the functionalities of a Smart Home platform that are required to maintain and support such sensitive services. This architecture emphasizes user interaction as a key concept to facilitate the acceptance of sensitive services by end-users and utilizes activity theory to support its innovative design. The application of activity theory to the architecture eases the handling of novel concepts, such as understanding of the system by patients at home or the affordability of assistive services. Finally, we provide a proof-of-concept implementation of the architecture and compare the results with other architectures from the literature.

  3. Prospectus 1999

    NASA Astrophysics Data System (ADS)

    Holmes, Jon L.; Gettys, Nancy S.

    1999-01-01

    We begin 1999 with a message to all Journal subscribers about our plans for JCE Software and what you will be seeing in this column as the year progresses. Series News JCE Software will continue to publish individual programs, one to an issue as they become ready for distribution. The old Series B, C, and D designations no longer exist. Regular Issue numbers for 1999 will start with 99, and end with M for Mac OS, W for Windows, or MW for programs that will run under both the Mac OS and Windows. Windows programs will be compatible with Windows 95/98 and may or may not be compatible with Windows 3.1. Special Issues, such as CD-ROMs and videotapes will continue to be designated with SP followed by a number. Publication Plans for 1999 Periodic Table Live! Second Edition Periodic Table Live! Second Edition is a new version of one of JCE Software's most popular publications. The best features of Illustrated Periodic Table (1) for Windows and Chemistry Navigator (2) for Mac OS are combined in a new HTML-based, multimedia presentation format. Together with the video from Periodic Table Videodisc (3), digitized to take advantage of new features available in QuickTime 3 (4), the new Periodic Table Live! will be easy to use with complete features available to both Windows and Mac OS user. Chemistry Comes Alive! The Chemistry Comes Alive! (CCA!) series continues in 1999 with CD-ROMs for Mac OS and Windows. Like the first two volumes (5,6), new CDs will contain video and animations of chemical reactions, including clips from our videodiscs ChemDemos (7), ChemDemos II (8), and Titration Techniques (9). Other clips are new, available for the first time in Chemistry Comes Alive! New CCA! CDs will be made available in two varieties for individual users, one to take advantage of the high-quality video that can be displayed by new, faster computers, and another that will play well on older, slower models. In addition, a third variation for network licensing will include video optimized for delivery via the World Wide Web. If all goes according to plan, two new CCA! volumes will be announced in 1999, and CCA! 1 and CCA! 2 will be updated to take advantage of the latest digital video technology. Chem Pages Chem Pages, Laboratory Techniques, was developed by the New Traditions Curriculum Project at the University of Wisconsin-Madison. It is an HTML-based CD-ROM for Mac OS and Windows that contains lessons and tutorials to prepare introductory chemistry students to work in the laboratory. It includes text, photographs, computer graphics, animations, digital video, and voice narration to introduce students to the laboratory equipment and procedures. Regular Issues Programs that have been accepted for publication as Regular Issues in 1999 include a gas chromatography simulation for Windows 95 by Bruce Armitage, a collection of lessons on torsional rotation for organic chemistry students by Ronald Starkey, and a tutorial on pericyclic reactions, also for organic chemistry, by Albert Lee, C. T. So, and C. L. Chan. We have had many recent submissions and submissions of work in progress. In 1999 we will work with the authors and our peer-reviewers to complete and publish these submissions. Submissions include Multimedia Problems for General Chemistry by David Whisnant, lessons on point groups and crystallography by Margaret Kastner, et al., a mass spectrum simulator by Stephen W. Bigger and Robert A. Craig, a tutorial for introductory chemistry on determining the pH of very dilute acid and base solutions by Paul Mihas and George Papageorgiou, and many others. Also under development by the JCE Software staff are The General Chemistry Collection (instructor's edition) CD-ROM along with an updated student edition. An Invitation In collaboration with JCE Online we plan to make available in 1999 support files for JCE Software. These will include not only troubleshooting tips and technical support notes, but also supporting information such as lessons, specific assignments, and activities using JCE Software publications submitted by users. All JCE Software users are invited to contribute to this area. Get in touch with JCE Software and let us know how you are using our materials so that we can share your ideas with others! Although the word software is in our name, many of our publications are not traditional software. We also publish video on videotape, videodisc, and CD-ROM and electronic documents (Mathcad and Mathematica, spreadsheet files and macros, HTML documents, and PowerPoint presentations). Most chemistry instructors who use a computer in their teaching have created or considered creating one or more of these for their classes. If you have an original computer presentation, electronic document, animation, video, or any other item that is not printed text it is probably an appropriate submission for JCE Software. By publishing your work in any branch of the Journal of Chemical Education, you will share your efforts with chemistry instructors and students all over the world and get professional recognition for your achievements. Literature Cited 1. Schatz, P. F.; Moore, J. W.; Holmes, J. L. Illustrated Periodic Table; J. Chem. Educ. Software 1995, 2D2. 2. Kotz. J. C.; Young, S. Chemistry Navigator; J. Chem. Educ. Software 1995, 6C2. 3. Banks, A. Periodic Table Videodisc, 2nd ed.; J. Chem. Educ. Software 1996, SP1. 4. QuickTime 3.0, Apple Computer, Inc.: 1 Infinite Loop, Cupertino, CA 95014-2084. 5. Jacobsen, J. J.; Moore, J. W. Chemistry Comes Alive!, Volume 1; J. Chem. Educ. Software 1997, SP 18. 6. Jacobsen, J. J.; Moore, J. W. Chemistry Comes Alive!, Volume 2; Chem. Educ. Software 1998, SP 21. 7. Moore, J. W.; Jacobsen, J. J.; Hunsberger, L. R.; Gammon, S. D.; Jetzer, K. H.; Zimmerman, J. ChemDemos Videodisc; J. Chem. Educ. Software 1994, SP 8. 8. Moore, J. W.; Jacobsen, J. J.; Jetzer, K. H.; Gilbert, G.; Mattes, F.; Phillips, D.; Lisensky, G.; Zweerink, G. ChemDemos II; J. Chem. Educ. Software 1996, SP 14. 9. Jacobsen, J. J.; Jetzer, K. H.; Patani, N.; Zimmerman, J. Titration Techniques Videodisc; J. Chem. Educ. Software 1995, SP 9. JCE Software CD-ROMs In addition to more than 100 traditional computer programs and videodiscs, JCE Software has published nine CD-ROMs and four videotapes. Recently published CDs now available include:

    • JCE CD 98
    • Solid State Resources, 2nd Edition
    • General Chemistry Collection, 2nd Edition (Student Edition)
    • Chemistry Comes Alive!, Volumes 1 and 2
    • Flying over Atoms
    Below are some images from JCE Software CD-ROMs. Information for all CDs can be found on our WWW site. Ordering and Information JCE Software is a publication of the Journal of Chemical Education. There is an order form inserted in this issue that provides prices and other ordering information. If this card is not available or if you need additional information, contact: JCE Software, University of Wisconsin-Madison, 1101 University Avenue, Madison, WI 53706-1396 phone: 608/262-5153 or 800/991-5534 fax: 608/265-8094; email: jcesoft@chem.wisc.edu Information about all of our publications (including abstracts, descriptions, updates) is available from our World Wide Web site. http://jchemed.chem.wisc.edu/JCESoft/

  4. Lean Mission Operations Systems Design - Using Agile and Lean Development Principles for Mission Operations Design and Development

    NASA Technical Reports Server (NTRS)

    Trimble, Jay Phillip

    2014-01-01

    The Resource Prospector Mission seeks to rove the lunar surface with an in-situ resource utilization payload in search of volatiles at a polar region. The mission operations system (MOS) will need to perform the short-duration mission while taking advantage of the near real time control that the short one-way light time to the Moon provides. To maximize our use of limited resources for the design and development of the MOS we are utilizing agile and lean methods derived from our previous experience with applying these methods to software. By using methods such as "say it then sim it" we will spend less time in meetings and more time focused on the one outcome that counts - the effective utilization of our assets on the Moon to meet mission objectives.

  5. BioNetFit: a fitting tool compatible with BioNetGen, NFsim and distributed computing environments

    PubMed Central

    Thomas, Brandon R.; Chylek, Lily A.; Colvin, Joshua; Sirimulla, Suman; Clayton, Andrew H.A.; Hlavacek, William S.; Posner, Richard G.

    2016-01-01

    Summary: Rule-based models are analyzed with specialized simulators, such as those provided by the BioNetGen and NFsim open-source software packages. Here, we present BioNetFit, a general-purpose fitting tool that is compatible with BioNetGen and NFsim. BioNetFit is designed to take advantage of distributed computing resources. This feature facilitates fitting (i.e. optimization of parameter values for consistency with data) when simulations are computationally expensive. Availability and implementation: BioNetFit can be used on stand-alone Mac, Windows/Cygwin, and Linux platforms and on Linux-based clusters running SLURM, Torque/PBS, or SGE. The BioNetFit source code (Perl) is freely available (http://bionetfit.nau.edu). Supplementary information: Supplementary data are available at Bioinformatics online. Contact: bionetgen.help@gmail.com PMID:26556387

  6. Application of low-noise CID imagers in scientific instrumentation cameras

    NASA Astrophysics Data System (ADS)

    Carbone, Joseph; Hutton, J.; Arnold, Frank S.; Zarnowski, Jeffrey J.; Vangorden, Steven; Pilon, Michael J.; Wadsworth, Mark V.

    1991-07-01

    CIDTEC has developed a PC-based instrumentation camera incorporating a preamplifier per row CID imager and a microprocessor/LCA camera controller. The camera takes advantage of CID X-Y addressability to randomly read individual pixels and potentially overlapping pixel subsets in true nondestructive (NDRO) as well as destructive readout modes. Using an oxy- nitride fabricated CID and the NDRO readout technique, pixel full well and noise levels of approximately 1*10(superscript 6) and 40 electrons, respectively, were measured. Data taken from test structures indicates noise levels (which appear to be 1/f limited) can be reduced by a factor of two by eliminating the nitride under the preamplifier gate. Due to software programmability, versatile readout capabilities, wide dynamic range, and extended UV/IR capability, this camera appears to be ideally suited for use in spectroscopy and other scientific applications.

  7. Mining biomedical images towards valuable information retrieval in biomedical and life sciences

    PubMed Central

    Ahmed, Zeeshan; Zeeshan, Saman; Dandekar, Thomas

    2016-01-01

    Biomedical images are helpful sources for the scientists and practitioners in drawing significant hypotheses, exemplifying approaches and describing experimental results in published biomedical literature. In last decades, there has been an enormous increase in the amount of heterogeneous biomedical image production and publication, which results in a need for bioimaging platforms for feature extraction and analysis of text and content in biomedical images to take advantage in implementing effective information retrieval systems. In this review, we summarize technologies related to data mining of figures. We describe and compare the potential of different approaches in terms of their developmental aspects, used methodologies, produced results, achieved accuracies and limitations. Our comparative conclusions include current challenges for bioimaging software with selective image mining, embedded text extraction and processing of complex natural language queries. PMID:27538578

  8. Sequence History Update Tool

    NASA Technical Reports Server (NTRS)

    Khanampompan, Teerapat; Gladden, Roy; Fisher, Forest; DelGuercio, Chris

    2008-01-01

    The Sequence History Update Tool performs Web-based sequence statistics archiving for Mars Reconnaissance Orbiter (MRO). Using a single UNIX command, the software takes advantage of sequencing conventions to automatically extract the needed statistics from multiple files. This information is then used to populate a PHP database, which is then seamlessly formatted into a dynamic Web page. This tool replaces a previous tedious and error-prone process of manually editing HTML code to construct a Web-based table. Because the tool manages all of the statistics gathering and file delivery to and from multiple data sources spread across multiple servers, there is also a considerable time and effort savings. With the use of The Sequence History Update Tool what previously took minutes is now done in less than 30 seconds, and now provides a more accurate archival record of the sequence commanding for MRO.

  9. Cloud computing in pharmaceutical R&D: business risks and mitigations.

    PubMed

    Geiger, Karl

    2010-05-01

    Cloud computing provides information processing power and business services, delivering these services over the Internet from centrally hosted locations. Major technology corporations aim to supply these services to every sector of the economy. Deploying business processes 'in the cloud' requires special attention to the regulatory and business risks assumed when running on both hardware and software that are outside the direct control of a company. The identification of risks at the correct service level allows a good mitigation strategy to be selected. The pharmaceutical industry can take advantage of existing risk management strategies that have already been tested in the finance and electronic commerce sectors. In this review, the business risks associated with the use of cloud computing are discussed, and mitigations achieved through knowledge from securing services for electronic commerce and from good IT practice are highlighted.

  10. cOSPREY: A Cloud-Based Distributed Algorithm for Large-Scale Computational Protein Design

    PubMed Central

    Pan, Yuchao; Dong, Yuxi; Zhou, Jingtian; Hallen, Mark; Donald, Bruce R.; Xu, Wei

    2016-01-01

    Abstract Finding the global minimum energy conformation (GMEC) of a huge combinatorial search space is the key challenge in computational protein design (CPD) problems. Traditional algorithms lack a scalable and efficient distributed design scheme, preventing researchers from taking full advantage of current cloud infrastructures. We design cloud OSPREY (cOSPREY), an extension to a widely used protein design software OSPREY, to allow the original design framework to scale to the commercial cloud infrastructures. We propose several novel designs to integrate both algorithm and system optimizations, such as GMEC-specific pruning, state search partitioning, asynchronous algorithm state sharing, and fault tolerance. We evaluate cOSPREY on three different cloud platforms using different technologies and show that it can solve a number of large-scale protein design problems that have not been possible with previous approaches. PMID:27154509

  11. The B-dot Earth Average Magnetic Field

    NASA Technical Reports Server (NTRS)

    Capo-Lugo, Pedro A.; Rakoczy, John; Sanders, Devon

    2013-01-01

    The average Earth's magnetic field is solved with complex mathematical models based on mean square integral. Depending on the selection of the Earth magnetic model, the average Earth's magnetic field can have different solutions. This paper presents a simple technique that takes advantage of the damping effects of the b-dot controller and is not dependent of the Earth magnetic model; but it is dependent on the magnetic torquers of the satellite which is not taken into consideration in the known mathematical models. Also the solution of this new technique can be implemented so easily that the flight software can be updated during flight, and the control system can have current gains for the magnetic torquers. Finally, this technique is verified and validated using flight data from a satellite that it has been in orbit for three years.

  12. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kurth, Thorsten; Pochinsky, Andrew; Sarje, Abhinav

    Practitioners of lattice QCD/QFT have been some of the primary pioneer users of the state-of-the-art high-performance-computing systems, and contribute towards the stress tests of such new machines as soon as they become available. As with all aspects of high-performance-computing, I/O is becoming an increasingly specialized component of these systems. In order to take advantage of the latest available high-performance I/O infrastructure, to ensure reliability and backwards compatibility of data files, and to help unify the data structures used in lattice codes, we have incorporated parallel HDF5 I/O into the SciDAC supported USQCD software stack. Here we present the design andmore » implementation of this I/O framework. Our HDF5 implementation outperforms optimized QIO at the 10-20% level and leaves room for further improvement by utilizing appropriate dataset chunking.« less

  13. Computer codes for checking, plotting and processing of neutron cross-section covariance data and their application

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sartori, E.; Roussin, R.W.

    This paper presents a brief review of computer codes concerned with checking, plotting, processing and using of covariances of neutron cross-section data. It concentrates on those available from the computer code information centers of the United States and the OECD/Nuclear Energy Agency. Emphasis will be placed also on codes using covariances for specific applications such as uncertainty analysis, data adjustment and data consistency analysis. Recent evaluations contain neutron cross section covariance information for all isotopes of major importance for technological applications of nuclear energy. It is therefore important that the available software tools needed for taking advantage of this informationmore » are widely known as hey permit the determination of better safety margins and allow the optimization of more economic, I designs of nuclear energy systems.« less

  14. 78 FR 31916 - Increasing Market and Planning Efficiency Through Improved Software; Supplemental Agenda Notice

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-05-28

    ... Market and Planning Efficiency Through Improved Software; Supplemental Agenda Notice Take notice that... for increasing real-time and day-ahead market efficiency through improved software. A detailed agenda..., the software industry, government, research centers and academia and is intended to build on the...

  15. 77 FR 19280 - Increasing Market and Planning Efficiency Through Improved Software; Notice of Technical...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-03-30

    ... Market and Planning Efficiency Through Improved Software; Notice of Technical Conference: Increasing Real-Time and Day- Ahead Market Efficiency Through Improved Software Take notice that Commission staff will...-time and day-ahead market efficiency through improved software. A detailed agenda with the list of and...

  16. 76 FR 28022 - Increasing Market and Planning Efficiency Through Improved Software; Notice of Technical...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-05-13

    ... Market and Planning Efficiency Through Improved Software; Notice of Technical Conference: Increasing Real-Time and Day- Ahead Market Efficiency Through Improved Software Take notice that Commission staff will... for increasing real-time and day-ahead market efficiency through improved software. This conference...

  17. TCP/IP Interface for the Satellite Orbit Analysis Program (SOAP)

    NASA Technical Reports Server (NTRS)

    Carnright, Robert; Stodden, David; Coggi, John

    2009-01-01

    The Transmission Control Protocol/ Internet protocol (TCP/IP) interface for the Satellite Orbit Analysis Program (SOAP) provides the means for the software to establish real-time interfaces with other software. Such interfaces can operate between two programs, either on the same computer or on different computers joined by a network. The SOAP TCP/IP module employs a client/server interface where SOAP is the server and other applications can be clients. Real-time interfaces between software offer a number of advantages over embedding all of the common functionality within a single program. One advantage is that they allow each program to divide the computation labor between processors or computers running the separate applications. Secondly, each program can be allowed to provide its own expertise domain with other programs able to use this expertise.

  18. Using Visual Basic to Teach Programming for Geographers.

    ERIC Educational Resources Information Center

    Slocum, Terry A.; Yoder, Stephen C.

    1996-01-01

    Outlines reasons why computer programming should be taught to geographers. These include experience using macro (scripting) languages and sophisticated visualization software, and developing a deeper understanding of general hardware and software capabilities. Discusses the distinct advantages and few disadvantages of the programming language…

  19. INTERFACING SAS TO ORACLE IN THE UNIX ENVIRONMENT

    EPA Science Inventory

    SAS is an EPA standard data and statistical analysis software package while ORACLE is EPA's standard data base management system software package. RACLE has the advantage over SAS in data retrieval and storage capabilities but has limited data and statistical analysis capability....

  20. Doclet To Synthesize UML

    NASA Technical Reports Server (NTRS)

    Barry, Matthew R.; Osborne, Richard N.

    2005-01-01

    The RoseDoclet computer program extends the capability of Java doclet software to automatically synthesize Unified Modeling Language (UML) content from Java language source code. [Doclets are Java-language programs that use the doclet application programming interface (API) to specify the content and format of the output of Javadoc. Javadoc is a program, originally designed to generate API documentation from Java source code, now also useful as an extensible engine for processing Java source code.] RoseDoclet takes advantage of Javadoc comments and tags already in the source code to produce a UML model of that code. RoseDoclet applies the doclet API to create a doclet passed to Javadoc. The Javadoc engine applies the doclet to the source code, emitting the output format specified by the doclet. RoseDoclet emits a Rose model file and populates it with fully documented packages, classes, methods, variables, and class diagrams identified in the source code. The way in which UML models are generated can be controlled by use of new Javadoc comment tags that RoseDoclet provides. The advantage of using RoseDoclet is that Javadoc documentation becomes leveraged for two purposes: documenting the as-built API and keeping the design documentation up to date.

  1. In situ Raman mapping of art objects

    PubMed Central

    Brondeel, Ph.; Moens, L.; Vandenabeele, P.

    2016-01-01

    Raman spectroscopy has grown to be one of the techniques of interest for the investigation of art objects. The approach has several advantageous properties, and the non-destructive character of the technique allowed it to be used for in situ investigations. However, compared with laboratory approaches, it would be useful to take advantage of the small spectral footprint of the technique, and use Raman spectroscopy to study the spatial distribution of different compounds. In this work, an in situ Raman mapping system is developed to be able to relate chemical information with its spatial distribution. Challenges for the development are discussed, including the need for stable positioning and proper data treatment. To avoid focusing problems, nineteenth century porcelain cards are used to test the system. This work focuses mainly on the post-processing of the large dataset which consists of four steps: (i) importing the data into the software; (ii) visualization of the dataset; (iii) extraction of the variables; and (iv) creation of a Raman image. It is shown that despite the challenging task of the development of the full in situ Raman mapping system, the first steps are very promising. This article is part of the themed issue ‘Raman spectroscopy in art and archaeology’. PMID:27799424

  2. Semi-Autonomous Vehicle Project

    NASA Technical Reports Server (NTRS)

    Stewart, Christopher

    2016-01-01

    The primary objective this summer is "evaluating standards for wireless architecture for the internet of things". The Internet of Things is the network of physical objects or "things" embedded with electronics, software, sensors and network connectivity which enables these objects to collect and exchange data and make decisions based on said data. This was accomplished by creating a semi-autonomous vehicle that takes advantage of multiple sensors, cameras, and onboard computers and combined them with a mesh network which enabled communication across large distances with little to no interruption. The mesh network took advantage of what is known as DTN - Disruption Tolerant Networking which according to NASA is the new communications protocol that is "the first step towards interplanetary internet." The use of DTN comes from the fact that it will store information if an interruption in communications is detected and even forward that information via other relays within range so that the data is not lost. This translates well into the project because as the car moves further away from whatever is sending it commands (in this case a joystick), the information can still be forwarded to the car with little to no loss of information thanks to the mesh nodes around the driving area.

  3. TAKING SCIENTIFIC ADVANTAGE OF A DISASTROUS OIL SPILL

    EPA Science Inventory

    On 19 January 1996, the North Cape barge ran aground on Moonstone Beach in southern Rhode Island, releasing 828,000 gallons of refined oil. This opportunistic study was designed to take scientific advantage of the most severely affected seabird, the common loon (Gavia immer) . As...

  4. Contribution of 3D inversion of Electrical Resistivity Tomography data applied to volcanic structures

    NASA Astrophysics Data System (ADS)

    Portal, Angélie; Fargier, Yannick; Lénat, Jean-François; Labazuy, Philippe

    2016-04-01

    The electrical resistivity tomography (ERT) method, initially developed for environmental and engineering exploration, is now commonly used for geological structures imaging. Such structures can present complex characteristics that conventional 2D inversion processes cannot perfectly integrate. Here we present a new 3D inversion algorithm named EResI, firstly developed for levee investigation, and presently applied to the study of a complex lava dome (the Puy de Dôme volcano, France). EResI algorithm is based on a conventional regularized Gauss-Newton inversion scheme and a 3D non-structured discretization of the model (double grid method based on tetrahedrons). This discretization allows to accurately model the topography of investigated structure (without a mesh deformation procedure) and also permits a precise location of the electrodes. Moreover, we demonstrate that a complete 3D unstructured discretization limits the number of inversion cells and is better adapted to the resolution capacity of tomography than a structured discretization. This study shows that a 3D inversion with a non-structured parametrization has some advantages compared to classical 2D inversions. The first advantage comes from the fact that a 2D inversion leads to artefacts due to 3D effects (3D topography, 3D internal resistivity). The second advantage comes from the fact that the capacity to experimentally align electrodes along an axis (for 2D surveys) depends on the constrains on the field (topography...). In this case, a 2D assumption induced by 2.5D inversion software prevents its capacity to model electrodes outside this axis leading to artefacts in the inversion result. The last limitation comes from the use of mesh deformation techniques used to accurately model the topography in 2D softwares. This technique used for structured discretization (Res2dinv) is prohibed for strong topography (>60 %) and leads to a small computational errors. A wide geophysical survey was carried out on the Puy de Dôme volcano resulting in 12 ERT profiles with approximatively 800 electrodes. We performed two processing stages by inverting independently each profiles in 2D (RES2DINV software) and the complete data set in 3D (EResI). The comparison of the 3D inversion results with those obtained through a conventional 2D inversion process underlined that EResI allows to accurately take into account the random electrodes positioning and reduce out-line artefacts into the inversion models due to positioning errors out of the profile axis. This comparison also highlighted the advantages to integrate several ERT lines to compute the 3D models of complex volcanic structures. Finally, the resulting 3D model allows a better interpretation of the Puy de Dome Volcano.

  5. Optimal Propellant Maneuver Flight Demonstrations on ISS

    NASA Technical Reports Server (NTRS)

    Bhatt, Sagar; Bedrossian, Nazareth; Longacre, Kenneth; Nguyen, Louis

    2013-01-01

    In this paper, first ever flight demonstrations of Optimal Propellant Maneuver (OPM), a method of propulsive rotational state transition for spacecraft controlled using thrusters, is presented for the International Space Station (ISS). On August 1, 2012, two ISS reorientations of about 180deg each were performed using OPMs. These maneuvers were in preparation for the same-day launch and rendezvous of a Progress vehicle, also a first for ISS visiting vehicles. The first maneuver used 9.7 kg of propellant, whereas the second used 10.2 kg. Identical maneuvers performed without using OPMs would have used approximately 151.1kg and 150.9kg respectively. The OPM method is to use a pre-planned attitude command trajectory to accomplish a rotational state transition. The trajectory is designed to take advantage of the complete nonlinear system dynamics. The trajectory choice directly influences the cost of the maneuver, in this case, propellant. For example, while an eigenaxis maneuver is kinematically the shortest path between two orientations, following that path requires overcoming the nonlinear system dynamics, thereby increasing the cost of the maneuver. The eigenaxis path is used for ISS maneuvers using thrusters. By considering a longer angular path, the path dependence of the system dynamics can be exploited to reduce the cost. The benefits of OPM for the ISS include not only reduced lifetime propellant use, but also reduced loads, erosion, and contamination from thrusters due to fewer firings. Another advantage of the OPM is that it does not require ISS flight software modifications since it is a set of commands tailored to the specific attitude control architecture. The OPM takes advantage of the existing ISS control system architecture for propulsive rotation called USTO control mode1. USTO was originally developed to provide ISS Orbiter stack attitude control capability for a contingency tile-repair scenario, where the Orbiter is maneuvered using its robotic manipulator relative to the ISS. Since 2005 USTO has been used for nominal ISS operations.

  6. Real-time immune-inspired optimum state-of-charge trajectory estimation using upcoming route information preview and neural networks for plug-in hybrid electric vehicles fuel economy

    NASA Astrophysics Data System (ADS)

    Mozaffari, Ahmad; Vajedi, Mahyar; Azad, Nasser L.

    2015-06-01

    The main proposition of the current investigation is to develop a computational intelligence-based framework which can be used for the real-time estimation of optimum battery state-of-charge (SOC) trajectory in plug-in hybrid electric vehicles (PHEVs). The estimated SOC trajectory can be then employed for an intelligent power management to significantly improve the fuel economy of the vehicle. The devised intelligent SOC trajectory builder takes advantage of the upcoming route information preview to achieve the lowest possible total cost of electricity and fossil fuel. To reduce the complexity of real-time optimization, the authors propose an immune system-based clustering approach which allows categorizing the route information into a predefined number of segments. The intelligent real-time optimizer is also inspired on the basis of interactions in biological immune systems, and is called artificial immune algorithm (AIA). The objective function of the optimizer is derived from a computationally efficient artificial neural network (ANN) which is trained by a database obtained from a high-fidelity model of the vehicle built in the Autonomie software. The simulation results demonstrate that the integration of immune inspired clustering tool, AIA and ANN, will result in a powerful framework which can generate a near global optimum SOC trajectory for the baseline vehicle, that is, the Toyota Prius PHEV. The outcomes of the current investigation prove that by taking advantage of intelligent approaches, it is possible to design a computationally efficient and powerful SOC trajectory builder for the intelligent power management of PHEVs.

  7. Automated daily processing of more than 1000 ground-based GPS receivers for studying intense ionospheric storms

    NASA Astrophysics Data System (ADS)

    Komjathy, Attila; Sparks, Lawrence; Wilson, Brian D.; Mannucci, Anthony J.

    2005-12-01

    As the number of ground-based and space-based receivers tracking the Global Positioning System (GPS) satellites steadily increases, it is becoming possible to monitor changes in the ionosphere continuously and on a global scale with unprecedented accuracy and reliability. As of August 2005, there are more than 1000 globally distributed dual-frequency GPS receivers available using publicly accessible networks including, for example, the International GPS Service and the continuously operating reference stations. To take advantage of the vast amount of GPS data, researchers use a number of techniques to estimate satellite and receiver interfrequency biases and the total electron content (TEC) of the ionosphere. Most techniques estimate vertical ionospheric structure and, simultaneously, hardware-related biases treated as nuisance parameters. These methods often are limited to 200 GPS receivers and use a sequential least squares or Kalman filter approach. The biases are later removed from the measurements to obtain unbiased TEC. In our approach to calibrating GPS receiver and transmitter interfrequency biases we take advantage of all available GPS receivers using a new processing algorithm based on the Global Ionospheric Mapping (GIM) software developed at the Jet Propulsion Laboratory. This new capability is designed to estimate receiver biases for all stations. We solve for the instrumental biases by modeling the ionospheric delay and removing it from the observation equation using precomputed GIM maps. The precomputed GIM maps rely on 200 globally distributed GPS receivers to establish the "background" used to model the ionosphere at the remaining 800 GPS sites.

  8. Open source molecular modeling.

    PubMed

    Pirhadi, Somayeh; Sunseri, Jocelyn; Koes, David Ryan

    2016-09-01

    The success of molecular modeling and computational chemistry efforts are, by definition, dependent on quality software applications. Open source software development provides many advantages to users of modeling applications, not the least of which is that the software is free and completely extendable. In this review we categorize, enumerate, and describe available open source software packages for molecular modeling and computational chemistry. An updated online version of this catalog can be found at https://opensourcemolecularmodeling.github.io. Copyright © 2016 The Author(s). Published by Elsevier Inc. All rights reserved.

  9. Eleven quick tips for architecting biomedical informatics workflows with cloud computing.

    PubMed

    Cole, Brian S; Moore, Jason H

    2018-03-01

    Cloud computing has revolutionized the development and operations of hardware and software across diverse technological arenas, yet academic biomedical research has lagged behind despite the numerous and weighty advantages that cloud computing offers. Biomedical researchers who embrace cloud computing can reap rewards in cost reduction, decreased development and maintenance workload, increased reproducibility, ease of sharing data and software, enhanced security, horizontal and vertical scalability, high availability, a thriving technology partner ecosystem, and much more. Despite these advantages that cloud-based workflows offer, the majority of scientific software developed in academia does not utilize cloud computing and must be migrated to the cloud by the user. In this article, we present 11 quick tips for architecting biomedical informatics workflows on compute clouds, distilling knowledge gained from experience developing, operating, maintaining, and distributing software and virtualized appliances on the world's largest cloud. Researchers who follow these tips stand to benefit immediately by migrating their workflows to cloud computing and embracing the paradigm of abstraction.

  10. Eleven quick tips for architecting biomedical informatics workflows with cloud computing

    PubMed Central

    Moore, Jason H.

    2018-01-01

    Cloud computing has revolutionized the development and operations of hardware and software across diverse technological arenas, yet academic biomedical research has lagged behind despite the numerous and weighty advantages that cloud computing offers. Biomedical researchers who embrace cloud computing can reap rewards in cost reduction, decreased development and maintenance workload, increased reproducibility, ease of sharing data and software, enhanced security, horizontal and vertical scalability, high availability, a thriving technology partner ecosystem, and much more. Despite these advantages that cloud-based workflows offer, the majority of scientific software developed in academia does not utilize cloud computing and must be migrated to the cloud by the user. In this article, we present 11 quick tips for architecting biomedical informatics workflows on compute clouds, distilling knowledge gained from experience developing, operating, maintaining, and distributing software and virtualized appliances on the world’s largest cloud. Researchers who follow these tips stand to benefit immediately by migrating their workflows to cloud computing and embracing the paradigm of abstraction. PMID:29596416

  11. Markov Chains For Testing Redundant Software

    NASA Technical Reports Server (NTRS)

    White, Allan L.; Sjogren, Jon A.

    1990-01-01

    Preliminary design developed for validation experiment that addresses problems unique to assuring extremely high quality of multiple-version programs in process-control software. Approach takes into account inertia of controlled system in sense it takes more than one failure of control program to cause controlled system to fail. Verification procedure consists of two steps: experimentation (numerical simulation) and computation, with Markov model for each step.

  12. [EpiData: the natural heir to EpiInfo 6?].

    PubMed

    Bohigas, Pedro Arias; Lauritsen, Jens L

    2007-01-01

    EpiData is an epidemiological software developed by the EpiData Association (www.epidata.dk). Following the EpiInfo 6 philosophy, Epidata, offers all the advantages of EpiInfo 6: simplicity, applicability, few operation and communication system requirements, widening them with a clear focus on data quality and documentation plus the advantages that, for many users, has the Windows operating system. The aim of this Note is to introduce to potential users the strengths and limitations of EpiData, a software that can become in a short time the equivalent to what EpiInfo 6 was a few years ago.

  13. The Relevance of Software Development Education for Students

    ERIC Educational Resources Information Center

    Liebenberg, Janet; Huisman, Magda; Mentz, Elsa

    2015-01-01

    Despite a widely-acknowledged shortage of software developers, and reports of a gap between industry needs and software education, the possible gap between students' needs and software development education has not been explored in detail. In their university education, students want to take courses and carry out projects that clearly relate to…

  14. Evolution of Flexible Multibody Dynamics for Simulation Applications Supporting Human Spaceflight

    NASA Technical Reports Server (NTRS)

    Huynh, An; Brain, Thomas A.; MacLean, John R.; Quiocho, Leslie J.

    2016-01-01

    During the course of transition from the Space Shuttle and International Space Station programs to the Orion and Journey to Mars exploration programs, a generic flexible multibody dynamics formulation and associated software implementation has evolved to meet an ever changing set of requirements at the NASA Johnson Space Center (JSC). Challenging problems related to large transitional topologies and robotic free-flyer vehicle capture/ release, contact dynamics, and exploration missions concept evaluation through simulation (e.g., asteroid surface operations) have driven this continued development. Coupled with this need is the requirement to oftentimes support human spaceflight operations in real-time. Moreover, it has been desirable to allow even more rapid prototyping of on-orbit manipulator and spacecraft systems, to support less complex infrastructure software for massively integrated simulations, to yield further computational efficiencies, and to take advantage of recent advances and availability of multi-core computing platforms. Since engineering analysis, procedures development, and crew familiarity/training for human spaceflight is fundamental to JSC's charter, there is also a strong desire to share and reuse models in both the non-realtime and real-time domains, with the goal of retaining as much multibody dynamics fidelity as possible. Three specific enhancements are reviewed here: (1) linked list organization to address large transitional topologies, (2) body level model order reduction, and (3) parallel formulation/implementation. This paper provides a detailed overview of these primary updates to JSC's flexible multibody dynamics algorithms as well as a comparison of numerical results to previous formulations and associated software.

  15. Reconfigurable HIL Testing of Earth Satellites

    NASA Technical Reports Server (NTRS)

    2008-01-01

    In recent years, hardware-in-the-loop (HIL) testing has carved a strong niche in several industries, such as automotive, aerospace, telecomm, and consumer electronics. As desktop computers have realized gains in speed, memory size, and data storage capacity, hardware/software platforms have evolved into high performance, deterministic HIL platforms, capable of hosting the most demanding applications for testing components and subsystems. Using simulation software to emulate the digital and analog I/O signals of system components, engineers of all disciplines can now test new systems in realistic environments to evaluate their function and performance prior to field deployment. Within the Aerospace industry, space-borne satellite systems are arguably some of the most demanding in terms of their requirement for custom engineering and testing. Typically, spacecraft are built one or few at a time to fulfill a space science or defense mission. In contrast to other industries that can amortize the cost of HIL systems over thousands, even millions of units, spacecraft HIL systems have been built as one-of-a-kind solutions, expensive in terms of schedule, cost, and risk, to assure satellite and spacecraft systems reliability. The focus of this paper is to present a new approach to HIL testing for spacecraft systems that takes advantage of a highly flexible hardware/software architecture based on National Instruments PXI reconfigurable hardware and virtual instruments developed using LabVIEW. This new approach to HIL is based on a multistage/multimode spacecraft bus emulation development model called Reconfigurable Hardware In-the-Loop or RHIL.

  16. Standardization: Hardware and Software Standardization Can Reduce Costs and Save Time

    ERIC Educational Resources Information Center

    Brooks-Young, Susan

    2005-01-01

    Sadly, technical support doesn't come cheap. One money-saving strategy that's gained popularity among school technicians is equipment and software standardization. When it works, standardization can be very effective. However, standardization has its drawbacks. This article discusses the advantages and disadvantages of standardization.

  17. What's New in Software? Hot New Tool: The Hypertext.

    ERIC Educational Resources Information Center

    Hedley, Carolyn N.

    1989-01-01

    This article surveys recent developments in hypertext software, a highly interactive nonsequential reading/writing/database approach to research and teaching that allows paths to be created through related materials including text, graphics, video, and animation sources. Described are uses, advantages, and problems of hypertext. (PB)

  18. The Benefits of Multimedia Computer Software for Students with Disabilities.

    ERIC Educational Resources Information Center

    Green, Douglas W.

    This paper assesses the current state of research and informed opinion on the benefits of multimedia computer software for students with disabilities. Topics include: a definition of multimedia; advantages of multimedia; Multiple Intelligence Theory which states intellectual abilities consist of seven components; motivation and behavior…

  19. Evaluation of drug interaction microcomputer software: comparative study.

    PubMed

    Poirier, T I; Giudici, R

    1991-01-01

    Twelve drug interaction microcomputer software programs were evaluated and compared using general and specific criteria. This article summarizes and compares the features, ratings, advantages, and disadvantages of each program. Features of an ideal drug interaction program are noted. Recommended programs based on three price ranges are suggested.

  20. What Librarians Still Don't Know about Free Software

    ERIC Educational Resources Information Center

    Chudnov, Daniel

    2009-01-01

    Free software isn't about cost, and it isn't about hype, it isn't about taking business away from vendors. It's about four kinds of freedom--the freedom to use the software for any purpose, the freedom to study how the software works, the freedom to modify the software to adapt it to one's needs, and the freedom to copy and share copies of the…

  1. Near-Earth Object Survey Simulation Software

    NASA Astrophysics Data System (ADS)

    Naidu, Shantanu P.; Chesley, Steven R.; Farnocchia, Davide

    2017-10-01

    There is a significant interest in Near-Earth objects (NEOs) because they pose an impact threat to Earth, offer valuable scientific information, and are potential targets for robotic and human exploration. The number of NEO discoveries has been rising rapidly over the last two decades with over 1800 being discovered last year, making the total number of known NEOs >16000. Pan-STARRS and the Catalina Sky Survey are currently the most prolific NEO surveys, having discovered >1600 NEOs between them in 2016. As next generation surveys such as Large Synoptic Survey Telescope (LSST) and the proposed Near-Earth Object Camera (NEOCam) become operational in the next decade, the discovery rate is expected to increase tremendously. Coordination between various survey telescopes will be necessary in order to optimize NEO discoveries and create a unified global NEO discovery network. We are collaborating on a community-based, open-source software project to simulate asteroid surveys to facilitate such coordination and develop strategies for improving discovery efficiency. Our effort so far has focused on development of a fast and efficient tool capable of accepting user-defined asteroid population models and telescope parameters such as a list of pointing angles and camera field-of-view, and generating an output list of detectable asteroids. The software takes advantage of the widely used and tested SPICE library and architecture developed by NASA’s Navigation and Ancillary Information Facility (Acton, 1996) for saving and retrieving asteroid trajectories and camera pointing. Orbit propagation is done using OpenOrb (Granvik et al. 2009) but future versions will allow the user to plug in a propagator of their choice. The software allows the simulation of both ground-based and space-based surveys. Performance is being tested using the Grav et al. (2011) asteroid population model and the LSST simulated survey “enigma_1189”.

  2. Experiences in improving the state of the practice in verification and validation of knowledge-based systems

    NASA Technical Reports Server (NTRS)

    Culbert, Chris; French, Scott W.; Hamilton, David

    1994-01-01

    Knowledge-based systems (KBS's) are in general use in a wide variety of domains, both commercial and government. As reliance on these types of systems grows, the need to assess their quality and validity reaches critical importance. As with any software, the reliability of a KBS can be directly attributed to the application of disciplined programming and testing practices throughout the development life-cycle. However, there are some essential differences between conventional software and KBSs, both in construction and use. The identification of these differences affect the verification and validation (V&V) process and the development of techniques to handle them. The recognition of these differences is the basis of considerable on-going research in this field. For the past three years IBM (Federal Systems Company - Houston) and the Software Technology Branch (STB) of NASA/Johnson Space Center have been working to improve the 'state of the practice' in V&V of Knowledge-based systems. This work was motivated by the need to maintain NASA's ability to produce high quality software while taking advantage of new KBS technology. To date, the primary accomplishment has been the development and teaching of a four-day workshop on KBS V&V. With the hope of improving the impact of these workshops, we also worked directly with NASA KBS projects to employ concepts taught in the workshop. This paper describes two projects that were part of this effort. In addition to describing each project, this paper describes problems encountered and solutions proposed in each case, with particular emphasis on implications for transferring KBS V&V technology beyond the NASA domain.

  3. Improvements to NASA's Debris Assessment Software

    NASA Technical Reports Server (NTRS)

    Opiela, J.; Johnson, Nicholas L.

    2007-01-01

    NASA's Debris Assessment Software (DAS) has been substantially revised and expanded. DAS is designed to assist NASA programs in performing orbital debris assessments, as described in NASA s Guidelines and Assessment Procedures for Limiting Orbital Debris. The extensive upgrade of DAS was undertaken to reflect changes in the debris mitigation guidelines, to incorporate recommendations from DAS users, and to take advantage of recent software capabilities for greater user utility. DAS 2.0 includes an updated environment model and enhanced orbital propagators and reentry-survivability models. The ORDEM96 debris environment model has been replaced by ORDEM2000 in DAS 2.0, which is also designed to accept anticipated revisions to the environment definition. Numerous upgrades have also been applied to the assessment of human casualty potential due to reentering debris. Routines derived from the Object Reentry Survival Analysis Tool, Version 6 (ORSAT 6), determine which objects are assessed to survive reentry, and the resulting risk of human casualty is calculated directly based upon the orbital inclination and a future world population database. When evaluating reentry risks, the user may enter up to 200 unique hardware components for each launched object, in up to four nested levels. This last feature allows the software to more accurately model components that are exposed below the initial breakup altitude. The new DAS 2.0 provides an updated set of tools for users to assess their mission s compliance with the NASA Safety Standard and does so with a clear and easy-to-understand interface. The new native Microsoft Windows graphical user interface (GUI) is a vast improvement over the previous DOS-based interface. In the new version, functions are more-clearly laid out, and the GUI includes the standard Windows-style Help functions. The underlying routines within the DAS code are also improved.

  4. A Language for Specifying Compiler Optimizations for Generic Software

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Willcock, Jeremiah J.

    2007-01-01

    Compiler optimization is important to software performance, and modern processor architectures make optimization even more critical. However, many modern software applications use libraries providing high levels of abstraction. Such libraries often hinder effective optimization — the libraries are difficult to analyze using current compiler technology. For example, high-level libraries often use dynamic memory allocation and indirectly expressed control structures, such as iteratorbased loops. Programs using these libraries often cannot achieve an optimal level of performance. On the other hand, software libraries have also been recognized as potentially aiding in program optimization. One proposed implementation of library-based optimization is to allowmore » the library author, or a library user, to define custom analyses and optimizations. Only limited systems have been created to take advantage of this potential, however. One problem in creating a framework for defining new optimizations and analyses is how users are to specify them: implementing them by hand inside a compiler is difficult and prone to errors. Thus, a domain-specific language for librarybased compiler optimizations would be beneficial. Many optimization specification languages have appeared in the literature, but they tend to be either limited in power or unnecessarily difficult to use. Therefore, I have designed, implemented, and evaluated the Pavilion language for specifying program analyses and optimizations, designed for library authors and users. These analyses and optimizations can be based on the implementation of a particular library, its use in a specific program, or on the properties of a broad range of types, expressed through concepts. The new system is intended to provide a high level of expressiveness, even though the intended users are unlikely to be compiler experts.« less

  5. VOC Emission Reduction Study at the Hill Air Force Base Building 515 Painting Facility

    DTIC Science & Technology

    1990-09-01

    occurs during painting. A system for decreasing the flow to a downstream VOC emission control device can be designed that takes advantage of this...paint application process. A flow-reducing ventilation system that takes advantage of this operating characteristic can be designed in which the...flow from the second duct is vented to a VOC emission control device. The advantage of this system is that the flow rate to a VOC emission contro

  6. Managing Digital Archives Using Open Source Software Tools

    NASA Astrophysics Data System (ADS)

    Barve, S.; Dongare, S.

    2007-10-01

    This paper describes the use of open source software tools such as MySQL and PHP for creating database-backed websites. Such websites offer many advantages over ones built from static HTML pages. This paper will discuss how OSS tools are used and their benefits, and after the successful implementation of these tools how the library took the initiative in implementing an institutional repository using DSpace open source software.

  7. An Experimental Investigation of Computer Program Development Approaches and Computer Programming Metrics.

    DTIC Science & Technology

    1979-12-01

    team progranming in reducing software dleveloup- ment costs relative to ad hoc approaches and improving software product quality relative to...are interpreted as demonstrating the advantages of disciplined team programming in reducing software development costs relative to ad hoc approaches...is due oartialty to the cost and imoracticality of a valiI experimental setup within a oroauct ion environment. Thus the question remains, are

  8. SHI(EL)DS: A Novel Hardware-Based Security Backplane to Enhance Security with Minimal Impact to System Operation

    DTIC Science & Technology

    2008-03-01

    executables. The current roadblock to detecting Type I Malware consistantly is the practice of legitimate software , such as antivirus programs, using this... Software Security Systems . . 31 3.2.2 Advantages of Hardware . . . . . . . . . . . . . 32 3.2.3 Trustworthiness of Information . . . . . . . . . 33...Towards a Hardware Security Backplane . . . . . . . . . 42 IV. Review of State of the Art Computer Security Solutions . . . . . 46 4.1 Software

  9. Evolution of Secondary Software Businesses: Understanding Industry Dynamics

    NASA Astrophysics Data System (ADS)

    Tyrväinen, Pasi; Warsta, Juhani; Seppänen, Veikko

    Primary software industry originates from IBM's decision to unbundle software-related computer system development activities to external partners. This kind of outsourcing from an enterprise internal software development activity is a common means to start a new software business serving a vertical software market. It combines knowledge of the vertical market process with competence in software development. In this research, we present and analyze the key figures of the Finnish secondary software industry, in order to quantify its interaction with the primary software industry during the period of 2000-2003. On the basis of the empirical data, we present a model for evolution of a secondary software business, which makes explicit the industry dynamics. It represents the shift from internal software developed for competitive advantage to development of products supporting standard business processes on top of standardized technologies. We also discuss the implications for software business strategies in each phase.

  10. Software beamforming: comparison between a phased array and synthetic transmit aperture.

    PubMed

    Li, Yen-Feng; Li, Pai-Chi

    2011-04-01

    The data-transfer and computation requirements are compared between software-based beamforming using a phased array (PA) and a synthetic transmit aperture (STA). The advantages of a software-based architecture are reduced system complexity and lower hardware cost. Although this architecture can be implemented using commercial CPUs or GPUs, the high computation and data-transfer requirements limit its real-time beamforming performance. In particular, transferring the raw rf data from the front-end subsystem to the software back-end remains challenging with current state-of-the-art electronics technologies, which offset the cost advantage of the software back end. This study investigated the tradeoff between the data-transfer and computation requirements. Two beamforming methods based on a PA and STA, respectively, were used: the former requires a higher data transfer rate and the latter requires more memory operations. The beamformers were implemente;d in an NVIDIA GeForce GTX 260 GPU and an Intel core i7 920 CPU. The frame rate of PA beamforming was 42 fps with a 128-element array transducer, with 2048 samples per firing and 189 beams per image (with a 95 MB/frame data-transfer requirement). The frame rate of STA beamforming was 40 fps with 16 firings per image (with an 8 MB/frame data-transfer requirement). Both approaches achieved real-time beamforming performance but each had its own bottleneck. On the one hand, the required data-transfer speed was considerably reduced in STA beamforming, whereas this required more memory operations, which limited the overall computation time. The advantages of the GPU approach over the CPU approach were clearly demonstrated.

  11. Science 101: How Does Speech-Recognition Software Work?

    ERIC Educational Resources Information Center

    Robertson, Bill

    2016-01-01

    This column provides background science information for elementary teachers. Many innovations with computer software begin with analysis of how humans do a task. This article takes a look at how humans recognize spoken words and explains the origins of speech-recognition software.

  12. Second Generation Product Line Engineering Takes Hold in the DoD

    DTIC Science & Technology

    2014-01-01

    Feature- Oriented Domain Analysis ( FODA ) Feasibility Study” (CMU/SEI-90- TR-021, ADA235785). Pittsburgh, PA: Software Engineering Institute...software product line engineering and software architecture documentation and analysis . Clements is co-author of three practitioner-oriented books about

  13. Modern Corneal Eye-Banking Using a Software-Based IT Management Solution.

    PubMed

    Kern, C; Kortuem, K; Wertheimer, C; Nilmayer, O; Dirisamer, M; Priglinger, S; Mayer, W J

    2018-01-01

    Increasing government legislation and regulations in manufacturing have led to additional documentation regarding the pharmaceutical product requirements of corneal grafts in the European Union. The aim of this project was to develop a software within a hospital information system (HIS) to support the documentation process, to improve the management of the patient waiting list and to increase informational flow between the clinic and eye bank. After an analysis of the current documentation process, a new workflow and software were implemented in our electronic health record (EHR) system. The software takes over most of the documentation and reduces the time required for record keeping. It guarantees real-time tracing of all steps during human corneal tissue processing from the start of production until allocation during surgery and includes follow-up within the HIS. Moreover, listing of the patient for surgery as well as waiting list management takes place in the same system. The new software for corneal eye banking supports the whole process chain by taking over both most of the required documentation and the management of the transplant waiting list. It may provide a standardized IT-based solution for German eye banks working within the same HIS.

  14. Cardiovascular Physiology Teaching: Computer Simulations vs. Animal Demonstrations.

    ERIC Educational Resources Information Center

    Samsel, Richard W.; And Others

    1994-01-01

    At the introductory level, the computer provides an effective alternative to using animals for laboratory teaching. Computer software can simulate the operation of multiple organ systems. Advantages of software include alteration of variables that are not easily changed in vivo, repeated interventions, and cost-effective hands-on student access.…

  15. Software for Better Documentation of Other Software

    NASA Technical Reports Server (NTRS)

    Pinedo, John

    2003-01-01

    The Literate Programming Extraction Engine is a Practical Extraction and Reporting Language- (PERL-)based computer program that facilitates and simplifies the implementation of a concept of self-documented literate programming in a fashion tailored to the typical needs of scientists. The advantage for the programmer is that documentation and source code are written side-by-side in the same file, reducing the likelihood that the documentation will be inconsistent with the code and improving the verification that the code performs its intended functions. The advantage for the user is the knowledge that the documentation matches the software because they come from the same file. This program unifies the documentation process for a variety of programming languages, including C, C++, and several versions of FORTRAN. This program can process the documentation in any markup language, and incorporates the LaTeX typesetting software. The program includes sample Makefile scripts for automating both the code-compilation (when appropriate) and documentation-generation processes into a single command-line statement. Also included are macro instructions for the Emacs display-editor software, making it easy for a programmer to toggle between editing in a code or a documentation mode.

  16. Active in-database processing to support ambient assisted living systems.

    PubMed

    de Morais, Wagner O; Lundström, Jens; Wickström, Nicholas

    2014-08-12

    As an alternative to the existing software architectures that underpin the development of smart homes and ambient assisted living (AAL) systems, this work presents a database-centric architecture that takes advantage of active databases and in-database processing. Current platforms supporting AAL systems use database management systems (DBMSs) exclusively for data storage. Active databases employ database triggers to detect and react to events taking place inside or outside of the database. DBMSs can be extended with stored procedures and functions that enable in-database processing. This means that the data processing is integrated and performed within the DBMS. The feasibility and flexibility of the proposed approach were demonstrated with the implementation of three distinct AAL services. The active database was used to detect bed-exits and to discover common room transitions and deviations during the night. In-database machine learning methods were used to model early night behaviors. Consequently, active in-database processing avoids transferring sensitive data outside the database, and this improves performance, security and privacy. Furthermore, centralizing the computation into the DBMS facilitates code reuse, adaptation and maintenance. These are important system properties that take into account the evolving heterogeneity of users, their needs and the devices that are characteristic of smart homes and AAL systems. Therefore, DBMSs can provide capabilities to address requirements for scalability, security, privacy, dependability and personalization in applications of smart environments in healthcare.

  17. Active In-Database Processing to Support Ambient Assisted Living Systems

    PubMed Central

    de Morais, Wagner O.; Lundström, Jens; Wickström, Nicholas

    2014-01-01

    As an alternative to the existing software architectures that underpin the development of smart homes and ambient assisted living (AAL) systems, this work presents a database-centric architecture that takes advantage of active databases and in-database processing. Current platforms supporting AAL systems use database management systems (DBMSs) exclusively for data storage. Active databases employ database triggers to detect and react to events taking place inside or outside of the database. DBMSs can be extended with stored procedures and functions that enable in-database processing. This means that the data processing is integrated and performed within the DBMS. The feasibility and flexibility of the proposed approach were demonstrated with the implementation of three distinct AAL services. The active database was used to detect bed-exits and to discover common room transitions and deviations during the night. In-database machine learning methods were used to model early night behaviors. Consequently, active in-database processing avoids transferring sensitive data outside the database, and this improves performance, security and privacy. Furthermore, centralizing the computation into the DBMS facilitates code reuse, adaptation and maintenance. These are important system properties that take into account the evolving heterogeneity of users, their needs and the devices that are characteristic of smart homes and AAL systems. Therefore, DBMSs can provide capabilities to address requirements for scalability, security, privacy, dependability and personalization in applications of smart environments in healthcare. PMID:25120164

  18. Intelligent QoS routing algorithm based on improved AODV protocol for Ad Hoc networks

    NASA Astrophysics Data System (ADS)

    Huibin, Liu; Jun, Zhang

    2016-04-01

    Mobile Ad Hoc Networks were playing an increasingly important part in disaster reliefs, military battlefields and scientific explorations. However, networks routing difficulties are more and more outstanding due to inherent structures. This paper proposed an improved cuckoo searching-based Ad hoc On-Demand Distance Vector Routing protocol (CSAODV). It elaborately designs the calculation methods of optimal routing algorithm used by protocol and transmission mechanism of communication-package. In calculation of optimal routing algorithm by CS Algorithm, by increasing QoS constraint, the found optimal routing algorithm can conform to the requirements of specified bandwidth and time delay, and a certain balance can be obtained among computation spending, bandwidth and time delay. Take advantage of NS2 simulation software to take performance test on protocol in three circumstances and validate the feasibility and validity of CSAODV protocol. In results, CSAODV routing protocol is more adapt to the change of network topological structure than AODV protocol, which improves package delivery fraction of protocol effectively, reduce the transmission time delay of network, reduce the extra burden to network brought by controlling information, and improve the routing efficiency of network.

  19. Predicting Software Suitability Using a Bayesian Belief Network

    NASA Technical Reports Server (NTRS)

    Beaver, Justin M.; Schiavone, Guy A.; Berrios, Joseph S.

    2005-01-01

    The ability to reliably predict the end quality of software under development presents a significant advantage for a development team. It provides an opportunity to address high risk components earlier in the development life cycle, when their impact is minimized. This research proposes a model that captures the evolution of the quality of a software product, and provides reliable forecasts of the end quality of the software being developed in terms of product suitability. Development team skill, software process maturity, and software problem complexity are hypothesized as driving factors of software product quality. The cause-effect relationships between these factors and the elements of software suitability are modeled using Bayesian Belief Networks, a machine learning method. This research presents a Bayesian Network for software quality, and the techniques used to quantify the factors that influence and represent software quality. The developed model is found to be effective in predicting the end product quality of small-scale software development efforts.

  20. Model Driven Engineering with Ontology Technologies

    NASA Astrophysics Data System (ADS)

    Staab, Steffen; Walter, Tobias; Gröner, Gerd; Parreiras, Fernando Silva

    Ontologies constitute formal models of some aspect of the world that may be used for drawing interesting logical conclusions even for large models. Software models capture relevant characteristics of a software artifact to be developed, yet, most often these software models have limited formal semantics, or the underlying (often graphical) software language varies from case to case in a way that makes it hard if not impossible to fix its semantics. In this contribution, we survey the use of ontology technologies for software modeling in order to carry over advantages from ontology technologies to the software modeling domain. It will turn out that ontology-based metamodels constitute a core means for exploiting expressive ontology reasoning in the software modeling domain while remaining flexible enough to accommodate varying needs of software modelers.

  1. The Design of Software for Three-Phase Induction Motor Test System

    NASA Astrophysics Data System (ADS)

    Haixiang, Xu; Fengqi, Wu; Jiai, Xue

    2017-11-01

    The design and development of control system software is important to three-phase induction motor test equipment, which needs to be completely familiar with the test process and the control procedure of test equipment. In this paper, the software is developed according to the national standard (GB/T1032-2005) about three-phase induction motor test method by VB language. The control system and data analysis software and the implement about motor test system are described individually, which has the advantages of high automation and high accuracy.

  2. The Personal Software Process: Downscaling the factory

    NASA Technical Reports Server (NTRS)

    Roy, Daniel M.

    1994-01-01

    It is argued that the next wave of software process improvement (SPI) activities will be based on a people-centered paradigm. The most promising such paradigm, Watts Humphrey's personal software process (PSP), is summarized and its advantages are listed. The concepts of the PSP are shown also to fit a down-scaled version of Basili's experience factory. The author's data and lessons learned while practicing the PSP are presented along with personal experience, observations, and advice from the perspective of a consultant and teacher for the personal software process.

  3. Fine Mapping Causal Variants with an Approximate Bayesian Method Using Marginal Test Statistics.

    PubMed

    Chen, Wenan; Larrabee, Beth R; Ovsyannikova, Inna G; Kennedy, Richard B; Haralambieva, Iana H; Poland, Gregory A; Schaid, Daniel J

    2015-07-01

    Two recently developed fine-mapping methods, CAVIAR and PAINTOR, demonstrate better performance over other fine-mapping methods. They also have the advantage of using only the marginal test statistics and the correlation among SNPs. Both methods leverage the fact that the marginal test statistics asymptotically follow a multivariate normal distribution and are likelihood based. However, their relationship with Bayesian fine mapping, such as BIMBAM, is not clear. In this study, we first show that CAVIAR and BIMBAM are actually approximately equivalent to each other. This leads to a fine-mapping method using marginal test statistics in the Bayesian framework, which we call CAVIAR Bayes factor (CAVIARBF). Another advantage of the Bayesian framework is that it can answer both association and fine-mapping questions. We also used simulations to compare CAVIARBF with other methods under different numbers of causal variants. The results showed that both CAVIARBF and BIMBAM have better performance than PAINTOR and other methods. Compared to BIMBAM, CAVIARBF has the advantage of using only marginal test statistics and takes about one-quarter to one-fifth of the running time. We applied different methods on two independent cohorts of the same phenotype. Results showed that CAVIARBF, BIMBAM, and PAINTOR selected the same top 3 SNPs; however, CAVIARBF and BIMBAM had better consistency in selecting the top 10 ranked SNPs between the two cohorts. Software is available at https://bitbucket.org/Wenan/caviarbf. Copyright © 2015 by the Genetics Society of America.

  4. NASA JPL Distributed Systems Technology (DST) Object-Oriented Component Approach for Software Inter-Operability and Reuse

    NASA Technical Reports Server (NTRS)

    Hall, Laverne; Hung, Chaw-Kwei; Lin, Imin

    2000-01-01

    The purpose of this paper is to provide a description of NASA JPL Distributed Systems Technology (DST) Section's object-oriented component approach to open inter-operable systems software development and software reuse. It will address what is meant by the terminology object component software, give an overview of the component-based development approach and how it relates to infrastructure support of software architectures and promotes reuse, enumerate on the benefits of this approach, and give examples of application prototypes demonstrating its usage and advantages. Utilization of the object-oriented component technology approach for system development and software reuse will apply to several areas within JPL, and possibly across other NASA Centers.

  5. BioNetFit: a fitting tool compatible with BioNetGen, NFsim and distributed computing environments.

    PubMed

    Thomas, Brandon R; Chylek, Lily A; Colvin, Joshua; Sirimulla, Suman; Clayton, Andrew H A; Hlavacek, William S; Posner, Richard G

    2016-03-01

    Rule-based models are analyzed with specialized simulators, such as those provided by the BioNetGen and NFsim open-source software packages. Here, we present BioNetFit, a general-purpose fitting tool that is compatible with BioNetGen and NFsim. BioNetFit is designed to take advantage of distributed computing resources. This feature facilitates fitting (i.e. optimization of parameter values for consistency with data) when simulations are computationally expensive. BioNetFit can be used on stand-alone Mac, Windows/Cygwin, and Linux platforms and on Linux-based clusters running SLURM, Torque/PBS, or SGE. The BioNetFit source code (Perl) is freely available (http://bionetfit.nau.edu). Supplementary data are available at Bioinformatics online. bionetgen.help@gmail.com. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  6. Graphics Processing Unit Assisted Thermographic Compositing

    NASA Technical Reports Server (NTRS)

    Ragasa, Scott; Russell, Samuel S.

    2012-01-01

    Objective Develop a software application utilizing high performance computing techniques, including general purpose graphics processing units (GPGPUs), for the analysis and visualization of large thermographic data sets. Over the past several years, an increasing effort among scientists and engineers to utilize graphics processing units (GPUs) in a more general purpose fashion is allowing for previously unobtainable levels of computation by individual workstations. As data sets grow, the methods to work them grow at an equal, and often greater, pace. Certain common computations can take advantage of the massively parallel and optimized hardware constructs of the GPU which yield significant increases in performance. These common computations have high degrees of data parallelism, that is, they are the same computation applied to a large set of data where the result does not depend on other data elements. Image processing is one area were GPUs are being used to greatly increase the performance of certain analysis and visualization techniques.

  7. Graphics Processing Unit Assisted Thermographic Compositing

    NASA Technical Reports Server (NTRS)

    Ragasa, Scott; McDougal, Matthew; Russell, Sam

    2013-01-01

    Objective: To develop a software application utilizing general purpose graphics processing units (GPUs) for the analysis of large sets of thermographic data. Background: Over the past few years, an increasing effort among scientists and engineers to utilize the GPU in a more general purpose fashion is allowing for supercomputer level results at individual workstations. As data sets grow, the methods to work them grow at an equal, and often greater, pace. Certain common computations can take advantage of the massively parallel and optimized hardware constructs of the GPU to allow for throughput that was previously reserved for compute clusters. These common computations have high degrees of data parallelism, that is, they are the same computation applied to a large set of data where the result does not depend on other data elements. Signal (image) processing is one area were GPUs are being used to greatly increase the performance of certain algorithms and analysis techniques.

  8. High-Performance I/O: HDF5 for Lattice QCD

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kurth, Thorsten; Pochinsky, Andrew; Sarje, Abhinav

    2015-01-01

    Practitioners of lattice QCD/QFT have been some of the primary pioneer users of the state-of-the-art high-performance-computing systems, and contribute towards the stress tests of such new machines as soon as they become available. As with all aspects of high-performance-computing, I/O is becoming an increasingly specialized component of these systems. In order to take advantage of the latest available high-performance I/O infrastructure, to ensure reliability and backwards compatibility of data files, and to help unify the data structures used in lattice codes, we have incorporated parallel HDF5 I/O into the SciDAC supported USQCD software stack. Here we present the design andmore » implementation of this I/O framework. Our HDF5 implementation outperforms optimized QIO at the 10-20% level and leaves room for further improvement by utilizing appropriate dataset chunking.« less

  9. An FPGA-based heterogeneous image fusion system design method

    NASA Astrophysics Data System (ADS)

    Song, Le; Lin, Yu-chi; Chen, Yan-hua; Zhao, Mei-rong

    2011-08-01

    Taking the advantages of FPGA's low cost and compact structure, an FPGA-based heterogeneous image fusion platform is established in this study. Altera's Cyclone IV series FPGA is adopted as the core processor of the platform, and the visible light CCD camera and infrared thermal imager are used as the image-capturing device in order to obtain dualchannel heterogeneous video images. Tailor-made image fusion algorithms such as gray-scale weighted averaging, maximum selection and minimum selection methods are analyzed and compared. VHDL language and the synchronous design method are utilized to perform a reliable RTL-level description. Altera's Quartus II 9.0 software is applied to simulate and implement the algorithm modules. The contrast experiments of various fusion algorithms show that, preferably image quality of the heterogeneous image fusion can be obtained on top of the proposed system. The applied range of the different fusion algorithms is also discussed.

  10. Modular uncooled video engines based on a DSP processor

    NASA Astrophysics Data System (ADS)

    Schapiro, F.; Milstain, Y.; Aharon, A.; Neboshchik, A.; Ben-Simon, Y.; Kogan, I.; Lerman, I.; Mizrahi, U.; Maayani, S.; Amsterdam, A.; Vaserman, I.; Duman, O.; Gazit, R.

    2011-06-01

    The market demand for low SWaP (Size, Weight and Power) uncooled engines keeps growing. Low SWaP is especially critical in battery-operated applications such as goggles and Thermal Weapon Sights. A new approach for the design of the engines was implemented by SCD to optimize size and power consumption at system level. The new approach described in the paper, consists of: 1. A modular hardware design that allows the user to define the exact level of integration needed for his system 2. An "open architecture" based on the OMAPTM530 DSP that allows the integrator to take advantage of unused hardware (FPGA) and software (DSP) resources, for implementation of additional algorithms or functionality. The approach was successfully implemented on the first generation of 25μm pitch BIRD detectors, and more recently on the new, 640 x480, 17 μm pitch detector.

  11. A case study for cloud based high throughput analysis of NGS data using the globus genomics system

    DOE PAGES

    Bhuvaneshwar, Krithika; Sulakhe, Dinanath; Gauba, Robinder; ...

    2015-01-01

    Next generation sequencing (NGS) technologies produce massive amounts of data requiring a powerful computational infrastructure, high quality bioinformatics software, and skilled personnel to operate the tools. We present a case study of a practical solution to this data management and analysis challenge that simplifies terabyte scale data handling and provides advanced tools for NGS data analysis. These capabilities are implemented using the “Globus Genomics” system, which is an enhanced Galaxy workflow system made available as a service that offers users the capability to process and transfer data easily, reliably and quickly to address end-to-end NGS analysis requirements. The Globus Genomicsmore » system is built on Amazon's cloud computing infrastructure. The system takes advantage of elastic scaling of compute resources to run multiple workflows in parallel and it also helps meet the scale-out analysis needs of modern translational genomics research.« less

  12. Haptics – Touchfeedback Technology Widening the Horizon of Medicine

    PubMed Central

    Kapoor, Shalini; Arora, Pallak; Kapoor, Vikas; Jayachandran, Mahesh; Tiwari, Manish

    2014-01-01

    Haptics, or touchsense haptic technology is a major breakthrough in medical and dental interventions. Haptic perception is the process of recognizing objects through touch. Haptic sensations are created by actuators or motors which generate vibrations to the users and are controlled by embedded software which is integrated into the device. It takes the advantage of a combination of somatosensory pattern of skin and proprioception of hand position. Anatomical and diagnostic knowledge, when it is combined with this touch sense technology, has revolutionized medical education. This amalgamation of the worlds of diagnosis and surgical intervention adds precise robotic touch to the skill of the surgeon. A systematic literature review was done by using MEDLINE, GOOGLE SEARCH AND PubMed. The aim of this article was to introduce the fundamentals of haptic technology, its current applications in medical training and robotic surgeries, limitations of haptics and future aspects of haptics in medicine. PMID:24783164

  13. Fusing modeling techniques to support domain analysis for reuse opportunities identification

    NASA Technical Reports Server (NTRS)

    Hall, Susan Main; Mcguire, Eileen

    1993-01-01

    Functional modeling techniques or object-oriented graphical representations, which are more useful to someone trying to understand the general design or high level requirements of a system? For a recent domain analysis effort, the answer was a fusion of popular modeling techniques of both types. By using both functional and object-oriented techniques, the analysts involved were able to lean on their experience in function oriented software development, while taking advantage of the descriptive power available in object oriented models. In addition, a base of familiar modeling methods permitted the group of mostly new domain analysts to learn the details of the domain analysis process while producing a quality product. This paper describes the background of this project and then provides a high level definition of domain analysis. The majority of this paper focuses on the modeling method developed and utilized during this analysis effort.

  14. A combined-slip predictive control of vehicle stability with experimental verification

    NASA Astrophysics Data System (ADS)

    Jalali, Milad; Hashemi, Ehsan; Khajepour, Amir; Chen, Shih-ken; Litkouhi, Bakhtiar

    2018-02-01

    In this paper, a model predictive vehicle stability controller is designed based on a combined-slip LuGre tyre model. Variations in the lateral tyre forces due to changes in tyre slip ratios are considered in the prediction model of the controller. It is observed that the proposed combined-slip controller takes advantage of the more accurate tyre model and can adjust tyre slip ratios based on lateral forces of the front axle. This results in an interesting closed-loop response that challenges the notion of braking only the wheels on one side of the vehicle in differential braking. The performance of the proposed controller is evaluated in software simulations and is compared to a similar pure-slip controller. Furthermore, experimental tests are conducted on a rear-wheel drive electric Chevrolet Equinox equipped with differential brakes to evaluate the closed-loop response of the model predictive control controller.

  15. High-Performance I/O: HDF5 for Lattice QCD

    DOE PAGES

    Kurth, Thorsten; Pochinsky, Andrew; Sarje, Abhinav; ...

    2017-05-09

    Practitioners of lattice QCD/QFT have been some of the primary pioneer users of the state-of-the-art high-performance-computing systems, and contribute towards the stress tests of such new machines as soon as they become available. As with all aspects of high-performance-computing, I/O is becoming an increasingly specialized component of these systems. In order to take advantage of the latest available high-performance I/O infrastructure, to ensure reliability and backwards compatibility of data files, and to help unify the data structures used in lattice codes, we have incorporated parallel HDF5 I/O into the SciDAC supported USQCD software stack. Here we present the design andmore » implementation of this I/O framework. Our HDF5 implementation outperforms optimized QIO at the 10-20% level and leaves room for further improvement by utilizing appropriate dataset chunking.« less

  16. Real-Time Hardware-in-the-Loop Simulation of Ares I Launch Vehicle

    NASA Technical Reports Server (NTRS)

    Tobbe, Patrick; Matras, Alex; Walker, David; Wilson, Heath; Fulton, Chris; Alday, Nathan; Betts, Kevin; Hughes, Ryan; Turbe, Michael

    2009-01-01

    The Ares Real-Time Environment for Modeling, Integration, and Simulation (ARTEMIS) has been developed for use by the Ares I launch vehicle System Integration Laboratory at the Marshall Space Flight Center. The primary purpose of the Ares System Integration Laboratory is to test the vehicle avionics hardware and software in a hardware - in-the-loop environment to certify that the integrated system is prepared for flight. ARTEMIS has been designed to be the real-time simulation backbone to stimulate all required Ares components for verification testing. ARTE_VIIS provides high -fidelity dynamics, actuator, and sensor models to simulate an accurate flight trajectory in order to ensure realistic test conditions. ARTEMIS has been designed to take advantage of the advances in underlying computational power now available to support hardware-in-the-loop testing to achieve real-time simulation with unprecedented model fidelity. A modular realtime design relying on a fully distributed computing architecture has been implemented.

  17. An Optimal Partial Differential Equations-based Stopping Criterion for Medical Image Denoising.

    PubMed

    Khanian, Maryam; Feizi, Awat; Davari, Ali

    2014-01-01

    Improving the quality of medical images at pre- and post-surgery operations are necessary for beginning and speeding up the recovery process. Partial differential equations-based models have become a powerful and well-known tool in different areas of image processing such as denoising, multiscale image analysis, edge detection and other fields of image processing and computer vision. In this paper, an algorithm for medical image denoising using anisotropic diffusion filter with a convenient stopping criterion is presented. In this regard, the current paper introduces two strategies: utilizing the efficient explicit method due to its advantages with presenting impressive software technique to effectively solve the anisotropic diffusion filter which is mathematically unstable, proposing an automatic stopping criterion, that takes into consideration just input image, as opposed to other stopping criteria, besides the quality of denoised image, easiness and time. Various medical images are examined to confirm the claim.

  18. Multicore Challenges and Benefits for High Performance Scientific Computing

    DOE PAGES

    Nielsen, Ida M. B.; Janssen, Curtis L.

    2008-01-01

    Until recently, performance gains in processors were achieved largely by improvements in clock speeds and instruction level parallelism. Thus, applications could obtain performance increases with relatively minor changes by upgrading to the latest generation of computing hardware. Currently, however, processor performance improvements are realized by using multicore technology and hardware support for multiple threads within each core, and taking full advantage of this technology to improve the performance of applications requires exposure of extreme levels of software parallelism. We will here discuss the architecture of parallel computers constructed from many multicore chips as well as techniques for managing the complexitymore » of programming such computers, including the hybrid message-passing/multi-threading programming model. We will illustrate these ideas with a hybrid distributed memory matrix multiply and a quantum chemistry algorithm for energy computation using Møller–Plesset perturbation theory.« less

  19. Radio frequency identification (RFID) in health care: privacy and security concerns limiting adoption.

    PubMed

    Rosenbaum, Benjamin P

    2014-03-01

    Radio frequency identification (RFID) technology has been implemented in a wide variety of industries. Health care is no exception. This article explores implementations and limitations of RFID in several health care domains: authentication, medication safety, patient tracking, and blood transfusion medicine. Each domain has seen increasing utilization of unique applications of RFID technology. Given the importance of protecting patient and data privacy, potential privacy and security concerns in each domain are discussed. Such concerns, some of which are inherent to existing RFID hardware and software technology, may limit ubiquitous adoption. In addition, an apparent lack of security standards within the RFID domain and specifically health care may also hinder the growth and utility of RFID within health care for the foreseeable future. Safeguarding the privacy of patient data may be the most important obstacle to overcome to allow the health care industry to take advantage of the numerous benefits RFID technology affords.

  20. Interactive Visualization of Computational Fluid Dynamics using Mosaic

    NASA Technical Reports Server (NTRS)

    Clucas, Jean; Watson, Velvin; Chancellor, Marisa K. (Technical Monitor)

    1994-01-01

    The Web provides new Methods for accessing Information world-wide, but the current text-and-pictures approach neither utilizes all the Web's possibilities not provides for its limitations. While the inclusion of pictures and animations in a paper communicates more effectively than text alone, It Is essentially an extension of the concept of "publication." Also, as use of the Web increases putting images and animations online will quickly load even the "Information Superhighway." We need to find forms of communication that take advantage of the special nature of the Web. This paper presents one approach: the use of the Internet and the Mosaic interface for data sharing and collaborative analysis. We will describe (and In the presentation, demonstrate) our approach: using FAST (Flow Analysis Software Toolkit), a scientific visualization package, as a data viewer and interactive tool called from MOSAIC. Our intent is to stimulate the development of other tools that utilize the unique nature of electronic communication.

  1. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ilsche, Thomas; Schuchart, Joseph; Cope, Joseph

    Event tracing is an important tool for understanding the performance of parallel applications. As concurrency increases in leadership-class computing systems, the quantity of performance log data can overload the parallel file system, perturbing the application being observed. In this work we present a solution for event tracing at leadership scales. We enhance the I/O forwarding system software to aggregate and reorganize log data prior to writing to the storage system, significantly reducing the burden on the underlying file system for this type of traffic. Furthermore, we augment the I/O forwarding system with a write buffering capability to limit the impactmore » of artificial perturbations from log data accesses on traced applications. To validate the approach, we modify the Vampir tracing tool to take advantage of this new capability and show that the approach increases the maximum traced application size by a factor of 5x to more than 200,000 processors.« less

  2. Accounting for informatively missing data in logistic regression by means of reassessment sampling.

    PubMed

    Lin, Ji; Lyles, Robert H

    2015-05-20

    We explore the 'reassessment' design in a logistic regression setting, where a second wave of sampling is applied to recover a portion of the missing data on a binary exposure and/or outcome variable. We construct a joint likelihood function based on the original model of interest and a model for the missing data mechanism, with emphasis on non-ignorable missingness. The estimation is carried out by numerical maximization of the joint likelihood function with close approximation of the accompanying Hessian matrix, using sharable programs that take advantage of general optimization routines in standard software. We show how likelihood ratio tests can be used for model selection and how they facilitate direct hypothesis testing for whether missingness is at random. Examples and simulations are presented to demonstrate the performance of the proposed method. Copyright © 2015 John Wiley & Sons, Ltd.

  3. Technical and economic analysis on grid-connected wind farm based on hybrid energy storage system and distributed generators

    NASA Astrophysics Data System (ADS)

    Zhang, Xinhua; Zhou, Zhongkang; Chen, Xiaochun; Song, Jishuang; Shi, Maolin

    2017-05-01

    system is proposed based on NaS battery and lithium ion battery, that the former is the main large scale energy storage technology world-widely used and developed and the latter is a flexible way to have both power and energy capacities. The hybrid energy storage system, which takes advantage of the two complementary technologies to provide large power and energy capacities, is chosen to do an evaluation of econom ical-environmental based on critical excess electricity production (CEEP), CO2 emission, annual total costs calculated on the specific given condition using Energy PLAN software. The result shows that hybrid storage system has strengths in environmental benefits and also can absorb more discarded wind power than single storage system and is a potential way to push forward the application of wind power and even other types of renewable energy resources.

  4. Mining biomedical images towards valuable information retrieval in biomedical and life sciences.

    PubMed

    Ahmed, Zeeshan; Zeeshan, Saman; Dandekar, Thomas

    2016-01-01

    Biomedical images are helpful sources for the scientists and practitioners in drawing significant hypotheses, exemplifying approaches and describing experimental results in published biomedical literature. In last decades, there has been an enormous increase in the amount of heterogeneous biomedical image production and publication, which results in a need for bioimaging platforms for feature extraction and analysis of text and content in biomedical images to take advantage in implementing effective information retrieval systems. In this review, we summarize technologies related to data mining of figures. We describe and compare the potential of different approaches in terms of their developmental aspects, used methodologies, produced results, achieved accuracies and limitations. Our comparative conclusions include current challenges for bioimaging software with selective image mining, embedded text extraction and processing of complex natural language queries. © The Author(s) 2016. Published by Oxford University Press.

  5. Optimizing Excited-State Electronic-Structure Codes for Intel Knights Landing: A Case Study on the BerkeleyGW Software

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Deslippe, Jack; da Jornada, Felipe H.; Vigil-Fowler, Derek

    2016-10-06

    We profile and optimize calculations performed with the BerkeleyGW code on the Xeon-Phi architecture. BerkeleyGW depends both on hand-tuned critical kernels as well as on BLAS and FFT libraries. We describe the optimization process and performance improvements achieved. We discuss a layered parallelization strategy to take advantage of vector, thread and node-level parallelism. We discuss locality changes (including the consequence of the lack of L3 cache) and effective use of the on-package high-bandwidth memory. We show preliminary results on Knights-Landing including a roofline study of code performance before and after a number of optimizations. We find that the GW methodmore » is particularly well-suited for many-core architectures due to the ability to exploit a large amount of parallelism over plane-wave components, band-pairs, and frequencies.« less

  6. A case study for cloud based high throughput analysis of NGS data using the globus genomics system

    PubMed Central

    Bhuvaneshwar, Krithika; Sulakhe, Dinanath; Gauba, Robinder; Rodriguez, Alex; Madduri, Ravi; Dave, Utpal; Lacinski, Lukasz; Foster, Ian; Gusev, Yuriy; Madhavan, Subha

    2014-01-01

    Next generation sequencing (NGS) technologies produce massive amounts of data requiring a powerful computational infrastructure, high quality bioinformatics software, and skilled personnel to operate the tools. We present a case study of a practical solution to this data management and analysis challenge that simplifies terabyte scale data handling and provides advanced tools for NGS data analysis. These capabilities are implemented using the “Globus Genomics” system, which is an enhanced Galaxy workflow system made available as a service that offers users the capability to process and transfer data easily, reliably and quickly to address end-to-endNGS analysis requirements. The Globus Genomics system is built on Amazon 's cloud computing infrastructure. The system takes advantage of elastic scaling of compute resources to run multiple workflows in parallel and it also helps meet the scale-out analysis needs of modern translational genomics research. PMID:26925205

  7. Designing Image Operators for MRI-PET Image Fusion of the Brain

    NASA Astrophysics Data System (ADS)

    Márquez, Jorge; Gastélum, Alfonso; Padilla, Miguel A.

    2006-09-01

    Our goal is to obtain images combining in a useful and precise way the information from 3D volumes of medical imaging sets. We address two modalities combining anatomy (Magnetic Resonance Imaging or MRI) and functional information (Positron Emission Tomography or PET). Commercial imaging software offers image fusion tools based on fixed blending or color-channel combination of two modalities, and color Look-Up Tables (LUTs), without considering the anatomical and functional character of the image features. We used a sensible approach for image fusion taking advantage mainly from the HSL (Hue, Saturation and Luminosity) color space, in order to enhance the fusion results. We further tested operators for gradient and contour extraction to enhance anatomical details, plus other spatial-domain filters for functional features corresponding to wide point-spread-function responses in PET images. A set of image-fusion operators was formulated and tested on PET and MRI acquisitions.

  8. Launching genomics into the cloud: deployment of Mercury, a next generation sequence analysis pipeline.

    PubMed

    Reid, Jeffrey G; Carroll, Andrew; Veeraraghavan, Narayanan; Dahdouli, Mahmoud; Sundquist, Andreas; English, Adam; Bainbridge, Matthew; White, Simon; Salerno, William; Buhay, Christian; Yu, Fuli; Muzny, Donna; Daly, Richard; Duyk, Geoff; Gibbs, Richard A; Boerwinkle, Eric

    2014-01-29

    Massively parallel DNA sequencing generates staggering amounts of data. Decreasing cost, increasing throughput, and improved annotation have expanded the diversity of genomics applications in research and clinical practice. This expanding scale creates analytical challenges: accommodating peak compute demand, coordinating secure access for multiple analysts, and sharing validated tools and results. To address these challenges, we have developed the Mercury analysis pipeline and deployed it in local hardware and the Amazon Web Services cloud via the DNAnexus platform. Mercury is an automated, flexible, and extensible analysis workflow that provides accurate and reproducible genomic results at scales ranging from individuals to large cohorts. By taking advantage of cloud computing and with Mercury implemented on the DNAnexus platform, we have demonstrated a powerful combination of a robust and fully validated software pipeline and a scalable computational resource that, to date, we have applied to more than 10,000 whole genome and whole exome samples.

  9. Simple solution to the medical instrumentation software problem

    NASA Astrophysics Data System (ADS)

    Leif, Robert C.; Leif, Suzanne B.; Leif, Stephanie H.; Bingue, E.

    1995-04-01

    Medical devices now include a substantial software component, which is both difficult and expensive to produce and maintain. Medical software must be developed according to `Good Manufacturing Practices', GMP. Good Manufacturing Practices as specified by the FDA and ISO requires the definition and compliance to a software processes which ensures quality products by specifying a detailed method of software construction. The software process should be based on accepted standards. US Department of Defense software standards and technology can both facilitate the development and improve the quality of medical systems. We describe the advantages of employing Mil-Std-498, Software Development and Documentation, and the Ada programming language. Ada provides the very broad range of functionalities, from embedded real-time to management information systems required by many medical devices. It also includes advanced facilities for object oriented programming and software engineering.

  10. WINDS: A Web-Based Intelligent Interactive Course on Data-Structures

    ERIC Educational Resources Information Center

    Sirohi, Vijayalaxmi

    2007-01-01

    The Internet has opened new ways of learning and has brought several advantages to computer-aided education. Global access, self-paced learning, asynchronous teaching, interactivity, and multimedia usage are some of these. Along with the advantages comes the challenge of designing the software using the available facilities. Integrating online…

  11. PINT, a New Pulsar Timing Software

    NASA Astrophysics Data System (ADS)

    Luo, Jing; Jenet, Fredrick A.; Ransom, Scott M.; Demorest, Paul; Van Haasteren, Rutger; Archibald, Anne

    2015-01-01

    We are presenting a new pulsar timing software PINT. The current pulsar timing group are heavily depending on Tempo/Tempo2, a package for analysis pulsar data. However, for a high accuracy pulsar timing related project, such as pulsar timing for gravitational waves, an alternative software is needed for the purpose of examing the results. We are developing a Tempo independent software with a different structure. Different modules is designed to be more isolated and easier to be expanded. Instead of C, we are using Python as our programming language for the advantage of flexibility and powerful docstring. Here, we are presenting the detailed design and the first result of the software.

  12. 3 CFR 8524 - Proclamation 8524 of May 20, 2010. National Safe Boating Week, 2010

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ..., let us recommit during National Safe Boating Week to practicing safe techniques so boaters of all ages... awareness and teach safe boating practices. Boaters can take advantage of these opportunities to learn, make... activities to observe this occasion by learning more about safe boating practices and to take advantage of...

  13. Neural Correlates of Traditional Chinese Medicine Induced Advantageous Risk-Taking Decision Making

    ERIC Educational Resources Information Center

    Lee, Tiffany M. Y.; Guo, Li-guo; Shi, Hong-zhi; Li, Yong-zhi; Luo, Yue-jia; Sung, Connie Y. Y.; Chan, Chetwyn C. H.; Lee, Tatia M. C.

    2009-01-01

    This fMRI study examined the neural correlates of the observed improvement in advantageous risk-taking behavior, as measured by the number of adjusted pumps in the Balloon Analogue Risk Task (BART), following a 60-day course of a Traditional Chinese Medicine (TCM) recipe, specifically designed to regulate impulsiveness in order to modulate…

  14. Development and Evaluation of LEGUME ID: A ToolBook Multimedia Module.

    ERIC Educational Resources Information Center

    Hannaway, David B.; And Others

    1992-01-01

    Describes the development and advantages of LEGUME ID, a multimedia module for agricultural education. LEGUME ID is an example of how teachers, given the opportunity through accessible computer software programs, can create powerful teaching tools. Summarized is a student response to the use of this teacher-produced software program. (MCO)

  15. Software system architecture for corporate user support

    NASA Astrophysics Data System (ADS)

    Sukhopluyeva, V. S.; Kuznetsov, D. Y.

    2017-01-01

    In this article, several existing ready-to-use solutions for the HelpDesk are reviewed. Advantages and disadvantages of these systems are identified. Architecture of software solution for a corporate user support system is presented in a form of the use case, state, and component diagrams described by using a unified modeling language (UML).

  16. Simulated Analysis of Linear Reversible Enzyme Inhibition with SCILAB

    ERIC Educational Resources Information Center

    Antuch, Manuel; Ramos, Yaquelin; Álvarez, Rubén

    2014-01-01

    SCILAB is a lesser-known program (than MATLAB) for numeric simulations and has the advantage of being free software. A challenging software-based activity to analyze the most common linear reversible inhibition types with SCILAB is described. Students establish typical values for the concentration of enzyme, substrate, and inhibitor to simulate…

  17. Using Information Technology in Teaching of Business Statistics in Nigeria Business School

    ERIC Educational Resources Information Center

    Hamadu, Dallah; Adeleke, Ismaila; Ehie, Ike

    2011-01-01

    This paper discusses the use of Microsoft Excel software in the teaching of statistics in the Faculty of Business Administration at the University of Lagos, Nigeria. Problems associated with existing traditional methods are identified and a novel pedagogy using Excel is proposed. The advantages of using this software over other specialized…

  18. Macintosh Computer Classroom and Laboratory Security: Preventing Unwanted Changes to the System.

    ERIC Educational Resources Information Center

    Senn, Gary J.; Smyth, Thomas J. C.

    Because of the graphical interface and "openness" of the operating system, Macintosh computers are susceptible to undesirable changes by the user. This presentation discusses the advantages and disadvantages of software packages that offer protection for the Macintosh system. The two basic forms of software security packages include a…

  19. Advantages and Disadvantages in Image Processing with Free Software in Radiology.

    PubMed

    Mujika, Katrin Muradas; Méndez, Juan Antonio Juanes; de Miguel, Andrés Framiñan

    2018-01-15

    Currently, there are sophisticated applications that make it possible to visualize medical images and even to manipulate them. These software applications are of great interest, both from a teaching and a radiological perspective. In addition, some of these applications are known as Free Open Source Software because they are free and the source code is freely available, and therefore it can be easily obtained even on personal computers. Two examples of free open source software are Osirix Lite® and 3D Slicer®. However, this last group of free applications have limitations in its use. For the radiological field, manipulating and post-processing images is increasingly important. Consequently, sophisticated computing tools that combine software and hardware to process medical images are needed. In radiology, graphic workstations allow their users to process, review, analyse, communicate and exchange multidimensional digital images acquired with different image-capturing radiological devices. These radiological devices are basically CT (Computerised Tomography), MRI (Magnetic Resonance Imaging), PET (Positron Emission Tomography), etc. Nevertheless, the programs included in these workstations have a high cost which always depends on the software provider and is always subject to its norms and requirements. With this study, we aim to present the advantages and disadvantages of these radiological image visualization systems in the advanced management of radiological studies. We will compare the features of the VITREA2® and AW VolumeShare 5® radiology workstation with free open source software applications like OsiriX® and 3D Slicer®, with examples from specific studies.

  20. Enabling Arctic Research Through Science and Engineering Partnerships

    NASA Astrophysics Data System (ADS)

    Kendall, E. A.; Valentic, T. A.; Stehle, R. H.

    2014-12-01

    Under an Arctic Research Support and Logistics contract from NSF (GEO/PLR), SRI International, as part of the CH2M HILL Polar Services (CPS) program, forms partnerships with Arctic research teams to provide data transfer, remote operations, and safety/operations communications. This teamwork is integral to the success of real-time science results and often allows for unmanned operations which are both cost-effective and safer. The CPS program utilizes a variety of communications networks, services and technologies to support researchers and instruments throughout the Arctic, including Iridium, VSAT, Inmarsat BGAN, HughesNet, TeleGreenland, radios, and personal locator beacons. Program-wide IT and communications limitations are due to the broad categories of bandwidth, availability, and power. At these sites it is essential to conserve bandwidth and power through using efficient software, coding and scheduling techniques. There are interesting new products and services on the horizon that the program may be able to take advantage of in the future such as Iridium NEXT, Inmarsat Xpress, and Omnispace mobile satellite services. Additionally, there are engineering and computer software opportunities to develop more efficient products. We will present an overview of science/engineering partnerships formed by the CPS program, discuss current limitations and identify future technological possibilities that could further advance Arctic science goals.

  1. WarpEngine, a Flexible Platform for Distributed Computing Implemented in the VEGA Program and Specially Targeted for Virtual Screening Studies.

    PubMed

    Pedretti, Alessandro; Mazzolari, Angelica; Vistoli, Giulio

    2018-05-21

    The manuscript describes WarpEngine, a novel platform implemented within the VEGA ZZ suite of software for performing distributed simulations both in local and wide area networks. Despite being tailored for structure-based virtual screening campaigns, WarpEngine possesses the required flexibility to carry out distributed calculations utilizing various pieces of software, which can be easily encapsulated within this platform without changing their source codes. WarpEngine takes advantages of all cheminformatics features implemented in the VEGA ZZ program as well as of its largely customizable scripting architecture thus allowing an efficient distribution of various time-demanding simulations. To offer an example of the WarpEngine potentials, the manuscript includes a set of virtual screening campaigns based on the ACE data set of the DUD-E collections using PLANTS as the docking application. Benchmarking analyses revealed a satisfactory linearity of the WarpEngine performances, the speed-up values being roughly equal to the number of utilized cores. Again, the computed scalability values emphasized that a vast majority (i.e., >90%) of the performed simulations benefit from the distributed platform presented here. WarpEngine can be freely downloaded along with the VEGA ZZ program at www.vegazz.net .

  2. Using WNTR to Model Water Distribution System Resilience ...

    EPA Pesticide Factsheets

    The Water Network Tool for Resilience (WNTR) is a new open source Python package developed by the U.S. Environmental Protection Agency and Sandia National Laboratories to model and evaluate resilience of water distribution systems. WNTR can be used to simulate a wide range of disruptive events, including earthquakes, contamination incidents, floods, climate change, and fires. The software includes the EPANET solver as well as a WNTR solver with the ability to model pressure-driven demand hydraulics, pipe breaks, component degradation and failure, changes to supply and demand, and cascading failure. Damage to individual components in the network (i.e. pipes, tanks) can be selected probabilistically using fragility curves. WNTR can also simulate different types of resilience-enhancing actions, including scheduled pipe repair or replacement, water conservation efforts, addition of back-up power, and use of contamination warning systems. The software can be used to estimate potential damage in a network, evaluate preparedness, prioritize repair strategies, and identify worse case scenarios. As a Python package, WNTR takes advantage of many existing python capabilities, including parallel processing of scenarios and graphics capabilities. This presentation will outline the modeling components in WNTR, demonstrate their use, give the audience information on how to get started using the code, and invite others to participate in this open source project. This pres

  3. AIBench: a rapid application development framework for translational research in biomedicine.

    PubMed

    Glez-Peña, D; Reboiro-Jato, M; Maia, P; Rocha, M; Díaz, F; Fdez-Riverola, F

    2010-05-01

    Applied research in both biomedical discovery and translational medicine today often requires the rapid development of fully featured applications containing both advanced and specific functionalities, for real use in practice. In this context, new tools are demanded that allow for efficient generation, deployment and reutilization of such biomedical applications as well as their associated functionalities. In this context this paper presents AIBench, an open-source Java desktop application framework for scientific software development with the goal of providing support to both fundamental and applied research in the domain of translational biomedicine. AIBench incorporates a powerful plug-in engine, a flexible scripting platform and takes advantage of Java annotations, reflection and various design principles in order to make it easy to use, lightweight and non-intrusive. By following a basic input-processing-output life cycle, it is possible to fully develop multiplatform applications using only three types of concepts: operations, data-types and views. The framework automatically provides functionalities that are present in a typical scientific application including user parameter definition, logging facilities, multi-threading execution, experiment repeatability and user interface workflow management, among others. The proposed framework architecture defines a reusable component model which also allows assembling new applications by the reuse of libraries from past projects or third-party software. Copyright (c) 2009 Elsevier Ireland Ltd. All rights reserved.

  4. Adaptive Software Architecture Based on Confident HCI for the Deployment of Sensitive Services in Smart Homes

    PubMed Central

    Vega-Barbas, Mario; Pau, Iván; Martín-Ruiz, María Luisa; Seoane, Fernando

    2015-01-01

    Smart spaces foster the development of natural and appropriate forms of human-computer interaction by taking advantage of home customization. The interaction potential of the Smart Home, which is a special type of smart space, is of particular interest in fields in which the acceptance of new technologies is limited and restrictive. The integration of smart home design patterns with sensitive solutions can increase user acceptance. In this paper, we present the main challenges that have been identified in the literature for the successful deployment of sensitive services (e.g., telemedicine and assistive services) in smart spaces and a software architecture that models the functionalities of a Smart Home platform that are required to maintain and support such sensitive services. This architecture emphasizes user interaction as a key concept to facilitate the acceptance of sensitive services by end-users and utilizes activity theory to support its innovative design. The application of activity theory to the architecture eases the handling of novel concepts, such as understanding of the system by patients at home or the affordability of assistive services. Finally, we provide a proof-of-concept implementation of the architecture and compare the results with other architectures from the literature. PMID:25815449

  5. Computer Technology for Industry

    NASA Technical Reports Server (NTRS)

    1979-01-01

    In this age of the computer, more and more business firms are automating their operations for increased efficiency in a great variety of jobs, from simple accounting to managing inventories, from precise machining to analyzing complex structures. In the interest of national productivity, NASA is providing assistance both to longtime computer users and newcomers to automated operations. Through a special technology utilization service, NASA saves industry time and money by making available already developed computer programs which have secondary utility. A computer program is essentially a set of instructions which tells the computer how to produce desired information or effect by drawing upon its stored input. Developing a new program from scratch can be costly and time-consuming. Very often, however, a program developed for one purpose can readily be adapted to a totally different application. To help industry take advantage of existing computer technology, NASA operates the Computer Software Management and Information Center (COSMIC)(registered TradeMark),located at the University of Georgia. COSMIC maintains a large library of computer programs developed for NASA, the Department of Defense, the Department of Energy and other technology-generating agencies of the government. The Center gets a continual flow of software packages, screens them for adaptability to private sector usage, stores them and informs potential customers of their availability.

  6. Viewpoints: A High-Performance High-Dimensional Exploratory Data Analysis Tool

    NASA Astrophysics Data System (ADS)

    Gazis, P. R.; Levit, C.; Way, M. J.

    2010-12-01

    Scientific data sets continue to increase in both size and complexity. In the past, dedicated graphics systems at supercomputing centers were required to visualize large data sets, but as the price of commodity graphics hardware has dropped and its capability has increased, it is now possible, in principle, to view large complex data sets on a single workstation. To do this in practice, an investigator will need software that is written to take advantage of the relevant graphics hardware. The Viewpoints visualization package described herein is an example of such software. Viewpoints is an interactive tool for exploratory visual analysis of large high-dimensional (multivariate) data. It leverages the capabilities of modern graphics boards (GPUs) to run on a single workstation or laptop. Viewpoints is minimalist: it attempts to do a small set of useful things very well (or at least very quickly) in comparison with similar packages today. Its basic feature set includes linked scatter plots with brushing, dynamic histograms, normalization, and outlier detection/removal. Viewpoints was originally designed for astrophysicists, but it has since been used in a variety of fields that range from astronomy, quantum chemistry, fluid dynamics, machine learning, bioinformatics, and finance to information technology server log mining. In this article, we describe the Viewpoints package and show examples of its usage.

  7. Techniques for designing rotorcraft control systems

    NASA Technical Reports Server (NTRS)

    Levine, William S.; Barlow, Jewel

    1993-01-01

    This report summarizes the work that was done on the project from 1 Apr. 1992 to 31 Mar. 1993. The main goal of this research is to develop a practical tool for rotorcraft control system design based on interactive optimization tools (CONSOL-OPTCAD) and classical rotorcraft design considerations (ADOCS). This approach enables the designer to combine engineering intuition and experience with parametric optimization. The combination should make it possible to produce a better design faster than would be possible using either pure optimization or pure intuition and experience. We emphasize that the goal of this project is not to develop an algorithm. It is to develop a tool. We want to keep the human designer in the design process to take advantage of his or her experience and creativity. The role of the computer is to perform the calculation necessary to improve and to display the performance of the nominal design. Briefly, during the first year we have connected CONSOL-OPTCAD, an existing software package for optimizing parameters with respect to multiple performance criteria, to a simplified nonlinear simulation of the UH-60 rotorcraft. We have also created mathematical approximations to the Mil-specs for rotorcraft handling qualities and input them into CONSOL-OPTCAD. Finally, we have developed the additional software necessary to use CONSOL-OPTCAD for the design of rotorcraft controllers.

  8. Loran-C flight test software

    NASA Technical Reports Server (NTRS)

    Nickum, J. D.

    1978-01-01

    The software package developed for the KIM-1 Micro-System and the Mini-L PLL receiver to simplify taking flight test data is described along with the address and data bus buffers used in the KIM-1 Micro-system. The interface hardware and timing are also presented to describe completely the software programs.

  9. Application of Real Options Theory to Software Engineering for Strategic Decision Making in Software Related Capital Investments

    DTIC Science & Technology

    2008-12-01

    between our current project and the historical projects. Therefore to refine the historical volatility estimate of the previously completed software... historical volatility estimates obtained in the form of beliefs and plausibility based on subjective probabilities that take into consideration unique

  10. NOBLE - Flexible concept recognition for large-scale biomedical natural language processing.

    PubMed

    Tseytlin, Eugene; Mitchell, Kevin; Legowski, Elizabeth; Corrigan, Julia; Chavan, Girish; Jacobson, Rebecca S

    2016-01-14

    Natural language processing (NLP) applications are increasingly important in biomedical data analysis, knowledge engineering, and decision support. Concept recognition is an important component task for NLP pipelines, and can be either general-purpose or domain-specific. We describe a novel, flexible, and general-purpose concept recognition component for NLP pipelines, and compare its speed and accuracy against five commonly used alternatives on both a biological and clinical corpus. NOBLE Coder implements a general algorithm for matching terms to concepts from an arbitrary vocabulary set. The system's matching options can be configured individually or in combination to yield specific system behavior for a variety of NLP tasks. The software is open source, freely available, and easily integrated into UIMA or GATE. We benchmarked speed and accuracy of the system against the CRAFT and ShARe corpora as reference standards and compared it to MMTx, MGrep, Concept Mapper, cTAKES Dictionary Lookup Annotator, and cTAKES Fast Dictionary Lookup Annotator. We describe key advantages of the NOBLE Coder system and associated tools, including its greedy algorithm, configurable matching strategies, and multiple terminology input formats. These features provide unique functionality when compared with existing alternatives, including state-of-the-art systems. On two benchmarking tasks, NOBLE's performance exceeded commonly used alternatives, performing almost as well as the most advanced systems. Error analysis revealed differences in error profiles among systems. NOBLE Coder is comparable to other widely used concept recognition systems in terms of accuracy and speed. Advantages of NOBLE Coder include its interactive terminology builder tool, ease of configuration, and adaptability to various domains and tasks. NOBLE provides a term-to-concept matching system suitable for general concept recognition in biomedical NLP pipelines.

  11. Fitmunk: improving protein structures by accurate, automatic modeling of side-chain conformations.

    PubMed

    Porebski, Przemyslaw Jerzy; Cymborowski, Marcin; Pasenkiewicz-Gierula, Marta; Minor, Wladek

    2016-02-01

    Improvements in crystallographic hardware and software have allowed automated structure-solution pipelines to approach a near-`one-click' experience for the initial determination of macromolecular structures. However, in many cases the resulting initial model requires a laborious, iterative process of refinement and validation. A new method has been developed for the automatic modeling of side-chain conformations that takes advantage of rotamer-prediction methods in a crystallographic context. The algorithm, which is based on deterministic dead-end elimination (DEE) theory, uses new dense conformer libraries and a hybrid energy function derived from experimental data and prior information about rotamer frequencies to find the optimal conformation of each side chain. In contrast to existing methods, which incorporate the electron-density term into protein-modeling frameworks, the proposed algorithm is designed to take advantage of the highly discriminatory nature of electron-density maps. This method has been implemented in the program Fitmunk, which uses extensive conformational sampling. This improves the accuracy of the modeling and makes it a versatile tool for crystallographic model building, refinement and validation. Fitmunk was extensively tested on over 115 new structures, as well as a subset of 1100 structures from the PDB. It is demonstrated that the ability of Fitmunk to model more than 95% of side chains accurately is beneficial for improving the quality of crystallographic protein models, especially at medium and low resolutions. Fitmunk can be used for model validation of existing structures and as a tool to assess whether side chains are modeled optimally or could be better fitted into electron density. Fitmunk is available as a web service at http://kniahini.med.virginia.edu/fitmunk/server/ or at http://fitmunk.bitbucket.org/.

  12. Software Requirements Specification for an Ammunition Management System

    DTIC Science & Technology

    1986-09-01

    thesis takes the form of a software requirements specification. Such a specification, according to Pressman [Ref. 7], establishes a complete...defined by Pressman , is depicted in Figure 1.1. 11 Figure 1.1 Generalized Software Life Cycle The common thread which binds the various phases together...application of software engineering principles requires an established methodology. This methodology, according to Pressman [Ref. 8:p. 151 is an

  13. Developing Confidence Limits For Reliability Of Software

    NASA Technical Reports Server (NTRS)

    Hayhurst, Kelly J.

    1991-01-01

    Technique developed for estimating reliability of software by use of Moranda geometric de-eutrophication model. Pivotal method enables straightforward construction of exact bounds with associated degree of statistical confidence about reliability of software. Confidence limits thus derived provide precise means of assessing quality of software. Limits take into account number of bugs found while testing and effects of sampling variation associated with random order of discovering bugs.

  14. A Highly Scalable Data Service (HSDS) using Cloud-based Storage Technologies for Earth Science Data

    NASA Astrophysics Data System (ADS)

    Michaelis, A.; Readey, J.; Votava, P.; Henderson, J.; Willmore, F.

    2017-12-01

    Cloud based infrastructure may offer several key benefits of scalability, built in redundancy, security mechanisms and reduced total cost of ownership as compared with a traditional data center approach. However, most of the tools and legacy software systems developed for online data repositories within the federal government were not developed with a cloud based infrastructure in mind and do not fully take advantage of commonly available cloud-based technologies. Moreover, services bases on object storage are well established and provided through all the leading cloud service providers (Amazon Web Service, Microsoft Azure, Google Cloud, etc…) of which can often provide unmatched "scale-out" capabilities and data availability to a large and growing consumer base at a price point unachievable from in-house solutions. We describe a system that utilizes object storage rather than traditional file system based storage to vend earth science data. The system described is not only cost effective, but shows a performance advantage for running many different analytics tasks in the cloud. To enable compatibility with existing tools and applications, we outline client libraries that are API compatible with existing libraries for HDF5 and NetCDF4. Performance of the system is demonstrated using clouds services running on Amazon Web Services.

  15. Fully parallel write/read in resistive synaptic array for accelerating on-chip learning

    NASA Astrophysics Data System (ADS)

    Gao, Ligang; Wang, I.-Ting; Chen, Pai-Yu; Vrudhula, Sarma; Seo, Jae-sun; Cao, Yu; Hou, Tuo-Hung; Yu, Shimeng

    2015-11-01

    A neuro-inspired computing paradigm beyond the von Neumann architecture is emerging and it generally takes advantage of massive parallelism and is aimed at complex tasks that involve intelligence and learning. The cross-point array architecture with synaptic devices has been proposed for on-chip implementation of the weighted sum and weight update in the learning algorithms. In this work, forming-free, silicon-process-compatible Ta/TaO x /TiO2/Ti synaptic devices are fabricated, in which >200 levels of conductance states could be continuously tuned by identical programming pulses. In order to demonstrate the advantages of parallelism of the cross-point array architecture, a novel fully parallel write scheme is designed and experimentally demonstrated in a small-scale crossbar array to accelerate the weight update in the training process, at a speed that is independent of the array size. Compared to the conventional row-by-row write scheme, it achieves >30× speed-up and >30× improvement in energy efficiency as projected in a large-scale array. If realistic synaptic device characteristics such as device variations are taken into an array-level simulation, the proposed array architecture is able to achieve ∼95% recognition accuracy of MNIST handwritten digits, which is close to the accuracy achieved by software using the ideal sparse coding algorithm.

  16. Reuse Metrics for Object Oriented Software

    NASA Technical Reports Server (NTRS)

    Bieman, James M.

    1998-01-01

    One way to increase the quality of software products and the productivity of software development is to reuse existing software components when building new software systems. In order to monitor improvements in reuse, the level of reuse must be measured. In this NASA supported project we (1) derived a suite of metrics which quantify reuse attributes for object oriented, object based, and procedural software, (2) designed prototype tools to take these measurements in Ada, C++, Java, and C software, (3) evaluated the reuse in available software, (4) analyzed the relationship between coupling, cohesion, inheritance, and reuse, (5) collected object oriented software systems for our empirical analyses, and (6) developed quantitative criteria and methods for restructuring software to improve reusability.

  17. Research into software executives for space operations support

    NASA Technical Reports Server (NTRS)

    Collier, Mark D.

    1990-01-01

    Research concepts pertaining to a software (workstation) executive which will support a distributed processing command and control system characterized by high-performance graphics workstations used as computing nodes are presented. Although a workstation-based distributed processing environment offers many advantages, it also introduces a number of new concerns. In order to solve these problems, allow the environment to function as an integrated system, and present a functional development environment to application programmers, it is necessary to develop an additional layer of software. This 'executive' software integrates the system, provides real-time capabilities, and provides the tools necessary to support the application requirements.

  18. Factoring symmetric indefinite matrices on high-performance architectures

    NASA Technical Reports Server (NTRS)

    Jones, Mark T.; Patrick, Merrell L.

    1990-01-01

    The Bunch-Kaufman algorithm is the method of choice for factoring symmetric indefinite matrices in many applications. However, the Bunch-Kaufman algorithm does not take advantage of high-performance architectures such as the Cray Y-MP. Three new algorithms, based on Bunch-Kaufman factorization, that take advantage of such architectures are described. Results from an implementation of the third algorithm are presented.

  19. Improving Military Educational Benefits

    DTIC Science & Technology

    1983-03-16

    military to take advantage of their educational benefits . o Adding recruiters or increasing bonuses are less costly ways to increase the number of...as members left to take advantage of their benefits --poorer retention would cancel out five percentage points of that gain. The overall cost of our...IMPROVING MILITARY EDUCATIONAL BENEFITS Statement of Robert F. Hale Assistant Director for National Security and International Affairs

  20. KinSNP software for homozygosity mapping of disease genes using SNP microarrays

    PubMed Central

    2010-01-01

    Consanguineous families affected with a recessive genetic disease caused by homozygotisation of a mutation offer a unique advantage for positional cloning of rare diseases. Homozygosity mapping of patient genotypes is a powerful technique for the identification of the genomic locus harbouring the causing mutation. This strategy relies on the observation that in these patients a large region spanning the disease locus is also homozygous with high probability. The high marker density in single nucleotide polymorphism (SNP) arrays is extremely advantageous for homozygosity mapping. We present KinSNP, a user-friendly software tool for homozygosity mapping using SNP arrays. The software searches for stretches of SNPs which are homozygous to the same allele in all ascertained sick individuals. User-specified parameters control the number of allowed genotyping 'errors' within homozygous blocks. Candidate disease regions are then reported in a detailed, coloured Excel file, along with genotypes of family members and healthy controls. An interactive genome browser has been included which shows homozygous blocks, individual genotypes, genes and further annotations along the chromosomes, with zooming and scrolling capabilities. The software has been used to identify the location of a mutated gene causing insensitivity to pain in a large Bedouin family. KinSNP is freely available from http://bioinfo.bgu.ac.il/bsu/software/kinSNP. PMID:20846928

  1. The Consumer Juggernaut: Web-Based and Mobile Applications as Innovation Pioneer

    NASA Astrophysics Data System (ADS)

    Messerschmitt, David G.

    As happened previously in electronics, software targeted at consumers is increasingly the focus of investment and innovation. Some of the areas where it is leading is animated interfaces, treating users as a community, audio and video information, software as a service, agile software development, and the integration of business models with software design. As a risk-taking and experimental market, and as a source of ideas, consumer software can benefit other areas of applications software. The influence of consumer software can be magnified by research into the internal organizations and processes of the innovative firms at its foundation.

  2. Lessons from 30 Years of Flight Software

    NASA Technical Reports Server (NTRS)

    McComas, David C.

    2015-01-01

    This presentation takes a brief historical look at flight software over the past 30 years, extracts lessons learned and shows how many of the lessons learned are embodied in the Flight Software product line called the core Flight System (cFS). It also captures the lessons learned from developing and applying the cFS.

  3. Optimal Software Strategies in the Presence of Network Externalities

    ERIC Educational Resources Information Center

    Liu, Yipeng

    2009-01-01

    Network externalities or alternatively termed network effects are pervasive in computer software markets. While software vendors consider pricing strategies, they must also take into account the impact of network externalities on their sales. My main interest in this research is to describe a firm's strategies and behaviors in the presence of…

  4. SONAR: A High-Throughput Pipeline for Inferring Antibody Ontogenies from Longitudinal Sequencing of B Cell Transcripts.

    PubMed

    Schramm, Chaim A; Sheng, Zizhang; Zhang, Zhenhai; Mascola, John R; Kwong, Peter D; Shapiro, Lawrence

    2016-01-01

    The rapid advance of massively parallel or next-generation sequencing technologies has made possible the characterization of B cell receptor repertoires in ever greater detail, and these developments have triggered a proliferation of software tools for processing and annotating these data. Of especial interest, however, is the capability to track the development of specific antibody lineages across time, which remains beyond the scope of most current programs. We have previously reported on the use of techniques such as inter- and intradonor analysis and CDR3 tracing to identify transcripts related to an antibody of interest. Here, we present Software for the Ontogenic aNalysis of Antibody Repertoires (SONAR), capable of automating both general repertoire analysis and specialized techniques for investigating specific lineages. SONAR annotates next-generation sequencing data, identifies transcripts in a lineage of interest, and tracks lineage development across multiple time points. SONAR also generates figures, such as identity-divergence plots and longitudinal phylogenetic "birthday" trees, and provides interfaces to other programs such as DNAML and BEAST. SONAR can be downloaded as a ready-to-run Docker image or manually installed on a local machine. In the latter case, it can also be configured to take advantage of a high-performance computing cluster for the most computationally intensive steps, if available. In summary, this software provides a useful new tool for the processing of large next-generation sequencing datasets and the ontogenic analysis of neutralizing antibody lineages. SONAR can be found at https://github.com/scharch/SONAR, and the Docker image can be obtained from https://hub.docker.com/r/scharch/sonar/.

  5. General Purpose Fortran Program for Discrete-Ordinate-Method Radiative Transfer in Scattering and Emitting Layered Media: An Update of DISORT

    NASA Technical Reports Server (NTRS)

    Tsay, Si-Chee; Stamnes, Knut; Wiscombe, Warren; Laszlo, Istvan; Einaudi, Franco (Technical Monitor)

    2000-01-01

    This update reports a state-of-the-art discrete ordinate algorithm for monochromatic unpolarized radiative transfer in non-isothermal, vertically inhomogeneous, but horizontally homogeneous media. The physical processes included are Planckian thermal emission, scattering with arbitrary phase function, absorption, and surface bidirectional reflection. The system may be driven by parallel or isotropic diffuse radiation incident at the top boundary, as well as by internal thermal sources and thermal emission from the boundaries. Radiances, fluxes, and mean intensities are returned at user-specified angles and levels. DISORT has enjoyed considerable popularity in the atmospheric science and other communities since its introduction in 1988. Several new DISORT features are described in this update: intensity correction algorithms designed to compensate for the 8-M forward-peak scaling and obtain accurate intensities even in low orders of approximation; a more general surface bidirectional reflection option; and an exponential-linear approximation of the Planck function allowing more accurate solutions in the presence of large temperature gradients. DISORT has been designed to be an exemplar of good scientific software as well as a program of intrinsic utility. An extraordinary effort has been made to make it numerically well-conditioned, error-resistant, and user-friendly, and to take advantage of robust existing software tools. A thorough test suite is provided to verify the program both against published results, and for consistency where there are no published results. This careful attention to software design has been just as important in DISORT's popularity as its powerful algorithmic content.

  6. The Automated Instrumentation and Monitoring System (AIMS) reference manual

    NASA Technical Reports Server (NTRS)

    Yan, Jerry; Hontalas, Philip; Listgarten, Sherry

    1993-01-01

    Whether a researcher is designing the 'next parallel programming paradigm,' another 'scalable multiprocessor' or investigating resource allocation algorithms for multiprocessors, a facility that enables parallel program execution to be captured and displayed is invaluable. Careful analysis of execution traces can help computer designers and software architects to uncover system behavior and to take advantage of specific application characteristics and hardware features. A software tool kit that facilitates performance evaluation of parallel applications on multiprocessors is described. The Automated Instrumentation and Monitoring System (AIMS) has four major software components: a source code instrumentor which automatically inserts active event recorders into the program's source code before compilation; a run time performance-monitoring library, which collects performance data; a trace file animation and analysis tool kit which reconstructs program execution from the trace file; and a trace post-processor which compensate for data collection overhead. Besides being used as prototype for developing new techniques for instrumenting, monitoring, and visualizing parallel program execution, AIMS is also being incorporated into the run-time environments of various hardware test beds to evaluate their impact on user productivity. Currently, AIMS instrumentors accept FORTRAN and C parallel programs written for Intel's NX operating system on the iPSC family of multi computers. A run-time performance-monitoring library for the iPSC/860 is included in this release. We plan to release monitors for other platforms (such as PVM and TMC's CM-5) in the near future. Performance data collected can be graphically displayed on workstations (e.g. Sun Sparc and SGI) supporting X-Windows (in particular, Xl IR5, Motif 1.1.3).

  7. FPGA Coprocessor for Accelerated Classification of Images

    NASA Technical Reports Server (NTRS)

    Pingree, Paula J.; Scharenbroich, Lucas J.; Werne, Thomas A.

    2008-01-01

    An effort related to that described in the preceding article focuses on developing a spaceborne processing platform for fast and accurate onboard classification of image data, a critical part of modern satellite image processing. The approach again has been to exploit the versatility of recently developed hybrid Virtex-4FX field-programmable gate array (FPGA) to run diverse science applications on embedded processors while taking advantage of the reconfigurable hardware resources of the FPGAs. In this case, the FPGA serves as a coprocessor that implements legacy C-language support-vector-machine (SVM) image-classification algorithms to detect and identify natural phenomena such as flooding, volcanic eruptions, and sea-ice break-up. The FPGA provides hardware acceleration for increased onboard processing capability than previously demonstrated in software. The original C-language program demonstrated on an imaging instrument aboard the Earth Observing-1 (EO-1) satellite implements a linear-kernel SVM algorithm for classifying parts of the images as snow, water, ice, land, or cloud or unclassified. Current onboard processors, such as on EO-1, have limited computing power, extremely limited active storage capability and are no longer considered state-of-the-art. Using commercially available software that translates C-language programs into hardware description language (HDL) files, the legacy C-language program, and two newly formulated programs for a more capable expanded-linear-kernel and a more accurate polynomial-kernel SVM algorithm, have been implemented in the Virtex-4FX FPGA. In tests, the FPGA implementations have exhibited significant speedups over conventional software implementations running on general-purpose hardware.

  8. Applying heuristic evaluation to improve the usability of a telemedicine system.

    PubMed

    Tang, Zhihua; Johnson, Todd R; Tindall, R Douglas; Zhang, Jiajie

    2006-02-01

    The development of a telemedicine system should not only take advantage of technological advances but also pay close attention to users and the human issues involved. In this paper we examine the utility of heuristic evaluation in improving the usability of a digital emergency medical services (EMS) system equipped on an ambulance. The digital EMS system used advanced communication technologies to help remotely located trauma specialists gain access to patient data in real-time and direct life-saving measures in a timely fashion. To improve its usability, three experts inspected prototypes of the system according to 14 software usability heuristics. The analyses revealed information on the prevalence, severity, and nature of heuristic violations in the user interface design. The results were subsequently utilized to guide the iterative software design process. A comparison between two consecutive prototypes showed that the second design had only half as many usability violations as the first prototype and had considerable improvement in a number of usability heuristic categories. The validity of heuristic evaluation was examined in an ethnographic study of paramedics using a prototype of the system in their work environment. Users' task performances partially verified heuristic evaluation results. However, they also revealed problems that were not identified in heuristic evaluation but only became prominent during field observation. In conclusion, we argue that usability should be given high priority in the development of a telemedicine system, and that heuristic evaluation can be an effective and efficient way to identify usability problems in the early stage of software development.

  9. Application of troposphere model from NWP and GNSS data into real-time precise positioning

    NASA Astrophysics Data System (ADS)

    Wilgan, Karina; Hadas, Tomasz; Kazmierski, Kamil; Rohm, Witold; Bosy, Jaroslaw

    2016-04-01

    The tropospheric delay empirical models are usually functions of meteorological parameters (temperature, pressure and humidity). The application of standard atmosphere parameters or global models, such as GPT (global pressure/temperature) model or UNB3 (University of New Brunswick, version 3) model, may not be sufficient, especially for positioning in non-standard weather conditions. The possible solution is to use regional troposphere models based on real-time or near-real time measurements. We implement a regional troposphere model into the PPP (Precise Point Positioning) software GNSS-WARP (Wroclaw Algorithms for Real-time Positioning) developed at Wroclaw University of Environmental and Life Sciences. The software is capable of processing static and kinematic multi-GNSS data in real-time and post-processing mode and takes advantage of final IGS (International GNSS Service) products as well as IGS RTS (Real-Time Service) products. A shortcoming of PPP technique is the time required for the solution to converge. One of the reasons is the high correlation among the estimated parameters: troposphere delay, receiver clock offset and receiver height. To efficiently decorrelate these parameters, a significant change in satellite geometry is required. Alternative solution is to introduce the external high-quality regional troposphere delay model to constrain troposphere estimates. The proposed model consists of zenith total delays (ZTD) and mapping functions calculated from meteorological parameters from Numerical Weather Prediction model WRF (Weather Research and Forecasting) and ZTDs from ground-based GNSS stations using the least-squares collocation software COMEDIE (Collocation of Meteorological Data for Interpretation and Estimation of Tropospheric Pathdelays) developed at ETH Zurich.

  10. An integrated approach for increasing breeding efficiency in apple and peach in Europe.

    PubMed

    Laurens, Francois; Aranzana, Maria José; Arus, Pere; Bassi, Daniele; Bink, Marco; Bonany, Joan; Caprera, Andrea; Corelli-Grappadelli, Luca; Costes, Evelyne; Durel, Charles-Eric; Mauroux, Jehan-Baptiste; Muranty, Hélène; Nazzicari, Nelson; Pascal, Thierry; Patocchi, Andrea; Peil, Andreas; Quilot-Turion, Bénédicte; Rossini, Laura; Stella, Alessandra; Troggio, Michela; Velasco, Riccardo; van de Weg, Eric

    2018-01-01

    Despite the availability of whole genome sequences of apple and peach, there has been a considerable gap between genomics and breeding. To bridge the gap, the European Union funded the FruitBreedomics project (March 2011 to August 2015) involving 28 research institutes and private companies. Three complementary approaches were pursued: (i) tool and software development, (ii) deciphering genetic control of main horticultural traits taking into account allelic diversity and (iii) developing plant materials, tools and methodologies for breeders. Decisive breakthroughs were made including the making available of ready-to-go DNA diagnostic tests for Marker Assisted Breeding, development of new, dense SNP arrays in apple and peach, new phenotypic methods for some complex traits, software for gene/QTL discovery on breeding germplasm via Pedigree Based Analysis (PBA). This resulted in the discovery of highly predictive molecular markers for traits of horticultural interest via PBA and via Genome Wide Association Studies (GWAS) on several European genebank collections. FruitBreedomics also developed pre-breeding plant materials in which multiple sources of resistance were pyramided and software that can support breeders in their selection activities. Through FruitBreedomics, significant progresses were made in the field of apple and peach breeding, genetics, genomics and bioinformatics of which advantage will be made by breeders, germplasm curators and scientists. A major part of the data collected during the project has been stored in the FruitBreedomics database and has been made available to the public. This review covers the scientific discoveries made in this major endeavour, and perspective in the apple and peach breeding and genomics in Europe and beyond.

  11. Performance of twist-coupled blades on variable speed rotors

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lobitz, D.W.; Veers, P.S.; Laino, D.J.

    1999-12-07

    The load mitigation and energy capture characteristics of twist-coupled HAWT blades that are mounted on a variable speed rotor are investigated in this paper. These blades are designed to twist toward feather as they bend with pretwist set to achieve a desirable twist distribution at rated power. For this investigation, the ADAMS-WT software has been modified to include blade models with bending-twist coupling. Using twist-coupled and uncoupled models, the ADAMS software is exercised for steady wind environments to generate C{sub p} curves at a number of operating speeds to compare the efficiencies of the two models. The ADAMS software ismore » also used to generate the response of a twist-coupled variable speed rotor to a spectrum of stochastic wind time series. This spectrum contains time series with two mean wind speeds at two turbulence levels. Power control is achieved by imposing a reactive torque on the low speed shaft proportional to the RPM squared with the coefficient specified so that the rotor operates at peak efficiency in the linear aerodynamic range, and by limiting the maximum RPM to take advantage of the stall controlled nature of the rotor. Fatigue calculations are done for the generated load histories using a range of material exponents that represent materials from welded steel to aluminum to composites, and results are compared with the damage computed for the rotor without twist-coupling. Results indicate that significant reductions in damage are achieved across the spectrum of applied wind loading without any degradation in power production.« less

  12. The Automated Instrumentation and Monitoring System (AIMS): Design and Architecture. 3.2

    NASA Technical Reports Server (NTRS)

    Yan, Jerry C.; Schmidt, Melisa; Schulbach, Cathy; Bailey, David (Technical Monitor)

    1997-01-01

    Whether a researcher is designing the 'next parallel programming paradigm', another 'scalable multiprocessor' or investigating resource allocation algorithms for multiprocessors, a facility that enables parallel program execution to be captured and displayed is invaluable. Careful analysis of such information can help computer and software architects to capture, and therefore, exploit behavioral variations among/within various parallel programs to take advantage of specific hardware characteristics. A software tool-set that facilitates performance evaluation of parallel applications on multiprocessors has been put together at NASA Ames Research Center under the sponsorship of NASA's High Performance Computing and Communications Program over the past five years. The Automated Instrumentation and Monitoring Systematic has three major software components: a source code instrumentor which automatically inserts active event recorders into program source code before compilation; a run-time performance monitoring library which collects performance data; and a visualization tool-set which reconstructs program execution based on the data collected. Besides being used as a prototype for developing new techniques for instrumenting, monitoring and presenting parallel program execution, AIMS is also being incorporated into the run-time environments of various hardware testbeds to evaluate their impact on user productivity. Currently, the execution of FORTRAN and C programs on the Intel Paragon and PALM workstations can be automatically instrumented and monitored. Performance data thus collected can be displayed graphically on various workstations. The process of performance tuning with AIMS will be illustrated using various NAB Parallel Benchmarks. This report includes a description of the internal architecture of AIMS and a listing of the source code.

  13. Selecting Advanced Software Technology in Two Small Manufacturing Enterprises

    DTIC Science & Technology

    2004-05-01

    improving workflow to further reduce delivery times, enhance customer service, and obtain a competitive advantage . The company wanted help... environment , stakeholders’ needs, ecommerce , shop floor visualization, and collaboration capability. These statements are not significantly different...for the purpose of describing a software environment . This identification does not imply any recommendation or endorsement by NIST, the SEI, CMU, or

  14. The Advantages and Disadvantages of Five Common Computer Assisted Instruction Modes.

    ERIC Educational Resources Information Center

    Davidson, Robert L.; Traylor, Karen

    1987-01-01

    This article reviews five modes of computer-assisted software so that teachers will be more aware of them and use computers more in their classrooms. The five modes are the following: (1) drill and practice; (2) tutorial; (3) simulation; (4) demonstration; and (5) instructional games. Teachers should review softwares and choose those that meet…

  15. New Software for Market Segmentation Analysis: A Chi-Square Interaction Detector. AIR 1983 Annual Forum Paper.

    ERIC Educational Resources Information Center

    Lay, Robert S.

    The advantages and disadvantages of new software for market segmentation analysis are discussed, and the application of this new, chi-square based procedure (CHAID), is illustrated. A comparison is presented of an earlier, binary segmentation technique (THAID) and a multiple discriminant analysis. It is suggested that CHAID is superior to earlier…

  16. Quick and Easy: Use Screen Capture Software to Train and Communicate

    ERIC Educational Resources Information Center

    Schuster, Ellen

    2011-01-01

    Screen capture (screen cast) software can be used to develop short videos for training purposes. Developing videos is quick and easy. This article describes how these videos are used as tools to reinforce face-to-face and interactive TV curriculum training in a nutrition education program. Advantages of developing these videos are shared.…

  17. A Review of MS-DOS Bulletin Board Software Suitable for Long Distance Learning.

    ERIC Educational Resources Information Center

    Sessa, Anneliese

    This paper describes the advantages of using computer bulletin boards systems (BBS) for distance learning, including the use of the New York City Education Network (NYCENET) to access various databases and to communicate with individuals or the public. Questions to be answered in order to determine the most appropriate software for running a BBS…

  18. Passive Superconducting Shielding: Experimental Results and Computer Models

    NASA Technical Reports Server (NTRS)

    Warner, B. A.; Kamiya, K.

    2003-01-01

    Passive superconducting shielding for magnetic refrigerators has advantages over active shielding and passive ferromagnetic shielding in that it is lightweight and easy to construct. However, it is not as easy to model and does not fail gracefully. Failure of a passive superconducting shield may lead to persistent flux and persistent currents. Unfortunately, modeling software for superconducting materials is not as easily available as is software for simple coils or for ferromagnetic materials. This paper will discuss ways of using available software to model passive superconducting shielding.

  19. A general observatory control software framework design for existing small and mid-size telescopes

    NASA Astrophysics Data System (ADS)

    Ge, Liang; Lu, Xiao-Meng; Jiang, Xiao-Jun

    2015-07-01

    A general framework for observatory control software would help to improve the efficiency of observation and operation of telescopes, and would also be advantageous for remote and joint observations. We describe a general framework for observatory control software, which considers principles of flexibility and inheritance to meet the expectations from observers and technical personnel. This framework includes observation scheduling, device control and data storage. The design is based on a finite state machine that controls the whole process.

  20. Software Maintenance Exercises for a Software Engineering Project Course

    DTIC Science & Technology

    1989-02-01

    what is program style and how can it be measured? Program style has been defined as a "followed convention with respect to punctuation, capitalization ...convention with respect to punctuation, capitalization , and typographic arrangement and display." *DASC is a software tool that takes a syntactically...Specilleauons: A Frarnewo* * CM-12 Software Metrws CM- 13 Introduction to Softwarell Verification and Validation CM-14 Intelectual Property Protection for

  1. Software Development for EECU Platform of Turbofan Engine

    NASA Astrophysics Data System (ADS)

    Kim, Bo Gyoung; Kwak, Dohyup; Kim, Byunghyun; Choi, Hee ju; Kong, Changduk

    2017-04-01

    The turbofan engine operation consists of a number of hardware and software. The engine is controlled by Electronic Engine Control Unit (EECU). In order to control the engine, EECU communicates with an aircraft system, Actuator Drive Unit (ADU), Engine Power Unit (EPU) and sensors on the engine. This paper tried to investigate the process form starting to taking-off and aims to design the EECU software mode and defined communication data format. The software is implemented according to the designed software mode.

  2. Development of a Cross-Flow Fan Rotor for Vertical Take-Off and Landing Aircraft

    DTIC Science & Technology

    2013-06-01

    ANSYS CFX , along with the commercial computer-aided design software SolidWorks, was used to model and perform a parametric study on the number of rotor...the results found using ANSYS CFX . The experimental and analytical models were successfully compared at speeds ranging from 4,000 to 7,000 RPM...will make vertical take-off possible. The commercial computational fluid dynamics software ANSYS CFX , along with the commercial computer-aided design

  3. India's Information Technology Sector: What Contribution to Broader Economic Development? OECD Development Centre Working Paper, No. 207 (Formerly Technical Paper No. 207)

    ERIC Educational Resources Information Center

    Singh, Nirvikar

    2003-01-01

    What contribution can information technology (IT) make to India's overall economic development? This paper provides an analytical framework centred around the concepts of comparative advantage, complementarities, and innovation. There is strong evidence that India has a strong and sustainable comparative advantage in software development and…

  4. 75 FR 21088 - Self-Regulatory Organizations; NYSE Amex LLC; Notice of Filing of Proposed Rule Change To...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-04-22

    ... example, for a single device to which the Vendor has granted two people access, the Vendor should report... Amex BBO service in any calendar month. In order to take advantage of the per-query fee, a NYSE Amex... anticipates will be the most likely to take advantage of the proposed service; (iv) the contribution of market...

  5. User-driven integrated software lives: ``Paleomag'' paleomagnetics analysis on the Macintosh

    NASA Astrophysics Data System (ADS)

    Jones, Craig H.

    2002-12-01

    "PaleoMag," a paleomagnetics analysis package originally developed for the Macintosh operating system in 1988, allows examination of demagnetization of individual samples and analysis of directional data from collections of samples. Prior to recent reinvigorated development of the software for both Macintosh and Windows, it was widely used despite not running properly on machines and operating systems sold after 1995. This somewhat surprising situation demonstrates that there is a continued need for integrated analysis software within the earth sciences, in addition to well-developed scripting and batch-mode software. One distinct advantage of software like PaleoMag is in the ability to combine quality control with analysis within a unique graphical environment. Because such demands are frequent within the earth sciences, means of nurturing the development of similar software should be found.

  6. Teaching science and ethics to undergraduates: a multidisciplinary approach.

    PubMed

    McGowan, Alan H

    2013-06-01

    The teaching of the ethical implications of scientific advances in science courses for undergraduates has significant advantages for both science and non-science majors. The article describes three courses taught by the author as examples of the concept, and examines the disadvantages as well as the advantages. A significant advantage of this approach is that many students take the courses primarily because of the ethical component who would not otherwise take science. A disadvantage is less time in the course for the science; arguably, this is outweighed by the greater retention of the science when it is put into context.

  7. Are Future Teachers Methodically Trained to Distinguish Good from Bad Educational Software?

    ERIC Educational Resources Information Center

    Pjanic, Karmelita; Hamzabegovic, Jasna

    2016-01-01

    In the era of information technology and general digitization of society, invasion of every kind of software is evident. No matter how laudable is the existence and development of educational software, taking into account its role, its quality and whether it achieves the desired goal is very important. In addition to programming experts it is…

  8. A usability evaluation of medical software at an expert conference setting.

    PubMed

    Bond, Raymond Robert; Finlay, Dewar D; Nugent, Chris D; Moore, George; Guldenring, Daniel

    2014-01-01

    A usability test was employed to evaluate two medical software applications at an expert conference setting. One software application is a medical diagnostic tool (electrocardiogram [ECG] viewer) and the other is a medical research tool (electrode misplacement simulator [EMS]). These novel applications have yet to be adopted by the healthcare domain, thus, (1) we wanted to determine the potential user acceptance of these applications and (2) we wanted to determine the feasibility of evaluating medical diagnostic and medical research software at a conference setting as opposed to the conventional laboratory setting. The medical diagnostic tool (ECG viewer) was evaluated using seven delegates and the medical research tool (EMS) was evaluated using 17 delegates that were recruited at the 2010 International Conference on Computing in Cardiology. Each delegate/participant was required to use the software and undertake a set of predefined tasks during the session breaks at the conference. User interactions with the software were recorded using screen-recording software. The 'think-aloud' protocol was also used to elicit verbal feedback from the participants whilst they attempted the pre-defined tasks. Before and after each session, participants completed a pre-test and a post-test questionnaire respectively. The average duration of a usability session at the conference was 34.69 min (SD=10.28). However, taking into account that 10 min was dedicated to the pre-test and post-test questionnaires, the average time dedication to user interaction of the medical software was 24.69 min (SD=10.28). Given we have shown that usability data can be collected at conferences, this paper details the advantages of conference-based usability studies over the laboratory-based approach. For example, given delegates gather at one geographical location, a conference-based usability evaluation facilitates recruitment of a convenient sample of international subject experts. This would otherwise be very expensive to arrange. A conference-based approach also allows for data to be collected over a few days as opposed to months by avoiding administration duties normally involved in laboratory based approach, e.g. mailing invitation letters as part of a recruitment campaign. Following analysis of the user video recordings, 41 (previously unknown) use errors were identified in the advanced ECG viewer and 29 were identified in the EMS application. All use errors were given a consensus severity rating from two independent usability experts. Out of a rating scale of 4 (where 1=cosmetic and 4=critical), the average severity rating for the ECG viewer was 2.24 (SD=1.09) and the average severity rating for the EMS application was 2.34 (SD=0.97). We were also able to extract task completion rates and times from the video recordings to determine the effectiveness of the software applications. For example, six out of seven tasks were completed by all participants when using both applications. This statistic alone suggests both applications already have a high degree of usability. As well as extracting data from the video recordings, we were also able to extract data from the questionnaires. Using a semantic differential scale (where 1=poor and 5=excellent), delegates highly rated the 'responsiveness', 'usefulness', 'learnability' and the 'look and feel' of both applications. This study has shown the potential user acceptance and user-friendliness of the novel EMS and the ECG viewer applications within the healthcare domain. It has also shown that both medical diagnostic software and medical research software can be evaluated for their usability at an expert conference setting. The primary advantage of a conference-based usability evaluation over a laboratory-based evaluation is the high concentration of experts at one location, which is convenient, less time consuming and less expensive. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  9. The CUORE slow monitoring systems

    NASA Astrophysics Data System (ADS)

    Gladstone, L.; Biare, D.; Cappelli, L.; Cushman, J. S.; Del Corso, F.; Fujikawa, B. K.; Hickerson, K. P.; Moggi, N.; Pagliarone, C. E.; Schmidt, B.; Wagaarachchi, S. L.; Welliver, B.; Winslow, L. A.

    2017-09-01

    CUORE is a cryogenic experiment searching primarily for neutrinoless double decay in 130Te. It will begin data-taking operations in 2016. To monitor the cryostat and detector during commissioning and data taking, we have designed and developed Slow Monitoring systems. In addition to real-time systems using LabVIEW, we have an alarm, analysis, and archiving website that uses MongoDB, AngularJS, and Bootstrap software. These modern, state of the art software packages make the monitoring system transparent, easily maintainable, and accessible on many platforms including mobile devices.

  10. The Use of Video Technology for the Fast-Prototyping of Artificially Intelligent Software.

    ERIC Educational Resources Information Center

    Klein, Gary L.

    This paper describes the use of video to provide a screenplay depiction of a proposed artificial intelligence software system. Advantages of such use are identified: (1) the video can be used to provide a clear conceptualization of the proposed system; (2) it can illustrate abstract technical concepts; (3) it can simulate the functions of the…

  11. A Dozen Years after Open Source's 1998 Birth, It's Time for "OpenTechComm"

    ERIC Educational Resources Information Center

    Still, Brian

    2010-01-01

    2008 marked the 10-year Anniversary of the Open Source movement, which has had a substantial impact on not only software production and adoption, but also on the sharing and distribution of information. Technical communication as a discipline has taken some advantage of the movement or its derivative software, but this article argues not as much…

  12. The Osborne 1.

    ERIC Educational Resources Information Center

    McWilliams, Peter

    1982-01-01

    Describes the unique features, available software, performance capabilities, system options, costs, advantages, disadvantages, and eccentricities of the Osborne 1 microcomputer. A table summarizes specifications, features, and costs. (JL)

  13. [Comparision of Different Methods of Area Measurement in Irregular Scar].

    PubMed

    Ran, D; Li, W J; Sun, Q G; Li, J Q; Xia, Q

    2016-10-01

    To determine a measurement standard of irregular scar area by comparing the advantages and disadvantages of different measurement methods in measuring same irregular scar area. Irregular scar area was scanned by digital scanning and measured by coordinate reading method, AutoCAD pixel method, Photoshop lasso pixel method, Photoshop magic bar filled pixel method and Foxit PDF reading software, and some aspects of these methods such as measurement time, repeatability, whether could be recorded and whether could be traced were compared and analyzed. There was no significant difference in the scar areas by the measurement methods above. However, there was statistical difference in the measurement time and repeatability by one or multi performers and only Foxit PDF reading software could be traced back. The methods above can be used for measuring scar area, but each one has its advantages and disadvantages. It is necessary to develop new measurement software for forensic identification. Copyright© by the Editorial Department of Journal of Forensic Medicine

  14. Evaluation of the efficiency and reliability of software generated by code generators

    NASA Technical Reports Server (NTRS)

    Schreur, Barbara

    1994-01-01

    There are numerous studies which show that CASE Tools greatly facilitate software development. As a result of these advantages, an increasing amount of software development is done with CASE Tools. As more software engineers become proficient with these tools, their experience and feedback lead to further development with the tools themselves. What has not been widely studied, however, is the reliability and efficiency of the actual code produced by the CASE Tools. This investigation considered these matters. Three segments of code generated by MATRIXx, one of many commercially available CASE Tools, were chosen for analysis: ETOFLIGHT, a portion of the Earth to Orbit Flight software, and ECLSS and PFMC, modules for Environmental Control and Life Support System and Pump Fan Motor Control, respectively.

  15. Graphical Technique to Support the Teaching/Learning Process of Software Process Reference Models

    NASA Astrophysics Data System (ADS)

    Espinosa-Curiel, Ismael Edrein; Rodríguez-Jacobo, Josefina; Fernández-Zepeda, José Alberto

    In this paper, we propose a set of diagrams to visualize software process reference models (PRM). The diagrams, called dimods, are the combination of some visual and process modeling techniques such as rich pictures, mind maps, IDEF and RAD diagrams. We show the use of this technique by designing a set of dimods for the Mexican Software Industry Process Model (MoProSoft). Additionally, we perform an evaluation of the usefulness of dimods. The result of the evaluation shows that dimods may be a support tool that facilitates the understanding, memorization, and learning of software PRMs in both, software development organizations and universities. The results also show that dimods may have advantages over the traditional description methods for these types of models.

  16. Computer-assisted qualitative data analysis software.

    PubMed

    Cope, Diane G

    2014-05-01

    Advances in technology have provided new approaches for data collection methods and analysis for researchers. Data collection is no longer limited to paper-and-pencil format, and numerous methods are now available through Internet and electronic resources. With these techniques, researchers are not burdened with entering data manually and data analysis is facilitated by software programs. Quantitative research is supported by the use of computer software and provides ease in the management of large data sets and rapid analysis of numeric statistical methods. New technologies are emerging to support qualitative research with the availability of computer-assisted qualitative data analysis software (CAQDAS).CAQDAS will be presented with a discussion of advantages, limitations, controversial issues, and recommendations for this type of software use.

  17. Getting their appetite back. In need of capital, not-for-profit hospitals take advantage of dropping interest rates.

    PubMed

    Evans, Melanie

    2011-10-31

    After steering clear of the municipal bond market this year, not-for-profit hospitals are being lured back by dropping interest rates. "We're taking advantage of the current market," says Jim Budzinski, left, executive vice president and chief financial officer of WellStar Health System. The Georgia provider's recent bond deal helped erase $4.2 million in interest costs.

  18. Effects of Vaporized Decontamination Systems on Selected Building Interior Materials: Chlorine Dioxide

    DTIC Science & Technology

    2009-02-01

    Chemical Biological Center (ECBC) to take advantage of ECBC’s extensive expertise and specialized research facilities for the decontamination of surfaces...Agency (EPA) established an Interagency Agreement with the U.S. Army Edgewood Chemical Biological Center (ECBC) to take advantage of ECBC’s extensive...Detector (Waters Corporation, Milford, MA). Conductivity suppression was carried out using an ERIS 1000HP Autosuppressor ( Alltech Corporation, Deerfield

  19. Applying formal methods and object-oriented analysis to existing flight software

    NASA Technical Reports Server (NTRS)

    Cheng, Betty H. C.; Auernheimer, Brent

    1993-01-01

    Correctness is paramount for safety-critical software control systems. Critical software failures in medical radiation treatment, communications, and defense are familiar to the public. The significant quantity of software malfunctions regularly reported to the software engineering community, the laws concerning liability, and a recent NRC Aeronautics and Space Engineering Board report additionally motivate the use of error-reducing and defect detection software development techniques. The benefits of formal methods in requirements driven software development ('forward engineering') is well documented. One advantage of rigorously engineering software is that formal notations are precise, verifiable, and facilitate automated processing. This paper describes the application of formal methods to reverse engineering, where formal specifications are developed for a portion of the shuttle on-orbit digital autopilot (DAP). Three objectives of the project were to: demonstrate the use of formal methods on a shuttle application, facilitate the incorporation and validation of new requirements for the system, and verify the safety-critical properties to be exhibited by the software.

  20. Geospatial Authentication

    NASA Technical Reports Server (NTRS)

    Lyle, Stacey D.

    2009-01-01

    A software package that has been designed to allow authentication for determining if the rover(s) is/are within a set of boundaries or a specific area to access critical geospatial information by using GPS signal structures as a means to authenticate mobile devices into a network wirelessly and in real-time has been developed. The advantage lies in that the system only allows those with designated geospatial boundaries or areas into the server. The Geospatial Authentication software has two parts Server and Client. The server software is a virtual private network (VPN) developed in Linux operating system using Perl programming language. The server can be a stand-alone VPN server or can be combined with other applications and services. The client software is a GUI Windows CE software, or Mobile Graphical Software, that allows users to authenticate into a network. The purpose of the client software is to pass the needed satellite information to the server for authentication.

  1. [Research progress of probe design software of oligonucleotide microarrays].

    PubMed

    Chen, Xi; Wu, Zaoquan; Liu, Zhengchun

    2014-02-01

    DNA microarray has become an essential medical genetic diagnostic tool for its high-throughput, miniaturization and automation. The design and selection of oligonucleotide probes are critical for preparing gene chips with high quality. Several sets of probe design software have been developed and are available to perform this work now. Every set of the software aims to different target sequences and shows different advantages and limitations. In this article, the research and development of these sets of software are reviewed in line with three main criteria, including specificity, sensitivity and melting temperature (Tm). In addition, based on the experimental results from literatures, these sets of software are classified according to their applications. This review will be helpful for users to choose an appropriate probe-design software. It will also reduce the costs of microarrays, improve the application efficiency of microarrays, and promote both the research and development (R&D) and commercialization of high-performance probe design software.

  2. R as a Lingua Franca: Advantages of Using R for Quantitative Research in Applied Linguistics

    ERIC Educational Resources Information Center

    Mizumoto, Atsushi; Plonsky, Luke

    2016-01-01

    In this article, we suggest that using R, a statistical software environment, is advantageous for quantitative researchers in applied linguistics. We first provide a brief overview of the reasons why R is popular among researchers in other fields and why we recommend its use for analyses in applied linguistics. In order to illustrate these…

  3. 35 Ways to Take a "Byte" out of Software Costs. Fund Raising Ideas from COMPress Customers.

    ERIC Educational Resources Information Center

    COMPress, Wentworth, NH.

    Based on a survey sponsored by COMPress Quarterly of various schools to determine the extent of the problem of lack of funds for purchasing computer software and how schools have coped with the problem, this booklet describes numerous ways to raise funds for software purchases. Nearly 1,000 questionnaires were returned and this booklet was…

  4. Applying Evolutionary Prototyping In Developing LMIS: A Spatial Web-Based System For Land Management

    NASA Astrophysics Data System (ADS)

    Agustiono, W.

    2018-01-01

    Software development project is a difficult task. Especially for software designed to comply with regulations that are constantly being introduced or changed, it is almost impossible to make just one change during the development process. Even if it is possible, nonetheless, the developers may take bulk of works to fix the design to meet specified needs. This iterative work also means that it takes additional time and potentially leads to failing to meet the original schedule and budget. In such inevitable changes, it is essential for developers to carefully consider and use an appropriate method which will help them carry out software project development. This research aims to examine the implementation of a software development method called evolutionary prototyping for developing software for complying regulation. It investigates the development of Land Management Information System (pseudonym), initiated by the Australian government, for use by farmers to meet regulatory demand requested by Soil and Land Conservation Act. By doing so, it sought to provide understanding the efficacy of evolutionary prototyping in helping developers address frequent changing requirements and iterative works but still within schedule. The findings also offer useful practical insights for other developers who seek to build similar regulatory compliance software.

  5. 12 CFR 563.201 - Corporate opportunity.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... duty to a savings association, you must not take advantage of corporate opportunities belonging to the...; and (2) The opportunity is of present or potential practical advantage to the savings association, either directly or through its subsidiary. (c) OTS will not deem you to have taken advantage of a...

  6. EMMA: a new paradigm in configurable software

    DOE PAGES

    Nogiec, J. M.; Trombly-Freytag, K.

    2017-11-23

    EMMA is a framework designed to create a family of configurable software systems, with emphasis on extensibility and flexibility. It is based on a loosely coupled, event driven architecture. The EMMA framework has been built upon the premise of composing software systems from independent components. It opens up opportunities for reuse of components and their functionality and composing them together in many different ways. As a result, it provides the developer of test and measurement applications with a lightweight alternative to microservices, while sharing their various advantages, including composability, loose coupling, encapsulation, and reuse.

  7. Information systems analysis approach in hospitals: a national survey.

    PubMed

    Wong, B K; Sellaro, C L; Monaco, J A

    1995-03-01

    A survey of 216 hospitals reveals that some hospitals do not conduct cost-benefit analyses or analyze possible adverse effects in feasibility studies. In determining and analyzing system requirements, external factors that initiate the transaction are not examined, and computer-aided software engineering (CASE) tools are seldom used. Some hospitals do not investigate the advantages and disadvantages of using in-house-developed software versus purchased software packages in the evaluation of alternatives. The survey finds that, overall, most hospitals follow the traditional systems development life cycle (SDLC) approach in analyzing information systems.

  8. EMMA: A New Paradigm in Configurable Software

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nogiec, J. M.; Trombly-Freytag, K.

    EMMA is a framework designed to create a family of configurable software systems, with emphasis on extensibility and flexibility. It is based on a loosely coupled, event driven architecture. The EMMA framework has been built upon the premise of composing software systems from independent components. It opens up opportunities for reuse of components and their functionality and composing them together in many different ways. It provides the developer of test and measurement applications with a lightweight alternative to microservices, while sharing their various advantages, including composability, loose coupling, encapsulation, and reuse.

  9. EMMA: a new paradigm in configurable software

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nogiec, J. M.; Trombly-Freytag, K.

    EMMA is a framework designed to create a family of configurable software systems, with emphasis on extensibility and flexibility. It is based on a loosely coupled, event driven architecture. The EMMA framework has been built upon the premise of composing software systems from independent components. It opens up opportunities for reuse of components and their functionality and composing them together in many different ways. As a result, it provides the developer of test and measurement applications with a lightweight alternative to microservices, while sharing their various advantages, including composability, loose coupling, encapsulation, and reuse.

  10. EMMA: a new paradigm in configurable software

    NASA Astrophysics Data System (ADS)

    Nogiec, J. M.; Trombly-Freytag, K.

    2017-10-01

    EMMA is a framework designed to create a family of configurable software systems, with emphasis on extensibility and flexibility. It is based on a loosely coupled, event driven architecture. The EMMA framework has been built upon the premise of composing software systems from independent components. It opens up opportunities for reuse of components and their functionality and composing them together in many different ways. It provides the developer of test and measurement applications with a lightweight alternative to microservices, while sharing their various advantages, including composability, loose coupling, encapsulation, and reuse.

  11. Implementation and Testing of VLBI Software Correlation at the USNO

    NASA Technical Reports Server (NTRS)

    Fey, Alan; Ojha, Roopesh; Boboltz, Dave; Geiger, Nicole; Kingham, Kerry; Hall, David; Gaume, Ralph; Johnston, Ken

    2010-01-01

    The Washington Correlator (WACO) at the U.S. Naval Observatory (USNO) is a dedicated VLBI processor based on dedicated hardware of ASIC design. The WACO is currently over 10 years old and is nearing the end of its expected lifetime. Plans for implementation and testing of software correlation at the USNO are currently being considered. The VLBI correlation process is, by its very nature, well suited to a parallelized computing environment. Commercial off-the-shelf computer hardware has advanced in processing power to the point where software correlation is now both economically and technologically feasible. The advantages of software correlation are manifold but include flexibility, scalability, and easy adaptability to changing environments and requirements. We discuss our experience with and plans for use of software correlation at USNO with emphasis on the use of the DiFX software correlator.

  12. Multi-Level Bitmap Indexes for Flash Memory Storage

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wu, Kesheng; Madduri, Kamesh; Canon, Shane

    2010-07-23

    Due to their low access latency, high read speed, and power-efficient operation, flash memory storage devices are rapidly emerging as an attractive alternative to traditional magnetic storage devices. However, tests show that the most efficient indexing methods are not able to take advantage of the flash memory storage devices. In this paper, we present a set of multi-level bitmap indexes that can effectively take advantage of flash storage devices. These indexing methods use coarsely binned indexes to answer queries approximately, and then use finely binned indexes to refine the answers. Our new methods read significantly lower volumes of data atmore » the expense of an increased disk access count, thus taking full advantage of the improved read speed and low access latency of flash devices. To demonstrate the advantage of these new indexes, we measure their performance on a number of storage systems using a standard data warehousing benchmark called the Set Query Benchmark. We observe that multi-level strategies on flash drives are up to 3 times faster than traditional indexing strategies on magnetic disk drives.« less

  13. Analyse et design aerodynamique haute-fidelite de l'integration moteur sur un avion BWB

    NASA Astrophysics Data System (ADS)

    Mirzaei Amirabad, Mojtaba

    BWB (Blended Wing Body) is an innovative type of aircraft based on the flying wing concept. In this configuration, the wing and the fuselage are blended together smoothly. BWB offers economical and environmental advantages by reducing fuel consumption through improving aerodynamic performance. In this project, the goal is to improve the aerodynamic performance by optimizing the main body of BWB that comes from conceptual design. The high fidelity methods applied in this project have been less frequently addressed in the literature. This research develops an automatic optimization procedure in order to reduce the drag force on the main body. The optimization is carried out in two main stages: before and after engine installation. Our objective is to minimize the drag by taking into account several constraints in high fidelity optimization. The commercial software, Isight is chosen as an optimizer in which MATLAB software is called to start the optimization process. Geometry is generated using ANSYS-DesignModeler, unstructured mesh is created by ANSYS-Mesh and CFD calculations are done with the help of ANSYS-Fluent. All of these software are coupled together in ANSYS-Workbench environment which is called by MATLAB. The high fidelity methods are used during optimization by solving Navier-Stokes equations. For verifying the results, a finer structured mesh is created by ICEM software to be used in each stage of optimization. The first stage includes a 3D optimization on the surface of the main body, before adding the engine. The optimized case is then used as an input for the second stage in which the nacelle is added. It could be concluded that this study leads us to obtain appropriate reduction in drag coefficient for BWB without nacelle. In the second stage (adding the nacelle) a drag minimization is also achieved by performing a local optimization. Furthermore, the flow separation, created in the main body-nacelle zone, is reduced.

  14. Help: first aid issues.

    PubMed

    Granitoff, N; Whitaker, I Y; Diccini, S; Goncalves, V C; Marin, H F

    1995-01-01

    First aid is the initial and immediate care given to a victim outside the hospital environment, with the purpose of assuring life and avoiding worsening conditions until he/she receives qualified assistance. Providing immediate aid to someone requires tranquility and, above all, knowledge on what has to be done or not in each situation. In addition to being treated by health professionals, the chances that a victim will receive early treatment by others are large. However, in Brazil, access to information, and the possibility of reviewing it whenever necessary, may contribute greatly to the process of assimilation of this knowledge, in addition to exercises on simulated cases. Informatics has been shown as an extremely useful tool in the development of educational software, considering its multiplicity of resources and providing for the users: motivation for an interactive experience, an individualized teaching that takes into account his/her own rhythm and desired complexity level, besides making possible the user's capacity for solving problems through simulated situations. Considering that, the number of individuals of the population prepared to act as First Aid helpers in situations of life threatening accidents or sudden illness is still very scarce. The ever increasing use of the computer as a mean of spreading information in schools, enterprises, and even households and considering the advantages of an educational software for the users regarding storage and retrieval of information when needed, we proposed the creation of an interactive teaching software. This software is being developed using Storyboard live. The methodology is the following: literature review, selection of images, development of the program, application tests. The initial selected issues are: assessment of the victim, cardiorespiratory arrest and resuscitation, airway obstruction, wounds, and hemorrhages. After utilizing the program, the user should be able to solve hypothetical situations with minimal initial care and maximal physical comfort for an accident or sudden illness victim.

  15. BioPortal: An Open-Source Community-Based Ontology Repository

    NASA Astrophysics Data System (ADS)

    Noy, N.; NCBO Team

    2011-12-01

    Advances in computing power and new computational techniques have changed the way researchers approach science. In many fields, one of the most fruitful approaches has been to use semantically aware software to break down the barriers among disparate domains, systems, data sources, and technologies. Such software facilitates data aggregation, improves search, and ultimately allows the detection of new associations that were previously not detectable. Achieving these analyses requires software systems that take advantage of the semantics and that can intelligently negotiate domains and knowledge sources, identifying commonality across systems that use different and conflicting vocabularies, while understanding apparent differences that may be concealed by the use of superficially similar terms. An ontology, a semantically rich vocabulary for a domain of interest, is the cornerstone of software for bridging systems, domains, and resources. However, as ontologies become the foundation of all semantic technologies in e-science, we must develop an infrastructure for sharing ontologies, finding and evaluating them, integrating and mapping among them, and using ontologies in applications that help scientists process their data. BioPortal [1] is an open-source on-line community-based ontology repository that has been used as a critical component of semantic infrastructure in several domains, including biomedicine and bio-geochemical data. BioPortal, uses the social approaches in the Web 2.0 style to bring structure and order to the collection of biomedical ontologies. It enables users to provide and discuss a wide array of knowledge components, from submitting the ontologies themselves, to commenting on and discussing classes in the ontologies, to reviewing ontologies in the context of their own ontology-based projects, to creating mappings between overlapping ontologies and discussing and critiquing the mappings. Critically, it provides web-service access to all its content, enabling its integration in semantically enriched applications. [1] Noy, N.F., Shah, N.H., et al., BioPortal: ontologies and integrated data resources at the click of a mouse. Nucleic Acids Res, 2009. 37(Web Server issue): p. W170-3.

  16. SCOS 2: An object oriented software development approach

    NASA Technical Reports Server (NTRS)

    Symonds, Martin; Lynenskjold, Steen; Mueller, Christian

    1994-01-01

    The Spacecraft Control and Operations System 2 (SCOS 2), is intended to provide the generic mission control system infrastructure for future ESA missions. It represents a bold step forward in order to take advantage of state-of-the-art technology and current practices in the area of software engineering. Key features include: (1) use of object oriented analysis and design techniques; (2) use of UNIX, C++ and a distributed architecture as the enabling implementation technology; (3) goal of re-use for development, maintenance and mission specific software implementation; and (4) introduction of the concept of a spacecraft control model. This paper touches upon some of the traditional beliefs surrounding Object Oriented development and describes their relevance to SCOS 2. It gives rationale for why particular approaches were adopted and others not, and describes the impact of these decisions. The development approach followed is discussed, highlighting the evolutionary nature of the overall process and the iterative nature of the various tasks carried out. The emphasis of this paper is on the process of the development with the following being covered: (1) the three phases of the SCOS 2 project - prototyping & analysis, design & implementation and configuration / delivery of mission specific systems; (2) the close cooperation and continual interaction with the users during the development; (3) the management approach - the split between client staff, industry and some of the required project management activities; (4) the lifecycle adopted being an enhancement of the ESA PSS-05 standard with SCOS 2 specific activities and approaches defined; and (5) an examination of some of the difficulties encountered and the solutions adopted. Finally, the lessons learned from the SCOS 2 experience are highlighted, identifying those issues to be used as feedback into future developments of this nature. This paper does not intend to describe the finished product and its operation, but focusing on the journey to arrive there, concentrating therefore on the process and not the products of the SCOS 2 software development.

  17. Software agents for the dissemination of remote terrestrial sensing data

    NASA Technical Reports Server (NTRS)

    Toomey, Christopher N.; Simoudis, Evangelos; Johnson, Raymond W.; Mark, William S.

    1994-01-01

    Remote terrestrial sensing (RTS) data is constantly being collected from a variety of space-based and earth-based sensors. The collected data, and especially 'value-added' analyses of the data, are finding growing application for commercial, government, and scientific purposes. The scale of this data collection and analysis is truly enormous; e.g., by 1995, the amount of data available in just one sector, NASA space science, will reach 5 petabytes. Moreover, the amount of data, and the value of analyzing the data, are expected to increase dramatically as new satellites and sensors become available (e.g., NASA's Earth Observing System satellites). Lockheed and other companies are beginning to provide data and analysis commercially. A critical issue for the exploitation of collected data is the dissemination of data and value-added analyses to a diverse and widely distributed customer base. Customers must be able to use their computational environment (eventually the National Information Infrastructure) to obtain timely and complete information, without having to know the details of where the relevant data resides and how it is accessed. Customers must be able to routinely use standard, widely available (and, therefore, low cost) analyses, while also being able to readily create on demand highly customized analyses to make crucial decisions. The diversity of user needs creates a difficult software problem: how can users easily state their needs, while the computational environment assumes the responsibility of finding (or creating) relevant information, and then delivering the results in a form that users understand? A software agent is a self-contained, active software module that contains an explicit representation of its operational knowledge. This explicit representation allows agents to examine their own capabilities in order to modify their goals to meet changing needs and to take advantage of dynamic opportunities. In addition, the explicit representation allows agents to advertize their capabilities and results to other agents, thereby allowing the collection of agents to reuse each others work.

  18. Wavelet Compression of Satellite-Transmitted Digital Mammograms

    NASA Technical Reports Server (NTRS)

    Zheng, Yuan F.

    2001-01-01

    Breast cancer is one of the major causes of cancer death in women in the United States. The most effective way to treat breast cancer is to detect it at an early stage by screening patients periodically. Conventional film-screening mammography uses X-ray films which are effective in detecting early abnormalities of the breast. Direct digital mammography has the potential to improve the image quality and to take advantages of convenient storage, efficient transmission, and powerful computer-aided diagnosis, etc. One effective alternative to direct digital imaging is secondary digitization of X-ray films. This technique may not provide as high an image quality as the direct digital approach, but definitely have other advantages inherent to digital images. One of them is the usage of satellite-transmission technique for transferring digital mammograms between a remote image-acquisition site and a central image-reading site. This technique can benefit a large population of women who reside in remote areas where major screening and diagnosing facilities are not available. The NASA-Lewis Research Center (LeRC), in collaboration with the Cleveland Clinic Foundation (CCF), has begun a pilot study to investigate the application of the Advanced Communications Technology Satellite (ACTS) network to telemammography. The bandwidth of the T1 transmission is limited (1.544 Mbps) while the size of a mammographic image is huge. It takes a long time to transmit a single mammogram. For example, a mammogram of 4k by 4k pixels with 16 bits per pixel needs more than 4 minutes to transmit. Four images for a typical screening exam would take more than 16 minutes. This is too long a time period for a convenient screening. Consequently, compression is necessary for making satellite-transmission of mammographic images practically possible. The Wavelet Research Group of the Department of Electrical Engineering at The Ohio State University (OSU) participated in the LeRC-CCF collaboration by providing advanced compression technology using wavelet transform. OSU developed a time-efficient software package with various wavelets to compress a serious of mammographic images. This documents reports the result of the compression activities.

  19. Intelligent systems technology infrastructure for integrated systems

    NASA Technical Reports Server (NTRS)

    Lum, Henry

    1991-01-01

    A system infrastructure must be properly designed and integrated from the conceptual development phase to accommodate evolutionary intelligent technologies. Several technology development activities were identified that may have application to rendezvous and capture systems. Optical correlators in conjunction with fuzzy logic control might be used for the identification, tracking, and capture of either cooperative or non-cooperative targets without the intensive computational requirements associated with vision processing. A hybrid digital/analog system was developed and tested with a robotic arm. An aircraft refueling application demonstration is planned within two years. Initially this demonstration will be ground based with a follow-on air based demonstration. System dependability measurement and modeling techniques are being developed for fault management applications. This involves usage of incremental solution/evaluation techniques and modularized systems to facilitate reuse and to take advantage of natural partitions in system models. Though not yet commercially available and currently subject to accuracy limitations, technology is being developed to perform optical matrix operations to enhance computational speed. Optical terrain recognition using camera image sequencing processed with optical correlators is being developed to determine position and velocity in support of lander guidance. The system is planned for testing in conjunction with Dryden Flight Research Facility. Advanced architecture technology is defining open architecture design constraints, test bed concepts (processors, multiple hardware/software and multi-dimensional user support, knowledge/tool sharing infrastructure), and software engineering interface issues.

  20. Real time 3D structural and Doppler OCT imaging on graphics processing units

    NASA Astrophysics Data System (ADS)

    Sylwestrzak, Marcin; Szlag, Daniel; Szkulmowski, Maciej; Gorczyńska, Iwona; Bukowska, Danuta; Wojtkowski, Maciej; Targowski, Piotr

    2013-03-01

    In this report the application of graphics processing unit (GPU) programming for real-time 3D Fourier domain Optical Coherence Tomography (FdOCT) imaging with implementation of Doppler algorithms for visualization of the flows in capillary vessels is presented. Generally, the time of the data processing of the FdOCT data on the main processor of the computer (CPU) constitute a main limitation for real-time imaging. Employing additional algorithms, such as Doppler OCT analysis, makes this processing even more time consuming. Lately developed GPUs, which offers a very high computational power, give a solution to this problem. Taking advantages of them for massively parallel data processing, allow for real-time imaging in FdOCT. The presented software for structural and Doppler OCT allow for the whole processing with visualization of 2D data consisting of 2000 A-scans generated from 2048 pixels spectra with frame rate about 120 fps. The 3D imaging in the same mode of the volume data build of 220 × 100 A-scans is performed at a rate of about 8 frames per second. In this paper a software architecture, organization of the threads and optimization applied is shown. For illustration the screen shots recorded during real time imaging of the phantom (homogeneous water solution of Intralipid in glass capillary) and the human eye in-vivo is presented.

Top