Sample records for open application programming

  1. Open Admissions: A Bibliography for Research and Application.

    ERIC Educational Resources Information Center

    Shrier, Irene; Lavin, David E.

    This bibliography presents materials for research and application of open admissions policies in higher education. Sections cover: open admissions; factors influencing high school graduates to attend college; disadvantaged and minority students; precollege and special programs; English and reading skills; general compensatory programs; dropouts;…

  2. 75 FR 50986 - Notice of Contract Proposal (NOCP) for Payments to Eligible Advanced Biofuel Producers

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-08-18

    ... remaining available Fiscal Year 2009 program funds. This Notice opens an application window for certain... opening a new application window from August 18, 2010 through September 17, 2010 to accept applications... opening a new application window to accept additional applications for the remaining available Fiscal Year...

  3. Cancer Prevention Fellowship Program Application Period is Open until August 25 | Division of Cancer Prevention

    Cancer.gov

    The application period for the NCI Cancer Prevention Fellowship Program (CPFP) is open. Since 1987, CPFP has provided funding support for post-doctoral Fellows to train the next generation of researchers and leaders in the field. |

  4. SWMM5 Application Programming Interface and PySWMM: A Python Interfacing Wrapper

    EPA Science Inventory

    In support of the OpenWaterAnalytics open source initiative, the PySWMM project encompasses the development of a Python interfacing wrapper to SWMM5 with parallel ongoing development of the USEPA Stormwater Management Model (SWMM5) application programming interface (API). ...

  5. Dynamic mobility applications open source application development portal : Task 3.3 : concept of operations : final report.

    DOT National Transportation Integrated Search

    2016-10-12

    The Dynamic Mobility Applications (DMA) program seeks to promote the highest level of collaboration and preservation of intellectual capital generated from application development and associated research activities funded by the program. The program ...

  6. Controlled Multivariate Evaluation of Open Education: Application of a Critical Model.

    ERIC Educational Resources Information Center

    Sewell, Alan F.; And Others

    This paper continues previous reports of a controlled multivariate evaluation of a junior high school open-education program. A new method of estimating program objectives and implementation is presented, together with the nature and degree of obtained student outcomes. Open-program students were found to approve more highly of their learning…

  7. Swan: A tool for porting CUDA programs to OpenCL

    NASA Astrophysics Data System (ADS)

    Harvey, M. J.; De Fabritiis, G.

    2011-04-01

    The use of modern, high-performance graphical processing units (GPUs) for acceleration of scientific computation has been widely reported. The majority of this work has used the CUDA programming model supported exclusively by GPUs manufactured by NVIDIA. An industry standardisation effort has recently produced the OpenCL specification for GPU programming. This offers the benefits of hardware-independence and reduced dependence on proprietary tool-chains. Here we describe a source-to-source translation tool, "Swan" for facilitating the conversion of an existing CUDA code to use the OpenCL model, as a means to aid programmers experienced with CUDA in evaluating OpenCL and alternative hardware. While the performance of equivalent OpenCL and CUDA code on fixed hardware should be comparable, we find that a real-world CUDA application ported to OpenCL exhibits an overall 50% increase in runtime, a reduction in performance attributable to the immaturity of contemporary compilers. The ported application is shown to have platform independence, running on both NVIDIA and AMD GPUs without modification. We conclude that OpenCL is a viable platform for developing portable GPU applications but that the more mature CUDA tools continue to provide best performance. Program summaryProgram title: Swan Catalogue identifier: AEIH_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEIH_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: GNU Public License version 2 No. of lines in distributed program, including test data, etc.: 17 736 No. of bytes in distributed program, including test data, etc.: 131 177 Distribution format: tar.gz Programming language: C Computer: PC Operating system: Linux RAM: 256 Mbytes Classification: 6.5 External routines: NVIDIA CUDA, OpenCL Nature of problem: Graphical Processing Units (GPUs) from NVIDIA are preferentially programed with the proprietary CUDA programming toolkit. An alternative programming model promoted as an industry standard, OpenCL, provides similar capabilities to CUDA and is also supported on non-NVIDIA hardware (including multicore ×86 CPUs, AMD GPUs and IBM Cell processors). The adaptation of a program from CUDA to OpenCL is relatively straightforward but laborious. The Swan tool facilitates this conversion. Solution method:Swan performs a translation of CUDA kernel source code into an OpenCL equivalent. It also generates the C source code for entry point functions, simplifying kernel invocation from the host program. A concise host-side API abstracts the CUDA and OpenCL APIs. A program adapted to use Swan has no dependency on the CUDA compiler for the host-side program. The converted program may be built for either CUDA or OpenCL, with the selection made at compile time. Restrictions: No support for CUDA C++ features Running time: Nominal

  8. Open Source and ROI: Open Source Has Made Significant Leaps in Recent Years. What Does It Have to Offer Education?

    ERIC Educational Resources Information Center

    Guhlin, Miguel

    2007-01-01

    A switch to free open source software can minimize cost and allow funding to be diverted to equipment and other programs. For instance, the OpenOffice suite is an alternative to expensive basic application programs offered by major vendors. Many such programs on the market offer features seldom used in education but for which educators must pay.…

  9. Software for Real-Time Analysis of Subsonic Test Shot Accuracy

    DTIC Science & Technology

    2014-03-01

    used the C++ programming language, the Open Source Computer Vision ( OpenCV ®) software library, and Microsoft Windows® Application Programming...video for comparison through OpenCV image analysis tools. Based on the comparison, the software then computed the coordinates of each shot relative to...DWB researchers wanted to use the Open Source Computer Vision ( OpenCV ) software library for capturing and analyzing frames of video. OpenCV contains

  10. Leveraging OpenStudio's Application Programming Interfaces: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Long, N.; Ball, B.; Goldwasser, D.

    2013-11-01

    OpenStudio development efforts have been focused on providing Application Programming Interfaces (APIs) where users are able to extend OpenStudio without the need to compile the open source libraries. This paper will discuss the basic purposes and functionalities of the core libraries that have been wrapped with APIs including the Building Model, Results Processing, Advanced Analysis, UncertaintyQuantification, and Data Interoperability through Translators. Several building energy modeling applications have been produced using OpenStudio's API and Software Development Kits (SDK) including the United States Department of Energy's Asset ScoreCalculator, a mobile-based audit tool, an energy design assistance reporting protocol, and a portfolio scalemore » incentive optimization analysismethodology. Each of these software applications will be discussed briefly and will describe how the APIs were leveraged for various uses including high-level modeling, data transformations from detailed building audits, error checking/quality assurance of models, and use of high-performance computing for mass simulations.« less

  11. The open-source movement: an introduction for forestry professionals

    Treesearch

    Patrick Proctor; Paul C. Van Deusen; Linda S. Heath; Jeffrey H. Gove

    2005-01-01

    In recent years, the open-source movement has yielded a generous and powerful suite of software and utilities that rivals those developed by many commercial software companies. Open-source programs are available for many scientific needs: operating systems, databases, statistical analysis, Geographic Information System applications, and object-oriented programming....

  12. Toward Enhancing OpenMP's Work-Sharing Directives

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chapman, B M; Huang, L; Jin, H

    2006-05-17

    OpenMP provides a portable programming interface for shared memory parallel computers (SMPs). Although this interface has proven successful for small SMPs, it requires greater flexibility in light of the steadily growing size of individual SMPs and the recent advent of multithreaded chips. In this paper, we describe two application development experiences that exposed these expressivity problems in the current OpenMP specification. We then propose mechanisms to overcome these limitations, including thread subteams and thread topologies. Thus, we identify language features that improve OpenMP application performance on emerging and large-scale platforms while preserving ease of programming.

  13. 44 CFR 80.1 - Purpose and scope.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... date, applicable program regulations and guidance in effect for the funding program (available at http... requirements of the funding grant program and must be read in conjunction with the relevant program regulations... oversight, applies to projects for which the funding program application period opens or for which funding...

  14. 44 CFR 80.1 - Purpose and scope.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... date, applicable program regulations and guidance in effect for the funding program (available at http... requirements of the funding grant program and must be read in conjunction with the relevant program regulations... oversight, applies to projects for which the funding program application period opens or for which funding...

  15. 44 CFR 80.1 - Purpose and scope.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... date, applicable program regulations and guidance in effect for the funding program (available at http... requirements of the funding grant program and must be read in conjunction with the relevant program regulations... oversight, applies to projects for which the funding program application period opens or for which funding...

  16. 44 CFR 80.1 - Purpose and scope.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... date, applicable program regulations and guidance in effect for the funding program (available at http... requirements of the funding grant program and must be read in conjunction with the relevant program regulations... oversight, applies to projects for which the funding program application period opens or for which funding...

  17. Characterizing and Mitigating Work Time Inflation in Task Parallel Programs

    DOE PAGES

    Olivier, Stephen L.; de Supinski, Bronis R.; Schulz, Martin; ...

    2013-01-01

    Task parallelism raises the level of abstraction in shared memory parallel programming to simplify the development of complex applications. However, task parallel applications can exhibit poor performance due to thread idleness, scheduling overheads, and work time inflation – additional time spent by threads in a multithreaded computation beyond the time required to perform the same work in a sequential computation. We identify the contributions of each factor to lost efficiency in various task parallel OpenMP applications and diagnose the causes of work time inflation in those applications. Increased data access latency can cause significant work time inflation in NUMA systems.more » Our locality framework for task parallel OpenMP programs mitigates this cause of work time inflation. Our extensions to the Qthreads library demonstrate that locality-aware scheduling can improve performance up to 3X compared to the Intel OpenMP task scheduler.« less

  18. Development of a web application for water resources based on open source software

    NASA Astrophysics Data System (ADS)

    Delipetrev, Blagoj; Jonoski, Andreja; Solomatine, Dimitri P.

    2014-01-01

    This article presents research and development of a prototype web application for water resources using latest advancements in Information and Communication Technologies (ICT), open source software and web GIS. The web application has three web services for: (1) managing, presenting and storing of geospatial data, (2) support of water resources modeling and (3) water resources optimization. The web application is developed using several programming languages (PhP, Ajax, JavaScript, Java), libraries (OpenLayers, JQuery) and open source software components (GeoServer, PostgreSQL, PostGIS). The presented web application has several main advantages: it is available all the time, it is accessible from everywhere, it creates a real time multi-user collaboration platform, the programing languages code and components are interoperable and designed to work in a distributed computer environment, it is flexible for adding additional components and services and, it is scalable depending on the workload. The application was successfully tested on a case study with concurrent multi-users access.

  19. An OpenACC-Based Unified Programming Model for Multi-accelerator Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kim, Jungwon; Lee, Seyong; Vetter, Jeffrey S

    2015-01-01

    This paper proposes a novel SPMD programming model of OpenACC. Our model integrates the different granularities of parallelism from vector-level parallelism to node-level parallelism into a single, unified model based on OpenACC. It allows programmers to write programs for multiple accelerators using a uniform programming model whether they are in shared or distributed memory systems. We implement a prototype of our model and evaluate its performance with a GPU-based supercomputer using three benchmark applications.

  20. Support of Multidimensional Parallelism in the OpenMP Programming Model

    NASA Technical Reports Server (NTRS)

    Jin, Hao-Qiang; Jost, Gabriele

    2003-01-01

    OpenMP is the current standard for shared-memory programming. While providing ease of parallel programming, the OpenMP programming model also has limitations which often effect the scalability of applications. Examples for these limitations are work distribution and point-to-point synchronization among threads. We propose extensions to the OpenMP programming model which allow the user to easily distribute the work in multiple dimensions and synchronize the workflow among the threads. The proposed extensions include four new constructs and the associated runtime library. They do not require changes to the source code and can be implemented based on the existing OpenMP standard. We illustrate the concept in a prototype translator and test with benchmark codes and a cloud modeling code.

  1. Automatic Generation of OpenMP Directives and Its Application to Computational Fluid Dynamics Codes

    NASA Technical Reports Server (NTRS)

    Yan, Jerry; Jin, Haoqiang; Frumkin, Michael; Yan, Jerry (Technical Monitor)

    2000-01-01

    The shared-memory programming model is a very effective way to achieve parallelism on shared memory parallel computers. As great progress was made in hardware and software technologies, performance of parallel programs with compiler directives has demonstrated large improvement. The introduction of OpenMP directives, the industrial standard for shared-memory programming, has minimized the issue of portability. In this study, we have extended CAPTools, a computer-aided parallelization toolkit, to automatically generate OpenMP-based parallel programs with nominal user assistance. We outline techniques used in the implementation of the tool and discuss the application of this tool on the NAS Parallel Benchmarks and several computational fluid dynamics codes. This work demonstrates the great potential of using the tool to quickly port parallel programs and also achieve good performance that exceeds some of the commercial tools.

  2. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Barbara Chapman

    OpenMP was not well recognized at the beginning of the project, around year 2003, because of its limited use in DoE production applications and the inmature hardware support for an efficient implementation. Yet in the recent years, it has been graduately adopted both in HPC applications, mostly in the form of MPI+OpenMP hybrid code, and in mid-scale desktop applications for scientific and experimental studies. We have observed this trend and worked deligiently to improve our OpenMP compiler and runtimes, as well as to work with the OpenMP standard organization to make sure OpenMP are evolved in the direction close tomore » DoE missions. In the Center for Programming Models for Scalable Parallel Computing project, the HPCTools team at the University of Houston (UH), directed by Dr. Barbara Chapman, has been working with project partners, external collaborators and hardware vendors to increase the scalability and applicability of OpenMP for multi-core (and future manycore) platforms and for distributed memory systems by exploring different programming models, language extensions, compiler optimizations, as well as runtime library support.« less

  3. OpenSHMEM-UCX : Evaluation of UCX for implementing OpenSHMEM Programming Model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Baker, Matthew B; Gorentla Venkata, Manjunath; Aderholdt, William Ferrol

    2016-01-01

    The OpenSHMEM reference implementation was developed towards the goal of developing an open source and high-performing Open- SHMEM implementation. To achieve portability and performance across various networks, the OpenSHMEM reference implementation uses GAS- Net and UCCS for network operations. Recently, new network layers have emerged with the promise of providing high-performance, scalabil- ity, and portability for HPC applications. In this paper, we implement the OpenSHMEM reference implementation to use the UCX framework for network operations. Then, we evaluate its performance and scalabil- ity on Cray XK systems to understand UCX s suitability for developing the OpenSHMEM programming model. Further, wemore » develop a bench- mark called SHOMS for evaluating the OpenSHMEM implementation. Our experimental results show that OpenSHMEM-UCX outperforms the vendor supplied OpenSHMEM implementation in most cases on the Cray XK system by up to 40% with respect to message rate and up to 70% for the execution of application kernels.« less

  4. Logo Talks Back.

    ERIC Educational Resources Information Center

    Bearden, Donna; Muller, Jim

    1983-01-01

    In addition to turtle graphics, the Logo programing language has list and text processing capabilities that open up opportunities for word games, language programs, word processing, and other applications. Provided are examples of these applications using both Apple and MIT Logo versions. Includes sample interactive programs. (JN)

  5. Policy analysis and recommendations for the open source application development portal (OSADP).

    DOT National Transportation Integrated Search

    2012-06-01

    This white paper addresses the policy and institutional issues that are associated with the development of an open source applications development portal (OSADP), part of a larger research effort being conducted under the ITS Programs Dynamic Mobi...

  6. A proposed application programming interface for a physical volume repository

    NASA Technical Reports Server (NTRS)

    Jones, Merritt; Williams, Joel; Wrenn, Richard

    1996-01-01

    The IEEE Storage System Standards Working Group (SSSWG) has developed the Reference Model for Open Storage Systems Interconnection, Mass Storage System Reference Model Version 5. This document, provides the framework for a series of standards for application and user interfaces to open storage systems. More recently, the SSSWG has been developing Application Programming Interfaces (APIs) for the individual components defined by the model. The API for the Physical Volume Repository is the most fully developed, but work is being done on APIs for the Physical Volume Library and for the Mover also. The SSSWG meets every other month, and meetings are open to all interested parties. The Physical Volume Repository (PVR) is responsible for managing the storage of removable media cartridges and for mounting and dismounting these cartridges onto drives. This document describes a model which defines a Physical Volume Repository, and gives a brief summary of the Application Programming Interface (API) which the IEEE Storage Systems Standards Working Group (SSSWG) is proposing as the standard interface for the PVR.

  7. BUILDING MODEL ANALYSIS APPLICATIONS WITH THE JOINT UNIVERSAL PARAMETER IDENTIFICATION AND EVALUATION OF RELIABILITY (JUPITER) API

    EPA Science Inventory

    The open-source, public domain JUPITER (Joint Universal Parameter IdenTification and Evaluation of Reliability) API (Application Programming Interface) provides conventions and Fortran-90 modules to develop applications (computer programs) for analyzing process models. The input ...

  8. State-of-the-practice and lessons learned on implementing open data and open source policies.

    DOT National Transportation Integrated Search

    2012-05-01

    This report describes the current government, academic, and private sector practices associated with open data and open source application development. These practices are identified; and the potential uses with the ITS Programs Data Capture and M...

  9. NASA STI Program Coordinating Council Twelfth Meeting: Standards

    NASA Technical Reports Server (NTRS)

    1994-01-01

    The theme of this NASA Scientific and Technical Information Program Coordinating Council Meeting was standards and their formation and application. Topics covered included scientific and technical information architecture, the Open Systems Interconnection Transmission Control Protocol/Internet Protocol, Machine-Readable Cataloging (MARC) open system environment procurement, and the Government Information Locator Service.

  10. The Automatic Parallelisation of Scientific Application Codes Using a Computer Aided Parallelisation Toolkit

    NASA Technical Reports Server (NTRS)

    Ierotheou, C.; Johnson, S.; Leggett, P.; Cross, M.; Evans, E.; Jin, Hao-Qiang; Frumkin, M.; Yan, J.; Biegel, Bryan (Technical Monitor)

    2001-01-01

    The shared-memory programming model is a very effective way to achieve parallelism on shared memory parallel computers. Historically, the lack of a programming standard for using directives and the rather limited performance due to scalability have affected the take-up of this programming model approach. Significant progress has been made in hardware and software technologies, as a result the performance of parallel programs with compiler directives has also made improvements. The introduction of an industrial standard for shared-memory programming with directives, OpenMP, has also addressed the issue of portability. In this study, we have extended the computer aided parallelization toolkit (developed at the University of Greenwich), to automatically generate OpenMP based parallel programs with nominal user assistance. We outline the way in which loop types are categorized and how efficient OpenMP directives can be defined and placed using the in-depth interprocedural analysis that is carried out by the toolkit. We also discuss the application of the toolkit on the NAS Parallel Benchmarks and a number of real-world application codes. This work not only demonstrates the great potential of using the toolkit to quickly parallelize serial programs but also the good performance achievable on up to 300 processors for hybrid message passing and directive-based parallelizations.

  11. SWMM5 Application Programming Interface and PySWMM: A ...

    EPA Pesticide Factsheets

    In support of the OpenWaterAnalytics open source initiative, the PySWMM project encompasses the development of a Python interfacing wrapper to SWMM5 with parallel ongoing development of the USEPA Stormwater Management Model (SWMM5) application programming interface (API). ... The purpose of this work is to increase the utility of the SWMM dll by creating a Toolkit API for accessing its functionality. The utility of the Toolkit is further enhanced with a wrapper to allow access from the Python scripting language. This work is being prosecuted as part of an Open Source development strategy and is being performed by volunteer software developers.

  12. Research on e-commerce transaction networks using multi-agent modelling and open application programming interface

    NASA Astrophysics Data System (ADS)

    Piao, Chunhui; Han, Xufang; Wu, Harris

    2010-08-01

    We provide a formal definition of an e-commerce transaction network. Agent-based modelling is used to simulate e-commerce transaction networks. For real-world analysis, we studied the open application programming interfaces (APIs) from eBay and Taobao e-commerce websites and captured real transaction data. Pajek is used to visualise the agent relationships in the transaction network. We derived one-mode networks from the transaction network and analysed them using degree and betweenness centrality. Integrating multi-agent modelling, open APIs and social network analysis, we propose a new way to study large-scale e-commerce systems.

  13. Open Scenario Study, Phase II Report: Assessment and Development of Approaches for Satisfying Unclassified Scenario Needs

    DTIC Science & Technology

    2010-01-01

    interface, another providing the application logic (a program used to manipulate the data), and a server running Microsoft SQL Server or Oracle RDBMS... Oracle ) • Mysql (Open Source) • Other What application server software will be needed? • Application Server • CGI PHP/Perl (Open Source...are used throughout DoD and serve a variety of functions. While DoD has a codified and institutionalized process for the development of a common set

  14. An Open-Source and Java-Technologies Approach to Web Applications

    DTIC Science & Technology

    2003-09-01

    program for any purpose (Freedom 0). • The freedom to study how the program works, and adapt it to individual needs (Freedom 1). Access to the source...manage information for many purposes. Today a key technology that allows developers to make Web applications is server-side programming to generate a

  15. Design, implementation and practice of JBEI-ICE: an open source biological part registry platform and tools.

    PubMed

    Ham, Timothy S; Dmytriv, Zinovii; Plahar, Hector; Chen, Joanna; Hillson, Nathan J; Keasling, Jay D

    2012-10-01

    The Joint BioEnergy Institute Inventory of Composable Elements (JBEI-ICEs) is an open source registry platform for managing information about biological parts. It is capable of recording information about 'legacy' parts, such as plasmids, microbial host strains and Arabidopsis seeds, as well as DNA parts in various assembly standards. ICE is built on the idea of a web of registries and thus provides strong support for distributed interconnected use. The information deposited in an ICE installation instance is accessible both via a web browser and through the web application programming interfaces, which allows automated access to parts via third-party programs. JBEI-ICE includes several useful web browser-based graphical applications for sequence annotation, manipulation and analysis that are also open source. As with open source software, users are encouraged to install, use and customize JBEI-ICE and its components for their particular purposes. As a web application programming interface, ICE provides well-developed parts storage functionality for other synthetic biology software projects. A public instance is available at public-registry.jbei.org, where users can try out features, upload parts or simply use it for their projects. The ICE software suite is available via Google Code, a hosting site for community-driven open source projects.

  16. What Multilevel Parallel Programs do when you are not Watching: A Performance Analysis Case Study Comparing MPI/OpenMP, MLP, and Nested OpenMP

    NASA Technical Reports Server (NTRS)

    Jost, Gabriele; Labarta, Jesus; Gimenez, Judit

    2004-01-01

    With the current trend in parallel computer architectures towards clusters of shared memory symmetric multi-processors, parallel programming techniques have evolved that support parallelism beyond a single level. When comparing the performance of applications based on different programming paradigms, it is important to differentiate between the influence of the programming model itself and other factors, such as implementation specific behavior of the operating system (OS) or architectural issues. Rewriting-a large scientific application in order to employ a new programming paradigms is usually a time consuming and error prone task. Before embarking on such an endeavor it is important to determine that there is really a gain that would not be possible with the current implementation. A detailed performance analysis is crucial to clarify these issues. The multilevel programming paradigms considered in this study are hybrid MPI/OpenMP, MLP, and nested OpenMP. The hybrid MPI/OpenMP approach is based on using MPI [7] for the coarse grained parallelization and OpenMP [9] for fine grained loop level parallelism. The MPI programming paradigm assumes a private address space for each process. Data is transferred by explicitly exchanging messages via calls to the MPI library. This model was originally designed for distributed memory architectures but is also suitable for shared memory systems. The second paradigm under consideration is MLP which was developed by Taft. The approach is similar to MPi/OpenMP, using a mix of coarse grain process level parallelization and loop level OpenMP parallelization. As it is the case with MPI, a private address space is assumed for each process. The MLP approach was developed for ccNUMA architectures and explicitly takes advantage of the availability of shared memory. A shared memory arena which is accessible by all processes is required. Communication is done by reading from and writing to the shared memory.

  17. Evaluation of CHO Benchmarks on the Arria 10 FPGA using Intel FPGA SDK for OpenCL

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jin, Zheming; Yoshii, Kazutomo; Finkel, Hal

    The OpenCL standard is an open programming model for accelerating algorithms on heterogeneous computing system. OpenCL extends the C-based programming language for developing portable codes on different platforms such as CPU, Graphics processing units (GPUs), Digital Signal Processors (DSPs) and Field Programmable Gate Arrays (FPGAs). The Intel FPGA SDK for OpenCL is a suite of tools that allows developers to abstract away the complex FPGA-based development flow for a high-level software development flow. Users can focus on the design of hardware-accelerated kernel functions in OpenCL and then direct the tools to generate the low-level FPGA implementations. The approach makes themore » FPGA-based development more accessible to software users as the needs for hybrid computing using CPUs and FPGAs are increasing. It can also significantly reduce the hardware development time as users can evaluate different ideas with high-level language without deep FPGA domain knowledge. Benchmarking of OpenCL-based framework is an effective way for analyzing the performance of system by studying the execution of the benchmark applications. CHO is a suite of benchmark applications that provides support for OpenCL [1]. The authors presented CHO as an OpenCL port of the CHStone benchmark. Using Altera OpenCL (AOCL) compiler to synthesize the benchmark applications, they listed the resource usage and performance of each kernel that can be successfully synthesized by the compiler. In this report, we evaluate the resource usage and performance of the CHO benchmark applications using the Intel FPGA SDK for OpenCL and Nallatech 385A FPGA board that features an Arria 10 FPGA device. The focus of the report is to have a better understanding of the resource usage and performance of the kernel implementations using Arria-10 FPGA devices compared to Stratix-5 FPGA devices. In addition, we also gain knowledge about the limitations of the current compiler when it fails to synthesize a benchmark application.« less

  18. Comparing the OpenMP, MPI, and Hybrid Programming Paradigm on an SMP Cluster

    NASA Technical Reports Server (NTRS)

    Jost, Gabriele; Jin, Hao-Qiang; anMey, Dieter; Hatay, Ferhat F.

    2003-01-01

    Clusters of SMP (Symmetric Multi-Processors) nodes provide support for a wide range of parallel programming paradigms. The shared address space within each node is suitable for OpenMP parallelization. Message passing can be employed within and across the nodes of a cluster. Multiple levels of parallelism can be achieved by combining message passing and OpenMP parallelization. Which programming paradigm is the best will depend on the nature of the given problem, the hardware components of the cluster, the network, and the available software. In this study we compare the performance of different implementations of the same CFD benchmark application, using the same numerical algorithm but employing different programming paradigms.

  19. Design and optimization of a portable LQCD Monte Carlo code using OpenACC

    NASA Astrophysics Data System (ADS)

    Bonati, Claudio; Coscetti, Simone; D'Elia, Massimo; Mesiti, Michele; Negro, Francesco; Calore, Enrico; Schifano, Sebastiano Fabio; Silvi, Giorgio; Tripiccione, Raffaele

    The present panorama of HPC architectures is extremely heterogeneous, ranging from traditional multi-core CPU processors, supporting a wide class of applications but delivering moderate computing performance, to many-core Graphics Processor Units (GPUs), exploiting aggressive data-parallelism and delivering higher performances for streaming computing applications. In this scenario, code portability (and performance portability) become necessary for easy maintainability of applications; this is very relevant in scientific computing where code changes are very frequent, making it tedious and prone to error to keep different code versions aligned. In this work, we present the design and optimization of a state-of-the-art production-level LQCD Monte Carlo application, using the directive-based OpenACC programming model. OpenACC abstracts parallel programming to a descriptive level, relieving programmers from specifying how codes should be mapped onto the target architecture. We describe the implementation of a code fully written in OpenAcc, and show that we are able to target several different architectures, including state-of-the-art traditional CPUs and GPUs, with the same code. We also measure performance, evaluating the computing efficiency of our OpenACC code on several architectures, comparing with GPU-specific implementations and showing that a good level of performance-portability can be reached.

  20. BOWS (bioinformatics open web services) to centralize bioinformatics tools in web services.

    PubMed

    Velloso, Henrique; Vialle, Ricardo A; Ortega, J Miguel

    2015-06-02

    Bioinformaticians face a range of difficulties to get locally-installed tools running and producing results; they would greatly benefit from a system that could centralize most of the tools, using an easy interface for input and output. Web services, due to their universal nature and widely known interface, constitute a very good option to achieve this goal. Bioinformatics open web services (BOWS) is a system based on generic web services produced to allow programmatic access to applications running on high-performance computing (HPC) clusters. BOWS intermediates the access to registered tools by providing front-end and back-end web services. Programmers can install applications in HPC clusters in any programming language and use the back-end service to check for new jobs and their parameters, and then to send the results to BOWS. Programs running in simple computers consume the BOWS front-end service to submit new processes and read results. BOWS compiles Java clients, which encapsulate the front-end web service requisitions, and automatically creates a web page that disposes the registered applications and clients. Bioinformatics open web services registered applications can be accessed from virtually any programming language through web services, or using standard java clients. The back-end can run in HPC clusters, allowing bioinformaticians to remotely run high-processing demand applications directly from their machines.

  1. E-Standards For Mass Properties Engineering

    NASA Technical Reports Server (NTRS)

    Cerro, Jeffrey A.

    2008-01-01

    A proposal is put forth to promote the concept of a Society of Allied Weight Engineers developed voluntary consensus standard for mass properties engineering. This standard would be an e-standard, and would encompass data, data manipulation, and reporting functionality. The standard would be implemented via an open-source SAWE distribution site with full SAWE member body access. Engineering societies and global standards initiatives are progressing toward modern engineering standards, which become functioning deliverable data sets. These data sets, if properly standardized, will integrate easily between supplier and customer enabling technically precise mass properties data exchange. The concepts of object-oriented programming support all of these requirements, and the use of a JavaTx based open-source development initiative is proposed. Results are reported for activity sponsored by the NASA Langley Research Center Innovation Institute to scope out requirements for developing a mass properties engineering e-standard. An initial software distribution is proposed. Upon completion, an open-source application programming interface will be available to SAWE members for the development of more specific programming requirements that are tailored to company and project requirements. A fully functioning application programming interface will permit code extension via company proprietary techniques, as well as through continued open-source initiatives.

  2. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kim, Junghyun; Gangwon, Jo; Jaehoon, Jung

    Applications written solely in OpenCL or CUDA cannot execute on a cluster as a whole. Most previous approaches that extend these programming models to clusters are based on a common idea: designating a centralized host node and coordinating the other nodes with the host for computation. However, the centralized host node is a serious performance bottleneck when the number of nodes is large. In this paper, we propose a scalable and distributed OpenCL framework called SnuCL-D for large-scale clusters. SnuCL-D's remote device virtualization provides an OpenCL application with an illusion that all compute devices in a cluster are confined inmore » a single node. To reduce the amount of control-message and data communication between nodes, SnuCL-D replicates the OpenCL host program execution and data in each node. We also propose a new OpenCL host API function and a queueing optimization technique that significantly reduce the overhead incurred by the previous centralized approaches. To show the effectiveness of SnuCL-D, we evaluate SnuCL-D with a microbenchmark and eleven benchmark applications on a large-scale CPU cluster and a medium-scale GPU cluster.« less

  3. Context-Based Mobile Security Enclave

    DTIC Science & Technology

    2012-09-01

    29  c.  Change IMSI .............................30  d.  Change CellID ...........................31  e.  Change Geolocation ...Assisted Global Positioning System ADB Android Debugger API Application Programming Interface APK Android Application Package BSC Base Station...Programming Interfaces ( APIs ), which use Java compatible libraries based on Apache Harmony (an open source Java implementation developed by the Apache

  4. 12 CFR 226.53 - Allocation of payments.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... (CONTINUED) TRUTH IN LENDING (REGULATION Z) Special Rules Applicable to Credit Card Accounts and Open-End... periodic payment for a credit card account under an open-end (not home-secured) consumer credit plan, the... program. When a balance on a credit card account under an open-end (not home-secured) consumer credit plan...

  5. 12 CFR 226.53 - Allocation of payments.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... TRUTH IN LENDING (REGULATION Z) Special Rules Applicable to Credit Card Accounts and Open-End Credit... payment for a credit card account under an open-end (not home-secured) consumer credit plan, the card... program. When a balance on a credit card account under an open-end (not home-secured) consumer credit plan...

  6. 12 CFR 226.53 - Allocation of payments.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... (CONTINUED) TRUTH IN LENDING (REGULATION Z) Special Rules Applicable to Credit Card Accounts and Open-End... periodic payment for a credit card account under an open-end (not home-secured) consumer credit plan, the... program. When a balance on a credit card account under an open-end (not home-secured) consumer credit plan...

  7. 75 FR 11841 - Repowering Assistance Program

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-03-12

    ... application window. SUMMARY: RBS is announcing a new application window to submit applications for the...-time application window for remaining FY 2009 funds. Paperwork Reduction Act In accordance with the... allocate all of the FY 2009 authorized funds. Therefore, the Agency is opening a new application window to...

  8. 45 CFR 2517.500 - How is an application reviewed?

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... COMMUNITY SERVICE COMMUNITY-BASED SERVICE-LEARNING PROGRAMS Application Review § 2517.500 How is an... ensure that the projects are open to participants of different ages, races, genders, ethnicities...

  9. RINGMesh: A programming library for developing mesh-based geomodeling applications

    NASA Astrophysics Data System (ADS)

    Pellerin, Jeanne; Botella, Arnaud; Bonneau, François; Mazuyer, Antoine; Chauvin, Benjamin; Lévy, Bruno; Caumon, Guillaume

    2017-07-01

    RINGMesh is a C++ open-source programming library for manipulating discretized geological models. It is designed to ease the development of applications and workflows that use discretized 3D models. It is neither a geomodeler, nor a meshing software. RINGMesh implements functionalities to read discretized surface-based or volumetric structural models and to check their validity. The models can be then exported in various file formats. RINGMesh provides data structures to represent geological structural models, either defined by their discretized boundary surfaces, and/or by discretized volumes. A programming interface allows to develop of new geomodeling methods, and to plug in external software. The goal of RINGMesh is to help researchers to focus on the implementation of their specific method rather than on tedious tasks common to many applications. The documented code is open-source and distributed under the modified BSD license. It is available at https://www.ring-team.org/index.php/software/ringmesh.

  10. 78 FR 52900 - Request for Applications: The Community Forest and Open Space Conservation Program

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-08-27

    ... regulations, contact Scott Stewart, Program Manager, 202-205-1618, [email protected] or Maya Solomon, Program... (TDD) may call the Federal Relay Service (FRS) at 1-800-877-8339 twenty-four hours a day, every day of...

  11. Your Personal Analysis Toolkit - An Open Source Solution

    NASA Astrophysics Data System (ADS)

    Mitchell, T.

    2009-12-01

    Open source software is commonly known for its web browsers, word processors and programming languages. However, there is a vast array of open source software focused on geographic information management and geospatial application building in general. As geo-professionals, having easy access to tools for our jobs is crucial. Open source software provides the opportunity to add a tool to your tool belt and carry it with you for your entire career - with no license fees, a supportive community and the opportunity to test, adopt and upgrade at your own pace. OSGeo is a US registered non-profit representing more than a dozen mature geospatial data management applications and programming resources. Tools cover areas such as desktop GIS, web-based mapping frameworks, metadata cataloging, spatial database analysis, image processing and more. Learn about some of these tools as they apply to AGU members, as well as how you can join OSGeo and its members in getting the job done with powerful open source tools. If you haven't heard of OSSIM, MapServer, OpenLayers, PostGIS, GRASS GIS or the many other projects under our umbrella - then you need to hear this talk. Invest in yourself - use open source!

  12. Tank Information System (tis): a Case Study in Migrating Web Mapping Application from Flex to Dojo for Arcgis Server and then to Open Source

    NASA Astrophysics Data System (ADS)

    Pulsani, B. R.

    2017-11-01

    Tank Information System is a web application which provides comprehensive information about minor irrigation tanks of Telangana State. As part of the program, a web mapping application using Flex and ArcGIS server was developed to make the data available to the public. In course of time as Flex be-came outdated, a migration of the client interface to the latest JavaScript based technologies was carried out. Initially, the Flex based application was migrated to ArcGIS JavaScript API using Dojo Toolkit. Both the client applications used published services from ArcGIS server. To check the migration pattern from proprietary to open source, the JavaScript based ArcGIS application was later migrated to OpenLayers and Dojo Toolkit which used published service from GeoServer. The migration pattern noticed in the study especially emphasizes upon the use of Dojo Toolkit and PostgreSQL database for ArcGIS server so that migration to open source could be performed effortlessly. The current ap-plication provides a case in study which could assist organizations in migrating their proprietary based ArcGIS web applications to open source. Furthermore, the study reveals cost benefits of adopting open source against commercial software's.

  13. Software Applications on the Peregrine System | High-Performance Computing

    Science.gov Websites

    programming and optimization. Gaussian Chemistry Program for calculating molecular electronic structure and Materials Science Open-source classical molecular dynamics program designed for massively parallel systems framework Q-Chem Chemistry ab initio quantum chemistry package for predictin molecular structures

  14. Mobile service for open data visualization on geo-based images

    NASA Astrophysics Data System (ADS)

    Lee, Kiwon; Kim, Kwangseob; Kang, Sanggoo

    2015-12-01

    Since the early 2010s, governments in most countries have adopted and promoted open data policy and open data platform. Korea are in the same situation, and government and public organizations have operated the public-accessible open data portal systems since 2011. The number of open data and data type have been increasing every year. These trends are more expandable or extensible on mobile environments. The purpose of this study is to design and implement a mobile application service to visualize various typed or formatted public open data with geo-based images on the mobile web. Open data cover downloadable data sets or open-accessible data application programming interface API. Geo-based images mean multi-sensor satellite imageries which are referred in geo-coordinates and matched with digital map sets. System components for mobile service are fully based on open sources and open development environments without any commercialized tools: PostgreSQL for database management system, OTB for remote sensing image processing, GDAL for data conversion, GeoServer for application server, OpenLayers for mobile web mapping, R for data analysis and D3.js for web-based data graphic processing. Mobile application in client side was implemented by using HTML5 for cross browser and cross platform. The result shows many advantageous points such as linking open data and geo-based data, integrating open data and open source, and demonstrating mobile applications with open data. It is expected that this approach is cost effective and process efficient implementation strategy for intelligent earth observing data.

  15. Enhancing Application Performance Using Mini-Apps: Comparison of Hybrid Parallel Programming Paradigms

    NASA Technical Reports Server (NTRS)

    Lawson, Gary; Poteat, Michael; Sosonkina, Masha; Baurle, Robert; Hammond, Dana

    2016-01-01

    In this work, several mini-apps have been created to enhance a real-world application performance, namely the VULCAN code for complex flow analysis developed at the NASA Langley Research Center. These mini-apps explore hybrid parallel programming paradigms with Message Passing Interface (MPI) for distributed memory access and either Shared MPI (SMPI) or OpenMP for shared memory accesses. Performance testing shows that MPI+SMPI yields the best execution performance, while requiring the largest number of code changes. A maximum speedup of 23X was measured for MPI+SMPI, but only 10X was measured for MPI+OpenMP.

  16. Computational toxicology using the OpenTox application programming interface and Bioclipse

    PubMed Central

    2011-01-01

    Background Toxicity is a complex phenomenon involving the potential adverse effect on a range of biological functions. Predicting toxicity involves using a combination of experimental data (endpoints) and computational methods to generate a set of predictive models. Such models rely strongly on being able to integrate information from many sources. The required integration of biological and chemical information sources requires, however, a common language to express our knowledge ontologically, and interoperating services to build reliable predictive toxicology applications. Findings This article describes progress in extending the integrative bio- and cheminformatics platform Bioclipse to interoperate with OpenTox, a semantic web framework which supports open data exchange and toxicology model building. The Bioclipse workbench environment enables functionality from OpenTox web services and easy access to OpenTox resources for evaluating toxicity properties of query molecules. Relevant cases and interfaces based on ten neurotoxins are described to demonstrate the capabilities provided to the user. The integration takes advantage of semantic web technologies, thereby providing an open and simplifying communication standard. Additionally, the use of ontologies ensures proper interoperation and reliable integration of toxicity information from both experimental and computational sources. Conclusions A novel computational toxicity assessment platform was generated from integration of two open science platforms related to toxicology: Bioclipse, that combines a rich scriptable and graphical workbench environment for integration of diverse sets of information sources, and OpenTox, a platform for interoperable toxicology data and computational services. The combination provides improved reliability and operability for handling large data sets by the use of the Open Standards from the OpenTox Application Programming Interface. This enables simultaneous access to a variety of distributed predictive toxicology databases, and algorithm and model resources, taking advantage of the Bioclipse workbench handling the technical layers. PMID:22075173

  17. Open data models for smart health interconnected applications: the example of openEHR.

    PubMed

    Demski, Hans; Garde, Sebastian; Hildebrand, Claudia

    2016-10-22

    Smart Health is known as a concept that enhances networking, intelligent data processing and combining patient data with other parameters. Open data models can play an important role in creating a framework for providing interoperable data services that support the development of innovative Smart Health applications profiting from data fusion and sharing. This article describes a model-driven engineering approach based on standardized clinical information models and explores its application for the development of interoperable electronic health record systems. The following possible model-driven procedures were considered: provision of data schemes for data exchange, automated generation of artefacts for application development and native platforms that directly execute the models. The applicability of the approach in practice was examined using the openEHR framework as an example. A comprehensive infrastructure for model-driven engineering of electronic health records is presented using the example of the openEHR framework. It is shown that data schema definitions to be used in common practice software development processes can be derived from domain models. The capabilities for automatic creation of implementation artefacts (e.g., data entry forms) are demonstrated. Complementary programming libraries and frameworks that foster the use of open data models are introduced. Several compatible health data platforms are listed. They provide standard based interfaces for interconnecting with further applications. Open data models help build a framework for interoperable data services that support the development of innovative Smart Health applications. Related tools for model-driven application development foster semantic interoperability and interconnected innovative applications.

  18. Hybrid MPI+OpenMP Programming of an Overset CFD Solver and Performance Investigations

    NASA Technical Reports Server (NTRS)

    Djomehri, M. Jahed; Jin, Haoqiang H.; Biegel, Bryan (Technical Monitor)

    2002-01-01

    This report describes a two level parallelization of a Computational Fluid Dynamic (CFD) solver with multi-zone overset structured grids. The approach is based on a hybrid MPI+OpenMP programming model suitable for shared memory and clusters of shared memory machines. The performance investigations of the hybrid application on an SGI Origin2000 (O2K) machine is reported using medium and large scale test problems.

  19. Spatializing Open Data for the Assessment and the Improvement of Territorial and Social Cohesion

    NASA Astrophysics Data System (ADS)

    Scorza, F.; Las Casas, G. B.; Murgante, B.

    2016-09-01

    An integrated place-based approach for the improvement of territorial and social cohesion is the new instance for planning disciplines, coming from EU New Cohesion Policies. This paper considers the territorial impact assessment of regional development policies as a precondition in order to develop balanced and effective operative programs at national and regional levels. The contribution of `open data' appears to be mature in order to support this application and in this paper we present a spatial analysis technique for the evaluation of EU funds effects at territorial level, starting from open data by Open Cohesion. The application is focused on internal areas of Basilicata Region: Agri river Valley. A complex contest, where environmental and agricultural traditional vocations conflict with a recent development of oil extraction industries. Conclusions concern further applications and perspectives to improve and support regional development planning considering the exploitation of open data sources and spatial analysis.

  20. Traleika Glacier X-Stack Extension Final Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fryman, Joshua

    The XStack Extension Project continued along the direction of the XStack program in exploring the software tools and frameworks to support a task-based community runtime towards the goal of Exascale programming. The momentum built as part of the XStack project, with the development of the task-based Open Community Runtime (OCR) and related tools, was carried through during the XStack Extension with the focus areas of easing application development, improving performance and supporting more features. The infrastructure set up for a community-driven open-source development continued to be used towards these areas, with continued co-development of runtime and applications. A variety ofmore » OCR programming environments were studied, as described in Sections Revolutionary Programming Environments & Applications – to assist with application development on OCR, and we develop OCR Translator, a ROSE-based source-to-source compiler that parses high-level annotations in an MPI program to generate equivalent OCR code. Figure 2 compares the number of OCR objects needed to generate the 2D stencil workload using the translator, against manual approaches based on SPMD library or native coding. The rate of increase with the translator, with an increase in number of ranks, is consistent with other approaches. This is explored further in Section OCR Translator.« less

  1. 12 CFR 606.602 - Application.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... open meetings of the Farm Credit Board. (2) Making inquiries or filing complaints. (3) Using the FCA library in McLean, Virginia. (4) Seeking employment with FCA. (5) Attending any meeting, conference, seminar, or other program open to the public. This list is illustrative only and failure to include an...

  2. 12 CFR 606.602 - Application.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... open meetings of the Farm Credit Board. (2) Making inquiries or filing complaints. (3) Using the FCA library in McLean, Virginia. (4) Seeking employment with FCA. (5) Attending any meeting, conference, seminar, or other program open to the public. This list is illustrative only and failure to include an...

  3. Writing in the Disciplines versus Corporate Workplaces: On the Importance of Conflicting Disciplinary Discourses in the Open Source Movement and the Value of Intellectual Property

    ERIC Educational Resources Information Center

    Ballentine, Brian D.

    2009-01-01

    Writing programs and more specifically, Writing in the Disciplines (WID) initiatives have begun to embrace the use of and the ideology inherent to, open source software. The Conference on College Composition and Communication has passed a resolution stating that whenever feasible educators and their institutions consider open source applications.…

  4. Cargo Movement Operations System (CMOS). Software Requirements Specification (Applications CSCI) Increment 1, Update

    DTIC Science & Technology

    1990-05-31

    12. CMOS PMO ACCEPTS COMMENT: YES [ ] NO [ ] ERCI ACCEPTS COMMENT: YES [ ] NO [ ] COMMENT DISPOSITION: COMMENT STATUS: OPEN [ ] CLOSED...ERCI ACCEPTS COMMENT: YES [ ] NO [ ] COMMENT DISPOSITION: COMMENT STATUS: OPEN [ ] CLOSED [ 3 ORIGINATOR CONTROL NUMBER: SRS1-0004 PROGRAM OFFICE...operational state of the SBSS. CMOS PMO ACCEPTS COMMENT: YES [ ] NO [ ] ERCI ACCEPTS COMMENT: YES [ ] NO [ ] COMMENT DISPOSITION: COMMENT STATUS: OPEN

  5. Division XII / Commission 46 / Program Group Exchnage of Astronomers

    NASA Astrophysics Data System (ADS)

    Percy, John R.; Leung, Kam-Ching; Tolbert, Charles R.

    The Commission 46 Program Group Exchange of Astronomers (PG-EA) provides travel grants to astronomers and advanced students for research or study trips of at least three months duration. Highest priority is given to applicants from developing countries whose visits will benefit them, their institution and country, and the institution visited. This program, if used strategically, has the potential to support other Commission 46 programs such as Teaching for Astronomical Development (PG-TAD) and World Wide Development of Astronomy (PG-WWDA). Complete information about the program, and the application procedure, can be found at .

  6. 77 FR 13261 - Request for Applications: The Community Forest and Open Space Conservation Program

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-03-06

    ....us or Maya Solomon, Program Coordinator, 202-205-1376, [email protected] . Individuals who use telecommunication devices for the deaf (TDD) may call the Federal Relay Service (FRS) at 1-800-877-8339 twenty-four...

  7. 77 FR 8801 - Request for Applications: The Community Forest and Open Space Conservation Program

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-02-15

    ... Solomon, Program Coordinator, 202-205-1376, [email protected] . Individuals who use telecommunication devices for the deaf (TDD) may call the Federal Relay Service (FRS) at 1-800-877-8339 twenty-four hours a...

  8. 2016 RFA for Great Lakes Long-Term Biology Monitoring Program: Phytoplankton Component

    EPA Pesticide Factsheets

    This Request for Applications solicits applications from eligible entities for a cooperative agreement to be awarded for a project to continue the long-term monitoring of phytoplankton in the open waters of the Great Lakes.

  9. Cargo Movement Operations System (CMOS). Final Software Requirements Specification, (Applications CSCI), Increment II

    DTIC Science & Technology

    1991-01-29

    NO [ ] COMMENT DISPOSITION: COMMENT STATUS: OPEN ( ] CLOSED [ ] ORIGINATOR CONTROL Nt3MBFR: SRS1-0002 PROGRAM OFFICE CONTROL NUMBER: DATA ITEM...floppy diskette interface with CMOS. CMOS PMO ACCEPTS COMMENT: YES [ ] NO [ ] ERCI ACCEPTS COMMENT: YES ( 3 NO [ ] COMMENT DISPOSITION: COMMENT STATUS: OPEN [ ] CLOSED [

  10. Real -time dispatching modelling for trucks with different capacities in open pit mines / Modelowanie w czasie rzeczywistym przewozów ciężarówek o różnej ładowności w kopalni odkrywkowej

    NASA Astrophysics Data System (ADS)

    Ahangaran, Daryoush Kaveh; Yasrebi, Amir Bijan; Wetherelt, Andy; Foster, Patrick

    2012-10-01

    Application of fully automated systems for truck dispatching plays a major role in decreasing the transportation costs which often represent the majority of costs spent on open pit mining. Consequently, the application of a truck dispatching system has become fundamentally important in most of the world's open pit mines. Recent experiences indicate that by decreasing a truck's travelling time and the associated waiting time of its associated shovel then due to the application of a truck dispatching system the rate of production will be considerably improved. Computer-based truck dispatching systems using algorithms, advanced and accurate software are examples of these innovations. Developing an algorithm of a computer- based program appropriated to a specific mine's conditions is considered as one of the most important activities in connection with computer-based dispatching in open pit mines. In this paper the changing trend of programming and dispatching control algorithms and automation conditions will be discussed. Furthermore, since the transportation fleet of most mines use trucks with different capacities, innovative methods, operational optimisation techniques and the best possible methods for developing the required algorithm for real-time dispatching are selected by conducting research on mathematical-based planning methods. Finally, a real-time dispatching model compatible with the requirement of trucks with different capacities is developed by using two techniques of flow networks and integer programming.

  11. Enhancing Application Performance Using Mini-Apps: Comparison of Hybrid Parallel Programming Paradigms

    NASA Technical Reports Server (NTRS)

    Lawson, Gary; Sosonkina, Masha; Baurle, Robert; Hammond, Dana

    2017-01-01

    In many fields, real-world applications for High Performance Computing have already been developed. For these applications to stay up-to-date, new parallel strategies must be explored to yield the best performance; however, restructuring or modifying a real-world application may be daunting depending on the size of the code. In this case, a mini-app may be employed to quickly explore such options without modifying the entire code. In this work, several mini-apps have been created to enhance a real-world application performance, namely the VULCAN code for complex flow analysis developed at the NASA Langley Research Center. These mini-apps explore hybrid parallel programming paradigms with Message Passing Interface (MPI) for distributed memory access and either Shared MPI (SMPI) or OpenMP for shared memory accesses. Performance testing shows that MPI+SMPI yields the best execution performance, while requiring the largest number of code changes. A maximum speedup of 23 was measured for MPI+SMPI, but only 11 was measured for MPI+OpenMP.

  12. Testing New Programming Paradigms with NAS Parallel Benchmarks

    NASA Technical Reports Server (NTRS)

    Jin, H.; Frumkin, M.; Schultz, M.; Yan, J.

    2000-01-01

    Over the past decade, high performance computing has evolved rapidly, not only in hardware architectures but also with increasing complexity of real applications. Technologies have been developing to aim at scaling up to thousands of processors on both distributed and shared memory systems. Development of parallel programs on these computers is always a challenging task. Today, writing parallel programs with message passing (e.g. MPI) is the most popular way of achieving scalability and high performance. However, writing message passing programs is difficult and error prone. Recent years new effort has been made in defining new parallel programming paradigms. The best examples are: HPF (based on data parallelism) and OpenMP (based on shared memory parallelism). Both provide simple and clear extensions to sequential programs, thus greatly simplify the tedious tasks encountered in writing message passing programs. HPF is independent of memory hierarchy, however, due to the immaturity of compiler technology its performance is still questionable. Although use of parallel compiler directives is not new, OpenMP offers a portable solution in the shared-memory domain. Another important development involves the tremendous progress in the internet and its associated technology. Although still in its infancy, Java promisses portability in a heterogeneous environment and offers possibility to "compile once and run anywhere." In light of testing these new technologies, we implemented new parallel versions of the NAS Parallel Benchmarks (NPBs) with HPF and OpenMP directives, and extended the work with Java and Java-threads. The purpose of this study is to examine the effectiveness of alternative programming paradigms. NPBs consist of five kernels and three simulated applications that mimic the computation and data movement of large scale computational fluid dynamics (CFD) applications. We started with the serial version included in NPB2.3. Optimization of memory and cache usage was applied to several benchmarks, noticeably BT and SP, resulting in better sequential performance. In order to overcome the lack of an HPF performance model and guide the development of the HPF codes, we employed an empirical performance model for several primitives found in the benchmarks. We encountered a few limitations of HPF, such as lack of supporting the "REDISTRIBUTION" directive and no easy way to handle irregular computation. The parallelization with OpenMP directives was done at the outer-most loop level to achieve the largest granularity. The performance of six HPF and OpenMP benchmarks is compared with their MPI counterparts for the Class-A problem size in the figure in next page. These results were obtained on an SGI Origin2000 (195MHz) with MIPSpro-f77 compiler 7.2.1 for OpenMP and MPI codes and PGI pghpf-2.4.3 compiler with MPI interface for HPF programs.

  13. 77 FR 10783 - Meeting of National Council on the Humanities

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-02-23

    ... Meetings (Open to the Public) Policy Discussion 9-10:30 a.m. Digital Humanities Room 402 Education Programs... Research Programs Room 315 (Closed to the Public) Discussion of Specific Grant Applications and Programs.... Presentation by Joan Houston Hall, editor of the Dictionary of American Regional English (DARE) 3. Staff Report...

  14. Leveraging Open Standard Interfaces in Providing Efficient Discovery, Retrieval, and Information of NASA-Sponsored Observations and Predictions

    NASA Astrophysics Data System (ADS)

    Cole, M.; Alameh, N.; Bambacus, M.

    2006-05-01

    The Applied Sciences Program at NASA focuses on extending the results of NASA's Earth-Sun system science research beyond the science and research communities to contribute to national priority applications with societal benefits. By employing a systems engineering approach, supporting interoperable data discovery and access, and developing partnerships with federal agencies and national organizations, the Applied Sciences Program facilitates the transition from research to operations in national applications. In particular, the Applied Sciences Program identifies twelve national applications, listed at http://science.hq.nasa.gov/earth-sun/applications/, which can be best served by the results of NASA aerospace research and development of science and technologies. The ability to use and integrate NASA data and science results into these national applications results in enhanced decision support and significant socio-economic benefits for each of the applications. This paper focuses on leveraging the power of interoperability and specifically open standard interfaces in providing efficient discovery, retrieval, and integration of NASA's science research results. Interoperability (the ability to access multiple, heterogeneous geoprocessing environments, either local or remote by means of open and standard software interfaces) can significantly increase the value of NASA-related data by increasing the opportunities to discover, access and integrate that data in the twelve identified national applications (particularly in non-traditional settings). Furthermore, access to data, observations, and analytical models from diverse sources can facilitate interdisciplinary and exploratory research and analysis. To streamline this process, the NASA GeoSciences Interoperability Office (GIO) is developing the NASA Earth-Sun System Gateway (ESG) to enable access to remote geospatial data, imagery, models, and visualizations through open, standard web protocols. The gateway (online at http://esg.gsfc.nasa.gov) acts as a flexible and searchable registry of NASA-related resources (files, services, models, etc) and allows scientists, decision makers and others to discover and retrieve a wide variety of observations and predictions of natural and human phenomena related to Earth Science from NASA and other sources. To support the goals of the Applied Sciences national applications, GIO staff is also working with the national applications communities to identify opportunities where open standards-based discovery and access to NASA data can enhance the decision support process of the national applications. This paper describes the work performed to-date on that front, and summarizes key findings in terms of identified data sources and benefiting national applications. The paper also highlights the challenges encountered in making NASA-related data accessible in a cross-cutting fashion and identifies areas where interoperable approaches can be leveraged.

  15. METLIN-PC: An applications-program package for problems of mathematical programming

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pshenichnyi, B.N.; Sobolenko, L.A.; Sosnovskii, A.A.

    1994-05-01

    The METLIN-PC applications-program package (APP) was developed at the V.M. Glushkov Institute of Cybernetics of the Academy of Sciences of Ukraine on IBM PC XT and AT computers. The present version of the package was written in Turbo Pascal and Fortran-77. The METLIN-PC is chiefly designed for the solution of smooth problems of mathematical programming and is a further development of the METLIN prototype, which was created earlier on a BESM-6 computer. The principal property of the previous package is retained - the applications modules employ a single approach based on the linearization method of B.N. Pschenichnyi. Hence the namemore » {open_quotes}METLIN.{close_quotes}« less

  16. Exploring social support and job satisfaction among associate degree program directors in California.

    PubMed

    Mintz-Binder, Ronda D; Fitzpatrick, Joyce J

    2009-01-01

    A troubling trend noted in California has been an increase in the number of open positions for program directors of associate degree registered nursing (ADRN) programs. Positions remain open for extended periods of time, and the number of qualified applicants for such positions is insufficient. The loss of and ensuing slow replacement of ADRN program directors can put these programs in jeopardy of student admission suspension, or, worse yet, closure by the state nursing board. In this exploratory study, variables of social support and job satisfaction were studied. Variables were found to be limited opportunities for peer interaction, expressed discontent, and retention concerns. A significant positive relationship between job satisfaction and social support was noted. Recommendations for future research are offered.

  17. Portable LQCD Monte Carlo code using OpenACC

    NASA Astrophysics Data System (ADS)

    Bonati, Claudio; Calore, Enrico; Coscetti, Simone; D'Elia, Massimo; Mesiti, Michele; Negro, Francesco; Fabio Schifano, Sebastiano; Silvi, Giorgio; Tripiccione, Raffaele

    2018-03-01

    Varying from multi-core CPU processors to many-core GPUs, the present scenario of HPC architectures is extremely heterogeneous. In this context, code portability is increasingly important for easy maintainability of applications; this is relevant in scientific computing where code changes are numerous and frequent. In this talk we present the design and optimization of a state-of-the-art production level LQCD Monte Carlo application, using the OpenACC directives model. OpenACC aims to abstract parallel programming to a descriptive level, where programmers do not need to specify the mapping of the code on the target machine. We describe the OpenACC implementation and show that the same code is able to target different architectures, including state-of-the-art CPUs and GPUs.

  18. Effective Technology Insertion: The Key to Evolutionary Acquisition Program

    DTIC Science & Technology

    2004-05-03

    Army War College, 7 April 2003. 47Orazia A. Di Marca ; Rejto, Stephen B. Rejto and Thomas Gomez, “ Open System Design and Evolutionary Acquisition...to Military Applications-Report No. D-2002-107. 14 June 2002. Di Marca , Orazia A.; Rejto, StephenB., and Gomez, Thomas, “ Open System Design and

  19. Interface handbook : National Airspace System (NAS) Open System Environment (OSE) application services

    DOT National Transportation Integrated Search

    1997-06-27

    The Intelligent Vehicle Highway System (IVHS) program is a major new national program which has dramatically come of age in the last five years. Internationally, similar events have also occurred in both Japan and Europe. However, what may be suspect...

  20. An open-source framework for testing tracking devices using Lego Mindstorms

    NASA Astrophysics Data System (ADS)

    Jomier, Julien; Ibanez, Luis; Enquobahrie, Andinet; Pace, Danielle; Cleary, Kevin

    2009-02-01

    In this paper, we present an open-source framework for testing tracking devices in surgical navigation applications. At the core of image-guided intervention systems is the tracking interface that handles communication with the tracking device and gathers tracking information. Given that the correctness of tracking information is critical for protecting patient safety and for ensuring the successful execution of an intervention, the tracking software component needs to be thoroughly tested on a regular basis. Furthermore, with widespread use of extreme programming methodology that emphasizes continuous and incremental testing of application components, testing design becomes critical. While it is easy to automate most of the testing process, it is often more difficult to test components that require manual intervention such as tracking device. Our framework consists of a robotic arm built from a set of Lego Mindstorms and an open-source toolkit written in C++ to control the robot movements and assess the accuracy of the tracking devices. The application program interface (API) is cross-platform and runs on Windows, Linux and MacOS. We applied this framework for the continuous testing of the Image-Guided Surgery Toolkit (IGSTK), an open-source toolkit for image-guided surgery and shown that regression testing on tracking devices can be performed at low cost and improve significantly the quality of the software.

  1. HydroDesktop as a Community Designed and Developed Resource for Hydrologic Data Discovery and Analysis

    NASA Astrophysics Data System (ADS)

    Ames, D. P.

    2013-12-01

    As has been seen in other informatics fields, well-documented and appropriately licensed open source software tools have the potential to significantly increase both opportunities and motivation for inter-institutional science and technology collaboration. The CUAHSI HIS (and related HydroShare) projects have aimed to foster such activities in hydrology resulting in the development of many useful community software components including the HydroDesktop software application. HydroDesktop is an open source, GIS-based, scriptable software application for discovering data on the CUAHSI Hydrologic Information System and related resources. It includes a well-defined plugin architecture and interface to allow 3rd party developers to create extensions and add new functionality without requiring recompiling of the full source code. HydroDesktop is built in the C# programming language and uses the open source DotSpatial GIS engine for spatial data management. Capabilities include data search, discovery, download, visualization, and export. An extension that integrates the R programming language with HydroDesktop provides scripting and data automation capabilities and an OpenMI plugin provides the ability to link models. Current revision and updates to HydroDesktop include migration of core business logic to cross platform, scriptable Python code modules that can be executed in any operating system or linked into other software front-end applications.

  2. Automatically Preparing Safe SQL Queries

    NASA Astrophysics Data System (ADS)

    Bisht, Prithvi; Sistla, A. Prasad; Venkatakrishnan, V. N.

    We present the first sound program source transformation approach for automatically transforming the code of a legacy web application to employ PREPARE statements in place of unsafe SQL queries. Our approach therefore opens the way for eradicating the SQL injection threat vector from legacy web applications.

  3. An open-source java platform for automated reaction mapping.

    PubMed

    Crabtree, John D; Mehta, Dinesh P; Kouri, Tina M

    2010-09-27

    This article presents software applications that have been built upon a modular, open-source, reaction mapping library that can be used in both cheminformatics and bioinformatics research. We first describe the theoretical underpinnings and modular architecture of the core software library. We then describe two applications that have been built upon that core. The first is a generic reaction viewer and mapper, and the second classifies reactions according to rules that can be modified by end users with little or no programming skills.

  4. Openlobby: an open game server for lobby and matchmaking

    NASA Astrophysics Data System (ADS)

    Zamzami, E. M.; Tarigan, J. T.; Jaya, I.; Hardi, S. M.

    2018-03-01

    Online Multiplayer is one of the most essential feature in modern games. However, while developing a multiplayer feature can be done with a simple computer networking programming, creating a balanced multiplayer session requires more player management components such as game lobby and matchmaking system. Our objective is to develop OpenLobby, a server that available to be used by other developers to support their multiplayer application. The proposed system acts as a lobby and matchmaker where queueing players will be matched to other player according to a certain criteria defined by developer. The solution provides an application programing interface that can be used by developer to interact with the server. For testing purpose, we developed a game that uses the server as their multiplayer server.

  5. Graduate Medical Education Viewed from the National Intern and Resident Matching Program

    ERIC Educational Resources Information Center

    Graettinger, John S.

    1976-01-01

    The total number of applicants for first-year programs in graduate medical education through the National Intern and Resident Matching Program in 1976 exceeded the number of positions offered for the second consecutive year. There were deficits in the number of openings offered in the primary care specialties and surfeits in medical and surgical…

  6. Ensemble: an Architecture for Mission-Operations Software

    NASA Technical Reports Server (NTRS)

    Norris, Jeffrey; Powell, Mark; Fox, Jason; Rabe, Kenneth; Shu, IHsiang; McCurdy, Michael; Vera, Alonso

    2008-01-01

    Ensemble is the name of an open architecture for, and a methodology for the development of, spacecraft mission operations software. Ensemble is also potentially applicable to the development of non-spacecraft mission-operations- type software. Ensemble capitalizes on the strengths of the open-source Eclipse software and its architecture to address several issues that have arisen repeatedly in the development of mission-operations software: Heretofore, mission-operations application programs have been developed in disparate programming environments and integrated during the final stages of development of missions. The programs have been poorly integrated, and it has been costly to develop, test, and deploy them. Users of each program have been forced to interact with several different graphical user interfaces (GUIs). Also, the strategy typically used in integrating the programs has yielded serial chains of operational software tools of such a nature that during use of a given tool, it has not been possible to gain access to the capabilities afforded by other tools. In contrast, the Ensemble approach offers a low-risk path towards tighter integration of mission-operations software tools.

  7. Ted Kwasnik | NREL

    Science.gov Websites

    Architecture/Implementation of GIS Applications Open Source Programming and Web Development Spatial Analysis and Cartography Research Interests Transportation Systems and Urban Mobility Wind and Solar Resource

  8. AZOrange - High performance open source machine learning for QSAR modeling in a graphical programming environment

    PubMed Central

    2011-01-01

    Background Machine learning has a vast range of applications. In particular, advanced machine learning methods are routinely and increasingly used in quantitative structure activity relationship (QSAR) modeling. QSAR data sets often encompass tens of thousands of compounds and the size of proprietary, as well as public data sets, is rapidly growing. Hence, there is a demand for computationally efficient machine learning algorithms, easily available to researchers without extensive machine learning knowledge. In granting the scientific principles of transparency and reproducibility, Open Source solutions are increasingly acknowledged by regulatory authorities. Thus, an Open Source state-of-the-art high performance machine learning platform, interfacing multiple, customized machine learning algorithms for both graphical programming and scripting, to be used for large scale development of QSAR models of regulatory quality, is of great value to the QSAR community. Results This paper describes the implementation of the Open Source machine learning package AZOrange. AZOrange is specially developed to support batch generation of QSAR models in providing the full work flow of QSAR modeling, from descriptor calculation to automated model building, validation and selection. The automated work flow relies upon the customization of the machine learning algorithms and a generalized, automated model hyper-parameter selection process. Several high performance machine learning algorithms are interfaced for efficient data set specific selection of the statistical method, promoting model accuracy. Using the high performance machine learning algorithms of AZOrange does not require programming knowledge as flexible applications can be created, not only at a scripting level, but also in a graphical programming environment. Conclusions AZOrange is a step towards meeting the needs for an Open Source high performance machine learning platform, supporting the efficient development of highly accurate QSAR models fulfilling regulatory requirements. PMID:21798025

  9. AZOrange - High performance open source machine learning for QSAR modeling in a graphical programming environment.

    PubMed

    Stålring, Jonna C; Carlsson, Lars A; Almeida, Pedro; Boyer, Scott

    2011-07-28

    Machine learning has a vast range of applications. In particular, advanced machine learning methods are routinely and increasingly used in quantitative structure activity relationship (QSAR) modeling. QSAR data sets often encompass tens of thousands of compounds and the size of proprietary, as well as public data sets, is rapidly growing. Hence, there is a demand for computationally efficient machine learning algorithms, easily available to researchers without extensive machine learning knowledge. In granting the scientific principles of transparency and reproducibility, Open Source solutions are increasingly acknowledged by regulatory authorities. Thus, an Open Source state-of-the-art high performance machine learning platform, interfacing multiple, customized machine learning algorithms for both graphical programming and scripting, to be used for large scale development of QSAR models of regulatory quality, is of great value to the QSAR community. This paper describes the implementation of the Open Source machine learning package AZOrange. AZOrange is specially developed to support batch generation of QSAR models in providing the full work flow of QSAR modeling, from descriptor calculation to automated model building, validation and selection. The automated work flow relies upon the customization of the machine learning algorithms and a generalized, automated model hyper-parameter selection process. Several high performance machine learning algorithms are interfaced for efficient data set specific selection of the statistical method, promoting model accuracy. Using the high performance machine learning algorithms of AZOrange does not require programming knowledge as flexible applications can be created, not only at a scripting level, but also in a graphical programming environment. AZOrange is a step towards meeting the needs for an Open Source high performance machine learning platform, supporting the efficient development of highly accurate QSAR models fulfilling regulatory requirements.

  10. 48 CFR 22.1014 - Delay over 60 days in bid opening or commencement of work.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 48 Federal Acquisition Regulations System 1 2010-10-01 2010-10-01 false Delay over 60 days in bid... ACQUISITION REGULATION SOCIOECONOMIC PROGRAMS APPLICATION OF LABOR LAWS TO GOVERNMENT ACQUISITIONS Service Contract Act of 1965, as Amended 22.1014 Delay over 60 days in bid opening or commencement of work. If a...

  11. Mission Systems Open Architecture Science and Technology (MOAST) program

    NASA Astrophysics Data System (ADS)

    Littlejohn, Kenneth; Rajabian-Schwart, Vahid; Kovach, Nicholas; Satterthwaite, Charles P.

    2017-04-01

    The Mission Systems Open Architecture Science and Technology (MOAST) program is an AFRL effort that is developing and demonstrating Open System Architecture (OSA) component prototypes, along with methods and tools, to strategically evolve current OSA standards and technical approaches, promote affordable capability evolution, reduce integration risk, and address emerging challenges [1]. Within the context of open architectures, the program is conducting advanced research and concept development in the following areas: (1) Evolution of standards; (2) Cyber-Resiliency; (3) Emerging Concepts and Technologies; (4) Risk Reduction Studies and Experimentation; and (5) Advanced Technology Demonstrations. Current research includes the development of methods, tools, and techniques to characterize the performance of OMS data interconnection methods for representative mission system applications. Of particular interest are the OMS Critical Abstraction Layer (CAL), the Avionics Service Bus (ASB), and the Bulk Data Transfer interconnects, as well as to develop and demonstrate cybersecurity countermeasures techniques to detect and mitigate cyberattacks against open architecture based mission systems and ensure continued mission operations. Focus is on cybersecurity techniques that augment traditional cybersecurity controls and those currently defined within the Open Mission System and UCI standards. AFRL is also developing code generation tools and simulation tools to support evaluation and experimentation of OSA-compliant implementations.

  12. HTSstation: a web application and open-access libraries for high-throughput sequencing data analysis.

    PubMed

    David, Fabrice P A; Delafontaine, Julien; Carat, Solenne; Ross, Frederick J; Lefebvre, Gregory; Jarosz, Yohan; Sinclair, Lucas; Noordermeer, Daan; Rougemont, Jacques; Leleu, Marion

    2014-01-01

    The HTSstation analysis portal is a suite of simple web forms coupled to modular analysis pipelines for various applications of High-Throughput Sequencing including ChIP-seq, RNA-seq, 4C-seq and re-sequencing. HTSstation offers biologists the possibility to rapidly investigate their HTS data using an intuitive web application with heuristically pre-defined parameters. A number of open-source software components have been implemented and can be used to build, configure and run HTS analysis pipelines reactively. Besides, our programming framework empowers developers with the possibility to design their own workflows and integrate additional third-party software. The HTSstation web application is accessible at http://htsstation.epfl.ch.

  13. HTSstation: A Web Application and Open-Access Libraries for High-Throughput Sequencing Data Analysis

    PubMed Central

    David, Fabrice P. A.; Delafontaine, Julien; Carat, Solenne; Ross, Frederick J.; Lefebvre, Gregory; Jarosz, Yohan; Sinclair, Lucas; Noordermeer, Daan; Rougemont, Jacques; Leleu, Marion

    2014-01-01

    The HTSstation analysis portal is a suite of simple web forms coupled to modular analysis pipelines for various applications of High-Throughput Sequencing including ChIP-seq, RNA-seq, 4C-seq and re-sequencing. HTSstation offers biologists the possibility to rapidly investigate their HTS data using an intuitive web application with heuristically pre-defined parameters. A number of open-source software components have been implemented and can be used to build, configure and run HTS analysis pipelines reactively. Besides, our programming framework empowers developers with the possibility to design their own workflows and integrate additional third-party software. The HTSstation web application is accessible at http://htsstation.epfl.ch. PMID:24475057

  14. Automatic Generation of Directive-Based Parallel Programs for Shared Memory Parallel Systems

    NASA Technical Reports Server (NTRS)

    Jin, Hao-Qiang; Yan, Jerry; Frumkin, Michael

    2000-01-01

    The shared-memory programming model is a very effective way to achieve parallelism on shared memory parallel computers. As great progress was made in hardware and software technologies, performance of parallel programs with compiler directives has demonstrated large improvement. The introduction of OpenMP directives, the industrial standard for shared-memory programming, has minimized the issue of portability. Due to its ease of programming and its good performance, the technique has become very popular. In this study, we have extended CAPTools, a computer-aided parallelization toolkit, to automatically generate directive-based, OpenMP, parallel programs. We outline techniques used in the implementation of the tool and present test results on the NAS parallel benchmarks and ARC3D, a CFD application. This work demonstrates the great potential of using computer-aided tools to quickly port parallel programs and also achieve good performance.

  15. Methods/Labor Standards Application Program - Phase IV

    DTIC Science & Technology

    1985-01-01

    Engine Platform a. Pressure switch b. Compressor motor c. Voltage regulator d. Open and clean generator exciter and main windings S3 . Main Collector...clean motors b. Slip rings Gantry #3 Annual: S2. Engine Platform a. Pressure switch b. Compressor motor Voltage regulator d. Open and clean generator...Travel Motors Open and clean motorsa. b. Slip rings Gantry #4 S2 . S3. S4 . S5 . Engine Platform a. Pressure switch b. Compressor motor Voltage regulator

  16. If Minicomputers Are the Answer, What Was the Question?

    ERIC Educational Resources Information Center

    GRI Computer Corp., Newton, MA.

    The availability of low-cost minicomputers in the last few years has opened up many new control and special purpose applications for computers. However, using general purpose computers for these specialized applications often leads to inefficiencies in programing and operation. GRI Computer Corporation has developed a common-sense approach called…

  17. Factors influencing the number of applications submitted per applicant to orthopedic residency programs

    PubMed Central

    Finkler, Elissa S.; Fogel, Harold A.; Kroin, Ellen; Kliethermes, Stephanie; Wu, Karen; Nystrom, Lukas M.; Schiff, Adam P.

    2016-01-01

    Background From 2002 to 2014, the orthopedic surgery residency applicant pool increased by 25% while the number of applications submitted per applicant rose by 69%, resulting in an increase of 109% in the number of applications received per program. Objective This study aimed to identify applicant factors associated with an increased number of applications to orthopedic surgery residency programs. Design An anonymous survey was sent to all applicants applying to the orthopedic surgery residency program at Loyola University. Questions were designed to define the number of applications submitted per respondent as well as the strength of their application. Of 733 surveys sent, 140 (19.1%) responses were received. Setting An academic institution in Maywood, IL. Participants Fourth-year medical students applying to the orthopedic surgery residency program at Loyola University. Results An applicant's perception of how competitive he or she was (applicants who rated themselves as ‘average’ submitted more applications than those who rated themselves as either ‘good’ or ‘outstanding’, p=0.001) and the number of away rotations (those who completed >2 away rotations submitted more applications, p=0.03) were significantly associated with an increased number of applications submitted. No other responses were found to be associated with an increased number of applications submitted. Conclusion Less qualified candidates are not applying to significantly more programs than their more qualified counterparts. The increasing number of applications represents a financial strain on the applicant, given the costs required to apply to more programs, and a time burden on individual programs to screen increasing numbers of applicants. In order to stabilize or reverse this alarming trend, orthopedic surgery residency programs should openly disclose admission criteria to prospective candidates, and medical schools should provide additional guidance for candidates in this process. PMID:27448634

  18. scikit-image: image processing in Python.

    PubMed

    van der Walt, Stéfan; Schönberger, Johannes L; Nunez-Iglesias, Juan; Boulogne, François; Warner, Joshua D; Yager, Neil; Gouillart, Emmanuelle; Yu, Tony

    2014-01-01

    scikit-image is an image processing library that implements algorithms and utilities for use in research, education and industry applications. It is released under the liberal Modified BSD open source license, provides a well-documented API in the Python programming language, and is developed by an active, international team of collaborators. In this paper we highlight the advantages of open source to achieve the goals of the scikit-image library, and we showcase several real-world image processing applications that use scikit-image. More information can be found on the project homepage, http://scikit-image.org.

  19. 76 FR 30368 - Announcement of the Publication of Funding Opportunity Announcements under the Runaway and...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-05-25

    .../index.html ) on 4/25/2011: Funding opportunity number Application Funding opportunity title (FON) Access... grants/open/foa/view/HHS- 2011-ACF-ACYF-CY-0166. Street Outreach Program HHS-2011-ACF-ACYF-YO-0168. http://www.acf.hhs.gov/ 6/24/2011 grants/open/foa/view/HHS- 2011-ACF-ACYF-YO-0168. [[Page 30369

  20. Creating Open Education Resources for Teaching and Community Development through Action Research: The Milk Production and Hygiene Module

    ERIC Educational Resources Information Center

    Ssajjakambwe, Paul; Setumba, Christopher; Kisaka, Stevens; Bahizi, Gloria; Vudriko, Patrick; Kabasa, John D.; Kaneene, John B.

    2013-01-01

    One of the cornerstones of the AgShare program is the application of an information loop of action research in the training of graduate students to generate new and practical educational materials and interventions for creating open education research (OER) modules for teaching at universities, and for designing interventions and training…

  1. 75 FR 6729 - Meeting of National Council on the Humanities

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-02-10

    ... Applications and Programs Before the Council 10:30 a.m. until Adjourned Digital Humanities--Room M-07 Education...: Committee Meetings (Open to the Public) Policy Discussion 9-10:30 a.m. Digital Humanities--Room M-07 Education Programs and Federal/State Partnership--Room 510A Preservation and Access--Room 415 Public...

  2. The Uneven March toward Social Justice: Diversity, Conflict, and Complexity in Educational Administration Programs

    ERIC Educational Resources Information Center

    McClellan, Rhonda; Dominguez, Ramon

    2006-01-01

    Purpose: This paper aims to provide a framework for the development and implementation of educational administration programs that encourage practitioners and educational administration faculty to push application and preparation beyond reproducing tendencies of the status quo as well as to open education to the potential of embracing silenced or…

  3. Migration of a telehealth program to a e-education health program

    NASA Astrophysics Data System (ADS)

    Gomez, A.; Montano, L. F.; Amaro, L.; Aleman, B.

    It's presented the result of the experience of Telehealth in Mexico, inside a National program, in one Public Health Institution, which along nine years of using, has been fulfilled a retrospective and prospective analysis of future application, emphasising on the specification of characteristics of the application sites, with impact measures: Cost/Opportunity , Cost/Benefit , and Cost/Efficiency . Anticipating inversion and reorganization of the net when being convenient, as well as situate the distance medical attention, beyond the institutional technologic platforms. A fanlight of possibilities is already opened to e-education programs that support the preventive medicine, the self-care, and the distance medical education in all medical attention levels, enlarging it covering not only to doctors, paramedical and nurses but also to general population, making it more equable and covering the minorities like rural population, handicaps, and indigene population overall in development ways countries and identifying the impact measurements in the evaluation of the enabling given to; doctors, teachers, students and open population. Also is proposed a Latin American E-Education Net for Health.

  4. Leveraging Web Services in Providing Efficient Discovery, Retrieval, and Integration of NASA-Sponsored Observations and Predictions

    NASA Astrophysics Data System (ADS)

    Bambacus, M.; Alameh, N.; Cole, M.

    2006-12-01

    The Applied Sciences Program at NASA focuses on extending the results of NASA's Earth-Sun system science research beyond the science and research communities to contribute to national priority applications with societal benefits. By employing a systems engineering approach, supporting interoperable data discovery and access, and developing partnerships with federal agencies and national organizations, the Applied Sciences Program facilitates the transition from research to operations in national applications. In particular, the Applied Sciences Program identifies twelve national applications, listed at http://science.hq.nasa.gov/earth-sun/applications/, which can be best served by the results of NASA aerospace research and development of science and technologies. The ability to use and integrate NASA data and science results into these national applications results in enhanced decision support and significant socio-economic benefits for each of the applications. This paper focuses on leveraging the power of interoperability and specifically open standard interfaces in providing efficient discovery, retrieval, and integration of NASA's science research results. Interoperability (the ability to access multiple, heterogeneous geoprocessing environments, either local or remote by means of open and standard software interfaces) can significantly increase the value of NASA-related data by increasing the opportunities to discover, access and integrate that data in the twelve identified national applications (particularly in non-traditional settings). Furthermore, access to data, observations, and analytical models from diverse sources can facilitate interdisciplinary and exploratory research and analysis. To streamline this process, the NASA GeoSciences Interoperability Office (GIO) is developing the NASA Earth-Sun System Gateway (ESG) to enable access to remote geospatial data, imagery, models, and visualizations through open, standard web protocols. The gateway (online at http://esg.gsfc.nasa.gov) acts as a flexible and searchable registry of NASA-related resources (files, services, models, etc) and allows scientists, decision makers and others to discover and retrieve a wide variety of observations and predictions of natural and human phenomena related to Earth Science from NASA and other sources. To support the goals of the Applied Sciences national applications, GIO staff is also working with the national applications communities to identify opportunities where open standards-based discovery and access to NASA data can enhance the decision support process of the national applications. This paper describes the work performed to-date on that front, and summarizes key findings in terms of identified data sources and benefiting national applications. The paper also highlights the challenges encountered in making NASA-related data accessible in a cross-cutting fashion and identifies areas where interoperable approaches can be leveraged.

  5. T-Check in Technologies for Interoperability: Web Services and Security--Single Sign-On

    DTIC Science & Technology

    2007-12-01

    following tools: • Apache Tomcat 6.0—a Java Servlet container to host the Web services and a simple Web client application [Apache 2007a] • Apache Axis...Eclipse. Eclipse – an open development platform. http://www.eclipse.org/ (2007) [Hunter 2001] Hunter, Jason. Java Servlet Programming, 2nd Edition...Citation SAML 1.1 Java Toolkit SAML Ping Identity’s SAML-1.1 implementation [SourceID 2006] OpenSAML SAML An open source implementation of SAML 1.1

  6. GiPSi:a framework for open source/open architecture software development for organ-level surgical simulation.

    PubMed

    Cavuşoğlu, M Cenk; Göktekin, Tolga G; Tendick, Frank

    2006-04-01

    This paper presents the architectural details of an evolving open source/open architecture software framework for developing organ-level surgical simulations. Our goal is to facilitate shared development of reusable models, to accommodate heterogeneous models of computation, and to provide a framework for interfacing multiple heterogeneous models. The framework provides an application programming interface for interfacing dynamic models defined over spatial domains. It is specifically designed to be independent of the specifics of the modeling methods used, and therefore facilitates seamless integration of heterogeneous models and processes. Furthermore, each model has separate geometries for visualization, simulation, and interfacing, allowing the model developer to choose the most natural geometric representation for each case. Input/output interfaces for visualization and haptics for real-time interactive applications have also been provided.

  7. Optics Program Modified for Multithreaded Parallel Computing

    NASA Technical Reports Server (NTRS)

    Lou, John; Bedding, Dave; Basinger, Scott

    2006-01-01

    A powerful high-performance computer program for simulating and analyzing adaptive and controlled optical systems has been developed by modifying the serial version of the Modeling and Analysis for Controlled Optical Systems (MACOS) program to impart capabilities for multithreaded parallel processing on computing systems ranging from supercomputers down to Symmetric Multiprocessing (SMP) personal computers. The modifications included the incorporation of OpenMP, a portable and widely supported application interface software, that can be used to explicitly add multithreaded parallelism to an application program under a shared-memory programming model. OpenMP was applied to parallelize ray-tracing calculations, one of the major computing components in MACOS. Multithreading is also used in the diffraction propagation of light in MACOS based on pthreads [POSIX Thread, (where "POSIX" signifies a portable operating system for UNIX)]. In tests of the parallelized version of MACOS, the speedup in ray-tracing calculations was found to be linear, or proportional to the number of processors, while the speedup in diffraction calculations ranged from 50 to 60 percent, depending on the type and number of processors. The parallelized version of MACOS is portable, and, to the user, its interface is basically the same as that of the original serial version of MACOS.

  8. Cloud based, Open Source Software Application for Mitigating Herbicide Drift

    NASA Astrophysics Data System (ADS)

    Saraswat, D.; Scott, B.

    2014-12-01

    The spread of herbicide resistant weeds has resulted in the need for clearly marked fields. In response to this need, the University of Arkansas Cooperative Extension Service launched a program named Flag the Technology in 2011. This program uses color-coded flags as a visual alert of the herbicide trait technology within a farm field. The flag based program also serves to help avoid herbicide misapplication and prevent herbicide drift damage between fields with differing crop technologies. This program has been endorsed by Southern Weed Science Society of America and is attracting interest from across the USA, Canada, and Australia. However, flags have risk of misplacement or disappearance due to mischief or severe windstorms/thunderstorms, respectively. This presentation will discuss the design and development of a cloud-based, free application utilizing open-source technologies, called Flag the Technology Cloud (FTTCloud), for allowing agricultural stakeholders to color code their farm fields for indicating herbicide resistant technologies. The developed software utilizes modern web development practices, widely used design technologies, and basic geographic information system (GIS) based interactive interfaces for representing, color-coding, searching, and visualizing fields. This program has also been made compatible for a wider usability on different size devices- smartphones, tablets, desktops and laptops.

  9. Norms and Standards for Computer Education (MCA, BCA) through Distance Mode.

    ERIC Educational Resources Information Center

    Rausaria, R.R., Ed.; Lele, Nalini A., Ed.; Bhushan, Bharat, Ed.

    This document presents the norms and standards for computer education in India through distance mode, including the Masters in Computer Applications (MCA) and Bachelor in Computer Applications (BCA) programs. These norms and standards were considered and approved by the Distance Education Council, Indira Gandhi National Open University (India), at…

  10. Performance Analysis of a Hybrid Overset Multi-Block Application on Multiple Architectures

    NASA Technical Reports Server (NTRS)

    Djomehri, M. Jahed; Biswas, Rupak

    2003-01-01

    This paper presents a detailed performance analysis of a multi-block overset grid compu- tational fluid dynamics app!ication on multiple state-of-the-art computer architectures. The application is implemented using a hybrid MPI+OpenMP programming paradigm that exploits both coarse and fine-grain parallelism; the former via MPI message passing and the latter via OpenMP directives. The hybrid model also extends the applicability of multi-block programs to large clusters of SNIP nodes by overcoming the restriction that the number of processors be less than the number of grid blocks. A key kernel of the application, namely the LU-SGS linear solver, had to be modified to enhance the performance of the hybrid approach on the target machines. Investigations were conducted on cacheless Cray SX6 vector processors, cache-based IBM Power3 and Power4 architectures, and single system image SGI Origin3000 platforms. Overall results for complex vortex dynamics simulations demonstrate that the SX6 achieves the highest performance and outperforms the RISC-based architectures; however, the best scaling performance was achieved on the Power3.

  11. 12 CFR 268.701 - Purpose and application.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... conducted by the Board. Such programs and activities include: (i) Holding open meetings of the Board or... library facilities; and (iv) Any other lawful interaction with the Board or its staff in any official...

  12. 12 CFR 268.701 - Purpose and application.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... conducted by the Board. Such programs and activities include: (i) Holding open meetings of the Board or... library facilities; and (iv) Any other lawful interaction with the Board or its staff in any official...

  13. 76 FR 744 - Community Forest and Open Space Conservation Program

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-01-06

    ... INFORMATION CONTACT: Maya Solomon, U.S. Forest Service, State and Private Forestry, Cooperative Forestry, (202...) Research; (4) Existing liens or taxes owed; and (5) Costs associated with preparation of the application...

  14. Collaborative development of predictive toxicology applications

    PubMed Central

    2010-01-01

    OpenTox provides an interoperable, standards-based Framework for the support of predictive toxicology data management, algorithms, modelling, validation and reporting. It is relevant to satisfying the chemical safety assessment requirements of the REACH legislation as it supports access to experimental data, (Quantitative) Structure-Activity Relationship models, and toxicological information through an integrating platform that adheres to regulatory requirements and OECD validation principles. Initial research defined the essential components of the Framework including the approach to data access, schema and management, use of controlled vocabularies and ontologies, architecture, web service and communications protocols, and selection and integration of algorithms for predictive modelling. OpenTox provides end-user oriented tools to non-computational specialists, risk assessors, and toxicological experts in addition to Application Programming Interfaces (APIs) for developers of new applications. OpenTox actively supports public standards for data representation, interfaces, vocabularies and ontologies, Open Source approaches to core platform components, and community-based collaboration approaches, so as to progress system interoperability goals. The OpenTox Framework includes APIs and services for compounds, datasets, features, algorithms, models, ontologies, tasks, validation, and reporting which may be combined into multiple applications satisfying a variety of different user needs. OpenTox applications are based on a set of distributed, interoperable OpenTox API-compliant REST web services. The OpenTox approach to ontology allows for efficient mapping of complementary data coming from different datasets into a unifying structure having a shared terminology and representation. Two initial OpenTox applications are presented as an illustration of the potential impact of OpenTox for high-quality and consistent structure-activity relationship modelling of REACH-relevant endpoints: ToxPredict which predicts and reports on toxicities for endpoints for an input chemical structure, and ToxCreate which builds and validates a predictive toxicity model based on an input toxicology dataset. Because of the extensible nature of the standardised Framework design, barriers of interoperability between applications and content are removed, as the user may combine data, models and validation from multiple sources in a dependable and time-effective way. PMID:20807436

  15. Collaborative development of predictive toxicology applications.

    PubMed

    Hardy, Barry; Douglas, Nicki; Helma, Christoph; Rautenberg, Micha; Jeliazkova, Nina; Jeliazkov, Vedrin; Nikolova, Ivelina; Benigni, Romualdo; Tcheremenskaia, Olga; Kramer, Stefan; Girschick, Tobias; Buchwald, Fabian; Wicker, Joerg; Karwath, Andreas; Gütlein, Martin; Maunz, Andreas; Sarimveis, Haralambos; Melagraki, Georgia; Afantitis, Antreas; Sopasakis, Pantelis; Gallagher, David; Poroikov, Vladimir; Filimonov, Dmitry; Zakharov, Alexey; Lagunin, Alexey; Gloriozova, Tatyana; Novikov, Sergey; Skvortsova, Natalia; Druzhilovsky, Dmitry; Chawla, Sunil; Ghosh, Indira; Ray, Surajit; Patel, Hitesh; Escher, Sylvia

    2010-08-31

    OpenTox provides an interoperable, standards-based Framework for the support of predictive toxicology data management, algorithms, modelling, validation and reporting. It is relevant to satisfying the chemical safety assessment requirements of the REACH legislation as it supports access to experimental data, (Quantitative) Structure-Activity Relationship models, and toxicological information through an integrating platform that adheres to regulatory requirements and OECD validation principles. Initial research defined the essential components of the Framework including the approach to data access, schema and management, use of controlled vocabularies and ontologies, architecture, web service and communications protocols, and selection and integration of algorithms for predictive modelling. OpenTox provides end-user oriented tools to non-computational specialists, risk assessors, and toxicological experts in addition to Application Programming Interfaces (APIs) for developers of new applications. OpenTox actively supports public standards for data representation, interfaces, vocabularies and ontologies, Open Source approaches to core platform components, and community-based collaboration approaches, so as to progress system interoperability goals.The OpenTox Framework includes APIs and services for compounds, datasets, features, algorithms, models, ontologies, tasks, validation, and reporting which may be combined into multiple applications satisfying a variety of different user needs. OpenTox applications are based on a set of distributed, interoperable OpenTox API-compliant REST web services. The OpenTox approach to ontology allows for efficient mapping of complementary data coming from different datasets into a unifying structure having a shared terminology and representation.Two initial OpenTox applications are presented as an illustration of the potential impact of OpenTox for high-quality and consistent structure-activity relationship modelling of REACH-relevant endpoints: ToxPredict which predicts and reports on toxicities for endpoints for an input chemical structure, and ToxCreate which builds and validates a predictive toxicity model based on an input toxicology dataset. Because of the extensible nature of the standardised Framework design, barriers of interoperability between applications and content are removed, as the user may combine data, models and validation from multiple sources in a dependable and time-effective way.

  16. Using Simulation to Understand Consistency in Treatment Effects: An Application to School Choice

    ERIC Educational Resources Information Center

    Maroulis, Spiro; Bakshy, Eytan; Gomez, Louis; Wilensky, Uri

    2012-01-01

    The authors examine the sensitivity of the school choice treatment effects--as defined as the difference between participants and non-participants in open enrollment programs--to differences in i) the underlying student/household preferences of a school districts, and ii) the program participation rates of the district. Data detailed and broad…

  17. Mental Skills for Sport and Life.

    ERIC Educational Resources Information Center

    Unestahl, Lars-Eric

    The acquisition and the application of mental skills in sports and non-sport settings are discussed in this paper. The paper opens with an overview of the situation in Sweden; it is noted that 25% of the Swedish population have used mental training programs and that in the future all Swedes will have experienced such programs, since basic mental…

  18. Stan: A Probabilistic Programming Language for Bayesian Inference and Optimization

    ERIC Educational Resources Information Center

    Gelman, Andrew; Lee, Daniel; Guo, Jiqiang

    2015-01-01

    Stan is a free and open-source C++ program that performs Bayesian inference or optimization for arbitrary user-specified models and can be called from the command line, R, Python, Matlab, or Julia and has great promise for fitting large and complex statistical models in many areas of application. We discuss Stan from users' and developers'…

  19. scikit-image: image processing in Python

    PubMed Central

    Schönberger, Johannes L.; Nunez-Iglesias, Juan; Boulogne, François; Warner, Joshua D.; Yager, Neil; Gouillart, Emmanuelle; Yu, Tony

    2014-01-01

    scikit-image is an image processing library that implements algorithms and utilities for use in research, education and industry applications. It is released under the liberal Modified BSD open source license, provides a well-documented API in the Python programming language, and is developed by an active, international team of collaborators. In this paper we highlight the advantages of open source to achieve the goals of the scikit-image library, and we showcase several real-world image processing applications that use scikit-image. More information can be found on the project homepage, http://scikit-image.org. PMID:25024921

  20. Building model analysis applications with the Joint Universal Parameter IdenTification and Evaluation of Reliability (JUPITER) API

    USGS Publications Warehouse

    Banta, E.R.; Hill, M.C.; Poeter, E.; Doherty, J.E.; Babendreier, J.

    2008-01-01

    The open-source, public domain JUPITER (Joint Universal Parameter IdenTification and Evaluation of Reliability) API (Application Programming Interface) provides conventions and Fortran-90 modules to develop applications (computer programs) for analyzing process models. The input and output conventions allow application users to access various applications and the analysis methods they embody with a minimum of time and effort. Process models simulate, for example, physical, chemical, and (or) biological systems of interest using phenomenological, theoretical, or heuristic approaches. The types of model analyses supported by the JUPITER API include, but are not limited to, sensitivity analysis, data needs assessment, calibration, uncertainty analysis, model discrimination, and optimization. The advantages provided by the JUPITER API for users and programmers allow for rapid programming and testing of new ideas. Application-specific coding can be in languages other than the Fortran-90 of the API. This article briefly describes the capabilities and utility of the JUPITER API, lists existing applications, and uses UCODE_2005 as an example.

  1. OpenVigil FDA - Inspection of U.S. American Adverse Drug Events Pharmacovigilance Data and Novel Clinical Applications.

    PubMed

    Böhm, Ruwen; von Hehn, Leocadie; Herdegen, Thomas; Klein, Hans-Joachim; Bruhn, Oliver; Petri, Holger; Höcker, Jan

    2016-01-01

    Pharmacovigilance contributes to health care. However, direct access to the underlying data for academic institutions and individual physicians or pharmacists is intricate, and easily employable analysis modes for everyday clinical situations are missing. This underlines the need for a tool to bring pharmacovigilance to the clinics. To address these issues, we have developed OpenVigil FDA, a novel web-based pharmacovigilance analysis tool which uses the openFDA online interface of the Food and Drug Administration (FDA) to access U.S. American and international pharmacovigilance data from the Adverse Event Reporting System (AERS). OpenVigil FDA provides disproportionality analyses to (i) identify the drug most likely evoking a new adverse event, (ii) compare two drugs concerning their safety profile, (iii) check arbitrary combinations of two drugs for unknown drug-drug interactions and (iv) enhance the relevance of results by identifying confounding factors and eliminating them using background correction. We present examples for these applications and discuss the promises and limits of pharmacovigilance, openFDA and OpenVigil FDA. OpenVigil FDA is the first public available tool to apply pharmacovigilance findings directly to real-life clinical problems. OpenVigil FDA does not require special licenses or statistical programs.

  2. Kokkos: Enabling manycore performance portability through polymorphic memory access patterns

    DOE PAGES

    Carter Edwards, H.; Trott, Christian R.; Sunderland, Daniel

    2014-07-22

    The manycore revolution can be characterized by increasing thread counts, decreasing memory per thread, and diversity of continually evolving manycore architectures. High performance computing (HPC) applications and libraries must exploit increasingly finer levels of parallelism within their codes to sustain scalability on these devices. We found that a major obstacle to performance portability is the diverse and conflicting set of constraints on memory access patterns across devices. Contemporary portable programming models address manycore parallelism (e.g., OpenMP, OpenACC, OpenCL) but fail to address memory access patterns. The Kokkos C++ library enables applications and domain libraries to achieve performance portability on diversemore » manycore architectures by unifying abstractions for both fine-grain data parallelism and memory access patterns. In this paper we describe Kokkos’ abstractions, summarize its application programmer interface (API), present performance results for unit-test kernels and mini-applications, and outline an incremental strategy for migrating legacy C++ codes to Kokkos. Furthermore, the Kokkos library is under active research and development to incorporate capabilities from new generations of manycore architectures, and to address a growing list of applications and domain libraries.« less

  3. Open source software in a practical approach for post processing of radiologic images.

    PubMed

    Valeri, Gianluca; Mazza, Francesco Antonino; Maggi, Stefania; Aramini, Daniele; La Riccia, Luigi; Mazzoni, Giovanni; Giovagnoni, Andrea

    2015-03-01

    The purpose of this paper is to evaluate the use of open source software (OSS) to process DICOM images. We selected 23 programs for Windows and 20 programs for Mac from 150 possible OSS programs including DICOM viewers and various tools (converters, DICOM header editors, etc.). The programs selected all meet the basic requirements such as free availability, stand-alone application, presence of graphical user interface, ease of installation and advanced features beyond simple display monitor. Capabilities of data import, data export, metadata, 2D viewer, 3D viewer, support platform and usability of each selected program were evaluated on a scale ranging from 1 to 10 points. Twelve programs received a score higher than or equal to eight. Among them, five obtained a score of 9: 3D Slicer, MedINRIA, MITK 3M3, VolView, VR Render; while OsiriX received 10. OsiriX appears to be the only program able to perform all the operations taken into consideration, similar to a workstation equipped with proprietary software, allowing the analysis and interpretation of images in a simple and intuitive way. OsiriX is a DICOM PACS workstation for medical imaging and software for image processing for medical research, functional imaging, 3D imaging, confocal microscopy and molecular imaging. This application is also a good tool for teaching activities because it facilitates the attainment of learning objectives among students and other specialists.

  4. OpenDrop: An Integrated Do-It-Yourself Platform for Personal Use of Biochips

    PubMed Central

    Alistar, Mirela; Gaudenz, Urs

    2017-01-01

    Biochips, or digital labs-on-chip, are developed with the purpose of being used by laboratory technicians or biologists in laboratories or clinics. In this article, we expand this vision with the goal of enabling everyone, regardless of their expertise, to use biochips for their own personal purposes. We developed OpenDrop, an integrated electromicrofluidic platform that allows users to develop and program their own bio-applications. We address the main challenges that users may encounter: accessibility, bio-protocol design and interaction with microfluidics. OpenDrop consists of a do-it-yourself biochip, an automated software tool with visual interface and a detailed technique for at-home operations of microfluidics. We report on two years of use of OpenDrop, released as an open-source platform. Our platform attracted a highly diverse user base with participants originating from maker communities, academia and industry. Our findings show that 47% of attempts to replicate OpenDrop were successful, the main challenge remaining the assembly of the device. In terms of usability, the users managed to operate their platforms at home and are working on designing their own bio-applications. Our work provides a step towards a future in which everyone will be able to create microfluidic devices for their personal applications, thereby democratizing parts of health care. PMID:28952524

  5. Cargo Movement Operations System (CMOS) Updated Draft Software Requirements Specification (Applications CSCI) Increment II

    DTIC Science & Technology

    1990-11-29

    appropriate to combine them into one paragraph. CMOS PMO ACCEPTS COY24ENT: YES [ ] NO [ ] ERCI ACCEPTS COMMENT: YES [ ] NO [ ] COMMENT DISPOSITION: COMMENT...COMMENT: YES [ ] NO [ ] ERCI ACCEPTS COMMENT: YES [ ] NO [ ] COMMENT DISPOSITION: COMMENT STATUS: OPEN [ ] CLOSED [ ] ORIGINATOR CONTROL NUMBER: SRS1-0004...ERCI ACCEPTS COMMENT: YES [ ] NO [ ] COMMENT DISPOSITION: COMMENT STATUS: OPEN [ ] CLOSED [ ] ORIGINATOR CONTROL NUMBER: SRS1-0005 PROGRAM OFFICE

  6. Establishment of a Uniform Format for Data Reporting of Structural Material Properties for Reliability Analysis

    DTIC Science & Technology

    1994-06-30

    tip Opening Displacement (CTOD) Fracture Toughness Measurement". 48 The method has found application in the elastic-plastic fracture mechanics ( EPFM ...68 6.1 Proposed Material Property Database Format and Hierarchy .............. 68 6.2 Sample Application of the Material Property Database...the E 49.05 sub-committee. The relevant quality indicators applicable to the present program are: source of data, statistical basis of data

  7. Documentation--INFO: A Small Computer Data Base Management System for School Applications. The Illinois Series on Educational Application of Computers, No. 24e.

    ERIC Educational Resources Information Center

    Cox, John

    This paper documents the program used in the application of the INFO system for data storage and retrieval in schools, from the viewpoints of both the unsophisticated user and the experienced programmer interested in using the INFO system or modifying it for use within an existing school's computer system. The opening user's guide presents simple…

  8. SU-E-J-114: Web-Browser Medical Physics Applications Using HTML5 and Javascript.

    PubMed

    Bakhtiari, M

    2012-06-01

    Since 2010, there has been a great attention about HTML5. Application developers and browser makers fully embrace and support the web of the future. Consumers have started to embrace HTML5, especially as more users understand the benefits and potential that HTML5 can mean for the future.Modern browsers such as Firefox, Google Chrome, and Safari are offering better and more robust support for HTML5, CSS3, and JavaScript. The idea is to introduce the HTML5 to medical physics community for open source software developments. The benefit of using HTML5 is developing portable software systems. The HTML5, CSS, and JavaScript programming languages were used to develop several applications for Quality Assurance in radiation therapy. The canvas element of HTML5 was used for handling and displaying the images, and JavaScript was used to manipulate the data. Sample application were developed to: 1. analyze the flatness and symmetry of the radiotherapy fields in a web browser, 2.analyze the Dynalog files from Varian machines, 3. visualize the animated Dynamic MLC files, 4. Simulation via Monte Carlo, and 5. interactive image manipulation. The programs showed great performance and speed in uploading the data and displaying the results. The flatness and symmetry program and Dynalog file analyzer ran in a fraction of second. The reason behind this performance is using JavaScript language which is a lower level programming language in comparison to the most of the scientific programming packages such as Matlab. The second reason is that JavaScript runs locally on client side computers not on the web-servers. HTML5 and JavaScript can be used to develop useful applications that can be run online or offline on different modern web-browsers. The programming platform can be also one of the modern web-browsers which are mostly open source (such as Firefox). © 2012 American Association of Physicists in Medicine.

  9. Model driven development of clinical information sytems using openEHR.

    PubMed

    Atalag, Koray; Yang, Hong Yul; Tempero, Ewan; Warren, Jim

    2011-01-01

    openEHR and the recent international standard (ISO 13606) defined a model driven software development methodology for health information systems. However there is little evidence in the literature describing implementation; especially for desktop clinical applications. This paper presents an implementation pathway using .Net/C# technology for Microsoft Windows desktop platforms. An endoscopy reporting application driven by openEHR Archetypes and Templates has been developed. A set of novel GUI directives has been defined and presented which guides the automatic graphical user interface generator to render widgets properly. We also reveal the development steps and important design decisions; from modelling to the final software product. This might provide guidance for other developers and form evidence required for the adoption of these standards for vendors and national programs alike.

  10. Open source tools and toolkits for bioinformatics: significance, and where are we?

    PubMed

    Stajich, Jason E; Lapp, Hilmar

    2006-09-01

    This review summarizes important work in open-source bioinformatics software that has occurred over the past couple of years. The survey is intended to illustrate how programs and toolkits whose source code has been developed or released under an Open Source license have changed informatics-heavy areas of life science research. Rather than creating a comprehensive list of all tools developed over the last 2-3 years, we use a few selected projects encompassing toolkit libraries, analysis tools, data analysis environments and interoperability standards to show how freely available and modifiable open-source software can serve as the foundation for building important applications, analysis workflows and resources.

  11. Research on Ajax and Hibernate technology in the development of E-shop system

    NASA Astrophysics Data System (ADS)

    Yin, Luo

    2011-12-01

    Hibernate is a object relational mapping framework of open source code, which conducts light-weighted object encapsulation of JDBC to let Java programmers use the concept of object-oriented programming to manipulate database at will. The appearence of the concept of Ajax (asynchronous JavaScript and XML technology) begins the time prelude of page partial refresh so that developers can develop web application programs with stronger interaction. The paper illustrates the concrete application of Ajax and Hibernate to the development of E-shop in details and adopts them to design to divide the entire program code into relatively independent parts which can cooperate with one another as well. In this way, it is easier for the entire program to maintain and expand.

  12. Cloud computing geospatial application for water resources based on free and open source software and open standards - a prototype

    NASA Astrophysics Data System (ADS)

    Delipetrev, Blagoj

    2016-04-01

    Presently, most of the existing software is desktop-based, designed to work on a single computer, which represents a major limitation in many ways, starting from limited computer processing, storage power, accessibility, availability, etc. The only feasible solution lies in the web and cloud. This abstract presents research and development of a cloud computing geospatial application for water resources based on free and open source software and open standards using hybrid deployment model of public - private cloud, running on two separate virtual machines (VMs). The first one (VM1) is running on Amazon web services (AWS) and the second one (VM2) is running on a Xen cloud platform. The presented cloud application is developed using free and open source software, open standards and prototype code. The cloud application presents a framework how to develop specialized cloud geospatial application that needs only a web browser to be used. This cloud application is the ultimate collaboration geospatial platform because multiple users across the globe with internet connection and browser can jointly model geospatial objects, enter attribute data and information, execute algorithms, and visualize results. The presented cloud application is: available all the time, accessible from everywhere, it is scalable, works in a distributed computer environment, it creates a real-time multiuser collaboration platform, the programing languages code and components are interoperable, and it is flexible in including additional components. The cloud geospatial application is implemented as a specialized water resources application with three web services for 1) data infrastructure (DI), 2) support for water resources modelling (WRM), 3) user management. The web services are running on two VMs that are communicating over the internet providing services to users. The application was tested on the Zletovica river basin case study with concurrent multiple users. The application is a state-of-the-art cloud geospatial collaboration platform. The presented solution is a prototype and can be used as a foundation for developing of any specialized cloud geospatial applications. Further research will be focused on distributing the cloud application on additional VMs, testing the scalability and availability of services.

  13. Analysis of Parallel Algorithms on SMP Node and Cluster of Workstations Using Parallel Programming Models with New Tile-based Method for Large Biological Datasets.

    PubMed

    Shrimankar, D D; Sathe, S R

    2016-01-01

    Sequence alignment is an important tool for describing the relationships between DNA sequences. Many sequence alignment algorithms exist, differing in efficiency, in their models of the sequences, and in the relationship between sequences. The focus of this study is to obtain an optimal alignment between two sequences of biological data, particularly DNA sequences. The algorithm is discussed with particular emphasis on time, speedup, and efficiency optimizations. Parallel programming presents a number of critical challenges to application developers. Today's supercomputer often consists of clusters of SMP nodes. Programming paradigms such as OpenMP and MPI are used to write parallel codes for such architectures. However, the OpenMP programs cannot be scaled for more than a single SMP node. However, programs written in MPI can have more than single SMP nodes. But such a programming paradigm has an overhead of internode communication. In this work, we explore the tradeoffs between using OpenMP and MPI. We demonstrate that the communication overhead incurs significantly even in OpenMP loop execution and increases with the number of cores participating. We also demonstrate a communication model to approximate the overhead from communication in OpenMP loops. Our results are astonishing and interesting to a large variety of input data files. We have developed our own load balancing and cache optimization technique for message passing model. Our experimental results show that our own developed techniques give optimum performance of our parallel algorithm for various sizes of input parameter, such as sequence size and tile size, on a wide variety of multicore architectures.

  14. Analysis of Parallel Algorithms on SMP Node and Cluster of Workstations Using Parallel Programming Models with New Tile-based Method for Large Biological Datasets

    PubMed Central

    Shrimankar, D. D.; Sathe, S. R.

    2016-01-01

    Sequence alignment is an important tool for describing the relationships between DNA sequences. Many sequence alignment algorithms exist, differing in efficiency, in their models of the sequences, and in the relationship between sequences. The focus of this study is to obtain an optimal alignment between two sequences of biological data, particularly DNA sequences. The algorithm is discussed with particular emphasis on time, speedup, and efficiency optimizations. Parallel programming presents a number of critical challenges to application developers. Today’s supercomputer often consists of clusters of SMP nodes. Programming paradigms such as OpenMP and MPI are used to write parallel codes for such architectures. However, the OpenMP programs cannot be scaled for more than a single SMP node. However, programs written in MPI can have more than single SMP nodes. But such a programming paradigm has an overhead of internode communication. In this work, we explore the tradeoffs between using OpenMP and MPI. We demonstrate that the communication overhead incurs significantly even in OpenMP loop execution and increases with the number of cores participating. We also demonstrate a communication model to approximate the overhead from communication in OpenMP loops. Our results are astonishing and interesting to a large variety of input data files. We have developed our own load balancing and cache optimization technique for message passing model. Our experimental results show that our own developed techniques give optimum performance of our parallel algorithm for various sizes of input parameter, such as sequence size and tile size, on a wide variety of multicore architectures. PMID:27932868

  15. An Overview of R in Health Decision Sciences.

    PubMed

    Jalal, Hawre; Pechlivanoglou, Petros; Krijkamp, Eline; Alarid-Escudero, Fernando; Enns, Eva; Hunink, M G Myriam

    2017-10-01

    As the complexity of health decision science applications increases, high-level programming languages are increasingly adopted for statistical analyses and numerical computations. These programming languages facilitate sophisticated modeling, model documentation, and analysis reproducibility. Among the high-level programming languages, the statistical programming framework R is gaining increased recognition. R is freely available, cross-platform compatible, and open source. A large community of users who have generated an extensive collection of well-documented packages and functions supports it. These functions facilitate applications of health decision science methodology as well as the visualization and communication of results. Although R's popularity is increasing among health decision scientists, methodological extensions of R in the field of decision analysis remain isolated. The purpose of this article is to provide an overview of existing R functionality that is applicable to the various stages of decision analysis, including model design, input parameter estimation, and analysis of model outputs.

  16. 48 CFR 22.404-4 - Solicitations issued without wage determinations for the primary site of the work.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... Acquisition Regulations System FEDERAL ACQUISITION REGULATION SOCIOECONOMIC PROGRAMS APPLICATION OF LABOR LAWS... issued as an amendment to the solicitation. (b) In sealed bidding, bids may not be opened until a...

  17. Lapin Data Interchange Among Database, Analysis and Display Programs Using XML-Based Text Files

    NASA Technical Reports Server (NTRS)

    2005-01-01

    The purpose of grant NCC3-966 was to investigate and evaluate the interchange of application-specific data among multiple programs each carrying out part of the analysis and design task. This has been carried out previously by creating a custom program to read data produced by one application and then write that data to a file whose format is specific to the second application that needs all or part of that data. In this investigation, data of interest is described using the XML markup language that allows the data to be stored in a text-string. Software to transform output data of a task into an XML-string and software to read an XML string and extract all or a portion of the data needed for another application is used to link two independent applications together as part of an overall design effort. This approach was initially used with a standard analysis program, Lapin, along with standard applications a standard spreadsheet program, a relational database program, and a conventional dialog and display program to demonstrate the successful sharing of data among independent programs. Most of the effort beyond that demonstration has been concentrated on the inclusion of more complex display programs. Specifically, a custom-written windowing program organized around dialogs to control the interactions have been combined with an independent CAD program (Open Cascade) that supports sophisticated display of CAD elements such as lines, spline curves, and surfaces and turbine-blade data produced by an independent blade design program (UD0300).

  18. Measuring the effectiveness and impact of an open innovation platform.

    PubMed

    Carroll, Glenn P; Srivastava, Sanjay; Volini, Adam S; Piñeiro-Núñez, Marta M; Vetman, Tatiana

    2017-05-01

    Today, most pharmaceutical companies complement their traditional R&D models with some variation on the Open Innovation (OI) approach in an effort to better access global scientific talent, ideas and hypotheses. Traditional performance indicators that measure economic returns from R&D through commercialization are often not applicable to the practical assessment of these OI approaches, particularly within the context of early drug discovery. This leaves OI programs focused on early R&D without a standard assessment framework from which to evaluate overall performance. This paper proposes a practical dashboard for such assessment, encompassing quantitative and qualitative elements, to enable decision-making and improvement of future performance. The use of this dashboard is illustrated using real-time data from the Lilly Open Innovation Drug Discovery (OIDD) program. Copyright © 2017 The Author(s). Published by Elsevier Ltd.. All rights reserved.

  19. Elastic-plastic analysis of a propagating crack under cyclic loading

    NASA Technical Reports Server (NTRS)

    Newman, J. C., Jr.; Armen, H., Jr.

    1974-01-01

    Development and application of a two-dimensional finite-element analysis to predict crack-closure and crack-opening stresses during specified histories of cyclic loading. An existing finite-element computer program which accounts for elastic-plastic material behavior under cyclic loading was modified to account for changing boundary conditions - crack growth and intermittent contact of crack surfaces. This program was subsequently used to study the crack-closure behavior under constant-amplitude and simple block-program loading.

  20. Distributed chemical computing using ChemStar: an open source java remote method invocation architecture applied to large scale molecular data from PubChem.

    PubMed

    Karthikeyan, M; Krishnan, S; Pandey, Anil Kumar; Bender, Andreas; Tropsha, Alexander

    2008-04-01

    We present the application of a Java remote method invocation (RMI) based open source architecture to distributed chemical computing. This architecture was previously employed for distributed data harvesting of chemical information from the Internet via the Google application programming interface (API; ChemXtreme). Due to its open source character and its flexibility, the underlying server/client framework can be quickly adopted to virtually every computational task that can be parallelized. Here, we present the server/client communication framework as well as an application to distributed computing of chemical properties on a large scale (currently the size of PubChem; about 18 million compounds), using both the Marvin toolkit as well as the open source JOELib package. As an application, for this set of compounds, the agreement of log P and TPSA between the packages was compared. Outliers were found to be mostly non-druglike compounds and differences could usually be explained by differences in the underlying algorithms. ChemStar is the first open source distributed chemical computing environment built on Java RMI, which is also easily adaptable to user demands due to its "plug-in architecture". The complete source codes as well as calculated properties along with links to PubChem resources are available on the Internet via a graphical user interface at http://moltable.ncl.res.in/chemstar/.

  1. Closed-Loop, Multichannel Experimentation Using the Open-Source NeuroRighter Electrophysiology Platform

    PubMed Central

    Newman, Jonathan P.; Zeller-Townson, Riley; Fong, Ming-Fai; Arcot Desai, Sharanya; Gross, Robert E.; Potter, Steve M.

    2013-01-01

    Single neuron feedback control techniques, such as voltage clamp and dynamic clamp, have enabled numerous advances in our understanding of ion channels, electrochemical signaling, and neural dynamics. Although commercially available multichannel recording and stimulation systems are commonly used for studying neural processing at the network level, they provide little native support for real-time feedback. We developed the open-source NeuroRighter multichannel electrophysiology hardware and software platform for closed-loop multichannel control with a focus on accessibility and low cost. NeuroRighter allows 64 channels of stimulation and recording for around US $10,000, along with the ability to integrate with other software and hardware. Here, we present substantial enhancements to the NeuroRighter platform, including a redesigned desktop application, a new stimulation subsystem allowing arbitrary stimulation patterns, low-latency data servers for accessing data streams, and a new application programming interface (API) for creating closed-loop protocols that can be inserted into NeuroRighter as plugin programs. This greatly simplifies the design of sophisticated real-time experiments without sacrificing the power and speed of a compiled programming language. Here we present a detailed description of NeuroRighter as a stand-alone application, its plugin API, and an extensive set of case studies that highlight the system’s abilities for conducting closed-loop, multichannel interfacing experiments. PMID:23346047

  2. Cinfony – combining Open Source cheminformatics toolkits behind a common interface

    PubMed Central

    O'Boyle, Noel M; Hutchison, Geoffrey R

    2008-01-01

    Background Open Source cheminformatics toolkits such as OpenBabel, the CDK and the RDKit share the same core functionality but support different sets of file formats and forcefields, and calculate different fingerprints and descriptors. Despite their complementary features, using these toolkits in the same program is difficult as they are implemented in different languages (C++ versus Java), have different underlying chemical models and have different application programming interfaces (APIs). Results We describe Cinfony, a Python module that presents a common interface to all three of these toolkits, allowing the user to easily combine methods and results from any of the toolkits. In general, the run time of the Cinfony modules is almost as fast as accessing the underlying toolkits directly from C++ or Java, but Cinfony makes it much easier to carry out common tasks in cheminformatics such as reading file formats and calculating descriptors. Conclusion By providing a simplified interface and improving interoperability, Cinfony makes it easy to combine complementary features of OpenBabel, the CDK and the RDKit. PMID:19055766

  3. Application of Modern Fortran to Spacecraft Trajectory Design and Optimization

    NASA Technical Reports Server (NTRS)

    Williams, Jacob; Falck, Robert D.; Beekman, Izaak B.

    2018-01-01

    In this paper, applications of the modern Fortran programming language to the field of spacecraft trajectory optimization and design are examined. Modern object-oriented Fortran has many advantages for scientific programming, although many legacy Fortran aerospace codes have not been upgraded to use the newer standards (or have been rewritten in other languages perceived to be more modern). NASA's Copernicus spacecraft trajectory optimization program, originally a combination of Fortran 77 and Fortran 95, has attempted to keep up with modern standards and makes significant use of the new language features. Various algorithms and methods are presented from trajectory tools such as Copernicus, as well as modern Fortran open source libraries and other projects.

  4. Technology of interdisciplinary open-ended designing in engineering education

    NASA Astrophysics Data System (ADS)

    Isaev, A. P.; Plotnikov, L. V.; Fomin, N. I.

    2017-11-01

    Author’s technology of interdisciplinary open-ended engineering is presented in this article. This technology is an integrated teaching method that significantly increases the practical component in the educational program. Author’s technology creates the conditions to overcome the shortcomings in the engineering education. The basic ideas of the technology of open-ended engineering, experience of their implementation in higher education and the author’s vision of the teaching technology are examined in the article. The main stages of development process of the author’s technology of open-ended engineering to prepare students (bachelor) of technical profile are presented in the article. Complex of the methodological tools and procedures is shown in the article. This complex is the basis of the developed training technology that is used in educational process in higher school of engineering (UrFU). The organizational model of the technology of open-ended engineering is presented. Organizational model integrates the functions in the creation and implementation of all educational program. Analysis of the characteristics of educational activity of students working on author’s technology of interdisciplinary open-ended engineering is presented. Intermediate results of the application of author’s technology in the educational process of the engineering undergraduate are shown.

  5. Vehicle infrastructure integration (VII) based road-condition warning system for highway collision prevention.

    DOT National Transportation Integrated Search

    2009-05-01

    As a major ITS initiative, the Vehicle Infrastructure Integration (VII) program is to revolutionize : transportation by creating an enabling communication infrastructure that will open up a wide range of : safety applications. The road-condition warn...

  6. Compiled MPI: Cost-Effective Exascale Applications Development

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bronevetsky, G; Quinlan, D; Lumsdaine, A

    2012-04-10

    The complexity of petascale and exascale machines makes it increasingly difficult to develop applications that can take advantage of them. Future systems are expected to feature billion-way parallelism, complex heterogeneous compute nodes and poor availability of memory (Peter Kogge, 2008). This new challenge for application development is motivating a significant amount of research and development on new programming models and runtime systems designed to simplify large-scale application development. Unfortunately, DoE has significant multi-decadal investment in a large family of mission-critical scientific applications. Scaling these applications to exascale machines will require a significant investment that will dwarf the costs of hardwaremore » procurement. A key reason for the difficulty in transitioning today's applications to exascale hardware is their reliance on explicit programming techniques, such as the Message Passing Interface (MPI) programming model to enable parallelism. MPI provides a portable and high performance message-passing system that enables scalable performance on a wide variety of platforms. However, it also forces developers to lock the details of parallelization together with application logic, making it very difficult to adapt the application to significant changes in the underlying system. Further, MPI's explicit interface makes it difficult to separate the application's synchronization and communication structure, reducing the amount of support that can be provided by compiler and run-time tools. This is in contrast to the recent research on more implicit parallel programming models such as Chapel, OpenMP and OpenCL, which promise to provide significantly more flexibility at the cost of reimplementing significant portions of the application. We are developing CoMPI, a novel compiler-driven approach to enable existing MPI applications to scale to exascale systems with minimal modifications that can be made incrementally over the application's lifetime. It includes: (1) New set of source code annotations, inserted either manually or automatically, that will clarify the application's use of MPI to the compiler infrastructure, enabling greater accuracy where needed; (2) A compiler transformation framework that leverages these annotations to transform the original MPI source code to improve its performance and scalability; (3) Novel MPI runtime implementation techniques that will provide a rich set of functionality extensions to be used by applications that have been transformed by our compiler; and (4) A novel compiler analysis that leverages simple user annotations to automatically extract the application's communication structure and synthesize most complex code annotations.« less

  7. National Utility Rate Database: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ong, S.; McKeel, R.

    2012-08-01

    When modeling solar energy technologies and other distributed energy systems, using high-quality expansive electricity rates is essential. The National Renewable Energy Laboratory (NREL) developed a utility rate platform for entering, storing, updating, and accessing a large collection of utility rates from around the United States. This utility rate platform lives on the Open Energy Information (OpenEI) website, OpenEI.org, allowing the data to be programmatically accessed from a web browser, using an application programming interface (API). The semantic-based utility rate platform currently has record of 1,885 utility rates and covers over 85% of the electricity consumption in the United States.

  8. CheD: chemical database compilation tool, Internet server, and client for SQL servers.

    PubMed

    Trepalin, S V; Yarkov, A V

    2001-01-01

    An efficient program, which runs on a personal computer, for the storage, retrieval, and processing of chemical information, is presented, The program can work both as a stand-alone application or in conjunction with a specifically written Web server application or with some standard SQL servers, e.g., Oracle, Interbase, and MS SQL. New types of data fields are introduced, e.g., arrays for spectral information storage, HTML and database links, and user-defined functions. CheD has an open architecture; thus, custom data types, controls, and services may be added. A WWW server application for chemical data retrieval features an easy and user-friendly installation on Windows NT or 95 platforms.

  9. Performance evaluation of throughput computing workloads using multi-core processors and graphics processors

    NASA Astrophysics Data System (ADS)

    Dave, Gaurav P.; Sureshkumar, N.; Blessy Trencia Lincy, S. S.

    2017-11-01

    Current trend in processor manufacturing focuses on multi-core architectures rather than increasing the clock speed for performance improvement. Graphic processors have become as commodity hardware for providing fast co-processing in computer systems. Developments in IoT, social networking web applications, big data created huge demand for data processing activities and such kind of throughput intensive applications inherently contains data level parallelism which is more suited for SIMD architecture based GPU. This paper reviews the architectural aspects of multi/many core processors and graphics processors. Different case studies are taken to compare performance of throughput computing applications using shared memory programming in OpenMP and CUDA API based programming.

  10. Electronic Health Record Application Support Service Enablers.

    PubMed

    Neofytou, M S; Neokleous, K; Aristodemou, A; Constantinou, I; Antoniou, Z; Schiza, E C; Pattichis, C S; Schizas, C N

    2015-08-01

    There is a huge need for open source software solutions in the healthcare domain, given the flexibility, interoperability and resource savings characteristics they offer. In this context, this paper presents the development of three open source libraries - Specific Enablers (SEs) for eHealth applications that were developed under the European project titled "Future Internet Social and Technological Alignment Research" (FI-STAR) funded under the "Future Internet Public Private Partnership" (FI-PPP) program. The three SEs developed under the Electronic Health Record Application Support Service Enablers (EHR-EN) correspond to: a) an Electronic Health Record enabler (EHR SE), b) a patient summary enabler based on the EU project "European patient Summary Open Source services" (epSOS SE) supporting patient mobility and the offering of interoperable services, and c) a Picture Archiving and Communications System (PACS) enabler (PACS SE) based on the dcm4che open source system for the support of medical imaging functionality. The EHR SE follows the HL7 Clinical Document Architecture (CDA) V2.0 and supports the Integrating the Healthcare Enterprise (IHE) profiles (recently awarded in Connectathon 2015). These three FI-STAR platform enablers are designed to facilitate the deployment of innovative applications and value added services in the health care sector. They can be downloaded from the FI-STAR cataloque website. Work in progress focuses in the validation and evaluation scenarios for the proving and demonstration of the usability, applicability and adaptability of the proposed enablers.

  11. Implementation of Open-Source Web Mapping Technologies to Support Monitoring of Governmental Schemes

    NASA Astrophysics Data System (ADS)

    Pulsani, B. R.

    2015-10-01

    Several schemes are undertaken by the government to uplift social and economic condition of people. The monitoring of these schemes is done through information technology where involvement of Geographic Information System (GIS) is lacking. To demonstrate the benefits of thematic mapping as a tool for assisting the officials in making decisions, a web mapping application for three government programs such as Mother and Child Tracking system (MCTS), Telangana State Housing Corporation Limited (TSHCL) and Ground Water Quality Mapping (GWQM) has been built. Indeed the three applications depicted the distribution of various parameters thematically and helped in identifying the areas with higher and weaker distributions. Based on the three applications, the study tends to find similarities of many government schemes reflecting the nature of thematic mapping and hence deduces to implement this kind of approach for other schemes as well. These applications have been developed using SharpMap Csharp library which is a free and open source mapping library for developing geospatial applications. The study highlights upon the cost benefits of SharpMap and brings out the advantage of this library over proprietary vendors and further discusses its advantages over other open source libraries as well.

  12. Research on the application of satellite remote sensing to local, state, regional and national programs involved with resource management and environmental quality

    NASA Technical Reports Server (NTRS)

    Barr, B. G.

    1974-01-01

    A program designed to involve state, regional and local agency personnel in the application of remote sensing is reported. During this period fifteen applications projects were initiated in support of twenty-five separate state, county and municipal agencies or entities. Eight of the projects were completed with positive results which aided the agencies involved. These results included information which contributed to decisions on: (1) selection of a route for a scenic parkway, (2) policy development on open land use, (3) policy related to urban development, (4) a major reservoir project by a governor's staff, (5) control tactics and damage assessment during flooding conditions on the Kansas and Missouri rivers, and (6) initiating a program of habitat inventory by remote sensing by the Kansas Forestry, Fish and Game Commission.

  13. Investigating the application of AOP methodology in development of Financial Accounting Software using Eclipse-AJDT Environment

    NASA Astrophysics Data System (ADS)

    Sharma, Amita; Sarangdevot, S. S.

    2010-11-01

    Aspect-Oriented Programming (AOP) methodology has been investigated in development of real world business application software—Financial Accounting Software. Eclipse-AJDT environment has been used as open source enhanced IDE support for programming in AOP language—Aspect J. Crosscutting concerns have been identified and modularized as aspects. This reduces the complexity of the design considerably due to elimination of code scattering and tangling. Improvement in modularity, quality and performance is achieved. The study concludes that AOP methodology in Eclipse-AJDT environment offers powerful support for modular design and implementation of real world quality business software.

  14. Open System Interconnection - NASA program communications of the future. [developed by International aorganization for Standardization

    NASA Technical Reports Server (NTRS)

    Brady, Charles D.

    1987-01-01

    Open Systems Interconnection (OSI) standards are being developed by the ISO and the Consultative Committee on International Telephone and Telegraph with the support of industry. These standards are being developed to allow the interconnecting of computer systems and the interworking of applications such that the applications can be independent of any equipment manufacturer. Significant progress has been made, and the establishment of government OSI standards is being considered. There is considerable interest within NASA in the potential benefits of OSI and in communications standards in general. The OSI standards are being considered for possible application in the Space Station onboard data management system. The OSI standards have reached a high level of maturity, and it is now imperative that NASA plan for future migration to OSI where appropriate.

  15. The results of the investigations of Russian Research Center - {open_quotes}Kurchatov Institute{close_quotes} on molten salt applications to problems of nuclear energy systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Novikov, V.M.

    1995-10-01

    The results of investigations on molten salt (MS) applications to problems of nuclear energy systems that have been conducted in Russian Research {open_quotes}Kurchatov Institute{close_quotes} are presented and discussed. The spectrum of these investigations is rather broad and covers the following items: physical characteristics of molten salt nuclear energy systems (MSNES); nuclear and radiation safety of MSNES; construction materials compatible with MS of different compositions; technological aspects of MS loops; in-reactor loop testing. It is shown that main findings of completed program support the conclusion that there are no physical nor technological obstacles on way of MS application to different nuclearmore » energy systems.« less

  16. Finding Services for an Open Architecture: A Review of Existing Applications and Programs in PEO C4I

    DTIC Science & Technology

    2011-01-01

    2004) Two key SOA success factors listed were as follows: 1. Shared Services Strategy: Existence of a strategy to identify overlapping business and...model Architectural pattern 22 Finding Services for an Open Architecture or eliminating redundancies and overlaps through use of shared services 2...Funding Model: Existence of an IT funding model aligned with and supportive of a shared services strategy. (Sun Micro- systems, 2004) Become Data

  17. Spatial Information Processing: Standards-Based Open Source Visualization Technology

    NASA Astrophysics Data System (ADS)

    Hogan, P.

    2009-12-01

    . Spatial information intelligence is a global issue that will increasingly affect our ability to survive as a species. Collectively we must better appreciate the complex relationships that make life on Earth possible. Providing spatial information in its native context can accelerate our ability to process that information. To maximize this ability to process information, three basic elements are required: data delivery (server technology), data access (client technology), and data processing (information intelligence). NASA World Wind provides open source client and server technologies based on open standards. The possibilities for data processing and data sharing are enhanced by this inclusive infrastructure for geographic information. It is interesting that this open source and open standards approach, unfettered by proprietary constraints, simultaneously provides for entirely proprietary use of this same technology. 1. WHY WORLD WIND? NASA World Wind began as a single program with specific functionality, to deliver NASA content. But as the possibilities for virtual globe technology became more apparent, we found that while enabling a new class of information technology, we were also getting in the way. Researchers, developers and even users expressed their desire for World Wind functionality in ways that would service their specific needs. They want it in their web pages. They want to add their own features. They want to manage their own data. They told us that only with this kind of flexibility, could their objectives and the potential for this technology be truly realized. World Wind client technology is a set of development tools, a software development kit (SDK) that allows a software engineer to create applications requiring geographic visualization technology. 2. MODULAR COMPONENTRY Accelerated evolution of a technology requires that the essential elements of that technology be modular components such that each can advance independent of the other elements. World Wind therefore changed its mission from providing a single information browser to enabling a whole class of 3D geographic applications. Instead of creating a single program, World Wind is a suite of components that can be selectively used in any number of programs. World Wind technology can be a part of any application, or it can be a window in a web page. Or it can be extended with additional functionalities by application and web developers. World Wind makes it possible to include virtual globe visualization and server technology in support of any objective. The world community can continually benefit from advances made in the technology by NASA in concert with the world community. 3. OPEN SOURCE AND OPEN STANDARDS NASA World Wind is NASA Open Source software. This means that the source code is fully accessible for anyone to freely use, even in association with proprietary technology. Imagery and other data provided by the World Wind servers reside in the public domain, including the data server technology itself. This allows others to deliver their own geospatial data and to provide custom solutions based on users specific needs.

  18. AMBIT RESTful web services: an implementation of the OpenTox application programming interface.

    PubMed

    Jeliazkova, Nina; Jeliazkov, Vedrin

    2011-05-16

    The AMBIT web services package is one of the several existing independent implementations of the OpenTox Application Programming Interface and is built according to the principles of the Representational State Transfer (REST) architecture. The Open Source Predictive Toxicology Framework, developed by the partners in the EC FP7 OpenTox project, aims at providing a unified access to toxicity data and predictive models, as well as validation procedures. This is achieved by i) an information model, based on a common OWL-DL ontology ii) links to related ontologies; iii) data and algorithms, available through a standardized REST web services interface, where every compound, data set or predictive method has a unique web address, used to retrieve its Resource Description Framework (RDF) representation, or initiate the associated calculations.The AMBIT web services package has been developed as an extension of AMBIT modules, adding the ability to create (Quantitative) Structure-Activity Relationship (QSAR) models and providing an OpenTox API compliant interface. The representation of data and processing resources in W3C Resource Description Framework facilitates integrating the resources as Linked Data. By uploading datasets with chemical structures and arbitrary set of properties, they become automatically available online in several formats. The services provide unified interfaces to several descriptor calculation, machine learning and similarity searching algorithms, as well as to applicability domain and toxicity prediction models. All Toxtree modules for predicting the toxicological hazard of chemical compounds are also integrated within this package. The complexity and diversity of the processing is reduced to the simple paradigm "read data from a web address, perform processing, write to a web address". The online service allows to easily run predictions, without installing any software, as well to share online datasets and models. The downloadable web application allows researchers to setup an arbitrary number of service instances for specific purposes and at suitable locations. These services could be used as a distributed framework for processing of resource-intensive tasks and data sharing or in a fully independent way, according to the specific needs. The advantage of exposing the functionality via the OpenTox API is seamless interoperability, not only within a single web application, but also in a network of distributed services. Last, but not least, the services provide a basis for building web mashups, end user applications with friendly GUIs, as well as embedding the functionalities in existing workflow systems.

  19. AMBIT RESTful web services: an implementation of the OpenTox application programming interface

    PubMed Central

    2011-01-01

    The AMBIT web services package is one of the several existing independent implementations of the OpenTox Application Programming Interface and is built according to the principles of the Representational State Transfer (REST) architecture. The Open Source Predictive Toxicology Framework, developed by the partners in the EC FP7 OpenTox project, aims at providing a unified access to toxicity data and predictive models, as well as validation procedures. This is achieved by i) an information model, based on a common OWL-DL ontology ii) links to related ontologies; iii) data and algorithms, available through a standardized REST web services interface, where every compound, data set or predictive method has a unique web address, used to retrieve its Resource Description Framework (RDF) representation, or initiate the associated calculations. The AMBIT web services package has been developed as an extension of AMBIT modules, adding the ability to create (Quantitative) Structure-Activity Relationship (QSAR) models and providing an OpenTox API compliant interface. The representation of data and processing resources in W3C Resource Description Framework facilitates integrating the resources as Linked Data. By uploading datasets with chemical structures and arbitrary set of properties, they become automatically available online in several formats. The services provide unified interfaces to several descriptor calculation, machine learning and similarity searching algorithms, as well as to applicability domain and toxicity prediction models. All Toxtree modules for predicting the toxicological hazard of chemical compounds are also integrated within this package. The complexity and diversity of the processing is reduced to the simple paradigm "read data from a web address, perform processing, write to a web address". The online service allows to easily run predictions, without installing any software, as well to share online datasets and models. The downloadable web application allows researchers to setup an arbitrary number of service instances for specific purposes and at suitable locations. These services could be used as a distributed framework for processing of resource-intensive tasks and data sharing or in a fully independent way, according to the specific needs. The advantage of exposing the functionality via the OpenTox API is seamless interoperability, not only within a single web application, but also in a network of distributed services. Last, but not least, the services provide a basis for building web mashups, end user applications with friendly GUIs, as well as embedding the functionalities in existing workflow systems. PMID:21575202

  20. Thermoviscous analysis of open photoacoustic cells

    NASA Astrophysics Data System (ADS)

    Mannoor, Madhusoodanan; Kang, Sangmo

    2017-11-01

    Open photoacoustic cells, apart from the conventional spectroscopic applications, are increasingly useful in bio medical applications such as in vivo blood sugar measurement. Maximising the acoustic pressure amplitude and the quality factor are major design considerations associated with open cells.Conventionaly, resonant photoacoustic cells are analyzed by either transmission line analogy or Eigen mode expansion method. In this study, we conducted a more comprehensive thermo viscous analysis of open photoacoustic cells. A Helmholtz cell and a T-shaped cell, which are acoustically different, are considered for analysis. Effect of geometrical dimensions on the acoustic pressure, quality factor and the intrusion of noise are analyzed and compared between these cells. Specific attention is given to the sizing of the opening and fixtures on it to minimize the radiational losses and the intrusion of noise. Our results are useful for proper selection of the type of open photoacoustic cells for in vivo blood sugar measurement and the optimization of geometric variables of such cells. This research was supported by Basic Science Research Program through the National Research Foundation of Korea (NRF) funded by the Ministry of Science, ICT and future planning (2017R1A2B4005006).

  1. Lapin Data Interchange Among Database, Analysis and Display Programs Using XML-Based Text Files

    NASA Technical Reports Server (NTRS)

    2004-01-01

    The purpose was to investigate and evaluate the interchange of application- specific data among multiple programs each carrying out part of the analysis and design task. This has been carried out previously by creating a custom program to read data produced by one application and then write that data to a file whose format is specific to the second application that needs all or part of that data. In this investigation, data of interest is described using the XML markup language that allows the data to be stored in a text-string. Software to transform output data of a task into an XML-string and software to read an XML string and extract all or a portion of the data needed for another application is used to link two independent applications together as part of an overall design effort. This approach was initially used with a standard analysis program, Lapin, along with standard applications a standard spreadsheet program, a relational database program, and a conventional dialog and display program to demonstrate the successful sharing of data among independent programs. See Engineering Analysis Using a Web-Based Protocol by J.D. Schoeffler and R.W. Claus, NASA TM-2002-211981, October 2002. Most of the effort beyond that demonstration has been concentrated on the inclusion of more complex display programs. Specifically, a custom-written windowing program organized around dialogs to control the interactions have been combined with an independent CAD program (Open Cascade) that supports sophisticated display of CAD elements such as lines, spline curves, and surfaces and turbine-blade data produced by an independent blade design program (UD0300).

  2. OpenMS: a flexible open-source software platform for mass spectrometry data analysis.

    PubMed

    Röst, Hannes L; Sachsenberg, Timo; Aiche, Stephan; Bielow, Chris; Weisser, Hendrik; Aicheler, Fabian; Andreotti, Sandro; Ehrlich, Hans-Christian; Gutenbrunner, Petra; Kenar, Erhan; Liang, Xiao; Nahnsen, Sven; Nilse, Lars; Pfeuffer, Julianus; Rosenberger, George; Rurik, Marc; Schmitt, Uwe; Veit, Johannes; Walzer, Mathias; Wojnar, David; Wolski, Witold E; Schilling, Oliver; Choudhary, Jyoti S; Malmström, Lars; Aebersold, Ruedi; Reinert, Knut; Kohlbacher, Oliver

    2016-08-30

    High-resolution mass spectrometry (MS) has become an important tool in the life sciences, contributing to the diagnosis and understanding of human diseases, elucidating biomolecular structural information and characterizing cellular signaling networks. However, the rapid growth in the volume and complexity of MS data makes transparent, accurate and reproducible analysis difficult. We present OpenMS 2.0 (http://www.openms.de), a robust, open-source, cross-platform software specifically designed for the flexible and reproducible analysis of high-throughput MS data. The extensible OpenMS software implements common mass spectrometric data processing tasks through a well-defined application programming interface in C++ and Python and through standardized open data formats. OpenMS additionally provides a set of 185 tools and ready-made workflows for common mass spectrometric data processing tasks, which enable users to perform complex quantitative mass spectrometric analyses with ease.

  3. CISUS: an integrated 3D ultrasound system for IGT using a modular tracking API

    NASA Astrophysics Data System (ADS)

    Boctor, Emad M.; Viswanathan, Anand; Pieper, Steve; Choti, Michael A.; Taylor, Russell H.; Kikinis, Ron; Fichtinger, Gabor

    2004-05-01

    Ultrasound has become popular in clinical/surgical applications, both as the primary image guidance modality and also in conjunction with other modalities like CT or MRI. Three dimensional ultrasound (3DUS) systems have also demonstrated usefulness in image-guided therapy (IGT). At the same time, however, current lack of open-source and open-architecture multi-modal medical visualization systems prevents 3DUS from fulfilling its potential. Several stand-alone 3DUS systems, like Stradx or In-Vivo exist today. Although these systems have been found to be useful in real clinical setting, it is difficult to augment their functionality and integrate them in versatile IGT systems. To address these limitations, a robotic/freehand 3DUS open environment (CISUS) is being integrated into the 3D Slicer, an open-source research tool developed for medical image analysis and surgical planning. In addition, the system capitalizes on generic application programming interfaces (APIs) for tracking devices and robotic control. The resulting platform-independent open-source system may serve as a valuable tool to the image guided surgery community. Other researchers could straightforwardly integrate the generic CISUS system along with other functionalities (i.e. dual view visualization, registration, real-time tracking, segmentation, etc) to rapidly create their medical/surgical applications. Our current driving clinical application is robotically assisted and freehand 3DUS-guided liver ablation, which is fully being integrated under the CISUS-3D Slicer. Initial functionality and pre-clinical feasibility are demonstrated on phantom and ex-vivo animal models.

  4. OpenARC: Extensible OpenACC Compiler Framework for Directive-Based Accelerator Programming Study

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lee, Seyong; Vetter, Jeffrey S

    2014-01-01

    Directive-based, accelerator programming models such as OpenACC have arisen as an alternative solution to program emerging Scalable Heterogeneous Computing (SHC) platforms. However, the increased complexity in the SHC systems incurs several challenges in terms of portability and productivity. This paper presents an open-sourced OpenACC compiler, called OpenARC, which serves as an extensible research framework to address those issues in the directive-based accelerator programming. This paper explains important design strategies and key compiler transformation techniques needed to implement the reference OpenACC compiler. Moreover, this paper demonstrates the efficacy of OpenARC as a research framework for directive-based programming study, by proposing andmore » implementing OpenACC extensions in the OpenARC framework to 1) support hybrid programming of the unified memory and separate memory and 2) exploit architecture-specific features in an abstract manner. Porting thirteen standard OpenACC programs and three extended OpenACC programs to CUDA GPUs shows that OpenARC performs similarly to a commercial OpenACC compiler, while it serves as a high-level research framework.« less

  5. Evaluating Theory-Based Evaluation: Information, Norms, and Adherence

    ERIC Educational Resources Information Center

    Jacobs, W. Jake; Sisco, Melissa; Hill, Dawn; Malter, Frederic; Figueredo, Aurelio Jose

    2012-01-01

    Programmatic social interventions attempt to produce appropriate social-norm-guided behavior in an open environment. A marriage of applicable psychological theory, appropriate program evaluation theory, and outcome of evaluations of specific social interventions assures the acquisition of cumulative theory and the production of successful social…

  6. 44 CFR 80.13 - Application information.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... HOMELAND SECURITY INSURANCE AND HAZARD MITIGATION National Flood Insurance Program PROPERTY ACQUISITION AND... acquisition of property for the purpose of open space must include: (1) A photograph that represents the... language, which shall be consistent with the FEMA model deed restriction that the local government will...

  7. The Future of Library Automation in Schools.

    ERIC Educational Resources Information Center

    Anderson, Elaine

    2000-01-01

    Addresses the future of library automation programs for schools. Discusses requirements of emerging OPACs and circulation systems; the Schools Interoperability Framework (SIF), an industry initiatives to develop an open specification for ensuring that K-12 instructional and administrative software applications work together more effectively; home…

  8. Research, Collaboration, and Open Science Using Web 2.0

    PubMed Central

    Shee, Kevin; Strong, Michael; Guido, Nicholas J.; Lue, Robert A.; Church, George M.; Viel, Alain

    2010-01-01

    There is little doubt that the Internet has transformed the world in which we live. Information that was once archived in bricks and mortar libraries is now only a click away, and people across the globe have become connected in a manner inconceivable only 20 years ago. Although many scientists and educators have embraced the Internet as an invaluable tool for research, education and data sharing, some have been somewhat slower to take full advantage of emerging Web 2.0 technologies. Here we discuss the benefits and challenges of integrating Web 2.0 applications into undergraduate research and education programs, based on our experience utilizing these technologies in a summer undergraduate research program in synthetic biology at Harvard University. We discuss the use of applications including wiki-based documentation, digital brainstorming, and open data sharing via the Web, to facilitate the educational aspects and collaborative progress of undergraduate research projects. We hope to inspire others to integrate these technologies into their own coursework or research projects. PMID:23653712

  9. Fracture mechanics life analytical methods verification testing

    NASA Technical Reports Server (NTRS)

    Favenesi, J. A.; Clemons, T. G.; Riddell, W. T.; Ingraffea, A. R.; Wawrzynek, P. A.

    1994-01-01

    The objective was to evaluate NASCRAC (trademark) version 2.0, a second generation fracture analysis code, for verification and validity. NASCRAC was evaluated using a combination of comparisons to the literature, closed-form solutions, numerical analyses, and tests. Several limitations and minor errors were detected. Additionally, a number of major flaws were discovered. These major flaws were generally due to application of a specific method or theory, not due to programming logic. Results are presented for the following program capabilities: K versus a, J versus a, crack opening area, life calculation due to fatigue crack growth, tolerable crack size, proof test logic, tearing instability, creep crack growth, crack transitioning, crack retardation due to overloads, and elastic-plastic stress redistribution. It is concluded that the code is an acceptable fracture tool for K solutions of simplified geometries, for a limited number of J and crack opening area solutions, and for fatigue crack propagation with the Paris equation and constant amplitude loads when the Paris equation is applicable.

  10. Impact of Doximity Residency Rankings on Emergency Medicine Applicant Rank Lists.

    PubMed

    Peterson, William J; Hopson, Laura R; Khandelwal, Sorabh; White, Melissa; Gallahue, Fiona E; Burkhardt, John; Rolston, Aimee M; Santen, Sally A

    2016-05-01

    This study investigates the impact of the Doximity rankings on the rank list choices made by residency applicants in emergency medicine (EM). We sent an 11-item survey by email to all students who applied to EM residency programs at four different institutions representing diverse geographical regions. Students were asked questions about their perception of Doximity rankings and how it may have impacted their rank list decisions. Response rate was 58% of 1,372 opened electronic surveys. This study found that a majority of medical students applying to residency in EM were aware of the Doximity rankings prior to submitting rank lists (67%). One-quarter of these applicants changed the number of programs and ranks of those programs when completing their rank list based on the Doximity rankings (26%). Though the absolute number of programs changed on the rank lists was small, the results demonstrate that the EM Doximity rankings impact applicant decision-making in ranking residency programs. While applicants do not find the Doximity rankings to be important compared to other factors in the application process, the Doximity rankings result in a small change in residency applicant ranking behavior. This unvalidated ranking, based principally on reputational data rather than objective outcome criteria, thus has the potential to be detrimental to students, programs, and the public. We feel it important for specialties to develop consensus around measurable training outcomes and provide freely accessible metrics for candidate education.

  11. PedVizApi: a Java API for the interactive, visual analysis of extended pedigrees.

    PubMed

    Fuchsberger, Christian; Falchi, Mario; Forer, Lukas; Pramstaller, Peter P

    2008-01-15

    PedVizApi is a Java API (application program interface) for the visual analysis of large and complex pedigrees. It provides all the necessary functionality for the interactive exploration of extended genealogies. While available packages are mostly focused on a static representation or cannot be added to an existing application, PedVizApi is a highly flexible open source library for the efficient construction of visual-based applications for the analysis of family data. An extensive demo application and a R interface is provided. http://www.pedvizapi.org

  12. 3D geospatial visualizations: Animation and motion effects on spatial objects

    NASA Astrophysics Data System (ADS)

    Evangelidis, Konstantinos; Papadopoulos, Theofilos; Papatheodorou, Konstantinos; Mastorokostas, Paris; Hilas, Constantinos

    2018-02-01

    Digital Elevation Models (DEMs), in combination with high quality raster graphics provide realistic three-dimensional (3D) representations of the globe (virtual globe) and amazing navigation experience over the terrain through earth browsers. In addition, the adoption of interoperable geospatial mark-up languages (e.g. KML) and open programming libraries (Javascript) makes it also possible to create 3D spatial objects and convey on them the sensation of any type of texture by utilizing open 3D representation models (e.g. Collada). One step beyond, by employing WebGL frameworks (e.g. Cesium.js, three.js) animation and motion effects are attributed on 3D models. However, major GIS-based functionalities in combination with all the above mentioned visualization capabilities such as for example animation effects on selected areas of the terrain texture (e.g. sea waves) as well as motion effects on 3D objects moving in dynamically defined georeferenced terrain paths (e.g. the motion of an animal over a hill, or of a big fish in an ocean etc.) are not widely supported at least by open geospatial applications or development frameworks. Towards this we developed and made available to the research community, an open geospatial software application prototype that provides high level capabilities for dynamically creating user defined virtual geospatial worlds populated by selected animated and moving 3D models on user specified locations, paths and areas. At the same time, the generated code may enhance existing open visualization frameworks and programming libraries dealing with 3D simulations, with the geospatial aspect of a virtual world.

  13. pytc: Open-Source Python Software for Global Analyses of Isothermal Titration Calorimetry Data.

    PubMed

    Duvvuri, Hiranmayi; Wheeler, Lucas C; Harms, Michael J

    2018-05-08

    Here we describe pytc, an open-source Python package for global fits of thermodynamic models to multiple isothermal titration calorimetry experiments. Key features include simplicity, the ability to implement new thermodynamic models, a robust maximum likelihood fitter, a fast Bayesian Markov-Chain Monte Carlo sampler, rigorous implementation, extensive documentation, and full cross-platform compatibility. pytc fitting can be done using an application program interface or via a graphical user interface. It is available for download at https://github.com/harmslab/pytc .

  14. Orchestrating high-throughput genomic analysis with Bioconductor

    PubMed Central

    Huber, Wolfgang; Carey, Vincent J.; Gentleman, Robert; Anders, Simon; Carlson, Marc; Carvalho, Benilton S.; Bravo, Hector Corrada; Davis, Sean; Gatto, Laurent; Girke, Thomas; Gottardo, Raphael; Hahne, Florian; Hansen, Kasper D.; Irizarry, Rafael A.; Lawrence, Michael; Love, Michael I.; MacDonald, James; Obenchain, Valerie; Oleś, Andrzej K.; Pagès, Hervé; Reyes, Alejandro; Shannon, Paul; Smyth, Gordon K.; Tenenbaum, Dan; Waldron, Levi; Morgan, Martin

    2015-01-01

    Bioconductor is an open-source, open-development software project for the analysis and comprehension of high-throughput data in genomics and molecular biology. The project aims to enable interdisciplinary research, collaboration and rapid development of scientific software. Based on the statistical programming language R, Bioconductor comprises 934 interoperable packages contributed by a large, diverse community of scientists. Packages cover a range of bioinformatic and statistical applications. They undergo formal initial review and continuous automated testing. We present an overview for prospective users and contributors. PMID:25633503

  15. 48 CFR 22.404-8 - Notification of improper wage determination before award.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... FEDERAL ACQUISITION REGULATION SOCIOECONOMIC PROGRAMS APPLICATION OF LABOR LAWS TO GOVERNMENT ACQUISITIONS... determination is withdrawn by the Administrative Review Board. (b) In sealed bidding, the contracting officer... determination for the primary site of the work reaches the contracting officer before bid opening, the...

  16. A Review of Gas-Cooled Reactor Concepts for SDI Applications

    DTIC Science & Technology

    1989-08-01

    710 program .) Wire- Core Reactor (proposed by Rockwell). The wire- core reactor utilizes thin fuel wires woven between spacer wires to form an open...reactor is based on results of developmental studies of nuclear rocket propulsion systems. The reactor core is made up of annular fuel assemblies of...XE Addendum to Volume II. NERVA Fuel Development , Westinghouse Astronuclear Laboratory, TNR-230, July 15’ 1972. J I8- Rover Program Reactor Tests

  17. Applications of Dynamic Deployment of Services in Industrial Automation

    NASA Astrophysics Data System (ADS)

    Candido, Gonçalo; Barata, José; Jammes, François; Colombo, Armando W.

    Service-oriented Architecture (SOA) is becoming a de facto paradigm for business and enterprise integration. SOA is expanding into several domains of application envisioning a unified solution suitable across all different layers of an enterprise infrastructure. The application of SOA based on open web standards can significantly enhance the interoperability and openness of those devices. By embedding a dynamical deployment service even into small field de- vices, it would be either possible to allow machine builders to place built- in services and still allow the integrator to deploy on-the-run the services that best fit his current application. This approach allows the developer to keep his own preferred development language, but still deliver a SOA- compliant application. A dynamic deployment service is envisaged as a fundamental framework to support more complex applications, reducing deployment delays, while increasing overall system agility. As use-case scenario, a dynamic deployment service was implemented over DPWS and WS-Management specifications allowing designing and programming an automation application using IEC61131 languages, and deploying these components as web services into devices.

  18. Clinical software development for the Web: lessons learned from the BOADICEA project

    PubMed Central

    2012-01-01

    Background In the past 20 years, society has witnessed the following landmark scientific advances: (i) the sequencing of the human genome, (ii) the distribution of software by the open source movement, and (iii) the invention of the World Wide Web. Together, these advances have provided a new impetus for clinical software development: developers now translate the products of human genomic research into clinical software tools; they use open-source programs to build them; and they use the Web to deliver them. Whilst this open-source component-based approach has undoubtedly made clinical software development easier, clinical software projects are still hampered by problems that traditionally accompany the software process. This study describes the development of the BOADICEA Web Application, a computer program used by clinical geneticists to assess risks to patients with a family history of breast and ovarian cancer. The key challenge of the BOADICEA Web Application project was to deliver a program that was safe, secure and easy for healthcare professionals to use. We focus on the software process, problems faced, and lessons learned. Our key objectives are: (i) to highlight key clinical software development issues; (ii) to demonstrate how software engineering tools and techniques can facilitate clinical software development for the benefit of individuals who lack software engineering expertise; and (iii) to provide a clinical software development case report that can be used as a basis for discussion at the start of future projects. Results We developed the BOADICEA Web Application using an evolutionary software process. Our approach to Web implementation was conservative and we used conventional software engineering tools and techniques. The principal software development activities were: requirements, design, implementation, testing, documentation and maintenance. The BOADICEA Web Application has now been widely adopted by clinical geneticists and researchers. BOADICEA Web Application version 1 was released for general use in November 2007. By May 2010, we had > 1200 registered users based in the UK, USA, Canada, South America, Europe, Africa, Middle East, SE Asia, Australia and New Zealand. Conclusions We found that an evolutionary software process was effective when we developed the BOADICEA Web Application. The key clinical software development issues identified during the BOADICEA Web Application project were: software reliability, Web security, clinical data protection and user feedback. PMID:22490389

  19. Clinical software development for the Web: lessons learned from the BOADICEA project.

    PubMed

    Cunningham, Alex P; Antoniou, Antonis C; Easton, Douglas F

    2012-04-10

    In the past 20 years, society has witnessed the following landmark scientific advances: (i) the sequencing of the human genome, (ii) the distribution of software by the open source movement, and (iii) the invention of the World Wide Web. Together, these advances have provided a new impetus for clinical software development: developers now translate the products of human genomic research into clinical software tools; they use open-source programs to build them; and they use the Web to deliver them. Whilst this open-source component-based approach has undoubtedly made clinical software development easier, clinical software projects are still hampered by problems that traditionally accompany the software process. This study describes the development of the BOADICEA Web Application, a computer program used by clinical geneticists to assess risks to patients with a family history of breast and ovarian cancer. The key challenge of the BOADICEA Web Application project was to deliver a program that was safe, secure and easy for healthcare professionals to use. We focus on the software process, problems faced, and lessons learned. Our key objectives are: (i) to highlight key clinical software development issues; (ii) to demonstrate how software engineering tools and techniques can facilitate clinical software development for the benefit of individuals who lack software engineering expertise; and (iii) to provide a clinical software development case report that can be used as a basis for discussion at the start of future projects. We developed the BOADICEA Web Application using an evolutionary software process. Our approach to Web implementation was conservative and we used conventional software engineering tools and techniques. The principal software development activities were: requirements, design, implementation, testing, documentation and maintenance. The BOADICEA Web Application has now been widely adopted by clinical geneticists and researchers. BOADICEA Web Application version 1 was released for general use in November 2007. By May 2010, we had > 1200 registered users based in the UK, USA, Canada, South America, Europe, Africa, Middle East, SE Asia, Australia and New Zealand. We found that an evolutionary software process was effective when we developed the BOADICEA Web Application. The key clinical software development issues identified during the BOADICEA Web Application project were: software reliability, Web security, clinical data protection and user feedback.

  20. Code Parallelization with CAPO: A User Manual

    NASA Technical Reports Server (NTRS)

    Jin, Hao-Qiang; Frumkin, Michael; Yan, Jerry; Biegel, Bryan (Technical Monitor)

    2001-01-01

    A software tool has been developed to assist the parallelization of scientific codes. This tool, CAPO, extends an existing parallelization toolkit, CAPTools developed at the University of Greenwich, to generate OpenMP parallel codes for shared memory architectures. This is an interactive toolkit to transform a serial Fortran application code to an equivalent parallel version of the software - in a small fraction of the time normally required for a manual parallelization. We first discuss the way in which loop types are categorized and how efficient OpenMP directives can be defined and inserted into the existing code using the in-depth interprocedural analysis. The use of the toolkit on a number of application codes ranging from benchmark to real-world application codes is presented. This will demonstrate the great potential of using the toolkit to quickly parallelize serial programs as well as the good performance achievable on a large number of toolkit to quickly parallelize serial programs as well as the good performance achievable on a large number of processors. The second part of the document gives references to the parameters and the graphic user interface implemented in the toolkit. Finally a set of tutorials is included for hands-on experiences with this toolkit.

  1. Open Source Web-Based Solutions for Disseminating and Analyzing Flood Hazard Information at the Community Level

    NASA Astrophysics Data System (ADS)

    Santillan, M. M.-M.; Santillan, J. R.; Morales, E. M. O.

    2017-09-01

    We discuss in this paper the development, including the features and functionalities, of an open source web-based flood hazard information dissemination and analytical system called "Flood EViDEns". Flood EViDEns is short for "Flood Event Visualization and Damage Estimations", an application that was developed by the Caraga State University to address the needs of local disaster managers in the Caraga Region in Mindanao, Philippines in accessing timely and relevant flood hazard information before, during and after the occurrence of flood disasters at the community (i.e., barangay and household) level. The web application made use of various free/open source web mapping and visualization technologies (GeoServer, GeoDjango, OpenLayers, Bootstrap), various geospatial datasets including LiDAR-derived elevation and information products, hydro-meteorological data, and flood simulation models to visualize various scenarios of flooding and its associated damages to infrastructures. The Flood EViDEns application facilitates the release and utilization of this flood-related information through a user-friendly front end interface consisting of web map and tables. A public version of the application can be accessed at http://121.97.192.11:8082/. The application is currently expanded to cover additional sites in Mindanao, Philippines through the "Geo-informatics for the Systematic Assessment of Flood Effects and Risks for a Resilient Mindanao" or the "Geo-SAFER Mindanao" Program.

  2. Access Control of Web- and Java-Based Applications

    NASA Technical Reports Server (NTRS)

    Tso, Kam S.; Pajevski, Michael J.

    2013-01-01

    Cybersecurity has become a great concern as threats of service interruption, unauthorized access, stealing and altering of information, and spreading of viruses have become more prevalent and serious. Application layer access control of applications is a critical component in the overall security solution that also includes encryption, firewalls, virtual private networks, antivirus, and intrusion detection. An access control solution, based on an open-source access manager augmented with custom software components, was developed to provide protection to both Web-based and Javabased client and server applications. The DISA Security Service (DISA-SS) provides common access control capabilities for AMMOS software applications through a set of application programming interfaces (APIs) and network- accessible security services for authentication, single sign-on, authorization checking, and authorization policy management. The OpenAM access management technology designed for Web applications can be extended to meet the needs of Java thick clients and stand alone servers that are commonly used in the JPL AMMOS environment. The DISA-SS reusable components have greatly reduced the effort for each AMMOS subsystem to develop its own access control strategy. The novelty of this work is that it leverages an open-source access management product that was designed for Webbased applications to provide access control for Java thick clients and Java standalone servers. Thick clients and standalone servers are still commonly used in businesses and government, especially for applications that require rich graphical user interfaces and high-performance visualization that cannot be met by thin clients running on Web browsers

  3. POSTOP: Postbuckled open-stiffener optimum panels, user's manual

    NASA Technical Reports Server (NTRS)

    Biggers, S. B.; Dickson, J. N.

    1984-01-01

    The computer program POSTOP developed to serve as an aid in the analysis and sizing of stiffened composite panels that may be loaded in the postbuckling regime, is intended for the preliminary design of metal or composite panels with open-section stiffeners, subjected to multiple combined biaxial compression (or tension), shear and normal pressure load cases. Longitudinal compression, however, is assumed to be the dominant loading. Temperature, initial bow eccentricity and load eccentricity effects are included. The panel geometry is assumed to be repetitive over several bays in the longitudinal (stiffener) direction as well as in the transverse direction. Analytical routines are included to compute panel stiffnesses, strains, local and panel buckling loads, and skin/stiffener interface stresses. The resulting program is applicable to stiffened panels as commonly used in fuselage, wing, or empennage structures. The capabilities and limitations of the code are described. Instructions required to use the program and several example problems are included.

  4. Clock Agreement Among Parallel Supercomputer Nodes

    DOE Data Explorer

    Jones, Terry R.; Koenig, Gregory A.

    2014-04-30

    This dataset presents measurements that quantify the clock synchronization time-agreement characteristics among several high performance computers including the current world's most powerful machine for open science, the U.S. Department of Energy's Titan machine sited at Oak Ridge National Laboratory. These ultra-fast machines derive much of their computational capability from extreme node counts (over 18000 nodes in the case of the Titan machine). Time-agreement is commonly utilized by parallel programming applications and tools, distributed programming application and tools, and system software. Our time-agreement measurements detail the degree of time variance between nodes and how that variance changes over time. The dataset includes empirical measurements and the accompanying spreadsheets.

  5. cp-R, an interface the R programming language for clinical laboratory method comparisons.

    PubMed

    Holmes, Daniel T

    2015-02-01

    Clinical scientists frequently need to compare two different bioanalytical methods as part of assay validation/monitoring. As a matter necessity, regression methods for quantitative comparison in clinical chemistry, hematology and other clinical laboratory disciplines must allow for error in both the x and y variables. Traditionally the methods popularized by 1) Deming and 2) Passing and Bablok have been recommended. While commercial tools exist, no simple open source tool is available. The purpose of this work was to develop and entirely open-source GUI-driven program for bioanalytical method comparisons capable of performing these regression methods and able to produce highly customized graphical output. The GUI is written in python and PyQt4 with R scripts performing regression and graphical functions. The program can be run from source code or as a pre-compiled binary executable. The software performs three forms of regression and offers weighting where applicable. Confidence bands of the regression are calculated using bootstrapping for Deming and Passing Bablok methods. Users can customize regression plots according to the tools available in R and can produced output in any of: jpg, png, tiff, bmp at any desired resolution or ps and pdf vector formats. Bland Altman plots and some regression diagnostic plots are also generated. Correctness of regression parameter estimates was confirmed against existing R packages. The program allows for rapid and highly customizable graphical output capable of conforming to the publication requirements of any clinical chemistry journal. Quick method comparisons can also be performed and cut and paste into spreadsheet or word processing applications. We present a simple and intuitive open source tool for quantitative method comparison in a clinical laboratory environment. Copyright © 2014 The Canadian Society of Clinical Chemists. Published by Elsevier Inc. All rights reserved.

  6. Aerodynamic design guidelines and computer program for estimation of subsonic wind tunnel performance

    NASA Technical Reports Server (NTRS)

    Eckert, W. T.; Mort, K. W.; Jope, J.

    1976-01-01

    General guidelines are given for the design of diffusers, contractions, corners, and the inlets and exits of non-return tunnels. A system of equations, reflecting the current technology, has been compiled and assembled into a computer program (a user's manual for this program is included) for determining the total pressure losses. The formulation presented is applicable to compressible flow through most closed- or open-throat, single-, double-, or non-return wind tunnels. A comparison of estimated performance with that actually achieved by several existing facilities produced generally good agreement.

  7. OpenMM 4: A Reusable, Extensible, Hardware Independent Library for High Performance Molecular Simulation.

    PubMed

    Eastman, Peter; Friedrichs, Mark S; Chodera, John D; Radmer, Randall J; Bruns, Christopher M; Ku, Joy P; Beauchamp, Kyle A; Lane, Thomas J; Wang, Lee-Ping; Shukla, Diwakar; Tye, Tony; Houston, Mike; Stich, Timo; Klein, Christoph; Shirts, Michael R; Pande, Vijay S

    2013-01-08

    OpenMM is a software toolkit for performing molecular simulations on a range of high performance computing architectures. It is based on a layered architecture: the lower layers function as a reusable library that can be invoked by any application, while the upper layers form a complete environment for running molecular simulations. The library API hides all hardware-specific dependencies and optimizations from the users and developers of simulation programs: they can be run without modification on any hardware on which the API has been implemented. The current implementations of OpenMM include support for graphics processing units using the OpenCL and CUDA frameworks. In addition, OpenMM was designed to be extensible, so new hardware architectures can be accommodated and new functionality (e.g., energy terms and integrators) can be easily added.

  8. OpenMM 4: A Reusable, Extensible, Hardware Independent Library for High Performance Molecular Simulation

    PubMed Central

    Eastman, Peter; Friedrichs, Mark S.; Chodera, John D.; Radmer, Randall J.; Bruns, Christopher M.; Ku, Joy P.; Beauchamp, Kyle A.; Lane, Thomas J.; Wang, Lee-Ping; Shukla, Diwakar; Tye, Tony; Houston, Mike; Stich, Timo; Klein, Christoph; Shirts, Michael R.; Pande, Vijay S.

    2012-01-01

    OpenMM is a software toolkit for performing molecular simulations on a range of high performance computing architectures. It is based on a layered architecture: the lower layers function as a reusable library that can be invoked by any application, while the upper layers form a complete environment for running molecular simulations. The library API hides all hardware-specific dependencies and optimizations from the users and developers of simulation programs: they can be run without modification on any hardware on which the API has been implemented. The current implementations of OpenMM include support for graphics processing units using the OpenCL and CUDA frameworks. In addition, OpenMM was designed to be extensible, so new hardware architectures can be accommodated and new functionality (e.g., energy terms and integrators) can be easily added. PMID:23316124

  9. Spectral-element Seismic Wave Propagation on CUDA/OpenCL Hardware Accelerators

    NASA Astrophysics Data System (ADS)

    Peter, D. B.; Videau, B.; Pouget, K.; Komatitsch, D.

    2015-12-01

    Seismic wave propagation codes are essential tools to investigate a variety of wave phenomena in the Earth. Furthermore, they can now be used for seismic full-waveform inversions in regional- and global-scale adjoint tomography. Although these seismic wave propagation solvers are crucial ingredients to improve the resolution of tomographic images to answer important questions about the nature of Earth's internal processes and subsurface structure, their practical application is often limited due to high computational costs. They thus need high-performance computing (HPC) facilities to improving the current state of knowledge. At present, numerous large HPC systems embed many-core architectures such as graphics processing units (GPUs) to enhance numerical performance. Such hardware accelerators can be programmed using either the CUDA programming environment or the OpenCL language standard. CUDA software development targets NVIDIA graphic cards while OpenCL was adopted by additional hardware accelerators, like e.g. AMD graphic cards, ARM-based processors as well as Intel Xeon Phi coprocessors. For seismic wave propagation simulations using the open-source spectral-element code package SPECFEM3D_GLOBE, we incorporated an automatic source-to-source code generation tool (BOAST) which allows us to use meta-programming of all computational kernels for forward and adjoint runs. Using our BOAST kernels, we generate optimized source code for both CUDA and OpenCL languages within the source code package. Thus, seismic wave simulations are able now to fully utilize CUDA and OpenCL hardware accelerators. We show benchmarks of forward seismic wave propagation simulations using SPECFEM3D_GLOBE on CUDA/OpenCL GPUs, validating results and comparing performances for different simulations and hardware usages.

  10. Argobots: A Lightweight Low-Level Threading and Tasking Framework

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Seo, Sangmin; Amer, Abdelhalim; Balaji, Pavan

    In the past few decades, a number of user-level threading and tasking models have been proposed in the literature to address the shortcomings of OS-level threads, primarily with respect to cost and flexibility. Current state-of-the-art user-level threading and tasking models, however, are either too specific to applications or architectures or are not as powerful or flexible. In this paper, we present Argobots, a lightweight, low-level threading and tasking framework that is designed as a portable and performant substrate for high-level programming models or runtime systems. Argobots offers a carefully designed execution model that balances generality of functionality with providing amore » rich set of controls to allow specialization by the user or high-level programming model. We describe the design, implementation, and optimization of Argobots and present integrations with three example high-level models: OpenMP, MPI, and co-located I/O service. Evaluations show that (1) Argobots outperforms existing generic threading runtimes; (2) our OpenMP runtime offers more efficient interoperability capabilities than production OpenMP runtimes do; (3) when MPI interoperates with Argobots instead of Pthreads, it enjoys reduced synchronization costs and better latency hiding capabilities; and (4) I/O service with Argobots reduces interference with co-located applications, achieving performance competitive with that of the Pthreads version.« less

  11. Open source software projects of the caBIG In Vivo Imaging Workspace Software special interest group.

    PubMed

    Prior, Fred W; Erickson, Bradley J; Tarbox, Lawrence

    2007-11-01

    The Cancer Bioinformatics Grid (caBIG) program was created by the National Cancer Institute to facilitate sharing of IT infrastructure, data, and applications among the National Cancer Institute-sponsored cancer research centers. The program was launched in February 2004 and now links more than 50 cancer centers. In April 2005, the In Vivo Imaging Workspace was added to promote the use of imaging in cancer clinical trials. At the inaugural meeting, four special interest groups (SIGs) were established. The Software SIG was charged with identifying projects that focus on open-source software for image visualization and analysis. To date, two projects have been defined by the Software SIG. The eXtensible Imaging Platform project has produced a rapid application development environment that researchers may use to create targeted workflows customized for specific research projects. The Algorithm Validation Tools project will provide a set of tools and data structures that will be used to capture measurement information and associated needed to allow a gold standard to be defined for the given database against which change analysis algorithms can be tested. Through these and future efforts, the caBIG In Vivo Imaging Workspace Software SIG endeavors to advance imaging informatics and provide new open-source software tools to advance cancer research.

  12. Ocean energy program summary. Volume 2: Research summaries

    NASA Astrophysics Data System (ADS)

    1990-01-01

    The oceans are the world's largest solar energy collector and storage system. Covering 71 percent of the earth's surface, this stored energy is realized as waves, currents, and thermal salinity gradients. The purpose of the Federal Ocean Energy Technology (OET) Program is to develop techniques that harness this ocean energy in a cost effective and environmentally acceptable manner. The OET Program seeks to develop ocean energy technology to a point where the commercial sector can assess whether applications of the technology are viable energy conversion alternatives or supplements to systems. Past studies conducted by the U.S. Department of Energy (DOE) have identified ocean thermal energy conversion (OTEC) as the largest potential contributor to United States energy supplies from the ocean resource. As a result, the OET Program concentrates on research to advance OTEC technology. Current program emphasis has shifted to open-cycle OTEC power system research because the closed-cycle OTEC system is at a more advanced stage of development and has already attracted industrial interest. During FY 1989, the OET Program focused primarily on the technical uncertainties associated with near-shore open-cycle OTEC systems ranging in size from 2 to 15 MW(sub e). Activities were performed under three major program elements: thermodynamic research and analysis, experimental verification and testing, and materials and structures research. These efforts addressed a variety of technical problems whose resolution is crucial to demonstrating the viability of open-cycle OTEC technology. This publications is one of a series of documents on the Renewable Energy programs sponsored by the U.S. Department of Energy. An overview of all the programs is available, entitled Programs in Renewable Energy.

  13. Conditions for Effective Application of Analysis of Symmetrically-Predicted Endogenous Subgroups

    ERIC Educational Resources Information Center

    Peck, Laura R.

    2015-01-01

    Several analytic strategies exist for opening up the "black box" to reveal more about what drives policy and program impacts. This article focuses on one of these strategies: the Analysis of Symmetrically-Predicted Endogenous Subgroups (ASPES). ASPES uses exogenous baseline data to identify endogenously-defined subgroups, keeping the…

  14. 78 FR 25717 - Applications for New Awards; Strengthening Institutions Program (SIP)

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-05-02

    ... use of time, staff, money, or other resources while improving student learning or other educational... migrant, or who have disabilities. Open educational resources (OER) means teaching, learning, and research... awards for individual development grants in rank order from the funding slate according to the average...

  15. Process control charts in infection prevention: Make it simple to make it happen.

    PubMed

    Wiemken, Timothy L; Furmanek, Stephen P; Carrico, Ruth M; Mattingly, William A; Persaud, Annuradha K; Guinn, Brian E; Kelley, Robert R; Ramirez, Julio A

    2017-03-01

    Quality improvement is central to Infection Prevention and Control (IPC) programs. Challenges may occur when applying quality improvement methodologies like process control charts, often due to the limited exposure of typical IPs. Because of this, our team created an open-source database with a process control chart generator for IPC programs. The objectives of this report are to outline the development of the application and demonstrate application using simulated data. We used Research Electronic Data Capture (REDCap Consortium, Vanderbilt University, Nashville, TN), R (R Foundation for Statistical Computing, Vienna, Austria), and R Studio Shiny (R Foundation for Statistical Computing) to create an open source data collection system with automated process control chart generation. We used simulated data to test and visualize both in-control and out-of-control processes for commonly used metrics in IPC programs. The R code for implementing the control charts and Shiny application can be found on our Web site (https://github.com/ul-research-support/spcapp). Screen captures of the workflow and simulated data indicating both common cause and special cause variation are provided. Process control charts can be easily developed based on individual facility needs using freely available software. Through providing our work free to all interested parties, we hope that others will be able to harness the power and ease of use of the application for improving the quality of care and patient safety in their facilities. Copyright © 2017 Association for Professionals in Infection Control and Epidemiology, Inc. Published by Elsevier Inc. All rights reserved.

  16. [Series: Medical Applications of the PHITS Code (2): Acceleration by Parallel Computing].

    PubMed

    Furuta, Takuya; Sato, Tatsuhiko

    2015-01-01

    Time-consuming Monte Carlo dose calculation becomes feasible owing to the development of computer technology. However, the recent development is due to emergence of the multi-core high performance computers. Therefore, parallel computing becomes a key to achieve good performance of software programs. A Monte Carlo simulation code PHITS contains two parallel computing functions, the distributed-memory parallelization using protocols of message passing interface (MPI) and the shared-memory parallelization using open multi-processing (OpenMP) directives. Users can choose the two functions according to their needs. This paper gives the explanation of the two functions with their advantages and disadvantages. Some test applications are also provided to show their performance using a typical multi-core high performance workstation.

  17. AccessMRS: integrating OpenMRS with smart forms on Android.

    PubMed

    Fazen, Louis E; Chemwolo, Benjamin T; Songok, Julia J; Ruhl, Laura J; Kipkoech, Carolyne; Green, James M; Ikemeri, Justus E; Christoffersen-Deb, Astrid

    2013-01-01

    We present a new open-source Android application, AccessMRS, for interfacing with an electronic medical record system (OpenMRS) and loading 'Smart Forms' on a mobile device. AccessMRS functions as a patient-centered interface for viewing OpenMRS data; managing patient information in reminders, task lists, and previous encounters; and launching patient-specific 'Smart Forms' for electronic data collection and dissemination of health information. We present AccessMRS in the context of related software applications we developed to serve Community Health Workers, including AccessInfo, AccessAdmin, AccessMaps, and AccessForms. The specific features and design of AccessMRS are detailed in relationship to the requirements that drove development: the workflows of the Kenyan Ministry of Health Community Health Volunteers (CHVs) supported by the AMPATH Primary Health Care Program. Specifically, AccessMRS was designed to improve the quality of community-based Maternal and Child Health services delivered by CHVs in Kosirai Division. AccessMRS is currently in use by more than 80 CHVs in Kenya and undergoing formal assessment of acceptability, effectiveness, and cost.

  18. Development of AN Open-Source Automatic Deformation Monitoring System for Geodetical and Geotechnical Measurements

    NASA Astrophysics Data System (ADS)

    Engel, P.; Schweimler, B.

    2016-04-01

    The deformation monitoring of structures and buildings is an important task field of modern engineering surveying, ensuring the standing and reliability of supervised objects over a long period. Several commercial hardware and software solutions for the realization of such monitoring measurements are available on the market. In addition to them, a research team at the Neubrandenburg University of Applied Sciences (NUAS) is actively developing a software package for monitoring purposes in geodesy and geotechnics, which is distributed under an open source licence and free of charge. The task of managing an open source project is well-known in computer science, but it is fairly new in a geodetic context. This paper contributes to that issue by detailing applications, frameworks, and interfaces for the design and implementation of open hardware and software solutions for sensor control, sensor networks, and data management in automatic deformation monitoring. It will be discussed how the development effort of networked applications can be reduced by using free programming tools, cloud computing technologies, and rapid prototyping methods.

  19. Resilient workflows for computational mechanics platforms

    NASA Astrophysics Data System (ADS)

    Nguyên, Toàn; Trifan, Laurentiu; Désidéri, Jean-Antoine

    2010-06-01

    Workflow management systems have recently been the focus of much interest and many research and deployment for scientific applications worldwide [26, 27]. Their ability to abstract the applications by wrapping application codes have also stressed the usefulness of such systems for multidiscipline applications [23, 24]. When complex applications need to provide seamless interfaces hiding the technicalities of the computing infrastructures, their high-level modeling, monitoring and execution functionalities help giving production teams seamless and effective facilities [25, 31, 33]. Software integration infrastructures based on programming paradigms such as Python, Mathlab and Scilab have also provided evidence of the usefulness of such approaches for the tight coupling of multidisciplne application codes [22, 24]. Also high-performance computing based on multi-core multi-cluster infrastructures open new opportunities for more accurate, more extensive and effective robust multi-discipline simulations for the decades to come [28]. This supports the goal of full flight dynamics simulation for 3D aircraft models within the next decade, opening the way to virtual flight-tests and certification of aircraft in the future [23, 24, 29].

  20. Forward and adjoint spectral-element simulations of seismic wave propagation using hardware accelerators

    NASA Astrophysics Data System (ADS)

    Peter, Daniel; Videau, Brice; Pouget, Kevin; Komatitsch, Dimitri

    2015-04-01

    Improving the resolution of tomographic images is crucial to answer important questions on the nature of Earth's subsurface structure and internal processes. Seismic tomography is the most prominent approach where seismic signals from ground-motion records are used to infer physical properties of internal structures such as compressional- and shear-wave speeds, anisotropy and attenuation. Recent advances in regional- and global-scale seismic inversions move towards full-waveform inversions which require accurate simulations of seismic wave propagation in complex 3D media, providing access to the full 3D seismic wavefields. However, these numerical simulations are computationally very expensive and need high-performance computing (HPC) facilities for further improving the current state of knowledge. During recent years, many-core architectures such as graphics processing units (GPUs) have been added to available large HPC systems. Such GPU-accelerated computing together with advances in multi-core central processing units (CPUs) can greatly accelerate scientific applications. There are mainly two possible choices of language support for GPU cards, the CUDA programming environment and OpenCL language standard. CUDA software development targets NVIDIA graphic cards while OpenCL was adopted mainly by AMD graphic cards. In order to employ such hardware accelerators for seismic wave propagation simulations, we incorporated a code generation tool BOAST into an existing spectral-element code package SPECFEM3D_GLOBE. This allows us to use meta-programming of computational kernels and generate optimized source code for both CUDA and OpenCL languages, running simulations on either CUDA or OpenCL hardware accelerators. We show here applications of forward and adjoint seismic wave propagation on CUDA/OpenCL GPUs, validating results and comparing performances for different simulations and hardware usages.

  1. Panel C report: Standards needed for the use of ISO Open Systems Interconnection - basic reference model

    NASA Technical Reports Server (NTRS)

    1981-01-01

    The use of an International Standards Organization (ISO) Open Systems Interconnection (OSI) Reference Model and its relevance to interconnecting an Applications Data Service (ADS) pilot program for data sharing is discussed. A top level mapping between the conjectured ADS requirements and identified layers within the OSI Reference Model was performed. It was concluded that the OSI model represents an orderly architecture for the ADS networking planning and that the protocols being developed by the National Bureau of Standards offer the best available implementation approach.

  2. Open Astronomy Catalogs API

    NASA Astrophysics Data System (ADS)

    Guillochon, James; Cowperthwaite, Philip S.

    2018-05-01

    We announce the public release of the application program interface (API) for the Open Astronomy Catalogs (OACs), the OACAPI. The OACs serve near-complete collections of supernova, tidal disruption, kilonova, and fast stars data (including photometry, spectra, radio, and X-ray observations) via a user-friendly web interface that displays the data interactively and offers full data downloads. The OACAPI, by contrast, enables users to specifically download particular pieces of the OAC dataset via a flexible programmatic syntax, either via URL GET requests, or via a module within the astroquery Python package.

  3. MEIGO: an open-source software suite based on metaheuristics for global optimization in systems biology and bioinformatics.

    PubMed

    Egea, Jose A; Henriques, David; Cokelaer, Thomas; Villaverde, Alejandro F; MacNamara, Aidan; Danciu, Diana-Patricia; Banga, Julio R; Saez-Rodriguez, Julio

    2014-05-10

    Optimization is the key to solving many problems in computational biology. Global optimization methods, which provide a robust methodology, and metaheuristics in particular have proven to be the most efficient methods for many applications. Despite their utility, there is a limited availability of metaheuristic tools. We present MEIGO, an R and Matlab optimization toolbox (also available in Python via a wrapper of the R version), that implements metaheuristics capable of solving diverse problems arising in systems biology and bioinformatics. The toolbox includes the enhanced scatter search method (eSS) for continuous nonlinear programming (cNLP) and mixed-integer programming (MINLP) problems, and variable neighborhood search (VNS) for Integer Programming (IP) problems. Additionally, the R version includes BayesFit for parameter estimation by Bayesian inference. The eSS and VNS methods can be run on a single-thread or in parallel using a cooperative strategy. The code is supplied under GPLv3 and is available at http://www.iim.csic.es/~gingproc/meigo.html. Documentation and examples are included. The R package has been submitted to BioConductor. We evaluate MEIGO against optimization benchmarks, and illustrate its applicability to a series of case studies in bioinformatics and systems biology where it outperforms other state-of-the-art methods. MEIGO provides a free, open-source platform for optimization that can be applied to multiple domains of systems biology and bioinformatics. It includes efficient state of the art metaheuristics, and its open and modular structure allows the addition of further methods.

  4. MEIGO: an open-source software suite based on metaheuristics for global optimization in systems biology and bioinformatics

    PubMed Central

    2014-01-01

    Background Optimization is the key to solving many problems in computational biology. Global optimization methods, which provide a robust methodology, and metaheuristics in particular have proven to be the most efficient methods for many applications. Despite their utility, there is a limited availability of metaheuristic tools. Results We present MEIGO, an R and Matlab optimization toolbox (also available in Python via a wrapper of the R version), that implements metaheuristics capable of solving diverse problems arising in systems biology and bioinformatics. The toolbox includes the enhanced scatter search method (eSS) for continuous nonlinear programming (cNLP) and mixed-integer programming (MINLP) problems, and variable neighborhood search (VNS) for Integer Programming (IP) problems. Additionally, the R version includes BayesFit for parameter estimation by Bayesian inference. The eSS and VNS methods can be run on a single-thread or in parallel using a cooperative strategy. The code is supplied under GPLv3 and is available at http://www.iim.csic.es/~gingproc/meigo.html. Documentation and examples are included. The R package has been submitted to BioConductor. We evaluate MEIGO against optimization benchmarks, and illustrate its applicability to a series of case studies in bioinformatics and systems biology where it outperforms other state-of-the-art methods. Conclusions MEIGO provides a free, open-source platform for optimization that can be applied to multiple domains of systems biology and bioinformatics. It includes efficient state of the art metaheuristics, and its open and modular structure allows the addition of further methods. PMID:24885957

  5. OpenFIRE - A Web GIS Service for Distributing the Finnish Reflection Experiment Datasets

    NASA Astrophysics Data System (ADS)

    Väkevä, Sakari; Aalto, Aleksi; Heinonen, Aku; Heikkinen, Pekka; Korja, Annakaisa

    2017-04-01

    The Finnish Reflection Experiment (FIRE) is a land-based deep seismic reflection survey conducted between 2001 and 2003 by a research consortium of the Universities of Helsinki and Oulu, the Geological Survey of Finland, and a Russian state-owned enterprise SpetsGeofysika. The dataset consists of 2100 kilometers of high-resolution profiles across the Archaean and Proterozoic nuclei of the Fennoscandian Shield. Although FIRE data have been available on request since 2009, the data have remained underused outside the original research consortium. The original FIRE data have been quality-controlled. The shot gathers have been cross-checked and comprehensive errata has been created. The brute stacks provided by the Russian seismic contractor have been reprocessed into seismic sections and replotted. A complete documentation of the intermediate processing steps is provided together with guidelines for setting up a computing environment and plotting the data. An open access web service "OpenFIRE" for the visualization and the downloading of FIRE data has been created. The service includes a mobile-responsive map application capable of enriching seismic sections with data from other sources such as open data from the National Land Survey and the Geological Survey of Finland. The AVAA team of the Finnish Open Science and Research Initiative has provided a tailored Liferay portal with necessary web components such as an API (Application Programming Interface) for download requests. INSPIRE (Infrastructure for Spatial Information in Europe) -compliant discovery metadata have been produced and geospatial data will be exposed as Open Geospatial Consortium standard services. The technical guidelines of the European Plate Observing System have been followed and the service could be considered as a reference application for sharing reflection seismic data. The OpenFIRE web service is available at www.seismo.helsinki.fi/openfire

  6. BioWord: A sequence manipulation suite for Microsoft Word

    PubMed Central

    2012-01-01

    Background The ability to manipulate, edit and process DNA and protein sequences has rapidly become a necessary skill for practicing biologists across a wide swath of disciplines. In spite of this, most everyday sequence manipulation tools are distributed across several programs and web servers, sometimes requiring installation and typically involving frequent switching between applications. To address this problem, here we have developed BioWord, a macro-enabled self-installing template for Microsoft Word documents that integrates an extensive suite of DNA and protein sequence manipulation tools. Results BioWord is distributed as a single macro-enabled template that self-installs with a single click. After installation, BioWord will open as a tab in the Office ribbon. Biologists can then easily manipulate DNA and protein sequences using a familiar interface and minimize the need to switch between applications. Beyond simple sequence manipulation, BioWord integrates functionality ranging from dyad search and consensus logos to motif discovery and pair-wise alignment. Written in Visual Basic for Applications (VBA) as an open source, object-oriented project, BioWord allows users with varying programming experience to expand and customize the program to better meet their own needs. Conclusions BioWord integrates a powerful set of tools for biological sequence manipulation within a handy, user-friendly tab in a widely used word processing software package. The use of a simple scripting language and an object-oriented scheme facilitates customization by users and provides a very accessible educational platform for introducing students to basic bioinformatics algorithms. PMID:22676326

  7. BioWord: a sequence manipulation suite for Microsoft Word.

    PubMed

    Anzaldi, Laura J; Muñoz-Fernández, Daniel; Erill, Ivan

    2012-06-07

    The ability to manipulate, edit and process DNA and protein sequences has rapidly become a necessary skill for practicing biologists across a wide swath of disciplines. In spite of this, most everyday sequence manipulation tools are distributed across several programs and web servers, sometimes requiring installation and typically involving frequent switching between applications. To address this problem, here we have developed BioWord, a macro-enabled self-installing template for Microsoft Word documents that integrates an extensive suite of DNA and protein sequence manipulation tools. BioWord is distributed as a single macro-enabled template that self-installs with a single click. After installation, BioWord will open as a tab in the Office ribbon. Biologists can then easily manipulate DNA and protein sequences using a familiar interface and minimize the need to switch between applications. Beyond simple sequence manipulation, BioWord integrates functionality ranging from dyad search and consensus logos to motif discovery and pair-wise alignment. Written in Visual Basic for Applications (VBA) as an open source, object-oriented project, BioWord allows users with varying programming experience to expand and customize the program to better meet their own needs. BioWord integrates a powerful set of tools for biological sequence manipulation within a handy, user-friendly tab in a widely used word processing software package. The use of a simple scripting language and an object-oriented scheme facilitates customization by users and provides a very accessible educational platform for introducing students to basic bioinformatics algorithms.

  8. The ESA standard for telemetry and telecommand packet utilisation: PUS

    NASA Technical Reports Server (NTRS)

    Kaufeler, Jean-Francois

    1994-01-01

    ESA has developed standards for packet telemetry and telecommand, which are derived from the recommendations of the Inter-Agency Consultative Committee for Space Data Systems (CCSDS). These standards are now mandatory for future ESA programs as well as for many programs currently under development. However, while these packet standards address the end-to-end transfer of telemetry and telecommand data between applications on the ground and Application Processes on-board, they leave open the internal structure or content of the packets. This paper presents the ESA Packet Utilization Standard (PUS) which addresses this very subject and, as such, serves to extend and complement the ESA packet standards. The goal of the PUS is to be applicable to future ESA missions in all application areas (Telecommunications, Science, Earth Resources, microgravity, etc.). The production of the PUS falls under the responsibility of the ESA Committee for Operations and EGSE Standards (COES).

  9. Open-Source Assisted Laboratory Automation through Graphical User Interfaces and 3D Printers: Application to Equipment Hyphenation for Higher-Order Data Generation.

    PubMed

    Siano, Gabriel G; Montemurro, Milagros; Alcaráz, Mirta R; Goicoechea, Héctor C

    2017-10-17

    Higher-order data generation implies some automation challenges, which are mainly related to the hidden programming languages and electronic details of the equipment. When techniques and/or equipment hyphenation are the key to obtaining higher-order data, the required simultaneous control of them demands funds for new hardware, software, and licenses, in addition to very skilled operators. In this work, we present Design of Inputs-Outputs with Sikuli (DIOS), a free and open-source code program that provides a general framework for the design of automated experimental procedures without prior knowledge of programming or electronics. Basically, instruments and devices are considered as nodes in a network, and every node is associated both with physical and virtual inputs and outputs. Virtual components, such as graphical user interfaces (GUIs) of equipment, are handled by means of image recognition tools provided by Sikuli scripting language, while handling of their physical counterparts is achieved using an adapted open-source three-dimensional (3D) printer. Two previously reported experiments of our research group, related to fluorescence matrices derived from kinetics and high-performance liquid chromatography, were adapted to be carried out in a more automated fashion. Satisfactory results, in terms of analytical performance, were obtained. Similarly, advantages derived from open-source tools assistance could be appreciated, mainly in terms of lesser intervention of operators and cost savings.

  10. An Integration of Mobile Applications into Physical Education Programs

    ERIC Educational Resources Information Center

    Yu, Hyeonho; Kulinna, Pamela Hodges; Lorenz, Kent A.

    2018-01-01

    Even though technology in physical education has the potential to open up a variety of teaching and learning avenues by enhancing active experiences to help students develop the skills, attitudes, knowledge and behaviors needed for a lifetime of physical activity, some teachers may have a hard time finding ways to integrate technology into their…

  11. 76 FR 14996 - Request for Certification of Compliance-Rural Industrialization Loan and Grant Program

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-03-18

    ... 4279-2) for the following: Applicant/Location: SoloPower, Inc., Wilsonville, Oregon. Principal Product... capacity by opening a new facility in Wilsonville, Oregon. The NAICS industry code for this enterprise is: 334413 (Solar cells manufacturing). DATES: All interested parties may submit comments in writing no later...

  12. 78 FR 71628 - National Institute on Alcohol Abuse and Alcoholism; Notice of Meeting

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-11-29

    ... attendance limited to space available. Individuals who plan to attend and need special assistance, such as... applications and the discussions could disclose confidential trade secrets or commercial property such as...' evaluation of NIAAA's Intramural Programs. Open: 2:00 p.m. to 5:00 p.m. Agenda: Presentations and other...

  13. WAGES (Women and Girls Employment Enabling Service): Final Report.

    ERIC Educational Resources Information Center

    Thomas, Leathia S.; Dickey, Sandy

    The two-year report of the WAGES project (Women and Girls Employment Enabling Service) documents the growth, problems, and success experienced through efforts to open nontraditional fields of employment to women by way of a community-based program in Memphis, Tennessee. Staff and volunteers provided counseling and referrals to applicants. Personal…

  14. TOWARD DEVELOPMENT OF A COMMON SOFTWARE APPLICATION PROGRAMMING INTERFACE (API) FOR UNCERTAINTY, SENSITIVITY, AND PARAMETER ESTIMATION METHODS AND TOOLS

    EPA Science Inventory

    The final session of the workshop considered the subject of software technology and how it might be better constructed to support those who develop, evaluate, and apply multimedia environmental models. Two invited presentations were featured along with an extended open discussio...

  15. Application of Component Scoring to a Complicated Cognitive Domain.

    ERIC Educational Resources Information Center

    Tatsuoka, Kikumi K.; Yamamoto, Kentaro

    This study used the Montague-Riley Test to introduce a new scoring procedure that revealed errors in cognitive processes occurring at subcomponents of an electricity problem. The test, consisting of four parts with 36 open-ended problems each, was administered to 250 high school students. A computer program, ELTEST, was written applying a…

  16. An Open-Sourced and Interactive Ebook Development Program for Minority Languages

    ERIC Educational Resources Information Center

    Sheepy, Emily; Sundberg, Ross; Laurie, Anne

    2017-01-01

    According to Long (2014), genuine task-based pedagogy is centered around the real-world activities that learners need to complete using the target language. We are developing the OurStories mobile application to support learners and instructors of minority languages in the development of personally relevant, task-based learning resources. The…

  17. Supporting secure programming in web applications through interactive static analysis.

    PubMed

    Zhu, Jun; Xie, Jing; Lipford, Heather Richter; Chu, Bill

    2014-07-01

    Many security incidents are caused by software developers' failure to adhere to secure programming practices. Static analysis tools have been used to detect software vulnerabilities. However, their wide usage by developers is limited by the special training required to write rules customized to application-specific logic. Our approach is interactive static analysis, to integrate static analysis into Integrated Development Environment (IDE) and provide in-situ secure programming support to help developers prevent vulnerabilities during code construction. No additional training is required nor are there any assumptions on ways programs are built. Our work is motivated in part by the observation that many vulnerabilities are introduced due to failure to practice secure programming by knowledgeable developers. We implemented a prototype interactive static analysis tool as a plug-in for Java in Eclipse. Our technical evaluation of our prototype detected multiple zero-day vulnerabilities in a large open source project. Our evaluations also suggest that false positives may be limited to a very small class of use cases.

  18. Supporting secure programming in web applications through interactive static analysis

    PubMed Central

    Zhu, Jun; Xie, Jing; Lipford, Heather Richter; Chu, Bill

    2013-01-01

    Many security incidents are caused by software developers’ failure to adhere to secure programming practices. Static analysis tools have been used to detect software vulnerabilities. However, their wide usage by developers is limited by the special training required to write rules customized to application-specific logic. Our approach is interactive static analysis, to integrate static analysis into Integrated Development Environment (IDE) and provide in-situ secure programming support to help developers prevent vulnerabilities during code construction. No additional training is required nor are there any assumptions on ways programs are built. Our work is motivated in part by the observation that many vulnerabilities are introduced due to failure to practice secure programming by knowledgeable developers. We implemented a prototype interactive static analysis tool as a plug-in for Java in Eclipse. Our technical evaluation of our prototype detected multiple zero-day vulnerabilities in a large open source project. Our evaluations also suggest that false positives may be limited to a very small class of use cases. PMID:25685513

  19. Moving targets: Promoting physical activity in public spaces via open streets in the US.

    PubMed

    Hipp, J Aaron; Bird, Alyssa; van Bakergem, Margaret; Yarnall, Elizabeth

    2017-10-01

    Popularity of Open Streets, temporarily opening streets to communities and closing streets to vehicles, in the US has recently surged. As of January 2016, 122 cities have hosted an Open Streets program. Even with this great expansion, the sustainability of Open Streets remains a challenge in many cities and overall Open Streets in the US differ from their successful counterparts in Central and South America. Between summer 2015 and winter 2016, we reviewed the websites and social media of the 122 identified programs and interviewed 32 unique Open Streets programs. Websites and social media were reviewed for program initiation, number of Open Streets days, length of routes, duration of program, and reported participation. Interview questions focused on barriers and facilitators of expanding Open Streets and specific questioning regarding local evaluation activities. All interviews were transcribed verbatim and analyzed with constant comparative methodology. Over three-quarters of US Open Streets programs have been initiated since 2010, with median frequency of one time per year, 4h per date, and 5000-9999 participants. Seventy-seven percent of program routes are under 5km in length. Success of programs was measured by enthusiasm, attendance, social media, survey metrics, and sustainability. Thirteen of 32 program organizers expressed interest in expanding their programs to 12 dates per year, but noted consistent barriers to expansion including funding, permitting, and branding. Though many cities now host Open Streets programs, their ability to effect public health remains limited with few program dates per year. Coordinated efforts, especially around funding, permitting, and branding may assist in expanding program dates. Copyright © 2016 Elsevier Inc. All rights reserved.

  20. Igpet software for modeling igneous processes: examples of application using the open educational version

    NASA Astrophysics Data System (ADS)

    Carr, Michael J.; Gazel, Esteban

    2017-04-01

    We provide here an open version of Igpet software, called t-Igpet to emphasize its application for teaching and research in forward modeling of igneous geochemistry. There are three programs, a norm utility, a petrologic mixing program using least squares and Igpet, a graphics program that includes many forms of numerical modeling. Igpet is a multifaceted tool that provides the following basic capabilities: igneous rock identification using the IUGS (International Union of Geological Sciences) classification and several supplementary diagrams; tectonic discrimination diagrams; pseudo-quaternary projections; least squares fitting of lines, polynomials and hyperbolae; magma mixing using two endmembers, histograms, x-y plots, ternary plots and spider-diagrams. The advanced capabilities of Igpet are multi-element mixing and magma evolution modeling. Mixing models are particularly useful for understanding the isotopic variations in rock suites that evolved by mixing different sources. The important melting models include, batch melting, fractional melting and aggregated fractional melting. Crystallization models include equilibrium and fractional crystallization and AFC (assimilation and fractional crystallization). Theses, reports and proposals concerning igneous petrology are improved by numerical modeling. For reviewed publications some elements of modeling are practically a requirement. Our intention in providing this software is to facilitate improved communication and lower entry barriers to research, especially for students.

  1. Personalization of structural PDB files.

    PubMed

    Woźniak, Tomasz; Adamiak, Ryszard W

    2013-01-01

    PDB format is most commonly applied by various programs to define three-dimensional structure of biomolecules. However, the programs often use different versions of the format. Thus far, no comprehensive solution for unifying the PDB formats has been developed. Here we present an open-source, Python-based tool called PDBinout for processing and conversion of various versions of PDB file format for biostructural applications. Moreover, PDBinout allows to create one's own PDB versions. PDBinout is freely available under the LGPL licence at http://pdbinout.ibch.poznan.pl.

  2. Design of an Open Smart Energy Gateway for Smart Meter Data Management

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Page, Janie; McParland, Chuck; Piette, Mary Ann

    With the widespread deployment of electronic interval meters, commonly known as smart meters, came the promise of real-time data on electric energy consumption. Recognizing an opportunity to provide consumers access to their near real-time energy consumption data directly from their installed smart meter, we designed a mechanism for capturing those data for consumer use via an open smart energy gateway (OpenSEG). By design, OpenSEG provides a clearly defined boundary for equipment and data ownership. OpenSEG is an open-source data management platform to enable better data management of smart meter data. Effectively, it is an information architecture designed to work withmore » the ZigBee Smart Energy Profile 1.x (SEP 1.x). It was specifically designed to reduce cyber-security risks and provide secure information directly from smart meters to consumers in near real time, using display devices already owned by the consumers. OpenSEG stores 48 hours of recent consumption data in a circular cache using a format consistent with commonly available archived (not real-time) consumption data such as Green Button, which is based on the Energy Services Provider Interface (ESPI) data standard. It consists of a common XML format for energy usage information and a data exchange protocol to facilitate automated data transfer upon utility customer authorization. Included in the design is an application program interface by which users can acquire data from OpenSEG for further post processing. A sample data display application is included in the initial software product. The data display application demonstrates that OpenSEG can help electricity use data to be retrieved from a smart meter and ported to a wide variety of user-owned devices such as cell phones or a user-selected database. This system can be used for homes, multi-family buildings, or small commercial buildings in California.« less

  3. MPI, HPF or OpenMP: A Study with the NAS Benchmarks

    NASA Technical Reports Server (NTRS)

    Jin, Hao-Qiang; Frumkin, Michael; Hribar, Michelle; Waheed, Abdul; Yan, Jerry; Saini, Subhash (Technical Monitor)

    1999-01-01

    Porting applications to new high performance parallel and distributed platforms is a challenging task. Writing parallel code by hand is time consuming and costly, but the task can be simplified by high level languages and would even better be automated by parallelizing tools and compilers. The definition of HPF (High Performance Fortran, based on data parallel model) and OpenMP (based on shared memory parallel model) standards has offered great opportunity in this respect. Both provide simple and clear interfaces to language like FORTRAN and simplify many tedious tasks encountered in writing message passing programs. In our study we implemented the parallel versions of the NAS Benchmarks with HPF and OpenMP directives. Comparison of their performance with the MPI implementation and pros and cons of different approaches will be discussed along with experience of using computer-aided tools to help parallelize these benchmarks. Based on the study,potentials of applying some of the techniques to realistic aerospace applications will be presented

  4. MPI, HPF or OpenMP: A Study with the NAS Benchmarks

    NASA Technical Reports Server (NTRS)

    Jin, H.; Frumkin, M.; Hribar, M.; Waheed, A.; Yan, J.; Saini, Subhash (Technical Monitor)

    1999-01-01

    Porting applications to new high performance parallel and distributed platforms is a challenging task. Writing parallel code by hand is time consuming and costly, but this task can be simplified by high level languages and would even better be automated by parallelizing tools and compilers. The definition of HPF (High Performance Fortran, based on data parallel model) and OpenMP (based on shared memory parallel model) standards has offered great opportunity in this respect. Both provide simple and clear interfaces to language like FORTRAN and simplify many tedious tasks encountered in writing message passing programs. In our study, we implemented the parallel versions of the NAS Benchmarks with HPF and OpenMP directives. Comparison of their performance with the MPI implementation and pros and cons of different approaches will be discussed along with experience of using computer-aided tools to help parallelize these benchmarks. Based on the study, potentials of applying some of the techniques to realistic aerospace applications will be presented.

  5. Design and Implementation of a Modern Automatic Deformation Monitoring System

    NASA Astrophysics Data System (ADS)

    Engel, Philipp; Schweimler, Björn

    2016-03-01

    The deformation monitoring of structures and buildings is an important task field of modern engineering surveying, ensuring the standing and reliability of supervised objects over a long period. Several commercial hardware and software solutions for the realization of such monitoring measurements are available on the market. In addition to them, a research team at the University of Applied Sciences in Neubrandenburg (NUAS) is actively developing a software package for monitoring purposes in geodesy and geotechnics, which is distributed under an open source licence and free of charge. The task of managing an open source project is well-known in computer science, but it is fairly new in a geodetic context. This paper contributes to that issue by detailing applications, frameworks, and interfaces for the design and implementation of open hardware and software solutions for sensor control, sensor networks, and data management in automatic deformation monitoring. It will be discussed how the development effort of networked applications can be reduced by using free programming tools, cloud computing technologies, and rapid prototyping methods.

  6. Squid - a simple bioinformatics grid.

    PubMed

    Carvalho, Paulo C; Glória, Rafael V; de Miranda, Antonio B; Degrave, Wim M

    2005-08-03

    BLAST is a widely used genetic research tool for analysis of similarity between nucleotide and protein sequences. This paper presents a software application entitled "Squid" that makes use of grid technology. The current version, as an example, is configured for BLAST applications, but adaptation for other computing intensive repetitive tasks can be easily accomplished in the open source version. This enables the allocation of remote resources to perform distributed computing, making large BLAST queries viable without the need of high-end computers. Most distributed computing / grid solutions have complex installation procedures requiring a computer specialist, or have limitations regarding operating systems. Squid is a multi-platform, open-source program designed to "keep things simple" while offering high-end computing power for large scale applications. Squid also has an efficient fault tolerance and crash recovery system against data loss, being able to re-route jobs upon node failure and recover even if the master machine fails. Our results show that a Squid application, working with N nodes and proper network resources, can process BLAST queries almost N times faster than if working with only one computer. Squid offers high-end computing, even for the non-specialist, and is freely available at the project web site. Its open-source and binary Windows distributions contain detailed instructions and a "plug-n-play" instalation containing a pre-configured example.

  7. Building Automatic Grading Tools for Basic of Programming Lab in an Academic Institution

    NASA Astrophysics Data System (ADS)

    Harimurti, Rina; Iwan Nurhidayat, Andi; Asmunin

    2018-04-01

    The skills of computer programming is a core competency that must be mastered by students majoring in computer sciences. The best way to improve this skill is through the practice of writing many programs to solve various problems from simple to complex. It takes hard work and a long time to check and evaluate the results of student labs one by one, especially if the number of students a lot. Based on these constrain, web proposes Automatic Grading Tools (AGT), the application that can evaluate and deeply check the source code in C, C++. The application architecture consists of students, web-based applications, compilers, and operating systems. Automatic Grading Tools (AGT) is implemented MVC Architecture and using open source software, such as laravel framework version 5.4, PostgreSQL 9.6, Bootstrap 3.3.7, and jquery library. Automatic Grading Tools has also been tested for real problems by submitting source code in C/C++ language and then compiling. The test results show that the AGT application has been running well.

  8. Time-domain seismic modeling in viscoelastic media for full waveform inversion on heterogeneous computing platforms with OpenCL

    NASA Astrophysics Data System (ADS)

    Fabien-Ouellet, Gabriel; Gloaguen, Erwan; Giroux, Bernard

    2017-03-01

    Full Waveform Inversion (FWI) aims at recovering the elastic parameters of the Earth by matching recordings of the ground motion with the direct solution of the wave equation. Modeling the wave propagation for realistic scenarios is computationally intensive, which limits the applicability of FWI. The current hardware evolution brings increasing parallel computing power that can speed up the computations in FWI. However, to take advantage of the diversity of parallel architectures presently available, new programming approaches are required. In this work, we explore the use of OpenCL to develop a portable code that can take advantage of the many parallel processor architectures now available. We present a program called SeisCL for 2D and 3D viscoelastic FWI in the time domain. The code computes the forward and adjoint wavefields using finite-difference and outputs the gradient of the misfit function given by the adjoint state method. To demonstrate the code portability on different architectures, the performance of SeisCL is tested on three different devices: Intel CPUs, NVidia GPUs and Intel Xeon PHI. Results show that the use of GPUs with OpenCL can speed up the computations by nearly two orders of magnitudes over a single threaded application on the CPU. Although OpenCL allows code portability, we show that some device-specific optimization is still required to get the best performance out of a specific architecture. Using OpenCL in conjunction with MPI allows the domain decomposition of large models on several devices located on different nodes of a cluster. For large enough models, the speedup of the domain decomposition varies quasi-linearly with the number of devices. Finally, we investigate two different approaches to compute the gradient by the adjoint state method and show the significant advantages of using OpenCL for FWI.

  9. Quantitative computed tomography (QCT) as a radiology reporting tool by using optical character recognition (OCR) and macro program.

    PubMed

    Lee, Young Han; Song, Ho-Taek; Suh, Jin-Suck

    2012-12-01

    The objectives are (1) to introduce a new concept of making a quantitative computed tomography (QCT) reporting system by using optical character recognition (OCR) and macro program and (2) to illustrate the practical usages of the QCT reporting system in radiology reading environment. This reporting system was created as a development tool by using an open-source OCR software and an open-source macro program. The main module was designed for OCR to report QCT images in radiology reading process. The principal processes are as follows: (1) to save a QCT report as a graphic file, (2) to recognize the characters from an image as a text, (3) to extract the T scores from the text, (4) to perform error correction, (5) to reformat the values into QCT radiology reporting template, and (6) to paste the reports into the electronic medical record (EMR) or picture archiving and communicating system (PACS). The accuracy test of OCR was performed on randomly selected QCTs. QCT as a radiology reporting tool successfully acted as OCR of QCT. The diagnosis of normal, osteopenia, or osteoporosis is also determined. Error correction of OCR is done with AutoHotkey-coded module. The results of T scores of femoral neck and lumbar vertebrae had an accuracy of 100 and 95.4 %, respectively. A convenient QCT reporting system could be established by utilizing open-source OCR software and open-source macro program. This method can be easily adapted for other QCT applications and PACS/EMR.

  10. X based interactive computer graphics applications for aerodynamic design and education

    NASA Technical Reports Server (NTRS)

    Benson, Thomas J.; Higgs, C. Fred, III

    1995-01-01

    Six computer applications packages have been developed to solve a variety of aerodynamic problems in an interactive environment on a single workstation. The packages perform classical one dimensional analysis under the control of a graphical user interface and can be used for preliminary design or educational purposes. The programs were originally developed on a Silicon Graphics workstation and used the GL version of the FORMS library as the graphical user interface. These programs have recently been converted to the XFORMS library of X based graphics widgets and have been tested on SGI, IBM, Sun, HP and PC-Lunix computers. The paper will show results from the new VU-DUCT program as a prime example. VU-DUCT has been developed as an educational package for the study of subsonic open and closed loop wind tunnels.

  11. Naval research fellowships

    NASA Astrophysics Data System (ADS)

    The American Society for Engineering Education (ASEE) is seeking applicants for 40 fellowships that will be awarded by the Office of Naval Research (ONR) in 1984. This program is designed to increase the number of U.S. citizens doing graduate work in such fields as ocean engineering, applied physics, electrical engineering, computer science, naval architecture, materials science) and aerospace a n d mechanical engineering. The fellowships are awarded on the recommendation of a panel of scientists and engineers convened by the ASEE. The deadline for applications is February 15, 1984.The program is open to graduating seniors who already have or will shortly have baccalaureates in disciplines vital to the research aims of the Navy and critical to national defense. As a reflection of the quality of the program, 1983 fellows had an average cummulative grade point average of 3.88; nine had a perfect 4.0.

  12. A uniform approach for programming distributed heterogeneous computing systems

    PubMed Central

    Grasso, Ivan; Pellegrini, Simone; Cosenza, Biagio; Fahringer, Thomas

    2014-01-01

    Large-scale compute clusters of heterogeneous nodes equipped with multi-core CPUs and GPUs are getting increasingly popular in the scientific community. However, such systems require a combination of different programming paradigms making application development very challenging. In this article we introduce libWater, a library-based extension of the OpenCL programming model that simplifies the development of heterogeneous distributed applications. libWater consists of a simple interface, which is a transparent abstraction of the underlying distributed architecture, offering advanced features such as inter-context and inter-node device synchronization. It provides a runtime system which tracks dependency information enforced by event synchronization to dynamically build a DAG of commands, on which we automatically apply two optimizations: collective communication pattern detection and device-host-device copy removal. We assess libWater’s performance in three compute clusters available from the Vienna Scientific Cluster, the Barcelona Supercomputing Center and the University of Innsbruck, demonstrating improved performance and scaling with different test applications and configurations. PMID:25844015

  13. A uniform approach for programming distributed heterogeneous computing systems.

    PubMed

    Grasso, Ivan; Pellegrini, Simone; Cosenza, Biagio; Fahringer, Thomas

    2014-12-01

    Large-scale compute clusters of heterogeneous nodes equipped with multi-core CPUs and GPUs are getting increasingly popular in the scientific community. However, such systems require a combination of different programming paradigms making application development very challenging. In this article we introduce libWater, a library-based extension of the OpenCL programming model that simplifies the development of heterogeneous distributed applications. libWater consists of a simple interface, which is a transparent abstraction of the underlying distributed architecture, offering advanced features such as inter-context and inter-node device synchronization. It provides a runtime system which tracks dependency information enforced by event synchronization to dynamically build a DAG of commands, on which we automatically apply two optimizations: collective communication pattern detection and device-host-device copy removal. We assess libWater's performance in three compute clusters available from the Vienna Scientific Cluster, the Barcelona Supercomputing Center and the University of Innsbruck, demonstrating improved performance and scaling with different test applications and configurations.

  14. caGrid 1.0 : an enterprise Grid infrastructure for biomedical research.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Oster, S.; Langella, S.; Hastings, S.

    To develop software infrastructure that will provide support for discovery, characterization, integrated access, and management of diverse and disparate collections of information sources, analysis methods, and applications in biomedical research. Design: An enterprise Grid software infrastructure, called caGrid version 1.0 (caGrid 1.0), has been developed as the core Grid architecture of the NCI-sponsored cancer Biomedical Informatics Grid (caBIG{trademark}) program. It is designed to support a wide range of use cases in basic, translational, and clinical research, including (1) discovery, (2) integrated and large-scale data analysis, and (3) coordinated study. Measurements: The caGrid is built as a Grid software infrastructure andmore » leverages Grid computing technologies and the Web Services Resource Framework standards. It provides a set of core services, toolkits for the development and deployment of new community provided services, and application programming interfaces for building client applications. Results: The caGrid 1.0 was released to the caBIG community in December 2006. It is built on open source components and caGrid source code is publicly and freely available under a liberal open source license. The core software, associated tools, and documentation can be downloaded from the following URL: .« less

  15. Argobots: A Lightweight Low-Level Threading and Tasking Framework

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Seo, Sangmin; Amer, Abdelhalim; Balaji, Pavan

    In the past few decades, a number of user-level threading and tasking models have been proposed in the literature to address the shortcomings of OS-level threads, primarily with respect to cost and flexibility. Current state-of-the-art user-level threading and tasking models, however, either are too specific to applications or architectures or are not as powerful or flexible. In this paper, we present Argobots, a lightweight, low-level threading and tasking framework that is designed as a portable and performant substrate for high-level programming models or runtime systems. Argobots offers a carefully designed execution model that balances generality of functionality with providing amore » rich set of controls to allow specialization by end users or high-level programming models. We describe the design, implementation, and performance characterization of Argobots and present integrations with three high-level models: OpenMP, MPI, and colocated I/O services. Evaluations show that (1) Argobots, while providing richer capabilities, is competitive with existing simpler generic threading runtimes; (2) our OpenMP runtime offers more efficient interoperability capabilities than production OpenMP runtimes do; (3) when MPI interoperates with Argobots instead of Pthreads, it enjoys reduced synchronization costs and better latency-hiding capabilities; and (4) I/O services with Argobots reduce interference with colocated applications while achieving performance competitive with that of a Pthreads approach.« less

  16. Argobots: A Lightweight Low-Level Threading and Tasking Framework

    DOE PAGES

    Seo, Sangmin; Amer, Abdelhalim; Balaji, Pavan; ...

    2017-10-24

    In the past few decades, a number of user-level threading and tasking models have been proposed in the literature to address the shortcomings of OS-level threads, primarily with respect to cost and flexibility. Current state-of-the-art user-level threading and tasking models, however, are either too specific to applications or architectures or are not as powerful or flexible. In this article, we present Argobots, a lightweight, low-level threading and tasking framework that is designed as a portable and performant substrate for high-level programming models or runtime systems. Argobots offers a carefully designed execution model that balances generality of functionality with providing amore » rich set of controls to allow specialization by the user or high-level programming model. Here, we describe the design, implementation, and optimization of Argobots and present integrations with three example high-level models: OpenMP, MPI, and co-located I/O service. Evaluations show that (1) Argobots outperforms existing generic threading runtimes; (2) our OpenMP runtime offers more efficient interoperability capabilities than production OpenMP runtimes do; (3) when MPI interoperates with Argobots instead of Pthreads, it enjoys reduced synchronization costs and better latency hiding capabilities; and (4) I/O service with Argobots reduces interference with co-located applications, achieving performance competitive with that of the Pthreads version.« less

  17. Argobots: A Lightweight Low-Level Threading and Tasking Framework

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Seo, Sangmin; Amer, Abdelhalim; Balaji, Pavan

    In the past few decades, a number of user-level threading and tasking models have been proposed in the literature to address the shortcomings of OS-level threads, primarily with respect to cost and flexibility. Current state-of-the-art user-level threading and tasking models, however, are either too specific to applications or architectures or are not as powerful or flexible. In this article, we present Argobots, a lightweight, low-level threading and tasking framework that is designed as a portable and performant substrate for high-level programming models or runtime systems. Argobots offers a carefully designed execution model that balances generality of functionality with providing amore » rich set of controls to allow specialization by the user or high-level programming model. Here, we describe the design, implementation, and optimization of Argobots and present integrations with three example high-level models: OpenMP, MPI, and co-located I/O service. Evaluations show that (1) Argobots outperforms existing generic threading runtimes; (2) our OpenMP runtime offers more efficient interoperability capabilities than production OpenMP runtimes do; (3) when MPI interoperates with Argobots instead of Pthreads, it enjoys reduced synchronization costs and better latency hiding capabilities; and (4) I/O service with Argobots reduces interference with co-located applications, achieving performance competitive with that of the Pthreads version.« less

  18. Lattice QCD simulations using the OpenACC platform

    NASA Astrophysics Data System (ADS)

    Majumdar, Pushan

    2016-10-01

    In this article we will explore the OpenACC platform for programming Graphics Processing Units (GPUs). The OpenACC platform offers a directive based programming model for GPUs which avoids the detailed data flow control and memory management necessary in a CUDA programming environment. In the OpenACC model, programs can be written in high level languages with OpenMP like directives. We present some examples of QCD simulation codes using OpenACC and discuss their performance on the Fermi and Kepler GPUs.

  19. NASA space communications R and D (Research and Development): Issues, derived benefits, and future directions

    NASA Technical Reports Server (NTRS)

    1989-01-01

    Space communication is making immense strides since ECHO was launched in 1962. It was a simple passive reflector of signals that demonstrated the concept. Today, satellites incorporating transponders, sophisticated high-gain antennas, and stabilization systems provide voice, video, and data communications to millions of people nationally and worldwide. Applications of emerging technology, typified by NASA's Advanced Communications Technology Satellite (ACTS) to be launched in 1992, will use newer portions of the frequency spectrum (the Ka-band at 30/20 GHz), along with antennas and signal-processing that could open yet new markets and services. Government programs, directly or indirectly, are responsible for many space communications accomplishments. They are sponsored and funded in part by NASA and the U.S. Department of Defense since the early 1950s. The industry is growing rapidly and is achieving international preeminence under joint private and government sponsorship. Now, however, the U.S. space communications industry - satellite manufacturers and users, launch services providers, and communications services companies - are being forced to adapt to a different environment. International competition is growing, and terrestrial technologies such as fiber optics are claiming markets until recently dominated by satellites. At the same time, advancing technology is opening up opportunities for new applications and new markets in space exploration, for defense, and for commercial applications of several types. Space communications research, development, and applications (RD and A) programs need to adjust to these realities, be better coordinated and more efficient, and be more closely attuned to commercial markets. The programs must take advantage of RD and A results in other agencies - and in other nations.

  20. NASA space communications R and D (Research and Development): Issues, derived benefits, and future directions

    NASA Astrophysics Data System (ADS)

    1989-02-01

    Space communication is making immense strides since ECHO was launched in 1962. It was a simple passive reflector of signals that demonstrated the concept. Today, satellites incorporating transponders, sophisticated high-gain antennas, and stabilization systems provide voice, video, and data communications to millions of people nationally and worldwide. Applications of emerging technology, typified by NASA's Advanced Communications Technology Satellite (ACTS) to be launched in 1992, will use newer portions of the frequency spectrum (the Ka-band at 30/20 GHz), along with antennas and signal-processing that could open yet new markets and services. Government programs, directly or indirectly, are responsible for many space communications accomplishments. They are sponsored and funded in part by NASA and the U.S. Department of Defense since the early 1950s. The industry is growing rapidly and is achieving international preeminence under joint private and government sponsorship. Now, however, the U.S. space communications industry - satellite manufacturers and users, launch services providers, and communications services companies - are being forced to adapt to a different environment. International competition is growing, and terrestrial technologies such as fiber optics are claiming markets until recently dominated by satellites. At the same time, advancing technology is opening up opportunities for new applications and new markets in space exploration, for defense, and for commercial applications of several types. Space communications research, development, and applications (RD and A) programs need to adjust to these realities, be better coordinated and more efficient, and be more closely attuned to commercial markets. The programs must take advantage of RD and A results in other agencies - and in other nations.

  1. Ruby on Rails Applications

    NASA Technical Reports Server (NTRS)

    Hochstadt, Jake

    2011-01-01

    Ruby on Rails is an open source web application framework for the Ruby programming language. The first application I built was a web application to manage and authenticate other applications. One of the main requirements for this application was a single sign-on service. This allowed authentication to be built in one location and be implemented in many different applications. For example, users would be able to login using their existing credentials, and be able to access other NASA applications without authenticating again. The second application I worked on was an internal qualification plan app. Previously, the viewing of employee qualifications was managed through Excel spread sheets. I built a database driven application to streamline the process of managing qualifications. Employees would be able to login securely to view, edit and update their personal qualifications.

  2. Parallelization of interpolation, solar radiation and water flow simulation modules in GRASS GIS using OpenMP

    NASA Astrophysics Data System (ADS)

    Hofierka, Jaroslav; Lacko, Michal; Zubal, Stanislav

    2017-10-01

    In this paper, we describe the parallelization of three complex and computationally intensive modules of GRASS GIS using the OpenMP application programming interface for multi-core computers. These include the v.surf.rst module for spatial interpolation, the r.sun module for solar radiation modeling and the r.sim.water module for water flow simulation. We briefly describe the functionality of the modules and parallelization approaches used in the modules. Our approach includes the analysis of the module's functionality, identification of source code segments suitable for parallelization and proper application of OpenMP parallelization code to create efficient threads processing the subtasks. We document the efficiency of the solutions using the airborne laser scanning data representing land surface in the test area and derived high-resolution digital terrain model grids. We discuss the performance speed-up and parallelization efficiency depending on the number of processor threads. The study showed a substantial increase in computation speeds on a standard multi-core computer while maintaining the accuracy of results in comparison to the output from original modules. The presented parallelization approach showed the simplicity and efficiency of the parallelization of open-source GRASS GIS modules using OpenMP, leading to an increased performance of this geospatial software on standard multi-core computers.

  3. Early Experiences Writing Performance Portable OpenMP 4 Codes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Joubert, Wayne; Hernandez, Oscar R

    In this paper, we evaluate the recently available directives in OpenMP 4 to parallelize a computational kernel using both the traditional shared memory approach and the newer accelerator targeting capabilities. In addition, we explore various transformations that attempt to increase application performance portability, and examine the expressiveness and performance implications of using these approaches. For example, we want to understand if the target map directives in OpenMP 4 improve data locality when mapped to a shared memory system, as opposed to the traditional first touch policy approach in traditional OpenMP. To that end, we use recent Cray and Intel compilersmore » to measure the performance variations of a simple application kernel when executed on the OLCF s Titan supercomputer with NVIDIA GPUs and the Beacon system with Intel Xeon Phi accelerators attached. To better understand these trade-offs, we compare our results from traditional OpenMP shared memory implementations to the newer accelerator programming model when it is used to target both the CPU and an attached heterogeneous device. We believe the results and lessons learned as presented in this paper will be useful to the larger user community by providing guidelines that can assist programmers in the development of performance portable code.« less

  4. THE CDF ARCHIVE: HERSCHEL PACS AND SPIRE SPECTROSCOPIC DATA PIPELINE AND PRODUCTS FOR PROTOSTARS AND YOUNG STELLAR OBJECTS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Green, Joel D.; Yang, Yao-Lun; II, Neal J. Evans

    2016-03-15

    We present the COPS-DIGIT-FOOSH (CDF) Herschel spectroscopy data product archive, and related ancillary data products, along with data fidelity assessments, and a user-created archive in collaboration with the Herschel-PACS and SPIRE ICC groups. Our products include datacubes, contour maps, automated line fitting results, and best 1D spectra products for all protostellar and disk sources observed with PACS in RangeScan mode for two observing programs: the DIGIT Open Time Key Program (KPOT-nevans-1 and SDP-nevans-1; PI: N. Evans), and the FOOSH Open Time Program (OT1-jgreen02-2; PI: J. Green). In addition, we provide our best SPIRE-FTS spectroscopic products for the COPS Open Time Program (OT2-jgreen02-6;more » PI: J. Green) and FOOSH sources. We include details of data processing, descriptions of output products, and tests of their reliability for user applications. We identify the parts of the data set to be used with caution. The resulting absolute flux calibration has improved in almost all cases. Compared to previous reductions, the resulting rotational temperatures and numbers of CO molecules have changed substantially in some sources. On average, however, the rotational temperatures have not changed substantially (<2%), but the number of warm (T{sub rot} ∼ 300 K) CO molecules has increased by about 18%.« less

  5. The CDF Archive: Herschel PACS and SPIRE Spectroscopic Data Pipeline and Products for Protostars and Young Stellar Objects

    NASA Astrophysics Data System (ADS)

    Green, Joel D.; Yang, Yao-Lun; Evans, Neal J., II; Karska, Agata; Herczeg, Gregory; van Dishoeck, Ewine F.; Lee, Jeong-Eun; Larson, Rebecca L.; Bouwman, Jeroen

    2016-03-01

    We present the COPS-DIGIT-FOOSH (CDF) Herschel spectroscopy data product archive, and related ancillary data products, along with data fidelity assessments, and a user-created archive in collaboration with the Herschel-PACS and SPIRE ICC groups. Our products include datacubes, contour maps, automated line fitting results, and best 1D spectra products for all protostellar and disk sources observed with PACS in RangeScan mode for two observing programs: the DIGIT Open Time Key Program (KPOT_nevans1 and SDP_nevans_1; PI: N. Evans), and the FOOSH Open Time Program (OT1_jgreen02_2; PI: J. Green). In addition, we provide our best SPIRE-FTS spectroscopic products for the COPS Open Time Program (OT2_jgreen02_6; PI: J. Green) and FOOSH sources. We include details of data processing, descriptions of output products, and tests of their reliability for user applications. We identify the parts of the data set to be used with caution. The resulting absolute flux calibration has improved in almost all cases. Compared to previous reductions, the resulting rotational temperatures and numbers of CO molecules have changed substantially in some sources. On average, however, the rotational temperatures have not changed substantially (<2%), but the number of warm (Trot ∼ 300 K) CO molecules has increased by about 18%.

  6. On the Pontryagin maximum principle for systems with delays. Economic applications

    NASA Astrophysics Data System (ADS)

    Kim, A. V.; Kormyshev, V. M.; Kwon, O. B.; Mukhametshin, E. R.

    2017-11-01

    The Pontryagin maximum principle [6] is the key stone of finite-dimensional optimal control theory [1, 2, 5]. So beginning with opening the maximum principle it was important to extend the maximum principle on various classes of dynamical systems. In t he paper we consider some aspects of application of i-smooth analysis [3, 4] in the theory of the Pontryagin maximum principle [6] for systems with delays, obtained results can be applied by elaborating optimal program controls in economic models with delays.

  7. ClusterControl: a web interface for distributing and monitoring bioinformatics applications on a Linux cluster.

    PubMed

    Stocker, Gernot; Rieder, Dietmar; Trajanoski, Zlatko

    2004-03-22

    ClusterControl is a web interface to simplify distributing and monitoring bioinformatics applications on Linux cluster systems. We have developed a modular concept that enables integration of command line oriented program into the application framework of ClusterControl. The systems facilitate integration of different applications accessed through one interface and executed on a distributed cluster system. The package is based on freely available technologies like Apache as web server, PHP as server-side scripting language and OpenPBS as queuing system and is available free of charge for academic and non-profit institutions. http://genome.tugraz.at/Software/ClusterControl

  8. PAPARA(ZZ)I: An open-source software interface for annotating photographs of the deep-sea

    NASA Astrophysics Data System (ADS)

    Marcon, Yann; Purser, Autun

    PAPARA(ZZ)I is a lightweight and intuitive image annotation program developed for the study of benthic megafauna. It offers functionalities such as free, grid and random point annotation. Annotations may be made following existing classification schemes for marine biota and substrata or with the use of user defined, customised lists of keywords, which broadens the range of potential application of the software to other types of studies (e.g. marine litter distribution assessment). If Internet access is available, PAPARA(ZZ)I can also query and use standardised taxa names directly from the World Register of Marine Species (WoRMS). Program outputs include abundances, densities and size calculations per keyword (e.g. per taxon). These results are written into text files that can be imported into spreadsheet programs for further analyses. PAPARA(ZZ)I is open-source and is available at http://papara-zz-i.github.io. Compiled versions exist for most 64-bit operating systems: Windows, Mac OS X and Linux.

  9. POSTOP: Postbuckled open-stiffener optimum panels-theory and capability

    NASA Technical Reports Server (NTRS)

    Dickson, J. N.; Biggers, S. B.

    1984-01-01

    The computer program POSTOP was developed to serve as an aid in the analysis and sizing of stiffened composite panels that are loaded in the postbuckling regime. A comprehensive set of analysis routines was coupled to a widely used optimization program to produce this sizing code. POSTOP is intended for the preliminary design of metal or composite panels with open-section stiffeners, subjected to multiple combined biaxial compression (or tension), shear and normal pressure load cases. Longitudinal compression, however, is assumed to be the dominant loading. Temperature, initial bow eccentricity and load eccentricity effects are included. The panel geometry is assumed to be repetitive over several bays in the longitudinal (stiffener) direction as well as in the transverse direction. Analytical routines are included to compute panel stiffnesses, strains, local and panel buckling loads, and skin/stiffener interface stresses. The resulting program is applicable to stiffened panels as commonly used in fuselage, wing, or empennage structures. The analysis procedures and rationale for the assumptions used therein are described in detail.

  10. Comparing the OpenMP, MPI, and Hybrid Programming Paradigm on an SMP Cluster

    NASA Technical Reports Server (NTRS)

    Jost, Gabriele; Jin, Haoqiang; anMey, Dieter; Hatay, Ferhat F.

    2003-01-01

    With the advent of parallel hardware and software technologies users are faced with the challenge to choose a programming paradigm best suited for the underlying computer architecture. With the current trend in parallel computer architectures towards clusters of shared memory symmetric multi-processors (SMP), parallel programming techniques have evolved to support parallelism beyond a single level. Which programming paradigm is the best will depend on the nature of the given problem, the hardware architecture, and the available software. In this study we will compare different programming paradigms for the parallelization of a selected benchmark application on a cluster of SMP nodes. We compare the timings of different implementations of the same CFD benchmark application employing the same numerical algorithm on a cluster of Sun Fire SMP nodes. The rest of the paper is structured as follows: In section 2 we briefly discuss the programming models under consideration. We describe our compute platform in section 3. The different implementations of our benchmark code are described in section 4 and the performance results are presented in section 5. We conclude our study in section 6.

  11. Wireless access to a pharmaceutical database: a demonstrator for data driven Wireless Application Protocol (WAP) applications in medical information processing.

    PubMed

    Schacht Hansen, M; Dørup, J

    2001-01-01

    The Wireless Application Protocol technology implemented in newer mobile phones has built-in facilities for handling much of the information processing needed in clinical work. To test a practical approach we ported a relational database of the Danish pharmaceutical catalogue to Wireless Application Protocol using open source freeware at all steps. We used Apache 1.3 web software on a Linux server. Data containing the Danish pharmaceutical catalogue were imported from an ASCII file into a MySQL 3.22.32 database using a Practical Extraction and Report Language script for easy update of the database. Data were distributed in 35 interrelated tables. Each pharmaceutical brand name was given its own card with links to general information about the drug, active substances, contraindications etc. Access was available through 1) browsing therapeutic groups and 2) searching for a brand name. The database interface was programmed in the server-side scripting language PHP3. A free, open source Wireless Application Protocol gateway to a pharmaceutical catalogue was established to allow dial-in access independent of commercial Wireless Application Protocol service providers. The application was tested on the Nokia 7110 and Ericsson R320s cellular phones. We have demonstrated that Wireless Application Protocol-based access to a dynamic clinical database can be established using open source freeware. The project opens perspectives for a further integration of Wireless Application Protocol phone functions in clinical information processing: Global System for Mobile communication telephony for bilateral communication, asynchronous unilateral communication via e-mail and Short Message Service, built-in calculator, calendar, personal organizer, phone number catalogue and Dictaphone function via answering machine technology. An independent Wireless Application Protocol gateway may be placed within hospital firewalls, which may be an advantage with respect to security. However, if Wireless Application Protocol phones are to become effective tools for physicians, special attention must be paid to the limitations of the devices. Input tools of Wireless Application Protocol phones should be improved, for instance by increased use of speech control.

  12. Wireless access to a pharmaceutical database: A demonstrator for data driven Wireless Application Protocol applications in medical information processing

    PubMed Central

    Hansen, Michael Schacht

    2001-01-01

    Background The Wireless Application Protocol technology implemented in newer mobile phones has built-in facilities for handling much of the information processing needed in clinical work. Objectives To test a practical approach we ported a relational database of the Danish pharmaceutical catalogue to Wireless Application Protocol using open source freeware at all steps. Methods We used Apache 1.3 web software on a Linux server. Data containing the Danish pharmaceutical catalogue were imported from an ASCII file into a MySQL 3.22.32 database using a Practical Extraction and Report Language script for easy update of the database. Data were distributed in 35 interrelated tables. Each pharmaceutical brand name was given its own card with links to general information about the drug, active substances, contraindications etc. Access was available through 1) browsing therapeutic groups and 2) searching for a brand name. The database interface was programmed in the server-side scripting language PHP3. Results A free, open source Wireless Application Protocol gateway to a pharmaceutical catalogue was established to allow dial-in access independent of commercial Wireless Application Protocol service providers. The application was tested on the Nokia 7110 and Ericsson R320s cellular phones. Conclusions We have demonstrated that Wireless Application Protocol-based access to a dynamic clinical database can be established using open source freeware. The project opens perspectives for a further integration of Wireless Application Protocol phone functions in clinical information processing: Global System for Mobile communication telephony for bilateral communication, asynchronous unilateral communication via e-mail and Short Message Service, built-in calculator, calendar, personal organizer, phone number catalogue and Dictaphone function via answering machine technology. An independent Wireless Application Protocol gateway may be placed within hospital firewalls, which may be an advantage with respect to security. However, if Wireless Application Protocol phones are to become effective tools for physicians, special attention must be paid to the limitations of the devices. Input tools of Wireless Application Protocol phones should be improved, for instance by increased use of speech control. PMID:11720946

  13. Rotor systems research aircraft simulation mathematical model

    NASA Technical Reports Server (NTRS)

    Houck, J. A.; Moore, F. L.; Howlett, J. J.; Pollock, K. S.; Browne, M. M.

    1977-01-01

    An analytical model developed for evaluating and verifying advanced rotor concepts is discussed. The model was used during in both open loop and real time man-in-the-loop simulation during the rotor systems research aircraft design. Future applications include: pilot training, preflight of test programs, and the evaluation of promising concepts before their implementation on the flight vehicle.

  14. A Critical Analysis of a New Model for Occupational Therapy Education: Its Applicability for Other Occupations and Systems.

    ERIC Educational Resources Information Center

    National Committee on Employment of Youth, New York, NY.

    The symposium report focuses on an upgrading program (designed by the Consortium for Occupational Therapy Education) to develop alternate routes to credentialled education and training, resulting in opening up occupational therapy career opportunities to young people. The consortium was composed of four New York State hospitals, two academic…

  15. A Directory of Human Performance Models for System Design (Defence Research Group Panel 8 on the Defence Applications of Human and Bio-Medical Sciences)

    DTIC Science & Technology

    1992-12-27

    quantities, but they are not continuously dependent on these quantities. This pure open-loop programmed-control-like behaviour is called precognitive . Like...and largely accomplished by the precognitive action and then may be completed with compeisatory eor-reducuon operations. 304. A quasilinear or

  16. A collection of open source applications for mass spectrometry data mining.

    PubMed

    Gallardo, Óscar; Ovelleiro, David; Gay, Marina; Carrascal, Montserrat; Abian, Joaquin

    2014-10-01

    We present several bioinformatics applications for the identification and quantification of phosphoproteome components by MS. These applications include a front-end graphical user interface that combines several Thermo RAW formats to MASCOT™ Generic Format extractors (EasierMgf), two graphical user interfaces for search engines OMSSA and SEQUEST (OmssaGui and SequestGui), and three applications, one for the management of databases in FASTA format (FastaTools), another for the integration of search results from up to three search engines (Integrator), and another one for the visualization of mass spectra and their corresponding database search results (JsonVisor). These applications were developed to solve some of the common problems found in proteomic and phosphoproteomic data analysis and were integrated in the workflow for data processing and feeding on our LymPHOS database. Applications were designed modularly and can be used standalone. These tools are written in Perl and Python programming languages and are supported on Windows platforms. They are all released under an Open Source Software license and can be freely downloaded from our software repository hosted at GoogleCode. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  17. Open Babel: An open chemical toolbox

    PubMed Central

    2011-01-01

    Background A frequent problem in computational modeling is the interconversion of chemical structures between different formats. While standard interchange formats exist (for example, Chemical Markup Language) and de facto standards have arisen (for example, SMILES format), the need to interconvert formats is a continuing problem due to the multitude of different application areas for chemistry data, differences in the data stored by different formats (0D versus 3D, for example), and competition between software along with a lack of vendor-neutral formats. Results We discuss, for the first time, Open Babel, an open-source chemical toolbox that speaks the many languages of chemical data. Open Babel version 2.3 interconverts over 110 formats. The need to represent such a wide variety of chemical and molecular data requires a library that implements a wide range of cheminformatics algorithms, from partial charge assignment and aromaticity detection, to bond order perception and canonicalization. We detail the implementation of Open Babel, describe key advances in the 2.3 release, and outline a variety of uses both in terms of software products and scientific research, including applications far beyond simple format interconversion. Conclusions Open Babel presents a solution to the proliferation of multiple chemical file formats. In addition, it provides a variety of useful utilities from conformer searching and 2D depiction, to filtering, batch conversion, and substructure and similarity searching. For developers, it can be used as a programming library to handle chemical data in areas such as organic chemistry, drug design, materials science, and computational chemistry. It is freely available under an open-source license from http://openbabel.org. PMID:21982300

  18. Personalized reminiscence therapy M-health application for patients living with dementia: Innovating using open source code repository.

    PubMed

    Zhang, Melvyn W B; Ho, Roger C M

    2017-01-01

    Dementia is known to be an illness which brings forth marked disability amongst the elderly individuals. At times, patients living with dementia do also experience non-cognitive symptoms, and these symptoms include that of hallucinations, delusional beliefs as well as emotional liability, sexualized behaviours and aggression. According to the National Institute of Clinical Excellence (NICE) guidelines, non-pharmacological techniques are typically the first-line option prior to the consideration of adjuvant pharmacological options. Reminiscence and music therapy are thus viable options. Lazar et al. [3] previously performed a systematic review with regards to the utilization of technology to delivery reminiscence based therapy to individuals who are living with dementia and has highlighted that technology does have benefits in the delivery of reminiscence therapy. However, to date, there has been a paucity of M-health innovations in this area. In addition, most of the current innovations are not personalized for each of the person living with Dementia. Prior research has highlighted the utility for open source repository in bioinformatics study. The authors hoped to explain how they managed to tap upon and make use of open source repository in the development of a personalized M-health reminiscence therapy innovation for patients living with dementia. The availability of open source code repository has changed the way healthcare professionals and developers develop smartphone applications today. Conventionally, a long iterative process is needed in the development of native application, mainly because of the need for native programming and coding, especially so if the application needs to have interactive features or features that could be personalized. Such repository enables the rapid and cost effective development of application. Moreover, developers are also able to further innovate, as less time is spend in the iterative process.

  19. Virtual Hubs for facilitating access to Open Data

    NASA Astrophysics Data System (ADS)

    Mazzetti, Paolo; Latre, Miguel Á.; Ernst, Julia; Brumana, Raffaella; Brauman, Stefan; Nativi, Stefano

    2015-04-01

    In October 2014 the ENERGIC-OD (European NEtwork for Redistributing Geospatial Information to user Communities - Open Data) project, funded by the European Union under the Competitiveness and Innovation framework Programme (CIP), has started. In response to the EU call, the general objective of the project is to "facilitate the use of open (freely available) geographic data from different sources for the creation of innovative applications and services through the creation of Virtual Hubs". In ENERGIC-OD, Virtual Hubs are conceived as information systems supporting the full life cycle of Open Data: publishing, discovery and access. They facilitate the use of Open Data by lowering and possibly removing the main barriers which hampers geo-information (GI) usage by end-users and application developers. Data and data services heterogeneity is recognized as one of the major barriers to Open Data (re-)use. It imposes end-users and developers to spend a lot of effort in accessing different infrastructures and harmonizing datasets. Such heterogeneity cannot be completely removed through the adoption of standard specifications for service interfaces, metadata and data models, since different infrastructures adopt different standards to answer to specific challenges and to address specific use-cases. Thus, beyond a certain extent, heterogeneity is irreducible especially in interdisciplinary contexts. ENERGIC-OD Virtual Hubs address heterogeneity adopting a mediation and brokering approach: specific components (brokers) are dedicated to harmonize service interfaces, metadata and data models, enabling seamless discovery and access to heterogeneous infrastructures and datasets. As an innovation project, ENERGIC-OD will integrate several existing technologies to implement Virtual Hubs as single points of access to geospatial datasets provided by new or existing platforms and infrastructures, including INSPIRE-compliant systems and Copernicus services. ENERGIC OD will deploy a set of five Virtual Hubs (VHs) at national level in France, Germany, Italy, Poland, Spain and an additional one at the European level. VHs will be provided according to the cloud Software-as-a-Services model. The main expected impact of VHs is the creation of new business opportunities opening up access to Research Data and Public Sector Information. Therefore, ENERGIC-OD addresses not only end-users, who will have the opportunity to access the VH through a geo-portal, but also application developers who will be able to access VH functionalities through simple Application Programming Interfaces (API). ENERGIC-OD Consortium will develop ten different applications on top of the deployed VHs. They aim to demonstrate how VHs facilitate the development of new and multidisciplinary applications based on the full exploitation of (open) GI, hence stimulating innovation and business activities.

  20. Numerical modeling of exciton-polariton Bose-Einstein condensate in a microcavity

    NASA Astrophysics Data System (ADS)

    Voronych, Oksana; Buraczewski, Adam; Matuszewski, Michał; Stobińska, Magdalena

    2017-06-01

    A novel, optimized numerical method of modeling of an exciton-polariton superfluid in a semiconductor microcavity was proposed. Exciton-polaritons are spin-carrying quasiparticles formed from photons strongly coupled to excitons. They possess unique properties, interesting from the point of view of fundamental research as well as numerous potential applications. However, their numerical modeling is challenging due to the structure of nonlinear differential equations describing their evolution. In this paper, we propose to solve the equations with a modified Runge-Kutta method of 4th order, further optimized for efficient computations. The algorithms were implemented in form of C++ programs fitted for parallel environments and utilizing vector instructions. The programs form the EPCGP suite which has been used for theoretical investigation of exciton-polaritons. Catalogue identifier: AFBQ_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AFBQ_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: BSD-3 No. of lines in distributed program, including test data, etc.: 2157 No. of bytes in distributed program, including test data, etc.: 498994 Distribution format: tar.gz Programming language: C++ with OpenMP extensions (main numerical program), Python (helper scripts). Computer: Modern PC (tested on AMD and Intel processors), HP BL2x220. Operating system: Unix/Linux and Windows. Has the code been vectorized or parallelized?: Yes (OpenMP) RAM: 200 MB for single run Classification: 7, 7.7. Nature of problem: An exciton-polariton superfluid is a novel, interesting physical system allowing investigation of high temperature Bose-Einstein condensation of exciton-polaritons-quasiparticles carrying spin. They have brought a lot of attention due to their unique properties and potential applications in polariton-based optoelectronic integrated circuits. This is an out-of-equilibrium quantum system confined within a semiconductor microcavity. It is described by a set of nonlinear differential equations similar in spirit to the Gross-Pitaevskii (GP) equation, but their unique properties do not allow standard GP solving frameworks to be utilized. Finding an accurate and efficient numerical algorithm as well as development of optimized numerical software is necessary for effective theoretical investigation of exciton-polaritons. Solution method: A Runge-Kutta method of 4th order was employed to solve the set of differential equations describing exciton-polariton superfluids. The method was fitted for the exciton-polariton equations and further optimized. The C++ programs utilize OpenMP extensions and vector operations in order to fully utilize the computer hardware. Running time: 6h for 100 ps evolution, depending on the values of parameters

  1. Understanding Portability of a High-Level Programming Model on Contemporary Heterogeneous Architectures

    DOE PAGES

    Sabne, Amit J.; Sakdhnagool, Putt; Lee, Seyong; ...

    2015-07-13

    Accelerator-based heterogeneous computing is gaining momentum in the high-performance computing arena. However, the increased complexity of heterogeneous architectures demands more generic, high-level programming models. OpenACC is one such attempt to tackle this problem. Although the abstraction provided by OpenACC offers productivity, it raises questions concerning both functional and performance portability. In this article, the authors propose HeteroIR, a high-level, architecture-independent intermediate representation, to map high-level programming models, such as OpenACC, to heterogeneous architectures. They present a compiler approach that translates OpenACC programs into HeteroIR and accelerator kernels to obtain OpenACC functional portability. They then evaluate the performance portability obtained bymore » OpenACC with their approach on 12 OpenACC programs on Nvidia CUDA, AMD GCN, and Intel Xeon Phi architectures. They study the effects of various compiler optimizations and OpenACC program settings on these architectures to provide insights into the achieved performance portability.« less

  2. Recent evaluations of crack-opening-area in circumferentially cracked pipes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rahman, S.; Brust, F.; Ghadiali, N.

    1997-04-01

    Leak-before-break (LBB) analyses for circumferentially cracked pipes are currently being conducted in the nuclear industry to justify elimination of pipe whip restraints and jet shields which are present because of the expected dynamic effects from pipe rupture. The application of the LBB methodology frequently requires calculation of leak rates. The leak rates depend on the crack-opening area of the through-wall crack in the pipe. In addition to LBB analyses which assume a hypothetical flaw size, there is also interest in the integrity of actual leaking cracks corresponding to current leakage detection requirements in NRC Regulatory Guide 1.45, or for assessingmore » temporary repair of Class 2 and 3 pipes that have leaks as are being evaluated in ASME Section XI. The objectives of this study were to review, evaluate, and refine current predictive models for performing crack-opening-area analyses of circumferentially cracked pipes. The results from twenty-five full-scale pipe fracture experiments, conducted in the Degraded Piping Program, the International Piping Integrity Research Group Program, and the Short Cracks in Piping and Piping Welds Program, were used to verify the analytical models. Standard statistical analyses were performed to assess used to verify the analytical models. Standard statistical analyses were performed to assess quantitatively the accuracy of the predictive models. The evaluation also involved finite element analyses for determining the crack-opening profile often needed to perform leak-rate calculations.« less

  3. Blue guardian: an open architecture for rapid ISR demonstration

    NASA Astrophysics Data System (ADS)

    Barrett, Donald A.; Borntrager, Luke A.; Green, David M.

    2016-05-01

    Throughout the Department of Defense (DoD), acquisition, platform integration, and life cycle costs for weapons systems have continued to rise. Although Open Architecture (OA) interface standards are one of the primary methods being used to reduce these costs, the Air Force Rapid Capabilities Office (AFRCO) has extended the OA concept and chartered the Open Mission System (OMS) initiative with industry to develop and demonstrate a consensus-based, non-proprietary, OA standard for integrating subsystems and services into airborne platforms. The new OMS standard provides the capability to decouple vendor-specific sensors, payloads, and service implementations from platform-specific architectures and is still in the early stages of maturation and demonstration. The Air Force Research Laboratory (AFRL) - Sensors Directorate has developed the Blue Guardian program to demonstrate advanced sensing technology utilizing open architectures in operationally relevant environments. Over the past year, Blue Guardian has developed a platform architecture using the Air Force's OMS reference architecture and conducted a ground and flight test program of multiple payload combinations. Systems tested included a vendor-unique variety of Full Motion Video (FMV) systems, a Wide Area Motion Imagery (WAMI) system, a multi-mode radar system, processing and database functions, multiple decompression algorithms, multiple communications systems, and a suite of software tools. Initial results of the Blue Guardian program show the promise of OA to DoD acquisitions, especially for Intelligence, Surveillance and Reconnaissance (ISR) payload applications. Specifically, the OMS reference architecture was extremely useful in reducing the cost and time required for integrating new systems.

  4. Repeatable, accurate, and high speed multi-level programming of memristor 1T1R arrays for power efficient analog computing applications.

    PubMed

    Merced-Grafals, Emmanuelle J; Dávila, Noraica; Ge, Ning; Williams, R Stanley; Strachan, John Paul

    2016-09-09

    Beyond use as high density non-volatile memories, memristors have potential as synaptic components of neuromorphic systems. We investigated the suitability of tantalum oxide (TaOx) transistor-memristor (1T1R) arrays for such applications, particularly the ability to accurately, repeatedly, and rapidly reach arbitrary conductance states. Programming is performed by applying an adaptive pulsed algorithm that utilizes the transistor gate voltage to control the SET switching operation and increase programming speed of the 1T1R cells. We show the capability of programming 64 conductance levels with <0.5% average accuracy using 100 ns pulses and studied the trade-offs between programming speed and programming error. The algorithm is also utilized to program 16 conductance levels on a population of cells in the 1T1R array showing robustness to cell-to-cell variability. In general, the proposed algorithm results in approximately 10× improvement in programming speed over standard algorithms that do not use the transistor gate to control memristor switching. In addition, after only two programming pulses (an initialization pulse followed by a programming pulse), the resulting conductance values are within 12% of the target values in all cases. Finally, endurance of more than 10(6) cycles is shown through open-loop (single pulses) programming across multiple conductance levels using the optimized gate voltage of the transistor. These results are relevant for applications that require high speed, accurate, and repeatable programming of the cells such as in neural networks and analog data processing.

  5. Final Report, University Research Program in Robotics (URPR), Nuclear Facilities Clean-up

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tesar, Delbert; Kapoor, Chetan; Pryor, Mitch

    This final report describes the research activity at the University of Texas at Austin with application to EM needs at DOE. This research activity is divided in to two major thrusts and contributes to the overall University Research Program in Robotics (URPR) thrust by providing mechanically oriented robotic solutions based on modularity and generalized software. These thrusts are also the core strengths of the UTA program that has a 40-year history in machine development, 30 years specifically devoted to robotics. Since 1975, much of this effort has been to establish the general analytical and design infrastructure for an open (modular)more » architecture of systems with many degrees of freedom that are able to satisfy a broad range of applications for future production machines. This work has coalesced from two principal areas: standardized actuators and generalized software.« less

  6. Automation of Data Traffic Control on DSM Architecture

    NASA Technical Reports Server (NTRS)

    Frumkin, Michael; Jin, Hao-Qiang; Yan, Jerry

    2001-01-01

    The design of distributed shared memory (DSM) computers liberates users from the duty to distribute data across processors and allows for the incremental development of parallel programs using, for example, OpenMP or Java threads. DSM architecture greatly simplifies the development of parallel programs having good performance on a few processors. However, to achieve a good program scalability on DSM computers requires that the user understand data flow in the application and use various techniques to avoid data traffic congestions. In this paper we discuss a number of such techniques, including data blocking, data placement, data transposition and page size control and evaluate their efficiency on the NAS (NASA Advanced Supercomputing) Parallel Benchmarks. We also present a tool which automates the detection of constructs causing data congestions in Fortran array oriented codes and advises the user on code transformations for improving data traffic in the application.

  7. SMART on FHIR: a standards-based, interoperable apps platform for electronic health records

    PubMed Central

    Kreda, David A; Mandl, Kenneth D; Kohane, Isaac S; Ramoni, Rachel B

    2016-01-01

    Objective In early 2010, Harvard Medical School and Boston Children’s Hospital began an interoperability project with the distinctive goal of developing a platform to enable medical applications to be written once and run unmodified across different healthcare IT systems. The project was called Substitutable Medical Applications and Reusable Technologies (SMART). Methods We adopted contemporary web standards for application programming interface transport, authorization, and user interface, and standard medical terminologies for coded data. In our initial design, we created our own openly licensed clinical data models to enforce consistency and simplicity. During the second half of 2013, we updated SMART to take advantage of the clinical data models and the application-programming interface described in a new, openly licensed Health Level Seven draft standard called Fast Health Interoperability Resources (FHIR). Signaling our adoption of the emerging FHIR standard, we called the new platform SMART on FHIR. Results We introduced the SMART on FHIR platform with a demonstration that included several commercial healthcare IT vendors and app developers showcasing prototypes at the Health Information Management Systems Society conference in February 2014. This established the feasibility of SMART on FHIR, while highlighting the need for commonly accepted pragmatic constraints on the base FHIR specification. Conclusion In this paper, we describe the creation of SMART on FHIR, relate the experience of the vendors and developers who built SMART on FHIR prototypes, and discuss some challenges in going from early industry prototyping to industry-wide production use. PMID:26911829

  8. NRC Grants for Federal Research

    NASA Astrophysics Data System (ADS)

    The National Research Council is accepting applications for the 1989 Resident, Cooperative, and Postdoctoral Research Associateship Programs in science and engineering. NRC administers the awards for 30 federal agencies and research institutions, which have 115 participating laboratories in the U.S.About 450 new full-time Associateships will be given for research in biological, health, behaviorial sciences and biotechnology; chemistry; Earth and atmospheric sciences; engineering and applied sciences; mathematics; physics; and space and planetary sciences. Most of the programs are open to recent Ph.D.s and senior investigators and to citizens of the U.S. and other countries. More than 5500 scientists have received Associateships since the programs began in 1954.

  9. The UK National Quantum Technologies Hub in sensors and metrology (Keynote Paper)

    NASA Astrophysics Data System (ADS)

    Bongs, K.; Boyer, V.; Cruise, M. A.; Freise, A.; Holynski, M.; Hughes, J.; Kaushik, A.; Lien, Y.-H.; Niggebaum, A.; Perea-Ortiz, M.; Petrov, P.; Plant, S.; Singh, Y.; Stabrawa, A.; Paul, D. J.; Sorel, M.; Cumming, D. R. S.; Marsh, J. H.; Bowtell, R. W.; Bason, M. G.; Beardsley, R. P.; Campion, R. P.; Brookes, M. J.; Fernholz, T.; Fromhold, T. M.; Hackermuller, L.; Krüger, P.; Li, X.; Maclean, J. O.; Mellor, C. J.; Novikov, S. V.; Orucevic, F.; Rushforth, A. W.; Welch, N.; Benson, T. M.; Wildman, R. D.; Freegarde, T.; Himsworth, M.; Ruostekoski, J.; Smith, P.; Tropper, A.; Griffin, P. F.; Arnold, A. S.; Riis, E.; Hastie, J. E.; Paboeuf, D.; Parrotta, D. C.; Garraway, B. M.; Pasquazi, A.; Peccianti, M.; Hensinger, W.; Potter, E.; Nizamani, A. H.; Bostock, H.; Rodriguez Blanco, A.; Sinuco-Leon, G.; Hill, I. R.; Williams, R. A.; Gill, P.; Hempler, N.; Malcolm, G. P. A.; Cross, T.; Kock, B. O.; Maddox, S.; John, P.

    2016-04-01

    The UK National Quantum Technology Hub in Sensors and Metrology is one of four flagship initiatives in the UK National of Quantum Technology Program. As part of a 20-year vision it translates laboratory demonstrations to deployable practical devices, with game-changing miniaturized components and prototypes that transform the state-of-the-art for quantum sensors and metrology. It brings together experts from the Universities of Birmingham, Glasgow, Nottingham, Southampton, Strathclyde and Sussex, NPL and currently links to over 15 leading international academic institutions and over 70 companies to build the supply chains and routes to market needed to bring 10-1000x improvements in sensing applications. It seeks, and is open to, additional partners for new application development and creates a point of easy open access to the facilities and supply chains that it stimulates or nurtures.

  10. An overview of the Hadoop/MapReduce/HBase framework and its current applications in bioinformatics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Taylor, Ronald C.

    Bioinformatics researchers are increasingly confronted with analysis of ultra large-scale data sets, a problem that will only increase at an alarming rate in coming years. Recent developments in open source software, that is, the Hadoop project and associated software, provide a foundation for scaling to petabyte scale data warehouses on Linux clusters, providing fault-tolerant parallelized analysis on such data using a programming style named MapReduce. An overview is given of the current usage within the bioinformatics community of Hadoop, a top-level Apache Software Foundation project, and of associated open source software projects. The concepts behind Hadoop and the associated HBasemore » project are defined, and current bioinformatics software that employ Hadoop is described. The focus is on next-generation sequencing, as the leading application area to date.« less

  11. Application Characterization at Scale: Lessons learned from developing a distributed Open Community Runtime system for High Performance Computing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Landwehr, Joshua B.; Suetterlein, Joshua D.; Marquez, Andres

    2016-05-16

    Since 2012, the U.S. Department of Energy’s X-Stack program has been developing solutions including runtime systems, programming models, languages, compilers, and tools for the Exascale system software to address crucial performance and power requirements. Fine grain programming models and runtime systems show a great potential to efficiently utilize the underlying hardware. Thus, they are essential to many X-Stack efforts. An abundant amount of small tasks can better utilize the vast parallelism available on current and future machines. Moreover, finer tasks can recover faster and adapt better, due to a decrease in state and control. Nevertheless, current applications have been writtenmore » to exploit old paradigms (such as Communicating Sequential Processor and Bulk Synchronous Parallel processing). To fully utilize the advantages of these new systems, applications need to be adapted to these new paradigms. As part of the applications’ porting process, in-depth characterization studies, focused on both application characteristics and runtime features, need to take place to fully understand the application performance bottlenecks and how to resolve them. This paper presents a characterization study for a novel high performance runtime system, called the Open Community Runtime, using key HPC kernels as its vehicle. This study has the following contributions: one of the first high performance, fine grain, distributed memory runtime system implementing the OCR standard (version 0.99a); and a characterization study of key HPC kernels in terms of runtime primitives running on both intra and inter node environments. Running on a general purpose cluster, we have found up to 1635x relative speed-up for a parallel tiled Cholesky Kernels on 128 nodes with 16 cores each and a 1864x relative speed-up for a parallel tiled Smith-Waterman kernel on 128 nodes with 30 cores.« less

  12. Approaches to Linked Open Data at data.oceandrilling.org

    NASA Astrophysics Data System (ADS)

    Fils, D.

    2012-12-01

    The data.oceandrilling.org web application applies Linked Open Data (LOD) patterns to expose Deep Sea Drilling Project (DSDP), Ocean Drilling Program (ODP) and Integrated Ocean Drilling Program (IODP) data. Ocean drilling data is represented in a rich range of data formats: high resolution images, file based data sets and sample based data. This richness of data types has been well met by semantic approaches and will be demonstrated. Data has been extracted from CSV, HTML and RDBMS through custom software and existing packages for loading into a SPARQL 1.1 compliant triple store. Practices have been developed to streamline the maintenance of the RDF graphs and properly expose them using LOD approaches like VoID and HTML embedded structured data. Custom and existing vocabularies are used to allow semantic relations between resources. Use of the W3c draft RDF Data Cube Vocabulary and other approaches for encoding time scales, taxonomic fossil data and other graphs will be shown. A software layer written in Google Go mediates the RDF to web pipeline. The approach used is general and can be applied to other similar environments like node.js or Python Twisted. To facilitate communication user interface software libraries such as D3 and packages such as S2S and LodLive have been used. Additionally OpenSearch API's, structured data in HTML and SPARQL endpoints provide various access methods for applications. The data.oceandrilling.org is not viewed as a web site but as an application that communicate with a range of clients. This approach helps guide the development more along software practices than along web site authoring approaches.

  13. Portable multi-node LQCD Monte Carlo simulations using OpenACC

    NASA Astrophysics Data System (ADS)

    Bonati, Claudio; Calore, Enrico; D'Elia, Massimo; Mesiti, Michele; Negro, Francesco; Sanfilippo, Francesco; Schifano, Sebastiano Fabio; Silvi, Giorgio; Tripiccione, Raffaele

    This paper describes a state-of-the-art parallel Lattice QCD Monte Carlo code for staggered fermions, purposely designed to be portable across different computer architectures, including GPUs and commodity CPUs. Portability is achieved using the OpenACC parallel programming model, used to develop a code that can be compiled for several processor architectures. The paper focuses on parallelization on multiple computing nodes using OpenACC to manage parallelism within the node, and OpenMPI to manage parallelism among the nodes. We first discuss the available strategies to be adopted to maximize performances, we then describe selected relevant details of the code, and finally measure the level of performance and scaling-performance that we are able to achieve. The work focuses mainly on GPUs, which offer a significantly high level of performances for this application, but also compares with results measured on other processors.

  14. Teachers' Perceptions of the Efficacy of the Open Court Program for English Proficient and English Language Learners

    ERIC Educational Resources Information Center

    Lee, Steven K.; Ajayi, Lasisi; Richards, Rachel

    2007-01-01

    The purpose of the study is to examine teachers' perceptions of the Open Court language program. Open Court is published by McGraw Hill and has been approved by the "No Child Left Behind Act" as an appropriate research-based reading program. The Open Court program was adopted as part of the efforts to provide all elementary school…

  15. Cooperative analysis expert situation assessment research

    NASA Technical Reports Server (NTRS)

    Mccown, Michael G.

    1987-01-01

    For the past few decades, Rome Air Development Center (RADC) has been conducting research in Artificial Intelligence (AI). When the recent advances in hardware technology made many AI techniques practical, the Intelligence and Reconnaissance Directorate of RADC initiated an applications program entitled Knowledge Based Intelligence Systems (KBIS). The goal of the program is the development of a generic Intelligent Analyst System, an open machine with the framework for intelligence analysis, natural language processing, and man-machine interface techniques, needing only the specific problem domain knowledge to be operationally useful. The development of KBIS is described.

  16. Hypertext and hypermedia systems in information retrieval

    NASA Technical Reports Server (NTRS)

    Kaye, K. M.; Kuhn, A. D.

    1992-01-01

    This paper opens with a brief history of hypertext and hypermedia in the context of information management during the 'information age.' Relevant terms are defined and the approach of the paper is explained. Linear and hypermedia information access methods are contrasted. A discussion of hyperprogramming in the handling of complex scientific and technical information follows. A selection of innovative hypermedia systems is discussed. An analysis of the Clinical Practice Library of Medicine NASA STI Program hypermedia application is presented. The paper concludes with a discussion of the NASA STI Program's future hypermedia project plans.

  17. Consolidated fuel reprocessing program

    NASA Astrophysics Data System (ADS)

    1985-04-01

    A survey of electrochemical methods applications in fuel reprocessing was completed. A dummy fuel assembly shroud was cut using the remotely operated laser disassembly equipment. Operations and engineering efforts have continued to correct equipment operating, software, and procedural problems experienced during the previous uranium compaigns. Fuel cycle options were examined for the liquid metal reactor fuel cycle. In high temperature gas cooled reactor spent fuel studies, preconceptual designs were completed for the concrete storage cask and open field drywell storage concept. These and other tasks operating under the consolidated fuel reprocessing program are examined.

  18. Development of a North American paleoclimate pollen-based reconstruction database application

    NASA Astrophysics Data System (ADS)

    Ladd, Matthew; Mosher, Steven; Viau, Andre

    2013-04-01

    Recent efforts in synthesizing paleoclimate records across the globe has warranted an effort to standardize the different paleoclimate archives currently available in order to facilitate data-model comparisons and hence improve our estimates of future climate change. It is often the case that the methodology and programs make it challenging for other researchers to reproduce the results for a reconstruction, therefore there is a need for to standardize paleoclimate reconstruction databases in an application specific to proxy data. Here we present a methodology using the open source R language using North American pollen databases (e.g. NAPD, NEOTOMA) where this application can easily be used to perform new reconstructions and quickly analyze and output/plot the data. The application was developed to easily test methodological and spatial/temporal issues that might affect the reconstruction results. The application allows users to spend more time analyzing and interpreting results instead of on data management and processing. Some of the unique features of this R program are the two modules each with a menu making the user feel at ease with the program, the ability to use different pollen sums, select one of 70 climate variables available, substitute an appropriate modern climate dataset, a user-friendly regional target domain, temporal resolution criteria, linear interpolation and many other features for a thorough exploratory data analysis. The application program will be available for North American pollen-based reconstructions and eventually be made available as a package through the CRAN repository by late 2013.

  19. The Bioperl Toolkit: Perl Modules for the Life Sciences

    PubMed Central

    Stajich, Jason E.; Block, David; Boulez, Kris; Brenner, Steven E.; Chervitz, Stephen A.; Dagdigian, Chris; Fuellen, Georg; Gilbert, James G.R.; Korf, Ian; Lapp, Hilmar; Lehväslaiho, Heikki; Matsalla, Chad; Mungall, Chris J.; Osborne, Brian I.; Pocock, Matthew R.; Schattner, Peter; Senger, Martin; Stein, Lincoln D.; Stupka, Elia; Wilkinson, Mark D.; Birney, Ewan

    2002-01-01

    The Bioperl project is an international open-source collaboration of biologists, bioinformaticians, and computer scientists that has evolved over the past 7 yr into the most comprehensive library of Perl modules available for managing and manipulating life-science information. Bioperl provides an easy-to-use, stable, and consistent programming interface for bioinformatics application programmers. The Bioperl modules have been successfully and repeatedly used to reduce otherwise complex tasks to only a few lines of code. The Bioperl object model has been proven to be flexible enough to support enterprise-level applications such as EnsEMBL, while maintaining an easy learning curve for novice Perl programmers. Bioperl is capable of executing analyses and processing results from programs such as BLAST, ClustalW, or the EMBOSS suite. Interoperation with modules written in Python and Java is supported through the evolving BioCORBA bridge. Bioperl provides access to data stores such as GenBank and SwissProt via a flexible series of sequence input/output modules, and to the emerging common sequence data storage format of the Open Bioinformatics Database Access project. This study describes the overall architecture of the toolkit, the problem domains that it addresses, and gives specific examples of how the toolkit can be used to solve common life-sciences problems. We conclude with a discussion of how the open-source nature of the project has contributed to the development effort. [Supplemental material is available online at www.genome.org. Bioperl is available as open-source software free of charge and is licensed under the Perl Artistic License (http://www.perl.com/pub/a/language/misc/Artistic.html). It is available for download at http://www.bioperl.org. Support inquiries should be addressed to bioperl-l@bioperl.org.] PMID:12368254

  20. The image-guided surgery toolkit IGSTK: an open source C++ software toolkit.

    PubMed

    Enquobahrie, Andinet; Cheng, Patrick; Gary, Kevin; Ibanez, Luis; Gobbi, David; Lindseth, Frank; Yaniv, Ziv; Aylward, Stephen; Jomier, Julien; Cleary, Kevin

    2007-11-01

    This paper presents an overview of the image-guided surgery toolkit (IGSTK). IGSTK is an open source C++ software library that provides the basic components needed to develop image-guided surgery applications. It is intended for fast prototyping and development of image-guided surgery applications. The toolkit was developed through a collaboration between academic and industry partners. Because IGSTK was designed for safety-critical applications, the development team has adopted lightweight software processes that emphasizes safety and robustness while, at the same time, supporting geographically separated developers. A software process that is philosophically similar to agile software methods was adopted emphasizing iterative, incremental, and test-driven development principles. The guiding principle in the architecture design of IGSTK is patient safety. The IGSTK team implemented a component-based architecture and used state machine software design methodologies to improve the reliability and safety of the components. Every IGSTK component has a well-defined set of features that are governed by state machines. The state machine ensures that the component is always in a valid state and that all state transitions are valid and meaningful. Realizing that the continued success and viability of an open source toolkit depends on a strong user community, the IGSTK team is following several key strategies to build an active user community. These include maintaining a users and developers' mailing list, providing documentation (application programming interface reference document and book), presenting demonstration applications, and delivering tutorial sessions at relevant scientific conferences.

  1. Advantages and Disadvantages in Image Processing with Free Software in Radiology.

    PubMed

    Mujika, Katrin Muradas; Méndez, Juan Antonio Juanes; de Miguel, Andrés Framiñan

    2018-01-15

    Currently, there are sophisticated applications that make it possible to visualize medical images and even to manipulate them. These software applications are of great interest, both from a teaching and a radiological perspective. In addition, some of these applications are known as Free Open Source Software because they are free and the source code is freely available, and therefore it can be easily obtained even on personal computers. Two examples of free open source software are Osirix Lite® and 3D Slicer®. However, this last group of free applications have limitations in its use. For the radiological field, manipulating and post-processing images is increasingly important. Consequently, sophisticated computing tools that combine software and hardware to process medical images are needed. In radiology, graphic workstations allow their users to process, review, analyse, communicate and exchange multidimensional digital images acquired with different image-capturing radiological devices. These radiological devices are basically CT (Computerised Tomography), MRI (Magnetic Resonance Imaging), PET (Positron Emission Tomography), etc. Nevertheless, the programs included in these workstations have a high cost which always depends on the software provider and is always subject to its norms and requirements. With this study, we aim to present the advantages and disadvantages of these radiological image visualization systems in the advanced management of radiological studies. We will compare the features of the VITREA2® and AW VolumeShare 5® radiology workstation with free open source software applications like OsiriX® and 3D Slicer®, with examples from specific studies.

  2. Object Management Group object transaction service based on an X/Open and International Organization for Standardization open systems interconnection transaction processing kernel

    NASA Astrophysics Data System (ADS)

    Liang, J.; Sédillot, S.; Traverson, B.

    1997-09-01

    This paper addresses federation of a transactional object standard - Object Management Group (OMG) object transaction service (OTS) - with the X/Open distributed transaction processing (DTP) model and International Organization for Standardization (ISO) open systems interconnection (OSI) transaction processing (TP) communication protocol. The two-phase commit propagation rules within a distributed transaction tree are similar in the X/Open, ISO and OMG models. Building an OTS on an OSI TP protocol machine is possible because the two specifications are somewhat complementary. OTS defines a set of external interfaces without specific internal protocol machine, while OSI TP specifies an internal protocol machine without any application programming interface. Given these observations, and having already implemented an X/Open two-phase commit transaction toolkit based on an OSI TP protocol machine, we analyse the feasibility of using this implementation as a transaction service provider for OMG interfaces. Based on the favourable result of this feasibility study, we are implementing an OTS compliant system, which, by initiating the extensibility and openness strengths of OSI TP, is able to provide interoperability between X/Open DTP and OMG OTS models.

  3. Fast Acceleration of 2D Wave Propagation Simulations Using Modern Computational Accelerators

    PubMed Central

    Wang, Wei; Xu, Lifan; Cavazos, John; Huang, Howie H.; Kay, Matthew

    2014-01-01

    Recent developments in modern computational accelerators like Graphics Processing Units (GPUs) and coprocessors provide great opportunities for making scientific applications run faster than ever before. However, efficient parallelization of scientific code using new programming tools like CUDA requires a high level of expertise that is not available to many scientists. This, plus the fact that parallelized code is usually not portable to different architectures, creates major challenges for exploiting the full capabilities of modern computational accelerators. In this work, we sought to overcome these challenges by studying how to achieve both automated parallelization using OpenACC and enhanced portability using OpenCL. We applied our parallelization schemes using GPUs as well as Intel Many Integrated Core (MIC) coprocessor to reduce the run time of wave propagation simulations. We used a well-established 2D cardiac action potential model as a specific case-study. To the best of our knowledge, we are the first to study auto-parallelization of 2D cardiac wave propagation simulations using OpenACC. Our results identify several approaches that provide substantial speedups. The OpenACC-generated GPU code achieved more than speedup above the sequential implementation and required the addition of only a few OpenACC pragmas to the code. An OpenCL implementation provided speedups on GPUs of at least faster than the sequential implementation and faster than a parallelized OpenMP implementation. An implementation of OpenMP on Intel MIC coprocessor provided speedups of with only a few code changes to the sequential implementation. We highlight that OpenACC provides an automatic, efficient, and portable approach to achieve parallelization of 2D cardiac wave simulations on GPUs. Our approach of using OpenACC, OpenCL, and OpenMP to parallelize this particular model on modern computational accelerators should be applicable to other computational models of wave propagation in multi-dimensional media. PMID:24497950

  4. Application of the Transtheoretical Model to Exercise Behavior and Physical Activity in Patients after Open Heart Surgery.

    PubMed

    Huang, Hsin-Yi; Lin, Yu-Shan; Chuang, Yi-Cheng; Lin, Wei-Hsuan; Kuo, Li Ying; Chen, Jui Chun; Hsu, Ching Ling; Chen, Bo Yan; Tsai, Hui Yu; Cheng, Fei Hsin; Tsai, Mei-Wun

    2015-05-01

    To assess exercise behavior and physical activity levels after open heart surgery. This prospective cohort study included 130 patients (70.8% male, aged 61.0 ± 12.2 years, 53.8% coronary bypass grafting) who underwent open heart surgery. The exercise behavior and physical activity of these patients were assessed at the 3- and 6-month follow-up appointments. Additional interviews were also conducted to further assess exercise behavior. Physical activity duration and metabolic equivalents were calculated from self-reported questionnaire responses. Moreover, possible related demographic factors, clinical features, participation in cardiac rehabilitation programs, and physical activity levels were additionally evaluated. Six months after hospital discharge, most patients were in the action (39.2%) and maintenance (37.7%) stages. Other subjects were in the precontemplation (11.5%), contemplation (5.4%), and preparation (6.2%) stages. The average physical activity level was 332.6 ± 377.1 min/week and 1198.1 ± 1396.9 KJ/week. Subjects in the action and maintenance stages exercised an average of 399.4 ± 397.6 min/week, significantly longer than those in other stages (116.2 ± 176.2 min/week, p = 0.02). Subjects that participated in outpatient cardiac rehabilitation programs after discharge may have the better exercise habit. Gender had no significant effect on exercise behavior 6 months after hospital discharge. Most subjects following open heart surgery may maintain regular exercise behavior at 6 months after hospital discharge. Physical activity levels sufficient for cardiac health were achieved by subjects in the active and maintenance stages. Outpatient cardiac rehabilitation programs are valuable for encouraging exercise behavior after heart surgery. Exercise behavior; Open heart surgery; Physical activity; Transtheoretical model.

  5. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lee, Seyong; Kim, Jungwon; Vetter, Jeffrey S

    This paper presents a directive-based, high-level programming framework for high-performance reconfigurable computing. It takes a standard, portable OpenACC C program as input and generates a hardware configuration file for execution on FPGAs. We implemented this prototype system using our open-source OpenARC compiler; it performs source-to-source translation and optimization of the input OpenACC program into an OpenCL code, which is further compiled into a FPGA program by the backend Altera Offline OpenCL compiler. Internally, the design of OpenARC uses a high- level intermediate representation that separates concerns of program representation from underlying architectures, which facilitates portability of OpenARC. In fact, thismore » design allowed us to create the OpenACC-to-FPGA translation framework with minimal extensions to our existing system. In addition, we show that our proposed FPGA-specific compiler optimizations and novel OpenACC pragma extensions assist the compiler in generating more efficient FPGA hardware configuration files. Our empirical evaluation on an Altera Stratix V FPGA with eight OpenACC benchmarks demonstrate the benefits of our strategy. To demonstrate the portability of OpenARC, we show results for the same benchmarks executing on other heterogeneous platforms, including NVIDIA GPUs, AMD GPUs, and Intel Xeon Phis. This initial evidence helps support the goal of using a directive-based, high-level programming strategy for performance portability across heterogeneous HPC architectures.« less

  6. Multi-board kernel communication using socket programming for embedded applications

    NASA Astrophysics Data System (ADS)

    Mishra, Ashish; Girdhar, Neha; Krishnia, Nikita

    2016-03-01

    It is often seen in large application projects, there is a need to communicate between two different processors or two different kernels. The aim of this paper is to communicate between two different kernels and use efficient method to do so. The TCP/IP protocol is implemented to communicate between two boards via the Ethernet port and use lwIP (lightweight IP) stack, which is a smaller independent implementation of the TCP/IP stack suitable for use in embedded systems. While retaining TCP/IP functionality, lwIP stack reduces the use of memory and even size of the code. In this process of communication we made Raspberry pi as an active client and Field programmable gate array(FPGA) board as a passive server and they are allowed to communicate via Ethernet. Three applications based on TCP/IP client-server network communication have been implemented. The Echo server application is used to communicate between two different kernels of two different boards. Socket programming is used as it is independent of platform and programming language used. TCP transmit and receive throughput test applications are used to measure maximum throughput of the transmission of data. These applications are based on communication to an open source tool called iperf. It is used to measure the throughput transmission rate by sending or receiving some constant piece of data to the client or server according to the test application.

  7. (On)line dancing: choosing an appropriate distance education partner.

    PubMed

    Menn, Mindy; Don Chaney, J

    2014-05-01

    Online-delivered distance education is a burgeoning component of professional development and continuing education. Distance education programs allow individuals to learn in a different location and/or at a different time from fellow learners, thereby increasing the flexibility and number of learning options. Selecting the "right" program for personal development from the ever-growing body of online-delivered education is an individualized decision that can become an overwhelming and challenging process. This Tool presents four important definitions for navigating distance education program description materials and outlines a five-step process to assist in identifying an appropriate program for personal development. The five-step process includes key questions and points to consider while conducting a candid self-assessment, identifying and investigating distance education programs, and then compiling information, comparing programs, and prioritizing a list of programs suitable for application. Furthermore, this Tool highlights important websites for distance education degree program reviews, accreditation information, and open educational resources.

  8. Assessment of RELAP5/MOD2 against a pressurizer spray valve inadverted fully opening transient and recovery by natural circulation in Jose Cabrera Nuclear Station

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Arroyo, R.; Rebollo, L.

    1993-06-01

    This document presents the comparison between the simulation results and the plant measurements of a real event that took place in JOSE CABRERA nuclear power plant in August 30th, 1984. The event was originated by the total, continuous and inadverted opening of the pressurizer spray valve PCV-400A. JOSE CABRERA power plant is a single loop Westinghouse PWR belonging to UNION ELECTRICA FENOSA, S.A. (UNION FENOSA), an Spanish utility which participates in the International Code Assessment and Applications Program (ICAP) as a member of UNIDAD ELECTRICA, S.A. (UNESA). This is the second of its two contributions to the Program: the firstmore » one was an application case and this is an assessment one. The simulation has been performed using the RELAP5/MOD2 cycle 36.04 code, running on a CDC CYBER 180/830 computer under NOS 2.5 operating system. The main phenomena have been calculated correctly and some conclusions about the 3D characteristics of the condensation due to the spray and its simulation with a 1D tool have been got.« less

  9. Global solutions of restricted open-shell Hartree-Fock theory from semidefinite programming with applications to strongly correlated quantum systems.

    PubMed

    Veeraraghavan, Srikant; Mazziotti, David A

    2014-03-28

    We present a density matrix approach for computing global solutions of restricted open-shell Hartree-Fock theory, based on semidefinite programming (SDP), that gives upper and lower bounds on the Hartree-Fock energy of quantum systems. While wave function approaches to Hartree-Fock theory yield an upper bound to the Hartree-Fock energy, we derive a semidefinite relaxation of Hartree-Fock theory that yields a rigorous lower bound on the Hartree-Fock energy. We also develop an upper-bound algorithm in which Hartree-Fock theory is cast as a SDP with a nonconvex constraint on the rank of the matrix variable. Equality of the upper- and lower-bound energies guarantees that the computed solution is the globally optimal solution of Hartree-Fock theory. The work extends a previously presented method for closed-shell systems [S. Veeraraghavan and D. A. Mazziotti, Phys. Rev. A 89, 010502-R (2014)]. For strongly correlated systems the SDP approach provides an alternative to the locally optimized Hartree-Fock energies and densities with a certificate of global optimality. Applications are made to the potential energy curves of C2, CN, Cr2, and NO2.

  10. Design and implementation of a cloud based lithography illumination pupil processing application

    NASA Astrophysics Data System (ADS)

    Zhang, Youbao; Ma, Xinghua; Zhu, Jing; Zhang, Fang; Huang, Huijie

    2017-02-01

    Pupil parameters are important parameters to evaluate the quality of lithography illumination system. In this paper, a cloud based full-featured pupil processing application is implemented. A web browser is used for the UI (User Interface), the websocket protocol and JSON format are used for the communication between the client and the server, and the computing part is implemented in the server side, where the application integrated a variety of high quality professional libraries, such as image processing libraries libvips and ImageMagic, automatic reporting system latex, etc., to support the program. The cloud based framework takes advantage of server's superior computing power and rich software collections, and the program could run anywhere there is a modern browser due to its web UI design. Compared to the traditional way of software operation model: purchased, licensed, shipped, downloaded, installed, maintained, and upgraded, the new cloud based approach, which is no installation, easy to use and maintenance, opens up a new way. Cloud based application probably is the future of the software development.

  11. A Locality-Based Threading Algorithm for the Configuration-Interaction Method

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shan, Hongzhang; Williams, Samuel; Johnson, Calvin

    The Configuration Interaction (CI) method has been widely used to solve the non-relativistic many-body Schrodinger equation. One great challenge to implementing it efficiently on manycore architectures is its immense memory and data movement requirements. To address this issue, within each node, we exploit a hybrid MPI+OpenMP programming model in lieu of the traditional flat MPI programming model. Here in this paper, we develop optimizations that partition the workloads among OpenMP threads based on data locality,-which is essential in ensuring applications with complex data access patterns scale well on manycore architectures. The new algorithm scales to 256 threadson the 64-core Intelmore » Knights Landing (KNL) manycore processor and 24 threads on dual-socket Ivy Bridge (Xeon) nodes. Compared with the original implementation, the performance has been improved by up to 7× on theKnights Landing processor and 3× on the dual-socket Ivy Bridge node.« less

  12. A Locality-Based Threading Algorithm for the Configuration-Interaction Method

    DOE PAGES

    Shan, Hongzhang; Williams, Samuel; Johnson, Calvin; ...

    2017-07-03

    The Configuration Interaction (CI) method has been widely used to solve the non-relativistic many-body Schrodinger equation. One great challenge to implementing it efficiently on manycore architectures is its immense memory and data movement requirements. To address this issue, within each node, we exploit a hybrid MPI+OpenMP programming model in lieu of the traditional flat MPI programming model. Here in this paper, we develop optimizations that partition the workloads among OpenMP threads based on data locality,-which is essential in ensuring applications with complex data access patterns scale well on manycore architectures. The new algorithm scales to 256 threadson the 64-core Intelmore » Knights Landing (KNL) manycore processor and 24 threads on dual-socket Ivy Bridge (Xeon) nodes. Compared with the original implementation, the performance has been improved by up to 7× on theKnights Landing processor and 3× on the dual-socket Ivy Bridge node.« less

  13. A low-cost programmable pulse generator for physiology and behavior

    PubMed Central

    Sanders, Joshua I.; Kepecs, Adam

    2014-01-01

    Precisely timed experimental manipulations of the brain and its sensory environment are often employed to reveal principles of brain function. While complex and reliable pulse trains for temporal stimulus control can be generated with commercial instruments, contemporary options remain expensive and proprietary. We have developed Pulse Pal, an open source device that allows users to create and trigger software-defined trains of voltage pulses with high temporal precision. Here we describe Pulse Pal’s circuitry and firmware, and characterize its precision and reliability. In addition, we supply online documentation with instructions for assembling, testing and installing Pulse Pal. While the device can be operated as a stand-alone instrument, we also provide application programming interfaces in several programming languages. As an inexpensive, flexible and open solution for temporal control, we anticipate that Pulse Pal will be used to address a wide range of instrumentation timing challenges in neuroscience research. PMID:25566051

  14. Generating optimal control simulations of musculoskeletal movement using OpenSim and MATLAB.

    PubMed

    Lee, Leng-Feng; Umberger, Brian R

    2016-01-01

    Computer modeling, simulation and optimization are powerful tools that have seen increased use in biomechanics research. Dynamic optimizations can be categorized as either data-tracking or predictive problems. The data-tracking approach has been used extensively to address human movement problems of clinical relevance. The predictive approach also holds great promise, but has seen limited use in clinical applications. Enhanced software tools would facilitate the application of predictive musculoskeletal simulations to clinically-relevant research. The open-source software OpenSim provides tools for generating tracking simulations but not predictive simulations. However, OpenSim includes an extensive application programming interface that permits extending its capabilities with scripting languages such as MATLAB. In the work presented here, we combine the computational tools provided by MATLAB with the musculoskeletal modeling capabilities of OpenSim to create a framework for generating predictive simulations of musculoskeletal movement based on direct collocation optimal control techniques. In many cases, the direct collocation approach can be used to solve optimal control problems considerably faster than traditional shooting methods. Cyclical and discrete movement problems were solved using a simple 1 degree of freedom musculoskeletal model and a model of the human lower limb, respectively. The problems could be solved in reasonable amounts of time (several seconds to 1-2 hours) using the open-source IPOPT solver. The problems could also be solved using the fmincon solver that is included with MATLAB, but the computation times were excessively long for all but the smallest of problems. The performance advantage for IPOPT was derived primarily by exploiting sparsity in the constraints Jacobian. The framework presented here provides a powerful and flexible approach for generating optimal control simulations of musculoskeletal movement using OpenSim and MATLAB. This should allow researchers to more readily use predictive simulation as a tool to address clinical conditions that limit human mobility.

  15. Generating optimal control simulations of musculoskeletal movement using OpenSim and MATLAB

    PubMed Central

    Lee, Leng-Feng

    2016-01-01

    Computer modeling, simulation and optimization are powerful tools that have seen increased use in biomechanics research. Dynamic optimizations can be categorized as either data-tracking or predictive problems. The data-tracking approach has been used extensively to address human movement problems of clinical relevance. The predictive approach also holds great promise, but has seen limited use in clinical applications. Enhanced software tools would facilitate the application of predictive musculoskeletal simulations to clinically-relevant research. The open-source software OpenSim provides tools for generating tracking simulations but not predictive simulations. However, OpenSim includes an extensive application programming interface that permits extending its capabilities with scripting languages such as MATLAB. In the work presented here, we combine the computational tools provided by MATLAB with the musculoskeletal modeling capabilities of OpenSim to create a framework for generating predictive simulations of musculoskeletal movement based on direct collocation optimal control techniques. In many cases, the direct collocation approach can be used to solve optimal control problems considerably faster than traditional shooting methods. Cyclical and discrete movement problems were solved using a simple 1 degree of freedom musculoskeletal model and a model of the human lower limb, respectively. The problems could be solved in reasonable amounts of time (several seconds to 1–2 hours) using the open-source IPOPT solver. The problems could also be solved using the fmincon solver that is included with MATLAB, but the computation times were excessively long for all but the smallest of problems. The performance advantage for IPOPT was derived primarily by exploiting sparsity in the constraints Jacobian. The framework presented here provides a powerful and flexible approach for generating optimal control simulations of musculoskeletal movement using OpenSim and MATLAB. This should allow researchers to more readily use predictive simulation as a tool to address clinical conditions that limit human mobility. PMID:26835184

  16. Multilevel Parallelization of AutoDock 4.2.

    PubMed

    Norgan, Andrew P; Coffman, Paul K; Kocher, Jean-Pierre A; Katzmann, David J; Sosa, Carlos P

    2011-04-28

    Virtual (computational) screening is an increasingly important tool for drug discovery. AutoDock is a popular open-source application for performing molecular docking, the prediction of ligand-receptor interactions. AutoDock is a serial application, though several previous efforts have parallelized various aspects of the program. In this paper, we report on a multi-level parallelization of AutoDock 4.2 (mpAD4). Using MPI and OpenMP, AutoDock 4.2 was parallelized for use on MPI-enabled systems and to multithread the execution of individual docking jobs. In addition, code was implemented to reduce input/output (I/O) traffic by reusing grid maps at each node from docking to docking. Performance of mpAD4 was examined on two multiprocessor computers. Using MPI with OpenMP multithreading, mpAD4 scales with near linearity on the multiprocessor systems tested. In situations where I/O is limiting, reuse of grid maps reduces both system I/O and overall screening time. Multithreading of AutoDock's Lamarkian Genetic Algorithm with OpenMP increases the speed of execution of individual docking jobs, and when combined with MPI parallelization can significantly reduce the execution time of virtual screens. This work is significant in that mpAD4 speeds the execution of certain molecular docking workloads and allows the user to optimize the degree of system-level (MPI) and node-level (OpenMP) parallelization to best fit both workloads and computational resources.

  17. Bayesian Atmospheric Radiative Transfer (BART)Thermochemical Equilibrium Abundance (TEA) Code and Application to WASP-43b

    NASA Astrophysics Data System (ADS)

    Blecic, Jasmina; Harrington, Joseph; Bowman, Matthew O.; Cubillos, Patricio E.; Stemm, Madison; Foster, Andrew

    2014-11-01

    We present a new, open-source, Thermochemical Equilibrium Abundances (TEA) code that calculates the abundances of gaseous molecular species. TEA uses the Gibbs-free-energy minimization method with an iterative Lagrangian optimization scheme. It initializes the radiative-transfer calculation in our Bayesian Atmospheric Radiative Transfer (BART) code. Given elemental abundances, TEA calculates molecular abundances for a particular temperature and pressure or a list of temperature-pressure pairs. The code is tested against the original method developed by White at al. (1958), the analytic method developed by Burrows and Sharp (1999), and the Newton-Raphson method implemented in the open-source Chemical Equilibrium with Applications (CEA) code. TEA is written in Python and is available to the community via the open-source development site GitHub.com. We also present BART applied to eclipse depths of WASP-43b exoplanet, constraining atmospheric thermal and chemical parameters. This work was supported by NASA Planetary Atmospheres grant NNX12AI69G and NASA Astrophysics Data Analysis Program grant NNX13AF38G. JB holds a NASA Earth and Space Science Fellowship.

  18. Havery Mudd 2014-2015 Computer Science Conduit Clinic Final Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Aspesi, G; Bai, J; Deese, R

    2015-05-12

    Conduit, a new open-source library developed at Lawrence Livermore National Laboratories, provides a C++ application programming interface (API) to describe and access scientific data. Conduit’s primary use is for inmemory data exchange in high performance computing (HPC) applications. Our team tested and improved Conduit to make it more appealing to potential adopters in the HPC community. We extended Conduit’s capabilities by prototyping four libraries: one for parallel communication using MPI, one for I/O functionality, one for aggregating performance data, and one for data visualization.

  19. Plowshare Program - American Atomic Bomb Tests For Industrial Applications

    ScienceCinema

    None

    2018-01-16

    The United States Atomic Energy Commission (AEC) established the Plowshare Program as a research and development activity to explore the technical and economic feasibility of using nuclear explosives for industrial applications. The reasoning was that the relatively inexpensive energy available from nuclear explosions could prove useful for a wide variety of peaceful purposes. The Plowshare Program began in 1958 and continued through 1975. Between December 1961 and May 1973, the United States conducted 27 Plowshare nuclear explosive tests comprising 35 individual detonations. Conceptually, industrial applications resulting from the use of nuclear explosives could be divided into two broad categories: 1) large-scale excavation and quarrying, where the energy from the explosion was used to break up and/or move rock; and 2) underground engineering, where the energy released from deeply buried nuclear explosives increased the permeability and porosity of the rock by massive breaking and fracturing. Possible excavation applications included: canals, harbors, highway and railroad cuts through mountains, open pit mining, construction of dams, and other quarry and construction-related projects. Underground nuclear explosion applications included: stimulation of natural gas production, preparation of leachable ore bodies for in situ leaching, creation of underground zones of fractured oil shale for in situ retorting, and formation of underground natural gas and petroleum storage reservoirs.

  20. Plowshare Program - American Atomic Bomb Tests For Industrial Applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    None

    2012-04-22

    The United States Atomic Energy Commission (AEC) established the Plowshare Program as a research and development activity to explore the technical and economic feasibility of using nuclear explosives for industrial applications. The reasoning was that the relatively inexpensive energy available from nuclear explosions could prove useful for a wide variety of peaceful purposes. The Plowshare Program began in 1958 and continued through 1975. Between December 1961 and May 1973, the United States conducted 27 Plowshare nuclear explosive tests comprising 35 individual detonations. Conceptually, industrial applications resulting from the use of nuclear explosives could be divided into two broad categories: 1)more » large-scale excavation and quarrying, where the energy from the explosion was used to break up and/or move rock; and 2) underground engineering, where the energy released from deeply buried nuclear explosives increased the permeability and porosity of the rock by massive breaking and fracturing. Possible excavation applications included: canals, harbors, highway and railroad cuts through mountains, open pit mining, construction of dams, and other quarry and construction-related projects. Underground nuclear explosion applications included: stimulation of natural gas production, preparation of leachable ore bodies for in situ leaching, creation of underground zones of fractured oil shale for in situ retorting, and formation of underground natural gas and petroleum storage reservoirs.« less

  1. U.S. Seismic Design Maps Web Application

    NASA Astrophysics Data System (ADS)

    Martinez, E.; Fee, J.

    2015-12-01

    The application computes earthquake ground motion design parameters compatible with the International Building Code and other seismic design provisions. It is the primary method for design engineers to obtain ground motion parameters for multiple building codes across the country. When designing new buildings and other structures, engineers around the country use the application. Users specify the design code of interest, location, and other parameters to obtain necessary ground motion information consisting of a high-level executive summary as well as detailed information including maps, data, and graphs. Results are formatted such that they can be directly included in a final engineering report. In addition to single-site analysis, the application supports a batch mode for simultaneous consideration of multiple locations. Finally, an application programming interface (API) is available which allows other application developers to integrate this application's results into larger applications for additional processing. Development on the application has proceeded in an iterative manner working with engineers through email, meetings, and workshops. Each iteration provided new features, improved performance, and usability enhancements. This development approach positioned the application to be integral to the structural design process and is now used to produce over 1800 reports daily. Recent efforts have enhanced the application to be a data-driven, mobile-first, responsive web application. Development is ongoing, and source code has recently been published into the open-source community on GitHub. Open-sourcing the code facilitates improved incorporation of user feedback to add new features ensuring the application's continued success.

  2. CHARMM-GUI Input Generator for NAMD, GROMACS, AMBER, OpenMM, and CHARMM/OpenMM Simulations Using the CHARMM36 Additive Force Field

    DOE PAGES

    Lee, Jumin; Cheng, Xi; Swails, Jason M.; ...

    2015-11-12

    Here we report that proper treatment of nonbonded interactions is essential for the accuracy of molecular dynamics (MD) simulations, especially in studies of lipid bilayers. The use of the CHARMM36 force field (C36 FF) in different MD simulation programs can result in disagreements with published simulations performed with CHARMM due to differences in the protocols used to treat the long-range and 1-4 nonbonded interactions. In this study, we systematically test the use of the C36 lipid FF in NAMD, GROMACS, AMBER, OpenMM, and CHARMM/OpenMM. A wide range of Lennard-Jones (LJ) cutoff schemes and integrator algorithms were tested to find themore » optimal simulation protocol to best match bilayer properties of six lipids with varying acyl chain saturation and head groups. MD simulations of a 1,2-dipalmitoyl-sn-phosphatidylcholine (DPPC) bilayer were used to obtain the optimal protocol for each program. MD simulations with all programs were found to reasonably match the DPPC bilayer properties (surface area per lipid, chain order parameters, and area compressibility modulus) obtained using the standard protocol used in CHARMM as well as from experiments. The optimal simulation protocol was then applied to the other five lipid simulations and resulted in excellent agreement between results from most simulation programs as well as with experimental data. AMBER compared least favorably with the expected membrane properties, which appears to be due to its use of the hard-truncation in the LJ potential versus a force-based switching function used to smooth the LJ potential as it approaches the cutoff distance. The optimal simulation protocol for each program has been implemented in CHARMM-GUI. This protocol is expected to be applicable to the remainder of the additive C36 FF including the proteins, nucleic acids, carbohydrates, and small molecules.« less

  3. CHARMM-GUI Input Generator for NAMD, GROMACS, AMBER, OpenMM, and CHARMM/OpenMM Simulations Using the CHARMM36 Additive Force Field.

    PubMed

    Lee, Jumin; Cheng, Xi; Swails, Jason M; Yeom, Min Sun; Eastman, Peter K; Lemkul, Justin A; Wei, Shuai; Buckner, Joshua; Jeong, Jong Cheol; Qi, Yifei; Jo, Sunhwan; Pande, Vijay S; Case, David A; Brooks, Charles L; MacKerell, Alexander D; Klauda, Jeffery B; Im, Wonpil

    2016-01-12

    Proper treatment of nonbonded interactions is essential for the accuracy of molecular dynamics (MD) simulations, especially in studies of lipid bilayers. The use of the CHARMM36 force field (C36 FF) in different MD simulation programs can result in disagreements with published simulations performed with CHARMM due to differences in the protocols used to treat the long-range and 1-4 nonbonded interactions. In this study, we systematically test the use of the C36 lipid FF in NAMD, GROMACS, AMBER, OpenMM, and CHARMM/OpenMM. A wide range of Lennard-Jones (LJ) cutoff schemes and integrator algorithms were tested to find the optimal simulation protocol to best match bilayer properties of six lipids with varying acyl chain saturation and head groups. MD simulations of a 1,2-dipalmitoyl-sn-phosphatidylcholine (DPPC) bilayer were used to obtain the optimal protocol for each program. MD simulations with all programs were found to reasonably match the DPPC bilayer properties (surface area per lipid, chain order parameters, and area compressibility modulus) obtained using the standard protocol used in CHARMM as well as from experiments. The optimal simulation protocol was then applied to the other five lipid simulations and resulted in excellent agreement between results from most simulation programs as well as with experimental data. AMBER compared least favorably with the expected membrane properties, which appears to be due to its use of the hard-truncation in the LJ potential versus a force-based switching function used to smooth the LJ potential as it approaches the cutoff distance. The optimal simulation protocol for each program has been implemented in CHARMM-GUI. This protocol is expected to be applicable to the remainder of the additive C36 FF including the proteins, nucleic acids, carbohydrates, and small molecules.

  4. Application-oriented integrated control center (AICC) for heterogeneous optical networks

    NASA Astrophysics Data System (ADS)

    Zhao, Yongli; Zhang, Jie; Cao, Xuping; Wang, Dajiang; Wu, Koubo; Cai, Yinxiang; Gu, Wanyi

    2011-12-01

    Various broad bandwidth services have being swallowing the bandwidth resource of optical networks, such as the data center application and cloud computation. There are still some challenges for future optical networks although the available bandwidth is increasing with the development of transmission technologies. The relationship between upper application layer and lower network resource layer is necessary to be researched further. In order to improve the efficiency of network resources and capability of service provisioning, heterogeneous optical networks resource can be abstracted as unified Application Programming Interfaces (APIs) which can be open to various upper applications through Application-oriented Integrated Control Center (AICC) proposed in the paper. A novel Openflow-based unified control architecture is proposed for the optimization of cross layer resources. Numeric results show good performance of AICC through simulation experiments.

  5. A Flexible Method for Producing F.E.M. Analysis of Bone Using Open-Source Software

    NASA Technical Reports Server (NTRS)

    Boppana, Abhishektha; Sefcik, Ryan; Meyers, Jerry G.; Lewandowski, Beth E.

    2016-01-01

    This project, performed in support of the NASA GRC Space Academy summer program, sought to develop an open-source workflow methodology that segmented medical image data, created a 3D model from the segmented data, and prepared the model for finite-element analysis. In an initial step, a technological survey evaluated the performance of various existing open-source software that claim to perform these tasks. However, the survey concluded that no single software exhibited the wide array of functionality required for the potential NASA application in the area of bone, muscle and bio fluidic studies. As a result, development of a series of Python scripts provided the bridging mechanism to address the shortcomings of the available open source tools. The implementation of the VTK library provided the most quick and effective means of segmenting regions of interest from the medical images; it allowed for the export of a 3D model by using the marching cubes algorithm to build a surface mesh. To facilitate the development of the model domain from this extracted information required a surface mesh to be processed in the open-source software packages Blender and Gmsh. The Preview program of the FEBio suite proved to be sufficient for volume filling the model with an unstructured mesh and preparing boundaries specifications for finite element analysis. To fully allow FEM modeling, an in house developed Python script allowed assignment of material properties on an element by element basis by performing a weighted interpolation of voxel intensity of the parent medical image correlated to published information of image intensity to material properties, such as ash density. A graphical user interface combined the Python scripts and other software into a user friendly interface. The work using Python scripts provides a potential alternative to expensive commercial software and inadequate, limited open-source freeware programs for the creation of 3D computational models. More work will be needed to validate this approach in creating finite-element models.

  6. A contextual perspective on talented female participants and their development in extracurricular STEM programs.

    PubMed

    Stoeger, Heidrun; Schirner, Sigrun; Laemmle, Lena; Obergriesser, Stefanie; Heilemann, Michael; Ziegler, Albert

    2016-08-01

    We advocate a more contextual perspective in giftedness research. In our view, doing so opens up three particularly interesting research areas, which we refer to as the participation issue, the effectiveness issue, and the interaction issue. To illustrate their utility, we examined characteristics of females participating in German high achiever-track secondary education who had applied for participation in a 1-year extracurricular e-mentoring program in science, technology, engineering, and mathematics (STEM) (n = 1237). Their characteristics were compared with male and female random-sample control groups. We assessed the effectiveness of the mentoring program by comparing the developmental trajectories of program participants with those of three control groups: applicants who were randomly chosen for later participation (waiting-list control group) and a female and a male control group. Finally, we examined whether differences in program effectiveness could be partially explained by characteristics of the interaction with the domain. Program applicants possessed more advantageous individual characteristics but, unexpectedly, less advantageous home and school environments than female and male members of the control groups. Program participation affected positive changes in certainty about career goals (independent of STEM) and in the number of STEM activities. The amount of STEM communication partially explained differences in program effectiveness. © 2016 New York Academy of Sciences.

  7. Essential elements to the establishment and design of a successful robotic surgery programme.

    PubMed

    Patel, Vipul R

    2006-03-01

    The application of robotic assisted technology has created a new era in surgery, by addressing some of the limitations of conventional open and laparoscopic surgery. To optimize success the incorporation of robotics into a surgical program must be performed with a structured approach. We discuss the key factors for building a successful robotic surgery program. Prior to implementing a robotics program certain essential elements must be examined. One must assess the overall goals of the program, the initial applications of the technology and the time line for success. In addition a financial analysis of the potential impact of the technology must also be performed. Essential personnel should also be identified in order to form a cohesive robotic surgery team. These preparatory sets help coordinate the establishment of the program and help to prevent unrealistic expectations; while generating the best environment for success. Once the purchase of the robotic system has been approved a robotic surgery team is created with certain essential components. This staff includes: the surgeons, nursing staff, physician assistants, resident/fellows, program coordinator, marketing and a financial analysis team. This team will work together to achieve the common goals for the program. Robotic assisted surgery has grown tremendously over the last half decade in certain surgical fields such as urology. The success of programs has been variable and often related to the infrastructure of the program. The key factors appear to be creation of a sound financial plan, early identification of applicable specialties and a motivated surgical team. Copyright 2006 John Wiley & Sons, Ltd.

  8. KSC-02pd0618

    NASA Image and Video Library

    2002-04-29

    KENNEDY SPACE CENTER, FLA. -- Robert Ferl, professor in the horticultural sciences department and assistant director of the University of Florida Biotechnology Program, speaks during the opening ceremony to launch a new program called SABRE, Space Agricultural Biotechnology Research and Education, that involves UF and NASA. SABRE will focus on the discovery, development and application of the biological aspects of advanced life support strategies. The program will include faculty from UF's Institute of Food and Agricultural Sciences, who will be located at both KSC - in the state-owned Space Experiment Research and Processing Laboratory (SERPL) being built there - and UF in Gainesville. Ferl will direct and be responsible for coordinating the research and education efforts of UF and NASA.

  9. Workshop on Advanced Technologies for Planetary Instruments, part 1

    NASA Technical Reports Server (NTRS)

    Appleby, John F. (Editor)

    1993-01-01

    This meeting was conceived in response to new challenges facing NASA's robotic solar system exploration program. This volume contains papers presented at the Workshop on Advanced Technologies for Planetary Instruments on 28-30 Apr. 1993. This meeting was conceived in response to new challenges facing NASA's robotic solar system exploration program. Over the past several years, SDIO has sponsored a significant technology development program aimed, in part, at the production of instruments with these characteristics. This workshop provided an opportunity for specialists from the planetary science and DoD communities to establish contacts, to explore common technical ground in an open forum, and more specifically, to discuss the applicability of SDIO's technology base to planetary science instruments.

  10. Hybrid cloud and cluster computing paradigms for life science applications

    PubMed Central

    2010-01-01

    Background Clouds and MapReduce have shown themselves to be a broadly useful approach to scientific computing especially for parallel data intensive applications. However they have limited applicability to some areas such as data mining because MapReduce has poor performance on problems with an iterative structure present in the linear algebra that underlies much data analysis. Such problems can be run efficiently on clusters using MPI leading to a hybrid cloud and cluster environment. This motivates the design and implementation of an open source Iterative MapReduce system Twister. Results Comparisons of Amazon, Azure, and traditional Linux and Windows environments on common applications have shown encouraging performance and usability comparisons in several important non iterative cases. These are linked to MPI applications for final stages of the data analysis. Further we have released the open source Twister Iterative MapReduce and benchmarked it against basic MapReduce (Hadoop) and MPI in information retrieval and life sciences applications. Conclusions The hybrid cloud (MapReduce) and cluster (MPI) approach offers an attractive production environment while Twister promises a uniform programming environment for many Life Sciences applications. Methods We used commercial clouds Amazon and Azure and the NSF resource FutureGrid to perform detailed comparisons and evaluations of different approaches to data intensive computing. Several applications were developed in MPI, MapReduce and Twister in these different environments. PMID:21210982

  11. Hybrid cloud and cluster computing paradigms for life science applications.

    PubMed

    Qiu, Judy; Ekanayake, Jaliya; Gunarathne, Thilina; Choi, Jong Youl; Bae, Seung-Hee; Li, Hui; Zhang, Bingjing; Wu, Tak-Lon; Ruan, Yang; Ekanayake, Saliya; Hughes, Adam; Fox, Geoffrey

    2010-12-21

    Clouds and MapReduce have shown themselves to be a broadly useful approach to scientific computing especially for parallel data intensive applications. However they have limited applicability to some areas such as data mining because MapReduce has poor performance on problems with an iterative structure present in the linear algebra that underlies much data analysis. Such problems can be run efficiently on clusters using MPI leading to a hybrid cloud and cluster environment. This motivates the design and implementation of an open source Iterative MapReduce system Twister. Comparisons of Amazon, Azure, and traditional Linux and Windows environments on common applications have shown encouraging performance and usability comparisons in several important non iterative cases. These are linked to MPI applications for final stages of the data analysis. Further we have released the open source Twister Iterative MapReduce and benchmarked it against basic MapReduce (Hadoop) and MPI in information retrieval and life sciences applications. The hybrid cloud (MapReduce) and cluster (MPI) approach offers an attractive production environment while Twister promises a uniform programming environment for many Life Sciences applications. We used commercial clouds Amazon and Azure and the NSF resource FutureGrid to perform detailed comparisons and evaluations of different approaches to data intensive computing. Several applications were developed in MPI, MapReduce and Twister in these different environments.

  12. IJ-OpenCV: Combining ImageJ and OpenCV for processing images in biomedicine.

    PubMed

    Domínguez, César; Heras, Jónathan; Pascual, Vico

    2017-05-01

    The effective processing of biomedical images usually requires the interoperability of diverse software tools that have different aims but are complementary. The goal of this work is to develop a bridge to connect two of those tools: ImageJ, a program for image analysis in life sciences, and OpenCV, a computer vision and machine learning library. Based on a thorough analysis of ImageJ and OpenCV, we detected the features of these systems that could be enhanced, and developed a library to combine both tools, taking advantage of the strengths of each system. The library was implemented on top of the SciJava converter framework. We also provide a methodology to use this library. We have developed the publicly available library IJ-OpenCV that can be employed to create applications combining features from both ImageJ and OpenCV. From the perspective of ImageJ developers, they can use IJ-OpenCV to easily create plugins that use any functionality provided by the OpenCV library and explore different alternatives. From the perspective of OpenCV developers, this library provides a link to the ImageJ graphical user interface and all its features to handle regions of interest. The IJ-OpenCV library bridges the gap between ImageJ and OpenCV, allowing the connection and the cooperation of these two systems. Copyright © 2017 Elsevier Ltd. All rights reserved.

  13. Better Finite-Element Analysis of Composite Shell Structures

    NASA Technical Reports Server (NTRS)

    Clarke, Gregory

    2007-01-01

    A computer program implements a finite-element-based method of predicting the deformations of thin aerospace structures made of isotropic materials or anisotropic fiber-reinforced composite materials. The technique and corresponding software are applicable to thin shell structures in general and are particularly useful for analysis of thin beamlike members having open cross-sections (e.g. I-beams and C-channels) in which significant warping can occur.

  14. Undergraduate Research Opportunities in OSS

    NASA Astrophysics Data System (ADS)

    Boldyreff, Cornelia; Capiluppi, Andrea; Knowles, Thomas; Munro, James

    Using Open Source Software (OSS) in undergraduate teaching in universities is now commonplace. Students use OSS applications and systems in their courses on programming, operating systems, DBMS, web development to name but a few. Studying OSS projects from both a product and a process view also forms part of the software engineering curriculum at various universities. Many students have taken part in OSS projects as well as developers.

  15. Opening the archive: how free data has enabled the science and monitoring promise of Landsat

    Treesearch

    Michael A. Wulder; Jeffrey G. Masek; Warren B. Cohen; Thomas R. Loveland; Curtis E. Woodcock

    2012-01-01

    Landsat occupies a unique position in the constellation of civilian earth observation satellites, with a long and rich scientific and applications heritage. With nearly 40 years of continuous observation—since launch of the first satellite in 1972—the Landsat program has benefited from insightful technical specification, robust engineering, and the necessary...

  16. Impacts and Viability of Open Source Software on Earth Science Metadata Clearing House and Service Registry Applications

    NASA Astrophysics Data System (ADS)

    Pilone, D.; Cechini, M. F.; Mitchell, A.

    2011-12-01

    Earth Science applications typically deal with large amounts of data and high throughput rates, if not also high transaction rates. While Open Source is frequently used for smaller scientific applications, large scale, highly available systems frequently fall back to "enterprise" class solutions like Oracle RAC or commercial grade JEE Application Servers. NASA's Earth Observing System Data and Information System (EOSDIS) provides end-to-end capabilities for managing NASA's Earth science data from multiple sources - satellites, aircraft, field measurements, and various other programs. A core capability of EOSDIS, the Earth Observing System (EOS) Clearinghouse (ECHO), is a highly available search and order clearinghouse of over 100 million pieces of science data that has evolved from its early R&D days to a fully operational system. Over the course of this maturity ECHO has largely transitioned from commercial frameworks, databases, and operating systems to Open Source solutions...and in some cases, back. In this talk we discuss the progression of our technological solutions and our lessons learned in the areas of: ? High performance, large scale searching solutions ? GeoSpatial search capabilities and dealing with multiple coordinate systems ? Search and storage of variable format source (science) data ? Highly available deployment solutions ? Scalable (elastic) solutions to visual searching and image handling Throughout the evolution of the ECHO system we have had to evaluate solutions with respect to performance, cost, developer productivity, reliability, and maintainability in the context of supporting global science users. Open Source solutions have played a significant role in our architecture and development but several critical commercial components remain (or have been reinserted) to meet our operational demands.

  17. USGS Science Data Catalog - Open Data Advances or Declines

    NASA Astrophysics Data System (ADS)

    Frame, M. T.; Hutchison, V.; Zolly, L.; Wheeler, B.; Latysh, N.; Devarakonda, R.; Palanisamy, G.; Shrestha, B.

    2014-12-01

    The recent Office of Science and Technology Policy (OSTP) White House Open Data Policies (2013) have required Federal agencies to establish formal catalogues of their science data holdings and make these data easily available on Web sites, portals, and applications. As an organization, the USGS has historically excelled at making its data holdings freely available on its various Web sites (i.e., National, Scientific Programs, or local Science Center). In response to these requirements, the USGS Core Science Analytics, Synthesis, and Libraries program, in collaboration with DOE's Oak Ridge National Laboratory (ORNL) Mercury Consortium (funded by NASA, USGS, and DOE), and a number of other USGS organizations, established the Science Data Catalog (http://data.usgs.gov) cyberinfrastructure, content management processes/tools, and supporting policies. The USGS Science Data Catalog led the charge at USGS to improve the robustness of existing/future metadata collections; streamline and develop sustainable publishing to external aggregators (i.e., data.gov); and provide leadership to the U.S. Department of Interior in emerging Open Data policies, techniques, and systems. The session will discuss the current successes, challenges, and movement toward meeting these Open Data policies for USGS scientific data holdings. A retrospective look at the last year of implementation of these efforts within USGS will occur to determine whether these Open Data Policies are improving data access or limiting data availability. To learn more about the USGS Science Data Catalog, visit us at http://data.usgs.gov/info/about.html

  18. WHATIF: an open-source desktop application for extraction and management of the incidental findings from next-generation sequencing variant data

    PubMed Central

    Ye, Zhan; Kadolph, Christopher; Strenn, Robert; Wall, Daniel; McPherson, Elizabeth; Lin, Simon

    2015-01-01

    Background Identification and evaluation of incidental findings in patients following whole exome (WGS) or whole genome sequencing (WGS) is challenging for both practicing physicians and researchers. The American College of Medical Genetics and Genomics (ACMG) recently recommended a list of reportable incidental genetic findings. However, no informatics tools are currently available to support evaluation of incidental findings in next-generation sequencing data. Methods The Wisconsin Hierarchical Analysis Tool for Incidental Findings (WHATIF), was developed as a stand-alone Windows-based desktop executable, to support the interactive analysis of incidental findings in the context of the ACMG recommendations. WHATIF integrates the European Bioinformatics Institute Variant Effect Predictor (VEP) tool for biological interpretation and the National Center for Biotechnology Information ClinVar tool for clinical interpretation. Results An open-source desktop program was created to annotate incidental findings and present the results with a user-friendly interface. Further, a meaningful index (WHATIF Index) was devised for each gene to facilitate ranking of the relative importance of the variants and estimate the potential workload associated with further evaluation of the variants. Our WHATIF application is available at: http://tinyurl.com/WHATIF-SOFTWARE Conclusions The WHATIF application offers a user-friendly interface and allows users to investigate the extracted variant information efficiently and intuitively while always accessing the up to date information on variants via application programming interfaces (API) connections. WHATIF’s highly flexible design and straightforward implementation aids users in customizing the source code to meet their own special needs. PMID:25890833

  19. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gearhart, Jared Lee; Adair, Kristin Lynn; Durfee, Justin David.

    When developing linear programming models, issues such as budget limitations, customer requirements, or licensing may preclude the use of commercial linear programming solvers. In such cases, one option is to use an open-source linear programming solver. A survey of linear programming tools was conducted to identify potential open-source solvers. From this survey, four open-source solvers were tested using a collection of linear programming test problems and the results were compared to IBM ILOG CPLEX Optimizer (CPLEX) [1], an industry standard. The solvers considered were: COIN-OR Linear Programming (CLP) [2], [3], GNU Linear Programming Kit (GLPK) [4], lp_solve [5] and Modularmore » In-core Nonlinear Optimization System (MINOS) [6]. As no open-source solver outperforms CPLEX, this study demonstrates the power of commercial linear programming software. CLP was found to be the top performing open-source solver considered in terms of capability and speed. GLPK also performed well but cannot match the speed of CLP or CPLEX. lp_solve and MINOS were considerably slower and encountered issues when solving several test problems.« less

  20. Web Platform for Sharing Modeling Software in the Field of Nonlinear Optics

    NASA Astrophysics Data System (ADS)

    Dubenskaya, Julia; Kryukov, Alexander; Demichev, Andrey

    2018-02-01

    We describe the prototype of a Web platform intended for sharing software programs for computer modeling in the rapidly developing field of the nonlinear optics phenomena. The suggested platform is built on the top of the HUBZero open-source middleware. In addition to the basic HUBZero installation we added to our platform the capability to run Docker containers via an external application server and to send calculation programs to those containers for execution. The presented web platform provides a wide range of features and might be of benefit to nonlinear optics researchers.

  1. Investigation of air transportation technology at Princeton University, 1984

    NASA Technical Reports Server (NTRS)

    Stengel, Robert F.

    1987-01-01

    The Air Transportation Technology Program at Princeton University, a program emphasizing graduate and undergraduate student research, proceeded along four avenues during 1984: (1) guidance and control strategies for penetration of microbursts and wind shear; (2) application of artificial intelligence in flight control systems; (3) effects of control saturation on closed loop stability; and (4) response of open loop unstable aircraft. Areas of investigation relate to guidance and control of commercial transports as well as to general aviation aircraft. Interaction between the flight crew and automatic systems is a subject of principle concern. These areas of investigation are briefly discussed.

  2. An Evaluation of Grazing-Incidence Optics for Neutron Imaging

    NASA Technical Reports Server (NTRS)

    Gubarev, M. V.

    2007-01-01

    The refractive index for most materials is slightly less than unity, which opens an opportunity to develop the grazing incidence neutron imaging optics. The ideal material for the optics would be natural nickel and its isotopes. Marshall Space Flight Center (MSFC) has active development program on the nickel replicated optics for use in x-ray astronomy. Brief status report on the program is presented. The results of the neutron focusing optic test carried by the MSFC team at National Institute of Standards and Technology (NIST) are also presented. Possible applications of the optics are briefly discussed.

  3. PiCO QL: A software library for runtime interactive queries on program data

    NASA Astrophysics Data System (ADS)

    Fragkoulis, Marios; Spinellis, Diomidis; Louridas, Panos

    PiCO QL is an open source C/C++ software whose scientific scope is real-time interactive analysis of in-memory data through SQL queries. It exposes a relational view of a system's or application's data structures, which is queryable through SQL. While the application or system is executing, users can input queries through a web-based interface or issue web service requests. Queries execute on the live data structures through the respective relational views. PiCO QL makes a good candidate for ad-hoc data analysis in applications and for diagnostics in systems settings. Applications of PiCO QL include the Linux kernel, the Valgrind instrumentation framework, a GIS application, a virtual real-time observatory of stellar objects, and a source code analyser.

  4. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kegel, T.M.

    Calibration laboratories are faced with the need to become accredited or registered to one or more quality standards. One requirement common to all of these standards is the need to have in place a measurement assurance program. What is a measurement assurance program? Brian Belanger, in Measurement Assurance Programs: Part 1, describes it as a {open_quotes}quality assurance program for a measurement process that quantifies the total uncertainty of the measurements (both random and systematic components of error) with respect to national or designated standards and demonstrates that the total uncertainty is sufficiently small to meet the user`s requirements.{close_quotes} Rolf Schumachermore » is more specific in Measurement Assurance in Your Own Laboratory. He states, {open_quotes}Measurement assurance is the application of broad quality control principles to measurements of calibrations.{close_quotes} Here, the focus is on one important part of any measurement assurance program: implementation of statistical process control (SPC). Paraphrasing Juran`s Quality Control Handbook, a process is in statistical control if the only observed variations are those that can be attributed to random causes. Conversely, a process that exhibits variations due to assignable causes is not in a state of statistical control. Finally, Carrol Croarkin states, {open_quotes}In the measurement assurance context the measurement algorithm including instrumentation, reference standards and operator interactions is the process that is to be controlled, and its direct product is the measurement per se. The measurements are assumed to be valid if the measurement algorithm is operating in a state of control.{close_quotes} Implicit in this statement is the important fact that an out-of-control process cannot produce valid measurements. 7 figs.« less

  5. KSC-02pd0617

    NASA Image and Video Library

    2002-04-29

    KENNEDY SPACE CENTER, FLA. -- Florida Representative Bob Allen speaks to attendees at the opening ceremony kicking off a new program known as SABRE, Space Agricultural Biotechnology Research and Education. The program is a combined effort of the University of Florida and NASA. SABRE will focus on the discovery, development and application of the biological aspects of advanced life support strategies. The program will include faculty from UF's Institute of Food and Agricultural Sciences, who will be located at both KSC - in the state-owned Space Experiment Research and Processing Laboratory (SERPL) being built there - and UF in Gainesville. SABRE will be directed by Robert Ferl, professor in the horticultural sciences department and assistant director of UF's Biotechnology Program. He will be responsible for coordinating the research and education efforts of UF and NASA

  6. An Overview of Air-Breathing Propulsion Efforts for 2015 SBIR Phase I

    NASA Technical Reports Server (NTRS)

    Nguyen, Hung D.; Steele, Gynelle C.

    2016-01-01

    NASA's Small Business Innovation Research (SBIR) program focuses on technological innovation by investing in development of innovative concepts and technologies to help NASA mission directorates address critical research needs for Agency programs. This report highlights 24 of the innovative SBIR 2015 Phase I projects that emphasize one of NASA Glenn Research Center's six core competencies-Air-Breathing Propulsion. The technologies cover a wide spectrum of applications such as hybrid nanocomposites for efficient aerospace structures; plasma flow control for drag reduction; physics-based aeroanalysis methods for open rotor conceptual designs; vertical lift by series hybrid power; fast pressure-sensitive paint systems for production wind tunnel testing; rugged, compact, and inexpensive airborne fiber sensor interrogators based on monolithic tunable lasers; and high sensitivity semiconductor sensor skins for multi-axis surface pressure characterization. Each featured technology describes an innovation and technical objective and highlights NASA commercial and industrial applications. This report provides an opportunity for NASA engineers, researchers, and program managers to learn how NASA SBIR technologies could help their programs and projects, and lead to collaborations and partnerships between the small SBIR companies and NASA that would benefit both.

  7. The NASA LeRC regenerative fuel cell system testbed program for goverment and commercial applications

    NASA Astrophysics Data System (ADS)

    Maloney, Thomas M.; Prokopius, Paul R.; Voecks, Gerald E.

    1995-01-01

    The Electrochemical Technology Branch of the NASA Lewis Research Center (LeRC) has initiated a program to develop a renewable energy system testbed to evaluate, characterize, and demonstrate fully integrated regenerative fuel cell (RFC) system for space, military, and commercial applications. A multi-agency management team, led by NASA LeRC, is implementing the program through a unique international coalition which encompasses both government and industry participants. This open-ended teaming strategy optimizes the development for space, military, and commercial RFC system technologies. Program activities to date include system design and analysis, and reactant storage sub-system design, with a major emphasis centered upon testbed fabrication and installation and testing of two key RFC system components, namely, the fuel cells and electrolyzers. Construction of the LeRC 25 kW RFC system testbed at the NASA-Jet Propulsion Labortory (JPL) facility at Edwards Air Force Base (EAFB) is nearly complete and some sub-system components have already been installed. Furthermore, planning for the first commercial RFC system demonstration is underway.

  8. Emerging Leaders: AED's Open World Program.

    ERIC Educational Resources Information Center

    McDonald, Sandra

    2003-01-01

    Describes the Open World Program, funded and administered by the Library of Congress, with support from private organizations such as the Academy for Educational Development (AED). Open World Program allows community colleges to participate by hosting delegations from other countries. Some themes include: environment, women as leaders, economic…

  9. 47 CFR 76.1512 - Programming information.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... MULTICHANNEL VIDEO AND CABLE TELEVISION SERVICE Open Video Systems § 76.1512 Programming information. (a) An open video system operator shall not unreasonably discriminate in favor of itself or its affiliates... for the purpose of selecting programming on the open video system, or in the way such material or...

  10. 47 CFR 76.1512 - Programming information.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... MULTICHANNEL VIDEO AND CABLE TELEVISION SERVICE Open Video Systems § 76.1512 Programming information. (a) An open video system operator shall not unreasonably discriminate in favor of itself or its affiliates... for the purpose of selecting programming on the open video system, or in the way such material or...

  11. 47 CFR 76.1512 - Programming information.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... MULTICHANNEL VIDEO AND CABLE TELEVISION SERVICE Open Video Systems § 76.1512 Programming information. (a) An open video system operator shall not unreasonably discriminate in favor of itself or its affiliates... for the purpose of selecting programming on the open video system, or in the way such material or...

  12. 47 CFR 76.1512 - Programming information.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... MULTICHANNEL VIDEO AND CABLE TELEVISION SERVICE Open Video Systems § 76.1512 Programming information. (a) An open video system operator shall not unreasonably discriminate in favor of itself or its affiliates... for the purpose of selecting programming on the open video system, or in the way such material or...

  13. Amine Swingbed Payload Project Management

    NASA Technical Reports Server (NTRS)

    Hayley, Elizabeth; Curley, Su; Walsh, Mary

    2011-01-01

    The International Space Station (ISS) has been designed as a laboratory for demonstrating technologies in a microgravity environment, benefitting exploration programs by reducing the overall risk of implementing such technologies in new spacecraft. At the beginning of fiscal year 2010, the ISS program manager requested that the amine-based, pressure-swing carbon dioxide and humidity absorption technology (designed by Hamilton Sundstrand, baselined for the ORION Multi-Purpose Crew Vehicle, and tested at the Johnson Space Center in relevant environments, including with humans, since 2005) be developed into a payload for ISS Utilization. In addition to evaluating the amine technology in a flight environment before the first launch of the ORION vehicle, the ISS program wanted to determine the capability of the amine technology to remove carbon dioxide from the ISS cabin environment at the metabolic rate of the full 6-person crew. Because the amine technology vents the absorbed carbon dioxide and water vapor to space vacuum (open loop), additional hardware needed to be developed to minimize the amount of air and water resources lost overboard. Additionally, the payload system would be launched on two separate Space Shuttle flights, with the heart of the payload the swingbed unit itself launching a full year before the remainder of the payload. This paper discusses the project management and challenges of developing the amine swingbed payload in order to accomplish the technology objectives of both the open-loop ORION application as well as the closed-loop ISS application.

  14. Amine Swingbed Payload Project Management

    NASA Technical Reports Server (NTRS)

    Walsch, Mary; Curley, Su

    2013-01-01

    The International Space Station (ISS) has been designed as a laboratory for demonstrating technologies in a microgravity environment, benefitting exploration programs by reducing the overall risk of implementing such technologies in new spacecraft. At the beginning of fiscal year 2010, the ISS program manager requested that the amine-based, pressure-swing carbon dioxide and humidity absorption technology (designed by Hamilton Sundstrand, baselined for the Orion Multi-Purpose Crew Vehicle, and tested at the Johnson Space Center in relevant environments, including with humans, since 2005) be developed into a payload for ISS Utilization. In addition to evaluating the amine technology in a flight environment before the first launch of the Orion vehicle, the ISS program wanted to determine the capability of the amine technology to remove carbon dioxide from the ISS cabin environment at the metabolic rate of the full 6 ]person crew. Because the amine technology vents the absorbed carbon dioxide and water vapor to space vacuum (open loop), additional hardware needed to be developed to minimize the amount of air and water resources lost overboard. Additionally, the payload system would be launched on two separate Space Shuttle flights, with the heart of the payload-the swingbed unit itself-launching a full year before the remainder of the payload. This paper discusses the project management and challenges of developing the amine swingbed payload in order to accomplish the technology objectives of both the open -loop Orion application as well as the closed-loop ISS application.

  15. Promoting organ donation through an entertainment-education TV program in Korea: Open Your Eyes.

    PubMed

    Byoung Kwan Lee; Hyun Soon Park; Choi, Myung-Il; Cheon Soo Kim

    2010-01-01

    The purpose of this study is to investigate the effects of the characteristics of the program, Open Your Eyes, an entertainment-education TV program in Korea, on parasocial interaction and behavioral intention for organ donation. The results indicated that affective evaluation positively affected parasocial interaction with the program but cognitive evaluation negatively affected involvement with beneficiaries in the program. Also, it was found that cognitive evaluation of Open Your Eyes had a significant positive effect on behavioral intention. In addition, a significant positive effect of program engagement on the behavioral intention was found. Thus, the results indicate that individuals who feel program engagement of Open Your Eyes will be more likely to proceed with organ donation. However, no direct effect of involvement with the beneficiary and program hosts was found.

  16. Disambiguate: An open-source application for disambiguating two species in next generation sequencing data from grafted samples.

    PubMed

    Ahdesmäki, Miika J; Gray, Simon R; Johnson, Justin H; Lai, Zhongwu

    2016-01-01

    Grafting of cell lines and primary tumours is a crucial step in the drug development process between cell line studies and clinical trials. Disambiguate is a program for computationally separating the sequencing reads of two species derived from grafted samples. Disambiguate operates on DNA or RNA-seq alignments to the two species and separates the components at very high sensitivity and specificity as illustrated in artificially mixed human-mouse samples. This allows for maximum recovery of data from target tumours for more accurate variant calling and gene expression quantification. Given that no general use open source algorithm accessible to the bioinformatics community exists for the purposes of separating the two species data, the proposed Disambiguate tool presents a novel approach and improvement to performing sequence analysis of grafted samples. Both Python and C++ implementations are available and they are integrated into several open and closed source pipelines. Disambiguate is open source and is freely available at https://github.com/AstraZeneca-NGS/disambiguate.

  17. Maintaining Scientific Community Vocabularies in Drupal through Consumption of Linked Open Data and Web Services

    NASA Astrophysics Data System (ADS)

    Shepherd, A.; Arko, R. A.; Maffei, A. R.; Chandler, C. L.

    2012-12-01

    In the Summer of 2011, a one-year pilot project was funded by the National Science Foundation to build a pre-cruise planning application using the Drupal content management system (CMS). This application will be used to assist the individual operators of research vessels in the UNOLS fleet. A large portion of the operator's pre-cruise process revolves around a questionnaire presented to the principal investigator(PI) that is used to gather information about the nature of their upcoming cruise. The Drupal-based application will be delivered as a distribution for use by any operator of a UNOLS vessel to construct customized questionnaires and provide an interface for the PI to complete this questionnaire at their leisure. A major goal of the project is to develop an application that will require as little programming maintenance as possible after the initial development effort. One of the strategies employed is the reuse of existing controlled vocabularies and linked open data wherever possible for fields of the questionnaire - most notably to populate the concepts of Country, Organization, Port, and Ship. The Rolling Deck to Repository (R2R) program manages controlled vocabularies for these concepts and currently exposes these concepts as linked open data. Furthermore, R2R has identified the authoritative source for pertinent oceanographic community vocabularies as ICES for Ship, UNOLS for Port, IANA for Organization, ISO for Country, ISO for Language, SeaDataNet for Device, FIPS for State, and IHO for Sea Area as described at http://www.rvdata.us/voc. The scope of the terms provided by these sources matches the scope of the operator's needs for these concepts, and so the application is being designed to automatically consume served information about these vocabulary terms to populate and update Drupal taxonomies for use in the questionnaire. Where newer terms are required for a PI to complete a questionnaire (before they appear in the vocabularies), the Drupal-based application employs features that provide extensibility to the Drupal taxonomies while striving for lower development and maintenance costs through the use of existing Drupal modules such as web_taxonomy, autocomplete field widgets and custom modules for consuming and managing data at SPARQL endpoints.

  18. An automated program for reinforcement requirements for openings in cylindrical pressure vessels

    NASA Technical Reports Server (NTRS)

    Wilson, J. F.; Taylor, J. T.

    1975-01-01

    An automated interactive program for calculating the reinforcement requirements for openings in cylindrical pressure vessels subjected to internal pressure is described. The program is written for an electronic desk top calculator. The program calculates the required area of reinforcement for a given opening and compares this value with the area of reinforcement provided by a proposed design. All program steps, operating instructions, and example problems with input and sample output are documented.

  19. High-performance heat pipes for heat recovery applications

    NASA Technical Reports Server (NTRS)

    Saaski, E. W.; Hartl, J. H.

    1980-01-01

    Methods to improve the performance of reflux heat pipes for heat recovery applications were examined both analytically and experimentally. Various models for the estimation of reflux heat pipe transport capacity were surveyed in the literature and compared with experimental data. A high transport capacity reflux heat pipe was developed that provides up to a factor of 10 capacity improvement over conventional open tube designs; analytical models were developed for this device and incorporated into a computer program HPIPE. Good agreement of the model predictions with data for R-11 and benzene reflux heat pipes was obtained.

  20. Education of Intellectual Properties for the Training of Creative Engineers

    NASA Astrophysics Data System (ADS)

    Ito, Yoshifumi; Kajiwara, Katuhiko; Oodan, Kyouji

    Kurume National College of Technology has obtained results concerning intellectual property education combined with inventive education. In the education program, students learn about industrial property and practical expertise such as searching the open patents, making up patent-maps, and making patent application papers to the Patent Office under the guidance of a teacher, a patent adviser and attorney. As a result, some of the creative students have already applied for patents. In the future, we are going to prepare a managing system for the intellectual property at our college for the intensification of cooperative application with the local company.

  1. Assistance to Firefighters Program: Distribution of Fire Grant Funding

    DTIC Science & Technology

    2010-07-13

    shows the appropriations history for firefighter assistance, including AFG, SAFER, and the Fire Station Construction Grants ( SCG ) provided in the...1.109 billion $210 million $6.521 billion a. Assistance to Firefighters Fire Station Construction Grants ( SCG ) grants were funded by the American...and FY2010. The application period for ARRA Assistance to Firefighters Fire Station Construction Grants ( SCG ) opened on June 11 and closed on July 10

  2. World Wind Tools Reveal Environmental Change

    NASA Technical Reports Server (NTRS)

    2012-01-01

    Originally developed under NASA's Learning Technologies program as a tool to engage and inspire students, World Wind software was released under the NASA Open Source Agreement license. Honolulu, Hawaii based Intelesense Technologies is one of the companies currently making use of the technology for environmental, public health, and other monitoring applications for nonprofit organizations and Government agencies. The company saved about $1 million in development costs by using the NASA software.

  3. An overview of the Hadoop/MapReduce/HBase framework and its current applications in bioinformatics

    PubMed Central

    2010-01-01

    Background Bioinformatics researchers are now confronted with analysis of ultra large-scale data sets, a problem that will only increase at an alarming rate in coming years. Recent developments in open source software, that is, the Hadoop project and associated software, provide a foundation for scaling to petabyte scale data warehouses on Linux clusters, providing fault-tolerant parallelized analysis on such data using a programming style named MapReduce. Description An overview is given of the current usage within the bioinformatics community of Hadoop, a top-level Apache Software Foundation project, and of associated open source software projects. The concepts behind Hadoop and the associated HBase project are defined, and current bioinformatics software that employ Hadoop is described. The focus is on next-generation sequencing, as the leading application area to date. Conclusions Hadoop and the MapReduce programming paradigm already have a substantial base in the bioinformatics community, especially in the field of next-generation sequencing analysis, and such use is increasing. This is due to the cost-effectiveness of Hadoop-based analysis on commodity Linux clusters, and in the cloud via data upload to cloud vendors who have implemented Hadoop/HBase; and due to the effectiveness and ease-of-use of the MapReduce method in parallelization of many data analysis algorithms. PMID:21210976

  4. An overview of the Hadoop/MapReduce/HBase framework and its current applications in bioinformatics.

    PubMed

    Taylor, Ronald C

    2010-12-21

    Bioinformatics researchers are now confronted with analysis of ultra large-scale data sets, a problem that will only increase at an alarming rate in coming years. Recent developments in open source software, that is, the Hadoop project and associated software, provide a foundation for scaling to petabyte scale data warehouses on Linux clusters, providing fault-tolerant parallelized analysis on such data using a programming style named MapReduce. An overview is given of the current usage within the bioinformatics community of Hadoop, a top-level Apache Software Foundation project, and of associated open source software projects. The concepts behind Hadoop and the associated HBase project are defined, and current bioinformatics software that employ Hadoop is described. The focus is on next-generation sequencing, as the leading application area to date. Hadoop and the MapReduce programming paradigm already have a substantial base in the bioinformatics community, especially in the field of next-generation sequencing analysis, and such use is increasing. This is due to the cost-effectiveness of Hadoop-based analysis on commodity Linux clusters, and in the cloud via data upload to cloud vendors who have implemented Hadoop/HBase; and due to the effectiveness and ease-of-use of the MapReduce method in parallelization of many data analysis algorithms.

  5. An overview on integrated data system for archiving and sharing marine geology and geophysical data in Korea Institute of Ocean Science & Technology (KIOST)

    NASA Astrophysics Data System (ADS)

    Choi, Sang-Hwa; Kim, Sung Dae; Park, Hyuk Min; Lee, SeungHa

    2016-04-01

    We established and have operated an integrated data system for managing, archiving and sharing marine geology and geophysical data around Korea produced from various research projects and programs in Korea Institute of Ocean Science & Technology (KIOST). First of all, to keep the consistency of data system with continuous data updates, we set up standard operating procedures (SOPs) for data archiving, data processing and converting, data quality controls, and data uploading, DB maintenance, etc. Database of this system comprises two databases, ARCHIVE DB and GIS DB for the purpose of this data system. ARCHIVE DB stores archived data as an original forms and formats from data providers for data archive and GIS DB manages all other compilation, processed and reproduction data and information for data services and GIS application services. Relational data management system, Oracle 11g, adopted for DBMS and open source GIS techniques applied for GIS services such as OpenLayers for user interface, GeoServer for application server, PostGIS and PostgreSQL for GIS database. For the sake of convenient use of geophysical data in a SEG Y format, a viewer program was developed and embedded in this system. Users can search data through GIS user interface and save the results as a report.

  6. The Lunar and Planetary Institute Summer Intern Program in Planetary Science

    NASA Astrophysics Data System (ADS)

    Kramer, G. Y.

    2017-12-01

    Since 1977, the Lunar and Planetary Institute (LPI) Summer Intern Program brings undergraduate students from across the world to Houston for 10 weeks of their summer where they work one-on-one with a scientist at either LPI or Johnson Space Center on a cutting-edge research project in the planetary sciences. The program is geared for students finishing their sophomore and junior years, although graduating seniors may also apply. It is open to international undergraduates as well as students from the United States. Applicants must have at least 50 semester hours of credit (or equivalent sophomore status) and an interest in pursuing a career in the sciences. The application process is somewhat rigorous, requiring three letters of recommendation, official college transcripts, and a letter describing their background, interests, and career goals. The deadline for applications is in early January of that year of the internship. More information about the program and how to apply can be found on the LPI website: http://www.lpi.usra.edu/lpiintern/. Each advisor reads through the applications, looking for academically excellent students and those with scientific interest and backgrounds compatible with the advisor's specific project. Interns are selected fairly from the applicant pool - there are no pre-arranged agreements or selections based on who knows whom. The projects are different every year as new advisors come into the program, and existing ones change their research interest and directions. The LPI Summer Intern Program gives students the opportunity to participate in peer-reviewed research, learn from top-notch planetary scientists, and preview various careers in science. For many interns, this program was a defining moment in their careers - when they decided whether or not to follow an academic path, which direction they would take, and how. While past interns can be found all over the world and in a wide variety of occupations, all share the common bond of that summer in Houston.

  7. SCTE: An open-source Perl framework for testing equipment control and data acquisition

    NASA Astrophysics Data System (ADS)

    Mostaço-Guidolin, Luiz C.; Frigori, Rafael B.; Ruchko, Leonid; Galvão, Ricardo M. O.

    2012-07-01

    SCTE intends to provide a simple, yet powerful, framework for building data acquisition and equipment control systems for experimental Physics, and correlated areas. Via its SCTE::Instrument module, RS-232, USB, and LAN buses are supported, and the intricacies of hardware communication are encapsulated underneath an object oriented abstraction layer. Written in Perl, and using the SCPI protocol, enabled instruments can be easily programmed to perform a wide variety of tasks. While this work presents general aspects of the development of data acquisition systems using the SCTE framework, it is illustrated by particular applications designed for the calibration of several in-house developed devices for power measurement in the tokamak TCABR Alfvén Waves Excitement System. Catalogue identifier: AELZ_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AELZ_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: GNU General Public License Version 3 No. of lines in distributed program, including test data, etc.: 13 811 No. of bytes in distributed program, including test data, etc.: 743 709 Distribution format: tar.gz Programming language: Perl version 5.10.0 or higher. Computer: PC. SCPI capable digital oscilloscope, with RS-232, USB, or LAN communication ports, null modem, USB, or Ethernet cables Operating system: GNU/Linux (2.6.28-11), should also work on any Unix-based operational system Classification: 4.14 External routines: Perl modules: Device::SerialPort, Term::ANSIColor, Math::GSL, Net::HTTP. Gnuplot 4.0 or higher Nature of problem: Automation of experiments and data acquisition often requires expensive equipment and in-house development of software applications. Nowadays personal computers and test equipment come with fast and easy-to-use communication ports. Instrument vendors often supply application programs capable of controlling such devices, but are very restricted in terms of functionalities. For instance, they are not capable of controlling more than one test equipment at a same time or to automate repetitive tasks. SCTE provides a way of using auxiliary equipment in order to automate experiment procedures at low cost using only free, and open-source operational system and libraries. Solution method: SCTE provides a Perl module that implements RS-232, USB, and LAN communication allowing the use of SCPI capable instruments [1]. Therefore providing a straightforward way of creating automation and data acquisition applications using personal computers and testing instruments [2]. SCPI Consortium, Standard Commands for Programmable Instruments, 1999, http://www.scpiconsortium.org. L.C.B. Mostaço-Guidolin, Determinação da configuração de ondas de Alfvén excitadas no tokamak TCABR, Master's thesis, Universidade de São Paulo (2007), http://www.teses.usp.br/teses/disponiveis/43/43134/tde-23042009-230419/.

  8. 43 CFR 3816.2 - Application to open lands to location.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 43 Public Lands: Interior 2 2013-10-01 2013-10-01 false Application to open lands to location... LOCATION Mineral Locations in Reclamation Withdrawals § 3816.2 Application to open lands to location. Application to open lands to location under the Act may be filed by a person, association or corporation...

  9. 43 CFR 3816.2 - Application to open lands to location.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 43 Public Lands: Interior 2 2012-10-01 2012-10-01 false Application to open lands to location... LOCATION Mineral Locations in Reclamation Withdrawals § 3816.2 Application to open lands to location. Application to open lands to location under the Act may be filed by a person, association or corporation...

  10. Sustainable Data Evolution Technology for Power Grid Optimization

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    The SDET Tool is used to create open-access power grid data sets and facilitate updates of these data sets by the community. Pacific Northwest National Laboratory (PNNL) and its power industry and software vendor partners are developing an innovative sustainable data evolution technology (SDET) to create open-access power grid datasets and facilitate updates to these datasets by the power grid community. The objective is to make this a sustained effort within and beyond the ARPA-E GRID DATA program so that the datasets can evolve over time and meet the current and future needs for power grid optimization and potentially othermore » applications in power grid operation and planning.« less

  11. Automatic helmet-wearing detection for law enforcement using CCTV cameras

    NASA Astrophysics Data System (ADS)

    Wonghabut, P.; Kumphong, J.; Satiennam, T.; Ung-arunyawee, R.; Leelapatra, W.

    2018-04-01

    The objective of this research is to develop an application for enforcing helmet wearing using CCTV cameras. The developed application aims to help law enforcement by police, and eventually resulting in changing risk behaviours and consequently reducing the number of accidents and its severity. Conceptually, the application software implemented using C++ language and OpenCV library uses two different angle of view CCTV cameras. Video frames recorded by the wide-angle CCTV camera are used to detect motorcyclists. If any motorcyclist without helmet is found, then the zoomed (narrow-angle) CCTV is activated to capture image of the violating motorcyclist and the motorcycle license plate in real time. Captured images are managed by database implemented using MySQL for ticket issuing. The results show that the developed program is able to detect 81% of motorcyclists on various motorcycle types during daytime and night-time. The validation results reveal that the program achieves 74% accuracy in detecting the motorcyclist without helmet.

  12. Testing and Validating Machine Learning Classifiers by Metamorphic Testing☆

    PubMed Central

    Xie, Xiaoyuan; Ho, Joshua W. K.; Murphy, Christian; Kaiser, Gail; Xu, Baowen; Chen, Tsong Yueh

    2011-01-01

    Machine Learning algorithms have provided core functionality to many application domains - such as bioinformatics, computational linguistics, etc. However, it is difficult to detect faults in such applications because often there is no “test oracle” to verify the correctness of the computed outputs. To help address the software quality, in this paper we present a technique for testing the implementations of machine learning classification algorithms which support such applications. Our approach is based on the technique “metamorphic testing”, which has been shown to be effective to alleviate the oracle problem. Also presented include a case study on a real-world machine learning application framework, and a discussion of how programmers implementing machine learning algorithms can avoid the common pitfalls discovered in our study. We also conduct mutation analysis and cross-validation, which reveal that our method has high effectiveness in killing mutants, and that observing expected cross-validation result alone is not sufficiently effective to detect faults in a supervised classification program. The effectiveness of metamorphic testing is further confirmed by the detection of real faults in a popular open-source classification program. PMID:21532969

  13. Comparison of cyclic correlation algorithm implemented in matlab and python

    NASA Astrophysics Data System (ADS)

    Carr, Richard; Whitney, James

    Simulation is a necessary step for all engineering projects. Simulation gives the engineers an approximation of how their devices will perform under different circumstances, without hav-ing to build, or before building a physical prototype. This is especially true for space bound devices, i.e., space communication systems, where the impact of system malfunction or failure is several orders of magnitude over that of terrestrial applications. Therefore having a reliable simulation tool is key in developing these devices and systems. Math Works Matrix Laboratory (MATLAB) is a matrix based software used by scientists and engineers to solve problems and perform complex simulations. MATLAB has a number of applications in a wide variety of fields which include communications, signal processing, image processing, mathematics, eco-nomics and physics. Because of its many uses MATLAB has become the preferred software for many engineers; it is also very expensive, especially for students and startups. One alternative to MATLAB is Python. The Python is a powerful, easy to use, open source programming environment that can be used to perform many of the same functions as MATLAB. Python programming environment has been steadily gaining popularity in niche programming circles. While there are not as many function included in the software as MATLAB, there are many open source functions that have been developed that are available to be downloaded for free. This paper illustrates how Python can implement the cyclic correlation algorithm and com-pares the results to the cyclic correlation algorithm implemented in the MATLAB environment. Some of the characteristics to be compared are the accuracy and precision of the results, and the length of the programs. The paper will demonstrate that Python is capable of performing simulations of complex algorithms such cyclic correlation.

  14. 78 FR 33090 - Re-Opening of the Public Comment Period for the Draft Uranium Leasing Program Programmatic...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-06-03

    ... DEPARTMENT OF ENERGY Re-Opening of the Public Comment Period for the Draft Uranium Leasing Program Programmatic Environmental Impact Statement AGENCY: Department of Energy. ACTION: Re-opening of the public... the Draft Uranium Leasing Program Programmatic Environmental Impact Statement (Draft ULP PEIS, DOE/EIS...

  15. OpenClimateGIS - A Web Service Providing Climate Model Data in Commonly Used Geospatial Formats

    NASA Astrophysics Data System (ADS)

    Erickson, T. A.; Koziol, B. W.; Rood, R. B.

    2011-12-01

    The goal of the OpenClimateGIS project is to make climate model datasets readily available in commonly used, modern geospatial formats used by GIS software, browser-based mapping tools, and virtual globes.The climate modeling community typically stores climate data in multidimensional gridded formats capable of efficiently storing large volumes of data (such as netCDF, grib) while the geospatial community typically uses flexible vector and raster formats that are capable of storing small volumes of data (relative to the multidimensional gridded formats). OpenClimateGIS seeks to address this difference in data formats by clipping climate data to user-specified vector geometries (i.e. areas of interest) and translating the gridded data on-the-fly into multiple vector formats. The OpenClimateGIS system does not store climate data archives locally, but rather works in conjunction with external climate archives that expose climate data via the OPeNDAP protocol. OpenClimateGIS provides a RESTful API web service for accessing climate data resources via HTTP, allowing a wide range of applications to access the climate data.The OpenClimateGIS system has been developed using open source development practices and the source code is publicly available. The project integrates libraries from several other open source projects (including Django, PostGIS, numpy, Shapely, and netcdf4-python).OpenClimateGIS development is supported by a grant from NOAA's Climate Program Office.

  16. First International Symposium on Strain Gauge Balances. Pt. 1

    NASA Technical Reports Server (NTRS)

    Tripp, John S. (Editor); Tcheng, Ping (Editor)

    1999-01-01

    The first International Symposium on Strain Gauge Balances was sponsored and held at NASA Langley Research Center during October 22-25, 1996. The symposium provided an open international forum for presentation, discussion, and exchange of technical information among wind tunnel test technique specialists and strain gauge balance designers. The Symposium also served to initiate organized professional activities among the participating and relevant international technical communities. Over 130 delegates from 15 countries were in attendance. The program opened with a panel discussion, followed by technical paper sessions, and guided tours of the National Transonic Facility (NTF) wind tunnel, a local commercial balance fabrication facility, and the LaRC balance calibration laboratory. The opening panel discussion addressed "Future Trends in Balance Development and Applications." Forty-six technical papers were presented in 11 technical sessions covering the following areas: calibration, automatic calibration, data reduction, facility reports, design, accuracy and uncertainty analysis, strain gauges, instrumentation, balance design, thermal effects, finite element analysis, applications, and special balances. At the conclusion of the Symposium, a steering committee representing most of the nations and several U.S. organizations attending the Symposium was established to initiate planning for a second international balance symposium, to be held in 1999 in the UK.

  17. First International Symposium on Strain Gauge Balances. Part 2

    NASA Technical Reports Server (NTRS)

    Tripp, John S (Editor); Tcheng, Ping (Editor)

    1999-01-01

    The first International Symposium on Strain Gauge Balances was sponsored and held at NASA Langley Research Center during October 22-25, 1996. The symposium provided an open international forum for presentation, discussion, and exchange of technical information among wind tunnel test technique specialists and strain gauge balance designers. The Symposium also served to initiate organized professional activities among the participating and relevant international technical communities. Over 130 delegates from 15 countries were in attendance. The program opened with a panel discussion, followed by technical paper sessions, and guided tours of the National Transonic Facility (NTF) wind tunnel, a local commercial balance fabrication facility, and the LaRC balance calibration laboratory. The opening panel discussion addressed "Future Trends in Balance Development and Applications." Forty-six technical papers were presented in 11 technical sessions covering the following areas: calibration, automatic calibration, data reduction, facility reports, design, accuracy and uncertainty analysis, strain gauges, instrumentation, balance design, thermal effects, finite element analysis, applications, and special balances. At the conclusion of the Symposium, a steering committee representing most of the nations and several U.S. organizations attending the Symposium was established to initiate planning for a second international balance symposium, to be held in 1999 in the UK.

  18. Modal analysis and dynamic stresses for acoustically excited shuttle insulation tiles

    NASA Technical Reports Server (NTRS)

    Ojalvo, I. U.; Ogilvie, P. L.

    1975-01-01

    Improvements and extensions to the RESIST computer program developed for determining the normalized modal stress response of shuttle insulation tiles are described. The new version of RESIST can accommodate primary structure panels with closed-cell stringers, in addition to the capability for treating open-cell stringers. In addition, the present version of RESIST numerically solves vibration problems several times faster than its predecessor. A new digital computer program, titled ARREST (Acoustic Response of Reusable Shuttle Tiles) is also described. Starting with modal information contained on output tapes from RESIST computer runs, ARREST determines RMS stresses, deflections and accelerations of shuttle panels with reusable surface insulation tiles. Both programs are applicable to stringer stiffened structural panels with or without reusable surface insulation titles.

  19. KSC-02pd0614

    NASA Image and Video Library

    2002-04-29

    KENNEDY SPACE CENTER, FLA. -- U.S. Representative Dave Weldon addresses a large group attending the opening of a new program known as SABRE, Space Agricultural Biotechnology Research and Education, that involves the University of Florida and NASA. SABRE will focus on the discovery, development and application of the biological aspects of advanced life support strategies. The program will include faculty from UF's Institute of Food and Agricultural Sciences, who will be located at both KSC - in the state-owned Space Experiment Research and Processing Laboratory (SERPL) being built there - and UF in Gainesville. SABRE will be directed by Robert Ferl, professor in the horticultural sciences department and assistant director of UF's Biotechnology Program. He will be responsible for coordinating the research and education efforts of UF and NASA

  20. KSC-02pd0613

    NASA Image and Video Library

    2002-04-29

    KENNEDY SPACE CENTER, FLA. -- Center Director Roy D. Bridges Jr. speaks to a large group attending the opening of a new program known as SABRE, Space Agricultural Biotechnology Research and Education, that involves the University of Florida and NASA. SABRE will focus on the discovery, development and application of the biological aspects of advanced life support strategies. The program will include faculty from UF's Institute of Food and Agricultural Sciences, who will be located at both KSC - in the state-owned Space Experiment Research and Processing Laboratory (SERPL) being built there - and UF in Gainesville. SABRE will be directed by Robert Ferl, professor in the horticultural sciences department and assistant director of UF's Biotechnology Program. He will be responsible for coordinating the research and education efforts of UF and NASA

  1. BioSmalltalk: a pure object system and library for bioinformatics.

    PubMed

    Morales, Hernán F; Giovambattista, Guillermo

    2013-09-15

    We have developed BioSmalltalk, a new environment system for pure object-oriented bioinformatics programming. Adaptive end-user programming systems tend to become more important for discovering biological knowledge, as is demonstrated by the emergence of open-source programming toolkits for bioinformatics in the past years. Our software is intended to bridge the gap between bioscientists and rapid software prototyping while preserving the possibility of scaling to whole-system biology applications. BioSmalltalk performs better in terms of execution time and memory usage than Biopython and BioPerl for some classical situations. BioSmalltalk is cross-platform and freely available (MIT license) through the Google Project Hosting at http://code.google.com/p/biosmalltalk hernan.morales@gmail.com Supplementary data are available at Bioinformatics online.

  2. Standardized emissions inventory methodology for open-pit mining areas.

    PubMed

    Huertas, Jose I; Camacho, Dumar A; Huertas, Maria E

    2011-08-01

    There is still interest in a unified methodology to quantify the mass of particulate material emitted into the atmosphere by activities inherent to open-pit mining. For the case of total suspended particles (TSP), the current practice is to estimate such emissions by developing inventories based on the emission factors recommended by the USEPA for this purpose. However, there are disputes over the specific emission factors that must be used for each activity and the applicability of such factors to cases quite different to the ones under which they were obtained. There is also a need for particulate matter with an aerodynamic diameter less than 10 μm (PM(10)) emission inventories and for metrics to evaluate the emission control programs implemented by open-pit mines. To address these needs, work was carried out to establish a standardized TSP and PM(10) emission inventory methodology for open-pit mining areas. The proposed methodology was applied to seven of the eight mining companies operating in the northern part of Colombia, home to the one of the world's largest open-pit coal mining operations (∼70 Mt/year). The results obtained show that transport on unpaved roads is the mining activity that generates most of the emissions and that the total emissions may be reduced up to 72% by spraying water on the unpaved roads. Performance metrics were defined for the emission control programs implemented by mining companies. It was found that coal open-pit mines are emitting 0.726 and 0.180 kg of TSP and PM(10), respectively, per ton of coal produced. It was also found that these mines are using on average 1.148 m(2) of land per ton of coal produced per year.

  3. Application of low cost technology for the management of irrgation in organic orchads

    NASA Astrophysics Data System (ADS)

    Horcajo, Daniel; Patrícia Prazeres Marques, Karina; Rodríguez Sinobas, Leonor

    2014-05-01

    Throughout history, humans have cyclically return to their old traditions such as the organic orchards. Nowadays, these have been integrated into the modern cities and could supply fresh vegetables to the daily food improving human health. Organic orchards grow crops without pesticides and artificial fertilizers thus, they are respectful with the environment and guarantee the food's safety . In modern society, the application of new technology is a must, in this case to obtain an efficient irrigation. In order to monitor a proper irrigation and save water and energy, soil water content probes are used to measure soil water content. Among them, capacitive probes ,monitored with a specific data logger, are typically used. Most of them, specially the data loggers, are expensive and in many cases are not used. In this work, we have applied the open hardware Arduino to build and program a low cost datalogger for the programming of irrigation in an experimental organic orchard. Results showed that the application of such as low cost technology, which is easily available in the market and easy to understand, everyone can built and program its own device helping in managing water resources in organic orchards .

  4. Op-Ug TD Optimizer Tool Based on Matlab Code to Find Transition Depth From Open Pit to Block Caving / Narzędzie Optymalizacyjne Oparte O Kod Matlab Wykorzystane Do Określania Głębokości Przejściowej Od Wydobycia Odkrywkowego Do Wybierania Komorami

    NASA Astrophysics Data System (ADS)

    Bakhtavar, E.

    2015-09-01

    In this study, transition from open pit to block caving has been considered as a challenging problem. For this purpose, the linear integer programing code of Matlab was initially developed on the basis of the binary integer model proposed by Bakhtavar et al (2012). Then a program based on graphical user interface (GUI) was set up and named "Op-Ug TD Optimizer". It is a beneficial tool for simple application of the model in all situations where open pit is considered together with block caving method for mining an ore deposit. Finally, Op-Ug TD Optimizer has been explained step by step through solving the transition from open pit to block caving problem of a case ore deposit. W pracy tej rozważano skomplikowane zagadnienie przejścia od wybierania odkrywkowego do komorowego. W tym celu opracowano kod programowania liniowego w środowisku MATLAB w oparciu o model liczb binarnych zaproponowany przez Bakhtavara (2012). Następnie opracowano program z wykorzystujący graficzny interfejs użytkownika o nazwie Optymalizator Op-Ug TD. Jest to niezwykle cenne narzędzie umożliwiające stosowanie modelu dla wszystkich warunków w sytuacjach gdy rozważamy prowadzenie wydobycia metodą odkrywkową oraz wydobycie komorowe przy eksploatacji złóż rud żelaza. W końcowej części pracy podano szczegółową instrukcję stosowanie programu Optymalizator na przedstawionym przykładzie przejścia od wydobycia rud żelaza metodami odkrywkowymi poprzez wydobycie komorami.

  5. Integrating the visualization concept of the medical imaging interaction toolkit (MITK) into the XIP-Builder visual programming environment

    NASA Astrophysics Data System (ADS)

    Wolf, Ivo; Nolden, Marco; Schwarz, Tobias; Meinzer, Hans-Peter

    2010-02-01

    The Medical Imaging Interaction Toolkit (MITK) and the eXtensible Imaging Platform (XIP) both aim at facilitating the development of medical imaging applications, but provide support on different levels. MITK offers support from the toolkit level, whereas XIP comes with a visual programming environment. XIP is strongly based on Open Inventor. Open Inventor with its scene graph-based rendering paradigm was not specifically designed for medical imaging, but focuses on creating dedicated visualizations. MITK has a visualization concept with a model-view-controller like design that assists in implementing multiple, consistent views on the same data, which is typically required in medical imaging. In addition, MITK defines a unified means of describing position, orientation, bounds, and (if required) local deformation of data and views, supporting e.g. images acquired with gantry tilt and curved reformations. The actual rendering is largely delegated to the Visualization Toolkit (VTK). This paper presents an approach of how to integrate the visualization concept of MITK with XIP, especially into the XIP-Builder. This is a first step of combining the advantages of both platforms. It enables experimenting with algorithms in the XIP visual programming environment without requiring a detailed understanding of Open Inventor. Using MITK-based add-ons to XIP, any number of data objects (images, surfaces, etc.) produced by algorithms can simply be added to an MITK DataStorage object and rendered into any number of slice-based (2D) or 3D views. Both MITK and XIP are open-source C++ platforms. The extensions presented in this paper will be available from www.mitk.org.

  6. Evaluating theory-based evaluation: information, norms, and adherence.

    PubMed

    Jacobs, W Jake; Sisco, Melissa; Hill, Dawn; Malter, Frederic; Figueredo, Aurelio José

    2012-08-01

    Programmatic social interventions attempt to produce appropriate social-norm-guided behavior in an open environment. A marriage of applicable psychological theory, appropriate program evaluation theory, and outcome of evaluations of specific social interventions assures the acquisition of cumulative theory and the production of successful social interventions--the marriage permits us to advance knowledge by making use of both success and failures. We briefly review well-established principles within the field of program evaluation, well-established processes involved in changing social norms and social-norm adherence, the outcome of several program evaluations focusing on smoking prevention, pro-environmental behavior, and rape prevention and, using the principle of learning from our failures, examine why these programs often do not perform as expected. Finally, we discuss the promise of learning from our collective experiences to develop a cumulative science of program evaluation and to improve the performance of extant and future interventions. Copyright © 2012. Published by Elsevier Ltd.

  7. The Open AUC Project.

    PubMed

    Cölfen, Helmut; Laue, Thomas M; Wohlleben, Wendel; Schilling, Kristian; Karabudak, Engin; Langhorst, Bradley W; Brookes, Emre; Dubbs, Bruce; Zollars, Dan; Rocco, Mattia; Demeler, Borries

    2010-02-01

    Progress in analytical ultracentrifugation (AUC) has been hindered by obstructions to hardware innovation and by software incompatibility. In this paper, we announce and outline the Open AUC Project. The goals of the Open AUC Project are to stimulate AUC innovation by improving instrumentation, detectors, acquisition and analysis software, and collaborative tools. These improvements are needed for the next generation of AUC-based research. The Open AUC Project combines on-going work from several different groups. A new base instrument is described, one that is designed from the ground up to be an analytical ultracentrifuge. This machine offers an open architecture, hardware standards, and application programming interfaces for detector developers. All software will use the GNU Public License to assure that intellectual property is available in open source format. The Open AUC strategy facilitates collaborations, encourages sharing, and eliminates the chronic impediments that have plagued AUC innovation for the last 20 years. This ultracentrifuge will be equipped with multiple and interchangeable optical tracks so that state-of-the-art electronics and improved detectors will be available for a variety of optical systems. The instrument will be complemented by a new rotor, enhanced data acquisition and analysis software, as well as collaboration software. Described here are the instrument, the modular software components, and a standardized database that will encourage and ease integration of data analysis and interpretation software.

  8. OpenFlyData: an exemplar data web integrating gene expression data on the fruit fly Drosophila melanogaster.

    PubMed

    Miles, Alistair; Zhao, Jun; Klyne, Graham; White-Cooper, Helen; Shotton, David

    2010-10-01

    Integrating heterogeneous data across distributed sources is a major requirement for in silico bioinformatics supporting translational research. For example, genome-scale data on patterns of gene expression in the fruit fly Drosophila melanogaster are widely used in functional genomic studies in many organisms to inform candidate gene selection and validate experimental results. However, current data integration solutions tend to be heavy weight, and require significant initial and ongoing investment of effort. Development of a common Web-based data integration infrastructure (a.k.a. data web), using Semantic Web standards, promises to alleviate these difficulties, but little is known about the feasibility, costs, risks or practical means of migrating to such an infrastructure. We describe the development of OpenFlyData, a proof-of-concept system integrating gene expression data on D. melanogaster, combining Semantic Web standards with light-weight approaches to Web programming based on Web 2.0 design patterns. To support researchers designing and validating functional genomic studies, OpenFlyData includes user-facing search applications providing intuitive access to and comparison of gene expression data from FlyAtlas, the BDGP in situ database, and FlyTED, using data from FlyBase to expand and disambiguate gene names. OpenFlyData's services are also openly accessible, and are available for reuse by other bioinformaticians and application developers. Semi-automated methods and tools were developed to support labour- and knowledge-intensive tasks involved in deploying SPARQL services. These include methods for generating ontologies and relational-to-RDF mappings for relational databases, which we illustrate using the FlyBase Chado database schema; and methods for mapping gene identifiers between databases. The advantages of using Semantic Web standards for biomedical data integration are discussed, as are open issues. In particular, although the performance of open source SPARQL implementations is sufficient to query gene expression data directly from user-facing applications such as Web-based data fusions (a.k.a. mashups), we found open SPARQL endpoints to be vulnerable to denial-of-service-type problems, which must be mitigated to ensure reliability of services based on this standard. These results are relevant to data integration activities in translational bioinformatics. The gene expression search applications and SPARQL endpoints developed for OpenFlyData are deployed at http://openflydata.org. FlyUI, a library of JavaScript widgets providing re-usable user-interface components for Drosophila gene expression data, is available at http://flyui.googlecode.com. Software and ontologies to support transformation of data from FlyBase, FlyAtlas, BDGP and FlyTED to RDF are available at http://openflydata.googlecode.com. SPARQLite, an implementation of the SPARQL protocol, is available at http://sparqlite.googlecode.com. All software is provided under the GPL version 3 open source license.

  9. Gpufit: An open-source toolkit for GPU-accelerated curve fitting.

    PubMed

    Przybylski, Adrian; Thiel, Björn; Keller-Findeisen, Jan; Stock, Bernd; Bates, Mark

    2017-11-16

    We present a general purpose, open-source software library for estimation of non-linear parameters by the Levenberg-Marquardt algorithm. The software, Gpufit, runs on a Graphics Processing Unit (GPU) and executes computations in parallel, resulting in a significant gain in performance. We measured a speed increase of up to 42 times when comparing Gpufit with an identical CPU-based algorithm, with no loss of precision or accuracy. Gpufit is designed such that it is easily incorporated into existing applications or adapted for new ones. Multiple software interfaces, including to C, Python, and Matlab, ensure that Gpufit is accessible from most programming environments. The full source code is published as an open source software repository, making its function transparent to the user and facilitating future improvements and extensions. As a demonstration, we used Gpufit to accelerate an existing scientific image analysis package, yielding significantly improved processing times for super-resolution fluorescence microscopy datasets.

  10. Hydropower application of confined space regulations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Franseen, H.W.

    1995-12-31

    OSHA`s {open_quotes}Permit Required Confined Space{close_quotes} rules, 1910.146, became effective April 15, 1993. Their rules define a {open_quotes}confined space{close_quotes} and {open_quotes}permit required confined space{close_quotes}; provide general requirements for those entering the confined space, for the attendant and entry supervisor; define what a confined space program and permit system should be; and describe training requirements and rescue considerations. Tapoco Inc., began preparing confined space procedures in 1992 using Alcoa Engineering Standards and OSHA`s proposed rules. A joint union management team was formed, and this team began evaluating spaces which meet the confined space definition. In 1993, employees were trained, and all entriesmore » into spaces were done according to Alcoa`s and OSHA`s proposed rules. Rescue teams have been trained at each site. Some unique confined spaces and or unique entry conditions have been encountered which have required extensive evaluation.« less

  11. HYDRA Hyperspectral Data Research Application Tom Rink and Tom Whittaker

    NASA Astrophysics Data System (ADS)

    Rink, T.; Whittaker, T.

    2005-12-01

    HYDRA is a freely available, easy to install tool for visualization and analysis of large local or remote hyper/multi-spectral datasets. HYDRA is implemented on top of the open source VisAD Java library via Jython - the Java implementation of the user friendly Python programming language. VisAD provides data integration, through its generalized data model, user-display interaction and display rendering. Jython has an easy to read, concise, scripting-like, syntax which eases software development. HYDRA allows data sharing of large datasets through its support of the OpenDAP and OpenADDE server-client protocols. The users can explore and interrogate data, and subset in physical and/or spectral space to isolate key areas of interest for further analysis without having to download an entire dataset. It also has an extensible data input architecture to recognize new instruments and understand different local file formats, currently NetCDF and HDF4 are supported.

  12. Open source tools for the information theoretic analysis of neural data.

    PubMed

    Ince, Robin A A; Mazzoni, Alberto; Petersen, Rasmus S; Panzeri, Stefano

    2010-01-01

    The recent and rapid development of open source software tools for the analysis of neurophysiological datasets consisting of simultaneous multiple recordings of spikes, field potentials and other neural signals holds the promise for a significant advance in the standardization, transparency, quality, reproducibility and variety of techniques used to analyze neurophysiological data and for the integration of information obtained at different spatial and temporal scales. In this review we focus on recent advances in open source toolboxes for the information theoretic analysis of neural responses. We also present examples of their use to investigate the role of spike timing precision, correlations across neurons, and field potential fluctuations in the encoding of sensory information. These information toolboxes, available both in MATLAB and Python programming environments, hold the potential to enlarge the domain of application of information theory to neuroscience and to lead to new discoveries about how neurons encode and transmit information.

  13. Using OpenMP vs. Threading Building Blocks for Medical Imaging on Multi-cores

    NASA Astrophysics Data System (ADS)

    Kegel, Philipp; Schellmann, Maraike; Gorlatch, Sergei

    We compare two parallel programming approaches for multi-core systems: the well-known OpenMP and the recently introduced Threading Building Blocks (TBB) library by Intel®. The comparison is made using the parallelization of a real-world numerical algorithm for medical imaging. We develop several parallel implementations, and compare them w.r.t. programming effort, programming style and abstraction, and runtime performance. We show that TBB requires a considerable program re-design, whereas with OpenMP simple compiler directives are sufficient. While TBB appears to be less appropriate for parallelizing existing implementations, it fosters a good programming style and higher abstraction level for newly developed parallel programs. Our experimental measurements on a dual quad-core system demonstrate that OpenMP slightly outperforms TBB in our implementation.

  14. AplusB: A Web Application for Investigating A + B Designs for Phase I Cancer Clinical Trials.

    PubMed

    Wheeler, Graham M; Sweeting, Michael J; Mander, Adrian P

    2016-01-01

    In phase I cancer clinical trials, the maximum tolerated dose of a new drug is often found by a dose-escalation method known as the A + B design. We have developed an interactive web application, AplusB, which computes and returns exact operating characteristics of A + B trial designs. The application has a graphical user interface (GUI), requires no programming knowledge and is free to access and use on any device that can open an internet browser. A customised report is available for download for each design that contains tabulated operating characteristics and informative plots, which can then be compared with other dose-escalation methods. We present a step-by-step guide on how to use this application and provide several illustrative examples of its capabilities.

  15. Remote sensing applications to resource problems in South Dakota

    NASA Technical Reports Server (NTRS)

    Myers, V. I. (Principal Investigator); Best, R. G.; Dalsted, K. J.; Devries, M. E.; Eidenshink, J. C.; Fowler, R.; Heilman, J.; Schmer, F. A.

    1980-01-01

    Cooperative projects between RSI and numerous South Dakota agencies have provided a means of incorporating remote sensing techniques into operational programs. Eight projects discussed in detail are: (1) detection of high moisture zones near interstate 90; (2) thermal infrared census of Canada geese in South Dakota; (3) dutch elm disease detection in urban environment; (4) a feasibility study for monitoring effective precipitation in South Dakota using TIROS-N; (5) open and abandoned dump sites in Spink county; (6) the influence of soil reflectance on LANDSAT signatures of crops; (7) A model implementation program for Lake Herman watershed; and (8) the Six-Mile Creek investigation follow-on.

  16. PLOCAN glider portal: a gateway for useful data management and visualization system

    NASA Astrophysics Data System (ADS)

    Morales, Tania; Lorenzo, Alvaro; Viera, Josue; Barrera, Carlos; José Rueda, María

    2014-05-01

    Nowadays monitoring ocean behavior and its characteristics involves a wide range of sources able to gather and provide a vast amount of data in spatio-temporal scales. Multiplatform infrastructures, like PLOCAN, hold a variety of autonomous Lagrangian and Eulerian devices addressed to collect information then transferred to land in near-real time. Managing all this data collection in an efficient way is a major issue. Advances in ocean observation technologies, where underwater autonomous gliders play a key role, has brought as a consequence an improvement of spatio-temporal resolution which offers a deeper understanding of the ocean but requires a bigger effort in the data management process. There are general requirements in terms of data management in that kind of environments, such as processing raw data at different levels to obtain valuable information, storing data coherently and providing accurate products to final users according to their specific needs. Managing large amount of data can be certainly tedious and complex without having right tools and operational procedures; hence automating these tasks through software applications saves time and reduces errors. Moreover, data distribution is highly relevant since scientist tent to assimilate different sources for comparison and validation. The use of web applications has boosted the necessary scientific dissemination. Within this argument, PLOCAN has implemented a set of independent but compatible applications to process, store and disseminate information gathered through different oceanographic platforms. These applications have been implemented using open standards, such as HTML and CSS, and open source software, like python as programming language and Django as framework web. More specifically, a glider application has been developed within the framework of FP7-GROOM project. Regarding data management, this project focuses on collecting and making available consistent and quality controlled datasets as well as fostering open access to glider data.

  17. The Scientific Filesystem.

    PubMed

    Sochat, Vanessa

    2018-05-01

    Here, we present the Scientific Filesystem (SCIF), an organizational format that supports exposure of executables and metadata for discoverability of scientific applications. The format includes a known filesystem structure, a definition for a set of environment variables describing it, and functions for generation of the variables and interaction with the libraries, metadata, and executables located within. SCIF makes it easy to expose metadata, multiple environments, installation steps, files, and entry points to render scientific applications consistent, modular, and discoverable. A SCIF can be installed on a traditional host or in a container technology such as Docker or Singularity. We start by reviewing the background and rationale for the SCIF, followed by an overview of the specification and the different levels of internal modules ("apps") that the organizational format affords. Finally, we demonstrate that SCIF is useful by implementing and discussing several use cases that improve user interaction and understanding of scientific applications. SCIF is released along with a client and integration in the Singularity 2.4 software to quickly install and interact with SCIF. When used inside of a reproducible container, a SCIF is a recipe for reproducibility and introspection of the functions and users that it serves. We use SCIF to evaluate container software, provide metrics, serve scientific workflows, and execute a primary function under different contexts. To encourage collaboration and sharing of applications, we developed tools along with an open source, version-controlled, tested, and programmatically accessible web infrastructure. SCIF and associated resources are available at https://sci-f.github.io. The ease of using SCIF, especially in the context of containers, offers promise for scientists' work to be self-documenting and programatically parseable for maximum reproducibility. SCIF opens up an abstraction from underlying programming languages and packaging logic to work with scientific applications, opening up new opportunities for scientific software development.

  18. CALCMIN - an EXCEL™ Visual Basic application for calculating mineral structural formulae from electron microprobe analyses

    NASA Astrophysics Data System (ADS)

    Brandelik, Andreas

    2009-07-01

    CALCMIN, an open source Visual Basic program, was implemented in EXCEL™. The program was primarily developed to support geoscientists in their routine task of calculating structural formulae of minerals on the basis of chemical analysis mainly obtained by electron microprobe (EMP) techniques. Calculation programs for various minerals are already included in the form of sub-routines. These routines are arranged in separate modules containing a minimum of code. The architecture of CALCMIN allows the user to easily develop new calculation routines or modify existing routines with little knowledge of programming techniques. By means of a simple mouse-click, the program automatically generates a rudimentary framework of code using the object model of the Visual Basic Editor (VBE). Within this framework simple commands and functions, which are provided by the program, can be used, for example, to perform various normalization procedures or to output the results of the computations. For the clarity of the code, element symbols are used as variables initialized by the program automatically. CALCMIN does not set any boundaries in complexity of the code used, resulting in a wide range of possible applications. Thus, matrix and optimization methods can be included, for instance, to determine end member contents for subsequent thermodynamic calculations. Diverse input procedures are provided, such as the automated read-in of output files created by the EMP. Furthermore, a subsequent filter routine enables the user to extract specific analyses in order to use them for a corresponding calculation routine. An event-driven, interactive operating mode was selected for easy application of the program. CALCMIN leads the user from the beginning to the end of the calculation process.

  19. Scalable Unix commands for parallel processors : a high-performance implementation.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ong, E.; Lusk, E.; Gropp, W.

    2001-06-22

    We describe a family of MPI applications we call the Parallel Unix Commands. These commands are natural parallel versions of common Unix user commands such as ls, ps, and find, together with a few similar commands particular to the parallel environment. We describe the design and implementation of these programs and present some performance results on a 256-node Linux cluster. The Parallel Unix Commands are open source and freely available.

  20. BossPro: a biometrics-based obfuscation scheme for software protection

    NASA Astrophysics Data System (ADS)

    Kuseler, Torben; Lami, Ihsan A.; Al-Assam, Hisham

    2013-05-01

    This paper proposes to integrate biometric-based key generation into an obfuscated interpretation algorithm to protect authentication application software from illegitimate use or reverse-engineering. This is especially necessary for mCommerce because application programmes on mobile devices, such as Smartphones and Tablet-PCs are typically open for misuse by hackers. Therefore, the scheme proposed in this paper ensures that a correct interpretation / execution of the obfuscated program code of the authentication application requires a valid biometric generated key of the actual person to be authenticated, in real-time. Without this key, the real semantics of the program cannot be understood by an attacker even if he/she gains access to this application code. Furthermore, the security provided by this scheme can be a vital aspect in protecting any application running on mobile devices that are increasingly used to perform business/financial or other security related applications, but are easily lost or stolen. The scheme starts by creating a personalised copy of any application based on the biometric key generated during an enrolment process with the authenticator as well as a nuance created at the time of communication between the client and the authenticator. The obfuscated code is then shipped to the client's mobile devise and integrated with real-time biometric extracted data of the client to form the unlocking key during execution. The novelty of this scheme is achieved by the close binding of this application program to the biometric key of the client, thus making this application unusable for others. Trials and experimental results on biometric key generation, based on client's faces, and an implemented scheme prototype, based on the Android emulator, prove the concept and novelty of this proposed scheme.

  1. OpenFDA: an innovative platform providing access to a wealth of FDA's publicly available data.

    PubMed

    Kass-Hout, Taha A; Xu, Zhiheng; Mohebbi, Matthew; Nelsen, Hans; Baker, Adam; Levine, Jonathan; Johanson, Elaine; Bright, Roselie A

    2016-05-01

    The objective of openFDA is to facilitate access and use of big important Food and Drug Administration public datasets by developers, researchers, and the public through harmonization of data across disparate FDA datasets provided via application programming interfaces (APIs). Using cutting-edge technologies deployed on FDA's new public cloud computing infrastructure, openFDA provides open data for easier, faster (over 300 requests per second per process), and better access to FDA datasets; open source code and documentation shared on GitHub for open community contributions of examples, apps and ideas; and infrastructure that can be adopted for other public health big data challenges. Since its launch on June 2, 2014, openFDA has developed four APIs for drug and device adverse events, recall information for all FDA-regulated products, and drug labeling. There have been more than 20 million API calls (more than half from outside the United States), 6000 registered users, 20,000 connected Internet Protocol addresses, and dozens of new software (mobile or web) apps developed. A case study demonstrates a use of openFDA data to understand an apparent association of a drug with an adverse event. With easier and faster access to these datasets, consumers worldwide can learn more about FDA-regulated products. © The Author 2015. Published by Oxford University Press on behalf of the American Medical Informatics Association. All rights reserved.

  2. OpenFDA: an innovative platform providing access to a wealth of FDA’s publicly available data

    PubMed Central

    Kass-Hout, Taha A; Mohebbi, Matthew; Nelsen, Hans; Baker, Adam; Levine, Jonathan; Johanson, Elaine; Bright, Roselie A

    2016-01-01

    Objective The objective of openFDA is to facilitate access and use of big important Food and Drug Administration public datasets by developers, researchers, and the public through harmonization of data across disparate FDA datasets provided via application programming interfaces (APIs). Materials and Methods Using cutting-edge technologies deployed on FDA’s new public cloud computing infrastructure, openFDA provides open data for easier, faster (over 300 requests per second per process), and better access to FDA datasets; open source code and documentation shared on GitHub for open community contributions of examples, apps and ideas; and infrastructure that can be adopted for other public health big data challenges. Results Since its launch on June 2, 2014, openFDA has developed four APIs for drug and device adverse events, recall information for all FDA-regulated products, and drug labeling. There have been more than 20 million API calls (more than half from outside the United States), 6000 registered users, 20,000 connected Internet Protocol addresses, and dozens of new software (mobile or web) apps developed. A case study demonstrates a use of openFDA data to understand an apparent association of a drug with an adverse event. Conclusion With easier and faster access to these datasets, consumers worldwide can learn more about FDA-regulated products. PMID:26644398

  3. 46 CFR 308.506 - Application for an Open Cargo Policy.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 46 Shipping 8 2014-10-01 2014-10-01 false Application for an Open Cargo Policy. 308.506 Section 308.506 Shipping MARITIME ADMINISTRATION, DEPARTMENT OF TRANSPORTATION EMERGENCY OPERATIONS WAR RISK INSURANCE War Risk Cargo Insurance Open Policy War Risk Cargo Insurance § 308.506 Application for an Open...

  4. A Study of Learners Perception and Attitude towards BA/BSS Program of SSHL of Bangladesh Open University

    ERIC Educational Resources Information Center

    Sultana, Sabiha; Jahan, Tasrun; Numan, Sharker Md.

    2011-01-01

    In the present day open and distance education has become a significant way of the development of higher education. Bangladesh Open University (BOU), the only public institution in Bangladesh offers several formal and non-formal programs from secondary to post graduate level through distance mode. The main objectives of BOU's program is to provide…

  5. Massively parallel sparse matrix function calculations with NTPoly

    NASA Astrophysics Data System (ADS)

    Dawson, William; Nakajima, Takahito

    2018-04-01

    We present NTPoly, a massively parallel library for computing the functions of sparse, symmetric matrices. The theory of matrix functions is a well developed framework with a wide range of applications including differential equations, graph theory, and electronic structure calculations. One particularly important application area is diagonalization free methods in quantum chemistry. When the input and output of the matrix function are sparse, methods based on polynomial expansions can be used to compute matrix functions in linear time. We present a library based on these methods that can compute a variety of matrix functions. Distributed memory parallelization is based on a communication avoiding sparse matrix multiplication algorithm. OpenMP task parallellization is utilized to implement hybrid parallelization. We describe NTPoly's interface and show how it can be integrated with programs written in many different programming languages. We demonstrate the merits of NTPoly by performing large scale calculations on the K computer.

  6. A new era of space transportation. [Space Shuttle system utilization

    NASA Technical Reports Server (NTRS)

    Fletcher, J. C.

    1976-01-01

    It is pointed out that founded on the experiences of Apollo, Skylab, and the Apollo/Soyuz mission an era is entered which will be characterized by a displacement of the interface between the experimenter and his experiment from the control center on the ground to the laboratory in orbit. A new world has been opened by going into space. Economic applications are related to the achievement of an enormous efficiency in world communications at a much lower cost. However, programs of space exploration and usage are under severe economic constraints. A primary tool to lower the cost of programs is to be the Space Transportation System using the Space Shuttle. It is emphasized that the Shuttle system is an international enterprise. Attention is also given to the results of the Viking missions, the Landsat satellites, and applications of space technology for science and commerce.

  7. [Application of a mathematical algorithm for the detection of electroneuromyographic results in the pathogenesis study of facial dyskinesia].

    PubMed

    Gribova, N P; Iudel'son, Ia B; Golubev, V L; Abramenkova, I V

    2003-01-01

    To carry out a differential diagnosis of two facial dyskinesia (FD) models--facial hemispasm (FH) and facial paraspasm (FP), a combined program of electroneuromyographic (ENMG) examination has been created, using statistical analyses, including that for objects identification based on hybrid neural network with the application of adaptive fuzzy logic method and standard statistics programs (Wilcoxon, Student statistics). In FH, a lesion of peripheral facial neuromotor apparatus with augmentation of functions of inter-neurons in segmental and upper segmental stem levels predominated. In FP, primary afferent strengthening in mimic muscles was accompanied by increased motor neurons activity and reciprocal augmentation of inter-neurons, inhibiting motor portion of V pair. Mathematical algorithm for ENMG results recognition worked out in the study provides a precise differentiation of two FD models and opens possibilities for differential diagnosis of other facial motor disorders.

  8. An open-source LabVIEW application toolkit for phasic heart rate analysis in psychophysiological research.

    PubMed

    Duley, Aaron R; Janelle, Christopher M; Coombes, Stephen A

    2004-11-01

    The cardiovascular system has been extensively measured in a variety of research and clinical domains. Despite technological and methodological advances in cardiovascular science, the analysis and evaluation of phasic changes in heart rate persists as a way to assess numerous psychological concomitants. Some researchers, however, have pointed to constraints on data analysis when evaluating cardiac activity indexed by heart rate or heart period. Thus, an off-line application toolkit for heart rate analysis is presented. The program, written with National Instruments' LabVIEW, incorporates a variety of tools for off-line extraction and analysis of heart rate data. Current methods and issues concerning heart rate analysis are highlighted, and how the toolkit provides a flexible environment to ameliorate common problems that typically lead to trial rejection is discussed. Source code for this program may be downloaded from the Psychonomic Society Web archive at www.psychonomic.org/archive/.

  9. Bio-inspired self-shaping ceramics

    PubMed Central

    Bargardi, Fabio L.; Le Ferrand, Hortense; Libanori, Rafael; Studart, André R.

    2016-01-01

    Shaping ceramics into complex and intricate geometries using cost-effective processes is desirable in many applications but still remains an open challenge. Inspired by plant seed dispersal units that self-fold on differential swelling, we demonstrate that self-shaping can be implemented in ceramics by programming the material's microstructure to undergo local anisotropic shrinkage during heat treatment. Such microstructural design is achieved by magnetically aligning functionalized ceramic platelets in a liquid ceramic suspension, subsequently consolidated through an established enzyme-catalysed reaction. By fabricating alumina compacts exhibiting bio-inspired bilayer architectures, we achieve deliberate control over shape change during the sintering step. Bending, twisting or combinations of these two basic movements can be successfully programmed to obtain a myriad of complex shapes. The simplicity and the universality of such a bottom-up shaping method makes it attractive for applications that would benefit from low-waste ceramic fabrication, temperature-resistant interlocking structures or unusual geometries not accessible using conventional top–down manufacturing. PMID:28008930

  10. Bio-inspired self-shaping ceramics

    NASA Astrophysics Data System (ADS)

    Bargardi, Fabio L.; Le Ferrand, Hortense; Libanori, Rafael; Studart, André R.

    2016-12-01

    Shaping ceramics into complex and intricate geometries using cost-effective processes is desirable in many applications but still remains an open challenge. Inspired by plant seed dispersal units that self-fold on differential swelling, we demonstrate that self-shaping can be implemented in ceramics by programming the material's microstructure to undergo local anisotropic shrinkage during heat treatment. Such microstructural design is achieved by magnetically aligning functionalized ceramic platelets in a liquid ceramic suspension, subsequently consolidated through an established enzyme-catalysed reaction. By fabricating alumina compacts exhibiting bio-inspired bilayer architectures, we achieve deliberate control over shape change during the sintering step. Bending, twisting or combinations of these two basic movements can be successfully programmed to obtain a myriad of complex shapes. The simplicity and the universality of such a bottom-up shaping method makes it attractive for applications that would benefit from low-waste ceramic fabrication, temperature-resistant interlocking structures or unusual geometries not accessible using conventional top-down manufacturing.

  11. Proceedings of the Advanced Seminar on one-dimensional, open-channel Flow and transport modeling

    USGS Publications Warehouse

    Schaffranek, Raymond W.

    1989-01-01

    In view of the increased use of mathematical/numerical simulation models, of the diversity of both model investigations and informational project objectives, and of the technical demands of complex model applications by U.S. Geological Survey personnel, an advanced seminar on one-dimensional open-channel flow and transport modeling was organized and held on June 15-18, 1987, at the National Space Technology Laboratory, Bay St. Louis, Mississippi. Principal emphasis in the Seminar was on one-dimensional flow and transport model-implementation techniques, operational practices, and application considerations. The purposes of the Seminar were to provide a forum for the exchange of information, knowledge, and experience among model users, as well as to identify immediate and future needs with respect to model development and enhancement, user support, training requirements, and technology transfer. The Seminar program consisted of a mix of topical and project presentations by Geological Survey personnel. This report is a compilation of short papers that summarize the presentations made at the Seminar.

  12. A portable platform for accelerated PIC codes and its application to GPUs using OpenACC

    NASA Astrophysics Data System (ADS)

    Hariri, F.; Tran, T. M.; Jocksch, A.; Lanti, E.; Progsch, J.; Messmer, P.; Brunner, S.; Gheller, C.; Villard, L.

    2016-10-01

    We present a portable platform, called PIC_ENGINE, for accelerating Particle-In-Cell (PIC) codes on heterogeneous many-core architectures such as Graphic Processing Units (GPUs). The aim of this development is efficient simulations on future exascale systems by allowing different parallelization strategies depending on the application problem and the specific architecture. To this end, this platform contains the basic steps of the PIC algorithm and has been designed as a test bed for different algorithmic options and data structures. Among the architectures that this engine can explore, particular attention is given here to systems equipped with GPUs. The study demonstrates that our portable PIC implementation based on the OpenACC programming model can achieve performance closely matching theoretical predictions. Using the Cray XC30 system, Piz Daint, at the Swiss National Supercomputing Centre (CSCS), we show that PIC_ENGINE running on an NVIDIA Kepler K20X GPU can outperform the one on an Intel Sandy bridge 8-core CPU by a factor of 3.4.

  13. Performance Characteristics of the Multi-Zone NAS Parallel Benchmarks

    NASA Technical Reports Server (NTRS)

    Jin, Haoqiang; VanderWijngaart, Rob F.

    2003-01-01

    We describe a new suite of computational benchmarks that models applications featuring multiple levels of parallelism. Such parallelism is often available in realistic flow computations on systems of grids, but had not previously been captured in bench-marks. The new suite, named NPB Multi-Zone, is extended from the NAS Parallel Benchmarks suite, and involves solving the application benchmarks LU, BT and SP on collections of loosely coupled discretization meshes. The solutions on the meshes are updated independently, but after each time step they exchange boundary value information. This strategy provides relatively easily exploitable coarse-grain parallelism between meshes. Three reference implementations are available: one serial, one hybrid using the Message Passing Interface (MPI) and OpenMP, and another hybrid using a shared memory multi-level programming model (SMP+OpenMP). We examine the effectiveness of hybrid parallelization paradigms in these implementations on three different parallel computers. We also use an empirical formula to investigate the performance characteristics of the multi-zone benchmarks.

  14. PD5: a general purpose library for primer design software.

    PubMed

    Riley, Michael C; Aubrey, Wayne; Young, Michael; Clare, Amanda

    2013-01-01

    Complex PCR applications for large genome-scale projects require fast, reliable and often highly sophisticated primer design software applications. Presently, such applications use pipelining methods to utilise many third party applications and this involves file parsing, interfacing and data conversion, which is slow and prone to error. A fully integrated suite of software tools for primer design would considerably improve the development time, the processing speed, and the reliability of bespoke primer design software applications. The PD5 software library is an open-source collection of classes and utilities, providing a complete collection of software building blocks for primer design and analysis. It is written in object-oriented C(++) with an emphasis on classes suitable for efficient and rapid development of bespoke primer design programs. The modular design of the software library simplifies the development of specific applications and also integration with existing third party software where necessary. We demonstrate several applications created using this software library that have already proved to be effective, but we view the project as a dynamic environment for building primer design software and it is open for future development by the bioinformatics community. Therefore, the PD5 software library is published under the terms of the GNU General Public License, which guarantee access to source-code and allow redistribution and modification. The PD5 software library is downloadable from Google Code and the accompanying Wiki includes instructions and examples: http://code.google.com/p/primer-design.

  15. JACOB: an enterprise framework for computational chemistry.

    PubMed

    Waller, Mark P; Dresselhaus, Thomas; Yang, Jack

    2013-06-15

    Here, we present just a collection of beans (JACOB): an integrated batch-based framework designed for the rapid development of computational chemistry applications. The framework expedites developer productivity by handling the generic infrastructure tier, and can be easily extended by user-specific scientific code. Paradigms from enterprise software engineering were rigorously applied to create a scalable, testable, secure, and robust framework. A centralized web application is used to configure and control the operation of the framework. The application-programming interface provides a set of generic tools for processing large-scale noninteractive jobs (e.g., systematic studies), or for coordinating systems integration (e.g., complex workflows). The code for the JACOB framework is open sourced and is available at: www.wallerlab.org/jacob. Copyright © 2013 Wiley Periodicals, Inc.

  16. Finding the Perfect Match: Factors That Influence Family Medicine Residency Selection.

    PubMed

    Wright, Katherine M; Ryan, Elizabeth R; Gatta, John L; Anderson, Lauren; Clements, Deborah S

    2016-04-01

    Residency program selection is a significant experience for emerging physicians, yet there is limited information about how applicants narrow their list of potential programs. This study examines factors that influence residency program selection among medical students interested in family medicine at the time of application. Medical students with an expressed interest in family medicine were invited to participate in a 37-item, online survey. Students were asked to rate factors that may impact residency selection on a 6-point Likert scale in addition to three open-ended qualitative questions. Mean values were calculated for each survey item and were used to determine a rank order for selection criteria. Logistic regression analysis was performed to identify factors that predict a strong interest in urban, suburban, and rural residency programs. Logistic regression was also used to identify factors that predict a strong interest in academic health center-based residencies, community-based residencies, and community-based residencies with an academic affiliation. A total of 705 medical students from 32 states across the country completed the survey. Location, work/life balance, and program structure (curriculum, schedule) were rated the most important factors for residency selection. Logistic regression analysis was used to refine our understanding of how each factor relates to specific types of residencies. These findings have implications for how to best advise students in selecting a residency, as well as marketing residencies to the right candidates. Refining the recruitment process will ensure a better fit between applicants and potential programs. Limited recruitment resources may be better utilized by focusing on targeted dissemination strategies.

  17. OpenCL: A Parallel Programming Standard for Heterogeneous Computing Systems.

    PubMed

    Stone, John E; Gohara, David; Shi, Guochun

    2010-05-01

    We provide an overview of the key architectural features of recent microprocessor designs and describe the programming model and abstractions provided by OpenCL, a new parallel programming standard targeting these architectures.

  18. 46 CFR 308.521 - Application for Open Cargo Policy, Form MA-301.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... Section 308.521 Shipping MARITIME ADMINISTRATION, DEPARTMENT OF TRANSPORTATION EMERGENCY OPERATIONS WAR RISK INSURANCE War Risk Cargo Insurance Ii-Open Policy War Risk Cargo Insurance § 308.521 Application for Open Cargo Policy, Form MA-301. The standard form of application for a War Risk Open Cargo Policy...

  19. 46 CFR 308.506 - Application for an Open Cargo Policy.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 46 Shipping 8 2010-10-01 2010-10-01 false Application for an Open Cargo Policy. 308.506 Section 308.506 Shipping MARITIME ADMINISTRATION, DEPARTMENT OF TRANSPORTATION EMERGENCY OPERATIONS WAR RISK INSURANCE War Risk Cargo Insurance Ii-Open Policy War Risk Cargo Insurance § 308.506 Application for an Open...

  20. 46 CFR 308.521 - Application for Open Cargo Policy, Form MA-301.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... Section 308.521 Shipping MARITIME ADMINISTRATION, DEPARTMENT OF TRANSPORTATION EMERGENCY OPERATIONS WAR RISK INSURANCE War Risk Cargo Insurance Ii-Open Policy War Risk Cargo Insurance § 308.521 Application for Open Cargo Policy, Form MA-301. The standard form of application for a War Risk Open Cargo Policy...

  1. 46 CFR 308.506 - Application for an Open Cargo Policy.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 46 Shipping 8 2013-10-01 2013-10-01 false Application for an Open Cargo Policy. 308.506 Section 308.506 Shipping MARITIME ADMINISTRATION, DEPARTMENT OF TRANSPORTATION EMERGENCY OPERATIONS WAR RISK INSURANCE War Risk Cargo Insurance Ii-Open Policy War Risk Cargo Insurance § 308.506 Application for an Open...

  2. 46 CFR 308.521 - Application for Open Cargo Policy, Form MA-301.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... Section 308.521 Shipping MARITIME ADMINISTRATION, DEPARTMENT OF TRANSPORTATION EMERGENCY OPERATIONS WAR RISK INSURANCE War Risk Cargo Insurance Ii-Open Policy War Risk Cargo Insurance § 308.521 Application for Open Cargo Policy, Form MA-301. The standard form of application for a War Risk Open Cargo Policy...

  3. 46 CFR 308.506 - Application for an Open Cargo Policy.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 46 Shipping 8 2011-10-01 2011-10-01 false Application for an Open Cargo Policy. 308.506 Section 308.506 Shipping MARITIME ADMINISTRATION, DEPARTMENT OF TRANSPORTATION EMERGENCY OPERATIONS WAR RISK INSURANCE War Risk Cargo Insurance Ii-Open Policy War Risk Cargo Insurance § 308.506 Application for an Open...

  4. 46 CFR 308.521 - Application for Open Cargo Policy, Form MA-301.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... Section 308.521 Shipping MARITIME ADMINISTRATION, DEPARTMENT OF TRANSPORTATION EMERGENCY OPERATIONS WAR RISK INSURANCE War Risk Cargo Insurance Open Policy War Risk Cargo Insurance § 308.521 Application for Open Cargo Policy, Form MA-301. The standard form of application for a War Risk Open Cargo Policy may...

  5. 46 CFR 308.506 - Application for an Open Cargo Policy.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 46 Shipping 8 2012-10-01 2012-10-01 false Application for an Open Cargo Policy. 308.506 Section 308.506 Shipping MARITIME ADMINISTRATION, DEPARTMENT OF TRANSPORTATION EMERGENCY OPERATIONS WAR RISK INSURANCE War Risk Cargo Insurance Ii-Open Policy War Risk Cargo Insurance § 308.506 Application for an Open...

  6. 46 CFR 308.521 - Application for Open Cargo Policy, Form MA-301.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... Section 308.521 Shipping MARITIME ADMINISTRATION, DEPARTMENT OF TRANSPORTATION EMERGENCY OPERATIONS WAR RISK INSURANCE War Risk Cargo Insurance Ii-Open Policy War Risk Cargo Insurance § 308.521 Application for Open Cargo Policy, Form MA-301. The standard form of application for a War Risk Open Cargo Policy...

  7. DICOM router: an open source toolbox for communication and correction of DICOM objects.

    PubMed

    Hackländer, Thomas; Kleber, Klaus; Martin, Jens; Mertens, Heinrich

    2005-03-01

    Today, the exchange of medical images and clinical information is well defined by the digital imaging and communications in medicine (DICOM) and Health Level Seven (ie, HL7) standards. The interoperability among information systems is specified by the integration profiles of IHE (Integrating the Healthcare Enterprise). However, older imaging modalities frequently do not correctly support these interfaces and integration profiles, and some use cases are not yet specified by IHE. Therefore, corrections of DICOM objects are necessary to establish conformity. The aim of this project was to develop a toolbox that can automatically perform these recurrent corrections of the DICOM objects. The toolbox is composed of three main components: 1) a receiver to receive DICOM objects, 2) a processing pipeline to correct each object, and 3) one or more senders to forward each corrected object to predefined addressees. The toolbox is implemented under Java as an open source project. The processing pipeline is realized by means of plug ins. One of the plug ins can be programmed by the user via an external eXtensible Stylesheet Language (ie, XSL) file. Using this plug in, DICOM objects can also be converted into eXtensible Markup Language (ie, XML) documents or other data formats. DICOM storage services, DICOM CD-ROMs, and the local file system are defined as input and output channel. The toolbox is used clinically for different application areas. These are the automatic correction of DICOM objects from non-IHE-conforming modalities, the import of DICOM CD-ROMs into the picture archiving and communication system and the pseudo naming of DICOM images. The toolbox has been accepted by users in a clinical setting. Because of the open programming interfaces, the functionality can easily be adapted to future applications.

  8. Hot/Wet Open Hole Compression Strength of Carbon/Epoxy Laminates for Launch Vehicle Applications

    NASA Technical Reports Server (NTRS)

    Nettles, Alan T.

    2009-01-01

    This Technical Memorandum examines the effects of heat and absorbed moisture on the open hole compression strength of carbon/epoxy laminates with the material and layup intended for the Ares I composite interstage. The knockdown due to temperature, amount of moisture absorbed, and the interaction between these two are examined. Results show that temperature is much more critical than the amount of moisture absorbed. The environmental knockdown factor was found to be low for this material and layup and thus obtaining a statistically significant number for this value needs to be weighed against a program s cost and schedule since basis values, damage tolerance, and safety factors all contribute much more to the overall knockdown factor.

  9. Application of an Aligned and Unaligned Signal Processing Technique to Investigate Tones and Broadband Noise in Fan and Contra-Rotating Open Rotor Acoustic Spectra

    NASA Technical Reports Server (NTRS)

    Miles, Jeffrey Hilton; Hultgren, Lennart S.

    2015-01-01

    The study of noise from a two-shaft contra-rotating open rotor (CROR) is challenging since the shafts are not phase locked in most cases. Consequently, phase averaging of the acoustic data keyed to a single shaft rotation speed is not meaningful. An unaligned spectrum procedure that was developed to estimate a signal coherence threshold and reveal concealed spectral lines in turbofan engine combustion noise is applied to fan and CROR acoustic data in this paper (also available as NASA/TM-2015-218865). The NASA Advanced Air Vehicles Program, Advanced Air Transport Technology Project, Aircraft Noise Reduction Subproject supported the current work. The fan and open rotor data were obtained under previous efforts supported by the NASA Quiet Aircraft Technology (QAT) Project and the NASA Environmentally Responsible Aviation (ERA) Project of the Integrated Systems Research Program in collaboration with GE Aviation, respectively. The overarching goal of the Advanced Air Transport (AATT) Project is to explore and develop technologies and concepts to revolutionize the energy efficiency and environmental compatibility of fixed wing transport aircrafts. These technological solutions are critical in reducing the impact of aviation on the environment even as this industry and the corresponding global transportation system continue to grow.

  10. Elevated temperature crack growth

    NASA Technical Reports Server (NTRS)

    Yau, J. F.; Malik, S. N.; Kim, K. S.; Vanstone, R. H.; Laflen, J. H.

    1985-01-01

    The objective of the Elevated Temperature Crack Growth Project is to evaluate proposed nonlinear fracture mechanics methods for application to combustor liners of aircraft gas turbine engines. During the first year of this program, proposed path-independent (P-I) integrals were reviewed for such applications. Several P-I integrals were implemented into a finite-element postprocessor which was developed and verified as part of the work. Alloy 718 was selected as the analog material for use in the forthcoming experimental work. A buttonhead, single-edge notch specimen was designed and verified for use in elevated-temperature strain control testing with significant inelastic strains. A crack mouth opening displacement measurement device was developed for further use.

  11. Deployment of Directory Service for IEEE N Bus Test System Information

    NASA Astrophysics Data System (ADS)

    Barman, Amal; Sil, Jaya

    2008-10-01

    Exchanging information over Internet and Intranet becomes a defacto standard in computer applications, among various users and organizations. Distributed system study, e-governance etc require transparent information exchange between applications, constituencies, manufacturers, and vendors. To serve these purposes database system is needed for storing system data and other relevant information. Directory service, which is a specialized database along with access protocol, could be the single solution since it runs over TCP/IP, supported by all POSIX compliance platforms and is based on open standard. This paper describes a way to deploy directory service, to store IEEE n bus test system data and integrating load flow program with it.

  12. The NASA LeRC regenerative fuel cell system testbed program for goverment and commercial applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Maloney, T.M.; Prokopius, P.R.; Voecks, G.E.

    1995-01-25

    The Electrochemical Technology Branch of the NASA Lewis Research Center (LeRC) has initiated a program to develop a renewable energy system testbed to evaluate, characterize, and demonstrate fully integrated regenerative fuel cell (RFC) system for space, military, and commercial applications. A multi-agency management team, led by NASA LeRC, is implementing the program through a unique international coalition which encompasses both government and industry participants. This open-ended teaming strategy optimizes the development for space, military, and commercial RFC system technologies. Program activities to date include system design and analysis, and reactant storage sub-system design, with a major emphasis centered upon testbedmore » fabrication and installation and testing of two key RFC system components, namely, the fuel cells and electrolyzers. Construction of the LeRC 25 kW RFC system testbed at the NASA-Jet Propulsion Labortory (JPL) facility at Edwards Air Force Base (EAFB) is nearly complete and some sub-system components have already been installed. Furthermore, planning for the first commercial RFC system demonstration is underway. {copyright} {ital 1995} {ital American} {ital Institute} {ital of} {ital Physics}« less

  13. Cargo Movement Operations System (CMOS). Software Requirements Specification, Increment 1, Change 02

    DTIC Science & Technology

    1990-05-24

    COMMENT: YES [ ] NO [ ] COMMENT DISPOSITION: COMMENT STATUS: OPEN [ CLOSED [ ] ORIGINATOR CONTROL NUMBER: SRS2-0002 PROGRAM OFFICE CONTROL NUMBER: DATA...ACCEPTS COMMENT: YES ( J NO [ ) COMMENT DISPOSITION: COMMENT STATUS: OPEN [ J CLOSED [ ] ORIGINATOR CONTROL NUMBER: SRS2-0003 PROGRAM OFFICE CONTROL...NO [ ] ERCI ACCEPTS COMMENT: YES [ ] NO ( ] COMMENT DISPOSITION: COMMENT STATUS: OPEN [ ] CLOSED [ J ORIGINATOR CONTROL NUMBER: SRS2-0004 PROGRAM

  14. A markup language for electrocardiogram data acquisition and analysis (ecgML)

    PubMed Central

    Wang, Haiying; Azuaje, Francisco; Jung, Benjamin; Black, Norman

    2003-01-01

    Background The storage and distribution of electrocardiogram data is based on different formats. There is a need to promote the development of standards for their exchange and analysis. Such models should be platform-/ system- and application-independent, flexible and open to every member of the scientific community. Methods A minimum set of information for the representation and storage of electrocardiogram signals has been synthesised from existing recommendations. This specification is encoded into an XML-vocabulary. The model may aid in a flexible exchange and analysis of electrocardiogram information. Results Based on advantages of XML technologies, ecgML has the ability to present a system-, application- and format-independent solution for representation and exchange of electrocardiogram data. The distinction between the proposal developed by the U.S Food and Drug Administration and ecgML model is given. A series of tools, which aim to facilitate ecgML-based applications, are presented. Conclusions The models proposed here can facilitate the generation of a data format, which opens ways for better and clearer interpretation by both humans and machines. Its structured and transparent organisation will allow researchers to expand and test its capabilities in different application domains. The specification and programs for this protocol are publicly available. PMID:12735790

  15. caGrid 1.0: An Enterprise Grid Infrastructure for Biomedical Research

    PubMed Central

    Oster, Scott; Langella, Stephen; Hastings, Shannon; Ervin, David; Madduri, Ravi; Phillips, Joshua; Kurc, Tahsin; Siebenlist, Frank; Covitz, Peter; Shanbhag, Krishnakant; Foster, Ian; Saltz, Joel

    2008-01-01

    Objective To develop software infrastructure that will provide support for discovery, characterization, integrated access, and management of diverse and disparate collections of information sources, analysis methods, and applications in biomedical research. Design An enterprise Grid software infrastructure, called caGrid version 1.0 (caGrid 1.0), has been developed as the core Grid architecture of the NCI-sponsored cancer Biomedical Informatics Grid (caBIG™) program. It is designed to support a wide range of use cases in basic, translational, and clinical research, including 1) discovery, 2) integrated and large-scale data analysis, and 3) coordinated study. Measurements The caGrid is built as a Grid software infrastructure and leverages Grid computing technologies and the Web Services Resource Framework standards. It provides a set of core services, toolkits for the development and deployment of new community provided services, and application programming interfaces for building client applications. Results The caGrid 1.0 was released to the caBIG community in December 2006. It is built on open source components and caGrid source code is publicly and freely available under a liberal open source license. The core software, associated tools, and documentation can be downloaded from the following URL: https://cabig.nci.nih.gov/workspaces/Architecture/caGrid. Conclusions While caGrid 1.0 is designed to address use cases in cancer research, the requirements associated with discovery, analysis and integration of large scale data, and coordinated studies are common in other biomedical fields. In this respect, caGrid 1.0 is the realization of a framework that can benefit the entire biomedical community. PMID:18096909

  16. caGrid 1.0: an enterprise Grid infrastructure for biomedical research.

    PubMed

    Oster, Scott; Langella, Stephen; Hastings, Shannon; Ervin, David; Madduri, Ravi; Phillips, Joshua; Kurc, Tahsin; Siebenlist, Frank; Covitz, Peter; Shanbhag, Krishnakant; Foster, Ian; Saltz, Joel

    2008-01-01

    To develop software infrastructure that will provide support for discovery, characterization, integrated access, and management of diverse and disparate collections of information sources, analysis methods, and applications in biomedical research. An enterprise Grid software infrastructure, called caGrid version 1.0 (caGrid 1.0), has been developed as the core Grid architecture of the NCI-sponsored cancer Biomedical Informatics Grid (caBIG) program. It is designed to support a wide range of use cases in basic, translational, and clinical research, including 1) discovery, 2) integrated and large-scale data analysis, and 3) coordinated study. The caGrid is built as a Grid software infrastructure and leverages Grid computing technologies and the Web Services Resource Framework standards. It provides a set of core services, toolkits for the development and deployment of new community provided services, and application programming interfaces for building client applications. The caGrid 1.0 was released to the caBIG community in December 2006. It is built on open source components and caGrid source code is publicly and freely available under a liberal open source license. The core software, associated tools, and documentation can be downloaded from the following URL: https://cabig.nci.nih.gov/workspaces/Architecture/caGrid. While caGrid 1.0 is designed to address use cases in cancer research, the requirements associated with discovery, analysis and integration of large scale data, and coordinated studies are common in other biomedical fields. In this respect, caGrid 1.0 is the realization of a framework that can benefit the entire biomedical community.

  17. KSC-02pd0611

    NASA Image and Video Library

    2002-04-29

    KENNEDY SPACE CENTER, FLA. -- Center Director Roy D. Bridges Jr. speaks at the opening ceremony to launch a new program called SABRE, Space Agricultural Biotechnology Research and Education, involving the University of Florida and NASA. Officials from UF and NASA attended the event. SABRE will focus on the discovery, development and application of the biological aspects of advanced life support strategies. The program will include faculty from UF's Institute of Food and Agricultural Sciences, who will be located at both KSC - in the state-owned Space Experiment Research and Processing Laboratory (SERPL) being built there - and UF in Gainesville. SABRE will be directed by Robert Ferl, professor in the horticultural sciences department and assistant director of UF's Biotechnology Program. He will be responsible for coordinating the research and education efforts of UF and NASA

  18. KSC-02pd0619

    NASA Image and Video Library

    2002-04-29

    KENNEDY SPACE CENTER, FLA. -- At the opening ceremony for the new program known as SABRE, Space Agricultural Biotechnology Research and Education, William Knott speaks to attendees. Knott is senior scientist in the NASA biological sciences office. SABRE is a joint effort of the University of Florida and NASA and will focus on the discovery, development and application of the biological aspects of advanced life support strategies. The program will include faculty from UF's Institute of Food and Agricultural Sciences, who will be located at both KSC - in the state-owned Space Experiment Research and Processing Laboratory (SERPL) being built there - and UF in Gainesville. Robert Ferl, professor in the horticultural sciences department and assistant director of the University of Florida Biotechnology Program, will direct and be responsible for coordinating the research and education.

  19. KSC-02pd0610

    NASA Image and Video Library

    2002-04-29

    KENNEDY SPACE CENTER, FLA. -- Mike Martin, University of Florida vice president for agriculture and natural resources, speaks during the opening ceremony to launch a new program called SABRE, Space Agricultural Biotechnology Research and Education, that involves UF and NASA. Officials from UF and NASA attended the event. SABRE will focus on the discovery, development and application of the biological aspects of advanced life support strategies. The program will include faculty from UF's Institute of Food and Agricultural Sciences, who will be located at both KSC - in the state-owned Space Experiment Research and Processing Laboratory (SERPL) being built there - and UF in Gainesville. SABRE will be directed by Robert Ferl, professor in the horticultural sciences department and assistant director of UF's Biotechnology Program. He will be responsible for coordinating the research and education efforts of UF and NASA

  20. Checkpointing Shared Memory Programs at the Application-level

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bronevetsky, G; Schulz, M; Szwed, P

    2004-09-08

    Trends in high-performance computing are making it necessary for long-running applications to tolerate hardware faults. The most commonly used approach is checkpoint and restart(CPR)-the state of the computation is saved periodically on disk, and when a failure occurs, the computation is restarted from the last saved state. At present, it is the responsibility of the programmer to instrument applications for CPR. Our group is investigating the use of compiler technology to instrument codes to make them self-checkpointing and self-restarting, thereby providing an automatic solution to the problem of making long-running scientific applications resilient to hardware faults. Our previous work focusedmore » on message-passing programs. In this paper, we describe such a system for shared-memory programs running on symmetric multiprocessors. The system has two components: (i)a pre-compiler for source-to-source modification of applications, and (ii) a runtime system that implements a protocol for coordinating CPR among the threads of the parallel application. For the sake of concreteness, we focus on a non-trivial subset of OpenMP that includes barriers and locks. One of the advantages of this approach is that the ability to tolerate faults becomes embedded within the application itself, so applications become self-checkpointing and self-restarting on any platform. We demonstrate this by showing that our transformed benchmarks can checkpoint and restart on three different platforms (Windows/x86, Linux/x86, and Tru64/Alpha). Our experiments show that the overhead introduced by this approach is usually quite small; they also suggest ways in which the current implementation can be tuned to reduced overheads further.« less

  1. Sustaining and Extending the Open Science Grid: Science Innovation on a PetaScale Nationwide Facility (DE-FC02-06ER41436) SciDAC-2 Closeout Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Livny, Miron; Shank, James; Ernst, Michael

    Under this SciDAC-2 grant the project’s goal w a s t o stimulate new discoveries by providing scientists with effective and dependable access to an unprecedented national distributed computational facility: the Open Science Grid (OSG). We proposed to achieve this through the work of the Open Science Grid Consortium: a unique hands-on multi-disciplinary collaboration of scientists, software developers and providers of computing resources. Together the stakeholders in this consortium sustain and use a shared distributed computing environment that transforms simulation and experimental science in the US. The OSG consortium is an open collaboration that actively engages new research communities. Wemore » operate an open facility that brings together a broad spectrum of compute, storage, and networking resources and interfaces to other cyberinfrastructures, including the US XSEDE (previously TeraGrid), the European Grids for ESciencE (EGEE), as well as campus and regional grids. We leverage middleware provided by computer science groups, facility IT support organizations, and computing programs of application communities for the benefit of consortium members and the US national CI.« less

  2. Use Computer-Aided Tools to Parallelize Large CFD Applications

    NASA Technical Reports Server (NTRS)

    Jin, H.; Frumkin, M.; Yan, J.

    2000-01-01

    Porting applications to high performance parallel computers is always a challenging task. It is time consuming and costly. With rapid progressing in hardware architectures and increasing complexity of real applications in recent years, the problem becomes even more sever. Today, scalability and high performance are mostly involving handwritten parallel programs using message-passing libraries (e.g. MPI). However, this process is very difficult and often error-prone. The recent reemergence of shared memory parallel (SMP) architectures, such as the cache coherent Non-Uniform Memory Access (ccNUMA) architecture used in the SGI Origin 2000, show good prospects for scaling beyond hundreds of processors. Programming on an SMP is simplified by working in a globally accessible address space. The user can supply compiler directives, such as OpenMP, to parallelize the code. As an industry standard for portable implementation of parallel programs for SMPs, OpenMP is a set of compiler directives and callable runtime library routines that extend Fortran, C and C++ to express shared memory parallelism. It promises an incremental path for parallel conversion of existing software, as well as scalability and performance for a complete rewrite or an entirely new development. Perhaps the main disadvantage of programming with directives is that inserted directives may not necessarily enhance performance. In the worst cases, it can create erroneous results. While vendors have provided tools to perform error-checking and profiling, automation in directive insertion is very limited and often failed on large programs, primarily due to the lack of a thorough enough data dependence analysis. To overcome the deficiency, we have developed a toolkit, CAPO, to automatically insert OpenMP directives in Fortran programs and apply certain degrees of optimization. CAPO is aimed at taking advantage of detailed inter-procedural dependence analysis provided by CAPTools, developed by the University of Greenwich, to reduce potential errors made by users. Earlier tests on NAS Benchmarks and ARC3D have demonstrated good success of this tool. In this study, we have applied CAPO to parallelize three large applications in the area of computational fluid dynamics (CFD): OVERFLOW, TLNS3D and INS3D. These codes are widely used for solving Navier-Stokes equations with complicated boundary conditions and turbulence model in multiple zones. Each one comprises of from 50K to 1,00k lines of FORTRAN77. As an example, CAPO took 77 hours to complete the data dependence analysis of OVERFLOW on a workstation (SGI, 175MHz, R10K processor). A fair amount of effort was spent on correcting false dependencies due to lack of necessary knowledge during the analysis. Even so, CAPO provides an easy way for user to interact with the parallelization process. The OpenMP version was generated within a day after the analysis was completed. Due to sequential algorithms involved, code sections in TLNS3D and INS3D need to be restructured by hand to produce more efficient parallel codes. An included figure shows preliminary test results of the generated OVERFLOW with several test cases in single zone. The MPI data points for the small test case were taken from a handcoded MPI version. As we can see, CAPO's version has achieved 18 fold speed up on 32 nodes of the SGI O2K. For the small test case, it outperformed the MPI version. These results are very encouraging, but further work is needed. For example, although CAPO attempts to place directives on the outer- most parallel loops in an interprocedural framework, it does not insert directives based on the best manual strategy. In particular, it lacks the support of parallelization at the multi-zone level. Future work will emphasize on the development of methodology to work in a multi-zone level and with a hybrid approach. Development of tools to perform more complicated code transformation is also needed.

  3. Publications of the Fossil Energy Advanced Research and Technology Development Materials Program: April 1, 1993-March 31, 1995

    NASA Astrophysics Data System (ADS)

    Carlson, Paul T.

    1995-04-01

    The objective of the Fossil Energy Advanced Research and Technology Development (AR and TD) Materials Program is to conduct research and development on materials for fossil energy applications, with a focus on the longer-term needs for materials with general applicability to the various fossil fuel technologies. The Program includes research aimed at a better understanding of materials behavior in fossil energy environments and on the development of new materials capable of substantial improvement in plant operations and reliability. The scope of the Program addresses materials requirements for all fossil energy systems, including materials for coal preparation, coal liquefaction, coal gasification, heat engines and heat recovery, combustion systems, and fuel cells. Work on the Program is conducted at national and government laboratories, universities, and industrial research facilities. This bibliography covers the period of April 1, 1993, through March 31, 1995, and is a supplement to previous bibliographies in this series. It is the intent of this series of bibliographies to list only those publications that can be conveniently obtained by a researcher through relatively normal channels. The publications listed in this document have been limited to topical reports, open literature publications in referred journals, full-length papers in published proceedings of conferences, full-length papers in unreferred journals, and books and book articles.

  4. medplot: a web application for dynamic summary and analysis of longitudinal medical data based on R.

    PubMed

    Ahlin, Črt; Stupica, Daša; Strle, Franc; Lusa, Lara

    2015-01-01

    In biomedical studies the patients are often evaluated numerous times and a large number of variables are recorded at each time-point. Data entry and manipulation of longitudinal data can be performed using spreadsheet programs, which usually include some data plotting and analysis capabilities and are straightforward to use, but are not designed for the analyses of complex longitudinal data. Specialized statistical software offers more flexibility and capabilities, but first time users with biomedical background often find its use difficult. We developed medplot, an interactive web application that simplifies the exploration and analysis of longitudinal data. The application can be used to summarize, visualize and analyze data by researchers that are not familiar with statistical programs and whose knowledge of statistics is limited. The summary tools produce publication-ready tables and graphs. The analysis tools include features that are seldom available in spreadsheet software, such as correction for multiple testing, repeated measurement analyses and flexible non-linear modeling of the association of the numerical variables with the outcome. medplot is freely available and open source, it has an intuitive graphical user interface (GUI), it is accessible via the Internet and can be used within a web browser, without the need for installing and maintaining programs locally on the user's computer. This paper describes the application and gives detailed examples describing how to use the application on real data from a clinical study including patients with early Lyme borreliosis.

  5. Technology Applications Team: Applications of aerospace technology

    NASA Technical Reports Server (NTRS)

    1993-01-01

    Highlights of the Research Triangle Institute (RTI) Applications Team activities over the past quarter are presented in Section 1.0. The Team's progress in fulfilling the requirements of the contract is summarized in Section 2.0. In addition to our market-driven approach to applications project development, RTI has placed increased effort on activities to commercialize technologies developed at NASA Centers. These Technology Commercialization efforts are summarized in Section 3.0. New problem statements prepared by the Team in the reporting period are presented in Section 4.0. The Team's transfer activities for ongoing projects with the NASA Centers are presented in Section 5.0. Section 6.0 summarizes the status of four add-on tasks. Travel for the reporting period is described in Section 7.0. The RTI Team staff and consultants and their project responsibilities are listed in Appendix A. The authors gratefully acknowledge the contributions of many individuals to the RTI Technology Applications Team program. The time and effort contributed by managers, engineers, and scientists throughout NASA were essential to program success. Most important to the program has been a productive working relationship with the NASA Field Center Technology Utilization (TU) Offices. The RTI Team continues to strive for improved effectiveness as a resource to these offices. Industry managers, technical staff, medical researchers, and clinicians have been cooperative and open in their participation. The RTI Team looks forward to continuing expansion of its interaction with U.S. industry to facilitate the transfer of aerospace technology to the private sector.

  6. OpenCL: A Parallel Programming Standard for Heterogeneous Computing Systems

    PubMed Central

    Stone, John E.; Gohara, David; Shi, Guochun

    2010-01-01

    We provide an overview of the key architectural features of recent microprocessor designs and describe the programming model and abstractions provided by OpenCL, a new parallel programming standard targeting these architectures. PMID:21037981

  7. 77 FR 43084 - Multiple Award Schedule (MAS) Program Continuous Open Season-Operational Change

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-07-23

    ... Award Schedule (MAS) Program Continuous Open Season- Operational Change AGENCY: Federal Acquisition... proposing this operational change to enhance the performance of and modernize the MAS program in three key program areas: Small business viability, operational efficiency, and cost control. The DBM will realign...

  8. CCP MRAP Run

    NASA Image and Video Library

    2018-04-20

    Following a training run on the Shuttle Landing Facility at NASA's Kennedy Space Center in Florida, MRAP back doors are opened showing seating in the armored vehicle. The 45,000-pound mine-resistant ambush protected vehicle, or MRAP, was originally designed for military applications, but will support the agency's Commercial Crew Program at the spaceport. The MRAP offers a mobile bunker for astronauts and ground crews in the unlikely event they have to get away from the launch pad quickly in an emergency.

  9. Modeling and Simulation Behavior Validation Methodology and Extension Model Validation for the Individual Soldier

    DTIC Science & Technology

    2015-03-01

    domains. Major model functions include: • Ground combat: Light and heavy forces. • Air mobile forces. • Future forces. • Fixed-wing and rotary-wing...Constraints: • Study must be completed no later than 31 December 2014. • Entity behavior limited to select COMBATXXI Mobility , Unmanned Aerial System...and SQL backend , as well as any open application programming interface API. • Allows data transparency and data driven navigation through the model

  10. Evaluation of Carbon Nanotube Thin Films for Optically Transparent Microwave Applications Using On-Wafer Probing of Corbino Disc Test Structures

    DTIC Science & Technology

    2013-03-01

    the Material Under Test (MUT) against an open end of a coaxial cable . The novelty of our measurement scheme is the aspect of on-wafer probing. This...Directorate, ARL Julia B. Doggett George Washington University Henning Richter and Ramesh Sivarajan Nano -C, Inc...and Engineering Apprenticeship Program, George Washington University/Department of Defense, Washington, D.C., 20052 † Nano -C, Inc., 33 Southwest Park

  11. Open Space, Open Education, and Pupil Performance.

    ERIC Educational Resources Information Center

    Lukasevich, Ann; Gray, Roland F.

    1978-01-01

    Explores the relationship between instructional style (open and non-open programs), architectural style (open and non-open facilities) and selected cognitive and affective outcomes of third grade pupils. (CM)

  12. McIDAS-V: Data Analysis and Visualization for NPOESS and GOES-R

    NASA Astrophysics Data System (ADS)

    Rink, T.; Achtor, T. H.

    2009-12-01

    McIDAS-V, the next-generation McIDAS, is being built on top a modern, cross-platform software framework which supports development of 4-D, interactive displays and integration of wide-array of geophysical data. As the replacement of McIDAS, the development emphasis is on future satellite observation platforms such as NPOESS and GOES-R. Data interrogation, analysis and visualization capabilities have been developed for multi- and hyper-spectral instruments like MODIS, AIRS and IASI, and are being extended for application to VIIRS and CrIS. Compatibility with GOES-R ABI level1 and level2 product storage formats has been demonstrated. The abstract data model, which can internalize most any geophysical data, opens up new possibilities for data fusion techniques, for example, polar and geostationary, (LEO/GEO), synergy for research and validation. McIDAS-V follows an object-oriented design model, using the Java programming language, allowing specialized extensions for for new sources of data, and novel displays and interactive behavior. The reference application, what the user sees on startup, can be customized, and the system has a persistence mechanism allowing sharing of the application state across the internet. McIDAS-V is open-source, and free to the public.

  13. Automatic Thread-Level Parallelization in the Chombo AMR Library

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Christen, Matthias; Keen, Noel; Ligocki, Terry

    2011-05-26

    The increasing on-chip parallelism has some substantial implications for HPC applications. Currently, hybrid programming models (typically MPI+OpenMP) are employed for mapping software to the hardware in order to leverage the hardware?s architectural features. In this paper, we present an approach that automatically introduces thread level parallelism into Chombo, a parallel adaptive mesh refinement framework for finite difference type PDE solvers. In Chombo, core algorithms are specified in the ChomboFortran, a macro language extension to F77 that is part of the Chombo framework. This domain-specific language forms an already used target language for an automatic migration of the large number ofmore » existing algorithms into a hybrid MPI+OpenMP implementation. It also provides access to the auto-tuning methodology that enables tuning certain aspects of an algorithm to hardware characteristics. Performance measurements are presented for a few of the most relevant kernels with respect to a specific application benchmark using this technique as well as benchmark results for the entire application. The kernel benchmarks show that, using auto-tuning, up to a factor of 11 in performance was gained with 4 threads with respect to the serial reference implementation.« less

  14. An open experimental database for exploring inorganic materials

    DOE PAGES

    Zakutayev, Andriy; Wunder, Nick; Schwarting, Marcus; ...

    2018-04-03

    The use of advanced machine learning algorithms in experimental materials science is limited by the lack of sufficiently large and diverse datasets amenable to data mining. If publicly open, such data resources would also enable materials research by scientists without access to expensive experimental equipment. Here, we report on our progress towards a publicly open High Throughput Experimental Materials (HTEM) Database (htem.nrel.gov). This database currently contains 140,000 sample entries, characterized by structural (100,000), synthetic (80,000), chemical (70,000), and optoelectronic (50,000) properties of inorganic thin film materials, grouped in >4,000 sample entries across >100 materials systems; more than a half ofmore » these data are publicly available. This article shows how the HTEM database may enable scientists to explore materials by browsing web-based user interface and an application programming interface. This paper also describes a HTE approach to generating materials data, and discusses the laboratory information management system (LIMS), that underpin HTEM database. Finally, this manuscript illustrates how advanced machine learning algorithms can be adopted to materials science problems using this open data resource.« less

  15. Digital time stamping system based on open source technologies.

    PubMed

    Miskinis, Rimantas; Smirnov, Dmitrij; Urba, Emilis; Burokas, Andrius; Malysko, Bogdan; Laud, Peeter; Zuliani, Francesco

    2010-03-01

    A digital time stamping system based on open source technologies (LINUX-UBUNTU, OpenTSA, OpenSSL, MySQL) is described in detail, including all important testing results. The system, called BALTICTIME, was developed under a project sponsored by the European Commission under the Program FP 6. It was designed to meet the requirements posed to the systems of legal and accountable time stamping and to be applicable to the hardware commonly used by the national time metrology laboratories. The BALTICTIME system is intended for the use of governmental and other institutions as well as personal bodies. Testing results demonstrate that the time stamps issued to the user by BALTICTIME and saved in BALTICTIME's archives (which implies that the time stamps are accountable) meet all the regulatory requirements. Moreover, the BALTICTIME in its present implementation is able to issue more than 10 digital time stamps per second. The system can be enhanced if needed. The test version of the BALTICTIME service is free and available at http://baltictime. pfi.lt:8080/btws/ and http://baltictime.lnmc.lv:8080/btws/.

  16. An open experimental database for exploring inorganic materials.

    PubMed

    Zakutayev, Andriy; Wunder, Nick; Schwarting, Marcus; Perkins, John D; White, Robert; Munch, Kristin; Tumas, William; Phillips, Caleb

    2018-04-03

    The use of advanced machine learning algorithms in experimental materials science is limited by the lack of sufficiently large and diverse datasets amenable to data mining. If publicly open, such data resources would also enable materials research by scientists without access to expensive experimental equipment. Here, we report on our progress towards a publicly open High Throughput Experimental Materials (HTEM) Database (htem.nrel.gov). This database currently contains 140,000 sample entries, characterized by structural (100,000), synthetic (80,000), chemical (70,000), and optoelectronic (50,000) properties of inorganic thin film materials, grouped in >4,000 sample entries across >100 materials systems; more than a half of these data are publicly available. This article shows how the HTEM database may enable scientists to explore materials by browsing web-based user interface and an application programming interface. This paper also describes a HTE approach to generating materials data, and discusses the laboratory information management system (LIMS), that underpin HTEM database. Finally, this manuscript illustrates how advanced machine learning algorithms can be adopted to materials science problems using this open data resource.

  17. An open experimental database for exploring inorganic materials

    PubMed Central

    Zakutayev, Andriy; Wunder, Nick; Schwarting, Marcus; Perkins, John D.; White, Robert; Munch, Kristin; Tumas, William; Phillips, Caleb

    2018-01-01

    The use of advanced machine learning algorithms in experimental materials science is limited by the lack of sufficiently large and diverse datasets amenable to data mining. If publicly open, such data resources would also enable materials research by scientists without access to expensive experimental equipment. Here, we report on our progress towards a publicly open High Throughput Experimental Materials (HTEM) Database (htem.nrel.gov). This database currently contains 140,000 sample entries, characterized by structural (100,000), synthetic (80,000), chemical (70,000), and optoelectronic (50,000) properties of inorganic thin film materials, grouped in >4,000 sample entries across >100 materials systems; more than a half of these data are publicly available. This article shows how the HTEM database may enable scientists to explore materials by browsing web-based user interface and an application programming interface. This paper also describes a HTE approach to generating materials data, and discusses the laboratory information management system (LIMS), that underpin HTEM database. Finally, this manuscript illustrates how advanced machine learning algorithms can be adopted to materials science problems using this open data resource. PMID:29611842

  18. An open experimental database for exploring inorganic materials

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zakutayev, Andriy; Wunder, Nick; Schwarting, Marcus

    The use of advanced machine learning algorithms in experimental materials science is limited by the lack of sufficiently large and diverse datasets amenable to data mining. If publicly open, such data resources would also enable materials research by scientists without access to expensive experimental equipment. Here, we report on our progress towards a publicly open High Throughput Experimental Materials (HTEM) Database (htem.nrel.gov). This database currently contains 140,000 sample entries, characterized by structural (100,000), synthetic (80,000), chemical (70,000), and optoelectronic (50,000) properties of inorganic thin film materials, grouped in >4,000 sample entries across >100 materials systems; more than a half ofmore » these data are publicly available. This article shows how the HTEM database may enable scientists to explore materials by browsing web-based user interface and an application programming interface. This paper also describes a HTE approach to generating materials data, and discusses the laboratory information management system (LIMS), that underpin HTEM database. Finally, this manuscript illustrates how advanced machine learning algorithms can be adopted to materials science problems using this open data resource.« less

  19. Effects of Ordering Strategies and Programming Paradigms on Sparse Matrix Computations

    NASA Technical Reports Server (NTRS)

    Oliker, Leonid; Li, Xiaoye; Husbands, Parry; Biswas, Rupak; Biegel, Bryan (Technical Monitor)

    2002-01-01

    The Conjugate Gradient (CG) algorithm is perhaps the best-known iterative technique to solve sparse linear systems that are symmetric and positive definite. For systems that are ill-conditioned, it is often necessary to use a preconditioning technique. In this paper, we investigate the effects of various ordering and partitioning strategies on the performance of parallel CG and ILU(O) preconditioned CG (PCG) using different programming paradigms and architectures. Results show that for this class of applications: ordering significantly improves overall performance on both distributed and distributed shared-memory systems, that cache reuse may be more important than reducing communication, that it is possible to achieve message-passing performance using shared-memory constructs through careful data ordering and distribution, and that a hybrid MPI+OpenMP paradigm increases programming complexity with little performance gains. A implementation of CG on the Cray MTA does not require special ordering or partitioning to obtain high efficiency and scalability, giving it a distinct advantage for adaptive applications; however, it shows limited scalability for PCG due to a lack of thread level parallelism.

  20. Utilizing Linked Open Data Sources for Automatic Generation of Semantic Metadata

    NASA Astrophysics Data System (ADS)

    Nummiaho, Antti; Vainikainen, Sari; Melin, Magnus

    In this paper we present an application that can be used to automatically generate semantic metadata for tags given as simple keywords. The application that we have implemented in Java programming language creates the semantic metadata by linking the tags to concepts in different semantic knowledge bases (CrunchBase, DBpedia, Freebase, KOKO, Opencyc, Umbel and/or WordNet). The steps that our application takes in doing so include detecting possible languages, finding spelling suggestions and finding meanings from amongst the proper nouns and common nouns separately. Currently, our application supports English, Finnish and Swedish words, but other languages could be included easily if the required lexical tools (spellcheckers, etc.) are available. The created semantic metadata can be of great use in, e.g., finding and combining similar contents, creating recommendations and targeting advertisements.

  1. KSC-02pd0616

    NASA Image and Video Library

    2002-04-29

    KENNEDY SPACE CENTER, FLA. -- The Honorable Diana Morgan speaks to attendees at the opening ceremony kicking off a new program known as SABRE, Space Agricultural Biotechnology Research and Education. In the foreground are Center Director Roy D. Bridges Jr. (left) and U.S. Representative Dave Weldon (right). The SABRE program is a combined effort of the University of Florida and NASA. Morgan is vice chair on the UF Board of Trustees. SABRE will focus on the discovery, development and application of the biological aspects of advanced life support strategies. The program will include faculty from UF's Institute of Food and Agricultural Sciences, who will be located at both KSC - in the state-owned Space Experiment Research and Processing Laboratory (SERPL) being built there - and UF in Gainesville. SABRE will be directed by Robert Ferl, professor in the horticultural sciences department and assistant director of UF's Biotechnology Program. He will be responsible for coordinating the research and education efforts of UF and NASA

  2. Modern Data Center Services Supporting Science

    NASA Astrophysics Data System (ADS)

    Varner, J. D.; Cartwright, J.; McLean, S. J.; Boucher, J.; Neufeld, D.; LaRocque, J.; Fischman, D.; McQuinn, E.; Fugett, C.

    2011-12-01

    The National Oceanic and Atmospheric Administration's National Geophysical Data Center (NGDC) World Data Center for Geophysics and Marine Geology provides scientific stewardship, products and services for geophysical data, including bathymetry, gravity, magnetics, seismic reflection, data derived from sediment and rock samples, as well as historical natural hazards data (tsunamis, earthquakes, and volcanoes). Although NGDC has long made many of its datasets available through map and other web services, it has now developed a second generation of services to improve the discovery and access to data. These new services use off-the-shelf commercial and open source software, and take advantage of modern JavaScript and web application frameworks. Services are accessible using both RESTful and SOAP queries as well as Open Geospatial Consortium (OGC) standard protocols such as WMS, WFS, WCS, and KML. These new map services (implemented using ESRI ArcGIS Server) are finer-grained than their predecessors, feature improved cartography, and offer dramatic speed improvements through the use of map caches. Using standards-based interfaces allows customers to incorporate the services without having to coordinate with the provider. Providing fine-grained services increases flexibility for customers building custom applications. The Integrated Ocean and Coastal Mapping program and Coastal and Marine Spatial Planning program are two examples of national initiatives that require common data inventories from multiple sources and benefit from these modern data services. NGDC is also consuming its own services, providing a set of new browser-based mapping applications which allow the user to quickly visualize and search for data. One example is a new interactive mapping application to search and display information about historical natural hazards. NGDC continues to increase the amount of its data holdings that are accessible and is augmenting the capabilities with modern web application frameworks such as Groovy and Grails. Data discovery is being improved and simplified by leveraging ISO metadata standards along with ESRI Geoportal Server.

  3. Using a web-based application to define the accuracy of diagnostic tests when the gold standard is imperfect.

    PubMed

    Lim, Cherry; Wannapinij, Prapass; White, Lisa; Day, Nicholas P J; Cooper, Ben S; Peacock, Sharon J; Limmathurotsakul, Direk

    2013-01-01

    Estimates of the sensitivity and specificity for new diagnostic tests based on evaluation against a known gold standard are imprecise when the accuracy of the gold standard is imperfect. Bayesian latent class models (LCMs) can be helpful under these circumstances, but the necessary analysis requires expertise in computational programming. Here, we describe open-access web-based applications that allow non-experts to apply Bayesian LCMs to their own data sets via a user-friendly interface. Applications for Bayesian LCMs were constructed on a web server using R and WinBUGS programs. The models provided (http://mice.tropmedres.ac) include two Bayesian LCMs: the two-tests in two-population model (Hui and Walter model) and the three-tests in one-population model (Walter and Irwig model). Both models are available with simplified and advanced interfaces. In the former, all settings for Bayesian statistics are fixed as defaults. Users input their data set into a table provided on the webpage. Disease prevalence and accuracy of diagnostic tests are then estimated using the Bayesian LCM, and provided on the web page within a few minutes. With the advanced interfaces, experienced researchers can modify all settings in the models as needed. These settings include correlation among diagnostic test results and prior distributions for all unknown parameters. The web pages provide worked examples with both models using the original data sets presented by Hui and Walter in 1980, and by Walter and Irwig in 1988. We also illustrate the utility of the advanced interface using the Walter and Irwig model on a data set from a recent melioidosis study. The results obtained from the web-based applications were comparable to those published previously. The newly developed web-based applications are open-access and provide an important new resource for researchers worldwide to evaluate new diagnostic tests.

  4. How Physician Assistant Programs Use the CASPA Personal Statement in Their Admissions Process.

    PubMed

    Lopes, John E; Badur, Michalina; Weis, Nicole

    2016-06-01

    This research surveyed physician assistant (PA) program admissions personnel to determine how the Central Application Service for Physician Assistants (CASPA) personal statements are used, what influence the statements had on certain admissions processes, whether there was any concern about authorship of the statements, and how important certain previously identified content themes were to admissions committees and personnel. The PA programs participating in CASPA were contacted and interviewed using a computer-assisted telephone interview system. Participants were asked a series of open-ended questions related to the usefulness of the personal statement and asked to score certain items using a Likert-type scale. The response rate for the telephone survey was 75%. Most of the programs (93%) used the personal statement in the applicant review process, and almost two-thirds (62%) indicated that the statement was useful or very useful. Three-fourths (76%) of respondents sometimes or always used the statement for the selection of candidates for interviews. Only 29% of respondents were very to extremely concerned that the statements were not written by the applicants. Despite the observation that the statements were relatively homogeneous in content, respondents ranked identified content themes as an important influence on decision-making. Almost all respondents used the personal statement in their admissions process, usually in the selection of interviewees. Although there was some concern that the statements were not the original work of the applicant, less than a third of respondents were very concerned about this possibility. The homogeneity of the statements was also a concern, but the importance placed on the identified theme content areas validates the applicants' inclusion of these themes in the statements.

  5. An open-source method to analyze optokinetic reflex responses in larval zebrafish.

    PubMed

    Scheetz, Seth D; Shao, Enhua; Zhou, Yangzhong; Cario, Clinton L; Bai, Qing; Burton, Edward A

    2018-01-01

    Optokinetic reflex (OKR) responses provide a convenient means to evaluate oculomotor, integrative and afferent visual function in larval zebrafish models, which are commonly used to elucidate molecular mechanisms underlying development, disease and repair of the vertebrate nervous system. We developed an open-source MATLAB-based solution for automated quantitative analysis of OKR responses in larval zebrafish. The package includes applications to: (i) generate sinusoidally-transformed animated grating patterns suitable for projection onto a cylindrical screen to elicit the OKR; (ii) determine and record the angular orientations of the eyes in each frame of a video recording showing the OKR response; and (iii) analyze angular orientation data from the tracking program to yield a set of parameters that quantify essential elements of the OKR. The method can be employed without modification using the operating manual provided. In addition, annotated source code is included, allowing users to modify or adapt the software for other applications. We validated the algorithms and measured OKR responses in normal larval zebrafish, showing good agreement with published quantitative data, where available. We provide the first open-source method to elicit and analyze the OKR in larval zebrafish. The wide range of parameters that are automatically quantified by our algorithms significantly expands the scope of quantitative analysis previously reported. Our method for quantifying OKR responses will be useful for numerous applications in neuroscience using the genetically- and chemically-tractable zebrafish model. Published by Elsevier B.V.

  6. Development of a Magneto-Resistive Angular Position Sensor for Space Mechanisms

    NASA Technical Reports Server (NTRS)

    Hahn, Robert; Schmidt, Tilo; Seifart, Klaus; Olberts, Bastian; Romera, Fernando

    2016-01-01

    Magnetic microsystems in the form of magneto-resistive (MR) sensors are firmly established in automobiles and industrial applications. They are used to measure travel, angle, electrical current, or magnetic fields. MR technology opens up new sensor possibilities in space applications and can be an enabling technology for optimal performance, high robustness and long lifetime at reasonable costs. In some science missions, the technology is already applied, however, the designs are proprietary and case specific, for instance in case of the angular sensors used for JPL/NASA's Mars rover Curiosity [1]. Since 2013 HTS GmbH and Sensitec GmbH have teamed up to develop and qualify a standardized yet flexible to use MR angular sensor for space mechanisms. Starting with a first assessment study and market survey performed under ESA contract, a very strong industry interest in novel, contactless position measurement means was found. Currently a detailed and comprehensive development program is being performed by HTS and Sensitec. The objective of this program is to advance the sensor design up to Engineering Qualification Model level and to perform qualification testing for a representative space application. The paper briefly reviews the basics of magneto-resistive effects and possible sensor applications and describes the key benefits of MR angular sensors with reference to currently operational industrial and space applications. The key applications and specification are presented and the preliminary baseline mechanical and electrical design will be discussed. An outlook on the upcoming development and test stages as well as the qualification program will be provided.

  7. Developing inclusive employment: lessons from Telenor Open Mind.

    PubMed

    Kalef, Laura; Barrera, Magda; Heymann, Jody

    2014-01-01

    Despite significant gains in legal rights for people with disabilities, the employment rate for individuals with disabilities in many countries remains extremely low. Programs to promote the inclusion of people with disabilities in the workforce can have an important impact on individuals' economic and social prospects, as well as societal benefits. This article aims to explore Telenor Open Mind, a job training program at Norway's largest telecommunications company with financial support from Norway's Labor and Welfare Organization (NAV), which acts as a springboard for individuals with disabilities into the workplace. A qualitative case study design was utilized to explore the Telenor Open Mind Program. Drawing on field research conducted in Oslo during 2011, this article explores subjective experiences of individuals involved with the program, through interviews and program observations. Telenor Open Mind's two-year program is comprised of a three month training period, in which individuals participate in computer and self-development courses followed by a 21-month paid internship where participants gain hands-on experience. The program has an average 75% rate of employment upon completion and a high rate of participant satisfaction. Participation in the program led to increased self-confidence and social development. The company experienced benefits from greater workplace satisfaction and reductions in sick leave rates. The Telenor Open Mind program has provided benefits for participants, the company, and society as a whole. Participants gain training, work experience, and increased employability. Telenor gains dedicated and trained employees, in addition to reducing sick leave absences among all employees. Finally, society benefits from the Open Mind program as the individuals who gain employment become tax-payers, and no longer need to receive benefits from the government.

  8. An open science approach to modeling and visualizing ...

    EPA Pesticide Factsheets

    It is expected that cyanobacteria blooms will increase in frequency, duration, and severity as inputs of nutrients increase and the impacts of climate change are realized. Partly in response to this, federal, state, and local entities have ramped up efforts to better understand blooms which has resulted in new life for old datasets, new monitoring programs, and novel uses for non-traditional sources of data. To fully benefit from these datasets, it is also imperative that the full body of work including data, code, and manuscripts be openly available (i.e., open science). This presentation will provide several examples of our work which occurs at the intersection of open science and research on cyanobacetria blooms in lakes and ponds. In particular we will discuss 1) why open science is particularly important for environmental human health issues; 2) the lakemorpho and elevatr R packages and how we use those to model lake morphometry; 3) Shiny server applications to visualize data collected as part of the Cyanobacteria Monitoring Collaborative; and 4) distribution of our research and models via open access publications and as R packages on GitHub. Modelling and visualizing information on cyanobacteria blooms is important as it provides estimates of the extent of potential problems associated with these blooms. Furthermore, conducting this work in the open allows others to access our code, data, and results. In turn, this allows for a greater impact because the

  9. Challenges of agricultural monitoring: integration of the Open Farm Management Information System into GEOSS and Digital Earth

    NASA Astrophysics Data System (ADS)

    Řezník, T.; Kepka, M.; Charvát, K.; Charvát, K., Jr.; Horáková, S.; Lukas, V.

    2016-04-01

    From a global perspective, agriculture is the single largest user of freshwater resources, each country using an average of 70% of all its surface water supplies. An essential proportion of agricultural water is recycled back to surface water and/or groundwater. Agriculture and water pollution is therefore the subject of (inter)national legislation, such as the Clean Water Act in the United States of America, the European Water Framework Directive, and the Law of the People's Republic of China on the Prevention and Control of Water Pollution. Regular monitoring by means of sensor networks is needed in order to provide evidence of water pollution in agriculture. This paper describes the benefits of, and open issues stemming from, regular sensor monitoring provided by an Open Farm Management Information System. Emphasis is placed on descriptions of the processes and functionalities available to users, the underlying open data model, and definitions of open and lightweight application programming interfaces for the efficient management of collected (spatial) data. The presented Open Farm Management Information System has already been successfully registered under Phase 8 of the Global Earth Observation System of Systems (GEOSS) Architecture Implementation Pilot in order to support the wide variety of demands that are primarily aimed at agriculture pollution monitoring. The final part of the paper deals with the integration of the Open Farm Management Information System into the Digital Earth framework.

  10. From WSN towards WoT: Open API Scheme Based on oneM2M Platforms.

    PubMed

    Kim, Jaeho; Choi, Sung-Chan; Ahn, Il-Yeup; Sung, Nak-Myoung; Yun, Jaeseok

    2016-10-06

    Conventional computing systems have been able to be integrated into daily objects and connected to each other due to advances in computing and network technologies, such as wireless sensor networks (WSNs), forming a global network infrastructure, called the Internet of Things (IoT). To support the interconnection and interoperability between heterogeneous IoT systems, the availability of standardized, open application programming interfaces (APIs) is one of the key features of common software platforms for IoT devices, gateways, and servers. In this paper, we present a standardized way of extending previously-existing WSNs towards IoT systems, building the world of the Web of Things (WoT). Based on the oneM2M software platforms developed in the previous project, we introduce a well-designed open API scheme and device-specific thing adaptation software (TAS) enabling WSN elements, such as a wireless sensor node, to be accessed in a standardized way on a global scale. Three pilot services are implemented (i.e., a WiFi-enabled smart flowerpot, voice-based control for ZigBee-connected home appliances, and WiFi-connected AR.Drone control) to demonstrate the practical usability of the open API scheme and TAS modules. Full details on the method of integrating WSN elements into three example systems are described at the programming code level, which is expected to help future researchers in integrating their WSN systems in IoT platforms, such as oneM2M. We hope that the flexibly-deployable, easily-reusable common open API scheme and TAS-based integration method working with the oneM2M platforms will help the conventional WSNs in diverse industries evolve into the emerging WoT solutions.

  11. From WSN towards WoT: Open API Scheme Based on oneM2M Platforms

    PubMed Central

    Kim, Jaeho; Choi, Sung-Chan; Ahn, Il-Yeup; Sung, Nak-Myoung; Yun, Jaeseok

    2016-01-01

    Conventional computing systems have been able to be integrated into daily objects and connected to each other due to advances in computing and network technologies, such as wireless sensor networks (WSNs), forming a global network infrastructure, called the Internet of Things (IoT). To support the interconnection and interoperability between heterogeneous IoT systems, the availability of standardized, open application programming interfaces (APIs) is one of the key features of common software platforms for IoT devices, gateways, and servers. In this paper, we present a standardized way of extending previously-existing WSNs towards IoT systems, building the world of the Web of Things (WoT). Based on the oneM2M software platforms developed in the previous project, we introduce a well-designed open API scheme and device-specific thing adaptation software (TAS) enabling WSN elements, such as a wireless sensor node, to be accessed in a standardized way on a global scale. Three pilot services are implemented (i.e., a WiFi-enabled smart flowerpot, voice-based control for ZigBee-connected home appliances, and WiFi-connected AR.Drone control) to demonstrate the practical usability of the open API scheme and TAS modules. Full details on the method of integrating WSN elements into three example systems are described at the programming code level, which is expected to help future researchers in integrating their WSN systems in IoT platforms, such as oneM2M. We hope that the flexibly-deployable, easily-reusable common open API scheme and TAS-based integration method working with the oneM2M platforms will help the conventional WSNs in diverse industries evolve into the emerging WoT solutions. PMID:27782058

  12. Summary and evaluation of the Strategic Defense Initiative Space Power Architecture Study

    NASA Technical Reports Server (NTRS)

    Edenburn, M. (Editor); Smith, J. M. (Editor)

    1989-01-01

    The Space Power Architecture Study (SPAS) identified and evaluated power subsystem options for multimegawatt electric (MMWE) space based weapons and surveillance platforms for the Strategic Defense Initiative (SDI) applications. Steady state requirements of less than 1 MMWE are adequately covered by the SP-100 nuclear space power program and hence were not addressed in the SPAS. Four steady state power systems less than 1 MMWE were investigated with little difference between them on a mass basis. The majority of the burst power systems utilized H(2) from the weapons and were either closed (no effluent), open (effluent release) or steady state with storage (no effluent). Closed systems used nuclear or combustion heat source with thermionic, Rankine, turboalternator, fuel cell and battery conversion devices. Open systems included nuclear or combustion heat sources using turboalternator, magnetohydrodynamic, fuel cell or battery power conversion devices. The steady state systems with storage used the SP-100 or Star-M reactors as energy sources and flywheels, fuel cells or batteries to store energy for burst applications. As with other studies the open systems are by far the lightest, most compact and simplist (most reliable) systems. However, unlike other studies the SPAS studied potential platform operational problems caused by effluents or vibration.

  13. An Open-Source Bayesian Atmospheric Radiative Transfer (BART) Code, with Application to WASP-12b

    NASA Astrophysics Data System (ADS)

    Harrington, Joseph; Blecic, Jasmina; Cubillos, Patricio; Rojo, Patricio; Loredo, Thomas J.; Bowman, M. Oliver; Foster, Andrew S. D.; Stemm, Madison M.; Lust, Nate B.

    2015-01-01

    Atmospheric retrievals for solar-system planets typically fit, either with a minimizer or by eye, a synthetic spectrum to high-resolution (Δλ/λ ~ 1000-100,000) data with S/N > 100 per point. In contrast, exoplanet data often have S/N ~ 10 per point, and may have just a few points representing bandpasses larger than 1 um. To derive atmospheric constraints and robust parameter uncertainty estimates from such data requires a Bayesian approach. To date there are few investigators with the relevant codes, none of which are publicly available. We are therefore pleased to announce the open-source Bayesian Atmospheric Radiative Transfer (BART) code. BART uses a Bayesian phase-space explorer to drive a radiative-transfer model through the parameter phase space, producing the most robust estimates available for the thermal profile and chemical abundances in the atmosphere. We present an overview of the code and an initial application to Spitzer eclipse data for WASP-12b. We invite the community to use and improve BART via the open-source development site GitHub.com. This work was supported by NASA Planetary Atmospheres grant NNX12AI69G and NASA Astrophysics Data Analysis Program grant NNX13AF38G. JB holds a NASA Earth and Space Science Fellowship.

  14. An Open-Source Bayesian Atmospheric Radiative Transfer (BART) Code, and Application to WASP-12b

    NASA Astrophysics Data System (ADS)

    Harrington, Joseph; Blecic, Jasmina; Cubillos, Patricio; Rojo, Patricio M.; Loredo, Thomas J.; Bowman, Matthew O.; Foster, Andrew S.; Stemm, Madison M.; Lust, Nate B.

    2014-11-01

    Atmospheric retrievals for solar-system planets typically fit, either with a minimizer or by eye, a synthetic spectrum to high-resolution (Δλ/λ ~ 1000-100,000) data with S/N > 100 per point. In contrast, exoplanet data often have S/N ~ 10 per point, and may have just a few points representing bandpasses larger than 1 um. To derive atmospheric constraints and robust parameter uncertainty estimates from such data requires a Bayesian approach. To date there are few investigators with the relevant codes, none of which are publicly available. We are therefore pleased to announce the open-source Bayesian Atmospheric Radiative Transfer (BART) code. BART uses a Bayesian phase-space explorer to drive a radiative-transfer model through the parameter phase space, producing the most robust estimates available for the thermal profile and chemical abundances in the atmosphere. We present an overview of the code and an initial application to Spitzer eclipse data for WASP-12b. We invite the community to use and improve BART via the open-source development site GitHub.com. This work was supported by NASA Planetary Atmospheres grant NNX12AI69G and NASA Astrophysics Data Analysis Program grant NNX13AF38G. JB holds a NASA Earth and Space Science Fellowship.

  15. Theoretical investigation of gas-surface interactions

    NASA Technical Reports Server (NTRS)

    Dyall, Kenneth G.

    1992-01-01

    The investigation into the appearance of intruder states from the negative continuum when some of the two-electron integrals were omitted was completed. The work shows that, provided all integrals involving core contracted functions in an atomic general contraction are included, or that the core functions are radially localized, meaningful results are obtained and intruder states do not appear. In the area of program development, the Dirac-Hartree-Fock (DHF) program for closed-shell polyatomic molecules was extended to permit Kramers-restricted open-shell DHF calculations with one electron in an open shell or one hole in a closed shell, or state-averaged DHF calculations over several particle or hole doublet states. One application of the open-shell code was to the KO molecule. Another major area of program development is the transformation of integrals from the scalar basis in which they are generated to the 2-spinor basis employed in parts of the DHF program, and hence to supermatrix form. Particularly concerning the omission of small component integrals, and with increase in availability of disk space, it is now possible to consider transforming the integrals. The use of ordered integrals, either in the scalar basis or in the 2-spinor basis, would considerably speed up the construction of the Fock matrix, and even more so if supermatrices were constructed. A considerable amount of effort was spent on analyzing the integral ordering and tranformation for the DHF program. The work of assessing the reliability of the relativistic effective core potentials (RECPs) was continued with calculation of the group IV monoxides. The perturbation of the metal atom provided by oxygen is expected to be larger than that provided by hydrogen and thus provide a better test of the qualification of the RECPs. Calculations on some platinum hydrides were carried out at nonrelativistic (NR), perturbation theory (PT) and DHF levels. Reprints of four papers describing this work are included.

  16. Guide to NavyFOAM V1.0

    DTIC Science & Technology

    2011-04-01

    NavyFOAM has been developed using an open-source CFD software tool-kit ( OpenFOAM ) that draws heavily upon object-oriented programming. The...numerical methods and the physical models in the original version of OpenFOAM have been upgraded in an effort to improve accuracy and robustness of...computational fluid dynamics OpenFOAM , Object Oriented Programming (OOP) (CFD), NavyFOAM, 16. SECURITY CLASSIFICATION OF: a. REPORT UNCLASSIFIED b

  17. 76 FR 20799 - Intermediary Lending Pilot Program Meeting

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-04-13

    ... Pilot (ILP) program established by the Small Business Jobs Act of 2010. The meetings will be open to the... holding open meetings to discuss the ILP program established in the Small Business Jobs Act of 2010 (Pub... program public meeting, please contact: 1. San Francisco--Steve Bangs, (415) 744-6792, fax (415) 744-6812...

  18. Telemedicine Program

    NASA Technical Reports Server (NTRS)

    1996-01-01

    Since the 1970s, NASA has been involved in the research and demonstration of telemedicine for its potential in the care of astronauts in flight and Earth-bound applications. A combination of NASA funding, expertise and off-the-shelf computer and networking systems made telemedicine possible for a medically underserved hospital in Texas. Through two-way audio/video relay, the program links pediatric oncology specialists at the University of Texas Health Science Center in San Antonio to South Texas Hospital in Harlingen, providing easier access and better care to children with cancer. Additionally, the hospital is receiving teleclinics on pediatric oncology nursing, family counseling and tuberculosis treatment. VTEL Corporation, Sprint, and the Healthcare Open Systems and Trials Consortium also contributed staff and hardware.

  19. NOBLAST and JAMBLAST: New Options for BLAST and a Java Application Manager for BLAST results.

    PubMed

    Lagnel, Jacques; Tsigenopoulos, Costas S; Iliopoulos, Ioannis

    2009-03-15

    NOBLAST (New Options for BLAST) is an open source program that provides a new user-friendly tabular output format for various NCBI BLAST programs (Blastn, Blastp, Blastx, Tblastn, Tblastx, Mega BLAST and Psi BLAST) without any use of a parser and provides E-value correction in case of use of segmented BLAST database. JAMBLAST using the NOBLAST output allows the user to manage, view and filter the BLAST hits using a number of selection criteria. A distribution package of NOBLAST and JAMBLAST including detailed installation procedure is freely available from http://sourceforge.net/projects/JAMBLAST/ and http://sourceforge.net/projects/NOBLAST. Supplementary data are available at Bioinformatics online.

  20. Application Period Open for NCI Biospecimen Use | Division of Cancer Prevention

    Cancer.gov

    The application period for investigators interested in obtaining biospecimens and data from the Prostate, Lung, Colorectal and Ovarian (PLCO) Cancer Screening Trial re-opened June 1. A separate application for obtaining biospecimens and data with research funding is also open. |

  1. An Assessment of Educational Benefits from the OpenOrbiter Space Program

    ERIC Educational Resources Information Center

    Straub, Jeremy; Whalen, David

    2013-01-01

    This paper analyzes the educational impact of the OpenOrbiter Small Spacecraft Development Initiative, a CubeSat development program underway at the University of North Dakota. OpenOrbiter includes traditional STEM activities (e.g., spacecraft engineering, software development); it also incorporates students from non-STEM disciplines not generally…

  2. Integrating an Automatic Judge into an Open Source LMS

    ERIC Educational Resources Information Center

    Georgouli, Katerina; Guerreiro, Pedro

    2011-01-01

    This paper presents the successful integration of the evaluation engine of Mooshak into the open source learning management system Claroline. Mooshak is an open source online automatic judge that has been used for international and national programming competitions. although it was originally designed for programming competitions, Mooshak has also…

  3. 47 CFR 76.1503 - Carriage of video programming providers on open video systems.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 47 Telecommunication 4 2011-10-01 2011-10-01 false Carriage of video programming providers on open video systems. 76.1503 Section 76.1503 Telecommunication FEDERAL COMMUNICATIONS COMMISSION (CONTINUED) BROADCAST RADIO SERVICES MULTICHANNEL VIDEO AND CABLE TELEVISION SERVICE Open Video Systems § 76.1503...

  4. 47 CFR 76.1503 - Carriage of video programming providers on open video systems.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 47 Telecommunication 4 2012-10-01 2012-10-01 false Carriage of video programming providers on open video systems. 76.1503 Section 76.1503 Telecommunication FEDERAL COMMUNICATIONS COMMISSION (CONTINUED) BROADCAST RADIO SERVICES MULTICHANNEL VIDEO AND CABLE TELEVISION SERVICE Open Video Systems § 76.1503...

  5. 47 CFR 76.1503 - Carriage of video programming providers on open video systems.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 47 Telecommunication 4 2014-10-01 2014-10-01 false Carriage of video programming providers on open video systems. 76.1503 Section 76.1503 Telecommunication FEDERAL COMMUNICATIONS COMMISSION (CONTINUED) BROADCAST RADIO SERVICES MULTICHANNEL VIDEO AND CABLE TELEVISION SERVICE Open Video Systems § 76.1503...

  6. 47 CFR 76.1503 - Carriage of video programming providers on open video systems.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 47 Telecommunication 4 2010-10-01 2010-10-01 false Carriage of video programming providers on open video systems. 76.1503 Section 76.1503 Telecommunication FEDERAL COMMUNICATIONS COMMISSION (CONTINUED) BROADCAST RADIO SERVICES MULTICHANNEL VIDEO AND CABLE TELEVISION SERVICE Open Video Systems § 76.1503...

  7. 47 CFR 76.1503 - Carriage of video programming providers on open video systems.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 47 Telecommunication 4 2013-10-01 2013-10-01 false Carriage of video programming providers on open video systems. 76.1503 Section 76.1503 Telecommunication FEDERAL COMMUNICATIONS COMMISSION (CONTINUED) BROADCAST RADIO SERVICES MULTICHANNEL VIDEO AND CABLE TELEVISION SERVICE Open Video Systems § 76.1503...

  8. 78 FR 33853 - Announcement for the National Registry of Evidence-Based Programs and Practices (NREPP): Open...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-06-05

    ... Practices (NREPP): Open Submission Period for Fiscal Year 2014 Background The Substance Abuse and Mental... DEPARTMENT OF HEALTH AND HUMAN SERVICES Substance Abuse and Mental Health Services Administration (SAMHSA) Announcement for the National Registry of Evidence-Based Programs and Practices (NREPP): Open...

  9. JASPAR RESTful API: accessing JASPAR data from any programming language.

    PubMed

    Khan, Aziz; Mathelier, Anthony

    2018-05-01

    JASPAR is a widely used open-access database of curated, non-redundant transcription factor binding profiles. Currently, data from JASPAR can be retrieved as flat files or by using programming language-specific interfaces. Here, we present a programming language-independent application programming interface (API) to access JASPAR data using the Representational State Transfer (REST) architecture. The REST API enables programmatic access to JASPAR by most programming languages and returns data in eight widely used formats. Several endpoints are available to access the data and an endpoint is available to infer the TF binding profile(s) likely bound by a given DNA binding domain protein sequence. Additionally, it provides an interactive browsable interface for bioinformatics tool developers. This REST API is implemented in Python using the Django REST Framework. It is accessible at http://jaspar.genereg.net/api/ and the source code is freely available at https://bitbucket.org/CBGR/jaspar under GPL v3 license. aziz.khan@ncmm.uio.no or anthony.mathelier@ncmm.uio.no. Supplementary data are available at Bioinformatics online.

  10. A suicide awareness and intervention program for health professional students.

    PubMed

    De Silva, Eve; Bowerman, Lisa; Zimitat, Craig

    2015-01-01

    Many emergency service professionals and health professionals play important roles in the assessment and management of suicide risk but often receive inadequate mental health training in this area. A 'Suicide Awareness and Intervention Program' (SAIP) was developed for first year medical, paramedical and pharmacy students at the University of Tasmania, Australia. The program aimed to increase students' knowledge and awareness about suicide-related issues, develop interpersonal skills around suicide screening and increase awareness of available support services. A 5-hour experiential SAIP was embedded within the curriculum. A pre and post evaluation of knowledge, skills and attitudes was conducted, with an open-ended follow-up survey regarding use of what was learned in the program. Pre and post SAIP surveys showed significant improvement inknowledge and practical skills. Feedback from students and the counselling service indicated enduring impact of the program. Participation in the SAIP increased knowledge, skills and attitudes related to the assessment and management of individuals at risk for suicide, and the application of this ability to students' personal and professional lives.

  11. Abriendo Puertas/Opening Doors Parenting Program: Summary Report of Program Implementation and Impacts. Publication #2014-24

    ERIC Educational Resources Information Center

    Moore, Kristin A.; Caal, Selma; Lawner, Elizabeth K.; Rojas, Angela; Walker, Karen

    2014-01-01

    Child Trends conducted a random assignment evaluation of the Abriendo Puertas/Opening Doors program, one of the largest programs in the United States working with low-income Latino parents of children ages zero to five. Since it began in 2007, the program has served over 30,000 parents/families in 34 states. The evaluation study examined the…

  12. Speech-language therapy program for mouth opening in patients with oral and oropharyngeal cancer undergoing adjuvant radiotherapy: a pilot study.

    PubMed

    Marrafon, Caroline Somera; Matos, Leandro Luongo; Simões-Zenari, Marcia; Cernea, Claudio Roberto; Nemr, Katia

    2018-01-01

    Purpose Assess the effectiveness of an orofacial myofunctional therapeutic program in patients with oral or oropharyngeal cancer submitted to adjuvant radiotherapy through pre- and post-program comparison of maximum mandibular opening. Methods Prospective study involving five adult patients and five elderly patients postoperatively to oral cavity/oropharynx surgery who were awaiting the beginning of radiotherapy or had undergone fewer than five treatment sessions. The study participants had their maximum jaw opening measured using a sliding caliper at the beginning and end of the program. Two mobility exercises and three mandibular traction exercises were selected and weekly monitored presentially for 10 weeks. Descriptive data and pre- and post-therapy comparative measures were statistically analyzed using the Wilcoxon test. Results Ten patients (two women and eight men) with mean age of 58.4 years, median of 57.0 years, completed the therapeutic program. They presented mean maximum mandibular opening of 31.6 ± 11.7 and 36.4 ± 8.0 mm pre- and post-therapy, respectively (p =0.021). Conclusion The proposed orofacial myofunctional therapeutic program increased the maximum jaw opening of patients referred to adjuvant radiotherapy for oral cavity or oropharynx cancer treatment.

  13. Solvent-programmed microchip open-channel electrochromatography.

    PubMed

    Kutter, J P; Jacobson, S C; Matsubara, N; Ramsey, J M

    1998-08-01

    Open-channel electrochromatography in combination with solvent programming is demonstrated using a microchip device. Channel walls were coated with octadecylsilanes at ambient temperatures, yielding stationary phases for chromatographic separations of neutral dyes. The electroosmotic flow after coating was sufficient to ensure transport of all species and on-chip mixing of isocratic and gradient elution conditions with acetonitrile-buffer mixtures. Chips having different channel depths between 10.2 and 2.9 μm were evaluated for performance, and van Deemter plots were established. Channel depths of about 5 μm were found to be a good compromise between efficiency and ease of operation. Isocratic and gradient elution conditions were easily established and manipulated by computer-controlled application of voltages to the terminals of the microchip. Linear gradients with different slopes, start times, duration times, and start percentages of organic modifier proved to be powerful tools to tune selectivity and analysis time for the separation of a test mixture. Even very steep gradients still produced excellent efficiencies. Together with fast reconditioning times, complete runs could be finished in under 60 s.

  14. Investigation of air transportation technology at Princeton University, 1986

    NASA Technical Reports Server (NTRS)

    Stengel, Robert F.

    1988-01-01

    The Air Transportation Technology Program at Princeton proceeded along four avenues: Guidance and control strategies for penetration of microbursts and wind shear; Application of artificial intelligence in flight control systems; Computer aided control system design; and Effects of control saturation on closed loop stability and response of open loop unstable aircraft. Areas of investigation relate to guidance and control of commercial transports as well as general aviation aircraft. Interaction between the flight crew and automatic systems is a subject of prime concern.

  15. Naval Open Architecture Contract Guidebook for Program Managers

    DTIC Science & Technology

    2010-06-30

    a whole, transform inputs into outputs. [IEEE/EIA Std. 12207 /1997] “APP233/ ISO 10303” – APP233 an “Application Protocol” for Systems Engineering...Language Metadata Interchange (XMI) and AP233/ ISO 10303). The contractor shall identify the proposed standards and formats to be used. The contractor...ANSI ISO /IEC 9075-1, ISO /IEC 9075-2, ISO /IEC 9075-3, ISO /IEC 9075-4, ISO /IEC 9075-5) 2. HTML for presentation layer (e.g., XML 1.0

  16. The application of nanotechnology in medicine: treatment and diagnostics.

    PubMed

    Owen, Andrew; Dufès, Christine; Moscatelli, Davide; Mayes, Eric; Lovell, Jonathan F; Katti, Kattesh V; Sokolov, Konstantin; Mazza, Mariarosa; Fontaine, Olivier; Rannard, Steve; Stone, Vicki

    2014-07-01

    Nanomedicine 2014 Edinburgh, UK, 26-27 March 2014 The British Society for Nanomedicine (BSNM), in collaboration with SELECTBIO, organized Nanomedicine 2014. BSNM is a registered charity created to allow open access for industry, academia, clinicians and the public to news and details of ongoing nanomedicine research. The Nanomedicine 2014 program provided insight across a number of emerging nanotechnologies spanning treatment to diagnostics. A key objective of the meeting was provision of opportunities to build collaborations and rationalize nanoenabled healthcare solutions.

  17. Open Integrated Personal Learning Environment: Towards a New Conception of the ICT-Based Learning Processes

    NASA Astrophysics Data System (ADS)

    Conde, Miguel Ángel; García-Peñalvo, Francisco José; Casany, Marià José; Alier Forment, Marc

    Learning processes are changing related to technological and sociological evolution, taking this in to account, a new learning strategy must be considered. Specifically what is needed is to give an effective step towards the eLearning 2.0 environments consolidation. This must imply the fusion of the advantages of the traditional LMS (Learning Management System) - more formative program control and planning oriented - with the social learning and the flexibility of the web 2.0 educative applications.

  18. Current trends for customized biomedical software tools.

    PubMed

    Khan, Haseeb Ahmad

    2017-01-01

    In the past, biomedical scientists were solely dependent on expensive commercial software packages for various applications. However, the advent of user-friendly programming languages and open source platforms has revolutionized the development of simple and efficient customized software tools for solving specific biomedical problems. Many of these tools are designed and developed by biomedical scientists independently or with the support of computer experts and often made freely available for the benefit of scientific community. The current trends for customized biomedical software tools are highlighted in this short review.

  19. Integrating UniTree with the data migration API

    NASA Technical Reports Server (NTRS)

    Schrodel, David G.

    1994-01-01

    The Data Migration Application Programming Interface (DMAPI) has the potential to allow developers of open systems Hierarchical Storage Management (HSM) products to virtualize native file systems without the requirement to make changes to the underlying operating system. This paper describes advantages of virtualizing native file systems in hierarchical storage management systems, the DMAPI at a high level, what the goals are for the interface, and the integration of the Convex UniTree+HSM with DMAPI along with some of the benefits derived in the resulting product.

  20. Verification Tools Secure Online Shopping, Banking

    NASA Technical Reports Server (NTRS)

    2010-01-01

    Just like rover or rocket technology sent into space, the software that controls these technologies must be extensively tested to ensure reliability and effectiveness. Ames Research Center invented the open-source Java Pathfinder (JPF) toolset for the deep testing of Java-based programs. Fujitsu Labs of America Inc., based in Sunnyvale, California, improved the capabilities of the JPF Symbolic Pathfinder tool, establishing the tool as a means of thoroughly testing the functionality and security of Web-based Java applications such as those used for Internet shopping and banking.

  1. Building an open-source robotic stereotaxic instrument.

    PubMed

    Coffey, Kevin R; Barker, David J; Ma, Sisi; West, Mark O

    2013-10-29

    This protocol includes the designs and software necessary to upgrade an existing stereotaxic instrument to a robotic (CNC) stereotaxic instrument for around $1,000 (excluding a drill), using industry standard stepper motors and CNC controlling software. Each axis has variable speed control and may be operated simultaneously or independently. The robot's flexibility and open coding system (g-code) make it capable of performing custom tasks that are not supported by commercial systems. Its applications include, but are not limited to, drilling holes, sharp edge craniotomies, skull thinning, and lowering electrodes or cannula. In order to expedite the writing of g-coding for simple surgeries, we have developed custom scripts that allow individuals to design a surgery with no knowledge of programming. However, for users to get the most out of the motorized stereotax, it would be beneficial to be knowledgeable in mathematical programming and G-Coding (simple programming for CNC machining). The recommended drill speed is greater than 40,000 rpm. The stepper motor resolution is 1.8°/Step, geared to 0.346°/Step. A standard stereotax has a resolution of 2.88 μm/step. The maximum recommended cutting speed is 500 μm/sec. The maximum recommended jogging speed is 3,500 μm/sec. The maximum recommended drill bit size is HP 2.

  2. ProteinShader: illustrative rendering of macromolecules

    PubMed Central

    Weber, Joseph R

    2009-01-01

    Background Cartoon-style illustrative renderings of proteins can help clarify structural features that are obscured by space filling or balls and sticks style models, and recent advances in programmable graphics cards offer many new opportunities for improving illustrative renderings. Results The ProteinShader program, a new tool for macromolecular visualization, uses information from Protein Data Bank files to produce illustrative renderings of proteins that approximate what an artist might create by hand using pen and ink. A combination of Hermite and spherical linear interpolation is used to draw smooth, gradually rotating three-dimensional tubes and ribbons with a repeating pattern of texture coordinates, which allows the application of texture mapping, real-time halftoning, and smooth edge lines. This free platform-independent open-source program is written primarily in Java, but also makes extensive use of the OpenGL Shading Language to modify the graphics pipeline. Conclusion By programming to the graphics processor unit, ProteinShader is able to produce high quality images and illustrative rendering effects in real-time. The main feature that distinguishes ProteinShader from other free molecular visualization tools is its use of texture mapping techniques that allow two-dimensional images to be mapped onto the curved three-dimensional surfaces of ribbons and tubes with minimum distortion of the images. PMID:19331660

  3. The development and performance of smud grid-connected photovoltaic projects

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Osborn, D.E.; Collier, D.E.

    1995-11-01

    The utility grid-connected market has been identified as a key market to be developed to accelerate the commercialization of photovoltaics. The Sacramento Municipal Utility District (SMUD) has completed the first two years of a continuing commercialization effort based on two years of a continuing commercialization effort based on the sustained, orderly development of the grid-connected, utility PV market. This program is aimed at developing the experience needed to successfully integrate PV as distributed generation into the utility system and to stimulate the collaborative processes needed to accelerate the cost reductions necessary for PV to be cost-effective in these applications bymore » the year 2000. In the first two years, SMUD has installed over 240 residential and commercial building, grid-connected, rooftop, {open_quotes}PV Pioneer{close_quotes} systems totaling over 1MW of capacity and four substation sited, grid-support PV systems totaling 600 kW bringing the SMUD distributed PV power systems to over 3.7 MW. The 1995 SMUD PV Program will add another approximately 800 kW of PV systems to the District`s distributed PV power system. SMUD also established a partnership with its customers through the PV Pioneer {open_quotes}green pricing{close_quotes} program to advance PV commercialization.« less

  4. Portfolio-Scale Optimization of Customer Energy Efficiency Incentive and Marketing: Cooperative Research and Development Final Report, CRADA Number CRD-13-535

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brackney, Larry J.

    North East utility National Grid (NGrid) is developing a portfolio-scale application of OpenStudio designed to optimize incentive and marketing expenditures for their energy efficiency (EE) programs. NGrid wishes to leverage a combination of geographic information systems (GIS), public records, customer data, and content from the Building Component Library (BCL) to form a JavaScript Object Notation (JSON) input file that is consumed by an OpenStudio-based expert system for automated model generation. A baseline model for each customer building will be automatically tuned using electricity and gas consumption data, and a set of energy conservation measures (ECMs) associated with each NGrid incentivemore » program will be applied to the model. The simulated energy performance and return on investment (ROI) will be compared with customer hurdle rates and available incentives to A) optimize the incentive required to overcome the customer hurdle rate and B) determine if marketing activity associated with the specific ECM is warranted for that particular customer. Repeated across their portfolio, this process will enable NGrid to substantially optimize their marketing and incentive expenditures, targeting those customers that will likely adopt and benefit from specific EE programs.« less

  5. Teaching Introductory GIS Programming to Geographers Using an Open Source Python Approach

    ERIC Educational Resources Information Center

    Etherington, Thomas R.

    2016-01-01

    Computer programming is not commonly taught to geographers as a part of geographic information system (GIS) courses, but the advent of NeoGeography, big data and open GIS means that programming skills are becoming more important. To encourage the teaching of programming to geographers, this paper outlines a course based around a series of…

  6. Implementing OpenMRS for patient monitoring in an HIV/AIDS care and treatment program in rural Mozambique.

    PubMed

    Manders, Eric-Jan; José, Eurico; Solis, Manuel; Burlison, Janeen; Nhampossa, José Leopoldo; Moon, Troy

    2010-01-01

    We have adopted the Open Medical Record System (OpenMRS) framework to implement an electronic patient monitoring system for an HIV care and treatment program in Mozambique. The program provides technical assistance to the Ministry of Health supporting the scale up of integrated HIV care and support services in health facilities in rural resource limited settings. The implementation is in use for adult and pediatric programs, with ongoing roll-out to cover all supported sites. We describe early experiences in adapting the system to the program needs, addressing infrastructure challenges, creating a regional support team, training data entry staff, migrating a legacy database, deployment, and current use. We find that OpenMRS offers excellent prospects for in-country development of health information systems, even in severely resource limited settings. However, it also requires considerable organizational infrastructure investment and technical capacity building to ensure continued local support.

  7. BEAGLE: an application programming interface and high-performance computing library for statistical phylogenetics.

    PubMed

    Ayres, Daniel L; Darling, Aaron; Zwickl, Derrick J; Beerli, Peter; Holder, Mark T; Lewis, Paul O; Huelsenbeck, John P; Ronquist, Fredrik; Swofford, David L; Cummings, Michael P; Rambaut, Andrew; Suchard, Marc A

    2012-01-01

    Phylogenetic inference is fundamental to our understanding of most aspects of the origin and evolution of life, and in recent years, there has been a concentration of interest in statistical approaches such as Bayesian inference and maximum likelihood estimation. Yet, for large data sets and realistic or interesting models of evolution, these approaches remain computationally demanding. High-throughput sequencing can yield data for thousands of taxa, but scaling to such problems using serial computing often necessitates the use of nonstatistical or approximate approaches. The recent emergence of graphics processing units (GPUs) provides an opportunity to leverage their excellent floating-point computational performance to accelerate statistical phylogenetic inference. A specialized library for phylogenetic calculation would allow existing software packages to make more effective use of available computer hardware, including GPUs. Adoption of a common library would also make it easier for other emerging computing architectures, such as field programmable gate arrays, to be used in the future. We present BEAGLE, an application programming interface (API) and library for high-performance statistical phylogenetic inference. The API provides a uniform interface for performing phylogenetic likelihood calculations on a variety of compute hardware platforms. The library includes a set of efficient implementations and can currently exploit hardware including GPUs using NVIDIA CUDA, central processing units (CPUs) with Streaming SIMD Extensions and related processor supplementary instruction sets, and multicore CPUs via OpenMP. To demonstrate the advantages of a common API, we have incorporated the library into several popular phylogenetic software packages. The BEAGLE library is free open source software licensed under the Lesser GPL and available from http://beagle-lib.googlecode.com. An example client program is available as public domain software.

  8. BEAGLE: An Application Programming Interface and High-Performance Computing Library for Statistical Phylogenetics

    PubMed Central

    Ayres, Daniel L.; Darling, Aaron; Zwickl, Derrick J.; Beerli, Peter; Holder, Mark T.; Lewis, Paul O.; Huelsenbeck, John P.; Ronquist, Fredrik; Swofford, David L.; Cummings, Michael P.; Rambaut, Andrew; Suchard, Marc A.

    2012-01-01

    Abstract Phylogenetic inference is fundamental to our understanding of most aspects of the origin and evolution of life, and in recent years, there has been a concentration of interest in statistical approaches such as Bayesian inference and maximum likelihood estimation. Yet, for large data sets and realistic or interesting models of evolution, these approaches remain computationally demanding. High-throughput sequencing can yield data for thousands of taxa, but scaling to such problems using serial computing often necessitates the use of nonstatistical or approximate approaches. The recent emergence of graphics processing units (GPUs) provides an opportunity to leverage their excellent floating-point computational performance to accelerate statistical phylogenetic inference. A specialized library for phylogenetic calculation would allow existing software packages to make more effective use of available computer hardware, including GPUs. Adoption of a common library would also make it easier for other emerging computing architectures, such as field programmable gate arrays, to be used in the future. We present BEAGLE, an application programming interface (API) and library for high-performance statistical phylogenetic inference. The API provides a uniform interface for performing phylogenetic likelihood calculations on a variety of compute hardware platforms. The library includes a set of efficient implementations and can currently exploit hardware including GPUs using NVIDIA CUDA, central processing units (CPUs) with Streaming SIMD Extensions and related processor supplementary instruction sets, and multicore CPUs via OpenMP. To demonstrate the advantages of a common API, we have incorporated the library into several popular phylogenetic software packages. The BEAGLE library is free open source software licensed under the Lesser GPL and available from http://beagle-lib.googlecode.com. An example client program is available as public domain software. PMID:21963610

  9. OpenDanubia - An integrated, modular simulation system to support regional water resource management

    NASA Astrophysics Data System (ADS)

    Muerth, M.; Waldmann, D.; Heinzeller, C.; Hennicker, R.; Mauser, W.

    2012-04-01

    The already completed, multi-disciplinary research project GLOWA-Danube has developed a regional scale, integrated modeling system, which was successfully applied on the 77,000 km2 Upper Danube basin to investigate the impact of Global Change on both the natural and anthropogenic water cycle. At the end of the last project phase, the integrated modeling system was transferred into the open source project OpenDanubia, which now provides both the core system as well as all major model components to the general public. First, this will enable decision makers from government, business and management to use OpenDanubia as a tool for proactive management of water resources in the context of global change. Secondly, the model framework to support integrated simulations and all simulation models developed for OpenDanubia in the scope of GLOWA-Danube are further available for future developments and research questions. OpenDanubia allows for the investigation of water-related scenarios considering different ecological and economic aspects to support both scientists and policy makers to design policies for sustainable environmental management. OpenDanubia is designed as a framework-based, distributed system. The model system couples spatially distributed physical and socio-economic process during run-time, taking into account their mutual influence. To simulate the potential future impacts of Global Change on agriculture, industrial production, water supply, households and tourism businesses, so-called deep actor models are implemented in OpenDanubia. All important water-related fluxes and storages in the natural environment are implemented in OpenDanubia as spatially explicit, process-based modules. This includes the land surface water and energy balance, dynamic plant water uptake, ground water recharge and flow as well as river routing and reservoirs. Although the complete system is relatively demanding on data requirements and hardware requirements, the modular structure and the generic core system (Core Framework, Actor Framework) allows the application in new regions and the selection of a reduced number of modules for simulation. As part of the Open Source Initiative in GLOWA-Danube (opendanubia.glowa-danube.de) a comprehensive documentation for the system installation was created and both the program code of the framework and of all major components is licensed under the GNU General Public License. In addition, some helpful programs and scripts necessary for the operation and processing of input and result data sets are provided.

  10. Online characterization of planetary surfaces: PlanetServer, an open-source analysis and visualization tool

    NASA Astrophysics Data System (ADS)

    Marco Figuera, R.; Pham Huu, B.; Rossi, A. P.; Minin, M.; Flahaut, J.; Halder, A.

    2018-01-01

    The lack of open-source tools for hyperspectral data visualization and analysis creates a demand for new tools. In this paper we present the new PlanetServer, a set of tools comprising a web Geographic Information System (GIS) and a recently developed Python Application Programming Interface (API) capable of visualizing and analyzing a wide variety of hyperspectral data from different planetary bodies. Current WebGIS open-source tools are evaluated in order to give an overview and contextualize how PlanetServer can help in this matters. The web client is thoroughly described as well as the datasets available in PlanetServer. Also, the Python API is described and exposed the reason of its development. Two different examples of mineral characterization of different hydrosilicates such as chlorites, prehnites and kaolinites in the Nili Fossae area on Mars are presented. As the obtained results show positive outcome in hyperspectral analysis and visualization compared to previous literature, we suggest using the PlanetServer approach for such investigations.

  11. Imbibition of ``Open Capillary'': Fundamentals and Applications

    NASA Astrophysics Data System (ADS)

    Tani, Marie; Kawano, Ryuji; Kamiya, Koki; Okumura, Ko

    2015-11-01

    Control or transportation of small amount of liquid is one of the most important issues in various contexts including medical sciences or pharmaceutical industries to fuel delivery. We studied imbibition of ``open capillary'' both experimentally and theoretically, and found simple scaling laws for both statics and dynamics of the imbibition, similarly as that of imbibition of capillary tubes. Furthermore, we revealed the existence of ``precursor film,'' which developed ahead of the imbibing front, and the dynamics of it is described well by another scaling law for capillary rise in a corner. Then, to show capabilities of open capillaries, we demonstrated two experiments by fabricating micro mixing devices to achieve (1) simultaneous multi-color change of the Bromothymol blue (BTB) solution and (2) expression of the green florescent protein (GFP). This research was partly supported by ImPACT Program of Council for Science, Technology and Innovation (Cabinet Office, Government of Japan). M. T. is supported by the Japan Society for the Promotion of Science Research Fellowships for Young Scientists.

  12. Development of Laser-Polarized Noble Gas Magnetic Resonance Imaging (MRI) Technology

    NASA Technical Reports Server (NTRS)

    Walsworth, Ronald L.

    2004-01-01

    We are developing technology for laser-polarized noble gas nuclear magnetic resonance (NMR), with the aim of enabling it as a novel biomedical imaging tool for ground-based and eventually space-based application. This emerging multidisciplinary technology enables high-resolution gas-space magnetic resonance imaging (MRI)-e.g., of lung ventilation, perfusion, and gas-exchange. In addition, laser-polarized noble gases (3He and 1BXe) do not require a large magnetic field for sensitive NMR detection, opening the door to practical MRI with novel, open-access magnet designs at very low magnetic fields (and hence in confined spaces). We are pursuing two specific aims in this technology development program. The first aim is to develop an open-access, low-field (less than 0.01 T) instrument for MRI studies of human gas inhalation as a function of subject orientation, and the second aim is to develop functional imaging of the lung using laser-polarized He-3 and Xe-129.

  13. KSC-02pd0621

    NASA Image and Video Library

    2002-04-29

    KENNEDY SPACE CENTER, FLA. -- At the opening ceremony for the new program known as SABRE, Space Agricultural Biotechnology Research and Education, key participants gather around the SABRE poster. From left are Robert Ferl, professor in the horticultural sciences department and assistant director of the University of Florida Biotechnology Program, who will direct and be responsible for coordinating the research and education; William Knott, senior scientist in the NASA biological sciences office; U.S. Representative Dave Weldon; Center Director Roy D. Bridges Jr.; and Florida Representative Bob Allen. Involving UF and NASA, SABRE will focus on the discovery, development and application of the biological aspects of advanced life support strategies. The program will include faculty from UF's Institute of Food and Agricultural Sciences, who will be located at both KSC - in the state-owned Space Experiment Research and Processing Laboratory (SERPL) being built there - and UF in Gainesville

  14. The SCEC Community Modeling Environment(SCEC/CME): A Collaboratory for Seismic Hazard Analysis

    NASA Astrophysics Data System (ADS)

    Maechling, P. J.; Jordan, T. H.; Minster, J. B.; Moore, R.; Kesselman, C.

    2005-12-01

    The SCEC Community Modeling Environment (SCEC/CME) Project is an NSF-supported Geosciences/IT partnership that is actively developing an advanced information infrastructure for system-level earthquake science in Southern California. This partnership includes SCEC, USC's Information Sciences Institute (ISI), the San Diego Supercomputer Center (SDSC), the Incorporated Institutions for Research in Seismology (IRIS), and the U.S. Geological Survey. The goal of the SCEC/CME is to develop seismological applications and information technology (IT) infrastructure to support the development of Seismic Hazard Analysis (SHA) programs and other geophysical simulations. The SHA application programs developed on the Project include a Probabilistic Seismic Hazard Analysis system called OpenSHA. OpenSHA computational elements that are currently available include a collection of attenuation relationships, and several Earthquake Rupture Forecasts (ERFs). Geophysicists in the collaboration have also developed Anelastic Wave Models (AWMs) using both finite-difference and finite-element approaches. Earthquake simulations using these codes have been run for a variety of earthquake sources. Rupture Dynamic Model (RDM) codes have also been developed that simulate friction-based fault slip. The SCEC/CME collaboration has also developed IT software and hardware infrastructure to support the development, execution, and analysis of these SHA programs. To support computationally expensive simulations, we have constructed a grid-based scientific workflow system. Using the SCEC grid, project collaborators can submit computations from the SCEC/CME servers to High Performance Computers at USC and TeraGrid High Performance Computing Centers. Data generated and archived by the SCEC/CME is stored in a digital library system, the Storage Resource Broker (SRB). This system provides a robust and secure system for maintaining the association between the data seta and their metadata. To provide an easy-to-use system for constructing SHA computations, a browser-based workflow assembly web portal has been developed. Users can compose complex SHA calculations, specifying SCEC/CME data sets as inputs to calculations, and calling SCEC/CME computational programs to process the data and the output. Knowledge-based software tools have been implemented that utilize ontological descriptions of SHA software and data can validate workflows created with this pathway assembly tool. Data visualization software developed by the collaboration supports analysis and validation of data sets. Several programs have been developed to visualize SCEC/CME data including GMT-based map making software for PSHA codes, 4D wavefield propagation visualization software based on OpenGL, and 3D Geowall-based visualization of earthquakes, faults, and seismic wave propagation. The SCEC/CME Project also helps to sponsor the SCEC UseIT Intern program. The UseIT Intern Program provides research opportunities in both Geosciences and Information Technology to undergraduate students in a variety of fields. The UseIT group has developed a 3D data visualization tool, called SCEC-VDO, as a part of this undergraduate research program.

  15. PaaS for web applications with OpenShift Origin

    NASA Astrophysics Data System (ADS)

    Lossent, A.; Rodriguez Peon, A.; Wagner, A.

    2017-10-01

    The CERN Web Frameworks team has deployed OpenShift Origin to facilitate deployment of web applications and to improving efficiency in terms of computing resource usage. OpenShift leverages Docker containers and Kubernetes orchestration to provide a Platform-as-a-service solution oriented for web applications. We will review use cases and how OpenShift was integrated with other services such as source control, web site management and authentication services.

  16. 75 FR 41102 - Energy Conservation Program: Re-Opening of the Public Comment Period for Commercial Refrigeration...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-07-15

    ... Conservation Program: Re-Opening of the Public Comment Period for Commercial Refrigeration Equipment AGENCY... document for commercial refrigeration equipment and provide notice of a public meeting. The NOPM provided... the framework document for commercial refrigeration equipment is to be re-opened from July 15, 2010 to...

  17. Usage of Credit Cards Received through College Student-Marketing Programs

    ERIC Educational Resources Information Center

    Barron, John M.; Staten, Michael E.

    2004-01-01

    This article provides benchmark measures of college student credit card usage by utilizing a pooled sample of over 300,000 recently opened credit card accounts. The analysis compares behavior over 12 months of account history for three groups of accounts: those opened by young adults through college student marketing programs; those opened through…

  18. 47 CFR 76.1507 - Competitive access to satellite cable programming.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... RADIO SERVICES MULTICHANNEL VIDEO AND CABLE TELEVISION SERVICE Open Video Systems § 76.1507 Competitive....1000 through 76.1003 shall also apply to an operator of an open video system and its affiliate which provides video programming on its open video system, except as limited by paragraph (a) (1)-(3) of this...

  19. 47 CFR 76.1509 - Syndicated program exclusivity.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... MULTICHANNEL VIDEO AND CABLE TELEVISION SERVICE Open Video Systems § 76.1509 Syndicated program exclusivity. (a) Sections 76.151 through 76.163 shall apply to open video systems in accordance with the provisions... to an open video system. (c) Any provision of § 76.155 that refers to a “cable system operator” or...

  20. 47 CFR 76.1507 - Competitive access to satellite cable programming.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... RADIO SERVICES MULTICHANNEL VIDEO AND CABLE TELEVISION SERVICE Open Video Systems § 76.1507 Competitive....1000 through 76.1003 shall also apply to an operator of an open video system and its affiliate which provides video programming on its open video system, except as limited by paragraph (a) (1)-(3) of this...

Top