The Application Programming Interface (API) for Uncertainty Analysis, Sensitivity Analysis, and Parameter Estimation (UA/SA/PE API) tool development, here fore referred to as the Calibration, Optimization, and Sensitivity and Uncertainty Algorithms API (COSU-API), was initially d...
Banta, E.R.; Hill, M.C.; Poeter, E.; Doherty, J.E.; Babendreier, J.
2008-01-01
The open-source, public domain JUPITER (Joint Universal Parameter IdenTification and Evaluation of Reliability) API (Application Programming Interface) provides conventions and Fortran-90 modules to develop applications (computer programs) for analyzing process models. The input and output conventions allow application users to access various applications and the analysis methods they embody with a minimum of time and effort. Process models simulate, for example, physical, chemical, and (or) biological systems of interest using phenomenological, theoretical, or heuristic approaches. The types of model analyses supported by the JUPITER API include, but are not limited to, sensitivity analysis, data needs assessment, calibration, uncertainty analysis, model discrimination, and optimization. The advantages provided by the JUPITER API for users and programmers allow for rapid programming and testing of new ideas. Application-specific coding can be in languages other than the Fortran-90 of the API. This article briefly describes the capabilities and utility of the JUPITER API, lists existing applications, and uses UCODE_2005 as an example.
Banta, Edward R.; Poeter, Eileen P.; Doherty, John E.; Hill, Mary C.
2006-01-01
he Joint Universal Parameter IdenTification and Evaluation of Reliability Application Programming Interface (JUPITER API) improves the computer programming resources available to those developing applications (computer programs) for model analysis.The JUPITER API consists of eleven Fortran-90 modules that provide for encapsulation of data and operations on that data. Each module contains one or more entities: data, data types, subroutines, functions, and generic interfaces. The modules do not constitute computer programs themselves; instead, they are used to construct computer programs. Such computer programs are called applications of the API. The API provides common modeling operations for use by a variety of computer applications.The models being analyzed are referred to here as process models, and may, for example, represent the physics, chemistry, and(or) biology of a field or laboratory system. Process models commonly are constructed using published models such as MODFLOW (Harbaugh et al., 2000; Harbaugh, 2005), MT3DMS (Zheng and Wang, 1996), HSPF (Bicknell et al., 1997), PRMS (Leavesley and Stannard, 1995), and many others. The process model may be accessed by a JUPITER API application as an external program, or it may be implemented as a subroutine within a JUPITER API application . In either case, execution of the model takes place in a framework designed by the application programmer. This framework can be designed to take advantage of any parallel processing capabilities possessed by the process model, as well as the parallel-processing capabilities of the JUPITER API.Model analyses for which the JUPITER API could be useful include, for example: Compare model results to observed values to determine how well the model reproduces system processes and characteristics.Use sensitivity analysis to determine the information provided by observations to parameters and predictions of interest.Determine the additional data needed to improve selected model predictions.Use calibration methods to modify parameter values and other aspects of the model.Compare predictions to regulatory limits.Quantify the uncertainty of predictions based on the results of one or many simulations using inferential or Monte Carlo methods.Determine how to manage the system to achieve stated objectives.The capabilities provided by the JUPITER API include, for example, communication with process models, parallel computations, compressed storage of matrices, and flexible input capabilities. The input capabilities use input blocks suitable for lists or arrays of data. The input blocks needed for one application can be included within one data file or distributed among many files. Data exchange between different JUPITER API applications or between applications and other programs is supported by data-exchange files.The JUPITER API has already been used to construct a number of applications. Three simple example applications are presented in this report. More complicated applications include the universal inverse code UCODE_2005 (Poeter et al., 2005), the multi-model analysis MMA (Eileen P. Poeter, Mary C. Hill, E.R. Banta, S.W. Mehl, and Steen Christensen, written commun., 2006), and a code named OPR_PPR (Matthew J. Tonkin, Claire R. Tiedeman, Mary C. Hill, and D. Matthew Ely, written communication, 2006).This report describes a set of underlying organizational concepts and complete specifics about the JUPITER API. While understanding the organizational concept presented is useful to understanding the modules, other organizational concepts can be used in applications constructed using the JUPITER API.
Efficient Strategies for Active Interface-Level Network Topology Discovery
2013-09-01
Network Information Centre API Application Programming Interface APNIC Asia-Pacific Network Information Centre ARIN American Registry for Internet Numbers...very convenient Application Programming Interface ( API ) for easy primitive implementation. Ark’s API facilitates easy development and rapid...prototyping – important attributes as the char- acteristics of our primitives evolve. The API allows a high-level of abstraction, which in turn leads to rapid
Federal Register 2010, 2011, 2012, 2013, 2014
2013-08-09
... depend upon the Application Programming Interface (``API'') a Permit Holder is using.\\4\\ Currently, the Exchange offers two APIs: CBOE Market Interface (``CMi'') API and Financial Information eXchange (``FIX... available APIs, and if applicable, which version, it would like to use. \\4\\ An API is a computer interface...
The open-source, public domain JUPITER (Joint Universal Parameter IdenTification and Evaluation of Reliability) API (Application Programming Interface) provides conventions and Fortran-90 modules to develop applications (computer programs) for analyzing process models. The input ...
PedVizApi: a Java API for the interactive, visual analysis of extended pedigrees.
Fuchsberger, Christian; Falchi, Mario; Forer, Lukas; Pramstaller, Peter P
2008-01-15
PedVizApi is a Java API (application program interface) for the visual analysis of large and complex pedigrees. It provides all the necessary functionality for the interactive exploration of extended genealogies. While available packages are mostly focused on a static representation or cannot be added to an existing application, PedVizApi is a highly flexible open source library for the efficient construction of visual-based applications for the analysis of family data. An extensive demo application and a R interface is provided. http://www.pedvizapi.org
PrismTech Data Distribution Service Java API Evaluation
NASA Technical Reports Server (NTRS)
Riggs, Cortney
2008-01-01
My internship duties with Launch Control Systems required me to start performance testing of an Object Management Group's (OMG) Data Distribution Service (DDS) specification implementation by PrismTech Limited through the Java programming language application programming interface (API). DDS is a networking middleware for Real-Time Data Distribution. The performance testing involves latency, redundant publishers, extended duration, redundant failover, and read performance. Time constraints allowed only for a data throughput test. I have designed the testing applications to perform all performance tests when time is allowed. Performance evaluation data such as megabits per second and central processing unit (CPU) time consumption were not easily attainable through the Java programming language; they required new methods and classes created in the test applications. Evaluation of this product showed the rate that data can be sent across the network. Performance rates are better on Linux platforms than AIX and Sun platforms. Compared to previous C++ programming language API, the performance evaluation also shows the language differences for the implementation. The Java API of the DDS has a lower throughput performance than the C++ API.
Context-Based Mobile Security Enclave
2012-09-01
29 c. Change IMSI .............................30 d. Change CellID ...........................31 e. Change Geolocation ...Assisted Global Positioning System ADB Android Debugger API Application Programming Interface APK Android Application Package BSC Base Station...Programming Interfaces ( APIs ), which use Java compatible libraries based on Apache Harmony (an open source Java implementation developed by the Apache
A proposed application programming interface for a physical volume repository
NASA Technical Reports Server (NTRS)
Jones, Merritt; Williams, Joel; Wrenn, Richard
1996-01-01
The IEEE Storage System Standards Working Group (SSSWG) has developed the Reference Model for Open Storage Systems Interconnection, Mass Storage System Reference Model Version 5. This document, provides the framework for a series of standards for application and user interfaces to open storage systems. More recently, the SSSWG has been developing Application Programming Interfaces (APIs) for the individual components defined by the model. The API for the Physical Volume Repository is the most fully developed, but work is being done on APIs for the Physical Volume Library and for the Mover also. The SSSWG meets every other month, and meetings are open to all interested parties. The Physical Volume Repository (PVR) is responsible for managing the storage of removable media cartridges and for mounting and dismounting these cartridges onto drives. This document describes a model which defines a Physical Volume Repository, and gives a brief summary of the Application Programming Interface (API) which the IEEE Storage Systems Standards Working Group (SSSWG) is proposing as the standard interface for the PVR.
Leveraging OpenStudio's Application Programming Interfaces: Preprint
DOE Office of Scientific and Technical Information (OSTI.GOV)
Long, N.; Ball, B.; Goldwasser, D.
2013-11-01
OpenStudio development efforts have been focused on providing Application Programming Interfaces (APIs) where users are able to extend OpenStudio without the need to compile the open source libraries. This paper will discuss the basic purposes and functionalities of the core libraries that have been wrapped with APIs including the Building Model, Results Processing, Advanced Analysis, UncertaintyQuantification, and Data Interoperability through Translators. Several building energy modeling applications have been produced using OpenStudio's API and Software Development Kits (SDK) including the United States Department of Energy's Asset ScoreCalculator, a mobile-based audit tool, an energy design assistance reporting protocol, and a portfolio scalemore » incentive optimization analysismethodology. Each of these software applications will be discussed briefly and will describe how the APIs were leveraged for various uses including high-level modeling, data transformations from detailed building audits, error checking/quality assurance of models, and use of high-performance computing for mass simulations.« less
ERIC Educational Resources Information Center
Wang, Kening; Mulvenon, Sean W.; Stegman, Charles; Anderson, Travis
2008-01-01
Google Maps API (Application Programming Interface), released in late June 2005 by Google, is an amazing technology that allows users to embed Google Maps in their own Web pages with JavaScript. Google Maps API has accelerated the development of new Google Maps based applications. This article reports a Web-based interactive mapping system…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hollman, David; Lifflander, Jonathon; Wilke, Jeremiah
2017-03-14
DARMA is a portability layer for asynchronous many-task (AMT) runtime systems. AMT runtime systems show promise to mitigate challenges imposed by next generation high performance computing architectures. However, current runtime system technologies are not production-ready. DARMA is a portability layer that seeks to insulate application developers from idiosyncrasies of individual runtime systems, thereby facilitating application-developer use of these technologies. DARMA comprises a frontend application programming interface (API) for application developers, a backend API for runtime system developers, and a translation that translates frontend API calls into backend API calls. Application developers use C++ abstractions to annotate both data and tasksmore » in their code. The DARMA translation layer uses C++ template metaprogramming to capture data-task dependencies, and provides this information to a potential backend runtime system via a series of backend API calls.« less
The Application Programming Interface (API) for Uncertainty Analysis, Sensitivity Analysis, and
Parameter Estimation (UA/SA/PE API) (also known as Calibration, Optimization and Sensitivity and Uncertainty (CUSO)) was developed in a joint effort between several members of both ...
JASPAR RESTful API: accessing JASPAR data from any programming language.
Khan, Aziz; Mathelier, Anthony
2018-05-01
JASPAR is a widely used open-access database of curated, non-redundant transcription factor binding profiles. Currently, data from JASPAR can be retrieved as flat files or by using programming language-specific interfaces. Here, we present a programming language-independent application programming interface (API) to access JASPAR data using the Representational State Transfer (REST) architecture. The REST API enables programmatic access to JASPAR by most programming languages and returns data in eight widely used formats. Several endpoints are available to access the data and an endpoint is available to infer the TF binding profile(s) likely bound by a given DNA binding domain protein sequence. Additionally, it provides an interactive browsable interface for bioinformatics tool developers. This REST API is implemented in Python using the Django REST Framework. It is accessible at http://jaspar.genereg.net/api/ and the source code is freely available at https://bitbucket.org/CBGR/jaspar under GPL v3 license. aziz.khan@ncmm.uio.no or anthony.mathelier@ncmm.uio.no. Supplementary data are available at Bioinformatics online.
Migrating Department of Defense (DoD) Web Service Based Applications to Mobile Computing Platforms
2012-03-01
World Wide Web Consortium (W3C) Geolocation API to identify the device’s location and then center the map on the device. Finally, we modify the entry...THIS PAGE INTENTIONALLY LEFT BLANK xii List of Acronyms and Abbreviations API Application Programming Interface CSS Cascading Style Sheets CLIMO...Java API for XML Web Services Reference Implementation JS JavaScript JSNI JavaScript Native Interface METOC Meteorological and Oceanographic MAA Mobile
Federal Register 2010, 2011, 2012, 2013, 2014
2012-11-09
... available to Participants various application programming interfaces (``APIs''),\\4\\ such as CBOE Market... certain order and trade data to the Exchange, which data the Exchange uses to conduct surveillances of its markets and Participants. \\4\\ APIs are computer programs that allow Participants to interface with the...
Internet SCADA Utilizing API's as Data Source
NASA Astrophysics Data System (ADS)
Robles, Rosslin John; Kim, Haeng-Kon; Kim, Tai-Hoon
An Application programming interface or API is an interface implemented by a software program that enables it to interact with other software. Many companies provide free API services which can be utilized in Control Systems. SCADA is an example of a control system and it is a system that collects data from various sensors at a factory, plant or in other remote locations and then sends this data to a central computer which then manages and controls the data. In this paper, we designed a scheme for Weather Condition in Internet SCADA Environment utilizing data from external API services. The scheme was designed to double check the weather information in SCADA.
An Application Programming Interface for Synthetic Snowflake Particle Structure and Scattering Data
NASA Technical Reports Server (NTRS)
Lammers, Matthew; Kuo, Kwo-Sen
2017-01-01
The work by Kuo and colleagues on growing synthetic snowflakes and calculating their single-scattering properties has demonstrated great potential to improve the retrievals of snowfall. To grant colleagues flexible and targeted access to their large collection of sizes and shapes at fifteen (15) microwave frequencies, we have developed a web-based Application Programming Interface (API) integrated with NASA Goddard's Precipitation Processing System (PPS) Group. It is our hope that the API will enable convenient programmatic utilization of the database. To help users better understand the API's capabilities, we have developed an interactive web interface called the OpenSSP API Query Builder, which implements an intuitive system of mechanisms for selecting shapes, sizes, and frequencies to generate queries, with which the API can then extract and return data from the database. The Query Builder also allows for the specification of normalized particle size distributions by setting pertinent parameters, with which the API can also return mean geometric and scattering properties for each size bin. Additionally, the Query Builder interface enables downloading of raw scattering and particle structure data packages. This presentation will describe some of the challenges and successes associated with developing such an API. Examples of its usage will be shown both through downloading output and pulling it into a spreadsheet, as well as querying the API programmatically and working with the output in code.
Reactome Pengine: A web-logic API to the homo sapiens reactome.
Neaves, Samuel R; Tsoka, Sophia; Millard, Louise A C
2018-03-30
Existing ways of accessing data from the Reactome database are limited. Either a researcher is restricted to particular queries defined by a web application programming interface (API), or they have to download the whole database. Reactome Pengine is a web service providing a logic programming based API to the human reactome. This gives researchers greater flexibility in data access than existing APIs, as users can send their own small programs (alongside queries) to Reactome Pengine. The server and an example notebook can be found at https://apps.nms.kcl.ac.uk/reactome-pengine. Source code is available at https://github.com/samwalrus/reactome-pengine and a Docker image is available at https://hub.docker.com/r/samneaves/rp4/ . samuel.neaves@kcl.ac.uk. Supplementary data are available at Bioinformatics online.
GIANT API: an application programming interface for functional genomics
Roberts, Andrew M.; Wong, Aaron K.; Fisk, Ian; Troyanskaya, Olga G.
2016-01-01
GIANT API provides biomedical researchers programmatic access to tissue-specific and global networks in humans and model organisms, and associated tools, which includes functional re-prioritization of existing genome-wide association study (GWAS) data. Using tissue-specific interaction networks, researchers are able to predict relationships between genes specific to a tissue or cell lineage, identify the changing roles of genes across tissues and uncover disease-gene associations. Additionally, GIANT API enables computational tools like NetWAS, which leverages tissue-specific networks for re-prioritization of GWAS results. The web services covered by the API include 144 tissue-specific functional gene networks in human, global functional networks for human and six common model organisms and the NetWAS method. GIANT API conforms to the REST architecture, which makes it stateless, cacheable and highly scalable. It can be used by a diverse range of clients including web browsers, command terminals, programming languages and standalone apps for data analysis and visualization. The API is freely available for use at http://giant-api.princeton.edu. PMID:27098035
Programmatic access to logical models in the Cell Collective modeling environment via a REST API.
Kowal, Bryan M; Schreier, Travis R; Dauer, Joseph T; Helikar, Tomáš
2016-01-01
Cell Collective (www.cellcollective.org) is a web-based interactive environment for constructing, simulating and analyzing logical models of biological systems. Herein, we present a Web service to access models, annotations, and simulation data in the Cell Collective platform through the Representational State Transfer (REST) Application Programming Interface (API). The REST API provides a convenient method for obtaining Cell Collective data through almost any programming language. To ensure easy processing of the retrieved data, the request output from the API is available in a standard JSON format. The Cell Collective REST API is freely available at http://thecellcollective.org/tccapi. All public models in Cell Collective are available through the REST API. For users interested in creating and accessing their own models through the REST API first need to create an account in Cell Collective (http://thecellcollective.org). thelikar2@unl.edu. Technical user documentation: https://goo.gl/U52GWo. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.
Application-Program-Installer Builder
NASA Technical Reports Server (NTRS)
Wolgast, Paul; Demore, Martha; Lowik, Paul
2007-01-01
A computer program builds application programming interfaces (APIs) and related software components for installing and uninstalling application programs in any of a variety of computers and operating systems that support the Java programming language in its binary form. This program is partly similar in function to commercial (e.g., Install-Shield) software. This program is intended to enable satisfaction of a quasi-industry-standard set of requirements for a set of APIs that would enable such installation and uninstallation and that would avoid the pitfalls that are commonly encountered during installation of software. The requirements include the following: 1) Properly detecting prerequisites to an application program before performing the installation; 2) Properly registering component requirements; 3) Correctly measuring the required hard-disk space, including accounting for prerequisite components that have already been installed; and 4) Correctly uninstalling an application program. Correct uninstallation includes (1) detecting whether any component of the program to be removed is required by another program, (2) not removing that component, and (3) deleting references to requirements of the to-be-removed program for components of other programs so that those components can be properly removed at a later time.
Commercial Building Energy Saver, API
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hong, Tianzhen; Piette, Mary; Lee, Sang Hoon
2015-08-27
The CBES API provides Application Programming Interface to a suite of functions to improve energy efficiency of buildings, including building energy benchmarking, preliminary retrofit analysis using a pre-simulation database DEEP, and detailed retrofit analysis using energy modeling with the EnergyPlus simulation engine. The CBES API is used to power the LBNL CBES Web App. It can be adopted by third party developers and vendors into their software tools and platforms.
Measuring the impact of an API-first mentality with ScienceBase after 4.5 years
NASA Astrophysics Data System (ADS)
Bristol, S.; Tekell, S.
2016-12-01
ScienceBase is a research infrastructure developed and operated by the U.S. Geological Survey with users and uses across a number of other agency and organization partners. Over four years ago, we released an Application Programming Interface (API) as the foundation of the system and took on the mindset that our progress would be measured by the uptake of the API by others beyond ourselves in developing interesting applications. We now measure success more by someone finding ScienceBase, organizing their data and information, developing an innovative API-driven application and then serendipitous discovery through a science meeting. Because of the way we built the RESTful API, we can characterize what parts of the system are employed. Analysis of usage data helps us take the supposition out of what works and guides design and funding decisions. This analytics-based process facilitates regular adjustments to our thinking and allows us to test design decisions as hypotheses rather than untestable aspirations.
GIANT API: an application programming interface for functional genomics.
Roberts, Andrew M; Wong, Aaron K; Fisk, Ian; Troyanskaya, Olga G
2016-07-08
GIANT API provides biomedical researchers programmatic access to tissue-specific and global networks in humans and model organisms, and associated tools, which includes functional re-prioritization of existing genome-wide association study (GWAS) data. Using tissue-specific interaction networks, researchers are able to predict relationships between genes specific to a tissue or cell lineage, identify the changing roles of genes across tissues and uncover disease-gene associations. Additionally, GIANT API enables computational tools like NetWAS, which leverages tissue-specific networks for re-prioritization of GWAS results. The web services covered by the API include 144 tissue-specific functional gene networks in human, global functional networks for human and six common model organisms and the NetWAS method. GIANT API conforms to the REST architecture, which makes it stateless, cacheable and highly scalable. It can be used by a diverse range of clients including web browsers, command terminals, programming languages and standalone apps for data analysis and visualization. The API is freely available for use at http://giant-api.princeton.edu. © The Author(s) 2016. Published by Oxford University Press on behalf of Nucleic Acids Research.
DOE Office of Scientific and Technical Information (OSTI.GOV)
2012-08-21
NREL's Developer Network, developer.nrel.gov, provides data that users can access to provide data to their own analyses, mobile and web applications. Developers can retrieve the data through a Web services API (application programming interface). The Developer Network handles overhead of serving up web services such as key management, authentication, analytics, reporting, documentation standards, and throttling in a common architecture, while allowing web services and APIs to be maintained and managed independently.
A knowledge discovery object model API for Java
Zuyderduyn, Scott D; Jones, Steven JM
2003-01-01
Background Biological data resources have become heterogeneous and derive from multiple sources. This introduces challenges in the management and utilization of this data in software development. Although efforts are underway to create a standard format for the transmission and storage of biological data, this objective has yet to be fully realized. Results This work describes an application programming interface (API) that provides a framework for developing an effective biological knowledge ontology for Java-based software projects. The API provides a robust framework for the data acquisition and management needs of an ontology implementation. In addition, the API contains classes to assist in creating GUIs to represent this data visually. Conclusions The Knowledge Discovery Object Model (KDOM) API is particularly useful for medium to large applications, or for a number of smaller software projects with common characteristics or objectives. KDOM can be coupled effectively with other biologically relevant APIs and classes. Source code, libraries, documentation and examples are available at . PMID:14583100
An Auto-Configuration System for the GMSEC Architecture and API
NASA Technical Reports Server (NTRS)
Moholt, Joseph; Mayorga, Arturo
2007-01-01
A viewgraph presentation on an automated configuration concept for The Goddard Mission Services Evolution Center (GMSEC) architecture and Application Program Interface (API) is shown. The topics include: 1) The Goddard Mission Services Evolution Center (GMSEC); 2) Automated Configuration Concept; 3) Implementation Approach; and 4) Key Components and Benefits.
Activity-Centric Approach to Distributed Programming
NASA Technical Reports Server (NTRS)
Levy, Renato; Satapathy, Goutam; Lang, Jun
2004-01-01
The first phase of an effort to develop a NASA version of the Cybele software system has been completed. To give meaning to even a highly abbreviated summary of the modifications to be embodied in the NASA version, it is necessary to present the following background information on Cybele: Cybele is a proprietary software infrastructure for use by programmers in developing agent-based application programs [complex application programs that contain autonomous, interacting components (agents)]. Cybele provides support for event handling from multiple sources, multithreading, concurrency control, migration, and load balancing. A Cybele agent follows a programming paradigm, called activity-centric programming, that enables an abstraction over system-level thread mechanisms. Activity centric programming relieves application programmers of the complex tasks of thread management, concurrency control, and event management. In order to provide such functionality, activity-centric programming demands support of other layers of software. This concludes the background information. In the first phase of the present development, a new architecture for Cybele was defined. In this architecture, Cybele follows a modular service-based approach to coupling of the programming and service layers of software architecture. In a service-based approach, the functionalities supported by activity-centric programming are apportioned, according to their characteristics, among several groups called services. A well-defined interface among all such services serves as a path that facilitates the maintenance and enhancement of such services without adverse effect on the whole software framework. The activity-centric application-program interface (API) is part of a kernel. The kernel API calls the services by use of their published interface. This approach makes it possible for any application code written exclusively under the API to be portable for any configuration of Cybele.
uPy: a ubiquitous computer graphics Python API with Biological Modeling Applications
Autin, L.; Johnson, G.; Hake, J.; Olson, A.; Sanner, M.
2015-01-01
In this paper we describe uPy, an extension module for the Python programming language that provides a uniform abstraction of the APIs of several 3D computer graphics programs called hosts, including: Blender, Maya, Cinema4D, and DejaVu. A plugin written with uPy is a unique piece of code that will run in all uPy-supported hosts. We demonstrate the creation of complex plug-ins for molecular/cellular modeling and visualization and discuss how uPy can more generally simplify programming for many types of projects (not solely science applications) intended for multi-host distribution. uPy is available at http://upy.scripps.edu PMID:24806987
Numerical Integration with Graphical Processing Unit for QKD Simulation
2014-03-27
Windows system application programming interface (API) timer. The problem sizes studied produce speedups greater than 60x on the NVIDIA Tesla C2075...13 2.3.3 CUDA API . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 14 2.3.4 CUDA and NVIDIA GPU Hardware...Theoretical Floating-Point Operations per Second for Intel CPUs and NVIDIA GPUs [3
HDF-EOS 2 and HDF-EOS 5 Compatibility Library
NASA Technical Reports Server (NTRS)
Ullman, Richard; Bane, Bob; Yang, Jingli
2008-01-01
The HDF-EOS 2 and HDF-EOS 5 Compatibility Library contains C-language functions that provide uniform access to HDF-EOS 2 and HDF-EOS 5 files through one set of application programming interface (API) calls. ("HDFEOS 2" and "HDF-EOS 5" are defined in the immediately preceding article.) Without this library, differences between the APIs of HDF-EOS 2 and HDF-EOS 5 would necessitate writing of different programs to cover HDF-EOS 2 and HDF-EOS 5. The API associated with this library is denoted "he25." For nearly every HDF-EOS 5 API call, there is a corresponding he25 API call. If a file in question is in the HDF-EOS 5 format, the code reverts to the corresponding HDF-EOS 5 call; if the file is in the HDF-EOS 2 format, the code translates the arguments to HDF-EOS 2 equivalents (if necessary), calls the HDFEOS 2 call, and retranslates the results back to HDF-EOS 5 (if necessary).
SWMM5 Application Programming Interface and PySWMM: A ...
In support of the OpenWaterAnalytics open source initiative, the PySWMM project encompasses the development of a Python interfacing wrapper to SWMM5 with parallel ongoing development of the USEPA Stormwater Management Model (SWMM5) application programming interface (API). ... The purpose of this work is to increase the utility of the SWMM dll by creating a Toolkit API for accessing its functionality. The utility of the Toolkit is further enhanced with a wrapper to allow access from the Python scripting language. This work is being prosecuted as part of an Open Source development strategy and is being performed by volunteer software developers.
NASA Astrophysics Data System (ADS)
Piao, Chunhui; Han, Xufang; Wu, Harris
2010-08-01
We provide a formal definition of an e-commerce transaction network. Agent-based modelling is used to simulate e-commerce transaction networks. For real-world analysis, we studied the open application programming interfaces (APIs) from eBay and Taobao e-commerce websites and captured real transaction data. Pajek is used to visualise the agent relationships in the transaction network. We derived one-mode networks from the transaction network and analysed them using degree and betweenness centrality. Integrating multi-agent modelling, open APIs and social network analysis, we propose a new way to study large-scale e-commerce systems.
Validation results of specifications for motion control interoperability
NASA Astrophysics Data System (ADS)
Szabo, Sandor; Proctor, Frederick M.
1997-01-01
The National Institute of Standards and Technology (NIST) is participating in the Department of Energy Technologies Enabling Agile Manufacturing (TEAM) program to establish interface standards for machine tool, robot, and coordinate measuring machine controllers. At NIST, the focus is to validate potential application programming interfaces (APIs) that make it possible to exchange machine controller components with a minimal impact on the rest of the system. This validation is taking place in the enhanced machine controller (EMC) consortium and is in cooperation with users and vendors of motion control equipment. An area of interest is motion control, including closed-loop control of individual axes and coordinated path planning. Initial tests of the motion control APIs are complete. The APIs were implemented on two commercial motion control boards that run on two different machine tools. The results for a baseline set of APIs look promising, but several issues were raised. These include resolving differing approaches in how motions are programmed and defining a standard measurement of performance for motion control. This paper starts with a summary of the process used in developing a set of specifications for motion control interoperability. Next, the EMC architecture and its classification of motion control APIs into two classes, Servo Control and Trajectory Planning, are reviewed. Selected APIs are presented to explain the basic functionality and some of the major issues involved in porting the APIs to other motion controllers. The paper concludes with a summary of the main issues and ways to continue the standards process.
Enabling complex queries to drug information sources through functional composition.
Peters, Lee; Mortensen, Jonathan; Nguyen, Thang; Bodenreider, Olivier
2013-01-01
Our objective was to enable an end-user to create complex queries to drug information sources through functional composition, by creating sequences of functions from application program interfaces (API) to drug terminologies. The development of a functional composition model seeks to link functions from two distinct APIs. An ontology was developed using Protégé to model the functions of the RxNorm and NDF-RT APIs by describing the semantics of their input and output. A set of rules were developed to define the interoperable conditions for functional composition. The operational definition of interoperability between function pairs is established by executing the rules on the ontology. We illustrate that the functional composition model supports common use cases, including checking interactions for RxNorm drugs and deploying allergy lists defined in reference to drug properties in NDF-RT. This model supports the RxMix application (http://mor.nlm.nih.gov/RxMix/), an application we developed for enabling complex queries to the RxNorm and NDF-RT APIs.
An application programming interface for CellNetAnalyzer.
Klamt, Steffen; von Kamp, Axel
2011-08-01
CellNetAnalyzer (CNA) is a MATLAB toolbox providing computational methods for studying structure and function of metabolic and cellular signaling networks. In order to allow non-experts to use these methods easily, CNA provides GUI-based interactive network maps as a means of parameter input and result visualization. However, with the availability of high-throughput data, there is a need to make CNA's functionality also accessible in batch mode for automatic data processing. Furthermore, as some algorithms of CNA are of general relevance for network analysis it would be desirable if they could be called as sub-routines by other applications. For this purpose, we developed an API (application programming interface) for CNA allowing users (i) to access the content of network models in CNA, (ii) to use CNA's network analysis capabilities independent of the GUI, and (iii) to interact with the GUI to facilitate the development of graphical plugins. Here we describe the organization of network projects in CNA and the application of the new API functions to these projects. This includes the creation of network projects from scratch, loading and saving of projects and scenarios, and the application of the actual analysis methods. Furthermore, API functions for the import/export of metabolic models in SBML format and for accessing the GUI are described. Lastly, two example applications demonstrate the use and versatile applicability of CNA's API. CNA is freely available for academic use and can be downloaded from http://www.mpi-magdeburg.mpg.de/projects/cna/cna.html. Copyright © 2011 Elsevier Ireland Ltd. All rights reserved.
SWMM5 Application Programming Interface and PySWMM: A Python Interfacing Wrapper
In support of the OpenWaterAnalytics open source initiative, the PySWMM project encompasses the development of a Python interfacing wrapper to SWMM5 with parallel ongoing development of the USEPA Stormwater Management Model (SWMM5) application programming interface (API). ...
Determinants of quality, latency, and amount of Stack Overflow answers about recent Android APIs
Filkov, Vladimir
2018-01-01
Stack Overflow is a popular crowdsourced question and answer website for programming-related issues. It is an invaluable resource for software developers; on average, questions posted there get answered in minutes to an hour. Questions about well established topics, e.g., the coercion operator in C++, or the difference between canonical and class names in Java, get asked often in one form or another, and answered very quickly. On the other hand, questions on previously unseen or niche topics take a while to get a good answer. This is particularly the case with questions about current updates to or the introduction of new application programming interfaces (APIs). In a hyper-competitive online market, getting good answers to current programming questions sooner could increase the chances of an app getting released and used. So, can developers anyhow, e.g., hasten the speed to good answers to questions about new APIs? Here, we empirically study Stack Overflow questions pertaining to new Android APIs and their associated answers. We contrast the interest in these questions, their answer quality, and timeliness of their answers to questions about old APIs. We find that Stack Overflow answerers in general prioritize with respect to currentness: questions about new APIs do get more answers, but good quality answers take longer. We also find that incentives in terms of question bounties, if used appropriately, can significantly shorten the time and increase answer quality. Interestingly, no operationalization of bounty amount shows significance in our models. In practice, our findings confirm the value of bounties in enhancing expert participation. In addition, they show that the Stack Overflow style of crowdsourcing, for all its glory in providing answers about established programming knowledge, is less effective with new API questions. PMID:29547620
Determinants of quality, latency, and amount of Stack Overflow answers about recent Android APIs.
Kavaler, David; Filkov, Vladimir
2018-01-01
Stack Overflow is a popular crowdsourced question and answer website for programming-related issues. It is an invaluable resource for software developers; on average, questions posted there get answered in minutes to an hour. Questions about well established topics, e.g., the coercion operator in C++, or the difference between canonical and class names in Java, get asked often in one form or another, and answered very quickly. On the other hand, questions on previously unseen or niche topics take a while to get a good answer. This is particularly the case with questions about current updates to or the introduction of new application programming interfaces (APIs). In a hyper-competitive online market, getting good answers to current programming questions sooner could increase the chances of an app getting released and used. So, can developers anyhow, e.g., hasten the speed to good answers to questions about new APIs? Here, we empirically study Stack Overflow questions pertaining to new Android APIs and their associated answers. We contrast the interest in these questions, their answer quality, and timeliness of their answers to questions about old APIs. We find that Stack Overflow answerers in general prioritize with respect to currentness: questions about new APIs do get more answers, but good quality answers take longer. We also find that incentives in terms of question bounties, if used appropriately, can significantly shorten the time and increase answer quality. Interestingly, no operationalization of bounty amount shows significance in our models. In practice, our findings confirm the value of bounties in enhancing expert participation. In addition, they show that the Stack Overflow style of crowdsourcing, for all its glory in providing answers about established programming knowledge, is less effective with new API questions.
Fenix, A Fault Tolerant Programming Framework for MPI Applications
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gamel, Marc; Teranihi, Keita; Valenzuela, Eric
2016-10-05
Fenix provides APIs to allow the users to add fault tolerance capability to MPI-based parallel programs in a transparent manner. Fenix-enabled programs can run through process failures during program execution using a pool of spare processes accommodated by Fenix.
An overview of the CellML API and its implementation
2010-01-01
Background CellML is an XML based language for representing mathematical models, in a machine-independent form which is suitable for their exchange between different authors, and for archival in a model repository. Allowing for the exchange and archival of models in a computer readable form is a key strategic goal in bioinformatics, because of the associated improvements in scientific record accuracy, the faster iterative process of scientific development, and the ability to combine models into large integrative models. However, for CellML models to be useful, tools which can process them correctly are needed. Due to some of the more complex features present in CellML models, such as imports, developing code ab initio to correctly process models can be an onerous task. For this reason, there is a clear and pressing need for an application programming interface (API), and a good implementation of that API, upon which tools can base their support for CellML. Results We developed an API which allows the information in CellML models to be retrieved and/or modified. We also developed a series of optional extension APIs, for tasks such as simplifying the handling of connections between variables, dealing with physical units, validating models, and translating models into different procedural languages. We have also provided a Free/Open Source implementation of this application programming interface, optimised to achieve good performance. Conclusions Tools have been developed using the API which are mature enough for widespread use. The API has the potential to accelerate the development of additional tools capable of processing CellML, and ultimately lead to an increased level of sharing of mathematical model descriptions. PMID:20377909
An overview of the CellML API and its implementation.
Miller, Andrew K; Marsh, Justin; Reeve, Adam; Garny, Alan; Britten, Randall; Halstead, Matt; Cooper, Jonathan; Nickerson, David P; Nielsen, Poul F
2010-04-08
CellML is an XML based language for representing mathematical models, in a machine-independent form which is suitable for their exchange between different authors, and for archival in a model repository. Allowing for the exchange and archival of models in a computer readable form is a key strategic goal in bioinformatics, because of the associated improvements in scientific record accuracy, the faster iterative process of scientific development, and the ability to combine models into large integrative models.However, for CellML models to be useful, tools which can process them correctly are needed. Due to some of the more complex features present in CellML models, such as imports, developing code ab initio to correctly process models can be an onerous task. For this reason, there is a clear and pressing need for an application programming interface (API), and a good implementation of that API, upon which tools can base their support for CellML. We developed an API which allows the information in CellML models to be retrieved and/or modified. We also developed a series of optional extension APIs, for tasks such as simplifying the handling of connections between variables, dealing with physical units, validating models, and translating models into different procedural languages.We have also provided a Free/Open Source implementation of this application programming interface, optimised to achieve good performance. Tools have been developed using the API which are mature enough for widespread use. The API has the potential to accelerate the development of additional tools capable of processing CellML, and ultimately lead to an increased level of sharing of mathematical model descriptions.
Adaptive runtime for a multiprocessing API
Antao, Samuel F.; Bertolli, Carlo; Eichenberger, Alexandre E.; O'Brien, John K.
2016-11-15
A computer-implemented method includes selecting a runtime for executing a program. The runtime includes a first combination of feature implementations, where each feature implementation implements a feature of an application programming interface (API). Execution of the program is monitored, and the execution uses the runtime. Monitor data is generated based on the monitoring. A second combination of feature implementations are selected, by a computer processor, where the selection is based at least in part on the monitor data. The runtime is modified by activating the second combination of feature implementations to replace the first combination of feature implementations.
Adaptive runtime for a multiprocessing API
Antao, Samuel F.; Bertolli, Carlo; Eichenberger, Alexandre E.; O'Brien, John K.
2016-10-11
A computer-implemented method includes selecting a runtime for executing a program. The runtime includes a first combination of feature implementations, where each feature implementation implements a feature of an application programming interface (API). Execution of the program is monitored, and the execution uses the runtime. Monitor data is generated based on the monitoring. A second combination of feature implementations are selected, by a computer processor, where the selection is based at least in part on the monitor data. The runtime is modified by activating the second combination of feature implementations to replace the first combination of feature implementations.
Using the RxNorm web services API for quality assurance purposes.
Peters, Lee; Bodenreider, Olivier
2008-11-06
Auditing large, rapidly evolving terminological systems is still a challenge. In the case of RxNorm, a standardized nomenclature for clinical drugs, we argue that quality assurance processes can benefit from the recently released application programming interface (API) provided by RxNav. We demonstrate the usefulness of the API by performing a systematic comparison of alternative paths in the RxNorm graph, over several thousands of drug entities. This study revealed potential errors in RxNorm, currently under review. The results also prompted us to modify the implementation of RxNav to navigate the RxNorm graph more accurately. The RxNav web services API used in this experiment is robust and fast.
Using the RxNorm Web Services API for Quality Assurance Purposes
Peters, Lee; Bodenreider, Olivier
2008-01-01
Auditing large, rapidly evolving terminological systems is still a challenge. In the case of RxNorm, a standardized nomenclature for clinical drugs, we argue that quality assurance processes can benefit from the recently released application programming interface (API) provided by RxNav. We demonstrate the usefulness of the API by performing a systematic comparison of alternative paths in the RxNorm graph, over several thousands of drug entities. This study revealed potential errors in RxNorm, currently under review. The results also prompted us to modify the implementation of RxNav to navigate the RxNorm graph more accurately. The RxNorm web services API used in this experiment is robust and fast. PMID:18999038
NASA Technical Reports Server (NTRS)
Barry, Matthew R.; Osborne, Richard N.
2005-01-01
The RoseDoclet computer program extends the capability of Java doclet software to automatically synthesize Unified Modeling Language (UML) content from Java language source code. [Doclets are Java-language programs that use the doclet application programming interface (API) to specify the content and format of the output of Javadoc. Javadoc is a program, originally designed to generate API documentation from Java source code, now also useful as an extensible engine for processing Java source code.] RoseDoclet takes advantage of Javadoc comments and tags already in the source code to produce a UML model of that code. RoseDoclet applies the doclet API to create a doclet passed to Javadoc. The Javadoc engine applies the doclet to the source code, emitting the output format specified by the doclet. RoseDoclet emits a Rose model file and populates it with fully documented packages, classes, methods, variables, and class diagrams identified in the source code. The way in which UML models are generated can be controlled by use of new Javadoc comment tags that RoseDoclet provides. The advantage of using RoseDoclet is that Javadoc documentation becomes leveraged for two purposes: documenting the as-built API and keeping the design documentation up to date.
NASA Astrophysics Data System (ADS)
Manuaba, I. B. P.; Rudiastini, E.
2018-01-01
Assessment of lecturers is a tool used to measure lecturer performance. Lecturer’s assessment variable can be measured from three aspects : teaching activities, research and community service. Broad aspect to measure the performance of lecturers requires a special framework, so that the system can be developed in a sustainable manner. Issues of this research is to create a API web service data tool, so the lecturer assessment system can be developed in various frameworks. The research was developed with web service and php programming language with the output of json extension data. The conclusion of this research is API web service data application can be developed using several platforms such as web, mobile application
Connected Lighting System Interoperability Study Part 1: Application Programming Interfaces
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gaidon, Clement; Poplawski, Michael
First in a series of studies that focuses on interoperability as realized by the use of Application Programming Interfaces (APIs), explores the diversity of such interfaces in several connected lighting systems; characterizes the extent of interoperability that they provide; and illustrates challenges, limitations, and tradeoffs that were encountered during this exploration.
Search without Boundaries Using Simple APIs
Tong, Qi
2009-01-01
The U.S. Geological Survey (USGS) Library, where the author serves as the digital services librarian, is increasingly challenged to make it easier for users to find information from many heterogeneous information sources. Information is scattered throughout different software applications (i.e., library catalog, federated search engine, link resolver, and vendor websites), and each specializes in one thing. How could the library integrate the functionalities of one application with another and provide a single point of entry for users to search across? To improve the user experience, the library launched an effort to integrate the federated search engine into the library's intranet website. The result is a simple search box that leverages the federated search engine's built-in application programming interfaces (APIs). In this article, the author describes how this project demonstrated the power of APIs and their potential to be used by other enterprise search portals inside or outside of the library.
MetNetAPI: A flexible method to access and manipulate biological network data from MetNet
2010-01-01
Background Convenient programmatic access to different biological databases allows automated integration of scientific knowledge. Many databases support a function to download files or data snapshots, or a webservice that offers "live" data. However, the functionality that a database offers cannot be represented in a static data download file, and webservices may consume considerable computational resources from the host server. Results MetNetAPI is a versatile Application Programming Interface (API) to the MetNetDB database. It abstracts, captures and retains operations away from a biological network repository and website. A range of database functions, previously only available online, can be immediately (and independently from the website) applied to a dataset of interest. Data is available in four layers: molecular entities, localized entities (linked to a specific organelle), interactions, and pathways. Navigation between these layers is intuitive (e.g. one can request the molecular entities in a pathway, as well as request in what pathways a specific entity participates). Data retrieval can be customized: Network objects allow the construction of new and integration of existing pathways and interactions, which can be uploaded back to our server. In contrast to webservices, the computational demand on the host server is limited to processing data-related queries only. Conclusions An API provides several advantages to a systems biology software platform. MetNetAPI illustrates an interface with a central repository of data that represents the complex interrelationships of a metabolic and regulatory network. As an alternative to data-dumps and webservices, it allows access to a current and "live" database and exposes analytical functions to application developers. Yet it only requires limited resources on the server-side (thin server/fat client setup). The API is available for Java, Microsoft.NET and R programming environments and offers flexible query and broad data- retrieval methods. Data retrieval can be customized to client needs and the API offers a framework to construct and manipulate user-defined networks. The design principles can be used as a template to build programmable interfaces for other biological databases. The API software and tutorials are available at http://www.metnetonline.org/api. PMID:21083943
Analyzing Spacecraft Telecommunication Systems
NASA Technical Reports Server (NTRS)
Kordon, Mark; Hanks, David; Gladden, Roy; Wood, Eric
2004-01-01
Multi-Mission Telecom Analysis Tool (MMTAT) is a C-language computer program for analyzing proposed spacecraft telecommunication systems. MMTAT utilizes parameterized input and computational models that can be run on standard desktop computers to perform fast and accurate analyses of telecommunication links. MMTAT is easy to use and can easily be integrated with other software applications and run as part of almost any computational simulation. It is distributed as either a stand-alone application program with a graphical user interface or a linkable library with a well-defined set of application programming interface (API) calls. As a stand-alone program, MMTAT provides both textual and graphical output. The graphs make it possible to understand, quickly and easily, how telecommunication performance varies with variations in input parameters. A delimited text file that can be read by any spreadsheet program is generated at the end of each run. The API in the linkable-library form of MMTAT enables the user to control simulation software and to change parameters during a simulation run. Results can be retrieved either at the end of a run or by use of a function call at any time step.
Heterogeneous scalable framework for multiphase flows
DOE Office of Scientific and Technical Information (OSTI.GOV)
Morris, Karla Vanessa
2013-09-01
Two categories of challenges confront the developer of computational spray models: those related to the computation and those related to the physics. Regarding the computation, the trend towards heterogeneous, multi- and many-core platforms will require considerable re-engineering of codes written for the current supercomputing platforms. Regarding the physics, accurate methods for transferring mass, momentum and energy from the dispersed phase onto the carrier fluid grid have so far eluded modelers. Significant challenges also lie at the intersection between these two categories. To be competitive, any physics model must be expressible in a parallel algorithm that performs well on evolving computermore » platforms. This work created an application based on a software architecture where the physics and software concerns are separated in a way that adds flexibility to both. The develop spray-tracking package includes an application programming interface (API) that abstracts away the platform-dependent parallelization concerns, enabling the scientific programmer to write serial code that the API resolves into parallel processes and threads of execution. The project also developed the infrastructure required to provide similar APIs to other application. The API allow object-oriented Fortran applications direct interaction with Trilinos to support memory management of distributed objects in central processing units (CPU) and graphic processing units (GPU) nodes for applications using C++.« less
Universal programming interface with concurrent access
DOE Office of Scientific and Technical Information (OSTI.GOV)
Alferov, Oleg
2004-10-07
There exist a number of devices with a positioning nature of operation, such as mechanical linear stages, temperature controllers, or filterwheels with discrete state, and most of them have different programming interfaces. The Universal Positioner software suggests the way to handle all of them is with a single approach, whereby a particular hardware driver is created from the template and by translating the actual commands used by the hardware to and from the universal programming interface. The software contains the universal API module itself, the demo simulation of hardware, and the front-end programs to help developers write their own softwaremore » drivers along with example drivers for actual hardware controllers. The software allows user application programs to call devices simultaneously without race conditions (multitasking and concurrent access). The template suggested in this package permits developers to integrate various devices easily into their applications using the same API. The drivers can be stacked; i.e., they can call each other via the same interface.« less
NASA Astrophysics Data System (ADS)
Ward, K.
2015-12-01
Hidden within the terabytes of imagery in NASA's Global Imagery Browse Services (GIBS) collection are hundreds of daily natural events. Some events are newsworthy, devastating, and visibly obvious at a global scale, others are merely regional curiosities. Regardless of the scope and significance of any one event, it is likely that multiple GIBS layers can be viewed to provide a multispectral, dataset-based view of the event. To facilitate linking between the discrete event and the representative dataset imagery, NASA's Earth Observatory Group has developed a prototype application programming interface (API): the Earth Observatory Natural Event Tracker (EONET). EONET supports an API model that allows users to retrieve event-specific metadata--date/time, location, and type (wildfire, storm, etc.)--and web service layer-specific metadata which can be used to link to event-relevant dataset imagery in GIBS. GIBS' ability to ingest many near real time datasets, combined with its growing archive of past imagery, means that API users will be able to develop client applications that not only show ongoing events but can also look at imagery from before and after. In our poster, we will present the API and show examples of its use.
From Petascale to Exascale: Eight Focus Areas of R&D Challenges for HPC Simulation Environments
DOE Office of Scientific and Technical Information (OSTI.GOV)
Springmeyer, R; Still, C; Schulz, M
2011-03-17
Programming models bridge the gap between the underlying hardware architecture and the supporting layers of software available to applications. Programming models are different from both programming languages and application programming interfaces (APIs). Specifically, a programming model is an abstraction of the underlying computer system that allows for the expression of both algorithms and data structures. In comparison, languages and APIs provide implementations of these abstractions and allow the algorithms and data structures to be put into practice - a programming model exists independently of the choice of both the programming language and the supporting APIs. Programming models are typically focusedmore » on achieving increased developer productivity, performance, and portability to other system designs. The rapidly changing nature of processor architectures and the complexity of designing an exascale platform provide significant challenges for these goals. Several other factors are likely to impact the design of future programming models. In particular, the representation and management of increasing levels of parallelism, concurrency and memory hierarchies, combined with the ability to maintain a progressive level of interoperability with today's applications are of significant concern. Overall the design of a programming model is inherently tied not only to the underlying hardware architecture, but also to the requirements of applications and libraries including data analysis, visualization, and uncertainty quantification. Furthermore, the successful implementation of a programming model is dependent on exposed features of the runtime software layers and features of the operating system. Successful use of a programming model also requires effective presentation to the software developer within the context of traditional and new software development tools. Consideration must also be given to the impact of programming models on both languages and the associated compiler infrastructure. Exascale programming models must reflect several, often competing, design goals. These design goals include desirable features such as abstraction and separation of concerns. However, some aspects are unique to large-scale computing. For example, interoperability and composability with existing implementations will prove critical. In particular, performance is the essential underlying goal for large-scale systems. A key evaluation metric for exascale models will be the extent to which they support these goals rather than merely enable them.« less
CMR Catalog Service for the Web
NASA Technical Reports Server (NTRS)
Newman, Doug; Mitchell, Andrew
2016-01-01
With the impending retirement of Global Change Master Directory (GCMD) Application Programming Interfaces (APIs) the Common Metadata Repository (CMR) was charged with providing a collection-level Catalog Service for the Web (CSW) that provided the same level of functionality as GCMD. This talk describes the capabilities of the CMR CSW API with particular reference to the support of the Committee on Earth Observation Satellites (CEOS) Working Group on Information Systems and Services (WGISS) Integrated Catalog (CWIC).
GLobal Integrated Design Environment (GLIDE): A Concurrent Engineering Application
NASA Technical Reports Server (NTRS)
McGuire, Melissa L.; Kunkel, Matthew R.; Smith, David A.
2010-01-01
The GLobal Integrated Design Environment (GLIDE) is a client-server software application purpose-built to mitigate issues associated with real time data sharing in concurrent engineering environments and to facilitate discipline-to-discipline interaction between multiple engineers and researchers. GLIDE is implemented in multiple programming languages utilizing standardized web protocols to enable secure parameter data sharing between engineers and researchers across the Internet in closed and/or widely distributed working environments. A well defined, HyperText Transfer Protocol (HTTP) based Application Programming Interface (API) to the GLIDE client/server environment enables users to interact with GLIDE, and each other, within common and familiar tools. One such common tool, Microsoft Excel (Microsoft Corporation), paired with its add-in API for GLIDE, is discussed in this paper. The top-level examples given demonstrate how this interface improves the efficiency of the design process of a concurrent engineering study while reducing potential errors associated with manually sharing information between study participants.
Kougoulos, Eleftherios; Smales, Ian; Verrier, Hugh M
2011-03-01
A novel experimental approach describing the integration of drug substance and drug production design using particle engineering techniques such as sonocrystallization, high shear wet milling (HSWM) and dry impact (hammer) milling were used to manufacture samples of an active pharmaceutical ingredient (API) with diverse particle size and size distributions. The API instability was addressed using particle engineering and through judicious selection of excipients to reduce degradation reactions. API produced using a conventional batch cooling crystallization process resulted in content uniformity issues. Hammer milling increased fine particle formation resulting in reduced content uniformity and increased degradation compared to sonocrystallized and HSWM API in the formulation. To ensure at least a 2-year shelf life based on predictions using an Accelerated Stability Assessment Program, this API should have a D [v, 0.1] of 55 μm and a D [v, 0.5] of 140 μm. The particle size of the chief excipient in the drug product formulation needed to be close to that of the API to avoid content uniformity and stability issues but large enough to reduce lactam formation. The novel methodology described here has potential for application to other APIs. © 2011 American Association of Pharmaceutical Scientists
Dipole Models for UXO Discrimination at Live Sites - Pole Mountain
2012-06-01
48 ESTCP MR-201159 Pole Mountain Demonstration Report viii April 2012 Acronyms API Application Programming...working on transitioning our inversion algorithms to an API that will be generally accessible. 8 3. PERFORMANCE OBJECTIVES The performance...2098 35 0 0 Frag (light) 6 MD PM-2098 35 0 0 Frag (light) 4 MD PM-2098 35 0 0 Frag (light) 3.5 MD PM-1354 8.5 0 0 Frag (medium) 5 MD PM- 1104 3 15
Investigating an API for resilient exascale computing.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Stearley, Jon R.; Tomkins, James; VanDyke, John P.
2013-05-01
Increased HPC capability comes with increased complexity, part counts, and fault occurrences. In- creasing the resilience of systems and applications to faults is a critical requirement facing the viability of exascale systems, as the overhead of traditional checkpoint/restart is projected to outweigh its bene ts due to fault rates outpacing I/O bandwidths. As faults occur and propagate throughout hardware and software layers, pervasive noti cation and handling mechanisms are necessary. This report describes an initial investigation of fault types and programming interfaces to mitigate them. Proof-of-concept APIs are presented for the frequent and important cases of memory errors and nodemore » failures, and a strategy proposed for lesystem failures. These involve changes to the operating system, runtime, I/O library, and application layers. While a single API for fault handling among hardware and OS and application system-wide remains elusive, the e ort increased our understanding of both the mountainous challenges and the promising trailheads. 3« less
49 CFR 195.440 - Public awareness.
Code of Federal Regulations, 2014 CFR
2014-10-01
... Petroleum Institute's (API) Recommended Practice (RP) 1162 (incorporated by reference, see § 195.3). (b) The operator's program must follow the general program recommendations of API RP 1162 and assess the unique... general program recommendations, including baseline and supplemental requirements of API RP 1162, unless...
49 CFR 195.440 - Public awareness.
Code of Federal Regulations, 2013 CFR
2013-10-01
... Petroleum Institute's (API) Recommended Practice (RP) 1162 (incorporated by reference, see § 195.3). (b) The operator's program must follow the general program recommendations of API RP 1162 and assess the unique... general program recommendations, including baseline and supplemental requirements of API RP 1162, unless...
49 CFR 195.440 - Public awareness.
Code of Federal Regulations, 2012 CFR
2012-10-01
... Petroleum Institute's (API) Recommended Practice (RP) 1162 (incorporated by reference, see § 195.3). (b) The operator's program must follow the general program recommendations of API RP 1162 and assess the unique... general program recommendations, including baseline and supplemental requirements of API RP 1162, unless...
Sakhteman, Amirhossein; Zare, Bijan
2016-01-01
An interactive application, Modelface, was presented for Modeller software based on windows platform. The application is able to run all steps of homology modeling including pdb to fasta generation, running clustal, model building and loop refinement. Other modules of modeler including energy calculation, energy minimization and the ability to make single point mutations in the PDB structures are also implemented inside Modelface. The API is a simple batch based application with no memory occupation and is free of charge for academic use. The application is also able to repair missing atom types in the PDB structures making it suitable for many molecular modeling studies such as docking and molecular dynamic simulation. Some successful instances of modeling studies using Modelface are also reported. PMID:28243276
JGromacs: a Java package for analyzing protein simulations.
Münz, Márton; Biggin, Philip C
2012-01-23
In this paper, we introduce JGromacs, a Java API (Application Programming Interface) that facilitates the development of cross-platform data analysis applications for Molecular Dynamics (MD) simulations. The API supports parsing and writing file formats applied by GROMACS (GROningen MAchine for Chemical Simulations), one of the most widely used MD simulation packages. JGromacs builds on the strengths of object-oriented programming in Java by providing a multilevel object-oriented representation of simulation data to integrate and interconvert sequence, structure, and dynamics information. The easy-to-learn, easy-to-use, and easy-to-extend framework is intended to simplify and accelerate the implementation and development of complex data analysis algorithms. Furthermore, a basic analysis toolkit is included in the package. The programmer is also provided with simple tools (e.g., XML-based configuration) to create applications with a user interface resembling the command-line interface of GROMACS applications. JGromacs and detailed documentation is freely available from http://sbcb.bioch.ox.ac.uk/jgromacs under a GPLv3 license .
JGromacs: A Java Package for Analyzing Protein Simulations
2011-01-01
In this paper, we introduce JGromacs, a Java API (Application Programming Interface) that facilitates the development of cross-platform data analysis applications for Molecular Dynamics (MD) simulations. The API supports parsing and writing file formats applied by GROMACS (GROningen MAchine for Chemical Simulations), one of the most widely used MD simulation packages. JGromacs builds on the strengths of object-oriented programming in Java by providing a multilevel object-oriented representation of simulation data to integrate and interconvert sequence, structure, and dynamics information. The easy-to-learn, easy-to-use, and easy-to-extend framework is intended to simplify and accelerate the implementation and development of complex data analysis algorithms. Furthermore, a basic analysis toolkit is included in the package. The programmer is also provided with simple tools (e.g., XML-based configuration) to create applications with a user interface resembling the command-line interface of GROMACS applications. Availability: JGromacs and detailed documentation is freely available from http://sbcb.bioch.ox.ac.uk/jgromacs under a GPLv3 license. PMID:22191855
Developing Cancer Informatics Applications and Tools Using the NCI Genomic Data Commons API.
Wilson, Shane; Fitzsimons, Michael; Ferguson, Martin; Heath, Allison; Jensen, Mark; Miller, Josh; Murphy, Mark W; Porter, James; Sahni, Himanso; Staudt, Louis; Tang, Yajing; Wang, Zhining; Yu, Christine; Zhang, Junjun; Ferretti, Vincent; Grossman, Robert L
2017-11-01
The NCI Genomic Data Commons (GDC) was launched in 2016 and makes available over 4 petabytes (PB) of cancer genomic and associated clinical data to the research community. This dataset continues to grow and currently includes over 14,500 patients. The GDC is an example of a biomedical data commons, which collocates biomedical data with storage and computing infrastructure and commonly used web services, software applications, and tools to create a secure, interoperable, and extensible resource for researchers. The GDC is (i) a data repository for downloading data that have been submitted to it, and also a system that (ii) applies a common set of bioinformatics pipelines to submitted data; (iii) reanalyzes existing data when new pipelines are developed; and (iv) allows users to build their own applications and systems that interoperate with the GDC using the GDC Application Programming Interface (API). We describe the GDC API and how it has been used both by the GDC itself and by third parties. Cancer Res; 77(21); e15-18. ©2017 AACR . ©2017 American Association for Cancer Research.
A Shellcode Detection Method Based on Full Native API Sequence and Support Vector Machine
NASA Astrophysics Data System (ADS)
Cheng, Yixuan; Fan, Wenqing; Huang, Wei; An, Jing
2017-09-01
Dynamic monitoring the behavior of a program is widely used to discriminate between benign program and malware. It is usually based on the dynamic characteristics of a program, such as API call sequence or API call frequency to judge. The key innovation of this paper is to consider the full Native API sequence and use the support vector machine to detect the shellcode. We also use the Markov chain to extract and digitize Native API sequence features. Our experimental results show that the method proposed in this paper has high accuracy and low detection rate.
Enabling Mobile Air Quality App Development with an AirNow API
NASA Astrophysics Data System (ADS)
Dye, T.; White, J. E.; Ludewig, S. A.; Dickerson, P.; Healy, A. N.; West, J. W.; Prince, L. A.
2013-12-01
The U.S. Environmental Protection Agency's (EPA) AirNow program works with over 130 participating state, local, and federal air quality agencies to obtain, quality control, and store real-time air quality observations and forecasts. From these data, the AirNow system generates thousands of maps and products each hour. Each day, information from AirNow is published online and in other media to assist the public in making health-based decisions related to air quality. However, an increasing number of people use mobile devices as their primary tool for obtaining information, and AirNow has responded to this trend by publishing an easy-to-use Web API that is useful for mobile app developers. This presentation will describe the various features of the AirNow application programming interface (API), including Representational State Transfer (REST)-type web services, file outputs, and RSS feeds. In addition, a web portal for the AirNow API will be shown, including documentation on use of the system, a query tool for configuring and running web services, and general information about the air quality data and forecasts available. Data published via the AirNow API includes corresponding Air Quality Index (AQI) levels for each pollutant. We will highlight examples of mobile apps that are using the AirNow API to provide location-based, real-time air quality information. Examples will include mobile apps developed for Minnesota ('Minnesota Air') and Washington, D.C. ('Clean Air Partners Air Quality'), and an app developed by EPA ('EPA AirNow').
Home Energy Scoring Tools (website) and Application Programming Interfaces, APIs (aka HEScore)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mills, Evan; Bourassa, Norm; Rainer, Leo
A web-based residential energy rating tool with APIs that runs the LBNL website: Provides customized estimates of residential energy use and energy bills based on building description information provided by the user. Energy use is estimated using engineering models developed at LBNL. Space heating and cooling use is based on the DOE-2. 1E building simulation model. Other end-users (water heating, appliances, lighting, and misc. equipment) are based on engineering models developed by LBNL.
Home Energy Scoring Tools (website) and Application Programming Interfaces, APIs (aka HEScore)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mills, Evan; Bourassa, Norm; Rainer, Leo
2016-04-22
A web-based residential energy rating tool with APIs that runs the LBNL website: Provides customized estimates of residential energy use and energy bills based on building description information provided by the user. Energy use is estimated using engineering models developed at LBNL. Space heating and cooling use is based on the DOE-2. 1E building simulation model. Other end-users (water heating, appliances, lighting, and misc. equipment) are based on engineering models developed by LBNL.
The Ruby UCSC API: accessing the UCSC genome database using Ruby.
Mishima, Hiroyuki; Aerts, Jan; Katayama, Toshiaki; Bonnal, Raoul J P; Yoshiura, Koh-ichiro
2012-09-21
The University of California, Santa Cruz (UCSC) genome database is among the most used sources of genomic annotation in human and other organisms. The database offers an excellent web-based graphical user interface (the UCSC genome browser) and several means for programmatic queries. A simple application programming interface (API) in a scripting language aimed at the biologist was however not yet available. Here, we present the Ruby UCSC API, a library to access the UCSC genome database using Ruby. The API is designed as a BioRuby plug-in and built on the ActiveRecord 3 framework for the object-relational mapping, making writing SQL statements unnecessary. The current version of the API supports databases of all organisms in the UCSC genome database including human, mammals, vertebrates, deuterostomes, insects, nematodes, and yeast.The API uses the bin index-if available-when querying for genomic intervals. The API also supports genomic sequence queries using locally downloaded *.2bit files that are not stored in the official MySQL database. The API is implemented in pure Ruby and is therefore available in different environments and with different Ruby interpreters (including JRuby). Assisted by the straightforward object-oriented design of Ruby and ActiveRecord, the Ruby UCSC API will facilitate biologists to query the UCSC genome database programmatically. The API is available through the RubyGem system. Source code and documentation are available at https://github.com/misshie/bioruby-ucsc-api/ under the Ruby license. Feedback and help is provided via the website at http://rubyucscapi.userecho.com/.
The Ruby UCSC API: accessing the UCSC genome database using Ruby
2012-01-01
Background The University of California, Santa Cruz (UCSC) genome database is among the most used sources of genomic annotation in human and other organisms. The database offers an excellent web-based graphical user interface (the UCSC genome browser) and several means for programmatic queries. A simple application programming interface (API) in a scripting language aimed at the biologist was however not yet available. Here, we present the Ruby UCSC API, a library to access the UCSC genome database using Ruby. Results The API is designed as a BioRuby plug-in and built on the ActiveRecord 3 framework for the object-relational mapping, making writing SQL statements unnecessary. The current version of the API supports databases of all organisms in the UCSC genome database including human, mammals, vertebrates, deuterostomes, insects, nematodes, and yeast. The API uses the bin index—if available—when querying for genomic intervals. The API also supports genomic sequence queries using locally downloaded *.2bit files that are not stored in the official MySQL database. The API is implemented in pure Ruby and is therefore available in different environments and with different Ruby interpreters (including JRuby). Conclusions Assisted by the straightforward object-oriented design of Ruby and ActiveRecord, the Ruby UCSC API will facilitate biologists to query the UCSC genome database programmatically. The API is available through the RubyGem system. Source code and documentation are available at https://github.com/misshie/bioruby-ucsc-api/ under the Ruby license. Feedback and help is provided via the website at http://rubyucscapi.userecho.com/. PMID:22994508
Shuttle-Data-Tape XML Translator
NASA Technical Reports Server (NTRS)
Barry, Matthew R.; Osborne, Richard N.
2005-01-01
JSDTImport is a computer program for translating native Shuttle Data Tape (SDT) files from American Standard Code for Information Interchange (ASCII) format into databases in other formats. JSDTImport solves the problem of organizing the SDT content, affording flexibility to enable users to choose how to store the information in a database to better support client and server applications. JSDTImport can be dynamically configured by use of a simple Extensible Markup Language (XML) file. JSDTImport uses this XML file to define how each record and field will be parsed, its layout and definition, and how the resulting database will be structured. JSDTImport also includes a client application programming interface (API) layer that provides abstraction for the data-querying process. The API enables a user to specify the search criteria to apply in gathering all the data relevant to a query. The API can be used to organize the SDT content and translate into a native XML database. The XML format is structured into efficient sections, enabling excellent query performance by use of the XPath query language. Optionally, the content can be translated into a Structured Query Language (SQL) database for fast, reliable SQL queries on standard database server computers.
Provider-Independent Use of the Cloud
NASA Astrophysics Data System (ADS)
Harmer, Terence; Wright, Peter; Cunningham, Christina; Perrott, Ron
Utility computing offers researchers and businesses the potential of significant cost-savings, making it possible for them to match the cost of their computing and storage to their demand for such resources. A utility compute provider enables the purchase of compute infrastructures on-demand; when a user requires computing resources a provider will provision a resource for them and charge them only for their period of use of that resource. There has been a significant growth in the number of cloud computing resource providers and each has a different resource usage model, application process and application programming interface (API)-developing generic multi-resource provider applications is thus difficult and time consuming. We have developed an abstraction layer that provides a single resource usage model, user authentication model and API for compute providers that enables cloud-provider neutral applications to be developed. In this paper we outline the issues in using external resource providers, give examples of using a number of the most popular cloud providers and provide examples of developing provider neutral applications. In addition, we discuss the development of the API to create a generic provisioning model based on a common architecture for cloud computing providers.
Buske, Orion J.; Schiettecatte, François; Hutton, Benjamin; Dumitriu, Sergiu; Misyura, Andriy; Huang, Lijia; Hartley, Taila; Girdea, Marta; Sobreira, Nara; Mungall, Chris; Brudno, Michael
2016-01-01
Despite the increasing prevalence of clinical sequencing, the difficulty of identifying additional affected families is a key obstacle to solving many rare diseases. There may only be a handful of similar patients worldwide, and their data may be stored in diverse clinical and research databases. Computational methods are necessary to enable finding similar patients across the growing number of patient repositories and registries. We present the Matchmaker Exchange Application Programming Interface (MME API), a protocol and data format for exchanging phenotype and genotype profiles to enable matchmaking among patient databases, facilitate the identification of additional cohorts, and increase the rate with which rare diseases can be researched and diagnosed. We designed the API to be straightforward and flexible in order to simplify its adoption on a large number of data types and workflows. We also provide a public test data set, curated from the literature, to facilitate implementation of the API and development of new matching algorithms. The initial version of the API has been successfully implemented by three members of the Matchmaker Exchange and was immediately able to reproduce previously-identified matches and generate several new leads currently being validated. The API is available at https://github.com/ga4gh/mme-apis. PMID:26255989
Buske, Orion J; Schiettecatte, François; Hutton, Benjamin; Dumitriu, Sergiu; Misyura, Andriy; Huang, Lijia; Hartley, Taila; Girdea, Marta; Sobreira, Nara; Mungall, Chris; Brudno, Michael
2015-10-01
Despite the increasing prevalence of clinical sequencing, the difficulty of identifying additional affected families is a key obstacle to solving many rare diseases. There may only be a handful of similar patients worldwide, and their data may be stored in diverse clinical and research databases. Computational methods are necessary to enable finding similar patients across the growing number of patient repositories and registries. We present the Matchmaker Exchange Application Programming Interface (MME API), a protocol and data format for exchanging phenotype and genotype profiles to enable matchmaking among patient databases, facilitate the identification of additional cohorts, and increase the rate with which rare diseases can be researched and diagnosed. We designed the API to be straightforward and flexible in order to simplify its adoption on a large number of data types and workflows. We also provide a public test data set, curated from the literature, to facilitate implementation of the API and development of new matching algorithms. The initial version of the API has been successfully implemented by three members of the Matchmaker Exchange and was immediately able to reproduce previously identified matches and generate several new leads currently being validated. The API is available at https://github.com/ga4gh/mme-apis. © 2015 WILEY PERIODICALS, INC.
Home Energy Management System - VOLTTRON Integration
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zandi, Helia
In most Home Energy Management Systems (HEMS) available in the market, different devices running different communication protocols cannot interact with each other and exchange information. As a result of this integration, the information about different devices running different communication protocol can be accessible by other agents and devices running on VOLTTRON platform. The integration process can be used by any HEMS available in the market regardless of the programming language they use. If the existing HEMS provides an Application Programming Interface (API) based on the RESTFul architecture, that API can be used for integration. Our candidate HEMS in this projectmore » is home-assistant (Hass). An agent is implemented which can communicate with the Hass API and receives information about the devices loaded on the API. The agent publishes the information it receives on the VOLTTRON message bus so other agents can have access to this information. On the other side, for each type of devices, an agent is implemented such as Climate Agent, Lock Agent, Switch Agent, Light Agent, etc. Each of these agents is subscribed to the messages published on the message bus about their associated devices. These agents can also change the status of the devices by sending appropriate service calls to the API. Other agents and services on the platform can also access this information and coordinate their decision-making process based on this information.« less
OpenSearch (ECHO-ESIP) & REST API for Earth Science Data Access
NASA Astrophysics Data System (ADS)
Mitchell, A.; Cechini, M.; Pilone, D.
2010-12-01
This presentation will provide a brief technical overview of OpenSearch, the Earth Science Information Partners (ESIP) Federated Search framework, and the REST architecture; discuss NASA’s Earth Observing System (EOS) ClearingHOuse’s (ECHO) implementation lessons learned; and demonstrate the simplified usage of these technologies. SOAP, as a framework for web service communication has numerous advantages for Enterprise applications and Java/C# type programming languages. As a technical solution, SOAP has been a reliable framework on top of which many applications have been successfully developed and deployed. However, as interest grows for quick development cycles and more intriguing “mashups,” the SOAP API loses its appeal. Lightweight and simple are the vogue characteristics that are sought after. Enter the REST API architecture and OpenSearch format. Both of these items provide a new path for application development addressing some of the issues unresolved by SOAP. ECHO has made available all of its discovery, order submission, and data management services through a publicly accessible SOAP API. This interface is utilized by a variety of ECHO client and data partners to provide valuable capabilities to end users. As ECHO interacted with current and potential partners looking to develop Earth Science tools utilizing ECHO, it became apparent that the development overhead required to interact with the SOAP API was a growing barrier to entry. ECHO acknowledged the technical issues that were being uncovered by its partner community and chose to provide two new interfaces for interacting with the ECHO metadata catalog. The first interface is built upon the OpenSearch format and ESIP Federated Search framework. Leveraging these two items, a client (ECHO-ESIP) was developed with a focus on simplified searching and results presentation. The second interface is built upon the Representational State Transfer (REST) architecture. Leveraging the REST architecture, a new API has been made available that will provide access to the entire SOAP API suite of services. The results of these development activities has not only positioned to engage in the thriving world of mashup applications, but also provided an excellent real-world case study of how to successfully leverage these emerging technologies.
uPy: a ubiquitous CG Python API with biological-modeling applications.
Autin, Ludovic; Johnson, Graham; Hake, Johan; Olson, Arthur; Sanner, Michel
2012-01-01
The uPy Python extension module provides a uniform abstraction of the APIs of several 3D computer graphics programs (called hosts), including Blender, Maya, Cinema 4D, and DejaVu. A plug-in written with uPy can run in all uPy-supported hosts. Using uPy, researchers have created complex plug-ins for molecular and cellular modeling and visualization. uPy can simplify programming for many types of projects (not solely science applications) intended for multihost distribution. It's available at http://upy.scripps.edu. The first featured Web extra is a video that shows interactive analysis of a calcium dynamics simulation. YouTube URL: http://youtu.be/wvs-nWE6ypo. The second featured Web extra is a video that shows rotation of the HIV virus. YouTube URL: http://youtu.be/vEOybMaRoKc.
Software Applications to Access Earth Science Data: Building an ECHO Client
NASA Astrophysics Data System (ADS)
Cohen, A.; Cechini, M.; Pilone, D.
2010-12-01
Historically, developing an ECHO (NASA’s Earth Observing System (EOS) ClearingHOuse) client required interaction with its SOAP API. SOAP, as a framework for web service communication has numerous advantages for Enterprise applications and Java/C# type programming languages. However, as interest has grown for quick development cycles and more intriguing “mashups,” ECHO has seen the SOAP API lose its appeal. In order to address these changing needs, ECHO has introduced two new interfaces facilitating simple access to its metadata holdings. The first interface is built upon the OpenSearch format and ESIP Federated Search framework. The second interface is built upon the Representational State Transfer (REST) architecture. Using the REST and OpenSearch APIs to access ECHO makes development with modern languages much more feasible and simpler. Client developers can leverage the simple interaction with ECHO to focus more of their time on the advanced functionality they are presenting to users. To demonstrate the simplicity of developing with the REST API, participants will be led through a hands-on experience where they will develop an ECHO client that performs the following actions: + Login + Provider discovery + Provider based dataset discovery + Dataset, Temporal, and Spatial constraint based Granule discovery + Online Data Access
G2S: a web-service for annotating genomic variants on 3D protein structures.
Wang, Juexin; Sheridan, Robert; Sumer, S Onur; Schultz, Nikolaus; Xu, Dong; Gao, Jianjiong
2018-06-01
Accurately mapping and annotating genomic locations on 3D protein structures is a key step in structure-based analysis of genomic variants detected by recent large-scale sequencing efforts. There are several mapping resources currently available, but none of them provides a web API (Application Programming Interface) that supports programmatic access. We present G2S, a real-time web API that provides automated mapping of genomic variants on 3D protein structures. G2S can align genomic locations of variants, protein locations, or protein sequences to protein structures and retrieve the mapped residues from structures. G2S API uses REST-inspired design and it can be used by various clients such as web browsers, command terminals, programming languages and other bioinformatics tools for bringing 3D structures into genomic variant analysis. The webserver and source codes are freely available at https://g2s.genomenexus.org. g2s@genomenexus.org. Supplementary data are available at Bioinformatics online.
Integrated platform and API for electrophysiological data
Sobolev, Andrey; Stoewer, Adrian; Leonhardt, Aljoscha; Rautenberg, Philipp L.; Kellner, Christian J.; Garbers, Christian; Wachtler, Thomas
2014-01-01
Recent advancements in technology and methodology have led to growing amounts of increasingly complex neuroscience data recorded from various species, modalities, and levels of study. The rapid data growth has made efficient data access and flexible, machine-readable data annotation a crucial requisite for neuroscientists. Clear and consistent annotation and organization of data is not only an important ingredient for reproducibility of results and re-use of data, but also essential for collaborative research and data sharing. In particular, efficient data management and interoperability requires a unified approach that integrates data and metadata and provides a common way of accessing this information. In this paper we describe GNData, a data management platform for neurophysiological data. GNData provides a storage system based on a data representation that is suitable to organize data and metadata from any electrophysiological experiment, with a functionality exposed via a common application programming interface (API). Data representation and API structure are compatible with existing approaches for data and metadata representation in neurophysiology. The API implementation is based on the Representational State Transfer (REST) pattern, which enables data access integration in software applications and facilitates the development of tools that communicate with the service. Client libraries that interact with the API provide direct data access from computing environments like Matlab or Python, enabling integration of data management into the scientist's experimental or analysis routines. PMID:24795616
Integrated platform and API for electrophysiological data.
Sobolev, Andrey; Stoewer, Adrian; Leonhardt, Aljoscha; Rautenberg, Philipp L; Kellner, Christian J; Garbers, Christian; Wachtler, Thomas
2014-01-01
Recent advancements in technology and methodology have led to growing amounts of increasingly complex neuroscience data recorded from various species, modalities, and levels of study. The rapid data growth has made efficient data access and flexible, machine-readable data annotation a crucial requisite for neuroscientists. Clear and consistent annotation and organization of data is not only an important ingredient for reproducibility of results and re-use of data, but also essential for collaborative research and data sharing. In particular, efficient data management and interoperability requires a unified approach that integrates data and metadata and provides a common way of accessing this information. In this paper we describe GNData, a data management platform for neurophysiological data. GNData provides a storage system based on a data representation that is suitable to organize data and metadata from any electrophysiological experiment, with a functionality exposed via a common application programming interface (API). Data representation and API structure are compatible with existing approaches for data and metadata representation in neurophysiology. The API implementation is based on the Representational State Transfer (REST) pattern, which enables data access integration in software applications and facilitates the development of tools that communicate with the service. Client libraries that interact with the API provide direct data access from computing environments like Matlab or Python, enabling integration of data management into the scientist's experimental or analysis routines.
Design Considerations for Integrating Twitter into an Online Course
ERIC Educational Resources Information Center
Rohr, Linda E.; Costello, Jane; Hawkins, Thomas
2015-01-01
While the use of Twitter for communication and assessment activities in online courses is not new, it has not been without its challenges. This is increasingly true of high enrolment courses. The use of a Twitter Evaluation application which leverages a Learning Management System's (LMS's) application programming interface (API) provides a…
Helioviewer.org: Simple Solar and Heliospheric Data Visualization
NASA Astrophysics Data System (ADS)
Hughitt, V. K.; Ireland, J.; Mueller, D.
2011-12-01
Helioviewer.org is a free and open-source web application for exploring solar physics data in a simple and intuitive manner. Over the past several years, Helioviewer.org has enabled thousands of users from across the globe to explore the inner heliosphere, providing access to over ten million images from the SOHO, SDO, and STEREO missions. While Helioviewer.org has seen a surge in use by the public in recent months, it is still ultimately a science tool. The newest version of Helioviewer.org provides access to science-quality data for all available images through the Virtual Solar Observatory (VSO). In addition to providing a powerful platform for browsing heterogeneous sets of solar data, Helioviewer.org also seeks to be as flexible and extensible as possible, providing access to much of its functionality via a simple Application Programming Interface (API). Recently, the Helioviewer.org API was used for two such applications: a Wordpress plugin, and a Python library for solar physics data analysis (SunPy). These applications are discussed and examples of API usage are provided. Finally, Helioviewer.org is undergoing continual development, with new features being added on a regular basis. Recent updates to Helioviewer.org are discussed, along with a preview of things to come.
NASA Astrophysics Data System (ADS)
Guillochon, James; Cowperthwaite, Philip S.
2018-05-01
We announce the public release of the application program interface (API) for the Open Astronomy Catalogs (OACs), the OACAPI. The OACs serve near-complete collections of supernova, tidal disruption, kilonova, and fast stars data (including photometry, spectra, radio, and X-ray observations) via a user-friendly web interface that displays the data interactively and offers full data downloads. The OACAPI, by contrast, enables users to specifically download particular pieces of the OAC dataset via a flexible programmatic syntax, either via URL GET requests, or via a module within the astroquery Python package.
Manycore Performance-Portability: Kokkos Multidimensional Array Library
Edwards, H. Carter; Sunderland, Daniel; Porter, Vicki; ...
2012-01-01
Large, complex scientific and engineering application code have a significant investment in computational kernels to implement their mathematical models. Porting these computational kernels to the collection of modern manycore accelerator devices is a major challenge in that these devices have diverse programming models, application programming interfaces (APIs), and performance requirements. The Kokkos Array programming model provides library-based approach to implement computational kernels that are performance-portable to CPU-multicore and GPGPU accelerator devices. This programming model is based upon three fundamental concepts: (1) manycore compute devices each with its own memory space, (2) data parallel kernels and (3) multidimensional arrays. Kernel executionmore » performance is, especially for NVIDIA® devices, extremely dependent on data access patterns. Optimal data access pattern can be different for different manycore devices – potentially leading to different implementations of computational kernels specialized for different devices. The Kokkos Array programming model supports performance-portable kernels by (1) separating data access patterns from computational kernels through a multidimensional array API and (2) introduce device-specific data access mappings when a kernel is compiled. An implementation of Kokkos Array is available through Trilinos [Trilinos website, http://trilinos.sandia.gov/, August 2011].« less
Development of NETCONF-Based Network Management Systems in Web Services Framework
NASA Astrophysics Data System (ADS)
Iijima, Tomoyuki; Kimura, Hiroyasu; Kitani, Makoto; Atarashi, Yoshifumi
To develop a network management system (NMS) more easily, the authors developed an application programming interface (API) for configuring network devices. Because this API is used in a Java development environment, an NMS can be developed by utilizing the API and other commonly available Java libraries. It is thus possible to easily develop an NMS that is highly compatible with other IT systems. And operations that are generated from the API and that are exchanged between the NMS and network devices are based on NETCONF, which is standardized by the Internet Engineering Task Force (IETF) as a next-generation network-configuration protocol. Adopting a standardized technology ensures that the NMS developed by using the API can manage network devices provided from multi-vendors in a unified manner. Furthermore, the configuration items exchanged over NETCONF are specified in an object-oriented design. They are therefore easier to manage than such items in the Management Information Base (MIB), which is defined as data to be managed by the Simple Network Management Protocol (SNMP). We actually developed several NMSs by using the API. Evaluation of these NMSs showed that, in terms of configuration time and development time, the NMS developed by using the API performed as well as NMSs developed by using a command line interface (CLI) and SNMP. The NMS developed by using the API showed feasibility to achieve “autonomic network management” and “high interoperability with IT systems.”
An Airborne Onboard Parallel Processing Testbed
NASA Technical Reports Server (NTRS)
Mandl, Daniel J.
2014-01-01
This presentation provides information on the progress the Intelligent Payload Module (IPM) development effort. In addition, a vision is presented on integration of the IPM architecture with the GeoSocial Application Program Interface (API) architecture to enable efficient distribution of satellite data products.
Model-Driven Energy Intelligence
2015-03-01
building information model ( BIM ) for operations...estimate of the potential impact on energy performance at Fort Jackson. 15. SUBJECT TERMS Building Information Modeling ( BIM ), Energy, ECMs, monitoring...dimensional AHU Air Handling Unit API Application Programming Interface BIM building information model BLCC Building Life Cycle Cost
Web Services--A Buzz Word with Potentials
János T. Füstös
2006-01-01
The simplest definition of a web service is an application that provides a web API. The web API exposes the functionality of the solution to other applications. The web API relies on other Internet-based technologies to manage communications. The resulting web services are pervasive, vendor-independent, language-neutral, and very low-cost. The main purpose of a web API...
NASA Astrophysics Data System (ADS)
Xiong, Wenhao; Tian, Xin; Chen, Genshe; Pham, Khanh; Blasch, Erik
2017-05-01
Software defined radio (SDR) has become a popular tool for the implementation and testing for communications performance. The advantage of the SDR approach includes: a re-configurable design, adaptive response to changing conditions, efficient development, and highly versatile implementation. In order to understand the benefits of SDR, the space telecommunication radio system (STRS) was proposed by NASA Glenn research center (GRC) along with the standard application program interface (API) structure. Each component of the system uses a well-defined API to communicate with other components. The benefit of standard API is to relax the platform limitation of each component for addition options. For example, the waveform generating process can support a field programmable gate array (FPGA), personal computer (PC), or an embedded system. As long as the API defines the requirements, the generated waveform selection will work with the complete system. In this paper, we demonstrate the design and development of adaptive SDR following the STRS and standard API protocol. We introduce step by step the SDR testbed system including the controlling graphic user interface (GUI), database, GNU radio hardware control, and universal software radio peripheral (USRP) tranceiving front end. In addition, a performance evaluation in shown on the effectiveness of the SDR approach for space telecommunication.
A Driving Cycle Detection Approach Using Map Service API
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhu, Lei; Gonder, Jeffrey D
Following advancements in smartphone and portable global positioning system (GPS) data collection, wearable GPS data have realized extensive use in transportation surveys and studies. The task of detecting driving cycles (driving or car-mode trajectory segments) from wearable GPS data has been the subject of much research. Specifically, distinguishing driving cycles from other motorized trips (such as taking a bus) is the main research problem in this paper. Many mode detection methods only focus on raw GPS speed data while some studies apply additional information, such as geographic information system (GIS) data, to obtain better detection performance. Procuring and maintaining dedicatedmore » road GIS data are costly and not trivial, whereas the technical maturity and broad use of map service application program interface (API) queries offers opportunities for mode detection tasks. The proposed driving cycle detection method takes advantage of map service APIs to obtain high-quality car-mode API route information and uses a trajectory segmentation algorithm to find the best-matched API route. The car-mode API route data combined with the actual route information, including the actual mode information, are used to train a logistic regression machine learning model, which estimates car modes and non-car modes with probability rates. The experimental results show promise for the proposed method's ability to detect vehicle mode accurately.« less
Automating spectral measurements
NASA Astrophysics Data System (ADS)
Goldstein, Fred T.
2008-09-01
This paper discusses the architecture of software utilized in spectroscopic measurements. As optical coatings become more sophisticated, there is mounting need to automate data acquisition (DAQ) from spectrophotometers. Such need is exacerbated when 100% inspection is required, ancillary devices are utilized, cost reduction is crucial, or security is vital. While instrument manufacturers normally provide point-and-click DAQ software, an application programming interface (API) may be missing. In such cases automation is impossible or expensive. An API is typically provided in libraries (*.dll, *.ocx) which may be embedded in user-developed applications. Users can thereby implement DAQ automation in several Windows languages. Another possibility, developed by FTG as an alternative to instrument manufacturers' software, is the ActiveX application (*.exe). ActiveX, a component of many Windows applications, provides means for programming and interoperability. This architecture permits a point-and-click program to act as automation client and server. Excel, for example, can control and be controlled by DAQ applications. Most importantly, ActiveX permits ancillary devices such as barcode readers and XY-stages to be easily and economically integrated into scanning procedures. Since an ActiveX application has its own user-interface, it can be independently tested. The ActiveX application then runs (visibly or invisibly) under DAQ software control. Automation capabilities are accessed via a built-in spectro-BASIC language with industry-standard (VBA-compatible) syntax. Supplementing ActiveX, spectro-BASIC also includes auxiliary serial port commands for interfacing programmable logic controllers (PLC). A typical application is automatic filter handling.
NASA Astrophysics Data System (ADS)
Lukes, George E.; Cain, Joel M.
1996-02-01
The Advanced Distributed Simulation (ADS) Synthetic Environments Program seeks to create robust virtual worlds from operational terrain and environmental data sources of sufficient fidelity and currency to interact with the real world. While some applications can be met by direct exploitation of standard digital terrain data, more demanding applications -- particularly those support operations 'close to the ground' -- are well-served by emerging capabilities for 'value-adding' by the user working with controlled imagery. For users to rigorously refine and exploit controlled imagery within functionally different workstations they must have a shared framework to allow interoperability within and between these environments in terms of passing image and object coordinates and other information using a variety of validated sensor models. The Synthetic Environments Program is now being expanded to address rapid construction of virtual worlds with research initiatives in digital mapping, softcopy workstations, and cartographic image understanding. The Synthetic Environments Program is also participating in a joint initiative for a sensor model applications programer's interface (API) to ensure that a common controlled imagery exploitation framework is available to all researchers, developers and users. This presentation provides an introduction to ADS and the associated requirements for synthetic environments to support synthetic theaters of war. It provides a technical rationale for exploring applications of image understanding technology to automated cartography in support of ADS and related programs benefitting from automated analysis of mapping, earth resources and reconnaissance imagery. And it provides an overview and status of the joint initiative for a sensor model API.
NASA Astrophysics Data System (ADS)
Larour, Eric; Cheng, Daniel; Perez, Gilberto; Quinn, Justin; Morlighem, Mathieu; Duong, Bao; Nguyen, Lan; Petrie, Kit; Harounian, Silva; Halkides, Daria; Hayes, Wayne
2017-12-01
Earth system models (ESMs) are becoming increasingly complex, requiring extensive knowledge and experience to deploy and use in an efficient manner. They run on high-performance architectures that are significantly different from the everyday environments that scientists use to pre- and post-process results (i.e., MATLAB, Python). This results in models that are hard to use for non-specialists and are increasingly specific in their application. It also makes them relatively inaccessible to the wider science community, not to mention to the general public. Here, we present a new software/model paradigm that attempts to bridge the gap between the science community and the complexity of ESMs by developing a new JavaScript application program interface (API) for the Ice Sheet System Model (ISSM). The aforementioned API allows cryosphere scientists to run ISSM on the client side of a web page within the JavaScript environment. When combined with a web server running ISSM (using a Python API), it enables the serving of ISSM computations in an easy and straightforward way. The deep integration and similarities between all the APIs in ISSM (MATLAB, Python, and now JavaScript) significantly shortens and simplifies the turnaround of state-of-the-art science runs and their use by the larger community. We demonstrate our approach via a new Virtual Earth System Laboratory (VESL) website (http://vesl.jpl.nasa.gov, VESL(2017)).
FirebrowseR: an R client to the Broad Institute’s Firehose Pipeline
Deng, Mario; Brägelmann, Johannes; Kryukov, Ivan; Saraiva-Agostinho, Nuno; Perner, Sven
2017-01-01
With its Firebrowse service (http://firebrowse.org/) the Broad Institute is making large-scale multi-platform omics data analysis results publicly available through a Representational State Transfer (REST) Application Programmable Interface (API). Querying this database through an API client from an arbitrary programming environment is an essential task, allowing other developers and researchers to focus on their analysis and avoid data wrangling. Hence, as a first result, we developed a workflow to automatically generate, test and deploy such clients for rapid response to API changes. Its underlying infrastructure, a combination of free and publicly available web services, facilitates the development of API clients. It decouples changes in server software from the client software by reacting to changes in the RESTful service and removing direct dependencies on a specific implementation of an API. As a second result, FirebrowseR, an R client to the Broad Institute’s RESTful Firehose Pipeline, is provided as a working example, which is built by the means of the presented workflow. The package’s features are demonstrated by an example analysis of cancer gene expression data. Database URL: https://github.com/mariodeng/ PMID:28062517
FirebrowseR: an R client to the Broad Institute's Firehose Pipeline.
Deng, Mario; Brägelmann, Johannes; Kryukov, Ivan; Saraiva-Agostinho, Nuno; Perner, Sven
2017-01-01
With its Firebrowse service (http://firebrowse.org/) the Broad Institute is making large-scale multi-platform omics data analysis results publicly available through a Representational State Transfer (REST) Application Programmable Interface (API). Querying this database through an API client from an arbitrary programming environment is an essential task, allowing other developers and researchers to focus on their analysis and avoid data wrangling. Hence, as a first result, we developed a workflow to automatically generate, test and deploy such clients for rapid response to API changes. Its underlying infrastructure, a combination of free and publicly available web services, facilitates the development of API clients. It decouples changes in server software from the client software by reacting to changes in the RESTful service and removing direct dependencies on a specific implementation of an API. As a second result, FirebrowseR, an R client to the Broad Institute's RESTful Firehose Pipeline, is provided as a working example, which is built by the means of the presented workflow. The package's features are demonstrated by an example analysis of cancer gene expression data.Database URL: https://github.com/mariodeng/. © The Author(s) 2017. Published by Oxford University Press.
Coordinating complex decision support activities across distributed applications
NASA Technical Reports Server (NTRS)
Adler, Richard M.
1994-01-01
Knowledge-based technologies have been applied successfully to automate planning and scheduling in many problem domains. Automation of decision support can be increased further by integrating task-specific applications with supporting database systems, and by coordinating interactions between such tools to facilitate collaborative activities. Unfortunately, the technical obstacles that must be overcome to achieve this vision of transparent, cooperative problem-solving are daunting. Intelligent decision support tools are typically developed for standalone use, rely on incompatible, task-specific representational models and application programming interfaces (API's), and run on heterogeneous computing platforms. Getting such applications to interact freely calls for platform independent capabilities for distributed communication, as well as tools for mapping information across disparate representations. Symbiotics is developing a layered set of software tools (called NetWorks! for integrating and coordinating heterogeneous distributed applications. he top layer of tools consists of an extensible set of generic, programmable coordination services. Developers access these services via high-level API's to implement the desired interactions between distributed applications.
30 CFR 250.1920 - What are the auditing requirements for my SEMS program?
Code of Federal Regulations, 2012 CFR
2012-07-01
... designated and qualified personnel according to the requirements of this subpart and API RP 75, Section 12... thirteen elements of your SEMS program to evaluate compliance with the requirements of this subpart and API... audit plan and procedures must meet or exceed all of the recommendations included in API RP 75 section...
When Will It Be ...?: U.S. Naval Observatory Sidereal Time and Julian Date Calculators
NASA Astrophysics Data System (ADS)
Chizek Frouard, Malynda R.; Lesniak, Michael V.; Bartlett, Jennifer L.
2017-01-01
Sidereal time and Julian date are two values often used in observational astronomy that can be tedious to calculate. Fortunately, the U.S. Naval Observatory (USNO) has redesigned its on-line Sidereal Time and Julian Date (JD) calculators to provide data through an Application Programming Interface (API). This flexible interface returns dates and times in JavaScript Object Notation (JSON) that can be incorporated into third-party websites or applications.Via the API, Sidereal Time can be obtained for any location on Earth for any date occurring in the current, previous, or subsequent year. Up to 9999 iterations of sidereal time data with intervals from 1 second to 1095 days can be generated, as long as the data doesn’t extend past the date limits. The API provides the Gregorian calendar date and time (in UT1), Greenwich Mean Sidereal Time, Greenwich Apparent Sidereal Time, Local Mean Sidereal Time, Local Apparent Sidereal Time, and the Equation of the Equinoxes.Julian Date can be converted to calendar date, either Julian or Gregorian as appropriate, for any date between JD 0 (January 1, 4713 BCE proleptic Julian) and JD 5373484 (December 31, 9999 CE Gregorian); the reverse calendar date to Julian Date conversion is also available. The calendar date and Julian Date are returned for all API requests; the day of the week is also returned for Julian Date to calendar date conversions.On-line documentation for using all USNO API-enabled calculators, including sample calls, is available (http://aa.usno.navy.mil/data/docs/api.php).For those who prefer using traditional data input forms, Sidereal Time can still be accessed at http://aa.usno.navy.mil/data/docs/siderealtime.php, and the Julian Date Converter at http://aa.usno.navy.mil/data/docs/JulianDate.php.
A RESTful API for accessing microbial community data for MG-RAST.
Wilke, Andreas; Bischof, Jared; Harrison, Travis; Brettin, Tom; D'Souza, Mark; Gerlach, Wolfgang; Matthews, Hunter; Paczian, Tobias; Wilkening, Jared; Glass, Elizabeth M; Desai, Narayan; Meyer, Folker
2015-01-01
Metagenomic sequencing has produced significant amounts of data in recent years. For example, as of summer 2013, MG-RAST has been used to annotate over 110,000 data sets totaling over 43 Terabases. With metagenomic sequencing finding even wider adoption in the scientific community, the existing web-based analysis tools and infrastructure in MG-RAST provide limited capability for data retrieval and analysis, such as comparative analysis between multiple data sets. Moreover, although the system provides many analysis tools, it is not comprehensive. By opening MG-RAST up via a web services API (application programmers interface) we have greatly expanded access to MG-RAST data, as well as provided a mechanism for the use of third-party analysis tools with MG-RAST data. This RESTful API makes all data and data objects created by the MG-RAST pipeline accessible as JSON objects. As part of the DOE Systems Biology Knowledgebase project (KBase, http://kbase.us) we have implemented a web services API for MG-RAST. This API complements the existing MG-RAST web interface and constitutes the basis of KBase's microbial community capabilities. In addition, the API exposes a comprehensive collection of data to programmers. This API, which uses a RESTful (Representational State Transfer) implementation, is compatible with most programming environments and should be easy to use for end users and third parties. It provides comprehensive access to sequence data, quality control results, annotations, and many other data types. Where feasible, we have used standards to expose data and metadata. Code examples are provided in a number of languages both to show the versatility of the API and to provide a starting point for users. We present an API that exposes the data in MG-RAST for consumption by our users, greatly enhancing the utility of the MG-RAST service.
2007-03-01
Intelligence AIS Artificial Immune System ANN Artificial Neural Networks API Application Programming Interface BFS Breadth-First Search BIS Biological...problem domain is too large for only one algorithm’s application . It ranges from network - based sniffer systems, responsible for Enterprise-wide coverage...options to network administrators in choosing detectors to employ in future ID applications . Objectives Our hypothesis validity is based on a set
Yoink: An interaction-based partitioning API.
Zheng, Min; Waller, Mark P
2018-05-15
Herein, we describe the implementation details of our interaction-based partitioning API (application programming interface) called Yoink for QM/MM modeling and fragment-based quantum chemistry studies. Interactions are detected by computing density descriptors such as reduced density gradient, density overlap regions indicator, and single exponential decay detector. Only molecules having an interaction with a user-definable QM core are added to the QM region of a hybrid QM/MM calculation. Moreover, a set of molecule pairs having density-based interactions within a molecular system can be computed in Yoink, and an interaction graph can then be constructed. Standard graph clustering methods can then be applied to construct fragments for further quantum chemical calculations. The Yoink API is licensed under Apache 2.0 and can be accessed via yoink.wallerlab.org. © 2018 Wiley Periodicals, Inc. © 2018 Wiley Periodicals, Inc.
The Stratway Program for Strategic Conflict Resolution: User's Guide
NASA Technical Reports Server (NTRS)
Hagen, George E.; Butler, Ricky W.; Maddalon, Jeffrey M.
2016-01-01
Stratway is a strategic conflict detection and resolution program. It provides both intent-based conflict detection and conflict resolution for a single ownship in the presence of multiple traffic aircraft and weather cells defined by moving polygons. It relies on a set of heuristic search strategies to solve conflicts. These strategies are user configurable through multiple parameters. The program can be called from other programs through an application program interface (API) and can also be executed from a command line.
The jmzQuantML programming interface and validator for the mzQuantML data standard.
Qi, Da; Krishna, Ritesh; Jones, Andrew R
2014-03-01
The mzQuantML standard from the HUPO Proteomics Standards Initiative has recently been released, capturing quantitative data about peptides and proteins, following analysis of MS data. We present a Java application programming interface (API) for mzQuantML called jmzQuantML. The API provides robust bridges between Java classes and elements in mzQuantML files and allows random access to any part of the file. The API provides read and write capabilities, and is designed to be embedded in other software packages, enabling mzQuantML support to be added to proteomics software tools (http://code.google.com/p/jmzquantml/). The mzQuantML standard is designed around a multilevel validation system to ensure that files are structurally and semantically correct for different proteomics quantitative techniques. In this article, we also describe a Java software tool (http://code.google.com/p/mzquantml-validator/) for validating mzQuantML files, which is a formal part of the data standard. © 2014 The Authors. Proteomics published by Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.
NSLS-II HIGH LEVEL APPLICATION INFRASTRUCTURE AND CLIENT API DESIGN
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shen, G.; Yang; L.
2011-03-28
The beam commissioning software framework of NSLS-II project adopts a client/server based architecture to replace the more traditional monolithic high level application approach. It is an open structure platform, and we try to provide a narrow API set for client application. With this narrow API, existing applications developed in different language under different architecture could be ported to our platform with small modification. This paper describes system infrastructure design, client API and system integration, and latest progress. As a new 3rd generation synchrotron light source with ultra low emittance, there are new requirements and challenges to control and manipulate themore » beam. A use case study and a theoretical analysis have been performed to clarify requirements and challenges to the high level applications (HLA) software environment. To satisfy those requirements and challenges, adequate system architecture of the software framework is critical for beam commissioning, study and operation. The existing traditional approaches are self-consistent, and monolithic. Some of them have adopted a concept of middle layer to separate low level hardware processing from numerical algorithm computing, physics modelling, data manipulating, plotting, and error handling. However, none of the existing approaches can satisfy the requirement. A new design has been proposed by introducing service oriented architecture technology. The HLA is combination of tools for accelerator physicists and operators, which is same as traditional approach. In NSLS-II, they include monitoring applications and control routines. Scripting environment is very important for the later part of HLA and both parts are designed based on a common set of APIs. Physicists and operators are users of these APIs, while control system engineers and a few accelerator physicists are the developers of these APIs. With our Client/Server mode based approach, we leave how to retrieve information to the developers of APIs and how to use them to form a physics application to the users. For example, how the channels are related to magnet and what the current real-time setting of a magnet is in physics unit are the internals of APIs. Measuring chromaticities are the users of APIs. All the users of APIs are working with magnet and instrument names in a physics unit. The low level communications in current or voltage unit are minimized. In this paper, we discussed our recent progress of our infrastructure development, and client API.« less
Atlas - a data warehouse for integrative bioinformatics.
Shah, Sohrab P; Huang, Yong; Xu, Tao; Yuen, Macaire M S; Ling, John; Ouellette, B F Francis
2005-02-21
We present a biological data warehouse called Atlas that locally stores and integrates biological sequences, molecular interactions, homology information, functional annotations of genes, and biological ontologies. The goal of the system is to provide data, as well as a software infrastructure for bioinformatics research and development. The Atlas system is based on relational data models that we developed for each of the source data types. Data stored within these relational models are managed through Structured Query Language (SQL) calls that are implemented in a set of Application Programming Interfaces (APIs). The APIs include three languages: C++, Java, and Perl. The methods in these API libraries are used to construct a set of loader applications, which parse and load the source datasets into the Atlas database, and a set of toolbox applications which facilitate data retrieval. Atlas stores and integrates local instances of GenBank, RefSeq, UniProt, Human Protein Reference Database (HPRD), Biomolecular Interaction Network Database (BIND), Database of Interacting Proteins (DIP), Molecular Interactions Database (MINT), IntAct, NCBI Taxonomy, Gene Ontology (GO), Online Mendelian Inheritance in Man (OMIM), LocusLink, Entrez Gene and HomoloGene. The retrieval APIs and toolbox applications are critical components that offer end-users flexible, easy, integrated access to this data. We present use cases that use Atlas to integrate these sources for genome annotation, inference of molecular interactions across species, and gene-disease associations. The Atlas biological data warehouse serves as data infrastructure for bioinformatics research and development. It forms the backbone of the research activities in our laboratory and facilitates the integration of disparate, heterogeneous biological sources of data enabling new scientific inferences. Atlas achieves integration of diverse data sets at two levels. First, Atlas stores data of similar types using common data models, enforcing the relationships between data types. Second, integration is achieved through a combination of APIs, ontology, and tools. The Atlas software is freely available under the GNU General Public License at: http://bioinformatics.ubc.ca/atlas/
Atlas – a data warehouse for integrative bioinformatics
Shah, Sohrab P; Huang, Yong; Xu, Tao; Yuen, Macaire MS; Ling, John; Ouellette, BF Francis
2005-01-01
Background We present a biological data warehouse called Atlas that locally stores and integrates biological sequences, molecular interactions, homology information, functional annotations of genes, and biological ontologies. The goal of the system is to provide data, as well as a software infrastructure for bioinformatics research and development. Description The Atlas system is based on relational data models that we developed for each of the source data types. Data stored within these relational models are managed through Structured Query Language (SQL) calls that are implemented in a set of Application Programming Interfaces (APIs). The APIs include three languages: C++, Java, and Perl. The methods in these API libraries are used to construct a set of loader applications, which parse and load the source datasets into the Atlas database, and a set of toolbox applications which facilitate data retrieval. Atlas stores and integrates local instances of GenBank, RefSeq, UniProt, Human Protein Reference Database (HPRD), Biomolecular Interaction Network Database (BIND), Database of Interacting Proteins (DIP), Molecular Interactions Database (MINT), IntAct, NCBI Taxonomy, Gene Ontology (GO), Online Mendelian Inheritance in Man (OMIM), LocusLink, Entrez Gene and HomoloGene. The retrieval APIs and toolbox applications are critical components that offer end-users flexible, easy, integrated access to this data. We present use cases that use Atlas to integrate these sources for genome annotation, inference of molecular interactions across species, and gene-disease associations. Conclusion The Atlas biological data warehouse serves as data infrastructure for bioinformatics research and development. It forms the backbone of the research activities in our laboratory and facilitates the integration of disparate, heterogeneous biological sources of data enabling new scientific inferences. Atlas achieves integration of diverse data sets at two levels. First, Atlas stores data of similar types using common data models, enforcing the relationships between data types. Second, integration is achieved through a combination of APIs, ontology, and tools. The Atlas software is freely available under the GNU General Public License at: PMID:15723693
Deployable Command and Control System for Over the Horizon Small Boat Operations
2006-09-01
the HP iPAQ Navigation System bundle. There is no programmable Application Programming Interface (API), nor otherwise accessible methods to ...High Point Software which comes complete with a C# library to allow customized programs to access Bluetooth enabled GPS devices. GPSAccess...data could be displayed along with ownship’s positional data, but the program was designed to only work with the Ross radios and the MS Windows XP
Oceanographic data at your fingertips: the SOCIB App for smartphones
NASA Astrophysics Data System (ADS)
Lora, Sebastian; Sebastian, Kristian; Troupin, Charles; Pau Beltran, Joan; Frontera, Biel; Gómara, Sonia; Tintoré, Joaquín
2015-04-01
The Balearic Islands Coastal Ocean Observing and Forecasting System (SOCIB, http://www.socib.es), is a multi-platform Marine Research Infrastructure that generates data from nearshore to the open sea in the Western Mediterranean Sea. In line with SOCIB principles of discoverable, freely available and standardized data, an application (App) for smartphones has been designed, with the objective of providing an easy access to all the data managed by SOCIB in real-time: underwater gliders, drifters, profiling buoys, research vessel, HF Radar and numerical model outputs (hydrodynamics and waves). The Data Centre, responsible for the aquisition, processing and visualisation of all SOCIB data, developed a REpresentational State Transfer (REST) application programming interface (API) called "DataDiscovery" (http://apps.socib.es/DataDiscovery/). This API is made up of RESTful web services that provide information on : platforms, instruments, deployments of instruments. It also provides the data themselves. In this way, it is possible to integrate SOCIB data in third-party applications, developed either by the Data Center or externally. The existence of a single point for the data distribution not only allows for an efficient management but also makes easier the concepts and data access for external developers, who are not necessarily familiar with the concepts and tools related to oceanographic or atmospheric data. The SOCIB App for Android (https://play.google.com/store/apps/details?id=com.socib) uses that API as a "data backend", in such a way that it is straightforward to manage which information is shown by the application, without having to modify and upload it again. The only pieces of information that do not depend on the services are the App "Sections" and "Screens", but the content displayed in each of them is obtained through requests to the web services. The API is not used only for the smartphone app: presently, most of SOCIB applications for data visualisation and access rely on the API, for instance: corporative web, deployment Application (Dapp, http://apps.socib.es/dapp/), Sea Boards (http://seaboard.socib.es/).
Using Map Service API for Driving Cycle Detection for Wearable GPS Data: Preprint
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhu, Lei; Gonder, Jeffrey D
Following advancements in smartphone and portable global positioning system (GPS) data collection, wearable GPS data have realized extensive use in transportation surveys and studies. The task of detecting driving cycles (driving or car-mode trajectory segments) from wearable GPS data has been the subject of much research. Specifically, distinguishing driving cycles from other motorized trips (such as taking a bus) is the main research problem in this paper. Many mode detection methods only focus on raw GPS speed data while some studies apply additional information, such as geographic information system (GIS) data, to obtain better detection performance. Procuring and maintaining dedicatedmore » road GIS data are costly and not trivial, whereas the technical maturity and broad use of map service application program interface (API) queries offers opportunities for mode detection tasks. The proposed driving cycle detection method takes advantage of map service APIs to obtain high-quality car-mode API route information and uses a trajectory segmentation algorithm to find the best-matched API route. The car-mode API route data combined with the actual route information, including the actual mode information, are used to train a logistic regression machine learning model, which estimates car modes and non-car modes with probability rates. The experimental results show promise for the proposed method's ability to detect vehicle mode accurately.« less
ERIC Educational Resources Information Center
Black, August
2011-01-01
The research presented in this dissertation studies and describes how technical standards, protocols, and application programming interfaces (APIs) shape the aesthetic, functional, and affective nature of our most dominant mode of online communication, the World Wide Web (WWW). I examine the politically charged and contentious battle over browser…
NASA Technical Reports Server (NTRS)
Kavelund, Klaus; Barringer, Howard
2012-01-01
TraceContract is an API (Application Programming Interface) for trace analysis. A trace is a sequence of events, and can, for example, be generated by a running program, instrumented appropriately to generate events. An event can be any data object. An example of a trace is a log file containing events that a programmer has found important to record during a program execution. Trace - Contract takes as input such a trace together with a specification formulated using the API and reports on any violations of the specification, potentially calling code (reactions) to be executed when violations are detected. The software is developed as an internal DSL (Domain Specific Language) in the Scala programming language. Scala is a relatively new programming language that is specifically convenient for defining such internal DSLs due to a number of language characteristics. This includes Scala s elegant combination of object-oriented and functional programming, a succinct notation, and an advanced type system. The DSL offers a combination of data-parameterized state machines and temporal logic, which is novel. As an extension of Scala, it is a very expressive and convenient log file analysis framework.
Newman, Jonathan P.; Zeller-Townson, Riley; Fong, Ming-Fai; Arcot Desai, Sharanya; Gross, Robert E.; Potter, Steve M.
2013-01-01
Single neuron feedback control techniques, such as voltage clamp and dynamic clamp, have enabled numerous advances in our understanding of ion channels, electrochemical signaling, and neural dynamics. Although commercially available multichannel recording and stimulation systems are commonly used for studying neural processing at the network level, they provide little native support for real-time feedback. We developed the open-source NeuroRighter multichannel electrophysiology hardware and software platform for closed-loop multichannel control with a focus on accessibility and low cost. NeuroRighter allows 64 channels of stimulation and recording for around US $10,000, along with the ability to integrate with other software and hardware. Here, we present substantial enhancements to the NeuroRighter platform, including a redesigned desktop application, a new stimulation subsystem allowing arbitrary stimulation patterns, low-latency data servers for accessing data streams, and a new application programming interface (API) for creating closed-loop protocols that can be inserted into NeuroRighter as plugin programs. This greatly simplifies the design of sophisticated real-time experiments without sacrificing the power and speed of a compiled programming language. Here we present a detailed description of NeuroRighter as a stand-alone application, its plugin API, and an extensive set of case studies that highlight the system’s abilities for conducting closed-loop, multichannel interfacing experiments. PMID:23346047
Zhang, Mingyuan; Velasco, Ferdinand T.; Musser, R. Clayton; Kawamoto, Kensaku
2013-01-01
Enabling clinical decision support (CDS) across multiple electronic health record (EHR) systems has been a desired but largely unattained aim of clinical informatics, especially in commercial EHR systems. A potential opportunity for enabling such scalable CDS is to leverage vendor-supported, Web-based CDS development platforms along with vendor-supported application programming interfaces (APIs). Here, we propose a potential staged approach for enabling such scalable CDS, starting with the use of custom EHR APIs and moving towards standardized EHR APIs to facilitate interoperability. We analyzed three commercial EHR systems for their capabilities to support the proposed approach, and we implemented prototypes in all three systems. Based on these analyses and prototype implementations, we conclude that the approach proposed is feasible, already supported by several major commercial EHR vendors, and potentially capable of enabling cross-platform CDS at scale. PMID:24551426
QSPIN: A High Level Java API for Quantum Computing Experimentation
NASA Technical Reports Server (NTRS)
Barth, Tim
2017-01-01
QSPIN is a high level Java language API for experimentation in QC models used in the calculation of Ising spin glass ground states and related quadratic unconstrained binary optimization (QUBO) problems. The Java API is intended to facilitate research in advanced QC algorithms such as hybrid quantum-classical solvers, automatic selection of constraint and optimization parameters, and techniques for the correction and mitigation of model and solution errors. QSPIN includes high level solver objects tailored to the D-Wave quantum annealing architecture that implement hybrid quantum-classical algorithms [Booth et al.] for solving large problems on small quantum devices, elimination of variables via roof duality, and classical computing optimization methods such as GPU accelerated simulated annealing and tabu search for comparison. A test suite of documented NP-complete applications ranging from graph coloring, covering, and partitioning to integer programming and scheduling are provided to demonstrate current capabilities.
Jefferson, Emily R.; Walsh, Thomas P.; Roberts, Timothy J.; Barton, Geoffrey J.
2007-01-01
SNAPPI-DB, a high performance database of Structures, iNterfaces and Alignments of Protein–Protein Interactions, and its associated Java Application Programming Interface (API) is described. SNAPPI-DB contains structural data, down to the level of atom co-ordinates, for each structure in the Protein Data Bank (PDB) together with associated data including SCOP, CATH, Pfam, SWISSPROT, InterPro, GO terms, Protein Quaternary Structures (PQS) and secondary structure information. Domain–domain interactions are stored for multiple domain definitions and are classified by their Superfamily/Family pair and interaction interface. Each set of classified domain–domain interactions has an associated multiple structure alignment for each partner. The API facilitates data access via PDB entries, domains and domain–domain interactions. Rapid development, fast database access and the ability to perform advanced queries without the requirement for complex SQL statements are provided via an object oriented database and the Java Data Objects (JDO) API. SNAPPI-DB contains many features which are not available in other databases of structural protein–protein interactions. It has been applied in three studies on the properties of protein–protein interactions and is currently being employed to train a protein–protein interaction predictor and a functional residue predictor. The database, API and manual are available for download at: . PMID:17202171
Environmental Models as a Service: Enabling Interoperability ...
Achieving interoperability in environmental modeling has evolved as software technology has progressed. The recent rise of cloud computing and proliferation of web services initiated a new stage for creating interoperable systems. Scientific programmers increasingly take advantage of streamlined deployment processes and affordable cloud access to move algorithms and data to the web for discoverability and consumption. In these deployments, environmental models can become available to end users through RESTful web services and consistent application program interfaces (APIs) that consume, manipulate, and store modeling data. RESTful modeling APIs also promote discoverability and guide usability through self-documentation. Embracing the RESTful paradigm allows models to be accessible via a web standard, and the resulting endpoints are platform- and implementation-agnostic while simultaneously presenting significant computational capabilities for spatial and temporal scaling. RESTful APIs present data in a simple verb-noun web request interface: the verb dictates how a resource is consumed using HTTP methods (e.g., GET, POST, and PUT) and the noun represents the URL reference of the resource on which the verb will act. The RESTful API can self-document in both the HTTP response and an interactive web page using the Open API standard. This lets models function as an interoperable service that promotes sharing, documentation, and discoverability. Here, we discuss the
30 CFR 250.1900 - Must I have a SEMS program?
Code of Federal Regulations, 2012 CFR
2012-07-01
... Environmental Management Program for Offshore Operations and Facilities (API RP 75) (as incorporated by... subpart and API RP 75 (as incorporated by reference in § 250.198), you must follow the requirements of...
30 CFR 250.1900 - Must I have a SEMS program?
Code of Federal Regulations, 2011 CFR
2011-07-01
... Safety and Environmental Management Program for Offshore Operations and Facilities (API RP 75... requirements of this subpart and API RP 75 (incorporated by reference as specified in § 250.198), you must...
2016-03-01
Representational state transfer Java messaging service Java application programming interface (API) Internet relay chat (IRC)/extensible messaging and...JBoss application server or an Apache Tomcat servlet container instance. The relational database management system can be either PostgreSQL or MySQL ... Java library called direct web remoting. This library has been part of the core CACE architecture for quite some time; however, there have not been
From WSN towards WoT: Open API Scheme Based on oneM2M Platforms.
Kim, Jaeho; Choi, Sung-Chan; Ahn, Il-Yeup; Sung, Nak-Myoung; Yun, Jaeseok
2016-10-06
Conventional computing systems have been able to be integrated into daily objects and connected to each other due to advances in computing and network technologies, such as wireless sensor networks (WSNs), forming a global network infrastructure, called the Internet of Things (IoT). To support the interconnection and interoperability between heterogeneous IoT systems, the availability of standardized, open application programming interfaces (APIs) is one of the key features of common software platforms for IoT devices, gateways, and servers. In this paper, we present a standardized way of extending previously-existing WSNs towards IoT systems, building the world of the Web of Things (WoT). Based on the oneM2M software platforms developed in the previous project, we introduce a well-designed open API scheme and device-specific thing adaptation software (TAS) enabling WSN elements, such as a wireless sensor node, to be accessed in a standardized way on a global scale. Three pilot services are implemented (i.e., a WiFi-enabled smart flowerpot, voice-based control for ZigBee-connected home appliances, and WiFi-connected AR.Drone control) to demonstrate the practical usability of the open API scheme and TAS modules. Full details on the method of integrating WSN elements into three example systems are described at the programming code level, which is expected to help future researchers in integrating their WSN systems in IoT platforms, such as oneM2M. We hope that the flexibly-deployable, easily-reusable common open API scheme and TAS-based integration method working with the oneM2M platforms will help the conventional WSNs in diverse industries evolve into the emerging WoT solutions.
From WSN towards WoT: Open API Scheme Based on oneM2M Platforms
Kim, Jaeho; Choi, Sung-Chan; Ahn, Il-Yeup; Sung, Nak-Myoung; Yun, Jaeseok
2016-01-01
Conventional computing systems have been able to be integrated into daily objects and connected to each other due to advances in computing and network technologies, such as wireless sensor networks (WSNs), forming a global network infrastructure, called the Internet of Things (IoT). To support the interconnection and interoperability between heterogeneous IoT systems, the availability of standardized, open application programming interfaces (APIs) is one of the key features of common software platforms for IoT devices, gateways, and servers. In this paper, we present a standardized way of extending previously-existing WSNs towards IoT systems, building the world of the Web of Things (WoT). Based on the oneM2M software platforms developed in the previous project, we introduce a well-designed open API scheme and device-specific thing adaptation software (TAS) enabling WSN elements, such as a wireless sensor node, to be accessed in a standardized way on a global scale. Three pilot services are implemented (i.e., a WiFi-enabled smart flowerpot, voice-based control for ZigBee-connected home appliances, and WiFi-connected AR.Drone control) to demonstrate the practical usability of the open API scheme and TAS modules. Full details on the method of integrating WSN elements into three example systems are described at the programming code level, which is expected to help future researchers in integrating their WSN systems in IoT platforms, such as oneM2M. We hope that the flexibly-deployable, easily-reusable common open API scheme and TAS-based integration method working with the oneM2M platforms will help the conventional WSNs in diverse industries evolve into the emerging WoT solutions. PMID:27782058
Hynes, Martin; Wang, Han; Kilmartin, Liam
2009-01-01
Over the last decade, there has been substantial research interest in the application of accelerometry data for many forms of automated gait and activity analysis algorithms. This paper introduces a summary of new "of-the-shelf" mobile phone handset platforms containing embedded accelerometers which support the development of custom software to implement real time analysis of the accelerometer data. An overview of the main software programming environments which support the development of such software, including Java ME based JSR 256 API, C++ based Motion Sensor API and the Python based "aXYZ" module, is provided. Finally, a sample application is introduced and its performance evaluated in order to illustrate how a standard mobile phone can be used to detect gait activity using such a non-intrusive and easily accepted sensing platform.
A RESTful API for accessing microbial community data for MG-RAST
Wilke, Andreas; Bischof, Jared; Harrison, Travis; ...
2015-01-08
Metagenomic sequencing has produced significant amounts of data in recent years. For example, as of summer 2013, MGRAST has been used to annotate over 110,000 data sets totaling over 43 Terabases. With metagenomic sequencing finding even wider adoption in the scientific community, the existing web-based analysis tools and infrastructure in MG-RAST provide limited capability for data retrieval and analysis, such as comparative analysis between multiple data sets. Moreover, although the system provides many analysis tools, it is not comprehensive. By opening MG-RAST up via a web services API (application programmers interface) we have greatly expanded access to MG-RAST data, asmore » well as provided a mechanism for the use of third-party analysis tools with MG-RAST data. This RESTful API makes all data and data objects created by the MG-RAST pipeline accessible as JSON objects. As part of the DOE Systems Biology Knowledgebase project (KBase, http:// kbase.us) we have implemented a web services API for MG-RAST. This API complements the existing MG-RAST web interface and constitutes the basis of KBase’s microbial community capabilities. In addition, the API exposes a comprehensive collection of data to programmers. This API, which uses a RESTful (Representational State Transfer) implementation, is compatible with most programming environments and should be easy to use for end users and third parties. It provides comprehensive access to sequence data, quality control results, annotations, and many other data types. Where feasible, we have used standards to expose data and metadata. Code examples are provided in a number of languages both to show the versatility of the API and to provide a starting point for users. We present an API that exposes the data in MG-RAST for consumption by our users, greatly enhancing the utility of the MG-RAST service.« less
A RESTful API for Accessing Microbial Community Data for MG-RAST
Wilke, Andreas; Bischof, Jared; Harrison, Travis; Brettin, Tom; D'Souza, Mark; Gerlach, Wolfgang; Matthews, Hunter; Paczian, Tobias; Wilkening, Jared; Glass, Elizabeth M.; Desai, Narayan; Meyer, Folker
2015-01-01
Metagenomic sequencing has produced significant amounts of data in recent years. For example, as of summer 2013, MG-RAST has been used to annotate over 110,000 data sets totaling over 43 Terabases. With metagenomic sequencing finding even wider adoption in the scientific community, the existing web-based analysis tools and infrastructure in MG-RAST provide limited capability for data retrieval and analysis, such as comparative analysis between multiple data sets. Moreover, although the system provides many analysis tools, it is not comprehensive. By opening MG-RAST up via a web services API (application programmers interface) we have greatly expanded access to MG-RAST data, as well as provided a mechanism for the use of third-party analysis tools with MG-RAST data. This RESTful API makes all data and data objects created by the MG-RAST pipeline accessible as JSON objects. As part of the DOE Systems Biology Knowledgebase project (KBase, http://kbase.us) we have implemented a web services API for MG-RAST. This API complements the existing MG-RAST web interface and constitutes the basis of KBase's microbial community capabilities. In addition, the API exposes a comprehensive collection of data to programmers. This API, which uses a RESTful (Representational State Transfer) implementation, is compatible with most programming environments and should be easy to use for end users and third parties. It provides comprehensive access to sequence data, quality control results, annotations, and many other data types. Where feasible, we have used standards to expose data and metadata. Code examples are provided in a number of languages both to show the versatility of the API and to provide a starting point for users. We present an API that exposes the data in MG-RAST for consumption by our users, greatly enhancing the utility of the MG-RAST service. PMID:25569221
Develop 3G Application with The J2ME SATSA API
NASA Astrophysics Data System (ADS)
JunWu, Xu; JunLing, Liang
This paper describes research in the use of the Security and Trust Services API for J2ME (SATSA) to develop mobile applications. for 3G networks. SATSA defines a set of APIs that allows J2ME applications to communicate with and access functionality, secure storage and cryptographic operations provided by security elements such as smart cards and Wireless Identification Modules (WIM). A Java Card application could also work as an authentication module in a J2ME-based e-bank application. The e-bank application would allow its users to access their bank accounts using their cell phones.
30 CFR 250.1900 - Must I have a SEMS program?
Code of Federal Regulations, 2013 CFR
2013-07-01
... Environmental Management Program for Offshore Operations and Facilities (API RP 75) (as incorporated by... conflicts between the requirements of this subpart and API RP 75; COS-2-01, COS-2-03, or COS-2-04; or ISO...
30 CFR 250.1900 - Must I have a SEMS program?
Code of Federal Regulations, 2014 CFR
2014-07-01
... Environmental Management Program for Offshore Operations and Facilities (API RP 75) (as incorporated by... conflicts between the requirements of this subpart and API RP 75; COS-2-01, COS-2-03, or COS-2-04; or ISO...
Detecting Runtime Anomalies in AJAX Applications through Trace Analysis
2011-08-10
statements by adding the instrumentation to the GWT UI classes, leaving the user code untouched. Some content management frameworks such as Drupal [12...Google web toolkit.” http://code.google.com/webtoolkit/. [12] “Form generation – drupal api.” http://api.drupal.org/api/group/form_api/6. 9
DOE Office of Scientific and Technical Information (OSTI.GOV)
Spires, S.
This code provides an application programming interface to the Macintosh OSX Carbon Databrowser from Macintosh Common Lisp. The Databrowser API is made available to Lisp via high level native CLOS classes and methods, obviating the need to write low-level Carbon code. This code is primarily glue in that its job is to provide an interface between two extant software tools: Macintosh Common Lisp and the OSX Databrowser, both of which are COTS products from private vendors. The Databrowser is an extremely useful user interface widget that is provided with Apples OSX (and to some extent, OS9) operating systems. One Apple-sanctionedmore » method for using the Databrowser is via an API called Carbon, which is designed for C and C++ programmers. We have translated the low-level Carbon programming interface to the Databrowser into high-level object-oriented Common Lisp calls, functions, methods. and classes to enable MCL programmers to more readily take advantage of the Databrowser from Lisp programs.« less
Federal Register 2010, 2011, 2012, 2013, 2014
2012-11-09
... makes available to Trading Permit Holders various application programming interfaces (``APIs''),\\4\\ such... Permit Holders to enter and execute orders, as well as submit certain order and trade data to the Exchange, which data the Exchange uses to conduct surveillances of its markets and Trading Permit Holders...
A "Simple Query Interface" Adapter for the Discovery and Exchange of Learning Resources
ERIC Educational Resources Information Center
Massart, David
2006-01-01
Developed as part of CEN/ISSS Workshop on Learning Technology efforts to improve interoperability between learning resource repositories, the Simple Query Interface (SQI) is an Application Program Interface (API) for querying heterogeneous repositories of learning resource metadata. In the context of the ProLearn Network of Excellence, SQI is used…
The final session of the workshop considered the subject of software technology and how it might be better constructed to support those who develop, evaluate, and apply multimedia environmental models. Two invited presentations were featured along with an extended open discussio...
Identifying and assessing highly hazardous drugs within quality risk management programs.
Sussman, Robert G; Schatz, Anthony R; Kimmel, Tracy A; Ader, Allan; Naumann, Bruce D; Weideman, Patricia A
2016-08-01
Historically, pharmaceutical industry regulatory guidelines have assigned certain active pharmaceutical ingredients (APIs) to various categories of concern, such as "cytotoxic", "hormones", and "steroids". These categories have been used to identify APIs requiring segregation or dedication in order to prevent cross-contamination and protect the quality and safety of drug products. Since these terms were never defined by regulatory authorities, and many novel pharmacological mechanisms challenge these categories, there is a recognized need to modify the historical use of these terms. The application of a risk-based approach using a health-based limit, such as an acceptable daily exposure (ADE), is more appropriate for the development of a Quality Risk Management Program (QRMP) than the use of categories of concern. The toxicological and pharmacological characteristics of these categories are discussed to help identify and prioritize compounds requiring special attention. Controlling airborne concentrations and the contamination of product contact surfaces in accordance with values derived from quantitative risk assessments can prevent adverse effects in workers and patients, regardless of specific categorical designations to which these APIs have been assigned. The authors acknowledge the movement away from placing compounds into categories and, while not yet universal, the importance of basing QRMPs on compound-specific ADEs and risk assessments. Based on the results of a risk assessment, segregation and dedication may also be required for some compounds to prevent cross contamination during manufacture of APIs. Copyright © 2016 Elsevier Inc. All rights reserved.
49 CFR 192.616 - Public awareness.
Code of Federal Regulations, 2012 CFR
2012-10-01
... follows the guidance provided in the American Petroleum Institute's (API) Recommended Practice (RP) 1162... recommendations of API RP 1162 and assess the unique attributes and characteristics of the operator's pipeline and... supplemental requirements of API RP 1162, unless the operator provides justification in its program or...
49 CFR 192.616 - Public awareness.
Code of Federal Regulations, 2014 CFR
2014-10-01
... follows the guidance provided in the American Petroleum Institute's (API) Recommended Practice (RP) 1162... recommendations of API RP 1162 and assess the unique attributes and characteristics of the operator's pipeline and... supplemental requirements of API RP 1162, unless the operator provides justification in its program or...
49 CFR 192.616 - Public awareness.
Code of Federal Regulations, 2013 CFR
2013-10-01
... follows the guidance provided in the American Petroleum Institute's (API) Recommended Practice (RP) 1162... recommendations of API RP 1162 and assess the unique attributes and characteristics of the operator's pipeline and... supplemental requirements of API RP 1162, unless the operator provides justification in its program or...
Ayres, Daniel L; Darling, Aaron; Zwickl, Derrick J; Beerli, Peter; Holder, Mark T; Lewis, Paul O; Huelsenbeck, John P; Ronquist, Fredrik; Swofford, David L; Cummings, Michael P; Rambaut, Andrew; Suchard, Marc A
2012-01-01
Phylogenetic inference is fundamental to our understanding of most aspects of the origin and evolution of life, and in recent years, there has been a concentration of interest in statistical approaches such as Bayesian inference and maximum likelihood estimation. Yet, for large data sets and realistic or interesting models of evolution, these approaches remain computationally demanding. High-throughput sequencing can yield data for thousands of taxa, but scaling to such problems using serial computing often necessitates the use of nonstatistical or approximate approaches. The recent emergence of graphics processing units (GPUs) provides an opportunity to leverage their excellent floating-point computational performance to accelerate statistical phylogenetic inference. A specialized library for phylogenetic calculation would allow existing software packages to make more effective use of available computer hardware, including GPUs. Adoption of a common library would also make it easier for other emerging computing architectures, such as field programmable gate arrays, to be used in the future. We present BEAGLE, an application programming interface (API) and library for high-performance statistical phylogenetic inference. The API provides a uniform interface for performing phylogenetic likelihood calculations on a variety of compute hardware platforms. The library includes a set of efficient implementations and can currently exploit hardware including GPUs using NVIDIA CUDA, central processing units (CPUs) with Streaming SIMD Extensions and related processor supplementary instruction sets, and multicore CPUs via OpenMP. To demonstrate the advantages of a common API, we have incorporated the library into several popular phylogenetic software packages. The BEAGLE library is free open source software licensed under the Lesser GPL and available from http://beagle-lib.googlecode.com. An example client program is available as public domain software.
Ayres, Daniel L.; Darling, Aaron; Zwickl, Derrick J.; Beerli, Peter; Holder, Mark T.; Lewis, Paul O.; Huelsenbeck, John P.; Ronquist, Fredrik; Swofford, David L.; Cummings, Michael P.; Rambaut, Andrew; Suchard, Marc A.
2012-01-01
Abstract Phylogenetic inference is fundamental to our understanding of most aspects of the origin and evolution of life, and in recent years, there has been a concentration of interest in statistical approaches such as Bayesian inference and maximum likelihood estimation. Yet, for large data sets and realistic or interesting models of evolution, these approaches remain computationally demanding. High-throughput sequencing can yield data for thousands of taxa, but scaling to such problems using serial computing often necessitates the use of nonstatistical or approximate approaches. The recent emergence of graphics processing units (GPUs) provides an opportunity to leverage their excellent floating-point computational performance to accelerate statistical phylogenetic inference. A specialized library for phylogenetic calculation would allow existing software packages to make more effective use of available computer hardware, including GPUs. Adoption of a common library would also make it easier for other emerging computing architectures, such as field programmable gate arrays, to be used in the future. We present BEAGLE, an application programming interface (API) and library for high-performance statistical phylogenetic inference. The API provides a uniform interface for performing phylogenetic likelihood calculations on a variety of compute hardware platforms. The library includes a set of efficient implementations and can currently exploit hardware including GPUs using NVIDIA CUDA, central processing units (CPUs) with Streaming SIMD Extensions and related processor supplementary instruction sets, and multicore CPUs via OpenMP. To demonstrate the advantages of a common API, we have incorporated the library into several popular phylogenetic software packages. The BEAGLE library is free open source software licensed under the Lesser GPL and available from http://beagle-lib.googlecode.com. An example client program is available as public domain software. PMID:21963610
Asynchronous Object Storage with QoS for Scientific and Commercial Big Data
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brim, Michael J; Dillow, David A; Oral, H Sarp
2013-01-01
This paper presents our design for an asynchronous object storage system intended for use in scientific and commercial big data workloads. Use cases from the target workload do- mains are used to motivate the key abstractions used in the application programming interface (API). The architecture of the Scalable Object Store (SOS), a prototype object stor- age system that supports the API s facilities, is presented. The SOS serves as a vehicle for future research into scalable and resilient big data object storage. We briefly review our research into providing efficient storage servers capable of providing quality of service (QoS) contractsmore » relevant for big data use cases.« less
The Electricity Data Browser shows generation, consumption, fossil fuel receipts, stockpiles, retail sales, and electricity prices. The data appear on an interactive web page and are updated each month. The Electricity Data Browser includes all the datasets collected and published in EIA's Electric Power Monthly and allows users to perform dynamic charting of data sets as well as map the data by state. The data browser includes a series of reports that appear in the Electric Power Monthly and allows readers to drill down to plant level statistics, where available. All images and datasets are available for download. Users can also link to the data series in EIA's Application Programming Interface (API). An API makes our data machine-readable and more accessible to users.
Huang, Linda; Fernandes, Helen; Zia, Hamid; Tavassoli, Peyman; Rennert, Hanna; Pisapia, David; Imielinski, Marcin; Sboner, Andrea; Rubin, Mark A; Kluk, Michael; Elemento, Olivier
2017-05-01
This paper describes the Precision Medicine Knowledge Base (PMKB; https://pmkb.weill.cornell.edu ), an interactive online application for collaborative editing, maintenance, and sharing of structured clinical-grade cancer mutation interpretations. PMKB was built using the Ruby on Rails Web application framework. Leveraging existing standards such as the Human Genome Variation Society variant description format, we implemented a data model that links variants to tumor-specific and tissue-specific interpretations. Key features of PMKB include support for all major variant types, standardized authentication, distinct user roles including high-level approvers, and detailed activity history. A REpresentational State Transfer (REST) application-programming interface (API) was implemented to query the PMKB programmatically. At the time of writing, PMKB contains 457 variant descriptions with 281 clinical-grade interpretations. The EGFR, BRAF, KRAS, and KIT genes are associated with the largest numbers of interpretable variants. PMKB's interpretations have been used in over 1500 AmpliSeq tests and 750 whole-exome sequencing tests. The interpretations are accessed either directly via the Web interface or programmatically via the existing API. An accurate and up-to-date knowledge base of genomic alterations of clinical significance is critical to the success of precision medicine programs. The open-access, programmatically accessible PMKB represents an important attempt at creating such a resource in the field of oncology. The PMKB was designed to help collect and maintain clinical-grade mutation interpretations and facilitate reporting for clinical cancer genomic testing. The PMKB was also designed to enable the creation of clinical cancer genomics automated reporting pipelines via an API. © The Author 2016. Published by Oxford University Press on behalf of the American Medical Informatics Association.
A RESTful application programming interface for the PubMLST molecular typing and genome databases
Bray, James E.; Maiden, Martin C. J.
2017-01-01
Abstract Molecular typing is used to differentiate microorganisms at the subspecies or strain level for epidemiological investigations, infection control, public health and environmental sampling. DNA sequence-based typing methods require authoritative databases that link sequence variants to nomenclature in order to facilitate communication and comparison of identified types in national or global settings. The PubMLST website (https://pubmlst.org/) fulfils this role for over a hundred microorganisms for which it hosts curated molecular sequence typing data, providing sequence and allelic profile definitions for multi-locus sequence typing (MLST) and single-gene typing approaches. In recent years, these have expanded to cover the whole genome with schemes such as core genome MLST (cgMLST) and whole genome MLST (wgMLST) which catalogue the allelic diversity found in hundreds to thousands of genes. These approaches provide a common nomenclature for high-resolution strain characterization and comparison. Molecular typing information is linked to isolate provenance, phenotype, and increasingly genome assemblies, providing a resource for outbreak investigation and research in to population structure, gene association, global epidemiology and vaccine coverage. A Representational State Transfer (REST) Application Programming Interface (API) has been developed for the PubMLST website to make these large quantities of structured molecular typing and whole genome sequence data available for programmatic access by any third party application. The API is an integral component of the Bacterial Isolate Genome Sequence Database (BIGSdb) platform that is used to host PubMLST resources, and exposes all public data within the site. In addition to data browsing, searching and download, the API supports authentication and submission of new data to curator queues. Database URL: http://rest.pubmlst.org/ PMID:29220452
Layered approach to workstation design for medical image viewing
NASA Astrophysics Data System (ADS)
Haynor, David R.; Zick, Gregory L.; Heritage, Marcus B.; Kim, Yongmin
1992-07-01
Software engineering principles suggest that complex software systems are best constructed from independent, self-contained modules, thereby maximizing the portability, maintainability and modifiability of the produced code. This principal is important in the design of medical imaging workstations, where further developments in technology (CPU, memory, interface devices, displays, network connections) are required for clinically acceptable workstations, and it is desirable to provide different hardware platforms with the ''same look and feel'' for the user. In addition, the set of desired functions is relatively well understood, but the optimal user interface for delivering these functions on a clinically acceptable workstation is still different depending on department, specialty, or individual preference. At the University of Washington, we are developing a viewing station based on the IBM RISC/6000 computer and on new technologies that are just becoming commercially available. These include advanced voice recognition systems and an ultra-high-speed network. We are developing a set of specifications and a conceptual design for the workstation, and will be producing a prototype. This paper presents our current concepts concerning the architecture and software system design of the future prototype. Our conceptual design specifies requirements for a Database Application Programming Interface (DBAPI) and for a User API (UAPI). The DBAPI consists of a set of subroutine calls that define the admissible transactions between the workstation and an image archive. The UAPI describes the requests a user interface program can make of the workstation. It incorporates basic display and image processing functions, yet is specifically designed to allow extensions to the basic set at the application level. We will discuss the fundamental elements of the two API''s and illustrate their application to workstation design.
Huang, Linda; Fernandes, Helen; Zia, Hamid; Tavassoli, Peyman; Rennert, Hanna; Pisapia, David; Imielinski, Marcin; Sboner, Andrea; Rubin, Mark A; Kluk, Michael
2017-01-01
Objective: This paper describes the Precision Medicine Knowledge Base (PMKB; https://pmkb.weill.cornell.edu), an interactive online application for collaborative editing, maintenance, and sharing of structured clinical-grade cancer mutation interpretations. Materials and Methods: PMKB was built using the Ruby on Rails Web application framework. Leveraging existing standards such as the Human Genome Variation Society variant description format, we implemented a data model that links variants to tumor-specific and tissue-specific interpretations. Key features of PMKB include support for all major variant types, standardized authentication, distinct user roles including high-level approvers, and detailed activity history. A REpresentational State Transfer (REST) application-programming interface (API) was implemented to query the PMKB programmatically. Results: At the time of writing, PMKB contains 457 variant descriptions with 281 clinical-grade interpretations. The EGFR, BRAF, KRAS, and KIT genes are associated with the largest numbers of interpretable variants. PMKB’s interpretations have been used in over 1500 AmpliSeq tests and 750 whole-exome sequencing tests. The interpretations are accessed either directly via the Web interface or programmatically via the existing API. Discussion: An accurate and up-to-date knowledge base of genomic alterations of clinical significance is critical to the success of precision medicine programs. The open-access, programmatically accessible PMKB represents an important attempt at creating such a resource in the field of oncology. Conclusion: The PMKB was designed to help collect and maintain clinical-grade mutation interpretations and facilitate reporting for clinical cancer genomic testing. The PMKB was also designed to enable the creation of clinical cancer genomics automated reporting pipelines via an API. PMID:27789569
Federal Register 2010, 2011, 2012, 2013, 2014
2013-09-10
... that incorporates the management program and principles of API RP 75 is appropriate for vessels engaged... which would incorporate the management program and principles of API RP 75. Table 1 shows the current... to develop, implement, and maintain a SEMS that incorporates the management program and principles of...
Mobile Phone Application Development for the Classroom
NASA Astrophysics Data System (ADS)
Lewis, P.; Oostra, D.; Crecelius, S.; Chambers, L. H.
2012-08-01
With smartphone sales currently surpassing laptop sales, it is hard not to think that these devices will have a place in the classroom. More specifically, with little to no monetary investment, classroom-centric mobile applications have the ability to suit the needs of teachers. Previously, programming such an item was a daunting task to the classroom teacher. But now, through the use of online visual tools, anyone has the ability to generate a mobile application to suit individual classroom needs. The "MY NASA DATA" (MND) project has begun work on such an application. Using online tools that are directed at the non-programmer, the team has developed two usable mobile applications ("apps") that fit right into the science classroom. The two apps generated include a cloud dichotomous key for cloud identification in the field, and an atmospheric science glossary to help with standardized testing key vocabulary and classroom assignments. Through the use of free online tools, teachers and students now have the ability to customize mobile applications to meet their individual needs. As an extension of the mobile applications, the MND team is planning web-based application programming interfaces (API's) that will be generated from data that is currently included in the MND Live Access Server. This will allow teachers and students to choose data sets that they want to include in the mobile application without having to populate the API themselves. Through the use of easy to understand online mobile app tutorials and MND data sets, teachers will have the ability to generate unit-specific mobile applications to further engage and empower students in the science classroom.
Programming distributed medical applications with XWCH2.
Ben Belgacem, Mohamed; Niinimaki, Marko; Abdennadher, Nabil
2010-01-01
Many medical applications utilise distributed/parallel computing in order to cope with demands of large data or computing power requirements. In this paper, we present a new version of the XtremWeb-CH (XWCH) platform, and demonstrate two medical applications that run on XWCH. The platform is versatile in a way that it supports direct communication between tasks. When tasks cannot communicate directly, warehouses are used as intermediary nodes between "producer" and "consumer" tasks. New features have been developed to provide improved support for writing powerfull distributed applications using an easy API.
Framework for End-User Programming of Cross-Smart Space Applications
Palviainen, Marko; Kuusijärvi, Jarkko; Ovaska, Eila
2012-01-01
Cross-smart space applications are specific types of software services that enable users to share information, monitor the physical and logical surroundings and control it in a way that is meaningful for the user's situation. For developing cross-smart space applications, this paper makes two main contributions: it introduces (i) a component design and scripting method for end-user programming of cross-smart space applications and (ii) a backend framework of components that interwork to support the brunt of the RDFScript translation, and the use and execution of ontology models. Before end-user programming activities, the software professionals must develop easy-to-apply Driver components for the APIs of existing software systems. Thereafter, end-users are able to create applications from the commands of the Driver components with the help of the provided toolset. The paper also introduces the reference implementation of the framework, tools for the Driver component development and end-user programming of cross-smart space applications and the first evaluation results on their application. PMID:23202169
Use of model organism and disease databases to support matchmaking for human disease gene discovery.
Mungall, Christopher J; Washington, Nicole L; Nguyen-Xuan, Jeremy; Condit, Christopher; Smedley, Damian; Köhler, Sebastian; Groza, Tudor; Shefchek, Kent; Hochheiser, Harry; Robinson, Peter N; Lewis, Suzanna E; Haendel, Melissa A
2015-10-01
The Matchmaker Exchange application programming interface (API) allows searching a patient's genotypic or phenotypic profiles across clinical sites, for the purposes of cohort discovery and variant disease causal validation. This API can be used not only to search for matching patients, but also to match against public disease and model organism data. This public disease data enable matching known diseases and variant-phenotype associations using phenotype semantic similarity algorithms developed by the Monarch Initiative. The model data can provide additional evidence to aid diagnosis, suggest relevant models for disease mechanism and treatment exploration, and identify collaborators across the translational divide. The Monarch Initiative provides an implementation of this API for searching multiple integrated sources of data that contextualize the knowledge about any given patient or patient family into the greater biomedical knowledge landscape. While this corpus of data can aid diagnosis, it is also the beginning of research to improve understanding of rare human diseases. © 2015 WILEY PERIODICALS, INC.
An Embedded Systems Laboratory to Support Rapid Prototyping of Robotics and the Internet of Things
ERIC Educational Resources Information Center
Hamblen, J. O.; van Bekkum, G. M. E.
2013-01-01
This paper describes a new approach for a course and laboratory designed to allow students to develop low-cost prototypes of robotic and other embedded devices that feature Internet connectivity, I/O, networking, a real-time operating system (RTOS), and object-oriented C/C++. The application programming interface (API) libraries provided permit…
TraceContract: A Scala DSL for Trace Analysis
NASA Technical Reports Server (NTRS)
Barringer, Howard; Havelund, Klaus
2011-01-01
In this paper we describe TRACECONTRACT, an API for trace analysis, implemented in the SCALA programming language. We argue that for certain forms of trace analysis the best weapon is a high level programming language augmented with constructs for temporal reasoning. A trace is a sequence of events, which may for example be generated by a running program, instrumented appropriately to generate events. The API supports writing properties in a notation that combines an advanced form of data parameterized state machines with temporal logic. The implementation utilizes SCALA's support for defining internal Domain Specific Languages (DSLs). Furthermore SCALA's combination of object oriented and functional programming features, including partial functions and pattern matching, makes it an ideal host language for such an API.
Creating Mobile and Web Application Programming Interfaces (APIs) for NASA Science Data
NASA Astrophysics Data System (ADS)
Oostra, D.; Chambers, L. H.; Lewis, P. M.; Moore, S. W.
2011-12-01
The Atmospheric Science Data Center (ASDC) at the NASA Langley Research Center in Virginia houses almost three petabytes of data, a collection that increases every day. To put it into perspective, it is estimated that three petabytes of data storage could store a digitized copy of all printed material in U.S. research libraries. There are more than ten other NASA data centers like the ASDC. Scientists and the public use this data for research, science education, and to understand our environment. Most importantly these data provide the potential for all of us make new discoveries. NASA is about making discoveries. Galileo was quoted as saying, "All discoveries are easy to understand once they are discovered. The point is to discover them." To that end, NASA stores vast amounts of publicly available data. This paper examines an approach to create web applications that serve NASA data in ways that specifically address the mobile web application technologies that are quickly emerging. Mobile data is not a new concept. What is new, is that user driven tools have recently become available that allow users to create their own mobile applications. Through the use of these cloud-based tools users can produce complete native mobile applications. Thus, mobile apps can now be created by everyone, regardless of their programming experience or expertise. This work will explore standards and methods for creating dynamic and malleable application programming interfaces (APIs) that allow users to access and use NASA science data for their own needs. The focus will be on experiences that broaden and increase the scope and usage of NASA science data sets.
Developing cloud applications using the e-Science Central platform.
Hiden, Hugo; Woodman, Simon; Watson, Paul; Cala, Jacek
2013-01-28
This paper describes the e-Science Central (e-SC) cloud data processing system and its application to a number of e-Science projects. e-SC provides both software as a service (SaaS) and platform as a service for scientific data management, analysis and collaboration. It is a portable system and can be deployed on both private (e.g. Eucalyptus) and public clouds (Amazon AWS and Microsoft Windows Azure). The SaaS application allows scientists to upload data, edit and run workflows and share results in the cloud, using only a Web browser. It is underpinned by a scalable cloud platform consisting of a set of components designed to support the needs of scientists. The platform is exposed to developers so that they can easily upload their own analysis services into the system and make these available to other users. A representational state transfer-based application programming interface (API) is also provided so that external applications can leverage the platform's functionality, making it easier to build scalable, secure cloud-based applications. This paper describes the design of e-SC, its API and its use in three different case studies: spectral data visualization, medical data capture and analysis, and chemical property prediction.
Developing cloud applications using the e-Science Central platform
Hiden, Hugo; Woodman, Simon; Watson, Paul; Cala, Jacek
2013-01-01
This paper describes the e-Science Central (e-SC) cloud data processing system and its application to a number of e-Science projects. e-SC provides both software as a service (SaaS) and platform as a service for scientific data management, analysis and collaboration. It is a portable system and can be deployed on both private (e.g. Eucalyptus) and public clouds (Amazon AWS and Microsoft Windows Azure). The SaaS application allows scientists to upload data, edit and run workflows and share results in the cloud, using only a Web browser. It is underpinned by a scalable cloud platform consisting of a set of components designed to support the needs of scientists. The platform is exposed to developers so that they can easily upload their own analysis services into the system and make these available to other users. A representational state transfer-based application programming interface (API) is also provided so that external applications can leverage the platform's functionality, making it easier to build scalable, secure cloud-based applications. This paper describes the design of e-SC, its API and its use in three different case studies: spectral data visualization, medical data capture and analysis, and chemical property prediction. PMID:23230161
Overview of implementation of DARPA GPU program in SAIC
NASA Astrophysics Data System (ADS)
Braunreiter, Dennis; Furtek, Jeremy; Chen, Hai-Wen; Healy, Dennis
2008-04-01
This paper reviews the implementation of DARPA MTO STAP-BOY program for both Phase I and II conducted at Science Applications International Corporation (SAIC). The STAP-BOY program conducts fast covariance factorization and tuning techniques for space-time adaptive process (STAP) Algorithm Implementation on Graphics Processor unit (GPU) Architectures for Embedded Systems. The first part of our presentation on the DARPA STAP-BOY program will focus on GPU implementation and algorithm innovations for a prototype radar STAP algorithm. The STAP algorithm will be implemented on the GPU, using stream programming (from companies such as PeakStream, ATI Technologies' CTM, and NVIDIA) and traditional graphics APIs. This algorithm will include fast range adaptive STAP weight updates and beamforming applications, each of which has been modified to exploit the parallel nature of graphics architectures.
A parallel solver for huge dense linear systems
NASA Astrophysics Data System (ADS)
Badia, J. M.; Movilla, J. L.; Climente, J. I.; Castillo, M.; Marqués, M.; Mayo, R.; Quintana-Ortí, E. S.; Planelles, J.
2011-11-01
HDSS (Huge Dense Linear System Solver) is a Fortran Application Programming Interface (API) to facilitate the parallel solution of very large dense systems to scientists and engineers. The API makes use of parallelism to yield an efficient solution of the systems on a wide range of parallel platforms, from clusters of processors to massively parallel multiprocessors. It exploits out-of-core strategies to leverage the secondary memory in order to solve huge linear systems O(100.000). The API is based on the parallel linear algebra library PLAPACK, and on its Out-Of-Core (OOC) extension POOCLAPACK. Both PLAPACK and POOCLAPACK use the Message Passing Interface (MPI) as the communication layer and BLAS to perform the local matrix operations. The API provides a friendly interface to the users, hiding almost all the technical aspects related to the parallel execution of the code and the use of the secondary memory to solve the systems. In particular, the API can automatically select the best way to store and solve the systems, depending of the dimension of the system, the number of processes and the main memory of the platform. Experimental results on several parallel platforms report high performance, reaching more than 1 TFLOP with 64 cores to solve a system with more than 200 000 equations and more than 10 000 right-hand side vectors. New version program summaryProgram title: Huge Dense System Solver (HDSS) Catalogue identifier: AEHU_v1_1 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEHU_v1_1.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 87 062 No. of bytes in distributed program, including test data, etc.: 1 069 110 Distribution format: tar.gz Programming language: Fortran90, C Computer: Parallel architectures: multiprocessors, computer clusters Operating system: Linux/Unix Has the code been vectorized or parallelized?: Yes, includes MPI primitives. RAM: Tested for up to 190 GB Classification: 6.5 External routines: MPI ( http://www.mpi-forum.org/), BLAS ( http://www.netlib.org/blas/), PLAPACK ( http://www.cs.utexas.edu/~plapack/), POOCLAPACK ( ftp://ftp.cs.utexas.edu/pub/rvdg/PLAPACK/pooclapack.ps) (code for PLAPACK and POOCLAPACK is included in the distribution). Catalogue identifier of previous version: AEHU_v1_0 Journal reference of previous version: Comput. Phys. Comm. 182 (2011) 533 Does the new version supersede the previous version?: Yes Nature of problem: Huge scale dense systems of linear equations, Ax=B, beyond standard LAPACK capabilities. Solution method: The linear systems are solved by means of parallelized routines based on the LU factorization, using efficient secondary storage algorithms when the available main memory is insufficient. Reasons for new version: In many applications we need to guarantee a high accuracy in the solution of very large linear systems and we can do it by using double-precision arithmetic. Summary of revisions: Version 1.1 Can be used to solve linear systems using double-precision arithmetic. New version of the initialization routine. The user can choose the kind of arithmetic and the values of several parameters of the environment. Running time: About 5 hours to solve a system with more than 200 000 equations and more than 10 000 right-hand side vectors using double-precision arithmetic on an eight-node commodity cluster with a total of 64 Intel cores.
NASA Astrophysics Data System (ADS)
Dave, Gaurav P.; Sureshkumar, N.; Blessy Trencia Lincy, S. S.
2017-11-01
Current trend in processor manufacturing focuses on multi-core architectures rather than increasing the clock speed for performance improvement. Graphic processors have become as commodity hardware for providing fast co-processing in computer systems. Developments in IoT, social networking web applications, big data created huge demand for data processing activities and such kind of throughput intensive applications inherently contains data level parallelism which is more suited for SIMD architecture based GPU. This paper reviews the architectural aspects of multi/many core processors and graphics processors. Different case studies are taken to compare performance of throughput computing applications using shared memory programming in OpenMP and CUDA API based programming.
Accessing the SEED genome databases via Web services API: tools for programmers.
Disz, Terry; Akhter, Sajia; Cuevas, Daniel; Olson, Robert; Overbeek, Ross; Vonstein, Veronika; Stevens, Rick; Edwards, Robert A
2010-06-14
The SEED integrates many publicly available genome sequences into a single resource. The database contains accurate and up-to-date annotations based on the subsystems concept that leverages clustering between genomes and other clues to accurately and efficiently annotate microbial genomes. The backend is used as the foundation for many genome annotation tools, such as the Rapid Annotation using Subsystems Technology (RAST) server for whole genome annotation, the metagenomics RAST server for random community genome annotations, and the annotation clearinghouse for exchanging annotations from different resources. In addition to a web user interface, the SEED also provides Web services based API for programmatic access to the data in the SEED, allowing the development of third-party tools and mash-ups. The currently exposed Web services encompass over forty different methods for accessing data related to microbial genome annotations. The Web services provide comprehensive access to the database back end, allowing any programmer access to the most consistent and accurate genome annotations available. The Web services are deployed using a platform independent service-oriented approach that allows the user to choose the most suitable programming platform for their application. Example code demonstrate that Web services can be used to access the SEED using common bioinformatics programming languages such as Perl, Python, and Java. We present a novel approach to access the SEED database. Using Web services, a robust API for access to genomics data is provided, without requiring large volume downloads all at once. The API ensures timely access to the most current datasets available, including the new genomes as soon as they come online.
Dynamic Policy-Driven Quality of Service in Service-Oriented Information Management Systems
2011-01-01
both DiffServ and IntServ net- work QoS mechanisms. Wang et al [48] provide middleware APIs to shield applications from directly interacting with...complex network QoS mechanism APIs . Middleware frameworks transparently converted the specified application QoS requirements into low- er-level network...QoS mechanism APIs and provided network QoS assurances. Deployment-time resource allocation. Other prior work has focused on deploying ap- plications
High-Throughput and Low-Latency Network Communication with NetIO
NASA Astrophysics Data System (ADS)
Schumacher, Jörn; Plessl, Christian; Vandelli, Wainer
2017-10-01
HPC network technologies like Infiniband, TrueScale or OmniPath provide low- latency and high-throughput communication between hosts, which makes them attractive options for data-acquisition systems in large-scale high-energy physics experiments. Like HPC networks, DAQ networks are local and include a well specified number of systems. Unfortunately traditional network communication APIs for HPC clusters like MPI or PGAS exclusively target the HPC community and are not suited well for DAQ applications. It is possible to build distributed DAQ applications using low-level system APIs like Infiniband Verbs, but it requires a non-negligible effort and expert knowledge. At the same time, message services like ZeroMQ have gained popularity in the HEP community. They make it possible to build distributed applications with a high-level approach and provide good performance. Unfortunately, their usage usually limits developers to TCP/IP- based networks. While it is possible to operate a TCP/IP stack on top of Infiniband and OmniPath, this approach may not be very efficient compared to a direct use of native APIs. NetIO is a simple, novel asynchronous message service that can operate on Ethernet, Infiniband and similar network fabrics. In this paper the design and implementation of NetIO is presented and described, and its use is evaluated in comparison to other approaches. NetIO supports different high-level programming models and typical workloads of HEP applications. The ATLAS FELIX project [1] successfully uses NetIO as its central communication platform. The architecture of NetIO is described in this paper, including the user-level API and the internal data-flow design. The paper includes a performance evaluation of NetIO including throughput and latency measurements. The performance is compared against the state-of-the- art ZeroMQ message service. Performance measurements are performed in a lab environment with Ethernet and FDR Infiniband networks.
USDA-ARS?s Scientific Manuscript database
The objective of this study was to provide initial results in an application of single-step genomic BLUP with a genomic relationship matrix (G^-1APY) calculated using the Algorithm of Proven and Young (APY) to 305-day protein yield for US Holsteins. Two G^-1APY were tested; one was from 139,057 geno...
TH-D-BRB-04: Pinnacle Scripting: Improving Efficiency While Maintaining Safety
DOE Office of Scientific and Technical Information (OSTI.GOV)
Moore, J.
2016-06-15
Scripting capabilities and application programming interfaces (APIs) are becoming commonly available in modern treatment planning systems. These links to the treatment planning system (TPS) allow users to read data from the TPS, and in some cases use TPS functionality and write data back to the TPS. Such tools are powerful extensions, allowing automation of routine clinical tasks and supporting research, particularly research involving repetitive tasks on large patient populations. The data and functionality exposed by scripting/API capabilities is vendor dependent, as are the languages used by script/API engines, such as the Microsoft .NET framework or Python. Scripts deployed in amore » clinical environment must be commissioned and validated like any other software tool. This session will provide an overview of scripting applications and a discussion of best practices, followed by a practical introduction to the scripting capabilities of three commercial treatment planning systems. Learning Objectives: Understand the scripting capabilities available in several treatment planning systems Learn how to get started using scripting capabilities Understand the best practices for safe script deployment in a clinical environment R. Popple, Varian Medical Systems has provided research support unrelated to the topic of this session.R. Cardan, Varian Medical Systems for grant research, product evaluation, and teaching honorarium.« less
30 CFR 250.1920 - What are the auditing requirements for my SEMS program?
Code of Federal Regulations, 2011 CFR
2011-07-01
... or your designated and qualified personnel according to the requirements of this subpart and API RP... subpart and API RP 75 to identify areas in which safety and environmental performance needs to be improved. (b) Your audit plan and procedures must meet or exceed all of the recommendations included in API RP...
BioBlend.objects: metacomputing with Galaxy.
Leo, Simone; Pireddu, Luca; Cuccuru, Gianmauro; Lianas, Luca; Soranzo, Nicola; Afgan, Enis; Zanetti, Gianluigi
2014-10-01
BioBlend.objects is a new component of the BioBlend package, adding an object-oriented interface for the Galaxy REST-based application programming interface. It improves support for metacomputing on Galaxy entities by providing higher-level functionality and allowing users to more easily create programs to explore, query and create Galaxy datasets and workflows. BioBlend.objects is available online at https://github.com/afgane/bioblend. The new object-oriented API is implemented by the galaxy/objects subpackage. © The Author 2014. Published by Oxford University Press.
Academic Peer Instruction: Reference and Training Manual (with Answers)
ERIC Educational Resources Information Center
Zaritsky, Joyce; Toce, Andi
2013-01-01
This manual consists of an introduction to our Academic Peer Instruction (API) program at LaGuardia Community College, a compilation of the materials we have developed and use for training of our tutors (with answers), and a bibliography. API is based on an internationally recognized peer tutoring program, Supplemental Instruction. (Contains 6…
A Development of Lightweight Grid Interface
NASA Astrophysics Data System (ADS)
Iwai, G.; Kawai, Y.; Sasaki, T.; Watase, Y.
2011-12-01
In order to help a rapid development of Grid/Cloud aware applications, we have developed API to abstract the distributed computing infrastructures based on SAGA (A Simple API for Grid Applications). SAGA, which is standardized in the OGF (Open Grid Forum), defines API specifications to access distributed computing infrastructures, such as Grid, Cloud and local computing resources. The Universal Grid API (UGAPI), which is a set of command line interfaces (CLI) and APIs, aims to offer simpler API to combine several SAGA interfaces with richer functionalities. These CLIs of the UGAPI offer typical functionalities required by end users for job management and file access to the different distributed computing infrastructures as well as local computing resources. We have also built a web interface for the particle therapy simulation and demonstrated the large scale calculation using the different infrastructures at the same time. In this paper, we would like to present how the web interface based on UGAPI and SAGA achieve more efficient utilization of computing resources over the different infrastructures with technical details and practical experiences.
Developing an Approach to Harvesting, Cleaning, and Analyzing Data from Twitter Using R
ERIC Educational Resources Information Center
Hill, Stephen; Scott, Rebecca
2017-01-01
Using data from social media can be of great value to businesses and other interested parties. However, harvesting data from social media networks such as Twitter, cleaning the data, and analyzing the data can be difficult. In this article, a step-by-step approach to obtaining data via the Twitter application program interface (API) is described.…
Detecting Potentially Compromised Credentials in a Large-Scale Production Single-Signon System
2014-06-01
Attention Deficit Hyperactivity Disorder ( ADHD ), Post-Traumatic Stress Disorder (PTSD), anxiety, they are neurotic, and have memory issues. They... Deficit Hyperactivity Disorder API Application Programming Interface CAC Common Access Card CBL Composite Blocking List CDF Cumulative Distribution...Service Logons (DSLs) system . . . . . . . . . . . . . . . . 49 xi THIS PAGE INTENTIONALLY LEFT BLANK xii List of Acronyms and Abbreviations ADHD Attention
Ontology-Oriented Programming for Biomedical Informatics.
Lamy, Jean-Baptiste
2016-01-01
Ontologies are now widely used in the biomedical domain. However, it is difficult to manipulate ontologies in a computer program and, consequently, it is not easy to integrate ontologies with databases or websites. Two main approaches have been proposed for accessing ontologies in a computer program: traditional API (Application Programming Interface) and ontology-oriented programming, either static or dynamic. In this paper, we will review these approaches and discuss their appropriateness for biomedical ontologies. We will also present an experience feedback about the integration of an ontology in a computer software during the VIIIP research project. Finally, we will present OwlReady, the solution we developed.
2005-04-12
Hardware, Database, and Operating System independence using Java • Enterprise-class Architecture using Java2 Enterprise Edition 1.4 • Standards based...portal applications. Compliance with the Java Specification Request for Portlet APIs (JSR-168) (Portlet API) and Web Services for Remote Portals...authentication and authorization • Portal Standards using Java Specification Request for Portlet APIs (JSR-168) (Portlet API) and Web Services for Remote
Bi, Yaw
2018-01-01
Although drug-based treatment is the primary intervention for malaria control and elimination, optimal use of targeted treatments remains unclear. From 2008 to 2016, three targeted programs on treatment were undertaken in Kachin Special Region II (KR2), Myanmar. Program I (2008–2011) treated all confirmed, clinical and suspected cases; program II (2012–2013) treated confirmed and clinical cases; and program III (2014–2016) targeted confirmed cases only. This study aims to evaluate the impacts of the three programs on malaria burden individually based on the annual parasite incidence (API), slide positivity rate (SPR) and their relative values. The API is calculated from original collected data and the incidence rate ratio (IRR) for each year is calculated by using the first-year API as a reference in each program phase across the KR2. Same method is applied to calculate SPR and risk ratio (RR) at the sentinel hospital too. During program I (2008–2011), malaria burden was reduced by 61% (95%CI: 58%-74%) and the actual API decreased from 9.8 (95%CI: 9.6–10.1) per 100 person-years in 2008 to 3.8 (3.6–4.1) per 100 person-years in 2011. Amid program II (2012–2013), the malaria burden increased by 33% (95%CI: 22%-46%) and the actual API increased from 2.1(95%CI: 2.0–2.3) per 100 person-years in 2012 to 2.8 (95%CI: 2.7–2.9) per 100 person-years in 2013. During program III (2014–2016) the malaria burden increased furtherly by 60% (95%CI: 51% - 69%) and the actual API increased from 3.2(95%CI: 3.0–3.3) per 100 person-years in 2014 to 5.1 (95%CI: 4.9–5.2) per 100 person-years in 2016. Results of the slide positivity of the sentinel hospital also confirm these results. Resurgence of malaria was mainly due to Plasmodium vivax during program II and III. This study indicates that strategy adopted in program I (2008–2011) should be more appropriate for the KR2. Quality-assured treatment of all confirmed, clinical and suspected malaria cases may be helpful for the reduction of malaria burden. PMID:29614088
NASA Technical Reports Server (NTRS)
Mandl, Daniel; Cappelaere, Patrice; Frye, Stuart; Evans, John; Moe, Karen
2014-01-01
Data products derived from Earth observing satellites are difficult to find and share without specialized software and often times a highly paid and specialized staff. For our research effort, we endeavored to prototype a distributed architecture that depends on a standardized communication protocol and applications program interface (API) that makes it easy for anyone to discover and access disaster related data. Providers can easily supply the public with their disaster related products by building an adapter for our API. Users can use the API to browse and find products that relate to the disaster at hand, without a centralized catalogue, for example floods, and then are able to share that data via social media. Furthermore, a longerterm goal for this architecture is to enable other users who see the shared disaster product to be able to generate the same product for other areas of interest via simple point and click actions on the API on their mobile device. Furthermore, the user will be able to edit the data with on the ground local observations and return the updated information to the original repository of this information if configured for this function. This architecture leverages SensorWeb functionality [1] presented at previous IGARSS conferences. The architecture is divided into two pieces, the frontend, which is the GeoSocial API, and the backend, which is a standardized disaster node that knows how to talk to other disaster nodes, and also can communicate with the GeoSocial API. The GeoSocial API, along with the disaster node basic functionality enables crowdsourcing and thus can leverage insitu observations by people external to a group to perform tasks such as improving water reference maps, which are maps of existing water before floods. This can lower the cost of generating precision water maps. Keywords-Data Discovery, Disaster Decision Support, Disaster Management, Interoperability, CEOS WGISS Disaster Architecture
DOE Office of Scientific and Technical Information (OSTI.GOV)
Grant, Ryan E.; Barrett, Brian W.; Pedretti, Kevin
The Portals reference implementation is based on the Portals 4.X API, published by Sandia National Laboratories as a freely available public document. It is designed to be an implementation of the Portals Networking Application Programming Interface and is used by several other upper layer protocols like SHMEM, GASNet and MPI. It is implemented over existing networks, specifically Ethernet and InfiniBand networks. This implementation provides Portals networks functionality and serves as a software emulation of Portals compliant networking hardware. It can be used to develop software using the Portals API prior to the debut of Portals networking hardware, such as Bull’smore » BXI interconnect, as well as a substitute for portals hardware on development platforms that do not have Portals compliant hardware. The reference implementation provides new capabilities beyond that of a typical network, namely the ability to have messages matched in hardware in a way compatible with upper layer software such as MPI or SHMEM. It also offers methods of offloading network operations via triggered operations, which can be used to create offloaded collective operations. Specific details on the Portals API can be found at http://portals4.org.« less
Reisinger, Florian; Krishna, Ritesh; Ghali, Fawaz; Ríos, Daniel; Hermjakob, Henning; Vizcaíno, Juan Antonio; Jones, Andrew R
2012-03-01
We present a Java application programming interface (API), jmzIdentML, for the Human Proteome Organisation (HUPO) Proteomics Standards Initiative (PSI) mzIdentML standard for peptide and protein identification data. The API combines the power of Java Architecture of XML Binding (JAXB) and an XPath-based random-access indexer to allow a fast and efficient mapping of extensible markup language (XML) elements to Java objects. The internal references in the mzIdentML files are resolved in an on-demand manner, where the whole file is accessed as a random-access swap file, and only the relevant piece of XMLis selected for mapping to its corresponding Java object. The APIis highly efficient in its memory usage and can handle files of arbitrary sizes. The APIfollows the official release of the mzIdentML (version 1.1) specifications and is available in the public domain under a permissive licence at http://www.code.google.com/p/jmzidentml/. © 2012 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
On Parallelizing Single Dynamic Simulation Using HPC Techniques and APIs of Commercial Software
DOE Office of Scientific and Technical Information (OSTI.GOV)
Diao, Ruisheng; Jin, Shuangshuang; Howell, Frederic
Time-domain simulations are heavily used in today’s planning and operation practices to assess power system transient stability and post-transient voltage/frequency profiles following severe contingencies to comply with industry standards. Because of the increased modeling complexity, it is several times slower than real time for state-of-the-art commercial packages to complete a dynamic simulation for a large-scale model. With the growing stochastic behavior introduced by emerging technologies, power industry has seen a growing need for performing security assessment in real time. This paper presents a parallel implementation framework to speed up a single dynamic simulation by leveraging the existing stability model librarymore » in commercial tools through their application programming interfaces (APIs). Several high performance computing (HPC) techniques are explored such as parallelizing the calculation of generator current injection, identifying fast linear solvers for network solution, and parallelizing data outputs when interacting with APIs in the commercial package, TSAT. The proposed method has been tested on a WECC planning base case with detailed synchronous generator models and exhibits outstanding scalable performance with sufficient accuracy.« less
LBMD : a layer-based mesh data structure tailored for generic API infrastructures.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ebeida, Mohamed S.; Knupp, Patrick Michael
2010-11-01
A new mesh data structure is introduced for the purpose of mesh processing in Application Programming Interface (API) infrastructures. This data structure utilizes a reduced mesh representation to increase its ability to handle significantly larger meshes compared to full mesh representation. In spite of the reduced representation, each mesh entity (vertex, edge, face, and region) is represented using a unique handle, with no extra storage cost, which is a crucial requirement in most API libraries. The concept of mesh layers makes the data structure more flexible for mesh generation and mesh modification operations. This flexibility can have a favorable impactmore » in solver based queries of finite volume and multigrid methods. The capabilities of LBMD make it even more attractive for parallel implementations using Message Passing Interface (MPI) or Graphics Processing Units (GPUs). The data structure is associated with a new classification method to relate mesh entities to their corresponding geometrical entities. The classification technique stores the related information at the node level without introducing any ambiguities. Several examples are presented to illustrate the strength of this new data structure.« less
CAPRI: A Geometric Foundation for Computational Analysis and Design
NASA Technical Reports Server (NTRS)
Haimes, Robert
2006-01-01
CAPRI is a software building tool-kit that refers to two ideas; (1) A simplified, object-oriented, hierarchical view of a solid part integrating both geometry and topology definitions, and (2) programming access to this part or assembly and any attached data. A complete definition of the geometry and application programming interface can be found in the document CAPRI: Computational Analysis PRogramming Interface appended to this report. In summary the interface is subdivided into the following functional components: 1. Utility routines -- These routines include the initialization of CAPRI, loading CAD parts and querying the operational status as well as closing the system down. 2. Geometry data-base queries -- This group of functions allow all top level applications to figure out and get detailed information on any geometric component in the Volume definition. 3. Point queries -- These calls allow grid generators, or solvers doing node adaptation, to snap points directly onto geometric entities. 4. Calculated or geometrically derived queries -- These entry points calculate data from the geometry to aid in grid generation. 5. Boundary data routines -- This part of CAPRI allows general data to be attached to Boundaries so that the boundary conditions can be specified and stored within CAPRI s data-base. 6. Tag based routines -- This part of the API allows the specification of properties associated with either the Volume (material properties) or Boundary (surface properties) entities. 7. Geometry based interpolation routines -- This part of the API facilitates Multi-disciplinary coupling and allows zooming through Boundary Attachments. 8. Geometric creation and manipulation -- These calls facilitate constructing simple solid entities and perform the Boolean solid operations. Geometry constructed in this manner has the advantage that if the data is kept consistent with the CAD package, therefore a new design can be incorporated directly and is manufacturable. 9. Master Model access This addition to the API allows for the querying of the parameters and dimensions of the model. The feature tree is also exposed so it is easy to see where the parameters are applied. Calls exist to allow for the modification of the parameters and the suppression/unsuppression of nodes in the tree. Part regeneration is performed by a single API call and a new part becomes available within CAPRI (if the regeneration was successful). This is described in a separate document. Components 1-7 are considered the CAPRI base level reader.
Application Program Interface for the Orion Aerodynamics Database
NASA Technical Reports Server (NTRS)
Robinson, Philip E.; Thompson, James
2013-01-01
The Application Programming Interface (API) for the Crew Exploration Vehicle (CEV) Aerodynamic Database has been developed to provide the developers of software an easily implemented, fully self-contained method of accessing the CEV Aerodynamic Database for use in their analysis and simulation tools. The API is programmed in C and provides a series of functions to interact with the database, such as initialization, selecting various options, and calculating the aerodynamic data. No special functions (file read/write, table lookup) are required on the host system other than those included with a standard ANSI C installation. It reads one or more files of aero data tables. Previous releases of aerodynamic databases for space vehicles have only included data tables and a document of the algorithm and equations to combine them for the total aerodynamic forces and moments. This process required each software tool to have a unique implementation of the database code. Errors or omissions in the documentation, or errors in the implementation, led to a lengthy and burdensome process of having to debug each instance of the code. Additionally, input file formats differ for each space vehicle simulation tool, requiring the aero database tables to be reformatted to meet the tool s input file structure requirements. Finally, the capabilities for built-in table lookup routines vary for each simulation tool. Implementation of a new database may require an update to and verification of the table lookup routines. This may be required if the number of dimensions of a data table exceeds the capability of the simulation tools built-in lookup routines. A single software solution was created to provide an aerodynamics software model that could be integrated into other simulation and analysis tools. The highly complex Orion aerodynamics model can then be quickly included in a wide variety of tools. The API code is written in ANSI C for ease of portability to a wide variety of systems. The input data files are in standard formatted ASCII, also for improved portability. The API contains its own implementation of multidimensional table reading and lookup routines. The same aerodynamics input file can be used without modification on all implementations. The turnaround time from aerodynamics model release to a working implementation is significantly reduced
EXODUS II: A finite element data model
DOE Office of Scientific and Technical Information (OSTI.GOV)
Schoof, L.A.; Yarberry, V.R.
1994-09-01
EXODUS II is a model developed to store and retrieve data for finite element analyses. It is used for preprocessing (problem definition), postprocessing (results visualization), as well as code to code data transfer. An EXODUS II data file is a random access, machine independent, binary file that is written and read via C, C++, or Fortran library routines which comprise the Application Programming Interface (API).
2014-04-25
EA’s Java application programming interface (API), the team built a tool called OWL2EA that can ingest an OWL file and generate the corresponding UML...ObjectItemStructure specification shown in Figure 10. Running this script in the relational database server MySQL creates the physical schema that
NASA Astrophysics Data System (ADS)
Listyorini, Tri; Muzid, Syafiul
2017-06-01
The promotion team of Muria Kudus University (UMK) has done annual promotion visit to several senior high schools in Indonesia. The visits were done to numbers of schools in Kudus, Jepara, Demak, Rembang and Purwodadi. To simplify the visit, each visit round is limited to 15 (fifteen) schools. However, the team frequently faces some obstacles during the visit, particularly in determining the route that they should take toward the targeted school. It is due to the long distance or the difficult route to reach the targeted school that leads to elongated travel duration and inefficient fuel cost. To solve these problems, the development of a certain application using heuristic genetic algorithm method based on the dynamic of population size or Population Resizing on Fitness lmprovement Genetic Algorithm (PRoFIGA), was done. This android-based application was developed to make the visit easier and to determine a shorter route for the team, hence, the visiting period will be effective and efficient. The result of this research was an android-based application to determine the shortest route by combining heuristic method and Google Maps Application Programming lnterface (API) that display the route options for the team.
Equalizer: a scalable parallel rendering framework.
Eilemann, Stefan; Makhinya, Maxim; Pajarola, Renato
2009-01-01
Continuing improvements in CPU and GPU performances as well as increasing multi-core processor and cluster-based parallelism demand for flexible and scalable parallel rendering solutions that can exploit multipipe hardware accelerated graphics. In fact, to achieve interactive visualization, scalable rendering systems are essential to cope with the rapid growth of data sets. However, parallel rendering systems are non-trivial to develop and often only application specific implementations have been proposed. The task of developing a scalable parallel rendering framework is even more difficult if it should be generic to support various types of data and visualization applications, and at the same time work efficiently on a cluster with distributed graphics cards. In this paper we introduce a novel system called Equalizer, a toolkit for scalable parallel rendering based on OpenGL which provides an application programming interface (API) to develop scalable graphics applications for a wide range of systems ranging from large distributed visualization clusters and multi-processor multipipe graphics systems to single-processor single-pipe desktop machines. We describe the system architecture, the basic API, discuss its advantages over previous approaches, present example configurations and usage scenarios as well as scalability results.
HIPAA Compliant Wireless Sensing Smartwatch Application for the Self-Management of Pediatric Asthma
Hosseini, Anahita; Buonocore, Chris M.; Hashemzadeh, Sepideh; Hojaiji, Hannaneh; Kalantarian, Haik; Sideris, Costas; Bui, Alex A.T.; King, Christine E.; Sarrafzadeh, Majid
2018-01-01
Asthma is the most prevalent chronic disease among pediatrics, as it is the leading cause of student absenteeism and hospitalization for those under the age of 15. To address the significant need to manage this disease in children, the authors present a mobile health (mHealth) system that determines the risk of an asthma attack through physiological and environmental wireless sensors and representational state transfer application program interfaces (RESTful APIs). The data is sent from wireless sensors to a smartwatch application (app) via a Health Insurance Portability and Accountability Act (HIPAA) compliant cryptography framework, which then sends data to a cloud for real-time analytics. The asthma risk is then sent to the smartwatch and provided to the user via simple graphics for easy interpretation by children. After testing the safety and feasibility of the system in an adult with moderate asthma prior to testing in children, it was found that the analytics model is able to determine the overall asthma risk (high, medium, or low risk) with an accuracy of 80.10±14.13%. Furthermore, the features most important for assessing the risk of an asthma attack were multifaceted, highlighting the importance of continuously monitoring different wireless sensors and RESTful APIs. Future testing this asthma attack risk prediction system in pediatric asthma individuals may lead to an effective self-management asthma program. PMID:29354688
Artificial periodic irregularities in the high-latitude ionosphere excited by the HAARP facility
NASA Astrophysics Data System (ADS)
Bakhmetieva, N. V.; Grach, S. M.; Sergeev, E. N.; Shindin, A. V.; Milikh, G. M.; Siefring, C. L.; Bernhardt, P. A.; McCarrick, M.
2016-07-01
We present results of the new observations of artificial periodic irregularities (APIs) in the ionosphere using the High Frequency Active Auroral Research Program (HAARP) heating facility carried out in late May and early June 2014.The objective of this work is to detect API using high-latitude facility and analyze possible differences of the temporal and spatial variations of the API echoes in the high (HAARP) and middle (Sura) latitudes. Irregularities were created by the powerful wave of X mode and were sounded using the short probing pulses signals of X mode. API echoes were observed in the D, E, and F regions of the ionosphere. Amplitudes and characteristic times of the API echoes were measured. The API growth and decay times at HAARP (high latitudes) observed were similar to those at the Sura heating facility (midlatitudes).
The RNASeq-er API-a gateway to systematically updated analysis of public RNA-seq data.
Petryszak, Robert; Fonseca, Nuno A; Füllgrabe, Anja; Huerta, Laura; Keays, Maria; Tang, Y Amy; Brazma, Alvis
2017-07-15
The exponential growth of publicly available RNA-sequencing (RNA-Seq) data poses an increasing challenge to researchers wishing to discover, analyse and store such data, particularly those based in institutions with limited computational resources. EMBL-EBI is in an ideal position to address these challenges and to allow the scientific community easy access to not just raw, but also processed RNA-Seq data. We present a Web service to access the results of a systematically and continually updated standardized alignment as well as gene and exon expression quantification of all public bulk (and in the near future also single-cell) RNA-Seq runs in 264 species in European Nucleotide Archive, using Representational State Transfer. The RNASeq-er API (Application Programming Interface) enables ontology-powered search for and retrieval of CRAM, bigwig and bedGraph files, gene and exon expression quantification matrices (Fragments Per Kilobase Of Exon Per Million Fragments Mapped, Transcripts Per Million, raw counts) as well as sample attributes annotated with ontology terms. To date over 270 00 RNA-Seq runs in nearly 10 000 studies (1PB of raw FASTQ data) in 264 species in ENA have been processed and made available via the API. The RNASeq-er API can be accessed at http://www.ebi.ac.uk/fg/rnaseq/api . The commands used to analyse the data are available in supplementary materials and at https://github.com/nunofonseca/irap/wiki/iRAP-single-library . rnaseq@ebi.ac.uk ; rpetry@ebi.ac.uk. Supplementary data are available at Bioinformatics online. © The Author(s) 2017. Published by Oxford University Press.
NASA Astrophysics Data System (ADS)
Duffy, D.; Maxwell, T. P.; Doutriaux, C.; Williams, D. N.; Chaudhary, A.; Ames, S.
2015-12-01
As the size of remote sensing observations and model output data grows, the volume of the data has become overwhelming, even to many scientific experts. As societies are forced to better understand, mitigate, and adapt to climate changes, the combination of Earth observation data and global climate model projects is crucial to not only scientists but to policy makers, downstream applications, and even the public. Scientific progress on understanding climate is critically dependent on the availability of a reliable infrastructure that promotes data access, management, and provenance. The Earth System Grid Federation (ESGF) has created such an environment for the Intergovernmental Panel on Climate Change (IPCC). ESGF provides a federated global cyber infrastructure for data access and management of model outputs generated for the IPCC Assessment Reports (AR). The current generation of the ESGF federated grid allows consumers of the data to find and download data with limited capabilities for server-side processing. Since the amount of data for future AR is expected to grow dramatically, ESGF is working on integrating server-side analytics throughout the federation. The ESGF Compute Working Team (CWT) has created a Web Processing Service (WPS) Application Programming Interface (API) to enable access scalable computational resources. The API is the exposure point to high performance computing resources across the federation. Specifically, the API allows users to execute simple operations, such as maximum, minimum, average, and anomalies, on ESGF data without having to download the data. These operations are executed at the ESGF data node site with access to large amounts of parallel computing capabilities. This presentation will highlight the WPS API, its capabilities, provide implementation details, and discuss future developments.
Open Core Data: Connecting scientific drilling data to scientists and community data resources
NASA Astrophysics Data System (ADS)
Fils, D.; Noren, A. J.; Lehnert, K.; Diver, P.
2016-12-01
Open Core Data (OCD) is an innovative, efficient, and scalable infrastructure for data generated by scientific drilling and coring to improve discoverability, accessibility, citability, and preservation of data from the oceans and continents. OCD is building on existing community data resources that manage, store, publish, and preserve scientific drilling data, filling a critical void that currently prevents linkages between these and other data systems and tools to realize the full potential of data generated through drilling and coring. We are developing this functionality through Linked Open Data (LOD) and semantic patterns that enable data access through the use of community ontologies such as GeoLink (geolink.org, an EarthCube Building Block), a collection of protocols, formats and vocabularies from a set of participating geoscience repositories. Common shared concepts of classes such as cruise, dataset, person and others allow easier resolution of common references through shared resource IDs. These graphs are then made available via SPARQL as well as incorporated into web pages following schema.org approaches. Additionally the W3C PROV vocabulary is under evaluation for use for documentation of provenance. Further, the application of persistent identifiers for samples (IGSNs); datasets, expeditions, and projects (DOIs); and people (ORCIDs), combined with LOD approaches, provides methods to resolve and incorporate metadata and datasets. Application Program Interfaces (APIs) complement these semantic approaches to the OCD data holdings. APIs are exposed following the Swagger guidelines (swagger.io) and will be evolved into the OpenAPI (openapis.org) approach. Currently APIs are in development for the NSF funded Flyover Country mobile geoscience app (fc.umn.edu), the Neotoma Paleoecology Database (neotomadb.org), Magnetics Information Consortium (MagIC; earthref.org/MagIC), and other community tools and data systems, as well as for internal OCD use.
Fault recovery in the reliable multicast protocol
NASA Technical Reports Server (NTRS)
Callahan, John R.; Montgomery, Todd L.; Whetten, Brian
1995-01-01
The Reliable Multicast Protocol (RMP) provides a unique, group-based model for distributed programs that need to handle reconfiguration events at the application layer. This model, called membership views, provides an abstraction in which events such as site failures, network partitions, and normal join-leave events are viewed as group reformations. RMP provides access to this model through an application programming interface (API) that notifies an application when a group is reformed as the result of a some event. RMP provides applications with reliable delivery of messages using an underlying IP Multicast (12, 5) media to other group members in a distributed environment even in the case of reformations. A distributed application can use various Quality of Service (QoS) levels provided by RMP to tolerate group reformations. This paper explores the implementation details of the mechanisms in RMP that provide distributed applications with membership view information and fault recovery capabilities.
Specification and Design of a Fault Recovery Model for the Reliable Multicast Protocol
NASA Technical Reports Server (NTRS)
Montgomery, Todd; Callahan, John R.; Whetten, Brian
1996-01-01
The Reliable Multicast Protocol (RMP) provides a unique, group-based model for distributed programs that need to handle reconfiguration events at the application layer. This model, called membership views, provides an abstraction in which events such as site failures, network partitions, and normal join-leave events are viewed as group reformations. RMP provides access to this model through an application programming interface (API) that notifies an application when a group is reformed as the result of a some event. RMP provides applications with reliable delivery of messages using an underlying IP Multicast media to other group members in a distributed environment even in the case of reformations. A distributed application can use various Quality of Service (QoS) levels provided by RMP to tolerate group reformations. This paper explores the implementation details of the mechanisms in RMP that provide distributed applications with membership view information and fault recovery capabilities.
Kearse, Matthew; Moir, Richard; Wilson, Amy; Stones-Havas, Steven; Cheung, Matthew; Sturrock, Shane; Buxton, Simon; Cooper, Alex; Markowitz, Sidney; Duran, Chris; Thierer, Tobias; Ashton, Bruce; Meintjes, Peter; Drummond, Alexei
2012-01-01
Summary: The two main functions of bioinformatics are the organization and analysis of biological data using computational resources. Geneious Basic has been designed to be an easy-to-use and flexible desktop software application framework for the organization and analysis of biological data, with a focus on molecular sequences and related data types. It integrates numerous industry-standard discovery analysis tools, with interactive visualizations to generate publication-ready images. One key contribution to researchers in the life sciences is the Geneious public application programming interface (API) that affords the ability to leverage the existing framework of the Geneious Basic software platform for virtually unlimited extension and customization. The result is an increase in the speed and quality of development of computation tools for the life sciences, due to the functionality and graphical user interface available to the developer through the public API. Geneious Basic represents an ideal platform for the bioinformatics community to leverage existing components and to integrate their own specific requirements for the discovery, analysis and visualization of biological data. Availability and implementation: Binaries and public API freely available for download at http://www.geneious.com/basic, implemented in Java and supported on Linux, Apple OSX and MS Windows. The software is also available from the Bio-Linux package repository at http://nebc.nerc.ac.uk/news/geneiousonbl. Contact: peter@biomatters.com PMID:22543367
Kearse, Matthew; Moir, Richard; Wilson, Amy; Stones-Havas, Steven; Cheung, Matthew; Sturrock, Shane; Buxton, Simon; Cooper, Alex; Markowitz, Sidney; Duran, Chris; Thierer, Tobias; Ashton, Bruce; Meintjes, Peter; Drummond, Alexei
2012-06-15
The two main functions of bioinformatics are the organization and analysis of biological data using computational resources. Geneious Basic has been designed to be an easy-to-use and flexible desktop software application framework for the organization and analysis of biological data, with a focus on molecular sequences and related data types. It integrates numerous industry-standard discovery analysis tools, with interactive visualizations to generate publication-ready images. One key contribution to researchers in the life sciences is the Geneious public application programming interface (API) that affords the ability to leverage the existing framework of the Geneious Basic software platform for virtually unlimited extension and customization. The result is an increase in the speed and quality of development of computation tools for the life sciences, due to the functionality and graphical user interface available to the developer through the public API. Geneious Basic represents an ideal platform for the bioinformatics community to leverage existing components and to integrate their own specific requirements for the discovery, analysis and visualization of biological data. Binaries and public API freely available for download at http://www.geneious.com/basic, implemented in Java and supported on Linux, Apple OSX and MS Windows. The software is also available from the Bio-Linux package repository at http://nebc.nerc.ac.uk/news/geneiousonbl.
Compositional mining of multiple object API protocols through state abstraction.
Dai, Ziying; Mao, Xiaoguang; Lei, Yan; Qi, Yuhua; Wang, Rui; Gu, Bin
2013-01-01
API protocols specify correct sequences of method invocations. Despite their usefulness, API protocols are often unavailable in practice because writing them is cumbersome and error prone. Multiple object API protocols are more expressive than single object API protocols. However, the huge number of objects of typical object-oriented programs poses a major challenge to the automatic mining of multiple object API protocols: besides maintaining scalability, it is important to capture various object interactions. Current approaches utilize various heuristics to focus on small sets of methods. In this paper, we present a general, scalable, multiple object API protocols mining approach that can capture all object interactions. Our approach uses abstract field values to label object states during the mining process. We first mine single object typestates as finite state automata whose transitions are annotated with states of interacting objects before and after the execution of the corresponding method and then construct multiple object API protocols by composing these annotated single object typestates. We implement our approach for Java and evaluate it through a series of experiments.
Compositional Mining of Multiple Object API Protocols through State Abstraction
Mao, Xiaoguang; Qi, Yuhua; Wang, Rui; Gu, Bin
2013-01-01
API protocols specify correct sequences of method invocations. Despite their usefulness, API protocols are often unavailable in practice because writing them is cumbersome and error prone. Multiple object API protocols are more expressive than single object API protocols. However, the huge number of objects of typical object-oriented programs poses a major challenge to the automatic mining of multiple object API protocols: besides maintaining scalability, it is important to capture various object interactions. Current approaches utilize various heuristics to focus on small sets of methods. In this paper, we present a general, scalable, multiple object API protocols mining approach that can capture all object interactions. Our approach uses abstract field values to label object states during the mining process. We first mine single object typestates as finite state automata whose transitions are annotated with states of interacting objects before and after the execution of the corresponding method and then construct multiple object API protocols by composing these annotated single object typestates. We implement our approach for Java and evaluate it through a series of experiments. PMID:23844378
Update 0.2 to "pysimm: A python package for simulation of molecular systems"
NASA Astrophysics Data System (ADS)
Demidov, Alexander G.; Fortunato, Michael E.; Colina, Coray M.
2018-01-01
An update to the pysimm Python molecular simulation API is presented. A major part of the update is the implementation of a new interface with CASSANDRA - a modern, versatile Monte Carlo molecular simulation program. Several significant improvements in the LAMMPS communication module that allow better and more versatile simulation setup are reported as well. An example of an application implementing iterative CASSANDRA-LAMMPS interaction is illustrated.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jarocki, John Charles; Zage, David John; Fisher, Andrew N.
LinkShop is a software tool for applying the method of Linkography to the analysis time-sequence data. LinkShop provides command line, web, and application programming interfaces (API) for input and processing of time-sequence data, abstraction models, and ontologies. The software creates graph representations of the abstraction model, ontology, and derived linkograph. Finally, the tool allows the user to perform statistical measurements of the linkograph and refine the ontology through direct manipulation of the linkograph.
Mining Program Source Code for Improving Software Quality
2013-01-01
conduct static verification on the software application under analysis to detect defects around APIs. (a) Papers published in peer-reviewed journals...N/A for none) Enter List of papers submitted or published that acknowledge ARO support from the start of the project to the date of this printing...List the papers , including journal references, in the following categories: Received Paper 05/06/2013 21.00 Tao Xie, Suresh Thummalapenta, David Lo
2014-06-01
from the ODM standard. Leveraging SPARX EA’s Java application programming interface (API), the team built a tool called OWL2EA that can ingest an OWL...server MySQL creates the physical schema that enables a user to store and retrieve data conforming to the vocabulary of the JC3IEDM. 6. GENERATING AN
scikit-image: image processing in Python.
van der Walt, Stéfan; Schönberger, Johannes L; Nunez-Iglesias, Juan; Boulogne, François; Warner, Joshua D; Yager, Neil; Gouillart, Emmanuelle; Yu, Tony
2014-01-01
scikit-image is an image processing library that implements algorithms and utilities for use in research, education and industry applications. It is released under the liberal Modified BSD open source license, provides a well-documented API in the Python programming language, and is developed by an active, international team of collaborators. In this paper we highlight the advantages of open source to achieve the goals of the scikit-image library, and we showcase several real-world image processing applications that use scikit-image. More information can be found on the project homepage, http://scikit-image.org.
Havery Mudd 2014-2015 Computer Science Conduit Clinic Final Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Aspesi, G; Bai, J; Deese, R
2015-05-12
Conduit, a new open-source library developed at Lawrence Livermore National Laboratories, provides a C++ application programming interface (API) to describe and access scientific data. Conduit’s primary use is for inmemory data exchange in high performance computing (HPC) applications. Our team tested and improved Conduit to make it more appealing to potential adopters in the HPC community. We extended Conduit’s capabilities by prototyping four libraries: one for parallel communication using MPI, one for I/O functionality, one for aggregating performance data, and one for data visualization.
Open SHMEM Reference Implementation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pritchard, Howard; Curtis, Anthony; Welch, Aaron
2016-05-12
OpenSHMEM is an effort to create a specification for a standardized API for parallel programming in the Partitioned Global Address Space. Along with the specification the project is also creating a reference implementation of the API. This implementation attempts to be portable, to allow it to be deployed in multiple environments, and to be a starting point for implementations targeted to particular hardware platforms. It will also serve as a springboard for future development of the API.
Code of Federal Regulations, 2014 CFR
2014-10-01
... accompanied by supporting materials sufficient to calculate required adjustments to each PCI, API, and SBI... that results in an API value that is equal to or less than the applicable PCI value, must be... proposed rates. (d) Each price cap tariff filing that proposes rates that will result in an API value that...
Code of Federal Regulations, 2013 CFR
2013-10-01
... accompanied by supporting materials sufficient to calculate required adjustments to each PCI, API, and SBI... that results in an API value that is equal to or less than the applicable PCI value, must be... proposed rates. (d) Each price cap tariff filing that proposes rates that will result in an API value that...
Code of Federal Regulations, 2012 CFR
2012-10-01
... accompanied by supporting materials sufficient to calculate required adjustments to each PCI, API, and SBI... that results in an API value that is equal to or less than the applicable PCI value, must be... proposed rates. (d) Each price cap tariff filing that proposes rates that will result in an API value that...
Code of Federal Regulations, 2011 CFR
2011-10-01
... accompanied by supporting materials sufficient to calculate required adjustments to each PCI, API, and SBI... that results in an API value that is equal to or less than the applicable PCI value, must be... proposed rates. (d) Each price cap tariff filing that proposes rates that will result in an API value that...
The Virtual Solar Observatory and the Heliophysics Meta-Virtual Observatory
NASA Astrophysics Data System (ADS)
Gurman, J. B.; Hourclé, J. A.; Bogart, R. S.; Tian, K.; Hill, F.; Suàrez-Sola, I.; Zarro, D. M.; Davey, A. R.; Martens, P. C.; Yoshimura, K.; Reardon, K. M.
2006-12-01
The Virtual Solar Observatory (VSO) has survived its infancy and provides metadata search and data identification for measurements from 45 instrument data sets held at 12 online archives, as well as flare and coronal mass ejection (CME) event lists. Like any toddler, the VSO is good at getting into anything and everything, and is now extending its grasp to more data sets, new missions, and new access methods using its application programming interface (API). We discuss and demonstrate recent changes, including developments for STEREO and SDO, and an IDL-callable interface for the VSO API. We urge the heliophysics community to help civilize this obstreperous youngster by providing input on ways to make the VSO even more useful for system science research in its role as part of the growing cluster of Heliophysics Virtual Observatories.
NASA Astrophysics Data System (ADS)
Xin, YANG; Si-qi, WU; Qi, ZHANG
2018-05-01
Beijing, London, Paris, New York are typical cities in the world, so comparative study of four cities green pattern is very important to find out gap and advantage and to learn from each other. The paper will provide basis and new ideas for development of metropolises in China. On the background of big data, API (Application Programming Interface) system can provide extensive and accurate basic data to study urban green pattern in different geographical environment in domestic and foreign. On the basis of this, Average nearest neighbor tool, Kernel density tool and Standard Ellipse tool in ArcGIS platform can process and summarize data and realize quantitative analysis of green pattern. The paper summarized uniqueness of four cities green pattern and reasons of formation on basis of numerical comparison.
Eastman, Peter; Friedrichs, Mark S; Chodera, John D; Radmer, Randall J; Bruns, Christopher M; Ku, Joy P; Beauchamp, Kyle A; Lane, Thomas J; Wang, Lee-Ping; Shukla, Diwakar; Tye, Tony; Houston, Mike; Stich, Timo; Klein, Christoph; Shirts, Michael R; Pande, Vijay S
2013-01-08
OpenMM is a software toolkit for performing molecular simulations on a range of high performance computing architectures. It is based on a layered architecture: the lower layers function as a reusable library that can be invoked by any application, while the upper layers form a complete environment for running molecular simulations. The library API hides all hardware-specific dependencies and optimizations from the users and developers of simulation programs: they can be run without modification on any hardware on which the API has been implemented. The current implementations of OpenMM include support for graphics processing units using the OpenCL and CUDA frameworks. In addition, OpenMM was designed to be extensible, so new hardware architectures can be accommodated and new functionality (e.g., energy terms and integrators) can be easily added.
Eastman, Peter; Friedrichs, Mark S.; Chodera, John D.; Radmer, Randall J.; Bruns, Christopher M.; Ku, Joy P.; Beauchamp, Kyle A.; Lane, Thomas J.; Wang, Lee-Ping; Shukla, Diwakar; Tye, Tony; Houston, Mike; Stich, Timo; Klein, Christoph; Shirts, Michael R.; Pande, Vijay S.
2012-01-01
OpenMM is a software toolkit for performing molecular simulations on a range of high performance computing architectures. It is based on a layered architecture: the lower layers function as a reusable library that can be invoked by any application, while the upper layers form a complete environment for running molecular simulations. The library API hides all hardware-specific dependencies and optimizations from the users and developers of simulation programs: they can be run without modification on any hardware on which the API has been implemented. The current implementations of OpenMM include support for graphics processing units using the OpenCL and CUDA frameworks. In addition, OpenMM was designed to be extensible, so new hardware architectures can be accommodated and new functionality (e.g., energy terms and integrators) can be easily added. PMID:23316124
QuTiP 2: A Python framework for the dynamics of open quantum systems
NASA Astrophysics Data System (ADS)
Johansson, J. R.; Nation, P. D.; Nori, Franco
2013-04-01
We present version 2 of QuTiP, the Quantum Toolbox in Python. Compared to the preceding version [J.R. Johansson, P.D. Nation, F. Nori, Comput. Phys. Commun. 183 (2012) 1760.], we have introduced numerous new features, enhanced performance, and made changes in the Application Programming Interface (API) for improved functionality and consistency within the package, as well as increased compatibility with existing conventions used in other scientific software packages for Python. The most significant new features include efficient solvers for arbitrary time-dependent Hamiltonians and collapse operators, support for the Floquet formalism, and new solvers for Bloch-Redfield and Floquet-Markov master equations. Here we introduce these new features, demonstrate their use, and give a summary of the important backward-incompatible API changes introduced in this version. Catalog identifier: AEMB_v2_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEMB_v2_0.html Program obtainable from: CPC Program Library, Queen’s University, Belfast, N. Ireland Licensing provisions: GNU General Public License, version 3 No. of lines in distributed program, including test data, etc.: 33625 No. of bytes in distributed program, including test data, etc.: 410064 Distribution format: tar.gz Programming language: Python. Computer: i386, x86-64. Operating system: Linux, Mac OSX. RAM: 2+ Gigabytes Classification: 7. External routines: NumPy, SciPy, Matplotlib, Cython Catalog identifier of previous version: AEMB_v1_0 Journal reference of previous version: Comput. Phys. Comm. 183 (2012) 1760 Does the new version supercede the previous version?: Yes Nature of problem: Dynamics of open quantum systems Solution method: Numerical solutions to Lindblad, Floquet-Markov, and Bloch-Redfield master equations, as well as the Monte Carlo wave function method. Reasons for new version: Compared to the preceding version we have introduced numerous new features, enhanced performance, and made changes in the Application Programming Interface (API) for improved functionality and consistency within the package, as well as increased compatibility with existing conventions used in other scientific software packages for Python. The most significant new features include efficient solvers for arbitrary time-dependent Hamiltonians and collapse operators, support for the Floquet formalism, and new solvers for Bloch-Redfield and Floquet-Markov master equations. Restrictions: Problems must meet the criteria for using the master equation in Lindblad, Floquet-Markov, or Bloch-Redfield form. Running time: A few seconds up to several tens of hours, depending on size of the underlying Hilbert space.
A new programming metaphor for image processing procedures
NASA Technical Reports Server (NTRS)
Smirnov, O. M.; Piskunov, N. E.
1992-01-01
Most image processing systems, besides an Application Program Interface (API) which lets users write their own image processing programs, also feature a higher level of programmability. Traditionally, this is a command or macro language, which can be used to build large procedures (scripts) out of simple programs or commands. This approach, a legacy of the teletypewriter has serious drawbacks. A command language is clumsy when (and if! it attempts to utilize the capabilities of a multitasking or multiprocessor environment, it is but adequate for real-time data acquisition and processing, it has a fairly steep learning curve, and the user interface is very inefficient,. especially when compared to a graphical user interface (GUI) that systems running under Xll or Windows should otherwise be able to provide. ll these difficulties stem from one basic problem: a command language is not a natural metaphor for an image processing procedure. A more natural metaphor - an image processing factory is described in detail. A factory is a set of programs (applications) that execute separate operations on images, connected by pipes that carry data (images and parameters) between them. The programs function concurrently, processing images as they arrive along pipes, and querying the user for whatever other input they need. From the user's point of view, programming (constructing) factories is a lot like playing with LEGO blocks - much more intuitive than writing scripts. Focus is on some of the difficulties of implementing factory support, most notably the design of an appropriate API. It also shows that factories retain all the functionality of a command language (including loops and conditional branches), while suffering from none of the drawbacks outlined above. Other benefits of factory programming include self-tuning factories and the process of encapsulation, which lets a factory take the shape of a standard application both from the system and the user's point of view, and thus be used as a component of other factories. A bare-bones prototype of factory programming was implemented under the PcIPS image processing system, and a complete version (on a multitasking platform) is under development.
SERENITY Aware System Development Process
NASA Astrophysics Data System (ADS)
Serrano, Daniel; Maña, Antonio; Llarena, Rafael; Crespo, Beatriz Gallego-Nicasio; Li, Keqin
Traditionally, security patterns have successfully been used to describe security and dependability. In the SERENITY Project the notion of security and dependability (S&D) pattern has been extended to exact specifications of re-usable security mechanisms for Ambient Intelligence (AmI) systems. These S&D Patterns include information on the security properties satisfied by the solution and on the context conditions to be fulfilled. This chapter presents the development of applications supported by SERENITY. In the context of SERENITY we refer to these applications as Serenity-aware applications. Firstly, this chapter presents the Serenity-aware application design using S&D Artefacts. Secondly, it proposes a Java Application Programming Interface (API) to be used in the application development. And, finally, it introduces the development of an example Serenity-aware application.
30 CFR 250.1920 - What are the auditing requirements for my SEMS program?
Code of Federal Regulations, 2013 CFR
2013-07-01
... subpart and API RP 75, Section 12 (incorporated by reference as specified in § 250.198). The audit process...) Your audit plan and procedures must meet or exceed all of the recommendations included in API RP 75...
30 CFR 250.1920 - What are the auditing requirements for my SEMS program?
Code of Federal Regulations, 2014 CFR
2014-07-01
... subpart and API RP 75, Section 12 (incorporated by reference as specified in § 250.198). The audit process...) Your audit plan and procedures must meet or exceed all of the recommendations included in API RP 75...
ANTP Protocol Suite Software Implementation Architecture in Python
2011-06-03
a popular platform of networking programming, an area in which C has traditionally dominated. 2 NetController AeroRP AeroNP AeroNP API AeroTP...visualisation of the running system. For example using the Google Maps API , the main logging web page can show all the running nodes in the system. By...communication between AeroNP and AeroRP and runs on the operating system as daemon. Furthermore, it creates an API interface to mange the communication between
DOE Office of Scientific and Technical Information (OSTI.GOV)
Steich, D J; Brugger, S T; Kallman, J S
2000-02-01
This final report describes our efforts on the Three-Dimensional Massively Parallel CEM Technologies LDRD project (97-ERD-009). Significant need exists for more advanced time domain computational electromagnetics modeling. Bookkeeping details and modifying inflexible software constitute a vast majority of the effort required to address such needs. The required effort escalates rapidly as problem complexity increases. For example, hybrid meshes requiring hybrid numerics on massively parallel platforms (MPPs). This project attempts to alleviate the above limitations by investigating flexible abstractions for these numerical algorithms on MPPs using object-oriented methods, providing a programming environment insulating physics from bookkeeping. The three major design iterationsmore » during the project, known as TIGER-I to TIGER-III, are discussed. Each version of TIGER is briefly discussed along with lessons learned during the development and implementation. An Application Programming Interface (API) of the object-oriented interface for Tiger-III is included in three appendices. The three appendices contain the Utilities, Entity-Attribute, and Mesh libraries developed during the project. The API libraries represent a snapshot of our latest attempt at insulated the physics from the bookkeeping.« less
30 CFR 250.920 - What are the MMS requirements for assessment of fixed platforms?
Code of Federal Regulations, 2011 CFR
2011-07-01
.... Assessment categories are defined in API RP 2A-WSD, Section 17.3. If MMS objects to the assessment category... more restrictive level (see Sections 17.2.1 through 17.2.5 of API RP 2A-WSD for a description of... assessment process of API RP 2A-WSD. You must submit applications for your mitigation actions (e.g., repair...
libChEBI: an API for accessing the ChEBI database.
Swainston, Neil; Hastings, Janna; Dekker, Adriano; Muthukrishnan, Venkatesh; May, John; Steinbeck, Christoph; Mendes, Pedro
2016-01-01
ChEBI is a database and ontology of chemical entities of biological interest. It is widely used as a source of identifiers to facilitate unambiguous reference to chemical entities within biological models, databases, ontologies and literature. ChEBI contains a wealth of chemical data, covering over 46,500 distinct chemical entities, and related data such as chemical formula, charge, molecular mass, structure, synonyms and links to external databases. Furthermore, ChEBI is an ontology, and thus provides meaningful links between chemical entities. Unlike many other resources, ChEBI is fully human-curated, providing a reliable, non-redundant collection of chemical entities and related data. While ChEBI is supported by a web service for programmatic access and a number of download files, it does not have an API library to facilitate the use of ChEBI and its data in cheminformatics software. To provide this missing functionality, libChEBI, a comprehensive API library for accessing ChEBI data, is introduced. libChEBI is available in Java, Python and MATLAB versions from http://github.com/libChEBI, and provides full programmatic access to all data held within the ChEBI database through a simple and documented API. libChEBI is reliant upon the (automated) download and regular update of flat files that are held locally. As such, libChEBI can be embedded in both on- and off-line software applications. libChEBI allows better support of ChEBI and its data in the development of new cheminformatics software. Covering three key programming languages, it allows for the entirety of the ChEBI database to be accessed easily and quickly through a simple API. All code is open access and freely available.
Lab Streaming Layer Enabled Myo Data Collection Software User Manual
2017-06-07
time - series data over a local network. LSL handles the networking, time -synchronization, (near-) real- time access as well as, optionally, the... series data collection (e.g., brain activity, heart activity, muscle activity) using the LSL application programming interface (API). Time -synchronized...saved to a single extensible data format (XDF) file. Once the time - series data are collected in a Lab Recorder XDF file, users will be able to query
Information Security Considerations for Applications Using Apache Accumulo
2014-09-01
Distributed File System INSCOM United States Army Intelligence and Security Command JPA Java Persistence API JSON JavaScript Object Notation MAC Mandatory... MySQL [13]. BigTable can process 20 petabytes per day [14]. High degree of scalability on commodity hardware. NoSQL databases do not rely on highly...manipulation in relational databases. NoSQL databases each have a unique programming interface that uses a lower level procedural language (e.g., Java
Development of the geometry database for the CBM experiment
NASA Astrophysics Data System (ADS)
Akishina, E. P.; Alexandrov, E. I.; Alexandrov, I. N.; Filozova, I. A.; Friese, V.; Ivanov, V. V.
2018-01-01
The paper describes the current state of the Geometry Database (Geometry DB) for the CBM experiment. The main purpose of this database is to provide convenient tools for: (1) managing the geometry modules; (2) assembling various versions of the CBM setup as a combination of geometry modules and additional files. The CBM users of the Geometry DB may use both GUI (Graphical User Interface) and API (Application Programming Interface) tools for working with it.
jqcML: an open-source java API for mass spectrometry quality control data in the qcML format.
Bittremieux, Wout; Kelchtermans, Pieter; Valkenborg, Dirk; Martens, Lennart; Laukens, Kris
2014-07-03
The awareness that systematic quality control is an essential factor to enable the growth of proteomics into a mature analytical discipline has increased over the past few years. To this aim, a controlled vocabulary and document structure have recently been proposed by Walzer et al. to store and disseminate quality-control metrics for mass-spectrometry-based proteomics experiments, called qcML. To facilitate the adoption of this standardized quality control routine, we introduce jqcML, a Java application programming interface (API) for the qcML data format. First, jqcML provides a complete object model to represent qcML data. Second, jqcML provides the ability to read, write, and work in a uniform manner with qcML data from different sources, including the XML-based qcML file format and the relational database qcDB. Interaction with the XML-based file format is obtained through the Java Architecture for XML Binding (JAXB), while generic database functionality is obtained by the Java Persistence API (JPA). jqcML is released as open-source software under the permissive Apache 2.0 license and can be downloaded from https://bitbucket.org/proteinspector/jqcml .
OSCAR API for Real-Time Low-Power Multicores and Its Performance on Multicores and SMP Servers
NASA Astrophysics Data System (ADS)
Kimura, Keiji; Mase, Masayoshi; Mikami, Hiroki; Miyamoto, Takamichi; Shirako, Jun; Kasahara, Hironori
OSCAR (Optimally Scheduled Advanced Multiprocessor) API has been designed for real-time embedded low-power multicores to generate parallel programs for various multicores from different vendors by using the OSCAR parallelizing compiler. The OSCAR API has been developed by Waseda University in collaboration with Fujitsu Laboratory, Hitachi, NEC, Panasonic, Renesas Technology, and Toshiba in an METI/NEDO project entitled "Multicore Technology for Realtime Consumer Electronics." By using the OSCAR API as an interface between the OSCAR compiler and backend compilers, the OSCAR compiler enables hierarchical multigrain parallel processing with memory optimization under capacity restriction for cache memory, local memory, distributed shared memory, and on-chip/off-chip shared memory; data transfer using a DMA controller; and power reduction control using DVFS (Dynamic Voltage and Frequency Scaling), clock gating, and power gating for various embedded multicores. In addition, a parallelized program automatically generated by the OSCAR compiler with OSCAR API can be compiled by the ordinary OpenMP compilers since the OSCAR API is designed on a subset of the OpenMP. This paper describes the OSCAR API and its compatibility with the OSCAR compiler by showing code examples. Performance evaluations of the OSCAR compiler and the OSCAR API are carried out using an IBM Power5+ workstation, an IBM Power6 high-end SMP server, and a newly developed consumer electronics multicore chip RP2 by Renesas, Hitachi and Waseda. From the results of scalability evaluation, it is found that on an average, the OSCAR compiler with the OSCAR API can exploit 5.8 times speedup over the sequential execution on the Power5+ workstation with eight cores and 2.9 times speedup on RP2 with four cores, respectively. In addition, the OSCAR compiler can accelerate an IBM XL Fortran compiler up to 3.3 times on the Power6 SMP server. Due to low-power optimization on RP2, the OSCAR compiler with the OSCAR API achieves a maximum power reduction of 84% in the real-time execution mode.
Using USNO's API to Obtain Data
NASA Astrophysics Data System (ADS)
Lesniak, Michael V.; Pozniak, Daniel; Punnoose, Tarun
2015-01-01
The U.S. Naval Observatory (USNO) is in the process of modernizing its publicly available web services into APIs (Application Programming Interfaces). Services configured as APIs offer greater flexibility to the user and allow greater usage. Depending on the particular service, users who implement our APIs will receive either a PNG (Portable Network Graphics) image or data in JSON (JavaScript Object Notation) format. This raw data can then be embedded in third-party web sites or in apps.Part of the USNO's mission is to provide astronomical and timing data to government agencies and the general public. To this end, the USNO provides accurate computations of astronomical phenomena such as dates of lunar phases, rise and set times of the Moon and Sun, and lunar and solar eclipse times. Users who navigate to our web site and select one of our 18 services are prompted to complete a web form, specifying parameters such as date, time, location, and object. Many of our services work for years between 1700 and 2100, meaning that past, present, and future events can be computed. Upon form submission, our web server processes the request, computes the data, and outputs it to the user.Over recent years, the use of the web by the general public has vastly changed. In response to this, the USNO is modernizing its web-based data services. This includes making our computed data easier to embed within third-party web sites as well as more easily querying from apps running on tablets and smart phones. To facilitate this, the USNO has begun converting its services into APIs. In addition to the existing web forms for the various services, users are able to make direct URL requests that return either an image or numerical data.To date, four of our web services have been configured to run with APIs. Two are image-producing services: "Apparent Disk of a Solar System Object" and "Day and Night Across the Earth." Two API data services are "Complete Sun and Moon Data for One Day" and "Dates of Primary Phases of the Moon." Instructions for how to use our API services as well as examples of their use can be found on one of our explanatory web pages and will be discussed here.
Solar Eclipse Computer API: Planning Ahead for August 2017
NASA Astrophysics Data System (ADS)
Bartlett, Jennifer L.; Chizek Frouard, Malynda; Lesniak, Michael V.; Bell, Steve
2016-01-01
With the total solar eclipse of 2017 August 21 over the continental United States approaching, the U.S. Naval Observatory (USNO) on-line Solar Eclipse Computer can now be accessed via an application programming interface (API). This flexible interface returns local circumstances for any solar eclipse in JavaScript Object Notation (JSON) that can be incorporated into third-party Web sites or applications. For a given year, it can also return a list of solar eclipses that can be used to build a more specific request for local circumstances. Over the course of a particular eclipse as viewed from a specific site, several events may be visible: the beginning and ending of the eclipse (first and fourth contacts), the beginning and ending of totality (second and third contacts), the moment of maximum eclipse, sunrise, or sunset. For each of these events, the USNO Solar Eclipse Computer reports the time, Sun's altitude and azimuth, and the event's position and vertex angles. The computer also reports the duration of the total phase, the duration of the eclipse, the magnitude of the eclipse, and the percent of the Sun obscured for a particular eclipse site. On-line documentation for using the API-enabled Solar Eclipse Computer, including sample calls, is available (http://aa.usno.navy.mil/data/docs/api.php). The same Web page also describes how to reach the Complete Sun and Moon Data for One Day, Phases of the Moon, Day and Night Across the Earth, and Apparent Disk of a Solar System Object services using API calls.For those who prefer using a traditional data input form, local circumstances can still be requested that way at http://aa.usno.navy.mil/data/docs/SolarEclipses.php. In addition, the 2017 August 21 Solar Eclipse Resource page (http://aa.usno.navy.mil/data/docs/Eclipse2017.php) consolidates all of the USNO resources for this event, including a Google Map view of the eclipse track designed by Her Majesty's Nautical Almanac Office (HMNAO). Looking further ahead, a 2024 April 8 Solar Eclipse Resource page (http://aa.usno.navy.mil/data/docs/Eclipse2024.php) is also available.
When Will It Be... USNO Seasons and Apsides Calculator
NASA Astrophysics Data System (ADS)
Chizek Frouard, Malynda; Bartlett, Jennifer Lynn
2018-01-01
The turning of the Earth’s seasons (solstices and equinoxes) and apsides (perihelions and aphelions) are times often used in observational astronomy and also of interest to the public. To avoid tedious calculations, the U.S. Naval Observatory (USNO) has developed an on-line interactive calculator, Earth’s Seasons and Apsides to provide information about events between 1600 and 2200. The new data service uses an Application Programming Interface (API), which returns values in JavaScript Object Notation (JSON) that can be incorporated into third-party websites or applications. For a requested year, the Earth’s Seasons and Apsides API provides the Gregorian calendar date and time of the Vernal Equinox, Summer Solstice, Autumnal Equinox, Winter Solstice, Aphelion, and Perihelion. The user may specify the time zone for their results, including the optional addition of U.S. daylight saving time for years after 1966.On-line documentation for using the API-enabled Earth’s Seasons and Apsides is available, including sample calls (http://aa.usno.navy.mil/data/docs/api.php). A traditional forms-based interface is available as well (http://aa.usno.navy.mil/data/docs/EarthSeasons.php). This data service replaces the popular Earth's Seasons: Equinoxes, Solstices, Perihelion, and Aphelion page that provided a static list of events for 2000–2025. The USNO also provides API-enabled data services for Complete Sun and Moon Data for One Day (http://aa.usno.navy.mil/data/docs/RS_OneDay.php), Dates of the Primary Phases of the Moon (http://aa.usno.navy.mil/data/docs/MoonPhase.php), Selected Christian Observances (http://aa.usno.navy.mil/data/docs/easter.php), Selected Islamic Observances (http://aa.usno.navy.mil/data/docs/islamic.php), Selected Jewish Observances (http://aa.usno.navy.mil/data/docs/passover.php), Julian Date Conversion (http://aa.usno.navy.mil/data/docs/JulianDate.php), and Sidereal Time (http://aa.usno.navy.mil/data/docs/siderealtime.php) as well as its Solar Eclipse Computer (http://aa.usno.navy.mil/data/docs/SolarEclipses.php).
DDT, global strategies, and a malaria control crisis in South America.
Roberts, D R; Laughlin, L L; Hsheih, P; Legters, L J
1997-01-01
Malaria is reemerging in endemic-disease countries of South America. We examined the rate of real growth in annual parasite indexes (API) by adjusting APIs for all years to the annual blood examination rate of 1965 for each country. The standardized APIs calculated for Brazil, Peru, Guyana, and for 18 other malaria-endemic countries of the Americas presented a consistent pattern of low rates up through the late 1970s, followed by geometric growth in malaria incidence in subsequent years. True growth in malaria incidence corresponds temporally with changes in global strategies for malaria control. Underlying the concordance of these events is a causal link between decreased spraying of homes with DDT and increased malaria; two regression models defining this link showed statistically significant negative relationships between APIs and house-spray rates. Separate analyses of data from 1993 to 1995 showed that countries that have recently discontinued their spray programs are reporting large increases in malaria incidence. Ecuador, which has increased use of DDT since 1993, is the only country reporting a large reduction (61%) in malaria rates since 1993. DDT use for malaria control and application of the Global Malaria Control Strategy to the Americas should be subjects of urgent national and international debate. We discuss the recent actions to ban DDT, the health costs of such a ban, perspectives on DDT use in agriculture versus malaria control, and costs versus benefits of DDT and alternative insecticides.
DDT, global strategies, and a malaria control crisis in South America.
Roberts, D. R.; Laughlin, L. L.; Hsheih, P.; Legters, L. J.
1997-01-01
Malaria is reemerging in endemic-disease countries of South America. We examined the rate of real growth in annual parasite indexes (API) by adjusting APIs for all years to the annual blood examination rate of 1965 for each country. The standardized APIs calculated for Brazil, Peru, Guyana, and for 18 other malaria-endemic countries of the Americas presented a consistent pattern of low rates up through the late 1970s, followed by geometric growth in malaria incidence in subsequent years. True growth in malaria incidence corresponds temporally with changes in global strategies for malaria control. Underlying the concordance of these events is a causal link between decreased spraying of homes with DDT and increased malaria; two regression models defining this link showed statistically significant negative relationships between APIs and house-spray rates. Separate analyses of data from 1993 to 1995 showed that countries that have recently discontinued their spray programs are reporting large increases in malaria incidence. Ecuador, which has increased use of DDT since 1993, is the only country reporting a large reduction (61%) in malaria rates since 1993. DDT use for malaria control and application of the Global Malaria Control Strategy to the Americas should be subjects of urgent national and international debate. We discuss the recent actions to ban DDT, the health costs of such a ban, perspectives on DDT use in agriculture versus malaria control, and costs versus benefits of DDT and alternative insecticides. PMID:9284373
Many-core graph analytics using accelerated sparse linear algebra routines
NASA Astrophysics Data System (ADS)
Kozacik, Stephen; Paolini, Aaron L.; Fox, Paul; Kelmelis, Eric
2016-05-01
Graph analytics is a key component in identifying emerging trends and threats in many real-world applications. Largescale graph analytics frameworks provide a convenient and highly-scalable platform for developing algorithms to analyze large datasets. Although conceptually scalable, these techniques exhibit poor performance on modern computational hardware. Another model of graph computation has emerged that promises improved performance and scalability by using abstract linear algebra operations as the basis for graph analysis as laid out by the GraphBLAS standard. By using sparse linear algebra as the basis, existing highly efficient algorithms can be adapted to perform computations on the graph. This approach, however, is often less intuitive to graph analytics experts, who are accustomed to vertex-centric APIs such as Giraph, GraphX, and Tinkerpop. We are developing an implementation of the high-level operations supported by these APIs in terms of linear algebra operations. This implementation is be backed by many-core implementations of the fundamental GraphBLAS operations required, and offers the advantages of both the intuitive programming model of a vertex-centric API and the performance of a sparse linear algebra implementation. This technology can reduce the number of nodes required, as well as the run-time for a graph analysis problem, enabling customers to perform more complex analysis with less hardware at lower cost. All of this can be accomplished without the requirement for the customer to make any changes to their analytics code, thanks to the compatibility with existing graph APIs.
Pisklak, Dariusz Maciej; Zielińska-Pisklak, Monika; Szeleszczuk, Łukasz
2016-11-20
Solid-state nuclear magnetic resonance (ssNMR) is a powerful and unique method for analyzing solid forms of the active pharmaceutical ingredients (APIs) directly in their original formulations. Unfortunately, despite their wide range of application, the ssNMR experiments often suffer from low sensitivity and peaks overlapping between API and excipients. To overcome these limitations, the crosspolarization inversion recovery method was successfully used. The differences in the spin-lattice relaxation time constants for hydrogen atoms T1(H) between API and excipients were employed in order to separate and discriminate their peaks in ssNMR spectra as well as to increase the intensity of API signals in low-dose formulations. The versatility of this method was demonstrated by different examples, including the excipients mixture and commercial solid dosage forms (e.g. granules and tablets). Copyright © 2016 Elsevier B.V. All rights reserved.
Yoo, Terry S; Ackerman, Michael J; Lorensen, William E; Schroeder, Will; Chalana, Vikram; Aylward, Stephen; Metaxas, Dimitris; Whitaker, Ross
2002-01-01
We present the detailed planning and execution of the Insight Toolkit (ITK), an application programmers interface (API) for the segmentation and registration of medical image data. This public resource has been developed through the NLM Visible Human Project, and is in beta test as an open-source software offering under cost-free licensing. The toolkit concentrates on 3D medical data segmentation and registration algorithms, multimodal and multiresolution capabilities, and portable platform independent support for Windows, Linux/Unix systems. This toolkit was built using current practices in software engineering. Specifically, we embraced the concept of generic programming during the development of these tools, working extensively with C++ templates and the freedom and flexibility they allow. Software development tools for distributed consortium-based code development have been created and are also publicly available. We discuss our assumptions, design decisions, and some lessons learned.
The NIH BD2K center for big data in translational genomics
Paten, Benedict; Diekhans, Mark; Druker, Brian J; Friend, Stephen; Guinney, Justin; Gassner, Nadine; Guttman, Mitchell; James Kent, W; Mantey, Patrick; Margolin, Adam A; Massie, Matt; Novak, Adam M; Nothaft, Frank; Pachter, Lior; Patterson, David; Smuga-Otto, Maciej; Stuart, Joshua M; Van’t Veer, Laura; Haussler, David
2015-01-01
The world’s genomics data will never be stored in a single repository – rather, it will be distributed among many sites in many countries. No one site will have enough data to explain genotype to phenotype relationships in rare diseases; therefore, sites must share data. To accomplish this, the genetics community must forge common standards and protocols to make sharing and computing data among many sites a seamless activity. Through the Global Alliance for Genomics and Health, we are pioneering the development of shared application programming interfaces (APIs) to connect the world’s genome repositories. In parallel, we are developing an open source software stack (ADAM) that uses these APIs. This combination will create a cohesive genome informatics ecosystem. Using containers, we are facilitating the deployment of this software in a diverse array of environments. Through benchmarking efforts and big data driver projects, we are ensuring ADAM’s performance and utility. PMID:26174866
Registered File Support for Critical Operations Files at (Space Infrared Telescope Facility) SIRTF
NASA Technical Reports Server (NTRS)
Turek, G.; Handley, Tom; Jacobson, J.; Rector, J.
2001-01-01
The SIRTF Science Center's (SSC) Science Operations System (SOS) has to contend with nearly one hundred critical operations files via comprehensive file management services. The management is accomplished via the registered file system (otherwise known as TFS) which manages these files in a registered file repository composed of a virtual file system accessible via a TFS server and a file registration database. The TFS server provides controlled, reliable, and secure file transfer and storage by registering all file transactions and meta-data in the file registration database. An API is provided for application programs to communicate with TFS servers and the repository. A command line client implementing this API has been developed as a client tool. This paper describes the architecture, current implementation, but more importantly, the evolution of these services based on evolving community use cases and emerging information system technology.
scikit-image: image processing in Python
Schönberger, Johannes L.; Nunez-Iglesias, Juan; Boulogne, François; Warner, Joshua D.; Yager, Neil; Gouillart, Emmanuelle; Yu, Tony
2014-01-01
scikit-image is an image processing library that implements algorithms and utilities for use in research, education and industry applications. It is released under the liberal Modified BSD open source license, provides a well-documented API in the Python programming language, and is developed by an active, international team of collaborators. In this paper we highlight the advantages of open source to achieve the goals of the scikit-image library, and we showcase several real-world image processing applications that use scikit-image. More information can be found on the project homepage, http://scikit-image.org. PMID:25024921
A program to generate a Fortran interface for a C++ library
DOE Office of Scientific and Technical Information (OSTI.GOV)
Taylor, Lee
Shroud is a utility to create a Fortran and C interface for a C++ library. An existing C++ library API is described in an input file. Shroud reads the file and creates source files which can be compiled to provide a Fortran API for the library.
2015-03-01
domains. Major model functions include: • Ground combat: Light and heavy forces. • Air mobile forces. • Future forces. • Fixed-wing and rotary-wing...Constraints: • Study must be completed no later than 31 December 2014. • Entity behavior limited to select COMBATXXI Mobility , Unmanned Aerial System...and SQL backend , as well as any open application programming interface API. • Allows data transparency and data driven navigation through the model
The Proteins API: accessing key integrated protein and genome information
Antunes, Ricardo; Alpi, Emanuele; Gonzales, Leonardo; Liu, Wudong; Luo, Jie; Qi, Guoying; Turner, Edd
2017-01-01
Abstract The Proteins API provides searching and programmatic access to protein and associated genomics data such as curated protein sequence positional annotations from UniProtKB, as well as mapped variation and proteomics data from large scale data sources (LSS). Using the coordinates service, researchers are able to retrieve the genomic sequence coordinates for proteins in UniProtKB. This, the LSS genomics and proteomics data for UniProt proteins is programmatically only available through this service. A Swagger UI has been implemented to provide documentation, an interface for users, with little or no programming experience, to ‘talk’ to the services to quickly and easily formulate queries with the services and obtain dynamically generated source code for popular programming languages, such as Java, Perl, Python and Ruby. Search results are returned as standard JSON, XML or GFF data objects. The Proteins API is a scalable, reliable, fast, easy to use RESTful services that provides a broad protein information resource for users to ask questions based upon their field of expertise and allowing them to gain an integrated overview of protein annotations available to aid their knowledge gain on proteins in biological processes. The Proteins API is available at (http://www.ebi.ac.uk/proteins/api/doc). PMID:28383659
The Proteins API: accessing key integrated protein and genome information.
Nightingale, Andrew; Antunes, Ricardo; Alpi, Emanuele; Bursteinas, Borisas; Gonzales, Leonardo; Liu, Wudong; Luo, Jie; Qi, Guoying; Turner, Edd; Martin, Maria
2017-07-03
The Proteins API provides searching and programmatic access to protein and associated genomics data such as curated protein sequence positional annotations from UniProtKB, as well as mapped variation and proteomics data from large scale data sources (LSS). Using the coordinates service, researchers are able to retrieve the genomic sequence coordinates for proteins in UniProtKB. This, the LSS genomics and proteomics data for UniProt proteins is programmatically only available through this service. A Swagger UI has been implemented to provide documentation, an interface for users, with little or no programming experience, to 'talk' to the services to quickly and easily formulate queries with the services and obtain dynamically generated source code for popular programming languages, such as Java, Perl, Python and Ruby. Search results are returned as standard JSON, XML or GFF data objects. The Proteins API is a scalable, reliable, fast, easy to use RESTful services that provides a broad protein information resource for users to ask questions based upon their field of expertise and allowing them to gain an integrated overview of protein annotations available to aid their knowledge gain on proteins in biological processes. The Proteins API is available at (http://www.ebi.ac.uk/proteins/api/doc). © The Author(s) 2017. Published by Oxford University Press on behalf of Nucleic Acids Research.
Hardware Acceleration for Cyber Security
2010-11-01
perform different approaches. It includes behavioral analysis, by means of NetFlow monitoring, as well as packet content analysis, so called Deep...Interface (API). The example of such application is NetFlow exporter described in [5]. • We provide modified libpcap library using libsze2 API. This...cards. The software applications using NIFIC include FlowMon NetFlow /IPFIX generator, Wireshark packet analyzer, iptables - Linux kernel firewall, deep
An Object-Oriented Network-Centric Software Architecture for Physical Computing
NASA Astrophysics Data System (ADS)
Palmer, Richard
1997-08-01
Recent developments in object-oriented computer languages and infrastructure such as the Internet, Web browsers, and the like provide an opportunity to define a more productive computational environment for scientific programming that is based more closely on the underlying mathematics describing physics than traditional programming languages such as FORTRAN or C++. In this talk I describe an object-oriented software architecture for representing physical problems that includes classes for such common mathematical objects as geometry, boundary conditions, partial differential and integral equations, discretization and numerical solution methods, etc. In practice, a scientific program written using this architecture looks remarkably like the mathematics used to understand the problem, is typically an order of magnitude smaller than traditional FORTRAN or C++ codes, and hence easier to understand, debug, describe, etc. All objects in this architecture are ``network-enabled,'' which means that components of a software solution to a physical problem can be transparently loaded from anywhere on the Internet or other global network. The architecture is expressed as an ``API,'' or application programmers interface specification, with reference embeddings in Java, Python, and C++. A C++ class library for an early version of this API has been implemented for machines ranging from PC's to the IBM SP2, meaning that phidentical codes run on all architectures.
Auralization Architectures for NASA?s Next Generation Aircraft Noise Prediction Program
NASA Technical Reports Server (NTRS)
Rizzi, Stephen A.; Lopes, Leonard V.; Burley, Casey L.; Aumann, Aric R.
2013-01-01
Aircraft community noise is a significant concern due to continued growth in air traffic, increasingly stringent environmental goals, and operational limitations imposed by airport authorities. The assessment of human response to noise from future aircraft can only be afforded through laboratory testing using simulated flyover noise. Recent work by the authors demonstrated the ability to auralize predicted flyover noise for a state-of-the-art reference aircraft and a future hybrid wing body aircraft concept. This auralization used source noise predictions from NASA's Aircraft NOise Prediction Program (ANOPP) as input. The results from this process demonstrated that auralization based upon system noise predictions is consistent with, and complementary to, system noise predictions alone. To further develop and validate the auralization process, improvements to the interfaces between the synthesis capability and the system noise tools are required. This paper describes the key elements required for accurate noise synthesis and introduces auralization architectures for use with the next-generation ANOPP (ANOPP2). The architectures are built around a new auralization library and its associated Application Programming Interface (API) that utilize ANOPP2 APIs to access data required for auralization. The architectures are designed to make the process of auralizing flyover noise a common element of system noise prediction.
Marshall Space Flight Center Telescience Resource Kit
NASA Technical Reports Server (NTRS)
Wade, Gina
2016-01-01
Telescience Resource Kit (TReK) is a suite of software applications that can be used to monitor and control assets in space or on the ground. The Telescience Resource Kit was originally developed for the International Space Station program. Since then it has been used to support a variety of NASA programs and projects including the WB-57 Ascent Vehicle Experiment (WAVE) project, the Fast Affordable Science and Technology Satellite (FASTSAT) project, and the Constellation Program. The Payloads Operations Center (POC), also known as the Payload Operations Integration Center (POIC), provides the capability for payload users to operate their payloads at their home sites. In this environment, TReK provides local ground support system services and an interface to utilize remote services provided by the POC. TReK provides ground system services for local and remote payload user sites including International Partner sites, Telescience Support Centers, and U.S. Investigator sites in over 40 locations worldwide. General Capabilities: Support for various data interfaces such as User Datagram Protocol, Transmission Control Protocol, and Serial interfaces. Data Services - retrieve, process, record, playback, forward, and display data (ground based data or telemetry data). Command - create, modify, send, and track commands. Command Management - Configure one TReK system to serve as a command server/filter for other TReK systems. Database - databases are used to store telemetry and command definition information. Application Programming Interface (API) - ANSI C interface compatible with commercial products such as Visual C++, Visual Basic, LabVIEW, Borland C++, etc. The TReK API provides a bridge for users to develop software to access and extend TReK services. Environments - development, test, simulations, training, and flight. Includes standalone training simulators.
DASMiner: discovering and integrating data from DAS sources
2009-01-01
Background DAS is a widely adopted protocol for providing syntactic interoperability among biological databases. The popularity of DAS is due to a simplified and elegant mechanism for data exchange that consists of sources exposing their RESTful interfaces for data access. As a growing number of DAS services are available for molecular biology resources, there is an incentive to explore this protocol in order to advance data discovery and integration among these resources. Results We developed DASMiner, a Matlab toolkit for querying DAS data sources that enables creation of integrated biological models using the information available in DAS-compliant repositories. DASMiner is composed by a browser application and an API that work together to facilitate gathering of data from different DAS sources, which can be used for creating enriched datasets from multiple sources. The browser is used to formulate queries and navigate data contained in DAS sources. Users can execute queries against these sources in an intuitive fashion, without the need of knowing the specific DAS syntax for the particular source. Using the source's metadata provided by the DAS Registry, the browser's layout adapts to expose only the set of commands and coordinate systems supported by the specific source. For this reason, the browser can interrogate any DAS source, independently of the type of data being served. The API component of DASMiner may be used for programmatic access of DAS sources by programs in Matlab. Once the desired data is found during navigation, the query is exported in the format of an API call to be used within any Matlab application. We illustrate the use of DASMiner by creating integrative models of histone modification maps and protein-protein interaction networks. These enriched datasets were built by retrieving and integrating distributed genomic and proteomic DAS sources using the API. Conclusion The support of the DAS protocol allows that hundreds of molecular biology databases to be treated as a federated, online collection of resources. DASMiner enables full exploration of these resources, and can be used to deploy applications and create integrated views of biological systems using the information deposited in DAS repositories. PMID:19919683
Software for Remote Monitoring of Space-Station Payloads
NASA Technical Reports Server (NTRS)
Schneider, Michelle; Lippincott, Jeff; Chubb, Steve; Whitaker, Jimmy; Gillis, Robert; Sellers, Donna; Sims, Chris; Rice, James
2003-01-01
Telescience Resource Kit (TReK) is a suite of application programs that enable geographically dispersed users to monitor scientific payloads aboard the International Space Station (ISS). TReK provides local ground support services that can simultaneously receive, process, record, playback, and display data from multiple sources. TReK also provides interfaces to use the remote services provided by the Payload Operations Integration Center which manages all ISS payloads. An application programming interface (API) allows for payload users to gain access to all data processed by TReK and allows payload-specific tools and programs to be built or integrated with TReK. Used in conjunction with other ISS-provided tools, TReK provides the ability to integrate payloads with the operational ground system early in the lifecycle. This reduces the potential for operational problems and provides "cradle-to-grave" end-to-end operations. TReK contains user guides and self-paced tutorials along with training applications to allow the user to become familiar with the system.
A Scientific Data Provenance API for Distributed Applications
DOE Office of Scientific and Technical Information (OSTI.GOV)
Raju, Bibi; Elsethagen, Todd O.; Stephan, Eric G.
Data provenance has been an active area of research as a means to standardize how the origin of data, process event history, and what or who was responsible for influencing results is explained. There are two approaches to capture provenance information. The first approach is to collect observed evidence produced by an executing application using log files, event listeners, and temporary files that are used by the application or application developer. The provenance translated from these observations is an interpretation of the provided evidence. The second approach is called disclosed because the application provides a firsthand account of the provenancemore » based on the anticipated questions on data flow, process flow, and responsible agents. Most observed provenance collection systems collect lot of provenance information during an application run or workflow execution. The common trend in capturing provenance is to collect all possible information, then attempt to find relevant information, which is not efficient. Existing disclosed provenance system APIs do not work well in distributed environment and have trouble finding where to fit the individual pieces of provenance information. This work focuses on determining more reliable solutions for provenance capture. As part of the Integrated End-to-end Performance Prediction and Diagnosis for Extreme Scientific Workflows (IPPD) project, an API was developed, called Producer API (PAPI), which can disclose application targeted provenance, designed to work in distributed environments by means of unique object identification methods. The provenance disclosure approach used adds additional metadata to the provenance information to uniquely identify the pieces and connect them together. PAPI uses a common provenance model to support this provenance integration across disclosure sources. The API also provides the flexibility to let the user decide what to do with the collected provenance. The collected provenance can be sent to a triple store using REST services or it can be logged to a file.« less
NASA Astrophysics Data System (ADS)
Huang, C. Y.; Wu, C. H.
2016-06-01
The Internet of Things (IoT) is an infrastructure that interconnects uniquely-identifiable devices using the Internet. By interconnecting everyday appliances, various monitoring and physical mashup applications can be constructed to improve people's daily life. However, IoT devices created by different manufacturers follow different proprietary protocols and cannot communicate with each other. This heterogeneity issue causes different products to be locked in multiple closed ecosystems that we call IoT silos. In order to address this issue, a common industrial solution is the hub approach, which implements connectors to communicate with IoT devices following different protocols. However, with the growing number of proprietary protocols proposed by device manufacturers, IoT hubs need to support and maintain a lot of customized connectors. Hence, we believe the ultimate solution to address the heterogeneity issue is to follow open and interoperable standard. Among the existing IoT standards, the Open Geospatial Consortium (OGC) SensorThings API standard supports comprehensive conceptual model and query functionalities. The first version of SensorThings API mainly focuses on connecting to IoT devices and sharing sensor observations online, which is the sensing capability. Besides the sensing capability, IoT devices could also be controlled via the Internet, which is the tasking capability. While the tasking capability was not included in the first version of the SensorThings API standard, this research aims on defining the tasking capability profile and integrates with the SensorThings API standard, which we call the extended-SensorThings API in this paper. In general, this research proposes a lightweight JSON-based web service description, the "Tasking Capability Description", allowing device owners and manufacturers to describe different IoT device protocols. Through the extended- SensorThings API, users and applications can follow a coherent protocol to control IoT devices that use different communication protocols, which could consequently achieve the interoperable Internet of Things infrastructure.
Application-oriented integrated control center (AICC) for heterogeneous optical networks
NASA Astrophysics Data System (ADS)
Zhao, Yongli; Zhang, Jie; Cao, Xuping; Wang, Dajiang; Wu, Koubo; Cai, Yinxiang; Gu, Wanyi
2011-12-01
Various broad bandwidth services have being swallowing the bandwidth resource of optical networks, such as the data center application and cloud computation. There are still some challenges for future optical networks although the available bandwidth is increasing with the development of transmission technologies. The relationship between upper application layer and lower network resource layer is necessary to be researched further. In order to improve the efficiency of network resources and capability of service provisioning, heterogeneous optical networks resource can be abstracted as unified Application Programming Interfaces (APIs) which can be open to various upper applications through Application-oriented Integrated Control Center (AICC) proposed in the paper. A novel Openflow-based unified control architecture is proposed for the optimization of cross layer resources. Numeric results show good performance of AICC through simulation experiments.
Collaborative development of predictive toxicology applications
2010-01-01
OpenTox provides an interoperable, standards-based Framework for the support of predictive toxicology data management, algorithms, modelling, validation and reporting. It is relevant to satisfying the chemical safety assessment requirements of the REACH legislation as it supports access to experimental data, (Quantitative) Structure-Activity Relationship models, and toxicological information through an integrating platform that adheres to regulatory requirements and OECD validation principles. Initial research defined the essential components of the Framework including the approach to data access, schema and management, use of controlled vocabularies and ontologies, architecture, web service and communications protocols, and selection and integration of algorithms for predictive modelling. OpenTox provides end-user oriented tools to non-computational specialists, risk assessors, and toxicological experts in addition to Application Programming Interfaces (APIs) for developers of new applications. OpenTox actively supports public standards for data representation, interfaces, vocabularies and ontologies, Open Source approaches to core platform components, and community-based collaboration approaches, so as to progress system interoperability goals. The OpenTox Framework includes APIs and services for compounds, datasets, features, algorithms, models, ontologies, tasks, validation, and reporting which may be combined into multiple applications satisfying a variety of different user needs. OpenTox applications are based on a set of distributed, interoperable OpenTox API-compliant REST web services. The OpenTox approach to ontology allows for efficient mapping of complementary data coming from different datasets into a unifying structure having a shared terminology and representation. Two initial OpenTox applications are presented as an illustration of the potential impact of OpenTox for high-quality and consistent structure-activity relationship modelling of REACH-relevant endpoints: ToxPredict which predicts and reports on toxicities for endpoints for an input chemical structure, and ToxCreate which builds and validates a predictive toxicity model based on an input toxicology dataset. Because of the extensible nature of the standardised Framework design, barriers of interoperability between applications and content are removed, as the user may combine data, models and validation from multiple sources in a dependable and time-effective way. PMID:20807436
Collaborative development of predictive toxicology applications.
Hardy, Barry; Douglas, Nicki; Helma, Christoph; Rautenberg, Micha; Jeliazkova, Nina; Jeliazkov, Vedrin; Nikolova, Ivelina; Benigni, Romualdo; Tcheremenskaia, Olga; Kramer, Stefan; Girschick, Tobias; Buchwald, Fabian; Wicker, Joerg; Karwath, Andreas; Gütlein, Martin; Maunz, Andreas; Sarimveis, Haralambos; Melagraki, Georgia; Afantitis, Antreas; Sopasakis, Pantelis; Gallagher, David; Poroikov, Vladimir; Filimonov, Dmitry; Zakharov, Alexey; Lagunin, Alexey; Gloriozova, Tatyana; Novikov, Sergey; Skvortsova, Natalia; Druzhilovsky, Dmitry; Chawla, Sunil; Ghosh, Indira; Ray, Surajit; Patel, Hitesh; Escher, Sylvia
2010-08-31
OpenTox provides an interoperable, standards-based Framework for the support of predictive toxicology data management, algorithms, modelling, validation and reporting. It is relevant to satisfying the chemical safety assessment requirements of the REACH legislation as it supports access to experimental data, (Quantitative) Structure-Activity Relationship models, and toxicological information through an integrating platform that adheres to regulatory requirements and OECD validation principles. Initial research defined the essential components of the Framework including the approach to data access, schema and management, use of controlled vocabularies and ontologies, architecture, web service and communications protocols, and selection and integration of algorithms for predictive modelling. OpenTox provides end-user oriented tools to non-computational specialists, risk assessors, and toxicological experts in addition to Application Programming Interfaces (APIs) for developers of new applications. OpenTox actively supports public standards for data representation, interfaces, vocabularies and ontologies, Open Source approaches to core platform components, and community-based collaboration approaches, so as to progress system interoperability goals.The OpenTox Framework includes APIs and services for compounds, datasets, features, algorithms, models, ontologies, tasks, validation, and reporting which may be combined into multiple applications satisfying a variety of different user needs. OpenTox applications are based on a set of distributed, interoperable OpenTox API-compliant REST web services. The OpenTox approach to ontology allows for efficient mapping of complementary data coming from different datasets into a unifying structure having a shared terminology and representation.Two initial OpenTox applications are presented as an illustration of the potential impact of OpenTox for high-quality and consistent structure-activity relationship modelling of REACH-relevant endpoints: ToxPredict which predicts and reports on toxicities for endpoints for an input chemical structure, and ToxCreate which builds and validates a predictive toxicity model based on an input toxicology dataset. Because of the extensible nature of the standardised Framework design, barriers of interoperability between applications and content are removed, as the user may combine data, models and validation from multiple sources in a dependable and time-effective way.
USDA-ARS?s Scientific Manuscript database
Breeding honey bees (Apis mellifera) for physiological resistance to diseases is a highly desirable and environmentally safe approach to increasing colony survival. Selection of desirable traits is a critical element of any breeding program. In this study we investigate whether honey bee stocks dif...
Programming PHREEQC calculations with C++ and Python a comparative study
Charlton, Scott R.; Parkhurst, David L.; Muller, Mike
2011-01-01
The new IPhreeqc module provides an application programming interface (API) to facilitate coupling of other codes with the U.S. Geological Survey geochemical model PHREEQC. Traditionally, loose coupling of PHREEQC with other applications required methods to create PHREEQC input files, start external PHREEQC processes, and process PHREEQC output files. IPhreeqc eliminates most of this effort by providing direct access to PHREEQC capabilities through a component object model (COM), a library, or a dynamically linked library (DLL). Input and calculations can be specified through internally programmed strings, and all data exchange between an application and the module can occur in computer memory. This study compares simulations programmed in C++ and Python that are tightly coupled with IPhreeqc modules to the traditional simulations that are loosely coupled to PHREEQC. The study compares performance, quantifies effort, and evaluates lines of code and the complexity of the design. The comparisons show that IPhreeqc offers a more powerful and simpler approach for incorporating PHREEQC calculations into transport models and other applications that need to perform PHREEQC calculations. The IPhreeqc module facilitates the design of coupled applications and significantly reduces run times. Even a moderate knowledge of one of the supported programming languages allows more efficient use of PHREEQC than the traditional loosely coupled approach.
Spatial Point Data Analysis of Geolocated Tweets in the First Day of Eid Al-Fitr 2017 in Java Island
NASA Astrophysics Data System (ADS)
Wibowo, T. W.
2017-12-01
Eid Al-Fitr is a worldwide Muslim feast day, which in Indonesia generally accompanied by tradition of going home (mudik). The demographic patterns at the time of the holiday are generally shifted, in which some urban residents will travel to their hometowns. The impact of this shifting is that there is a quite massive mobility of the population, which is generally accompanied by traffic congestion. The presence of location sensors on smartphone devices, open the opportunity to map the movement of the population in realtime or near-realtime. Especially now that social media applications have been integrated with the capability to include location information. One of the popular social media applications in Indonesia is Twitter, which provides microblogging facilities to its users. This study aims to analyze the pattern of Geolocated Tweets data uploaded by Twitter users on the first day of Eid Al-Fitr (1 Syawal 1438H). Geolocated Tweets data mining is done by using Streaming API (Application Programming Interface) and Python programming language. There are 13,224 Geolocated Tweets points obtained at the location of the study. Various point data analysis techniques applied to the data have been collected, such as density analysis, pattern analysis, and proximity analysis. In general, active Twitter users are dominated by residents in major cities, such as Jakarta, Bandung, Surabaya, Yogyakarta, Surakarta and Semarang. The results of the analysis can be used to determine whether the Geolocated Tweets data mined by the Streaming API method can be used to represent the movement of the population when mudik.
Salud, Margaret C; Marshak, Helen Hopp; Natto, Zuhair S; Montgomery, Susanne
2014-01-01
While HIV rates are low for Asian/Pacific Islanders (APIs), they have been increasing, especially for API women in the USA. We conducted a cross-sectional study with 299 young API women (18-24 years old) in the Inland Empire region of Southern California to better understand their intention for HIV testing and their perceptions about HIV/AIDS. Data analyses included descriptive statistics, bivariate exploration for model building and multivariate analyses to determine variables associated with HIV-testing intentions. Results suggest that more lifetime sexual partners, greater perceived gender susceptibility, higher HIV/AIDS knowledge, sexually active, more positive attitudes about HIV testing and higher self-perceptions/experiences related to risk contribute to stronger intentions for HIV testing in young API women. Findings from this study will contribute to the limited literature on HIV/AIDS in API women and provide information that can be used for developing and implementing culturally appropriate programs that encourage HIV prevention and testing in this population.
Salud, Margaret C.; Marshak, Helen Hopp; Natto, Zuhair S.; Montgomery, Susanne
2015-01-01
While HIV rates are low for Asian/Pacific Islanders (APIs), they have been increasing, especially for API women in the USA. We conducted a cross-sectional study with 299 young API women (18–24 years old) in the Inland Empire region of Southern California to better understand their intention for HIV testing and their perceptions about HIV/AIDS. Data analyses included descriptive statistics, bivariate exploration for model building and multivariate analyses to determine variables associated with HIV-testing intentions. Results suggest that more lifetime sexual partners, greater perceived gender susceptibility, higher HIV/AIDS knowledge, sexually active, more positive attitudes about HIV testing and higher self-perceptions/experiences related to risk contribute to stronger intentions for HIV testing in young API women. Findings from this study will contribute to the limited literature on HIV/AIDS in API women and provide information that can be used for developing and implementing culturally appropriate programs that encourage HIV prevention and testing in this population. PMID:24111859
2013-06-01
Radio is a software development toolkit that provides signal processing blocks to drive the SDR. GNU Radio has many strong points – it is actively...maintained with a large user base, new capabilities are constantly being added, and compiled C code is fast for many real-time applications such as...programming interface (API) makes learning the architecture a daunting task, even for the experienced software developer. This requirement poses many
Querying and Computing with BioCyc Databases
Krummenacker, Markus; Paley, Suzanne; Mueller, Lukas; Yan, Thomas; Karp, Peter D.
2006-01-01
Summary We describe multiple methods for accessing and querying the complex and integrated cellular data in the BioCyc family of databases: access through multiple file formats, access through Application Program Interfaces (APIs) for LISP, Perl and Java, and SQL access through the BioWarehouse relational database. Availability The Pathway Tools software and 20 BioCyc DBs in Tiers 1 and 2 are freely available to academic users; fees apply to some types of commercial use. For download instructions see http://BioCyc.org/download.shtml PMID:15961440
1981-06-01
I0’ writing -up of results, and synthesis of the Bering km in surface area; the Bering Sea area is on the MIZ results with incoming results from the...application to rapid Ielting . Rev. Iho trne.ali Sea sdui ri Al ).f X, N. pi I 1975 Api I 19761 li la yer lj t ’ii.,i r wNil liet.,ii fol r1 od,% ’ir e r. S
Integrating UniTree with the data migration API
NASA Technical Reports Server (NTRS)
Schrodel, David G.
1994-01-01
The Data Migration Application Programming Interface (DMAPI) has the potential to allow developers of open systems Hierarchical Storage Management (HSM) products to virtualize native file systems without the requirement to make changes to the underlying operating system. This paper describes advantages of virtualizing native file systems in hierarchical storage management systems, the DMAPI at a high level, what the goals are for the interface, and the integration of the Convex UniTree+HSM with DMAPI along with some of the benefits derived in the resulting product.
Kasthurirathne, Suranga N; Mamlin, Burke; Grieve, Grahame; Biondich, Paul
2015-01-01
Interoperability is essential to address limitations caused by the ad hoc implementation of clinical information systems and the distributed nature of modern medical care. The HL7 V2 and V3 standards have played a significant role in ensuring interoperability for healthcare. FHIR is a next generation standard created to address fundamental limitations in HL7 V2 and V3. FHIR is particularly relevant to OpenMRS, an Open Source Medical Record System widely used across emerging economies. FHIR has the potential to allow OpenMRS to move away from a bespoke, application specific API to a standards based API. We describe efforts to design and implement a FHIR based API for the OpenMRS platform. Lessons learned from this effort were used to define long term plans to transition from the legacy OpenMRS API to a FHIR based API that greatly reduces the learning curve for developers and helps enhance adhernce to standards.
Subsetting and Formatting Landsat-7 LOR ETM+ and Data Products
NASA Technical Reports Server (NTRS)
Reid, Michael R.
2000-01-01
The Landsat-7 Processing System (LPS) processes Landsat-7 Enhanced Thematic Mapper (ETM+) instrument data into large, contiguous segments called "subintervals" and stores them in Level OR (LOR) data files. The LPS processed subinterval products must be subsetted and reformatted before the Level I processing systems can ingest them. The initial full subintervals produced by the LPS are stored mainly in HDF Earth Observing System (HDF-EOS) format which is an extension to the Hierarchical Data Format (HDF). The final LOR products are stored in native HDF format. Primarily the EOS Core System (ECS) and alternately the DAAC Emergency System (DES) subset the subinterval data for the operational Landsat-7 data processing systems. The HDF and HDF-EOS application programming interfaces (APIs) can be used for extensive data subsetting and data reorganization. A stand-alone subsetter tool has been developed which is based on some of the DES code. This tool makes use of the HDF and HDFEOS APIs to perform Landsat-7 LOR product subsetting and demonstrates how HDF and HDFEOS can be used for creating various configurations of full LOR products. How these APIs can be used to efficiently subset, format, and organize Landsat-7 LOR data as demonstrated by the subsetter tool and the DES is discussed.
NASA Technical Reports Server (NTRS)
LaMora, Andy; Raugh, A.; Erickson, K.; Grayzeck, E. J.; Knopf, W.; Morgan, T. H.
2012-01-01
NASA PDS hosts terabytes of valuable data from hundreds of data sources and spans decades of research. Data is stored on flat-file systems regulated through careful meta dictionaries. PDS's data is available to the public through its website which supports data searches through drill-down navigation. While the system returns data quickly, result sets in response to identical input differ depending on the drill-down path a user follows. To correct this Issue, to allow custom searching, and to improve general accessibility, PDS sought to create a new data structure and API, and to use them to build applications that are a joy to use and showcase the value of the data to students, teachers and citizens. PDS engaged TopCoder and Harvard Business School through the NTL to pursue these objectives in a pilot effort. Scope was limited to Small Bodies Node data. NTL analyzed data, proposed a solution, and implemented it through a series of micro-contests. Contest focused on different segments of the problem; conceptualization, architectural design, implementation, testing, etc. To demonstrate the utility of the completed solution, NTL developed web-based and mobile applications that can compare targets, regardless of mission. To further explore the potential of the solution NTL hosted "Mash-up" challenges that integrated the API with other publically available assets, to produce consumer and teaching applications, including an Augmented Reality iPad tool. Two contests were also posted to middle and high school students via the NoNameSite.com platform, and as a result of these contests, PDS/SBN has initiated a Facebook program. These contests defined and implemented a data warehouse with the necessary migration tools to transform legacy data, produced a public web interface for the new search, developed a public API, and produced four mobile applications that we expect to appeal to users both within and, without the academic community.
NASA Astrophysics Data System (ADS)
LaMora, Andy; Raugh, A.; Erickson, K.; Grayzeck, E. J.; Knopf, W.; Lydon, M.; Lakhani, K.; Crusan, J.; Morgan, T. H.
2012-10-01
NASA PDS hosts terabytes of valuable data from hundreds of data sources and spans decades of research. Data is stored on flat-file systems regulated through careful meta dictionaries. PDS’s data is available to the public through its website which supports data searches through drill-down navigation. While the system returns data quickly, result sets in response to identical input differ depending on the drill-down path a user follows. To correct this issue, to allow custom searching, and to improve general accessibility, PDS sought to create a new data structure and API, and to use them to build applications that are a joy to use and showcase the value of the data to students, teachers and citizens. PDS engaged TopCoder and Harvard Business School through the NTL to pursue these objectives in a pilot effort. Scope was limited to Small Bodies Node data. NTL analyzed data, proposed a solution, and implemented it through a series of micro-contests. Contest focused on different segments of the problem; conceptualization, architectural design, implementation, testing, etc. To demonstrate the utility of the completed solution, NTL developed web-based and mobile applications that can compare targets, regardless of mission. To further explore the potential of the solution NTL hosted “Mash-up” challenges that integrated the API with other publically available assets, to produce consumer and teaching applications, including an Augmented Reality iPad tool. Two contests were also posted to middle and high school students via the NoNameSite.com platform, and as a result of these contests, PDS/SBN has initiated a Facebook program. These contests defined and implemented a data warehouse with the necessary migration tools to transform legacy data, produced a public web interface for the new search, developed a public API, and produced four mobile applications that we expect to appeal to users both within and without the academic community.
NASA Technical Reports Server (NTRS)
Smith, Dan
2007-01-01
The Goddard Mission Services Evolution Center, or GMSEC, was started in 2001 to create a new standard approach for managing GSFC missions. Standardized approaches in the past involved selecting and then integrating the most appropriate set of functional tools. Assumptions were made that "one size fits all" and that tool changes would not be necessary for many years. GMSEC took a very different approach and has proven to be very successful. The core of the GMSEC architecture consists of a publish/subscribe message bus, standardized message formats, and an Applications Programming Interface (API). The API supports multiple operating systems, programming languages and messaging middleware products. We use a GMSEC-developed free middleware for low-cost development. A high capacity, robust middleware is used for operations and a messaging system with a very small memory footprint is used for on-board flight software. Software components can use the standard message formats or develop adapters to convert from their native formats to the GMSEC formats. We do not want vendors to modify their core products. Over 50 software components are now available for use with the GMSEC architecture. Most available commercial telemetry and command systems, including the GMV hifly Satellite Control System, have been adapted to run in the GMSEC labs.
Howe, E.A.; de Souza, A.; Lahr, D.L.; Chatwin, S.; Montgomery, P.; Alexander, B.R.; Nguyen, D.-T.; Cruz, Y.; Stonich, D.A.; Walzer, G.; Rose, J.T.; Picard, S.C.; Liu, Z.; Rose, J.N.; Xiang, X.; Asiedu, J.; Durkin, D.; Levine, J.; Yang, J.J.; Schürer, S.C.; Braisted, J.C.; Southall, N.; Southern, M.R.; Chung, T.D.Y.; Brudz, S.; Tanega, C.; Schreiber, S.L.; Bittker, J.A.; Guha, R.; Clemons, P.A.
2015-01-01
BARD, the BioAssay Research Database (https://bard.nih.gov/) is a public database and suite of tools developed to provide access to bioassay data produced by the NIH Molecular Libraries Program (MLP). Data from 631 MLP projects were migrated to a new structured vocabulary designed to capture bioassay data in a formalized manner, with particular emphasis placed on the description of assay protocols. New data can be submitted to BARD with a user-friendly set of tools that assist in the creation of appropriately formatted datasets and assay definitions. Data published through the BARD application program interface (API) can be accessed by researchers using web-based query tools or a desktop client. Third-party developers wishing to create new tools can use the API to produce stand-alone tools or new plug-ins that can be integrated into BARD. The entire BARD suite of tools therefore supports three classes of researcher: those who wish to publish data, those who wish to mine data for testable hypotheses, and those in the developer community who wish to build tools that leverage this carefully curated chemical biology resource. PMID:25477388
Introducing the PRIDE Archive RESTful web services.
Reisinger, Florian; del-Toro, Noemi; Ternent, Tobias; Hermjakob, Henning; Vizcaíno, Juan Antonio
2015-07-01
The PRIDE (PRoteomics IDEntifications) database is one of the world-leading public repositories of mass spectrometry (MS)-based proteomics data and it is a founding member of the ProteomeXchange Consortium of proteomics resources. In the original PRIDE database system, users could access data programmatically by accessing the web services provided by the PRIDE BioMart interface. New REST (REpresentational State Transfer) web services have been developed to serve the most popular functionality provided by BioMart (now discontinued due to data scalability issues) and address the data access requirements of the newly developed PRIDE Archive. Using the API (Application Programming Interface) it is now possible to programmatically query for and retrieve peptide and protein identifications, project and assay metadata and the originally submitted files. Searching and filtering is also possible by metadata information, such as sample details (e.g. species and tissues), instrumentation (mass spectrometer), keywords and other provided annotations. The PRIDE Archive web services were first made available in April 2014. The API has already been adopted by a few applications and standalone tools such as PeptideShaker, PRIDE Inspector, the Unipept web application and the Python-based BioServices package. This application is free and open to all users with no login requirement and can be accessed at http://www.ebi.ac.uk/pride/ws/archive/. © The Author(s) 2015. Published by Oxford University Press on behalf of Nucleic Acids Research.
Vella, Michael; Cannon, Robert C; Crook, Sharon; Davison, Andrew P; Ganapathy, Gautham; Robinson, Hugh P C; Silver, R Angus; Gleeson, Padraig
2014-01-01
NeuroML is an XML-based model description language, which provides a powerful common data format for defining and exchanging models of neurons and neuronal networks. In the latest version of NeuroML, the structure and behavior of ion channel, synapse, cell, and network model descriptions are based on underlying definitions provided in LEMS, a domain-independent language for expressing hierarchical mathematical models of physical entities. While declarative approaches for describing models have led to greater exchange of model elements among software tools in computational neuroscience, a frequent criticism of XML-based languages is that they are difficult to work with directly. Here we describe two Application Programming Interfaces (APIs) written in Python (http://www.python.org), which simplify the process of developing and modifying models expressed in NeuroML and LEMS. The libNeuroML API provides a Python object model with a direct mapping to all NeuroML concepts defined by the NeuroML Schema, which facilitates reading and writing the XML equivalents. In addition, it offers a memory-efficient, array-based internal representation, which is useful for handling large-scale connectomics data. The libNeuroML API also includes support for performing common operations that are required when working with NeuroML documents. Access to the LEMS data model is provided by the PyLEMS API, which provides a Python implementation of the LEMS language, including the ability to simulate most models expressed in LEMS. Together, libNeuroML and PyLEMS provide a comprehensive solution for interacting with NeuroML models in a Python environment.
Vella, Michael; Cannon, Robert C.; Crook, Sharon; Davison, Andrew P.; Ganapathy, Gautham; Robinson, Hugh P. C.; Silver, R. Angus; Gleeson, Padraig
2014-01-01
NeuroML is an XML-based model description language, which provides a powerful common data format for defining and exchanging models of neurons and neuronal networks. In the latest version of NeuroML, the structure and behavior of ion channel, synapse, cell, and network model descriptions are based on underlying definitions provided in LEMS, a domain-independent language for expressing hierarchical mathematical models of physical entities. While declarative approaches for describing models have led to greater exchange of model elements among software tools in computational neuroscience, a frequent criticism of XML-based languages is that they are difficult to work with directly. Here we describe two Application Programming Interfaces (APIs) written in Python (http://www.python.org), which simplify the process of developing and modifying models expressed in NeuroML and LEMS. The libNeuroML API provides a Python object model with a direct mapping to all NeuroML concepts defined by the NeuroML Schema, which facilitates reading and writing the XML equivalents. In addition, it offers a memory-efficient, array-based internal representation, which is useful for handling large-scale connectomics data. The libNeuroML API also includes support for performing common operations that are required when working with NeuroML documents. Access to the LEMS data model is provided by the PyLEMS API, which provides a Python implementation of the LEMS language, including the ability to simulate most models expressed in LEMS. Together, libNeuroML and PyLEMS provide a comprehensive solution for interacting with NeuroML models in a Python environment. PMID:24795618
Multicomponent Pharmaceutical Cocrystals: A Novel Approach for Combination Therapy.
Fatima, Zeeshan; Srivastava, Dipti; Kaur, Chanchal Deep
2018-03-05
Cocrystallization is a technique for modifying the physicochemical and pharmacokinetic properties of an active pharmaceutical ingredient (API) embodying the concept of supramolecular synthon. Most of the examples cited in the literature are of cocrystals formed between an API and a coformer chosen from the generally recognized as safe (GRAS) substance list, however, few examples exist where a cocrystal consists of two or more APIs. These cocrystals are commonly known as multi API, multi drug or drug- drug cocrystals. The formation of such cocrystals is feasible by virtue of non covalent interactions between the APIs, which help them in retaining their biologic activity. In addition, drug- drug cocrystals also offer the potential solution to the limitations such as solubility, stability differences and chemical interaction between the APIs which is often faced during the traditional combination therapy. Cocrystallization of two or more APIs can be employed for delivery of combination drugs for the better and efficacious management of many complex disorders where existing monotherapies do not furnish the desired therapeutic effect. This review on the existing drug-drug cocrystals is to gain insight for better designing of multi API cocrystals with improved physicochemical and pharmacokinetic profile and its application in multiple target therapy. Copyright© Bentham Science Publishers; For any queries, please email at epub@benthamscience.org.
SMART Platforms: Building the App Store for Biosurveillance
Mandl, Kenneth D.
2013-01-01
Objective To enable public health departments to develop “apps” to run on electronic health records (EHRs) for (1) biosurveillance and case reporting and (2) delivering alerts to the point of care. We describe a novel health information technology platform with substitutable apps constructed around core services enabling EHRs to function as iPhone-like platforms. Introduction Health care information is a fundamental source of data for biosurveillance, yet configuring EHRs to report relevant data to health departments is technically challenging, labor intensive, and often requires custom solutions for each installation. Public health agencies wishing to deliver alerts to clinicians also must engage in an endless array of one-off systems integrations. Despite a $48B investment in HIT, and meaningful use criteria requiring reporting to biosurveillance systems, most vendor electronic health records are architected monolithically, making modification difficult for hospitals and physician practices. An alternative approach is to reimagine EHRs as iPhone-like platforms supporting substitutable apps-based functionality. Substitutability is the capability inherent in a system of replacing one application with another of similar functionality. Methods Substitutability requires that the purchaser of an app can replace one application with another without being technically expert, without requiring re-engineering other applications that they are using, and without having to consult or require assistance of any of the vendors of previously installed or currently installed applications. Apps necessarily compete with each other promoting progress and adaptability. The Substitutable Medical Applications, Reusable Technologies (SMART) Platforms project is funded by a $15M grant from Office of the National Coordinator of Health Information Technology’s Strategic Health IT Advanced Research Projects (SHARP) Program. All SMART standards are open and the core software is open source. The SMART project promotes substitutability through an application programming interface (API) that can be adopted as part of a “container” built around by a wide variety of HIT, providing readonly access to the underlying data model and a software development toolkit to readily create apps. SMART containers are HIT systems, that have implemented the SMART API or a portion of it. Containers marshal data sources and present them consistently across the SMART API. SMART applications consume the API and are substitutable. Results SMART provides a common platform supporting an “app store for biosurveillance” as an approach to enabling one stop shopping for public health departments—to create an app once, and distribute it everywhere. Further, such apps can be readily updated or created—for example, in the case of an emerging infection, an app may be designed to collect additional data at emergency department triage. Or a public health department may widely distribute an app, interoperable with any SMART-enabled EMR, that delivers contextualized alerts when patient electronic records are opened, or through background processes. SMART has sparked an ecosystem of apps developers and attracted existing health information technology platforms to adopt the SMART API—including, traditional, open source, and next generation EHRs, patient-facing platforms and health information exchanges. SMART-enabled platforms to date include the Cerner EMR, the WorldVista EHR, the OpenMRS EHR, the i2b2 analytic platform, and the Indivo X personal health record. The SMART team is working with the Mirth Corporation, to SMART-enable the HealthBridge and Redwood MedNet Health Information Exchanges. We have demonstrated that a single SMART app can run, unmodified, in all of these environments, as long as the underlying platform collects the required data types. Major EHR vendors are currently adapting the SMART API for their products. Conclusions The SMART system enables nimble customization of any electronic health record system to create either a reporting function (outgoing communication) or an alerting function (incoming communication) establishing a technology for a robust linkage between public health and clinical environments.
Krintz, Chandra
2013-01-01
AppScale is an open source distributed software system that implements a cloud platform as a service (PaaS). AppScale makes cloud applications easy to deploy and scale over disparate cloud fabrics, implementing a set of APIs and architecture that also makes apps portable across the services they employ. AppScale is API-compatible with Google App Engine (GAE) and thus executes GAE applications on-premise or over other cloud infrastructures, without modification. PMID:23828721
Beginning the 21st century with advanced Automatic Parts Identification (API)
NASA Technical Reports Server (NTRS)
Schramm, Fred; Roxby, Don
1994-01-01
Under the direction of the NASA George C. Marshall Space Flight Center, Huntsville, Alabama, the development and commercialization of an advanced Automated Parts Indentification (API) system is being undertaken by Rockwell International Corporation. The new API system is based on a variable sized, machine-readable, two-dimensioanl matrix symbol that can be applied directly onto most metallic and nonmetallic materials using safe, permanent marking methods. Its checkerboard-like structure is the most space efficient of all symbologies. This high data-density symbology can be applied to products of different material sizes and geometries using application-dependent, computer-driven marking devices. The high fidelity markings produced by these devices can then be captured using a specially designed camera linked to any IBM-compatible computer. Applications of compressed symbology technology will reduce costs and improve quality, productivity, and processes in a wide variety of federal and commercial applications.
The Error Reporting in the ATLAS TDAQ System
NASA Astrophysics Data System (ADS)
Kolos, Serguei; Kazarov, Andrei; Papaevgeniou, Lykourgos
2015-05-01
The ATLAS Error Reporting provides a service that allows experts and shift crew to track and address errors relating to the data taking components and applications. This service, called the Error Reporting Service (ERS), gives to software applications the opportunity to collect and send comprehensive data about run-time errors, to a place where it can be intercepted in real-time by any other system component. Other ATLAS online control and monitoring tools use the ERS as one of their main inputs to address system problems in a timely manner and to improve the quality of acquired data. The actual destination of the error messages depends solely on the run-time environment, in which the online applications are operating. When an application sends information to ERS, depending on the configuration, it may end up in a local file, a database, distributed middleware which can transport it to an expert system or display it to users. Thanks to the open framework design of ERS, new information destinations can be added at any moment without touching the reporting and receiving applications. The ERS Application Program Interface (API) is provided in three programming languages used in the ATLAS online environment: C++, Java and Python. All APIs use exceptions for error reporting but each of them exploits advanced features of a given language to simplify the end-user program writing. For example, as C++ lacks language support for exceptions, a number of macros have been designed to generate hierarchies of C++ exception classes at compile time. Using this approach a software developer can write a single line of code to generate a boilerplate code for a fully qualified C++ exception class declaration with arbitrary number of parameters and multiple constructors, which encapsulates all relevant static information about the given type of issues. When a corresponding error occurs at run time, the program just need to create an instance of that class passing relevant values to one of the available class constructors and send this instance to ERS. This paper presents the original design solutions exploited for the ERS implementation and describes how it was used during the first ATLAS run period. The cross-system error reporting standardization introduced by ERS was one of the key points for the successful implementation of automated mechanisms for online error recovery.
Construct exploit constraint in crash analysis by bypassing canary
NASA Astrophysics Data System (ADS)
Huang, Ning; Huang, Shuguang; Huang, Hui; Chang, Chao
2017-08-01
Selective symbolic execution is a common program testing technology. Developed on the basis of it, some crash analysis systems are often used to test the fragility of the program by constructing exploit constraints, such as CRAX. From the study of crash analysis based on symbolic execution, this paper find that this technology cannot bypass the canary stack protection mechanisms. This paper makes the improvement uses the API hook in Linux. Experimental results show that the use of API hook can effectively solve the problem that crash analysis cannot bypass the canary protection.
Zhang, Shuang; Yao, Feng; Jing, Ting; Zhang, Mengchen; Zhao, Wei; Zou, Xiangyang; Sui, Linlin; Hou, Lin
2017-09-10
During the embryonic development of Artemia sinica, the diapause phenomenon can be induced by high salinity or low temperature conditions. The diapause embryo at the gastrula stage is maintained under the threat of apoptosis to guarantee the embryo's normal development. In this process, apoptosis inhibitor proteins play vital roles in protecting embryos against apoptosis. Apoptosis inhibitor5 (API5) plays a pivotal role in regulating the cell cycle and preventing programmed cell death after growth factor starvation. In the present study, we cloned the full-length cDNA representing the api5 gene from A. sinica (As-api5), which encodes a 372-amino acid protein. In situ hybridization experiments revealed that As-api5 expression is not tissue or organ specific. Quantitative real-time PCR analyses of the developmental expression of As-api5 showed that it reached its highest level at 10h, after which its expression decreased. High salinity and low temperature treatments increased the expression of As-api5. Western blotting was used to assess the abundance of As-API5 and related proteins (As-CyclinA, As-CyclinE, As-E2F1, As-CDK2, As-APAF1, and As-Caspase9). Downregulation of As-api5 expression using a short interfering RNA resulted in increased mortality and embryo malformation of A. sinica. Taken together, the results indicated that API5 plays a crucial role in embryonic diapause termination and early embryo development of A. sinica. Copyright © 2017. Published by Elsevier B.V.
2010-01-01
Background The staphylococci are one of the most common environmental isolates found in clean room facility. Consequently, isolation followed by comprehensive and accurate identification is an essential step in any environmental monitoring program. Findings We have used the API Staph identification kit (bioMérieux, France) which depends on the expression of metabolic activities and or morphological features to identify the Staphylococcus isolates. The API staphylococci showed low sensitivity in the identification of some species, so we performed molecular methods based on PCR based fingerprinting of glyceraldehyde-3-phosphate dehydrogenase encoding gene as useful taxonomic tool for examining Staphylococcus isolates. Conclusions Our results showed that PCR protocol used in this study which depends on genotypic features was relatively accurate, rapid, sensitive and superior in the identification of at least 7 species of Staphylococcus than API Staph which depends on phenotypic features. PMID:21047438
Sheraba, Norhan S; Yassin, Aymen S; Amin, Magdy A
2010-11-04
The staphylococci are one of the most common environmental isolates found in clean room facility. Consequently, isolation followed by comprehensive and accurate identification is an essential step in any environmental monitoring program. We have used the API Staph identification kit (bioMérieux, France) which depends on the expression of metabolic activities and or morphological features to identify the Staphylococcus isolates. The API staphylococci showed low sensitivity in the identification of some species, so we performed molecular methods based on PCR based fingerprinting of glyceraldehyde-3-phosphate dehydrogenase encoding gene as useful taxonomic tool for examining Staphylococcus isolates. Our results showed that PCR protocol used in this study which depends on genotypic features was relatively accurate, rapid, sensitive and superior in the identification of at least 7 species of Staphylococcus than API Staph which depends on phenotypic features.
An application programming interface for extreme precipitation and hazard products
NASA Astrophysics Data System (ADS)
Kirschbaum, D.; Stanley, T.; Cappelaere, P. G.; Reed, J.; Lammers, M.
2016-12-01
Remote sensing data provides situational awareness of extreme events and hazards over large areas in a way that is impossible to achieve with in situ data. However, more valuable than raw data is actionable information based on user needs. This information can take the form of derived products, extraction of a subset of variables in a larger data matrix, or data processing for a specific goal. These products can then stream to the end users, who can use these data to improve local to global decision making. This presentation will outline both the science and methodology of two new data products and tools that can provide relevant climate and hazard data for response and support. The Global Precipitation Measurement (GPM) mission provides near real-time information on rain and snow around the world every thirty minutes. Through a new applications programing interface (API), this data can be freely accessed by consumers to visualize, analyze, and communicate where, when and how much rain is falling worldwide. The second tool is a global landslide model that provides situational awareness of potential landslide activity in near real-time, utilizing several remotely sensed data products. This hazard information is also provided through an API and is being ingested by the emergency response community, international aid organizations, and others around the world. This presentation will highlight lessons learned through the development, implementation, and communication of these products and tools with the goal of enabling better and more effective decision making.
Avdeef, Alex
2017-12-15
A novel general computational approach is described to address many aspects of cocrystal (CC) solubility product (K sp ) determination of drug substances. The CC analysis program, pDISOL-X, was developed and validated with published model systems of various acid-base combinations of active pharmaceutical ingredients (APIs) and coformers: (i) carbamazepine cocrystal systems with 4-aminobenzoic acid, cinnamic acid, saccharin, and salicylic acid, (ii) for indomethacin with saccharin, (iii) for nevirapine with maleic acid, saccharin, and salicylic acid, and (iv) for gabapentin with 3-hydroxybenzoic acid. In all systems but gabapentin, the coformer is much more soluble than the API. The model systems selected are those with available published dual concentration-pH data, one set for the API and one set for the coformer, generally measured at eutectic points (thermodynamically-stable three phases: solution, cocrystal, and crystalline API or coformer). The carbamazepine-cinnamic acid CC showed a substantial elevation in the API equilibrium concentration above pH5, consistent with the formation of a complex between carbamazepine and cinnamate anion. The analysis of the gabapentin:3-hydroxybenzoic acid 1:1 CC system indicated four zones of solid suspensions: coformer (pH<3.25), coformer and cocrystal eutectic (pH3.25-4.44), cocrystal (pH4.44-5.62), and API (pH>5.62). The general approach allows for testing of many possible equilibrium models, including those comprising drug-coformer complexation. The program calculates the ionic strength at each pH. From this, the equilibrium constants are adjusted for activity effects, based on the Stokes-Robinson hydration theory. The complete speciation analysis of the CC systems may provide useful insights into pH-sensitive dissolution effects that could potentially influence bioavailability. Copyright © 2017 Elsevier B.V. All rights reserved.
Namibia Dashboard Enhancements
NASA Technical Reports Server (NTRS)
Mandl, Daniel; Handy, Matthew
2014-01-01
The purpose of this presentation is for a Technical Interchange Meeting with the Namibia Hydrological Services (NHS) in Namibia. The meeting serves as a capacity building exercise. This presentation goes over existing software functionality developed in collaboration with NHS over the past five years called the Namibia Flood Dashboard. Furthermore, it outlines new functionality developed over the past year and future functionality that will be developed. The main purpose of the Dashboard is to assist in decision support for flood warning. The Namibia Flood Dashboard already exists online in a cloud environment and has been used in prototype mode for the past few years.Functionality in the Dashboard includes river gauge hydrographs, TRMM estimate rainfall, EO-1 flood maps, infrastructure maps and other related functions. Future functionality includes attempting to integrate interoperability standards and crowd-sourcing capability. To this end, we are adding OpenStreetMap compatibility and an Applications Program Interface (API) called a GeoSocial API to enable discovery and sharing of data products useful for decision support via social media.
A Data Services Upgrade for Advanced Composition Explorer (ACE) Data
NASA Astrophysics Data System (ADS)
Davis, A. J.; Hamell, G.
2008-12-01
Since early in 1998, NASA's Advanced Composition Explorer (ACE) spacecraft has provided continuous measurements of solar wind, interplanetary magnetic field, and energetic particle activity from L1, located approximately 0.01 AU sunward of Earth. The spacecraft has enough fuel to stay in orbit about L1 until ~2024. The ACE Science Center (ASC) provides access to ACE data, and performs level 1 and browse data processing for the science instruments. Thanks to a NASA Data Services Upgrade grant, we have recently retooled our legacy web interface to ACE data, enhancing data subsetting capabilities and improving online plotting options. We have also integrated a new application programming interface (API) and we are working to ensure that it will be compatible with emerging Virtual Observatory (VO) data services standards. The new API makes extensive use of metadata created using the Space Physics Archive Search and Extract (SPASE) data model. We describe these recent improvements to the ACE Science Center data services, and our plans for integrating these services into the VO system.
NASA Astrophysics Data System (ADS)
Marco Figuera, R.; Pham Huu, B.; Rossi, A. P.; Minin, M.; Flahaut, J.; Halder, A.
2018-01-01
The lack of open-source tools for hyperspectral data visualization and analysis creates a demand for new tools. In this paper we present the new PlanetServer, a set of tools comprising a web Geographic Information System (GIS) and a recently developed Python Application Programming Interface (API) capable of visualizing and analyzing a wide variety of hyperspectral data from different planetary bodies. Current WebGIS open-source tools are evaluated in order to give an overview and contextualize how PlanetServer can help in this matters. The web client is thoroughly described as well as the datasets available in PlanetServer. Also, the Python API is described and exposed the reason of its development. Two different examples of mineral characterization of different hydrosilicates such as chlorites, prehnites and kaolinites in the Nili Fossae area on Mars are presented. As the obtained results show positive outcome in hyperspectral analysis and visualization compared to previous literature, we suggest using the PlanetServer approach for such investigations.
Fortran interface layer of the framework for developing particle simulator FDPS
NASA Astrophysics Data System (ADS)
Namekata, Daisuke; Iwasawa, Masaki; Nitadori, Keigo; Tanikawa, Ataru; Muranushi, Takayuki; Wang, Long; Hosono, Natsuki; Nomura, Kentaro; Makino, Junichiro
2018-06-01
Numerical simulations based on particle methods have been widely used in various fields including astrophysics. To date, various versions of simulation software have been developed by individual researchers or research groups in each field, through a huge amount of time and effort, even though the numerical algorithms used are very similar. To improve the situation, we have developed a framework, called FDPS (Framework for Developing Particle Simulators), which enables researchers to develop massively parallel particle simulation codes for arbitrary particle methods easily. Until version 3.0, FDPS provided an API (application programming interface) for the C++ programming language only. This limitation comes from the fact that FDPS is developed using the template feature in C++, which is essential to support arbitrary data types of particle. However, there are many researchers who use Fortran to develop their codes. Thus, the previous versions of FDPS require such people to invest much time to learn C++. This is inefficient. To cope with this problem, we developed a Fortran interface layer in FDPS, which provides API for Fortran. In order to support arbitrary data types of particle in Fortran, we design the Fortran interface layer as follows. Based on a given derived data type in Fortran representing particle, a PYTHON script provided by us automatically generates a library that manipulates the C++ core part of FDPS. This library is seen as a Fortran module providing an API of FDPS from the Fortran side and uses C programs internally to interoperate Fortran with C++. In this way, we have overcome several technical issues when emulating a `template' in Fortran. Using the Fortran interface, users can develop all parts of their codes in Fortran. We show that the overhead of the Fortran interface part is sufficiently small and a code written in Fortran shows a performance practically identical to the one written in C++.
Working with HITRAN Database Using Hapi: HITRAN Application Programming Interface
NASA Astrophysics Data System (ADS)
Kochanov, Roman V.; Hill, Christian; Wcislo, Piotr; Gordon, Iouli E.; Rothman, Laurence S.; Wilzewski, Jonas
2015-06-01
A HITRAN Application Programing Interface (HAPI) has been developed to allow users on their local machines much more flexibility and power. HAPI is a programming interface for the main data-searching capabilities of the new "HITRANonline" web service (http://www.hitran.org). It provides the possibility to query spectroscopic data from the HITRAN database in a flexible manner using either functions or query language. Some of the prominent current features of HAPI are: a) Downloading line-by-line data from the HITRANonline site to a local machine b) Filtering and processing the data in SQL-like fashion c) Conventional Python structures (lists, tuples, and dictionaries) for representing spectroscopic data d) Possibility to use a large set of third-party Python libraries to work with the data e) Python implementation of the HT lineshape which can be reduced to a number of conventional line profiles f) Python implementation of total internal partition sums (TIPS-2011) for spectra simulations g) High-resolution spectra calculation accounting for pressure, temperature and optical path length h) Providing instrumental functions to simulate experimental spectra i) Possibility to extend HAPI's functionality by custom line profiles, partitions sums and instrumental functions Currently the API is a module written in Python and uses Numpy library providing fast array operations. The API is designed to deal with data in multiple formats such as ASCII, CSV, HDF5 and XSAMS. This work has been supported by NASA Aura Science Team Grant NNX14AI55G and NASA Planetary Atmospheres Grant NNX13AI59G. L.S. Rothman et al. JQSRT, Volume 130, 2013, Pages 4-50 N.H. Ngo et al. JQSRT, Volume 129, November 2013, Pages 89-100 A. L. Laraia at al. Icarus, Volume 215, Issue 1, September 2011, Pages 391-400
Swan: A tool for porting CUDA programs to OpenCL
NASA Astrophysics Data System (ADS)
Harvey, M. J.; De Fabritiis, G.
2011-04-01
The use of modern, high-performance graphical processing units (GPUs) for acceleration of scientific computation has been widely reported. The majority of this work has used the CUDA programming model supported exclusively by GPUs manufactured by NVIDIA. An industry standardisation effort has recently produced the OpenCL specification for GPU programming. This offers the benefits of hardware-independence and reduced dependence on proprietary tool-chains. Here we describe a source-to-source translation tool, "Swan" for facilitating the conversion of an existing CUDA code to use the OpenCL model, as a means to aid programmers experienced with CUDA in evaluating OpenCL and alternative hardware. While the performance of equivalent OpenCL and CUDA code on fixed hardware should be comparable, we find that a real-world CUDA application ported to OpenCL exhibits an overall 50% increase in runtime, a reduction in performance attributable to the immaturity of contemporary compilers. The ported application is shown to have platform independence, running on both NVIDIA and AMD GPUs without modification. We conclude that OpenCL is a viable platform for developing portable GPU applications but that the more mature CUDA tools continue to provide best performance. Program summaryProgram title: Swan Catalogue identifier: AEIH_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEIH_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: GNU Public License version 2 No. of lines in distributed program, including test data, etc.: 17 736 No. of bytes in distributed program, including test data, etc.: 131 177 Distribution format: tar.gz Programming language: C Computer: PC Operating system: Linux RAM: 256 Mbytes Classification: 6.5 External routines: NVIDIA CUDA, OpenCL Nature of problem: Graphical Processing Units (GPUs) from NVIDIA are preferentially programed with the proprietary CUDA programming toolkit. An alternative programming model promoted as an industry standard, OpenCL, provides similar capabilities to CUDA and is also supported on non-NVIDIA hardware (including multicore ×86 CPUs, AMD GPUs and IBM Cell processors). The adaptation of a program from CUDA to OpenCL is relatively straightforward but laborious. The Swan tool facilitates this conversion. Solution method:Swan performs a translation of CUDA kernel source code into an OpenCL equivalent. It also generates the C source code for entry point functions, simplifying kernel invocation from the host program. A concise host-side API abstracts the CUDA and OpenCL APIs. A program adapted to use Swan has no dependency on the CUDA compiler for the host-side program. The converted program may be built for either CUDA or OpenCL, with the selection made at compile time. Restrictions: No support for CUDA C++ features Running time: Nominal
Judicious use of custom development in an open source component architecture
NASA Astrophysics Data System (ADS)
Bristol, S.; Latysh, N.; Long, D.; Tekell, S.; Allen, J.
2014-12-01
Modern software engineering is not as much programming from scratch as innovative assembly of existing components. Seamlessly integrating disparate components into scalable, performant architecture requires sound engineering craftsmanship and can often result in increased cost efficiency and accelerated capabilities if software teams focus their creativity on the edges of the problem space. ScienceBase is part of the U.S. Geological Survey scientific cyberinfrastructure, providing data and information management, distribution services, and analysis capabilities in a way that strives to follow this pattern. ScienceBase leverages open source NoSQL and relational databases, search indexing technology, spatial service engines, numerous libraries, and one proprietary but necessary software component in its architecture. The primary engineering focus is cohesive component interaction, including construction of a seamless Application Programming Interface (API) across all elements. The API allows researchers and software developers alike to leverage the infrastructure in unique, creative ways. Scaling the ScienceBase architecture and core API with increasing data volume (more databases) and complexity (integrated science problems) is a primary challenge addressed by judicious use of custom development in the component architecture. Other data management and informatics activities in the earth sciences have independently resolved to a similar design of reusing and building upon established technology and are working through similar issues for managing and developing information (e.g., U.S. Geoscience Information Network; NASA's Earth Observing System Clearing House; GSToRE at the University of New Mexico). Recent discussions facilitated through the Earth Science Information Partners are exploring potential avenues to exploit the implicit relationships between similar projects for explicit gains in our ability to more rapidly advance global scientific cyberinfrastructure.
Oceans 2.0 API: Programmatic access to Ocean Networks Canada's sensor data.
NASA Astrophysics Data System (ADS)
Heesemann, M.; Ross, R.; Hoeberechts, M.; Pirenne, B.; MacArthur, M.; Jeffries, M. A.; Morley, M. G.
2017-12-01
Ocean Networks Canada (ONC) is a not-for-profit society that operates and manages innovative cabled observatories on behalf of the University of Victoria. These observatories supply continuous power and Internet connectivity to various scientific instruments located in coastal, deep-ocean and Arctic environments. The data from the instruments are relayed to the University of Victoria where they are archived, quality-controlled and made freely available to researchers, educators, and the public. The Oceans 2.0 data management system currently contains over 500 terabytes of data collected over 11 years from thousands of sensors. In order to facilitate access to the data, particularly for large datasets and long-time series of high-resolution data, a project was started in 2016 create a comprehensive Application Programming Interface, the "Oceans 2.0 API," to provide programmatic access to all ONC data products. The development is part of a project entitled "A Research Platform for User-Defined Oceanographic Data Products," funded through CANARIE, a Canadian organization responsible for the design and delivery of digital infrastructure for research, education and innovation [1]. Providing quick and easy access to ONC Data Products from within custom software solutions, allows researchers, modelers and decision makers to focus on what is important: solving their problems, answering their questions and making informed decisions. In this paper, we discuss how to access ONC's vast archive of data programmatically, through the Oceans 2.0 API. In particular we discuss the following: Access to ONC Data Products Access to ONC sensor data in near real-time Programming language support Use Cases References [1] CANARIE. Internet: https://www.canarie.ca/; accessed March 6, 2017.
Nowotka, Michał M; Gaulton, Anna; Mendez, David; Bento, A Patricia; Hersey, Anne; Leach, Andrew
2017-08-01
ChEMBL is a manually curated database of bioactivity data on small drug-like molecules, used by drug discovery scientists. Among many access methods, a REST API provides programmatic access, allowing the remote retrieval of ChEMBL data and its integration into other applications. This approach allows scientists to move from a world where they go to the ChEMBL web site to search for relevant data, to one where ChEMBL data can be simply integrated into their everyday tools and work environment. Areas covered: This review highlights some of the audiences who may benefit from using the ChEMBL API, and the goals they can address, through the description of several use cases. The examples cover a team communication tool (Slack), a data analytics platform (KNIME), batch job management software (Luigi) and Rich Internet Applications. Expert opinion: The advent of web technologies, cloud computing and micro services oriented architectures have made REST APIs an essential ingredient of modern software development models. The widespread availability of tools consuming RESTful resources have made them useful for many groups of users. The ChEMBL API is a valuable resource of drug discovery bioactivity data for professional chemists, chemistry students, data scientists, scientific and web developers.
Motion Adaptation, its Role in Motion Detection Under Natural Image Conditions and Target Detection
2005-06-02
Ibbotson, M.R. & Goodman, L.J. (1990) “Response characteristics of four wide-field motion sensitive descending interneurons in Apis mellifera ,” J. Exp...libraries (in particular a module, PyGame, original designed as an API for computer games applications). Andrew’s contribution to this effort was a
CernVM WebAPI - Controlling Virtual Machines from the Web
NASA Astrophysics Data System (ADS)
Charalampidis, I.; Berzano, D.; Blomer, J.; Buncic, P.; Ganis, G.; Meusel, R.; Segal, B.
2015-12-01
Lately, there is a trend in scientific projects to look for computing resources in the volunteering community. In addition, to reduce the development effort required to port the scientific software stack to all the known platforms, the use of Virtual Machines (VMs)u is becoming increasingly popular. Unfortunately their use further complicates the software installation and operation, restricting the volunteer audience to sufficiently expert people. CernVM WebAPI is a software solution addressing this specific case in a way that opens wide new application opportunities. It offers a very simple API for setting-up, controlling and interfacing with a VM instance in the users computer, while in the same time offloading the user from all the burden of downloading, installing and configuring the hypervisor. WebAPI comes with a lightweight javascript library that guides the user through the application installation process. Malicious usage is prohibited by offering a per-domain PKI validation mechanism. In this contribution we will overview this new technology, discuss its security features and examine some test cases where it is already in use.
DOE Office of Scientific and Technical Information (OSTI.GOV)
The open source Project Haystack initiative defines meta data and communication standards related to data from buildings and intelligent devices. The Project Haystack REST API defines standard formats and operations for exchanging Haystack tagged data over HTTP. The HaystackRuby gem wraps calls to this REST API to enable Ruby application to easily integrate data hosted on a Project Haystack compliant server. The HaystackRuby gem was developed at the National Renewable Energy Lab to support applications related to campus energy. We hope that this tool may be useful to others.
The Future of Healthcare Informatics: It Is Not What You Think
2012-01-01
Electronic health records (EHRs) offer many valuable benefits for patient safety, but it becomes apparent that the effective application of healthcare informatics creates problems and unintended consequences. One problem that seems particularly challenging is integration. Painfully missing are low-cost, easy to implement, plug-and-play, nonintrusive integration solutions—healthcare's “killer app.” Why is this? We must stop confusing application integration with information integration. Our goal must be to communicate data (ie, integrate information), not to integrate application functionality via complex and expensive application program interfaces (APIs). Communicating data simply requires a loosely coupled flow of data, as occurs today via email. In contrast, integration is a chief information officer's nightmare. Integrating applications, when we just wanted a bit of information, is akin to killing a gnat with a brick. PMID:24278826
Pulido, M J; Alvarado, E A; Berger, W; Nelson, A; Todoroff, C
2001-01-01
Hepatitis B virus (HBV) is a known cause of liver cancer, especially among Asian and Pacific Islanders (API). Despite national recommendations and school entry requirements for vaccination, many children are not fully vaccinated with the Hepatitis B vaccine (Hep B) before entering school. The purpose of this study was to measure ethnic group-specific hepatitis B vaccination rates among school-aged API children after implementation of universal recommendations and school laws, and quantify ethnic-specific risk factors associated with late and incomplete vaccinations. A multilingual questionnaire was distributed to parents of second and fourth graders in nine Los Angeles County (LAC) elementary schools with high proportions of API students. Data on Hepatitis B vaccination dates, source of health care and health information, cultural factors, and general knowledge and attitudes about HBV and vaccination were collected and analyzed. Overall, 1,696 (77%) of 2,183 questionnaires were returned. Of these, 1,024 were from API children. The API second graders in this survey had a 72% coverage rate, ranging from 46% to 94% among the individual ethnic groups. Fifty-one percent of API fourth graders had three doses of Hep B vaccine, ranging from 38% to 69% among the individual ethnic groups. Factors influencing coverage levels among API fourth graders were speaking limited English at home, living in the United States less than five years, and not having discussed hepatitis B vaccination with a health care provider. Factors influencing low immunization levels differed among the API ethnic groups. Analysis and intervention on a non-aggregate level are necessary for designing both effective and cultural-specific outreach programs for diverse API communities such as LAC's.
ERIC Educational Resources Information Center
DeCiccio, Albert; Kenny, Tammy; Lippacher, Linda; Flanary, Barry
2011-01-01
Many first-year students interested in healthcare careers do not succeed in Anatomy and Physiology I (A&PI), which they take in their first semester. These first-year students withdraw from the course or the institution, or their final grade may be below the identified threshold for progressing in their programs. A&PI has become a…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Laros, James H.; Grant, Ryan; Levenhagen, Michael J.
Measuring and controlling the power and energy consumption of high performance computing systems by various components in the software stack is an active research area. Implementations in lower level software layers are beginning to emerge in some production systems, which is very welcome. To be most effective, a portable interface to measurement and control features would significantly facilitate participation by all levels of the software stack. We present a proposal for a standard power Application Programming Interface (API) that endeavors to cover the entire software space, from generic hardware interfaces to the input from the computer facility manager.
NASA Astrophysics Data System (ADS)
Mann, Christopher; Narasimhamurthi, Natarajan
1998-08-01
This paper discusses a specific implementation of a web and complement based simulation systems. The overall simulation container is implemented within a web page viewed with Microsoft's Internet Explorer 4.0 web browser. Microsoft's ActiveX/Distributed Component Object Model object interfaces are used in conjunction with the Microsoft DirectX graphics APIs to provide visualization functionality for the simulation. The MathWorks' Matlab computer aided control system design program is used as an ActiveX automation server to provide the compute engine for the simulations.
Achieving High Performance With TCP Over 40 GbE on NUMA Architectures for CMS Data Acquisition
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bawej, Tomasz; et al.
2014-01-01
TCP and the socket abstraction have barely changed over the last two decades, but at the network layer there has been a giant leap from a few megabits to 100 gigabits in bandwidth. At the same time, CPU architectures have evolved into the multicore era and applications are expected to make full use of all available resources. Applications in the data acquisition domain based on the standard socket library running in a Non-Uniform Memory Access (NUMA) architecture are unable to reach full efficiency and scalability without the software being adequately aware about the IRQ (Interrupt Request), CPU and memory affinities.more » During the first long shutdown of LHC, the CMS DAQ system is going to be upgraded for operation from 2015 onwards and a new software component has been designed and developed in the CMS online framework for transferring data with sockets. This software attempts to wrap the low-level socket library to ease higher-level programming with an API based on an asynchronous event driven model similar to the DAT uDAPL API. It is an event-based application with NUMA optimizations, that allows for a high throughput of data across a large distributed system. This paper describes the architecture, the technologies involved and the performance measurements of the software in the context of the CMS distributed event building.« less
OpenFDA: an innovative platform providing access to a wealth of FDA's publicly available data.
Kass-Hout, Taha A; Xu, Zhiheng; Mohebbi, Matthew; Nelsen, Hans; Baker, Adam; Levine, Jonathan; Johanson, Elaine; Bright, Roselie A
2016-05-01
The objective of openFDA is to facilitate access and use of big important Food and Drug Administration public datasets by developers, researchers, and the public through harmonization of data across disparate FDA datasets provided via application programming interfaces (APIs). Using cutting-edge technologies deployed on FDA's new public cloud computing infrastructure, openFDA provides open data for easier, faster (over 300 requests per second per process), and better access to FDA datasets; open source code and documentation shared on GitHub for open community contributions of examples, apps and ideas; and infrastructure that can be adopted for other public health big data challenges. Since its launch on June 2, 2014, openFDA has developed four APIs for drug and device adverse events, recall information for all FDA-regulated products, and drug labeling. There have been more than 20 million API calls (more than half from outside the United States), 6000 registered users, 20,000 connected Internet Protocol addresses, and dozens of new software (mobile or web) apps developed. A case study demonstrates a use of openFDA data to understand an apparent association of a drug with an adverse event. With easier and faster access to these datasets, consumers worldwide can learn more about FDA-regulated products. © The Author 2015. Published by Oxford University Press on behalf of the American Medical Informatics Association. All rights reserved.
NASA Astrophysics Data System (ADS)
Fatland, R.; Tan, A.; Arendt, A. A.
2016-12-01
We describe a Python-based implementation of a PostgreSQL database accessed through an Application Programming Interface (API) hosted on the Amazon Web Services public cloud. The data is geospatial and concerns hydrological model results in the glaciated catchment basins of southcentral and southeast Alaska. This implementation, however, is intended to be generalized to other forms of geophysical data, particularly data that is intended to be shared across a collaborative team or publicly. An example (moderate-size) dataset is provided together with the code base and a complete installation tutorial on GitHub. An enthusiastic scientist with some familiarity with software installation can replicate the example system in two hours. This installation includes database, API, a test Client and a supporting Jupyter Notebook, specifically oriented towards Python 3 and markup text to comprise an executable paper. The installation 'on the cloud' often engenders discussion and consideration of cloud cost and safety. By treating the process as somewhat "cookbook" we hope to first demonstrate the feasibility of the proposition. A discussion of cost and data security is provided in this presentation and in the accompanying tutorial/documentation. This geospatial data system case study is part of a larger effort at the University of Washington to enable research teams to take advantage of the public cloud to meet challenges in data management and analysis.
OpenFDA: an innovative platform providing access to a wealth of FDA’s publicly available data
Kass-Hout, Taha A; Mohebbi, Matthew; Nelsen, Hans; Baker, Adam; Levine, Jonathan; Johanson, Elaine; Bright, Roselie A
2016-01-01
Objective The objective of openFDA is to facilitate access and use of big important Food and Drug Administration public datasets by developers, researchers, and the public through harmonization of data across disparate FDA datasets provided via application programming interfaces (APIs). Materials and Methods Using cutting-edge technologies deployed on FDA’s new public cloud computing infrastructure, openFDA provides open data for easier, faster (over 300 requests per second per process), and better access to FDA datasets; open source code and documentation shared on GitHub for open community contributions of examples, apps and ideas; and infrastructure that can be adopted for other public health big data challenges. Results Since its launch on June 2, 2014, openFDA has developed four APIs for drug and device adverse events, recall information for all FDA-regulated products, and drug labeling. There have been more than 20 million API calls (more than half from outside the United States), 6000 registered users, 20,000 connected Internet Protocol addresses, and dozens of new software (mobile or web) apps developed. A case study demonstrates a use of openFDA data to understand an apparent association of a drug with an adverse event. Conclusion With easier and faster access to these datasets, consumers worldwide can learn more about FDA-regulated products. PMID:26644398
Earth Science Mobile App Development for Non-Programmers
NASA Astrophysics Data System (ADS)
Oostra, D.; Crecelius, S.; Lewis, P.; Chambers, L. H.
2012-08-01
A number of cloud based visual development tools have emerged that provide methods for developing mobile applications quickly and without previous programming experience. The MY NASA DATA (MND) team would like to begin a discussion on how we can best leverage current mobile app technologies and available Earth science datasets. The MY NASA DATA team is developing an approach based on two main ideas. The first is to teach our constituents how to create mobile applications that interact with NASA datasets; the second is to provide web services or Application Programming Interfaces (APIs) that create sources of data that educators, students and scientists can use in their own mobile app development. This framework allows data providers to foster mobile application development and interaction while not becoming a software clearing house. MY NASA DATA's research has included meetings with local data providers, educators, libraries and individuals. A high level of interest has been identified from initial discussions and interviews. This overt interest combined with the marked popularity of mobile applications in our societies has created a new channel for outreach and communications with and between the science and educational communities.
CISUS: an integrated 3D ultrasound system for IGT using a modular tracking API
NASA Astrophysics Data System (ADS)
Boctor, Emad M.; Viswanathan, Anand; Pieper, Steve; Choti, Michael A.; Taylor, Russell H.; Kikinis, Ron; Fichtinger, Gabor
2004-05-01
Ultrasound has become popular in clinical/surgical applications, both as the primary image guidance modality and also in conjunction with other modalities like CT or MRI. Three dimensional ultrasound (3DUS) systems have also demonstrated usefulness in image-guided therapy (IGT). At the same time, however, current lack of open-source and open-architecture multi-modal medical visualization systems prevents 3DUS from fulfilling its potential. Several stand-alone 3DUS systems, like Stradx or In-Vivo exist today. Although these systems have been found to be useful in real clinical setting, it is difficult to augment their functionality and integrate them in versatile IGT systems. To address these limitations, a robotic/freehand 3DUS open environment (CISUS) is being integrated into the 3D Slicer, an open-source research tool developed for medical image analysis and surgical planning. In addition, the system capitalizes on generic application programming interfaces (APIs) for tracking devices and robotic control. The resulting platform-independent open-source system may serve as a valuable tool to the image guided surgery community. Other researchers could straightforwardly integrate the generic CISUS system along with other functionalities (i.e. dual view visualization, registration, real-time tracking, segmentation, etc) to rapidly create their medical/surgical applications. Our current driving clinical application is robotically assisted and freehand 3DUS-guided liver ablation, which is fully being integrated under the CISUS-3D Slicer. Initial functionality and pre-clinical feasibility are demonstrated on phantom and ex-vivo animal models.
TCIApathfinder: an R client for The Cancer Imaging Archive REST API.
Russell, Pamela; Fountain, Kelly; Wolverton, Dulcy; Ghosh, Debashis
2018-06-05
The Cancer Imaging Archive (TCIA) hosts publicly available de-identified medical images of cancer from over 25 body sites and over 30,000 patients. Over 400 published studies have utilized freely available TCIA images. Images and metadata are available for download through a web interface or a REST API. Here we present TCIApathfinder, an R client for the TCIA REST API. TCIApathfinder wraps API access in user-friendly R functions that can be called interactively within an R session or easily incorporated into scripts. Functions are provided to explore the contents of the large database and to download image files. TCIApathfinder provides easy access to TCIA resources in the highly popular R programming environment. TCIApathfinder is freely available under the MIT license as a package on CRAN (https://cran.r-project.org/web/packages/TCIApathfinder/index.html) and at https://github.com/pamelarussell/TCIApathfinder. Copyright ©2018, American Association for Cancer Research.
DeepBlue epigenomic data server: programmatic data retrieval and analysis of epigenome region sets
Albrecht, Felipe; List, Markus; Bock, Christoph; Lengauer, Thomas
2016-01-01
Large amounts of epigenomic data are generated under the umbrella of the International Human Epigenome Consortium, which aims to establish 1000 reference epigenomes within the next few years. These data have the potential to unravel the complexity of epigenomic regulation. However, their effective use is hindered by the lack of flexible and easy-to-use methods for data retrieval. Extracting region sets of interest is a cumbersome task that involves several manual steps: identifying the relevant experiments, downloading the corresponding data files and filtering the region sets of interest. Here we present the DeepBlue Epigenomic Data Server, which streamlines epigenomic data analysis as well as software development. DeepBlue provides a comprehensive programmatic interface for finding, selecting, filtering, summarizing and downloading region sets. It contains data from four major epigenome projects, namely ENCODE, ROADMAP, BLUEPRINT and DEEP. DeepBlue comes with a user manual, examples and a well-documented application programming interface (API). The latter is accessed via the XML-RPC protocol supported by many programming languages. To demonstrate usage of the API and to enable convenient data retrieval for non-programmers, we offer an optional web interface. DeepBlue can be openly accessed at http://deepblue.mpi-inf.mpg.de. PMID:27084938
2016-12-01
branches of our work . 3.1 Understanding Sensitive API Call and API Information Usage Android applications are written in a type- safe language (Java...directly invoke resolved targets. Because DroidSafe works with a comprehensive model of the Android environment , it supports precise resolution of...STATEMENT. FOR THE CHIEF ENGINEER: / S / / S / MARK K. WILLIAMS WARREN H. DEBANY, JR. Work Unit Manager
Petzoldt, Christine; Bley, Oliver; Byard, Stephen J; Andert, Doris; Baumgartner, Bruno; Nagel, Norbert; Tappertzhofen, Christoph; Feth, Martin Philipp
2014-04-01
The so-called pharmaceutical solid chain, which encompasses drug substance micronisation to the final tablet production, at pilot plant scale is presented as a case study for a novel, highly potent, pharmaceutical compound: SAR114137. Various solid-state analytical methods, such as solid-state Nuclear Magnetic Resonance (ssNMR), Differential Scanning Calorimetry (DSC), Dynamic Water Vapour Sorption Gravimetry (DWVSG), hot-stage Raman spectroscopy and X-ray Powder Diffraction (XRPD) were applied and evaluated to characterise and quantify amorphous content during the course of the physical treatment of crystalline active pharmaceutical ingredient (API). DSC was successfully used to monitor the changes in amorphous content during micronisation of the API, as well as during stability studies. (19)F solid-state NMR was found to be the method of choice for the detection and quantification of low levels of amorphous API, even in the final drug product (DP), since compaction during tablet manufacture was identified as a further source for the formation of amorphous API. The application of different jet milling techniques was a critical factor with respect to amorphous content formation. In the present case, the change from spiral jet milling to loop jet milling led to a decrease in amorphous API content from 20-30 w/w% to nearly 0 w/w% respectively. The use of loop jet milling also improved the processability of the API. Stability investigations on both the milled API and the DP showed a marked tendency for recrystallisation of the amorphous API content on exposure to elevated levels of relative humidity. No significant impact of amorphous API on either the chemical stability or the dissolution rate of the API in drug formulation was observed. Therefore, the presence of amorphous content in the oral formulation was of no consequence for the clinical trial phases I and II. Copyright © 2013 Elsevier B.V. All rights reserved.
S-Band POSIX Device Drivers for RTEMS
NASA Technical Reports Server (NTRS)
Lux, James P.; Lang, Minh; Peters, Kenneth J.; Taylor, Gregory H.
2011-01-01
This is a set of POSIX device driver level abstractions in the RTEMS RTOS (Real-Time Executive for Multiprocessor Systems real-time operating system) to SBand radio hardware devices that have been instantiated in an FPGA (field-programmable gate array). These include A/D (analog-to-digital) sample capture, D/A (digital-to-analog) sample playback, PLL (phase-locked-loop) tuning, and PWM (pulse-width-modulation)-controlled gain. This software interfaces to Sband radio hardware in an attached Xilinx Virtex-2 FPGA. It uses plug-and-play device discovery to map memory to device IDs. Instead of interacting with hardware devices directly, using direct-memory mapped access at the application level, this driver provides an application programming interface (API) offering that easily uses standard POSIX function calls. This simplifies application programming, enables portability, and offers an additional level of protection to the hardware. There are three separate device drivers included in this package: sband_device (ADC capture and DAC playback), pll_device (RF front end PLL tuning), and pwm_device (RF front end AGC control).
DOE Office of Scientific and Technical Information (OSTI.GOV)
Staley, Martin
2017-09-20
This high-performance ray tracing library provides very fast rendering; compact code; type flexibility through C++ "generic programming" techniques; and ease of use via an application programming interface (API) that operates independently of any GUI, on-screen display, or other enclosing application. Kip supports constructive solid geometry (CSG) models based on a wide variety of built-in shapes and logical operators, and also allows for user-defined shapes and operators to be provided. Additional features include basic texturing; input/output of models using a simple human-readable file format and with full error checking and detailed diagnostics; and support for shared data parallelism. Kip is writtenmore » in pure, ANSI standard C++; is entirely platform independent; and is very easy to use. As a C++ "header only" library, it requires no build system, configuration or installation scripts, wizards, non-C++ preprocessing, makefiles, shell scripts, or external libraries.« less
LANCE in ECHO - Merging Science and Near Real-Time Data Search and Order
NASA Astrophysics Data System (ADS)
Kreisler, S.; Murphy, K. J.; Vollmer, B.; Lighty, L.; Mitchell, A. E.; Devine, N.
2012-12-01
NASA's Earth Observing System (EOS) Data and Information System (EOSDIS) Land Atmosphere Near real-time Capability for EOS (LANCE) project provides expedited data products from the Terra, Aqua, and Aura satellites within three hours of observation. In order to satisfy latency requirements, LANCE data are produced with relaxed ancillary data resulting in a product that may have minor differences from its science quality counterpart. LANCE products are used by a number of different groups to support research and applications that require near real-time earth observations, such as disaster relief, hazard and air quality monitoring, and weather forecasting. LANCE elements process raw rate-buffered and/or session-based production datasets into higher-level products, which are freely available to registered users via LANCE FTP sites. The LANCE project also generates near real-time full resolution browse imagery from these products, which can be accessed through the Global Imagery Browse Services (GIBS). In an effort to support applications and services that require timely access to these near real-time products, the project is currently implementing the publication of LANCE product metadata to the EOS ClearingHouse (ECHO), a centralized EOSDIS registry of EOS data. Metadata within ECHO is made available through an Application Program Interface (API), and applications can utilize the API to allow users to efficiently search and order LANCE data. Publishing near real-time data to ECHO will permit applications to access near real-time product metadata prior to the release of its science quality counterpart and to associate imagery from GIBS with its underlying data product.
Population genetics of commercial and feral honey bees in Western Australia.
Chapman, Nadine C; Lim, Julianne; Oldroyd, Benjamin P
2008-04-01
Due to the introduction of exotic honey bee (Apis mellifera L.) diseases in the eastern states, the borders of the state of Western Australia were closed to the import of bees for breeding and other purposes > 25 yr ago. To provide genetically improved stock for the industry, a closed population breeding program was established that now provides stock for the majority of Western Australian beekeepers. Given concerns that inbreeding may have resulted from the closed population breeding structure, we assessed the genetic diversity within and between the breeding lines by using microsatellite and mitochondrial markers. We found that the breeding population still maintains considerable genetic diversity, despite 25 yr of selective breeding. We also investigated the genetic distance of the closed population breeding program to that of beekeepers outside of the program, and the feral Western Australian honey bee population. The feral population is genetically distinct from the closed population, but not from the genetic stock maintained by beekeepers outside of the program. The honey bees of Western Australia show three mitotypes, originating from two subspecies: Apis mellifera ligustica (mitotypes C1 and M7b) and Apis mellifera iberica (mitotype M6). Only mitotypes C1 and M6 are present in the commercial populations. The feral population contains all three mitotypes.
JEnsembl: a version-aware Java API to Ensembl data systems.
Paterson, Trevor; Law, Andy
2012-11-01
The Ensembl Project provides release-specific Perl APIs for efficient high-level programmatic access to data stored in various Ensembl database schema. Although Perl scripts are perfectly suited for processing large volumes of text-based data, Perl is not ideal for developing large-scale software applications nor embedding in graphical interfaces. The provision of a novel Java API would facilitate type-safe, modular, object-orientated development of new Bioinformatics tools with which to access, analyse and visualize Ensembl data. The JEnsembl API implementation provides basic data retrieval and manipulation functionality from the Core, Compara and Variation databases for all species in Ensembl and EnsemblGenomes and is a platform for the development of a richer API to Ensembl datasources. The JEnsembl architecture uses a text-based configuration module to provide evolving, versioned mappings from database schema to code objects. A single installation of the JEnsembl API can therefore simultaneously and transparently connect to current and previous database instances (such as those in the public archive) thus facilitating better analysis repeatability and allowing 'through time' comparative analyses to be performed. Project development, released code libraries, Maven repository and documentation are hosted at SourceForge (http://jensembl.sourceforge.net).
PCIPS 2.0: Powerful multiprofile image processing implemented on PCs
NASA Technical Reports Server (NTRS)
Smirnov, O. M.; Piskunov, N. E.
1992-01-01
Over the years, the processing power of personal computers has steadily increased. Now, 386- and 486-based PC's are fast enough for many image processing applications, and inexpensive enough even for amateur astronomers. PCIPS is an image processing system based on these platforms that was designed to satisfy a broad range of data analysis needs, while requiring minimum hardware and providing maximum expandability. It will run (albeit at a slow pace) even on a 80286 with 640K memory, but will take full advantage of bigger memory and faster CPU's. Because the actual image processing is performed by external modules, the system can be easily upgraded by the user for all sorts of scientific data analysis. PCIPS supports large format lD and 2D images in any numeric type from 8-bit integer to 64-bit floating point. The images can be displayed, overlaid, printed and any part of the data examined via an intuitive graphical user interface that employs buttons, pop-up menus, and a mouse. PCIPS automatically converts images between different types and sizes to satisfy the requirements of various applications. PCIPS features an API that lets users develop custom applications in C or FORTRAN. While doing so, a programmer can concentrate on the actual data processing, because PCIPS assumes responsibility for accessing images and interacting with the user. This also ensures that all applications, even custom ones, have a consistent and user-friendly interface. The API is compatible with factory programming, a metaphor for constructing image processing procedures that will be implemented in future versions of the system. Several application packages were created under PCIPS. The basic package includes elementary arithmetics and statistics, geometric transformations and import/export in various formats (FITS, binary, ASCII, and GIF). The CCD processing package and the spectral analysis package were successfully used to reduce spectra from the Nordic Telescope at La Palma. A photometry package is also available, and other packages are being developed. A multitasking version of PCIPS that utilizes the factory programming concept is currently under development. This version will remain compatible (on the source code level) with existing application packages and custom applications.
Configuration Management File Manager Developed for Numerical Propulsion System Simulation
NASA Technical Reports Server (NTRS)
Follen, Gregory J.
1997-01-01
One of the objectives of the High Performance Computing and Communication Project's (HPCCP) Numerical Propulsion System Simulation (NPSS) is to provide a common and consistent way to manage applications, data, and engine simulations. The NPSS Configuration Management (CM) File Manager integrated with the Common Desktop Environment (CDE) window management system provides a common look and feel for the configuration management of data, applications, and engine simulations for U.S. engine companies. In addition, CM File Manager provides tools to manage a simulation. Features include managing input files, output files, textual notes, and any other material normally associated with simulation. The CM File Manager includes a generic configuration management Application Program Interface (API) that can be adapted for the configuration management repositories of any U.S. engine company.
DGIdb 3.0: a redesign and expansion of the drug-gene interaction database.
Cotto, Kelsy C; Wagner, Alex H; Feng, Yang-Yang; Kiwala, Susanna; Coffman, Adam C; Spies, Gregory; Wollam, Alex; Spies, Nicholas C; Griffith, Obi L; Griffith, Malachi
2018-01-04
The drug-gene interaction database (DGIdb, www.dgidb.org) consolidates, organizes and presents drug-gene interactions and gene druggability information from papers, databases and web resources. DGIdb normalizes content from 30 disparate sources and allows for user-friendly advanced browsing, searching and filtering for ease of access through an intuitive web user interface, application programming interface (API) and public cloud-based server image. DGIdb v3.0 represents a major update of the database. Nine of the previously included 24 sources were updated. Six new resources were added, bringing the total number of sources to 30. These updates and additions of sources have cumulatively resulted in 56 309 interaction claims. This has also substantially expanded the comprehensive catalogue of druggable genes and anti-neoplastic drug-gene interactions included in the DGIdb. Along with these content updates, v3.0 has received a major overhaul of its codebase, including an updated user interface, preset interaction search filters, consolidation of interaction information into interaction groups, greatly improved search response times and upgrading the underlying web application framework. In addition, the expanded API features new endpoints which allow users to extract more detailed information about queried drugs, genes and drug-gene interactions, including listings of PubMed IDs, interaction type and other interaction metadata.
Daughton, Christian G; Ruhoy, Ilene S
2009-12-01
The combined excretion of active pharmaceutical ingredients (APIs) via urine and feces is considered the primary route by which APIs from human pharmaceuticals enter the environment. Disposal of unwanted, leftover medications by flushing into sewers has been considered a secondary route-one that does not contribute substantially to overall environmental loadings. The present study presents the first comprehensive examination of secondary routes of API release to the environment and for direct but unintentional human exposure. These include bathing, washing, and laundering, all of which release APIs remaining on the skin from the use of high-content dermal applications or from excretion to the skin via sweating, and disposal of unused and partially used high-content devices. Also discussed are the health hazards associated with: partially used devices, medication disposal practices of consumers, and interpersonal dermal transfer of API residues. Understanding these secondary routes is important from the perspective of pollution prevention, because actions can be designed more easily for reducing the environmental impact of APIs compared with the route of direct excretion (via urine and feces), for reducing the incidence of unintentional and purposeful poisonings of humans and pets, and for improving the quality and cost-effectiveness of health care. Overall, unintentional exposure to APIs for humans via these routes is possibly more important than exposure to trace residues recycled from the environment in drinking water or foods.
The Ensembl REST API: Ensembl Data for Any Language.
Yates, Andrew; Beal, Kathryn; Keenan, Stephen; McLaren, William; Pignatelli, Miguel; Ritchie, Graham R S; Ruffier, Magali; Taylor, Kieron; Vullo, Alessandro; Flicek, Paul
2015-01-01
We present a Web service to access Ensembl data using Representational State Transfer (REST). The Ensembl REST server enables the easy retrieval of a wide range of Ensembl data by most programming languages, using standard formats such as JSON and FASTA while minimizing client work. We also introduce bindings to the popular Ensembl Variant Effect Predictor tool permitting large-scale programmatic variant analysis independent of any specific programming language. The Ensembl REST API can be accessed at http://rest.ensembl.org and source code is freely available under an Apache 2.0 license from http://github.com/Ensembl/ensembl-rest. © The Author 2014. Published by Oxford University Press.
Computer aided fixture design - A case based approach
NASA Astrophysics Data System (ADS)
Tanji, Shekhar; Raiker, Saiesh; Mathew, Arun Tom
2017-11-01
Automated fixture design plays important role in process planning and integration of CAD and CAM. An automated fixture setup design system is developed where when fixturing surfaces and points are described allowing modular fixture components to get automatically select for generating fixture units and placed into position with satisfying assembled conditions. In past, various knowledge based system have been developed to implement CAFD in practice. In this paper, to obtain an acceptable automated machining fixture design, a case-based reasoning method with developed retrieval system is proposed. Visual Basic (VB) programming language is used in integrating with SolidWorks API (Application programming interface) module for better retrieval procedure reducing computational time. These properties are incorporated in numerical simulation to determine the best fit for practical use.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gunter, Dan; Lee, Jason; Stoufer, Martin
2003-03-28
The NetLogger Toolkit is designed to monitor, under actual operating conditions, the behavior of all the elements of the application-to-application communication path in order to determine exactly where time is spent within a complex system Using NetLogger, distnbuted application components are modified to produce timestamped logs of "interesting" events at all the critical points of the distributed system Events from each component are correlated, which allov^ one to characterize the performance of all aspects of the system and network in detail. The NetLogger Toolkit itself consists of four components an API and library of functions to simplify the generation ofmore » application-level event logs, a set of tools for collecting and sorting log files, an event archive system, and a tool for visualization and analysis of the log files In order to instrument an application to produce event logs, the application developer inserts calls to the NetLogger API at all the critical points in the code, then links the application with the NetLogger library All the tools in the NetLogger Toolkit share a common log format, and assume the existence of accurate and synchronized system clocks NetLogger messages can be logged using an easy-to-read text based format based on the lETF-proposed ULM format, or a binary format that can still be used through the same API but that is several times faster and smaller, with performance comparable or better than binary message formats such as MPI, XDR, SDDF-Binary, and PBIO. The NetLogger binary format is both highly efficient and self-describing, thus optimized for the dynamic message construction and parsing of application instrumentation. NetLogger includes an "activation" API that allows NetLogger logging to be turned on, off, or modified by changing an external file This IS useful for activating logging in daemons/services (e g GndFTP server). The NetLogger reliability API provides the ability to specify backup logging locations and penodically try to reconnect broken TCP pipe. A typical use for this is to store data on local disk while net is down. An event archiver can log one or more incoming NetLogger streams to a local disk file (netlogd) or to a mySQL database (netarchd). We have found exploratory, visual analysis of the log event data to be the most useful means of determining the causes of performance anomalies The NetLogger Visualization tool, niv, has been developed to provide a flexible and interactive graphical representation of system-level and application-level events.« less
Khan, Usman; Bloom, Raanan A; Nicell, James A; Laurenson, James P
2017-12-31
A select few prescription drugs can be especially harmful and, in some cases, fatal with just one dose when not used as prescribed. Therefore, the U. S. Food and Drug Administration (FDA) recommends that expired, unwanted, or otherwise unused portions of most of these drugs be disposed of quickly through a take-back program. If such an option is not readily available, FDA recommends that they be flushed down the sink or toilet. The goal of the current investigation was to evaluate the ecological and human-health risks associated with the environmental release of the 15 active pharmaceutical ingredients (APIs) currently on the FDA "flush list". The evaluation suggests that even when highly conservative assumptions are used-including that the entire API mass supplied for clinical use is flushed, all relevant sources in addition to clinical use of the API are considered, and no metabolic loss, environmental degradation, or dilution of wastewater effluents are used in estimating environmental concentrations-most of these APIs present a negligible eco-toxicological risk, both as individual compounds and as a mixture. For a few of these APIs, additional eco-toxicological data will need to be developed. Using similar conservative assumptions for human-health risks, all 15 APIs present negligible risk through ingestion of water and fish. Published by Elsevier B.V.
The NIH BD2K center for big data in translational genomics.
Paten, Benedict; Diekhans, Mark; Druker, Brian J; Friend, Stephen; Guinney, Justin; Gassner, Nadine; Guttman, Mitchell; Kent, W James; Mantey, Patrick; Margolin, Adam A; Massie, Matt; Novak, Adam M; Nothaft, Frank; Pachter, Lior; Patterson, David; Smuga-Otto, Maciej; Stuart, Joshua M; Van't Veer, Laura; Wold, Barbara; Haussler, David
2015-11-01
The world's genomics data will never be stored in a single repository - rather, it will be distributed among many sites in many countries. No one site will have enough data to explain genotype to phenotype relationships in rare diseases; therefore, sites must share data. To accomplish this, the genetics community must forge common standards and protocols to make sharing and computing data among many sites a seamless activity. Through the Global Alliance for Genomics and Health, we are pioneering the development of shared application programming interfaces (APIs) to connect the world's genome repositories. In parallel, we are developing an open source software stack (ADAM) that uses these APIs. This combination will create a cohesive genome informatics ecosystem. Using containers, we are facilitating the deployment of this software in a diverse array of environments. Through benchmarking efforts and big data driver projects, we are ensuring ADAM's performance and utility. © The Author 2015. Published by Oxford University Press on behalf of the American Medical Informatics Association. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
The Earth System Grid Federation: An Open Infrastructure for Access to Distributed Geospatial Data
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ananthakrishnan, Rachana; Bell, Gavin; Cinquini, Luca
2013-01-01
The Earth System Grid Federation (ESGF) is a multi-agency, international collaboration that aims at developing the software infrastructure needed to facilitate and empower the study of climate change on a global scale. The ESGF s architecture employs a system of geographically distributed peer nodes, which are independently administered yet united by the adoption of common federation protocols and application programming interfaces (APIs). The cornerstones of its interoperability are the peer-to-peer messaging that is continuously exchanged among all nodes in the federation; a shared architecture and API for search and discovery; and a security infrastructure based on industry standards (OpenID, SSL,more » GSI and SAML). The ESGF software is developed collaboratively across institutional boundaries and made available to the community as open source. It has now been adopted by multiple Earth science projects and allows access to petabytes of geophysical data, including the entire model output used for the next international assessment report on climate change (IPCC-AR5) and a suite of satellite observations (obs4MIPs) and reanalysis data sets (ANA4MIPs).« less
The Earth System Grid Federation: An Open Infrastructure for Access to Distributed Geo-Spatial Data
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cinquini, Luca; Crichton, Daniel; Miller, Neill
2012-01-01
The Earth System Grid Federation (ESGF) is a multi-agency, international collaboration that aims at developing the software infrastructure needed to facilitate and empower the study of climate change on a global scale. The ESGF s architecture employs a system of geographically distributed peer nodes, which are independently administered yet united by the adoption of common federation protocols and application programming interfaces (APIs). The cornerstones of its interoperability are the peer-to-peer messaging that is continuously exchanged among all nodes in the federation; a shared architecture and API for search and discovery; and a security infrastructure based on industry standards (OpenID, SSL,more » GSI and SAML). The ESGF software is developed collaboratively across institutional boundaries and made available to the community as open source. It has now been adopted by multiple Earth science projects and allows access to petabytes of geophysical data, including the entire model output used for the next international assessment report on climate change (IPCC-AR5) and a suite of satellite observations (obs4MIPs) and reanalysis data sets (ANA4MIPs).« less
The Earth System Grid Federation : an Open Infrastructure for Access to Distributed Geospatial Data
NASA Technical Reports Server (NTRS)
Cinquini, Luca; Crichton, Daniel; Mattmann, Chris; Harney, John; Shipman, Galen; Wang, Feiyi; Ananthakrishnan, Rachana; Miller, Neill; Denvil, Sebastian; Morgan, Mark;
2012-01-01
The Earth System Grid Federation (ESGF) is a multi-agency, international collaboration that aims at developing the software infrastructure needed to facilitate and empower the study of climate change on a global scale. The ESGF's architecture employs a system of geographically distributed peer nodes, which are independently administered yet united by the adoption of common federation protocols and application programming interfaces (APIs). The cornerstones of its interoperability are the peer-to-peer messaging that is continuously exchanged among all nodes in the federation; a shared architecture and API for search and discovery; and a security infrastructure based on industry standards (OpenID, SSL, GSI and SAML). The ESGF software is developed collaboratively across institutional boundaries and made available to the community as open source. It has now been adopted by multiple Earth science projects and allows access to petabytes of geophysical data, including the entire model output used for the next international assessment report on climate change (IPCC-AR5) and a suite of satellite observations (obs4MIPs) and reanalysis data sets (ANA4MIPs).
Summary of Structural Evaluation and Design Support for the Underground Nuclear Test Program.
1979-07-01
consider using API -5LX pipe as this pipe has been shown to have high ductility (better than A36). This pipe comes in several grades (X42, X46, X52 , X56, X60...X65, X70) with the grade number representing the yield strength (ksi) of the steel. Grades X42 and X52 are readily available while the higher yield...strength steels are less readily available. I believe X52 has certainly a high enough yield strength (52,000 psi) for your application and that even
Nemesis I: Parallel Enhancements to ExodusII
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hennigan, Gary L.; John, Matthew S.; Shadid, John N.
2006-03-28
NEMESIS I is an enhancement to the EXODUS II finite element database model used to store and retrieve data for unstructured parallel finite element analyses. NEMESIS I adds data structures which facilitate the partitioning of a scalar (standard serial) EXODUS II file onto parallel disk systems found on many parallel computers. Since the NEMESIS I application programming interface (APl)can be used to append information to an existing EXODUS II files can be used on files which contain NEMESIS I information. The NEMESIS I information is written and read via C or C++ callable functions which compromise the NEMESIS I API.
National Utility Rate Database: Preprint
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ong, S.; McKeel, R.
2012-08-01
When modeling solar energy technologies and other distributed energy systems, using high-quality expansive electricity rates is essential. The National Renewable Energy Laboratory (NREL) developed a utility rate platform for entering, storing, updating, and accessing a large collection of utility rates from around the United States. This utility rate platform lives on the Open Energy Information (OpenEI) website, OpenEI.org, allowing the data to be programmatically accessed from a web browser, using an application programming interface (API). The semantic-based utility rate platform currently has record of 1,885 utility rates and covers over 85% of the electricity consumption in the United States.
JEnsembl: a version-aware Java API to Ensembl data systems
Paterson, Trevor; Law, Andy
2012-01-01
Motivation: The Ensembl Project provides release-specific Perl APIs for efficient high-level programmatic access to data stored in various Ensembl database schema. Although Perl scripts are perfectly suited for processing large volumes of text-based data, Perl is not ideal for developing large-scale software applications nor embedding in graphical interfaces. The provision of a novel Java API would facilitate type-safe, modular, object-orientated development of new Bioinformatics tools with which to access, analyse and visualize Ensembl data. Results: The JEnsembl API implementation provides basic data retrieval and manipulation functionality from the Core, Compara and Variation databases for all species in Ensembl and EnsemblGenomes and is a platform for the development of a richer API to Ensembl datasources. The JEnsembl architecture uses a text-based configuration module to provide evolving, versioned mappings from database schema to code objects. A single installation of the JEnsembl API can therefore simultaneously and transparently connect to current and previous database instances (such as those in the public archive) thus facilitating better analysis repeatability and allowing ‘through time’ comparative analyses to be performed. Availability: Project development, released code libraries, Maven repository and documentation are hosted at SourceForge (http://jensembl.sourceforge.net). Contact: jensembl-develop@lists.sf.net, andy.law@roslin.ed.ac.uk, trevor.paterson@roslin.ed.ac.uk PMID:22945789
Improved Functionality and Curation Support in the ADS
NASA Astrophysics Data System (ADS)
Accomazzi, Alberto; Kurtz, Michael J.; Henneken, Edwin A.; Grant, Carolyn S.; Thompson, Donna; Chyla, Roman; Holachek, Alexandra; Sudilovsky, Vladimir; Murray, Stephen S.
2015-01-01
In this poster we describe the developments of the new ADS platform over the past year, focusing on the functionality which improves its discovery and curation capabilities.The ADS Application Programming Interface (API) is being updated to support authenticated access to the entire suite of ADS services, in addition to the search functionality itself. This allows programmatic access to resources which are specific to a user or class of users.A new interface, built directly on top of the API, now provides a more intuitive search experience and takes into account the best practices in web usability and responsive design. The interface now incorporates in-line views of graphics from the AAS Astroexplorer and the ADS All-Sky Survey image collections.The ADS Private Libraries, first introduced over 10 years ago, are now being enhanced to allow the bookmarking, tagging and annotation of records of interest. In addition, libraries can be shared with one or more ADS users, providing an easy way to collaborate in the curation of lists of papers. A library can also be explicitly made public and shared at large via the publishing of its URL.In collaboration with the AAS, the ADS plans to support the adoption of ORCID identifiers by implementing a plugin which will simplify the import of papers in ORCID via a query to the ADS API. Deeper integration between the two systems will depend on available resources and feedback from the community.
Nemoto, Tooru; Iwamoto, Mariko; Kamitani, Emiko; Morris, Anne; Sakata, Maria
2011-04-01
Access to culturally competent HIV/AIDS and substance abuse treatment and prevention services is limited for Asian and Pacific Islanders (APIs). Based on the intake data for a community outreach project in the San Francisco Bay Area (N = 1,349), HIV risk behaviors were described among the targeted API risk groups. The self-reported HIV prevalence was 6% among MSM. Inconsistent condom use for vaginal sex with casual partners in the past 6 months was reported among substance users (43%) and incarcerated participants (60%), whereas 26% of men who have sex with men reported inconsistent condom use for anal sex with casual partners. Overall, 56% and 29% had engaged in sex with casual partners under the influence of alcohol and drugs in the past 6 months, respectively. Although API organizations in the Bay Area have spearheaded HIV prevention, future programs must address substance use issues in relation to sexual risk behaviors, specific to API risk groups.
NASA Astrophysics Data System (ADS)
Kepner, J. V.; Janka, R. S.; Lebak, J.; Richards, M. A.
1999-12-01
The Vector/Signal/Image Processing Library (VSIPL) is a DARPA initiated effort made up of industry, government and academic representatives who have defined an industry standard API for vector, signal, and image processing primitives for real-time signal processing on high performance systems. VSIPL supports a wide range of data types (int, float, complex, ...) and layouts (vectors, matrices and tensors) and is ideal for astronomical data processing. The VSIPL API is intended to serve as an open, vendor-neutral, industry standard interface. The object-based VSIPL API abstracts the memory architecture of the underlying machine by using the concept of memory blocks and views. Early experiments with VSIPL code conversions have been carried out by the High Performance Computing Program team at the UCSD. Commercially, several major vendors of signal processors are actively developing implementations. VSIPL has also been explicitly required as part of a recent Rome Labs teraflop procurement. This poster presents the VSIPL API, its functionality and the status of various implementations.
Flexibility and Performance of Parallel File Systems
NASA Technical Reports Server (NTRS)
Kotz, David; Nieuwejaar, Nils
1996-01-01
As we gain experience with parallel file systems, it becomes increasingly clear that a single solution does not suit all applications. For example, it appears to be impossible to find a single appropriate interface, caching policy, file structure, or disk-management strategy. Furthermore, the proliferation of file-system interfaces and abstractions make applications difficult to port. We propose that the traditional functionality of parallel file systems be separated into two components: a fixed core that is standard on all platforms, encapsulating only primitive abstractions and interfaces, and a set of high-level libraries to provide a variety of abstractions and application-programmer interfaces (API's). We present our current and next-generation file systems as examples of this structure. Their features, such as a three-dimensional file structure, strided read and write interfaces, and I/O-node programs, are specifically designed with the flexibility and performance necessary to support a wide range of applications.
Application and API for Real-time Visualization of Ground-motions and Tsunami
NASA Astrophysics Data System (ADS)
Aoi, S.; Kunugi, T.; Suzuki, W.; Kubo, T.; Nakamura, H.; Azuma, H.; Fujiwara, H.
2015-12-01
Due to the recent progress of seismograph and communication environment, real-time and continuous ground-motion observation becomes technically and economically feasible. K-NET and KiK-net, which are nationwide strong motion networks operated by NIED, cover all Japan by about 1750 stations in total. More than half of the stations transmit the ground-motion indexes and/or waveform data in every second. Traditionally, strong-motion data were recorded by event-triggering based instruments with non-continues telephone line which is connected only after an earthquake. Though the data from such networks mainly contribute to preparations for future earthquakes, huge amount of real-time data from dense network are expected to directly contribute to the mitigation of ongoing earthquake disasters through, e.g., automatic shutdown plants and helping decision-making for initial response. By generating the distribution map of these indexes and uploading them to the website, we implemented the real-time ground motion monitoring system, Kyoshin (strong-motion in Japanese) monitor. This web service (www.kyoshin.bosai.go.jp) started in 2008 and anyone can grasp the current ground motions of Japan. Though this service provides only ground-motion map in GIF format, to take full advantage of real-time strong-motion data to mitigate the ongoing disasters, digital data are important. We have developed a WebAPI to provide real-time data and related information such as ground motions (5 km-mesh) and arrival times estimated from EEW (earthquake early warning). All response data from this WebAPI are in JSON format and are easy to parse. We also developed Kyoshin monitor application for smartphone, 'Kmoni view' using the API. In this application, ground motions estimated from EEW are overlapped on the map with the observed one-second-interval indexes. The application can playback previous earthquakes for demonstration or disaster drill. In mobile environment, data traffic and battery are limited and it is not practical to regularly visualize all the data. The application has automatic starting (pop-up) function triggered by EEW. Similar WebAPI and application for tsunami are being prepared using the pressure data recorded by dense offshore observation network (S-net), which is under construction along the Japan Trench.
An HTML Tool for Production of Interactive Stereoscopic Compositions.
Chistyakov, Alexey; Soto, Maria Teresa; Martí, Enric; Carrabina, Jordi
2016-12-01
The benefits of stereoscopic vision in medical applications were appreciated and have been thoroughly studied for more than a century. The usage of the stereoscopic displays has a proven positive impact on performance in various medical tasks. At the same time the market of 3D-enabled technologies is blooming. New high resolution stereo cameras, TVs, projectors, monitors, and head mounted displays become available. This equipment, completed with a corresponding application program interface (API), could be relatively easy implemented in a system. Such complexes could open new possibilities for medical applications exploiting the stereoscopic depth. This work proposes a tool for production of interactive stereoscopic graphical user interfaces, which could represent a software layer for web-based medical systems facilitating the stereoscopic effect. Further the tool's operation mode and the results of the conducted subjective and objective performance tests will be exposed.
NASA Astrophysics Data System (ADS)
Urbanova, Martina; Brus, Jiri; Sedenkova, Ivana; Policianova, Olivia; Kobera, Libor
In this contribution the ability of 19F MAS NMR spectroscopy to probe structural variability of poorly water-soluble drugs formulated as solid dispersions in polymer matrices is discussed. The application potentiality of the proposed approach is demonstrated on a moderately sized active pharmaceutical ingredient (API, Atorvastatin) exhibiting extensive polymorphism. In this respect, a range of model systems with the API incorporated in the matrix of polvinylpyrrolidone (PVP) was prepared. The extent of mixing of both components was determined by T1(1H) and T1ρ(1H) relaxation experiments, and it was found that the API forms nanosized domains. Subsequently it was found out that the polymer matrix induces two kinds of changes in 19F MAS NMR spectra. At first, this is a high-frequency shift reaching 2-3 ppm which is independent on molecular structure of the API and which results from the long-range polarization of the electron cloud around 19F nucleus induced by electrostatic fields of the polymer matrix. At second, this is broadening of the signals and formation of shoulders reflecting changes in molecular arrangement of the API. To avoid misleading in the interpretation of the recorded 19F MAS NMR spectra, because both the contributions act simultaneously, we applied chemometric approach based on multivariate analysis. It is demonstrated that factor analysis of the recorded spectra can separate both these spectral contributions, and the subtle structural differences in the molecular arrangement of the API in the nanosized domains can be traced. In this way 19F MAS NMR spectra of both pure APIs and APIs in solid dispersions can be directly compared. The proposed strategy thus provides a powerful tool for the analysis of new formulations of fluorinated pharmaceutical substances in polymer matrices.
DeepBlue epigenomic data server: programmatic data retrieval and analysis of epigenome region sets.
Albrecht, Felipe; List, Markus; Bock, Christoph; Lengauer, Thomas
2016-07-08
Large amounts of epigenomic data are generated under the umbrella of the International Human Epigenome Consortium, which aims to establish 1000 reference epigenomes within the next few years. These data have the potential to unravel the complexity of epigenomic regulation. However, their effective use is hindered by the lack of flexible and easy-to-use methods for data retrieval. Extracting region sets of interest is a cumbersome task that involves several manual steps: identifying the relevant experiments, downloading the corresponding data files and filtering the region sets of interest. Here we present the DeepBlue Epigenomic Data Server, which streamlines epigenomic data analysis as well as software development. DeepBlue provides a comprehensive programmatic interface for finding, selecting, filtering, summarizing and downloading region sets. It contains data from four major epigenome projects, namely ENCODE, ROADMAP, BLUEPRINT and DEEP. DeepBlue comes with a user manual, examples and a well-documented application programming interface (API). The latter is accessed via the XML-RPC protocol supported by many programming languages. To demonstrate usage of the API and to enable convenient data retrieval for non-programmers, we offer an optional web interface. DeepBlue can be openly accessed at http://deepblue.mpi-inf.mpg.de. © The Author(s) 2016. Published by Oxford University Press on behalf of Nucleic Acids Research.
Developing of Library for Proofs of Data Possession in Charm
2013-06-01
INTENTIONALLY LEFT BLANK x LIST OF ACRONYMS AND ABBREVIATIONS API Application Programmer Interface DTP Datatype -preserving Encryption FedRAMP U.S...proposed block-cipher mode for Datatype -Preserving Encryption (DTP) uses the Knuth Shuffle in one of its steps [19]. It may be advantageous to...http://www.clustal.org/omega/clustalo-api/util_8c.html. [19] U. T. Mattsson, “Format-controlling encryption using datatype -preserving encryption
NASA Astrophysics Data System (ADS)
McWhirter, J.; Boler, F. M.; Bock, Y.; Jamason, P.; Squibb, M. B.; Noll, C. E.; Blewitt, G.; Kreemer, C. W.
2010-12-01
Three geodesy Archive Centers, Scripps Orbit and Permanent Array Center (SOPAC), NASA's Crustal Dynamics Data Information System (CDDIS) and UNAVCO are engaged in a joint effort to define and develop a common Web Service Application Programming Interface (API) for accessing geodetic data holdings. This effort is funded by the NASA ROSES ACCESS Program to modernize the original GPS Seamless Archive Centers (GSAC) technology which was developed in the 1990s. A new web service interface, the GSAC-WS, is being developed to provide uniform and expanded mechanisms through which users can access our data repositories. In total, our respective archives hold tens of millions of files and contain a rich collection of site/station metadata. Though we serve similar user communities, we currently provide a range of different access methods, query services and metadata formats. This leads to a lack of consistency in the userís experience and a duplication of engineering efforts. The GSAC-WS API and its reference implementation in an underlying Java-based GSAC Service Layer (GSL) supports metadata and data queries into site/station oriented data archives. The general nature of this API makes it applicable to a broad range of data systems. The overall goals of this project include providing consistent and rich query interfaces for end users and client programs, the development of enabling technology to facilitate third party repositories in developing these web service capabilities and to enable the ability to perform data queries across a collection of federated GSAC-WS enabled repositories. A fundamental challenge faced in this project is to provide a common suite of query services across a heterogeneous collection of data yet enabling each repository to expose their specific metadata holdings. To address this challenge we are developing a "capabilities" based service where a repository can describe its specific query and metadata capabilities. Furthermore, the architecture of the GSL is based on a model-view paradigm that decouples the underlying data model semantics from particular representations of the data model. This will allow for the GSAC-WS enabled repositories to evolve their service offerings to incorporate new metadata definition formats (e.g., ISO-19115, FGDC, JSON, etc.) and new techniques for accessing their holdings. Building on the core GSAC-WS implementations the project is also developing a federated/distributed query service. This service will seamlessly integrate with the GSAC Service Layer and will support data and metadata queries across a collection of federated GSAC repositories.
Concerns and Structural Barriers Associated with WIC Participation among WIC-Eligible Women.
Liu, Cindy H; Liu, Heidi
2016-09-01
To examine sociodemographic status, psychosocial concerns, and structural barriers associated with women's participation in the USDA's Women, Infants, and Children (WIC) program among those eligible for the program. A total of 1,634 White, African-American, Hispanic, and Asian/Pacific Islander (A/PI) women from the New York City area completed the Pregnancy Risk Assessment Monitoring System (PRAMS) from 2004 to 2007, a population-based survey. Data on WIC eligibility and participation, sociodemographic details, unintended pregnancy, social support, and structural barriers were evaluated. Hispanics and Blacks were 4.1 and 2.4 times more likely to participate, respectively, in the WIC program relative to Whites. Mothers reporting unplanned pregnancies, fewer social supports, and more structural barriers (e.g., transportation) were less likely to participate in WIC. Race-stratified analyses revealed race/ethnic differences in the pattern of barriers; unintended pregnancy and structural problems were barriers associated with WIC participation particularly for A/PI. WIC-eligible women with unintended pregnancies and fewer social supports tend to participate in WIC, but those who experience more structural barriers are less likely to participate. A/PI women may face specific challenges to WIC participation. Careful attention is needed to understand the unique attitudes and behaviors in the process of participating in WIC. © 2016 Wiley Periodicals, Inc.
Oxley, P R; Oldroyd, B P
2009-04-01
Establishment of a closed population honey bee, Apis mellifera L. (Hymenoptera: Apidae), breeding program based on 'black' strains has been proposed for eastern Australia. Long-term success of such a program requires a high level of genetic variance. To determine the likely extent of genetic variation available, 50 colonies from 11 different commercial apiaries were sequenced in the mitochondrial cytochrome oxidase I and II intergenic region. Five distinct and novel mitotypes were identified. No colonies were found with the A. mellifera mellifera mitotype, which is often associated with undesirable feral strains. One group of mitotypes was consistent with a caucasica origin, two with carnica, and two with ligustica. The results suggest that there is sufficient genetic diversity to support a breeding program provided all these five sources were pooled.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Owen, D; Anderson, C; Mayo, C
Purpose: To extend the functionality of a commercial treatment planning system (TPS) to support (i) direct use of quantitative image-based metrics within treatment plan optimization and (ii) evaluation of dose-functional volume relationships to assist in functional image adaptive radiotherapy. Methods: A script was written that interfaces with a commercial TPS via an Application Programming Interface (API). The script executes a program that performs dose-functional volume analyses. Written in C#, the script reads the dose grid and correlates it with image data on a voxel-by-voxel basis through API extensions that can access registration transforms. A user interface was designed through WinFormsmore » to input parameters and display results. To test the performance of this program, image- and dose-based metrics computed from perfusion SPECT images aligned to the treatment planning CT were generated, validated, and compared. Results: The integration of image analysis information was successfully implemented as a plug-in to a commercial TPS. Perfusion SPECT images were used to validate the calculation and display of image-based metrics as well as dose-intensity metrics and histograms for defined structures on the treatment planning CT. Various biological dose correction models, custom image-based metrics, dose-intensity computations, and dose-intensity histograms were applied to analyze the image-dose profile. Conclusion: It is possible to add image analysis features to commercial TPSs through custom scripting applications. A tool was developed to enable the evaluation of image-intensity-based metrics in the context of functional targeting and avoidance. In addition to providing dose-intensity metrics and histograms that can be easily extracted from a plan database and correlated with outcomes, the system can also be extended to a plug-in optimization system, which can directly use the computed metrics for optimization of post-treatment tumor or normal tissue response models. Supported by NIH - P01 - CA059827.« less
An interactive web-based system using cloud for large-scale visual analytics
NASA Astrophysics Data System (ADS)
Kaseb, Ahmed S.; Berry, Everett; Rozolis, Erik; McNulty, Kyle; Bontrager, Seth; Koh, Youngsol; Lu, Yung-Hsiang; Delp, Edward J.
2015-03-01
Network cameras have been growing rapidly in recent years. Thousands of public network cameras provide tremendous amount of visual information about the environment. There is a need to analyze this valuable information for a better understanding of the world around us. This paper presents an interactive web-based system that enables users to execute image analysis and computer vision techniques on a large scale to analyze the data from more than 65,000 worldwide cameras. This paper focuses on how to use both the system's website and Application Programming Interface (API). Given a computer program that analyzes a single frame, the user needs to make only slight changes to the existing program and choose the cameras to analyze. The system handles the heterogeneity of the geographically distributed cameras, e.g. different brands, resolutions. The system allocates and manages Amazon EC2 and Windows Azure cloud resources to meet the analysis requirements.
Math Description Engine Software Development Kit
NASA Technical Reports Server (NTRS)
Shelton, Robert O.; Smith, Stephanie L.; Dexter, Dan E.; Hodgson, Terry R.
2010-01-01
The Math Description Engine Software Development Kit (MDE SDK) can be used by software developers to make computer-rendered graphs more accessible to blind and visually-impaired users. The MDE SDK generates alternative graph descriptions in two forms: textual descriptions and non-verbal sound renderings, or sonification. It also enables display of an animated trace of a graph sonification on a visual graph component, with color and line-thickness options for users having low vision or color-related impairments. A set of accessible graphical user interface widgets is provided for operation by end users and for control of accessible graph displays. Version 1.0 of the MDE SDK generates text descriptions for 2D graphs commonly seen in math and science curriculum (and practice). The mathematically rich text descriptions can also serve as a virtual math and science assistant for blind and sighted users, making graphs more accessible for everyone. The MDE SDK has a simple application programming interface (API) that makes it easy for programmers and Web-site developers to make graphs accessible with just a few lines of code. The source code is written in Java for cross-platform compatibility and to take advantage of Java s built-in support for building accessible software application interfaces. Compiled-library and NASA Open Source versions are available with API documentation and Programmer s Guide at http:/ / prim e.jsc.n asa. gov.
GIS tool to locate major Sikh temples in USA
NASA Astrophysics Data System (ADS)
Sharma, Saumya
This tool is a GIS based interactive and graphical user interface tool, which locates the major Sikh temples of USA on a map. This tool is using Java programming language along with MOJO (Map Object Java Object) provided by ESRI that is the organization that provides the GIS software. It also includes some of the integration with Google's API's like Google Translator API. This application will tell users about the origin of Sikhism in India and USA, the major Sikh temples in each state of USA, location, name and detail information through their website. The primary purpose of this application is to make people aware about this religion and culture. This tool will also measure the distance between two temple points in a map and display the result in miles and kilometers. Also, there is an added support to convert each temple's website language from English to Punjabi or any other language using a language convertor tool so that people from different nationalities can understand their culture. By clicking on each point on a map, a new window will pop up showing the picture of the temple and a hyperlink that will redirect to the website of that particular temple .It will also contain links to their dance, music, history, and also a help menu to guide the users to use the software efficiently.
Takahashi, Lois M; Magalong, Michelle G; Debell, Paula; Fasudhani, Angela
2006-12-01
Though AIDS case rates among Asian Pacific Islander Americans (APIs) in the United States remain relatively low, the number has been steadily increasing. Scholars, policy makers, and service providers still know little about how confident APIs are in carrying out different HIV risk reduction strategies. This article addresses this gap by presenting an analysis of a survey of API women and youth in Orange County, California (N = 313), a suburban county in southern California with large concentrations of Asian residents. Multivariate logistic regression models using subsamples of API women and API youth respondents were used. Variations in reported self-efficacy for female respondents were explained by acculturation, comfort in asking medical practitioners about HIV/AIDS, and to a lesser degree, education, household size, whether respondents were currently dating, HIV knowledge, and whether respondents believed that HIV could be identified by physical appearance. For respondents younger than 25 years, variations in self-efficacy were related to gender, age, acculturation, HIV knowledge, taking-over-the-counter medicines for illness, whether respondents were dating, and to a lesser degree, employment, recent serious illness, whether they believe that one could identify HIV by how one looks, and believing that illness was caused by germs. Implications for HIV prevention programs and future research are provided.
When Will It Be …?: U.S. Naval Observatory Calendar Computers
NASA Astrophysics Data System (ADS)
Bartlett, Jennifer L.; Chizek Frouard, Malynda; Lesniak, Michael V.
2016-06-01
Sensitivity to religious calendars is increasingly expected when planning activities. Consequently, the U.S. Naval Observatory (USNO) has redesigned its on-line calendar resources to allow the computation of select religious dates for specific years via an application programming interface (API). This flexible interface returns dates in JavaScript Object Notation (JSON) that can be incorporated into third-party websites or applications. Currently, the services compute Christian, Islamic, and Jewish events.The “Dates of Ash Wednesday and Easter” service (http://aa.usno.navy.mil/data/docs/easter.php) returns the dates of these two events for years after 1582 C.E. (1582 A.D.) The method of the western Christian churches is used to determined when Easter, a moveable feast, occurs.The “Dates of Islamic New Year and Ramadan” service (http://aa.usno.navy.mil/data/docs/islamic.php) returns the approximate Gregorian dates of these two events for years after 1582 C.E. (990 A.H.) and Julian dates are computed for the years 622-1582 C.E. (1-990 A.H.). The appropriate year in the Islamic calendar (anno Hegira) is also provided. Each event begins at 6 P.M. or sunset on the preceding day. These events are computed using a tabular calendar for planning purposes; in practice, the actual event is determined by observation of the appropriate new Moon.The “First Day of Passover” service (http://aa.usno.navy.mil/data/docs/passover.php) returns the Gregorian date corresponding to Nisan 15 for years after 1582 C.E. (5342 A.M.) and Julian dates are computed for the years 360-1582 C.E. (4120-5342 A.M.). The appropriate year in the Jewish calendar (anno Mundi) is also provided. Passover begins at 6 P.M. or sunset on the preceding day.On-line documentation for using the API-enabled calendar computers, including sample calls, is available (http://aa.usno.navy.mil/data/docs/api.php). The same web page also describes how to reach the Complete Sun and Moon Data for One Day, Phases of the Moon, Solar Eclipse Computer, Day and Night Across the Earth, and Apparent Disk of a Solar System Object services using API calls.An “Introduction to Calendars” (http://aa.usno.navy.mil/faq/docs/calendars.php) provides an overview of the topic and links to additional resources.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Aderholdt, Ferrol; Caldwell, Blake A.; Hicks, Susan Elaine
High performance computing environments are often used for a wide variety of workloads ranging from simulation, data transformation and analysis, and complex workflows to name just a few. These systems may process data at various security levels but in so doing are often enclaved at the highest security posture. This approach places significant restrictions on the users of the system even when processing data at a lower security level and exposes data at higher levels of confidentiality to a much broader population than otherwise necessary. The traditional approach of isolation, while effective in establishing security enclaves poses significant challenges formore » the use of shared infrastructure in HPC environments. This report details current state-of-the-art in reconfigurable network enclaving through Software Defined Networking (SDN) and Network Function Virtualization (NFV) and their applicability to secure enclaves in HPC environments. SDN and NFV methods are based on a solid foundation of system wide virtualization. The purpose of which is very straight forward, the system administrator can deploy networks that are more amenable to customer needs, and at the same time achieve increased scalability making it easier to increase overall capacity as needed without negatively affecting functionality. The network administration of both the server system and the virtual sub-systems is simplified allowing control of the infrastructure through well-defined APIs (Application Programming Interface). While SDN and NFV technologies offer significant promise in meeting these goals, they also provide the ability to address a significant component of the multi-tenant challenge in HPC environments, namely resource isolation. Traditional HPC systems are built upon scalable high-performance networking technologies designed to meet specific application requirements. Dynamic isolation of resources within these environments has remained difficult to achieve. SDN and NFV methodology provide us with relevant concepts and available open standards based APIs that isolate compute and storage resources within an otherwise common networking infrastructure. Additionally, the integration of the networking APIs within larger system frameworks such as OpenStack provide the tools necessary to establish isolated enclaves dynamically allowing the benefits of HPC while providing a controlled security structure surrounding these systems.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
2014-12-04
The software serves two purposes. The first purpose of the software is to prototype the Sandia High Performance Computing Power Application Programming Interface Specification effort. The specification can be found at http://powerapi.sandia.gov . Prototypes of the specification were developed in parallel with the development of the specification. Release of the prototype will be instructive to anyone who intends to implement the specification. More specifically, our vendor collaborators will benefit from the availability of the prototype. The second is in direct support of the PowerInsight power measurement device, which was co-developed with Penguin Computing. The software provides a cluster wide measurementmore » capability enabled by the PowerInsight device. The software can be used by anyone who purchases a PowerInsight device. The software will allow the user to easily collect power and energy information of a node that is instrumented with PowerInsight. The software can also be used as an example prototype implementation of the High Performance Computing Power Application Programming Interface Specification.« less
The eNanoMapper database for nanomaterial safety information
Chomenidis, Charalampos; Doganis, Philip; Fadeel, Bengt; Grafström, Roland; Hardy, Barry; Hastings, Janna; Hegi, Markus; Jeliazkov, Vedrin; Kochev, Nikolay; Kohonen, Pekka; Munteanu, Cristian R; Sarimveis, Haralambos; Smeets, Bart; Sopasakis, Pantelis; Tsiliki, Georgia; Vorgrimmler, David; Willighagen, Egon
2015-01-01
Summary Background: The NanoSafety Cluster, a cluster of projects funded by the European Commision, identified the need for a computational infrastructure for toxicological data management of engineered nanomaterials (ENMs). Ontologies, open standards, and interoperable designs were envisioned to empower a harmonized approach to European research in nanotechnology. This setting provides a number of opportunities and challenges in the representation of nanomaterials data and the integration of ENM information originating from diverse systems. Within this cluster, eNanoMapper works towards supporting the collaborative safety assessment for ENMs by creating a modular and extensible infrastructure for data sharing, data analysis, and building computational toxicology models for ENMs. Results: The eNanoMapper database solution builds on the previous experience of the consortium partners in supporting diverse data through flexible data storage, open source components and web services. We have recently described the design of the eNanoMapper prototype database along with a summary of challenges in the representation of ENM data and an extensive review of existing nano-related data models, databases, and nanomaterials-related entries in chemical and toxicogenomic databases. This paper continues with a focus on the database functionality exposed through its application programming interface (API), and its use in visualisation and modelling. Considering the preferred community practice of using spreadsheet templates, we developed a configurable spreadsheet parser facilitating user friendly data preparation and data upload. We further present a web application able to retrieve the experimental data via the API and analyze it with multiple data preprocessing and machine learning algorithms. Conclusion: We demonstrate how the eNanoMapper database is used to import and publish online ENM and assay data from several data sources, how the “representational state transfer” (REST) API enables building user friendly interfaces and graphical summaries of the data, and how these resources facilitate the modelling of reproducible quantitative structure–activity relationships for nanomaterials (NanoQSAR). PMID:26425413
The eNanoMapper database for nanomaterial safety information.
Jeliazkova, Nina; Chomenidis, Charalampos; Doganis, Philip; Fadeel, Bengt; Grafström, Roland; Hardy, Barry; Hastings, Janna; Hegi, Markus; Jeliazkov, Vedrin; Kochev, Nikolay; Kohonen, Pekka; Munteanu, Cristian R; Sarimveis, Haralambos; Smeets, Bart; Sopasakis, Pantelis; Tsiliki, Georgia; Vorgrimmler, David; Willighagen, Egon
2015-01-01
The NanoSafety Cluster, a cluster of projects funded by the European Commision, identified the need for a computational infrastructure for toxicological data management of engineered nanomaterials (ENMs). Ontologies, open standards, and interoperable designs were envisioned to empower a harmonized approach to European research in nanotechnology. This setting provides a number of opportunities and challenges in the representation of nanomaterials data and the integration of ENM information originating from diverse systems. Within this cluster, eNanoMapper works towards supporting the collaborative safety assessment for ENMs by creating a modular and extensible infrastructure for data sharing, data analysis, and building computational toxicology models for ENMs. The eNanoMapper database solution builds on the previous experience of the consortium partners in supporting diverse data through flexible data storage, open source components and web services. We have recently described the design of the eNanoMapper prototype database along with a summary of challenges in the representation of ENM data and an extensive review of existing nano-related data models, databases, and nanomaterials-related entries in chemical and toxicogenomic databases. This paper continues with a focus on the database functionality exposed through its application programming interface (API), and its use in visualisation and modelling. Considering the preferred community practice of using spreadsheet templates, we developed a configurable spreadsheet parser facilitating user friendly data preparation and data upload. We further present a web application able to retrieve the experimental data via the API and analyze it with multiple data preprocessing and machine learning algorithms. We demonstrate how the eNanoMapper database is used to import and publish online ENM and assay data from several data sources, how the "representational state transfer" (REST) API enables building user friendly interfaces and graphical summaries of the data, and how these resources facilitate the modelling of reproducible quantitative structure-activity relationships for nanomaterials (NanoQSAR).
The Ensembl REST API: Ensembl Data for Any Language
Yates, Andrew; Beal, Kathryn; Keenan, Stephen; McLaren, William; Pignatelli, Miguel; Ritchie, Graham R. S.; Ruffier, Magali; Taylor, Kieron; Vullo, Alessandro; Flicek, Paul
2015-01-01
Motivation: We present a Web service to access Ensembl data using Representational State Transfer (REST). The Ensembl REST server enables the easy retrieval of a wide range of Ensembl data by most programming languages, using standard formats such as JSON and FASTA while minimizing client work. We also introduce bindings to the popular Ensembl Variant Effect Predictor tool permitting large-scale programmatic variant analysis independent of any specific programming language. Availability and implementation: The Ensembl REST API can be accessed at http://rest.ensembl.org and source code is freely available under an Apache 2.0 license from http://github.com/Ensembl/ensembl-rest. Contact: ayates@ebi.ac.uk or flicek@ebi.ac.uk Supplementary information: Supplementary data are available at Bioinformatics online. PMID:25236461
Java 3D Interactive Visualization for Astrophysics
NASA Astrophysics Data System (ADS)
Chae, K.; Edirisinghe, D.; Lingerfelt, E. J.; Guidry, M. W.
2003-05-01
We are developing a series of interactive 3D visualization tools that employ the Java 3D API. We have applied this approach initially to a simple 3-dimensional galaxy collision model (restricted 3-body approximation), with quite satisfactory results. Running either as an applet under Web browser control, or as a Java standalone application, this program permits real-time zooming, panning, and 3-dimensional rotation of the galaxy collision simulation under user mouse and keyboard control. We shall also discuss applications of this technology to 3-dimensional visualization for other problems of astrophysical interest such as neutron star mergers and the time evolution of element/energy production networks in X-ray bursts. *Managed by UT-Battelle, LLC, for the U.S. Department of Energy under contract DE-AC05-00OR22725.
Apis - a Digital Inventory of Archaeological Heritage Based on Remote Sensing Data
NASA Astrophysics Data System (ADS)
Doneus, M.; Forwagner, U.; Liem, J.; Sevara, C.
2017-08-01
Heritage managers are in need of dynamic spatial inventories of archaeological and cultural heritage that provide them with multipurpose tools to interactively understand information about archaeological heritage within its landscape context. Specifically, linking site information with the respective non-invasive prospection data is of increasing importance as it allows for the assessment of inherent uncertainties related to the use and interpretation of remote sensing data by the educated and knowledgeable heritage manager. APIS, the archaeological prospection information system of the Aerial Archive of the University of Vienna, is specifically designed to meet these needs. It provides storage and easy access to all data concerning aerial photographs and archaeological sites through a single GIS-based application. Furthermore, APIS has been developed in an open source environment, which allows it to be freely distributed and modified. This combination in one single open source system facilitates an easy workflow for data management, interpretation, storage, and retrieval. APIS and a sample dataset will be released free of charge under creative commons license in near future.
Students academic performance based on behavior
NASA Astrophysics Data System (ADS)
Maulida, Juwita Dien; Kariyam
2017-12-01
Utilization of data in an information system that can be used for decision making that utilizes existing data warehouse to help dig useful information to make decisions correctly and accurately. Experience API (xAPI) is one of the enabling technologies for collecting data, so xAPI can be used as a data warehouse that can be used for various needs. One software application whose data is collected in xAPI is LMS. LMS is a software used in an electronic learning process that can handle all aspects of learning, by using LMS can also be known how the learning process and the aspects that can affect learning achievement. One of the aspects that can affect the learning achievement is the background of each student, which is not necessarily the student with a good background is an outstanding student or vice versa. Therefore, an action is needed to anticipate this problem. Prediction of student academic performance using Naive Bayes algorithm obtained accuracy of 67.7983% and error 32.2917%.
PyPDB: a Python API for the Protein Data Bank.
Gilpin, William
2016-01-01
We have created a Python programming interface for the RCSB Protein Data Bank (PDB) that allows search and data retrieval for a wide range of result types, including BLAST and sequence motif queries. The API relies on the existing XML-based API and operates by creating custom XML requests from native Python types, allowing extensibility and straightforward modification. The package has the ability to perform many types of advanced search of the PDB that are otherwise only available through the PDB website. PyPDB is implemented exclusively in Python 3 using standard libraries for maximal compatibility. The most up-to-date version, including iPython notebooks containing usage tutorials, is available free-of-charge under an open-source MIT license via GitHub at https://github.com/williamgilpin/pypdb, and the full API reference is at http://williamgilpin.github.io/pypdb_docs/html/. The latest stable release is also available on PyPI. wgilpin@stanford.edu. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.
There's An App For That: Planning Ahead for the Solar Eclipse in August 2017
NASA Astrophysics Data System (ADS)
Chizek Frouard, Malynda R.; Lesniak, Michael V.; Bell, Steve
2017-01-01
With the total solar eclipse of 2017 August 21 over the continental United States approaching, the U.S. Naval Observatory (USNO) on-line Solar Eclipse Computer can now be accessed via an Android application, available on Google Play.Over the course of the eclipse, as viewed from a specific site, several events may be visible: the beginning and ending of the eclipse (first and fourth contacts), the beginning and ending of totality (second and third contacts), the moment of maximum eclipse, sunrise, or sunset. For each of these events, the USNO Solar Eclipse 2017 Android application reports the time, Sun's altitude and azimuth, and the event's position and vertex angles. The app also lists the duration of the total phase, the duration of the eclipse, the magnitude of the eclipse, and the percent of the Sun obscured for a particular eclipse site.All of the data available in the app comes from the flexible USNO Solar Eclipse Computer Application Programming Interface (API), which produces JavaScript Object Notation (JSON) that can be incorporated into third-party Web sites or custom applications. Additional information is available in the on-line documentation (http://aa.usno.navy.mil/data/docs/api.php).For those who prefer using a traditional data input form, the local circumstances can still be requested at http://aa.usno.navy.mil/data/docs/SolarEclipses.php.In addition the 2017 August 21 Solar Eclipse Resource page (http://aa.usno.navy.mil/data/docs/Eclipse2017.php) consolidates all of the USNO resources for this event, including a Google Map view of the eclipse track designed by Her Majesty's Nautical Almanac Office (HMNAO).Looking further ahead, a 2024 April 8 Solar Eclipse Resource page (http://aa.usno.navy.mil/data/docs/Eclipse2024.php) is also available.
jmzML, an open-source Java API for mzML, the PSI standard for MS data.
Côté, Richard G; Reisinger, Florian; Martens, Lennart
2010-04-01
We here present jmzML, a Java API for the Proteomics Standards Initiative mzML data standard. Based on the Java Architecture for XML Binding and XPath-based XML indexer random-access XML parser, jmzML can handle arbitrarily large files in minimal memory, allowing easy and efficient processing of mzML files using the Java programming language. jmzML also automatically resolves internal XML references on-the-fly. The library (which includes a viewer) can be downloaded from http://jmzml.googlecode.com.
Rosebeck, Shaun; Lim, Megan S; Elenitoba-Johnson, Kojo S J; McAllister-Lucas, Linda M; Lucas, Peter C
2016-01-01
Lymphoma of mucosa-associated lymphoid tissue (MALT lymphoma) is the most common extranodal B cell tumor and accounts for 8% of non-Hodgkin’s lymphomas. Gastric MALT lymphoma is the best-studied example and is a prototypical neoplasm that occurs in the setting of chronic inflammation brought on by persistent infection or autoimmune disease. Cytogenetic abnormalities are commonly acquired during the course of disease and the most common is chromosomal translocation t(11;18)(q21;q21), which creates the API2-MALT1 fusion oncoprotein. t(11;18)-positive lymphomas can be clinically aggressive and have a higher rate of dissemination than t(11;18)-negative tumors. Many cancers, including MALT lymphomas, characteristically exhibit deregulated over-activation of cellular survival pathways, such as the nuclear factor-κB (NF-κB) pathway. Molecular characterization of API2-MALT1 has revealed it to be a potent activator of NF-κB, which is required for API2-MALT1-induced cellular transformation, however the mechanisms by which API2-MALT1 exerts these effects are only recently becoming apparent. The API2 moiety of the fusion binds tumor necrosis factor (TNF) receptor associated factor (TRAF) 2 and receptor interacting protein 1 (RIP1), two proteins essential for TNF receptor-induced NF-κB activation. By effectively mimicking ligand-bound TNF receptor, API2-MALT1 promotes TRAF2-dependent ubiquitination of RIP1, which then acts as a scaffold for nucleating and activating the canonical NF-κB machinery. Activation occurs, in part, through MALT1 moiety-dependent recruitment of TRAF6, which can directly modify NF-κB essential modulator, the principal downstream regulator of NF-κB. While the intrinsic MALT1 protease catalytic activity is dispensable for this canonical NF-κB signaling, it is critical for non-canonical NF-κB activation. In this regard, API2-MALT1 recognizes NF-κB inducing kinase (NIK), the essential upstream regulator of non-canonical NF-κB, and cleaves it to generate a stable, constitutively active fragment. Thus, API2-MALT1 harnesses multiple unique pathways to achieve deregulated NF-κB activation. Emerging data from our group and others have also detailed additional gain-of-function activities of API2-MALT1 that extend beyond NF-κB activation. Specifically, API2-MALT1 recruits and subverts multiple other signaling factors, including LIM domain and actin-binding protein 1 (LIMA1) and Smac/DIABLO. Like NIK, LIMA1 represents a unique substrate for API2-MALT1 protease activity, but unlike NIK, its cleavage sets in motion a major NF-κB-independent pathway for promoting oncogenesis. In this review, we highlight the most recent results characterizing these unique and diverse gain-of-function activities of API2-MALT1 and how they contribute to lymphomagenesis. PMID:26981201
The Matchmaker Exchange: a platform for rare disease gene discovery.
Philippakis, Anthony A; Azzariti, Danielle R; Beltran, Sergi; Brookes, Anthony J; Brownstein, Catherine A; Brudno, Michael; Brunner, Han G; Buske, Orion J; Carey, Knox; Doll, Cassie; Dumitriu, Sergiu; Dyke, Stephanie O M; den Dunnen, Johan T; Firth, Helen V; Gibbs, Richard A; Girdea, Marta; Gonzalez, Michael; Haendel, Melissa A; Hamosh, Ada; Holm, Ingrid A; Huang, Lijia; Hurles, Matthew E; Hutton, Ben; Krier, Joel B; Misyura, Andriy; Mungall, Christopher J; Paschall, Justin; Paten, Benedict; Robinson, Peter N; Schiettecatte, François; Sobreira, Nara L; Swaminathan, Ganesh J; Taschner, Peter E; Terry, Sharon F; Washington, Nicole L; Züchner, Stephan; Boycott, Kym M; Rehm, Heidi L
2015-10-01
There are few better examples of the need for data sharing than in the rare disease community, where patients, physicians, and researchers must search for "the needle in a haystack" to uncover rare, novel causes of disease within the genome. Impeding the pace of discovery has been the existence of many small siloed datasets within individual research or clinical laboratory databases and/or disease-specific organizations, hoping for serendipitous occasions when two distant investigators happen to learn they have a rare phenotype in common and can "match" these cases to build evidence for causality. However, serendipity has never proven to be a reliable or scalable approach in science. As such, the Matchmaker Exchange (MME) was launched to provide a robust and systematic approach to rare disease gene discovery through the creation of a federated network connecting databases of genotypes and rare phenotypes using a common application programming interface (API). The core building blocks of the MME have been defined and assembled. Three MME services have now been connected through the API and are available for community use. Additional databases that support internal matching are anticipated to join the MME network as it continues to grow. © 2015 WILEY PERIODICALS, INC.
The Matchmaker Exchange: A Platform for Rare Disease Gene Discovery
Philippakis, Anthony A.; Azzariti, Danielle R.; Beltran, Sergi; Brookes, Anthony J.; Brownstein, Catherine A.; Brudno, Michael; Brunner, Han G.; Buske, Orion J.; Carey, Knox; Doll, Cassie; Dumitriu, Sergiu; Dyke, Stephanie O.M.; den Dunnen, Johan T.; Firth, Helen V.; Gibbs, Richard A.; Girdea, Marta; Gonzalez, Michael; Haendel, Melissa A.; Hamosh, Ada; Holm, Ingrid A.; Huang, Lijia; Hurles, Matthew E.; Hutton, Ben; Krier, Joel B.; Misyura, Andriy; Mungall, Christopher J.; Paschall, Justin; Paten, Benedict; Robinson, Peter N.; Schiettecatte, François; Sobreira, Nara L.; Swaminathan, Ganesh J.; Taschner, Peter E.; Terry, Sharon F.; Washington, Nicole L.; Züchner, Stephan; Boycott, Kym M.; Rehm, Heidi L.
2015-01-01
There are few better examples of the need for data sharing than in the rare disease community, where patients, physicians, and researchers must search for “the needle in a haystack” to uncover rare, novel causes of disease within the genome. Impeding the pace of discovery has been the existence of many small siloed datasets within individual research or clinical laboratory databases and/or disease-specific organizations, hoping for serendipitous occasions when two distant investigators happen to learn they have a rare phenotype in common and can “match” these cases to build evidence for causality. However, serendipity has never proven to be a reliable or scalable approach in science. As such, the Matchmaker Exchange (MME) was launched to provide a robust and systematic approach to rare disease gene discovery through the creation of a federated network connecting databases of genotypes and rare phenotypes using a common application programming interface (API). The core building blocks of the MME have been defined and assembled. Three MME services have now been connected through the API and are available for community use. Additional databases that support internal matching are anticipated to join the MME network as it continues to grow. PMID:26295439
The Matchmaker Exchange: A Platform for Rare Disease Gene Discovery
Philippakis, Anthony A.; Azzariti, Danielle R.; Beltran, Sergi; ...
2015-09-17
There are few better examples of the need for data sharing than in the rare disease community, where patients, physicians, and researchers must search for "the needle in a haystack" to uncover rare, novel causes of disease within the genome. Impeding the pace of discovery has been the existence of many small siloed datasets within individual research or clinical laboratory databases and/or disease-specific organizations, hoping for serendipitous occasions when two distant investigators happen to learn they have a rare phenotype in common and can "match" these cases to build evidence for causality. However, serendipity has never proven to be amore » reliable or scalable approach in science. As such, the Matchmaker Exchange (MME) was launched to provide a robust and systematic approach to rare disease gene discovery through the creation of a federated network connecting databases of genotypes and rare phenotypes using a common application programming interface (API). The core building blocks of the MME have been defined and assembled. In conclusion, three MME services have now been connected through the API and are available for community use. Additional databases that support internal matching are anticipated to join the MME network as it continues to grow.« less
Griss, Johannes; Reisinger, Florian; Hermjakob, Henning; Vizcaíno, Juan Antonio
2012-03-01
We here present the jmzReader library: a collection of Java application programming interfaces (APIs) to parse the most commonly used peak list and XML-based mass spectrometry (MS) data formats: DTA, MS2, MGF, PKL, mzXML, mzData, and mzML (based on the already existing API jmzML). The library is optimized to be used in conjunction with mzIdentML, the recently released standard data format for reporting protein and peptide identifications, developed by the HUPO proteomics standards initiative (PSI). mzIdentML files do not contain spectra data but contain references to different kinds of external MS data files. As a key functionality, all parsers implement a common interface that supports the various methods used by mzIdentML to reference external spectra. Thus, when developing software for mzIdentML, programmers no longer have to support multiple MS data file formats but only this one interface. The library (which includes a viewer) is open source and, together with detailed documentation, can be downloaded from http://code.google.com/p/jmzreader/. © 2012 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
The Matchmaker Exchange: A Platform for Rare Disease Gene Discovery
DOE Office of Scientific and Technical Information (OSTI.GOV)
Philippakis, Anthony A.; Azzariti, Danielle R.; Beltran, Sergi
There are few better examples of the need for data sharing than in the rare disease community, where patients, physicians, and researchers must search for "the needle in a haystack" to uncover rare, novel causes of disease within the genome. Impeding the pace of discovery has been the existence of many small siloed datasets within individual research or clinical laboratory databases and/or disease-specific organizations, hoping for serendipitous occasions when two distant investigators happen to learn they have a rare phenotype in common and can "match" these cases to build evidence for causality. However, serendipity has never proven to be amore » reliable or scalable approach in science. As such, the Matchmaker Exchange (MME) was launched to provide a robust and systematic approach to rare disease gene discovery through the creation of a federated network connecting databases of genotypes and rare phenotypes using a common application programming interface (API). The core building blocks of the MME have been defined and assembled. In conclusion, three MME services have now been connected through the API and are available for community use. Additional databases that support internal matching are anticipated to join the MME network as it continues to grow.« less
Boulos, Maged N Kamel
2005-01-01
This eye-opener article aims at introducing the health GIS community to the emerging online consumer geoinformatics services from Google and Microsoft (MSN), and their potential utility in creating custom online interactive health maps. Using the programmable interfaces provided by Google and MSN, we created three interactive demonstrator maps of England's Strategic Health Authorities. These can be browsed online at – Google Maps API (Application Programming Interface) version, – Google Earth KML (Keyhole Markup Language) version, and – MSN Virtual Earth Map Control version. Google and MSN's worldwide distribution of "free" geospatial tools, imagery, and maps is to be commended as a significant step towards the ultimate "wikification" of maps and GIS. A discussion is provided of these emerging online mapping trends, their expected future implications and development directions, and associated individual privacy, national security and copyrights issues. Although ESRI have announced their planned response to Google (and MSN), it remains to be seen how their envisaged plans will materialize and compare to the offerings from Google and MSN, and also how Google and MSN mapping tools will further evolve in the near future. PMID:16176577
Lakshman, Jay P; Cao, Yu; Kowalski, James; Serajuddin, Abu T M
2008-01-01
Formulation of active pharmaceutical ingredients (API) in high-energy amorphous forms is a common strategy to enhance solubility, dissolution rate and, consequently, oral bioavailability of poorly water-soluble drugs. Amorphous APIs are, however, susceptible to recrystallization and, therefore, there is a need to physically stabilize them as solid dispersions in polymeric carriers. Hot melt extrusion has in recent years gained wide acceptance as a method of choice for the preparation of solid dispersions. There is a potential that the API, the polymer or both may degrade if excessively high temperature is needed in the melt extrusion process, especially when the melting point of the API is high. This report details a novel method where the API was first converted to an amorphous form by solvent evaporation and then melt-extruded with a suitable polymer at a drug load of at least 20% w/w. By this means, melt extrusion could be performed much below the melting temperature of the drug substance. Since the glass transition temperature of the amorphous drug was lower than that of the polymer used, the drug substance itself served as the plasticizer for the polymer. The addition of surfactants in the matrix enhanced dispersion and subsequent dissolution of the drug in aqueous media. The amorphous melt extrusion formulations showed higher bioavailability than formulations containing the crystalline API. There was no conversion of amorphous solid to its crystalline form during accelerated stability testing of dosage forms.
Miksík, Ivan; Mikulíková, Katerina; Pácha, Jirí; Kucka, Marek; Deyl, Zdenek
2004-02-05
A high-performance liquid chromatography-atmospheric pressure ionization-electrospray ionization mass spectrometry (HPLC-API-ESI-MS) method was developed for the analysis of steroids in a study of steroid-converting enzymes. Separations ware done on a Zorbax Eclipse XDB-C18 column (eluted with a linear methanol-water-acetic acid gradient) and identification of the steroids involved was done by API-ESI-MS using positive ion mode and extracted ion analysis. The applicability of the present method for studying steroid metabolism was proven in assaying two steroid-converting enzymes (20beta-hydroxysteroid dehydrogenase and 11beta-hydroxysteroid dehydrogenase) in various biological samples (rat and chicken intestine, chicken oviduct).
Sudhinaraset, May; Ling, Irving; To, Tu My; Melo, Jason; Quach, Thu
2017-07-01
There are currently 1.5 million undocumented Asians and Pacific Islanders (APIs) in the US. Undocumented API young adults, in particular, come of age in a challenging political and social climate, but little is known about their health outcomes. To our knowledge, this is the first study to assess the psychosocial needs and health status of API undocumented young adults. Guided by social capital theory, this qualitative study describes the social context of API undocumented young adults (ages 18-31), including community and government perceptions, and how social relationships influence health. This study was conducted in Northern California and included four focus group discussions (FGDs) and 24 in-depth interviews (IDIs), with 32 unique participants total. FGDs used purposeful sampling by gender (two male and two female discussions) and education status (in school and out-of-school). Findings suggest low bonding and bridging social capital. Results indicate that community distrust is high, even within the API community, due to high levels of exploitation, discrimination, and threats of deportation. Participants described how documentation status is a barrier in accessing health services, particularly mental health and sexual and reproductive health services. This study identifies trusted community groups and discusses recommendations for future research, programs, and policies. Copyright © 2017 The Authors. Published by Elsevier Ltd.. All rights reserved.
Gas chromatography coupled to atmospheric pressure ionization mass spectrometry (GC-API-MS): review.
Li, Du-Xin; Gan, Lin; Bronja, Amela; Schmitz, Oliver J
2015-09-03
Although the coupling of GC/MS with atmospheric pressure ionization (API) has been reported in 1970s, the interest in coupling GC with atmospheric pressure ion source was expanded in the last decade. The demand of a "soft" ion source for preserving highly diagnostic molecular ion is desirable, as compared to the "hard" ionization technique such as electron ionization (EI) in traditional GC/MS, which fragments the molecule in an extensive way. These API sources include atmospheric pressure chemical ionization (APCI), atmospheric pressure photoionization (APPI), atmospheric pressure laser ionization (APLI), electrospray ionization (ESI) and low temperature plasma (LTP). This review discusses the advantages and drawbacks of this analytical platform. After an introduction in atmospheric pressure ionization the review gives an overview about the history and explains the mechanisms of various atmospheric pressure ionization techniques used in combination with GC such as APCI, APPI, APLI, ESI and LTP. Also new developments made in ion source geometry, ion source miniaturization and multipurpose ion source constructions are discussed and a comparison between GC-FID, GC-EI-MS and GC-API-MS shows the advantages and drawbacks of these techniques. The review ends with an overview of applications realized with GC-API-MS. Copyright © 2015. Published by Elsevier B.V.
Codec-on-Demand Based on User-Level Virtualization
NASA Astrophysics Data System (ADS)
Zhang, Youhui; Zheng, Weimin
At work, at home, and in some public places, a desktop PC is usually available nowadays. Therefore, it is important for users to be able to play various videos on different PCs smoothly, but the diversity of codec types complicates the situation. Although some mainstream media players can try to download the needed codec automatically, this may fail for average users because installing the codec usually requires administrator privileges to complete, while the user may not be the owner of the PC. We believe an ideal solution should work without users' intervention, and need no special privileges. This paper proposes such a user-friendly, program-transparent solution for Windows-based media players. It runs the media player in a user-mode virtualization environment, and then downloads the needed codec on-the-fly. Because of API (Application Programming Interface) interception, some resource-accessing API calls from the player will be redirected to the downloaded codec resources. Then from the viewpoint of the player, the necessary codec exists locally and it can handle the video smoothly, although neither system registry nor system folders was modified during this process. Besides convenience, the principle of least privilege is maintained and the host system is left clean. This paper completely analyzes the technical issues and presents such a prototype which can work with DirectShow-compatible players. Performance tests show that the overhead is negligible. Moreover, our solution conforms to the Software-As-A-Service (SaaS) mode, which is very promising in the Internet era.
Ytow, Nozomi
2016-01-01
The Species API of the Global Biodiversity Information Facility (GBIF) provides public access to taxonomic data aggregated from multiple data sources. Each data source follows its own classification which can be inconsistent with classifications from other sources. Even with a reference classification e.g. the GBIF Backbone taxonomy, a comprehensive method to compare classifications in the data aggregation is essential, especially for non-expert users. A Java application was developed to compare multiple taxonomies graphically using classification data acquired from GBIF's ChecklistBank via the GBIF Species API. It uses a table to display taxonomies where each column represents a taxonomy under comparison, with an aligner column to organise taxa by name. Each cell contains the name of a taxon if the classification in that column contains the name. Each column also has a cell showing the hierarchy of the taxonomy by a folder metaphor where taxa are aligned and synchronised in the aligner column. A set of those comparative tables shows taxa categorised by relationship between taxonomies. The result set is also available as tables in an Excel format file.
Simulating storage part of application with Simgrid
NASA Astrophysics Data System (ADS)
Wang, Cong
2017-10-01
Design of a file system simulation and visualization system, using simgrid API and visualization techniques to help users understanding and improving the file system portion of their application. The core of the simulator is the API provided by simgrid, cluefs tracks and catches the procedure of the I/O operation. Run the simulator simulating this application to generate the output visualization file, which can visualize the I/O action proportion and time series. Users can also change the parameters in the configuration file to change the parameters of the storage system such as reading and writing bandwidth, users can also adjust the storage strategy, test the performance, getting reference to be much easier to optimize the storage system. We have tested all the aspects of the simulator, the results suggest that the simulator performance can be believable.
A Hybrid Task Graph Scheduler for High Performance Image Processing Workflows.
Blattner, Timothy; Keyrouz, Walid; Bhattacharyya, Shuvra S; Halem, Milton; Brady, Mary
2017-12-01
Designing applications for scalability is key to improving their performance in hybrid and cluster computing. Scheduling code to utilize parallelism is difficult, particularly when dealing with data dependencies, memory management, data motion, and processor occupancy. The Hybrid Task Graph Scheduler (HTGS) improves programmer productivity when implementing hybrid workflows for multi-core and multi-GPU systems. The Hybrid Task Graph Scheduler (HTGS) is an abstract execution model, framework, and API that increases programmer productivity when implementing hybrid workflows for such systems. HTGS manages dependencies between tasks, represents CPU and GPU memories independently, overlaps computations with disk I/O and memory transfers, keeps multiple GPUs occupied, and uses all available compute resources. Through these abstractions, data motion and memory are explicit; this makes data locality decisions more accessible. To demonstrate the HTGS application program interface (API), we present implementations of two example algorithms: (1) a matrix multiplication that shows how easily task graphs can be used; and (2) a hybrid implementation of microscopy image stitching that reduces code size by ≈ 43% compared to a manually coded hybrid workflow implementation and showcases the minimal overhead of task graphs in HTGS. Both of the HTGS-based implementations show good performance. In image stitching the HTGS implementation achieves similar performance to the hybrid workflow implementation. Matrix multiplication with HTGS achieves 1.3× and 1.8× speedup over the multi-threaded OpenBLAS library for 16k × 16k and 32k × 32k size matrices, respectively.
Cloud-Based Speech Technology for Assistive Technology Applications (CloudCAST).
Cunningham, Stuart; Green, Phil; Christensen, Heidi; Atria, José Joaquín; Coy, André; Malavasi, Massimiliano; Desideri, Lorenzo; Rudzicz, Frank
2017-01-01
The CloudCAST platform provides a series of speech recognition services that can be integrated into assistive technology applications. The platform and the services provided by the public API are described. Several exemplar applications have been developed to demonstrate the platform to potential developers and users.
Asymptomatic plasmodial infection in Colombian pregnant women.
Carmona-Fonseca, Jaime; Agudelo, Olga M; Arango, Eliana M
2017-08-01
Information about asymptomatic plasmodial infection is scarce in the world, and the current antimalarial program goals (control, elimination, and eradication) demand this evidence to be well documented in different populations and malaria transmission settings. This study aimed to measure the prevalence of API in Colombian pregnant women at delivery. A retrospective prevalence survey was used. Women were recruited at hospital obstetric facility in each of the municipalities of Turbo, Necoclí in Antioquia department, and Puerto Libertador in Córdoba department. Malaria infection was tested by thick blood smear (TBS) and real-time quantitative PCR (qPCR). Ninety-six pregnant women at delivery were studied: 95% were asymptomatic (91/96), 45% had asymptomatic plasmodial infection (API) by qPCR (41/91), and only 8% (7/91) had API by microscopy. The prevalence of submicroscopic infections (TBS negative and qPCR positive) was very high, 37% (34/91) in asymptomatic women and 41% (39/96) in total women studied (91 asymptomatic and 5 symptomatic). The prevalence of API in Colombian pregnant women is much higher than which is expected for a country that does not have the level of malaria transmission as Sub-Saharan African countries. Copyright © 2017 Elsevier B.V. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kim, Junghyun; Gangwon, Jo; Jaehoon, Jung
Applications written solely in OpenCL or CUDA cannot execute on a cluster as a whole. Most previous approaches that extend these programming models to clusters are based on a common idea: designating a centralized host node and coordinating the other nodes with the host for computation. However, the centralized host node is a serious performance bottleneck when the number of nodes is large. In this paper, we propose a scalable and distributed OpenCL framework called SnuCL-D for large-scale clusters. SnuCL-D's remote device virtualization provides an OpenCL application with an illusion that all compute devices in a cluster are confined inmore » a single node. To reduce the amount of control-message and data communication between nodes, SnuCL-D replicates the OpenCL host program execution and data in each node. We also propose a new OpenCL host API function and a queueing optimization technique that significantly reduce the overhead incurred by the previous centralized approaches. To show the effectiveness of SnuCL-D, we evaluate SnuCL-D with a microbenchmark and eleven benchmark applications on a large-scale CPU cluster and a medium-scale GPU cluster.« less
Faghih, Mohammad Mehdi; Moghaddam, Mohsen Ebrahimi
2011-01-01
Although much research in the area of Wireless Multimedia Sensor Networks (WMSNs) has been done in recent years, the programming of sensor nodes is still time-consuming and tedious. It requires expertise in low-level programming, mainly because of the use of resource constrained hardware and also the low level API provided by current operating systems. The code of the resulting systems has typically no clear separation between application and system logic. This minimizes the possibility of reusing code and often leads to the necessity of major changes when the underlying platform is changed. In this paper, we present a service oriented middleware named SOMM to support application development for WMSNs. The main goal of SOMM is to enable the development of modifiable and scalable WMSN applications. A network which uses the SOMM is capable of providing multiple services to multiple clients at the same time with the specified Quality of Service (QoS). SOMM uses a virtual machine with the ability to support mobile agents. Services in SOMM are provided by mobile agents and SOMM also provides a t space on each node which agents can use to communicate with each other. PMID:22346646
Faghih, Mohammad Mehdi; Moghaddam, Mohsen Ebrahimi
2011-01-01
Although much research in the area of Wireless Multimedia Sensor Networks (WMSNs) has been done in recent years, the programming of sensor nodes is still time-consuming and tedious. It requires expertise in low-level programming, mainly because of the use of resource constrained hardware and also the low level API provided by current operating systems. The code of the resulting systems has typically no clear separation between application and system logic. This minimizes the possibility of reusing code and often leads to the necessity of major changes when the underlying platform is changed. In this paper, we present a service oriented middleware named SOMM to support application development for WMSNs. The main goal of SOMM is to enable the development of modifiable and scalable WMSN applications. A network which uses the SOMM is capable of providing multiple services to multiple clients at the same time with the specified Quality of Service (QoS). SOMM uses a virtual machine with the ability to support mobile agents. Services in SOMM are provided by mobile agents and SOMM also provides a t space on each node which agents can use to communicate with each other.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Laros III, James H.; DeBonis, David; Grant, Ryan
Measuring and controlling the power and energy consumption of high performance computing systems by various components in the software stack is an active research area [13, 3, 5, 10, 4, 21, 19, 16, 7, 17, 20, 18, 11, 1, 6, 14, 12]. Implementations in lower level software layers are beginning to emerge in some production systems, which is very welcome. To be most effective, a portable interface to measurement and control features would significantly facilitate participation by all levels of the software stack. We present a proposal for a standard power Application Programming Interface (API) that endeavors to cover themore » entire software space, from generic hardware interfaces to the input from the computer facility manager.« less
Performance analysis of a proposed tightly-coupled medical instrument network based on CAN protocol.
Mujumdar, Shantanu; Thongpithoonrat, Pongnarin; Gurkan, D; McKneely, Paul K; Chapman, Frank M; Merchant, Fatima
2010-01-01
Advances in medical devices and health care has been phenomenal during the recent years. Although medical device manufacturers have been improving their instruments, network connection of these instruments still rely on proprietary technologies. Even if the interface has been provided by the manufacturer (e.g., RS-232, USB, or Ethernet coupled with a proprietary API), there is no widely-accepted uniform data model to access data of various bedside instruments. There is a need for a common standard which allows for internetworking with the medical devices from different manufacturers. ISO/IEEE 11073 (X73) is a standard attempting to unify the interfaces of all medical devices. X73 defines a client access mechanism that would be implemented into the communication controllers (residing between an instrument and the network) in order to access/network patient data. On the other hand, MediCAN™ technology suite has been demonstrated with various medical instruments to achieve interfacing and networking with a similar goal in its open standardization approach. However, it provides a more generic definition for medical data to achieve flexibility for networking and client access mechanisms. The instruments are in turn becoming more sophisticated; however, the operation of an instrument is still expected to be locally done by authorized medical personnel. Unfortunately, each medical instrument has its unique proprietary API (application programming interface - if any) to provide automated and electronic access to monitoring data. Integration of these APIs requires an agreement with the manufacturers towards realization of interoperable health care networking. As long as the interoperability of instruments with a network is not possible, ubiquitous access to patient status is limited only to manual entry based systems. This paper demonstrates an attempt to realize an interoperable medical instrument interface for networking using MediCAN technology suite as an open standard.
OpenAQ: A Platform to Aggregate and Freely Share Global Air Quality Data
NASA Astrophysics Data System (ADS)
Hasenkopf, C. A.; Flasher, J. C.; Veerman, O.; DeWitt, H. L.
2015-12-01
Thousands of ground-based air quality monitors around the world publicly publish real-time air quality data; however, researchers and the public do not have access to this information in the ways most useful to them. Often, air quality data are posted on obscure websites showing only current values, are programmatically inaccessible, and/or are in inconsistent data formats across sites. Yet, historical and programmatic access to such a global dataset would be transformative to several scientific fields, from epidemiology to low-cost sensor technologies to estimates of ground-level aerosol by satellite retrievals. To increase accessibility and standardize this disparate dataset, we have built OpenAQ, an innovative, open platform created by a group of scientists and open data programmers. The source code for the platform is viewable at github.com/openaq. Currently, we are aggregating, storing, and making publicly available real-time air quality data (PM2.5, PM10, SO2, NO2, and O3) via an Application Program Interface (API). We will present the OpenAQ platform, which currently has the following specific capabilities: A continuous ingest mechanism for some of the most polluted cities, generalizable to more sources An API providing data-querying, including ability to filter by location, measurement type and value and date, as well as custom sort options A generalized, chart-based visualization tool to explore data accessible via the API At this stage, we are seeking wider participation and input from multiple research communities in expanding our data retrieval sites, standardizing our protocols, receiving feedback on quality issues, and creating tools that can be built on top of this open platform.
Adapting Stanford's Chronic Disease Self-Management Program to Hawaii's Multicultural Population
ERIC Educational Resources Information Center
Tomioka, Michiyo; Braun, Kathryn L.; Compton, Merlita; Tanoue, Leslie
2012-01-01
Purpose of the Study: Stanford's Chronic Disease Self-Management Program (CDSMP) has been proven to increase patients' ability to manage distress. We describe how we replicated CDSMP in Asian and Pacific Islander (API) communities. Design and Methods: We used the "track changes" tool to deconstruct CDSMP into its various components…
Federal Register 2010, 2011, 2012, 2013, 2014
2011-06-07
... production is accounted for in API's certification program and that the follow-up audit program is showing... Selective Catalytic Reduction Technology AGENCY: Environmental Protection Agency (EPA). ACTION: Request for... reduction (SCR) technology to meet emission standards for oxides of nitrogen (NO X ). This draft guidance...
Knösel, Michael; Ellenberger, David; Göldner, Yvonne; Sandoval, Paulo; Wiechmann, Dirk
2015-04-15
Sealant application during fixed appliances orthodontic treatment for enamel protection is common, however, reliable data on its durability in vivo are rare. This study aims at assessing the durability of a sealant (OpalSeal, Ultradent) for protection against white-spot lesion formation in orthodontic patients over 26 weeks in vivo, taking into account the provision or absence of an adequate oral hygiene. We tested the null hypothesis of (1) no significant abatement of the sealant after 26 weeks in fixed orthodontic treatment compared to baseline, and (2) no significant influence of the factor of brushing and oral hygiene (as screened by approximal plaque index, API) on the abatement of the sealant. Integrity and abatement of OpalSeal applicated directly following bracketing was assessed in thirty-six consecutive patients (n(teeth) = 796) undergoing orthodontic treatment with fixed appliances (male/female 12/24; mean age/SD 14.4/1.33 Y). Assessment of the fluorescing sealant preservation was by a black-light lamp, using a classification that was concepted in analogy to the ARI index: (3, sealant completely preserved; 2= > 50% preserved; 1 = <50%; 0 = no sealant observable) immediately following application (Baseline, T0), after 2 (T1), 8 (T2), 14 (T3), 20 (T4) and 26 weeks (T5). API was assessed at T0 and T1. Statistical analysis was by non-parametric repeated measures ANOVA (α = 5%, power >80%). At baseline, 43.4% of teeth had a positive API. Oral hygiene deteriorated after bracketing (T1, 53%) significantly. Null hypothesis (1) was rejected, while (2) was accepted: Mean values of both the well brushed and non-brushed anterior teeth undercut the score "1" at T3 (week 14). Despite a slightly better preservation of the sealer before and after T3 in not-sufficiently brushed (API-positive) teeth, this finding was statistically not significant. One single application of OpalSeal is unlikely to last throughout the entire fixed appliance treatment stage. On average, re-application of the sealant can be expected to be necessary after 3.5 months (week 14) in treatment.
DAS: A Data Management System for Instrument Tests and Operations
NASA Astrophysics Data System (ADS)
Frailis, M.; Sartor, S.; Zacchei, A.; Lodi, M.; Cirami, R.; Pasian, F.; Trifoglio, M.; Bulgarelli, A.; Gianotti, F.; Franceschi, E.; Nicastro, L.; Conforti, V.; Zoli, A.; Smart, R.; Morbidelli, R.; Dadina, M.
2014-05-01
The Data Access System (DAS) is a and data management software system, providing a reusable solution for the storage of data acquired both from telescopes and auxiliary data sources during the instrument development phases and operations. It is part of the Customizable Instrument WorkStation system (CIWS-FW), a framework for the storage, processing and quick-look at the data acquired from scientific instruments. The DAS provides a data access layer mainly targeted to software applications: quick-look displays, pre-processing pipelines and scientific workflows. It is logically organized in three main components: an intuitive and compact Data Definition Language (DAS DDL) in XML format, aimed for user-defined data types; an Application Programming Interface (DAS API), automatically adding classes and methods supporting the DDL data types, and providing an object-oriented query language; a data management component, which maps the metadata of the DDL data types in a relational Data Base Management System (DBMS), and stores the data in a shared (network) file system. With the DAS DDL, developers define the data model for a particular project, specifying for each data type the metadata attributes, the data format and layout (if applicable), and named references to related or aggregated data types. Together with the DDL user-defined data types, the DAS API acts as the only interface to store, query and retrieve the metadata and data in the DAS system, providing both an abstract interface and a data model specific one in C, C++ and Python. The mapping of metadata in the back-end database is automatic and supports several relational DBMSs, including MySQL, Oracle and PostgreSQL.
Cho, Jaehyun; Jeon, Ikseong; Kim, Seong Yun; Lim, Soonho; Jho, Jae Young
2017-08-23
A series of polyketone (PK) nanocomposite films with varying content of noncovalently functionalized graphene nanoplatelet with 1-aminopyrene (GNP/APy) is prepared by solution blending with a solvent of hexafluoro-2-propanol. GNP/APy, prepared by a facile method, can effectively induce specific interaction such as hydrogen bonding between the amine functional group of GNP/APy and the carbonyl functional group of the PK matrix. With comparison of GNP and GNP/Py as reference materials, intensive investigation on filler-matrix interaction is achieved. In addition, the dispersion state of the functionalized GNP (f-GNPs; GNP/Py and GNP/APy) in the PK matrix is analyzed by three-dimensional nondestructive X-ray microcomputed tomography, and the increased dispersion state of those fillers results in significant improvement in the water vapor transmission rate (WVTR). The enhancement in WVTR of the PK/GNP/APy nanocomposite film at 1 wt % loading of filler leads to a barrier performance approximately 2 times larger compared to that of PK/GNP nanocomposite film and an approximately 92% reduction in WVTR compared to the case of pristine PK film. We expect that this facile method of graphene functionalization to enhance graphene dispersibility as well as interfacial interaction with the polymer matrix will be widely utilized to expand the potential of graphene materials to barrier film applications.
Alexander, Anthony J; Ma, Lianjia
2009-02-27
This paper focuses on the application of RPLC x RPLC to pharmaceutical analysis and addresses the specific problem of separating co-eluting impurities/degradation products that maybe "hidden" within the peak envelope of the active pharmaceutical ingredient (API) and thus may escape detection by conventional methods. A comprehensive two-dimensional liquid chromatograph (LC x LC) was constructed from commercially available HPLC equipment. This system utilizes two independently configurable 2nd dimension binary pumping systems to deliver independent flow rates, gradient profiles and mobile phase compositions to dual Fused-Core secondary columns. Very fast gradient separations (30s total cycle time) were achieved at ambient temperature without excessive backpressure and without compromising optimal 1st dimension sampling rates. The operation of the interface is demonstrated for the analysis of a 1mg/ml standard mixture containing 0.05% of a minor component. The practicality of using RPLC x RPLC for the analysis of actual co-eluting pharmaceutical degradation products, by exploiting pH-induced changes in selectivity, is also demonstrated using a three component mixture. This mixture (an API, an oxidation product of the API at 1.0%, w/w, and a photo degradant of the API at 0.5%, w/w) was used to assess the stability indicating nature of an established LC method for analysis of the API.
Integrating R with GIS for innovative geocomputing - the examples of RQGIS and RSAGA
NASA Astrophysics Data System (ADS)
Muenchow, Jannes; Schratz, Patrick; Bangs, Donovan; Brenning, Alexander
2017-04-01
While Geographical information systems (GIS) are good at efficiently manipulating and processing large amounts of geospatial data, the programming language R excels in (geo-)statistical analyses. Thus, bringing GIS algorithms to the R console combines the best of the two worlds, and paves the way for innovative geocomputing. To promote this approach, we will contrast the new RQGIS package with the established RSAGA package in terms of architecture, functionality and ease-of-use. Both packages use already existing Application Programming Interfaces (API), namely the QGIS Python API and SAGA API, to access GIS functionality from within R. Overall, RQGIS has the advantage of providing a unified interface to several GIS toolboxes (GRASS, SAGA, GDAL, etc.) bringing more than 1000 geocomputing algorithms to the R console (though only a subset of the full functionality of a specific third-party provider might be available). To further support the unified interface, QGIS automatically converts the input data into the formats supported by the respective third-party module. Moreover, RQGIS is easier to use than RSAGA due to special convenience functions (open_help, get_args_man, run_qgis). Nevertheless, the experienced SAGA user will most likely prefer the RSAGA package since it lets the user access all SAGA algorithms for a wide range of SAGA versions (currently 2.0.4 - 2.2.3). Additionally, RSAGA includes numerous user-friendly wrapper functions with arguments and meaningful default values (e.g., rsaga.slope). What is more, RSAGA provides the user with special geocomputing R functions, i.e. functions which solely run in R without using SAGA in the background (e.g., pick.from.ascii.grid, grid.predict and multi.local.function). To demonstrate the advantages of each package, we will derive terrain attributes from digital elevation models to model species richness and landslide susceptibility using non-linear generalized linear or generalized additive models. In the end, the choice of RQGIS or RSAGA depends on the user's preferences, expertise and tasks at hand. But both packages will benefit anyone working with large spatio-temporal data in R.
Proved reserves definitions proposed for adoption by SPE, AAPG, and API
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1980-09-01
A joint-association committee was organized to write a unified set of definitions for proved reserves that SPE, AAPG, API, other interested organizations, industry and units of government can adopt and use. The proposed definitions appear for the purpose of soliciting member comments before their anticipated adoption by the 3 sponsoring organizations. The committee directed its carefully written proposed definitions toward all present and future applications in industry and government circles. These definitions include the terms proved reserves, crude oil, natural gas, natural gas liquids, reservoir, and enhanced recovery.
Miksík, I; Vylitová, M; Pácha, J; Deyl, Z
1999-04-16
High-performance liquid chromatography coupled to atmospheric pressure ionization-electrospray ionization mass spectrometry (API-ESI-MS) was investigated for the analysis of corticosterone metabolites; their characterization was obtained by combining the separation on Zorbax Eclipse XDB C18 column (eluted with a methanol-water-acetic acid gradient) with identification using positive ion mode API-ESI-MS and selected ion analysis. The applicability of this method was verified by monitoring the activity of steroid converting enzymes (20beta-hydroxysteroid dehydrogenase and 11beta-hydroxysteroid dehydrogenase) in avian intestines.
The Secret Life of Quarks, Final Report for the University of North Carolina at Chapel Hill
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fowler, Robert J.
This final report summarizes activities and results at the University of North Carolina as part of the the SciDAC-2 Project The Secret Life of Quarks: National Computational Infrastructure for Lattice Quantum Chromodynamics. The overall objective of the project is to construct the software needed to study quantum chromo- dynamics (QCD), the theory of the strong interactions of subatomic physics, and similar strongly coupled gauge theories anticipated to be of importance in the LHC era. It built upon the successful efforts of the SciDAC-1 project National Computational Infrastructure for Lattice Gauge Theory, in which a QCD Applications Programming Interface (QCD API)more » was developed that enables lat- tice gauge theorists to make effective use of a wide variety of massively parallel computers. In the SciDAC-2 project, optimized versions of the QCD API were being created for the IBM Blue- Gene/L (BG/L) and BlueGene/P (BG/P), the Cray XT3/XT4 and its successors, and clusters based on multi-core processors and Infiniband communications networks. The QCD API is being used to enhance the performance of the major QCD community codes and to create new applications. Software libraries of physics tools have been expanded to contain sharable building blocks for inclusion in application codes, performance analysis and visualization tools, and software for au- tomation of physics work flow. New software tools were designed for managing the large data sets generated in lattice QCD simulations, and for sharing them through the International Lattice Data Grid consortium. As part of the overall project, researchers at UNC were funded through ASCR to work in three general areas. The main thrust has been performance instrumentation and analysis in support of the SciDAC QCD code base as it evolved and as it moved to new computation platforms. In support of the performance activities, performance data was to be collected in a database for the purpose of broader analysis. Third, the UNC work was done at RENCI (Renaissance Computing Institute), which has extensive expertise and facilities for scientific data visualization, so we acted in an ongoing consulting and support role in that area.« less
Phillips, Joshua; Chilukuri, Ram; Fragoso, Gilberto; Warzel, Denise; Covitz, Peter A
2006-01-06
Robust, programmatically accessible biomedical information services that syntactically and semantically interoperate with other resources are challenging to construct. Such systems require the adoption of common information models, data representations and terminology standards as well as documented application programming interfaces (APIs). The National Cancer Institute (NCI) developed the cancer common ontologic representation environment (caCORE) to provide the infrastructure necessary to achieve interoperability across the systems it develops or sponsors. The caCORE Software Development Kit (SDK) was designed to provide developers both within and outside the NCI with the tools needed to construct such interoperable software systems. The caCORE SDK requires a Unified Modeling Language (UML) tool to begin the development workflow with the construction of a domain information model in the form of a UML Class Diagram. Models are annotated with concepts and definitions from a description logic terminology source using the Semantic Connector component. The annotated model is registered in the Cancer Data Standards Repository (caDSR) using the UML Loader component. System software is automatically generated using the Codegen component, which produces middleware that runs on an application server. The caCORE SDK was initially tested and validated using a seven-class UML model, and has been used to generate the caCORE production system, which includes models with dozens of classes. The deployed system supports access through object-oriented APIs with consistent syntax for retrieval of any type of data object across all classes in the original UML model. The caCORE SDK is currently being used by several development teams, including by participants in the cancer biomedical informatics grid (caBIG) program, to create compatible data services. caBIG compatibility standards are based upon caCORE resources, and thus the caCORE SDK has emerged as a key enabling technology for caBIG. The caCORE SDK substantially lowers the barrier to implementing systems that are syntactically and semantically interoperable by providing workflow and automation tools that standardize and expedite modeling, development, and deployment. It has gained acceptance among developers in the caBIG program, and is expected to provide a common mechanism for creating data service nodes on the data grid that is under development.
Chromium: A Stress-Processing Framework for Interactive Rendering on Clusters
DOE Office of Scientific and Technical Information (OSTI.GOV)
Humphreys, G,; Houston, M.; Ng, Y.-R.
2002-01-11
We describe Chromium, a system for manipulating streams of graphics API commands on clusters of workstations. Chromium's stream filters can be arranged to create sort-first and sort-last parallel graphics architectures that, in many cases, support the same applications while using only commodity graphics accelerators. In addition, these stream filters can be extended programmatically, allowing the user to customize the stream transformations performed by nodes in a cluster. Because our stream processing mechanism is completely general, any cluster-parallel rendering algorithm can be either implemented on top of or embedded in Chromium. In this paper, we give examples of real-world applications thatmore » use Chromium to achieve good scalability on clusters of workstations, and describe other potential uses of this stream processing technology. By completely abstracting the underlying graphics architecture, network topology, and API command processing semantics, we allow a variety of applications to run in different environments.« less
Kokkos: Enabling manycore performance portability through polymorphic memory access patterns
Carter Edwards, H.; Trott, Christian R.; Sunderland, Daniel
2014-07-22
The manycore revolution can be characterized by increasing thread counts, decreasing memory per thread, and diversity of continually evolving manycore architectures. High performance computing (HPC) applications and libraries must exploit increasingly finer levels of parallelism within their codes to sustain scalability on these devices. We found that a major obstacle to performance portability is the diverse and conflicting set of constraints on memory access patterns across devices. Contemporary portable programming models address manycore parallelism (e.g., OpenMP, OpenACC, OpenCL) but fail to address memory access patterns. The Kokkos C++ library enables applications and domain libraries to achieve performance portability on diversemore » manycore architectures by unifying abstractions for both fine-grain data parallelism and memory access patterns. In this paper we describe Kokkos’ abstractions, summarize its application programmer interface (API), present performance results for unit-test kernels and mini-applications, and outline an incremental strategy for migrating legacy C++ codes to Kokkos. Furthermore, the Kokkos library is under active research and development to incorporate capabilities from new generations of manycore architectures, and to address a growing list of applications and domain libraries.« less
NASA Astrophysics Data System (ADS)
Zaghi, S.
2014-07-01
OFF, an open source (free software) code for performing fluid dynamics simulations, is presented. The aim of OFF is to solve, numerically, the unsteady (and steady) compressible Navier-Stokes equations of fluid dynamics by means of finite volume techniques: the research background is mainly focused on high-order (WENO) schemes for multi-fluids, multi-phase flows over complex geometries. To this purpose a highly modular, object-oriented application program interface (API) has been developed. In particular, the concepts of data encapsulation and inheritance available within Fortran language (from standard 2003) have been stressed in order to represent each fluid dynamics "entity" (e.g. the conservative variables of a finite volume, its geometry, etc…) by a single object so that a large variety of computational libraries can be easily (and efficiently) developed upon these objects. The main features of OFF can be summarized as follows: Programming LanguageOFF is written in standard (compliant) Fortran 2003; its design is highly modular in order to enhance simplicity of use and maintenance without compromising the efficiency; Parallel Frameworks Supported the development of OFF has been also targeted to maximize the computational efficiency: the code is designed to run on shared-memory multi-cores workstations and distributed-memory clusters of shared-memory nodes (supercomputers); the code's parallelization is based on Open Multiprocessing (OpenMP) and Message Passing Interface (MPI) paradigms; Usability, Maintenance and Enhancement in order to improve the usability, maintenance and enhancement of the code also the documentation has been carefully taken into account; the documentation is built upon comprehensive comments placed directly into the source files (no external documentation files needed): these comments are parsed by means of doxygen free software producing high quality html and latex documentation pages; the distributed versioning system referred as git has been adopted in order to facilitate the collaborative maintenance and improvement of the code; CopyrightsOFF is a free software that anyone can use, copy, distribute, study, change and improve under the GNU Public License version 3. The present paper is a manifesto of OFF code and presents the currently implemented features and ongoing developments. This work is focused on the computational techniques adopted and a detailed description of the main API characteristics is reported. OFF capabilities are demonstrated by means of one and two dimensional examples and a three dimensional real application.
Goel, Meenakshi; Larson, Eli; Venkatramani, C J; Al-Sayah, Mohammad A
2018-05-01
Enantioselective analysis is an essential requirement during the pharmaceutical development of chiral drug molecules. In pre-clinical and clinical studies, the Food and Drug Administration (FDA) mandates the assessment of "in vivo" inter-conversion of chiral drugs to determine their physiological effects. In-vivo analysis of the active pharmaceutical ingredient (API) and its potential metabolites could be quite challenging due to their low abundance (ng/mL levels) and matrix interferences. Therefore, highly selective and sensitive analytical techniques are required to separate the API and its metabolites from the matrix components and one another. Additionally, for chiral APIs, further analytical separation is required to resolve the API and its potential metabolites from their corresponding enantiomers. In this work, we demonstrate the optimization of our previously designed two-dimensional liquid chromatography-supercritical fluid chromatography-mass spectrometry (2D-LC-SFC -MS) system to achieve 10 ng/mL detection limit [1]. The first LC dimension, used as a desalting step, could efficiently separate the API from its potential metabolites and matrix components. The API and its metabolites were then trapped/focused on small trapping columns and transferred onto the second SFC dimension for chiral separation. Detection can be achieved by ultra-violet (UV) or MS detection. Different system parameters such as column dimensions, transfer volumes, trapping column stationary phase, system tubing internal diameter (i.d.), and detection techniques, were optimized to enhance the sensitivity of the 2D-LC-SFC-MS system. The limit of detection was determined to be 10 ng/mL. An application is described where a mouse hepatocyte treated sample was analyzed using the optimized 2D-LC-SFC-MS system with successful assessment of the ratio of API to its metabolite (1D-LC), as well as the corresponding enantiomeric excess values (% e.e.) of each (2D-SFC). Copyright © 2018 Elsevier B.V. All rights reserved.
2016-01-01
Abstract Background The Species API of the Global Biodiversity Information Facility (GBIF) provides public access to taxonomic data aggregated from multiple data sources. Each data source follows its own classification which can be inconsistent with classifications from other sources. Even with a reference classification e.g. the GBIF Backbone taxonomy, a comprehensive method to compare classifications in the data aggregation is essential, especially for non-expert users. New information A Java application was developed to compare multiple taxonomies graphically using classification data acquired from GBIF’s ChecklistBank via the GBIF Species API. It uses a table to display taxonomies where each column represents a taxonomy under comparison, with an aligner column to organise taxa by name. Each cell contains the name of a taxon if the classification in that column contains the name. Each column also has a cell showing the hierarchy of the taxonomy by a folder metaphor where taxa are aligned and synchronised in the aligner column. A set of those comparative tables shows taxa categorised by relationship between taxonomies. The result set is also available as tables in an Excel format file. PMID:27932916
Electronic Attack Platform Placement Optimization
2014-09-01
Processing in VBA ...............................................................33 2. Client-Server Using Two Different Excel Application...6 Figure 3. Screenshot of the VBA IDE contained within all Microsoft Office products...application using MS Excel’s Applicatin.OnTime method. .....................................33 Figure 20. WINSOCK API Functions needed to use TCP via VBA
Data Services in Support of High Performance Computing-Based Distributed Hydrologic Models
NASA Astrophysics Data System (ADS)
Tarboton, D. G.; Horsburgh, J. S.; Dash, P. K.; Gichamo, T.; Yildirim, A. A.; Jones, N.
2014-12-01
We have developed web-based data services to support the application of hydrologic models on High Performance Computing (HPC) systems. The purposes of these services are to provide hydrologic researchers, modelers, water managers, and users access to HPC resources without requiring them to become HPC experts and understanding the intrinsic complexities of the data services, so as to reduce the amount of time and effort spent in finding and organizing the data required to execute hydrologic models and data preprocessing tools on HPC systems. These services address some of the data challenges faced by hydrologic models that strive to take advantage of HPC. Needed data is often not in the form needed by such models, requiring researchers to spend time and effort on data preparation and preprocessing that inhibits or limits the application of these models. Another limitation is the difficult to use batch job control and queuing systems used by HPC systems. We have developed a REST-based gateway application programming interface (API) for authenticated access to HPC systems that abstracts away many of the details that are barriers to HPC use and enhances accessibility from desktop programming and scripting languages such as Python and R. We have used this gateway API to establish software services that support the delineation of watersheds to define a modeling domain, then extract terrain and land use information to automatically configure the inputs required for hydrologic models. These services support the Terrain Analysis Using Digital Elevation Model (TauDEM) tools for watershed delineation and generation of hydrology-based terrain information such as wetness index and stream networks. These services also support the derivation of inputs for the Utah Energy Balance snowmelt model used to address questions such as how climate, land cover and land use change may affect snowmelt inputs to runoff generation. To enhance access to the time varying climate data used to drive hydrologic models, we have developed services to downscale and re-grid nationally available climate analysis data from systems such as NLDAS and MERRA. These cases serve as examples for how this approach can be extended to other models to enhance the use of HPC for hydrologic modeling.
Cloud-Enabled Climate Analytics-as-a-Service using Reanalysis data: A case study.
NASA Astrophysics Data System (ADS)
Nadeau, D.; Duffy, D.; Schnase, J. L.; McInerney, M.; Tamkin, G.; Potter, G. L.; Thompson, J. H.
2014-12-01
The NASA Center for Climate Simulation (NCCS) maintains advanced data capabilities and facilities that allow researchers to access the enormous volume of data generated by weather and climate models. The NASA Climate Model Data Service (CDS) and the NCCS are merging their efforts to provide Climate Analytics-as-a-Service for the comparative study of the major reanalysis projects: ECMWF ERA-Interim, NASA/GMAO MERRA, NOAA/NCEP CFSR, NOAA/ESRL 20CR, JMA JRA25, and JRA55. These reanalyses have been repackaged to netCDF4 file format following the CMIP5 Climate and Forecast (CF) metadata convention prior to be sequenced into the Hadoop Distributed File System ( HDFS ). A small set of operations that represent a common starting point in many analysis workflows was then created: min, max, sum, count, variance and average. In this example, Reanalysis data exploration was performed with the use of Hadoop MapReduce and accessibility was achieved using the Climate Data Service(CDS) application programming interface (API) created at NCCS. This API provides a uniform treatment of large amount of data. In this case study, we have limited our exploration to 2 variables, temperature and precipitation, using 3 operations, min, max and avg and using 30-year of Reanalysis data for 3 regions of the world: global, polar, subtropical.
Wu, Yan-Yan; Zhou, Ting; Wang, Qiang; Dai, Ping-Li; Xu, Shu-Fa; Jia, Hui-Ru; Wang, Xing
2015-08-01
Honey bees are at an unavoidable risk of exposure to neonicotinoid pesticides, which are used worldwide. Compared with the well-studied roles of these pesticides in nontarget site (including midgut, ovary, or salivary glands), little has been reported in the target sites, the brain. In the current study, laboratory-reared adult worker honey bees (Apis mellifera L.) were treated with sublethal doses of imidacloprid. Neuronal apoptosis was detected using the TUNEL technique for DNA labeling. We observed significantly increased apoptotic markers in dose- and time-dependent manners in brains of bees exposed to imidacloprid. Neuronal activated caspase-3 and mRNA levels of caspase-1, as detected by immunofluorescence and real-time quantitative PCR, respectively, were significantly increased, suggesting that sublethal doses of imidacloprid may induce the caspase-dependent apoptotic pathway. Additionally, the overlap of apoptosis and autophagy in neurons was confirmed by transmission electron microscopy. It further suggests that a relationship exists between neurotoxicity and behavioral changes induced by sublethal doses of imidacloprid, and that there is a need to determine reasonable limits for imidacloprid application in the field to protect pollinators. © The Authors 2015. Published by Oxford University Press on behalf of Entomological Society of America. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
Foundational Tools for Petascale Computing
DOE Office of Scientific and Technical Information (OSTI.GOV)
Miller, Barton
2014-05-19
The Paradyn project has a history of developing algorithms, techniques, and software that push the cutting edge of tool technology for high-end computing systems. Under this funding, we are working on a three-year agenda to make substantial new advances in support of new and emerging Petascale systems. The overall goal for this work is to address the steady increase in complexity of these petascale systems. Our work covers two key areas: (1) The analysis, instrumentation and control of binary programs. Work in this area falls under the general framework of the Dyninst API tool kits. (2) Infrastructure for building toolsmore » and applications at extreme scale. Work in this area falls under the general framework of the MRNet scalability framework. Note that work done under this funding is closely related to work done under a contemporaneous grant, “High-Performance Energy Applications and Systems”, SC0004061/FG02-10ER25972, UW PRJ36WV.« less
The application of autostereoscopic display in smart home system based on mobile devices
NASA Astrophysics Data System (ADS)
Zhang, Yongjun; Ling, Zhi
2015-03-01
Smart home is a system to control home devices which are more and more popular in our daily life. Mobile intelligent terminals based on smart homes have been developed, make remote controlling and monitoring possible with smartphones or tablets. On the other hand, 3D stereo display technology developed rapidly in recent years. Therefore, a iPad-based smart home system adopts autostereoscopic display as the control interface is proposed to improve the userfriendliness of using experiences. In consideration of iPad's limited hardware capabilities, we introduced a 3D image synthesizing method based on parallel processing with Graphic Processing Unit (GPU) implemented it with OpenGL ES Application Programming Interface (API) library on IOS platforms for real-time autostereoscopic displaying. Compared to the traditional smart home system, the proposed system applied autostereoscopic display into smart home system's control interface enhanced the reality, user-friendliness and visual comfort of interface.
Impact of memory bottleneck on the performance of graphics processing units
NASA Astrophysics Data System (ADS)
Son, Dong Oh; Choi, Hong Jun; Kim, Jong Myon; Kim, Cheol Hong
2015-12-01
Recent graphics processing units (GPUs) can process general-purpose applications as well as graphics applications with the help of various user-friendly application programming interfaces (APIs) supported by GPU vendors. Unfortunately, utilizing the hardware resource in the GPU efficiently is a challenging problem, since the GPU architecture is totally different to the traditional CPU architecture. To solve this problem, many studies have focused on the techniques for improving the system performance using GPUs. In this work, we analyze the GPU performance varying GPU parameters such as the number of cores and clock frequency. According to our simulations, the GPU performance can be improved by 125.8% and 16.2% on average as the number of cores and clock frequency increase, respectively. However, the performance is saturated when memory bottleneck problems incur due to huge data requests to the memory. The performance of GPUs can be improved as the memory bottleneck is reduced by changing GPU parameters dynamically.
JPL Space Telecommunications Radio System Operating Environment
NASA Technical Reports Server (NTRS)
Lux, James P.; Lang, Minh; Peters, Kenneth J.; Taylor, Gregory H.; Duncan, Courtney B.; Orozco, David S.; Stern, Ryan A.; Ahten, Earl R.; Girard, Mike
2013-01-01
A flight-qualified implementation of a Software Defined Radio (SDR) Operating Environment for the JPL-SDR built for the CoNNeCT Project has been developed. It is compliant with the NASA Space Telecommunications Radio System (STRS) Architecture Standard, and provides the software infrastructure for STRS compliant waveform applications. This software provides a standards-compliant abstracted view of the JPL-SDR hardware platform. It uses industry standard POSIX interfaces for most functions, as well as exposing the STRS API (Application Programming In terface) required by the standard. This software includes a standardized interface for IP components instantiated within a Xilinx FPGA (Field Programmable Gate Array). The software provides a standardized abstracted interface to platform resources such as data converters, file system, etc., which can be used by STRS standards conformant waveform applications. It provides a generic SDR operating environment with a much smaller resource footprint than similar products such as SCA (Software Communications Architecture) compliant implementations, or the DoD Joint Tactical Radio Systems (JTRS).
interPopula: a Python API to access the HapMap Project dataset
2010-01-01
Background The HapMap project is a publicly available catalogue of common genetic variants that occur in humans, currently including several million SNPs across 1115 individuals spanning 11 different populations. This important database does not provide any programmatic access to the dataset, furthermore no standard relational database interface is provided. Results interPopula is a Python API to access the HapMap dataset. interPopula provides integration facilities with both the Python ecology of software (e.g. Biopython and matplotlib) and other relevant human population datasets (e.g. Ensembl gene annotation and UCSC Known Genes). A set of guidelines and code examples to address possible inconsistencies across heterogeneous data sources is also provided. Conclusions interPopula is a straightforward and flexible Python API that facilitates the construction of scripts and applications that require access to the HapMap dataset. PMID:21210977
NASA Astrophysics Data System (ADS)
Thau, D.
2017-12-01
For the past seven years, Google has made petabytes of Earth observation data, and the tools to analyze it, freely available to researchers around the world via cloud computing. These data and tools were initially available via Google Earth Engine and are increasingly available on the Google Cloud Platform. We have introduced a number of APIs for both the analysis and presentation of geospatial data that have been successfully used to create impactful datasets and web applications, including studies of global surface water availability, global tree cover change, and crop yield estimation. Each of these projects used the cloud to analyze thousands to millions of Landsat scenes. The APIs support a range of publishing options, from outputting imagery and data for inclusion in papers, to providing tools for full scale web applications that provide analysis tools of their own. Over the course of developing these tools, we have learned a number of lessons about how to build a publicly available cloud platform for geospatial analysis, and about how the characteristics of an API can affect the kinds of impacts a platform can enable. This study will present an overview of how Google Earth Engine works and how Google's geospatial capabilities are extending to Google Cloud Platform. We will provide a number of case studies describing how these platforms, and the data they host, have been leveraged to build impactful decision support tools used by governments, researchers, and other institutions, and we will describe how the available APIs have shaped (or constrained) those tools. [Image Credit: Tyler A. Erickson
Tiwari, Roshan V.; Polk, Ashley N.; Patil, Hemlata; Ye, Xingyou; Pimparade, Manjeet B.; Repka, Michael A.
2017-01-01
Developing a pediatric oral formulation with an age-appropriate dosage form and taste masking of naturally bitter active pharmaceutical ingredients (APIs) are key challenges for formulation scientists. Several techniques are used for taste masking of bitter APIs to improve formulation palatability; however, not all the techniques are applicable to pediatric dosage forms because of the limitations on the kind and concentration of the excipients that can be used. Hot-melt extrusion (HME) technology is used successfully for taste masking of bitter APIs, and overcomes some of the limitations of the existing taste masking techniques. Likewise, analytical taste assessment is an important quality control parameter evaluated by several in vivo and in vitro methods, such as the human taste panel, electrophysiological methods, electronic sensor, and animal preference tests to aid in selecting a taste-masked formulation. However, the most appropriate in-vivo method to assess the taste-masking efficacy of pediatric formulations remains unknown, because it is not known to what extent the human taste panel/electronic tongue can predict the palatability in the pediatric patients. The purpose of this study was to develop taste-masked caffeine citrate extrudates via HME, and to demonstrate the wide applicability of a single bottle-test rat model to record and compare the volume consumed of the taste-masked solutions to that of the pure API. Thus, this rat model can be considered as a low-cost alternative taste-assessment method to the most commonly used expensive human taste panel/electronic tongue method for pediatric formulations. PMID:26573158
Nagy, Brigitta; Farkas, Attila; Gyürkés, Martin; Komaromy-Hiller, Szofia; Démuth, Balázs; Szabó, Bence; Nusser, Dávid; Borbás, Enikő; Marosi, György; Nagy, Zsombor Kristóf
2017-09-15
The integration of Process Analytical Technology (PAT) initiative into the continuous production of pharmaceuticals is indispensable for reliable production. The present paper reports the implementation of in-line Raman spectroscopy in a continuous blending and tableting process of a three-component model pharmaceutical system, containing caffeine as model active pharmaceutical ingredient (API), glucose as model excipient and magnesium stearate as lubricant. The real-time analysis of API content, blend homogeneity, and tablet content uniformity was performed using a Partial Least Squares (PLS) quantitative method. The in-line Raman spectroscopic monitoring showed that the continuous blender was capable of producing blends with high homogeneity, and technological malfunctions can be detected by the proposed PAT method. The Raman spectroscopy-based feedback control of the API feeder was also established, creating a 'Process Analytically Controlled Technology' (PACT), which guarantees the required API content in the produced blend. This is, to the best of the authors' knowledge, the first ever application of Raman-spectroscopy in continuous blending and the first Raman-based feedback control in the formulation technology of solid pharmaceuticals. Copyright © 2017 Elsevier B.V. All rights reserved.
Lee, Meta T; Bracamontes, Jennifer; Mosier, Evan; Davis, James; Maddock, Jay E
2011-03-01
A qualitative study was conducted to determine preferred smoking cessation methods among Asian or Pacific Islander (API) smokers who live with hospitalized children. This study occurred in a children's hospital where a new cessation program would be developed. Twenty-six API smokers who live with children admitted to the hospital were interviewed and tape-recorded. Responses to survey questions were transcribed, categorized, and analyzed. 73% were interested in quitting, 34% within the next 30 days. Few would independently use the quit-line (31%) or attend group classes (4%). However, if offered during their child's hospitalization, 52% would sign up for individualized counseling and 29% would attend group sessions. Respondents believed advice would be helpful from their physician (71%), child's pediatrician (65%, nurse (64%), respiratory therapist (65%), or smoking cessation counselor (75%). The majority of API smokers were interested in quitting and receptive to one-on-one counseling. Advice would be helpful from any healthcare professional. Hawaii Medical Journal Copyright 2011.
pyomocontrib_simplemodel v. 1.0
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hart, William
2017-03-02
Pyomo supports the formulation and analysis of mathematical models for complex optimization applications. This library extends the API of Pyomo to include a simple modeling representation: a list of objectives and constraints.
Visualization of Vgi Data Through the New NASA Web World Wind Virtual Globe
NASA Astrophysics Data System (ADS)
Brovelli, M. A.; Kilsedar, C. E.; Zamboni, G.
2016-06-01
GeoWeb 2.0, laying the foundations of Volunteered Geographic Information (VGI) systems, has led to platforms where users can contribute to the geographic knowledge that is open to access. Moreover, as a result of the advancements in 3D visualization, virtual globes able to visualize geographic data even on browsers emerged. However the integration of VGI systems and virtual globes has not been fully realized. The study presented aims to visualize volunteered data in 3D, considering also the ease of use aspects for general public, using Free and Open Source Software (FOSS). The new Application Programming Interface (API) of NASA, Web World Wind, written in JavaScript and based on Web Graphics Library (WebGL) is cross-platform and cross-browser, so that the virtual globe created using this API can be accessible through any WebGL supported browser on different operating systems and devices, as a result not requiring any installation or configuration on the client-side, making the collected data more usable to users, which is not the case with the World Wind for Java as installation and configuration of the Java Virtual Machine (JVM) is required. Furthermore, the data collected through various VGI platforms might be in different formats, stored in a traditional relational database or in a NoSQL database. The project developed aims to visualize and query data collected through Open Data Kit (ODK) platform and a cross-platform application, where data is stored in a relational PostgreSQL and NoSQL CouchDB databases respectively.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Del Bravo, R.; Pinacci, P.; Trifilo, R.
1998-07-01
This paper has the aim to give a general overview of the api Energia IGCC project starting from the project background in 1992 and ending with the progress of construction. api Energia S.p.A., a joint VENTURE between api anonima petroli italiana S.p.A., Roma, Italy (51%), ABB Sae Sadelmi S.p.A., Milano, Italy (25%) and Texaco Development Corporation (24%), is building a 280 MW Integrated Gasification Combined Cycle plant in the api refinery at Falconara Marittima, on Italy' s Adriatic coast, using heavy oil residues. The plant is based on the modern concept of employing a highly efficient combined cycle power plantmore » fed with a low heating value fuel gas produced by gasifying heavy refinery residues. This scheme provides consistent advantages in terms of efficiency and environmental impact over alternative applications of the refinery residues. The electric power produced will feed the national grid. The project has been financed using the ``project financing'' scheme: over 1,000 billion Lira, representing 75% of the overall capital requirement, have been provided by a pool of international banks. In November 1996 the project reached financial closure and immediately after the detailed design and procurement activities started. Engineering, Procurement and Construction activities, carried out by a Consortium of companies of the ABB group, are totally in line with the schedule. Commercial operation of the plant, is scheduled for November 1999.« less
Continuous API-crystal coating via coacervation in a tubular reactor.
Besenhard, M O; Thurnberger, A; Hohl, R; Faulhammer, E; Rattenberger, J; Khinast, J G
2014-11-20
We present a proof-of-concept study of a continuous coating process of single API crystals in a tubular reactor using coacervation as a microencapsulation technique. Continuous API crystal coating can have several advantages, as in a single step (following crystallization) individual crystals can be prepared with a functional coating, either to change the release behavior, to protect the API from gastric juice or to modify the surface energetics of the API (i.e., to tailor the hydrophobic/hydrophilic characteristics, flowability or agglomeration tendency, etc.). The coating process was developed for the microencapsulation of a lipophilic core material (ibuprofen crystals of 20 μm- to 100 μm-size), with either hypromellose phthalate (HPMCP) or Eudragit L100-55. The core material was suspended in an aqueous solution containing one of these enteric polymers, fed into the tubing and mixed continuously with a sodium sulfate solution as an antisolvent to induce coacervation. A subsequent temperature treatment was applied to optimize the microencapsulation of crystals via the polymer-rich coacervate phase. Cross-linking of the coating shell was achieved by mixing the processed material with an acidic solution (pH<3). Flow rates, temperature profiles and polymer-to-antisolvent ratios had to be tightly controlled to avoid excessive aggregation, leading to pipe plugging. This work demonstrates the potential of a tubular reactor design for continuous coating applications and is the basis for future work, combining continuous crystallization and coating. Copyright © 2014 Elsevier B.V. All rights reserved.
AMBIT RESTful web services: an implementation of the OpenTox application programming interface.
Jeliazkova, Nina; Jeliazkov, Vedrin
2011-05-16
The AMBIT web services package is one of the several existing independent implementations of the OpenTox Application Programming Interface and is built according to the principles of the Representational State Transfer (REST) architecture. The Open Source Predictive Toxicology Framework, developed by the partners in the EC FP7 OpenTox project, aims at providing a unified access to toxicity data and predictive models, as well as validation procedures. This is achieved by i) an information model, based on a common OWL-DL ontology ii) links to related ontologies; iii) data and algorithms, available through a standardized REST web services interface, where every compound, data set or predictive method has a unique web address, used to retrieve its Resource Description Framework (RDF) representation, or initiate the associated calculations.The AMBIT web services package has been developed as an extension of AMBIT modules, adding the ability to create (Quantitative) Structure-Activity Relationship (QSAR) models and providing an OpenTox API compliant interface. The representation of data and processing resources in W3C Resource Description Framework facilitates integrating the resources as Linked Data. By uploading datasets with chemical structures and arbitrary set of properties, they become automatically available online in several formats. The services provide unified interfaces to several descriptor calculation, machine learning and similarity searching algorithms, as well as to applicability domain and toxicity prediction models. All Toxtree modules for predicting the toxicological hazard of chemical compounds are also integrated within this package. The complexity and diversity of the processing is reduced to the simple paradigm "read data from a web address, perform processing, write to a web address". The online service allows to easily run predictions, without installing any software, as well to share online datasets and models. The downloadable web application allows researchers to setup an arbitrary number of service instances for specific purposes and at suitable locations. These services could be used as a distributed framework for processing of resource-intensive tasks and data sharing or in a fully independent way, according to the specific needs. The advantage of exposing the functionality via the OpenTox API is seamless interoperability, not only within a single web application, but also in a network of distributed services. Last, but not least, the services provide a basis for building web mashups, end user applications with friendly GUIs, as well as embedding the functionalities in existing workflow systems.
AMBIT RESTful web services: an implementation of the OpenTox application programming interface
2011-01-01
The AMBIT web services package is one of the several existing independent implementations of the OpenTox Application Programming Interface and is built according to the principles of the Representational State Transfer (REST) architecture. The Open Source Predictive Toxicology Framework, developed by the partners in the EC FP7 OpenTox project, aims at providing a unified access to toxicity data and predictive models, as well as validation procedures. This is achieved by i) an information model, based on a common OWL-DL ontology ii) links to related ontologies; iii) data and algorithms, available through a standardized REST web services interface, where every compound, data set or predictive method has a unique web address, used to retrieve its Resource Description Framework (RDF) representation, or initiate the associated calculations. The AMBIT web services package has been developed as an extension of AMBIT modules, adding the ability to create (Quantitative) Structure-Activity Relationship (QSAR) models and providing an OpenTox API compliant interface. The representation of data and processing resources in W3C Resource Description Framework facilitates integrating the resources as Linked Data. By uploading datasets with chemical structures and arbitrary set of properties, they become automatically available online in several formats. The services provide unified interfaces to several descriptor calculation, machine learning and similarity searching algorithms, as well as to applicability domain and toxicity prediction models. All Toxtree modules for predicting the toxicological hazard of chemical compounds are also integrated within this package. The complexity and diversity of the processing is reduced to the simple paradigm "read data from a web address, perform processing, write to a web address". The online service allows to easily run predictions, without installing any software, as well to share online datasets and models. The downloadable web application allows researchers to setup an arbitrary number of service instances for specific purposes and at suitable locations. These services could be used as a distributed framework for processing of resource-intensive tasks and data sharing or in a fully independent way, according to the specific needs. The advantage of exposing the functionality via the OpenTox API is seamless interoperability, not only within a single web application, but also in a network of distributed services. Last, but not least, the services provide a basis for building web mashups, end user applications with friendly GUIs, as well as embedding the functionalities in existing workflow systems. PMID:21575202
The Virtual Solar Observatory and the Heliophysics Meta-Virtual Observatory
NASA Technical Reports Server (NTRS)
Gurman, Joseph B.
2007-01-01
The Virtual Solar Observatory (VSO) is now able to search for solar data ranging from the radio to gamma rays, obtained from space and groundbased observatories, from 26 sources at 12 data providers, and from 1915 to the present. The solar physics community can use a Web interface or an Application Programming Interface (API) that allows integrating VSO searches into other software, including other Web services. Over the next few years, this integration will be especially obvious as the NASA Heliophysics division sponsors the development of a heliophysics-wide virtual observatory (VO), based on existing VO's in heliospheric, magnetospheric, and ionospheric physics as well as the VSO. We examine some of the challenges and potential of such a "meta-VO."
VidCat: an image and video analysis service for personal media management
NASA Astrophysics Data System (ADS)
Begeja, Lee; Zavesky, Eric; Liu, Zhu; Gibbon, David; Gopalan, Raghuraman; Shahraray, Behzad
2013-03-01
Cloud-based storage and consumption of personal photos and videos provides increased accessibility, functionality, and satisfaction for mobile users. One cloud service frontier that is recently growing is that of personal media management. This work presents a system called VidCat that assists users in the tagging, organization, and retrieval of their personal media by faces and visual content similarity, time, and date information. Evaluations for the effectiveness of the copy detection and face recognition algorithms on standard datasets are also discussed. Finally, the system includes a set of application programming interfaces (API's) allowing content to be uploaded, analyzed, and retrieved on any client with simple HTTP-based methods as demonstrated with a prototype developed on the iOS and Android mobile platforms.
NASA Astrophysics Data System (ADS)
Ye, Z.; Xiang, H.
2014-04-01
The paper discusses the basic principles and the problem solutions during the design and implementation of the mobile GIS system, and base on the research result, we developed the General Provincial Situation Visualization System Based on iOS of Shandong Province. The system is developed in the Objective-C programming language, and use the ArcGIS Runtime SDK for IOS as the development tool to call the "World-map Shandong" services to implement the development of the General Provincial Situation Visualization System Based on iOS devices. The system is currently available for download in the Appstore and is chosen as the typical application case of ESRI China ArcGIS API for iOS.
nodeGame: Real-time, synchronous, online experiments in the browser.
Balietti, Stefano
2017-10-01
nodeGame is a free, open-source JavaScript/ HTML5 framework for conducting synchronous experiments online and in the lab directly in the browser window. It is specifically designed to support behavioral research along three dimensions: (i) larger group sizes, (ii) real-time (but also discrete time) experiments, and (iii) batches of simultaneous experiments. nodeGame has a modular source code, and defines an API (application programming interface) through which experimenters can create new strategic environments and configure the platform. With zero-install, nodeGame can run on a great variety of devices, from desktop computers to laptops, smartphones, and tablets. The current version of the software is 3.0, and extensive documentation is available on the wiki pages at http://nodegame.org .
Behavioral studies of learning in the Africanized honey bee (Apis mellifera L.).
Abramson, Charles I; Aquino, Italo S
2002-01-01
Experiments on basic classical conditioning phenomena in adult and young Africanized honey bees (Apis mellifera L.) are described. Phenomena include conditioning to various stimuli, extinction (both unpaired and CS only), conditioned inhibition, color and odor discrimination. In addition to work on basic phenomena, experiments on practical applications of conditioning methodology are illustrated with studies demonstrating the effects of insecticides on learning and the reaction of bees to consumer products. Electron microscope photos are presented of Africanized workers, drones, and queen bees. Possible sub-species differences between Africanized and European bees are discussed. Copyright 2002 S. Karger AG, Basel
Automated Test Assembly Using lp_Solve Version 5.5 in R
ERIC Educational Resources Information Center
Diao, Qi; van der Linden, Wim J.
2011-01-01
This article reviews the use of the software program lp_solve version 5.5 for solving mixed-integer automated test assembly (ATA) problems. The program is freely available under Lesser General Public License 2 (LGPL2). It can be called from the statistical language R using the lpSolveAPI interface. Three empirical problems are presented to…
2013-04-01
machine transitions. 2. We developed the TraceContract API for trace analysis in the Scala programming language. TraceContract combines a high-level...DATE (DD-MM-YYYY) 2. REPORT TYPE 3. DATES COVERED (From - To) 4. TITLE AND SUBTITLE 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER...awarded within the Software and Systems program . The original Pro- gram Manager was David Luginbuhl. Bob Bonneau took over as PM in March 2011. The award
MPT Prediction of Aircraft-Engine Fan Noise
NASA Technical Reports Server (NTRS)
Connell, Stuart D.
2004-01-01
A collection of computer programs has been developed that implements a procedure for predicting multiple-pure-tone (MPT) noise generated by fan blades of an aircraft engine (e.g., a turbofan engine). MPT noise arises when the fan is operating with supersonic relative tip Mach No. Under this flow condition, there is a strong upstream running shock. The strength and position of this shock are very sensitive to blade geometry variations. For a fan where all the blades are identical, the primary tone observed upstream of the fan will be the blade passing frequency. If there are small variations in geometry between blades, then tones below the blade passing frequency arise MPTs. Stagger angle differences as small as 0.1 can give rise to significant MPT. It is also noted that MPT noise is more pronounced when the fan is operating in an unstarted mode. Computational results using a three-dimensional flow solver to compute the complete annulus flow with non-uniform fans indicate that MPT noise can be estimated in a relatively simple way. Hence, once the effect of a typical geometry variation of one blade in an otherwise uniform blade row is known, the effect of all the blades being different can be quickly computed via superposition. Two computer programs that were developed as part of this work are used in conjunction with a user s computational fluid dynamics (CFD) code to predict MPT spectra for a fan with a specified set of geometric variations: (1) The first program ROTBLD reads the users CFD solution files for a single blade passage via an API (Application Program Interface). There are options to replicate and perturb the geometry with typical variations stagger, camber, thickness, and pitch. The multi-passage CFD solution files are then written in the user s file format using the API. (2) The second program SUPERPOSE requires two input files: the first is the circumferential upstream pressure distribution extracted from the CFD solution on the multi-passage mesh, the second file defines the geometry variations of each blade in a complete fan. Superposition is used to predict the spectra resulting from the geometric variations.
An open-source framework for testing tracking devices using Lego Mindstorms
NASA Astrophysics Data System (ADS)
Jomier, Julien; Ibanez, Luis; Enquobahrie, Andinet; Pace, Danielle; Cleary, Kevin
2009-02-01
In this paper, we present an open-source framework for testing tracking devices in surgical navigation applications. At the core of image-guided intervention systems is the tracking interface that handles communication with the tracking device and gathers tracking information. Given that the correctness of tracking information is critical for protecting patient safety and for ensuring the successful execution of an intervention, the tracking software component needs to be thoroughly tested on a regular basis. Furthermore, with widespread use of extreme programming methodology that emphasizes continuous and incremental testing of application components, testing design becomes critical. While it is easy to automate most of the testing process, it is often more difficult to test components that require manual intervention such as tracking device. Our framework consists of a robotic arm built from a set of Lego Mindstorms and an open-source toolkit written in C++ to control the robot movements and assess the accuracy of the tracking devices. The application program interface (API) is cross-platform and runs on Windows, Linux and MacOS. We applied this framework for the continuous testing of the Image-Guided Surgery Toolkit (IGSTK), an open-source toolkit for image-guided surgery and shown that regression testing on tracking devices can be performed at low cost and improve significantly the quality of the software.
HPC Programming on Intel Many-Integrated-Core Hardware with MAGMA Port to Xeon Phi
Dongarra, Jack; Gates, Mark; Haidar, Azzam; ...
2015-01-01
This paper presents the design and implementation of several fundamental dense linear algebra (DLA) algorithms for multicore with Intel Xeon Phi coprocessors. In particular, we consider algorithms for solving linear systems. Further, we give an overview of the MAGMA MIC library, an open source, high performance library, that incorporates the developments presented here and, more broadly, provides the DLA functionality equivalent to that of the popular LAPACK library while targeting heterogeneous architectures that feature a mix of multicore CPUs and coprocessors. The LAPACK-compliance simplifies the use of the MAGMA MIC library in applications, while providing them with portably performant DLA.more » High performance is obtained through the use of the high-performance BLAS, hardware-specific tuning, and a hybridization methodology whereby we split the algorithm into computational tasks of various granularities. Execution of those tasks is properly scheduled over the heterogeneous hardware by minimizing data movements and mapping algorithmic requirements to the architectural strengths of the various heterogeneous hardware components. Our methodology and programming techniques are incorporated into the MAGMA MIC API, which abstracts the application developer from the specifics of the Xeon Phi architecture and is therefore applicable to algorithms beyond the scope of DLA.« less
SpaceWire Driver Software for Special DSPs
NASA Technical Reports Server (NTRS)
Clark, Douglas; Lux, James; Nishimoto, Kouji; Lang, Minh
2003-01-01
A computer program provides a high-level C-language interface to electronics circuitry that controls a SpaceWire interface in a system based on a space qualified version of the ADSP-21020 digital signal processor (DSP). SpaceWire is a spacecraft-oriented standard for packet-switching data-communication networks that comprise nodes connected through bidirectional digital serial links that utilize low-voltage differential signaling (LVDS). The software is tailored to the SMCS-332 application-specific integrated circuit (ASIC) (also available as the TSS901E), which provides three highspeed (150 Mbps) serial point-to-point links compliant with the proposed Institute of Electrical and Electronics Engineers (IEEE) Standard 1355.2 and equivalent European Space Agency (ESA) Standard ECSS-E-50-12. In the specific application of this software, the SpaceWire ASIC was combined with the DSP processor, memory, and control logic in a Multi-Chip Module DSP (MCM-DSP). The software is a collection of low-level driver routines that provide a simple message-passing application programming interface (API) for software running on the DSP. Routines are provided for interrupt-driven access to the two styles of interface provided by the SMCS: (1) the "word at a time" conventional host interface (HOCI); and (2) a higher performance "dual port memory" style interface (COMI).
Federal Register 2010, 2011, 2012, 2013, 2014
2013-01-18
...-time MIAX API Testing and Certification fees are based upon the category of Member being tested and... Applications and Trading Permits. a. Application for MIAX Membership A one-time application fee based upon the... Trading Permit is issued will be pro-rated based on the number of trading days occurring after the date on...
Ren, Hui; Koo, Junghui; Guan, Baoxiang; Yue, Ping; Deng, Xingming; Chen, Mingwei; Khuri, Fadlo R; Sun, Shi-Yong
2013-11-22
The novel Akt inhibitor, API-1, induces apoptosis through undefined mechanisms. The current study focuses on revealing the mechanisms by which API-1 induces apoptosis. API-1 rapidly and potently reduced the levels of Mcl-1 primarily in API-1-senstive lung cancer cell lines. Ectopic expression of Mcl-1 protected cells from induction of apoptosis by API-1. API-1 treatment decreased the half-life of Mcl-1, whereas inhibition of the proteasome with MG132 rescued Mcl-1 reduction induced by API-1. API-1 decreased Mcl-1 levels accompanied with a rapid increase in Mcl-1 phosphorylation (S159/T163). Moreover, inhibition of GSK3 inhibited Mcl-1 phosphorylation and reduction induced by API-1 and antagonized the effect of API-1 on induction of apoptosis. Knockdown of either FBXW7 or β-TrCP alone, both of which are E3 ubiquitin ligases involved in Mcl-1 degradation, only partially rescued Mcl-1 reduction induced by API-1. However, double knockdown of both E3 ubiquitin ligases enhanced the rescue of API-1-induced Mcl-1 reduction. API-1 induces GSK3-dependent, β-TrCP- and FBXW7-mediated Mcl-1 degradation, resulting in induction of apoptosis.
2013-01-01
Background The novel Akt inhibitor, API-1, induces apoptosis through undefined mechanisms. The current study focuses on revealing the mechanisms by which API-1 induces apoptosis. Results API-1 rapidly and potently reduced the levels of Mcl-1 primarily in API-1-senstive lung cancer cell lines. Ectopic expression of Mcl-1 protected cells from induction of apoptosis by API-1. API-1 treatment decreased the half-life of Mcl-1, whereas inhibition of the proteasome with MG132 rescued Mcl-1 reduction induced by API-1. API-1 decreased Mcl-1 levels accompanied with a rapid increase in Mcl-1 phosphorylation (S159/T163). Moreover, inhibition of GSK3 inhibited Mcl-1 phosphorylation and reduction induced by API-1 and antagonized the effect of API-1 on induction of apoptosis. Knockdown of either FBXW7 or β-TrCP alone, both of which are E3 ubiquitin ligases involved in Mcl-1 degradation, only partially rescued Mcl-1 reduction induced by API-1. However, double knockdown of both E3 ubiquitin ligases enhanced the rescue of API-1-induced Mcl-1 reduction. Conclusions API-1 induces GSK3-dependent, β-TrCP- and FBXW7-mediated Mcl-1 degradation, resulting in induction of apoptosis. PMID:24261825
76 FR 51401 - Manufacturer of Controlled Substances; Notice of Application
Federal Register 2010, 2011, 2012, 2013, 2014
2011-08-18
... a bulk manufacturer of the following basic classes of controlled substances: Drug Schedule Marihuana... ingredients (APIs) for distribution to its customers. In reference to drug code 7360 (Marihuana), the company...
DOE Office of Scientific and Technical Information (OSTI.GOV)
LeGendre, M.
2012-04-01
We are seeking a code review of patches against DyninstAPI 8.0. DyninstAPI is an open source binary instrumentation library from the University of Wisconsin and University of Maryland. Our patches port DyninstAPI to the BlueGene/P and BlueGene/Q systems, as well as fix DyninstAPI bugs and implement minor new features in DyninstAPI.
Wirges, M; Funke, A; Serno, P; Knop, K; Kleinebudde, P
2013-05-05
Incorporation of an active pharmaceutical ingredient (API) into the coating layer of film-coated tablets is a method mainly used to formulate fixed-dose combinations. Uniform and precise spray-coating of an API represents a substantial challenge, which could be overcome by applying Raman spectroscopy as process analytical tool. In pharmaceutical industry, Raman spectroscopy is still mainly used as a bench top laboratory analytical method and usually not implemented in the production process. Concerning the application in the production process, a lot of scientific approaches stop at the level of feasibility studies and do not manage the step to production scale and process applications. The present work puts the scale up of an active coating process into focus, which is a step of highest importance during the pharmaceutical development. Active coating experiments were performed at lab and production scale. Using partial least squares (PLS), a multivariate model was constructed by correlating in-line measured Raman spectral data with the coated amount of API. By transferring this model, being implemented for a lab scale process, to a production scale process, the robustness of this analytical method and thus its applicability as a Process Analytical Technology (PAT) tool for the correct endpoint determination in pharmaceutical manufacturing could be shown. Finally, this method was validated according to the European Medicine Agency (EMA) guideline with respect to the special requirements of the applied in-line model development strategy. Copyright © 2013 Elsevier B.V. All rights reserved.
Flexible Architecture for FPGAs in Embedded Systems
NASA Technical Reports Server (NTRS)
Clark, Duane I.; Lim, Chester N.
2012-01-01
Commonly, field-programmable gate arrays (FPGAs) being developed in cPCI embedded systems include the bus interface in the FPGA. This complicates the development because the interface is complicated and requires a lot of development time and FPGA resources. In addition, flight qualification requires a substantial amount of time be devoted to just this interface. Another complication of putting the cPCI interface into the FPGA being developed is that configuration information loaded into the device by the cPCI microprocessor is lost when a new bit file is loaded, requiring cumbersome operations to return the system to an operational state. Finally, SRAM-based FPGAs are typically programmed via specialized cables and software, with programming files being loaded either directly into the FPGA, or into PROM devices. This can be cumbersome when doing FPGA development in an embedded environment, and does not have an easy path to flight. Currently, FPGAs used in space applications are usually programmed via multiple space-qualified PROM devices that are physically large and require extra circuitry (typically including a separate one-time programmable FPGA) to enable them to be used for this application. This technology adds a cPCI interface device with a simple, flexible, high-performance backend interface supporting multiple backend FPGAs. It includes a mechanism for programming the FPGAs directly via the microprocessor in the embedded system, eliminating specialized hardware, software, and PROM devices and their associated circuitry. It has a direct path to flight, and no extra hardware and minimal software are required to support reprogramming in flight. The device added is currently a small FPGA, but an advantage of this technology is that the design of the device does not change, regardless of the application in which it is being used. This means that it needs to be qualified for flight only once, and is suitable for one-time programmable devices or an application specific integrated circuit (ASIC). An application programming interface (API) further reduces the development time needed to use the interface device in a system.
2013-01-01
Background Alpha-1 proteinase inhibitor (API) is a plasma serpin superfamily member that inhibits neutrophil elastase; variant API M358R inhibits thrombin and activated protein C (APC). Fusing residues 1-75 of another serpin, heparin cofactor II (HCII), to API M358R (in HAPI M358R) was previously shown to accelerate thrombin inhibition over API M358R by conferring thrombin exosite 1 binding properties. We hypothesized that replacing HCII 1-75 region with the 13 C-terminal residues (triskaidecapeptide) of hirudin variant 3 (HV354-66) would further enhance the inhibitory potency of API M358R fusion proteins. We therefore expressed HV3API M358R (HV354-66 fused to API M358R) and HV3API RCL5 (HV354-66 fused to API F352A/L353V/E354V/A355I/I356A/I460L/M358R) API M358R) as N-terminally hexahistidine-tagged polypeptides in E. coli. Results HV3API M358R inhibited thrombin 3.3-fold more rapidly than API M358R; for HV3API RCL5 the rate enhancement was 1.9-fold versus API RCL5; neither protein inhibited thrombin as rapidly as HAPI M358R. While the thrombin/Activated Protein C rate constant ratio was 77-fold higher for HV3API RCL5 than for HV3API M358R, most of the increased specificity derived from the API F352A/L353V/E354V/A355I/I356A/I460L API RCL 5 mutations, since API RCL5 remained 3-fold more specific than HV3API RCL5. An HV3 54-66 peptide doubled the Thrombin Clotting Time (TCT) and halved the binding of thrombin to immobilized HCII 1-75 at lower concentrations than free HCII 1-75. HV3API RCL5 bound active site-inhibited FPR-chloromethyl ketone-thrombin more effectively than HAPI RCL5. Transferring the position of the fused HV3 triskaidecapeptide to the C-terminus of API M358R decreased the rate of thrombin inhibition relative to that mediated by HV3API M358R by 11-to 14-fold. Conclusions Fusing the C-terminal triskaidecapeptide of HV3 to API M358R-containing serpins significantly increased their effectiveness as thrombin inhibitors, but the enhancement was less than that seen in HCII 1-75–API M358R fusion proteins. HCII 1-75 was a superior fusion partner, in spite of the greater affinity of the HV3 triskaidecapeptide, manifested both in isolated and API-fused form, for thrombin exosite 1. Our results suggest that HCII 1-75 binds thrombin exosite 1 and orients the attached serpin scaffold for more efficient interaction with the active site of thrombin than the HV3 triskaidecapeptide. PMID:24215622
Automated Formal Testing of C API Using T2C Framework
NASA Astrophysics Data System (ADS)
Khoroshilov, Alexey V.; Rubanov, Vladimir V.; Shatokhin, Eugene A.
A problem of automated test development for checking basic functionality of program interfaces (API) is discussed. Different technologies and corresponding tools are surveyed. And T2C technology developed in ISPRAS is presented. The technology and associated tools facilitate development of "medium quality" (and "medium cost") tests. An important feature of T2C technology is that it enforces that each check in a developed test is explicitly linked to the corresponding place in the standard. T2C tools provide convenient means to create such linkage. The results of using T2C are considered by example of a project for testing interfaces of Linux system libraries defined by the LSB standard.
Underwater Munitions Expert System to Predict Mobility and Burial
2017-11-14
exposure and aggregation for underwater munitions. 15. SUBJECT TERMS Underwater Munitions, Mobility, Burial, Application Programmer Interface...Munitions Expert System: Demonstration and Evaluation Report Acronyms API – Application Programmer Interface APL – Applied Physics...comparisons and traditional metrics such as the coefficient of correlation. The summary statistic for the comparisons of burial results
USDA-ARS?s Scientific Manuscript database
Mosquito control reduces populations of mosquitoes to minimize the risk of mosquito-borne diseases. As part of an integrated approach to mosquito control, application of adulticides can be effective in rapidly reducing mosquito populations during times of high arbovirus transmission. However, impact...
do Carmo, Ana Cerúlia Moraes; Piras, Stefânia Schimaneski; Rocha, Nayrton Flávio Moura; Gratieri, Tais
2017-01-01
Objective . The marketing authorization of generic and similar pharmaceutical drug products involves the analysis of proposing company's administrative aspects as well as drug product technical description and scientific evaluations. This study evaluated the main reasons for registration refusal of generic and similar pharmaceutical drug products in Brazil. The aim is to help future applicants to better organize the proposal. Methods . A retrospective search of drug products registration processes was performed on the Brazilian Government Official Gazette from January 1, 2015, and December 31, 2015. Results . Drug product quality control, drug product stability study, deadline accomplishment, API quality control made by drug manufacturer, active pharmaceutical ingredient (API), and production report were the main reasons for marketing authorization application refusal of generic and similar pharmaceutical drug products in 2015. Conclusion . Disclosure of the reasons behind failed applications is a step forward on regulatory transparency. Sharing of experiences is essential to international regulatory authorities and organizations to improve legislation requirements for the marketing authorization of generic and similar pharmaceutical drug products.
do Carmo, Ana Cerúlia Moraes; Piras, Stefânia Schimaneski; Rocha, Nayrton Flávio Moura
2017-01-01
Objective. The marketing authorization of generic and similar pharmaceutical drug products involves the analysis of proposing company's administrative aspects as well as drug product technical description and scientific evaluations. This study evaluated the main reasons for registration refusal of generic and similar pharmaceutical drug products in Brazil. The aim is to help future applicants to better organize the proposal. Methods. A retrospective search of drug products registration processes was performed on the Brazilian Government Official Gazette from January 1, 2015, and December 31, 2015. Results. Drug product quality control, drug product stability study, deadline accomplishment, API quality control made by drug manufacturer, active pharmaceutical ingredient (API), and production report were the main reasons for marketing authorization application refusal of generic and similar pharmaceutical drug products in 2015. Conclusion. Disclosure of the reasons behind failed applications is a step forward on regulatory transparency. Sharing of experiences is essential to international regulatory authorities and organizations to improve legislation requirements for the marketing authorization of generic and similar pharmaceutical drug products. PMID:28280742
Xu, Xiaoma; van de Craats, Anick M; de Bruyn, Peter C A M
2004-11-01
A highly sensitive screening method based on high performance liquid chromatography atmospheric pressure ionization mass spectrometry (HPLC-API-MS) has been developed for the analysis of 21 nitroaromatic, nitramine and nitrate ester explosives, which include the explosives most commonly encountered in forensic science. Two atmospheric pressure ionization (API) methods, atmospheric pressure chemical ionization (APCI) and electrospray ionization (ESI), and various experimental conditions have been applied to allow for the detection of all 21 explosive compounds. The limit of detection (LOD) in the full-scan mode has been found to be 0.012-1.2 ng on column for the screening of most explosives investigated. For nitrobenzene, an LOD of 10 ng was found with the APCI method in the negative mode. Although the detection of nitrobenzene, 2-, 3-, and 4-nitrotoluene is hindered by the difficult ionization of these compounds, we have found that by forming an adduct with glycine, LOD values in the range of 3-16 ng on column can be achieved. Compared with previous screening methods with thermospray ionization, the API method has distinct advantages, including simplicity and stability of the method applied, an extended screening range and a low detection limit for the explosives studied.
Dobo, Krista L; Greene, Nigel; Cyr, Michelle O; Caron, Stéphane; Ku, Warren W
2006-04-01
Starting materials and intermediates used to synthesize pharmaceuticals are reactive in nature and may be present as impurities in the active pharmaceutical ingredient (API) used for preclinical safety studies and clinical trials. Furthermore, starting materials and intermediates may be known or suspected mutagens and/or carcinogens. Therefore, during drug development due diligence need be applied from two perspectives (1) to understand potential mutagenic and carcinogenic risks associated with compounds used for synthesis and (2) to understand the capability of synthetic processes to control genotoxic impurities in the API. Recently, a task force comprised of experts from pharmaceutical industry proposed guidance, with recommendations for classification, testing, qualification and assessing risk of genotoxic impurities. In our experience the proposed structure-based classification, has differentiated 75% of starting materials and intermediates as mutagenic and non-mutagenic with high concordance (92%) when compared with Ames results. Structure-based assessment has been used to identify genotoxic hazards, and prompted evaluation of fate of genotoxic impurities in API. These two assessments (safety and chemistry) culminate in identification of genotoxic impurities known or suspected to exceed acceptable levels in API, thereby triggering actions needed to assure appropriate control and measurement methods are in place. Hypothetical case studies are presented demonstrating this multi-disciplinary approach.
Bluhm, Martina E. C.; Schneider, Viktoria A. F.; Schäfer, Ingo; Piantavigna, Stefania; Goldbach, Tina; Knappe, Daniel; Seibel, Peter; Martin, Lisandra L.; Veldhuizen, Edwin J. A.; Hoffmann, Ralf
2016-01-01
The Gram-negative bacterium Pseudomonas aeruginosa is a life-threatening nosocomial pathogen due to its generally low susceptibility toward antibiotics. Furthermore, many strains have acquired resistance mechanisms requiring new antimicrobials with novel mechanisms to enhance treatment options. Proline-rich antimicrobial peptides, such as the apidaecin analog Api137, are highly efficient against various Enterobacteriaceae infections in mice, but less active against P. aeruginosa in vitro. Here, we extended our recent work by optimizing lead peptides Api755 (gu-OIORPVYOPRPRPPHPRL-OH; gu = N,N,N′,N′-tetramethylguanidino, O = L-ornithine) and Api760 (gu-OWORPVYOPRPRPPHPRL-OH) by incorporation of Ile-Orn- and Trp-Orn-motifs, respectively. Api795 (gu-O(IO)2RPVYOPRPRPPHPRL-OH) and Api794 (gu-O(WO)3RPVYOPRPRPPHPRL-OH) were highly active against P. aeruginosa with minimal inhibitory concentrations of 8–16 and 8–32 μg/mL against Escherichia coli and Klebsiella pneumoniae. Assessed using a quartz crystal microbalance, these peptides inserted into a membrane layer and the surface activity increased gradually from Api137, over Api795, to Api794. This mode of action was confirmed by transmission electron microscopy indicating some membrane damage only at the high peptide concentrations. Api794 and Api795 were highly stable against serum proteases (half-life times >5 h) and non-hemolytic to human erythrocytes at peptide concentrations of 0.6 g/L. At this concentration, Api795 reduced the cell viability of HeLa cells only slightly, whereas the IC50 of Api794 was 0.23 ± 0.09 g/L. Confocal fluorescence microscopy revealed no colocalization of 5(6)-carboxyfluorescein-labeled Api794 or Api795 with the mitochondria, excluding interactions with the mitochondrial membrane. Interestingly, Api795 was localized in endosomes, whereas Api794 was present in endosomes and the cytosol. This was verified using flow cytometry showing a 50% higher uptake of Api794 in HeLa cells compared with Api795. The uptake was reduced for both peptides by 50 and 80%, respectively, after inhibiting endocytotic uptake with dynasore. In summary, Api794 and Api795 were highly active against P. aeruginosa in vitro. Both peptides passed across the bacterial membrane efficiently, most likely then disturbing the ribosome assembly, and resulting in further intracellular damage. Api795 with its IOIO-motif, which was particularly active and only slightly toxic in vitro, appears to represent a promising third generation lead compound for the development of novel antibiotics against P. aeruginosa. PMID:27243004
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhu, Lei; Holden, Jacob; Gonder, Jeffrey D
The green routing strategy instructing a vehicle to select a fuel-efficient route benefits the current transportation system with fuel-saving opportunities. This paper introduces a navigation API route fuel-saving evaluation framework for estimating fuel advantages of alternative API routes based on large-scale, real-world travel data for conventional vehicles (CVs) and hybrid electric vehicles (HEVs). The navigation APIs, such Google Directions API, integrate traffic conditions and provide feasible alternative routes for origin-destination pairs. This paper develops two link-based fuel-consumption models stratified by link-level speed, road grade, and functional class (local/non-local), one for CVs and the other for HEVs. The link-based fuel-consumption modelsmore » are built by assigning travel from a large number of GPS driving traces to the links in TomTom MultiNet as the underlying road network layer and road grade data from a U.S. Geological Survey elevation data set. Fuel consumption on a link is calculated by the proposed fuel consumption model. This paper envisions two kinds of applications: 1) identifying alternate routes that save fuel, and 2) quantifying the potential fuel savings for large amounts of travel. An experiment based on a large-scale California Household Travel Survey GPS trajectory data set is conducted. The fuel consumption and savings of CVs and HEVs are investigated. At the same time, the trade-off between fuel saving and time saving for choosing different routes is also examined for both powertrains.« less
Park, Doori; Jung, Je Won; Lee, Mi Ok; Lee, Si Young; Kim, Boyun; Jin, Hye Jun; Kim, Jiyoung; Ahn, Young-Joon; Lee, Ki Won; Song, Yong Sang; Hong, Seunghun; Womack, James E; Kwon, Hyung Wook
2014-03-01
Insect-derived antimicrobial peptides (AMPs) have diverse effects on antimicrobial properties and pharmacological activities such as anti-inflammation and anticancer properties. Naturally occurring genetic polymorphism have a direct and/or indirect influence on pharmacological effect of AMPs, therefore information on single nucleotide polymorphism (SNP) occurring in natural AMPs provides an important clue to therapeutic applications. Here we identified nucleotide polymorphisms in melittin gene of honey bee populations, which is one of the potent AMP in bee venoms. We found that the novel SNP of melittin gene exists in these two honey bee species, Apis mellifera and Apis cerana. Nine polymorphisms were identified within the coding region of the melittin gene, of which one polymorphism that resulted in serine (Ser) to asparagine (Asp) substitution that can potentially effect on biological activities of melittin peptide. Serine-substituted melittin (Mel-S) showed more cytotoxic effect than asparagine-substituted melittin (Mel-N) against E. coli. Also, Mel-N and Mel-S had different inhibitory effects on the production of inflammatory factors such as IL-6 and TNF-α in BV-2 cells. Moreover, Mel-S showed stronger cytotoxic activities than Mel-N peptide against two human ovarian cancer cell lines. Using carbon nanotube-based transistor, we here characterized that Mel-S interacted with small unilamellar liposomes more strongly than Mel-N. Taken together, our present study demonstrates that there exist different characteristics of the gene frequency and the biological activities of the melittin peptide in two honey bee species, Apis mellifera and A. cerana. Copyright © 2014 Elsevier Inc. All rights reserved.
77 FR 30027 - Manufacturer of Controlled Substances; Notice of Application; Austin Pharma, LLC.
Federal Register 2010, 2011, 2012, 2013, 2014
2012-05-21
... Schedule Marihuana (7360) I Tetrahydrocannabinols (7370) I The company plans to manufacture bulk active pharmaceutical ingredients (APIs) for distribution to its customers. In reference to drug code 7360 (Marihuana...
Sinpoo, Chainarong; Paxton, Robert J; Disayathanoowat, Terd; Krongdang, Sasiprapa; Chantawannakul, Panuwan
Nosema apis and Nosema ceranae are obligate intracellular microsporidian parasites infecting midgut epithelial cells of host adult honey bees, originally Apis mellifera and Apis cerana respectively. Each microsporidia cross-infects the other host and both microsporidia nowadays have a worldwide distribution. In this study, cross-infection experiments using both N. apis and N. ceranae in both A. mellifera and A. cerana were carried out to compare pathogen proliferation and impact on hosts, including host immune response. Infection by N. ceranae led to higher spore loads than by N. apis in both host species, and there was greater proliferation of microsporidia in A. mellifera compared to A. cerana. Both N. apis and N. ceranae were pathogenic in both host Apis species. N. ceranae induced subtly, though not significantly, higher mortality than N. apis in both host species, yet survival of A. cerana was no different to that of A. mellifera in response to N. apis or N. ceranae. Infections of both host species with N. apis and N. ceranae caused significant up-regulation of AMP genes and cellular mediated immune genes but did not greatly alter apoptosis-related gene expression. In this study, A. cerana enlisted a higher immune response and displayed lower loads of N. apis and N. ceranae spores than A. mellifera, suggesting it may be better able to defend itself against microsporidia infection. We caution against over-interpretation of our results, though, because differences between host and parasite species in survival were insignificant and because size differences between microsporidia species and between host Apis species may alternatively explain the differential proliferation of N. ceranae in A. mellifera. Copyright © 2017 Elsevier Ltd. All rights reserved.
Miyagi, Etsuko; Maruyama, Yasuyo; Mogami, Tae; Numazaki, Reiko; Ikeda, Atsuko; Yamamoto, Hiroshi; Hirahara, Fumiki
2017-02-01
We previously developed a new plasma amino acid profile-based index (API) to detect ovarian, cervical, and endometrial cancers. Here, we compared API to serum cancer antigen 125 (CA125) for distinguishing epithelial ovarian malignant tumors from benign growths. API and CA125 were measured preoperatively in patients with ovarian tumors, which were later classified into 59 epithelial ovarian cancers, 21 epithelial borderline malignant tumors, and 97 benign tumors including 40 endometriotic cysts. The diagnostic accuracy and cutoff points of API were evaluated using receiver operating characteristic (ROC) curves. The area under the ROC curves showed the equivalent performance of API and CA125 to discriminate between malignant/borderline malignant and benign tumors (both 0.77), and API was superior to CA125 for discrimination between malignant/borderline malignant lesions and endometriotic cysts (API, 0.75 vs. CA125, 0.59; p < 0.05). At the API cutoff level of 6.0, API and CA125 had equal positive rates of detecting cancers and borderline malignancies (API, 0.71 vs. CA125, 0.74; p = 0.84) or cancers alone (API, 0.73 vs. CA125, 0.85; p = 0.12). However, API had a significantly lower detection rate of benign endometriotic cysts (0.35; 95 % CI, 0.21-0.52) compared with that of CA125 (0.65; 95 % CI, 0.48-0.79) (p < 0.05). API is an effective new tumor marker to detect ovarian cancers and borderline malignancies with a low false-positive rate for endometriosis. A large-scale prospective clinical study using the cutoff value of API determined in this study is warranted to validate API for practical clinical use.
HTTP-based Search and Ordering Using ECHO's REST-based and OpenSearch APIs
NASA Astrophysics Data System (ADS)
Baynes, K.; Newman, D. J.; Pilone, D.
2012-12-01
Metadata is an important entity in the process of cataloging, discovering, and describing Earth science data. NASA's Earth Observing System (EOS) ClearingHOuse (ECHO) acts as the core metadata repository for EOSDIS data centers, providing a centralized mechanism for metadata and data discovery and retrieval. By supporting both the ESIP's Federated Search API and its own search and ordering interfaces, ECHO provides multiple capabilities that facilitate ease of discovery and access to its ever-increasing holdings. Users are able to search and export metadata in a variety of formats including ISO 19115, json, and ECHO10. This presentation aims to inform technically savvy clients interested in automating search and ordering of ECHO's metadata catalog. The audience will be introduced to practical and applicable examples of end-to-end workflows that demonstrate finding, sub-setting and ordering data that is bound by keyword, temporal and spatial constraints. Interaction with the ESIP OpenSearch Interface will be highlighted, as will ECHO's own REST-based API.
NASA Astrophysics Data System (ADS)
Pulsani, B. R.
2017-11-01
Tank Information System is a web application which provides comprehensive information about minor irrigation tanks of Telangana State. As part of the program, a web mapping application using Flex and ArcGIS server was developed to make the data available to the public. In course of time as Flex be-came outdated, a migration of the client interface to the latest JavaScript based technologies was carried out. Initially, the Flex based application was migrated to ArcGIS JavaScript API using Dojo Toolkit. Both the client applications used published services from ArcGIS server. To check the migration pattern from proprietary to open source, the JavaScript based ArcGIS application was later migrated to OpenLayers and Dojo Toolkit which used published service from GeoServer. The migration pattern noticed in the study especially emphasizes upon the use of Dojo Toolkit and PostgreSQL database for ArcGIS server so that migration to open source could be performed effortlessly. The current ap-plication provides a case in study which could assist organizations in migrating their proprietary based ArcGIS web applications to open source. Furthermore, the study reveals cost benefits of adopting open source against commercial software's.
Event Driven Messaging with Role-Based Subscriptions
NASA Technical Reports Server (NTRS)
Bui, Tung; Bui, Bach; Malhotra, Shantanu; Chen, Fannie; Kim, rachel; Allen, Christopher; Luong, Ivy; Chang, George; Zendejas, Silvino; Sadaqathulla, Syed
2009-01-01
Event Driven Messaging with Role-Based Subscriptions (EDM-RBS) is a framework integrated into the Service Management Database (SMDB) to allow for role-based and subscription-based delivery of synchronous and asynchronous messages over JMS (Java Messaging Service), SMTP (Simple Mail Transfer Protocol), or SMS (Short Messaging Service). This allows for 24/7 operation with users in all parts of the world. The software classifies messages by triggering data type, application source, owner of data triggering event (mission), classification, sub-classification and various other secondary classifying tags. Messages are routed to applications or users based on subscription rules using a combination of the above message attributes. This program provides a framework for identifying connected users and their applications for targeted delivery of messages over JMS to the client applications the user is logged into. EDMRBS provides the ability to send notifications over e-mail or pager rather than having to rely on a live human to do it. It is implemented as an Oracle application that uses Oracle relational database management system intrinsic functions. It is configurable to use Oracle AQ JMS API or an external JMS provider for messaging. It fully integrates into the event-logging framework of SMDB (Subnet Management Database).
When Will It Be …?: U.S. Naval Observatory Religious Calendar Computers Expanded
NASA Astrophysics Data System (ADS)
Bartlett, Jennifer L.; Chizek Frouard, Malynda; Ziegler, Cross; Lesniak, Michael V.
2017-01-01
Reflecting increasing sensitivity to differing religious practices, the U.S. Naval Observatory (USNO) has expanded its on-line calendar resources to compute additional religious dates for specific years via an Application Programming Interface (API). This flexible method now identifies Christian, Islamic, and Jewish events in JavaScript Object Notation (JSON) that anyone can use.Selected Christian Observances (http://aa.usno.navy.mil/data/docs/easter.php) returns dates of eight events for years after 1582 C.E. (A.D. 1582): Ash Wednesday, Palm Sunday, Good Friday, Easter, Ascension, Whit Sunday, Trinity Sunday, and the first Sunday of Advent. The determination of Easter, a moveable feast, uses the method of western Christian churches.Selected Islamic Observances (http://aa.usno.navy.mil/data/docs/islamic.php) returns approximate Gregorian dates of three events for years after 1582 C.E. (A.H. 990) and Julian dates for 622-1582 C.E. (A.H. 1-990) along with the corresponding Islamic year (anno Hegirae). Ramadân, Shawwál, and the Islamic year begin at sunset on the preceding Gregorian or Julian date. For planning purposes, the determination of these dates uses a tabular calendar; in practice, observation of the appropriate waxing crescent Moon determines the actual date, which may vary.Selected Jewish Observances (http://aa.usno.navy.mil/data/docs/passover.php) returns Gregorian dates of six events for years after 1582 C.E. (A.M. 5342) and Julian dates for the years 360-1582 C.E. (A.M. 4120-5342) along with the corresponding Jewish year (anno Mundi). Passover, Shavuot, Rosh Hashanah, Yom Kippur, and Hanukkah begin at sunset on the preceding Gregorian or Julian date.On-line documentation for using the API-enabled calendar computers, including sample calls, is available (http://aa.usno.navy.mil/data/docs/api.php). The webpage also describes how to use the API with the Complete Sun and Moon Data for One Day, Phases of the Moon, Solar Eclipse Computer, Day and Night Across the Earth, Apparent Disk of a Solar System Object, Julian Date Conversion, and Sidereal Time services.Introduction to Calendars (http://aa.usno.navy.mil/faq/docs/calendars.php) provides an overview of the topic and links to additional resources.
Framework for Development and Distribution of Hardware Acceleration
NASA Astrophysics Data System (ADS)
Thomas, David B.; Luk, Wayne W.
2002-07-01
This paper describes IGOL, a framework for developing reconfigurable data processing applications. While IGOL was originally designed to target imaging and graphics systems, its structure is sufficiently general to support a broad range of applications. IGOL adopts a four-layer architecture: application layer, operation layer, appliance layer and configuration layer. This architecture is intended to separate and co-ordinate both the development and execution of hardware and software components. Hardware developers can use IGOL as an instance testbed for verification and benchmarking, as well as for distribution. Software application developers can use IGOL to discover hardware accelerated data processors, and to access them in a transparent, non-hardware specific manner. IGOL provides extensive support for the RC1000-PP board via the Handel-C language, and a wide selection of image processing filters have been developed. IGOL also supplies plug-ins to enable such filters to be incorporated in popular applications such as Premiere, Winamp, VirtualDub and DirectShow. Moreover, IGOL allows the automatic use of multiple cards to accelerate an application, demonstrated using DirectShow. To enable transparent acceleration without sacrificing performance, a three-tiered COM (Component Object Model) API has been designed and implemented. This API provides a well-defined and extensible interface which facilitates the development of hardware data processors that can accelerate multiple applications.
NASA Astrophysics Data System (ADS)
Rahman, Nurul Hidayah Ab; Abdullah, Nurul Azma; Hamid, Isredza Rahmi A.; Wen, Chuah Chai; Jelani, Mohamad Shafiqur Rahman Mohd
2017-10-01
Closed-Circuit TV (CCTV) system is one of the technologies in surveillance field to solve the problem of detection and monitoring by providing extra features such as email alert or motion detection. However, detecting and alerting the admin on CCTV system may complicate due to the complexity to integrate the main program with an external Application Programming Interface (API). In this study, pixel processing algorithm is applied due to its efficiency and SMS alert is added as an alternative solution for users who opted out email alert system or have no Internet connection. A CCTV system with SMS alert (CMDSA) was developed using evolutionary prototyping methodology. The system interface was implemented using Microsoft Visual Studio while the backend components, which are database and coding, were implemented on SQLite database and C# programming language, respectively. The main modules of CMDSA are motion detection, capturing and saving video, image processing and Short Message Service (SMS) alert functions. Subsequently, the system is able to reduce the processing time making the detection process become faster, reduce the space and memory used to run the program and alerting the system admin instantly.
NASA Astrophysics Data System (ADS)
Schroeder, P. C.; Luhmann, J. G.; Davis, A. J.; Russell, C. T.
2006-12-01
STEREO's IMPACT (In-situ Measurements of Particles and CME Transients) investigation provides the first opportunity for long duration, detailed observations of 1 AU magnetic field structures, plasma and suprathermal electrons, and energetic particles at points bracketing Earth's heliospheric location. The PLASTIC instrument takes plasma ion composition measurements completing STEREO's comprehensive in-situ perspective. Stereoscopic/3D information from the STEREO SECCHI imagers and SWAVES radio experiment make it possible to use both multipoint and quadrature studies to connect interplanetary Coronal Mass Ejections (ICME) and solar wind structures to CMEs and coronal holes observed at the Sun. The uniqueness of the STEREO mission requires novel data analysis tools and techniques to take advantage of the mission's full scientific potential. An interactive browser with the ability to create publication-quality plots has been developed which integrates STEREO's in-situ data with data from a variety of other missions including WIND and ACE. Also, an application program interface (API) is provided allowing users to create custom software that ties directly into STEREO's data set. The API allows for more advanced forms of data mining than currently available through most web-based data services. A variety of data access techniques and the development of cross-spacecraft data analysis tools allow the larger scientific community to combine STEREO's unique in-situ data with those of other missions, particularly the L1 missions, and, therefore, to maximize STEREO's scientific potential in gaining a greater understanding of the heliosphere.
Ruffier, Magali; Kähäri, Andreas; Komorowska, Monika; Keenan, Stephen; Laird, Matthew; Longden, Ian; Proctor, Glenn; Searle, Steve; Staines, Daniel; Taylor, Kieron; Vullo, Alessandro; Yates, Andrew; Zerbino, Daniel; Flicek, Paul
2017-01-01
The Ensembl software resources are a stable infrastructure to store, access and manipulate genome assemblies and their functional annotations. The Ensembl 'Core' database and Application Programming Interface (API) was our first major piece of software infrastructure and remains at the centre of all of our genome resources. Since its initial design more than fifteen years ago, the number of publicly available genomic, transcriptomic and proteomic datasets has grown enormously, accelerated by continuous advances in DNA-sequencing technology. Initially intended to provide annotation for the reference human genome, we have extended our framework to support the genomes of all species as well as richer assembly models. Cross-referenced links to other informatics resources facilitate searching our database with a variety of popular identifiers such as UniProt and RefSeq. Our comprehensive and robust framework storing a large diversity of genome annotations in one location serves as a platform for other groups to generate and maintain their own tailored annotation. We welcome reuse and contributions: our databases and APIs are publicly available, all of our source code is released with a permissive Apache v2.0 licence at http://github.com/Ensembl and we have an active developer mailing list ( http://www.ensembl.org/info/about/contact/index.html ). http://www.ensembl.org. © The Author(s) 2017. Published by Oxford University Press.
Fortunak, Joseph M; de Souza, Rodrigo O M A; Kulkarni, Amol A; King, Christopher L; Ellison, Tiffany; Miranda, Leandro S M
2014-01-01
Active pharmaceutical ingredients (APIs) are the molecular entities that exert the therapeutic effects of medicines. This article provides an overview of the major APIs that are entered into antiretroviral therapy (ART), outlines how APIs are manufactured, and examines the regulatory and cost frameworks of manufacturing ART APIs used in low- and middle-income countries (LMICs). Almost all APIs for ART are prepared by chemical synthesis. Roughly 15 APIs account for essentially all of the ARTs used in LMICs. Nearly all of the ART APIs purchased through the Global Fund for AIDS, TB and Malaria (GFATM) or the United States President's Emergency Plan for AIDS Relief (PEPFAR) are produced by generic companies. API costs are very important because they are the largest contribution to the overall cost of ART. Efficient API production requires substantial investment in chemical manufacturing technologies and the ready availability of raw materials and energy at competitive prices. Generic API production is practiced in only a limited number of countries; the API market for ART is dominated by Indian companies. The quality of these APIs is ensured by manufacturing under good manufacturing practice (GMP), including process validation, testing against previously established specifications and the demonstration of clinical bioequivalence. The investment and personnel costs of a quality management system for GMP contribute significantly to the cost of API production. Chinese companies are the major suppliers for many advanced intermediates in API production. Improved chemistry of manufacturing, economies of scale and optimization of procurement have enabled drastic cost reductions for many ART APIs. The available capacity for global production of quality-assured APIs is likely adequate to meet forecasted demand for 2015. The increased use of ART for paediatric treatment, for second-line and salvage therapy, and the introduction of new APIs and combinations are important factors for the future of treatment in LMICs. The introduction of new fixed-dose combinations for ART and use of new drug delivery technologies could plausibly provide robust, durable ART for all patients in need, at an overall cost that is only moderately higher than what is presently being spent.
Fortunak, Joseph M; de Souza, Rodrigo OMA; Kulkarni, Amol A; King, Christopher L; Ellison, Tiffany; Miranda, Leandro SM
2015-01-01
Active pharmaceutical ingredients (APIs) are the molecular entities that exert the therapeutic effects of medicines. This article provides an overview of the major APIs that are entered into antiretroviral therapy (ART), outlines how APIs are manufactured, and examines the regulatory and cost frameworks of manufacturing ART APIs used in low- and middle-income countries (LMICs). Almost all APIs for ART are prepared by chemical synthesis. Roughly 15 APIs account for essentially all of the ARTs used in LMICs. Nearly all of the ART APIs purchased through the Global Fund for AIDS, TB and Malaria (GFATM) or the United States President’s Emergency Plan for AIDS Relief (PEPFAR) are produced by generic companies. API costs are very important because they are the largest contribution to the overall cost of ART. Efficient API production requires substantial investment in chemical manufacturing technologies and the ready availability of raw materials and energy at competitive prices. Generic API production is practiced in only a limited number of countries; the API market for ART is dominated by Indian companies. The quality of these APIs is ensured by manufacturing under good manufacturing practice (GMP), including process validation, testing against previously established specifications and the demonstration of clinical bioequivalence. The investment and personnel costs of a quality management system for GMP contribute significantly to the cost of API production. Chinese companies are the major suppliers for many advanced intermediates in API production. Improved chemistry of manufacturing, economies of scale and optimization of procurement have enabled drastic cost reductions for many ART APIs. The available capacity for global production of quality-assured APIs is likely adequate to meet forecasted demand for 2015. The increased use of ART for paediatric treatment, for second-line and salvage therapy, and the introduction of new APIs and combinations are important factors for the future of treatment in LMICs. The introduction of new fixed-dose combinations for ART and use of new drug delivery technologies could plausibly provide robust, durable ART for all patients in need, at an overall cost that is only moderately higher than what is presently being spent. PMID:25310430
Cinfony – combining Open Source cheminformatics toolkits behind a common interface
O'Boyle, Noel M; Hutchison, Geoffrey R
2008-01-01
Background Open Source cheminformatics toolkits such as OpenBabel, the CDK and the RDKit share the same core functionality but support different sets of file formats and forcefields, and calculate different fingerprints and descriptors. Despite their complementary features, using these toolkits in the same program is difficult as they are implemented in different languages (C++ versus Java), have different underlying chemical models and have different application programming interfaces (APIs). Results We describe Cinfony, a Python module that presents a common interface to all three of these toolkits, allowing the user to easily combine methods and results from any of the toolkits. In general, the run time of the Cinfony modules is almost as fast as accessing the underlying toolkits directly from C++ or Java, but Cinfony makes it much easier to carry out common tasks in cheminformatics such as reading file formats and calculating descriptors. Conclusion By providing a simplified interface and improving interoperability, Cinfony makes it easy to combine complementary features of OpenBabel, the CDK and the RDKit. PMID:19055766
Training in pathology informatics: implementation at the University of Pittsburgh.
Harrison, James H; Stewart, Jimmie
2003-08-01
Pathology informatics is generally recognized as an important component of pathology training, but the scope, form, and goals of informatics training vary substantially between pathology residency programs. The Training and Education Committee of the Association for Pathology Informatics (API TEC) has developed a standard set of knowledge and skills objectives that are recommended for inclusion in pathology informatics training and may serve to standardize and formalize training programs in this area. The University of Pittsburgh (Pittsburgh, Pa) core rotation in pathology informatics includes most of these goals and is offered as an implementation model for pathology informatics training. The core rotation in pathology informatics is a 3-week, full-time rotation including didactic sessions and hands-on laboratories. Topics include general desktop computing and the Internet, but the primary focus of the rotation is vocabulary and concepts related to enterprise and pathology information systems, pathology practice, and research. The total contact time is 63 hours, and a total of 19 faculty and staff contribute. Pretests and posttests are given at the start and end of the rotation. Performance and course evaluation data were collected for 3 years (a total of 21 residents). The rotation implements 84% of the knowledge objectives and 94% of the skills objectives recommended by the API TEC. Residents scored an average of about 20% on the pretest and about 70% on the posttest for an average increase during the course of 50%. Posttest scores did not correlate with pretest scores or self-assessed computer skill level. The size of the pretest/posttest difference correlated negatively with the pretest scores and self-assessed computing skill level. Pretest scores were generally low regardless of whether residents were familiar with desktop computing and productivity applications, indicating that even residents who are computer "savvy" have limited knowledge of pathology informatics topics. Posttest scores showed that all residents' knowledge increased substantially during the course and that residents who were computing novices were not disadvantaged. In fact, novices tended to have higher pretest/posttest differences, indicating that the rotation effectively supported initially less knowledgeable residents in "catching up" to their peers and achieving an appropriate competency level. This rotation provides a formal training model that implements the API TEC recommendations with demonstrated success.
A modern Python interface for the Generic Mapping Tools
NASA Astrophysics Data System (ADS)
Uieda, L.; Wessel, P.
2017-12-01
Figures generated by The Generic Mapping Tools (GMT) are present in countless publications across the Earth sciences. The command-line interface of GMT lends the tool its flexibility but also creates a barrier to entry for begginers. Meanwhile, adoption of the Python programming language has grown across the scientific community. This growth is largely due to the simplicity and low barrier to entry of the language and its ecosystem of tools. Thus, it is not surprising that there have been at least three attempts to create Python interfaces for GMT: gmtpy (github.com/emolch/gmtpy), pygmt (github.com/ian-r-rose/pygmt), and PyGMT (github.com/glimmer-cism/PyGMT). None of these projects are currently active and, with the exception of pygmt, they do not use the GMT Application Programming Interface (API) introduced in GMT 5. The two main Python libraries for plotting data on maps are the matplotlib Basemap toolkit (matplotlib.org/basemap) and Cartopy (scitools.org.uk/cartopy), both of which rely on matplotlib (matplotlib.org) as the backend for generating the figures. Basemap is known to have limitations and is being discontinued. Cartopy is an improvement over Basemap but is still bound by the speed and memory constraints of matplotlib. We present a new Python interface for GMT (GMT/Python) that makes use of the GMT API and of new features being developed for the upcoming GMT 6 release. The GMT/Python library is designed according to the norms and styles of the Python community. The library integrates with the scientific Python ecosystem by using the "virtual files" from the GMT API to implement input and output of Python data types (numpy "ndarray" for tabular data and xarray "Dataset" for grids). Other features include an object-oriented interface for creating figures, the ability to display figures in the Jupyter notebook, and descriptive aliases for GMT arguments (e.g., "region" instead of "R" and "projection" instead of "J"). GMT/Python can also serve as a backend for developing new high-level interfaces, which can help make GMT more accessible to beginners and more intuitive for Python users. GMT/Python is an open-source project hosted on Github (github.com/GenericMappingTools/gmt-python) and is in early stages of development. A first release will accompany the release of GMT 6, which is expected for early 2018.
Updates to the NASA Space Telecommunications Radio System (STRS) Architecture
NASA Technical Reports Server (NTRS)
Kacpura, Thomas J.; Handler, Louis M.; Briones, Janette; Hall, Charles S.
2008-01-01
This paper describes an update of the Space Telecommunications Radio System (STRS) open architecture for NASA space based radios. The STRS architecture has been defined as a framework for the design, development, operation and upgrade of space based software defined radios, where processing resources are constrained. The architecture has been updated based upon reviews by NASA missions, radio providers, and component vendors. The STRS Standard prescribes the architectural relationship between the software elements used in software execution and defines the Application Programmer Interface (API) between the operating environment and the waveform application. Modeling tools have been adopted to present the architecture. The paper will present a description of the updated API, configuration files, and constraints. Minimum compliance is discussed for early implementations. The paper then closes with a summary of the changes made and discussion of the relevant alignment with the Object Management Group (OMG) SWRadio specification, and enhancements to the specialized signal processing abstraction.
Neuhaus, Philipp; Doods, Justin; Dugas, Martin
2015-01-01
Automatic coding of medical terms is an important, but highly complicated and laborious task. To compare and evaluate different strategies a framework with a standardized web-interface was created. Two UMLS mapping strategies are compared to demonstrate the interface. The framework is a Java Spring application running on a Tomcat application server. It accepts different parameters and returns results in JSON format. To demonstrate the framework, a list of medical data items was mapped by two different methods: similarity search in a large table of terminology codes versus search in a manually curated repository. These mappings were reviewed by a specialist. The evaluation shows that the framework is flexible (due to standardized interfaces like HTTP and JSON), performant and reliable. Accuracy of automatically assigned codes is limited (up to 40%). Combining different semantic mappers into a standardized Web-API is feasible. This framework can be easily enhanced due to its modular design.
Fifth NASA Goddard Conference on Mass Storage Systems and Technologies.. Volume 1
NASA Technical Reports Server (NTRS)
Kobler, Benjamin (Editor); Hariharan, P. C. (Editor)
1996-01-01
This document contains copies of those technical papers received in time for publication prior to the Fifth Goddard Conference on Mass Storage Systems and Technologies. As one of an ongoing series, this conference continues to serve as a unique medium for the exchange of information on topics relating to the ingestion and management of substantial amounts of data and the attendant problems involved. This year's discussion topics include storage architecture, database management, data distribution, file system performance and modeling, and optical recording technology. There will also be a paper on Application Programming Interfaces (API) for a Physical Volume Repository (PVR) defined in Version 5 of the Institute of Electrical and Electronics Engineers (IEEE) Reference Model (RM). In addition, there are papers on specific archives and storage products.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ball, G.; Kuznetsov, V.; Evans, D.
We present the Data Aggregation System, a system for information retrieval and aggregation from heterogenous sources of relational and non-relational data for the Compact Muon Solenoid experiment on the CERN Large Hadron Collider. The experiment currently has a number of organically-developed data sources, including front-ends to a number of different relational databases and non-database data services which do not share common data structures or APIs (Application Programming Interfaces), and cannot at this stage be readily converged. DAS provides a single interface for querying all these services, a caching layer to speed up access to expensive underlying calls and the abilitymore » to merge records from different data services pertaining to a single primary key.« less
Development of Cross Section Library and Application Programming Interface (API)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lee, C. H.; Marin-Lafleche, A.; Smith, M. A.
2014-04-09
The goal of NEAMS neutronics is to develop a high-fidelity deterministic neutron transport code termed PROTEUS for use on all reactor types of interest, but focused primarily on sodium-cooled fast reactors. While PROTEUS-SN has demonstrated good accuracy for homogeneous fast reactor problems and partially heterogeneous fast reactor problems, the simulation results were not satisfactory when applied on fully heterogeneous thermal problems like the Advanced Test Reactor (ATR). This is mainly attributed to the quality of cross section data for heterogeneous geometries since the conventional cross section generation approach does not work accurately for such irregular and complex geometries. Therefore, onemore » of the NEAMS neutronics tasks since FY12 has been the development of a procedure to generate appropriate cross sections for a heterogeneous geometry core.« less
NASA Astrophysics Data System (ADS)
Buck, J. J. H.; Phillips, A.; Lorenzo, A.; Kokkinaki, A.; Hearn, M.; Gardner, T.; Thorne, K.
2017-12-01
The National Oceanography Centre (NOC) operate a fleet of approximately 36 autonomous marine platforms including submarine gliders, autonomous underwater vehicles, and autonomous surface vehicles. Each platform effectivity has the capability to observe the ocean and collect data akin to a small research vessel. This is creating a growth in data volumes and complexity while the amount of resource available to manage data remains static. The OceanIds Command and Control (C2) project aims to solve these issues by fully automating the data archival, processing and dissemination. The data architecture being implemented jointly by NOC and the Scottish Association for Marine Science (SAMS) includes a single Application Programming Interface (API) gateway to handle authentication, forwarding and delivery of both metadata and data. Technicians and principle investigators will enter expedition data prior to deployment of vehicles enabling automated data processing when vehicles are deployed. The system will support automated metadata acquisition from platforms as this technology moves towards operational implementation. The metadata exposure to the web builds on a prototype developed by the European Commission supported SenseOCEAN project and is via open standards including World Wide Web Consortium (W3C) RDF/XML and the use of the Semantic Sensor Network ontology and Open Geospatial Consortium (OGC) SensorML standard. Data will be delivered in the marine domain Everyone's Glider Observatory (EGO) format and OGC Observations and Measurements. Additional formats will be served by implementation of endpoints such as the NOAA ERDDAP tool. This standardised data delivery via the API gateway enables timely near-real-time data to be served to Oceanids users, BODC users, operational users and big data systems. The use of open standards will also enable web interfaces to be rapidly built on the API gateway and delivery to European research infrastructures that include aligned reference models for data infrastructure.
NASA Astrophysics Data System (ADS)
Vines, Aleksander; Hansen, Morten W.; Korosov, Anton
2017-04-01
Existing infrastructure international and Norwegian projects, e.g., NorDataNet, NMDC and NORMAP, provide open data access through the OPeNDAP protocol following the conventions for CF (Climate and Forecast) metadata, designed to promote the processing and sharing of files created with the NetCDF application programming interface (API). This approach is now also being implemented in the Norwegian Sentinel Data Hub (satellittdata.no) to provide satellite EO data to the user community. Simultaneously with providing simplified and unified data access, these projects also seek to use and establish common standards for use and discovery metadata. This then allows development of standardized tools for data search and (subset) streaming over the internet to perform actual scientific analysis. A combinnation of software tools, which we call a Scientific Platform as a Service (SPaaS), will take advantage of these opportunities to harmonize and streamline the search, retrieval and analysis of integrated satellite and auxiliary observations of the oceans in a seamless system. The SPaaS is a cloud solution for integration of analysis tools with scientific datasets via an API. The core part of the SPaaS is a distributed metadata catalog to store granular metadata describing the structure, location and content of available satellite, model, and in situ datasets. The analysis tools include software for visualization (also online), interactive in-depth analysis, and server-based processing chains. The API conveys search requests between system nodes (i.e., interactive and server tools) and provides easy access to the metadata catalog, data repositories, and the tools. The SPaaS components are integrated in virtual machines, of which provisioning and deployment are automatized using existing state-of-the-art open-source tools (e.g., Vagrant, Ansible, Docker). The open-source code for scientific tools and virtual machine configurations is under version control at https://github.com/nansencenter/, and is coupled to an online continuous integration system (e.g., Travis CI).
Using Mobile App Development Tools to Build a GIS Application
NASA Astrophysics Data System (ADS)
Mital, A.; Catchen, M.; Mital, K.
2014-12-01
Our group designed and built working web, android, and IOS applications using different mapping libraries as bases on which to overlay fire data from NASA. The group originally planned to make app versions for Google Maps, Leaflet, and OpenLayers. However, because the Leaflet library did not properly load on Android, the group focused efforts on the other two mapping libraries. For Google Maps, the group first designed a UI for the web app and made a working version of the app. After updating the source of fire data to one which also provided historical fire data, the design had to be modified to include the extra data. After completing a working version of the web app, the group used webview in android, a built in resource which allowed porting the web app to android without rewriting the code for android. Upon completing this, the group found Apple IOS devices had a similar capability, and so decided to add an IOS app to the project using a function similar to webview. Alongside this effort, the group began implementing an OpenLayers fire map using a simpler UI. This web app was completed fairly quickly relative to Google Maps; however, it did not include functionality such as satellite imagery or searchable locations. The group finished the project with a working android version of the Google Maps based app supporting API levels 14-19 and an OpenLayers based app supporting API levels 8-19, as well as a Google Maps based IOS app supporting both old and new screen formats. This project was implemented by high school and college students under an SGT Inc. STEM internship program
Satellite Estimation of Fractional Cover in Several California Specialty Crops
NASA Technical Reports Server (NTRS)
Johnson, Lee; Cahn, Michael; Rosevelt, Carolyn; Guzman, Alberto; Farrara, Barry; Melton, Forrest S.
2016-01-01
Past research in California and elsewhere has revealed strong relationships between satellite NDVI, photosynthetically active vegetation fraction (Fc), and crop evapotranspiration (ETc). Estimation of ETc can support efficiency of irrigation practice, which enhances water security and may mitigate nitrate leaching. The U.C. Cooperative Extension previously developed the CropManage (CM) web application for evaluation of crop water requirement and irrigation scheduling for several high-value specialty crops. CM currently uses empirical equations to predict daily Fc as a function of crop type, planting date and expected harvest date. The Fc prediction is transformed to fraction of reference ET and combined with reference data from the California Irrigation Management Information System to estimate daily ETc. In the current study, atmospherically-corrected Landsat NDVI data were compared with in-situ Fc estimates on several crops in the Salinas Valley during 2011-2014. The satellite data were observed on day of ground collection or were linearly interpolated across no more than an 8-day revisit period. Results will be presented for lettuce, spinach, celery, broccoli, cauliflower, cabbage, peppers, and strawberry. An application programming interface (API) allows CM and other clients to automatically retrieve NDVI and associated data from NASA's Satellite Irrigation Management Support (SIMS) web service. The SIMS API allows for queries both by individual points or user-defined polygons, and provides data for individual days or annual timeseries. Updates to the CM web app will convert these NDVI data to Fc on a crop-specific basis. The satellite observations are expected to play a support role in Salinas Valley, and may eventually serve as a primary data source as CM is extended to crop systems or regions where Fc is less predictable.
Satellite Estimation of Fractional Cover in Several California Specialty Crops
NASA Astrophysics Data System (ADS)
Johnson, L.; Cahn, M.; Rosevelt, C.; Guzman, A.; Lockhart, T.; Farrara, B.; Melton, F. S.
2016-12-01
Past research in California and elsewhere has revealed strong relationships between satellite NDVI, photosynthetically active vegetation fraction (Fc), and crop evapotranspiration (ETc). Estimation of ETc can support efficiency of irrigation practice, which enhances water security and may mitigate nitrate leaching. The U.C. Cooperative Extension previously developed the CropManage (CM) web application for evaluation of crop water requirement and irrigation scheduling for several high-value specialty crops. CM currently uses empirical equations to predict daily Fc as a function of crop type, planting date and expected harvest date. The Fc prediction is transformed to fraction of reference ET and combined with reference data from the California Irrigation Management Information System to estimate daily ETc. In the current study, atmospherically-corrected Landsat NDVI data were compared with in-situ Fc estimates on several crops in the Salinas Valley during 2011-2014. The satellite data were observed on day of ground collection or were linearly interpolated across no more than an 8-day revisit period. Results will be presented for lettuce, spinach, celery, broccoli, cauliflower, cabbage, peppers, and strawberry. An application programming interface (API) allows CM and other clients to automatically retrieve NDVI and associated data from NASA's Satellite Irrigation Management Support (SIMS) web service. The SIMS API allows for queries both by individual points or user-defined polygons, and provides data for individual days or annual timeseries. Updates to the CM web app will convert these NDVI data to Fc on a crop-specific basis. The satellite observations are expected to play a support role in Salinas Valley, and may eventually serve as a primary data source as CM is extended to crop systems or regions where Fc is less predictable.
USDA-ARS?s Scientific Manuscript database
Few studies of honey bee colonies exist where varroa mite control is achieved by integrating broodless conditions, through either total brood removal or queen caging, in combination with oxalic acid (OA) applications. We observed significant varroa mortality after applications of OA in obtaining bro...
Current Standardization and Cooperative Efforts Related to Industrial Information Infrastructures.
1993-05-01
Data Management Systems: Components used to store, manage, and retrieve data. Data management includes knowledge bases, database management...Application Development Tools and Methods X/Open and POSIX APIs Integrated Design Support System (IDS) Knowledge -Based Systems (KBS) Application...IDEFlx) Yourdon Jackson System Design (JSD) Knowledge -Based Systems (KBSs) Structured Systems Development (SSD) Semantic Unification Meta-Model
System and Network Security Acronyms and Abbreviations
2009-09-01
hazards of electromagnetic radiation to fuel HERO hazards of electromagnetic radiation to ordnance HERP hazards of electromagnetic ...ABSTRACT unclassified c. THIS PAGE unclassified Standard Form 298 (Rev. 8-98) Prescribed by ANSI Std Z39-18 System and Network Security Acronyms...authentication and key management ALG application layer gateway ANSI American National Standards Institute AP access point API application
Rapid Deployment of a RESTful Service for Oceanographic Research Cruises
NASA Astrophysics Data System (ADS)
Fu, Linyun; Arko, Robert; Leadbetter, Adam
2014-05-01
The Ocean Data Interoperability Platform (ODIP) seeks to increase data sharing across scientific domains and international boundaries, by providing a forum to harmonize diverse regional data systems. ODIP participants from the US include the Rolling Deck to Repository (R2R) program, whose mission is to capture, catalog, and describe the underway/environmental sensor data from US oceanographic research vessels and submit the data to public long-term archives. R2R publishes information online as Linked Open Data, making it widely available using Semantic Web standards. Each vessel, sensor, cruise, dataset, person, organization, funding award, log, report, etc, has a Uniform Resource Identifier (URI). Complex queries that federate results from other data providers are supported, using the SPARQL query language. To facilitate interoperability, R2R uses controlled vocabularies developed collaboratively by the science community (eg. SeaDataNet device categories) and published online by the NERC Vocabulary Server (NVS). In response to user feedback, we are developing a standard programming interface (API) and Web portal for R2R's Linked Open Data. The API provides a set of simple REST-type URLs that are translated on-the-fly into SPARQL queries, and supports common output formats (eg. JSON). We will demonstrate an implementation based on the Epimorphics Linked Data API (ELDA) open-source Java package. Our experience shows that constructing a simple portal with limited schema elements in this way can significantly reduce development time and maintenance complexity.
ShareSync: A Solution for Deterministic Data Sharing over Ethernet
NASA Technical Reports Server (NTRS)
Dunn, Daniel J., II; Koons, William A.; Kennedy, Richard D.; Davis, Philip A.
2007-01-01
As part of upgrading the Contact Dynamics Simulation Laboratory (CDSL) at the NASA Marshall Space Flight Center (MSFC), a simple, cost effective method was needed to communicate data among the networked simulation machines and I/O controllers used to run the facility. To fill this need and similar applicable situations, a generic protocol was developed, called ShareSync. ShareSync is a lightweight, real-time, publish-subscribe Ethernet protocol for simple and deterministic data sharing across diverse machines and operating systems. ShareSync provides a simple Application Programming Interface (API) for simulation programmers to incorporate into their code. The protocol is compatible with virtually all Ethernet-capable machines, is flexible enough to support a variety of applications, is fast enough to provide soft real-time determinism, and is a low-cost resource for distributed simulation development, deployment, and maintenance. The first design cycle iteration of ShareSync has been completed, and the protocol has undergone several testing procedures including endurance and benchmarking tests and approaches the 2001ts data synchronization design goal for the CDSL.
IgE-Api m 4 Is Useful for Identifying a Particular Phenotype of Bee Venom Allergy.
Ruiz, B; Serrano, P; Moreno, C
Different clinical behaviors have been identified in patients allergic to bee venom. Compound-resolved diagnosis could be an appropriate tool for investigating these differences. The aims of this study were to analyze whether specific IgE to Api m 4 (sIgE-Api m 4) can identify a particular kind of bee venom allergy and to describe response to bee venom immunotherapy (bVIT). Prospective study of 31 patients allergic to bee venom who were assigned to phenotype group A (sIgE-Api m 4 <0.98 kU/L), treated with native aqueous (NA) extract, or phenotype group B (sIgE-Api m 4 ≥0.98 kU/L), treated with purified aqueous (PA) extract. Sex, age, cardiovascular risk, severity of preceding sting reaction, exposure to beekeeping, and immunological data (intradermal test, sIgE/sIgG4-Apis-nApi m 1, and sIgE-rApi m 2-Api m 4 were analyzed. Systemic reactions (SRs) during bVIT build-up were analyzed. Immunological and sting challenge outcomes were evaluated in each group after 1 and 2 years of bVIT. Phenotype B patients had more severe reactions (P=.049) and higher skin sensitivity (P=.011), baseline sIgE-Apis (P=.0004), sIgE-nApi m 1 (P=.0004), and sIgG4-Apis (P=.027) than phenotype A patients. Furthermore, 41% of patients in group B experienced SRs during the build-up phase with NA; the sting challenge success rate in this group was 82%. There were no significant reductions in serial intradermal test results, but an intense reduction in sIgE-nApi m 1 (P=.013) and sIgE-Api m 4 (P=.004) was observed after the first year of bVIT. Use of IgE-Api m 4 as the only discrimination criterion demonstrated differences in bee venom allergy. Further investigation with larger populations is necessary.
Gepp, Barbara; Lengger, Nina; Bublin, Merima; Hemmer, Wolfgang; Breiteneder, Heimo; Radauer, Christian
2014-01-01
Background Characterization of IgE-binding epitopes of allergens and determination of their patient-specific relevance is crucial for the diagnosis and treatment of allergy. Objective We sought to assess the contribution of specific surface areas of the major birch pollen allergen Bet v 1.0101 to binding IgE of individual patients. Methods Four distinct areas of Bet v 1 representing in total 81% of its surface were grafted onto the scaffold of its homolog, Api g 1.0101, to yield the chimeras Api-Bet-1 to Api-Bet-4. The chimeras were expressed in Escherichia coli and purified. IgE binding of 64 sera from Bet v 1–sensitized subjects with birch pollen allergy was determined by using direct ELISA. Specificity was assessed by means of inhibition ELISA. Results rApi g 1.0101, Api-Bet-1, Api-Bet-2, Api-Bet-3, and Api-Bet-4 bound IgE from 44%, 89%, 80%, 78%, and 48% of the patients, respectively. By comparing the amount of IgE binding to the chimeras and to rApi g 1.0101, 81%, 70%, 75%, and 45% of the patients showed significantly enhanced IgE binding to Api-Bet-1, Api-Bet-2, Api-Bet-3, and Api-Bet-4, respectively. The minority (8%) of the sera revealed enhanced IgE binding exclusively to a single chimera, whereas 31% showed increased IgE binding to all 4 chimeras compared with rApi g 1.0101. The chimeras inhibited up to 70% of IgE binding to rBet v 1.0101, confirming the specific IgE recognition of the grafted regions. Conclusion The Bet v 1–specific IgE response is polyclonal, and epitopes are spread across the entire Bet v 1 surface. Furthermore, the IgE recognition profile of Bet v 1 is highly patient specific. PMID:24529686
Frick, Marcel; Fischer, Jörg; Helbling, Arthur; Ruëff, Franziska; Wieczorek, Dorothea; Ollert, Markus; Pfützner, Wolfgang; Müller, Sabine; Huss-Marp, Johannes; Dorn, Britta; Biedermann, Tilo; Lidholm, Jonas; Ruecker, Gerta; Bantleon, Frank; Miehe, Michaela; Spillner, Edzard; Jakob, Thilo
2016-12-01
Component resolution recently identified distinct sensitization profiles in honey bee venom (HBV) allergy, some of which were dominated by specific IgE to Api m 3 and/or Api m 10, which have been reported to be underrepresented in therapeutic HBV preparations. We performed a retrospective analysis of component-resolved sensitization profiles in HBV-allergic patients and association with treatment outcome. HBV-allergic patients who had undergone controlled honey bee sting challenge after at least 6 months of HBV immunotherapy (n = 115) were included and classified as responder (n = 79) or treatment failure (n = 36) on the basis of absence or presence of systemic allergic reactions upon sting challenge. IgE reactivity to a panel of HBV allergens was analyzed in sera obtained before immunotherapy and before sting challenge. No differences were observed between responders and nonresponders regarding levels of IgE sensitization to Api m 1, Api m 2, Api m 3, and Api m 5. In contrast, Api m 10 specific IgE was moderately but significantly increased in nonresponders. Predominant Api m 10 sensitization (>50% of specific IgE to HBV) was the best discriminator (specificity, 95%; sensitivity, 25%) with an odds ratio of 8.444 (2.127-33.53; P = .0013) for treatment failure. Some but not all therapeutic HBV preparations displayed a lack of Api m 10, whereas Api m 1 and Api m 3 immunoreactivity was comparable to that of crude HBV. In line with this, significant Api m 10 sIgG 4 induction was observed only in those patients who were treated with HBV in which Api m 10 was detectable. Component-resolved sensitization profiles in HBV allergy suggest predominant IgE sensitization to Api m 10 as a risk factor for treatment failure in HBV immunotherapy. Copyright © 2016 The Authors. Published by Elsevier Inc. All rights reserved.
Gepp, Barbara; Lengger, Nina; Bublin, Merima; Hemmer, Wolfgang; Breiteneder, Heimo; Radauer, Christian
2014-07-01
Characterization of IgE-binding epitopes of allergens and determination of their patient-specific relevance is crucial for the diagnosis and treatment of allergy. We sought to assess the contribution of specific surface areas of the major birch pollen allergen Bet v 1.0101 to binding IgE of individual patients. Four distinct areas of Bet v 1 representing in total 81% of its surface were grafted onto the scaffold of its homolog, Api g 1.0101, to yield the chimeras Api-Bet-1 to Api-Bet-4. The chimeras were expressed in Escherichia coli and purified. IgE binding of 64 sera from Bet v 1-sensitized subjects with birch pollen allergy was determined by using direct ELISA. Specificity was assessed by means of inhibition ELISA. rApi g 1.0101, Api-Bet-1, Api-Bet-2, Api-Bet-3, and Api-Bet-4 bound IgE from 44%, 89%, 80%, 78%, and 48% of the patients, respectively. By comparing the amount of IgE binding to the chimeras and to rApi g 1.0101, 81%, 70%, 75%, and 45% of the patients showed significantly enhanced IgE binding to Api-Bet-1, Api-Bet-2, Api-Bet-3, and Api-Bet-4, respectively. The minority (8%) of the sera revealed enhanced IgE binding exclusively to a single chimera, whereas 31% showed increased IgE binding to all 4 chimeras compared with rApi g 1.0101. The chimeras inhibited up to 70% of IgE binding to rBet v 1.0101, confirming the specific IgE recognition of the grafted regions. The Bet v 1-specific IgE response is polyclonal, and epitopes are spread across the entire Bet v 1 surface. Furthermore, the IgE recognition profile of Bet v 1 is highly patient specific. Copyright © 2014 The Authors. Published by Mosby, Inc. All rights reserved.
Phillips, Joshua; Chilukuri, Ram; Fragoso, Gilberto; Warzel, Denise; Covitz, Peter A
2006-01-01
Background Robust, programmatically accessible biomedical information services that syntactically and semantically interoperate with other resources are challenging to construct. Such systems require the adoption of common information models, data representations and terminology standards as well as documented application programming interfaces (APIs). The National Cancer Institute (NCI) developed the cancer common ontologic representation environment (caCORE) to provide the infrastructure necessary to achieve interoperability across the systems it develops or sponsors. The caCORE Software Development Kit (SDK) was designed to provide developers both within and outside the NCI with the tools needed to construct such interoperable software systems. Results The caCORE SDK requires a Unified Modeling Language (UML) tool to begin the development workflow with the construction of a domain information model in the form of a UML Class Diagram. Models are annotated with concepts and definitions from a description logic terminology source using the Semantic Connector component. The annotated model is registered in the Cancer Data Standards Repository (caDSR) using the UML Loader component. System software is automatically generated using the Codegen component, which produces middleware that runs on an application server. The caCORE SDK was initially tested and validated using a seven-class UML model, and has been used to generate the caCORE production system, which includes models with dozens of classes. The deployed system supports access through object-oriented APIs with consistent syntax for retrieval of any type of data object across all classes in the original UML model. The caCORE SDK is currently being used by several development teams, including by participants in the cancer biomedical informatics grid (caBIG) program, to create compatible data services. caBIG compatibility standards are based upon caCORE resources, and thus the caCORE SDK has emerged as a key enabling technology for caBIG. Conclusion The caCORE SDK substantially lowers the barrier to implementing systems that are syntactically and semantically interoperable by providing workflow and automation tools that standardize and expedite modeling, development, and deployment. It has gained acceptance among developers in the caBIG program, and is expected to provide a common mechanism for creating data service nodes on the data grid that is under development. PMID:16398930
Extending the XNAT archive tool for image and analysis management in ophthalmology research
NASA Astrophysics Data System (ADS)
Wahle, Andreas; Lee, Kyungmoo; Harding, Adam T.; Garvin, Mona K.; Niemeijer, Meindert; Sonka, Milan; Abràmoff, Michael D.
2013-03-01
In ophthalmology, various modalities and tests are utilized to obtain vital information on the eye's structure and function. For example, optical coherence tomography (OCT) is utilized to diagnose, screen, and aid treatment of eye diseases like macular degeneration or glaucoma. Such data are complemented by photographic retinal fundus images and functional tests on the visual field. DICOM isn't widely used yet, though, and frequently images are encoded in proprietary formats. The eXtensible Neuroimaging Archive Tool (XNAT) is an open-source NIH-funded framework for research PACS and is in use at the University of Iowa for neurological research applications. Its use for ophthalmology was hence desirable but posed new challenges due to data types thus far not considered and the lack of standardized formats. We developed custom tools for data types not natively recognized by XNAT itself using XNAT's low-level REST API. Vendor-provided tools can be included as necessary to convert proprietary data sets into valid DICOM. Clients can access the data in a standardized format while still retaining the original format if needed by specific analysis tools. With respective project-specific permissions, results like segmentations or quantitative evaluations can be stored as additional resources to previously uploaded datasets. Applications can use our abstract-level Python or C/C++ API to communicate with the XNAT instance. This paper describes concepts and details of the designed upload script templates, which can be customized to the needs of specific projects, and the novel client-side communication API which allows integration into new or existing research applications.
Temporal and spatial behavior of pharmaceuticals in ...
The behavior of active pharmaceutical ingredients (APIs) in urban estuaries is not well understood. In this study, 15 high volume usage APIs were measured over a one year period throughout Narragansett Bay, RI, USA to determine factors controlling their concentration and distribution. Dissolved APIs ranged in concentration from not detected to 310 ng/L, with numerous APIs present at all sites and sampling periods. Eight APIs were present in suspended particulate material, ranging in concentration from <1 ng/g to 44 ng/g. Partitioning coefficients (Kds) were determined for APIs present in both the dissolved and particulate phases, with their range and variability remaining relatively constant during the study. Organic carbon normalization reduced the observed variability of several APIs to a small extent; however, other factors appear to play a role in controlling partitioning behavior. The continuous discharge of wastewater treatment plant effluents into upper Narragansett Bay resulted in sustained levels of APIs, resulting in a zone of “pseudo-persistence.” For most of the APIs, there was a strong relationship with salinity, indicating conservative behavior within the estuary. Short flushing times in Narragansett Bay coupled with APIs present primarily in the dissolved phase suggests that most APIs will be diluted and transported out of the estuary, with only small amounts of several compounds removed to and sequestered in sediments. This study ide
49 CFR 195.565 - How do I install cathodic protection on breakout tanks?
Code of Federal Regulations, 2011 CFR
2011-10-01
...) capacity built to API Specification 12F, API Standard 620, or API Standard 650 (or its predecessor Standard 12C), you must install the system in accordance with API Recommended Practice 651. However, installation of the system need not comply with API Recommended Practice 651 on any tank for which you note in...
49 CFR 195.579 - What must I do to mitigate internal corrosion?
Code of Federal Regulations, 2012 CFR
2012-10-01
... API Specification 12F, API Standard 620, or API Standard 650 (or its predecessor Standard 12C), you must install the lining in accordance with API Recommended Practice 652. However, installation of the lining need not comply with API Recommended Practice 652 on any tank for which you note in the corrosion...
49 CFR 195.565 - How do I install cathodic protection on breakout tanks?
Code of Federal Regulations, 2014 CFR
2014-10-01
...) capacity built to API Specification 12F, API Standard 620, or API Standard 650 (or its predecessor Standard 12C), you must install the system in accordance with API Recommended Practice 651. However, installation of the system need not comply with API Recommended Practice 651 on any tank for which you note in...
49 CFR 195.565 - How do I install cathodic protection on breakout tanks?
Code of Federal Regulations, 2013 CFR
2013-10-01
...) capacity built to API Specification 12F, API Standard 620, or API Standard 650 (or its predecessor Standard 12C), you must install the system in accordance with API Recommended Practice 651. However, installation of the system need not comply with API Recommended Practice 651 on any tank for which you note in...
49 CFR 195.579 - What must I do to mitigate internal corrosion?
Code of Federal Regulations, 2010 CFR
2010-10-01
... API Specification 12F, API Standard 620, or API Standard 650 (or its predecessor Standard 12C), you must install the lining in accordance with API Recommended Practice 652. However, installation of the lining need not comply with API Recommended Practice 652 on any tank for which you note in the corrosion...
49 CFR 195.579 - What must I do to mitigate internal corrosion?
Code of Federal Regulations, 2013 CFR
2013-10-01
... API Specification 12F, API Standard 620, or API Standard 650 (or its predecessor Standard 12C), you must install the lining in accordance with API Recommended Practice 652. However, installation of the lining need not comply with API Recommended Practice 652 on any tank for which you note in the corrosion...
49 CFR 195.579 - What must I do to mitigate internal corrosion?
Code of Federal Regulations, 2011 CFR
2011-10-01
... API Specification 12F, API Standard 620, or API Standard 650 (or its predecessor Standard 12C), you must install the lining in accordance with API Recommended Practice 652. However, installation of the lining need not comply with API Recommended Practice 652 on any tank for which you note in the corrosion...
49 CFR 195.565 - How do I install cathodic protection on breakout tanks?
Code of Federal Regulations, 2012 CFR
2012-10-01
...) capacity built to API Specification 12F, API Standard 620, or API Standard 650 (or its predecessor Standard 12C), you must install the system in accordance with API Recommended Practice 651. However, installation of the system need not comply with API Recommended Practice 651 on any tank for which you note in...
49 CFR 195.579 - What must I do to mitigate internal corrosion?
Code of Federal Regulations, 2014 CFR
2014-10-01
... API Specification 12F, API Standard 620, or API Standard 650 (or its predecessor Standard 12C), you must install the lining in accordance with API Recommended Practice 652. However, installation of the lining need not comply with API Recommended Practice 652 on any tank for which you note in the corrosion...
49 CFR 195.565 - How do I install cathodic protection on breakout tanks?
Code of Federal Regulations, 2010 CFR
2010-10-01
...) capacity built to API Specification 12F, API Standard 620, or API Standard 650 (or its predecessor Standard 12C), you must install the system in accordance with API Recommended Practice 651. However, installation of the system need not comply with API Recommended Practice 651 on any tank for which you note in...
Programming and Runtime Support to Blaze FPGA Accelerator Deployment at Datacenter Scale
Huang, Muhuan; Wu, Di; Yu, Cody Hao; Fang, Zhenman; Interlandi, Matteo; Condie, Tyson; Cong, Jason
2017-01-01
With the end of CPU core scaling due to dark silicon limitations, customized accelerators on FPGAs have gained increased attention in modern datacenters due to their lower power, high performance and energy efficiency. Evidenced by Microsoft’s FPGA deployment in its Bing search engine and Intel’s 16.7 billion acquisition of Altera, integrating FPGAs into datacenters is considered one of the most promising approaches to sustain future datacenter growth. However, it is quite challenging for existing big data computing systems—like Apache Spark and Hadoop—to access the performance and energy benefits of FPGA accelerators. In this paper we design and implement Blaze to provide programming and runtime support for enabling easy and efficient deployments of FPGA accelerators in datacenters. In particular, Blaze abstracts FPGA accelerators as a service (FaaS) and provides a set of clean programming APIs for big data processing applications to easily utilize those accelerators. Our Blaze runtime implements an FaaS framework to efficiently share FPGA accelerators among multiple heterogeneous threads on a single node, and extends Hadoop YARN with accelerator-centric scheduling to efficiently share them among multiple computing tasks in the cluster. Experimental results using four representative big data applications demonstrate that Blaze greatly reduces the programming efforts to access FPGA accelerators in systems like Apache Spark and YARN, and improves the system throughput by 1.7 × to 3× (and energy efficiency by 1.5× to 2.7×) compared to a conventional CPU-only cluster. PMID:28317049
Programming and Runtime Support to Blaze FPGA Accelerator Deployment at Datacenter Scale.
Huang, Muhuan; Wu, Di; Yu, Cody Hao; Fang, Zhenman; Interlandi, Matteo; Condie, Tyson; Cong, Jason
2016-10-01
With the end of CPU core scaling due to dark silicon limitations, customized accelerators on FPGAs have gained increased attention in modern datacenters due to their lower power, high performance and energy efficiency. Evidenced by Microsoft's FPGA deployment in its Bing search engine and Intel's 16.7 billion acquisition of Altera, integrating FPGAs into datacenters is considered one of the most promising approaches to sustain future datacenter growth. However, it is quite challenging for existing big data computing systems-like Apache Spark and Hadoop-to access the performance and energy benefits of FPGA accelerators. In this paper we design and implement Blaze to provide programming and runtime support for enabling easy and efficient deployments of FPGA accelerators in datacenters. In particular, Blaze abstracts FPGA accelerators as a service (FaaS) and provides a set of clean programming APIs for big data processing applications to easily utilize those accelerators. Our Blaze runtime implements an FaaS framework to efficiently share FPGA accelerators among multiple heterogeneous threads on a single node, and extends Hadoop YARN with accelerator-centric scheduling to efficiently share them among multiple computing tasks in the cluster. Experimental results using four representative big data applications demonstrate that Blaze greatly reduces the programming efforts to access FPGA accelerators in systems like Apache Spark and YARN, and improves the system throughput by 1.7 × to 3× (and energy efficiency by 1.5× to 2.7×) compared to a conventional CPU-only cluster.
Cloud-Based Perception and Control of Sensor Nets and Robot Swarms
2016-04-01
distributed stream processing framework provides the necessary API and infrastructure to develop and execute such applications in a cluster of computation...streaming DDDAS applications based on challenges they present to the backend Cloud control system. Figure 2 Parallel SLAM Application 3 1) Set of...the art deep learning- based object detectors can recognize among hundreds of object classes and this capability would be very useful for mobile
Structure simulation with calculated NMR parameters - integrating COSMOS into the CCPN framework.
Schneider, Olaf; Fogh, Rasmus H; Sternberg, Ulrich; Klenin, Konstantin; Kondov, Ivan
2012-01-01
The Collaborative Computing Project for NMR (CCPN) has build a software framework consisting of the CCPN data model (with APIs) for NMR related data, the CcpNmr Analysis program and additional tools like CcpNmr FormatConverter. The open architecture allows for the integration of external software to extend the abilities of the CCPN framework with additional calculation methods. Recently, we have carried out the first steps for integrating our software Computer Simulation of Molecular Structures (COSMOS) into the CCPN framework. The COSMOS-NMR force field unites quantum chemical routines for the calculation of molecular properties with a molecular mechanics force field yielding the relative molecular energies. COSMOS-NMR allows introducing NMR parameters as constraints into molecular mechanics calculations. The resulting infrastructure will be made available for the NMR community. As a first application we have tested the evaluation of calculated protein structures using COSMOS-derived 13C Cα and Cβ chemical shifts. In this paper we give an overview of the methodology and a roadmap for future developments and applications.