Sample records for integrated software platform

  1. Framework programmable platform for the advanced software development workstation. Integration mechanism design document

    NASA Technical Reports Server (NTRS)

    Mayer, Richard J.; Blinn, Thomas M.; Mayer, Paula S. D.; Reddy, Uday; Ackley, Keith; Futrell, Mike

    1991-01-01

    The Framework Programmable Software Development Platform (FPP) is a project aimed at combining effective tool and data integration mechanisms with a model of the software development process in an intelligent integrated software development environment. Guided by this model, this system development framework will take advantage of an integrated operating environment to automate effectively the management of the software development process so that costly mistakes during the development phase can be eliminated.

  2. VA's Integrated Imaging System on three platforms.

    PubMed

    Dayhoff, R E; Maloney, D L; Majurski, W J

    1992-01-01

    The DHCP Integrated Imaging System provides users with integrated patient data including text, image and graphics data. This system has been transferred from its original two screen DOS-based MUMPS platform to an X window workstation and a Microsoft Windows-based workstation. There are differences between these various platforms that impact on software design and on software development strategy. Data structures and conventions were used to isolate hardware, operating system, imaging software, and user-interface differences between platforms in the implementation of functionality for text and image display and interaction. The use of an object-oriented approach greatly increased system portability.

  3. VA's Integrated Imaging System on three platforms.

    PubMed Central

    Dayhoff, R. E.; Maloney, D. L.; Majurski, W. J.

    1992-01-01

    The DHCP Integrated Imaging System provides users with integrated patient data including text, image and graphics data. This system has been transferred from its original two screen DOS-based MUMPS platform to an X window workstation and a Microsoft Windows-based workstation. There are differences between these various platforms that impact on software design and on software development strategy. Data structures and conventions were used to isolate hardware, operating system, imaging software, and user-interface differences between platforms in the implementation of functionality for text and image display and interaction. The use of an object-oriented approach greatly increased system portability. PMID:1482983

  4. Framework Programmable Platform for the Advanced Software Development Workstation: Preliminary system design document

    NASA Technical Reports Server (NTRS)

    Mayer, Richard J.; Blinn, Thomas M.; Mayer, Paula S. D.; Ackley, Keith A.; Crump, John W., IV; Henderson, Richard; Futrell, Michael T.

    1991-01-01

    The Framework Programmable Software Development Platform (FPP) is a project aimed at combining effective tool and data integration mechanisms with a model of the software development process in an intelligent integrated software environment. Guided by the model, this system development framework will take advantage of an integrated operating environment to automate effectively the management of the software development process so that costly mistakes during the development phase can be eliminated. The focus here is on the design of components that make up the FPP. These components serve as supporting systems for the Integration Mechanism and the Framework Processor and provide the 'glue' that ties the FPP together. Also discussed are the components that allow the platform to operate in a distributed, heterogeneous environment and to manage the development and evolution of software system artifacts.

  5. Framework Programmable Platform for the advanced software development workstation: Framework processor design document

    NASA Technical Reports Server (NTRS)

    Mayer, Richard J.; Blinn, Thomas M.; Mayer, Paula S. D.; Ackley, Keith A.; Crump, Wes; Sanders, Les

    1991-01-01

    The design of the Framework Processor (FP) component of the Framework Programmable Software Development Platform (FFP) is described. The FFP is a project aimed at combining effective tool and data integration mechanisms with a model of the software development process in an intelligent integrated software development environment. Guided by the model, this Framework Processor will take advantage of an integrated operating environment to provide automated support for the management and control of the software development process so that costly mistakes during the development phase can be eliminated.

  6. Framework Programmable Platform for the Advanced Software Development Workstation (FPP/ASDW). Demonstration framework document. Volume 1: Concepts and activity descriptions

    NASA Technical Reports Server (NTRS)

    Mayer, Richard J.; Blinn, Thomas M.; Dewitte, Paul S.; Crump, John W.; Ackley, Keith A.

    1992-01-01

    The Framework Programmable Software Development Platform (FPP) is a project aimed at effectively combining tool and data integration mechanisms with a model of the software development process to provide an intelligent integrated software development environment. Guided by the model, this system development framework will take advantage of an integrated operating environment to automate effectively the management of the software development process so that costly mistakes during the development phase can be eliminated. The Advanced Software Development Workstation (ASDW) program is conducting research into development of advanced technologies for Computer Aided Software Engineering (CASE).

  7. Integrated Power, Avionics, and Software (iPAS) Space Telecommunications Radio System (STRS) Radio User's Guide -- Advanced Exploration Systems (AES)

    NASA Technical Reports Server (NTRS)

    Roche, Rigoberto; Shalkhauser, Mary Jo Windmille

    2017-01-01

    The Integrated Power, Avionics and Software (IPAS) software defined radio (SDR) was implemented on the Reconfigurable, Intelligently-Adaptive Communication System (RAICS) platform, for radio development at NASA Johnson Space Center. Software and hardware description language (HDL) code were delivered by NASA Glenn Research Center for use in the IPAS test bed and for development of their own Space Telecommunications Radio System (STRS) waveforms on the RAICS platform. The purpose of this document is to describe how to setup and operate the IPAS STRS Radio platform with its delivered test waveform.

  8. Towards an Open, Distributed Software Architecture for UxS Operations

    NASA Technical Reports Server (NTRS)

    Cross, Charles D.; Motter, Mark A.; Neilan, James H.; Qualls, Garry D.; Rothhaar, Paul M.; Tran, Loc; Trujillo, Anna C.; Allen, B. Danette

    2015-01-01

    To address the growing need to evaluate, test, and certify an ever expanding ecosystem of UxS platforms in preparation of cultural integration, NASA Langley Research Center's Autonomy Incubator (AI) has taken on the challenge of developing a software framework in which UxS platforms developed by third parties can be integrated into a single system which provides evaluation and testing, mission planning and operation, and out-of-the-box autonomy and data fusion capabilities. This software framework, named AEON (Autonomous Entity Operations Network), has two main goals. The first goal is the development of a cross-platform, extensible, onboard software system that provides autonomy at the mission execution and course-planning level, a highly configurable data fusion framework sensitive to the platform's available sensor hardware, and plug-and-play compatibility with a wide array of computer systems, sensors, software, and controls hardware. The second goal is the development of a ground control system that acts as a test-bed for integration of the proposed heterogeneous fleet, and allows for complex mission planning, tracking, and debugging capabilities. The ground control system should also be highly extensible and allow plug-and-play interoperability with third party software systems. In order to achieve these goals, this paper proposes an open, distributed software architecture which utilizes at its core the Data Distribution Service (DDS) standards, established by the Object Management Group (OMG), for inter-process communication and data flow. The design decisions proposed herein leverage the advantages of existing robotics software architectures and the DDS standards to develop software that is scalable, high-performance, fault tolerant, modular, and readily interoperable with external platforms and software.

  9. Integrating MPI and deduplication engines: a software architecture roadmap.

    PubMed

    Baksi, Dibyendu

    2009-03-01

    The objective of this paper is to clarify the major concepts related to architecture and design of patient identity management software systems so that an implementor looking to solve a specific integration problem in the context of a Master Patient Index (MPI) and a deduplication engine can address the relevant issues. The ideas presented are illustrated in the context of a reference use case from Integrating the Health Enterprise Patient Identifier Cross-referencing (IHE PIX) profile. Sound software engineering principles using the latest design paradigm of model driven architecture (MDA) are applied to define different views of the architecture. The main contribution of the paper is a clear software architecture roadmap for implementors of patient identity management systems. Conceptual design in terms of static and dynamic views of the interfaces is provided as an example of platform independent model. This makes the roadmap applicable to any specific solutions of MPI, deduplication library or software platform. Stakeholders in need of integration of MPIs and deduplication engines can evaluate vendor specific solutions and software platform technologies in terms of fundamental concepts and can make informed decisions that preserve investment. This also allows freedom from vendor lock-in and the ability to kick-start integration efforts based on a solid architecture.

  10. From proteomics to systems biology: MAPA, MASS WESTERN, PROMEX, and COVAIN as a user-oriented platform.

    PubMed

    Weckwerth, Wolfram; Wienkoop, Stefanie; Hoehenwarter, Wolfgang; Egelhofer, Volker; Sun, Xiaoliang

    2014-01-01

    Genome sequencing and systems biology are revolutionizing life sciences. Proteomics emerged as a fundamental technique of this novel research area as it is the basis for gene function analysis and modeling of dynamic protein networks. Here a complete proteomics platform suited for functional genomics and systems biology is presented. The strategy includes MAPA (mass accuracy precursor alignment; http://www.univie.ac.at/mosys/software.html ) as a rapid exploratory analysis step; MASS WESTERN for targeted proteomics; COVAIN ( http://www.univie.ac.at/mosys/software.html ) for multivariate statistical analysis, data integration, and data mining; and PROMEX ( http://www.univie.ac.at/mosys/databases.html ) as a database module for proteogenomics and proteotypic peptides for targeted analysis. Moreover, the presented platform can also be utilized to integrate metabolomics and transcriptomics data for the analysis of metabolite-protein-transcript correlations and time course analysis using COVAIN. Examples for the integration of MAPA and MASS WESTERN data, proteogenomic and metabolic modeling approaches for functional genomics, phosphoproteomics by integration of MOAC (metal-oxide affinity chromatography) with MAPA, and the integration of metabolomics, transcriptomics, proteomics, and physiological data using this platform are presented. All software and step-by-step tutorials for data processing and data mining can be downloaded from http://www.univie.ac.at/mosys/software.html.

  11. Future Directions for Astronomical Image Display

    NASA Technical Reports Server (NTRS)

    Mandel, Eric

    2000-01-01

    In the "Future Directions for Astronomical Image Displav" project, the Smithsonian Astrophysical Observatory (SAO) and the National Optical Astronomy Observatories (NOAO) evolved our existing image display program into fully extensible. cross-platform image display software. We also devised messaging software to support integration of image display into astronomical analysis systems. Finally, we migrated our software from reliance on Unix and the X Window System to a platform-independent architecture that utilizes the cross-platform Tcl/Tk technology.

  12. Waveform Developer's Guide for the Integrated Power, Avionics, and Software (iPAS) Space Telecommunications Radio System (STRS) Radio

    NASA Technical Reports Server (NTRS)

    Shalkhauser, Mary Jo W.; Roche, Rigoberto

    2017-01-01

    The Space Telecommunications Radio System (STRS) provides a common, consistent framework for software defined radios (SDRs) to abstract the application software from the radio platform hardware. The STRS standard aims to reduce the cost and risk of using complex, configurable and reprogrammable radio systems across NASA missions. To promote the use of the STRS architecture for future NASA advanced exploration missions, NASA Glenn Research Center (GRC) developed an STRS-compliant SDR on a radio platform used by the Advance Exploration System program at the Johnson Space Center (JSC) in their Integrated Power, Avionics, and Software (iPAS) laboratory. The iPAS STRS Radio was implemented on the Reconfigurable, Intelligently-Adaptive Communication System (RIACS) platform, currently being used for radio development at JSC. The platform consists of a Xilinx(Trademark) ML605 Virtex(Trademark)-6 FPGA board, an Analog Devices FMCOMMS1-EBZ RF transceiver board, and an Embedded PC (Axiomtek(Trademark) eBox 620-110-FL) running the Ubuntu 12.4 operating system. The result of this development is a very low cost STRS compliant platform that can be used for waveform developments for multiple applications. The purpose of this document is to describe how to develop a new waveform using the RIACS platform and the Very High Speed Integrated Circuits (VHSIC) Hardware Description Language (VHDL) FPGA wrapper code and the STRS implementation on the Axiomtek processor.

  13. CILogon: An Integrated Identity and Access Management Platform for Science

    NASA Astrophysics Data System (ADS)

    Basney, J.

    2016-12-01

    When scientists work together, they use web sites and other software to share their ideas and data. To ensure the integrity of their work, these systems require the scientists to log in and verify that they are part of the team working on a particular science problem. Too often, the identity and access verification process is a stumbling block for the scientists. Scientific research projects are forced to invest time and effort into developing and supporting Identity and Access Management (IAM) services, distracting them from the core goals of their research collaboration. CILogon provides an IAM platform that enables scientists to work together to meet their IAM needs more effectively so they can allocate more time and effort to their core mission of scientific research. The CILogon platform enables federated identity management and collaborative organization management. Federated identity management enables researchers to use their home organization identities to access cyberinfrastructure, rather than requiring yet another username and password to log on. Collaborative organization management enables research projects to define user groups for authorization to collaboration platforms (e.g., wikis, mailing lists, and domain applications). CILogon's IAM platform serves the unique needs of research collaborations, namely the need to dynamically form collaboration groups across organizations and countries, sharing access to data, instruments, compute clusters, and other resources to enable scientific discovery. CILogon provides a software-as-a-service platform to ease integration with cyberinfrastructure, while making all software components publicly available under open source licenses to enable re-use. Figure 1 illustrates the components and interfaces of this platform. CILogon has been operational since 2010 and has been used by over 7,000 researchers from more than 170 identity providers to access cyberinfrastructure including Globus, LIGO, Open Science Grid, SeedMe, and XSEDE. The "CILogon 2.0" platform, launched in 2016, adds support for virtual organization (VO) membership management, identity linking, international collaborations, and standard integration protocols, through integration with the Internet2 COmanage collaboration software.

  14. Hardware Interface Description for the Integrated Power, Avionics, and Software (iPAS) Space Telecommunications Radio Ssystem (STRS) Radio

    NASA Technical Reports Server (NTRS)

    Shalkhauser, Mary Jo W.; Roche, Rigoberto

    2017-01-01

    The Space Telecommunications Radio System (STRS) provides a common, consistent framework for software defined radios (SDRs) to abstract the application software from the radio platform hardware. The STRS standard aims to reduce the cost and risk of using complex, configurable and reprogrammable radio systems across NASA missions. To promote the use of the STRS architecture for future NASA advanced exploration missions, NASA Glenn Research Center (GRC) developed an STRS-compliant SDR on a radio platform used by the Advance Exploration System program at the Johnson Space Center (JSC) in their Integrated Power, Avionics, and Software (iPAS) laboratory. The iPAS STRS Radio was implemented on the Reconfigurable, Intelligently-Adaptive Communication System (RIACS) platform, currently being used for radio development at JSC. The platform consists of a Xilinx ML605 Virtex-6 FPGA board, an Analog Devices FMCOMMS1-EBZ RF transceiver board, and an Embedded PC (Axiomtek eBox 620-110-FL) running the Ubuntu 12.4 operating system. Figure 1 shows the RIACS platform hardware. The result of this development is a very low cost STRS compliant platform that can be used for waveform developments for multiple applications.The purpose of this document is to describe how to develop a new waveform using the RIACS platform and the Very High Speed Integrated Circuits (VHSIC) Hardware Description Language (VHDL) FPGA wrapper code and the STRS implementation on the Axiomtek processor.

  15. Using Docker Compose for the Simple Deployment of an Integrated Drug Target Screening Platform.

    PubMed

    List, Markus

    2017-06-10

    Docker virtualization allows for software tools to be executed in an isolated and controlled environment referred to as a container. In Docker containers, dependencies are provided exactly as intended by the developer and, consequently, they simplify the distribution of scientific software and foster reproducible research. The Docker paradigm is that each container encapsulates one particular software tool. However, to analyze complex biomedical data sets, it is often necessary to combine several software tools into elaborate workflows. To address this challenge, several Docker containers need to be instantiated and properly integrated, which complicates the software deployment process unnecessarily. Here, we demonstrate how an extension to Docker, Docker compose, can be used to mitigate these problems by providing a unified setup routine that deploys several tools in an integrated fashion. We demonstrate the power of this approach by example of a Docker compose setup for a drug target screening platform consisting of five integrated web applications and shared infrastructure, deployable in just two lines of codes.

  16. Process and information integration via hypermedia

    NASA Technical Reports Server (NTRS)

    Hammen, David G.; Labasse, Daniel L.; Myers, Robert M.

    1990-01-01

    Success stories for advanced automation prototypes abound in the literature but the deployments of practical large systems are few in number. There are several factors that militate against the maturation of such prototypes into products. Here, the integration of advanced automation software into large systems is discussed. Advanced automation systems tend to be specific applications that need to be integrated and aggregated into larger systems. Systems integration can be achieved by providing expert user-developers with verified tools to efficiently create small systems that interface to large systems through standard interfaces. The use of hypermedia as such a tool in the context of the ground control centers that support Shuttle and space station operations is explored. Hypermedia can be an integrating platform for data, conventional software, and advanced automation software, enabling data integration through the display of diverse types of information and through the creation of associative links between chunks of information. Further, hypermedia enables process integration through graphical invoking of system functions. Through analysis and examples, researchers illustrate how diverse information and processing paradigms can be integrated into a single software platform.

  17. Toward an integrated software platform for systems pharmacology

    PubMed Central

    Ghosh, Samik; Matsuoka, Yukiko; Asai, Yoshiyuki; Hsin, Kun-Yi; Kitano, Hiroaki

    2013-01-01

    Understanding complex biological systems requires the extensive support of computational tools. This is particularly true for systems pharmacology, which aims to understand the action of drugs and their interactions in a systems context. Computational models play an important role as they can be viewed as an explicit representation of biological hypotheses to be tested. A series of software and data resources are used for model development, verification and exploration of the possible behaviors of biological systems using the model that may not be possible or not cost effective by experiments. Software platforms play a dominant role in creativity and productivity support and have transformed many industries, techniques that can be applied to biology as well. Establishing an integrated software platform will be the next important step in the field. © 2013 The Authors. Biopharmaceutics & Drug Disposition published by John Wiley & Sons, Ltd. PMID:24150748

  18. Arsenic removal from contaminated groundwater by membrane-integrated hybrid plant: optimization and control using Visual Basic platform.

    PubMed

    Chakrabortty, S; Sen, M; Pal, P

    2014-03-01

    A simulation software (ARRPA) has been developed in Microsoft Visual Basic platform for optimization and control of a novel membrane-integrated arsenic separation plant in the backdrop of absence of such software. The user-friendly, menu-driven software is based on a dynamic linearized mathematical model, developed for the hybrid treatment scheme. The model captures the chemical kinetics in the pre-treating chemical reactor and the separation and transport phenomena involved in nanofiltration. The software has been validated through extensive experimental investigations. The agreement between the outputs from computer simulation program and the experimental findings are excellent and consistent under varying operating conditions reflecting high degree of accuracy and reliability of the software. High values of the overall correlation coefficient (R (2) = 0.989) and Willmott d-index (0.989) are indicators of the capability of the software in analyzing performance of the plant. The software permits pre-analysis, manipulation of input data, helps in optimization and exhibits performance of an integrated plant visually on a graphical platform. Performance analysis of the whole system as well as the individual units is possible using the tool. The software first of its kind in its domain and in the well-known Microsoft Excel environment is likely to be very useful in successful design, optimization and operation of an advanced hybrid treatment plant for removal of arsenic from contaminated groundwater.

  19. Open source hardware and software platform for robotics and artificial intelligence applications

    NASA Astrophysics Data System (ADS)

    Liang, S. Ng; Tan, K. O.; Lai Clement, T. H.; Ng, S. K.; Mohammed, A. H. Ali; Mailah, Musa; Azhar Yussof, Wan; Hamedon, Zamzuri; Yussof, Zulkifli

    2016-02-01

    Recent developments in open source hardware and software platforms (Android, Arduino, Linux, OpenCV etc.) have enabled rapid development of previously expensive and sophisticated system within a lower budget and flatter learning curves for developers. Using these platform, we designed and developed a Java-based 3D robotic simulation system, with graph database, which is integrated in online and offline modes with an Android-Arduino based rubbish picking remote control car. The combination of the open source hardware and software system created a flexible and expandable platform for further developments in the future, both in the software and hardware areas, in particular in combination with graph database for artificial intelligence, as well as more sophisticated hardware, such as legged or humanoid robots.

  20. Radiation and scattering from printed antennas on cylindrically conformal platforms

    NASA Technical Reports Server (NTRS)

    Kempel, Leo C.; Volakis, John L.; Bindiganavale, Sunil

    1994-01-01

    The goal was to develop suitable methods and software for the analysis of antennas on cylindrical coated and uncoated platforms. Specifically, the finite element boundary integral and finite element ABC methods were employed successfully and associated software were developed for the analysis and design of wraparound and discrete cavity-backed arrays situated on cylindrical platforms. This work led to the successful implementation of analysis software for such antennas. Developments which played a role in this respect are the efficient implementation of the 3D Green's function for a metallic cylinder, the incorporation of the fast Fourier transform in computing the matrix-vector products executed in the solver of the finite element-boundary integral system, and the development of a new absorbing boundary condition for terminating the finite element mesh on cylindrical surfaces.

  1. A Practical Software Architecture for Virtual Universities

    ERIC Educational Resources Information Center

    Xiang, Peifeng; Shi, Yuanchun; Qin, Weijun

    2006-01-01

    This article introduces a practical software architecture called CUBES, which focuses on system integration and evolvement for online virtual universities. The key of CUBES is a supporting platform that helps to integrate and evolve heterogeneous educational applications developed by different organizations. Both standardized educational…

  2. A Roadmap to Continuous Integration for ATLAS Software Development

    NASA Astrophysics Data System (ADS)

    Elmsheuser, J.; Krasznahorkay, A.; Obreshkov, E.; Undrus, A.; ATLAS Collaboration

    2017-10-01

    The ATLAS software infrastructure facilitates efforts of more than 1000 developers working on the code base of 2200 packages with 4 million lines of C++ and 1.4 million lines of python code. The ATLAS offline code management system is the powerful, flexible framework for processing new package versions requests, probing code changes in the Nightly Build System, migration to new platforms and compilers, deployment of production releases for worldwide access and supporting physicists with tools and interfaces for efficient software use. It maintains multi-stream, parallel development environment with about 70 multi-platform branches of nightly releases and provides vast opportunities for testing new packages, for verifying patches to existing software and for migrating to new platforms and compilers. The system evolution is currently aimed on the adoption of modern continuous integration (CI) practices focused on building nightly releases early and often, with rigorous unit and integration testing. This paper describes the CI incorporation program for the ATLAS software infrastructure. It brings modern open source tools such as Jenkins and GitLab into the ATLAS Nightly System, rationalizes hardware resource allocation and administrative operations, provides improved feedback and means to fix broken builds promptly for developers. Once adopted, ATLAS CI practices will improve and accelerate innovation cycles and result in increased confidence in new software deployments. The paper reports the status of Jenkins integration with the ATLAS Nightly System as well as short and long term plans for the incorporation of CI practices.

  3. A COTS RF Optical Software Defined Radio for the Integrated Radio and Optical Communications Test Bed

    NASA Technical Reports Server (NTRS)

    Nappier, Jennifer M.; Zeleznikar, Daniel J.; Wroblewski, Adam C.; Tokars, Roger P.; Schoenholz, Bryan L.; Lantz, Nicholas C.

    2016-01-01

    The Integrated Radio and Optical Communications (iROC) project at the National Aeronautics and Space Administration (NASA) is investigating the merits of a hybrid radio frequency (RF) and optical communication system for deep space missions. In an effort to demonstrate the feasibility and advantages of a hybrid RFOptical software defined radio (SDR), a laboratory prototype was assembled from primarily commercial-off-the-shelf (COTS) hardware components. This COTS platform has been used to demonstrate simultaneous transmission of the radio and optical communications waveforms through to the physical layer (telescope and antenna). This paper details the hardware and software used in the platform and various measures of its performance. A laboratory optical receiver platform has also been assembled in order to demonstrate hybrid free space links in combination with the transmitter.

  4. A COTS RF/Optical Software Defined Radio for the Integrated Radio and Optical Communications Test Bed

    NASA Technical Reports Server (NTRS)

    Nappier, Jennifer M.; Zeleznikar, Daniel J.; Wroblewski, Adam C.; Tokars, Roger P.; Schoenholz, Bryan L.; Lantz, Nicholas C.

    2017-01-01

    The Integrated Radio and Optical Communications (iROC) project at the National Aeronautics and Space Administration (NASA) is investigating the merits of a hybrid radio frequency (RF) and optical communication system for deep space missions. In an effort to demonstrate the feasibility and advantages of a hybrid RF/Optical software defined radio (SDR), a laboratory prototype was assembled from primarily commercial-off-the-shelf (COTS) hardware components. This COTS platform has been used to demonstrate simultaneous transmission of the radio and optical communications waveforms through to the physical layer (telescope and antenna). This paper details the hardware and software used in the platform and various measures of its performance. A laboratory optical receiver platform has also been assembled in order to demonstrate hybrid free space links in combination with the transmitter.

  5. Supporting in- and off-Hospital Patient Management Using a Web-based Integrated Software Platform.

    PubMed

    Spyropoulos, Basile; Botsivali, Maria; Tzavaras, Aris; Pierros, Vasileios

    2015-01-01

    In this paper, a Web-based software platform appropriately designed to support the continuity of health care information and management for both in and out of hospital care is presented. The system has some additional features as it is the formation of continuity of care records and the transmission of referral letters with a semantically annotated web service. The platform's Web-orientation provides significant advantages, allowing for easily accomplished remote access.

  6. Software packager user's guide

    NASA Technical Reports Server (NTRS)

    Callahan, John R.

    1995-01-01

    Software integration is a growing area of concern for many programmers and software managers because the need to build new programs quickly from existing components is greater than ever. This includes building versions of software products for multiple hardware platforms and operating systems, building programs from components written in different languages, and building systems from components that must execute on different machines in a distributed network. The goal of software integration is to make building new programs from existing components more seamless -- programmers should pay minimal attention to the underlying configuration issues involved. Libraries of reusable components and classes are important tools but only partial solutions to software development problems. Even though software components may have compatible interfaces, there may be other reasons, such as differences between execution environments, why they cannot be integrated. Often, components must be adapted or reimplemented to fit into another application because of implementation differences -- they are implemented in different programming languages, dependent on different operating system resources, or must execute on different physical machines. The software packager is a tool that allows programmers to deal with interfaces between software components and ignore complex integration details. The packager takes modular descriptions of the structure of a software system written in the package specification language and produces an integration program in the form of a makefile. If complex integration tools are needed to integrate a set of components, such as remote procedure call stubs, their use is implied by the packager automatically and stub generation tools are invoked in the corresponding makefile. The programmer deals only with the components themselves and not the details of how to build the system on any given platform.

  7. Flexible Software Architecture for Visualization and Seismic Data Analysis

    NASA Astrophysics Data System (ADS)

    Petunin, S.; Pavlov, I.; Mogilenskikh, D.; Podzyuban, D.; Arkhipov, A.; Baturuin, N.; Lisin, A.; Smith, A.; Rivers, W.; Harben, P.

    2007-12-01

    Research in the field of seismology requires software and signal processing utilities for seismogram manipulation and analysis. Seismologists and data analysts often encounter a major problem in the use of any particular software application specific to seismic data analysis: the tuning of commands and windows to the specific waveforms and hot key combinations so as to fit their familiar informational environment. The ability to modify the user's interface independently from the developer requires an adaptive code structure. An adaptive code structure also allows for expansion of software capabilities such as new signal processing modules and implementation of more efficient algorithms. Our approach is to use a flexible "open" architecture for development of geophysical software. This report presents an integrated solution for organizing a logical software architecture based on the Unix version of the Geotool software implemented on the Microsoft NET 2.0 platform. Selection of this platform greatly expands the variety and number of computers that can implement the software, including laptops that can be utilized in field conditions. It also facilitates implementation of communication functions for seismic data requests from remote databases through the Internet. The main principle of the new architecture for Geotool is that scientists should be able to add new routines for digital waveform analysis via software plug-ins that utilize the basic Geotool display for GUI interaction. The use of plug-ins allows the efficient integration of diverse signal-processing software, including software still in preliminary development, into an organized platform without changing the fundamental structure of that platform itself. An analyst's use of Geotool is tracked via a metadata file so that future studies can reconstruct, and alter, the original signal processing operations. The work has been completed in the framework of a joint Russian- American project.

  8. Future Standardization of Space Telecommunications Radio System with Core Flight System

    NASA Technical Reports Server (NTRS)

    Hickey, Joseph P.; Briones, Janette C.; Roche, Rigoberto; Handler, Louis M.; Hall, Steven

    2016-01-01

    NASA Glenn Research Center (GRC) is integrating the NASA Space Telecommunications Radio System (STRS) Standard with the Core Flight System (cFS). The STRS standard provides a common, consistent framework to develop, qualify, operate and maintain complex, reconfigurable and reprogrammable radio systems. The cFS is a flexible, open architecture that features a plug-and-play software executive called the Core Flight Executive (cFE), a reusable library of software components for flight and space missions and an integrated tool suite. Together, STRS and cFS create a development environment that allows for STRS compliant applications to reference the STRS APIs through the cFS infrastructure. These APis are used to standardize the communication protocols on NASAs space SDRs. The cFE-STRS Operating Environment (OE) is a portable cFS library, which adds the ability to run STRS applications on existing cFS platforms. The purpose of this paper is to discuss the cFE-STRS OE prototype, preliminary experimental results performed using the Advanced Space Radio Platform (ASRP), the GRC Sband Ground Station and the SCaN (Space Communication and Navigation) Testbed currently flying onboard the International Space Station. Additionally, this paper presents a demonstration of the Consultative Committee for Space Data Systems (CCSDS) Spacecraft Onboard Interface Services (SOIS) using electronic data sheets inside cFE. This configuration allows for the data sheets to specify binary formats for data exchange between STRS applications. The integration of STRS with cFS leverages mission-proven platform functions and mitigates barriers to integration with future missions. This reduces flight software development time and the costs of software-defined radio (SDR) platforms. Furthermore, the combined benefits of STRS standardization with the flexibility of cFS provide an effective, reliable and modular framework to minimize software development efforts for spaceflight missions.

  9. Boutiques: a flexible framework to integrate command-line applications in computing platforms.

    PubMed

    Glatard, Tristan; Kiar, Gregory; Aumentado-Armstrong, Tristan; Beck, Natacha; Bellec, Pierre; Bernard, Rémi; Bonnet, Axel; Brown, Shawn T; Camarasu-Pop, Sorina; Cervenansky, Frédéric; Das, Samir; Ferreira da Silva, Rafael; Flandin, Guillaume; Girard, Pascal; Gorgolewski, Krzysztof J; Guttmann, Charles R G; Hayot-Sasson, Valérie; Quirion, Pierre-Olivier; Rioux, Pierre; Rousseau, Marc-Étienne; Evans, Alan C

    2018-05-01

    We present Boutiques, a system to automatically publish, integrate, and execute command-line applications across computational platforms. Boutiques applications are installed through software containers described in a rich and flexible JSON language. A set of core tools facilitates the construction, validation, import, execution, and publishing of applications. Boutiques is currently supported by several distinct virtual research platforms, and it has been used to describe dozens of applications in the neuroinformatics domain. We expect Boutiques to improve the quality of application integration in computational platforms, to reduce redundancy of effort, to contribute to computational reproducibility, and to foster Open Science.

  10. Programmable Logic Device (PLD) Design Description for the Integrated Power, Avionics, and Software (iPAS) Space Telecommunications Radio System (STRS) Radio

    NASA Technical Reports Server (NTRS)

    Shalkhauser, Mary Jo W.

    2017-01-01

    The Space Telecommunications Radio System (STRS) provides a common, consistent framework for software defined radios (SDRs) to abstract the application software from the radio platform hardware. The STRS standard aims to reduce the cost and risk of using complex, configurable and reprogrammable radio systems across NASA missions. To promote the use of the STRS architecture for future NASA advanced exploration missions, NASA Glenn Research Center (GRC) developed an STRS compliant SDR on a radio platform used by the Advance Exploration System program at the Johnson Space Center (JSC) in their Integrated Power, Avionics, and Software (iPAS) laboratory. At the conclusion of the development, the software and hardware description language (HDL) code was delivered to JSC for their use in their iPAS test bed to get hands-on experience with the STRS standard, and for development of their own STRS Waveforms on the now STRS compliant platform.The iPAS STRS Radio was implemented on the Reconfigurable, Intelligently-Adaptive Communication System (RIACS) platform, currently being used for radio development at JSC. The platform consists of a Xilinx ML605 Virtex-6 FPGA board, an Analog Devices FMCOMMS1-EBZ RF transceiver board, and an Embedded PC (Axiomtek eBox 620-110-FL) running the Ubuntu 12.4 operating system. Figure 1 shows the RIACS platform hardware. The result of this development is a very low cost STRS compliant platform that can be used for waveform developments for multiple applications.The purpose of this document is to describe the design of the HDL code for the FPGA portion of the iPAS STRS Radio particularly the design of the FPGA wrapper and the test waveform.

  11. Fiji: an open-source platform for biological-image analysis.

    PubMed

    Schindelin, Johannes; Arganda-Carreras, Ignacio; Frise, Erwin; Kaynig, Verena; Longair, Mark; Pietzsch, Tobias; Preibisch, Stephan; Rueden, Curtis; Saalfeld, Stephan; Schmid, Benjamin; Tinevez, Jean-Yves; White, Daniel James; Hartenstein, Volker; Eliceiri, Kevin; Tomancak, Pavel; Cardona, Albert

    2012-06-28

    Fiji is a distribution of the popular open-source software ImageJ focused on biological-image analysis. Fiji uses modern software engineering practices to combine powerful software libraries with a broad range of scripting languages to enable rapid prototyping of image-processing algorithms. Fiji facilitates the transformation of new algorithms into ImageJ plugins that can be shared with end users through an integrated update system. We propose Fiji as a platform for productive collaboration between computer science and biology research communities.

  12. Future Standardization of Space Telecommunications Radio System with Core Flight System

    NASA Technical Reports Server (NTRS)

    Briones, Janette C.; Hickey, Joseph P.; Roche, Rigoberto; Handler, Louis M.; Hall, Charles S.

    2016-01-01

    NASA Glenn Research Center (GRC) is integrating the NASA Space Telecommunications Radio System (STRS) Standard with the Core Flight System (cFS), an avionics software operating environment. The STRS standard provides a common, consistent framework to develop, qualify, operate and maintain complex, reconfigurable and reprogrammable radio systems. The cFS is a flexible, open architecture that features a plugand- play software executive called the Core Flight Executive (cFE), a reusable library of software components for flight and space missions and an integrated tool suite. Together, STRS and cFS create a development environment that allows for STRS compliant applications to reference the STRS application programmer interfaces (APIs) that use the cFS infrastructure. These APIs are used to standardize the communication protocols on NASAs space SDRs. The cFS-STRS Operating Environment (OE) is a portable cFS library, which adds the ability to run STRS applications on existing cFS platforms. The purpose of this paper is to discuss the cFS-STRS OE prototype, preliminary experimental results performed using the Advanced Space Radio Platform (ASRP), the GRC S- band Ground Station and the SCaN (Space Communication and Navigation) Testbed currently flying onboard the International Space Station (ISS). Additionally, this paper presents a demonstration of the Consultative Committee for Space Data Systems (CCSDS) Spacecraft Onboard Interface Services (SOIS) using electronic data sheets (EDS) inside cFE. This configuration allows for the data sheets to specify binary formats for data exchange between STRS applications. The integration of STRS with cFS leverages mission-proven platform functions and mitigates barriers to integration with future missions. This reduces flight software development time and the costs of software-defined radio (SDR) platforms. Furthermore, the combined benefits of STRS standardization with the flexibility of cFS provide an effective, reliable and modular framework to minimize software development efforts for spaceflight missions.

  13. Develop Direct Geo-referencing System Based on Open Source Software and Hardware Platform

    NASA Astrophysics Data System (ADS)

    Liu, H. S.; Liao, H. M.

    2015-08-01

    Direct geo-referencing system uses the technology of remote sensing to quickly grasp images, GPS tracks, and camera position. These data allows the construction of large volumes of images with geographic coordinates. So that users can be measured directly on the images. In order to properly calculate positioning, all the sensor signals must be synchronized. Traditional aerial photography use Position and Orientation System (POS) to integrate image, coordinates and camera position. However, it is very expensive. And users could not use the result immediately because the position information does not embed into image. To considerations of economy and efficiency, this study aims to develop a direct geo-referencing system based on open source software and hardware platform. After using Arduino microcontroller board to integrate the signals, we then can calculate positioning with open source software OpenCV. In the end, we use open source panorama browser, panini, and integrate all these to open source GIS software, Quantum GIS. A wholesome collection of data - a data processing system could be constructed.

  14. Developing an Interactive Data Visualization Tool to Assess the Impact of Decision Support on Clinical Operations.

    PubMed

    Huber, Timothy C; Krishnaraj, Arun; Monaghan, Dayna; Gaskin, Cree M

    2018-05-18

    Due to mandates from recent legislation, clinical decision support (CDS) software is being adopted by radiology practices across the country. This software provides imaging study decision support for referring providers at the point of order entry. CDS systems produce a large volume of data, providing opportunities for research and quality improvement. In order to better visualize and analyze trends in this data, an interactive data visualization dashboard was created using a commercially available data visualization platform. Following the integration of a commercially available clinical decision support product into the electronic health record, a dashboard was created using a commercially available data visualization platform (Tableau, Seattle, WA). Data generated by the CDS were exported from the data warehouse, where they were stored, into the platform. This allowed for real-time visualization of the data generated by the decision support software. The creation of the dashboard allowed the output from the CDS platform to be more easily analyzed and facilitated hypothesis generation. Integrating data visualization tools into clinical decision support tools allows for easier data analysis and can streamline research and quality improvement efforts.

  15. Bringing your tools to CyVerse Discovery Environment using Docker

    PubMed Central

    Devisetty, Upendra Kumar; Kennedy, Kathleen; Sarando, Paul; Merchant, Nirav; Lyons, Eric

    2016-01-01

    Docker has become a very popular container-based virtualization platform for software distribution that has revolutionized the way in which scientific software and software dependencies (software stacks) can be packaged, distributed, and deployed. Docker makes the complex and time-consuming installation procedures needed for scientific software a one-time process. Because it enables platform-independent installation, versioning of software environments, and easy redeployment and reproducibility, Docker is an ideal candidate for the deployment of identical software stacks on different compute environments such as XSEDE and Amazon AWS. CyVerse’s Discovery Environment also uses Docker for integrating its powerful, community-recommended software tools into CyVerse’s production environment for public use. This paper will help users bring their tools into CyVerse Discovery Environment (DE) which will not only allows users to integrate their tools with relative ease compared to the earlier method of tool deployment in DE but will also help users to share their apps with collaborators and release them for public use. PMID:27803802

  16. Bringing your tools to CyVerse Discovery Environment using Docker.

    PubMed

    Devisetty, Upendra Kumar; Kennedy, Kathleen; Sarando, Paul; Merchant, Nirav; Lyons, Eric

    2016-01-01

    Docker has become a very popular container-based virtualization platform for software distribution that has revolutionized the way in which scientific software and software dependencies (software stacks) can be packaged, distributed, and deployed. Docker makes the complex and time-consuming installation procedures needed for scientific software a one-time process. Because it enables platform-independent installation, versioning of software environments, and easy redeployment and reproducibility, Docker is an ideal candidate for the deployment of identical software stacks on different compute environments such as XSEDE and Amazon AWS. CyVerse's Discovery Environment also uses Docker for integrating its powerful, community-recommended software tools into CyVerse's production environment for public use. This paper will help users bring their tools into CyVerse Discovery Environment (DE) which will not only allows users to integrate their tools with relative ease compared to the earlier method of tool deployment in DE but will also help users to share their apps with collaborators and release them for public use.

  17. [Application of the life sciences platform based on oracle to biomedical informations].

    PubMed

    Zhao, Zhi-Yun; Li, Tai-Huan; Yang, Hong-Qiao

    2008-03-01

    The life sciences platform based on Oracle database technology is introduced in this paper. By providing a powerful data access, integrating a variety of data types, and managing vast quantities of data, the software presents a flexible, safe and scalable management platform for biomedical data processing.

  18. MeDICi Software Superglue for Data Analysis Pipelines

    ScienceCinema

    Ian Gorton

    2017-12-09

    The Middleware for Data-Intensive Computing (MeDICi) Integration Framework is an integrated middleware platform developed to solve data analysis and processing needs of scientists across many domains. MeDICi is scalable, easily modified, and robust to multiple languages, protocols, and hardware platforms, and in use today by PNNL scientists for bioinformatics, power grid failure analysis, and text analysis.

  19. Boutiques: a flexible framework to integrate command-line applications in computing platforms

    PubMed Central

    Glatard, Tristan; Kiar, Gregory; Aumentado-Armstrong, Tristan; Beck, Natacha; Bellec, Pierre; Bernard, Rémi; Bonnet, Axel; Brown, Shawn T; Camarasu-Pop, Sorina; Cervenansky, Frédéric; Das, Samir; Ferreira da Silva, Rafael; Flandin, Guillaume; Girard, Pascal; Gorgolewski, Krzysztof J; Guttmann, Charles R G; Hayot-Sasson, Valérie; Quirion, Pierre-Olivier; Rioux, Pierre; Rousseau, Marc-Étienne; Evans, Alan C

    2018-01-01

    Abstract We present Boutiques, a system to automatically publish, integrate, and execute command-line applications across computational platforms. Boutiques applications are installed through software containers described in a rich and flexible JSON language. A set of core tools facilitates the construction, validation, import, execution, and publishing of applications. Boutiques is currently supported by several distinct virtual research platforms, and it has been used to describe dozens of applications in the neuroinformatics domain. We expect Boutiques to improve the quality of application integration in computational platforms, to reduce redundancy of effort, to contribute to computational reproducibility, and to foster Open Science. PMID:29718199

  20. Distributed Processing of Sentinel-2 Products using the BIGEARTH Platform

    NASA Astrophysics Data System (ADS)

    Bacu, Victor; Stefanut, Teodor; Nandra, Constantin; Mihon, Danut; Gorgan, Dorian

    2017-04-01

    The constellation of observational satellites orbiting around Earth is constantly increasing, providing more data that need to be processed in order to extract meaningful information and knowledge from it. Sentinel-2 satellites, part of the Copernicus Earth Observation program, aim to be used in agriculture, forestry and many other land management applications. ESA's SNAP toolbox can be used to process data gathered by Sentinel-2 satellites but is limited to the resources provided by a stand-alone computer. In this paper we present a cloud based software platform that makes use of this toolbox together with other remote sensing software applications to process Sentinel-2 products. The BIGEARTH software platform [1] offers an integrated solution for processing Earth Observation data coming from different sources (such as satellites or on-site sensors). The flow of processing is defined as a chain of tasks based on the WorDeL description language [2]. Each task could rely on a different software technology (such as Grass GIS and ESA's SNAP) in order to process the input data. One important feature of the BIGEARTH platform comes from this possibility of interconnection and integration, throughout the same flow of processing, of the various well known software technologies. All this integration is transparent from the user perspective. The proposed platform extends the SNAP capabilities by enabling specialists to easily scale the processing over distributed architectures, according to their specific needs and resources. The software platform [3] can be used in multiple configurations. In the basic one the software platform runs as a standalone application inside a virtual machine. Obviously in this case the computational resources are limited but it will give an overview of the functionalities of the software platform, and also the possibility to define the flow of processing and later on to execute it on a more complex infrastructure. The most complex and robust configuration is based on cloud computing and allows the installation on a private or public cloud infrastructure. In this configuration, the processing resources can be dynamically allocated and the execution time can be considerably improved by the available virtual resources and the number of parallelizable sequences in the processing flow. The presentation highlights the benefits and issues of the proposed solution by analyzing some significant experimental use cases. Main references for further information: [1] BigEarth project, http://cgis.utcluj.ro/projects/bigearth [2] Constantin Nandra, Dorian Gorgan: "Defining Earth data batch processing tasks by means of a flexible workflow description language", ISPRS Ann. Photogramm. Remote Sens. Spatial Inf. Sci., III-4, 59-66, (2016). [3] Victor Bacu, Teodor Stefanut, Dorian Gorgan, "Adaptive Processing of Earth Observation Data on Cloud Infrastructures Based on Workflow Description", Proceedings of the Intelligent Computer Communication and Processing (ICCP), IEEE-Press, pp.444-454, (2015).

  1. Development of an Integrated Process, Modeling and Simulation Platform for Performance-Based Design of Low-Energy and High IEQ Buildings

    ERIC Educational Resources Information Center

    Chen, Yixing

    2013-01-01

    The objective of this study was to develop a "Virtual Design Studio (VDS)": a software platform for integrated, coordinated and optimized design of green building systems with low energy consumption, high indoor environmental quality (IEQ), and high level of sustainability. The VDS is intended to assist collaborating architects,…

  2. Leveraging Existing Mission Tools in a Re-Usable, Component-Based Software Environment

    NASA Technical Reports Server (NTRS)

    Greene, Kevin; Grenander, Sven; Kurien, James; z,s (fshir. z[orttr); z,scer; O'Reilly, Taifun

    2006-01-01

    Emerging methods in component-based software development offer significant advantages but may seem incompatible with existing mission operations applications. In this paper we relate our positive experiences integrating existing mission applications into component-based tools we are delivering to three missions. In most operations environments, a number of software applications have been integrated together to form the mission operations software. In contrast, with component-based software development chunks of related functionality and data structures, referred to as components, can be individually delivered, integrated and re-used. With the advent of powerful tools for managing component-based development, complex software systems can potentially see significant benefits in ease of integration, testability and reusability from these techniques. These benefits motivate us to ask how component-based development techniques can be relevant in a mission operations environment, where there is significant investment in software tools that are not component-based and may not be written in languages for which component-based tools even exist. Trusted and complex software tools for sequencing, validation, navigation, and other vital functions cannot simply be re-written or abandoned in order to gain the advantages offered by emerging component-based software techniques. Thus some middle ground must be found. We have faced exactly this issue, and have found several solutions. Ensemble is an open platform for development, integration, and deployment of mission operations software that we are developing. Ensemble itself is an extension of an open source, component-based software development platform called Eclipse. Due to the advantages of component-based development, we have been able to vary rapidly develop mission operations tools for three surface missions by mixing and matching from a common set of mission operation components. We have also had to determine how to integrate existing mission applications for sequence development, sequence validation, and high level activity planning, and other functions into a component-based environment. For each of these, we used a somewhat different technique based upon the structure and usage of the existing application.

  3. SSE software test management STM capability: Using STM in the Ground Systems Development Environment (GSDE)

    NASA Technical Reports Server (NTRS)

    Church, Victor E.; Long, D.; Hartenstein, Ray; Perez-Davila, Alfredo

    1992-01-01

    This report is one of a series discussing configuration management (CM) topics for Space Station ground systems software development. It provides a description of the Software Support Environment (SSE)-developed Software Test Management (STM) capability, and discusses the possible use of this capability for management of developed software during testing performed on target platforms. This is intended to supplement the formal documentation of STM provided by the SEE Project. How STM can be used to integrate contractor CM and formal CM for software before delivery to operations is described. STM provides a level of control that is flexible enough to support integration and debugging, but sufficiently rigorous to insure the integrity of the testing process.

  4. ROS-IGTL-Bridge: an open network interface for image-guided therapy using the ROS environment.

    PubMed

    Frank, Tobias; Krieger, Axel; Leonard, Simon; Patel, Niravkumar A; Tokuda, Junichi

    2017-08-01

    With the growing interest in advanced image-guidance for surgical robot systems, rapid integration and testing of robotic devices and medical image computing software are becoming essential in the research and development. Maximizing the use of existing engineering resources built on widely accepted platforms in different fields, such as robot operating system (ROS) in robotics and 3D Slicer in medical image computing could simplify these tasks. We propose a new open network bridge interface integrated in ROS to ensure seamless cross-platform data sharing. A ROS node named ROS-IGTL-Bridge was implemented. It establishes a TCP/IP network connection between the ROS environment and external medical image computing software using the OpenIGTLink protocol. The node exports ROS messages to the external software over the network and vice versa simultaneously, allowing seamless and transparent data sharing between the ROS-based devices and the medical image computing platforms. Performance tests demonstrated that the bridge could stream transforms, strings, points, and images at 30 fps in both directions successfully. The data transfer latency was <1.2 ms for transforms, strings and points, and 25.2 ms for color VGA images. A separate test also demonstrated that the bridge could achieve 900 fps for transforms. Additionally, the bridge was demonstrated in two representative systems: a mock image-guided surgical robot setup consisting of 3D slicer, and Lego Mindstorms with ROS as a prototyping and educational platform for IGT research; and the smart tissue autonomous robot surgical setup with 3D Slicer. The study demonstrated that the bridge enabled cross-platform data sharing between ROS and medical image computing software. This will allow rapid and seamless integration of advanced image-based planning/navigation offered by the medical image computing software such as 3D Slicer into ROS-based surgical robot systems.

  5. The NCI High Performance Computing (HPC) and High Performance Data (HPD) Platform to Support the Analysis of Petascale Environmental Data Collections

    NASA Astrophysics Data System (ADS)

    Evans, B. J. K.; Pugh, T.; Wyborn, L. A.; Porter, D.; Allen, C.; Smillie, J.; Antony, J.; Trenham, C.; Evans, B. J.; Beckett, D.; Erwin, T.; King, E.; Hodge, J.; Woodcock, R.; Fraser, R.; Lescinsky, D. T.

    2014-12-01

    The National Computational Infrastructure (NCI) has co-located a priority set of national data assets within a HPC research platform. This powerful in-situ computational platform has been created to help serve and analyse the massive amounts of data across the spectrum of environmental collections - in particular the climate, observational data and geoscientific domains. This paper examines the infrastructure, innovation and opportunity for this significant research platform. NCI currently manages nationally significant data collections (10+ PB) categorised as 1) earth system sciences, climate and weather model data assets and products, 2) earth and marine observations and products, 3) geosciences, 4) terrestrial ecosystem, 5) water management and hydrology, and 6) astronomy, social science and biosciences. The data is largely sourced from the NCI partners (who include the custodians of many of the national scientific records), major research communities, and collaborating overseas organisations. By co-locating these large valuable data assets, new opportunities have arisen by harmonising the data collections, making a powerful transdisciplinary research platformThe data is accessible within an integrated HPC-HPD environment - a 1.2 PFlop supercomputer (Raijin), a HPC class 3000 core OpenStack cloud system and several highly connected large scale and high-bandwidth Lustre filesystems. New scientific software, cloud-scale techniques, server-side visualisation and data services have been harnessed and integrated into the platform, so that analysis is performed seamlessly across the traditional boundaries of the underlying data domains. Characterisation of the techniques along with performance profiling ensures scalability of each software component, all of which can either be enhanced or replaced through future improvements. A Development-to-Operations (DevOps) framework has also been implemented to manage the scale of the software complexity alone. This ensures that software is both upgradable and maintainable, and can be readily reused with complexly integrated systems and become part of the growing global trusted community tools for cross-disciplinary research.

  6. Defense AT and L. Volume 42, Number 1

    DTIC Science & Technology

    2013-02-01

    Agnish The U.S. Army late last year began equipping brigade combat teams with its first package of radios, satellite systems, software applications...Army’s first package of radios, satellite systems, software applications, smartphone-like devices, and other network components that provide integrated... satellite communications, intelligence, mission command applications, and the integration of C4ISR equip- ment onto various vehicle platforms. This

  7. Psynteract: A flexible, cross-platform, open framework for interactive experiments.

    PubMed

    Henninger, Felix; Kieslich, Pascal J; Hilbig, Benjamin E

    2017-10-01

    We introduce a novel platform for interactive studies, that is, any form of study in which participants' experiences depend not only on their own responses, but also on those of other participants who complete the same study in parallel, for example a prisoner's dilemma or an ultimatum game. The software thus especially serves the rapidly growing field of strategic interaction research within psychology and behavioral economics. In contrast to all available software packages, our platform does not handle stimulus display and response collection itself. Instead, we provide a mechanism to extend existing experimental software to incorporate interactive functionality. This approach allows us to draw upon the capabilities already available, such as accuracy of temporal measurement, integration with auxiliary hardware such as eye-trackers or (neuro-)physiological apparatus, and recent advances in experimental software, for example capturing response dynamics through mouse-tracking. Through integration with OpenSesame, an open-source graphical experiment builder, studies can be assembled via a drag-and-drop interface requiring little or no further programming skills. In addition, by using the same communication mechanism across software packages, we also enable interoperability between systems. Our source code, which provides support for all major operating systems and several popular experimental packages, can be freely used and distributed under an open source license. The communication protocols underlying its functionality are also well documented and easily adapted to further platforms. Code and documentation are available at https://github.com/psynteract/ .

  8. From WSN towards WoT: Open API Scheme Based on oneM2M Platforms.

    PubMed

    Kim, Jaeho; Choi, Sung-Chan; Ahn, Il-Yeup; Sung, Nak-Myoung; Yun, Jaeseok

    2016-10-06

    Conventional computing systems have been able to be integrated into daily objects and connected to each other due to advances in computing and network technologies, such as wireless sensor networks (WSNs), forming a global network infrastructure, called the Internet of Things (IoT). To support the interconnection and interoperability between heterogeneous IoT systems, the availability of standardized, open application programming interfaces (APIs) is one of the key features of common software platforms for IoT devices, gateways, and servers. In this paper, we present a standardized way of extending previously-existing WSNs towards IoT systems, building the world of the Web of Things (WoT). Based on the oneM2M software platforms developed in the previous project, we introduce a well-designed open API scheme and device-specific thing adaptation software (TAS) enabling WSN elements, such as a wireless sensor node, to be accessed in a standardized way on a global scale. Three pilot services are implemented (i.e., a WiFi-enabled smart flowerpot, voice-based control for ZigBee-connected home appliances, and WiFi-connected AR.Drone control) to demonstrate the practical usability of the open API scheme and TAS modules. Full details on the method of integrating WSN elements into three example systems are described at the programming code level, which is expected to help future researchers in integrating their WSN systems in IoT platforms, such as oneM2M. We hope that the flexibly-deployable, easily-reusable common open API scheme and TAS-based integration method working with the oneM2M platforms will help the conventional WSNs in diverse industries evolve into the emerging WoT solutions.

  9. From WSN towards WoT: Open API Scheme Based on oneM2M Platforms

    PubMed Central

    Kim, Jaeho; Choi, Sung-Chan; Ahn, Il-Yeup; Sung, Nak-Myoung; Yun, Jaeseok

    2016-01-01

    Conventional computing systems have been able to be integrated into daily objects and connected to each other due to advances in computing and network technologies, such as wireless sensor networks (WSNs), forming a global network infrastructure, called the Internet of Things (IoT). To support the interconnection and interoperability between heterogeneous IoT systems, the availability of standardized, open application programming interfaces (APIs) is one of the key features of common software platforms for IoT devices, gateways, and servers. In this paper, we present a standardized way of extending previously-existing WSNs towards IoT systems, building the world of the Web of Things (WoT). Based on the oneM2M software platforms developed in the previous project, we introduce a well-designed open API scheme and device-specific thing adaptation software (TAS) enabling WSN elements, such as a wireless sensor node, to be accessed in a standardized way on a global scale. Three pilot services are implemented (i.e., a WiFi-enabled smart flowerpot, voice-based control for ZigBee-connected home appliances, and WiFi-connected AR.Drone control) to demonstrate the practical usability of the open API scheme and TAS modules. Full details on the method of integrating WSN elements into three example systems are described at the programming code level, which is expected to help future researchers in integrating their WSN systems in IoT platforms, such as oneM2M. We hope that the flexibly-deployable, easily-reusable common open API scheme and TAS-based integration method working with the oneM2M platforms will help the conventional WSNs in diverse industries evolve into the emerging WoT solutions. PMID:27782058

  10. Integrated Environment for Development and Assurance

    DTIC Science & Technology

    2015-01-26

    Jan 26, 2015 © 2015 Carnegie Mellon University We Rely on Software for Safe Aircraft Operation Embedded software systems introduce a new class of...eveloper Compute Platform Runtime Architecture Application Software Embedded SW System Engineer Data Stream Characteristics Latency jitter affects...Why do system level failures still occur despite fault tolerance techniques being deployed in systems ? Embedded software system as major source of

  11. Interfaces and Integration of Medical Image Analysis Frameworks: Challenges and Opportunities.

    PubMed

    Covington, Kelsie; McCreedy, Evan S; Chen, Min; Carass, Aaron; Aucoin, Nicole; Landman, Bennett A

    2010-05-25

    Clinical research with medical imaging typically involves large-scale data analysis with interdependent software toolsets tied together in a processing workflow. Numerous, complementary platforms are available, but these are not readily compatible in terms of workflows or data formats. Both image scientists and clinical investigators could benefit from using the framework which is a most natural fit to the specific problem at hand, but pragmatic choices often dictate that a compromise platform is used for collaboration. Manual merging of platforms through carefully tuned scripts has been effective, but exceptionally time consuming and is not feasible for large-scale integration efforts. Hence, the benefits of innovation are constrained by platform dependence. Removing this constraint via integration of algorithms from one framework into another is the focus of this work. We propose and demonstrate a light-weight interface system to expose parameters across platforms and provide seamless integration. In this initial effort, we focus on four platforms Medical Image Analysis and Visualization (MIPAV), Java Image Science Toolkit (JIST), command line tools, and 3D Slicer. We explore three case studies: (1) providing a system for MIPAV to expose internal algorithms and utilize these algorithms within JIST, (2) exposing JIST modules through self-documenting command line interface for inclusion in scripting environments, and (3) detecting and using JIST modules in 3D Slicer. We review the challenges and opportunities for light-weight software integration both within development language (e.g., Java in MIPAV and JIST) and across languages (e.g., C/C++ in 3D Slicer and shell in command line tools).

  12. An open source platform for multi-scale spatially distributed simulations of microbial ecosystems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Segre, Daniel

    2014-08-14

    The goal of this project was to develop a tool for facilitating simulation, validation and discovery of multiscale dynamical processes in microbial ecosystems. This led to the development of an open-source software platform for Computation Of Microbial Ecosystems in Time and Space (COMETS). COMETS performs spatially distributed time-dependent flux balance based simulations of microbial metabolism. Our plan involved building the software platform itself, calibrating and testing it through comparison with experimental data, and integrating simulations and experiments to address important open questions on the evolution and dynamics of cross-feeding interactions between microbial species.

  13. The Mobile Aircraft Maintenance Office Concept from a Wide Area Perspective

    DTIC Science & Technology

    2003-03-01

    significant improvements in wireless network data rates, and enhanced mobile application platforms offers an opportunity to effectively integrate m...hardware, and mobile application platforms housing the necessary middleware software comprise the mobile landscape. The m-business network...devices. Lastly, an investigation into mobile application platforms will reveal the middleware functionality required to successfully extend suitable e

  14. Geneious Basic: An integrated and extendable desktop software platform for the organization and analysis of sequence data

    PubMed Central

    Kearse, Matthew; Moir, Richard; Wilson, Amy; Stones-Havas, Steven; Cheung, Matthew; Sturrock, Shane; Buxton, Simon; Cooper, Alex; Markowitz, Sidney; Duran, Chris; Thierer, Tobias; Ashton, Bruce; Meintjes, Peter; Drummond, Alexei

    2012-01-01

    Summary: The two main functions of bioinformatics are the organization and analysis of biological data using computational resources. Geneious Basic has been designed to be an easy-to-use and flexible desktop software application framework for the organization and analysis of biological data, with a focus on molecular sequences and related data types. It integrates numerous industry-standard discovery analysis tools, with interactive visualizations to generate publication-ready images. One key contribution to researchers in the life sciences is the Geneious public application programming interface (API) that affords the ability to leverage the existing framework of the Geneious Basic software platform for virtually unlimited extension and customization. The result is an increase in the speed and quality of development of computation tools for the life sciences, due to the functionality and graphical user interface available to the developer through the public API. Geneious Basic represents an ideal platform for the bioinformatics community to leverage existing components and to integrate their own specific requirements for the discovery, analysis and visualization of biological data. Availability and implementation: Binaries and public API freely available for download at http://www.geneious.com/basic, implemented in Java and supported on Linux, Apple OSX and MS Windows. The software is also available from the Bio-Linux package repository at http://nebc.nerc.ac.uk/news/geneiousonbl. Contact: peter@biomatters.com PMID:22543367

  15. Geneious Basic: an integrated and extendable desktop software platform for the organization and analysis of sequence data.

    PubMed

    Kearse, Matthew; Moir, Richard; Wilson, Amy; Stones-Havas, Steven; Cheung, Matthew; Sturrock, Shane; Buxton, Simon; Cooper, Alex; Markowitz, Sidney; Duran, Chris; Thierer, Tobias; Ashton, Bruce; Meintjes, Peter; Drummond, Alexei

    2012-06-15

    The two main functions of bioinformatics are the organization and analysis of biological data using computational resources. Geneious Basic has been designed to be an easy-to-use and flexible desktop software application framework for the organization and analysis of biological data, with a focus on molecular sequences and related data types. It integrates numerous industry-standard discovery analysis tools, with interactive visualizations to generate publication-ready images. One key contribution to researchers in the life sciences is the Geneious public application programming interface (API) that affords the ability to leverage the existing framework of the Geneious Basic software platform for virtually unlimited extension and customization. The result is an increase in the speed and quality of development of computation tools for the life sciences, due to the functionality and graphical user interface available to the developer through the public API. Geneious Basic represents an ideal platform for the bioinformatics community to leverage existing components and to integrate their own specific requirements for the discovery, analysis and visualization of biological data. Binaries and public API freely available for download at http://www.geneious.com/basic, implemented in Java and supported on Linux, Apple OSX and MS Windows. The software is also available from the Bio-Linux package repository at http://nebc.nerc.ac.uk/news/geneiousonbl.

  16. Integrated Modular Avionics for Spacecraft: Earth Observation Use Case Demonstrator

    NASA Astrophysics Data System (ADS)

    Deredempt, Marie-Helene; Rossignol, Alain; Hyounet, Philippe

    2013-08-01

    Integrated Modular Avionics (IMA) for Space, as European Space Agency initiative, aimed to make applicable to space domain the time and space partitioning concepts and particularly the ARINC 653 standard [1][2]. Expected benefits of such an approach are development flexibility, capability to provide differential V&V for different criticality level functionalities and to integrate late or In-Orbit delivery. This development flexibility could improve software subcontracting, industrial organization and software reuse. Time and space partitioning technique facilitates integration of software functions as black boxes and integration of decentralized function such as star tracker in On Board Computer to save mass and power by limiting electronics resources. In aeronautical domain, Integrated Modular Avionics architecture is based on a network of LRU (Line Replaceable Unit) interconnected by AFDX (Avionic Full DupleX). Time and Space partitioning concept is applicable to LRU and provides independent partitions which inter communicate using ARINC 653 communication ports. Using End System (LRU component) intercommunication between LRU is managed in the same way than intercommunication between partitions in LRU. In such architecture an application developed using only communication port can be integrated in an LRU or another one without impacting the global architecture. In space domain, a redundant On Board Computer controls (ground monitoring TM) and manages the platform (ground command TC) in terms of power, solar array deployment, attitude, orbit, thermal, maintenance, failure detection and recovery isolation. In addition, Payload units and platform units such as RIU, PCDU, AOCS units (Star tracker, Reaction wheels) are considered in this architecture. Interfaces are mainly realized through MIL-STD-1553B busses and SpaceWire and this could be considered as the main constraint for IMA implementation in space domain. During the first phase of IMA SP project, ARINC653 impact was analyzed. Requirements and architecture for space domain were defined [3][4] and System Executive platforms (based on Xtratum, Pike OS, and AIR) were developed with RTEMS as Guest OS. This paper focuses on the demonstrator developed by Astrium as part of IMA SP project. This demonstrator has the objective to confirm operational software partitioning feasibility above Xtratum System Executive Platform with acceptable CPU overhead.

  17. Analysis of lipid experiments (ALEX): a software framework for analysis of high-resolution shotgun lipidomics data.

    PubMed

    Husen, Peter; Tarasov, Kirill; Katafiasz, Maciej; Sokol, Elena; Vogt, Johannes; Baumgart, Jan; Nitsch, Robert; Ekroos, Kim; Ejsing, Christer S

    2013-01-01

    Global lipidomics analysis across large sample sizes produces high-content datasets that require dedicated software tools supporting lipid identification and quantification, efficient data management and lipidome visualization. Here we present a novel software-based platform for streamlined data processing, management and visualization of shotgun lipidomics data acquired using high-resolution Orbitrap mass spectrometry. The platform features the ALEX framework designed for automated identification and export of lipid species intensity directly from proprietary mass spectral data files, and an auxiliary workflow using database exploration tools for integration of sample information, computation of lipid abundance and lipidome visualization. A key feature of the platform is the organization of lipidomics data in "database table format" which provides the user with an unsurpassed flexibility for rapid lipidome navigation using selected features within the dataset. To demonstrate the efficacy of the platform, we present a comparative neurolipidomics study of cerebellum, hippocampus and somatosensory barrel cortex (S1BF) from wild-type and knockout mice devoid of the putative lipid phosphate phosphatase PRG-1 (plasticity related gene-1). The presented framework is generic, extendable to processing and integration of other lipidomic data structures, can be interfaced with post-processing protocols supporting statistical testing and multivariate analysis, and can serve as an avenue for disseminating lipidomics data within the scientific community. The ALEX software is available at www.msLipidomics.info.

  18. cFE/CFS (Core Flight Executive/Core Flight System)

    NASA Technical Reports Server (NTRS)

    Wildermann, Charles P.

    2008-01-01

    This viewgraph presentation describes in detail the requirements and goals of the Core Flight Executive (cFE) and the Core Flight System (CFS). The Core Flight Software System is a mission independent, platform-independent, Flight Software (FSW) environment integrating a reusable core flight executive (cFE). The CFS goals include: 1) Reduce time to deploy high quality flight software; 2) Reduce project schedule and cost uncertainty; 3) Directly facilitate formalized software reuse; 4) Enable collaboration across organizations; 5) Simplify sustaining engineering (AKA. FSW maintenance); 6) Scale from small instruments to System of Systems; 7) Platform for advanced concepts and prototyping; and 7) Common standards and tools across the branch and NASA wide.

  19. The SCEC Broadband Platform: A Collaborative Open-Source Software Package for Strong Ground Motion Simulation and Validation

    NASA Astrophysics Data System (ADS)

    Silva, F.; Maechling, P. J.; Goulet, C. A.; Somerville, P.; Jordan, T. H.

    2014-12-01

    The Southern California Earthquake Center (SCEC) Broadband Platform is a collaborative software development project involving geoscientists, earthquake engineers, graduate students, and the SCEC Community Modeling Environment. The SCEC Broadband Platform (BBP) is open-source scientific software that can generate broadband (0-100Hz) ground motions for earthquakes, integrating complex scientific modules that implement rupture generation, low and high-frequency seismogram synthesis, non-linear site effects calculation, and visualization into a software system that supports easy on-demand computation of seismograms. The Broadband Platform operates in two primary modes: validation simulations and scenario simulations. In validation mode, the Platform runs earthquake rupture and wave propagation modeling software to calculate seismograms for a well-observed historical earthquake. Then, the BBP calculates a number of goodness of fit measurements that quantify how well the model-based broadband seismograms match the observed seismograms for a certain event. Based on these results, the Platform can be used to tune and validate different numerical modeling techniques. In scenario mode, the Broadband Platform can run simulations for hypothetical (scenario) earthquakes. In this mode, users input an earthquake description, a list of station names and locations, and a 1D velocity model for their region of interest, and the Broadband Platform software then calculates ground motions for the specified stations. Working in close collaboration with scientists and research engineers, the SCEC software development group continues to add new capabilities to the Broadband Platform and to release new versions as open-source scientific software distributions that can be compiled and run on many Linux computer systems. Our latest release includes 5 simulation methods, 7 simulation regions covering California, Japan, and Eastern North America, the ability to compare simulation results against GMPEs, and several new data products, such as map and distance-based goodness of fit plots. As the number and complexity of scenarios simulated using the Broadband Platform increases, we have added batching utilities to substantially improve support for running large-scale simulations on computing clusters.

  20. Auto-Generated Semantic Processing Services

    NASA Technical Reports Server (NTRS)

    Davis, Rodney; Hupf, Greg

    2009-01-01

    Auto-Generated Semantic Processing (AGSP) Services is a suite of software tools for automated generation of other computer programs, denoted cross-platform semantic adapters, that support interoperability of computer-based communication systems that utilize a variety of both new and legacy communication software running in a variety of operating- system/computer-hardware combinations. AGSP has numerous potential uses in military, space-exploration, and other government applications as well as in commercial telecommunications. The cross-platform semantic adapters take advantage of common features of computer- based communication systems to enforce semantics, messaging protocols, and standards of processing of streams of binary data to ensure integrity of data and consistency of meaning among interoperating systems. The auto-generation aspect of AGSP Services reduces development time and effort by emphasizing specification and minimizing implementation: In effect, the design, building, and debugging of software for effecting conversions among complex communication protocols, custom device mappings, and unique data-manipulation algorithms is replaced with metadata specifications that map to an abstract platform-independent communications model. AGSP Services is modular and has been shown to be easily integrable into new and legacy NASA flight and ground communication systems.

  1. ARTIE: An Integrated Environment for the Development of Affective Robot Tutors

    PubMed Central

    Imbernón Cuadrado, Luis-Eduardo; Manjarrés Riesco, Ángeles; De La Paz López, Félix

    2016-01-01

    Over the last decade robotics has attracted a great deal of interest from teachers and researchers as a valuable educational tool from preschool to highschool levels. The implementation of social-support behaviors in robot tutors, in particular in the emotional dimension, can make a significant contribution to learning efficiency. With the aim of contributing to the rising field of affective robot tutors we have developed ARTIE (Affective Robot Tutor Integrated Environment). We offer an architectural pattern which integrates any given educational software for primary school children with a component whose function is to identify the emotional state of the students who are interacting with the software, and with the driver of a robot tutor which provides personalized emotional pedagogical support to the students. In order to support the development of affective robot tutors according to the proposed architecture, we also provide a methodology which incorporates a technique for eliciting pedagogical knowledge from teachers, and a generic development platform. This platform contains a component for identiying emotional states by analysing keyboard and mouse interaction data, and a generic affective pedagogical support component which specifies the affective educational interventions (including facial expressions, body language, tone of voice,…) in terms of BML (a Behavior Model Language for virtual agent specification) files which are translated into actions of a robot tutor. The platform and the methodology are both adapted to primary school students. Finally, we illustrate the use of this platform to build a prototype implementation of the architecture, in which the educational software is instantiated with Scratch and the robot tutor with NAO. We also report on a user experiment we carried out to orient the development of the platform and of the prototype. We conclude from our work that, in the case of primary school students, it is possible to identify, without using intrusive and expensive identification methods, the emotions which most affect the character of educational interventions. Our work also demonstrates the feasibility of a general-purpose architecture of decoupled components, in which a wide range of educational software and robot tutors can be integrated and then used according to different educational criteria. PMID:27536230

  2. ARTIE: An Integrated Environment for the Development of Affective Robot Tutors.

    PubMed

    Imbernón Cuadrado, Luis-Eduardo; Manjarrés Riesco, Ángeles; De La Paz López, Félix

    2016-01-01

    Over the last decade robotics has attracted a great deal of interest from teachers and researchers as a valuable educational tool from preschool to highschool levels. The implementation of social-support behaviors in robot tutors, in particular in the emotional dimension, can make a significant contribution to learning efficiency. With the aim of contributing to the rising field of affective robot tutors we have developed ARTIE (Affective Robot Tutor Integrated Environment). We offer an architectural pattern which integrates any given educational software for primary school children with a component whose function is to identify the emotional state of the students who are interacting with the software, and with the driver of a robot tutor which provides personalized emotional pedagogical support to the students. In order to support the development of affective robot tutors according to the proposed architecture, we also provide a methodology which incorporates a technique for eliciting pedagogical knowledge from teachers, and a generic development platform. This platform contains a component for identiying emotional states by analysing keyboard and mouse interaction data, and a generic affective pedagogical support component which specifies the affective educational interventions (including facial expressions, body language, tone of voice,…) in terms of BML (a Behavior Model Language for virtual agent specification) files which are translated into actions of a robot tutor. The platform and the methodology are both adapted to primary school students. Finally, we illustrate the use of this platform to build a prototype implementation of the architecture, in which the educational software is instantiated with Scratch and the robot tutor with NAO. We also report on a user experiment we carried out to orient the development of the platform and of the prototype. We conclude from our work that, in the case of primary school students, it is possible to identify, without using intrusive and expensive identification methods, the emotions which most affect the character of educational interventions. Our work also demonstrates the feasibility of a general-purpose architecture of decoupled components, in which a wide range of educational software and robot tutors can be integrated and then used according to different educational criteria.

  3. Final Report. Center for Scalable Application Development Software

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mellor-Crummey, John

    2014-10-26

    The Center for Scalable Application Development Software (CScADS) was established as a part- nership between Rice University, Argonne National Laboratory, University of California Berkeley, University of Tennessee – Knoxville, and University of Wisconsin – Madison. CScADS pursued an integrated set of activities with the aim of increasing the productivity of DOE computational scientists by catalyzing the development of systems software, libraries, compilers, and tools for leadership computing platforms. Principal Center activities were workshops to engage the research community in the challenges of leadership computing, research and development of open-source software, and work with computational scientists to help them develop codesmore » for leadership computing platforms. This final report summarizes CScADS activities at Rice University in these areas.« less

  4. Integrating SAP to Information Systems Curriculum: Design and Delivery

    ERIC Educational Resources Information Center

    Wang, Ming

    2011-01-01

    Information Systems (IS) education is being transformed from the segmented applications toward the integrated enterprise-wide system software Enterprise Resource Planning (ERP). ERP is a platform that integrates all business functions with its centralized data repository shared by all the business operations in the enterprise. This tremendous…

  5. Design and control of multifunctional sorting and training platform based on PLC control

    NASA Astrophysics Data System (ADS)

    Wan, Hongqiang; Ge, Shuai; Han, Peiying; Li, Fancong; Zhang, Simiao

    2018-05-01

    Electromechanical integration, as a multi-disciplinary subject, has been paid much attention by universities and is widely used in the automation production of enterprises. Aiming at the problem of the lack of control among enterprises and the lack of training among colleges and universities, this paper presents a design of multifunctional sorting training platform based on PLC control. Firstly, the structure of the platform is determined and three-dimensional modeling is done. Then design the platform's aerodynamic control and electrical control. Finally, realize the platform sorting function through PLC programming and configuration software development. The training platform can be used to design the practical training experiment, which has a strong advance and pertinence in the electromechanical integration teaching. At the same time, the platform makes full use of modular thinking to make the sorting modules more flexible. Compared with the traditional training platform, its teaching effect is more significant.

  6. Integrative structure modeling with the Integrative Modeling Platform.

    PubMed

    Webb, Benjamin; Viswanath, Shruthi; Bonomi, Massimiliano; Pellarin, Riccardo; Greenberg, Charles H; Saltzberg, Daniel; Sali, Andrej

    2018-01-01

    Building models of a biological system that are consistent with the myriad data available is one of the key challenges in biology. Modeling the structure and dynamics of macromolecular assemblies, for example, can give insights into how biological systems work, evolved, might be controlled, and even designed. Integrative structure modeling casts the building of structural models as a computational optimization problem, for which information about the assembly is encoded into a scoring function that evaluates candidate models. Here, we describe our open source software suite for integrative structure modeling, Integrative Modeling Platform (https://integrativemodeling.org), and demonstrate its use. © 2017 The Protein Society.

  7. LK Scripting Language

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    The LK scripting language is a simple and fast computer programming language designed for easy integration with existing software to enable automation of tasks. The LK language is used by NREL’s System Advisor Model (SAM), the SAM Software Development Kit (SDK), and SolTrace products. LK is easy extensible and adaptable to new software due to its small footprint and is designed to be statically linked into other software. It is written in standard C++, is cross-platform (Windows, Linux, and OSX), and includes optional portions that enable direct integration with graphical user interfaces written in the open source C++ wxWidgets Versionmore » 3.0+ toolkit.« less

  8. Parametric Modelling of As-Built Beam Framed Structure in Bim Environment

    NASA Astrophysics Data System (ADS)

    Yang, X.; Koehl, M.; Grussenmeyer, P.

    2017-02-01

    A complete documentation and conservation of a historic timber roof requires the integration of geometry modelling, attributional and dynamic information management and results of structural analysis. Recently developed as-built Building Information Modelling (BIM) technique has the potential to provide a uniform platform, which provides possibility to integrate the traditional geometry modelling, parametric elements management and structural analysis together. The main objective of the project presented in this paper is to develop a parametric modelling tool for a timber roof structure whose elements are leaning and crossing beam frame. Since Autodesk Revit, as the typical BIM software, provides the platform for parametric modelling and information management, an API plugin, able to automatically create the parametric beam elements and link them together with strict relationship, was developed. The plugin under development is introduced in the paper, which can obtain the parametric beam model via Autodesk Revit API from total station points and terrestrial laser scanning data. The results show the potential of automatizing the parametric modelling by interactive API development in BIM environment. It also integrates the separate data processing and different platforms into the uniform Revit software.

  9. Educational process in modern climatology within the web-GIS platform "Climate"

    NASA Astrophysics Data System (ADS)

    Gordova, Yulia; Gorbatenko, Valentina; Gordov, Evgeny; Martynova, Yulia; Okladnikov, Igor; Titov, Alexander; Shulgina, Tamara

    2013-04-01

    These days, common to all scientific fields the problem of training of scientists in the environmental sciences is exacerbated by the need to develop new computational and information technology skills in distributed multi-disciplinary teams. To address this and other pressing problems of Earth system sciences, software infrastructure for information support of integrated research in the geosciences was created based on modern information and computational technologies and a software and hardware platform "Climate» (http://climate.scert.ru/) was developed. In addition to the direct analysis of geophysical data archives, the platform is aimed at teaching the basics of the study of changes in regional climate. The educational component of the platform includes a series of lectures on climate, environmental and meteorological modeling and laboratory work cycles on the basics of analysis of current and potential future regional climate change using Siberia territory as an example. The educational process within the Platform is implemented using the distance learning system Moodle (www.moodle.org). This work is partially supported by the Ministry of education and science of the Russian Federation (contract #8345), SB RAS project VIII.80.2.1, RFBR grant #11-05-01190a, and integrated project SB RAS #131.

  10. Atomdroid: a computational chemistry tool for mobile platforms.

    PubMed

    Feldt, Jonas; Mata, Ricardo A; Dieterich, Johannes M

    2012-04-23

    We present the implementation of a new molecular mechanics program designed for use in mobile platforms, the first specifically built for these devices. The software is designed to run on Android operating systems and is compatible with several modern tablet-PCs and smartphones available in the market. It includes molecular viewer/builder capabilities with integrated routines for geometry optimizations and Monte Carlo simulations. These functionalities allow it to work as a stand-alone tool. We discuss some particular development aspects, as well as the overall feasibility of using computational chemistry software packages in mobile platforms. Benchmark calculations show that through efficient implementation techniques even hand-held devices can be used to simulate midsized systems using force fields.

  11. A user-friendly LabVIEW software platform for grating based X-ray phase-contrast imaging.

    PubMed

    Wang, Shenghao; Han, Huajie; Gao, Kun; Wang, Zhili; Zhang, Can; Yang, Meng; Wu, Zhao; Wu, Ziyu

    2015-01-01

    X-ray phase-contrast imaging can provide greatly improved contrast over conventional absorption-based imaging for weakly absorbing samples, such as biological soft tissues and fibre composites. In this study, we introduced an easy and fast way to develop a user-friendly software platform dedicated to the new grating-based X-ray phase-contrast imaging setup at the National Synchrotron Radiation Laboratory of the University of Science and Technology of China. The control of 21 motorized stages, of a piezoelectric stage and of an X-ray tube are achieved with this software, it also covers image acquisition with a flat panel detector for automatic phase stepping scan. Moreover, a data post-processing module for signals retrieval and other custom features are in principle available. With a seamless integration of all the necessary functions in one software package, this platform greatly facilitate users' activities during experimental runs with this grating based X-ray phase contrast imaging setup.

  12. Distributed software framework and continuous integration in hydroinformatics systems

    NASA Astrophysics Data System (ADS)

    Zhou, Jianzhong; Zhang, Wei; Xie, Mengfei; Lu, Chengwei; Chen, Xiao

    2017-08-01

    When encountering multiple and complicated models, multisource structured and unstructured data, complex requirements analysis, the platform design and integration of hydroinformatics systems become a challenge. To properly solve these problems, we describe a distributed software framework and it’s continuous integration process in hydroinformatics systems. This distributed framework mainly consists of server cluster for models, distributed database, GIS (Geographic Information System) servers, master node and clients. Based on it, a GIS - based decision support system for joint regulating of water quantity and water quality of group lakes in Wuhan China is established.

  13. Study on the E-commerce platform based on the agent

    NASA Astrophysics Data System (ADS)

    Fu, Ruixue; Qin, Lishuan; Gao, Yinmin

    2011-10-01

    To solve problem of dynamic integration in e-commerce, the Multi-Agent architecture of electronic commerce platform system based on Agent and Ontology has been introduced, which includes three major types of agent, Ontology and rule collection. In this architecture, service agent and rule are used to realize the business process reengineering, the reuse of software component, and agility of the electronic commerce platform. To illustrate the architecture, a simulation work has been done and the results imply that the architecture provides a very efficient method to design and implement the flexible, distributed, open and intelligent electronic commerce platform system to solve problem of dynamic integration in ecommerce. The objective of this paper is to illustrate the architecture of electronic commerce platform system, and the approach how Agent and Ontology support the electronic commerce platform system.

  14. The SCEC Broadband Platform: A Collaborative Open-Source Software Package for Strong Ground Motion Simulation and Validation

    NASA Astrophysics Data System (ADS)

    Silva, F.; Maechling, P. J.; Goulet, C.; Somerville, P.; Jordan, T. H.

    2013-12-01

    The Southern California Earthquake Center (SCEC) Broadband Platform is a collaborative software development project involving SCEC researchers, graduate students, and the SCEC Community Modeling Environment. The SCEC Broadband Platform is open-source scientific software that can generate broadband (0-100Hz) ground motions for earthquakes, integrating complex scientific modules that implement rupture generation, low and high-frequency seismogram synthesis, non-linear site effects calculation, and visualization into a software system that supports easy on-demand computation of seismograms. The Broadband Platform operates in two primary modes: validation simulations and scenario simulations. In validation mode, the Broadband Platform runs earthquake rupture and wave propagation modeling software to calculate seismograms of a historical earthquake for which observed strong ground motion data is available. Also in validation mode, the Broadband Platform calculates a number of goodness of fit measurements that quantify how well the model-based broadband seismograms match the observed seismograms for a certain event. Based on these results, the Platform can be used to tune and validate different numerical modeling techniques. During the past year, we have modified the software to enable the addition of a large number of historical events, and we are now adding validation simulation inputs and observational data for 23 historical events covering the Eastern and Western United States, Japan, Taiwan, Turkey, and Italy. In scenario mode, the Broadband Platform can run simulations for hypothetical (scenario) earthquakes. In this mode, users input an earthquake description, a list of station names and locations, and a 1D velocity model for their region of interest, and the Broadband Platform software then calculates ground motions for the specified stations. By establishing an interface between scientific modules with a common set of input and output files, the Broadband Platform facilitates the addition of new scientific methods, which are written by earth scientists in a number of languages such as C, C++, Fortran, and Python. The Broadband Platform's modular design also supports the reuse of existing software modules as building blocks to create new scientific methods. Additionally, the Platform implements a wrapper around each scientific module, converting input and output files to and from the specific formats required (or produced) by individual scientific codes. Working in close collaboration with scientists and research engineers, the SCEC software development group continues to add new capabilities to the Broadband Platform and to release new versions as open-source scientific software distributions that can be compiled and run on many Linux computer systems. Our latest release includes the addition of 3 new simulation methods and several new data products, such as map and distance-based goodness of fit plots. Finally, as the number and complexity of scenarios simulated using the Broadband Platform increase, we have added batching utilities to substantially improve support for running large-scale simulations on computing clusters.

  15. eSciMart: Web Platform for Scientific Software Marketplace

    NASA Astrophysics Data System (ADS)

    Kryukov, A. P.; Demichev, A. P.

    2016-10-01

    In this paper we suggest a design of a web marketplace where users of scientific application software and databases, presented in the form of web services, as well as their providers will have presence simultaneously. The model, which will be the basis for the web marketplace is close to the customer-to-customer (C2C) model, which has been successfully used, for example, on the auction sites such as eBay (ebay.com). Unlike the classical model of C2C the suggested marketplace focuses on application software in the form of web services, and standardization of API through which application software will be integrated into the web marketplace. A prototype of such a platform, entitled eSciMart, is currently being developed at SINP MSU.

  16. Team Oriented Robotic Exploration Task on Scorpion and K9 Platforms

    NASA Technical Reports Server (NTRS)

    Kirchner, Frank

    2003-01-01

    This final report describes the achievements that have been made in the project over the complete period of performance. The technical progress highlights the different areas of work in terms of Progress in Mechatronics, Sensor integration, Software Development. User Interfaces, Behavior Development and Experimental Results and System Testing. The different areas are: Mechatronics, Sensor integration, Software development, Experimental results and Basic System Testing, Behaviors Development and Advanced System Testing, User Interface and Wireless Communication.

  17. A Software Defined Radio Based Airplane Communication Navigation Simulation System

    NASA Astrophysics Data System (ADS)

    He, L.; Zhong, H. T.; Song, D.

    2018-01-01

    Radio communication and navigation system plays important role in ensuring the safety of civil airplane in flight. Function and performance should be tested before these systems are installed on-board. Conventionally, a set of transmitter and receiver are needed for each system, thus all the equipment occupy a lot of space and are high cost. In this paper, software defined radio technology is applied to design a common hardware communication and navigation ground simulation system, which can host multiple airplane systems with different operating frequency, such as HF, VHF, VOR, ILS, ADF, etc. We use a broadband analog frontend hardware platform, universal software radio peripheral (USRP), to transmit/receive signal of different frequency band. Software is compiled by LabVIEW on computer, which interfaces with USRP through Ethernet, and is responsible for communication and navigation signal processing and system control. An integrated testing system is established to perform functional test and performance verification of the simulation signal, which demonstrate the feasibility of our design. The system is a low-cost and common hardware platform for multiple airplane systems, which provide helpful reference for integrated avionics design.

  18. A case study in open source innovation: developing the Tidepool Platform for interoperability in type 1 diabetes management.

    PubMed

    Neinstein, Aaron; Wong, Jenise; Look, Howard; Arbiter, Brandon; Quirk, Kent; McCanne, Steve; Sun, Yao; Blum, Michael; Adi, Saleh

    2016-03-01

    Develop a device-agnostic cloud platform to host diabetes device data and catalyze an ecosystem of software innovation for type 1 diabetes (T1D) management. An interdisciplinary team decided to establish a nonprofit company, Tidepool, and build open-source software. Through a user-centered design process, the authors created a software platform, the Tidepool Platform, to upload and host T1D device data in an integrated, device-agnostic fashion, as well as an application ("app"), Blip, to visualize the data. Tidepool's software utilizes the principles of modular components, modern web design including REST APIs and JavaScript, cloud computing, agile development methodology, and robust privacy and security. By consolidating the currently scattered and siloed T1D device data ecosystem into one open platform, Tidepool can improve access to the data and enable new possibilities and efficiencies in T1D clinical care and research. The Tidepool Platform decouples diabetes apps from diabetes devices, allowing software developers to build innovative apps without requiring them to design a unique back-end (e.g., database and security) or unique ways of ingesting device data. It allows people with T1D to choose to use any preferred app regardless of which device(s) they use. The authors believe that the Tidepool Platform can solve two current problems in the T1D device landscape: 1) limited access to T1D device data and 2) poor interoperability of data from different devices. If proven effective, Tidepool's open source, cloud model for health data interoperability is applicable to other healthcare use cases. © The Author 2015. Published by Oxford University Press on behalf of the American Medical Informatics Association.

  19. A case study in open source innovation: developing the Tidepool Platform for interoperability in type 1 diabetes management

    PubMed Central

    Wong, Jenise; Look, Howard; Arbiter, Brandon; Quirk, Kent; McCanne, Steve; Sun, Yao; Blum, Michael; Adi, Saleh

    2016-01-01

    Objective Develop a device-agnostic cloud platform to host diabetes device data and catalyze an ecosystem of software innovation for type 1 diabetes (T1D) management. Materials and Methods An interdisciplinary team decided to establish a nonprofit company, Tidepool, and build open-source software. Results Through a user-centered design process, the authors created a software platform, the Tidepool Platform, to upload and host T1D device data in an integrated, device-agnostic fashion, as well as an application (“app”), Blip, to visualize the data. Tidepool’s software utilizes the principles of modular components, modern web design including REST APIs and JavaScript, cloud computing, agile development methodology, and robust privacy and security. Discussion By consolidating the currently scattered and siloed T1D device data ecosystem into one open platform, Tidepool can improve access to the data and enable new possibilities and efficiencies in T1D clinical care and research. The Tidepool Platform decouples diabetes apps from diabetes devices, allowing software developers to build innovative apps without requiring them to design a unique back-end (e.g., database and security) or unique ways of ingesting device data. It allows people with T1D to choose to use any preferred app regardless of which device(s) they use. Conclusion The authors believe that the Tidepool Platform can solve two current problems in the T1D device landscape: 1) limited access to T1D device data and 2) poor interoperability of data from different devices. If proven effective, Tidepool’s open source, cloud model for health data interoperability is applicable to other healthcare use cases. PMID:26338218

  20. STRS Radio Service Software for NASA's SCaN Testbed

    NASA Technical Reports Server (NTRS)

    Mortensen, Dale J.; Bishop, Daniel Wayne; Chelmins, David T.

    2012-01-01

    NASAs Space Communication and Navigation(SCaN) Testbed was launched to the International Space Station in 2012. The objective is to promote new software defined radio technologies and associated software application reuse, enabled by this first flight of NASAs Space Telecommunications Radio System(STRS) architecture standard. Pre-launch testing with the testbeds software defined radios was performed as part of system integration. Radio services for the JPL SDR were developed during system integration to allow the waveform application to operate properly in the space environment, especially considering thermal effects. These services include receiver gain control, frequency offset, IQ modulator balance, and transmit level control. Development, integration, and environmental testing of the radio services will be described. The added software allows the waveform application to operate properly in the space environment, and can be reused by future experimenters testing different waveform applications. Integrating such services with the platform provided STRS operating environment will attract more users, and these services are candidates for interface standardization via STRS.

  1. STRS Radio Service Software for NASA's SCaN Testbed

    NASA Technical Reports Server (NTRS)

    Mortensen, Dale J.; Bishop, Daniel Wayne; Chelmins, David T.

    2013-01-01

    NASA's Space Communication and Navigation(SCaN) Testbed was launched to the International Space Station in 2012. The objective is to promote new software defined radio technologies and associated software application reuse, enabled by this first flight of NASA's Space Telecommunications Radio System (STRS) architecture standard. Pre-launch testing with the testbed's software defined radios was performed as part of system integration. Radio services for the JPL SDR were developed during system integration to allow the waveform application to operate properly in the space environment, especially considering thermal effects. These services include receiver gain control, frequency offset, IQ modulator balance, and transmit level control. Development, integration, and environmental testing of the radio services will be described. The added software allows the waveform application to operate properly in the space environment, and can be reused by future experimenters testing different waveform applications. Integrating such services with the platform provided STRS operating environment will attract more users, and these services are candidates for interface standardization via STRS.

  2. Systems Engineering | Wind | NREL

    Science.gov Websites

    platform to leverage its research capabilities toward integrating wind energy engineering and cost models achieve a better understanding of how to improve system-level performance and achieve system-level cost research capabilities to: Integrate wind plant engineering performance and cost software modeling to enable

  3. Evaluation of a Mobile Platform for Proof-of-Concept Autonomous Site Selection and Preparation

    NASA Astrophysics Data System (ADS)

    Gammell, Jonathan

    A mobile robotic platform for Autonomous Site Selection and Preparation (ASSP) was developed for an analogue deployment to Mauna Kea, Hawai`i. A team of rovers performed an autonomous Ground Penetrating Radar (GPR) survey and constructed a level landing pad. They used interchangeable payloads that allowed the GPR and blade to be easily exchanged. Autonomy was accomplished by integrating the individual hardware devices with software based on the ArgoSoft framework previously developed at UTIAS. The rovers were controlled by an on-board netbook. The successes and failures of the devices and software modules are evaluated within. Recommendations are presented to address problems discovered during the deployment and to guide future research on the platform.

  4. The Generation Challenge Programme Platform: Semantic Standards and Workbench for Crop Science

    PubMed Central

    Bruskiewich, Richard; Senger, Martin; Davenport, Guy; Ruiz, Manuel; Rouard, Mathieu; Hazekamp, Tom; Takeya, Masaru; Doi, Koji; Satoh, Kouji; Costa, Marcos; Simon, Reinhard; Balaji, Jayashree; Akintunde, Akinnola; Mauleon, Ramil; Wanchana, Samart; Shah, Trushar; Anacleto, Mylah; Portugal, Arllet; Ulat, Victor Jun; Thongjuea, Supat; Braak, Kyle; Ritter, Sebastian; Dereeper, Alexis; Skofic, Milko; Rojas, Edwin; Martins, Natalia; Pappas, Georgios; Alamban, Ryan; Almodiel, Roque; Barboza, Lord Hendrix; Detras, Jeffrey; Manansala, Kevin; Mendoza, Michael Jonathan; Morales, Jeffrey; Peralta, Barry; Valerio, Rowena; Zhang, Yi; Gregorio, Sergio; Hermocilla, Joseph; Echavez, Michael; Yap, Jan Michael; Farmer, Andrew; Schiltz, Gary; Lee, Jennifer; Casstevens, Terry; Jaiswal, Pankaj; Meintjes, Ayton; Wilkinson, Mark; Good, Benjamin; Wagner, James; Morris, Jane; Marshall, David; Collins, Anthony; Kikuchi, Shoshi; Metz, Thomas; McLaren, Graham; van Hintum, Theo

    2008-01-01

    The Generation Challenge programme (GCP) is a global crop research consortium directed toward crop improvement through the application of comparative biology and genetic resources characterization to plant breeding. A key consortium research activity is the development of a GCP crop bioinformatics platform to support GCP research. This platform includes the following: (i) shared, public platform-independent domain models, ontology, and data formats to enable interoperability of data and analysis flows within the platform; (ii) web service and registry technologies to identify, share, and integrate information across diverse, globally dispersed data sources, as well as to access high-performance computational (HPC) facilities for computationally intensive, high-throughput analyses of project data; (iii) platform-specific middleware reference implementations of the domain model integrating a suite of public (largely open-access/-source) databases and software tools into a workbench to facilitate biodiversity analysis, comparative analysis of crop genomic data, and plant breeding decision making. PMID:18483570

  5. Getting expert systems off the ground: Lessons learned from integrating model-based diagnostics with prototype flight hardware

    NASA Technical Reports Server (NTRS)

    Stephan, Amy; Erikson, Carol A.

    1991-01-01

    As an initial attempt to introduce expert system technology into an onboard environment, a model based diagnostic system using the TRW MARPLE software tool was integrated with prototype flight hardware and its corresponding control software. Because this experiment was designed primarily to test the effectiveness of the model based reasoning technique used, the expert system ran on a separate hardware platform, and interactions between the control software and the model based diagnostics were limited. While this project met its objective of showing that model based reasoning can effectively isolate failures in flight hardware, it also identified the need for an integrated development path for expert system and control software for onboard applications. In developing expert systems that are ready for flight, artificial intelligence techniques must be evaluated to determine whether they offer a real advantage onboard, identify which diagnostic functions should be performed by the expert systems and which are better left to the procedural software, and work closely with both the hardware and the software developers from the beginning of a project to produce a well designed and thoroughly integrated application.

  6. A Practical Guide to the Technology and Adoption of Software Process Automation

    DTIC Science & Technology

    1994-03-01

    IDE’s integration of Software through Pictures, CodeCenter, and FrameMaker ). However, successful use of in- tegrated tools, as reflected in actual...tool for a specific platform. Thus, when a Work Context calls for a word processor, the weaver.tis file can be set up to call FrameMaker for the Sun4

  7. Get It Together: Integrating Data with XML.

    ERIC Educational Resources Information Center

    Miller, Ron

    2003-01-01

    Discusses the use of XML for data integration to move data across different platforms, including across the Internet, from a variety of sources. Topics include flexibility; standards; organizing databases; unstructured data and the use of meta tags to encode it with XML information; cost effectiveness; and eliminating client software licenses.…

  8. Integrated Biological Warfare Technology Platform (IBWTP). Intelligent Software Supporting Situation Awareness, Response, and Operations

    DTIC Science & Technology

    2007-01-01

    15 4.2.3. Users of Systems for Combating Biological Warfare ................................ 16 4.2.4...21 4.3.1. Existing Biosurveillance Systems .............................................................. 22 4.3.2. Automatic Integration...74 6.4.4. Multi-Agent System Management System (MMS).................................... 75 6.4.5. Agent Glossary

  9. Reconfigurable, Intelligently-Adaptive, Communication System, an SDR Platform

    NASA Technical Reports Server (NTRS)

    Roche, Rigoberto

    2016-01-01

    The Space Telecommunications Radio System (STRS) provides a common, consistent framework to abstract the application software from the radio platform hardware. STRS aims to reduce the cost and risk of using complex, configurable and reprogrammable radio systems across NASA missions. The Glenn Research Center (GRC) team made a software-defined radio (SDR) platform STRS compliant by adding an STRS operating environment and a field programmable gate array (FPGA) wrapper, capable of implementing each of the platforms interfaces, as well as a test waveform to exercise those interfaces. This effort serves to provide a framework toward waveform development on an STRS compliant platform to support future space communication systems for advanced exploration missions. Validated STRS compliant applications provided tested code with extensive documentation to potentially reduce risk, cost and efforts in development of space-deployable SDRs. This paper discusses the advantages of STRS, the integration of STRS onto a Reconfigurable, Intelligently-Adaptive, Communication System (RIACS) SDR platform, the sample waveform, and wrapper development efforts. The paper emphasizes the infusion of the STRS Architecture onto the RIACS platform for potential use in next generation SDRs for advance exploration missions.

  10. Wolfram technologies as an integrated scalable platform for interactive learning

    NASA Astrophysics Data System (ADS)

    Kaurov, Vitaliy

    2012-02-01

    We rely on technology profoundly with the prospect of even greater integration in the future. Well known challenges in education are a technology-inadequate curriculum and many software platforms that are difficult to scale or interconnect. We'll review an integrated technology, much of it free, that addresses these issues for individuals and small schools as well as for universities. Topics include: Mathematica, a programming environment that offers a diverse range of functionality; natural language programming for getting started quickly and accessing data from Wolfram|Alpha; quick and easy construction of interactive courseware and scientific applications; partnering with publishers to create interactive e-textbooks; course assistant apps for mobile platforms; the computable document format (CDF); teacher-student and student-student collaboration on interactive projects and web publishing at the Wolfram Demonstrations site.

  11. Integrated Component-based Data Acquisition Systems for Aerospace Test Facilities

    NASA Technical Reports Server (NTRS)

    Ross, Richard W.

    2001-01-01

    The Multi-Instrument Integrated Data Acquisition System (MIIDAS), developed by the NASA Langley Research Center, uses commercial off the shelf (COTS) products, integrated with custom software, to provide a broad range of capabilities at a low cost throughout the system s entire life cycle. MIIDAS combines data acquisition capabilities with online and post-test data reduction computations. COTS products lower purchase and maintenance costs by reducing the level of effort required to meet system requirements. Object-oriented methods are used to enhance modularity, encourage reusability, and to promote adaptability, reducing software development costs. Using only COTS products and custom software supported on multiple platforms reduces the cost of porting the system to other platforms. The post-test data reduction capabilities of MIIDAS have been installed at four aerospace testing facilities at NASA Langley Research Center. The systems installed at these facilities provide a common user interface, reducing the training time required for personnel that work across multiple facilities. The techniques employed by MIIDAS enable NASA to build a system with a lower initial purchase price and reduced sustaining maintenance costs. With MIIDAS, NASA has built a highly flexible next generation data acquisition and reduction system for aerospace test facilities that meets customer expectations.

  12. Global Software Development with Cloud Platforms

    NASA Astrophysics Data System (ADS)

    Yara, Pavan; Ramachandran, Ramaseshan; Balasubramanian, Gayathri; Muthuswamy, Karthik; Chandrasekar, Divya

    Offshore and outsourced distributed software development models and processes are facing challenges, previously unknown, with respect to computing capacity, bandwidth, storage, security, complexity, reliability, and business uncertainty. Clouds promise to address these challenges by adopting recent advances in virtualization, parallel and distributed systems, utility computing, and software services. In this paper, we envision a cloud-based platform that addresses some of these core problems. We outline a generic cloud architecture, its design and our first implementation results for three cloud forms - a compute cloud, a storage cloud and a cloud-based software service- in the context of global distributed software development (GSD). Our ”compute cloud” provides computational services such as continuous code integration and a compile server farm, ”storage cloud” offers storage (block or file-based) services with an on-line virtual storage service, whereas the on-line virtual labs represent a useful cloud service. We note some of the use cases for clouds in GSD, the lessons learned with our prototypes and identify challenges that must be conquered before realizing the full business benefits. We believe that in the future, software practitioners will focus more on these cloud computing platforms and see clouds as a means to supporting a ecosystem of clients, developers and other key stakeholders.

  13. Reconfigurable, Cognitive Software-Defined Radio

    NASA Technical Reports Server (NTRS)

    Bhat, Arvind

    2015-01-01

    Software-defined radio (SDR) technology allows radios to be reconfigured to perform different communication functions without using multiple radios to accomplish each task. Intelligent Automation, Inc., has developed SDR platforms that switch adaptively between different operation modes. The innovation works by modifying both transmit waveforms and receiver signal processing tasks. In Phase I of the project, the company developed SDR cognitive capabilities, including adaptive modulation and coding (AMC), automatic modulation recognition (AMR), and spectrum sensing. In Phase II, these capabilities were integrated into SDR platforms. The reconfigurable transceiver design employs high-speed field-programmable gate arrays, enabling multimode operation and scalable architecture. Designs are based on commercial off-the-shelf (COTS) components and are modular in nature, making it easier to upgrade individual components rather than redesigning the entire SDR platform as technology advances.

  14. Information-computational platform for collaborative multidisciplinary investigations of regional climatic changes and their impacts

    NASA Astrophysics Data System (ADS)

    Gordov, Evgeny; Lykosov, Vasily; Krupchatnikov, Vladimir; Okladnikov, Igor; Titov, Alexander; Shulgina, Tamara

    2013-04-01

    Analysis of growing volume of related to climate change data from sensors and model outputs requires collaborative multidisciplinary efforts of researchers. To do it timely and in reliable way one needs in modern information-computational infrastructure supporting integrated studies in the field of environmental sciences. Recently developed experimental software and hardware platform Climate (http://climate.scert.ru/) provides required environment for regional climate change related investigations. The platform combines modern web 2.0 approach, GIS-functionality and capabilities to run climate and meteorological models, process large geophysical datasets and support relevant analysis. It also supports joint software development by distributed research groups, and organization of thematic education for students and post-graduate students. In particular, platform software developed includes dedicated modules for numerical processing of regional and global modeling results for consequent analysis and visualization. Also run of integrated into the platform WRF and «Planet Simulator» models, modeling results data preprocessing and visualization is provided. All functions of the platform are accessible by a user through a web-portal using common graphical web-browser in the form of an interactive graphical user interface which provides, particularly, capabilities of selection of geographical region of interest (pan and zoom), data layers manipulation (order, enable/disable, features extraction) and visualization of results. Platform developed provides users with capabilities of heterogeneous geophysical data analysis, including high-resolution data, and discovering of tendencies in climatic and ecosystem changes in the framework of different multidisciplinary researches. Using it even unskilled user without specific knowledge can perform reliable computational processing and visualization of large meteorological, climatic and satellite monitoring datasets through unified graphical web-interface. Partial support of RF Ministry of Education and Science grant 8345, SB RAS Program VIII.80.2 and Projects 69, 131, 140 and APN CBA2012-16NSY project is acknowledged.

  15. Software Reliability Analysis of NASA Space Flight Software: A Practical Experience

    PubMed Central

    Sukhwani, Harish; Alonso, Javier; Trivedi, Kishor S.; Mcginnis, Issac

    2017-01-01

    In this paper, we present the software reliability analysis of the flight software of a recently launched space mission. For our analysis, we use the defect reports collected during the flight software development. We find that this software was developed in multiple releases, each release spanning across all software life-cycle phases. We also find that the software releases were developed and tested for four different hardware platforms, spanning from off-the-shelf or emulation hardware to actual flight hardware. For releases that exhibit reliability growth or decay, we fit Software Reliability Growth Models (SRGM); otherwise we fit a distribution function. We find that most releases exhibit reliability growth, with Log-Logistic (NHPP) and S-Shaped (NHPP) as the best-fit SRGMs. For the releases that experience reliability decay, we investigate the causes for the same. We find that such releases were the first software releases to be tested on a new hardware platform, and hence they encountered major hardware integration issues. Also such releases seem to have been developed under time pressure in order to start testing on the new hardware platform sooner. Such releases exhibit poor reliability growth, and hence exhibit high predicted failure rate. Other problems include hardware specification changes and delivery delays from vendors. Thus, our analysis provides critical insights and inputs to the management to improve the software development process. As NASA has moved towards a product line engineering for its flight software development, software for future space missions will be developed in a similar manner and hence the analysis results for this mission can be considered as a baseline for future flight software missions. PMID:29278255

  16. Software Reliability Analysis of NASA Space Flight Software: A Practical Experience.

    PubMed

    Sukhwani, Harish; Alonso, Javier; Trivedi, Kishor S; Mcginnis, Issac

    2016-01-01

    In this paper, we present the software reliability analysis of the flight software of a recently launched space mission. For our analysis, we use the defect reports collected during the flight software development. We find that this software was developed in multiple releases, each release spanning across all software life-cycle phases. We also find that the software releases were developed and tested for four different hardware platforms, spanning from off-the-shelf or emulation hardware to actual flight hardware. For releases that exhibit reliability growth or decay, we fit Software Reliability Growth Models (SRGM); otherwise we fit a distribution function. We find that most releases exhibit reliability growth, with Log-Logistic (NHPP) and S-Shaped (NHPP) as the best-fit SRGMs. For the releases that experience reliability decay, we investigate the causes for the same. We find that such releases were the first software releases to be tested on a new hardware platform, and hence they encountered major hardware integration issues. Also such releases seem to have been developed under time pressure in order to start testing on the new hardware platform sooner. Such releases exhibit poor reliability growth, and hence exhibit high predicted failure rate. Other problems include hardware specification changes and delivery delays from vendors. Thus, our analysis provides critical insights and inputs to the management to improve the software development process. As NASA has moved towards a product line engineering for its flight software development, software for future space missions will be developed in a similar manner and hence the analysis results for this mission can be considered as a baseline for future flight software missions.

  17. Energy Systems Integration News | Energy Systems Integration Facility |

    Science.gov Websites

    -the-loop" (HIL) to connect physical devices to software models, EdgePower is drawing on NREL's are putting their controller into a synthetic environment that is called 'controller in-the-loop controller-in-the-loop platform allows us to observe the dynamics of these buildings as they implement the

  18. The Ettention software package.

    PubMed

    Dahmen, Tim; Marsalek, Lukas; Marniok, Nico; Turoňová, Beata; Bogachev, Sviatoslav; Trampert, Patrick; Nickels, Stefan; Slusallek, Philipp

    2016-02-01

    We present a novel software package for the problem "reconstruction from projections" in electron microscopy. The Ettention framework consists of a set of modular building-blocks for tomographic reconstruction algorithms. The well-known block iterative reconstruction method based on Kaczmarz algorithm is implemented using these building-blocks, including adaptations specific to electron tomography. Ettention simultaneously features (1) a modular, object-oriented software design, (2) optimized access to high-performance computing (HPC) platforms such as graphic processing units (GPU) or many-core architectures like Xeon Phi, and (3) accessibility to microscopy end-users via integration in the IMOD package and eTomo user interface. We also provide developers with a clean and well-structured application programming interface (API) that allows for extending the software easily and thus makes it an ideal platform for algorithmic research while hiding most of the technical details of high-performance computing. Copyright © 2015 Elsevier B.V. All rights reserved.

  19. Ascent/Descent Software

    NASA Technical Reports Server (NTRS)

    Brown, Charles; Andrew, Robert; Roe, Scott; Frye, Ronald; Harvey, Michael; Vu, Tuan; Balachandran, Krishnaiyer; Bly, Ben

    2012-01-01

    The Ascent/Descent Software Suite has been used to support a variety of NASA Shuttle Program mission planning and analysis activities, such as range safety, on the Integrated Planning System (IPS) platform. The Ascent/Descent Software Suite, containing Ascent Flight Design (ASC)/Descent Flight Design (DESC) Configuration items (Cis), lifecycle documents, and data files used for shuttle ascent and entry modeling analysis and mission design, resides on IPS/Linux workstations. A list of tools in Navigation (NAV)/Prop Software Suite represents tool versions established during or after the IPS Equipment Rehost-3 project.

  20. A Visual Basic simulation software tool for performance analysis of a membrane-based advanced water treatment plant.

    PubMed

    Pal, P; Kumar, R; Srivastava, N; Chaudhuri, J

    2014-02-01

    A Visual Basic simulation software (WATTPPA) has been developed to analyse the performance of an advanced wastewater treatment plant. This user-friendly and menu-driven software is based on the dynamic mathematical model for an industrial wastewater treatment scheme that integrates chemical, biological and membrane-based unit operations. The software-predicted results corroborate very well with the experimental findings as indicated in the overall correlation coefficient of the order of 0.99. The software permits pre-analysis and manipulation of input data, helps in optimization and exhibits performance of an integrated plant visually on a graphical platform. It allows quick performance analysis of the whole system as well as the individual units. The software first of its kind in its domain and in the well-known Microsoft Excel environment is likely to be very useful in successful design, optimization and operation of an advanced hybrid treatment plant for hazardous wastewater.

  1. OIPAV: an integrated software system for ophthalmic image processing, analysis and visualization

    NASA Astrophysics Data System (ADS)

    Zhang, Lichun; Xiang, Dehui; Jin, Chao; Shi, Fei; Yu, Kai; Chen, Xinjian

    2018-03-01

    OIPAV (Ophthalmic Images Processing, Analysis and Visualization) is a cross-platform software which is specially oriented to ophthalmic images. It provides a wide range of functionalities including data I/O, image processing, interaction, ophthalmic diseases detection, data analysis and visualization to help researchers and clinicians deal with various ophthalmic images such as optical coherence tomography (OCT) images and color photo of fundus, etc. It enables users to easily access to different ophthalmic image data manufactured from different imaging devices, facilitate workflows of processing ophthalmic images and improve quantitative evaluations. In this paper, we will present the system design and functional modules of the platform and demonstrate various applications. With a satisfying function scalability and expandability, we believe that the software can be widely applied in ophthalmology field.

  2. The Ensemble Canon

    NASA Technical Reports Server (NTRS)

    MIittman, David S

    2011-01-01

    Ensemble is an open architecture for the development, integration, and deployment of mission operations software. Fundamentally, it is an adaptation of the Eclipse Rich Client Platform (RCP), a widespread, stable, and supported framework for component-based application development. By capitalizing on the maturity and availability of the Eclipse RCP, Ensemble offers a low-risk, politically neutral path towards a tighter integration of operations tools. The Ensemble project is a highly successful, ongoing collaboration among NASA Centers. Since 2004, the Ensemble project has supported the development of mission operations software for NASA's Exploration Systems, Science, and Space Operations Directorates.

  3. BIAS: Bioinformatics Integrated Application Software.

    PubMed

    Finak, G; Godin, N; Hallett, M; Pepin, F; Rajabi, Z; Srivastava, V; Tang, Z

    2005-04-15

    We introduce a development platform especially tailored to Bioinformatics research and software development. BIAS (Bioinformatics Integrated Application Software) provides the tools necessary for carrying out integrative Bioinformatics research requiring multiple datasets and analysis tools. It follows an object-relational strategy for providing persistent objects, allows third-party tools to be easily incorporated within the system and supports standards and data-exchange protocols common to Bioinformatics. BIAS is an OpenSource project and is freely available to all interested users at http://www.mcb.mcgill.ca/~bias/. This website also contains a paper containing a more detailed description of BIAS and a sample implementation of a Bayesian network approach for the simultaneous prediction of gene regulation events and of mRNA expression from combinations of gene regulation events. hallett@mcb.mcgill.ca.

  4. Supporting Building Portfolio Investment and Policy Decision Making through an Integrated Building Utility Data Platform

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Aziz, Azizan; Lasternas, Bertrand; Alschuler, Elena

    The American Recovery and Reinvestment Act stimulus funding of 2009 for smart grid projects resulted in the tripling of smart meters deployment. In 2012, the Green Button initiative provided utility customers with access to their real-time1 energy usage. The availability of finely granular data provides an enormous potential for energy data analytics and energy benchmarking. The sheer volume of time-series utility data from a large number of buildings also poses challenges in data collection, quality control, and database management for rigorous and meaningful analyses. In this paper, we will describe a building portfolio-level data analytics tool for operational optimization, businessmore » investment and policy assessment using 15-minute to monthly intervals utility data. The analytics tool is developed on top of the U.S. Department of Energy’s Standard Energy Efficiency Data (SEED) platform, an open source software application that manages energy performance data of large groups of buildings. To support the significantly large volume of granular interval data, we integrated a parallel time-series database to the existing relational database. The time-series database improves on the current utility data input, focusing on real-time data collection, storage, analytics and data quality control. The fully integrated data platform supports APIs for utility apps development by third party software developers. These apps will provide actionable intelligence for building owners and facilities managers. Unlike a commercial system, this platform is an open source platform funded by the U.S. Government, accessible to the public, researchers and other developers, to support initiatives in reducing building energy consumption.« less

  5. Should biomedical research be like Airbnb?

    PubMed

    Bonazzi, Vivien R; Bourne, Philip E

    2017-04-01

    The thesis presented here is that biomedical research is based on the trusted exchange of services. That exchange would be conducted more efficiently if the trusted software platforms to exchange those services, if they exist, were more integrated. While simpler and narrower in scope than the services governing biomedical research, comparison to existing internet-based platforms, like Airbnb, can be informative. We illustrate how the analogy to internet-based platforms works and does not work and introduce The Commons, under active development at the National Institutes of Health (NIH) and elsewhere, as an example of the move towards platforms for research.

  6. Should biomedical research be like Airbnb?

    PubMed Central

    Bonazzi, Vivien R.

    2017-01-01

    The thesis presented here is that biomedical research is based on the trusted exchange of services. That exchange would be conducted more efficiently if the trusted software platforms to exchange those services, if they exist, were more integrated. While simpler and narrower in scope than the services governing biomedical research, comparison to existing internet-based platforms, like Airbnb, can be informative. We illustrate how the analogy to internet-based platforms works and does not work and introduce The Commons, under active development at the National Institutes of Health (NIH) and elsewhere, as an example of the move towards platforms for research. PMID:28388615

  7. Bringing Legacy Visualization Software to Modern Computing Devices via Application Streaming

    NASA Astrophysics Data System (ADS)

    Fisher, Ward

    2014-05-01

    Planning software compatibility across forthcoming generations of computing platforms is a problem commonly encountered in software engineering and development. While this problem can affect any class of software, data analysis and visualization programs are particularly vulnerable. This is due in part to their inherent dependency on specialized hardware and computing environments. A number of strategies and tools have been designed to aid software engineers with this task. While generally embraced by developers at 'traditional' software companies, these methodologies are often dismissed by the scientific software community as unwieldy, inefficient and unnecessary. As a result, many important and storied scientific software packages can struggle to adapt to a new computing environment; for example, one in which much work is carried out on sub-laptop devices (such as tablets and smartphones). Rewriting these packages for a new platform often requires significant investment in terms of development time and developer expertise. In many cases, porting older software to modern devices is neither practical nor possible. As a result, replacement software must be developed from scratch, wasting resources better spent on other projects. Enabled largely by the rapid rise and adoption of cloud computing platforms, 'Application Streaming' technologies allow legacy visualization and analysis software to be operated wholly from a client device (be it laptop, tablet or smartphone) while retaining full functionality and interactivity. It mitigates much of the developer effort required by other more traditional methods while simultaneously reducing the time it takes to bring the software to a new platform. This work will provide an overview of Application Streaming and how it compares against other technologies which allow scientific visualization software to be executed from a remote computer. We will discuss the functionality and limitations of existing application streaming frameworks and how a developer might prepare their software for application streaming. We will also examine the secondary benefits realized by moving legacy software to the cloud. Finally, we will examine the process by which a legacy Java application, the Integrated Data Viewer (IDV), is to be adapted for tablet computing via Application Streaming.

  8. CONNJUR Workflow Builder: A software integration environment for spectral reconstruction

    PubMed Central

    Fenwick, Matthew; Weatherby, Gerard; Vyas, Jay; Sesanker, Colbert; Martyn, Timothy O.; Ellis, Heidi J.C.; Gryk, Michael R.

    2015-01-01

    CONNJUR Workflow Builder (WB) is an open-source software integration environment that leverages existing spectral reconstruction tools to create a synergistic, coherent platform for converting biomolecular NMR data from the time domain to the frequency domain. WB provides data integration of primary data and metadata using a relational database, and includes a library of pre-built workflows for processing time domain data. WB simplifies maximum entropy reconstruction, facilitating the processing of non-uniformly sampled time domain data. As will be shown in the paper, the unique features of WB provide it with novel abilities to enhance the quality, accuracy, and fidelity of the spectral reconstruction process. WB also provides features which promote collaboration, education, parameterization, and non-uniform data sets along with processing integrated with the Rowland NMR Toolkit (RNMRTK) and NMRPipe software packages. WB is available free of charge in perpetuity, dual-licensed under the MIT and GPL open source licenses. PMID:26066803

  9. CONNJUR Workflow Builder: a software integration environment for spectral reconstruction.

    PubMed

    Fenwick, Matthew; Weatherby, Gerard; Vyas, Jay; Sesanker, Colbert; Martyn, Timothy O; Ellis, Heidi J C; Gryk, Michael R

    2015-07-01

    CONNJUR Workflow Builder (WB) is an open-source software integration environment that leverages existing spectral reconstruction tools to create a synergistic, coherent platform for converting biomolecular NMR data from the time domain to the frequency domain. WB provides data integration of primary data and metadata using a relational database, and includes a library of pre-built workflows for processing time domain data. WB simplifies maximum entropy reconstruction, facilitating the processing of non-uniformly sampled time domain data. As will be shown in the paper, the unique features of WB provide it with novel abilities to enhance the quality, accuracy, and fidelity of the spectral reconstruction process. WB also provides features which promote collaboration, education, parameterization, and non-uniform data sets along with processing integrated with the Rowland NMR Toolkit (RNMRTK) and NMRPipe software packages. WB is available free of charge in perpetuity, dual-licensed under the MIT and GPL open source licenses.

  10. ICW eHealth Framework.

    PubMed

    Klein, Karsten; Wolff, Astrid C; Ziebold, Oliver; Liebscher, Thomas

    2008-01-01

    The ICW eHealth Framework (eHF) is a powerful infrastructure and platform for the development of service-oriented solutions in the health care business. It is the culmination of many years of experience of ICW in the development and use of in-house health care solutions and represents the foundation of ICW product developments based on the Java Enterprise Edition (Java EE). The ICW eHealth Framework has been leveraged to allow development by external partners - enabling adopters a straightforward integration into ICW solutions. The ICW eHealth Framework consists of reusable software components, development tools, architectural guidelines and conventions defining a full software-development and product lifecycle. From the perspective of a partner, the framework provides services and infrastructure capabilities for integrating applications within an eHF-based solution. This article introduces the ICW eHealth Framework's basic architectural concepts and technologies. It provides an overview of its module and component model, describes the development platform that supports the complete software development lifecycle of health care applications and outlines technological aspects, mainly focusing on application development frameworks and open standards.

  11. Performance analysis and optimization of an advanced pharmaceutical wastewater treatment plant through a visual basic software tool (PWWT.VB).

    PubMed

    Pal, Parimal; Thakura, Ritwik; Chakrabortty, Sankha

    2016-05-01

    A user-friendly, menu-driven simulation software tool has been developed for the first time to optimize and analyze the system performance of an advanced continuous membrane-integrated pharmaceutical wastewater treatment plant. The software allows pre-analysis and manipulation of input data which helps in optimization and shows the software performance visually on a graphical platform. Moreover, the software helps the user to "visualize" the effects of the operating parameters through its model-predicted output profiles. The software is based on a dynamic mathematical model, developed for a systematically integrated forward osmosis-nanofiltration process for removal of toxic organic compounds from pharmaceutical wastewater. The model-predicted values have been observed to corroborate well with the extensive experimental investigations which were found to be consistent under varying operating conditions like operating pressure, operating flow rate, and draw solute concentration. Low values of the relative error (RE = 0.09) and high values of Willmott-d-index (d will = 0.981) reflected a high degree of accuracy and reliability of the software. This software is likely to be a very efficient tool for system design or simulation of an advanced membrane-integrated treatment plant for hazardous wastewater.

  12. Hardware Evolution of Closed-Loop Controller Designs

    NASA Technical Reports Server (NTRS)

    Gwaltney, David; Ferguson, Ian

    2002-01-01

    Poster presentation will outline on-going efforts at NASA, MSFC to employ various Evolvable Hardware experimental platforms in the evolution of digital and analog circuitry for application to automatic control. Included will be information concerning the application of commercially available hardware and software along with the use of the JPL developed FPTA2 integrated circuit and supporting JPL developed software. Results to date will be presented.

  13. The SCEC Broadband Platform: Open-Source Software for Strong Ground Motion Simulation and Validation

    NASA Astrophysics Data System (ADS)

    Goulet, C.; Silva, F.; Maechling, P. J.; Callaghan, S.; Jordan, T. H.

    2015-12-01

    The Southern California Earthquake Center (SCEC) Broadband Platform (BBP) is a carefully integrated collection of open-source scientific software programs that can simulate broadband (0-100Hz) ground motions for earthquakes at regional scales. The BBP scientific software modules implement kinematic rupture generation, low and high-frequency seismogram synthesis using wave propagation through 1D layered velocity structures, seismogram ground motion amplitude calculations, and goodness of fit measurements. These modules are integrated into a software system that provides user-defined, repeatable, calculation of ground motion seismograms, using multiple alternative ground motion simulation methods, and software utilities that can generate plots, charts, and maps. The BBP has been developed over the last five years in a collaborative scientific, engineering, and software development project involving geoscientists, earthquake engineers, graduate students, and SCEC scientific software developers. The BBP can run earthquake rupture and wave propagation modeling software to simulate ground motions for well-observed historical earthquakes and to quantify how well the simulated broadband seismograms match the observed seismograms. The BBP can also run simulations for hypothetical earthquakes. In this case, users input an earthquake location and magnitude description, a list of station locations, and a 1D velocity model for the region of interest, and the BBP software then calculates ground motions for the specified stations. The SCEC BBP software released in 2015 can be compiled and run on recent Linux systems with GNU compilers. It includes 5 simulation methods, 7 simulation regions covering California, Japan, and Eastern North America, the ability to compare simulation results against GMPEs, updated ground motion simulation methods, and a simplified command line user interface.

  14. Reconfigurable, Intelligently-Adaptive, Communication System, an SDR Platform

    NASA Technical Reports Server (NTRS)

    Roche, Rigoberto J.; Shalkhauser, Mary Jo; Hickey, Joseph P.; Briones, Janette C.

    2016-01-01

    The Space Telecommunications Radio System (STRS) provides a common, consistent framework to abstract the application software from the radio platform hardware. STRS aims to reduce the cost and risk of using complex, configurable and reprogrammable radio systems across NASA missions. The NASA Glenn Research Center (GRC) team made a software defined radio (SDR) platform STRS compliant by adding an STRS operating environment and a field programmable gate array (FPGA) wrapper, capable of implementing each of the platforms interfaces, as well as a test waveform to exercise those interfaces. This effort serves to provide a framework toward waveform development onto an STRS compliant platform to support future space communication systems for advanced exploration missions. The use of validated STRS compliant applications provides tested code with extensive documentation to potentially reduce risk, cost and e ort in development of space-deployable SDRs. This paper discusses the advantages of STRS, the integration of STRS onto a Reconfigurable, Intelligently-Adaptive, Communication System (RIACS) SDR platform, and the test waveform and wrapper development e orts. The paper emphasizes the infusion of the STRS Architecture onto the RIACS platform for potential use in next generation flight system SDRs for advanced exploration missions.

  15. FRACOR-software toolbox for deterministic mapping of fracture corridors in oil fields on AutoCAD platform

    NASA Astrophysics Data System (ADS)

    Ozkaya, Sait I.

    2018-03-01

    Fracture corridors are interconnected large fractures in a narrow sub vertical tabular array, which usually traverse entire reservoir vertically and extended for several hundreds of meters laterally. Fracture corridors with their huge conductivities constitute an important element of many fractured reservoirs. Unlike small diffuse fractures, actual fracture corridors must be mapped deterministically for simulation or field development purposes. Fracture corridors can be identified and quantified definitely with borehole image logs and well testing. However, there are rarely sufficient image logs or well tests, and it is necessary to utilize various fracture corridor indicators with varying degrees of reliability. Integration of data from many different sources, in turn, requires a platform with powerful editing and layering capability. Available commercial reservoir characterization software packages, with layering and editing capabilities, can be cost intensive. CAD packages are far more affordable and may easily acquire the versatility and power of commercial software packages with addition of a small software toolbox. The objective of this communication is to present FRACOR, a software toolbox which enables deterministic 2D fracture corridor mapping and modeling on AutoCAD platform. The FRACOR toolbox is written in AutoLISPand contains several independent routines to import and integrate available fracture corridor data from an oil field, and export results as text files. The resulting fracture corridor maps consists mainly of fracture corridors with different confidence levels from combination of static and dynamic data and exclusion zones where no fracture corridor can exist. The exported text file of fracture corridors from FRACOR can be imported into an upscaling programs to generate fracture grid for dual porosity simulation or used for field development and well planning.

  16. CGI: Java Software for Mapping and Visualizing Data from Array-based Comparative Genomic Hybridization and Expression Profiling

    PubMed Central

    Gu, Joyce Xiuweu-Xu; Wei, Michael Yang; Rao, Pulivarthi H.; Lau, Ching C.; Behl, Sanjiv; Man, Tsz-Kwong

    2007-01-01

    With the increasing application of various genomic technologies in biomedical research, there is a need to integrate these data to correlate candidate genes/regions that are identified by different genomic platforms. Although there are tools that can analyze data from individual platforms, essential software for integration of genomic data is still lacking. Here, we present a novel Java-based program called CGI (Cytogenetics-Genomics Integrator) that matches the BAC clones from array-based comparative genomic hybridization (aCGH) to genes from RNA expression profiling datasets. The matching is computed via a fast, backend MySQL database containing UCSC Genome Browser annotations. This program also provides an easy-to-use graphical user interface for visualizing and summarizing the correlation of DNA copy number changes and RNA expression patterns from a set of experiments. In addition, CGI uses a Java applet to display the copy number values of a specific BAC clone in aCGH experiments side by side with the expression levels of genes that are mapped back to that BAC clone from the microarray experiments. The CGI program is built on top of extensible, reusable graphic components specifically designed for biologists. It is cross-platform compatible and the source code is freely available under the General Public License. PMID:19936083

  17. CGI: Java software for mapping and visualizing data from array-based comparative genomic hybridization and expression profiling.

    PubMed

    Gu, Joyce Xiuweu-Xu; Wei, Michael Yang; Rao, Pulivarthi H; Lau, Ching C; Behl, Sanjiv; Man, Tsz-Kwong

    2007-10-06

    With the increasing application of various genomic technologies in biomedical research, there is a need to integrate these data to correlate candidate genes/regions that are identified by different genomic platforms. Although there are tools that can analyze data from individual platforms, essential software for integration of genomic data is still lacking. Here, we present a novel Java-based program called CGI (Cytogenetics-Genomics Integrator) that matches the BAC clones from array-based comparative genomic hybridization (aCGH) to genes from RNA expression profiling datasets. The matching is computed via a fast, backend MySQL database containing UCSC Genome Browser annotations. This program also provides an easy-to-use graphical user interface for visualizing and summarizing the correlation of DNA copy number changes and RNA expression patterns from a set of experiments. In addition, CGI uses a Java applet to display the copy number values of a specific BAC clone in aCGH experiments side by side with the expression levels of genes that are mapped back to that BAC clone from the microarray experiments. The CGI program is built on top of extensible, reusable graphic components specifically designed for biologists. It is cross-platform compatible and the source code is freely available under the General Public License.

  18. Software Intensive Systems

    DTIC Science & Technology

    2006-07-01

    Architect, Developer and Platform Evangelism • Microsoft Dynamic Systems Initiative-- John Wilson, Architect Windows Management • Windows Lifecycle...Presentations • Aegis--Reuben Pitts & CDR John Ailes, Program Executive Office, Integrated Warfare Systems • Long Term Mine Reconnaissance (LMRS)--CAPT...Paul Imes • Joint Tactical Radio System (JTRS)--Richard North, JPEO JTRS & Leonard Schiavone , MITRE • Single Integrated Air Picture (SIAP)--CAPT

  19. Final Scientific/Technical Report for "Enabling Exascale Hardware and Software Design through Scalable System Virtualization"

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dinda, Peter August

    2015-03-17

    This report describes the activities, findings, and products of the Northwestern University component of the "Enabling Exascale Hardware and Software Design through Scalable System Virtualization" project. The purpose of this project has been to extend the state of the art of systems software for high-end computing (HEC) platforms, and to use systems software to better enable the evaluation of potential future HEC platforms, for example exascale platforms. Such platforms, and their systems software, have the goal of providing scientific computation at new scales, thus enabling new research in the physical sciences and engineering. Over time, the innovations in systems softwaremore » for such platforms also become applicable to more widely used computing clusters, data centers, and clouds. This was a five-institution project, centered on the Palacios virtual machine monitor (VMM) systems software, a project begun at Northwestern, and originally developed in a previous collaboration between Northwestern University and the University of New Mexico. In this project, Northwestern (including via our subcontract to the University of Pittsburgh) contributed to the continued development of Palacios, along with other team members. We took the leadership role in (1) continued extension of support for emerging Intel and AMD hardware, (2) integration and performance enhancement of overlay networking, (3) connectivity with architectural simulation, (4) binary translation, and (5) support for modern Non-Uniform Memory Access (NUMA) hosts and guests. We also took a supporting role in support for specialized hardware for I/O virtualization, profiling, configurability, and integration with configuration tools. The efforts we led (1-5) were largely successful and executed as expected, with code and papers resulting from them. The project demonstrated the feasibility of a virtualization layer for HEC computing, similar to such layers for cloud or datacenter computing. For effort (3), although a prototype connecting Palacios with the GEM5 architectural simulator was demonstrated, our conclusion was that such a platform was less useful for design space exploration than anticipated due to inherent complexity of the connection between the instruction set architecture level and the microarchitectural level. For effort (4), we found that a code injection approach proved to be more fruitful. The results of our efforts are publicly available in the open source Palacios codebase and published papers, all of which are available from the project web site, v3vee.org. Palacios is currently one of the two codebases (the other being Sandia’s Kitten lightweight kernel) that underlies the node operating system for the DOE Hobbes Project, one of two projects tasked with building a systems software prototype for the national exascale computing effort.« less

  20. A Computational Systems Biology Software Platform for Multiscale Modeling and Simulation: Integrating Whole-Body Physiology, Disease Biology, and Molecular Reaction Networks

    PubMed Central

    Eissing, Thomas; Kuepfer, Lars; Becker, Corina; Block, Michael; Coboeken, Katrin; Gaub, Thomas; Goerlitz, Linus; Jaeger, Juergen; Loosen, Roland; Ludewig, Bernd; Meyer, Michaela; Niederalt, Christoph; Sevestre, Michael; Siegmund, Hans-Ulrich; Solodenko, Juri; Thelen, Kirstin; Telle, Ulrich; Weiss, Wolfgang; Wendl, Thomas; Willmann, Stefan; Lippert, Joerg

    2011-01-01

    Today, in silico studies and trial simulations already complement experimental approaches in pharmaceutical R&D and have become indispensable tools for decision making and communication with regulatory agencies. While biology is multiscale by nature, project work, and software tools usually focus on isolated aspects of drug action, such as pharmacokinetics at the organism scale or pharmacodynamic interaction on the molecular level. We present a modeling and simulation software platform consisting of PK-Sim® and MoBi® capable of building and simulating models that integrate across biological scales. A prototypical multiscale model for the progression of a pancreatic tumor and its response to pharmacotherapy is constructed and virtual patients are treated with a prodrug activated by hepatic metabolization. Tumor growth is driven by signal transduction leading to cell cycle transition and proliferation. Free tumor concentrations of the active metabolite inhibit Raf kinase in the signaling cascade and thereby cell cycle progression. In a virtual clinical study, the individual therapeutic outcome of the chemotherapeutic intervention is simulated for a large population with heterogeneous genomic background. Thereby, the platform allows efficient model building and integration of biological knowledge and prior data from all biological scales. Experimental in vitro model systems can be linked with observations in animal experiments and clinical trials. The interplay between patients, diseases, and drugs and topics with high clinical relevance such as the role of pharmacogenomics, drug–drug, or drug–metabolite interactions can be addressed using this mechanistic, insight driven multiscale modeling approach. PMID:21483730

  1. HTAPP: High-Throughput Autonomous Proteomic Pipeline

    PubMed Central

    Yu, Kebing; Salomon, Arthur R.

    2011-01-01

    Recent advances in the speed and sensitivity of mass spectrometers and in analytical methods, the exponential acceleration of computer processing speeds, and the availability of genomic databases from an array of species and protein information databases have led to a deluge of proteomic data. The development of a lab-based automated proteomic software platform for the automated collection, processing, storage, and visualization of expansive proteomic datasets is critically important. The high-throughput autonomous proteomic pipeline (HTAPP) described here is designed from the ground up to provide critically important flexibility for diverse proteomic workflows and to streamline the total analysis of a complex proteomic sample. This tool is comprised of software that controls the acquisition of mass spectral data along with automation of post-acquisition tasks such as peptide quantification, clustered MS/MS spectral database searching, statistical validation, and data exploration within a user-configurable lab-based relational database. The software design of HTAPP focuses on accommodating diverse workflows and providing missing software functionality to a wide range of proteomic researchers to accelerate the extraction of biological meaning from immense proteomic data sets. Although individual software modules in our integrated technology platform may have some similarities to existing tools, the true novelty of the approach described here is in the synergistic and flexible combination of these tools to provide an integrated and efficient analysis of proteomic samples. PMID:20336676

  2. AnyWave: a cross-platform and modular software for visualizing and processing electrophysiological signals.

    PubMed

    Colombet, B; Woodman, M; Badier, J M; Bénar, C G

    2015-03-15

    The importance of digital signal processing in clinical neurophysiology is growing steadily, involving clinical researchers and methodologists. There is a need for crossing the gap between these communities by providing efficient delivery of newly designed algorithms to end users. We have developed such a tool which both visualizes and processes data and, additionally, acts as a software development platform. AnyWave was designed to run on all common operating systems. It provides access to a variety of data formats and it employs high fidelity visualization techniques. It also allows using external tools as plug-ins, which can be developed in languages including C++, MATLAB and Python. In the current version, plug-ins allow computation of connectivity graphs (non-linear correlation h2) and time-frequency representation (Morlet wavelets). The software is freely available under the LGPL3 license. AnyWave is designed as an open, highly extensible solution, with an architecture that permits rapid delivery of new techniques to end users. We have developed AnyWave software as an efficient neurophysiological data visualizer able to integrate state of the art techniques. AnyWave offers an interface well suited to the needs of clinical research and an architecture designed for integrating new tools. We expect this software to strengthen the collaboration between clinical neurophysiologists and researchers in biomedical engineering and signal processing. Copyright © 2015 Elsevier B.V. All rights reserved.

  3. The SCEC Broadband Platform: Open-Source Software for Strong Ground Motion Simulation and Validation

    NASA Astrophysics Data System (ADS)

    Silva, F.; Goulet, C. A.; Maechling, P. J.; Callaghan, S.; Jordan, T. H.

    2016-12-01

    The Southern California Earthquake Center (SCEC) Broadband Platform (BBP) is a carefully integrated collection of open-source scientific software programs that can simulate broadband (0-100 Hz) ground motions for earthquakes at regional scales. The BBP can run earthquake rupture and wave propagation modeling software to simulate ground motions for well-observed historical earthquakes and to quantify how well the simulated broadband seismograms match the observed seismograms. The BBP can also run simulations for hypothetical earthquakes. In this case, users input an earthquake location and magnitude description, a list of station locations, and a 1D velocity model for the region of interest, and the BBP software then calculates ground motions for the specified stations. The BBP scientific software modules implement kinematic rupture generation, low- and high-frequency seismogram synthesis using wave propagation through 1D layered velocity structures, several ground motion intensity measure calculations, and various ground motion goodness-of-fit tools. These modules are integrated into a software system that provides user-defined, repeatable, calculation of ground-motion seismograms, using multiple alternative ground motion simulation methods, and software utilities to generate tables, plots, and maps. The BBP has been developed over the last five years in a collaborative project involving geoscientists, earthquake engineers, graduate students, and SCEC scientific software developers. The SCEC BBP software released in 2016 can be compiled and run on recent Linux and Mac OS X systems with GNU compilers. It includes five simulation methods, seven simulation regions covering California, Japan, and Eastern North America, and the ability to compare simulation results against empirical ground motion models (aka GMPEs). The latest version includes updated ground motion simulation methods, a suite of new validation metrics and a simplified command line user interface.

  4. GABBs: Cyberinfrastructure for Self-Service Geospatial Data Exploration, Computation, and Sharing

    NASA Astrophysics Data System (ADS)

    Song, C. X.; Zhao, L.; Biehl, L. L.; Merwade, V.; Villoria, N.

    2016-12-01

    Geospatial data are present everywhere today with the proliferation of location-aware computing devices. This is especially true in the scientific community where large amounts of data are driving research and education activities in many domains. Collaboration over geospatial data, for example, in modeling, data analysis and visualization, must still overcome the barriers of specialized software and expertise among other challenges. In addressing these needs, the Geospatial data Analysis Building Blocks (GABBs) project aims at building geospatial modeling, data analysis and visualization capabilities in an open source web platform, HUBzero. Funded by NSF's Data Infrastructure Building Blocks initiative, GABBs is creating a geospatial data architecture that integrates spatial data management, mapping and visualization, and interfaces in the HUBzero platform for scientific collaborations. The geo-rendering enabled Rappture toolkit, a generic Python mapping library, geospatial data exploration and publication tools, and an integrated online geospatial data management solution are among the software building blocks from the project. The GABBS software will be available through Amazon's AWS Marketplace VM images and open source. Hosting services are also available to the user community. The outcome of the project will enable researchers and educators to self-manage their scientific data, rapidly create GIS-enable tools, share geospatial data and tools on the web, and build dynamic workflows connecting data and tools, all without requiring significant software development skills, GIS expertise or IT administrative privileges. This presentation will describe the GABBs architecture, toolkits and libraries, and showcase the scientific use cases that utilize GABBs capabilities, as well as the challenges and solutions for GABBs to interoperate with other cyberinfrastructure platforms.

  5. Advanced Wireless Integrated Navy Network - AWINN

    DTIC Science & Technology

    2005-09-30

    progress report No. 3 on AWINN hardware and software configurations of smart , wideband, multi-function antennas, secure configurable platform, close-in...results to the host PC via a UART soft core. The UART core used is a proprietary Xilinx core which incorporates features described in National...current software uses wheel odometry and visual landmarks to create a map and estimate position on an internal x, y grid . The wheel odometry provides a

  6. Tile-in-ONE: A web platform which integrates Tile Calorimeter data quality and calibration assessment

    NASA Astrophysics Data System (ADS)

    Sivolella, A.; Ferreira, F.; Maidantchik, C.; Solans, C.; Solodkov, A.; Burghgrave, B.; Smirnov, Y.

    2015-12-01

    The ATLAS Tile Calorimeter collaboration assesses the quality of calibration data in order to ensure its proper operation. A number of tasks is then performed by executing several tools and accessing web systems, which were independently developed to meet distinct collaboration's requirements and do not necessarily are connected with each other. Thus, to attend the collaboration needs, several programs are usually implemented without a global perspective of the detector, requiring basic software features. In addition, functionalities may overlap in their objectives and frequently replicate resources retrieval mechanisms. Tile-in-ONE is a designed and implemented platform that assembles various web systems used by the calorimeter community through a single framework and a standard technology. It provides an infrastructure to support the code implementation, avoiding duplication of work while integrating with an overall view of the detector status. Database connectors smooth the process of information access since developers do not need to be aware of where records are placed and how to extract them. Within the environment, a dashboard stands for a particular Tile operation aspect and gets together plug-ins, i.e. software components that add specific features to an existing application. A server contains the platform core, which represents the basic environment to deal with the configuration, manage user settings and load plug-ins at runtime. A web middleware assists users to develop their own plug-ins, perform tests and integrate them into the platform as a whole. Backends are employed to allow that any type of application is interpreted and displayed in a uniform way. This paper describes Tile-in-ONE web platform.

  7. A Web-Based Data Collection Platform for Multisite Randomized Behavioral Intervention Trials: Development, Key Software Features, and Results of a User Survey.

    PubMed

    Modi, Riddhi A; Mugavero, Michael J; Amico, Rivet K; Keruly, Jeanne; Quinlivan, Evelyn Byrd; Crane, Heidi M; Guzman, Alfredo; Zinski, Anne; Montue, Solange; Roytburd, Katya; Church, Anna; Willig, James H

    2017-06-16

    Meticulous tracking of study data must begin early in the study recruitment phase and must account for regulatory compliance, minimize missing data, and provide high information integrity and/or reduction of errors. In behavioral intervention trials, participants typically complete several study procedures at different time points. Among HIV-infected patients, behavioral interventions can favorably affect health outcomes. In order to empower newly diagnosed HIV positive individuals to learn skills to enhance retention in HIV care, we developed the behavioral health intervention Integrating ENGagement and Adherence Goals upon Entry (iENGAGE) funded by the National Institute of Allergy and Infectious Diseases (NIAID), where we deployed an in-clinic behavioral health intervention in 4 urban HIV outpatient clinics in the United States. To scale our intervention strategy homogenously across sites, we developed software that would function as a behavioral sciences research platform. This manuscript aimed to: (1) describe the design and implementation of a Web-based software application to facilitate deployment of a multisite behavioral science intervention; and (2) report on results of a survey to capture end-user perspectives of the impact of this platform on the conduct of a behavioral intervention trial. In order to support the implementation of the NIAID-funded trial iENGAGE, we developed software to deploy a 4-site behavioral intervention for new clinic patients with HIV/AIDS. We integrated the study coordinator into the informatics team to participate in the software development process. Here, we report the key software features and the results of the 25-item survey to evaluate user perspectives on research and intervention activities specific to the iENGAGE trial (N=13). The key features addressed are study enrollment, participant randomization, real-time data collection, facilitation of longitudinal workflow, reporting, and reusability. We found 100% user agreement (13/13) that participation in the database design and/or testing phase made it easier to understand user roles and responsibilities and recommended participation of research teams in developing databases for future studies. Users acknowledged ease of use, color flags, longitudinal work flow, and data storage in one location as the most useful features of the software platform and issues related to saving participant forms, security restrictions, and worklist layout as least useful features. The successful development of the iENGAGE behavioral science research platform validated an approach of early and continuous involvement of the study team in design development. In addition, we recommend post-hoc collection of data from users as this led to important insights on how to enhance future software and inform standard clinical practices. Clinicaltrials.gov NCT01900236; (https://clinicaltrials.gov/ct2/show/NCT01900236 (Archived by WebCite at http://www.webcitation.org/6qAa8ld7v). ©Riddhi A Modi, Michael J Mugavero, Rivet K Amico, Jeanne Keruly, Evelyn Byrd Quinlivan, Heidi M Crane, Alfredo Guzman, Anne Zinski, Solange Montue, Katya Roytburd, Anna Church, James H Willig. Originally published in JMIR Research Protocols (http://www.researchprotocols.org), 16.06.2017.

  8. SeWeR: a customizable and integrated dynamic HTML interface to bioinformatics services.

    PubMed

    Basu, M K

    2001-06-01

    Sequence analysis using Web Resources (SeWeR) is an integrated, Dynamic HTML (DHTML) interface to commonly used bioinformatics services available on the World Wide Web. It is highly customizable, extendable, platform neutral, completely server-independent and can be hosted as a web page as well as being used as stand-alone software running within a web browser.

  9. Social Sensors (S2ensors): A Kind of Hardware-Software-Integrated Mediators for Social Manufacturing Systems Under Mass Individualization

    NASA Astrophysics Data System (ADS)

    Ding, Kai; Jiang, Ping-Yu

    2017-09-01

    Currently, little work has been devoted to the mediators and tools for multi-role production interactions in the mass individualization environment. This paper proposes a kind of hardware-software-integrated mediators called social sensors (S2ensors) to facilitate the production interactions among customers, manufacturers, and other stakeholders in the social manufacturing systems (SMS). The concept, classification, operational logics, and formalization of S2ensors are clarified. S2ensors collect subjective data from physical sensors and objective data from sensory input in mobile Apps, merge them into meaningful information for decision-making, and finally feed the decisions back for reaction and execution. Then, an S2ensors-Cloud platform is discussed to integrate different S2ensors to work for SMSs in an autonomous way. A demonstrative case is studied by developing a prototype system and the results show that S2ensors and S2ensors-Cloud platform can assist multi-role stakeholders interact and collaborate for the production tasks. It reveals the mediator-enabled mechanisms and methods for production interactions among stakeholders in SMS.

  10. OpenSatKit Enables Quick Startup for CubeSat Missions

    NASA Technical Reports Server (NTRS)

    McComas, David; Melton, Ryan

    2017-01-01

    The software required to develop, integrate, and operate a spacecraft is substantial regardless of whether its a large or small satellite. Even getting started can be a monumental task. To solve this problem, NASAs Core Flight System (cFS), NASA's 42 spacecraft dynamics simulator, and Ball Aerospaces COSMOS ground system have been integrated together into a kit called OpenSatKit that provides a complete and open source software solution for starting a new satellite mission. Users can have a working system with flight software, dynamics simulation, and a ground command and control system up and running within hours.Every satellite mission requires three primary categories of software to function. The first is Flight Software (FSW) which provides the onboard control of the satellites and its payload(s). NASA's cFS provides a great platform for developing this software. Second, while developing a satellite on earth, it is necessary to simulate the satellites orbit, attitude, and actuators, to ensure that the systems that control these aspects will work correctly in the real environment. NASAs 42 simulator provides these functionalities. Finally, the ground has to be able to communicate with the satellite, monitor its performance and health, and display its data. Additionally, test scripts have to be written to verify the system on the ground. Ball Aerospace's COSMOS command and control system provides this functionality. Once the OpenSatKit is up and running, the next step is to customize the platform and get it running on the end target. Starting from a fully working system makes porting the cFS from Linux to a users platform much easier. An example Raspberry Pi target is included in the kit so users can gain experience working with a low cost hardware target. All users can benefit from OpenSatKit but the greatest impact and benefits will be to SmallSat missions with constrained budgets and small software teams. This paper describes OpenSatKits system design, the steps necessary to run the system to target the Raspberry Pi, and future plans. OpenSatKit is a free fully functional spacecraft software system that we hope will greatly benefit the SmallSat community.

  11. Serving the enterprise and beyond with informatics for integrating biology and the bedside (i2b2)

    PubMed Central

    Weber, Griffin; Mendis, Michael; Gainer, Vivian; Chueh, Henry C; Churchill, Susanne; Kohane, Isaac

    2010-01-01

    Informatics for Integrating Biology and the Bedside (i2b2) is one of seven projects sponsored by the NIH Roadmap National Centers for Biomedical Computing (http://www.ncbcs.org). Its mission is to provide clinical investigators with the tools necessary to integrate medical record and clinical research data in the genomics age, a software suite to construct and integrate the modern clinical research chart. i2b2 software may be used by an enterprise's research community to find sets of interesting patients from electronic patient medical record data, while preserving patient privacy through a query tool interface. Project-specific mini-databases (“data marts”) can be created from these sets to make highly detailed data available on these specific patients to the investigators on the i2b2 platform, as reviewed and restricted by the Institutional Review Board. The current version of this software has been released into the public domain and is available at the URL: http://www.i2b2.org/software. PMID:20190053

  12. Component-based integration of chemistry and optimization software.

    PubMed

    Kenny, Joseph P; Benson, Steven J; Alexeev, Yuri; Sarich, Jason; Janssen, Curtis L; McInnes, Lois Curfman; Krishnan, Manojkumar; Nieplocha, Jarek; Jurrus, Elizabeth; Fahlstrom, Carl; Windus, Theresa L

    2004-11-15

    Typical scientific software designs make rigid assumptions regarding programming language and data structures, frustrating software interoperability and scientific collaboration. Component-based software engineering is an emerging approach to managing the increasing complexity of scientific software. Component technology facilitates code interoperability and reuse. Through the adoption of methodology and tools developed by the Common Component Architecture Forum, we have developed a component architecture for molecular structure optimization. Using the NWChem and Massively Parallel Quantum Chemistry packages, we have produced chemistry components that provide capacity for energy and energy derivative evaluation. We have constructed geometry optimization applications by integrating the Toolkit for Advanced Optimization, Portable Extensible Toolkit for Scientific Computation, and Global Arrays packages, which provide optimization and linear algebra capabilities. We present a brief overview of the component development process and a description of abstract interfaces for chemical optimizations. The components conforming to these abstract interfaces allow the construction of applications using different chemistry and mathematics packages interchangeably. Initial numerical results for the component software demonstrate good performance, and highlight potential research enabled by this platform.

  13. FVMS: A novel SiL approach on the evaluation of controllers for autonomous MAV

    NASA Astrophysics Data System (ADS)

    Sampaio, Rafael C. B.; Becker, Marcelo; Siqueira, Adriano A. G.; Freschi, Leonardo W.; Montanher, Marcelo P.

    The originality of this work is to propose a novel SiL (Software-in-the-Loop) platform using Microsoft Flight Simulator (MSFS) to assist control design regarding the stabilization problem found in © AscTec Pelican platform. Aerial Robots Team (USP/EESC/LabRoM/ART) has developed a custom C++/C# software named FVMS (Flight Variables Management System) that interfaces the communication between the virtual Pelican and the control algorithms allowing the control designer to perform fast full closed loop real time algorithms. Emulation of embedded sensors as well as the possibility to integrate OpenCV Optical Flow algorithms to a virtual downward camera makes the SiL even more reliable. More than a strictly numeric analysis, the proposed SiL platform offers an unique experience, simultaneously offering both dynamic and graphical responses. Performance of SiL algorithms is presented and discussed.

  14. Design distributed simulation platform for vehicle management system

    NASA Astrophysics Data System (ADS)

    Wen, Zhaodong; Wang, Zhanlin; Qiu, Lihua

    2006-11-01

    Next generation military aircraft requires the airborne management system high performance. General modules, data integration, high speed data bus and so on are needed to share and manage information of the subsystems efficiently. The subsystems include flight control system, propulsion system, hydraulic power system, environmental control system, fuel management system, electrical power system and so on. The unattached or mixed architecture is changed to integrated architecture. That means the whole airborne system is regarded into one system to manage. So the physical devices are distributed but the system information is integrated and shared. The process function of each subsystem are integrated (including general process modules, dynamic reconfiguration), furthermore, the sensors and the signal processing functions are shared. On the other hand, it is a foundation for power shared. Establish a distributed vehicle management system using 1553B bus and distributed processors which can provide a validation platform for the research of airborne system integrated management. This paper establishes the Vehicle Management System (VMS) simulation platform. Discuss the software and hardware configuration and analyze the communication and fault-tolerant method.

  15. Software-defined networking control plane for seamless integration of multiple silicon photonic switches in Datacom networks.

    PubMed

    Shen, Yiwen; Hattink, Maarten H N; Samadi, Payman; Cheng, Qixiang; Hu, Ziyiz; Gazman, Alexander; Bergman, Keren

    2018-04-16

    Silicon photonics based switches offer an effective option for the delivery of dynamic bandwidth for future large-scale Datacom systems while maintaining scalable energy efficiency. The integration of a silicon photonics-based optical switching fabric within electronic Datacom architectures requires novel network topologies and arbitration strategies to effectively manage the active elements in the network. We present a scalable software-defined networking control plane to integrate silicon photonic based switches with conventional Ethernet or InfiniBand networks. Our software-defined control plane manages both electronic packet switches and multiple silicon photonic switches for simultaneous packet and circuit switching. We built an experimental Dragonfly network testbed with 16 electronic packet switches and 2 silicon photonic switches to evaluate our control plane. Observed latencies occupied by each step of the switching procedure demonstrate a total of 344 µs control plane latency for data-center and high performance computing platforms.

  16. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Akyol, Bora A.; Allwardt, Craig H.; Beech, Zachary W.

    VOLTTRON is a flexible, reliable, and scalable platform for distributed control and sensing. VOLTTRON serves in four primary roles: •A reference platform for researchers to quickly develop control applications for transactive energy. •A reference platform with flexible data store support for energy analytics applications either in academia or in commercial enterprise. •A platform from which commercial enterprise can develop products without license issues and easily integrate into their product line. •An accelerator to drive industry adoption of transactive energy and advanced building energy analytics. Pacific Northwest National Laboratory, with funding from the U.S. Department of Energy’s Building Technologies Office, developedmore » and maintains VOLTTRON as an open-source community project. VOLTTRON source code includes agent execution software; agents that perform critical services that enable and enhance VOLTTRON functionality; and numerous agents that utilize the platform to perform a specific function (fault detection, demand response, etc.). The platform supports energy, operational, and financial transactions between networked entities (equipment, organizations, buildings, grid, etc.) and enhance the control infrastructure of existing buildings through the use of open-source device communication, control protocols, and integrated analytics.« less

  17. System perspectives for mobile platform design in m-Health

    NASA Astrophysics Data System (ADS)

    Roveda, Janet M.; Fink, Wolfgang

    2016-05-01

    Advances in integrated circuit technologies have led to the integration of medical sensor front ends with data processing circuits, i.e., mobile platform design for wearable sensors. We discuss design methodologies for wearable sensor nodes and their applications in m-Health. From the user perspective, flexibility, comfort, appearance, fashion, ease-of-use, and visibility are key form factors. From the technology development point of view, high accuracy, low power consumption, and high signal to noise ratio are desirable features. From the embedded software design standpoint, real time data analysis algorithms, application and database interfaces are the critical components to create successful wearable sensor-based products.

  18. A CCD experimental platform for large telescope in Antarctica based on FPGA

    NASA Astrophysics Data System (ADS)

    Zhu, Yuhua; Qi, Yongjun

    2014-07-01

    The CCD , as a detector , is one of the important components of astronomical telescopes. For a large telescope in Antarctica, a set of CCD detector system with large size, high sensitivity and low noise is indispensable. Because of the extremely low temperatures and unattended, system maintenance and software and hardware upgrade become hard problems. This paper introduces a general CCD controller experiment platform, using Field programmable gate array FPGA, which is, in fact, a large-scale field reconfigurable array. Taking the advantage of convenience to modify the system, construction of driving circuit, digital signal processing module, network communication interface, control algorithm validation, and remote reconfigurable module may realize. With the concept of integrated hardware and software, the paper discusses the key technology of building scientific CCD system suitable for the special work environment in Antarctica, focusing on the method of remote reconfiguration for controller via network and then offering a feasible hardware and software solution.

  19. Parallel Domain Decomposition Formulation and Software for Large-Scale Sparse Symmetrical/Unsymmetrical Aeroacoustic Applications

    NASA Technical Reports Server (NTRS)

    Nguyen, D. T.; Watson, Willie R. (Technical Monitor)

    2005-01-01

    The overall objectives of this research work are to formulate and validate efficient parallel algorithms, and to efficiently design/implement computer software for solving large-scale acoustic problems, arised from the unified frameworks of the finite element procedures. The adopted parallel Finite Element (FE) Domain Decomposition (DD) procedures should fully take advantages of multiple processing capabilities offered by most modern high performance computing platforms for efficient parallel computation. To achieve this objective. the formulation needs to integrate efficient sparse (and dense) assembly techniques, hybrid (or mixed) direct and iterative equation solvers, proper pre-conditioned strategies, unrolling strategies, and effective processors' communicating schemes. Finally, the numerical performance of the developed parallel finite element procedures will be evaluated by solving series of structural, and acoustic (symmetrical and un-symmetrical) problems (in different computing platforms). Comparisons with existing "commercialized" and/or "public domain" software are also included, whenever possible.

  20. Knowledge Management tools integration within DLR's concurrent engineering facility

    NASA Astrophysics Data System (ADS)

    Lopez, R. P.; Soragavi, G.; Deshmukh, M.; Ludtke, D.

    The complexity of space endeavors has increased the need for Knowledge Management (KM) tools. The concept of KM involves not only the electronic storage of knowledge, but also the process of making this knowledge available, reusable and traceable. Establishing a KM concept within the Concurrent Engineering Facility (CEF) has been a research topic of the German Aerospace Centre (DLR). This paper presents the current KM tools of the CEF: the Software Platform for Organizing and Capturing Knowledge (S.P.O.C.K.), the data model Virtual Satellite (VirSat), and the Simulation Model Library (SimMoLib), and how their usage improved the Concurrent Engineering (CE) process. This paper also exposes the lessons learned from the introduction of KM practices into the CEF and elaborates a roadmap for the further development of KM in CE activities at DLR. The results of the application of the Knowledge Management tools have shown the potential of merging the three software platforms with their functionalities, as the next step towards the fully integration of KM practices into the CE process. VirSat will stay as the main software platform used within a CE study, and S.P.O.C.K. and SimMoLib will be integrated into VirSat. These tools will support the data model as a reference and documentation source, and as an access to simulation and calculation models. The use of KM tools in the CEF aims to become a basic practice during the CE process. The settlement of this practice will result in a much more extended knowledge and experience exchange within the Concurrent Engineering environment and, consequently, the outcome of the studies will comprise higher quality in the design of space systems.

  1. GIFT-Cloud: A data sharing and collaboration platform for medical imaging research.

    PubMed

    Doel, Tom; Shakir, Dzhoshkun I; Pratt, Rosalind; Aertsen, Michael; Moggridge, James; Bellon, Erwin; David, Anna L; Deprest, Jan; Vercauteren, Tom; Ourselin, Sébastien

    2017-02-01

    Clinical imaging data are essential for developing research software for computer-aided diagnosis, treatment planning and image-guided surgery, yet existing systems are poorly suited for data sharing between healthcare and academia: research systems rarely provide an integrated approach for data exchange with clinicians; hospital systems are focused towards clinical patient care with limited access for external researchers; and safe haven environments are not well suited to algorithm development. We have established GIFT-Cloud, a data and medical image sharing platform, to meet the needs of GIFT-Surg, an international research collaboration that is developing novel imaging methods for fetal surgery. GIFT-Cloud also has general applicability to other areas of imaging research. GIFT-Cloud builds upon well-established cross-platform technologies. The Server provides secure anonymised data storage, direct web-based data access and a REST API for integrating external software. The Uploader provides automated on-site anonymisation, encryption and data upload. Gateways provide a seamless process for uploading medical data from clinical systems to the research server. GIFT-Cloud has been implemented in a multi-centre study for fetal medicine research. We present a case study of placental segmentation for pre-operative surgical planning, showing how GIFT-Cloud underpins the research and integrates with the clinical workflow. GIFT-Cloud simplifies the transfer of imaging data from clinical to research institutions, facilitating the development and validation of medical research software and the sharing of results back to the clinical partners. GIFT-Cloud supports collaboration between multiple healthcare and research institutions while satisfying the demands of patient confidentiality, data security and data ownership. Copyright © 2016 The Authors. Published by Elsevier B.V. All rights reserved.

  2. A Hardware Platform for Tuning of MEMS Devices Using Closed-Loop Frequency Response

    NASA Technical Reports Server (NTRS)

    Ferguson, Michael I.; MacDonald, Eric; Foor, David

    2005-01-01

    We report on the development of a hardware platform for integrated tuning and closed-loop operation of MEMS gyroscopes. The platform was developed and tested for the second generation JPL/Boeing Post-Resonator MEMS gyroscope. The control of this device is implemented through a digital design on a Field Programmable Gate Array (FPGA). A software interface allows the user to configure, calibrate, and tune the bias voltages on the micro-gyro. The interface easily transitions to an embedded solution that allows for the miniaturization of the system to a single chip.

  3. Prima Platform: A Scheme for Managing Equipment-Dependent Onboard Functions and Impacts on the Avionics Software Production Process

    NASA Astrophysics Data System (ADS)

    Candia, Sante; Lisio, Giovanni; Campolo, Giovanni; Pascucci, Dario

    2010-08-01

    The Avionics Software (ASW), in charge of controlling the Low Earth Orbit (LEO) Spacecraft PRIMA Platform (Piattaforma Ri-configurabile Italiana Multi-Applicativa), is evolving towards a highly modular and re-usable architecture based on an architectural framework allowing the effective integration of the software building blocks (SWBBs) providing the on-board control functions. During the recent years, the PRIMA ASW design and production processes have been improved to reach the following objectives: (a) at PUS Services level, separation of the mission-independent software mechanisms from the mission-dependent configuration information; (b) at Application level, identification of mission-independent recurrent functions for promoting abstraction and obtaining a more efficient and safe ASW production, with positive implications also on the software validation activities. This paper is dedicated to the characterisation activity which has been performed at Application level for a software component abstracting a set of functions for the generic On-Board Assembly (OBA), a set of hardware units used to deliver an on-board service. Moreover, the ASW production process is specified to show how it results after the introduction of the new design features.

  4. Friendly Extensible Transfer Tool Beta Version

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Collins, William P.; Gutierrez, Kenneth M.; McRee, Susan R.

    2016-04-15

    Often data transfer software is designed to meet specific requirements or apply to specific environments. Frequently, this requires source code integration for added functionality. An extensible data transfer framework is needed to more easily incorporate new capabilities, in modular fashion. Using FrETT framework, functionality may be incorporated (in many cases without need of source code) to handle new platform capabilities: I/O methods (e.g., platform specific data access), network transport methods, data processing (e.g., data compression.).

  5. The MR-Base platform supports systematic causal inference across the human phenome

    PubMed Central

    Wade, Kaitlin H; Haberland, Valeriia; Baird, Denis; Laurin, Charles; Burgess, Stephen; Bowden, Jack; Langdon, Ryan; Tan, Vanessa Y; Yarmolinsky, James; Shihab, Hashem A; Timpson, Nicholas J; Evans, David M; Relton, Caroline; Martin, Richard M; Davey Smith, George

    2018-01-01

    Results from genome-wide association studies (GWAS) can be used to infer causal relationships between phenotypes, using a strategy known as 2-sample Mendelian randomization (2SMR) and bypassing the need for individual-level data. However, 2SMR methods are evolving rapidly and GWAS results are often insufficiently curated, undermining efficient implementation of the approach. We therefore developed MR-Base (http://www.mrbase.org): a platform that integrates a curated database of complete GWAS results (no restrictions according to statistical significance) with an application programming interface, web app and R packages that automate 2SMR. The software includes several sensitivity analyses for assessing the impact of horizontal pleiotropy and other violations of assumptions. The database currently comprises 11 billion single nucleotide polymorphism-trait associations from 1673 GWAS and is updated on a regular basis. Integrating data with software ensures more rigorous application of hypothesis-driven analyses and allows millions of potential causal relationships to be efficiently evaluated in phenome-wide association studies. PMID:29846171

  6. The Digital Slide Archive: A Software Platform for Management, Integration, and Analysis of Histology for Cancer Research.

    PubMed

    Gutman, David A; Khalilia, Mohammed; Lee, Sanghoon; Nalisnik, Michael; Mullen, Zach; Beezley, Jonathan; Chittajallu, Deepak R; Manthey, David; Cooper, Lee A D

    2017-11-01

    Tissue-based cancer studies can generate large amounts of histology data in the form of glass slides. These slides contain important diagnostic, prognostic, and biological information and can be digitized into expansive and high-resolution whole-slide images using slide-scanning devices. Effectively utilizing digital pathology data in cancer research requires the ability to manage, visualize, share, and perform quantitative analysis on these large amounts of image data, tasks that are often complex and difficult for investigators with the current state of commercial digital pathology software. In this article, we describe the Digital Slide Archive (DSA), an open-source web-based platform for digital pathology. DSA allows investigators to manage large collections of histologic images and integrate them with clinical and genomic metadata. The open-source model enables DSA to be extended to provide additional capabilities. Cancer Res; 77(21); e75-78. ©2017 AACR . ©2017 American Association for Cancer Research.

  7. VirtualPlant: A Software Platform to Support Systems Biology Research1[W][OA

    PubMed Central

    Katari, Manpreet S.; Nowicki, Steve D.; Aceituno, Felipe F.; Nero, Damion; Kelfer, Jonathan; Thompson, Lee Parnell; Cabello, Juan M.; Davidson, Rebecca S.; Goldberg, Arthur P.; Shasha, Dennis E.; Coruzzi, Gloria M.; Gutiérrez, Rodrigo A.

    2010-01-01

    Data generation is no longer the limiting factor in advancing biological research. In addition, data integration, analysis, and interpretation have become key bottlenecks and challenges that biologists conducting genomic research face daily. To enable biologists to derive testable hypotheses from the increasing amount of genomic data, we have developed the VirtualPlant software platform. VirtualPlant enables scientists to visualize, integrate, and analyze genomic data from a systems biology perspective. VirtualPlant integrates genome-wide data concerning the known and predicted relationships among genes, proteins, and molecules, as well as genome-scale experimental measurements. VirtualPlant also provides visualization techniques that render multivariate information in visual formats that facilitate the extraction of biological concepts. Importantly, VirtualPlant helps biologists who are not trained in computer science to mine lists of genes, microarray experiments, and gene networks to address questions in plant biology, such as: What are the molecular mechanisms by which internal or external perturbations affect processes controlling growth and development? We illustrate the use of VirtualPlant with three case studies, ranging from querying a gene of interest to the identification of gene networks and regulatory hubs that control seed development. Whereas the VirtualPlant software was developed to mine Arabidopsis (Arabidopsis thaliana) genomic data, its data structures, algorithms, and visualization tools are designed in a species-independent way. VirtualPlant is freely available at www.virtualplant.org. PMID:20007449

  8. Knowledge-based assistance in costing the space station DMS

    NASA Technical Reports Server (NTRS)

    Henson, Troy; Rone, Kyle

    1988-01-01

    The Software Cost Engineering (SCE) methodology developed over the last two decades at IBM Systems Integration Division (SID) in Houston is utilized to cost the NASA Space Station Data Management System (DMS). An ongoing project to capture this methodology, which is built on a foundation of experiences and lessons learned, has resulted in the development of an internal-use-only, PC-based prototype that integrates algorithmic tools with knowledge-based decision support assistants. This prototype Software Cost Engineering Automation Tool (SCEAT) is being employed to assist in the DMS costing exercises. At the same time, DMS costing serves as a forcing function and provides a platform for the continuing, iterative development, calibration, and validation and verification of SCEAT. The data that forms the cost engineering database is derived from more than 15 years of development of NASA Space Shuttle software, ranging from low criticality, low complexity support tools to highly complex and highly critical onboard software.

  9. Large Scale Software Building with CMake in ATLAS

    NASA Astrophysics Data System (ADS)

    Elmsheuser, J.; Krasznahorkay, A.; Obreshkov, E.; Undrus, A.; ATLAS Collaboration

    2017-10-01

    The offline software of the ATLAS experiment at the Large Hadron Collider (LHC) serves as the platform for detector data reconstruction, simulation and analysis. It is also used in the detector’s trigger system to select LHC collision events during data taking. The ATLAS offline software consists of several million lines of C++ and Python code organized in a modular design of more than 2000 specialized packages. Because of different workflows, many stable numbered releases are in parallel production use. To accommodate specific workflow requests, software patches with modified libraries are distributed on top of existing software releases on a daily basis. The different ATLAS software applications also require a flexible build system that strongly supports unit and integration tests. Within the last year this build system was migrated to CMake. A CMake configuration has been developed that allows one to easily set up and build the above mentioned software packages. This also makes it possible to develop and test new and modified packages on top of existing releases. The system also allows one to detect and execute partial rebuilds of the release based on single package changes. The build system makes use of CPack for building RPM packages out of the software releases, and CTest for running unit and integration tests. We report on the migration and integration of the ATLAS software to CMake and show working examples of this large scale project in production.

  10. Parallelization of Rocket Engine System Software (Press)

    NASA Technical Reports Server (NTRS)

    Cezzar, Ruknet

    1996-01-01

    The main goal is to assess parallelization requirements for the Rocket Engine Numeric Simulator (RENS) project which, aside from gathering information on liquid-propelled rocket engines and setting forth requirements, involve a large FORTRAN based package at NASA Lewis Research Center and TDK software developed by SUBR/UWF. The ultimate aim is to develop, test, integrate, and suitably deploy a family of software packages on various aspects and facets of rocket engines using liquid-propellants. At present, all project efforts by the funding agency, NASA Lewis Research Center, and the HBCU participants are disseminated over the internet using world wide web home pages. Considering obviously expensive methods of actual field trails, the benefits of software simulators are potentially enormous. When realized, these benefits will be analogous to those provided by numerous CAD/CAM packages and flight-training simulators. According to the overall task assignments, Hampton University's role is to collect all available software, place them in a common format, assess and evaluate, define interfaces, and provide integration. Most importantly, the HU's mission is to see to it that the real-time performance is assured. This involves source code translations, porting, and distribution. The porting will be done in two phases: First, place all software on Cray XMP platform using FORTRAN. After testing and evaluation on the Cray X-MP, the code will be translated to C + + and ported to the parallel nCUBE platform. At present, we are evaluating another option of distributed processing over local area networks using Sun NFS, Ethernet, TCP/IP. Considering the heterogeneous nature of the present software (e.g., first started as an expert system using LISP machines) which now involve FORTRAN code, the effort is expected to be quite challenging.

  11. Next generation of decision making software for nanopatterns characterization: application to semiconductor industry

    NASA Astrophysics Data System (ADS)

    Dervilllé, A.; Labrosse, A.; Zimmermann, Y.; Foucher, J.; Gronheid, R.; Boeckx, C.; Singh, A.; Leray, P.; Halder, S.

    2016-03-01

    The dimensional scaling in IC manufacturing strongly drives the demands on CD and defect metrology techniques and their measurement uncertainties. Defect review has become as important as CD metrology and both of them create a new metrology paradigm because it creates a completely new need for flexible, robust and scalable metrology software. Current, software architectures and metrology algorithms are performant but it must be pushed to another higher level in order to follow roadmap speed and requirements. For example: manage defect and CD in one step algorithm, customize algorithms and outputs features for each R&D team environment, provide software update every day or every week for R&D teams in order to explore easily various development strategies. The final goal is to avoid spending hours and days to manually tune algorithm to analyze metrology data and to allow R&D teams to stay focus on their expertise. The benefits are drastic costs reduction, more efficient R&D team and better process quality. In this paper, we propose a new generation of software platform and development infrastructure which can integrate specific metrology business modules. For example, we will show the integration of a chemistry module dedicated to electronics materials like Direct Self Assembly features. We will show a new generation of image analysis algorithms which are able to manage at the same time defect rates, images classifications, CD and roughness measurements with high throughput performances in order to be compatible with HVM. In a second part, we will assess the reliability, the customization of algorithm and the software platform capabilities to follow new specific semiconductor metrology software requirements: flexibility, robustness, high throughput and scalability. Finally, we will demonstrate how such environment has allowed a drastic reduction of data analysis cycle time.

  12. Real-time computing platform for spiking neurons (RT-spike).

    PubMed

    Ros, Eduardo; Ortigosa, Eva M; Agís, Rodrigo; Carrillo, Richard; Arnold, Michael

    2006-07-01

    A computing platform is described for simulating arbitrary networks of spiking neurons in real time. A hybrid computing scheme is adopted that uses both software and hardware components to manage the tradeoff between flexibility and computational power; the neuron model is implemented in hardware and the network model and the learning are implemented in software. The incremental transition of the software components into hardware is supported. We focus on a spike response model (SRM) for a neuron where the synapses are modeled as input-driven conductances. The temporal dynamics of the synaptic integration process are modeled with a synaptic time constant that results in a gradual injection of charge. This type of model is computationally expensive and is not easily amenable to existing software-based event-driven approaches. As an alternative we have designed an efficient time-based computing architecture in hardware, where the different stages of the neuron model are processed in parallel. Further improvements occur by computing multiple neurons in parallel using multiple processing units. This design is tested using reconfigurable hardware and its scalability and performance evaluated. Our overall goal is to investigate biologically realistic models for the real-time control of robots operating within closed action-perception loops, and so we evaluate the performance of the system on simulating a model of the cerebellum where the emulation of the temporal dynamics of the synaptic integration process is important.

  13. Ground Operations Autonomous Control and Integrated Health Management

    NASA Technical Reports Server (NTRS)

    Daniels, James

    2014-01-01

    The Ground Operations Autonomous Control and Integrated Health Management plays a key role for future ground operations at NASA. The software that is integrated into this system is called G2 2011 Gensym. The purpose of this report is to describe the Ground Operations Autonomous Control and Integrated Health Management with the use of the G2 Gensym software and the G2 NASA toolkit for Integrated System Health Management (ISHM) which is a Computer Software Configuration Item (CSCI). The decision rationale for the use of the G2 platform is to develop a modular capability for ISHM and AC. Toolkit modules include knowledge bases that are generic and can be applied in any application domain module. That way, there's a maximization of reusability, maintainability, and systematic evolution, portability, and scalability. Engine modules are generic, while application modules represent the domain model of a specific application. Furthermore, the NASA toolkit, developed since 2006 (a set of modules), makes it possible to create application domain models quickly, using pre-defined objects that include sensors and components libraries for typical fluid, electrical, and mechanical systems.

  14. Energy Systems Integration News - September 2016 | Energy Systems

    Science.gov Websites

    , Smarter Grid Solutions demonstrated a new distributed energy resources (DER) software control platform utility interconnections require distributed generation (DG) devices to disconnect from the grid during OpenFMB distributed applications on the microgrid test site to locally optimize renewable energy resources

  15. Beta Testing of Persistent Passive Acoustic Monitors

    DTIC Science & Technology

    2012-10-01

    three platforms provide the capability to work over a wide range of spatial and temporal scales. Hardware and software integration of the DMONs in...closely with Richard M. Ead (Sensors and Sonar Systems Department, Naval Undersea Warfare Center, NUWC Code 1535), Ted Ioannides (PS 4013) and Dave

  16. Integrated Software Health Management for Aircraft GN and C

    NASA Technical Reports Server (NTRS)

    Schumann, Johann; Mengshoel, Ole

    2011-01-01

    Modern aircraft rely heavily on dependable operation of many safety-critical software components. Despite careful design, verification and validation (V&V), on-board software can fail with disastrous consequences if it encounters problematic software/hardware interaction or must operate in an unexpected environment. We are using a Bayesian approach to monitor the software and its behavior during operation and provide up-to-date information about the health of the software and its components. The powerful reasoning mechanism provided by our model-based Bayesian approach makes reliable diagnosis of the root causes possible and minimizes the number of false alarms. Compilation of the Bayesian model into compact arithmetic circuits makes SWHM feasible even on platforms with limited CPU power. We show initial results of SWHM on a small simulator of an embedded aircraft software system, where software and sensor faults can be injected.

  17. Experiences with DCE: the pro7 communication server based on OSF-DCE functionality.

    PubMed

    Schulte, M; Lordieck, W

    1997-01-01

    The pro7-communication server is a new approach to manage communication between different applications on different hardware platforms in a hospital environment. The most important features are the use of OSF/DCE for realising remote procedure calls between different platforms, the use of an SQL-92 compatible relational database and the design of a new software development tool (called protocol definition language compiler) for describing the interface of a new application, which is to integrate in a hospital environment.

  18. IVAG: An Integrative Visualization Application for Various Types of Genomic Data Based on R-Shiny and the Docker Platform.

    PubMed

    Lee, Tae-Rim; Ahn, Jin Mo; Kim, Gyuhee; Kim, Sangsoo

    2017-12-01

    Next-generation sequencing (NGS) technology has become a trend in the genomics research area. There are many software programs and automated pipelines to analyze NGS data, which can ease the pain for traditional scientists who are not familiar with computer programming. However, downstream analyses, such as finding differentially expressed genes or visualizing linkage disequilibrium maps and genome-wide association study (GWAS) data, still remain a challenge. Here, we introduce a dockerized web application written in R using the Shiny platform to visualize pre-analyzed RNA sequencing and GWAS data. In addition, we have integrated a genome browser based on the JBrowse platform and an automated intermediate parsing process required for custom track construction, so that users can easily build and navigate their personal genome tracks with in-house datasets. This application will help scientists perform series of downstream analyses and obtain a more integrative understanding about various types of genomic data by interactively visualizing them with customizable options.

  19. Run Environment and Data Management for Earth System Models

    NASA Astrophysics Data System (ADS)

    Widmann, H.; Lautenschlager, M.; Fast, I.; Legutke, S.

    2009-04-01

    The Integrating Model and Data Infrastructure (IMDI) developed and maintained by the Model and Data Group (M&D) comprises the Standard Compile Environment (SCE) and the Standard Run Environment (SRE). The IMDI software has a modular design, which allows to combine and couple a suite of model components and as well to execute the tasks independently and on various platforms. Furthermore the modular structure enables the extension to new model combinations and new platforms. The SRE presented here enables the configuration and performance of earth system model experiments from model integration up to storage and visualization of data. We focus on recently implemented tasks such as synchronous data base filling, graphical monitoring and automatic generation of meta data in XML forms during run time. As well we address the capability to run experiments in heterogeneous IT environments with different computing systems for model integration, data processing and storage. These features are demonstrated for model configurations and on platforms used in current or upcoming projects, e.g. MILLENNIUM or IPCC AR5.

  20. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Trujillo, Angelina Michelle

    Strategy, Planning, Acquiring- very large scale computing platforms come and go and planning for immensely scalable machines often precedes actual procurement by 3 years. Procurement can be another year or more. Integration- After Acquisition, machines must be integrated into the computing environments at LANL. Connection to scalable storage via large scale storage networking, assuring correct and secure operations. Management and Utilization – Ongoing operations, maintenance, and trouble shooting of the hardware and systems software at massive scale is required.

  1. A Problem-Solving Environment for Biological Network Informatics: Bio-Spice

    DTIC Science & Technology

    2007-06-01

    user an environment to access software tools. The Dashboard is built upon the NetBeans Integrated Development Environment (IDE), an open source Java...based integration platform was demonstrated. During the subsequent six month development cycle, the first version of the NetBeans based Bio-SPICE...frameworks (OAA, NetBeans , and Systems Biology Workbench (SBW)[15]), it becomes possible for Bio-SPICE tools to truly interoperate. This interoperation

  2. Status report of the end-to-end ASKAP software system: towards early science operations

    NASA Astrophysics Data System (ADS)

    Guzman, Juan Carlos; Chapman, Jessica; Marquarding, Malte; Whiting, Matthew

    2016-08-01

    The Australian SKA Pathfinder (ASKAP) is a novel centimetre radio synthesis telescope currently in the commissioning phase and located in the midwest region of Western Australia. It comprises of 36 x 12 m diameter reflector antennas each equipped with state-of-the-art and award winning Phased Array Feeds (PAF) technology. The PAFs provide a wide, 30 square degree field-of-view by forming up to 36 separate dual-polarisation beams at once. This results in a high data rate: 70 TB of correlated visibilities in an 8-hour observation, requiring custom-written, high-performance software running in dedicated High Performance Computing (HPC) facilities. The first six antennas equipped with first-generation PAF technology (Mark I), named the Boolardy Engineering Test Array (BETA) have been in use since 2014 as a platform to test PAF calibration and imaging techniques, and along the way it has been producing some great science results. Commissioning of the ASKAP Array Release 1, that is the first six antennas with second-generation PAFs (Mark II) is currently under way. An integral part of the instrument is the Central Processor platform hosted at the Pawsey Supercomputing Centre in Perth, which executes custom-written software pipelines, designed specifically to meet the ASKAP imaging requirements of wide field of view and high dynamic range. There are three key hardware components of the Central Processor: The ingest nodes (16 x node cluster), the fast temporary storage (1 PB Lustre file system) and the processing supercomputer (200 TFlop system). This High-Performance Computing (HPC) platform is managed and supported by the Pawsey support team. Due to the limited amount of data generated by BETA and the first ASKAP Array Release, the Central Processor platform has been running in a more "traditional" or user-interactive mode. But this is about to change: integration and verification of the online ingest pipeline starts in early 2016, which is required to support the full 300 MHz bandwidth for Array Release 1; followed by the deployment of the real-time data processing components. In addition to the Central Processor, the first production release of the CSIRO ASKAP Science Data Archive (CASDA) has also been deployed in one of the Pawsey Supercomputing Centre facilities and it is integrated to the end-to-end ASKAP data flow system. This paper describes the current status of the "end-to-end" data flow software system from preparing observations to data acquisition, processing and archiving; and the challenges of integrating an HPC facility as a key part of the instrument. It also shares some lessons learned since the start of integration activities and the challenges ahead in preparation for the start of the Early Science program.

  3. Construction of integrated case environments.

    PubMed

    Losavio, Francisca; Matteo, Alfredo; Pérez, María

    2003-01-01

    The main goal of Computer-Aided Software Engineering (CASE) technology is to improve the entire software system development process. The CASE approach is not merely a technology; it involves a fundamental change in the process of software development. The tendency of the CASE approach, technically speaking, is the integration of tools that assist in the application of specific methods. In this sense, the environment architecture, which includes the platform and the system's hardware and software, constitutes the base of the CASE environment. The problem of tools integration has been proposed for two decades. Current integration efforts emphasize the interoperability of tools, especially in distributed environments. In this work we use the Brown approach. The environment resulting from the application of this model is called a federative environment, focusing on the fact that this architecture pays special attention to the connections among the components of the environment. This approach is now being used in component-based design. This paper describes a concrete experience in civil engineering and architecture fields, for the construction of an integrated CASE environment. A generic architectural framework based on an intermediary architectural pattern is applied to achieve the integration of the different tools. This intermediary represents the control perspective of the PAC (Presentation-Abstraction-Control) style, which has been implemented as a Mediator pattern and it has been used in the interactive systems domain. In addition, a process is given to construct the integrated CASE.

  4. System Architecture Development for Energy and Water Infrastructure Data Management and Geovisual Analytics

    NASA Astrophysics Data System (ADS)

    Berres, A.; Karthik, R.; Nugent, P.; Sorokine, A.; Myers, A.; Pang, H.

    2017-12-01

    Building an integrated data infrastructure that can meet the needs of a sustainable energy-water resource management requires a robust data management and geovisual analytics platform, capable of cross-domain scientific discovery and knowledge generation. Such a platform can facilitate the investigation of diverse complex research and policy questions for emerging priorities in Energy-Water Nexus (EWN) science areas. Using advanced data analytics, machine learning techniques, multi-dimensional statistical tools, and interactive geovisualization components, such a multi-layered federated platform is being developed, the Energy-Water Nexus Knowledge Discovery Framework (EWN-KDF). This platform utilizes several enterprise-grade software design concepts and standards such as extensible service-oriented architecture, open standard protocols, event-driven programming model, enterprise service bus, and adaptive user interfaces to provide a strategic value to the integrative computational and data infrastructure. EWN-KDF is built on the Compute and Data Environment for Science (CADES) environment in Oak Ridge National Laboratory (ORNL).

  5. GIS Application System Design Applied to Information Monitoring

    NASA Astrophysics Data System (ADS)

    Qun, Zhou; Yujin, Yuan; Yuena, Kang

    Natural environment information management system involves on-line instrument monitoring, data communications, database establishment, information management software development and so on. Its core lies in collecting effective and reliable environmental information, increasing utilization rate and sharing degree of environment information by advanced information technology, and maximizingly providing timely and scientific foundation for environmental monitoring and management. This thesis adopts C# plug-in application development and uses a set of complete embedded GIS component libraries and tools libraries provided by GIS Engine to finish the core of plug-in GIS application framework, namely, the design and implementation of framework host program and each functional plug-in, as well as the design and implementation of plug-in GIS application framework platform. This thesis adopts the advantages of development technique of dynamic plug-in loading configuration, quickly establishes GIS application by visualized component collaborative modeling and realizes GIS application integration. The developed platform is applicable to any application integration related to GIS application (ESRI platform) and can be as basis development platform of GIS application development.

  6. Integrated Design and Implementation of Embedded Control Systems with Scilab

    PubMed Central

    Ma, Longhua; Xia, Feng; Peng, Zhe

    2008-01-01

    Embedded systems are playing an increasingly important role in control engineering. Despite their popularity, embedded systems are generally subject to resource constraints and it is therefore difficult to build complex control systems on embedded platforms. Traditionally, the design and implementation of control systems are often separated, which causes the development of embedded control systems to be highly time-consuming and costly. To address these problems, this paper presents a low-cost, reusable, reconfigurable platform that enables integrated design and implementation of embedded control systems. To minimize the cost, free and open source software packages such as Linux and Scilab are used. Scilab is ported to the embedded ARM-Linux system. The drivers for interfacing Scilab with several communication protocols including serial, Ethernet, and Modbus are developed. Experiments are conducted to test the developed embedded platform. The use of Scilab enables implementation of complex control algorithms on embedded platforms. With the developed platform, it is possible to perform all phases of the development cycle of embedded control systems in a unified environment, thus facilitating the reduction of development time and cost. PMID:27873827

  7. Integrated Design and Implementation of Embedded Control Systems with Scilab.

    PubMed

    Ma, Longhua; Xia, Feng; Peng, Zhe

    2008-09-05

    Embedded systems are playing an increasingly important role in control engineering. Despite their popularity, embedded systems are generally subject to resource constraints and it is therefore difficult to build complex control systems on embedded platforms. Traditionally, the design and implementation of control systems are often separated, which causes the development of embedded control systems to be highly timeconsuming and costly. To address these problems, this paper presents a low-cost, reusable, reconfigurable platform that enables integrated design and implementation of embedded control systems. To minimize the cost, free and open source software packages such as Linux and Scilab are used. Scilab is ported to the embedded ARM-Linux system. The drivers for interfacing Scilab with several communication protocols including serial, Ethernet, and Modbus are developed. Experiments are conducted to test the developed embedded platform. The use of Scilab enables implementation of complex control algorithms on embedded platforms. With the developed platform, it is possible to perform all phases of the development cycle of embedded control systems in a unified environment, thus facilitating the reduction of development time and cost.

  8. Single software platform used for high speed data transfer implementation in a 65k pixel camera working in single photon counting mode

    NASA Astrophysics Data System (ADS)

    Maj, P.; Kasiński, K.; Gryboś, P.; Szczygieł, R.; Kozioł, A.

    2015-12-01

    Integrated circuits designed for specific applications generally use non-standard communication methods. Hybrid pixel detector readout electronics produces a huge amount of data as a result of number of frames per seconds. The data needs to be transmitted to a higher level system without limiting the ASIC's capabilities. Nowadays, the Camera Link interface is still one of the fastest communication methods, allowing transmission speeds up to 800 MB/s. In order to communicate between a higher level system and the ASIC with a dedicated protocol, an FPGA with dedicated code is required. The configuration data is received from the PC and written to the ASIC. At the same time, the same FPGA should be able to transmit the data from the ASIC to the PC at the very high speed. The camera should be an embedded system enabling autonomous operation and self-monitoring. In the presented solution, at least three different hardware platforms are used—FPGA, microprocessor with real-time operating system and the PC with end-user software. We present the use of a single software platform for high speed data transfer from 65k pixel camera to the personal computer.

  9. Integrated platform for optimized solar PV system design and engineering plan set generation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Adeyemo, Samuel

    2015-12-30

    The Aurora team has developed software that allows users to quickly generate a three-dimensional model for a building, with a corresponding irradiance map, from any two-dimensional image with associated geo-coordinates. The purpose of this project is to build upon that technology by developing and distributing to solar installers a software platform that automatically retrieves engineering, financial and geographic data for a specific site, and quickly generates an optimal customer proposal and corresponding engineering plans for that site. At the end of the project, Aurora’s optimization platform would have been used to make at least one thousand proposals from at leastmore » ten unique solar installation companies, two of whom would sign economically viable contracts to use the software. Furthermore, Aurora’s algorithms would be tested to show that in at least seventy percent of cases, Aurora automatically generated a design equivalent to or better than what a human could have done manually. A ‘better’ design is one that generates more energy for the same cost, or that generates a higher return on investment, while complying with all site-specific aesthetic, electrical and spatial requirements.« less

  10. ScaMo: Realisation of an OO-functional DSL for cross platform mobile applications development

    NASA Astrophysics Data System (ADS)

    Macos, Dragan; Solymosi, Andreas

    2013-10-01

    The software market is dynamically changing: the Internet is going mobile, the software applications are shifting from the desktop hardware onto the mobile devices. The largest markets are the mobile applications for iOS, Android and Windows Phone and for the purpose the typical programming languages include Objective-C, Java and C ♯. The realization of the native applications implies the integration of the developed software into the environments of mentioned mobile operating systems to enable the access to different hardware components of the devices: GPS module, display, GSM module, etc. This paper deals with the definition and possible implementation of an environment for the automatic application generation for multiple mobile platforms. It is based on a DSL for mobile application development, which includes the programming language Scala and a DSL defined in Scala. As part of a multi-stage cross-compiling algorithm, this language is translated into the language of the affected mobile platform. The advantage of our method lies in the expressiveness of the defined language and the transparent source code translation between different languages, which implies, for example, the advantages of debugging and development of the generated code.

  11. Experiences integrating autonomous components and legacy systems into tsunami early warning systems

    NASA Astrophysics Data System (ADS)

    Reißland, S.; Herrnkind, S.; Guenther, M.; Babeyko, A.; Comoglu, M.; Hammitzsch, M.

    2012-04-01

    Fostered by and embedded in the general development of Information and Communication Technology (ICT) the evolution of Tsunami Early Warning Systems (TEWS) shows a significant development from seismic-centred to multi-sensor system architectures using additional sensors, e.g. sea level stations for the detection of tsunami waves and GPS stations for the detection of ground displacements. Furthermore, the design and implementation of a robust and scalable service infrastructure supporting the integration and utilisation of existing resources serving near real-time data not only includes sensors but also other components and systems offering services such as the delivery of feasible simulations used for forecasting in an imminent tsunami threat. In the context of the development of the German Indonesian Tsunami Early Warning System (GITEWS) and the project Distant Early Warning System (DEWS) a service platform for both sensor integration and warning dissemination has been newly developed and demonstrated. In particular, standards of the Open Geospatial Consortium (OGC) and the Organization for the Advancement of Structured Information Standards (OASIS) have been successfully incorporated. In the project Collaborative, Complex, and Critical Decision-Support in Evolving Crises (TRIDEC) new developments are used to extend the existing platform to realise a component-based technology framework for building distributed TEWS. This talk will describe experiences made in GITEWS, DEWS and TRIDEC while integrating legacy stand-alone systems and newly developed special-purpose software components into TEWS using different software adapters and communication strategies to make the systems work together in a corporate infrastructure. The talk will also cover task management and data conversion between the different systems. Practical approaches and software solutions for the integration of sensors, e.g. providing seismic and sea level data, and utilisation of special-purpose components, such as simulation systems, in TEWS will be presented.

  12. Engaging Language Learners through Technology Integration: Theory, Applications, and Outcomes

    ERIC Educational Resources Information Center

    Li, Shuai, Ed.; Swanson, Peter, Ed.

    2014-01-01

    Web 2.0 technologies, open source software platforms, and mobile applications have transformed teaching and learning of second and foreign languages. Language teaching has transitioned from a teacher-centered approach to a student-centered approach through the use of Computer-Assisted Language Learning (CALL) and new teaching approaches.…

  13. Metrics of a Paradigm for Intelligent Control

    NASA Technical Reports Server (NTRS)

    Hexmoor, Henry

    1999-01-01

    We present metrics for quantifying organizational structures of complex control systems intended for controlling long-lived robotic or other autonomous applications commonly found in space applications. Such advanced control systems are often called integration platforms or agent architectures. Reported metrics span concerns about time, resources, software engineering, and complexities in the world.

  14. Social and Collaborative Interactions for Educational Content Enrichment in ULEs

    ERIC Educational Resources Information Center

    Araújo, Rafael D.; Brant-Ribeiro, Taffarel; Mendonça, Igor E. S.; Mendes, Miller M.; Dorça, Fabiano A.; Cattelan, Renan G.

    2017-01-01

    This article presents a social and collaborative model for content enrichment in Ubiquitous Learning Environments. Designed as a loosely coupled software architecture, the proposed model was implemented and integrated into the Classroom eXperience, a multimedia capture platform for educational environments. After automatically recording a lecture…

  15. TRACC_PB SOSS Integrated Traffic Simulation for CLT Ramp Operation

    NASA Technical Reports Server (NTRS)

    Okuniek, Nikolai; Zhu, Zhifan

    2017-01-01

    This presentation provides the current task under the NASA-DLR research collaboration for airport surface. It presents the effort done to adapt TRACC and SOSS software components to simulate airport (CLT) ramp area traffic management using TRACC's conflict free taxi trajectory optimization and SOSS's fast time simulation platform.

  16. Increasing Flight Software Reuse with OpenSatKit

    NASA Technical Reports Server (NTRS)

    McComas, David C.

    2018-01-01

    In January 2015 the NASA Goddard Space Flight Center (GSFC) released the Core Flight System (cFS) as open source under the NASA Open Source Agreement (NOSA) license. The cFS is based on flight software (FSW) developed for 12 spacecraft spanning nearly two decades of effort and it can provide about a third of the FSW functionality for a low-earth orbiting scientific spacecraft. The cFS is a FSW framework that is portable, configurable, and extendable using a product line deployment model. However, the components are maintained separately so the user must configure, integrate, and deploy them as a cohesive functional system. This can be very challenging especially for organizations such as universities building cubesats that have minimal experience developing FSW. Supporting universities was one of the primary motivators for releasing the cFS under NOSA. This paper describes the OpenSatKit that was developed to address the cFS deployment challenges and to serve as a cFS training platform for new users. It provides a fully functional out-of-the box software system that includes NASA's cFS, Ball Aerospace's command and control system COSMOS, and a NASA dynamic simulator called 42. The kit is freely available since all of the components have been released as open source. The kit runs on a Linux platform, includes 8 cFS applications, several kit-specific applications, and built in demos illustrating how to use key application features. It also includes the software necessary to port the cFS to a Raspberry Pi and instructions for configuring COSMOS to communicate with the target. All of the demos and test scripts can be rerun unchanged with the cFS running on the Raspberry Pi. The cFS uses a 3-tiered layered architecture including a platform abstraction layer, a Core Flight Executive (cFE) middle layer, and an application layer. Similar to smart phones, the cFS application layer is the key architectural feature for users to extend the FSW functionality to meet their mission-specific requirements. The platform abstraction layer and the cFE layers go a step further than smart phones by providing a platform-agnostic Application Programmer Interface (API) that allows applications to run unchanged on different platforms. OpenSatKit can serve two significant architectural roles that will further help the adoption of the cFS and help create a community of users that can share assets. First, the kit is being enhanced to automate the integration of applications with the goal of creating a virtual cFS "App Store".. Second, a platform certification test suite can be developed that would allow users to verify the port of the cFS to a new platform. This paper will describe the current state of these efforts and future plans.

  17. In-situ Testing of the EHT High Gain and Frequency Ultra-Stable Integrators

    NASA Astrophysics Data System (ADS)

    Miller, Kenneth; Ziemba, Timothy; Prager, James; Slobodov, Ilia; Lotz, Dan

    2014-10-01

    Eagle Harbor Technologies (EHT) has developed a long-pulse integrator that exceeds the ITER specification for integration error and pulse duration. During the Phase I program, EHT improved the RPPL short-pulse integrators, added a fast digital reset, and demonstrated that the new integrators exceed the ITER integration error and pulse duration requirements. In Phase II, EHT developed Field Programmable Gate Array (FPGA) software that allows for integrator control and real-time signal digitization and processing. In the second year of Phase II, the EHT integrator will be tested at a validation platform experiment (HIT-SI) and tokamak (DIII-D). In the Phase IIB program, EHT will continue development of the EHT integrator to reduce overall cost per channel. EHT will test lower cost components, move to surface mount components, and add an onboard Field Programmable Gate Array and data acquisition to produce a stand-alone system with lower cost per channel and increased the channel density. EHT will test the Phase IIB integrator at a validation platform experiment (HIT-SI) and tokamak (DIII-D). Work supported by the DOE under Contract Number (DE-SC0006281).

  18. ACQ4: an open-source software platform for data acquisition and analysis in neurophysiology research.

    PubMed

    Campagnola, Luke; Kratz, Megan B; Manis, Paul B

    2014-01-01

    The complexity of modern neurophysiology experiments requires specialized software to coordinate multiple acquisition devices and analyze the collected data. We have developed ACQ4, an open-source software platform for performing data acquisition and analysis in experimental neurophysiology. This software integrates the tasks of acquiring, managing, and analyzing experimental data. ACQ4 has been used primarily for standard patch-clamp electrophysiology, laser scanning photostimulation, multiphoton microscopy, intrinsic imaging, and calcium imaging. The system is highly modular, which facilitates the addition of new devices and functionality. The modules included with ACQ4 provide for rapid construction of acquisition protocols, live video display, and customizable analysis tools. Position-aware data collection allows automated construction of image mosaics and registration of images with 3-dimensional anatomical atlases. ACQ4 uses free and open-source tools including Python, NumPy/SciPy for numerical computation, PyQt for the user interface, and PyQtGraph for scientific graphics. Supported hardware includes cameras, patch clamp amplifiers, scanning mirrors, lasers, shutters, Pockels cells, motorized stages, and more. ACQ4 is available for download at http://www.acq4.org.

  19. Molecular Cloning Designer Simulator (MCDS): All-in-one molecular cloning and genetic engineering design, simulation and management software for complex synthetic biology and metabolic engineering projects.

    PubMed

    Shi, Zhenyu; Vickers, Claudia E

    2016-12-01

    Molecular Cloning Designer Simulator (MCDS) is a powerful new all-in-one cloning and genetic engineering design, simulation and management software platform developed for complex synthetic biology and metabolic engineering projects. In addition to standard functions, it has a number of features that are either unique, or are not found in combination in any one software package: (1) it has a novel interactive flow-chart user interface for complex multi-step processes, allowing an integrated overview of the whole project; (2) it can perform a user-defined workflow of cloning steps in a single execution of the software; (3) it can handle multiple types of genetic recombineering, a technique that is rapidly replacing classical cloning for many applications; (4) it includes experimental information to conveniently guide wet lab work; and (5) it can store results and comments to allow the tracking and management of the whole project in one platform. MCDS is freely available from https://mcds.codeplex.com.

  20. An overview of the Progenika ID CORE XT: an automated genotyping platform based on a fluidic microarray system.

    PubMed

    Goldman, Mindy; Núria, Núria; Castilho, Lilian M

    2015-01-01

    Automated testing platforms facilitate the introduction of red cell genotyping of patients and blood donors. Fluidic microarray systems, such as Luminex XMAP (Austin, TX), are used in many clinical applications, including HLA and HPA typing. The Progenika ID CORE XT (Progenika Biopharma-Grifols, Bizkaia, Spain) uses this platform to analyze 29 polymorphisms determining 37 antigens in 10 blood group systems. Once DNA has been extracted, processing time is approximately 4 hours. The system is highly automated and includes integrated analysis software that produces a file and a report with genotype and predicted phenotype results.

  1. Integration of the virtual model of a Stewart platform with the avatar of a vehicle in a virtual reality

    NASA Astrophysics Data System (ADS)

    Herbuś, K.; Ociepka, P.

    2016-08-01

    The development of methods of computer aided design and engineering allows conducting virtual tests, among others concerning motion simulation of technical means. The paper presents a method of integrating an object in the form of a virtual model of a Stewart platform with an avatar of a vehicle moving in a virtual environment. The area of the problem includes issues related to the problem of fidelity of mapping the work of the analyzed technical mean. The main object of investigations is a 3D model of a Stewart platform, which is a subsystem of the simulator designated for driving learning for disabled persons. The analyzed model of the platform, prepared for motion simulation, was created in the “Motion Simulation” module of a CAD/CAE class system Siemens PLM NX. Whereas the virtual environment, in which the moves the avatar of the passenger car, was elaborated in a VR class system EON Studio. The element integrating both of the mentioned software environments is a developed application that reads information from the virtual reality (VR) concerning the current position of the car avatar. Then, basing on the accepted algorithm, it sends control signals to respective joints of the model of the Stewart platform (CAD).

  2. Integrating Remote and Social Sensing Data for a Scenario on Secure Societies in Big Data Platform

    NASA Astrophysics Data System (ADS)

    Albani, Sergio; Lazzarini, Michele; Koubarakis, Manolis; Taniskidou, Efi Karra; Papadakis, George; Karkaletsis, Vangelis; Giannakopoulos, George

    2016-08-01

    In the framework of the Horizon 2020 project BigDataEurope (Integrating Big Data, Software & Communities for Addressing Europe's Societal Challenges), a pilot for the Secure Societies Societal Challenge was designed considering the requirements coming from relevant stakeholders. The pilot is focusing on the integration in a Big Data platform of data coming from remote and social sensing.The information on land changes coming from the Copernicus Sentinel 1A sensor (Change Detection workflow) is integrated with information coming from selected Twitter and news agencies accounts (Event Detection workflow) in order to provide the user with multiple sources of information.The Change Detection workflow implements a processing chain in a distributed parallel manner, exploiting the Big Data capabilities in place; the Event Detection workflow implements parallel and distributed social media and news agencies monitoring as well as suitable mechanisms to detect and geo-annotate the related events.

  3. A Reference Software Architecture to Support Unmanned Aircraft Integration in the National Airspace System

    DTIC Science & Technology

    2012-07-01

    and Avoid ( SAA ) testbed that provides some of the core services . This paper describes the general architecture and a SAA testbed implementation that...that provides data and software services to enable a set of Unmanned Aircraft (UA) platforms to operate in a wide range of air domains which may...implemented by MIT Lincoln Laboratory in the form of a Sense and Avoid ( SAA ) testbed that provides some of the core services . This paper describes the general

  4. Rapidly Deployable Security System Final Report CRADA No. TC-2030-01

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kohlhepp, V.; Whiteman, B.; McKibben, M. T.

    The ultimate objective of the LEADER and LLNL strategic partnership was to develop and commercialize_a security-based system product and platform for the use in protecting the substantial physical and economic assets of the government and commerce of the United States. The primary goal of this project was to integrate video surveillance hardware developed by LLNL with a security software backbone developed by LEADER. Upon completion of the project, a prototype hardware/software security system that is highly scalable was to be demonstrated.

  5. Supporting Crop Loss Insurance Policy of Indonesia through Rice Yield Modelling and Forecasting

    NASA Astrophysics Data System (ADS)

    van Verseveld, Willem; Weerts, Albrecht; Trambauer, Patricia; de Vries, Sander; Conijn, Sjaak; van Valkengoed, Eric; Hoekman, Dirk; Grondard, Nicolas; Hengsdijk, Huib; Schrevel, Aart; Vlasbloem, Pieter; Klauser, Dominik

    2017-04-01

    The Government of Indonesia has decided on a crop insurance policy to assist Indonesia's farmers and to boost food security. To support the Indonesian government, the G4INDO project (www.g4indo.org) is developing/constructing an integrated platform implemented in the Delft-FEWS forecasting system (Werner et al., 2013). The integrated platform brings together remote sensed data (both visible and radar) and hydrologic, crop and reservoir modelling and forecasting to improve the modelling and forecasting of rice yield. The hydrological model (wflow_sbm), crop model (wflow_lintul) and reservoir models (RTC-Tools) are coupled on time stepping basis in the OpenStreams framework (see https://github.com/openstreams/wflow) and deployed in the integrated platform to support seasonal forecasting of water availability and crop yield. First we will show the general idea about the G4INDO project, the integrated platform (including Sentinel 1 & 2 data) followed by first (reforecast) results of the coupled models for predicting water availability and crop yield in the Brantas catchment in Java, Indonesia. Werner, M., Schellekens, J., Gijsbers, P., Van Dijk, M., Van den Akker, O. and Heynert K, 2013. The Delft-FEWS flow forecasting system, Environmental Modelling & Software; 40:65-77. DOI: 10.1016/j.envsoft.2012.07.010.

  6. CONRAD—A software framework for cone-beam imaging in radiology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Maier, Andreas; Choi, Jang-Hwan; Riess, Christian

    2013-11-15

    Purpose: In the community of x-ray imaging, there is a multitude of tools and applications that are used in scientific practice. Many of these tools are proprietary and can only be used within a certain lab. Often the same algorithm is implemented multiple times by different groups in order to enable comparison. In an effort to tackle this problem, the authors created CONRAD, a software framework that provides many of the tools that are required to simulate basic processes in x-ray imaging and perform image reconstruction with consideration of nonlinear physical effects.Methods: CONRAD is a Java-based state-of-the-art software platform withmore » extensive documentation. It is based on platform-independent technologies. Special libraries offer access to hardware acceleration such as OpenCL. There is an easy-to-use interface for parallel processing. The software package includes different simulation tools that are able to generate up to 4D projection and volume data and respective vector motion fields. Well known reconstruction algorithms such as FBP, DBP, and ART are included. All algorithms in the package are referenced to a scientific source.Results: A total of 13 different phantoms and 30 processing steps have already been integrated into the platform at the time of writing. The platform comprises 74.000 nonblank lines of code out of which 19% are used for documentation. The software package is available for download at http://conrad.stanford.edu. To demonstrate the use of the package, the authors reconstructed images from two different scanners, a table top system and a clinical C-arm system. Runtimes were evaluated using the RabbitCT platform and demonstrate state-of-the-art runtimes with 2.5 s for the 256 problem size and 12.4 s for the 512 problem size.Conclusions: As a common software framework, CONRAD enables the medical physics community to share algorithms and develop new ideas. In particular this offers new opportunities for scientific collaboration and quantitative performance comparison between the methods of different groups.« less

  7. Applications and testing of the LSCAD system

    NASA Astrophysics Data System (ADS)

    Althouse, Mark L.; Gross, Robert L.; Ditillo, John T.; Lagna, William M.; Kolodzey, Steve J.; Keiser, Christopher C.; Nasers, Gary D.

    1996-06-01

    The lightweight standoff chemical agent detector (LSCAD) is an infrared Michelson interferometer operating in the 8 - 13 micron band and is designed primarily for military contamination avoidance and early warning applications. The system is designed to be operated autonomously from a vehicle while on the move and provide 360 degree coverage. The first group of prototypes were delivered in 1994 and have undergone integration into several platforms including the HMMWV, the M2 Bradley Fighting Vehicle, the M109 self- propelled Howitzer and the Pioneer and Hurricane unmanned air vehicles (UAVs). Additional vehicles and platforms are planned. To meet the restrictions of military applications, the prototype interferometer subsystem has a weight of about 10 lbs and is approximately 0.20 cu fit in size. The full system size and weight depends upon the particular platform and its operational requirements. LSCAD employs onboard instrument control, data collection, analysis and target detection decision software, all of which are critical to real-time operation. The hardware, software, and test results are discussed.

  8. A WBAN System for Ambulatory Monitoring of Physical Activity and Health Status: Applications and Challenges.

    PubMed

    Jovanov, E; Milenkovic, A; Otto, C; De Groen, P; Johnson, B; Warren, S; Taibi, G

    2005-01-01

    Recent technological advances in sensors, low-power integrated circuits, and wireless communications have enabled the design of low-cost, miniature, lightweight, intelligent physiological sensor platforms that can be seamlessly integrated into a body area network for health monitoring. Wireless body area networks (WBANs) promise unobtrusive ambulatory health monitoring for extended periods of time and near real-time updates of patients' medical records through the Internet. A number of innovative systems for health monitoring have recently been proposed. However, they typically rely on custom communication protocols and hardware designs, lacking generality and flexibility. The lack of standard platforms, system software support, and standards makes these systems expensive. Bulky sensors, high price, and frequent battery changes are all likely to limit user compliance. To address some of these challenges, we prototyped a WBAN utilizing a common off-the-shelf wireless sensor platform with a ZigBee-compliant radio interface and an ultra low-power microcontroller. The standard platform interfaces to custom sensor boards that are equipped with accelerometers for motion monitoring and a bioamplifier for electrocardiogram or electromyogram monitoring. Software modules for on-board processing, communication, and network synchronization have been developed using the TinyOS operating system. Although the initial WBAN prototype targets ambulatory monitoring of user activity, the developed sensors can easily be adapted to monitor other physiological parameters. In this paper, we discuss initial results, implementation challenges, and the need for standardization in this dynamic and promising research field.

  9. Missile signal processing common computer architecture for rapid technology upgrade

    NASA Astrophysics Data System (ADS)

    Rabinkin, Daniel V.; Rutledge, Edward; Monticciolo, Paul

    2004-10-01

    Interceptor missiles process IR images to locate an intended target and guide the interceptor towards it. Signal processing requirements have increased as the sensor bandwidth increases and interceptors operate against more sophisticated targets. A typical interceptor signal processing chain is comprised of two parts. Front-end video processing operates on all pixels of the image and performs such operations as non-uniformity correction (NUC), image stabilization, frame integration and detection. Back-end target processing, which tracks and classifies targets detected in the image, performs such algorithms as Kalman tracking, spectral feature extraction and target discrimination. In the past, video processing was implemented using ASIC components or FPGAs because computation requirements exceeded the throughput of general-purpose processors. Target processing was performed using hybrid architectures that included ASICs, DSPs and general-purpose processors. The resulting systems tended to be function-specific, and required custom software development. They were developed using non-integrated toolsets and test equipment was developed along with the processor platform. The lifespan of a system utilizing the signal processing platform often spans decades, while the specialized nature of processor hardware and software makes it difficult and costly to upgrade. As a result, the signal processing systems often run on outdated technology, algorithms are difficult to update, and system effectiveness is impaired by the inability to rapidly respond to new threats. A new design approach is made possible three developments; Moore's Law - driven improvement in computational throughput; a newly introduced vector computing capability in general purpose processors; and a modern set of open interface software standards. Today's multiprocessor commercial-off-the-shelf (COTS) platforms have sufficient throughput to support interceptor signal processing requirements. This application may be programmed under existing real-time operating systems using parallel processing software libraries, resulting in highly portable code that can be rapidly migrated to new platforms as processor technology evolves. Use of standardized development tools and 3rd party software upgrades are enabled as well as rapid upgrade of processing components as improved algorithms are developed. The resulting weapon system will have a superior processing capability over a custom approach at the time of deployment as a result of a shorter development cycles and use of newer technology. The signal processing computer may be upgraded over the lifecycle of the weapon system, and can migrate between weapon system variants enabled by modification simplicity. This paper presents a reference design using the new approach that utilizes an Altivec PowerPC parallel COTS platform. It uses a VxWorks-based real-time operating system (RTOS), and application code developed using an efficient parallel vector library (PVL). A quantification of computing requirements and demonstration of interceptor algorithm operating on this real-time platform are provided.

  10. An MDA Based Ontology Platform: AIR

    NASA Astrophysics Data System (ADS)

    Gaševic, Dragan; Djuric, Dragan; Devedžic, Vladan

    In the past few years, software engineering has witnessed two major shifts: model-driven engineering has entered the mainstream, and some leading development tools have become open and extensible.1 AI has always been a spring of new ideas that have been adopted in software engineering, but most of its gems have stayed buried in laboratories, available only to a limited number of AI practitioners. Should AI tools be integrated into mainstream tools and could it be done? We think that it is feasible, and that both communities can benefit from this integration. In fact, some efforts in this direction have already been made, both by major industrial standardization bodies such as the OMG, and by academic laboratories.

  11. Integrated Design and Analysis Tools for Software-Based Control Systems

    DTIC Science & Technology

    2005-07-01

    in understanding this space. Jini and Java Spaces To give us a versatile experimental platform in anticipation of the Boeing OCP, we developed a...This demonstration shows the network integration and publish-and- subscribe interactions working with Ptolemy II. Moreover, the demo used Jini , a...develop a better understanding of its role to help determine whether it should be incorporated later, and in what form. Pioneer and Jini Jie Liu and

  12. Software-defined networking control plane for seamless integration of multiple silicon photonic switches in Datacom networks

    DOE PAGES

    Shen, Yiwen; Hattink, Maarten; Samadi, Payman; ...

    2018-04-13

    Silicon photonics based switches offer an effective option for the delivery of dynamic bandwidth for future large-scale Datacom systems while maintaining scalable energy efficiency. The integration of a silicon photonics-based optical switching fabric within electronic Datacom architectures requires novel network topologies and arbitration strategies to effectively manage the active elements in the network. Here, we present a scalable software-defined networking control plane to integrate silicon photonic based switches with conventional Ethernet or InfiniBand networks. Our software-defined control plane manages both electronic packet switches and multiple silicon photonic switches for simultaneous packet and circuit switching. We built an experimental Dragonfly networkmore » testbed with 16 electronic packet switches and 2 silicon photonic switches to evaluate our control plane. Observed latencies occupied by each step of the switching procedure demonstrate a total of 344 microsecond control plane latency for data-center and high performance computing platforms.« less

  13. Software-defined networking control plane for seamless integration of multiple silicon photonic switches in Datacom networks

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shen, Yiwen; Hattink, Maarten; Samadi, Payman

    Silicon photonics based switches offer an effective option for the delivery of dynamic bandwidth for future large-scale Datacom systems while maintaining scalable energy efficiency. The integration of a silicon photonics-based optical switching fabric within electronic Datacom architectures requires novel network topologies and arbitration strategies to effectively manage the active elements in the network. Here, we present a scalable software-defined networking control plane to integrate silicon photonic based switches with conventional Ethernet or InfiniBand networks. Our software-defined control plane manages both electronic packet switches and multiple silicon photonic switches for simultaneous packet and circuit switching. We built an experimental Dragonfly networkmore » testbed with 16 electronic packet switches and 2 silicon photonic switches to evaluate our control plane. Observed latencies occupied by each step of the switching procedure demonstrate a total of 344 microsecond control plane latency for data-center and high performance computing platforms.« less

  14. Development of Integrated Modular Avionics Application Based on Simulink and XtratuM

    NASA Astrophysics Data System (ADS)

    Fons-Albert, Borja; Usach-Molina, Hector; Vila-Carbo, Joan; Crespo-Lorente, Alfons

    2013-08-01

    This paper presents an integral approach for designing avionics applications that meets the requirements for software development and execution of this application domain. Software design follows the Model-Based design process and is performed in Simulink. This approach allows easy and quick testbench development and helps satisfying DO-178B requirements through the use of proper tools. The software execution platform is based on XtratuM, a minimal bare-metal hypervisor designed in our research group. XtratuM provides support for IMA-SP (Integrated Modular Avionics for Space) architectures. This approach allows the code generation of a Simulink model to be executed on top of Lithos as XtratuM partition. Lithos is a ARINC-653 compliant RTOS for XtratuM. The paper concentrates in how to smoothly port Simulink designs to XtratuM solving problems like application partitioning, automatic code generation, real-time tasking, interfacing, and others. This process is illustrated with an autopilot design test using a flight simulator.

  15. Build and Execute Environment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Guan, Qiang

    At exascale, the challenge becomes to develop applications that run at scale and use exascale platforms reliably, efficiently, and flexibly. Workflows become much more complex because they must seamlessly integrate simulation and data analytics. They must include down-sampling, post-processing, feature extraction, and visualization. Power and data transfer limitations require these analysis tasks to be run in-situ or in-transit. We expect successful workflows will comprise multiple linked simulations along with tens of analysis routines. Users will have limited development time at scale and, therefore, must have rich tools to develop, debug, test, and deploy applications. At this scale, successful workflows willmore » compose linked computations from an assortment of reliable, well-defined computation elements, ones that can come and go as required, based on the needs of the workflow over time. We propose a novel framework that utilizes both virtual machines (VMs) and software containers to create a workflow system that establishes a uniform build and execution environment (BEE) beyond the capabilities of current systems. In this environment, applications will run reliably and repeatably across heterogeneous hardware and software. Containers, both commercial (Docker and Rocket) and open-source (LXC and LXD), define a runtime that isolates all software dependencies from the machine operating system. Workflows may contain multiple containers that run different operating systems, different software, and even different versions of the same software. We will run containers in open-source virtual machines (KVM) and emulators (QEMU) so that workflows run on any machine entirely in user-space. On this platform of containers and virtual machines, we will deliver workflow software that provides services, including repeatable execution, provenance, checkpointing, and future proofing. We will capture provenance about how containers were launched and how they interact to annotate workflows for repeatable and partial re-execution. We will coordinate the physical snapshots of virtual machines with parallel programming constructs, such as barriers, to automate checkpoint and restart. We will also integrate with HPC-specific container runtimes to gain access to accelerators and other specialized hardware to preserve native performance. Containers will link development to continuous integration. When application developers check code in, it will automatically be tested on a suite of different software and hardware architectures.« less

  16. Open chemistry: RESTful web APIs, JSON, NWChem and the modern web application.

    PubMed

    Hanwell, Marcus D; de Jong, Wibe A; Harris, Christopher J

    2017-10-30

    An end-to-end platform for chemical science research has been developed that integrates data from computational and experimental approaches through a modern web-based interface. The platform offers an interactive visualization and analytics environment that functions well on mobile, laptop and desktop devices. It offers pragmatic solutions to ensure that large and complex data sets are more accessible. Existing desktop applications/frameworks were extended to integrate with high-performance computing resources, and offer command-line tools to automate interaction-connecting distributed teams to this software platform on their own terms. The platform was developed openly, and all source code hosted on the GitHub platform with automated deployment possible using Ansible coupled with standard Ubuntu-based machine images deployed to cloud machines. The platform is designed to enable teams to reap the benefits of the connected web-going beyond what conventional search and analytics platforms offer in this area. It also has the goal of offering federated instances, that can be customized to the sites/research performed. Data gets stored using JSON, extending upon previous approaches using XML, building structures that support computational chemistry calculations. These structures were developed to make it easy to process data across different languages, and send data to a JavaScript-based web client.

  17. OpenFLUID: an open-source software environment for modelling fluxes in landscapes

    NASA Astrophysics Data System (ADS)

    Fabre, Jean-Christophe; Rabotin, Michaël; Crevoisier, David; Libres, Aline; Dagès, Cécile; Moussa, Roger; Lagacherie, Philippe; Raclot, Damien; Voltz, Marc

    2013-04-01

    Integrative landscape functioning has become a common concept in environmental management. Landscapes are complex systems where many processes interact in time and space. In agro-ecosystems, these processes are mainly physical processes, including hydrological-processes, biological processes and human activities. Modelling such systems requires an interdisciplinary approach, coupling models coming from different disciplines, developed by different teams. In order to support collaborative works, involving many models coupled in time and space for integrative simulations, an open software modelling platform is a relevant answer. OpenFLUID is an open source software platform for modelling landscape functioning, mainly focused on spatial fluxes. It provides an advanced object-oriented architecture allowing to i) couple models developed de novo or from existing source code, and which are dynamically plugged to the platform, ii) represent landscapes as hierarchical graphs, taking into account multi-scale, spatial heterogeneities and landscape objects connectivity, iii) run and explore simulations in many ways : using the OpenFLUID software interfaces for users (command line interface, graphical user interface), or using external applications such as GNU R through the provided ROpenFLUID package. OpenFLUID is developed in C++ and relies on open source libraries only (Boost, libXML2, GLib/GTK, OGR/GDAL, …). For modelers and developers, OpenFLUID provides a dedicated environment for model development, which is based on an open source toolchain, including the Eclipse editor, the GCC compiler and the CMake build system. OpenFLUID is distributed under the GPLv3 open source license, with a special exception allowing to plug existing models licensed under any license. It is clearly in the spirit of sharing knowledge and favouring collaboration in a community of modelers. OpenFLUID has been involved in many research applications, such as modelling of hydrological network transfer, diagnosis and prediction of water quality taking into account human activities, study of the effect of spatial organization on hydrological fluxes, modelling of surface-subsurface water exchanges, … At LISAH research unit, OpenFLUID is the supporting development platform of the MHYDAS model, which is a distributed model for agrosystems (Moussa et al., 2002, Hydrological Processes, 16, 393-412). OpenFLUID web site : http://www.openfluid-project.org

  18. BioContainers: an open-source and community-driven framework for software standardization.

    PubMed

    da Veiga Leprevost, Felipe; Grüning, Björn A; Alves Aflitos, Saulo; Röst, Hannes L; Uszkoreit, Julian; Barsnes, Harald; Vaudel, Marc; Moreno, Pablo; Gatto, Laurent; Weber, Jonas; Bai, Mingze; Jimenez, Rafael C; Sachsenberg, Timo; Pfeuffer, Julianus; Vera Alvarez, Roberto; Griss, Johannes; Nesvizhskii, Alexey I; Perez-Riverol, Yasset

    2017-08-15

    BioContainers (biocontainers.pro) is an open-source and community-driven framework which provides platform independent executable environments for bioinformatics software. BioContainers allows labs of all sizes to easily install bioinformatics software, maintain multiple versions of the same software and combine tools into powerful analysis pipelines. BioContainers is based on popular open-source projects Docker and rkt frameworks, that allow software to be installed and executed under an isolated and controlled environment. Also, it provides infrastructure and basic guidelines to create, manage and distribute bioinformatics containers with a special focus on omics technologies. These containers can be integrated into more comprehensive bioinformatics pipelines and different architectures (local desktop, cloud environments or HPC clusters). The software is freely available at github.com/BioContainers/. yperez@ebi.ac.uk. © The Author(s) 2017. Published by Oxford University Press.

  19. BioContainers: an open-source and community-driven framework for software standardization

    PubMed Central

    da Veiga Leprevost, Felipe; Grüning, Björn A.; Alves Aflitos, Saulo; Röst, Hannes L.; Uszkoreit, Julian; Barsnes, Harald; Vaudel, Marc; Moreno, Pablo; Gatto, Laurent; Weber, Jonas; Bai, Mingze; Jimenez, Rafael C.; Sachsenberg, Timo; Pfeuffer, Julianus; Vera Alvarez, Roberto; Griss, Johannes; Nesvizhskii, Alexey I.; Perez-Riverol, Yasset

    2017-01-01

    Abstract Motivation BioContainers (biocontainers.pro) is an open-source and community-driven framework which provides platform independent executable environments for bioinformatics software. BioContainers allows labs of all sizes to easily install bioinformatics software, maintain multiple versions of the same software and combine tools into powerful analysis pipelines. BioContainers is based on popular open-source projects Docker and rkt frameworks, that allow software to be installed and executed under an isolated and controlled environment. Also, it provides infrastructure and basic guidelines to create, manage and distribute bioinformatics containers with a special focus on omics technologies. These containers can be integrated into more comprehensive bioinformatics pipelines and different architectures (local desktop, cloud environments or HPC clusters). Availability and Implementation The software is freely available at github.com/BioContainers/. Contact yperez@ebi.ac.uk PMID:28379341

  20. On-Board Software Reference Architecture for Payloads

    NASA Astrophysics Data System (ADS)

    Bos, Victor; Rugina, Ana; Trcka, Adam

    2016-08-01

    The goal of the On-board Software Reference Architecture for Payloads (OSRA-P) is to identify an architecture for payload software to harmonize the payload domain, to enable more reuse of common/generic payload software across different payloads and missions and to ease the integration of the payloads with the platform.To investigate the payload domain, recent and current payload instruments of European space missions have been analyzed. This led to a Payload Catalogue describing 12 payload instruments as well as a Capability Matrix listing specific characteristics of each payload. In addition, a functional decomposition of payload software was prepared which contains functionalities typically found in payload systems. The definition of OSRA-P was evaluated by case studies and a dedicated OSRA-P workshop to gather feedback from the payload community.

  1. An Embedded Systems Course for Engineering Students Using Open-Source Platforms in Wireless Scenarios

    ERIC Educational Resources Information Center

    Rodriguez-Sanchez, M. C.; Torrado-Carvajal, Angel; Vaquero, Joaquin; Borromeo, Susana; Hernandez-Tamames, Juan A.

    2016-01-01

    This paper presents a case study analyzing the advantages and disadvantages of using project-based learning (PBL) combined with collaborative learning (CL) and industry best practices, integrated with information communication technologies, open-source software, and open-source hardware tools, in a specialized microcontroller and embedded systems…

  2. An Integration Architecture of Virtual Campuses with External e-Learning Tools

    ERIC Educational Resources Information Center

    Navarro, Antonio; Cigarran, Juan; Huertas, Francisco; Rodriguez-Artacho, Miguel; Cogolludo, Alberto

    2014-01-01

    Technology enhanced learning relies on a variety of software architectures and platforms to provide different kinds of management service and enhanced instructional interaction. As e-learning support has become more complex, there is a need for virtual campuses that combine learning management systems with the services demanded by educational…

  3. Ubiquitous Computing for Remote Cardiac Patient Monitoring: A Survey

    PubMed Central

    Kumar, Sunil; Kambhatla, Kashyap; Hu, Fei; Lifson, Mark; Xiao, Yang

    2008-01-01

    New wireless technologies, such as wireless LAN and sensor networks, for telecardiology purposes give new possibilities for monitoring vital parameters with wearable biomedical sensors, and give patients the freedom to be mobile and still be under continuous monitoring and thereby better quality of patient care. This paper will detail the architecture and quality-of-service (QoS) characteristics in integrated wireless telecardiology platforms. It will also discuss the current promising hardware/software platforms for wireless cardiac monitoring. The design methodology and challenges are provided for realistic implementation. PMID:18604301

  4. Ubiquitous computing for remote cardiac patient monitoring: a survey.

    PubMed

    Kumar, Sunil; Kambhatla, Kashyap; Hu, Fei; Lifson, Mark; Xiao, Yang

    2008-01-01

    New wireless technologies, such as wireless LAN and sensor networks, for telecardiology purposes give new possibilities for monitoring vital parameters with wearable biomedical sensors, and give patients the freedom to be mobile and still be under continuous monitoring and thereby better quality of patient care. This paper will detail the architecture and quality-of-service (QoS) characteristics in integrated wireless telecardiology platforms. It will also discuss the current promising hardware/software platforms for wireless cardiac monitoring. The design methodology and challenges are provided for realistic implementation.

  5. DIMP: an interoperable solution for software integration and product data exchange

    NASA Astrophysics Data System (ADS)

    Wang, Xi Vincent; Xu, Xun William

    2012-08-01

    Today, globalisation has become one of the main trends of manufacturing business that has led to a world-wide decentralisation of resources amongst not only individual departments within one company but also business partners. However, despite the development and improvement in the last few decades, difficulties in information exchange and sharing still exist in heterogeneous applications environments. This article is divided into two parts. In the first part, related research work and integrating solutions are reviewed and discussed. The second part introduces a collaborative environment called distributed interoperable manufacturing platform, which is based on a module-based, service-oriented architecture (SOA). In the platform, the STEP-NC data model is used to facilitate data-exchange among heterogeneous CAD/CAM/CNC systems.

  6. Performance Evaluation of Block Acquisition and Tracking Algorithms Using an Open Source GPS Receiver Platform

    NASA Technical Reports Server (NTRS)

    Ramachandran, Ganesh K.; Akopian, David; Heckler, Gregory W.; Winternitz, Luke B.

    2011-01-01

    Location technologies have many applications in wireless communications, military and space missions, etc. US Global Positioning System (GPS) and other existing and emerging Global Navigation Satellite Systems (GNSS) are expected to provide accurate location information to enable such applications. While GNSS systems perform very well in strong signal conditions, their operation in many urban, indoor, and space applications is not robust or even impossible due to weak signals and strong distortions. The search for less costly, faster and more sensitive receivers is still in progress. As the research community addresses more and more complicated phenomena there exists a demand on flexible multimode reference receivers, associated SDKs, and development platforms which may accelerate and facilitate the research. One of such concepts is the software GPS/GNSS receiver (GPS SDR) which permits a facilitated access to algorithmic libraries and a possibility to integrate more advanced algorithms without hardware and essential software updates. The GNU-SDR and GPS-SDR open source receiver platforms are such popular examples. This paper evaluates the performance of recently proposed block-corelator techniques for acquisition and tracking of GPS signals using open source GPS-SDR platform.

  7. Long-term real-time structural health monitoring using wireless smart sensor

    NASA Astrophysics Data System (ADS)

    Jang, Shinae; Mensah-Bonsu, Priscilla O.; Li, Jingcheng; Dahal, Sushil

    2013-04-01

    Improving the safety and security of civil infrastructure has become a critical issue for decades since it plays a central role in the economics and politics of a modern society. Structural health monitoring of civil infrastructure using wireless smart sensor network has emerged as a promising solution recently to increase structural reliability, enhance inspection quality, and reduce maintenance costs. Though hardware and software framework are well prepared for wireless smart sensors, the long-term real-time health monitoring strategy are still not available due to the lack of systematic interface. In this paper, the Imote2 smart sensor platform is employed, and a graphical user interface for the long-term real-time structural health monitoring has been developed based on Matlab for the Imote2 platform. This computer-aided engineering platform enables the control, visualization of measured data as well as safety alarm feature based on modal property fluctuation. A new decision making strategy to check the safety is also developed and integrated in this software. Laboratory validation of the computer aided engineering platform for the Imote2 on a truss bridge and a building structure has shown the potential of the interface for long-term real-time structural health monitoring.

  8. Research on infrared small-target tracking technology under complex background

    NASA Astrophysics Data System (ADS)

    Liu, Lei; Wang, Xin; Chen, Jilu; Pan, Tao

    2012-10-01

    In this paper, some basic principles and the implementing flow charts of a series of algorithms for target tracking are described. On the foundation of above works, a moving target tracking software base on the OpenCV is developed by the software developing platform MFC. Three kinds of tracking algorithms are integrated in this software. These two tracking algorithms are Kalman Filter tracking method and Camshift tracking method. In order to explain the software clearly, the framework and the function are described in this paper. At last, the implementing processes and results are analyzed, and those algorithms for tracking targets are evaluated from the two aspects of subjective and objective. This paper is very significant in the application of the infrared target tracking technology.

  9. I3Mote: An Open Development Platform for the Intelligent Industrial Internet

    PubMed Central

    Martinez, Borja; Vilajosana, Xavier; Kim, Il Han; Zhou, Jianwei; Tuset-Peiró, Pere; Xhafa, Ariton; Poissonnier, Dominique; Lu, Xiaolin

    2017-01-01

    In this article we present the Intelligent Industrial Internet (I3) Mote, an open hardware platform targeting industrial connectivity and sensing deployments. The I3Mote features the most advanced low-power components to tackle sensing, on-board computing and wireless/wired connectivity for demanding industrial applications. The platform has been designed to fill the gap in the industrial prototyping and early deployment market with a compact form factor, low-cost and robust industrial design. I3Mote is an advanced and compact prototyping system integrating the required components to be deployed as a product, leveraging the need for adopting industries to build their own tailored solution. This article describes the platform design, firmware and software ecosystem and characterizes its performance in terms of energy consumption. PMID:28452945

  10. Integration of a mobile-integrated therapy with electronic health records: lessons learned.

    PubMed

    Peeples, Malinda M; Iyer, Anand K; Cohen, Joshua L

    2013-05-01

    Responses to the chronic disease epidemic have predominantly been standardized in their approach to date. Barriers to better health outcomes remain, and effective management requires patient-specific data and disease state knowledge be presented in methods that foster clinical decision-making and patient self-management. Mobile technology provides a new platform for data collection and patient-provider communication. The mobile device represents a personalized platform that is available to the patient on a 24/7 basis. Mobile-integrated therapy (MIT) is the convergence of mobile technology, clinical and behavioral science, and scientifically validated clinical outcomes. In this article, we highlight the lessons learned from functional integration of a Food and Drug Administration-cleared type 2 diabetes MIT into the electronic health record (EHR) of a multiphysician practice within a large, urban, academic medical center. In-depth interviews were conducted with integration stakeholder groups: mobile and EHR software and information technology teams, clinical end users, project managers, and business analysts. Interviews were summarized and categorized into lessons learned using the Architecture for Integrated Mobility® framework. Findings from the diverse stakeholder group of a MIT-EHR integration project indicate that user workflow, software system persistence, environment configuration, device connectivity and security, organizational processes, and data exchange heuristics are key issues that must be addressed. Mobile-integrated therapy that integrates patient self-management data with medical record data provides the opportunity to understand the potential benefits of bidirectional data sharing and reporting that are most valuable in advancing better health and better care in a cost-effective way that is scalable for all chronic diseases. © 2013 Diabetes Technology Society.

  11. Systems biology driven software design for the research enterprise.

    PubMed

    Boyle, John; Cavnor, Christopher; Killcoyne, Sarah; Shmulevich, Ilya

    2008-06-25

    In systems biology, and many other areas of research, there is a need for the interoperability of tools and data sources that were not originally designed to be integrated. Due to the interdisciplinary nature of systems biology, and its association with high throughput experimental platforms, there is an additional need to continually integrate new technologies. As scientists work in isolated groups, integration with other groups is rarely a consideration when building the required software tools. We illustrate an approach, through the discussion of a purpose built software architecture, which allows disparate groups to reuse tools and access data sources in a common manner. The architecture allows for: the rapid development of distributed applications; interoperability, so it can be used by a wide variety of developers and computational biologists; development using standard tools, so that it is easy to maintain and does not require a large development effort; extensibility, so that new technologies and data types can be incorporated; and non intrusive development, insofar as researchers need not to adhere to a pre-existing object model. By using a relatively simple integration strategy, based upon a common identity system and dynamically discovered interoperable services, a light-weight software architecture can become the focal point through which scientists can both get access to and analyse the plethora of experimentally derived data.

  12. An integrated development workflow for community-driven FOSS-projects using continuous integration tools

    NASA Astrophysics Data System (ADS)

    Bilke, Lars; Watanabe, Norihiro; Naumov, Dmitri; Kolditz, Olaf

    2016-04-01

    A complex software project in general with high standards regarding code quality requires automated tools to help developers in doing repetitive and tedious tasks such as compilation on different platforms and configurations, doing unit testing as well as end-to-end tests and generating distributable binaries and documentation. This is known as continuous integration (CI). A community-driven FOSS-project within the Earth Sciences benefits even more from CI as time and resources regarding software development are often limited. Therefore testing developed code on more than the developers PC is a task which is often neglected and where CI can be the solution. We developed an integrated workflow based on GitHub, Travis and Jenkins for the community project OpenGeoSys - a coupled multiphysics modeling and simulation package - allowing developers to concentrate on implementing new features in a tight feedback loop. Every interested developer/user can create a pull request containing source code modifications on the online collaboration platform GitHub. The modifications are checked (compilation, compiler warnings, memory leaks, undefined behaviors, unit tests, end-to-end tests, analyzing differences in simulation run results between changes etc.) from the CI system which automatically responds to the pull request or by email on success or failure with detailed reports eventually requesting to improve the modifications. Core team developers review the modifications and merge them into the main development line once they satisfy agreed standards. We aim for efficient data structures and algorithms, self-explaining code, comprehensive documentation and high test code coverage. This workflow keeps entry barriers to get involved into the project low and permits an agile development process concentrating on feature additions rather than software maintenance procedures.

  13. Modelling and Forecasting of Rice Yield in support of Crop Insurance

    NASA Astrophysics Data System (ADS)

    Weerts, A.; van Verseveld, W.; Trambauer, P.; de Vries, S.; Conijn, S.; van Valkengoed, E.; Hoekman, D.; Hengsdijk, H.; Schrevel, A.

    2016-12-01

    The Government of Indonesia has embarked on a policy to bring crop insurance to all of Indonesia's farmers. To support the Indonesian government, the G4INDO project (www.g4indo.org) is developing/constructing an integrated platform for judging and handling insurance claims. The platform consists of bringing together remote sensed data (both visible and radar) and hydrologic and crop modelling and forecasting to improve predictions in one forecasting platform (i.e. Delft-FEWS, Werner et al., 2013). The hydrological model and crop model (LINTUL) are coupled on time stepping basis in the OpenStreams framework (see https://github.com/openstreams/wflow) and deployed in a Delft-FEWS forecasting platform to support seasonal forecasting of water availability and crop yield. First we will show the general idea about the project, the integrated platform (including Sentinel 1 & 2 data) followed by first (reforecast) results of the coupled models for predicting water availability and crop yield in the Brantas catchment in Java, Indonesia. Werner, M., Schellekens, J., Gijsbers, P., Van Dijk, M., Van den Akker, O. and Heynert K, 2013. The Delft-FEWS flow forecasting system, Environmental Modelling & Software; 40:65-77. DOI: 10.1016/j.envsoft.2012.07.010 .

  14. The Cloud-Based Integrated Data Viewer (IDV)

    NASA Astrophysics Data System (ADS)

    Fisher, Ward

    2015-04-01

    Maintaining software compatibility across new computing environments and the associated underlying hardware is a common problem for software engineers and scientific programmers. While there are a suite of tools and methodologies used in traditional software engineering environments to mitigate this issue, they are typically ignored by developers lacking a background in software engineering. The result is a large body of software which is simultaneously critical and difficult to maintain. Visualization software is particularly vulnerable to this problem, given the inherent dependency on particular graphics hardware and software API's. The advent of cloud computing has provided a solution to this problem, which was not previously practical on a large scale; Application Streaming. This technology allows a program to run entirely on a remote virtual machine while still allowing for interactivity and dynamic visualizations, with little-to-no re-engineering required. Through application streaming we are able to bring the same visualization to a desktop, a netbook, a smartphone, and the next generation of hardware, whatever it may be. Unidata has been able to harness Application Streaming to provide a tablet-compatible version of our visualization software, the Integrated Data Viewer (IDV). This work will examine the challenges associated with adapting the IDV to an application streaming platform, and include a brief discussion of the underlying technologies involved. We will also discuss the differences between local software and software-as-a-service.

  15. Benchmarking of dynamic simulation predictions in two software platforms using an upper limb musculoskeletal model

    PubMed Central

    Saul, Katherine R.; Hu, Xiao; Goehler, Craig M.; Vidt, Meghan E.; Daly, Melissa; Velisar, Anca; Murray, Wendy M.

    2014-01-01

    Several opensource or commercially available software platforms are widely used to develop dynamic simulations of movement. While computational approaches are conceptually similar across platforms, technical differences in implementation may influence output. We present a new upper limb dynamic model as a tool to evaluate potential differences in predictive behavior between platforms. We evaluated to what extent differences in technical implementations in popular simulation software environments result in differences in kinematic predictions for single and multijoint movements using EMG- and optimization-based approaches for deriving control signals. We illustrate the benchmarking comparison using SIMM-Dynamics Pipeline-SD/Fast and OpenSim platforms. The most substantial divergence results from differences in muscle model and actuator paths. This model is a valuable resource and is available for download by other researchers. The model, data, and simulation results presented here can be used by future researchers to benchmark other software platforms and software upgrades for these two platforms. PMID:24995410

  16. Benchmarking of dynamic simulation predictions in two software platforms using an upper limb musculoskeletal model.

    PubMed

    Saul, Katherine R; Hu, Xiao; Goehler, Craig M; Vidt, Meghan E; Daly, Melissa; Velisar, Anca; Murray, Wendy M

    2015-01-01

    Several opensource or commercially available software platforms are widely used to develop dynamic simulations of movement. While computational approaches are conceptually similar across platforms, technical differences in implementation may influence output. We present a new upper limb dynamic model as a tool to evaluate potential differences in predictive behavior between platforms. We evaluated to what extent differences in technical implementations in popular simulation software environments result in differences in kinematic predictions for single and multijoint movements using EMG- and optimization-based approaches for deriving control signals. We illustrate the benchmarking comparison using SIMM-Dynamics Pipeline-SD/Fast and OpenSim platforms. The most substantial divergence results from differences in muscle model and actuator paths. This model is a valuable resource and is available for download by other researchers. The model, data, and simulation results presented here can be used by future researchers to benchmark other software platforms and software upgrades for these two platforms.

  17. Increasing Flight Software Reuse with OpenSatKit

    NASA Technical Reports Server (NTRS)

    McComas, David

    2018-01-01

    In January 2015 the NASA Goddard Space Flight Center (GSFC) released the Core Flight System (cFS) as open source under the NASA Open Source Agreement (NOSA) license. The cFS is based on flight software (FSW) developed for 12 spacecraft spanning nearly two decades of effort and it can provide about a third of the FSW functionality for a low-earth orbiting scientific spacecraft. The cFS is a FSW framework that is portable, configurable, and extendable using a product line deployment model. However, the components are maintained separately so the user must configure, integrate, and deploy them as a cohesive functional system. This can be very challenging especially for organizations such as universities building cubesats that have minimal experience developing FSW. Supporting universities was one of the primary motivators for releasing the cFS under NOSA. This paper describes the OpenSatKit that was developed to address the cFS deployment challenges and to serve as a cFS training platform for new users. It provides a fully functional out-of-the box software system that includes NASA's cFS, Ball Aerospaceâ€"TM"s command and control system COSMOS, and a NASA dynamic simulator called 42. The kit is freely available since all of the components have been released as open source. The kit runs on a Linux platform, includes 8 cFS applications, several kit-specific applications, and built in demos illustrating how to use key application features. It also includes the software necessary to port the cFS to a Raspberry Pi and instructions for configuring COSMOS to communicate with the target. All of the demos and test scripts can be rerun unchanged with the cFS running on the Raspberry Pi. The cFS uses a 3-tiered layered architecture including a platform abstraction layer, a Core Flight Executive (cFE) middle layer, and an application layer. Similar to smart phones, the cFS application layer is the key architectural feature for userâ€"TM"s to extend the FSW functionality to meet their mission-specific requirements. The platform abstraction layer and the cFE layers go a step further than smart phones by providing a platform-agnostic Application Programmer Interface (API) that allows applications to run unchanged on different platforms. OpenSatKit can serve two significant architectural roles that will further help the adoption of the cFS and help create a community of users that can share assets. First, the kit is being enhanced to automate the integration of applications with the goal of creating a virtual cFS 'App Store'. Second, a platform certification test suite can be developed that would allow users to verify the port of the cFS to a new platform. This paper will describe the current state of these efforts and future plans.

  18. Massively Parallel, Molecular Analysis Platform Developed Using a CMOS Integrated Circuit With Biological Nanopores

    PubMed Central

    Roever, Stefan

    2012-01-01

    A massively parallel, low cost molecular analysis platform will dramatically change the nature of protein, molecular and genomics research, DNA sequencing, and ultimately, molecular diagnostics. An integrated circuit (IC) with 264 sensors was fabricated using standard CMOS semiconductor processing technology. Each of these sensors is individually controlled with precision analog circuitry and is capable of single molecule measurements. Under electronic and software control, the IC was used to demonstrate the feasibility of creating and detecting lipid bilayers and biological nanopores using wild type α-hemolysin. The ability to dynamically create bilayers over each of the sensors will greatly accelerate pore development and pore mutation analysis. In addition, the noise performance of the IC was measured to be 30fA(rms). With this noise performance, single base detection of DNA was demonstrated using α-hemolysin. The data shows that a single molecule, electrical detection platform using biological nanopores can be operationalized and can ultimately scale to millions of sensors. Such a massively parallel platform will revolutionize molecular analysis and will completely change the field of molecular diagnostics in the future.

  19. Open chemistry: RESTful web APIs, JSON, NWChem and the modern web application

    DOE PAGES

    Hanwell, Marcus D.; de Jong, Wibe A.; Harris, Christopher J.

    2017-10-30

    An end-to-end platform for chemical science research has been developed that integrates data from computational and experimental approaches through a modern web-based interface. The platform offers an interactive visualization and analytics environment that functions well on mobile, laptop and desktop devices. It offers pragmatic solutions to ensure that large and complex data sets are more accessible. Existing desktop applications/frameworks were extended to integrate with high-performance computing resources, and offer command-line tools to automate interaction - connecting distributed teams to this software platform on their own terms. The platform was developed openly, and all source code hosted on the GitHub platformmore » with automated deployment possible using Ansible coupled with standard Ubuntu-based machine images deployed to cloud machines. The platform is designed to enable teams to reap the benefits of the connected web - going beyond what conventional search and analytics platforms offer in this area. It also has the goal of offering federated instances, that can be customized to the sites/research performed. Data gets stored using JSON, extending upon previous approaches using XML, building structures that support computational chemistry calculations. These structures were developed to make it easy to process data across different languages, and send data to a JavaScript-based web client.« less

  20. Open chemistry: RESTful web APIs, JSON, NWChem and the modern web application

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hanwell, Marcus D.; de Jong, Wibe A.; Harris, Christopher J.

    An end-to-end platform for chemical science research has been developed that integrates data from computational and experimental approaches through a modern web-based interface. The platform offers an interactive visualization and analytics environment that functions well on mobile, laptop and desktop devices. It offers pragmatic solutions to ensure that large and complex data sets are more accessible. Existing desktop applications/frameworks were extended to integrate with high-performance computing resources, and offer command-line tools to automate interaction - connecting distributed teams to this software platform on their own terms. The platform was developed openly, and all source code hosted on the GitHub platformmore » with automated deployment possible using Ansible coupled with standard Ubuntu-based machine images deployed to cloud machines. The platform is designed to enable teams to reap the benefits of the connected web - going beyond what conventional search and analytics platforms offer in this area. It also has the goal of offering federated instances, that can be customized to the sites/research performed. Data gets stored using JSON, extending upon previous approaches using XML, building structures that support computational chemistry calculations. These structures were developed to make it easy to process data across different languages, and send data to a JavaScript-based web client.« less

  1. Global tectonic reconstructions with continuously deforming and evolving rigid plates

    NASA Astrophysics Data System (ADS)

    Gurnis, Michael; Yang, Ting; Cannon, John; Turner, Mark; Williams, Simon; Flament, Nicolas; Müller, R. Dietmar

    2018-07-01

    Traditional plate reconstruction methodologies do not allow for plate deformation to be considered. Here we present software to construct and visualize global tectonic reconstructions with deforming plates within the context of rigid plates. Both deforming and rigid plates are defined by continuously evolving polygons. The deforming regions are tessellated with triangular meshes such that either strain rate or cumulative strain can be followed. The finite strain history, crustal thickness and stretching factor of points within the deformation zones are tracked as Lagrangian points. Integrating these tools within the interactive platform GPlates enables specialized users to build and refine deforming plate models and integrate them with other models in time and space. We demonstrate the integrated platform with regional reconstructions of Cenozoic western North America, the Mesozoic South American Atlantic margin, and Cenozoic southeast Asia, embedded within global reconstructions, using different data and reconstruction strategies.

  2. A Vision-Based Driver Nighttime Assistance and Surveillance System Based on Intelligent Image Sensing Techniques and a Heterogamous Dual-Core Embedded System Architecture

    PubMed Central

    Chen, Yen-Lin; Chiang, Hsin-Han; Chiang, Chuan-Yen; Liu, Chuan-Ming; Yuan, Shyan-Ming; Wang, Jenq-Haur

    2012-01-01

    This study proposes a vision-based intelligent nighttime driver assistance and surveillance system (VIDASS system) implemented by a set of embedded software components and modules, and integrates these modules to accomplish a component-based system framework on an embedded heterogamous dual-core platform. Therefore, this study develops and implements computer vision and sensing techniques of nighttime vehicle detection, collision warning determination, and traffic event recording. The proposed system processes the road-scene frames in front of the host car captured from CCD sensors mounted on the host vehicle. These vision-based sensing and processing technologies are integrated and implemented on an ARM-DSP heterogamous dual-core embedded platform. Peripheral devices, including image grabbing devices, communication modules, and other in-vehicle control devices, are also integrated to form an in-vehicle-embedded vision-based nighttime driver assistance and surveillance system. PMID:22736956

  3. A vision-based driver nighttime assistance and surveillance system based on intelligent image sensing techniques and a heterogamous dual-core embedded system architecture.

    PubMed

    Chen, Yen-Lin; Chiang, Hsin-Han; Chiang, Chuan-Yen; Liu, Chuan-Ming; Yuan, Shyan-Ming; Wang, Jenq-Haur

    2012-01-01

    This study proposes a vision-based intelligent nighttime driver assistance and surveillance system (VIDASS system) implemented by a set of embedded software components and modules, and integrates these modules to accomplish a component-based system framework on an embedded heterogamous dual-core platform. Therefore, this study develops and implements computer vision and sensing techniques of nighttime vehicle detection, collision warning determination, and traffic event recording. The proposed system processes the road-scene frames in front of the host car captured from CCD sensors mounted on the host vehicle. These vision-based sensing and processing technologies are integrated and implemented on an ARM-DSP heterogamous dual-core embedded platform. Peripheral devices, including image grabbing devices, communication modules, and other in-vehicle control devices, are also integrated to form an in-vehicle-embedded vision-based nighttime driver assistance and surveillance system.

  4. PLUS: open-source toolkit for ultrasound-guided intervention systems.

    PubMed

    Lasso, Andras; Heffter, Tamas; Rankin, Adam; Pinter, Csaba; Ungi, Tamas; Fichtinger, Gabor

    2014-10-01

    A variety of advanced image analysis methods have been under the development for ultrasound-guided interventions. Unfortunately, the transition from an image analysis algorithm to clinical feasibility trials as part of an intervention system requires integration of many components, such as imaging and tracking devices, data processing algorithms, and visualization software. The objective of our paper is to provide a freely available open-source software platform-PLUS: Public software Library for Ultrasound-to facilitate rapid prototyping of ultrasound-guided intervention systems for translational clinical research. PLUS provides a variety of methods for interventional tool pose and ultrasound image acquisition from a wide range of tracking and imaging devices, spatial and temporal calibration, volume reconstruction, simulated image generation, and recording and live streaming of the acquired data. This paper introduces PLUS, explains its functionality and architecture, and presents typical uses and performance in ultrasound-guided intervention systems. PLUS fulfills the essential requirements for the development of ultrasound-guided intervention systems and it aspires to become a widely used translational research prototyping platform. PLUS is freely available as open source software under BSD license and can be downloaded from http://www.plustoolkit.org.

  5. Multidisciplinary Tool for Systems Analysis of Planetary Entry, Descent, and Landing

    NASA Technical Reports Server (NTRS)

    Samareh, Jamshid A.

    2011-01-01

    Systems analysis of a planetary entry (SAPE), descent, and landing (EDL) is a multidisciplinary activity in nature. SAPE improves the performance of the systems analysis team by automating and streamlining the process, and this improvement can reduce the errors that stem from manual data transfer among discipline experts. SAPE is a multidisciplinary tool for systems analysis of planetary EDL for Venus, Earth, Mars, Jupiter, Saturn, Uranus, Neptune, and Titan. It performs EDL systems analysis for any planet, operates cross-platform (i.e., Windows, Mac, and Linux operating systems), uses existing software components and open-source software to avoid software licensing issues, performs low-fidelity systems analysis in one hour on a computer that is comparable to an average laptop, and keeps discipline experts in the analysis loop. SAPE uses Python, a platform-independent, open-source language, for integration and for the user interface. Development has relied heavily on the object-oriented programming capabilities that are available in Python. Modules are provided to interface with commercial and government off-the-shelf software components (e.g., thermal protection systems and finite-element analysis). SAPE currently includes the following analysis modules: geometry, trajectory, aerodynamics, aerothermal, thermal protection system, and interface for structural sizing.

  6. ACQ4: an open-source software platform for data acquisition and analysis in neurophysiology research

    PubMed Central

    Campagnola, Luke; Kratz, Megan B.; Manis, Paul B.

    2014-01-01

    The complexity of modern neurophysiology experiments requires specialized software to coordinate multiple acquisition devices and analyze the collected data. We have developed ACQ4, an open-source software platform for performing data acquisition and analysis in experimental neurophysiology. This software integrates the tasks of acquiring, managing, and analyzing experimental data. ACQ4 has been used primarily for standard patch-clamp electrophysiology, laser scanning photostimulation, multiphoton microscopy, intrinsic imaging, and calcium imaging. The system is highly modular, which facilitates the addition of new devices and functionality. The modules included with ACQ4 provide for rapid construction of acquisition protocols, live video display, and customizable analysis tools. Position-aware data collection allows automated construction of image mosaics and registration of images with 3-dimensional anatomical atlases. ACQ4 uses free and open-source tools including Python, NumPy/SciPy for numerical computation, PyQt for the user interface, and PyQtGraph for scientific graphics. Supported hardware includes cameras, patch clamp amplifiers, scanning mirrors, lasers, shutters, Pockels cells, motorized stages, and more. ACQ4 is available for download at http://www.acq4.org. PMID:24523692

  7. Enhancing participatory approach in water resources management: development of a survey to evaluate stakeholders needs and priorities related to software capabilities

    NASA Astrophysics Data System (ADS)

    Foglia, L.; Rossetto, R.; Borsi, I.; Josef, S.; Boukalova, Z.; Triana, F.; Ghetta, M.; Sabbatini, T.; Bonari, E.; Cannata, M.; De Filippis, G.

    2016-12-01

    The EU H2020 FREEWAT project (FREE and open source software tools for WATer resource management) aims at simplifying the application of EU-water related Directives, by developing an open source and public domain, GIS-integrated platform for planning and management of ground- and surface-water resources. The FREEWAT platform is conceived as a canvas, where several distributed and physically-based simulation codes are virtually integrated. The choice of such codes was supported by the result of a survey performed by means of questionnaires distributed to 14 case study FREEWAT project partners and several stakeholders. This was performed in the first phase of the project within the WP 6 (Enhanced science and participatory approach evidence-based decision making), Task 6.1 (Definition of a "needs/tools" evaluation grid). About 30% among all the invited entities and institutions from several EU and non-EU Countries expressed their interest in contributing to the survey. Most of them were research institutions, government and geoenvironmental companies and river basin authorities.The result of the questionnaire provided a spectrum of needs and priorities of partners/stakeholders, which were addressed during the development phase of the FREEWAT platform. The main needs identified were related to ground- and surface-water quality, sustainable water management, interaction between groundwater/surface-water bodies, and design and management of Managed Aquifer Recharge schemes. Needs and priorities were then connected to the specific EU Directives and Regulations to be addressed.One of the main goals of the questionnaires was to collect information and suggestions regarding the use of existing commercial/open-source software tools to address needs and priorities, and regarding the needs to address specific water-related processes/problems.

  8. Virtual Platform for See Robustness Verification of Bootloader Embedded Software on Board Solar Orbiter's Energetic Particle Detector

    NASA Astrophysics Data System (ADS)

    Da Silva, A.; Sánchez Prieto, S.; Polo, O.; Parra Espada, P.

    2013-05-01

    Because of the tough robustness requirements in space software development, it is imperative to carry out verification tasks at a very early development stage to ensure that the implemented exception mechanisms work properly. All this should be done long time before the real hardware is available. But even if real hardware is available the verification of software fault tolerance mechanisms can be difficult since real faulty situations must be systematically and artificially brought about which can be imposible on real hardware. To solve this problem the Alcala Space Research Group (SRG) has developed a LEON2 virtual platform (Leon2ViP) with fault injection capabilities. This way it is posible to run the exact same target binary software as runs on the physical system in a more controlled and deterministic environment, allowing a more strict requirements verification. Leon2ViP enables unmanned and tightly focused fault injection campaigns, not possible otherwise, in order to expose and diagnose flaws in the software implementation early. Furthermore, the use of a virtual hardware-in-the-loop approach makes it possible to carry out preliminary integration tests with the spacecraft emulator or the sensors. The use of Leon2ViP has meant a signicant improvement, in both time and cost, in the development and verification processes of the Instrument Control Unit boot software on board Solar Orbiter's Energetic Particle Detector.

  9. Evaluating the ecological benefits of wildfire by integrating fire and ecosystem simulation models

    Treesearch

    Robert E. Keane; Eva Karau

    2010-01-01

    Fire managers are now realizing that wildfires can be beneficial because they can reduce hazardous fuels and restore fire-dominated ecosystems. A software tool that assesses potential beneficial and detrimental ecological effects from wildfire would be helpful to fire management. This paper presents a simulation platform called FLEAT (Fire and Landscape Ecology...

  10. The IMS Software Integration Platform

    DTIC Science & Technology

    1993-04-12

    products to incorporate all data shared by the IMS applications. Some entities (time-series, images, a algorithm -specific parameters) must be managed...dbwhoanii, dbcancel Transaction Management: dbcommit, dbrollback Key Counter Assignment: dbgetcounter String Handling: cstr ~to~pad, pad-to- cstr Error...increment *value; String Maniputation: int cstr topad (array, string, arraylength) char *array, *string; int arrayjlength; int pad tocstr (string

  11. The EPA CompTox Dashboard and Underpinning Software Architecture – a platform for data integration for environmental chemistry data (ACS Fall Meeting 7 of 12)

    EPA Science Inventory

    The CompTox Dashboard was developed by the Environmental Protection Agency’s National Center for Computational Toxicology. This dashboard has been architected in a manner that allows for the deployment of multiple “applications”, both as publicly available databases, and for dep...

  12. The Virtual Learning Environment ROODA: An Institutional Project of Long Distance Education

    ERIC Educational Resources Information Center

    Behar, Patricia Alejandra; Leite, Silvia Meirelles

    2006-01-01

    This article describes ROODA (http://www.homer.nuted.edu.ufrgs.br), a virtual learning environment and one of the official Long Distance Education platforms that has been in use since 2005 at the Federal University of Rio Grande do Sul (UFRGS), Porto Alegre, Brazil. It is free software that integrates syncronous and assyncronous…

  13. The application of a Web-geographic information system for improving urban water cycle modelling.

    PubMed

    Mair, M; Mikovits, C; Sengthaler, M; Schöpf, M; Kinzel, H; Urich, C; Kleidorfer, M; Sitzenfrei, R; Rauch, W

    2014-01-01

    Research in urban water management has experienced a transition from traditional model applications to modelling water cycles as an integrated part of urban areas. This includes the interlinking of models of many research areas (e.g. urban development, socio-economy, urban water management). The integration and simulation is realized in newly developed frameworks (e.g. DynaMind and OpenMI) and often assumes a high knowledge in programming. This work presents a Web based urban water management modelling platform which simplifies the setup and usage of complex integrated models. The platform is demonstrated with a small application example on a case study within the Alpine region. The used model is a DynaMind model benchmarking the impact of newly connected catchments on the flooding behaviour of an existing combined sewer system. As a result the workflow of the user within a Web browser is demonstrated and benchmark results are shown. The presented platform hides implementation specific aspects behind Web services based technologies such that the user can focus on his main aim, which is urban water management modelling and benchmarking. Moreover, this platform offers a centralized data management, automatic software updates and access to high performance computers accessible with desktop computers and mobile devices.

  14. Publishing Platform for Scientific Software - Lessons Learned

    NASA Astrophysics Data System (ADS)

    Hammitzsch, Martin; Fritzsch, Bernadette; Reusser, Dominik; Brembs, Björn; Deinzer, Gernot; Loewe, Peter; Fenner, Martin; van Edig, Xenia; Bertelmann, Roland; Pampel, Heinz; Klump, Jens; Wächter, Joachim

    2015-04-01

    Scientific software has become an indispensable commodity for the production, processing and analysis of empirical data but also for modelling and simulation of complex processes. Software has a significant influence on the quality of research results. For strengthening the recognition of the academic performance of scientific software development, for increasing its visibility and for promoting the reproducibility of research results, concepts for the publication of scientific software have to be developed, tested, evaluated, and then transferred into operations. For this, the publication and citability of scientific software have to fulfil scientific criteria by means of defined processes and the use of persistent identifiers, similar to data publications. The SciForge project is addressing these challenges. Based on interviews a blueprint for a scientific software publishing platform and a systematic implementation plan has been designed. In addition, the potential of journals, software repositories and persistent identifiers have been evaluated to improve the publication and dissemination of reusable software solutions. It is important that procedures for publishing software as well as methods and tools for software engineering are reflected in the architecture of the platform, in order to improve the quality of the software and the results of research. In addition, it is necessary to work continuously on improving specific conditions that promote the adoption and sustainable utilization of scientific software publications. Among others, this would include policies for the development and publication of scientific software in the institutions but also policies for establishing the necessary competencies and skills of scientists and IT personnel. To implement the concepts developed in SciForge a combined bottom-up / top-down approach is considered that will be implemented in parallel in different scientific domains, e.g. in earth sciences, climate research and the life sciences. Based on the developed blueprints a scientific software publishing platform will be iteratively implemented, tested, and evaluated. Thus the platform should be developed continuously on the basis of gained experiences and results. The platform services will be extended one by one corresponding to the requirements of the communities. Thus the implemented platform for the publication of scientific software can be improved and stabilized incrementally as a tool with software, science, publishing, and user oriented features.

  15. Cloud-enabled microscopy and droplet microfluidic platform for specific detection of Escherichia coli in water.

    PubMed

    Golberg, Alexander; Linshiz, Gregory; Kravets, Ilia; Stawski, Nina; Hillson, Nathan J; Yarmush, Martin L; Marks, Robert S; Konry, Tania

    2014-01-01

    We report an all-in-one platform - ScanDrop - for the rapid and specific capture, detection, and identification of bacteria in drinking water. The ScanDrop platform integrates droplet microfluidics, a portable imaging system, and cloud-based control software and data storage. The cloud-based control software and data storage enables robotic image acquisition, remote image processing, and rapid data sharing. These features form a "cloud" network for water quality monitoring. We have demonstrated the capability of ScanDrop to perform water quality monitoring via the detection of an indicator coliform bacterium, Escherichia coli, in drinking water contaminated with feces. Magnetic beads conjugated with antibodies to E. coli antigen were used to selectively capture and isolate specific bacteria from water samples. The bead-captured bacteria were co-encapsulated in pico-liter droplets with fluorescently-labeled anti-E. coli antibodies, and imaged with an automated custom designed fluorescence microscope. The entire water quality diagnostic process required 8 hours from sample collection to online-accessible results compared with 2-4 days for other currently available standard detection methods.

  16. Cloud-Enabled Microscopy and Droplet Microfluidic Platform for Specific Detection of Escherichia coli in Water

    PubMed Central

    Kravets, Ilia; Stawski, Nina; Hillson, Nathan J.; Yarmush, Martin L.; Marks, Robert S.; Konry, Tania

    2014-01-01

    We report an all-in-one platform – ScanDrop – for the rapid and specific capture, detection, and identification of bacteria in drinking water. The ScanDrop platform integrates droplet microfluidics, a portable imaging system, and cloud-based control software and data storage. The cloud-based control software and data storage enables robotic image acquisition, remote image processing, and rapid data sharing. These features form a “cloud” network for water quality monitoring. We have demonstrated the capability of ScanDrop to perform water quality monitoring via the detection of an indicator coliform bacterium, Escherichia coli, in drinking water contaminated with feces. Magnetic beads conjugated with antibodies to E. coli antigen were used to selectively capture and isolate specific bacteria from water samples. The bead-captured bacteria were co-encapsulated in pico-liter droplets with fluorescently-labeled anti-E. coli antibodies, and imaged with an automated custom designed fluorescence microscope. The entire water quality diagnostic process required 8 hours from sample collection to online-accessible results compared with 2–4 days for other currently available standard detection methods. PMID:24475107

  17. Explicet: graphical user interface software for metadata-driven management, analysis and visualization of microbiome data.

    PubMed

    Robertson, Charles E; Harris, J Kirk; Wagner, Brandie D; Granger, David; Browne, Kathy; Tatem, Beth; Feazel, Leah M; Park, Kristin; Pace, Norman R; Frank, Daniel N

    2013-12-01

    Studies of the human microbiome, and microbial community ecology in general, have blossomed of late and are now a burgeoning source of exciting research findings. Along with the advent of next-generation sequencing platforms, which have dramatically increased the scope of microbiome-related projects, several high-performance sequence analysis pipelines (e.g. QIIME, MOTHUR, VAMPS) are now available to investigators for microbiome analysis. The subject of our manuscript, the graphical user interface-based Explicet software package, fills a previously unmet need for a robust, yet intuitive means of integrating the outputs of the software pipelines with user-specified metadata and then visualizing the combined data.

  18. Radiation Hardening by Software Techniques on FPGAs: Flight Experiment Evaluation and Results

    NASA Technical Reports Server (NTRS)

    Schmidt, Andrew G.; Flatley, Thomas

    2017-01-01

    We present our work on implementing Radiation Hardening by Software (RHBSW) techniques on the Xilinx Virtex5 FPGAs PowerPC 440 processors on the SpaceCube 2.0 platform. The techniques have been matured and tested through simulation modeling, fault emulation, laser fault injection and now in a flight experiment, as part of the Space Test Program- Houston 4-ISS SpaceCube Experiment 2.0 (STP-H4-ISE 2.0). This work leverages concepts such as heartbeat monitoring, control flow assertions, and checkpointing, commonly used in the High Performance Computing industry, and adapts them for use in remote sensing embedded systems. These techniques are extremely low overhead (typically <1.3%), enabling a 3.3x gain in processing performance as compared to the equivalent traditionally radiation hardened processor. The recently concluded STP-H4 flight experiment was an opportunity to upgrade the RHBSW techniques for the Virtex5 FPGA and demonstrate them on-board the ISS to achieve TRL 7. This work details the implementation of the RHBSW techniques, that were previously developed for the Virtex4-based SpaceCube 1.0 platform, on the Virtex5-based SpaceCube 2.0 flight platform. The evaluation spans the development and integration with flight software, remotely uploading the new experiment to the ISS SpaceCube 2.0 platform, and conducting the experiment continuously for 16 days before the platform was decommissioned. The experiment was conducted on two PowerPCs embedded within the Virtex5 FPGA devices and the experiment collected 19,400 checkpoints, processed 253,482 status messages, and incurred 0 faults. These results are highly encouraging and future work is looking into longer duration testing as part of the STP-H5 flight experiment.

  19. S-Genius, a universal software platform with versatile inverse problem resolution for scatterometry

    NASA Astrophysics Data System (ADS)

    Fuard, David; Troscompt, Nicolas; El Kalyoubi, Ismael; Soulan, Sébastien; Besacier, Maxime

    2013-05-01

    S-Genius is a new universal scatterometry platform, which gathers all the LTM-CNRS know-how regarding the rigorous electromagnetic computation and several inverse problem solver solutions. This software platform is built to be a userfriendly, light, swift, accurate, user-oriented scatterometry tool, compatible with any ellipsometric measurements to fit and any types of pattern. It aims to combine a set of inverse problem solver capabilities — via adapted Levenberg- Marquard optimization, Kriging, Neural Network solutions — that greatly improve the reliability and the velocity of the solution determination. Furthermore, as the model solution is mainly vulnerable to materials optical properties, S-Genius may be coupled with an innovative material refractive indices determination. This paper will a little bit more focuses on the modified Levenberg-Marquardt optimization, one of the indirect method solver built up in parallel with the total SGenius software coding by yours truly. This modified Levenberg-Marquardt optimization corresponds to a Newton algorithm with an adapted damping parameter regarding the definition domains of the optimized parameters. Currently, S-Genius is technically ready for scientific collaboration, python-powered, multi-platform (windows/linux/macOS), multi-core, ready for 2D- (infinite features along the direction perpendicular to the incident plane), conical, and 3D-features computation, compatible with all kinds of input data from any possible ellipsometers (angle or wavelength resolved) or reflectometers, and widely used in our laboratory for resist trimming studies, etching features characterization (such as complex stack) or nano-imprint lithography measurements for instance. The work about kriging solver, neural network solver and material refractive indices determination is done (or about to) by other LTM members and about to be integrated on S-Genius platform.

  20. SITHON: An Airborne Fire Detection System Compliant with Operational Tactical Requirements

    PubMed Central

    Kontoes, Charalabos; Keramitsoglou, Iphigenia; Sifakis, Nicolaos; Konstantinidis, Pavlos

    2009-01-01

    In response to the urging need of fire managers for timely information on fire location and extent, the SITHON system was developed. SITHON is a fully digital thermal imaging system, integrating INS/GPS and a digital camera, designed to provide timely positioned and projected thermal images and video data streams rapidly integrated in the GIS operated by Crisis Control Centres. This article presents in detail the hardware and software components of SITHON, and demonstrates the first encouraging results of test flights over the Sithonia Peninsula in Northern Greece. It is envisaged that the SITHON system will be soon operated onboard various airborne platforms including fire brigade airplanes and helicopters as well as on UAV platforms owned and operated by the Greek Air Forces. PMID:22399963

  1. [Porting Radiotherapy Software of Varian to Cloud Platform].

    PubMed

    Zou, Lian; Zhang, Weisha; Liu, Xiangxiang; Xie, Zhao; Xie, Yaoqin

    2017-09-30

    To develop a low-cost private cloud platform of radiotherapy software. First, a private cloud platform which was based on OpenStack and the virtual GPU hardware was builded. Then on the private cloud platform, all the Varian radiotherapy software modules were installed to the virtual machine, and the corresponding function configuration was completed. Finally the software on the cloud was able to be accessed by virtual desktop client. The function test results of the cloud workstation show that a cloud workstation is equivalent to an isolated physical workstation, and any clients on the LAN can use the cloud workstation smoothly. The cloud platform transplantation in this study is economical and practical. The project not only improves the utilization rates of radiotherapy software, but also makes it possible that the cloud computing technology can expand its applications to the field of radiation oncology.

  2. Research on digital city geographic information common services platform

    NASA Astrophysics Data System (ADS)

    Chen, Dequan; Wu, Qunyong; Wang, Qinmin

    2008-10-01

    Traditional GIS (Geographic Information System) software development mode exposes many defects that will largely slow down the city informational progress. It is urgent need to build a common application infrastructure for informational project to speed up the development pace of digital city. The advent of service-oriented architecture (SOA) has motivated the adoption of GIS functionality portals that can be executed in distributed computing environment. According to the SOA principle, we bring forward and design a digital city geographic information common services platform which provides application development service interfaces for field users that can be further extended relevant business application. In the end, a public-oriented Web GIS is developed based on the platform for helping public users to query geographic information in their daily life. It indicates that our platform have the capacity that can be integrated by other applications conveniently.

  3. A Generic Software Architecture For Prognostics

    NASA Technical Reports Server (NTRS)

    Teubert, Christopher; Daigle, Matthew J.; Sankararaman, Shankar; Goebel, Kai; Watkins, Jason

    2017-01-01

    Prognostics is a systems engineering discipline focused on predicting end-of-life of components and systems. As a relatively new and emerging technology, there are few fielded implementations of prognostics, due in part to practitioners perceiving a large hurdle in developing the models, algorithms, architecture, and integration pieces. As a result, no open software frameworks for applying prognostics currently exist. This paper introduces the Generic Software Architecture for Prognostics (GSAP), an open-source, cross-platform, object-oriented software framework and support library for creating prognostics applications. GSAP was designed to make prognostics more accessible and enable faster adoption and implementation by industry, by reducing the effort and investment required to develop, test, and deploy prognostics. This paper describes the requirements, design, and testing of GSAP. Additionally, a detailed case study involving battery prognostics demonstrates its use.

  4. Time and Space Partition Platform for Safe and Secure Flight Software

    NASA Astrophysics Data System (ADS)

    Esquinas, Angel; Zamorano, Juan; de la Puente, Juan A.; Masmano, Miguel; Crespo, Alfons

    2012-08-01

    There are a number of research and development activities that are exploring Time and Space Partition (TSP) to implement safe and secure flight software. This approach allows to execute different real-time applications with different levels of criticality in the same computer board. In order to do that, flight applications must be isolated from each other in the temporal and spatial domains. This paper presents the first results of a partitioning platform based on the Open Ravenscar Kernel (ORK+) and the XtratuM hypervisor. ORK+ is a small, reliable realtime kernel supporting the Ada Ravenscar Computational model that is central to the ASSERT development process. XtratuM supports multiple virtual machines, i.e. partitions, on a single computer and is being used in the Integrated Modular Avionics for Space study. ORK+ executes in an XtratuM partition enabling Ada applications to share the computer board with other applications.

  5. Math Machines: Using Actuators in Physics Classes

    NASA Astrophysics Data System (ADS)

    Thomas, Frederick J.; Chaney, Robert A.; Gruesbeck, Marta

    2018-01-01

    Probeware (sensors combined with data-analysis software) is a well-established part of physics education. In engineering and technology, sensors are frequently paired with actuators—motors, heaters, buzzers, valves, color displays, medical dosing systems, and other devices that are activated by electrical signals to produce intentional physical change. This article describes how a 20-year project aimed at better integration of the STEM disciplines (science, technology, engineering and mathematics) uses brief actuator activities in physics instruction. Math Machines "actionware" includes software and hardware that convert virtually any free-form, time-dependent algebraic function into the dynamic actions of a stepper motor, servo motor, or RGB (red, green, blue) color mixer. With wheels and a platform, the stepper motor becomes LACI, a programmable vehicle. Adding a low-power laser module turns the servo motor into a programmable Pointer. Adding a gear and platform can transform the Pointer into an earthquake simulator.

  6. MathWorks Simulink and C++ integration with the new VLT PLC-based standard development platform for instrument control systems

    NASA Astrophysics Data System (ADS)

    Kiekebusch, Mario J.; Di Lieto, Nicola; Sandrock, Stefan; Popovic, Dan; Chiozzi, Gianluca

    2014-07-01

    ESO is in the process of implementing a new development platform, based on PLCs, for upcoming VLT control systems (new instruments and refurbishing of existing systems to manage obsolescence issues). In this context, we have evaluated the integration and reuse of existing C++ libraries and Simulink models into the real-time environment of BECKHOFF Embedded PCs using the capabilities of the latest version of TwinCAT software and MathWorks Embedded Coder. While doing so the aim was to minimize the impact of the new platform by adopting fully tested solutions implemented in C++. This allows us to reuse the in house expertise, as well as extending the normal capabilities of the traditional PLC programming environments. We present the progress of this work and its application in two concrete cases: 1) field rotation compensation for instrument tracking devices like derotators, 2) the ESO standard axis controller (ESTAC), a generic model-based controller implemented in Simulink and used for the control of telescope main axes.

  7. Parameters that affect parallel processing for computational electromagnetic simulation codes on high performance computing clusters

    NASA Astrophysics Data System (ADS)

    Moon, Hongsik

    What is the impact of multicore and associated advanced technologies on computational software for science? Most researchers and students have multicore laptops or desktops for their research and they need computing power to run computational software packages. Computing power was initially derived from Central Processing Unit (CPU) clock speed. That changed when increases in clock speed became constrained by power requirements. Chip manufacturers turned to multicore CPU architectures and associated technological advancements to create the CPUs for the future. Most software applications benefited by the increased computing power the same way that increases in clock speed helped applications run faster. However, for Computational ElectroMagnetics (CEM) software developers, this change was not an obvious benefit - it appeared to be a detriment. Developers were challenged to find a way to correctly utilize the advancements in hardware so that their codes could benefit. The solution was parallelization and this dissertation details the investigation to address these challenges. Prior to multicore CPUs, advanced computer technologies were compared with the performance using benchmark software and the metric was FLoting-point Operations Per Seconds (FLOPS) which indicates system performance for scientific applications that make heavy use of floating-point calculations. Is FLOPS an effective metric for parallelized CEM simulation tools on new multicore system? Parallel CEM software needs to be benchmarked not only by FLOPS but also by the performance of other parameters related to type and utilization of the hardware, such as CPU, Random Access Memory (RAM), hard disk, network, etc. The codes need to be optimized for more than just FLOPs and new parameters must be included in benchmarking. In this dissertation, the parallel CEM software named High Order Basis Based Integral Equation Solver (HOBBIES) is introduced. This code was developed to address the needs of the changing computer hardware platforms in order to provide fast, accurate and efficient solutions to large, complex electromagnetic problems. The research in this dissertation proves that the performance of parallel code is intimately related to the configuration of the computer hardware and can be maximized for different hardware platforms. To benchmark and optimize the performance of parallel CEM software, a variety of large, complex projects are created and executed on a variety of computer platforms. The computer platforms used in this research are detailed in this dissertation. The projects run as benchmarks are also described in detail and results are presented. The parameters that affect parallel CEM software on High Performance Computing Clusters (HPCC) are investigated. This research demonstrates methods to maximize the performance of parallel CEM software code.

  8. Adding Value to the Network: Exploring the Software as a Service and Platform as a Service Models for Mobile Operators

    NASA Astrophysics Data System (ADS)

    Gonçalves, Vânia

    The environments of software development and software provision are shifting to Web-based platforms supported by Platform/Software as a Service (PaaS/SaaS) models. This paper will make the case that there is equally an opportunity for mobile operators to identify additional sources of revenue by exposing network functionalities through Web-based service platforms. By elaborating on the concepts, benefits and risks of SaaS and PaaS, several factors that should be taken into consideration in applying these models to the telecom world are delineated.

  9. The Common Data Acquisition Platform in the Helmholtz Association

    NASA Astrophysics Data System (ADS)

    Kaever, P.; Balzer, M.; Kopmann, A.; Zimmer, M.; Rongen, H.

    2017-04-01

    Various centres of the German Helmholtz Association (HGF) started in 2012 to develop a modular data acquisition (DAQ) platform, covering the entire range from detector readout to data transfer into parallel computing environments. This platform integrates generic hardware components like the multi-purpose HGF-Advanced Mezzanine Card or a smart scientific camera framework, adding user value with Linux drivers and board support packages. Technically the scope comprises the DAQ-chain from FPGA-modules to computing servers, notably frontend-electronics-interfaces, microcontrollers and GPUs with their software plus high-performance data transmission links. The core idea is a generic and component-based approach, enabling the implementation of specific experiment requirements with low effort. This so called DTS-platform will support standards like MTCA.4 in hard- and software to ensure compatibility with commercial components. Its capability to deploy on other crate standards or FPGA-boards with PCI express or Ethernet interfaces remains an essential feature. Competences of the participating centres are coordinated in order to provide a solid technological basis for both research topics in the Helmholtz Programme ``Matter and Technology'': ``Detector Technology and Systems'' and ``Accelerator Research and Development''. The DTS-platform aims at reducing costs and development time and will ensure access to latest technologies for the collaboration. Due to its flexible approach, it has the potential to be applied in other scientific programs.

  10. OpenChrom: a cross-platform open source software for the mass spectrometric analysis of chromatographic data.

    PubMed

    Wenig, Philip; Odermatt, Juergen

    2010-07-30

    Today, data evaluation has become a bottleneck in chromatographic science. Analytical instruments equipped with automated samplers yield large amounts of measurement data, which needs to be verified and analyzed. Since nearly every GC/MS instrument vendor offers its own data format and software tools, the consequences are problems with data exchange and a lack of comparability between the analytical results. To challenge this situation a number of either commercial or non-profit software applications have been developed. These applications provide functionalities to import and analyze several data formats but have shortcomings in terms of the transparency of the implemented analytical algorithms and/or are restricted to a specific computer platform. This work describes a native approach to handle chromatographic data files. The approach can be extended in its functionality such as facilities to detect baselines, to detect, integrate and identify peaks and to compare mass spectra, as well as the ability to internationalize the application. Additionally, filters can be applied on the chromatographic data to enhance its quality, for example to remove background and noise. Extended operations like do, undo and redo are supported. OpenChrom is a software application to edit and analyze mass spectrometric chromatographic data. It is extensible in many different ways, depending on the demands of the users or the analytical procedures and algorithms. It offers a customizable graphical user interface. The software is independent of the operating system, due to the fact that the Rich Client Platform is written in Java. OpenChrom is released under the Eclipse Public License 1.0 (EPL). There are no license constraints regarding extensions. They can be published using open source as well as proprietary licenses. OpenChrom is available free of charge at http://www.openchrom.net.

  11. Creating Math Videos: Comparing Platforms and Software

    ERIC Educational Resources Information Center

    Abbasian, Reza O.; Sieben, John T.

    2016-01-01

    In this paper we present a short tutorial on creating mini-videos using two platforms--PCs and tablets such as iPads--and software packages that work with these devices. Specifically, we describe the step-by-step process of creating and editing videos using a Wacom Intuos pen-tablet plus Camtasia software on a PC platform and using the software…

  12. Multiattribute selection of acute stroke imaging software platform for Extending the Time for Thrombolysis in Emergency Neurological Deficits (EXTEND) clinical trial.

    PubMed

    Churilov, Leonid; Liu, Daniel; Ma, Henry; Christensen, Soren; Nagakane, Yoshinari; Campbell, Bruce; Parsons, Mark W; Levi, Christopher R; Davis, Stephen M; Donnan, Geoffrey A

    2013-04-01

    The appropriateness of a software platform for rapid MRI assessment of the amount of salvageable brain tissue after stroke is critical for both the validity of the Extending the Time for Thrombolysis in Emergency Neurological Deficits (EXTEND) Clinical Trial of stroke thrombolysis beyond 4.5 hours and for stroke patient care outcomes. The objective of this research is to develop and implement a methodology for selecting the acute stroke imaging software platform most appropriate for the setting of a multi-centre clinical trial. A multi-disciplinary decision making panel formulated the set of preferentially independent evaluation attributes. Alternative Multi-Attribute Value Measurement methods were used to identify the best imaging software platform followed by sensitivity analysis to ensure the validity and robustness of the proposed solution. Four alternative imaging software platforms were identified. RApid processing of PerfusIon and Diffusion (RAPID) software was selected as the most appropriate for the needs of the EXTEND trial. A theoretically grounded generic multi-attribute selection methodology for imaging software was developed and implemented. The developed methodology assured both a high quality decision outcome and a rational and transparent decision process. This development contributes to stroke literature in the area of comprehensive evaluation of MRI clinical software. At the time of evaluation, RAPID software presented the most appropriate imaging software platform for use in the EXTEND clinical trial. The proposed multi-attribute imaging software evaluation methodology is based on sound theoretical foundations of multiple criteria decision analysis and can be successfully used for choosing the most appropriate imaging software while ensuring both robust decision process and outcomes. © 2012 The Authors. International Journal of Stroke © 2012 World Stroke Organization.

  13. The multi-modal Australian ScienceS Imaging and Visualization Environment (MASSIVE) high performance computing infrastructure: applications in neuroscience and neuroinformatics research

    PubMed Central

    Goscinski, Wojtek J.; McIntosh, Paul; Felzmann, Ulrich; Maksimenko, Anton; Hall, Christopher J.; Gureyev, Timur; Thompson, Darren; Janke, Andrew; Galloway, Graham; Killeen, Neil E. B.; Raniga, Parnesh; Kaluza, Owen; Ng, Amanda; Poudel, Govinda; Barnes, David G.; Nguyen, Toan; Bonnington, Paul; Egan, Gary F.

    2014-01-01

    The Multi-modal Australian ScienceS Imaging and Visualization Environment (MASSIVE) is a national imaging and visualization facility established by Monash University, the Australian Synchrotron, the Commonwealth Scientific Industrial Research Organization (CSIRO), and the Victorian Partnership for Advanced Computing (VPAC), with funding from the National Computational Infrastructure and the Victorian Government. The MASSIVE facility provides hardware, software, and expertise to drive research in the biomedical sciences, particularly advanced brain imaging research using synchrotron x-ray and infrared imaging, functional and structural magnetic resonance imaging (MRI), x-ray computer tomography (CT), electron microscopy and optical microscopy. The development of MASSIVE has been based on best practice in system integration methodologies, frameworks, and architectures. The facility has: (i) integrated multiple different neuroimaging analysis software components, (ii) enabled cross-platform and cross-modality integration of neuroinformatics tools, and (iii) brought together neuroimaging databases and analysis workflows. MASSIVE is now operational as a nationally distributed and integrated facility for neuroinfomatics and brain imaging research. PMID:24734019

  14. Vocabulary Notebook: A Digital Solution to General and Specific Vocabulary Learning Problems in a CLIL Context

    ERIC Educational Resources Information Center

    Bazo, Plácido; Rodríguez, Romén; Fumero, Dácil

    2016-01-01

    In this paper, we will introduce an innovative software platform that can be especially useful in a Content and Language Integrated Learning (CLIL) context. This tool is called Vocabulary Notebook, and has been developed to solve all the problems that traditional (paper) vocabulary notebooks have. This tool keeps focus on the personalisation of…

  15. English as a Second Language on a Virtual Platform--Tradition and Innovation in a New Medium

    ERIC Educational Resources Information Center

    Hansson, Thomas

    2005-01-01

    A pilot study at a local school explores a virtual world during English lessons. The objective of applying a Vygotskian experimental design to the study is to investigate the potential of software, interaction and integration related to problem-solving defined as text composition in a foreign language. Focus of research and practices is on the…

  16. A computational platform for MALDI-TOF mass spectrometry data: application to serum and plasma samples.

    PubMed

    Mantini, Dante; Petrucci, Francesca; Pieragostino, Damiana; Del Boccio, Piero; Sacchetta, Paolo; Candiano, Giovanni; Ghiggeri, Gian Marco; Lugaresi, Alessandra; Federici, Giorgio; Di Ilio, Carmine; Urbani, Andrea

    2010-01-03

    Mass spectrometry (MS) is becoming the gold standard for biomarker discovery. Several MS-based bioinformatics methods have been proposed for this application, but the divergence of the findings by different research groups on the same MS data suggests that the definition of a reliable method has not been achieved yet. In this work, we propose an integrated software platform, MASCAP, intended for comparative biomarker detection from MALDI-TOF MS data. MASCAP integrates denoising and feature extraction algorithms, which have already shown to provide consistent peaks across mass spectra; furthermore, it relies on statistical analysis and graphical tools to compare the results between groups. The effectiveness in mass spectrum processing is demonstrated using MALDI-TOF data, as well as SELDI-TOF data. The usefulness in detecting potential protein biomarkers is shown comparing MALDI-TOF mass spectra collected from serum and plasma samples belonging to the same clinical population. The analysis approach implemented in MASCAP may simplify biomarker detection, by assisting the recognition of proteomic expression signatures of the disease. A MATLAB implementation of the software and the data used for its validation are available at http://www.unich.it/proteomica/bioinf. (c) 2009 Elsevier B.V. All rights reserved.

  17. Systems biology driven software design for the research enterprise

    PubMed Central

    Boyle, John; Cavnor, Christopher; Killcoyne, Sarah; Shmulevich, Ilya

    2008-01-01

    Background In systems biology, and many other areas of research, there is a need for the interoperability of tools and data sources that were not originally designed to be integrated. Due to the interdisciplinary nature of systems biology, and its association with high throughput experimental platforms, there is an additional need to continually integrate new technologies. As scientists work in isolated groups, integration with other groups is rarely a consideration when building the required software tools. Results We illustrate an approach, through the discussion of a purpose built software architecture, which allows disparate groups to reuse tools and access data sources in a common manner. The architecture allows for: the rapid development of distributed applications; interoperability, so it can be used by a wide variety of developers and computational biologists; development using standard tools, so that it is easy to maintain and does not require a large development effort; extensibility, so that new technologies and data types can be incorporated; and non intrusive development, insofar as researchers need not to adhere to a pre-existing object model. Conclusion By using a relatively simple integration strategy, based upon a common identity system and dynamically discovered interoperable services, a light-weight software architecture can become the focal point through which scientists can both get access to and analyse the plethora of experimentally derived data. PMID:18578887

  18. Feasibility of using a knowledge-based system concept for in-flight primary flight display research

    NASA Technical Reports Server (NTRS)

    Ricks, Wendell R.

    1991-01-01

    A study was conducted to determine the feasibility of using knowledge-based systems architectures for inflight research of primary flight display information management issues. The feasibility relied on the ability to integrate knowledge-based systems with existing onboard aircraft systems. And, given the hardware and software platforms available, the feasibility also depended on the ability to use interpreted LISP software with the real time operation of the primary flight display. In addition to evaluating these feasibility issues, the study determined whether the software engineering advantages of knowledge-based systems found for this application in the earlier workstation study extended to the inflight research environment. To study these issues, two integrated knowledge-based systems were designed to control the primary flight display according to pre-existing specifications of an ongoing primary flight display information management research effort. These two systems were implemented to assess the feasibility and software engineering issues listed. Flight test results were successful in showing the feasibility of using knowledge-based systems inflight with actual aircraft data.

  19. JIP: Java image processing on the Internet

    NASA Astrophysics Data System (ADS)

    Wang, Dongyan; Lin, Bo; Zhang, Jun

    1998-12-01

    In this paper, we present JIP - Java Image Processing on the Internet, a new Internet based application for remote education and software presentation. JIP offers an integrate learning environment on the Internet where remote users not only can share static HTML documents and lectures notes, but also can run and reuse dynamic distributed software components, without having the source code or any extra work of software compilation, installation and configuration. By implementing a platform-independent distributed computational model, local computational resources are consumed instead of the resources on a central server. As an extended Java applet, JIP allows users to selected local image files on their computers or specify any image on the Internet using an URL as input. Multimedia lectures such as streaming video/audio and digital images are integrated into JIP and intelligently associated with specific image processing functions. Watching demonstrations an practicing the functions with user-selected input data dramatically encourages leaning interest, while promoting the understanding of image processing theory. The JIP framework can be easily applied to other subjects in education or software presentation, such as digital signal processing, business, mathematics, physics, or other areas such as employee training and charged software consumption.

  20. Analyzing Cyber-Physical Threats on Robotic Platforms.

    PubMed

    Ahmad Yousef, Khalil M; AlMajali, Anas; Ghalyon, Salah Abu; Dweik, Waleed; Mohd, Bassam J

    2018-05-21

    Robots are increasingly involved in our daily lives. Fundamental to robots are the communication link (or stream) and the applications that connect the robots to their clients or users. Such communication link and applications are usually supported through client/server network connection. This networking system is amenable of being attacked and vulnerable to the security threats. Ensuring security and privacy for robotic platforms is thus critical, as failures and attacks could have devastating consequences. In this paper, we examine several cyber-physical security threats that are unique to the robotic platforms; specifically the communication link and the applications. Threats target integrity, availability and confidential security requirements of the robotic platforms, which use MobileEyes/arnlServer client/server applications. A robot attack tool (RAT) was developed to perform specific security attacks. An impact-oriented approach was adopted to analyze the assessment results of the attacks. Tests and experiments of attacks were conducted in simulation environment and physically on the robot. The simulation environment was based on MobileSim; a software tool for simulating, debugging and experimenting on MobileRobots/ActivMedia platforms and their environments. The robot platform PeopleBot TM was used for physical experiments. The analysis and testing results show that certain attacks were successful at breaching the robot security. Integrity attacks modified commands and manipulated the robot behavior. Availability attacks were able to cause Denial-of-Service (DoS) and the robot was not responsive to MobileEyes commands. Integrity and availability attacks caused sensitive information on the robot to be hijacked. To mitigate security threats, we provide possible mitigation techniques and suggestions to raise awareness of threats on the robotic platforms, especially when the robots are involved in critical missions or applications.

  1. Analyzing Cyber-Physical Threats on Robotic Platforms †

    PubMed Central

    2018-01-01

    Robots are increasingly involved in our daily lives. Fundamental to robots are the communication link (or stream) and the applications that connect the robots to their clients or users. Such communication link and applications are usually supported through client/server network connection. This networking system is amenable of being attacked and vulnerable to the security threats. Ensuring security and privacy for robotic platforms is thus critical, as failures and attacks could have devastating consequences. In this paper, we examine several cyber-physical security threats that are unique to the robotic platforms; specifically the communication link and the applications. Threats target integrity, availability and confidential security requirements of the robotic platforms, which use MobileEyes/arnlServer client/server applications. A robot attack tool (RAT) was developed to perform specific security attacks. An impact-oriented approach was adopted to analyze the assessment results of the attacks. Tests and experiments of attacks were conducted in simulation environment and physically on the robot. The simulation environment was based on MobileSim; a software tool for simulating, debugging and experimenting on MobileRobots/ActivMedia platforms and their environments. The robot platform PeopleBotTM was used for physical experiments. The analysis and testing results show that certain attacks were successful at breaching the robot security. Integrity attacks modified commands and manipulated the robot behavior. Availability attacks were able to cause Denial-of-Service (DoS) and the robot was not responsive to MobileEyes commands. Integrity and availability attacks caused sensitive information on the robot to be hijacked. To mitigate security threats, we provide possible mitigation techniques and suggestions to raise awareness of threats on the robotic platforms, especially when the robots are involved in critical missions or applications. PMID:29883403

  2. ImatraNMR: Novel software for batch integration and analysis of quantitative NMR spectra

    NASA Astrophysics Data System (ADS)

    Mäkelä, A. V.; Heikkilä, O.; Kilpeläinen, I.; Heikkinen, S.

    2011-08-01

    Quantitative NMR spectroscopy is a useful and important tool for analysis of various mixtures. Recently, in addition of traditional quantitative 1D 1H and 13C NMR methods, a variety of pulse sequences aimed for quantitative or semiquantitative analysis have been developed. To obtain actual usable results from quantitative spectra, they must be processed and analyzed with suitable software. Currently, there are many processing packages available from spectrometer manufacturers and third party developers, and most of them are capable of analyzing and integration of quantitative spectra. However, they are mainly aimed for processing single or few spectra, and are slow and difficult to use when large numbers of spectra and signals are being analyzed, even when using pre-saved integration areas or custom scripting features. In this article, we present a novel software, ImatraNMR, designed for batch analysis of quantitative spectra. In addition to capability of analyzing large number of spectra, it provides results in text and CSV formats, allowing further data-analysis using spreadsheet programs or general analysis programs, such as Matlab. The software is written with Java, and thus it should run in any platform capable of providing Java Runtime Environment version 1.6 or newer, however, currently it has only been tested with Windows and Linux (Ubuntu 10.04). The software is free for non-commercial use, and is provided with source code upon request.

  3. Mercury - A New Software Package for Orbital Integrations

    NASA Astrophysics Data System (ADS)

    Chambers, J. E.; Migliorini, F.

    1997-07-01

    We present Mercury: a new general-purpose software package for carrying out orbital integrations for problems in solar-system dynamics. Suitable applications include studying the long-term stability of the planetary system, investigating the orbital evolution of comets, asteroids or meteoroids, and simulating planetary accretion. Mercury is designed to be versatile and easy to use, accepting initial conditions in either Cartesian coordinates or Keplerian elements in ``cometary'' or ``asteroidal'' format, with different epochs of osculation for different objects. Output from an integration consists of either osculating or averaged (``proper'') elements, written in a machine-independent compressed format, which allows the results of a calculation performed on one platform to be transferred (e.g. via FTP) and decoded on another. Mercury itself is platform independent, and can be run on machines using DEC Unix, Open VMS, HP Unix, Solaris, Linux or DOS. During an integration, Mercury monitors and records details of close encounters, sungrazing events, ejections and collisions between objects. The effects of non-gravitational forces on comets can also be modelled. Additional effects such as Poynting-Robertson drag, post-Newtonian corrections, oblateness of the primary, and the galactic potential will be incorporated in future. The package currently supports integrations using a mixed-variable symplectic routine, the Bulirsch-Stoer method, and a hybrid code for planetary accretion calculations; with Everhart's popular RADAU algorithm and a symmetric multistep routine to be added shortly. Our presentation will include a demonstration of the latest version of Mercury, with the explicit aim of getting feedback from potential users and incorporating these suggestions into a final version that will be made available to everybody.

  4. Unique Challenges Testing SDRs for Space

    NASA Technical Reports Server (NTRS)

    Chelmins, David; Downey, Joseph A.; Johnson, Sandra K.; Nappier, Jennifer M.

    2013-01-01

    This paper describes the approach used by the Space Communication and Navigation (SCaN) Testbed team to qualify three Software Defined Radios (SDR) for operation in space and the characterization of the platform to enable upgrades on-orbit. The three SDRs represent a significant portion of the new technologies being studied on board the SCAN Testbed, which is operating on an external truss on the International Space Station (ISS). The SCaN Testbed provides experimenters an opportunity to develop and demonstrate experimental waveforms and applications for communication, networking, and navigation concepts and advance the understanding of developing and operating SDRs in space. Qualifying a Software Defined Radio for the space environment requires additional consideration versus a hardware radio. Tests that incorporate characterization of the platform to provide information necessary for future waveforms, which might exercise extended capabilities of the hardware, are needed. The development life cycle for the radio follows the software development life cycle, where changes can be incorporated at various stages of development and test. It also enables flexibility to be added with minor additional effort. Although this provides tremendous advantages, managing the complexity inherent in a software implementation requires a testing beyond the traditional hardware radio test plan. Due to schedule and resource limitations and parallel development activities, the subsystem testing of the SDRs at the vendor sites was primarily limited to typical fixed transceiver type of testing. NASA s Glenn Research Center (GRC) was responsible for the integration and testing of the SDRs into the SCaN Testbed system and conducting the investigation of the SDR to advance the technology to be accepted by missions. This paper will describe the unique tests that were conducted at both the subsystem and system level, including environmental testing, and present results. For example, test waveforms were developed to measure the gain of the transmit system across the tunable frequency band. These were used during thermal vacuum testing to enable characterization of the integrated system in the wide operational temperature range of space. Receive power indicators were used for Electromagnetic Interference tests (EMI) to understand the platform s susceptibility to external interferers independent of the waveform. Additional approaches and lessons learned during the SCaN Testbed subsystem and system level testing will be discussed that may help future SDR integrators

  5. Unique Challenges Testing SDRs for Space

    NASA Technical Reports Server (NTRS)

    Johnson, Sandra; Chelmins, David; Downey, Joseph; Nappier, Jennifer

    2013-01-01

    This paper describes the approach used by the Space Communication and Navigation (SCaN) Testbed team to qualify three Software Defined Radios (SDR) for operation in space and the characterization of the platform to enable upgrades on-orbit. The three SDRs represent a significant portion of the new technologies being studied on board the SCAN Testbed, which is operating on an external truss on the International Space Station (ISS). The SCaN Testbed provides experimenters an opportunity to develop and demonstrate experimental waveforms and applications for communication, networking, and navigation concepts and advance the understanding of developing and operating SDRs in space. Qualifying a Software Defined Radio for the space environment requires additional consideration versus a hardware radio. Tests that incorporate characterization of the platform to provide information necessary for future waveforms, which might exercise extended capabilities of the hardware, are needed. The development life cycle for the radio follows the software development life cycle, where changes can be incorporated at various stages of development and test. It also enables flexibility to be added with minor additional effort. Although this provides tremendous advantages, managing the complexity inherent in a software implementation requires a testing beyond the traditional hardware radio test plan. Due to schedule and resource limitations and parallel development activities, the subsystem testing of the SDRs at the vendor sites was primarily limited to typical fixed transceiver type of testing. NASA's Glenn Research Center (GRC) was responsible for the integration and testing of the SDRs into the SCaN Testbed system and conducting the investigation of the SDR to advance the technology to be accepted by missions. This paper will describe the unique tests that were conducted at both the subsystem and system level, including environmental testing, and present results. For example, test waveforms were developed to measure the gain of the transmit system across the tunable frequency band. These were used during thermal vacuum testing to enable characterization of the integrated system in the wide operational temperature range of space. Receive power indicators were used for Electromagnetic Interference tests (EMI) to understand the platform's susceptibility to external interferers independent of the waveform. Additional approaches and lessons learned during the SCaN Testbed subsystem and system level testing will be discussed that may help future SDR integrators.

  6. Multifunctional Web Enabled Ocean Sensor Systems for the Monitoring of a Changing Ocean

    NASA Astrophysics Data System (ADS)

    Pearlman, Jay; Castro, Ayoze; Corrandino, Luigi; del Rio, Joaquin; Delory, Eric; Garello, Rene; Heuermann, Rudinger; Martinez, Enoc; Pearlman, Francoise; Rolin, Jean-Francois; Toma, Daniel; Waldmann, Christoph; Zielinski, Oliver

    2016-04-01

    As stated in the 2010 "Ostend Declaration", a major challenge in the coming years is the development of a truly integrated and sustainably funded European Ocean Observing System for supporting major policy initiatives such as the Integrated Maritime Policy and the Marine Strategy Framework Directive. This will be achieved with more long-term measurements of key parameters supported by a new generation of sensors whose costs and reliability will enable broad and consistent observations. Within the NeXOS project, a framework including new sensors capabilities and interface software has been put together that embraces the key technical aspects needed to improve the temporal and spatial coverage, resolution and quality of marine observations. The developments include new, low-cost, compact and integrated sensors with multiple functionalities that will allow for the measurements useful for a number of objectives, ranging from more precise monitoring and modeling of the marine environment to an improved assessment of fisheries. The project is entering its third year and will be demonstrating initial capabilities of optical and acoustic sensor prototypes that will become available for a number of platforms. For fisheries management, there is also a series of sensors that support an Ecosystem Approach to Fisheries (EAF). The greatest capabilities for comprehensive operations will occur when these sensors can be integrated into a multisensory capability on a single platform or multiply interconnected and coordinated platforms. Within NeXOS the full processing steps starting from the sensor signal all the way up to distributing collected environmental information will be encapsulated into standardized new state of the art Smart Sensor Interface and Web components to provide both improved integration and a flexible interface for scientists to control sensor operation. The use of the OGC SWE (Sensor Web Enablement) set of standards like OGC PUCK and SensorML at the instrument to platform integration phase will provide standard mechanisms for a truly plug'n'work connection. Through this, NeXOS Instruments will maintain within themselves specific information about how a platform (buoy controller, AUV controller, Observatory controller) has to configure and communicate with the instrument without the platform needing previous knowledge about the instrument. This mechanism is now being evaluated in real platforms like a Slocum Glider from Teledyne Web research, SeaExplorer Glider from Alseamar, Provor Float from NKE, and others including non commercial platforms like Obsea seafloor cabled observatory. The latest developments in the NeXOS sensors and the integration into an observation system will be discussed, addressing demonstration plans both for a variety of platforms and scientific objectives supporting marine management.

  7. Recent enhancements to and applications of the SmartBrick structural health monitoring platform

    NASA Astrophysics Data System (ADS)

    Gunasekaran, A.; Cross, S.; Patel, N.; Sedigh, S.

    2012-04-01

    The SmartBrick network is an autonomous and wireless solution for structural health monitoring of civil infrastructures. The base station is currently in its third generation and has been laboratory- and field-tested in the United States and Italy. The second generation of the sensor nodes has been laboratory-tested as of publication. In this paper, we present recent enhancements made to hardware and software of the SmartBrick platform. Salient improvements described include the development of a new base station with fully-integrated long-range GSM (cellular) and short-range ZigBee communication. The major software improvement described in this paper is migration to the ZigBee PRO stack, which was carried out in the interest of interoperability. To broaden the application of the platform to critical environments that require survivability and fault tolerance, we have striven to achieve compliance with military standards in the areas of hardware, software, and communication. We describe these efforts and present a survey of the military standards investigated. Also described is instrumentation of a three-span experimental bridge in Washington County, Missouri; with the SmartBrick platform. The sensors, whose output is conditioned and multiplexed; include strain gauges, thermocouples, push potentiometers, and three-axis inclinometers. Data collected is stored on site and reported over the cellular network. Real-time alerts are generated if any monitored parameter falls outside its acceptable range. Redundant sensing and communication provide reliability and facilitate corroboration of the data collected. A web interface is used to issue remote configuration commands and to facilitate access to and visualization of the data collected.

  8. Seqcrawler: biological data indexing and browsing platform.

    PubMed

    Sallou, Olivier; Bretaudeau, Anthony; Roult, Aurelien

    2012-07-24

    Seqcrawler takes its roots in software like SRS or Lucegene. It provides an indexing platform to ease the search of data and meta-data in biological banks and it can scale to face the current flow of data. While many biological bank search tools are available on the Internet, mainly provided by large organizations to search their data, there is a lack of free and open source solutions to browse one's own set of data with a flexible query system and able to scale from a single computer to a cloud system. A personal index platform will help labs and bioinformaticians to search their meta-data but also to build a larger information system with custom subsets of data. The software is scalable from a single computer to a cloud-based infrastructure. It has been successfully tested in a private cloud with 3 index shards (pieces of index) hosting ~400 millions of sequence information (whole GenBank, UniProt, PDB and others) for a total size of 600 GB in a fault tolerant architecture (high-availability). It has also been successfully integrated with software to add extra meta-data from blast results to enhance users' result analysis. Seqcrawler provides a complete open source search and store solution for labs or platforms needing to manage large amount of data/meta-data with a flexible and customizable web interface. All components (search engine, visualization and data storage), though independent, share a common and coherent data system that can be queried with a simple HTTP interface. The solution scales easily and can also provide a high availability infrastructure.

  9. A Roadmap for using Agile Development in a Traditional System

    NASA Technical Reports Server (NTRS)

    Streiffert, Barbara; Starbird, Thomas

    2006-01-01

    I. Ensemble Development Group: a) Produces activity planning software for in spacecraft; b) Built on Eclipse Rich Client Platform (open source development and runtime software); c) Funded by multiple sources including the Mars Technology Program; d) Incorporated the use of Agile Development. II. Next Generation Uplink Planning System: a) Researches the Activity Planning and Sequencing Subsystem for Mars Science Laboratory (APSS); b) APSS includes Ensemble, Activity Modeling, Constraint Checking, Command Editing and Sequencing tools plus other uplink generation utilities; c) Funded by the Mars Technology Program; d) Integrates all of the tools for APSS.

  10. An Open Data Platform in the framework of the EGI-LifeWatch Competence Center

    NASA Astrophysics Data System (ADS)

    Aguilar Gómez, Fernando; de Lucas, Jesús Marco; Yaiza Rodríguez Marrero, Ana

    2016-04-01

    The working pilot of an Open Data Platform supporting the full data cycle in research is presented. It aims to preserve knowledge explicitly, starting with the description of the Case Studies, and integrating data and software management and preservation on equal basis. The uninterrupted support in the chain starts at the data acquisition level and covers up to the support for reuse and publication in an open framework, providing integrity and provenance controls. The Lifewatch Open Science Framework is a pilot web portal developed in collaboration with different commercial companies that tries to enrich and integrate different data lifecycle-related tools in order to address the management of the different steps: data planning, gathering, storing, curation, preservation, sharing, discovering, etc. To achieve this goal, the platform includes the following features: -Data Management Planning. Tool to set up an structure of the data, including what data will be generated, how it will be exploited, re-used, curated, preserved, etc. It has a semantic approach: includes reference to ontologies in order to express what data will be gathered. -Close to instrumentation. The portal includes a distributed storage system that can be used both for storing data from instruments and output data from analysis. All that data can be shared -Analysis. Resources from EGI Federated Cloud are accessible within the portal, so that users can exploit computing resources to perform analysis and other processes, including workflows. -Preservation. Data can be preserved in different systems and DOIs can be minted not only for datasets but also for software, DMPs, etc. The presentation will show the different components of the framework as well as how it can be extrapolated to other communities.

  11. Application of the GNU Radio platform in the multistatic radar

    NASA Astrophysics Data System (ADS)

    Szlachetko, Boguslaw; Lewandowski, Andrzej

    2009-06-01

    This document presents the application of the Software Defined Radio-based platform in the multistatic radar. This platform consists of four-sensor linear antenna, Universal Software Radio Peripheral (USRP) hardware (radio frequency frontend) and GNU-Radio PC software. The paper provides information about architecture of digital signal processing performed by USRP's FPGA (digital down converting blocks) and PC host (implementation of the multichannel digital beamforming). The preliminary results of the signal recording performed by our experimental platform are presented.

  12. ZBIT Bioinformatics Toolbox: A Web-Platform for Systems Biology and Expression Data Analysis

    PubMed Central

    Römer, Michael; Eichner, Johannes; Dräger, Andreas; Wrzodek, Clemens; Wrzodek, Finja; Zell, Andreas

    2016-01-01

    Bioinformatics analysis has become an integral part of research in biology. However, installation and use of scientific software can be difficult and often requires technical expert knowledge. Reasons are dependencies on certain operating systems or required third-party libraries, missing graphical user interfaces and documentation, or nonstandard input and output formats. In order to make bioinformatics software easily accessible to researchers, we here present a web-based platform. The Center for Bioinformatics Tuebingen (ZBIT) Bioinformatics Toolbox provides web-based access to a collection of bioinformatics tools developed for systems biology, protein sequence annotation, and expression data analysis. Currently, the collection encompasses software for conversion and processing of community standards SBML and BioPAX, transcription factor analysis, and analysis of microarray data from transcriptomics and proteomics studies. All tools are hosted on a customized Galaxy instance and run on a dedicated computation cluster. Users only need a web browser and an active internet connection in order to benefit from this service. The web platform is designed to facilitate the usage of the bioinformatics tools for researchers without advanced technical background. Users can combine tools for complex analyses or use predefined, customizable workflows. All results are stored persistently and reproducible. For each tool, we provide documentation, tutorials, and example data to maximize usability. The ZBIT Bioinformatics Toolbox is freely available at https://webservices.cs.uni-tuebingen.de/. PMID:26882475

  13. Software Tools for Development on the Peregrine System | High-Performance

    Science.gov Websites

    Computing | NREL Software Tools for Development on the Peregrine System Software Tools for and manage software at the source code level. Cross-Platform Make and SCons The "Cross-Platform Make" (CMake) package is from Kitware, and SCons is a modern software build tool based on Python

  14. OpenDrop: An Integrated Do-It-Yourself Platform for Personal Use of Biochips

    PubMed Central

    Alistar, Mirela; Gaudenz, Urs

    2017-01-01

    Biochips, or digital labs-on-chip, are developed with the purpose of being used by laboratory technicians or biologists in laboratories or clinics. In this article, we expand this vision with the goal of enabling everyone, regardless of their expertise, to use biochips for their own personal purposes. We developed OpenDrop, an integrated electromicrofluidic platform that allows users to develop and program their own bio-applications. We address the main challenges that users may encounter: accessibility, bio-protocol design and interaction with microfluidics. OpenDrop consists of a do-it-yourself biochip, an automated software tool with visual interface and a detailed technique for at-home operations of microfluidics. We report on two years of use of OpenDrop, released as an open-source platform. Our platform attracted a highly diverse user base with participants originating from maker communities, academia and industry. Our findings show that 47% of attempts to replicate OpenDrop were successful, the main challenge remaining the assembly of the device. In terms of usability, the users managed to operate their platforms at home and are working on designing their own bio-applications. Our work provides a step towards a future in which everyone will be able to create microfluidic devices for their personal applications, thereby democratizing parts of health care. PMID:28952524

  15. The Chandra X-ray Center data system: supporting the mission of the Chandra X-ray Observatory

    NASA Astrophysics Data System (ADS)

    Evans, Janet D.; Cresitello-Dittmar, Mark; Doe, Stephen; Evans, Ian; Fabbiano, Giuseppina; Germain, Gregg; Glotfelty, Kenny; Hall, Diane; Plummer, David; Zografou, Panagoula

    2006-06-01

    The Chandra X-ray Center Data System provides end-to-end scientific software support for Chandra X-ray Observatory mission operations. The data system includes the following components: (1) observers' science proposal planning tools; (2) science mission planning tools; (3) science data processing, monitoring, and trending pipelines and tools; and (4) data archive and database management. A subset of the science data processing component is ported to multiple platforms and distributed to end-users as a portable data analysis package. Web-based user tools are also available for data archive search and retrieval. We describe the overall architecture of the data system and its component pieces, and consider the design choices and their impacts on maintainability. We discuss the many challenges involved in maintaining a large, mission-critical software system with limited resources. These challenges include managing continually changing software requirements and ensuring the integrity of the data system and resulting data products while being highly responsive to the needs of the project. We describe our use of COTS and OTS software at the subsystem and component levels, our methods for managing multiple release builds, and adapting a large code base to new hardware and software platforms. We review our experiences during the life of the mission so-far, and our approaches for keeping a small, but highly talented, development team engaged during the maintenance phase of a mission.

  16. Human Motion Tracking and Glove-Based User Interfaces for Virtual Environments in ANVIL

    NASA Technical Reports Server (NTRS)

    Dumas, Joseph D., II

    2002-01-01

    The Army/NASA Virtual Innovations Laboratory (ANVIL) at Marshall Space Flight Center (MSFC) provides an environment where engineers and other personnel can investigate novel applications of computer simulation and Virtual Reality (VR) technologies. Among the many hardware and software resources in ANVIL are several high-performance Silicon Graphics computer systems and a number of commercial software packages, such as Division MockUp by Parametric Technology Corporation (PTC) and Jack by Unigraphics Solutions, Inc. These hardware and software platforms are used in conjunction with various VR peripheral I/O (input / output) devices, CAD (computer aided design) models, etc. to support the objectives of the MSFC Engineering Systems Department/Systems Engineering Support Group (ED42) by studying engineering designs, chiefly from the standpoint of human factors and ergonomics. One of the more time-consuming tasks facing ANVIL personnel involves the testing and evaluation of peripheral I/O devices and the integration of new devices with existing hardware and software platforms. Another important challenge is the development of innovative user interfaces to allow efficient, intuitive interaction between simulation users and the virtual environments they are investigating. As part of his Summer Faculty Fellowship, the author was tasked with verifying the operation of some recently acquired peripheral interface devices and developing new, easy-to-use interfaces that could be used with existing VR hardware and software to better support ANVIL projects.

  17. Clinical results of HIS, RIS, PACS integration using data integration CASE tools

    NASA Astrophysics Data System (ADS)

    Taira, Ricky K.; Chan, Hing-Ming; Breant, Claudine M.; Huang, Lu J.; Valentino, Daniel J.

    1995-05-01

    Current infrastructure research in PACS is dominated by the development of communication networks (local area networks, teleradiology, ATM networks, etc.), multimedia display workstations, and hierarchical image storage architectures. However, limited work has been performed on developing flexible, expansible, and intelligent information processing architectures for the vast decentralized image and text data repositories prevalent in healthcare environments. Patient information is often distributed among multiple data management systems. Current large-scale efforts to integrate medical information and knowledge sources have been costly with limited retrieval functionality. Software integration strategies to unify distributed data and knowledge sources is still lacking commercially. Systems heterogeneity (i.e., differences in hardware platforms, communication protocols, database management software, nomenclature, etc.) is at the heart of the problem and is unlikely to be standardized in the near future. In this paper, we demonstrate the use of newly available CASE (computer- aided software engineering) tools to rapidly integrate HIS, RIS, and PACS information systems. The advantages of these tools include fast development time (low-level code is generated from graphical specifications), and easy system maintenance (excellent documentation, easy to perform changes, and centralized code repository in an object-oriented database). The CASE tools are used to develop and manage the `middle-ware' in our client- mediator-serve architecture for systems integration. Our architecture is scalable and can accommodate heterogeneous database and communication protocols.

  18. Evaluation of DICOM viewer software for workflow integration in clinical trials

    NASA Astrophysics Data System (ADS)

    Haak, Daniel; Page, Charles E.; Kabino, Klaus; Deserno, Thomas M.

    2015-03-01

    The digital imaging and communications in medicine (DICOM) protocol is nowadays the leading standard for capture, exchange and storage of image data in medical applications. A broad range of commercial, free, and open source software tools supporting a variety of DICOM functionality exists. However, different from patient's care in hospital, DICOM has not yet arrived in electronic data capture systems (EDCS) for clinical trials. Due to missing integration, even just the visualization of patient's image data in electronic case report forms (eCRFs) is impossible. Four increasing levels for integration of DICOM components into EDCS are conceivable, raising functionality but also demands on interfaces with each level. Hence, in this paper, a comprehensive evaluation of 27 DICOM viewer software projects is performed, investigating viewing functionality as well as interfaces for integration. Concerning general, integration, and viewing requirements the survey involves the criteria (i) license, (ii) support, (iii) platform, (iv) interfaces, (v) two-dimensional (2D) and (vi) three-dimensional (3D) image viewing functionality. Optimal viewers are suggested for applications in clinical trials for 3D imaging, hospital communication, and workflow. Focusing on open source solutions, the viewers ImageJ and MicroView are superior for 3D visualization, whereas GingkoCADx is advantageous for hospital integration. Concerning workflow optimization in multi-centered clinical trials, we suggest the open source viewer Weasis. Covering most use cases, an EDCS and PACS interconnection with Weasis is suggested.

  19. KDE Bioscience: platform for bioinformatics analysis workflows.

    PubMed

    Lu, Qiang; Hao, Pei; Curcin, Vasa; He, Weizhong; Li, Yuan-Yuan; Luo, Qing-Ming; Guo, Yi-Ke; Li, Yi-Xue

    2006-08-01

    Bioinformatics is a dynamic research area in which a large number of algorithms and programs have been developed rapidly and independently without much consideration so far of the need for standardization. The lack of such common standards combined with unfriendly interfaces make it difficult for biologists to learn how to use these tools and to translate the data formats from one to another. Consequently, the construction of an integrative bioinformatics platform to facilitate biologists' research is an urgent and challenging task. KDE Bioscience is a java-based software platform that collects a variety of bioinformatics tools and provides a workflow mechanism to integrate them. Nucleotide and protein sequences from local flat files, web sites, and relational databases can be entered, annotated, and aligned. Several home-made or 3rd-party viewers are built-in to provide visualization of annotations or alignments. KDE Bioscience can also be deployed in client-server mode where simultaneous execution of the same workflow is supported for multiple users. Moreover, workflows can be published as web pages that can be executed from a web browser. The power of KDE Bioscience comes from the integrated algorithms and data sources. With its generic workflow mechanism other novel calculations and simulations can be integrated to augment the current sequence analysis functions. Because of this flexible and extensible architecture, KDE Bioscience makes an ideal integrated informatics environment for future bioinformatics or systems biology research.

  20. Space-Based Reconfigurable Software Defined Radio Test Bed Aboard International Space Station

    NASA Technical Reports Server (NTRS)

    Reinhart, Richard C.; Lux, James P.

    2014-01-01

    The National Aeronautical and Space Administration (NASA) recently launched a new software defined radio research test bed to the International Space Station. The test bed, sponsored by the Space Communications and Navigation (SCaN) Office within NASA is referred to as the SCaN Testbed. The SCaN Testbed is a highly capable communications system, composed of three software defined radios, integrated into a flight system, and mounted to the truss of the International Space Station. Software defined radios offer the future promise of in-flight reconfigurability, autonomy, and eventually cognitive operation. The adoption of software defined radios offers space missions a new way to develop and operate space transceivers for communications and navigation. Reconfigurable or software defined radios with communications and navigation functions implemented in software or VHDL (Very High Speed Hardware Description Language) provide the capability to change the functionality of the radio during development or after launch. The ability to change the operating characteristics of a radio through software once deployed to space offers the flexibility to adapt to new science opportunities, recover from anomalies within the science payload or communication system, and potentially reduce development cost and risk by adapting generic space platforms to meet specific mission requirements. The software defined radios on the SCaN Testbed are each compliant to NASA's Space Telecommunications Radio System (STRS) Architecture. The STRS Architecture is an open, non-proprietary architecture that defines interfaces for the connections between radio components. It provides an operating environment to abstract the communication waveform application from the underlying platform specific hardware such as digital-to-analog converters, analog-to-digital converters, oscillators, RF attenuators, automatic gain control circuits, FPGAs, general-purpose processors, etc. and the interconnections among different radio components.

  1. SDO FlatSat Facility

    NASA Technical Reports Server (NTRS)

    Amason, David L.

    2008-01-01

    The goal of the Solar Dynamics Observatory (SDO) is to understand and, ideally, predict the solar variations that influence life and society. It's instruments will measure the properties of the Sun and will take hifh definition images of the Sun every few seconds, all day every day. The FlatSat is a high fidelity electrical and functional representation of the SDO spacecraft bus. It is a high fidelity test bed for Integration & Test (I & T), flight software, and flight operations. For I & T purposes FlatSat will be a driver to development and dry run electrical integration procedures, STOL test procedures, page displays, and the command and telemetry database. FlatSat will also serve as a platform for flight software acceptance and systems testing for the flight software system component including the spacecraft main processors, power supply electronics, attitude control electronic, gimbal control electrons and the S-band communications card. FlatSat will also benefit the flight operations team through post-launch flight software code and table update development and verification and verification of new and updated flight operations products. This document highlights the benefits of FlatSat; describes the building of FlatSat; provides FlatSat facility requirements, access roles and responsibilities; and, and discusses FlatSat mechanical and electrical integration and functional testing.

  2. Next generation simulation tools: the Systems Biology Workbench and BioSPICE integration.

    PubMed

    Sauro, Herbert M; Hucka, Michael; Finney, Andrew; Wellock, Cameron; Bolouri, Hamid; Doyle, John; Kitano, Hiroaki

    2003-01-01

    Researchers in quantitative systems biology make use of a large number of different software packages for modelling, analysis, visualization, and general data manipulation. In this paper, we describe the Systems Biology Workbench (SBW), a software framework that allows heterogeneous application components--written in diverse programming languages and running on different platforms--to communicate and use each others' capabilities via a fast binary encoded-message system. Our goal was to create a simple, high performance, opensource software infrastructure which is easy to implement and understand. SBW enables applications (potentially running on separate, distributed computers) to communicate via a simple network protocol. The interfaces to the system are encapsulated in client-side libraries that we provide for different programming languages. We describe in this paper the SBW architecture, a selection of current modules, including Jarnac, JDesigner, and SBWMeta-tool, and the close integration of SBW into BioSPICE, which enables both frameworks to share tools and compliment and strengthen each others capabilities.

  3. BanTeC: a software tool for management of corneal transplantation.

    PubMed

    López-Alvarez, P; Caballero, F; Trias, J; Cortés, U; López-Navidad, A

    2005-11-01

    Until recently, all cornea information at our tissue bank was managed manually, no specific database or computer tool had been implemented to provide electronic versions of documents and medical reports. The main objective of the BanTeC project was therefore to create a computerized system to integrate and classify all the information and documents used in the center in order to facilitate management of retrieved, transplanted corneal tissues. We used the Windows platform to develop the project. Microsoft Access and Microsoft Jet Engine were used at the database level and Data Access Objects was the chosen data access technology. In short, the BanTeC software seeks to computerize the tissue bank. All the initial stages of the development have now been completed, from specification of needs, program design and implementation of the software components, to the total integration of the final result in the real production environment. BanTeC will allow the generation of statistical reports for analysis to improve our performance.

  4. GeoBoost: accelerating research involving the geospatial metadata of virus GenBank records.

    PubMed

    Tahsin, Tasnia; Weissenbacher, Davy; O'Connor, Karen; Magge, Arjun; Scotch, Matthew; Gonzalez-Hernandez, Graciela

    2018-05-01

    GeoBoost is a command-line software package developed to address sparse or incomplete metadata in GenBank sequence records that relate to the location of the infected host (LOIH) of viruses. Given a set of GenBank accession numbers corresponding to virus GenBank records, GeoBoost extracts, integrates and normalizes geographic information reflecting the LOIH of the viruses using integrated information from GenBank metadata and related full-text publications. In addition, to facilitate probabilistic geospatial modeling, GeoBoost assigns probability scores for each possible LOIH. Binaries and resources required for running GeoBoost are packed into a single zipped file and freely available for download at https://tinyurl.com/geoboost. A video tutorial is included to help users quickly and easily install and run the software. The software is implemented in Java 1.8, and supported on MS Windows and Linux platforms. gragon@upenn.edu. Supplementary data are available at Bioinformatics online.

  5. Accessing microfluidics through feature-based design software for 3D printing.

    PubMed

    Shankles, Peter G; Millet, Larry J; Aufrecht, Jayde A; Retterer, Scott T

    2018-01-01

    Additive manufacturing has been a cornerstone of the product development pipeline for decades, playing an essential role in the creation of both functional and cosmetic prototypes. In recent years, the prospects for distributed and open source manufacturing have grown tremendously. This growth has been enabled by an expanding library of printable materials, low-cost printers, and communities dedicated to platform development. The microfluidics community has embraced this opportunity to integrate 3D printing into the suite of manufacturing strategies used to create novel fluidic architectures. The rapid turnaround time and low cost to implement these strategies in the lab makes 3D printing an attractive alternative to conventional micro- and nanofabrication techniques. In this work, the production of multiple microfluidic architectures using a hybrid 3D printing-soft lithography approach is demonstrated and shown to enable rapid device fabrication with channel dimensions that take advantage of laminar flow characteristics. The fabrication process outlined here is underpinned by the implementation of custom design software with an integrated slicer program that replaces less intuitive computer aided design and slicer software tools. Devices are designed in the program by assembling parameterized microfluidic building blocks. The fabrication process and flow control within 3D printed devices were demonstrated with a gradient generator and two droplet generator designs. Precise control over the printing process allowed 3D microfluidics to be printed in a single step by extruding bridge structures to 'jump-over' channels in the same plane. This strategy was shown to integrate with conventional nanofabrication strategies to simplify the operation of a platform that incorporates both nanoscale features and 3D printed microfluidics.

  6. Accessing microfluidics through feature-based design software for 3D printing

    PubMed Central

    Shankles, Peter G.; Millet, Larry J.; Aufrecht, Jayde A.

    2018-01-01

    Additive manufacturing has been a cornerstone of the product development pipeline for decades, playing an essential role in the creation of both functional and cosmetic prototypes. In recent years, the prospects for distributed and open source manufacturing have grown tremendously. This growth has been enabled by an expanding library of printable materials, low-cost printers, and communities dedicated to platform development. The microfluidics community has embraced this opportunity to integrate 3D printing into the suite of manufacturing strategies used to create novel fluidic architectures. The rapid turnaround time and low cost to implement these strategies in the lab makes 3D printing an attractive alternative to conventional micro- and nanofabrication techniques. In this work, the production of multiple microfluidic architectures using a hybrid 3D printing-soft lithography approach is demonstrated and shown to enable rapid device fabrication with channel dimensions that take advantage of laminar flow characteristics. The fabrication process outlined here is underpinned by the implementation of custom design software with an integrated slicer program that replaces less intuitive computer aided design and slicer software tools. Devices are designed in the program by assembling parameterized microfluidic building blocks. The fabrication process and flow control within 3D printed devices were demonstrated with a gradient generator and two droplet generator designs. Precise control over the printing process allowed 3D microfluidics to be printed in a single step by extruding bridge structures to ‘jump-over’ channels in the same plane. This strategy was shown to integrate with conventional nanofabrication strategies to simplify the operation of a platform that incorporates both nanoscale features and 3D printed microfluidics. PMID:29596418

  7. Intraoperative optical coherence tomography: past, present, and future

    PubMed Central

    Ehlers, J P

    2016-01-01

    To provide an overview of the current state of intraoperative optical coherence tomography (OCT). Literature review of studies pertaining to intraoperative OCT examining both the technology aspects of the imaging platform and the current evidence for patient care. Over the last several years, there have been significant advances in integrative technology for intraoperative OCT. This has resulted in the development of multiple microscope-integrated systems and a rapidly expanding field of image-guided surgical care. Multiple studies have demonstrated the potential role for intraoperative OCT in facilitating surgeon understanding of the surgical environment, tissue configuration, and overall changes to anatomy. In fact, the PIONEER and DISCOVER studies, both demonstrated a potential significant percentage of cases that intraoperative OCT alters surgical decision-making in both anterior and posterior segment surgery. Current areas of exploration and development include OCT-compatible instrumentation, automated tracking, intraoperative OCT software platforms, and surgeon feedback/visualization platforms. Intraoperative OCT is an emerging technology that holds promise for enhancing the surgical care of both anterior segment and posterior segment conditions. Hurdles remain for adoption and widespread utilization, including cost, optimized feedback platforms, and more definitive value for individualized surgical care with image guidance. PMID:26681147

  8. Singularity: Scientific containers for mobility of compute.

    PubMed

    Kurtzer, Gregory M; Sochat, Vanessa; Bauer, Michael W

    2017-01-01

    Here we present Singularity, software developed to bring containers and reproducibility to scientific computing. Using Singularity containers, developers can work in reproducible environments of their choosing and design, and these complete environments can easily be copied and executed on other platforms. Singularity is an open source initiative that harnesses the expertise of system and software engineers and researchers alike, and integrates seamlessly into common workflows for both of these groups. As its primary use case, Singularity brings mobility of computing to both users and HPC centers, providing a secure means to capture and distribute software and compute environments. This ability to create and deploy reproducible environments across these centers, a previously unmet need, makes Singularity a game changing development for computational science.

  9. Singularity: Scientific containers for mobility of compute

    PubMed Central

    Kurtzer, Gregory M.; Bauer, Michael W.

    2017-01-01

    Here we present Singularity, software developed to bring containers and reproducibility to scientific computing. Using Singularity containers, developers can work in reproducible environments of their choosing and design, and these complete environments can easily be copied and executed on other platforms. Singularity is an open source initiative that harnesses the expertise of system and software engineers and researchers alike, and integrates seamlessly into common workflows for both of these groups. As its primary use case, Singularity brings mobility of computing to both users and HPC centers, providing a secure means to capture and distribute software and compute environments. This ability to create and deploy reproducible environments across these centers, a previously unmet need, makes Singularity a game changing development for computational science. PMID:28494014

  10. Recent Progress in Optical Biosensors Based on Smartphone Platforms

    PubMed Central

    Geng, Zhaoxin; Zhang, Xiong; Fan, Zhiyuan; Lv, Xiaoqing; Su, Yue; Chen, Hongda

    2017-01-01

    With a rapid improvement of smartphone hardware and software, especially complementary metal oxide semiconductor (CMOS) cameras, many optical biosensors based on smartphone platforms have been presented, which have pushed the development of the point-of-care testing (POCT). Imaging-based and spectrometry-based detection techniques have been widely explored via different approaches. Combined with the smartphone, imaging-based and spectrometry-based methods are currently used to investigate a wide range of molecular properties in chemical and biological science for biosensing and diagnostics. Imaging techniques based on smartphone-based microscopes are utilized to capture microscale analysts, while spectrometry-based techniques are used to probe reactions or changes of molecules. Here, we critically review the most recent progress in imaging-based and spectrometry-based smartphone-integrated platforms that have been developed for chemical experiments and biological diagnosis. We focus on the analytical performance and the complexity for implementation of the platforms. PMID:29068375

  11. Recent Progress in Optical Biosensors Based on Smartphone Platforms.

    PubMed

    Geng, Zhaoxin; Zhang, Xiong; Fan, Zhiyuan; Lv, Xiaoqing; Su, Yue; Chen, Hongda

    2017-10-25

    With a rapid improvement of smartphone hardware and software, especially complementary metal oxide semiconductor (CMOS) cameras, many optical biosensors based on smartphone platforms have been presented, which have pushed the development of the point-of-care testing (POCT). Imaging-based and spectrometry-based detection techniques have been widely explored via different approaches. Combined with the smartphone, imaging-based and spectrometry-based methods are currently used to investigate a wide range of molecular properties in chemical and biological science for biosensing and diagnostics. Imaging techniques based on smartphone-based microscopes are utilized to capture microscale analysts, while spectrometry-based techniques are used to probe reactions or changes of molecules. Here, we critically review the most recent progress in imaging-based and spectrometry-based smartphone-integrated platforms that have been developed for chemical experiments and biological diagnosis. We focus on the analytical performance and the complexity for implementation of the platforms.

  12. Current and future trends in marine image annotation software

    NASA Astrophysics Data System (ADS)

    Gomes-Pereira, Jose Nuno; Auger, Vincent; Beisiegel, Kolja; Benjamin, Robert; Bergmann, Melanie; Bowden, David; Buhl-Mortensen, Pal; De Leo, Fabio C.; Dionísio, Gisela; Durden, Jennifer M.; Edwards, Luke; Friedman, Ariell; Greinert, Jens; Jacobsen-Stout, Nancy; Lerner, Steve; Leslie, Murray; Nattkemper, Tim W.; Sameoto, Jessica A.; Schoening, Timm; Schouten, Ronald; Seager, James; Singh, Hanumant; Soubigou, Olivier; Tojeira, Inês; van den Beld, Inge; Dias, Frederico; Tempera, Fernando; Santos, Ricardo S.

    2016-12-01

    Given the need to describe, analyze and index large quantities of marine imagery data for exploration and monitoring activities, a range of specialized image annotation tools have been developed worldwide. Image annotation - the process of transposing objects or events represented in a video or still image to the semantic level, may involve human interactions and computer-assisted solutions. Marine image annotation software (MIAS) have enabled over 500 publications to date. We review the functioning, application trends and developments, by comparing general and advanced features of 23 different tools utilized in underwater image analysis. MIAS requiring human input are basically a graphical user interface, with a video player or image browser that recognizes a specific time code or image code, allowing to log events in a time-stamped (and/or geo-referenced) manner. MIAS differ from similar software by the capability of integrating data associated to video collection, the most simple being the position coordinates of the video recording platform. MIAS have three main characteristics: annotating events in real time, posteriorly to annotation and interact with a database. These range from simple annotation interfaces, to full onboard data management systems, with a variety of toolboxes. Advanced packages allow to input and display data from multiple sensors or multiple annotators via intranet or internet. Posterior human-mediated annotation often include tools for data display and image analysis, e.g. length, area, image segmentation, point count; and in a few cases the possibility of browsing and editing previous dive logs or to analyze the annotations. The interaction with a database allows the automatic integration of annotations from different surveys, repeated annotation and collaborative annotation of shared datasets, browsing and querying of data. Progress in the field of automated annotation is mostly in post processing, for stable platforms or still images. Integration into available MIAS is currently limited to semi-automated processes of pixel recognition through computer-vision modules that compile expert-based knowledge. Important topics aiding the choice of a specific software are outlined, the ideal software is discussed and future trends are presented.

  13. Rover Attitude and Pointing System Simulation Testbed

    NASA Technical Reports Server (NTRS)

    Vanelli, Charles A.; Grinblat, Jonathan F.; Sirlin, Samuel W.; Pfister, Sam

    2009-01-01

    The MER (Mars Exploration Rover) Attitude and Pointing System Simulation Testbed Environment (RAPSSTER) provides a simulation platform used for the development and test of GNC (guidance, navigation, and control) flight algorithm designs for the Mars rovers, which was specifically tailored to the MERs, but has since been used in the development of rover algorithms for the Mars Science Laboratory (MSL) as well. The software provides an integrated simulation and software testbed environment for the development of Mars rover attitude and pointing flight software. It provides an environment that is able to run the MER GNC flight software directly (as opposed to running an algorithmic model of the MER GNC flight code). This improves simulation fidelity and confidence in the results. Further more, the simulation environment allows the user to single step through its execution, pausing, and restarting at will. The system also provides for the introduction of simulated faults specific to Mars rover environments that cannot be replicated in other testbed platforms, to stress test the GNC flight algorithms under examination. The software provides facilities to do these stress tests in ways that cannot be done in the real-time flight system testbeds, such as time-jumping (both forwards and backwards), and introduction of simulated actuator faults that would be difficult, expensive, and/or destructive to implement in the real-time testbeds. Actual flight-quality codes can be incorporated back into the development-test suite of GNC developers, closing the loop between the GNC developers and the flight software developers. The software provides fully automated scripting, allowing multiple tests to be run with varying parameters, without human supervision.

  14. Workflow-Based Software Development Environment

    NASA Technical Reports Server (NTRS)

    Izygon, Michel E.

    2013-01-01

    The Software Developer's Assistant (SDA) helps software teams more efficiently and accurately conduct or execute software processes associated with NASA mission-critical software. SDA is a process enactment platform that guides software teams through project-specific standards, processes, and procedures. Software projects are decomposed into all of their required process steps or tasks, and each task is assigned to project personnel. SDA orchestrates the performance of work required to complete all process tasks in the correct sequence. The software then notifies team members when they may begin work on their assigned tasks and provides the tools, instructions, reference materials, and supportive artifacts that allow users to compliantly perform the work. A combination of technology components captures and enacts any software process use to support the software lifecycle. It creates an adaptive workflow environment that can be modified as needed. SDA achieves software process automation through a Business Process Management (BPM) approach to managing the software lifecycle for mission-critical projects. It contains five main parts: TieFlow (workflow engine), Business Rules (rules to alter process flow), Common Repository (storage for project artifacts, versions, history, schedules, etc.), SOA (interface to allow internal, GFE, or COTS tools integration), and the Web Portal Interface (collaborative web environment

  15. Mount control system of the ASTRI SST-2M prototype for the Cherenkov Telescope Array

    NASA Astrophysics Data System (ADS)

    Antolini, Elisa; Tosti, Gino; Tanci, Claudio; Bagaglia, Marco; Canestrari, Rodolfo; Cascone, Enrico; Gambini, Giorgio; Nucciarelli, Giuliano; Pareschi, Giovanni; Scuderi, Salvo; Stringhetti, Luca; Busatta, Andrea; Giacomel, Stefano; Marchiori, Gianpietro; Manfrin, Cristiana; Marcuzzi, Enrico; Di Michele, Daniele; Grigolon, Carlo; Guarise, Paolo

    2016-08-01

    The ASTRI SST-2M telescope is an end-to-end prototype proposed for the Small Size class of Telescopes (SST) of the future Cherenkov Telescope Array (CTA). The prototype is installed in Italy at the INAF observing station located at Serra La Nave on Mount Etna (Sicily) and it was inaugurated in September 2014. This paper presents the software and hardware architecture and development of the system dedicated to the control of the mount, health, safety and monitoring systems of the ASTRI SST-2M telescope prototype. The mount control system installed on the ASTRI SST-2M telescope prototype makes use of standard and widely deployed industrial hardware and software. State of the art of the control and automation industries was selected in order to fulfill the mount related functional and safety requirements with assembly compactness, high reliability, and reduced maintenance. The software package was implemented with the Beckhoff TwinCAT version 3 environment for the software Programmable Logical Controller (PLC), while the control electronics have been chosen in order to maximize the homogeneity and the real time performance of the system. The integration with the high level controller (Telescope Control System) has been carried out by choosing the open platform communications Unified Architecture (UA) protocol, supporting rich data model while offering compatibility with the PLC platform. In this contribution we show how the ASTRI approach for the design and implementation of the mount control system has made the ASTRI SST-2M prototype a standalone intelligent machine, able to fulfill requirements and easy to be integrated in an array configuration such as the future ASTRI mini-array proposed to be installed at the southern site of the Cherenkov Telescope Array (CTA).

  16. Integration of LCoS-SLM and LabVIEW based software to simulate fundamental optics, wave optics, and Fourier optics

    NASA Astrophysics Data System (ADS)

    Lyu, Bo-Han; Wang, Chen; Tsai, Chun-Wei

    2017-08-01

    Jasper Display Corp. (JDC) offer high reflectivity, high resolution Liquid Crystal on Silicon - Spatial Light Modulator (LCoS-SLM) which include an associated controller ASIC and LabVIEW based modulation software. Based on this LCoS-SLM, also called Education Kit (EDK), we provide a training platform which includes a series of optical theory and experiments to university students. This EDK not only provides a LabVIEW based operation software to produce Computer Generated Holograms (CGH) to generate some basic diffraction image or holographic image, but also provides simulation software to verity the experiment results simultaneously. However, we believe that a robust LCoSSLM, operation software, simulation software, training system, and training course can help students to study the fundamental optics, wave optics, and Fourier optics more easily. Based on these fundamental knowledges, they could develop their unique skills and create their new innovations on the optoelectronic application in the future.

  17. Development of a Web-Enabled Learning Platform for Geospatial Laboratories: Improving the Undergraduate Learning Experience

    ERIC Educational Resources Information Center

    Mui, Amy B.; Nelson, Sarah; Huang, Bruce; He, Yuhong; Wilson, Kathi

    2015-01-01

    This paper describes a web-enabled learning platform providing remote access to geospatial software that extends the learning experience outside of the laboratory setting. The platform was piloted in two undergraduate courses, and includes a software server, a data server, and remote student users. The platform was designed to improve the quality…

  18. The Image Data Resource: A Bioimage Data Integration and Publication Platform.

    PubMed

    Williams, Eleanor; Moore, Josh; Li, Simon W; Rustici, Gabriella; Tarkowska, Aleksandra; Chessel, Anatole; Leo, Simone; Antal, Bálint; Ferguson, Richard K; Sarkans, Ugis; Brazma, Alvis; Salas, Rafael E Carazo; Swedlow, Jason R

    2017-08-01

    Access to primary research data is vital for the advancement of science. To extend the data types supported by community repositories, we built a prototype Image Data Resource (IDR) that collects and integrates imaging data acquired across many different imaging modalities. IDR links data from several imaging modalities, including high-content screening, super-resolution and time-lapse microscopy, digital pathology, public genetic or chemical databases, and cell and tissue phenotypes expressed using controlled ontologies. Using this integration, IDR facilitates the analysis of gene networks and reveals functional interactions that are inaccessible to individual studies. To enable re-analysis, we also established a computational resource based on Jupyter notebooks that allows remote access to the entire IDR. IDR is also an open source platform that others can use to publish their own image data. Thus IDR provides both a novel on-line resource and a software infrastructure that promotes and extends publication and re-analysis of scientific image data.

  19. Platform-dependent optimization considerations for mHealth applications

    NASA Astrophysics Data System (ADS)

    Kaghyan, Sahak; Akopian, David; Sarukhanyan, Hakob

    2015-03-01

    Modern mobile devices contain integrated sensors that enable multitude of applications in such fields as mobile health (mHealth), entertainment, sports, etc. Human physical activity monitoring is one of such the emerging applications. There exists a range of challenges that relate to activity monitoring tasks, and, particularly, exploiting optimal solutions and architectures for respective mobile software application development. This work addresses mobile computations related to integrated inertial sensors for activity monitoring, such as accelerometers, gyroscopes, integrated global positioning system (GPS) and WLAN-based positioning, that can be used for activity monitoring. Some of the aspects will be discussed in this paper. Each of the sensing data sources has its own characteristics such as specific data formats, data rates, signal acquisition durations etc., and these specifications affect energy consumption. Energy consumption significantly varies as sensor data acquisition is followed by data analysis including various transformations and signal processing algorithms. This paper will address several aspects of more optimal activity monitoring implementations exploiting state-of-the-art capabilities of modern platforms.

  20. LHCb Dockerized Build Environment

    NASA Astrophysics Data System (ADS)

    Clemencic, M.; Belin, M.; Closier, J.; Couturier, B.

    2017-10-01

    Used as lightweight virtual machines or as enhanced chroot environments, Linux containers, and in particular the Docker abstraction over them, are more and more popular in the virtualization communities. The LHCb Core Software team decided to investigate how to use Docker containers to provide stable and reliable build environments for the different supported platforms, including the obsolete ones which cannot be installed on modern hardware, to be used in integration builds, releases and by any developer. We present here the techniques and procedures set up to define and maintain the Docker images and how these images can be used to develop on modern Linux distributions for platforms otherwise not accessible.

  1. Electronics design of the airborne stabilized platform attitude acquisition module

    NASA Astrophysics Data System (ADS)

    Xu, Jiang; Wei, Guiling; Cheng, Yong; Li, Baolin; Bu, Hongyi; Wang, Hao; Zhang, Zhanwei; Li, Xingni

    2014-02-01

    We present an attitude acquisition module electronics design for the airborne stabilized platform. The design scheme, which is based on Integrated MEMS sensor ADIS16405, develops the attitude information processing algorithms and the hardware circuit. The hardware circuits with a small volume of only 44.9 x 43.6 x 24.6 mm3, has the characteristics of lightweight, modularization and digitalization. The interface design of the PC software uses the combination plane chart with track line to receive the attitude information and display. Attitude calculation uses the Kalman filtering algorithm to improve the measurement accuracy of the module in the dynamic environment.

  2. Adaptive Tunable Laser Spectrometer for Space Applications

    NASA Technical Reports Server (NTRS)

    Flesch, Gregory; Keymeulen, Didier

    2010-01-01

    An architecture and process for the rapid prototyping and subsequent development of an adaptive tunable laser absorption spectrometer (TLS) are described. Our digital hardware/firmware/software platform is both reconfigurable at design time as well as autonomously adaptive in real-time for both post-integration and post-launch situations. The design expands the range of viable target environments and enhances tunable laser spectrometer performance in extreme and even unpredictable environments. Through rapid prototyping with a commercial RTOS/FPGA platform, we have implemented a fully operational tunable laser spectrometer (using a highly sensitive second harmonic technique). With this prototype, we have demonstrated autonomous real-time adaptivity in the lab with simulated extreme environments.

  3. Integrated web-based viewing and secure remote access to a clinical data repository and diverse clinical systems.

    PubMed

    Duncan, R G; Saperia, D; Dulbandzhyan, R; Shabot, M M; Polaschek, J X; Jones, D T

    2001-01-01

    The advent of the World-Wide-Web protocols and client-server technology has made it easy to build low-cost, user-friendly, platform-independent graphical user interfaces to health information systems and to integrate the presentation of data from multiple systems. The authors describe a Web interface for a clinical data repository (CDR) that was moved from concept to production status in less than six months using a rapid prototyping approach, multi-disciplinary development team, and off-the-shelf hardware and software. The system has since been expanded to provide an integrated display of clinical data from nearly 20 disparate information systems.

  4. The Unidata Integrated Data Viewer

    NASA Astrophysics Data System (ADS)

    Weber, W. J.; Ho, Y.

    2016-12-01

    The Unidata Integrated Data Viewer (IDV) is a free and open source, virtual globe, software application that enables three dimensional viewing of earth science data. The Unidata IDV is data agnostic and can display and analyze disparate data in a single view. This capability facilitates cross discipline research and allows for multiple observation platforms to be displayed simultaneously for any given event. The Unidata IDV is a mature application, written in JAVA, and has been serving the earth science community for over 15 years. This demonstration will focus on near real time global satelliteobservations, the integration of the COSMIC radio occultation data set that profiles the atmosphere, and high resolution numerical weather prediction.

  5. Waggle: A Framework for Intelligent Attentive Sensing and Actuation

    NASA Astrophysics Data System (ADS)

    Sankaran, R.; Jacob, R. L.; Beckman, P. H.; Catlett, C. E.; Keahey, K.

    2014-12-01

    Advances in sensor-driven computation and computationally steered sensing will greatly enable future research in fields including environmental and atmospheric sciences. We will present "Waggle," an open-source hardware and software infrastructure developed with two goals: (1) reducing the separation and latency between sensing and computing and (2) improving the reliability and longevity of sensing-actuation platforms in challenging and costly deployments. Inspired by "deep-space probe" systems, the Waggle platform design includes features that can support longitudinal studies, deployments with varying communication links, and remote management capabilities. Waggle lowers the barrier for scientists to incorporate real-time data from their sensors into their computations and to manipulate the sensors or provide feedback through actuators. A standardized software and hardware design allows quick addition of new sensors/actuators and associated software in the nodes and enables them to be coupled with computational codes both insitu and on external compute infrastructure. The Waggle framework currently drives the deployment of two observational systems - a portable and self-sufficient weather platform for study of small-scale effects in Chicago's urban core and an open-ended distributed instrument in Chicago that aims to support several research pursuits across a broad range of disciplines including urban planning, microbiology and computer science. Built around open-source software, hardware, and Linux OS, the Waggle system comprises two components - the Waggle field-node and Waggle cloud-computing infrastructure. Waggle field-node affords a modular, scalable, fault-tolerant, secure, and extensible platform for hosting sensors and actuators in the field. It supports insitu computation and data storage, and integration with cloud-computing infrastructure. The Waggle cloud infrastructure is designed with the goal of scaling to several hundreds of thousands of Waggle nodes. It supports aggregating data from sensors hosted by the nodes, staging computation, relaying feedback to the nodes and serving data to end-users. We will discuss the Waggle design principles and their applicability to various observational research pursuits, and demonstrate its capabilities.

  6. Revel8or: Model Driven Capacity Planning Tool Suite

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhu, Liming; Liu, Yan; Bui, Ngoc B.

    2007-05-31

    Designing complex multi-tier applications that must meet strict performance requirements is a challenging software engineering problem. Ideally, the application architect could derive accurate performance predictions early in the project life-cycle, leveraging initial application design-level models and a description of the target software and hardware platforms. To this end, we have developed a capacity planning tool suite for component-based applications, called Revel8tor. The tool adheres to the model driven development paradigm and supports benchmarking and performance prediction for J2EE, .Net and Web services platforms. The suite is composed of three different tools: MDAPerf, MDABench and DSLBench. MDAPerf allows annotation of designmore » diagrams and derives performance analysis models. MDABench allows a customized benchmark application to be modeled in the UML 2.0 Testing Profile and automatically generates a deployable application, with measurement automatically conducted. DSLBench allows the same benchmark modeling and generation to be conducted using a simple performance engineering Domain Specific Language (DSL) in Microsoft Visual Studio. DSLBench integrates with Visual Studio and reuses its load testing infrastructure. Together, the tool suite can assist capacity planning across platforms in an automated fashion.« less

  7. Electron tunneling in proteins program.

    PubMed

    Hagras, Muhammad A; Stuchebrukhov, Alexei A

    2016-06-05

    We developed a unique integrated software package (called Electron Tunneling in Proteins Program or ETP) which provides an environment with different capabilities such as tunneling current calculation, semi-empirical quantum mechanical calculation, and molecular modeling simulation for calculation and analysis of electron transfer reactions in proteins. ETP program is developed as a cross-platform client-server program in which all the different calculations are conducted at the server side while only the client terminal displays the resulting calculation outputs in the different supported representations. ETP program is integrated with a set of well-known computational software packages including Gaussian, BALLVIEW, Dowser, pKip, and APBS. In addition, ETP program supports various visualization methods for the tunneling calculation results that assist in a more comprehensive understanding of the tunneling process. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.

  8. Mousetrap: An integrated, open-source mouse-tracking package.

    PubMed

    Kieslich, Pascal J; Henninger, Felix

    2017-10-01

    Mouse-tracking - the analysis of mouse movements in computerized experiments - is becoming increasingly popular in the cognitive sciences. Mouse movements are taken as an indicator of commitment to or conflict between choice options during the decision process. Using mouse-tracking, researchers have gained insight into the temporal development of cognitive processes across a growing number of psychological domains. In the current article, we present software that offers easy and convenient means of recording and analyzing mouse movements in computerized laboratory experiments. In particular, we introduce and demonstrate the mousetrap plugin that adds mouse-tracking to OpenSesame, a popular general-purpose graphical experiment builder. By integrating with this existing experimental software, mousetrap allows for the creation of mouse-tracking studies through a graphical interface, without requiring programming skills. Thus, researchers can benefit from the core features of a validated software package and the many extensions available for it (e.g., the integration with auxiliary hardware such as eye-tracking, or the support of interactive experiments). In addition, the recorded data can be imported directly into the statistical programming language R using the mousetrap package, which greatly facilitates analysis. Mousetrap is cross-platform, open-source and available free of charge from https://github.com/pascalkieslich/mousetrap-os .

  9. Open Marketplace for Simulation Software on the Basis of a Web Platform

    NASA Astrophysics Data System (ADS)

    Kryukov, A. P.; Demichev, A. P.

    2016-02-01

    The focus in development of a new generation of middleware shifts from the global grid systems to building convenient and efficient web platforms for remote access to individual computing resources. Further line of their development, suggested in this work, is related not only with the quantitative increase in their number and with the expansion of scientific, engineering, and manufacturing areas in which they are used, but also with improved technology for remote deployment of application software on the resources interacting with the web platforms. Currently, the services for providers of application software in the context of scientific-oriented web platforms is not developed enough. The proposed in this work new web platforms of application software market should have all the features of the existing web platforms for submissions of jobs to remote resources plus the provision of specific web services for interaction on market principles between the providers and consumers of application packages. The suggested approach will be approved on the example of simulation applications in the field of nonlinear optics.

  10. The Development of GIS Educational Resources Sharing among Central Taiwan Universities

    NASA Astrophysics Data System (ADS)

    Chou, T.-Y.; Yeh, M.-L.; Lai, Y.-C.

    2011-09-01

    Using GIS in the classroom enhance students' computer skills and explore the range of knowledge. The paper highlights GIS integration on e-learning platform and introduces a variety of abundant educational resources. This research project will demonstrate tools for e-learning environment and delivers some case studies for learning interaction from Central Taiwan Universities. Feng Chia University (FCU) obtained a remarkable academic project subsidized by Ministry of Education and developed e-learning platform for excellence in teaching/learning programs among Central Taiwan's universities. The aim of the project is to integrate the educational resources of 13 universities in central Taiwan. FCU is serving as the hub of Center University. To overcome the problem of distance, e-platforms have been established to create experiences with collaboration enhanced learning. The e-platforms provide coordination of web service access among the educational community and deliver GIS educational resources. Most of GIS related courses cover the development of GIS, principles of cartography, spatial data analysis and overlaying, terrain analysis, buffer analysis, 3D GIS application, Remote Sensing, GPS technology, and WebGIS, MobileGIS, ArcGIS manipulation. In each GIS case study, students have been taught to know geographic meaning, collect spatial data and then use ArcGIS software to analyze spatial data. On one of e-Learning platforms provide lesson plans and presentation slides. Students can learn Arc GIS online. As they analyze spatial data, they can connect to GIS hub to get data they need including satellite images, aerial photos, and vector data. Moreover, e-learning platforms provide solutions and resources. Different levels of image scales have been integrated into the systems. Multi-scale spatial development and analyses in Central Taiwan integrate academic research resources among CTTLRC partners. Thus, establish decision-making support mechanism in teaching and learning. Accelerate communication, cooperation and sharing among academic units

  11. AlgoRun: a Docker-based packaging system for platform-agnostic implemented algorithms.

    PubMed

    Hosny, Abdelrahman; Vera-Licona, Paola; Laubenbacher, Reinhard; Favre, Thibauld

    2016-08-01

    There is a growing need in bioinformatics for easy-to-use software implementations of algorithms that are usable across platforms. At the same time, reproducibility of computational results is critical and often a challenge due to source code changes over time and dependencies. The approach introduced in this paper addresses both of these needs with AlgoRun, a dedicated packaging system for implemented algorithms, using Docker technology. Implemented algorithms, packaged with AlgoRun, can be executed through a user-friendly interface directly from a web browser or via a standardized RESTful web API to allow easy integration into more complex workflows. The packaged algorithm includes the entire software execution environment, thereby eliminating the common problem of software dependencies and the irreproducibility of computations over time. AlgoRun-packaged algorithms can be published on http://algorun.org, a centralized searchable directory to find existing AlgoRun-packaged algorithms. AlgoRun is available at http://algorun.org and the source code under GPL license is available at https://github.com/algorun laubenbacher@uchc.edu Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  12. The Global Climate Dashboard: a Software Interface to Stream Comprehensive Climate Data

    NASA Astrophysics Data System (ADS)

    Gardiner, N.; Phillips, M.; NOAA Climate Portal Dashboard

    2011-12-01

    The Global Climate Dashboard is an integral component of NOAA's web portal to climate data, services, and value-added content for decision-makers, teachers, and the science-attentive public (www.clmate.gov). The dashboard provides a rapid view of observational data that demonstrate climate change and variability, as well as outputs from the Climate Model Intercomparison Project version 3, which was built to support the Intergovernmental Panel on Climate Change fourth assessment. The data shown in the dashboard therefore span a range of climate science disciplines with applications that serve audiences with diverse needs. The dashboard is designed with reusable software components that allow it to be implemented incrementally on a wide range of platforms including desktops, tablet devices, and mobile phones. The underlying software components support live streaming of data and provide a way of encapsulating graph sytles and other presentation details into a device-independent standard format that results in a common visual look and feel across all platforms. Here we describe the pedagogical objectives, technical implementation, and the deployment of the dashboard through climate.gov and partner web sites and describe plans to develop a mobile application using the same framework.

  13. ImatraNMR: novel software for batch integration and analysis of quantitative NMR spectra.

    PubMed

    Mäkelä, A V; Heikkilä, O; Kilpeläinen, I; Heikkinen, S

    2011-08-01

    Quantitative NMR spectroscopy is a useful and important tool for analysis of various mixtures. Recently, in addition of traditional quantitative 1D (1)H and (13)C NMR methods, a variety of pulse sequences aimed for quantitative or semiquantitative analysis have been developed. To obtain actual usable results from quantitative spectra, they must be processed and analyzed with suitable software. Currently, there are many processing packages available from spectrometer manufacturers and third party developers, and most of them are capable of analyzing and integration of quantitative spectra. However, they are mainly aimed for processing single or few spectra, and are slow and difficult to use when large numbers of spectra and signals are being analyzed, even when using pre-saved integration areas or custom scripting features. In this article, we present a novel software, ImatraNMR, designed for batch analysis of quantitative spectra. In addition to capability of analyzing large number of spectra, it provides results in text and CSV formats, allowing further data-analysis using spreadsheet programs or general analysis programs, such as Matlab. The software is written with Java, and thus it should run in any platform capable of providing Java Runtime Environment version 1.6 or newer, however, currently it has only been tested with Windows and Linux (Ubuntu 10.04). The software is free for non-commercial use, and is provided with source code upon request. Copyright © 2011 Elsevier Inc. All rights reserved.

  14. Flightspeed Integral Image Analysis Toolkit

    NASA Technical Reports Server (NTRS)

    Thompson, David R.

    2009-01-01

    The Flightspeed Integral Image Analysis Toolkit (FIIAT) is a C library that provides image analysis functions in a single, portable package. It provides basic low-level filtering, texture analysis, and subwindow descriptor for applications dealing with image interpretation and object recognition. Designed with spaceflight in mind, it addresses: Ease of integration (minimal external dependencies) Fast, real-time operation using integer arithmetic where possible (useful for platforms lacking a dedicated floatingpoint processor) Written entirely in C (easily modified) Mostly static memory allocation 8-bit image data The basic goal of the FIIAT library is to compute meaningful numerical descriptors for images or rectangular image regions. These n-vectors can then be used directly for novelty detection or pattern recognition, or as a feature space for higher-level pattern recognition tasks. The library provides routines for leveraging training data to derive descriptors that are most useful for a specific data set. Its runtime algorithms exploit a structure known as the "integral image." This is a caching method that permits fast summation of values within rectangular regions of an image. This integral frame facilitates a wide range of fast image-processing functions. This toolkit has applicability to a wide range of autonomous image analysis tasks in the space-flight domain, including novelty detection, object and scene classification, target detection for autonomous instrument placement, and science analysis of geomorphology. It makes real-time texture and pattern recognition possible for platforms with severe computational restraints. The software provides an order of magnitude speed increase over alternative software libraries currently in use by the research community. FIIAT can commercially support intelligent video cameras used in intelligent surveillance. It is also useful for object recognition by robots or other autonomous vehicles

  15. VOLCWORKS: A suite for optimization of hazards mapping

    NASA Astrophysics Data System (ADS)

    Delgado Granados, H.; Ramírez Guzmán, R.; Villareal Benítez, J. L.; García Sánchez, T.

    2012-04-01

    Making hazards maps is a process linking basic science, applied science and engineering for the benefit of the society. The methodologies for hazards maps' construction have evolved enormously together with the tools that allow the forecasting of the behavior of the materials produced by different eruptive processes. However, in spite of the development of tools and evolution of methodologies, the utility of hazards maps has not changed: prevention and mitigation of volcanic disasters. Integration of different tools for simulation of different processes for a single volcano is a challenge to be solved using software tools including processing, simulation and visualization techniques, and data structures in order to build up a suit that helps in the construction process starting from the integration of the geological data, simulations and simplification of the output to design a hazards/scenario map. Scientific visualization is a powerful tool to explore and gain insight into complex data from instruments and simulations. The workflow from data collection, quality control and preparation for simulations, to achieve visual and appropriate presentation is a process that is usually disconnected, using in most of the cases different applications for each of the needed processes, because it requires many tools that are not built for the solution of a specific problem, or were developed by research groups to solve particular tasks, but disconnected. In volcanology, due to its complexity, groups typically examine only one aspect of the phenomenon: ash dispersal, laharic flows, pyroclastic flows, lava flows, and ballistic projectile ejection, among others. However, when studying the hazards associated to the activity of a volcano, it is important to analyze all the processes comprehensively, especially for communication of results to the end users: decision makers and planners. In order to solve this problem and connect different parts of a workflow we are developing the suite VOLCWORKS, whose principle is to have a flexible-implementation architecture allowing rapid development of software to the extent specified by the needs including calculations, routines, or algorithms, both new and through redesign of available software in the volcanological community, but especially allowing to include new knowledge, models or software transferring them to software modules. The design is component-oriented platform, which allows incorporating particular solutions (routines, simulations, etc.), which can be concatenated for integration or highlighting information. The platform includes a graphical interface with capabilities for working in different visual environments that can be focused to the particular work of different types of users (researchers, lecturers, students, etc.). This platform aims to integrate simulation and visualization phases, incorporating proven tools (now isolated). VOLCWORKS can be used under different operating systems (Windows, Linux and Mac OS) and fit the context of use automatically and at runtime: in both tasks and their sequence, such as utilization of hardware resources (CPU, GPU, special monitors, etc.). The application has the ability to run on a laptop or even in a virtual reality room with access to supercomputers.

  16. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Apte, A; Veeraraghavan, H; Oh, J

    Purpose: To present an open source and free platform to facilitate radiomics research — The “Radiomics toolbox” in CERR. Method: There is scarcity of open source tools that support end-to-end modeling of image features to predict patient outcomes. The “Radiomics toolbox” strives to fill the need for such a software platform. The platform supports (1) import of various kinds of image modalities like CT, PET, MR, SPECT, US. (2) Contouring tools to delineate structures of interest. (3) Extraction and storage of image based features like 1st order statistics, gray-scale co-occurrence and zonesize matrix based texture features and shape features andmore » (4) Statistical Analysis. Statistical analysis of the extracted features is supported with basic functionality that includes univariate correlations, Kaplan-Meir curves and advanced functionality that includes feature reduction and multivariate modeling. The graphical user interface and the data management are performed with Matlab for the ease of development and readability of code and features for wide audience. Open-source software developed with other programming languages is integrated to enhance various components of this toolbox. For example: Java-based DCM4CHE for import of DICOM, R for statistical analysis. Results: The Radiomics toolbox will be distributed as an open source, GNU copyrighted software. The toolbox was prototyped for modeling Oropharyngeal PET dataset at MSKCC. The analysis will be presented in a separate paper. Conclusion: The Radiomics Toolbox provides an extensible platform for extracting and modeling image features. To emphasize new uses of CERR for radiomics and image-based research, we have changed the name from the “Computational Environment for Radiotherapy Research” to the “Computational Environment for Radiological Research”.« less

  17. A Cloud-based, Open-Source, Command-and-Control Software Paradigm for Space Situational Awareness (SSA)

    NASA Astrophysics Data System (ADS)

    Melton, R.; Thomas, J.

    With the rapid growth in the number of space actors, there has been a marked increase in the complexity and diversity of software systems utilized to support SSA target tracking, indication, warning, and collision avoidance. Historically, most SSA software has been constructed with "closed" proprietary code, which limits interoperability, inhibits the code transparency that some SSA customers need to develop domain expertise, and prevents the rapid injection of innovative concepts into these systems. Open-source aerospace software, a rapidly emerging, alternative trend in code development, is based on open collaboration, which has the potential to bring greater transparency, interoperability, flexibility, and reduced development costs. Open-source software is easily adaptable, geared to rapidly changing mission needs, and can generally be delivered at lower costs to meet mission requirements. This paper outlines Ball's COSMOS C2 system, a fully open-source, web-enabled, command-and-control software architecture which provides several unique capabilities to move the current legacy SSA software paradigm to an open source model that effectively enables pre- and post-launch asset command and control. Among the unique characteristics of COSMOS is the ease with which it can integrate with diverse hardware. This characteristic enables COSMOS to serve as the command-and-control platform for the full life-cycle development of SSA assets, from board test, to box test, to system integration and test, to on-orbit operations. The use of a modern scripting language, Ruby, also permits automated procedures to provide highly complex decision making for the tasking of SSA assets based on both telemetry data and data received from outside sources. Detailed logging enables quick anomaly detection and resolution. Integrated real-time and offline data graphing renders the visualization of the both ground and on-orbit assets simple and straightforward.

  18. AstroCloud, a Cyber-Infrastructure for Astronomy Research: Data Access and Interoperability

    NASA Astrophysics Data System (ADS)

    Fan, D.; He, B.; Xiao, J.; Li, S.; Li, C.; Cui, C.; Yu, C.; Hong, Z.; Yin, S.; Wang, C.; Cao, Z.; Fan, Y.; Mi, L.; Wan, W.; Wang, J.

    2015-09-01

    Data access and interoperability module connects the observation proposals, data, virtual machines and software. According to the unique identifier of PI (principal investigator), an email address or an internal ID, data can be collected by PI's proposals, or by the search interfaces, e.g. conesearch. Files associated with the searched results could be easily transported to cloud storages, including the storage with virtual machines, or several commercial platforms like Dropbox. Benefitted from the standards of IVOA (International Observatories Alliance), VOTable formatted searching result could be sent to kinds of VO software. Latter endeavor will try to integrate more data and connect archives and some other astronomical resources.

  19. An intelligent CNC machine control system architecture

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Miller, D.J.; Loucks, C.S.

    1996-10-01

    Intelligent, agile manufacturing relies on automated programming of digitally controlled processes. Currently, processes such as Computer Numerically Controlled (CNC) machining are difficult to automate because of highly restrictive controllers and poor software environments. It is also difficult to utilize sensors and process models for adaptive control, or to integrate machining processes with other tasks within a factory floor setting. As part of a Laboratory Directed Research and Development (LDRD) program, a CNC machine control system architecture based on object-oriented design and graphical programming has been developed to address some of these problems and to demonstrate automated agile machining applications usingmore » platform-independent software.« less

  20. Six pitfalls in firewall deployment

    NASA Astrophysics Data System (ADS)

    Wilner, Bruce

    1996-03-01

    This note describes six key pitfalls in the deployment of popular commercial firewalls. The term `deployment' is intended to include the architecture of the firewall software itself, the integration of the firewall with the operating system platform, and the interconnection of the complete hardware/software combination within its target environment. After reviewing the evolution of Internet firewalls against the backdrop of classical trusted systems development, specific flaws and oversights in the familiar commercial deployments are analyzed in some detail. While significantly costlier solutions are available that address some of these problems, the analysis is applicable to the overwhelming majority of firewalls in use at both commercial and Government installations.

  1. KSC-2012-2891

    NASA Image and Video Library

    2011-07-20

    LOUISVILLE, Colo. – During NASA's Commercial Crew Development Round 2 CCDev2) activities for the Commercial Crew Program CCP, Sierra Nevada Corp. SNC built a Simulator and Avionics Laboratory to help engineers evaluate the Dream Chaser's characteristics during the piloted phases of flight. Located at Sierra Nevada’s Space Systems facility in Louisville, Colo., it consists of a physical cockpit and integrated simulation hardware and software. The simulator is linked to the Vehicle Avionics Integration Laboratory, or VAIL, which serves as a platform for Dream Chaser avionics development, engineering testing and integration. VAIL also will also be used for verification and validation of avionics and software. Sierra Nevada is one of seven companies NASA entered into Space Act Agreements SAAs with during CCDev2 to aid in the innovation and development of American-led commercial capabilities for crew transportation and rescue services to and from the International Space Station and other low Earth orbit destinations. For information about CCP, visit www.nasa.gov/commercialcrew. Photo credit: Sierra Nevada Corp.

  2. A Platform-Independent Plugin for Navigating Online Radiology Cases.

    PubMed

    Balkman, Jason D; Awan, Omer A

    2016-06-01

    Software methods that enable navigation of radiology cases on various digital platforms differ between handheld devices and desktop computers. This has resulted in poor compatibility of online radiology teaching files across mobile smartphones, tablets, and desktop computers. A standardized, platform-independent, or "agnostic" approach for presenting online radiology content was produced in this work by leveraging modern hypertext markup language (HTML) and JavaScript web software technology. We describe the design and evaluation of this software, demonstrate its use across multiple viewing platforms, and make it publicly available as a model for future development efforts.

  3. Building a virtual ligand screening pipeline using free software: a survey.

    PubMed

    Glaab, Enrico

    2016-03-01

    Virtual screening, the search for bioactive compounds via computational methods, provides a wide range of opportunities to speed up drug development and reduce the associated risks and costs. While virtual screening is already a standard practice in pharmaceutical companies, its applications in preclinical academic research still remain under-exploited, in spite of an increasing availability of dedicated free databases and software tools. In this survey, an overview of recent developments in this field is presented, focusing on free software and data repositories for screening as alternatives to their commercial counterparts, and outlining how available resources can be interlinked into a comprehensive virtual screening pipeline using typical academic computing facilities. Finally, to facilitate the set-up of corresponding pipelines, a downloadable software system is provided, using platform virtualization to integrate pre-installed screening tools and scripts for reproducible application across different operating systems. © The Author 2015. Published by Oxford University Press.

  4. Building a virtual ligand screening pipeline using free software: a survey

    PubMed Central

    2016-01-01

    Virtual screening, the search for bioactive compounds via computational methods, provides a wide range of opportunities to speed up drug development and reduce the associated risks and costs. While virtual screening is already a standard practice in pharmaceutical companies, its applications in preclinical academic research still remain under-exploited, in spite of an increasing availability of dedicated free databases and software tools. In this survey, an overview of recent developments in this field is presented, focusing on free software and data repositories for screening as alternatives to their commercial counterparts, and outlining how available resources can be interlinked into a comprehensive virtual screening pipeline using typical academic computing facilities. Finally, to facilitate the set-up of corresponding pipelines, a downloadable software system is provided, using platform virtualization to integrate pre-installed screening tools and scripts for reproducible application across different operating systems. PMID:26094053

  5. Applications of software-defined radio (SDR) technology in hospital environments.

    PubMed

    Chávez-Santiago, Raúl; Mateska, Aleksandra; Chomu, Konstantin; Gavrilovska, Liljana; Balasingham, Ilangko

    2013-01-01

    A software-defined radio (SDR) is a radio communication system where the major part of its functionality is implemented by means of software in a personal computer or embedded system. Such a design paradigm has the major advantage of producing devices that can receive and transmit widely different radio protocols based solely on the software used. This flexibility opens several application opportunities in hospital environments, where a large number of wired and wireless electronic devices must coexist in confined areas like operating rooms and intensive care units. This paper outlines some possible applications in the 2360-2500 MHz frequency band. These applications include the integration of wireless medical devices in a common communication platform for seamless interoperability, and cognitive radio (CR) for body area networks (BANs) and wireless sensor networks (WSNs) for medical environmental surveillance. The description of a proof-of-concept CR prototype is also presented.

  6. Coupling Sensing Hardware with Data Interrogation Software for Structural Health Monitoring

    DOE PAGES

    Farrar, Charles R.; Allen, David W.; Park, Gyuhae; ...

    2006-01-01

    The process of implementing a damage detection strategy for aerospace, civil and mechanical engineering infrastructure is referred to as structural health monitoring (SHM). The authors' approach is to address the SHM problem in the context of a statistical pattern recognition paradigm. In this paradigm, the process can be broken down into four parts: (1) Operational Evaluation, (2) Data Acquisition and Cleansing, (3) Feature Extraction and Data Compression, and (4) Statistical Model Development for Feature Discrimination. These processes must be implemented through hardware or software and, in general, some combination of these two approaches will be used. This paper will discussmore » each portion of the SHM process with particular emphasis on the coupling of a general purpose data interrogation software package for structural health monitoring with a modular wireless sensing and processing platform. More specifically, this paper will address the need to take an integrated hardware/software approach to developing SHM solutions.« less

  7. Pulseq-Graphical Programming Interface: Open source visual environment for prototyping pulse sequences and integrated magnetic resonance imaging algorithm development.

    PubMed

    Ravi, Keerthi Sravan; Potdar, Sneha; Poojar, Pavan; Reddy, Ashok Kumar; Kroboth, Stefan; Nielsen, Jon-Fredrik; Zaitsev, Maxim; Venkatesan, Ramesh; Geethanath, Sairam

    2018-03-11

    To provide a single open-source platform for comprehensive MR algorithm development inclusive of simulations, pulse sequence design and deployment, reconstruction, and image analysis. We integrated the "Pulseq" platform for vendor-independent pulse programming with Graphical Programming Interface (GPI), a scientific development environment based on Python. Our integrated platform, Pulseq-GPI, permits sequences to be defined visually and exported to the Pulseq file format for execution on an MR scanner. For comparison, Pulseq files using either MATLAB only ("MATLAB-Pulseq") or Python only ("Python-Pulseq") were generated. We demonstrated three fundamental sequences on a 1.5 T scanner. Execution times of the three variants of implementation were compared on two operating systems. In vitro phantom images indicate equivalence with the vendor supplied implementations and MATLAB-Pulseq. The examples demonstrated in this work illustrate the unifying capability of Pulseq-GPI. The execution times of all the three implementations were fast (a few seconds). The software is capable of user-interface based development and/or command line programming. The tool demonstrated here, Pulseq-GPI, integrates the open-source simulation, reconstruction and analysis capabilities of GPI Lab with the pulse sequence design and deployment features of Pulseq. Current and future work includes providing an ISMRMRD interface and incorporating Specific Absorption Ratio and Peripheral Nerve Stimulation computations. Copyright © 2018 Elsevier Inc. All rights reserved.

  8. Applications integration in a hybrid cloud computing environment: modelling and platform

    NASA Astrophysics Data System (ADS)

    Li, Qing; Wang, Ze-yuan; Li, Wei-hua; Li, Jun; Wang, Cheng; Du, Rui-yang

    2013-08-01

    With the development of application services providers and cloud computing, more and more small- and medium-sized business enterprises use software services and even infrastructure services provided by professional information service companies to replace all or part of their information systems (ISs). These information service companies provide applications, such as data storage, computing processes, document sharing and even management information system services as public resources to support the business process management of their customers. However, no cloud computing service vendor can satisfy the full functional IS requirements of an enterprise. As a result, enterprises often have to simultaneously use systems distributed in different clouds and their intra enterprise ISs. Thus, this article presents a framework to integrate applications deployed in public clouds and intra ISs. A run-time platform is developed and a cross-computing environment process modelling technique is also developed to improve the feasibility of ISs under hybrid cloud computing environments.

  9. "Mobile Nurse" platform for ubiquitous medicine.

    PubMed

    Struzik, Z R; Yoshiuchi, K; Sone, M; Ishikawa, T; Kikuchi, H; Kumano, H; Watsuji, T; Natelson, B H; Yamamoto, Y

    2007-01-01

    We introduce "Mobile Nurse" (MN) - an emerging platform for the practice of ubiquitous medicine. By implementing in a dynamic setting of daily life the patient care traditionally provided by the clinical nurses on duty, MN aims at integral data collection and shortening the response time to the patient. MN is also capable of intelligent interaction with the patient and is able to learn from the patient's behavior and disease sign evaluation for improved personalized treatment. In this paper, we outline the most essential concepts around the hardware, software and methodological designs of MN. We provide an example of the implementation, and elaborate on the possible future impact on medical practice and biomedical science research. The main innovation of MN, setting it apart from current tele-medicine systems, is the ability to integrate the patient's signs and symptoms on site, providing medical professionals with powerful tools to elucidate disease mechanisms, to make proper diagnoses and to prescribe treatment.

  10. Efficient visualization of high-throughput targeted proteomics experiments: TAPIR.

    PubMed

    Röst, Hannes L; Rosenberger, George; Aebersold, Ruedi; Malmström, Lars

    2015-07-15

    Targeted mass spectrometry comprises a set of powerful methods to obtain accurate and consistent protein quantification in complex samples. To fully exploit these techniques, a cross-platform and open-source software stack based on standardized data exchange formats is required. We present TAPIR, a fast and efficient Python visualization software for chromatograms and peaks identified in targeted proteomics experiments. The input formats are open, community-driven standardized data formats (mzML for raw data storage and TraML encoding the hierarchical relationships between transitions, peptides and proteins). TAPIR is scalable to proteome-wide targeted proteomics studies (as enabled by SWATH-MS), allowing researchers to visualize high-throughput datasets. The framework integrates well with existing automated analysis pipelines and can be extended beyond targeted proteomics to other types of analyses. TAPIR is available for all computing platforms under the 3-clause BSD license at https://github.com/msproteomicstools/msproteomicstools. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  11. Web-GIS platform for monitoring and forecasting of regional climate and ecological changes

    NASA Astrophysics Data System (ADS)

    Gordov, E. P.; Krupchatnikov, V. N.; Lykosov, V. N.; Okladnikov, I.; Titov, A. G.; Shulgina, T. M.

    2012-12-01

    Growing volume of environmental data from sensors and model outputs makes development of based on modern information-telecommunication technologies software infrastructure for information support of integrated scientific researches in the field of Earth sciences urgent and important task (Gordov et al, 2012, van der Wel, 2005). It should be considered that original heterogeneity of datasets obtained from different sources and institutions not only hampers interchange of data and analysis results but also complicates their intercomparison leading to a decrease in reliability of analysis results. However, modern geophysical data processing techniques allow combining of different technological solutions for organizing such information resources. Nowadays it becomes a generally accepted opinion that information-computational infrastructure should rely on a potential of combined usage of web- and GIS-technologies for creating applied information-computational web-systems (Titov et al, 2009, Gordov et al. 2010, Gordov, Okladnikov and Titov, 2011). Using these approaches for development of internet-accessible thematic information-computational systems, and arranging of data and knowledge interchange between them is a very promising way of creation of distributed information-computation environment for supporting of multidiscipline regional and global research in the field of Earth sciences including analysis of climate changes and their impact on spatial-temporal vegetation distribution and state. Experimental software and hardware platform providing operation of a web-oriented production and research center for regional climate change investigations which combines modern web 2.0 approach, GIS-functionality and capabilities of running climate and meteorological models, large geophysical datasets processing, visualization, joint software development by distributed research groups, scientific analysis and organization of students and post-graduate students education is presented. Platform software developed (Shulgina et al, 2012, Okladnikov et al, 2012) includes dedicated modules for numerical processing of regional and global modeling results for consequent analysis and visualization. Also data preprocessing, run and visualization of modeling results of models WRF and «Planet Simulator» integrated into the platform is provided. All functions of the center are accessible by a user through a web-portal using common graphical web-browser in the form of an interactive graphical user interface which provides, particularly, capabilities of visualization of processing results, selection of geographical region of interest (pan and zoom) and data layers manipulation (order, enable/disable, features extraction). Platform developed provides users with capabilities of heterogeneous geophysical data analysis, including high-resolution data, and discovering of tendencies in climatic and ecosystem changes in the framework of different multidisciplinary researches (Shulgina et al, 2011). Using it even unskilled user without specific knowledge can perform computational processing and visualization of large meteorological, climatological and satellite monitoring datasets through unified graphical web-interface.

  12. ToxPi Graphical User Interface 2.0: Dynamic exploration, visualization, and sharing of integrated data models.

    PubMed

    Marvel, Skylar W; To, Kimberly; Grimm, Fabian A; Wright, Fred A; Rusyn, Ivan; Reif, David M

    2018-03-05

    Drawing integrated conclusions from diverse source data requires synthesis across multiple types of information. The ToxPi (Toxicological Prioritization Index) is an analytical framework that was developed to enable integration of multiple sources of evidence by transforming data into integrated, visual profiles. Methodological improvements have advanced ToxPi and expanded its applicability, necessitating a new, consolidated software platform to provide functionality, while preserving flexibility for future updates. We detail the implementation of a new graphical user interface for ToxPi (Toxicological Prioritization Index) that provides interactive visualization, analysis, reporting, and portability. The interface is deployed as a stand-alone, platform-independent Java application, with a modular design to accommodate inclusion of future analytics. The new ToxPi interface introduces several features, from flexible data import formats (including legacy formats that permit backward compatibility) to similarity-based clustering to options for high-resolution graphical output. We present the new ToxPi interface for dynamic exploration, visualization, and sharing of integrated data models. The ToxPi interface is freely-available as a single compressed download that includes the main Java executable, all libraries, example data files, and a complete user manual from http://toxpi.org .

  13. An application framework for computer-aided patient positioning in radiation therapy.

    PubMed

    Liebler, T; Hub, M; Sanner, C; Schlegel, W

    2003-09-01

    The importance of exact patient positioning in radiation therapy increases with the ongoing improvements in irradiation planning and treatment. Therefore, new ways to overcome precision limitations of current positioning methods in fractionated treatment have to be found. The Department of Medical Physics at the German Cancer Research Centre (DKFZ) follows different video-based approaches to increase repositioning precision. In this context, the modular software framework FIVE (Fast Integrated Video-based Environment) has been designed and implemented. It is both hardware- and platform-independent and supports merging position data by integrating various computer-aided patient positioning methods. A highly precise optical tracking system and several subtraction imaging techniques have been realized as modules to supply basic video-based repositioning techniques. This paper describes the common framework architecture, the main software modules and their interfaces. An object-oriented software engineering process has been applied using the UML, C + + and the Qt library. The significance of the current framework prototype for the application in patient positioning as well as the extension to further application areas will be discussed. Particularly in experimental research, where special system adjustments are often necessary, the open design of the software allows problem-oriented extensions and adaptations.

  14. OASYS (OrAnge SYnchrotron Suite): an open-source graphical environment for x-ray virtual experiments

    NASA Astrophysics Data System (ADS)

    Rebuffi, Luca; Sanchez del Rio, Manuel

    2017-08-01

    The evolution of the hardware platforms, the modernization of the software tools, the access to the codes of a large number of young people and the popularization of the open source software for scientific applications drove us to design OASYS (ORange SYnchrotron Suite), a completely new graphical environment for modelling X-ray experiments. The implemented software architecture allows to obtain not only an intuitive and very-easy-to-use graphical interface, but also provides high flexibility and rapidity for interactive simulations, making configuration changes to quickly compare multiple beamline configurations. Its purpose is to integrate in a synergetic way the most powerful calculation engines available. OASYS integrates different simulation strategies via the implementation of adequate simulation tools for X-ray Optics (e.g. ray tracing and wave optics packages). It provides a language to make them to communicate by sending and receiving encapsulated data. Python has been chosen as main programming language, because of its universality and popularity in scientific computing. The software Orange, developed at the University of Ljubljana (SLO), is the high level workflow engine that provides the interaction with the user and communication mechanisms.

  15. An efficient approach to the deployment of complex open source information systems

    PubMed Central

    Cong, Truong Van Chi; Groeneveld, Eildert

    2011-01-01

    Complex open source information systems are usually implemented as component-based software to inherit the available functionality of existing software packages developed by third parties. Consequently, the deployment of these systems not only requires the installation of operating system, application framework and the configuration of services but also needs to resolve the dependencies among components. The problem becomes more challenging when the application must be installed and used on different platforms such as Linux and Windows. To address this, an efficient approach using the virtualization technology is suggested and discussed in this paper. The approach has been applied in our project to deploy a web-based integrated information system in molecular genetics labs. It is a low-cost solution to benefit both software developers and end-users. PMID:22102770

  16. An Open Service Provider Concept for Enterprise Complex Automation

    NASA Astrophysics Data System (ADS)

    Ivaschenko, A. V.; Sitnikov, P. V.; Tanonykhina, M. O.

    2017-01-01

    The paper introduces a solution for IT services representation and management in the integrated information space of distributed enterprises. It is proposed to develop an Open Service Provider as a software platform for interaction between IT services providers and their users. Implementation of the proposed concept and approach is illustrated by an after-sales customer support system for a large manufacturing corporation delivered by SEC “Open Code”.

  17. Multilingual Speech and Language Processing

    DTIC Science & Technology

    2003-04-01

    client software handles the user end of the transaction. Historically, four clients were provided: e-mail, web, FrameMaker , and command line. By...command-line client and an API. The API allows integration of CyberTrans into a number of processes including word processing packages ( FrameMaker ...preservation and logging, and others. The available clients remain e-mail, Web and FrameMaker . Platforms include both Unix and PC for clients, with

  18. Specification and preliminary design of the CARTA system for satellite cartography

    NASA Technical Reports Server (NTRS)

    Machadoesilva, A. J. F. (Principal Investigator); Neto, G. C.; Serra, P. R. M.; Souza, R. C. M.; Mitsuo, Fernando Augusta, II

    1984-01-01

    Digital imagery acquired by satellite have inherent geometrical distortion due to sensor characteristics and to platform variations. In INPE a software system for geometric correction of LANDSAT MSS imagery is under development. Such connected imagery will be useful for map generation. Important examples are the generation of LANDSAT image-charts for the Amazon region and the possibility of integrating digital satellite imagery into a Geographic Information System.

  19. Logic-centered architecture for ubiquitous health monitoring.

    PubMed

    Lewandowski, Jacek; Arochena, Hisbel E; Naguib, Raouf N G; Chao, Kuo-Ming; Garcia-Perez, Alexeis

    2014-09-01

    One of the key points to maintain and boost research and development in the area of smart wearable systems (SWS) is the development of integrated architectures for intelligent services, as well as wearable systems and devices for health and wellness management. This paper presents such a generic architecture for multiparametric, intelligent and ubiquitous wireless sensing platforms. It is a transparent, smartphone-based sensing framework with customizable wireless interfaces and plug'n'play capability to easily interconnect third party sensor devices. It caters to wireless body, personal, and near-me area networks. A pivotal part of the platform is the integrated inference engine/runtime environment that allows the mobile device to serve as a user-adaptable personal health assistant. The novelty of this system lays in a rapid visual development and remote deployment model. The complementary visual Inference Engine Editor that comes with the package enables artificial intelligence specialists, alongside with medical experts, to build data processing models by assembling different components and instantly deploying them (remotely) on patient mobile devices. In this paper, the new logic-centered software architecture for ubiquitous health monitoring applications is described, followed by a discussion as to how it helps to shift focus from software and hardware development, to medical and health process-centered design of new SWS applications.

  20. A Re-programmable Platform for Dynamic Burn-in Test of Xilinx Virtexll 3000 FPGA for Military and Aerospace Applications

    NASA Technical Reports Server (NTRS)

    Roosta, Ramin; Wang, Xinchen; Sadigursky, Michael; Tracton, Phil

    2004-01-01

    Field Programmable Gate Arrays (FPGA) have played increasingly important roles in military and aerospace applications. Xilinx SRAM-based FPGAs have been extensively used in commercial applications. They have been used less frequently in space flight applications due to their susceptibility to single-event upsets. Reliability of these devices in space applications is a concern that has not been addressed. The objective of this project is to design a fully programmable hardware/software platform that allows (but is not limited to) comprehensive static/dynamic burn-in test of Virtex-II 3000 FPGAs, at speed test and SEU test. Conventional methods test very few discrete AC parameters (primarily switching) of a given integrated circuit. This approach will test any possible configuration of the FPGA and any associated performance parameters. It allows complete or partial re-programming of the FPGA and verification of the program by using read back followed by dynamic test. Designers have full control over which functional elements of the FPGA to stress. They can completely simulate all possible types of configurations/functions. Another benefit of this platform is that it allows collecting information on elevation of the junction temperature as a function of gate utilization, operating frequency and functionality. A software tool has been implemented to demonstrate the various features of the system. The software consists of three major parts: the parallel interface driver, main system procedure and a graphical user interface (GUI).

  1. Potential of a suite of robot/computer-assisted motivating systems for personalized, home-based, stroke rehabilitation.

    PubMed

    Johnson, Michelle J; Feng, Xin; Johnson, Laura M; Winters, Jack M

    2007-03-01

    There is a need to improve semi-autonomous stroke therapy in home environments often characterized by low supervision of clinical experts and low extrinsic motivation. Our distributed device approach to this problem consists of an integrated suite of low-cost robotic/computer-assistive technologies driven by a novel universal access software framework called UniTherapy. Our design strategy for personalizing the therapy, providing extrinsic motivation and outcome assessment is presented and evaluated. Three studies were conducted to evaluate the potential of the suite. A conventional force-reflecting joystick, a modified joystick therapy platform (TheraJoy), and a steering wheel platform (TheraDrive) were tested separately with the UniTherapy software. Stroke subjects with hemiparesis and able-bodied subjects completed tracking activities with the devices in different positions. We quantify motor performance across subject groups and across device platforms and muscle activation across devices at two positions in the arm workspace. Trends in the assessment metrics were consistent across devices with able-bodied and high functioning strokes subjects being significantly more accurate and quicker in their motor performance than low functioning subjects. Muscle activation patterns were different for shoulder and elbow across different devices and locations. The Robot/CAMR suite has potential for stroke rehabilitation. By manipulating hardware and software variables, we can create personalized therapy environments that engage patients, address their therapy need, and track their progress. A larger longitudinal study is still needed to evaluate these systems in under-supervised environments such as the home.

  2. Dr.LiTHO: a development and research lithography simulator

    NASA Astrophysics Data System (ADS)

    Fühner, Tim; Schnattinger, Thomas; Ardelean, Gheorghe; Erdmann, Andreas

    2007-03-01

    This paper introduces Dr.LiTHO, a research and development oriented lithography simulation environment developed at Fraunhofer IISB to flexibly integrate our simulation models into one coherent platform. We propose a light-weight approach to a lithography simulation environment: The use of a scripting (batch) language as an integration platform. Out of the great variety of different scripting languages, Python proved superior in many ways: It exhibits a good-natured learning-curve, it is efficient, available on virtually any platform, and provides sophisticated integration mechanisms for existing programs. In this paper, we will describe the steps, required to provide Python bindings for existing programs and to finally generate an integrated simulation environment. In addition, we will give a short introduction into selected software design demands associated with the development of such a framework. We will especially focus on testing and (both technical and user-oriented) documentation issues. Dr.LiTHO Python files contain not only all simulation parameter settings but also the simulation flow, providing maximum flexibility. In addition to relatively simple batch jobs, repetitive tasks can be pooled in libraries. And as Python is a full-blown programming language, users can add virtually any functionality, which is especially useful in the scope of simulation studies or optimization tasks, that often require masses of evaluations. Furthermore, we will give a short overview of the numerous existing Python packages. Several examples demonstrate the feasibility and productiveness of integrating Python packages into custom Dr.LiTHO scripts.

  3. Visualizer: 3D Gridded Data Visualization Software for Geoscience Education and Research

    NASA Astrophysics Data System (ADS)

    Harwood, C.; Billen, M. I.; Kreylos, O.; Jadamec, M.; Sumner, D. Y.; Kellogg, L. H.; Hamann, B.

    2008-12-01

    In both research and education learning is an interactive and iterative process of exploring and analyzing data or model results. However, visualization software often presents challenges on the path to learning because it assumes the user already knows the locations and types of features of interest, instead of enabling flexible and intuitive examination of results. We present examples of research and teaching using the software, Visualizer, specifically designed to create an effective and intuitive environment for interactive, scientific analysis of 3D gridded data. Visualizer runs in a range of 3D virtual reality environments (e.g., GeoWall, ImmersaDesk, or CAVE), but also provides a similar level of real-time interactivity on a desktop computer. When using Visualizer in a 3D-enabled environment, the software allows the user to interact with the data images as real objects, grabbing, rotating or walking around the data to gain insight and perspective. On the desktop, simple features, such as a set of cross-bars marking the plane of the screen, provide extra 3D spatial cues that allow the user to more quickly understand geometric relationships within the data. This platform portability allows the user to more easily integrate research results into classroom demonstrations and exercises, while the interactivity provides an engaging environment for self-directed and inquiry-based learning by students. Visualizer software is freely available for download (www.keckcaves.org) and runs on Mac OSX and Linux platforms.

  4. PIPER: Performance Insight for Programmers and Exascale Runtimes: Guiding the Development of the Exascale Software Stack

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mellor-Crummey, John

    The PIPER project set out to develop methodologies and software for measurement, analysis, attribution, and presentation of performance data for extreme-scale systems. Goals of the project were to support analysis of massive multi-scale parallelism, heterogeneous architectures, multi-faceted performance concerns, and to support both post-mortem performance analysis to identify program features that contribute to problematic performance and on-line performance analysis to drive adaptation. This final report summarizes the research and development activity at Rice University as part of the PIPER project. Producing a complete suite of performance tools for exascale platforms during the course of this project was impossible since bothmore » hardware and software for exascale systems is still a moving target. For that reason, the project focused broadly on the development of new techniques for measurement and analysis of performance on modern parallel architectures, enhancements to HPCToolkit’s software infrastructure to support our research goals or use on sophisticated applications, engaging developers of multithreaded runtimes to explore how support for tools should be integrated into their designs, engaging operating system developers with feature requests for enhanced monitoring support, engaging vendors with requests that they add hardware measure- ment capabilities and software interfaces needed by tools as they design new components of HPC platforms including processors, accelerators and networks, and finally collaborations with partners interested in using HPCToolkit to analyze and tune scalable parallel applications.« less

  5. The CoreWall Project: An Update for 2007

    NASA Astrophysics Data System (ADS)

    Yu-Chung Chen, J.; Higgins, S.; Hur, H.; Ito, E.; Jenkins, C. J.; Johnson, A.; Leigh, J.; Morin, P.; Lee, J.

    2007-12-01

    The CoreWall Suite is a NSF-supported collaborative development for a real-time core description (Corelyzer), stratigraphic correlation (Correlater), and data visualization (CoreNavigator) software to be used by the marine, terrestrial and Antarctic science communities. The overall goal of the Corewall software development is to bring portable cross-platform tools to the broader drilling and coring communities to expand and enhance data visualization and enhance collaborative integration of multiple datasets. The CoreWall Project is now in its second year and significant progress has been made on all 3 software components. Corelyzer has undergone 2 field deployments and testing by ANDRILL program in 2006 (and again in Fall 2007) and by ICDP's SAFOD project (summer 2007). In addition, Corewall group and ICDP are working together so that the core description (DIS) system can expose DIS core data directly into Corelyzer seamlessly and be available to future ICDP and IODP-Mission Specific Platform expeditions. Educators have also taken note of the software's ease of use and strong visualization capabilities to begin exploring curriculum projects with Corelyzer software. To ensure that the software development is integrated with other community IT activities the development of the U.S. IODP-Phase 2 Scientific Ocean Drilling Vessel (SODV), a Steering Committee was constituted. It is composed of key U.S. IODP and related database (e.g., CHRONOS, SedDB) developers and users as well as representatives of other core-based enterprises (e.g., ANDRILL, ICDP, LacCore). Corelyzer (CoreWall's main visual core description tool) software displays digital core images from one or more cores along with discrete data streams (eg. physical properties, downhole logs) and nested images (eg. thin sections, fossils) to provide a robust approach to the description of sediment cores. Corelyzer's digital image handling allows the cores to be viewed from micron to km scale determined by the image resolution along a sliding plane, effectively making it a "digital microscope". Detailed features such as lithologic variation, macroscopic grain size variation, bioturbation intensity, chemical composition and micropaleontology are easier to interpret and annotate. Significant new capabilities have been added to allow for importing multiple images and data types, sharing/exporting Corelyzer "work sessions" for multiple users, enhanced annotations, as well as support for other activities like examining clasts, and sample requests. The new Correlator software, the updated version of Splicer/Sagan software used by ODP for over 10 years, has been ported into a single new analysis tool that will work across multiple platforms and interact seamlessly with both JANUS (ODP's relational database), CHRONOS, PetDB, SedDB, dbSEABED and other databases. This functionality will result in a CoreWall Suite module that can be used and distributed anywhere for stratigraphic and age correlation tasks. CoreNavigator, a spatial data discovery tool, has taken on a virtual Globe interface that allows users to enter Corelyzer from a geographic-visual standpoint.

  6. Terra Harvest software architecture

    NASA Astrophysics Data System (ADS)

    Humeniuk, Dave; Klawon, Kevin

    2012-06-01

    Under the Terra Harvest Program, the DIA has the objective of developing a universal Controller for the Unattended Ground Sensor (UGS) community. The mission is to define, implement, and thoroughly document an open architecture that universally supports UGS missions, integrating disparate systems, peripherals, etc. The Controller's inherent interoperability with numerous systems enables the integration of both legacy and future UGS System (UGSS) components, while the design's open architecture supports rapid third-party development to ensure operational readiness. The successful accomplishment of these objectives by the program's Phase 3b contractors is demonstrated via integration of the companies' respective plug-'n'-play contributions that include controllers, various peripherals, such as sensors, cameras, etc., and their associated software drivers. In order to independently validate the Terra Harvest architecture, L-3 Nova Engineering, along with its partner, the University of Dayton Research Institute, is developing the Terra Harvest Open Source Environment (THOSE), a Java Virtual Machine (JVM) running on an embedded Linux Operating System. The Use Cases on which the software is developed support the full range of UGS operational scenarios such as remote sensor triggering, image capture, and data exfiltration. The Team is additionally developing an ARM microprocessor-based evaluation platform that is both energy-efficient and operationally flexible. The paper describes the overall THOSE architecture, as well as the design decisions for some of the key software components. Development process for THOSE is discussed as well.

  7. The Virtual Brain: a simulator of primate brain network dynamics.

    PubMed

    Sanz Leon, Paula; Knock, Stuart A; Woodman, M Marmaduke; Domide, Lia; Mersmann, Jochen; McIntosh, Anthony R; Jirsa, Viktor

    2013-01-01

    We present The Virtual Brain (TVB), a neuroinformatics platform for full brain network simulations using biologically realistic connectivity. This simulation environment enables the model-based inference of neurophysiological mechanisms across different brain scales that underlie the generation of macroscopic neuroimaging signals including functional MRI (fMRI), EEG and MEG. Researchers from different backgrounds can benefit from an integrative software platform including a supporting framework for data management (generation, organization, storage, integration and sharing) and a simulation core written in Python. TVB allows the reproduction and evaluation of personalized configurations of the brain by using individual subject data. This personalization facilitates an exploration of the consequences of pathological changes in the system, permitting to investigate potential ways to counteract such unfavorable processes. The architecture of TVB supports interaction with MATLAB packages, for example, the well known Brain Connectivity Toolbox. TVB can be used in a client-server configuration, such that it can be remotely accessed through the Internet thanks to its web-based HTML5, JS, and WebGL graphical user interface. TVB is also accessible as a standalone cross-platform Python library and application, and users can interact with the scientific core through the scripting interface IDLE, enabling easy modeling, development and debugging of the scientific kernel. This second interface makes TVB extensible by combining it with other libraries and modules developed by the Python scientific community. In this article, we describe the theoretical background and foundations that led to the development of TVB, the architecture and features of its major software components as well as potential neuroscience applications.

  8. The Virtual Brain: a simulator of primate brain network dynamics

    PubMed Central

    Sanz Leon, Paula; Knock, Stuart A.; Woodman, M. Marmaduke; Domide, Lia; Mersmann, Jochen; McIntosh, Anthony R.; Jirsa, Viktor

    2013-01-01

    We present The Virtual Brain (TVB), a neuroinformatics platform for full brain network simulations using biologically realistic connectivity. This simulation environment enables the model-based inference of neurophysiological mechanisms across different brain scales that underlie the generation of macroscopic neuroimaging signals including functional MRI (fMRI), EEG and MEG. Researchers from different backgrounds can benefit from an integrative software platform including a supporting framework for data management (generation, organization, storage, integration and sharing) and a simulation core written in Python. TVB allows the reproduction and evaluation of personalized configurations of the brain by using individual subject data. This personalization facilitates an exploration of the consequences of pathological changes in the system, permitting to investigate potential ways to counteract such unfavorable processes. The architecture of TVB supports interaction with MATLAB packages, for example, the well known Brain Connectivity Toolbox. TVB can be used in a client-server configuration, such that it can be remotely accessed through the Internet thanks to its web-based HTML5, JS, and WebGL graphical user interface. TVB is also accessible as a standalone cross-platform Python library and application, and users can interact with the scientific core through the scripting interface IDLE, enabling easy modeling, development and debugging of the scientific kernel. This second interface makes TVB extensible by combining it with other libraries and modules developed by the Python scientific community. In this article, we describe the theoretical background and foundations that led to the development of TVB, the architecture and features of its major software components as well as potential neuroscience applications. PMID:23781198

  9. Company profile: Complete Genomics Inc.

    PubMed

    Reid, Clifford

    2011-02-01

    Complete Genomics Inc. is a life sciences company that focuses on complete human genome sequencing. It is taking a completely different approach to DNA sequencing than other companies in the industry. Rather than building a general-purpose platform for sequencing all organisms and all applications, it has focused on a single application - complete human genome sequencing. The company's Complete Genomics Analysis Platform (CGA™ Platform) comprises an integrated package of biochemistry, instrumentation and software that sequences human genomes at the highest quality, lowest cost and largest scale available. Complete Genomics offers a turnkey service that enables customers to outsource their human genome sequencing to the company's genome sequencing center in Mountain View, CA, USA. Customers send in their DNA samples, the company does all the library preparation, DNA sequencing, assembly and variant analysis, and customers receive research-ready data that they can use for biological discovery.

  10. MPHASYS: a mouse phenotype analysis system

    PubMed Central

    Calder, R Brent; Beems, Rudolf B; van Steeg, Harry; Mian, I Saira; Lohman, Paul HM; Vijg, Jan

    2007-01-01

    Background Systematic, high-throughput studies of mouse phenotypes have been hampered by the inability to analyze individual animal data from a multitude of sources in an integrated manner. Studies generally make comparisons at the level of genotype or treatment thereby excluding associations that may be subtle or involve compound phenotypes. Additionally, the lack of integrated, standardized ontologies and methodologies for data exchange has inhibited scientific collaboration and discovery. Results Here we introduce a Mouse Phenotype Analysis System (MPHASYS), a platform for integrating data generated by studies of mouse models of human biology and disease such as aging and cancer. This computational platform is designed to provide a standardized methodology for working with animal data; a framework for data entry, analysis and sharing; and ontologies and methodologies for ensuring accurate data capture. We describe the tools that currently comprise MPHASYS, primarily ones related to mouse pathology, and outline its use in a study of individual animal-specific patterns of multiple pathology in mice harboring a specific germline mutation in the DNA repair and transcription-specific gene Xpd. Conclusion MPHASYS is a system for analyzing multiple data types from individual animals. It provides a framework for developing data analysis applications, and tools for collecting and distributing high-quality data. The software is platform independent and freely available under an open-source license [1]. PMID:17553167

  11. Integrated software health management for aerospace guidance, navigation, and control systems: A probabilistic reasoning approach

    NASA Astrophysics Data System (ADS)

    Mbaya, Timmy

    Embedded Aerospace Systems have to perform safety and mission critical operations in a real-time environment where timing and functional correctness are extremely important. Guidance, Navigation, and Control (GN&C) systems substantially rely on complex software interfacing with hardware in real-time; any faults in software or hardware, or their interaction could result in fatal consequences. Integrated Software Health Management (ISWHM) provides an approach for detection and diagnosis of software failures while the software is in operation. The ISWHM approach is based on probabilistic modeling of software and hardware sensors using a Bayesian network. To meet memory and timing constraints of real-time embedded execution, the Bayesian network is compiled into an Arithmetic Circuit, which is used for on-line monitoring. This type of system monitoring, using an ISWHM, provides automated reasoning capabilities that compute diagnoses in a timely manner when failures occur. This reasoning capability enables time-critical mitigating decisions and relieves the human agent from the time-consuming and arduous task of foraging through a multitude of isolated---and often contradictory---diagnosis data. For the purpose of demonstrating the relevance of ISWHM, modeling and reasoning is performed on a simple simulated aerospace system running on a real-time operating system emulator, the OSEK/Trampoline platform. Models for a small satellite and an F-16 fighter jet GN&C (Guidance, Navigation, and Control) system have been implemented. Analysis of the ISWHM is then performed by injecting faults and analyzing the ISWHM's diagnoses.

  12. Platform for Post-Processing Waveform-Based NDE

    NASA Technical Reports Server (NTRS)

    Roth, Don J.

    2010-01-01

    Signal- and image-processing methods are commonly needed to extract information from the waves, improve resolution of, and highlight defects in an image. Since some similarity exists for all waveform-based nondestructive evaluation (NDE) methods, it would seem that a common software platform containing multiple signal- and image-processing techniques to process the waveforms and images makes sense where multiple techniques, scientists, engineers, and organizations are involved. NDE Wave & Image Processor Version 2.0 software provides a single, integrated signal- and image-processing and analysis environment for total NDE data processing and analysis. It brings some of the most useful algorithms developed for NDE over the past 20 years into a commercial-grade product. The software can import signal/spectroscopic data, image data, and image series data. This software offers the user hundreds of basic and advanced signal- and image-processing capabilities including esoteric 1D and 2D wavelet-based de-noising, de-trending, and filtering. Batch processing is included for signal- and image-processing capability so that an optimized sequence of processing operations can be applied to entire folders of signals, spectra, and images. Additionally, an extensive interactive model-based curve-fitting facility has been included to allow fitting of spectroscopy data such as from Raman spectroscopy. An extensive joint-time frequency module is included for analysis of non-stationary or transient data such as that from acoustic emission, vibration, or earthquake data.

  13. Blue guardian: an open architecture for rapid ISR demonstration

    NASA Astrophysics Data System (ADS)

    Barrett, Donald A.; Borntrager, Luke A.; Green, David M.

    2016-05-01

    Throughout the Department of Defense (DoD), acquisition, platform integration, and life cycle costs for weapons systems have continued to rise. Although Open Architecture (OA) interface standards are one of the primary methods being used to reduce these costs, the Air Force Rapid Capabilities Office (AFRCO) has extended the OA concept and chartered the Open Mission System (OMS) initiative with industry to develop and demonstrate a consensus-based, non-proprietary, OA standard for integrating subsystems and services into airborne platforms. The new OMS standard provides the capability to decouple vendor-specific sensors, payloads, and service implementations from platform-specific architectures and is still in the early stages of maturation and demonstration. The Air Force Research Laboratory (AFRL) - Sensors Directorate has developed the Blue Guardian program to demonstrate advanced sensing technology utilizing open architectures in operationally relevant environments. Over the past year, Blue Guardian has developed a platform architecture using the Air Force's OMS reference architecture and conducted a ground and flight test program of multiple payload combinations. Systems tested included a vendor-unique variety of Full Motion Video (FMV) systems, a Wide Area Motion Imagery (WAMI) system, a multi-mode radar system, processing and database functions, multiple decompression algorithms, multiple communications systems, and a suite of software tools. Initial results of the Blue Guardian program show the promise of OA to DoD acquisitions, especially for Intelligence, Surveillance and Reconnaissance (ISR) payload applications. Specifically, the OMS reference architecture was extremely useful in reducing the cost and time required for integrating new systems.

  14. Integration of USB and firewire cameras in machine vision applications

    NASA Astrophysics Data System (ADS)

    Smith, Timothy E.; Britton, Douglas F.; Daley, Wayne D.; Carey, Richard

    1999-08-01

    Digital cameras have been around for many years, but a new breed of consumer market cameras is hitting the main stream. By using these devices, system designers and integrators will be well posited to take advantage of technological advances developed to support multimedia and imaging applications on the PC platform. Having these new cameras on the consumer market means lower cost, but it does not necessarily guarantee ease of integration. There are many issues that need to be accounted for like image quality, maintainable frame rates, image size and resolution, supported operating system, and ease of software integration. This paper will describe briefly a couple of the consumer digital standards, and then discuss some of the advantages and pitfalls of integrating both USB and Firewire cameras into computer/machine vision applications.

  15. Combining a Multi-Agent System and Communication Middleware for Smart Home Control: A Universal Control Platform Architecture

    PubMed Central

    Zheng, Song; Zhang, Qi; Zheng, Rong; Huang, Bi-Qin; Song, Yi-Lin; Chen, Xin-Chu

    2017-01-01

    In recent years, the smart home field has gained wide attention for its broad application prospects. However, families using smart home systems must usually adopt various heterogeneous smart devices, including sensors and devices, which makes it more difficult to manage and control their home system. How to design a unified control platform to deal with the collaborative control problem of heterogeneous smart devices is one of the greatest challenges in the current smart home field. The main contribution of this paper is to propose a universal smart home control platform architecture (IAPhome) based on a multi-agent system and communication middleware, which shows significant adaptability and advantages in many aspects, including heterogeneous devices connectivity, collaborative control, human-computer interaction and user self-management. The communication middleware is an important foundation to design and implement this architecture which makes it possible to integrate heterogeneous smart devices in a flexible way. A concrete method of applying the multi-agent software technique to solve the integrated control problem of the smart home system is also presented. The proposed platform architecture has been tested in a real smart home environment, and the results indicate that the effectiveness of our approach for solving the collaborative control problem of different smart devices. PMID:28926957

  16. Combining a Multi-Agent System and Communication Middleware for Smart Home Control: A Universal Control Platform Architecture.

    PubMed

    Zheng, Song; Zhang, Qi; Zheng, Rong; Huang, Bi-Qin; Song, Yi-Lin; Chen, Xin-Chu

    2017-09-16

    In recent years, the smart home field has gained wide attention for its broad application prospects. However, families using smart home systems must usually adopt various heterogeneous smart devices, including sensors and devices, which makes it more difficult to manage and control their home system. How to design a unified control platform to deal with the collaborative control problem of heterogeneous smart devices is one of the greatest challenges in the current smart home field. The main contribution of this paper is to propose a universal smart home control platform architecture (IAPhome) based on a multi-agent system and communication middleware, which shows significant adaptability and advantages in many aspects, including heterogeneous devices connectivity, collaborative control, human-computer interaction and user self-management. The communication middleware is an important foundation to design and implement this architecture which makes it possible to integrate heterogeneous smart devices in a flexible way. A concrete method of applying the multi-agent software technique to solve the integrated control problem of the smart home system is also presented. The proposed platform architecture has been tested in a real smart home environment, and the results indicate that the effectiveness of our approach for solving the collaborative control problem of different smart devices.

  17. SU-E-T-36: A GPU-Accelerated Monte-Carlo Dose Calculation Platform and Its Application Toward Validating a ViewRay Beam Model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, Y; Mazur, T; Green, O

    Purpose: To build a fast, accurate and easily-deployable research platform for Monte-Carlo dose calculations. We port the dose calculation engine PENELOPE to C++, and accelerate calculations using GPU acceleration. Simulations of a Co-60 beam model provided by ViewRay demonstrate the capabilities of the platform. Methods: We built software that incorporates a beam model interface, CT-phantom model, GPU-accelerated PENELOPE engine, and GUI front-end. We rewrote the PENELOPE kernel in C++ (from Fortran) and accelerated the code on a GPU. We seamlessly integrated a Co-60 beam model (obtained from ViewRay) into our platform. Simulations of various field sizes and SSDs using amore » homogeneous water phantom generated PDDs, dose profiles, and output factors that were compared to experiment data. Results: With GPU acceleration using a dated graphics card (Nvidia Tesla C2050), a highly accurate simulation – including 100*100*100 grid, 3×3×3 mm3 voxels, <1% uncertainty, and 4.2×4.2 cm2 field size – runs 24 times faster (20 minutes versus 8 hours) than when parallelizing on 8 threads across a new CPU (Intel i7-4770). Simulated PDDs, profiles and output ratios for the commercial system agree well with experiment data measured using radiographic film or ionization chamber. Based on our analysis, this beam model is precise enough for general applications. Conclusions: Using a beam model for a Co-60 system provided by ViewRay, we evaluate a dose calculation platform that we developed. Comparison to measurements demonstrates the promise of our software for use as a research platform for dose calculations, with applications including quality assurance and treatment plan verification.« less

  18. Development and Characteristics of a Mobile, Semi-Autonomous Floating Platform for in situ Lake Measurements

    NASA Astrophysics Data System (ADS)

    Barry, D.; Lemmin, U.; Le Dantec, N.; Zulliger, L.; Rusterholz, M.; Bolay, M.; Rossier, J.; Kangur, K.

    2013-12-01

    In the development of sustainable management strategies of lakes more insight into their physical, chemical and ecological dynamics is needed. Field data obtained from various types of sensors with adequate spatial and temporal sampling rate are essential to understand better the processes that govern fluxes and pathways of water masses and transported compounds, whether for model validation or for monitoring purposes. One advantage of unmanned platforms is that they limit the disturbances typically affecting the quality of data collected on small vessels, including perturbations caused by movements of onboard crew. We have developed a mobile, semi-autonomous floating platform with 8 h power autonomy using a 5 m long by 2.5 m wide catamaran. Our approach focused on modularity and high payload capacity in order to accommodate a large number of sensors both in terms of electronic (power and data) and mechanical constraints of integration. Software architecture and onboard electronics use National Instruments technology to simplify and standardize integration of sensors, actuators and communication. Piecewise-movable deck sections allow optimizing platform stability depending on the payload. The entire system is controlled by a remote computer located on an accompanying vessel and connected via a wireless link with a range of over 1 km. Real-time transmission of GPS-stamped measurements allows immediate modifications in the survey plan if needed. The displacement of the platform is semi-autonomous, with the options of either autopilot mode following a pre-planned course specified by waypoints or remote manual control from the accompanying vessel. Maintenance of permanent control over the platform displacement is required for safety reasons with respect to other users of the lake. Currently, the sensor payload comprises an array of fast temperature probes, a bottom-tracking ADCP and atmospheric sensors including a radiometer. A towed CTD with additional water quality sensors operated from a remotely controlled winch is presently being integrated. Field tests have shown that the platform is reliable, capable of collecting long transects of 2D lake and collocated atmospheric boundary layer data and adaptable to integrate new sensors.

  19. Implementation of a Space Communications Cognitive Engine

    NASA Technical Reports Server (NTRS)

    Hackett, Timothy M.; Bilen, Sven G.; Ferreira, Paulo Victor R.; Wyglinski, Alexander M.; Reinhart, Richard C.

    2017-01-01

    Although communications-based cognitive engines have been proposed, very few have been implemented in a full system, especially in a space communications system. In this paper, we detail the implementation of a multi-objective reinforcement-learning algorithm and deep artificial neural networks for the use as a radio-resource-allocation controller. The modular software architecture presented encourages re-use and easy modification for trying different algorithms. Various trade studies involved with the system implementation and integration are discussed. These include the choice of software libraries that provide platform flexibility and promote reusability, choices regarding the deployment of this cognitive engine within a system architecture using the DVB-S2 standard and commercial hardware, and constraints placed on the cognitive engine caused by real-world radio constraints. The implemented radio-resource allocation-management controller was then integrated with the larger spaceground system developed by NASA Glenn Research Center (GRC).

  20. EcoliWiki: a wiki-based community resource for Escherichia coli

    PubMed Central

    McIntosh, Brenley K.; Renfro, Daniel P.; Knapp, Gwendowlyn S.; Lairikyengbam, Chanchala R.; Liles, Nathan M.; Niu, Lili; Supak, Amanda M.; Venkatraman, Anand; Zweifel, Adrienne E.; Siegele, Deborah A.; Hu, James C.

    2012-01-01

    EcoliWiki is the community annotation component of the PortEco (http://porteco.org; formerly EcoliHub) project, an online data resource that integrates information on laboratory strains of Escherichia coli, its phages, plasmids and mobile genetic elements. As one of the early adopters of the wiki approach to model organism databases, EcoliWiki was designed to not only facilitate community-driven sharing of biological knowledge about E. coli as a model organism, but also to be interoperable with other data resources. EcoliWiki content currently covers genes from five laboratory E. coli strains, 21 bacteriophage genomes, F plasmid and eight transposons. EcoliWiki integrates the Mediawiki wiki platform with other open-source software tools and in-house software development to extend how wikis can be used for model organism databases. EcoliWiki can be accessed online at http://ecoliwiki.net. PMID:22064863

  1. Evolution of the ATLAS Nightly Build System

    NASA Astrophysics Data System (ADS)

    Undrus, A.

    2012-12-01

    The ATLAS Nightly Build System is a major component in the ATLAS collaborative software organization, validation, and code approval scheme. For over 10 years of development it has evolved into a factory for automatic release production and grid distribution. The 50 multi-platform branches of ATLAS releases provide vast opportunities for testing new packages, verification of patches to existing software, and migration to new platforms and compilers for ATLAS code that currently contains 2200 packages with 4 million C++ and 1.4 million python scripting lines written by about 1000 developers. Recent development was focused on the integration of ATLAS Nightly Build and Installation systems. The nightly releases are distributed and validated and some are transformed into stable releases used for data processing worldwide. The ATLAS Nightly System is managed by the NICOS control tool on a computing farm with 50 powerful multiprocessor nodes. NICOS provides the fully automated framework for the release builds, testing, and creation of distribution kits. The ATN testing framework of the Nightly System runs unit and integration tests in parallel suites, fully utilizing the resources of multi-core machines, and provides the first results even before compilations complete. The NICOS error detection system is based on several techniques and classifies the compilation and test errors according to their severity. It is periodically tuned to place greater emphasis on certain software defects by highlighting the problems on NICOS web pages and sending automatic e-mail notifications to responsible developers. These and other recent developments will be presented and future plans will be described.

  2. Mobile Agents: A Distributed Voice-Commanded Sensory and Robotic System for Surface EVA Assistance

    NASA Technical Reports Server (NTRS)

    Clancey, William J.; Sierhuis, Maarten; Alena, Rick; Crawford, Sekou; Dowding, John; Graham, Jeff; Kaskiris, Charis; Tyree, Kim S.; vanHoof, Ronnie

    2003-01-01

    A model-based, distributed architecture integrates diverse components in a system designed for lunar and planetary surface operations: spacesuit biosensors, cameras, GPS, and a robotic assistant. The system transmits data and assists communication between the extra-vehicular activity (EVA) astronauts, the crew in a local habitat, and a remote mission support team. Software processes ("agents"), implemented in a system called Brahms, run on multiple, mobile platforms, including the spacesuit backpacks, all-terrain vehicles, and robot. These "mobile agents" interpret and transform available data to help people and robotic systems coordinate their actions to make operations more safe and efficient. Different types of agents relate platforms to each other ("proxy agents"), devices to software ("comm agents"), and people to the system ("personal agents"). A state-of-the-art spoken dialogue interface enables people to communicate with their personal agents, supporting a speech-driven navigation and scheduling tool, field observation record, and rover command system. An important aspect of the engineering methodology involves first simulating the entire hardware and software system in Brahms, and then configuring the agents into a runtime system. Design of mobile agent functionality has been based on ethnographic observation of scientists working in Mars analog settings in the High Canadian Arctic on Devon Island and the southeast Utah desert. The Mobile Agents system is developed iteratively in the context of use, with people doing authentic work. This paper provides a brief introduction to the architecture and emphasizes the method of empirical requirements analysis, through which observation, modeling, design, and testing are integrated in simulated EVA operations.

  3. Performance assessments of Android-powered military applications operating on tactical handheld devices

    NASA Astrophysics Data System (ADS)

    Weiss, Brian A.; Fronczek, Lisa; Morse, Emile; Kootbally, Zeid; Schlenoff, Craig

    2013-05-01

    Transformative Apps (TransApps) is a Defense Advanced Research Projects Agency (DARPA) funded program whose goal is to develop a range of militarily-relevant software applications ("apps") to enhance the operational-effectiveness of military personnel on (and off) the battlefield. TransApps is also developing a military apps marketplace to facilitate rapid development and dissemination of applications to address user needs by connecting engaged communities of endusers with development groups. The National Institute of Standards and Technology's (NIST) role in the TransApps program is to design and implement evaluation procedures to assess the performance of: 1) the various software applications, 2) software-hardware interactions, and 3) the supporting online application marketplace. Specifically, NIST is responsible for evaluating 50+ tactically-relevant applications operating on numerous Android™-powered platforms. NIST efforts include functional regression testing and quantitative performance testing. This paper discusses the evaluation methodologies employed to assess the performance of three key program elements: 1) handheld-based applications and their integration with various hardware platforms, 2) client-based applications and 3) network technologies operating on both the handheld and client systems along with their integration into the application marketplace. Handheld-based applications are assessed using a combination of utility and usability-based checklists and quantitative performance tests. Client-based applications are assessed to replicate current overseas disconnected (i.e. no network connectivity between handhelds) operations and to assess connected operations envisioned for later use. Finally, networked applications are assessed on handhelds to establish baselines of performance for when connectivity will be common usage.

  4. Jungle Computing: Distributed Supercomputing Beyond Clusters, Grids, and Clouds

    NASA Astrophysics Data System (ADS)

    Seinstra, Frank J.; Maassen, Jason; van Nieuwpoort, Rob V.; Drost, Niels; van Kessel, Timo; van Werkhoven, Ben; Urbani, Jacopo; Jacobs, Ceriel; Kielmann, Thilo; Bal, Henri E.

    In recent years, the application of high-performance and distributed computing in scientific practice has become increasingly wide spread. Among the most widely available platforms to scientists are clusters, grids, and cloud systems. Such infrastructures currently are undergoing revolutionary change due to the integration of many-core technologies, providing orders-of-magnitude speed improvements for selected compute kernels. With high-performance and distributed computing systems thus becoming more heterogeneous and hierarchical, programming complexity is vastly increased. Further complexities arise because urgent desire for scalability and issues including data distribution, software heterogeneity, and ad hoc hardware availability commonly force scientists into simultaneous use of multiple platforms (e.g., clusters, grids, and clouds used concurrently). A true computing jungle.

  5. A Digital Knowledge Preservation Platform for Environmental Sciences

    NASA Astrophysics Data System (ADS)

    Aguilar Gómez, Fernando; de Lucas, Jesús Marco; Pertinez, Esther; Palacio, Aida; Perez, David

    2017-04-01

    The Digital Knowledge Preservation Platform is the evolution of a pilot project for Open Data supporting the full research data life cycle. It is currently being evolved at IFCA (Instituto de Física de Cantabria) as a combination of different open tools that have been extended: DMPTool (https://dmptool.org/) with pilot semantics features (RDF export, parameters definition), INVENIO (http://invenio-software.org/ ) customized version to integrate the entire research data life cycle and Jupyter (http://jupyter.org/) as processing tool and reproducibility environment. This complete platform aims to provide an integrated environment for research data management following the FAIR+R principles: -Findable: The Web portal based on Invenio provides a search engine and all elements including metadata to make them easily findable. -Accessible: Both data and software are available online with internal PIDs and DOIs (provided by Datacite). -Interoperable: Datasets can be combined to perform new analysis. The OAI-PMH standard is also integrated. -Re-usable: different licenses types and embargo periods can be defined. -+Reproducible: directly integrated with cloud computing resources. The deployment of the entire system over a Cloud framework helps to build a dynamic and scalable solution, not only for managing open datasets but also as a useful tool for the final user, who is able to directly process and analyse the open data. In parallel, the direct use of semantics and metadata is being explored and integrated in the framework. Ontologies, being a knowledge representation, can contribute to define the elements and relationships of the research data life cycle, including DMP, datasets, software, etc. The first advantage of developing an ontology of a knowledge domain is that they provide a common vocabulary hierarchy (i.e. a conceptual schema) that can be used and standardized by all the agents interested in the domain (either humans or machines). This way of using ontologies is one of the basis of the Semantic Web, where ontologies are set to play a key role in establishing a common terminology between agents. To develop the ontology we are using a graphical tool called Protégé. Protégé is a graphical ontology-development tool which supports a rich knowledge model and it is open-source and freely available. However in order to process and manage the ontology from the web framework, we are using Semantic MediaWiki, which is able to process queries. Semantic MediaWiki is an extension of MediaWiki where we can do semantic search and export data in RDF and CSV format. This system is used as a testbed for the potential use of semantics in a more general environment. This Digital Knowledge Preservation Platform is very closed related to INDIGO-DataCloud project (https://www.indigo-datacloud.eu) since the same data life cycle approach is taking into account (Planning, Collect, Curate, Analyze, Publish, Preserve). INDIGO-DataCloud solutions will be able to support all the different elements in the system, as we showed in the last Research Data Alliance Plenary. This presentation will show the different elements on the system and how they work, as well as the roadmap of their continuous integration.

  6. Innovative Pressure Sensor Platform and Its Integration with an End-User Application

    PubMed Central

    Flores-Caballero, Antonio; Copaci, Dorin; Blanco, María Dolores; Moreno, Luis; Herrán, Jaime; Fernández, Iván; Ochoteco, Estíbaliz; Cabañero, German; Grande, Hans

    2014-01-01

    This paper describes the fully integration of an innovative and low-cost pressure sensor sheet based on a bendable and printed electronics technology. All integration stages are covered, from most low-level functional system, like physical analog sensor data acquisition, followed by embedded data processing, to end user interactive visual application. Data acquisition embedded software and hardware was developed using a Rapid Control Prototyping (RCP). Finally, after first electronic prototype successful testing, a Taylor-made electronics was developed, reducing electronics volume to 3.5 cm × 6 cm × 2 cm with a maximum power consumption of 765 mW for both electronics and pressure sensor sheet. PMID:24922455

  7. The Infobiotics Workbench: an integrated in silico modelling platform for Systems and Synthetic Biology.

    PubMed

    Blakes, Jonathan; Twycross, Jamie; Romero-Campero, Francisco Jose; Krasnogor, Natalio

    2011-12-01

    The Infobiotics Workbench is an integrated software suite incorporating model specification, simulation, parameter optimization and model checking for Systems and Synthetic Biology. A modular model specification allows for straightforward creation of large-scale models containing many compartments and reactions. Models are simulated either using stochastic simulation or numerical integration, and visualized in time and space. Model parameters and structure can be optimized with evolutionary algorithms, and model properties calculated using probabilistic model checking. Source code and binaries for Linux, Mac and Windows are available at http://www.infobiotics.org/infobiotics-workbench/; released under the GNU General Public License (GPL) version 3. Natalio.Krasnogor@nottingham.ac.uk.

  8. A versatile nondestructive evaluation imaging workstation

    NASA Technical Reports Server (NTRS)

    Chern, E. James; Butler, David W.

    1994-01-01

    Ultrasonic C-scan and eddy current imaging systems are of the pointwise type evaluation systems that rely on a mechanical scanner to physically maneuver a probe relative to the specimen point by point in order to acquire data and generate images. Since the ultrasonic C-scan and eddy current imaging systems are based on the same mechanical scanning mechanisms, the two systems can be combined using the same PC platform with a common mechanical manipulation subsystem and integrated data acquisition software. Based on this concept, we have developed an IBM PC-based combined ultrasonic C-scan and eddy current imaging system. The system is modularized and provides capacity for future hardware and software expansions. Advantages associated with the combined system are: (1) eliminated duplication of the computer and mechanical hardware, (2) unified data acquisition, processing and storage software, (3) reduced setup time for repetitious ultrasonic and eddy current scans, and (4) improved system efficiency. The concept can be adapted to many engineering systems by integrating related PC-based instruments into one multipurpose workstation such as dispensing, machining, packaging, sorting, and other industrial applications.

  9. A versatile nondestructive evaluation imaging workstation

    NASA Astrophysics Data System (ADS)

    Chern, E. James; Butler, David W.

    1994-02-01

    Ultrasonic C-scan and eddy current imaging systems are of the pointwise type evaluation systems that rely on a mechanical scanner to physically maneuver a probe relative to the specimen point by point in order to acquire data and generate images. Since the ultrasonic C-scan and eddy current imaging systems are based on the same mechanical scanning mechanisms, the two systems can be combined using the same PC platform with a common mechanical manipulation subsystem and integrated data acquisition software. Based on this concept, we have developed an IBM PC-based combined ultrasonic C-scan and eddy current imaging system. The system is modularized and provides capacity for future hardware and software expansions. Advantages associated with the combined system are: (1) eliminated duplication of the computer and mechanical hardware, (2) unified data acquisition, processing and storage software, (3) reduced setup time for repetitious ultrasonic and eddy current scans, and (4) improved system efficiency. The concept can be adapted to many engineering systems by integrating related PC-based instruments into one multipurpose workstation such as dispensing, machining, packaging, sorting, and other industrial applications.

  10. Software and Human-Machine Interface Development for Environmental Controls Subsystem Support

    NASA Technical Reports Server (NTRS)

    Dobson, Matthew

    2018-01-01

    The Space Launch System (SLS) is the next premier launch vehicle for NASA. It is the next stage of manned space exploration from American soil, and will be the platform in which we push further beyond Earth orbit. In preparation of the SLS maiden voyage on Exploration Mission 1 (EM-1), the existing ground support architecture at Kennedy Space Center required significant overhaul and updating. A comprehensive upgrade of controls systems was necessary, including programmable logic controller software, as well as Launch Control Center (LCC) firing room and local launch pad displays for technician use. Environmental control acts as an integral component in these systems, being the foremost system for conditioning the pad and extremely sensitive launch vehicle until T-0. The Environmental Controls Subsystem (ECS) required testing and modification to meet the requirements of the designed system, as well as the human factors requirements of NASA software for Validation and Verification (V&V). This term saw significant strides in the progress and functionality of the human-machine interfaces used at the launch pad, and improved integration with the controller code.

  11. Integrated computer-aided design using minicomputers

    NASA Technical Reports Server (NTRS)

    Storaasli, O. O.

    1980-01-01

    Computer-Aided Design/Computer-Aided Manufacturing (CAD/CAM), a highly interactive software, has been implemented on minicomputers at the NASA Langley Research Center. CAD/CAM software integrates many formerly fragmented programs and procedures into one cohesive system; it also includes finite element modeling and analysis, and has been interfaced via a computer network to a relational data base management system and offline plotting devices on mainframe computers. The CAD/CAM software system requires interactive graphics terminals operating at a minimum of 4800 bits/sec transfer rate to a computer. The system is portable and introduces 'interactive graphics', which permits the creation and modification of models interactively. The CAD/CAM system has already produced designs for a large area space platform, a national transonic facility fan blade, and a laminar flow control wind tunnel model. Besides the design/drafting element analysis capability, CAD/CAM provides options to produce an automatic program tooling code to drive a numerically controlled (N/C) machine. Reductions in time for design, engineering, drawing, finite element modeling, and N/C machining will benefit productivity through reduced costs, fewer errors, and a wider range of configuration.

  12. XML-based scripting of multimodality image presentations in multidisciplinary clinical conferences

    NASA Astrophysics Data System (ADS)

    Ratib, Osman M.; Allada, Vivekanand; Dahlbom, Magdalena; Marcus, Phillip; Fine, Ian; Lapstra, Lorelle

    2002-05-01

    We developed a multi-modality image presentation software for display and analysis of images and related data from different imaging modalities. The software is part of a cardiac image review and presentation platform that supports integration of digital images and data from digital and analog media such as videotapes, analog x-ray films and 35 mm cine films. The software supports standard DICOM image files as well as AVI and PDF data formats. The system is integrated in a digital conferencing room that includes projections of digital and analog sources, remote videoconferencing capabilities, and an electronic whiteboard. The goal of this pilot project is to: 1) develop a new paradigm for image and data management for presentation in a clinically meaningful sequence adapted to case-specific scenarios, 2) design and implement a multi-modality review and conferencing workstation using component technology and customizable 'plug-in' architecture to support complex review and diagnostic tasks applicable to all cardiac imaging modalities and 3) develop an XML-based scripting model of image and data presentation for clinical review and decision making during routine clinical tasks and multidisciplinary clinical conferences.

  13. Open source posturography.

    PubMed

    Rey-Martinez, Jorge; Pérez-Fernández, Nicolás

    2016-12-01

    The proposed validation goal of 0.9 in intra-class correlation coefficient was reached with the results of this study. With the obtained results we consider that the developed software (RombergLab) is a validated balance assessment software. The reliability of this software is dependent of the used force platform technical specifications. Develop and validate a posturography software and share its source code in open source terms. Prospective non-randomized validation study: 20 consecutive adults underwent two balance assessment tests, six condition posturography was performed using a clinical approved software and force platform and the same conditions were measured using the new developed open source software using a low cost force platform. Intra-class correlation index of the sway area obtained from the center of pressure variations in both devices for the six conditions was the main variable used for validation. Excellent concordance between RombergLab and clinical approved force platform was obtained (intra-class correlation coefficient =0.94). A Bland and Altman graphic concordance plot was also obtained. The source code used to develop RombergLab was published in open source terms.

  14. Automated platform for designing multiple robot work cells

    NASA Astrophysics Data System (ADS)

    Osman, N. S.; Rahman, M. A. A.; Rahman, A. A. Abdul; Kamsani, S. H.; Bali Mohamad, B. M.; Mohamad, E.; Zaini, Z. A.; Rahman, M. F. Ab; Mohamad Hatta, M. N. H.

    2017-06-01

    Designing the multiple robot work cells is very knowledge-intensive, intricate, and time-consuming process. This paper elaborates the development process of a computer-aided design program for generating the multiple robot work cells which offer a user-friendly interface. The primary purpose of this work is to provide a fast and easy platform for less cost and human involvement with minimum trial and errors adjustments. The automated platform is constructed based on the variant-shaped configuration concept with its mathematical model. A robot work cell layout, system components, and construction procedure of the automated platform are discussed in this paper where integration of these items will be able to automatically provide the optimum robot work cell design according to the information set by the user. This system is implemented on top of CATIA V5 software and utilises its Part Design, Assembly Design, and Macro tool. The current outcomes of this work provide a basis for future investigation in developing a flexible configuration system for the multiple robot work cells.

  15. Integrating Distributed Interactive Simulations With the Project Darkstar Open-Source Massively Multiplayer Online Game (MMOG) Middleware

    DTIC Science & Technology

    2009-09-01

    be complete MMOG solutions such as Multiverse are not within the scope of this thesis, though it is recommended that readers compare this type of...software to the middleware described here ( Multiverse , 2009). 1. University of Munster: Real-Time Framework The Real-Time Framework (RTF) project is...10, 2009, from http://wiki.secondlife.com/wiki/MMOX Multiverse . (2009). Multiverse platform architecture. Retrieved September 9, 2009, from http

  16. Sirocco Storage Server v. pre-alpha 0.1

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Curry, Matthew L.; Danielson, Geoffrey; Ward, H. Lee

    Sirocco is a parallel storage system under development, designed for write-intensive workloads on large-scale HPC platforms. It implements a keyvalue object store on top of a set of loosely federated storage servers that cooperate to ensure data integrity and performance. It includes support for a range of different types of storage transactions. This software release constitutes a conformant storage server, along with the client-side libraries to access the storage over a network.

  17. Systems Engineering Building Advances Power Grid Research

    ScienceCinema

    Virden, Jud; Huang, Henry; Skare, Paul; Dagle, Jeff; Imhoff, Carl; Stoustrup, Jakob; Melton, Ron; Stiles, Dennis; Pratt, Rob

    2018-01-16

    Researchers and industry are now better equipped to tackle the nation’s most pressing energy challenges through PNNL’s new Systems Engineering Building – including challenges in grid modernization, buildings efficiency and renewable energy integration. This lab links real-time grid data, software platforms, specialized laboratories and advanced computing resources for the design and demonstration of new tools to modernize the grid and increase buildings energy efficiency.

  18. Web accessibility and open source software.

    PubMed

    Obrenović, Zeljko

    2009-07-01

    A Web browser provides a uniform user interface to different types of information. Making this interface universally accessible and more interactive is a long-term goal still far from being achieved. Universally accessible browsers require novel interaction modalities and additional functionalities, for which existing browsers tend to provide only partial solutions. Although functionality for Web accessibility can be found as open source and free software components, their reuse and integration is complex because they were developed in diverse implementation environments, following standards and conventions incompatible with the Web. To address these problems, we have started several activities that aim at exploiting the potential of open-source software for Web accessibility. The first of these activities is the development of Adaptable Multi-Interface COmmunicator (AMICO):WEB, an infrastructure that facilitates efficient reuse and integration of open source software components into the Web environment. The main contribution of AMICO:WEB is in enabling the syntactic and semantic interoperability between Web extension mechanisms and a variety of integration mechanisms used by open source and free software components. Its design is based on our experiences in solving practical problems where we have used open source components to improve accessibility of rich media Web applications. The second of our activities involves improving education, where we have used our platform to teach students how to build advanced accessibility solutions from diverse open-source software. We are also partially involved in the recently started Eclipse projects called Accessibility Tools Framework (ACTF), the aim of which is development of extensible infrastructure, upon which developers can build a variety of utilities that help to evaluate and enhance the accessibility of applications and content for people with disabilities. In this article we briefly report on these activities.

  19. A software platform for phase contrast x-ray breast imaging research.

    PubMed

    Bliznakova, K; Russo, P; Mettivier, G; Requardt, H; Popov, P; Bravin, A; Buliev, I

    2015-06-01

    To present and validate a computer-based simulation platform dedicated for phase contrast x-ray breast imaging research. The software platform, developed at the Technical University of Varna on the basis of a previously validated x-ray imaging software simulator, comprises modules for object creation and for x-ray image formation. These modules were updated to take into account the refractive index for phase contrast imaging as well as implementation of the Fresnel-Kirchhoff diffraction theory of the propagating x-ray waves. Projection images are generated in an in-line acquisition geometry. To test and validate the platform, several phantoms differing in their complexity were constructed and imaged at 25 keV and 60 keV at the beamline ID17 of the European Synchrotron Radiation Facility. The software platform was used to design computational phantoms that mimic those used in the experimental study and to generate x-ray images in absorption and phase contrast modes. The visual and quantitative results of the validation process showed an overall good correlation between simulated and experimental images and show the potential of this platform for research in phase contrast x-ray imaging of the breast. The application of the platform is demonstrated in a feasibility study for phase contrast images of complex inhomogeneous and anthropomorphic breast phantoms, compared to x-ray images generated in absorption mode. The improved visibility of mammographic structures suggests further investigation and optimisation of phase contrast x-ray breast imaging, especially when abnormalities are present. The software platform can be exploited also for educational purposes. Copyright © 2015 Elsevier Ltd. All rights reserved.

  20. Development of a Platform to Enable Fully Automated Cross-Titration Experiments.

    PubMed

    Cassaday, Jason; Finley, Michael; Squadroni, Brian; Jezequel-Sur, Sylvie; Rauch, Albert; Gajera, Bharti; Uebele, Victor; Hermes, Jeffrey; Zuck, Paul

    2017-04-01

    In the triage of hits from a high-throughput screening campaign or during the optimization of a lead compound, it is relatively routine to test compounds at multiple concentrations to determine potency and maximal effect. Additional follow-up experiments, such as agonist shift, can be quite valuable in ascertaining compound mechanism of action (MOA). However, these experiments require cross-titration of a test compound with the activating ligand of the receptor requiring 100-200 data points, severely limiting the number tested in MOA assays in a screening triage. We describe a process to enhance the throughput of such cross-titration experiments through the integration of Hewlett Packard's D300 digital dispenser onto one of our robotics platforms to enable on-the-fly cross-titration of compounds in a 1536-well plate format. The process handles all the compound management and data tracking, as well as the biological assay. The process relies heavily on in-house-built software and hardware, and uses our proprietary control software for the platform. Using this system, we were able to automate the cross-titration of compounds for both positive and negative allosteric modulators of two different G protein-coupled receptors (GPCRs) using two distinct assay detection formats, IP1 and Ca 2+ detection, on nearly 100 compounds for each target.

  1. Closed-Loop, Multichannel Experimentation Using the Open-Source NeuroRighter Electrophysiology Platform

    PubMed Central

    Newman, Jonathan P.; Zeller-Townson, Riley; Fong, Ming-Fai; Arcot Desai, Sharanya; Gross, Robert E.; Potter, Steve M.

    2013-01-01

    Single neuron feedback control techniques, such as voltage clamp and dynamic clamp, have enabled numerous advances in our understanding of ion channels, electrochemical signaling, and neural dynamics. Although commercially available multichannel recording and stimulation systems are commonly used for studying neural processing at the network level, they provide little native support for real-time feedback. We developed the open-source NeuroRighter multichannel electrophysiology hardware and software platform for closed-loop multichannel control with a focus on accessibility and low cost. NeuroRighter allows 64 channels of stimulation and recording for around US $10,000, along with the ability to integrate with other software and hardware. Here, we present substantial enhancements to the NeuroRighter platform, including a redesigned desktop application, a new stimulation subsystem allowing arbitrary stimulation patterns, low-latency data servers for accessing data streams, and a new application programming interface (API) for creating closed-loop protocols that can be inserted into NeuroRighter as plugin programs. This greatly simplifies the design of sophisticated real-time experiments without sacrificing the power and speed of a compiled programming language. Here we present a detailed description of NeuroRighter as a stand-alone application, its plugin API, and an extensive set of case studies that highlight the system’s abilities for conducting closed-loop, multichannel interfacing experiments. PMID:23346047

  2. Price Based Local Power Distribution Management System (Local Power Distribution Manager) v1.0

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    BROWN, RICHARD E.; CZARNECKI, STEPHEN; SPEARS, MICHAEL

    2016-11-28

    A trans-active energy micro-grid controller is implemented in the VOLTTRON distributed control platform. The system uses the price of electricity as the mechanism for conducting transactions that are used to manage energy use and to balance supply and demand. In order to allow testing and analysis of the control system, the implementation is designed to run completely as a software simulation, while allowing the inclusion of selected hardware that physically manages power. Equipment to be integrated with the micro-grid controller must have an IP (Internet Protocol)-based network connection and a software "driver" must exist to translate data communications between themore » device and the controller.« less

  3. Research on Safety Monitoring System of Tailings Dam Based on Internet of Things

    NASA Astrophysics Data System (ADS)

    Wang, Ligang; Yang, Xiaocong; He, Manchao

    2018-03-01

    The paper designed and implemented the safety monitoring system of tailings dam based on Internet of things, completed the hardware and software design of sensor nodes, routing nodes and coordinator node by using ZigBee wireless sensor chip CC2630 and 3G/4G data transmission module, developed the software platform integrated with geographic information system. The paper achieved real-time monitoring and data collection of tailings dam dam deformation, seepage line, water level and rainfall for all-weather, the stability of tailings dam based on the Internet of things monitoring is analyzed, and realized intelligent and scientific management of tailings dam under the guidance of the remote expert system.

  4. Combining clinical and genomics queries using i2b2 – Three methods

    PubMed Central

    Murphy, Shawn N.; Avillach, Paul; Bellazzi, Riccardo; Phillips, Lori; Gabetta, Matteo; Eran, Alal; McDuffie, Michael T.; Kohane, Isaac S.

    2017-01-01

    We are fortunate to be living in an era of twin biomedical data surges: a burgeoning representation of human phenotypes in the medical records of our healthcare systems, and high-throughput sequencing making rapid technological advances. The difficulty representing genomic data and its annotations has almost by itself led to the recognition of a biomedical “Big Data” challenge, and the complexity of healthcare data only compounds the problem to the point that coherent representation of both systems on the same platform seems insuperably difficult. We investigated the capability for complex, integrative genomic and clinical queries to be supported in the Informatics for Integrating Biology and the Bedside (i2b2) translational software package. Three different data integration approaches were developed: The first is based on Sequence Ontology, the second is based on the tranSMART engine, and the third on CouchDB. These novel methods for representing and querying complex genomic and clinical data on the i2b2 platform are available today for advancing precision medicine. PMID:28388645

  5. Bridging the gap between PAT concepts and implementation: An integrated software platform for fermentation.

    PubMed

    Chopda, Viki R; Gomes, James; Rathore, Anurag S

    2016-01-01

    Bioreactor control significantly impacts both the amount and quality of the product being manufactured. The complexity of the control strategy that is implemented increases with reactor size, which may vary from thousands to tens of thousands of litres in commercial manufacturing. The Process Analytical Technology (PAT) initiative has highlighted the need for having robust monitoring tools and effective control schemes that are capable of taking real time information about the critical quality attributes (CQA) and the critical process parameters (CPP) and executing immediate response as soon as a deviation occurs. However, the limited flexibility that present commercial software packages offer creates a hurdle. Visual programming environments have gradually emerged as potential alternatives to the available text based languages. This paper showcases development of an integrated programme using a visual programming environment for a Sartorius BIOSTAT® B Plus 5L bioreactor through which various peripheral devices are interfaced. The proposed programme facilitates real-time access to data and allows for execution of control actions to follow the desired trajectory. Major benefits of such integrated software system include: (i) improved real time monitoring and control; (ii) reduced variability; (iii) improved performance; (iv) reduced operator-training time; (v) enhanced knowledge management; and (vi) easier PAT implementation. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  6. JAMS - a software platform for modular hydrological modelling

    NASA Astrophysics Data System (ADS)

    Kralisch, Sven; Fischer, Christian

    2015-04-01

    Current challenges of understanding and assessing the impacts of climate and land use changes on environmental systems demand for an ever-increasing integration of data and process knowledge in corresponding simulation models. Software frameworks that allow for a seamless creation of integrated models based on less complex components (domain models, process simulation routines) have therefore gained increasing attention during the last decade. JAMS is an Open-Source software framework that has been especially designed to cope with the challenges of eco-hydrological modelling. This is reflected by (i) its flexible approach for representing time and space, (ii) a strong separation of process simulation components from the declarative description of more complex models using domain specific XML, (iii) powerful analysis and visualization functions for spatial and temporal input and output data, and (iv) parameter optimization and uncertainty analysis functions commonly used in environmental modelling. Based on JAMS, different hydrological and nutrient-transport simulation models were implemented and successfully applied during the last years. We will present the JAMS core concepts and give an overview of models, simulation components and support tools available for that framework. Sample applications will be used to underline the advantages of component-based model designs and to show how JAMS can be used to address the challenges of integrated hydrological modelling.

  7. A Unified Algebraic and Logic-Based Framework Towards Safe Routing Implementations

    DTIC Science & Technology

    2015-08-13

    Software - defined Networks ( SDN ). We developed a declarative platform for implementing SDN protocols using declarative...and debugging several SDN applications. Example-based SDN synthesis. Recent emergence of software - defined networks offers an opportunity to design...domain of Software - defined Networks ( SDN ). We developed a declarative platform for implementing SDN protocols using declarative networking

  8. IntellWheels: modular development platform for intelligent wheelchairs.

    PubMed

    Braga, Rodrigo Antonio Marques; Petry, Marcelo; Reis, Luis Paulo; Moreira, António Paulo

    2011-01-01

    Intelligent wheelchairs (IWs) can become an important solution to the challenge of assisting individuals who have disabilities and are thus unable to perform their daily activities using classic powered wheelchairs. This article describes the concept and design of IntellWheels, a modular platform to facilitate the development of IWs through a multiagent system paradigm. In fact, modularity is achieved not only in the software perspective, but also through a generic hardware framework that was designed to fit, in a straightforward manner, almost any commercial powered wheelchair. Experimental results demonstrate the successful integration of all modules in the platform, providing safe motion to the IW. Furthermore, the results achieved with a prototype running in autonomous mode in simulated and mixed-reality environments also demonstrate the potential of our approach. Although some future research is still necessary to fully accomplish our objectives, preliminary tests have shown that IntellWheels will effectively reduce users' limitations, offering them a much more independent life.

  9. MRMer, an interactive open source and cross-platform system for data extraction and visualization of multiple reaction monitoring experiments.

    PubMed

    Martin, Daniel B; Holzman, Ted; May, Damon; Peterson, Amelia; Eastham, Ashley; Eng, Jimmy; McIntosh, Martin

    2008-11-01

    Multiple reaction monitoring (MRM) mass spectrometry identifies and quantifies specific peptides in a complex mixture with very high sensitivity and speed and thus has promise for the high throughput screening of clinical samples for candidate biomarkers. We have developed an interactive software platform, called MRMer, for managing highly complex MRM-MS experiments, including quantitative analyses using heavy/light isotopic peptide pairs. MRMer parses and extracts information from MS files encoded in the platform-independent mzXML data format. It extracts and infers precursor-product ion transition pairings, computes integrated ion intensities, and permits rapid visual curation for analyses exceeding 1000 precursor-product pairs. Results can be easily output for quantitative comparison of consecutive runs. Additionally MRMer incorporates features that permit the quantitative analysis experiments including heavy and light isotopic peptide pairs. MRMer is open source and provided under the Apache 2.0 license.

  10. Reconfigurable intelligent sensors for health monitoring: a case study of pulse oximeter sensor.

    PubMed

    Jovanov, E; Milenkovic, A; Basham, S; Clark, D; Kelley, D

    2004-01-01

    Design of low-cost, miniature, lightweight, ultra low-power, intelligent sensors capable of customization and seamless integration into a body area network for health monitoring applications presents one of the most challenging tasks for system designers. To answer this challenge we propose a reconfigurable intelligent sensor platform featuring a low-power microcontroller, a low-power programmable logic device, a communication interface, and a signal conditioning circuit. The proposed solution promises a cost-effective, flexible platform that allows easy customization, run-time reconfiguration, and energy-efficient computation and communication. The development of a common platform for multiple physical sensors and a repository of both software procedures and soft intellectual property cores for hardware acceleration will increase reuse and alleviate costs of transition to a new generation of sensors. As a case study, we present an implementation of a reconfigurable pulse oximeter sensor.

  11. Achieving sustainable ground-water management by using GIS-integrated simulation tools: the EU H2020 FREEWAT platform

    NASA Astrophysics Data System (ADS)

    Rossetto, Rudy; De Filippis, Giovanna; Borsi, Iacopo; Foglia, Laura; Toegl, Anja; Cannata, Massimiliano; Neumann, Jakob; Vazquez-Sune, Enric; Criollo, Rotman

    2017-04-01

    In order to achieve sustainable and participated ground-water management, innovative software built on the integration of numerical models within GIS software is a perfect candidate to provide a full characterization of quantitative and qualitative aspects of ground- and surface-water resources maintaining the time and spatial dimension. The EU H2020 FREEWAT project (FREE and open source software tools for WATer resource management; Rossetto et al., 2015) aims at simplifying the application of EU water-related Directives through an open-source and public-domain, GIS-integrated simulation platform for planning and management of ground- and surface-water resources. The FREEWAT platform allows to simulate the whole hydrological cycle, coupling the power of GIS geo-processing and post-processing tools in spatial data analysis with that of process-based simulation models. This results in a modeling environment where large spatial datasets can be stored, managed and visualized and where several simulation codes (mainly belonging to the USGS MODFLOW family) are integrated to simulate multiple hydrological, hydrochemical or economic processes. So far, the FREEWAT platform is a large plugin for the QGIS GIS desktop software and it integrates the following capabilities: • the AkvaGIS module allows to produce plots and statistics for the analysis and interpretation of hydrochemical and hydrogeological data; • the Observation Analysis Tool, to facilitate the import, analysis and visualization of time-series data and the use of these data to support model construction and calibration; • groundwater flow simulation in the saturated and unsaturated zones may be simulated using MODFLOW-2005 (Harbaugh, 2005); • multi-species advective-dispersive transport in the saturated zone can be simulated using MT3DMS (Zheng & Wang, 1999); the possibility to simulate viscosity- and density-dependent flows is further accomplished through SEAWAT (Langevin et al., 2007); • sustainable management of combined use of ground- and surface-water resources in rural environments is accomplished by the Farm Process module embedded in MODFLOW-OWHM (Hanson et al., 2014), which allows to dynamically integrate crop water demand and supply from ground- and surface-water; • UCODE_2014 (Poeter et al., 2014) is implemented to perform sensitivity analysis and parameter estimation to improve the model fit through an inverse, regression method based on the evaluation of an objective function. Through creating a common environment among water research/professionals, policy makers and implementers, FREEWAT aims at enhancing science and participatory approach and evidence-based decision making in water resource management, hence producing relevant outcomes for policy implementation. Acknowledgements This paper is presented within the framework of the project FREEWAT, which has received funding from the European Union's HORIZON 2020 research and innovation programme under Grant Agreement n. 642224. References Hanson, R.T., Boyce, S.E., Schmid, W., Hughes, J.D., Mehl, S.M., Leake, S.A., Maddock, T., Niswonger, R.G. One-Water Hydrologic Flow Model (MODFLOW-OWHM), U.S. Geological Survey, Techniques and Methods 6-A51, 2014 134 p. Harbaugh A.W. (2005) - MODFLOW-2005, The U.S. Geological Survey Modular Ground-Water Model - the Ground-Water Flow Process. U.S. Geological Survey, Techniques and Methods 6-A16, 253 p. Langevin C.D., Thorne D.T. Jr., Dausman A.M., Sukop M.C. & Guo Weixing (2007) - SEAWAT Version 4: A Computer Program for Simulation of Multi-Species Solute and Heat Transport. U.S. Geological Survey Techniques and Methods 6-A22, 39 pp. Poeter E.P., Hill M.C., Lu D., Tiedeman C.R. & Mehl S. (2014) - UCODE_2014, with new capabilities to define parameters unique to predictions, calculate weights using simulated values, estimate parameters with SVD, evaluate uncertainty with MCMC, and more. Integrated Groundwater Modeling Center Report Number GWMI 2014-02. Rossetto, R., Borsi, I. & Foglia, L. FREEWAT: FREE and open source software tools for WATer resource management, Rendiconti Online Società Geologica Italiana, 2015, 35, 252-255. Zheng C. & Wang P.P. (1999) - MT3DMS, A modular three-dimensional multi-species transport model for simulation of advection, dispersion and chemical reactions of contaminants in groundwater systems. U.S. Army Engineer Research and Development Center Contract Report SERDP-99-1, Vicksburg, MS, 202 pp.

  12. LabKey Server: an open source platform for scientific data integration, analysis and collaboration.

    PubMed

    Nelson, Elizabeth K; Piehler, Britt; Eckels, Josh; Rauch, Adam; Bellew, Matthew; Hussey, Peter; Ramsay, Sarah; Nathe, Cory; Lum, Karl; Krouse, Kevin; Stearns, David; Connolly, Brian; Skillman, Tom; Igra, Mark

    2011-03-09

    Broad-based collaborations are becoming increasingly common among disease researchers. For example, the Global HIV Enterprise has united cross-disciplinary consortia to speed progress towards HIV vaccines through coordinated research across the boundaries of institutions, continents and specialties. New, end-to-end software tools for data and specimen management are necessary to achieve the ambitious goals of such alliances. These tools must enable researchers to organize and integrate heterogeneous data early in the discovery process, standardize processes, gain new insights into pooled data and collaborate securely. To meet these needs, we enhanced the LabKey Server platform, formerly known as CPAS. This freely available, open source software is maintained by professional engineers who use commercially proven practices for software development and maintenance. Recent enhancements support: (i) Submitting specimens requests across collaborating organizations (ii) Graphically defining new experimental data types, metadata and wizards for data collection (iii) Transitioning experimental results from a multiplicity of spreadsheets to custom tables in a shared database (iv) Securely organizing, integrating, analyzing, visualizing and sharing diverse data types, from clinical records to specimens to complex assays (v) Interacting dynamically with external data sources (vi) Tracking study participants and cohorts over time (vii) Developing custom interfaces using client libraries (viii) Authoring custom visualizations in a built-in R scripting environment. Diverse research organizations have adopted and adapted LabKey Server, including consortia within the Global HIV Enterprise. Atlas is an installation of LabKey Server that has been tailored to serve these consortia. It is in production use and demonstrates the core capabilities of LabKey Server. Atlas now has over 2,800 active user accounts originating from approximately 36 countries and 350 organizations. It tracks roughly 27,000 assay runs, 860,000 specimen vials and 1,300,000 vial transfers. Sharing data, analysis tools and infrastructure can speed the efforts of large research consortia by enhancing efficiency and enabling new insights. The Atlas installation of LabKey Server demonstrates the utility of the LabKey platform for collaborative research. Stable, supported builds of LabKey Server are freely available for download at http://www.labkey.org. Documentation and source code are available under the Apache License 2.0.

  13. LabKey Server: An open source platform for scientific data integration, analysis and collaboration

    PubMed Central

    2011-01-01

    Background Broad-based collaborations are becoming increasingly common among disease researchers. For example, the Global HIV Enterprise has united cross-disciplinary consortia to speed progress towards HIV vaccines through coordinated research across the boundaries of institutions, continents and specialties. New, end-to-end software tools for data and specimen management are necessary to achieve the ambitious goals of such alliances. These tools must enable researchers to organize and integrate heterogeneous data early in the discovery process, standardize processes, gain new insights into pooled data and collaborate securely. Results To meet these needs, we enhanced the LabKey Server platform, formerly known as CPAS. This freely available, open source software is maintained by professional engineers who use commercially proven practices for software development and maintenance. Recent enhancements support: (i) Submitting specimens requests across collaborating organizations (ii) Graphically defining new experimental data types, metadata and wizards for data collection (iii) Transitioning experimental results from a multiplicity of spreadsheets to custom tables in a shared database (iv) Securely organizing, integrating, analyzing, visualizing and sharing diverse data types, from clinical records to specimens to complex assays (v) Interacting dynamically with external data sources (vi) Tracking study participants and cohorts over time (vii) Developing custom interfaces using client libraries (viii) Authoring custom visualizations in a built-in R scripting environment. Diverse research organizations have adopted and adapted LabKey Server, including consortia within the Global HIV Enterprise. Atlas is an installation of LabKey Server that has been tailored to serve these consortia. It is in production use and demonstrates the core capabilities of LabKey Server. Atlas now has over 2,800 active user accounts originating from approximately 36 countries and 350 organizations. It tracks roughly 27,000 assay runs, 860,000 specimen vials and 1,300,000 vial transfers. Conclusions Sharing data, analysis tools and infrastructure can speed the efforts of large research consortia by enhancing efficiency and enabling new insights. The Atlas installation of LabKey Server demonstrates the utility of the LabKey platform for collaborative research. Stable, supported builds of LabKey Server are freely available for download at http://www.labkey.org. Documentation and source code are available under the Apache License 2.0. PMID:21385461

  14. Architecture for the Next Generation System Management Tools

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gallard, Jerome; Lebre, I Adrien; Morin, Christine

    2011-01-01

    To get more results or greater accuracy, computational scientists execute their applications on distributed computing platforms such as Clusters, Grids and Clouds. These platforms are different in terms of hardware and software resources as well as locality: some span across multiple sites and multiple administrative domains whereas others are limited to a single site/domain. As a consequence, in order to scale their applica- tions up the scientists have to manage technical details for each target platform. From our point of view, this complexity should be hidden from the scientists who, in most cases, would prefer to focus on their researchmore » rather than spending time dealing with platform configuration concerns. In this article, we advocate for a system management framework that aims to automatically setup the whole run-time environment according to the applications needs. The main difference with regards to usual approaches is that they generally only focus on the software layer whereas we address both the hardware and the software expecta- tions through a unique system. For each application, scientists describe their requirements through the definition of a Virtual Platform (VP) and a Virtual System Environment (VSE). Relying on the VP/VSE definitions, the framework is in charge of: (i) the configuration of the physical infrastructure to satisfy the VP requirements, (ii) the setup of the VP, and (iii) the customization of the execution environment (VSE) upon the former VP. We propose a new formalism that the system can rely upon to successfully perform each of these three steps without burdening the user with the specifics of the configuration for the physical resources, and system management tools. This formalism leverages Goldberg s theory for recursive virtual machines by introducing new concepts based on system virtualization (identity, partitioning, aggregation) and emulation (simple, abstraction). This enables the definition of complex VP/VSE configurations without making assumptions about the hardware and the software re- sources. For each requirement, the system executes the corresponding operation with the appropriate management tool. As a proof of concept, we implemented a first prototype that currently interacts with several system management tools (e.g., OSCAR, the Grid 5000 toolkit, and XtreemOS) and that can be easily extended to integrate new resource brokers or cloud systems such as Nimbus, OpenNebula or Eucalyptus for instance.« less

  15. Rapid sample classification using an open port sampling interface coupled with liquid introduction atmospheric pressure ionization mass spectrometry

    DOE PAGES

    Van Berkel, Gary J.; Kertesz, Vilmos

    2016-11-15

    An “Open Access”-like mass spectrometric platform to fully utilize the simplicity of the manual open port sampling interface for rapid characterization of unprocessed samples by liquid introduction atmospheric pressure ionization mass spectrometry has been lacking. The in-house developed integrated software with a simple, small and relatively low-cost mass spectrometry system introduced here fills this void. Software was developed to operate the mass spectrometer, to collect and process mass spectrometric data files, to build a database and to classify samples using such a database. These tasks were accomplished via the vendorprovided software libraries. Sample classification based on spectral comparison utilized themore » spectral contrast angle method. As a result, using the developed software platform near real-time sample classification is exemplified using a series of commercially available blue ink rollerball pens and vegetable oils. In the case of the inks, full scan positive and negative ion ESI mass spectra were both used for database generation and sample classification. For the vegetable oils, full scan positive ion mode APCI mass spectra were recorded. The overall accuracy of the employed spectral contrast angle statistical model was 95.3% and 98% in case of the inks and oils, respectively, using leave-one-out cross-validation. In conclusion, this work illustrates that an open port sampling interface/mass spectrometer combination, with appropriate instrument control and data processing software, is a viable direct liquid extraction sampling and analysis system suitable for the non-expert user and near real-time sample classification via database matching.« less

  16. Rapid sample classification using an open port sampling interface coupled with liquid introduction atmospheric pressure ionization mass spectrometry

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Van Berkel, Gary J.; Kertesz, Vilmos

    An “Open Access”-like mass spectrometric platform to fully utilize the simplicity of the manual open port sampling interface for rapid characterization of unprocessed samples by liquid introduction atmospheric pressure ionization mass spectrometry has been lacking. The in-house developed integrated software with a simple, small and relatively low-cost mass spectrometry system introduced here fills this void. Software was developed to operate the mass spectrometer, to collect and process mass spectrometric data files, to build a database and to classify samples using such a database. These tasks were accomplished via the vendorprovided software libraries. Sample classification based on spectral comparison utilized themore » spectral contrast angle method. As a result, using the developed software platform near real-time sample classification is exemplified using a series of commercially available blue ink rollerball pens and vegetable oils. In the case of the inks, full scan positive and negative ion ESI mass spectra were both used for database generation and sample classification. For the vegetable oils, full scan positive ion mode APCI mass spectra were recorded. The overall accuracy of the employed spectral contrast angle statistical model was 95.3% and 98% in case of the inks and oils, respectively, using leave-one-out cross-validation. In conclusion, this work illustrates that an open port sampling interface/mass spectrometer combination, with appropriate instrument control and data processing software, is a viable direct liquid extraction sampling and analysis system suitable for the non-expert user and near real-time sample classification via database matching.« less

  17. GFam: a platform for automatic annotation of gene families.

    PubMed

    Sasidharan, Rajkumar; Nepusz, Tamás; Swarbreck, David; Huala, Eva; Paccanaro, Alberto

    2012-10-01

    We have developed GFam, a platform for automatic annotation of gene/protein families. GFam provides a framework for genome initiatives and model organism resources to build domain-based families, derive meaningful functional labels and offers a seamless approach to propagate functional annotation across periodic genome updates. GFam is a hybrid approach that uses a greedy algorithm to chain component domains from InterPro annotation provided by its 12 member resources followed by a sequence-based connected component analysis of un-annotated sequence regions to derive consensus domain architecture for each sequence and subsequently generate families based on common architectures. Our integrated approach increases sequence coverage by 7.2 percentage points and residue coverage by 14.6 percentage points higher than the coverage relative to the best single-constituent database within InterPro for the proteome of Arabidopsis. The true power of GFam lies in maximizing annotation provided by the different InterPro data sources that offer resource-specific coverage for different regions of a sequence. GFam's capability to capture higher sequence and residue coverage can be useful for genome annotation, comparative genomics and functional studies. GFam is a general-purpose software and can be used for any collection of protein sequences. The software is open source and can be obtained from http://www.paccanarolab.org/software/gfam/.

  18. OSCAR: A Compact, Powerful and Versatile On Board Computer Based on LEON3 Core

    NASA Astrophysics Data System (ADS)

    Poupat, Jean-Luc; Lefevre, Aurelien; Koebel, Franck

    2011-08-01

    Satellites are controlled via a platform On Board Computer (OBC) that manages different parameters (attitude, orbit, modes, temperatures ...) with respect to its payload mission (telecommunication, earth observation, scientific mission). The platform OBC is connected to the satellite and the ground control via digital links, and executes on board software.The main functions of a platform OBC are to provide the satellite flight segment with the following features: o Processing resources for the flight mission software o TM/TC services and interfaces with the RF communication chaino General communication services with the Avionicsand payload equipments through an on-board communication bus based on the MIL-1553B standard or CANo Time synchronization and distributiono Failure tolerant architecture based on the use of redounded reconfiguration units and redundancyimplementationFrom a hardware point of view, it groups a lot of digital functions usually dispatched on numerous chips (processor, co-processor, digital links IP ...) together. In order to reach an ultimate level of integration, Astrium has designed an ASIC gathering on a single chip all the required digital functions: the SCOC3 ASIC.Astrium has developed an OBC based on this SCOC3 ASIC: the OSCAR (Optimized Spacecraft Computer Architecture with Reconfiguration). It is now available off-the-shelf as the new OBC product family of Astrium.This paper presents the major innovations introduced by Astrium for SCOC3 and OSCAR with the objective to save cost and mass through a solution compatible with any class quality project, using a unique software development environment for user.

  19. A Power Hardware-in-the-Loop Platform with Remote Distribution Circuit Cosimulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Palmintier, Bryan; Lundstrom, Blake; Chakraborty, Sudipta

    2015-04-01

    This paper demonstrates the use of a novel cosimulation architecture that integrates hardware testing using Power Hardware-in-the-Loop (PHIL) with larger-scale electric grid models using off-the-shelf, non-PHIL software tools. This architecture enables utilities to study the impacts of emerging energy technologies on their system and manufacturers to explore the interactions of new devices with existing and emerging devices on the power system, both without the need to convert existing grid models to a new platform or to conduct in-field trials. The paper describes an implementation of this architecture for testing two residential-scale advanced solar inverters at separate points of common coupling.more » The same hardware setup is tested with two different distribution feeders (IEEE 123 and 8500 node test systems) modeled using GridLAB-D. In addition to simplifying testing with multiple feeders, the architecture demonstrates additional flexibility with hardware testing in one location linked via the Internet to software modeling in a remote location. In testing, inverter current, real and reactive power, and PCC voltage are well captured by the co-simulation platform. Testing of the inverter advanced control features is currently somewhat limited by the software model time step (1 sec) and tested communication latency (24 msec). Overshoot induced oscillations are observed with volt/VAR control delays of 0 and 1.5 sec, while 3.4 sec and 5.5 sec delays produced little or no oscillation. These limitations could be overcome using faster modeling and communication within the same co-simulation architecture.« less

  20. An Open Software Platform for Sharing Water Resource Models, Code and Data

    NASA Astrophysics Data System (ADS)

    Knox, Stephen; Meier, Philipp; Mohamed, Khaled; Korteling, Brett; Matrosov, Evgenii; Huskova, Ivana; Harou, Julien; Rosenberg, David; Tilmant, Amaury; Medellin-Azuara, Josue; Wicks, Jon

    2016-04-01

    The modelling of managed water resource systems requires new approaches in the face of increasing future uncertainty. Water resources management models, even if applied to diverse problem areas, use common approaches such as representing the problem as a network of nodes and links. We propose a data management software platform, called Hydra, that uses this commonality to allow multiple models using a node-link structure to be managed and run using a single software system. Hydra's user interface allows users to manage network topology and associated data. Hydra feeds this data directly into a model, importing from and exporting to different file formats using Apps. An App connects Hydra to a custom model, a modelling system such as GAMS or MATLAB or to different file formats such as MS Excel, CSV and ESRI Shapefiles. Hydra allows users to manage their data in a single, consistent place. Apps can be used to run domain-specific models and allow users to work with their own required file formats. The Hydra App Store offers a collaborative space where model developers can publish, review and comment on Apps, models and data. Example Apps and open-source libraries are available in a variety of languages (Python, Java and .NET). The App Store can act as a hub for water resource modellers to view and share Apps, models and data easily. This encourages an ecosystem of development using a shared platform, resulting in more model integration and potentially greater unity within resource modelling communities. www.hydraplatform.org www.hydraappstore.com

  1. Potential of a suite of robot/computer-assisted motivating systems for personalized, home-based, stroke rehabilitation

    PubMed Central

    Johnson, Michelle J; Feng, Xin; Johnson, Laura M; Winters, Jack M

    2007-01-01

    Background There is a need to improve semi-autonomous stroke therapy in home environments often characterized by low supervision of clinical experts and low extrinsic motivation. Our distributed device approach to this problem consists of an integrated suite of low-cost robotic/computer-assistive technologies driven by a novel universal access software framework called UniTherapy. Our design strategy for personalizing the therapy, providing extrinsic motivation and outcome assessment is presented and evaluated. Methods Three studies were conducted to evaluate the potential of the suite. A conventional force-reflecting joystick, a modified joystick therapy platform (TheraJoy), and a steering wheel platform (TheraDrive) were tested separately with the UniTherapy software. Stroke subjects with hemiparesis and able-bodied subjects completed tracking activities with the devices in different positions. We quantify motor performance across subject groups and across device platforms and muscle activation across devices at two positions in the arm workspace. Results Trends in the assessment metrics were consistent across devices with able-bodied and high functioning strokes subjects being significantly more accurate and quicker in their motor performance than low functioning subjects. Muscle activation patterns were different for shoulder and elbow across different devices and locations. Conclusion The Robot/CAMR suite has potential for stroke rehabilitation. By manipulating hardware and software variables, we can create personalized therapy environments that engage patients, address their therapy need, and track their progress. A larger longitudinal study is still needed to evaluate these systems in under-supervised environments such as the home. PMID:17331243

  2. A software platform for the analysis of dermatology images

    NASA Astrophysics Data System (ADS)

    Vlassi, Maria; Mavraganis, Vlasios; Asvestas, Panteleimon

    2017-11-01

    The purpose of this paper is to present a software platform developed in Python programming environment that can be used for the processing and analysis of dermatology images. The platform provides the capability for reading a file that contains a dermatology image. The platform supports image formats such as Windows bitmaps, JPEG, JPEG2000, portable network graphics, TIFF. Furthermore, it provides suitable tools for selecting, either manually or automatically, a region of interest (ROI) on the image. The automated selection of a ROI includes filtering for smoothing the image and thresholding. The proposed software platform has a friendly and clear graphical user interface and could be a useful second-opinion tool to a dermatologist. Furthermore, it could be used to classify images including from other anatomical parts such as breast or lung, after proper re-training of the classification algorithms.

  3. Cots Correlator Platform

    NASA Astrophysics Data System (ADS)

    Schaaf, Kjeld; Overeem, Ruud

    2004-06-01

    Moore’s law is best exploited by using consumer market hardware. In particular, the gaming industry pushes the limit of processor performance thus reducing the cost per raw flop even faster than Moore’s law predicts. Next to the cost benefits of Common-Of-The-Shelf (COTS) processing resources, there is a rapidly growing experience pool in cluster based processing. The typical Beowulf cluster of PC’s supercomputers are well known. Multiple examples exists of specialised cluster computers based on more advanced server nodes or even gaming stations. All these cluster machines build upon the same knowledge about cluster software management, scheduling, middleware libraries and mathematical libraries. In this study, we have integrated COTS processing resources and cluster nodes into a very high performance processing platform suitable for streaming data applications, in particular to implement a correlator. The required processing power for the correlator in modern radio telescopes is in the range of the larger supercomputers, which motivates the usage of supercomputer technology. Raw processing power is provided by graphical processors and is combined with an Infiniband host bus adapter with integrated data stream handling logic. With this processing platform a scalable correlator can be built with continuously growing processing power at consumer market prices.

  4. Precision instrument placement using a 4-DOF robot with integrated fiducials for minimally invasive interventions

    NASA Astrophysics Data System (ADS)

    Stenzel, Roland; Lin, Ralph; Cheng, Peng; Kronreif, Gernot; Kornfeld, Martin; Lindisch, David; Wood, Bradford J.; Viswanathan, Anand; Cleary, Kevin

    2007-03-01

    Minimally invasive procedures are increasingly attractive to patients and medical personnel because they can reduce operative trauma, recovery times, and overall costs. However, during these procedures, the physician has a very limited view of the interventional field and the exact position of surgical instruments. We present an image-guided platform for precision placement of surgical instruments based upon a small four degree-of-freedom robot (B-RobII; ARC Seibersdorf Research GmbH, Vienna, Austria). This platform includes a custom instrument guide with an integrated spiral fiducial pattern as the robot's end-effector, and it uses intra-operative computed tomography (CT) to register the robot to the patient directly before the intervention. The physician can then use a graphical user interface (GUI) to select a path for percutaneous access, and the robot will automatically align the instrument guide along this path. Potential anatomical targets include the liver, kidney, prostate, and spine. This paper describes the robotic platform, workflow, software, and algorithms used by the system. To demonstrate the algorithmic accuracy and suitability of the custom instrument guide, we also present results from experiments as well as estimates of the maximum error between target and instrument tip.

  5. Using Kepler for Tool Integration in Microarray Analysis Workflows.

    PubMed

    Gan, Zhuohui; Stowe, Jennifer C; Altintas, Ilkay; McCulloch, Andrew D; Zambon, Alexander C

    Increasing numbers of genomic technologies are leading to massive amounts of genomic data, all of which requires complex analysis. More and more bioinformatics analysis tools are being developed by scientist to simplify these analyses. However, different pipelines have been developed using different software environments. This makes integrations of these diverse bioinformatics tools difficult. Kepler provides an open source environment to integrate these disparate packages. Using Kepler, we integrated several external tools including Bioconductor packages, AltAnalyze, a python-based open source tool, and R-based comparison tool to build an automated workflow to meta-analyze both online and local microarray data. The automated workflow connects the integrated tools seamlessly, delivers data flow between the tools smoothly, and hence improves efficiency and accuracy of complex data analyses. Our workflow exemplifies the usage of Kepler as a scientific workflow platform for bioinformatics pipelines.

  6. SU-E-J-04: Integration of Interstitial High Intensity Therapeutic Ultrasound Applicators On a Clinical MRI-Guided High Intensity Focused Ultrasound Treatment Planning Software Platform

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ellens, N; Partanen, A; Ghoshal, G

    Purpose: Interstitial high intensity therapeutic ultrasound (HITU) applicators can be used to ablate tissue percutaneously, allowing for minimally-invasive treatment without ionizing radiation [1,2]. The purpose of this study was to evaluate the feasibility and usability of combining multielement interstitial HITU applicators with a clinical magnetic resonance imaging (MRI)-guided focused ultrasound software platform. Methods: The Sonalleve software platform (Philips Healthcare, Vantaa, Finland) combines anatomical MRI for target selection and multi-planar MRI thermometry to provide real-time temperature information. The MRI-compatible interstitial US applicators (Acoustic MedSystems, Savoy, IL, USA) had 1–4 cylindrical US elements, each 1 cm long with either 180° or 360°more » of active surface. Each applicator (4 Fr diameter, enclosed within a 13 Fr flexible catheter) was inserted into a tissue-mimicking agar-silica phantom. Degassed water was circulated around the transducers for cooling and coupling. Based on the location of the applicator, a virtual transducer overlay was added to the software to assist targeting and to allow automatic thermometry slice placement. The phantom was sonicated at 7 MHz for 5 minutes with 6–8 W of acoustic power for each element. MR thermometry data were collected during and after sonication. Results: Preliminary testing indicated that the applicator location could be identified in the planning images and the transducer locations predicted within 1 mm accuracy using the overlay. Ablation zones (thermal dose ≥ 240 CEM43) for 2 active, adjacent US elements ranged from 18 mm × 24 mm (width × length) to 25 mm × 25 mm for the 6 W and 8 W sonications, respectively. Conclusion: The combination of interstitial HITU applicators and this software platform holds promise for novel approaches in minimally-invasive MRI-guided therapy, especially when bony structures or air-filled cavities may preclude extracorporeal HIFU.[1] Diederich et al. IEEE UFFFC 46.5 (1999): 1218.[2] Chopra et al. PMB 50.21 (2005): 4957. Funding support was provided by Philips Healthcare and in-kind support from Acoustic MedSystems Inc. Ari Partanen is a paid employee of Philips Healthcare. Goutam Ghoshal and Everette Clif Burdette are paid employees of Acoustic MedSystems Inc.« less

  7. Scoping review and evaluation of SMS/text messaging platforms for mHealth projects or clinical interventions.

    PubMed

    Iribarren, Sarah J; Brown, William; Giguere, Rebecca; Stone, Patricia; Schnall, Rebecca; Staggers, Nancy; Carballo-Diéguez, Alex

    2017-05-01

    Mobile technology supporting text messaging interventions (TMIs) continues to evolve, presenting challenges for researchers and healthcare professionals who need to choose software solutions to best meet their program needs. The objective of this review was to systematically identify and compare text messaging platforms and to summarize their advantages and disadvantages as described in peer-reviewed literature. A scoping review was conducted using four steps: 1) identify currently available platforms through online searches and in mHealth repositories; 2) expand evaluation criteria of an mHealth mobile messaging toolkit and integrate prior user experiences as researchers; 3) evaluate each platform's functions and features based on the expanded criteria and a vendor survey; and 4) assess the documentation of platform use in the peer-review literature. Platforms meeting inclusion criteria were assessed independently by three reviewers and discussed until consensus was reached. The PRISMA guidelines were followed to report findings. Of the 1041 potentially relevant search results, 27 platforms met inclusion criteria. Most were excluded because they were not platforms (e.g., guides, toolkits, reports, or SMS gateways). Of the 27 platforms, only 12 were identified in existing mHealth repositories, 10 from Google searches, while five were found in both. The expanded evaluation criteria included 22 items. Results indicate no uniform presentation of platform features and functions, often making these difficult to discern. Fourteen of the platforms were reported as open source, 10 focused on health care and 16 were tailored to meet needs of low resource settings (not mutually exclusive). Fifteen platforms had do-it-yourself setup (programming not required) while the remainder required coding/programming skills or setups could be built to specification by the vendor. Frequently described features included data security and access to the platform via cloud-based systems. Pay structures and reported targeted end-users varied. Peer-reviewed publications listed only 6 of the 27 platforms across 21 publications. The majority of these articles reported the name of the platform used but did not describe advantages or disadvantages. Searching for and comparing mHealth platforms for TMIs remains a challenge. The results of this review can serve as a resource for researchers and healthcare professionals wanting to integrate TMIs into health interventions. Steps to identify, compare and assess advantages and disadvantages are outlined for consideration. Expanded evaluation criteria can be used by future researchers. Continued and more comprehensive platform tools should be integrated into mHealth repositories. Detailed descriptions of platform advantages and disadvantages are needed when mHealth researchers publish findings to expand the body of research on TMI tools for healthcare. Standardized descriptions and features are recommended for vendor sites. Copyright © 2017 Elsevier B.V. All rights reserved.

  8. Linking earth science informatics resources into uninterrupted digital value chains

    NASA Astrophysics Data System (ADS)

    Woodcock, Robert; Angreani, Rini; Cox, Simon; Fraser, Ryan; Golodoniuc, Pavel; Klump, Jens; Rankine, Terry; Robertson, Jess; Vote, Josh

    2015-04-01

    The CSIRO Mineral Resources Flagship was established to tackle medium- to long-term challenges facing the Australian mineral industry across the value chain from exploration and mining through mineral processing within the framework of an economically, environmentally and socially sustainable minerals industry. This broad portfolio demands collaboration and data exchange with a broad range of participants and data providers across government, research and industry. It is an ideal environment to link geoscience informatics platforms to application across the resource extraction industry and to unlock the value of data integration between traditionally discrete parts of the minerals digital value chain. Despite the potential benefits, data integration remains an elusive goal within research and industry. Many projects use only a subset of available data types in an integrated manner, often maintaining the traditional discipline-based data 'silos'. Integrating data across the entire minerals digital value chain is an expensive proposition involving multiple disciplines and, significantly, multiple data sources both internal and external to any single organisation. Differing vocabularies and data formats, along with access regimes to appropriate analysis software and equipment all hamper the sharing and exchange of information. AuScope has addressed the challenge of data exchange across organisations nationally, and established a national geosciences information infrastructure using open standards-based web services. Federated across a wide variety of organisations, the resulting infrastructure contains a wide variety of live and updated data types. The community data standards and infrastructure platforms that underpin AuScope provide important new datasets and multi-agency links independent of software and hardware differences. AuScope has thus created an infrastructure, a platform of technologies and the opportunity for new ways of working with and integrating disparate data at much lower cost. An early example of this approach is the value generated by combining geological and metallurgical data sets as part of the rapidly growing field of geometallurgy. This not only provides a far better understanding of the impact of geological variability on ore processing but also leads to new thinking on the types and characteristics of data sets collected at various stages of the exploration and mining process. The Minerals Resources Flagship is linking its research activities to the AuScope infrastructure, exploiting the technology internally to create a platform for integrated research across the minerals value chain and improved interaction with industry. Referred to as the 'Early Access Virtual Lab', the system will be fully interoperable with AuScope and international infrastructures using open standards like GeosciML. Secured access is provided to allow confidential collaboration with industry when required. This presentation will discuss how the CSIRO Mineral Resources Flagship is building on the AuScope infrastructure to transform the way that data and data products are identified, shared, integrated, and reused, to unlock the benefits of true integration of research efforts across the minerals digital value chain.

  9. Solar Asset Management Software

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Iverson, Aaron; Zviagin, George

    Ra Power Management (RPM) has developed a cloud based software platform that manages the financial and operational functions of third party financed solar projects throughout their lifecycle. RPM’s software streamlines and automates the sales, financing, and management of a portfolio of solar assets. The software helps solar developers automate the most difficult aspects of asset management, leading to increased transparency, efficiency, and reduction in human error. More importantly, our platform will help developers save money by improving their operating margins.

  10. Development and operation of a real-time data acquisition system for the NASA, Langley Research Center Differential Absorption Lidar

    NASA Technical Reports Server (NTRS)

    Butler, C.; Kindle, E. C.

    1984-01-01

    The capabilities of the DIAL data acquisition system (DAS) for the remote measurement of atmospheric trace gas concentrations from ground and aircraft platforms were extended through the purchase and integration of other hardware and the implementation of improved software. An operational manual for the current system is presented. Hardware and peripheral device registers are outlined only as an aid in debugging any DAS problems which may arise.

  11. ROBOCAL: Gamma-ray isotopic hardware/software interface

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hurd, J.R.; Bonner, C.A.; Ostenak, C.A.

    1989-01-01

    ROBOCAL, presently being developed at the Los Alamos National Laboratory, is a full-scale prototypical robotic system for remotely performing calorimetric and gamma-ray isotopics measurements of nuclear materials. It features a fully automated vertical stacker-retriever for storing and retrieving packaged nuclear materials from a multi-drawer system, and a fully automated, uniquely integrated gantry robot for programmable selection and transfer of nuclear materials to calorimetric and gamma-ray isotopic measurement stations. Since ROBOCAL is to require almost no operator intervention, a mechanical control system is required in addition to a totally automated assay system. The assay system must be a completely integrated datamore » acquisition and isotopic analysis package fully capable of performing state-of-the-art homogeneous and heterogeneous analyses on many varied matrices. The TRIFID assay system being discussed at this conference by J. G. Fleissner of the Rocky Flats Plant has been adopted because of its many automated features. These include: MCA/ADC setup and acquisition; spectral storage and analysis utilizing an expert system formalism; report generation with internal measurement control printout; user friendly screens and menus. The mechanical control portion consists primarily of two detector platforms and a sample platform, each with independent movement. Some minor modifications and additions are needed with TRIFID to interface the assay and mechanical portions with the CimRoc 4000 software controlling the robot. 6 refs., 5 figs., 3 tabs.« less

  12. IMPACT_S: integrated multiprogram platform to analyze and combine tests of selection.

    PubMed

    Maldonado, Emanuel; Sunagar, Kartik; Almeida, Daniela; Vasconcelos, Vitor; Antunes, Agostinho

    2014-01-01

    Among the major goals of research in evolutionary biology are the identification of genes targeted by natural selection and understanding how various regimes of evolution affect the fitness of an organism. In particular, adaptive evolution enables organisms to adapt to changing ecological factors such as diet, temperature, habitat, predatory pressures and prey abundance. An integrative approach is crucial for the identification of non-synonymous mutations that introduce radical changes in protein biochemistry and thus in turn influence the structure and function of proteins. Performing such analyses manually is often a time-consuming process, due to the large number of statistical files generated from multiple approaches, especially when assessing numerous taxa and/or large datasets. We present IMPACT_S, an easy-to-use Graphical User Interface (GUI) software, which rapidly and effectively integrates, filters and combines results from three widely used programs for assessing the influence of selection: Codeml (PAML package), Datamonkey and TreeSAAP. It enables the identification and tabulation of sites detected by these programs as evolving under the influence of positive, neutral and/or negative selection in protein-coding genes. IMPACT_S further facilitates the automatic mapping of these sites onto the three-dimensional structures of proteins. Other useful tools incorporated in IMPACT_S include Jmol, Archaeopteryx, Gnuplot, PhyML, a built-in Swiss-Model interface and a PDB downloader. The relevance and functionality of IMPACT_S is shown through a case study on the toxicoferan-reptilian Cysteine-rich Secretory Proteins (CRiSPs). IMPACT_S is a platform-independent software released under GPLv3 license, freely available online from http://impact-s.sourceforge.net.

  13. Distributed and Collaborative Software Analysis

    NASA Astrophysics Data System (ADS)

    Ghezzi, Giacomo; Gall, Harald C.

    Throughout the years software engineers have come up with a myriad of specialized tools and techniques that focus on a certain type of software analysissoftware analysis such as source code analysis, co-change analysis or bug prediction. However, easy and straight forward synergies between these analyses and tools rarely exist because of their stand-alone nature, their platform dependence, their different input and output formats and the variety of data to analyze. As a consequence, distributed and collaborative software analysiscollaborative software analysis scenarios and in particular interoperability are severely limited. We describe a distributed and collaborative software analysis platform that allows for a seamless interoperability of software analysis tools across platform, geographical and organizational boundaries. We realize software analysis tools as services that can be accessed and composed over the Internet. These distributed analysis services shall be widely accessible in our incrementally augmented Software Analysis Broker software analysis broker where organizations and tool providers can register and share their tools. To allow (semi-) automatic use and composition of these tools, they are classified and mapped into a software analysis taxonomy and adhere to specific meta-models and ontologiesontologies for their category of analysis.

  14. Using EIGER for Antenna Design and Analysis

    NASA Technical Reports Server (NTRS)

    Champagne, Nathan J.; Khayat, Michael; Kennedy, Timothy F.; Fink, Patrick W.

    2007-01-01

    EIGER (Electromagnetic Interactions GenERalized) is a frequency-domain electromagnetics software package that is built upon a flexible framework, designed using object-oriented techniques. The analysis methods used include moment method solutions of integral equations, finite element solutions of partial differential equations, and combinations thereof. The framework design permits new analysis techniques (boundary conditions, Green#s functions, etc.) to be added to the software suite with a sensible effort. The code has been designed to execute (in serial or parallel) on a wide variety of platforms from Intel-based PCs and Unix-based workstations. Recently, new potential integration scheme s that avoid singularity extraction techniques have been added for integral equation analysis. These new integration schemes are required for facilitating the use of higher-order elements and basis functions. Higher-order elements are better able to model geometrical curvature using fewer elements than when using linear elements. Higher-order basis functions are beneficial for simulating structures with rapidly varying fields or currents. Results presented here will demonstrate curren t and future capabilities of EIGER with respect to analysis of installed antenna system performance in support of NASA#s mission of exploration. Examples include antenna coupling within an enclosed environment and antenna analysis on electrically large manned space vehicles.

  15. Monitoring and detection platform to prevent anomalous situations in home care.

    PubMed

    Villarrubia, Gabriel; Bajo, Javier; De Paz, Juan F; Corchado, Juan M

    2014-06-05

    Monitoring and tracking people at home usually requires high cost hardware installations, which implies they are not affordable in many situations. This study/paper proposes a monitoring and tracking system for people with medical problems. A virtual organization of agents based on the PANGEA platform, which allows the easy integration of different devices, was created for this study. In this case, a virtual organization was implemented to track and monitor patients carrying a Holter monitor. The system includes the hardware and software required to perform: ECG measurements, monitoring through accelerometers and WiFi networks. Furthermore, the use of interactive television can moderate interactivity with the user. The system makes it possible to merge the information and facilitates patient tracking efficiently with low cost.

  16. Radiology and Enterprise Medical Imaging Extensions (REMIX).

    PubMed

    Erdal, Barbaros S; Prevedello, Luciano M; Qian, Songyue; Demirer, Mutlu; Little, Kevin; Ryu, John; O'Donnell, Thomas; White, Richard D

    2018-02-01

    Radiology and Enterprise Medical Imaging Extensions (REMIX) is a platform originally designed to both support the medical imaging-driven clinical and clinical research operational needs of Department of Radiology of The Ohio State University Wexner Medical Center. REMIX accommodates the storage and handling of "big imaging data," as needed for large multi-disciplinary cancer-focused programs. The evolving REMIX platform contains an array of integrated tools/software packages for the following: (1) server and storage management; (2) image reconstruction; (3) digital pathology; (4) de-identification; (5) business intelligence; (6) texture analysis; and (7) artificial intelligence. These capabilities, along with documentation and guidance, explaining how to interact with a commercial system (e.g., PACS, EHR, commercial database) that currently exists in clinical environments, are to be made freely available.

  17. Web Platform for Sharing Modeling Software in the Field of Nonlinear Optics

    NASA Astrophysics Data System (ADS)

    Dubenskaya, Julia; Kryukov, Alexander; Demichev, Andrey

    2018-02-01

    We describe the prototype of a Web platform intended for sharing software programs for computer modeling in the rapidly developing field of the nonlinear optics phenomena. The suggested platform is built on the top of the HUBZero open-source middleware. In addition to the basic HUBZero installation we added to our platform the capability to run Docker containers via an external application server and to send calculation programs to those containers for execution. The presented web platform provides a wide range of features and might be of benefit to nonlinear optics researchers.

  18. Architecture of a high-performance surgical guidance system based on C-arm cone-beam CT: software platform for technical integration and clinical translation

    NASA Astrophysics Data System (ADS)

    Uneri, Ali; Schafer, Sebastian; Mirota, Daniel; Nithiananthan, Sajendra; Otake, Yoshito; Reaungamornrat, Sureerat; Yoo, Jongheun; Stayman, J. Webster; Reh, Douglas; Gallia, Gary L.; Khanna, A. Jay; Hager, Gregory; Taylor, Russell H.; Kleinszig, Gerhard; Siewerdsen, Jeffrey H.

    2011-03-01

    Intraoperative imaging modalities are becoming more prevalent in recent years, and the need for integration of these modalities with surgical guidance is rising, creating new possibilities as well as challenges. In the context of such emerging technologies and new clinical applications, a software architecture for cone-beam CT (CBCT) guided surgery has been developed with emphasis on binding open-source surgical navigation libraries and integrating intraoperative CBCT with novel, application-specific registration and guidance technologies. The architecture design is focused on accelerating translation of task-specific technical development in a wide range of applications, including orthopaedic, head-and-neck, and thoracic surgeries. The surgical guidance system is interfaced with a prototype mobile C-arm for high-quality CBCT and through a modular software architecture, integration of different tools and devices consistent with surgical workflow in each of these applications is realized. Specific modules are developed according to the surgical task, such as: 3D-3D rigid or deformable registration of preoperative images, surgical planning data, and up-to-date CBCT images; 3D-2D registration of planning and image data in real-time fluoroscopy and/or digitally reconstructed radiographs (DRRs); compatibility with infrared, electromagnetic, and video-based trackers used individually or in hybrid arrangements; augmented overlay of image and planning data in endoscopic or in-room video; real-time "virtual fluoroscopy" computed from GPU-accelerated DRRs; and multi-modality image display. The platform aims to minimize offline data processing by exposing quantitative tools that analyze and communicate factors of geometric precision. The system was translated to preclinical phantom and cadaver studies for assessment of fiducial (FRE) and target registration error (TRE) showing sub-mm accuracy in targeting and video overlay within intraoperative CBCT. The work culminates in the development of a CBCT guidance system (reported here for the first time) that leverages the technical developments in Carm CBCT and associated technologies for realizing a high-performance system for translation to clinical studies.

  19. Open Software Tools Applied to Jordan's National Multi-Agent Water Management Model

    NASA Astrophysics Data System (ADS)

    Knox, Stephen; Meier, Philipp; Harou, Julien; Yoon, Jim; Selby, Philip; Lachaut, Thibaut; Klassert, Christian; Avisse, Nicolas; Khadem, Majed; Tilmant, Amaury; Gorelick, Steven

    2016-04-01

    Jordan is the fourth most water scarce country in the world, where demand exceeds supply in a politically and demographically unstable context. The Jordan Water Project (JWP) aims to perform policy evaluation by modelling the hydrology, economics, and governance of Jordan's water resource system. The multidisciplinary nature of the project requires a modelling software system capable of integrating submodels from multiple disciplines into a single decision making process and communicating results to stakeholders. This requires a tool for building an integrated model and a system where diverse data sets can be managed and visualised. The integrated Jordan model is built using Pynsim, an open-source multi-agent simulation framework implemented in Python. Pynsim operates on network structures of nodes and links and supports institutional hierarchies, where an institution represents a grouping of nodes, links or other institutions. At each time step, code within each node, link and institution can executed independently, allowing for their fully autonomous behaviour. Additionally, engines (sub-models) perform actions over the entire network or on a subset of the network, such as taking a decision on a set of nodes. Pynsim is modular in design, allowing distinct modules to be modified easily without affecting others. Data management and visualisation is performed using Hydra (www.hydraplatform.org), an open software platform allowing users to manage network structure and data. The Hydra data manager connects to Pynsim, providing necessary input parameters for the integrated model. By providing a high-level portal to the model, Hydra removes a barrier between the users of the model (researchers, stakeholders, planners etc) and the model itself, allowing them to manage data, run the model and visualise results all through a single user interface. Pynsim's ability to represent institutional hierarchies, inter-network communication and the separation of node, link and institutional logic from higher level processes (engine) suit JWP's requirements. The use of Hydra Platform and Pynsim helps make complex customised models such as the JWP model easier to run and manage with international groups of researchers.

  20. Trans-oceanic Remote Power Hardware-in-the-Loop: Multi-site Hardware, Integrated Controller, and Electric Network Co-simulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lundstrom, Blake R.; Palmintier, Bryan S.; Rowe, Daniel

    Electric system operators are increasingly concerned with the potential system-wide impacts of the large-scale integration of distributed energy resources (DERs) including voltage control, protection coordination, and equipment wear. This prompts a need for new simulation techniques that can simultaneously capture all the components of these large integrated smart grid systems. This paper describes a novel platform that combines three emerging research areas: power systems co-simulation, power hardware in the loop (PHIL) simulation, and lab-lab links. The platform is distributed, real-time capable, allows for easy internet-based connection from geographically-dispersed participants, and is software platform agnostic. We demonstrate its utility by studyingmore » real-time PHIL co-simulation of coordinated solar PV firming control of two inverters connected in multiple electric distribution network models, prototypical of U.S. and Australian systems. Here, the novel trans-pacific closed-loop system simulation was conducted in real-time using a power network simulator and physical PV/battery inverter at power at the National Renewable Energy Laboratory in Golden, CO, USA and a physical PV inverter at power at the Commonwealth Scientific and Industrial Research Organisation's Energy Centre in Newcastle, NSW, Australia. This capability enables smart grid researchers throughout the world to leverage their unique simulation capabilities for multi-site collaborations that can effectively simulate and validate emerging smart grid technology solutions.« less

  1. Trans-oceanic Remote Power Hardware-in-the-Loop: Multi-site Hardware, Integrated Controller, and Electric Network Co-simulation

    DOE PAGES

    Lundstrom, Blake R.; Palmintier, Bryan S.; Rowe, Daniel; ...

    2017-07-24

    Electric system operators are increasingly concerned with the potential system-wide impacts of the large-scale integration of distributed energy resources (DERs) including voltage control, protection coordination, and equipment wear. This prompts a need for new simulation techniques that can simultaneously capture all the components of these large integrated smart grid systems. This paper describes a novel platform that combines three emerging research areas: power systems co-simulation, power hardware in the loop (PHIL) simulation, and lab-lab links. The platform is distributed, real-time capable, allows for easy internet-based connection from geographically-dispersed participants, and is software platform agnostic. We demonstrate its utility by studyingmore » real-time PHIL co-simulation of coordinated solar PV firming control of two inverters connected in multiple electric distribution network models, prototypical of U.S. and Australian systems. Here, the novel trans-pacific closed-loop system simulation was conducted in real-time using a power network simulator and physical PV/battery inverter at power at the National Renewable Energy Laboratory in Golden, CO, USA and a physical PV inverter at power at the Commonwealth Scientific and Industrial Research Organisation's Energy Centre in Newcastle, NSW, Australia. This capability enables smart grid researchers throughout the world to leverage their unique simulation capabilities for multi-site collaborations that can effectively simulate and validate emerging smart grid technology solutions.« less

  2. Tele-Supervised Adaptive Ocean Sensor Fleet

    NASA Technical Reports Server (NTRS)

    Lefes, Alberto; Podnar, Gregg W.; Dolan, John M.; Hosler, Jeffrey C.; Ames, Troy J.

    2009-01-01

    The Tele-supervised Adaptive Ocean Sensor Fleet (TAOSF) is a multi-robot science exploration architecture and system that uses a group of robotic boats (the Ocean-Atmosphere Sensor Integration System, or OASIS) to enable in-situ study of ocean surface and subsurface characteristics and the dynamics of such ocean phenomena as coastal pollutants, oil spills, hurricanes, or harmful algal blooms (HABs). The OASIS boats are extended- deployment, autonomous ocean surface vehicles. The TAOSF architecture provides an integrated approach to multi-vehicle coordination and sliding human-vehicle autonomy. One feature of TAOSF is the adaptive re-planning of the activities of the OASIS vessels based on sensor input ( smart sensing) and sensorial coordination among multiple assets. The architecture also incorporates Web-based communications that permit control of the assets over long distances and the sharing of data with remote experts. Autonomous hazard and assistance detection allows the automatic identification of hazards that require human intervention to ensure the safety and integrity of the robotic vehicles, or of science data that require human interpretation and response. Also, the architecture is designed for science analysis of acquired data in order to perform an initial onboard assessment of the presence of specific science signatures of immediate interest. TAOSF integrates and extends five subsystems developed by the participating institutions: Emergent Space Tech - nol ogies, Wallops Flight Facility, NASA s Goddard Space Flight Center (GSFC), Carnegie Mellon University, and Jet Propulsion Laboratory (JPL). The OASIS Autonomous Surface Vehicle (ASV) system, which includes the vessels as well as the land-based control and communications infrastructure developed for them, controls the hardware of each platform (sensors, actuators, etc.), and also provides a low-level waypoint navigation capability. The Multi-Platform Simulation Environment from GSFC is a surrogate for the OASIS ASV system and allows for independent development and testing of higher-level software components. The Platform Communicator acts as a proxy for both actual and simulated platforms. It translates platform-independent messages from the higher control systems to the device-dependent communication protocols. This enables the higher-level control systems to interact identically with heterogeneous actual or simulated platforms.

  3. Software platform for simulation of a prototype proton CT scanner.

    PubMed

    Giacometti, Valentina; Bashkirov, Vladimir A; Piersimoni, Pierluigi; Guatelli, Susanna; Plautz, Tia E; Sadrozinski, Hartmut F-W; Johnson, Robert P; Zatserklyaniy, Andriy; Tessonnier, Thomas; Parodi, Katia; Rosenfeld, Anatoly B; Schulte, Reinhard W

    2017-03-01

    Proton computed tomography (pCT) is a promising imaging technique to substitute or at least complement x-ray CT for more accurate proton therapy treatment planning as it allows calculating directly proton relative stopping power from proton energy loss measurements. A proton CT scanner with a silicon-based particle tracking system and a five-stage scintillating energy detector has been completed. In parallel a modular software platform was developed to characterize the performance of the proposed pCT. The modular pCT software platform consists of (1) a Geant4-based simulation modeling the Loma Linda proton therapy beam line and the prototype proton CT scanner, (2) water equivalent path length (WEPL) calibration of the scintillating energy detector, and (3) image reconstruction algorithm for the reconstruction of the relative stopping power (RSP) of the scanned object. In this work, each component of the modular pCT software platform is described and validated with respect to experimental data and benchmarked against theoretical predictions. In particular, the RSP reconstruction was validated with both experimental scans, water column measurements, and theoretical calculations. The results show that the pCT software platform accurately reproduces the performance of the existing prototype pCT scanner with a RSP agreement between experimental and simulated values to better than 1.5%. The validated platform is a versatile tool for clinical proton CT performance and application studies in a virtual setting. The platform is flexible and can be modified to simulate not yet existing versions of pCT scanners and higher proton energies than those currently clinically available. © 2017 American Association of Physicists in Medicine.

  4. Development of fast wireless detection system for fixed offshore platform

    NASA Astrophysics Data System (ADS)

    Li, Zhigang; Yu, Yan; Jiao, Dong; Wang, Jie; Li, Zhirui; Ou, Jinping

    2011-04-01

    Offshore platforms' security is concerned since in 1950s and 1960s, and in the early 1980s some important specifications and standards are built, and all these provide technical basis of fixed platform design, construction, installation and evaluation. With the condition that more and more platforms are in serving over age, the research about the evaluation and detection technology of offshore platform has been a hotspot, especially underwater detection, and assessment method based on the finite element calculation. For fixed platform structure detection, conventional NDT methods, such as eddy current, magnetic powder, permeate, X-ray and ultrasonic, etc, are generally used. These techniques are more mature, intuitive, but underwater detection needs underwater robot, the necessary supporting tools of auxiliary equipment, and trained professional team, thus resources and cost used are considerable, installation time of test equipment is long. This project presents a new kind of fast wireless detection and damage diagnosis system for fixed offshore platform using wireless sensor networks, that is, wireless sensor nodes can be put quickly on the offshore platform, detect offshore platform structure global status by wireless communication, and then make diagnosis. This system is operated simply, suitable for offshore platform integrity states rapid assessment. The designed system consists in intelligence acquisition equipment and 8 wireless collection nodes, the whole system has 64 collection channels, namely every wireless collection node has eight 16-bit accuracy of A/D channels. Wireless collection node, integrated with vibration sensing unit, embedded low-power micro-processing unit, wireless transceiver unit, large-capacity power unit, and GPS time synchronization unit, can finish the functions such as vibration data collection, initial analysis, data storage, data wireless transmission. Intelligence acquisition equipment, integrated with high-performance computation unit, wireless transceiver unit, mobile power unit and embedded data analysis software, can totally control multi-wireless collection nodes, receive and analyze data, parameter identification. Data is transmitted at the 2.4GHz wireless communication channel, every sensing data channel in charge of data transmission is in a stable frequency band, control channel responsible for the control of power parameters is in a public frequency band. The test is initially conducted for the designed system, experimental results show that the system has good application prospects and practical value with fast arrangement, high sampling rate, high resolution, capacity of low frequency detection.

  5. Open-Source Learning Management System and Web 2.0 Online Social Software Applications as Learning Platforms for an Elementary School in Singapore

    ERIC Educational Resources Information Center

    Tay, Lee Yong; Lim, Cher Ping; Lye, Sze Yee; Ng, Kay Joo; Lim, Siew Khiaw

    2011-01-01

    This paper analyses how an elementary-level future school in Singapore implements and uses various open-source online platforms, which are easily available online and could be implemented with minimal software cost, for the purpose of teaching and learning. Online platforms have the potential to facilitate students' engagement for independent and…

  6. Thermal Analysis for Monitoring Effects of Shock-Induced Physical, Mechanical, and Chemical Changes in Materials

    DTIC Science & Technology

    2015-01-19

    MS WINDOWS platform, which enables multitasking with simultaneous evaluation and operation 1. REPORT DATE (DD-MM-YYYY) 4. TITLE AND SUBTITLE 13...measurement and analysis software for data acquisition, storage and evaluation with MS WINDOWS platform, which enables multitasking with simultaneous...Proteus measurement and analysis software for data acquisition, storage and evaluation with MS WINDOWS platform, which enables multitasking with

  7. Space Software Defined Radio Characterization to Enable Reuse

    NASA Technical Reports Server (NTRS)

    Mortensen, Dale J.; Bishop, Daniel W.; Chelmins, David

    2012-01-01

    NASA's Space Communication and Navigation Testbed is beginning operations on the International Space Station this year. The objective is to promote new software defined radio technologies and associated software application reuse, enabled by this first flight of NASA's Space Telecommunications Radio System architecture standard. The Space Station payload has three software defined radios onboard that allow for a wide variety of communications applications; however, each radio was only launched with one waveform application. By design the testbed allows new waveform applications to be uploaded and tested by experimenters in and outside of NASA. During the system integration phase of the testbed special waveform test modes and stand-alone test waveforms were used to characterize the SDR platforms for the future experiments. Characterization of the Testbed's JPL SDR using test waveforms and specialized ground test modes is discussed in this paper. One of the test waveforms, a record and playback application, can be utilized in a variety of ways, including new satellite on-orbit checkout as well as independent on-board testbed experiments.

  8. High Efficiency Traveling-Wave Tube Power Amplifier for Ka-Band Software Defined Radio on International Space Station-A Platform for Communications Technology Development

    NASA Technical Reports Server (NTRS)

    Simons, Rainee N.; Force, Dale A.; Kacpura, Thomas J.

    2013-01-01

    The design, fabrication and RF performance of the output traveling-wave tube amplifier (TWTA) for a space based Ka-band software defined radio (SDR) is presented. The TWTA, the SDR and the supporting avionics are integrated to forms a testbed, which is currently located on an exterior truss of the International Space Station (ISS). The SDR in the testbed communicates at Ka-band frequencies through a high-gain antenna directed to NASA s Tracking and Data Relay Satellite System (TDRSS), which communicates to the ground station located at White Sands Complex. The application of the testbed is for demonstrating new waveforms and software designed to enhance data delivery from scientific spacecraft and, the waveforms and software can be upgraded and reconfigured from the ground. The construction and the salient features of the Ka-band SDR are discussed. The testbed is currently undergoing on-orbit checkout and commissioning and is expected to operate for 3 to 5 years in space.

  9. Developing an Open Source, Reusable Platform for Distributed Collaborative Information Management in the Early Detection Research Network

    NASA Technical Reports Server (NTRS)

    Hart, Andrew F.; Verma, Rishi; Mattmann, Chris A.; Crichton, Daniel J.; Kelly, Sean; Kincaid, Heather; Hughes, Steven; Ramirez, Paul; Goodale, Cameron; Anton, Kristen; hide

    2012-01-01

    For the past decade, the NASA Jet Propulsion Laboratory, in collaboration with Dartmouth University has served as the center for informatics for the Early Detection Research Network (EDRN). The EDRN is a multi-institution research effort funded by the U.S. National Cancer Institute (NCI) and tasked with identifying and validating biomarkers for the early detection of cancer. As the distributed network has grown, increasingly formal processes have been developed for the acquisition, curation, storage, and dissemination of heterogeneous research information assets, and an informatics infrastructure has emerged. In this paper we discuss the evolution of EDRN informatics, its success as a mechanism for distributed information integration, and the potential sustainability and reuse benefits of emerging efforts to make the platform components themselves open source. We describe our experience transitioning a large closed-source software system to a community driven, open source project at the Apache Software Foundation, and point to lessons learned that will guide our present efforts to promote the reuse of the EDRN informatics infrastructure by a broader community.

  10. Development of a platform-independent receiver control system for SISIFOS

    NASA Astrophysics Data System (ADS)

    Lemke, Roland; Olberg, Michael

    1998-05-01

    Up to now receiver control software was a time consuming development usually written by receiver engineers who had mainly the hardware in mind. We are presenting a low-cost and very flexible system which uses a minimal interface to the real hardware, and which makes it easy to adapt to new receivers. Our system uses Tcl/Tk as a graphical user interface (GUI), SpecTcl as a GUI builder, Pgplot as plotting software, a simple query language (SQL) database for information storage and retrieval, Ethernet socket to socket communication and SCPI as a command control language. The complete system is in principal platform independent but for cost saving reasons we are using it actually on a PC486 running Linux 2.0.30, which is a copylefted Unix. The only hardware dependent part are the digital input/output boards, analog to digital and digital to analog convertors. In the case of the Linux PC we are using a device driver development kit to integrate the boards fully into the kernel of the operating system, which indeed makes them look like an ordinary device. The advantage of this system is firstly the low price and secondly the clear separation between the different software components which are available for many operating systems. If it is not possible, due to CPU performance limitations, to run all the software in a single machine,the SQL-database or the graphical user interface could be installed on separate computers.

  11. DPOI: Distributed software system development platform for ocean information service

    NASA Astrophysics Data System (ADS)

    Guo, Zhongwen; Hu, Keyong; Jiang, Yongguo; Sun, Zhaosui

    2015-02-01

    Ocean information management is of great importance as it has been employed in many areas of ocean science and technology. However, the developments of Ocean Information Systems (OISs) often suffer from low efficiency because of repetitive work and continuous modifications caused by dynamic requirements. In this paper, the basic requirements of OISs are analyzed first, and then a novel platform DPOI is proposed to improve development efficiency and enhance software quality of OISs by providing off-the-shelf resources. In the platform, the OIS is decomposed hierarchically into a set of modules, which can be reused in different system developments. These modules include the acquisition middleware and data loader that collect data from instruments and files respectively, the database that stores data consistently, the components that support fast application generation, the web services that make the data from distributed sources syntactical by use of predefined schemas and the configuration toolkit that enables software customization. With the assistance of the development platform, the software development needs no programming and the development procedure is thus accelerated greatly. We have applied the development platform in practical developments and evaluated its efficiency in several development practices and different development approaches. The results show that DPOI significantly improves development efficiency and software quality.

  12. PRANAS: A New Platform for Retinal Analysis and Simulation.

    PubMed

    Cessac, Bruno; Kornprobst, Pierre; Kraria, Selim; Nasser, Hassan; Pamplona, Daniela; Portelli, Geoffrey; Viéville, Thierry

    2017-01-01

    The retina encodes visual scenes by trains of action potentials that are sent to the brain via the optic nerve. In this paper, we describe a new free access user-end software allowing to better understand this coding. It is called PRANAS (https://pranas.inria.fr), standing for Platform for Retinal ANalysis And Simulation. PRANAS targets neuroscientists and modelers by providing a unique set of retina-related tools. PRANAS integrates a retina simulator allowing large scale simulations while keeping a strong biological plausibility and a toolbox for the analysis of spike train population statistics. The statistical method (entropy maximization under constraints) takes into account both spatial and temporal correlations as constraints, allowing to analyze the effects of memory on statistics. PRANAS also integrates a tool computing and representing in 3D (time-space) receptive fields. All these tools are accessible through a friendly graphical user interface. The most CPU-costly of them have been implemented to run in parallel.

  13. Coordinating complex decision support activities across distributed applications

    NASA Technical Reports Server (NTRS)

    Adler, Richard M.

    1994-01-01

    Knowledge-based technologies have been applied successfully to automate planning and scheduling in many problem domains. Automation of decision support can be increased further by integrating task-specific applications with supporting database systems, and by coordinating interactions between such tools to facilitate collaborative activities. Unfortunately, the technical obstacles that must be overcome to achieve this vision of transparent, cooperative problem-solving are daunting. Intelligent decision support tools are typically developed for standalone use, rely on incompatible, task-specific representational models and application programming interfaces (API's), and run on heterogeneous computing platforms. Getting such applications to interact freely calls for platform independent capabilities for distributed communication, as well as tools for mapping information across disparate representations. Symbiotics is developing a layered set of software tools (called NetWorks! for integrating and coordinating heterogeneous distributed applications. he top layer of tools consists of an extensible set of generic, programmable coordination services. Developers access these services via high-level API's to implement the desired interactions between distributed applications.

  14. Design of a terminal solution for integration of in-home health care devices and services towards the Internet-of-Things

    NASA Astrophysics Data System (ADS)

    Pang, Zhibo; Zheng, Lirong; Tian, Junzhe; Kao-Walter, Sharon; Dubrova, Elena; Chen, Qiang

    2015-01-01

    In-home health care services based on the Internet-of-Things are promising to resolve the challenges caused by the ageing of population. But the existing research is rather scattered and shows lack of interoperability. In this article, a business-technology co-design methodology is proposed for cross-boundary integration of in-home health care devices and services. In this framework, three key elements of a solution (business model, device and service integration architecture and information system integration architecture) are organically integrated and aligned. In particular, a cooperative Health-IoT ecosystem is formulated, and information systems of all stakeholders are integrated in a cooperative health cloud as well as extended to patients' home through the in-home health care station (IHHS). Design principles of the IHHS includes the reuse of 3C platform, certification of the Health Extension, interoperability and extendibility, convenient and trusted software distribution, standardised and secured electrical health care record handling, effective service composition and efficient data fusion. These principles are applied to the design of an IHHS solution called iMedBox. Detailed device and service integration architecture and hardware and software architecture are presented and verified by an implemented prototype. The quantitative performance analysis and field trials have confirmed the feasibility of the proposed design methodology and solution.

  15. relaxGUI: a new software for fast and simple NMR relaxation data analysis and calculation of ps-ns and μs motion of proteins.

    PubMed

    Bieri, Michael; d'Auvergne, Edward J; Gooley, Paul R

    2011-06-01

    Investigation of protein dynamics on the ps-ns and μs-ms timeframes provides detailed insight into the mechanisms of enzymes and the binding properties of proteins. Nuclear magnetic resonance (NMR) is an excellent tool for studying protein dynamics at atomic resolution. Analysis of relaxation data using model-free analysis can be a tedious and time consuming process, which requires good knowledge of scripting procedures. The software relaxGUI was developed for fast and simple model-free analysis and is fully integrated into the software package relax. It is written in Python and uses wxPython to build the graphical user interface (GUI) for maximum performance and multi-platform use. This software allows the analysis of NMR relaxation data with ease and the generation of publication quality graphs as well as color coded images of molecular structures. The interface is designed for simple data analysis and management. The software was tested and validated against the command line version of relax.

  16. How Conoco uses GIS technology to map geology, geography through time

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Foley, D.C.; Ghazi, T.Y.

    1995-05-08

    Conoco Inc.`s Advanced Exploration Organization (AEO) is in the business of studying foreign sedimentary basins from a regional perspective to evaluate their potential for petroleum exploration. Recently the company decided to focus some of the AEO`s resources on developing a global ranking system for those areas of the world where hydrocarbons might occur. AEO obtained software from the University of Texas, Arlington that rotates continents or portions of continents through time. Using the software, company geoscientists have created a series of maps, known as a PaleoAtlas, that depicts the geography and selected geological features for different periods in Phanerozoic time.more » In addition, the AEO has developed a software package based on ARC/INFO (ESRI Inc., Redlands, Calif.), a commercial GIS platform, to manage, integrate, and analyze those time-slice maps. Entitled PaleoAtlas Geographic Evaluation system (Pages), this software also sequences portions of the maps in a montage effect that geoscientists can use to study the geological evolution of petroleum source rocks. The paper describes the AEO project and its software.« less

  17. Optimizing CyberShake Seismic Hazard Workflows for Large HPC Resources

    NASA Astrophysics Data System (ADS)

    Callaghan, S.; Maechling, P. J.; Juve, G.; Vahi, K.; Deelman, E.; Jordan, T. H.

    2014-12-01

    The CyberShake computational platform is a well-integrated collection of scientific software and middleware that calculates 3D simulation-based probabilistic seismic hazard curves and hazard maps for the Los Angeles region. Currently each CyberShake model comprises about 235 million synthetic seismograms from about 415,000 rupture variations computed at 286 sites. CyberShake integrates large-scale parallel and high-throughput serial seismological research codes into a processing framework in which early stages produce files used as inputs by later stages. Scientific workflow tools are used to manage the jobs, data, and metadata. The Southern California Earthquake Center (SCEC) developed the CyberShake platform using USC High Performance Computing and Communications systems and open-science NSF resources.CyberShake calculations were migrated to the NSF Track 1 system NCSA Blue Waters when it became operational in 2013, via an interdisciplinary team approach including domain scientists, computer scientists, and middleware developers. Due to the excellent performance of Blue Waters and CyberShake software optimizations, we reduced the makespan (a measure of wallclock time-to-solution) of a CyberShake study from 1467 to 342 hours. We will describe the technical enhancements behind this improvement, including judicious introduction of new GPU software, improved scientific software components, increased workflow-based automation, and Blue Waters-specific workflow optimizations.Our CyberShake performance improvements highlight the benefits of scientific workflow tools. The CyberShake workflow software stack includes the Pegasus Workflow Management System (Pegasus-WMS, which includes Condor DAGMan), HTCondor, and Globus GRAM, with Pegasus-mpi-cluster managing the high-throughput tasks on the HPC resources. The workflow tools handle data management, automatically transferring about 13 TB back to SCEC storage.We will present performance metrics from the most recent CyberShake study, executed on Blue Waters. We will compare the performance of CPU and GPU versions of our large-scale parallel wave propagation code, AWP-ODC-SGT. Finally, we will discuss how these enhancements have enabled SCEC to move forward with plans to increase the CyberShake simulation frequency to 1.0 Hz.

  18. WIFIP: a web-based user interface for automated synchrotron beamlines.

    PubMed

    Sallaz-Damaz, Yoann; Ferrer, Jean Luc

    2017-09-01

    The beamline control software, through the associated graphical user interface (GUI), is the user access point to the experiment, interacting with synchrotron beamline components and providing automated routines. FIP, the French beamline for the Investigation of Proteins, is a highly automatized macromolecular crystallography (MX) beamline at the European Synchrotron Radiation Facility. On such a beamline, a significant number of users choose to control their experiment remotely. This is often performed with a limited bandwidth and from a large choice of computers and operating systems. Furthermore, this has to be possible in a rapidly evolving experimental environment, where new developments have to be easily integrated. To face these challenges, a light, platform-independent, control software and associated GUI are required. Here, WIFIP, a web-based user interface developed at FIP, is described. Further than being the present FIP control interface, WIFIP is also a proof of concept for future MX control software.

  19. Faster than Real-Time Dynamic Simulation for Large-Size Power System with Detailed Dynamic Models using High-Performance Computing Platform

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Huang, Renke; Jin, Shuangshuang; Chen, Yousu

    This paper presents a faster-than-real-time dynamic simulation software package that is designed for large-size power system dynamic simulation. It was developed on the GridPACKTM high-performance computing (HPC) framework. The key features of the developed software package include (1) faster-than-real-time dynamic simulation for a WECC system (17,000 buses) with different types of detailed generator, controller, and relay dynamic models, (2) a decoupled parallel dynamic simulation algorithm with optimized computation architecture to better leverage HPC resources and technologies, (3) options for HPC-based linear and iterative solvers, (4) hidden HPC details, such as data communication and distribution, to enable development centered on mathematicalmore » models and algorithms rather than on computational details for power system researchers, and (5) easy integration of new dynamic models and related algorithms into the software package.« less

  20. The relational database model and multiple multicenter clinical trials.

    PubMed

    Blumenstein, B A

    1989-12-01

    The Southwest Oncology Group (SWOG) chose to use a relational database management system (RDBMS) for the management of data from multiple clinical trials because of the underlying relational model's inherent flexibility and the natural way multiple entity types (patients, studies, and participants) can be accommodated. The tradeoffs to using the relational model as compared to using the hierarchical model include added computing cycles due to deferred data linkages and added procedural complexity due to the necessity of implementing protections against referential integrity violations. The SWOG uses its RDBMS as a platform on which to build data operations software. This data operations software, which is written in a compiled computer language, allows multiple users to simultaneously update the database and is interactive with respect to the detection of conditions requiring action and the presentation of options for dealing with those conditions. The relational model facilitates the development and maintenance of data operations software.

  1. CymeR: cytometry analysis using KNIME, docker and R

    PubMed Central

    Muchmore, B.; Alarcón-Riquelme, M.E.

    2017-01-01

    Abstract Summary: Here we present open-source software for the analysis of high-dimensional cytometry data using state of the art algorithms. Importantly, use of the software requires no programming ability, and output files can either be interrogated directly in CymeR or they can be used downstream with any other cytometric data analysis platform. Also, because we use Docker to integrate the multitude of components that form the basis of CymeR, we have additionally developed a proof-of-concept of how future open-source bioinformatic programs with graphical user interfaces could be developed. Availability and Implementation: CymeR is open-source software that ties several components into a single program that is perhaps best thought of as a self-contained data analysis operating system. Please see https://github.com/bmuchmore/CymeR/wiki for detailed installation instructions. Contact: brian.muchmore@genyo.es or marta.alarcon@genyo.es PMID:27998935

  2. CymeR: cytometry analysis using KNIME, docker and R.

    PubMed

    Muchmore, B; Alarcón-Riquelme, M E

    2017-03-01

    Here we present open-source software for the analysis of high-dimensional cytometry data using state of the art algorithms. Importantly, use of the software requires no programming ability, and output files can either be interrogated directly in CymeR or they can be used downstream with any other cytometric data analysis platform. Also, because we use Docker to integrate the multitude of components that form the basis of CymeR, we have additionally developed a proof-of-concept of how future open-source bioinformatic programs with graphical user interfaces could be developed. CymeR is open-source software that ties several components into a single program that is perhaps best thought of as a self-contained data analysis operating system. Please see https://github.com/bmuchmore/CymeR/wiki for detailed installation instructions. brian.muchmore@genyo.es or marta.alarcon@genyo.es. © The Author 2016. Published by Oxford University Press.

  3. Establishing a Novel Modeling Tool: A Python-Based Interface for a Neuromorphic Hardware System

    PubMed Central

    Brüderle, Daniel; Müller, Eric; Davison, Andrew; Muller, Eilif; Schemmel, Johannes; Meier, Karlheinz

    2008-01-01

    Neuromorphic hardware systems provide new possibilities for the neuroscience modeling community. Due to the intrinsic parallelism of the micro-electronic emulation of neural computation, such models are highly scalable without a loss of speed. However, the communities of software simulator users and neuromorphic engineering in neuroscience are rather disjoint. We present a software concept that provides the possibility to establish such hardware devices as valuable modeling tools. It is based on the integration of the hardware interface into a simulator-independent language which allows for unified experiment descriptions that can be run on various simulation platforms without modification, implying experiment portability and a huge simplification of the quantitative comparison of hardware and simulator results. We introduce an accelerated neuromorphic hardware device and describe the implementation of the proposed concept for this system. An example setup and results acquired by utilizing both the hardware system and a software simulator are demonstrated. PMID:19562085

  4. Establishing a novel modeling tool: a python-based interface for a neuromorphic hardware system.

    PubMed

    Brüderle, Daniel; Müller, Eric; Davison, Andrew; Muller, Eilif; Schemmel, Johannes; Meier, Karlheinz

    2009-01-01

    Neuromorphic hardware systems provide new possibilities for the neuroscience modeling community. Due to the intrinsic parallelism of the micro-electronic emulation of neural computation, such models are highly scalable without a loss of speed. However, the communities of software simulator users and neuromorphic engineering in neuroscience are rather disjoint. We present a software concept that provides the possibility to establish such hardware devices as valuable modeling tools. It is based on the integration of the hardware interface into a simulator-independent language which allows for unified experiment descriptions that can be run on various simulation platforms without modification, implying experiment portability and a huge simplification of the quantitative comparison of hardware and simulator results. We introduce an accelerated neuromorphic hardware device and describe the implementation of the proposed concept for this system. An example setup and results acquired by utilizing both the hardware system and a software simulator are demonstrated.

  5. KiT: a MATLAB package for kinetochore tracking.

    PubMed

    Armond, Jonathan W; Vladimirou, Elina; McAinsh, Andrew D; Burroughs, Nigel J

    2016-06-15

    During mitosis, chromosomes are attached to the mitotic spindle via large protein complexes called kinetochores. The motion of kinetochores throughout mitosis is intricate and automated quantitative tracking of their motion has already revealed many surprising facets of their behaviour. Here, we present 'KiT' (Kinetochore Tracking)-an easy-to-use, open-source software package for tracking kinetochores from live-cell fluorescent movies. KiT supports 2D, 3D and multi-colour movies, quantification of fluorescence, integrated deconvolution, parallel execution and multiple algorithms for particle localization. KiT is free, open-source software implemented in MATLAB and runs on all MATLAB supported platforms. KiT can be downloaded as a package from http://www.mechanochemistry.org/mcainsh/software.php The source repository is available at https://bitbucket.org/jarmond/kit and under continuing development. Supplementary data are available at Bioinformatics online. jonathan.armond@warwick.ac.uk. © The Author 2016. Published by Oxford University Press.

  6. Campus Energy Model for Control and Performance Validation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    2014-09-19

    The core of the modeling platform is an extensible block library for the MATLAB/Simulink software suite. The platform enables true co-simulation (interaction at each simulation time step) with NREL's state-of-the-art modeling tools and other energy modeling software.

  7. Integrating Multiple Autonomous Underwater Vessels, Surface Vessels and Aircraft into Oceanographic Research Vessel Operations

    NASA Astrophysics Data System (ADS)

    McGillivary, P. A.; Borges de Sousa, J.; Martins, R.; Rajan, K.

    2012-12-01

    Autonomous platforms are increasingly used as components of Integrated Ocean Observing Systems and oceanographic research cruises. Systems deployed can include gliders or propeller-driven autonomous underwater vessels (AUVs), autonomous surface vessels (ASVs), and unmanned aircraft systems (UAS). Prior field campaigns have demonstrated successful communication, sensor data fusion and visualization for studies using gliders and AUVs. However, additional requirements exist for incorporating ASVs and UASs into ship operations. For these systems to be optimally integrated into research vessel data management and operational planning systems involves addressing three key issues: real-time field data availability, platform coordination, and data archiving for later analysis. A fleet of AUVs, ASVs and UAS deployed from a research vessel is best operated as a system integrated with the ship, provided communications among them can be sustained. For this purpose, Disruptive Tolerant Networking (DTN) software protocols for operation in communication-challenged environments help ensure reliable high-bandwidth communications. Additionally, system components need to have considerable onboard autonomy, namely adaptive sampling capabilities using their own onboard sensor data stream analysis. We discuss Oceanographic Decision Support System (ODSS) software currently used for situational awareness and planning onshore, and in the near future event detection and response will be coordinated among multiple vehicles. Results from recent field studies from oceanographic research vessels using AUVs, ASVs and UAS, including the Rapid Environmental Picture (REP-12) cruise, are presented describing methods and results for use of multi-vehicle communication and deliberative control networks, adaptive sampling with single and multiple platforms, issues relating to data management and archiving, and finally challenges that remain in addressing these technological issues. Significantly, the use of UAS on oceanographic research vessels is just beginning. We report on several initial field efforts which demonstrated that UAS improve spatial and temporal mapping of ocean features, as well as monitoring marine mammal populations, ocean color, sea ice and wave fields and air-sea gas exchange. These studies however also confirm the challenges for shipboard computer systems ingesting and archiving UAS high resolution video, SAR and lidar data. We describe the successful inclusion of DTN communications for: 1) passing video data between two UAS or a UAS and ship; 2) for inclusion of ASVs as communication nodes for AUVs; as well as, 3) enabling extension of adaptive sampling software from AUVs and ASVs to include UAS. In conclusion, we describe how autonomous sampling systems may be best integrated into shipboard oceanographic vessel research to provide new and more comprehensive time-space ocean and atmospheric data collection that is important not only for scientific study, but also for sustainable ocean management, including emergency response capabilities. The recent examples of such integrated studies highlighted confirm ocean and atmospheric studies can more cost-effectively pursued, and in some cases only accomplished, by combining underwater, surface and aircraft autonomous systems with research vessel operations.

  8. Beyond xMOOCs in healthcare education: study of the feasibility in integrating virtual patient systems and MOOC platforms.

    PubMed

    Stathakarou, Natalia; Zary, Nabil; Kononowicz, Andrzej A

    2014-01-01

    Background. Massive Open Online Courses (MOOCs) are an emerging trend in online learning. However, their technology is not yet completely adjusted to the needs of healthcare education. Integration of Virtual Patients within MOOCs to increase interactivity and foster clinical reasoning skills training, has been discussed in the past, but not verified by a practical implementation. Objective. To investigate the technical feasibility of integrating MOOCs with Virtual Patients for the purpose of enabling further research into the potential pedagogical benefits of this approach. Methods. We selected OpenEdx and Open Labyrinth as representative constituents of a MOOC platform and Virtual Patient system integration. Based upon our prior experience we selected the most fundamental technical requirement to address. Grounded in the available literature we identified an e-learning standard to guide the integration. We attempted to demonstrate the feasibility of the integration by designing a "proof-of-concept" prototype. The resulting pilot implementation was subject of verification by two test cases. Results. A Single Sign-On mechanism connecting Open Labyrinth with OpenEdx and based on the IMS LTI standard was successfully implemented and verified. Conclusion. We investigated the technical perspective of integrating Virtual Patients with MOOCs. By addressing this crucial technical requirement we set a base for future research on the educational benefits of using virtual patients in MOOCs. This provides new opportunities for integrating specialized software in healthcare education at massive scale.

  9. Beyond xMOOCs in healthcare education: study of the feasibility in integrating virtual patient systems and MOOC platforms

    PubMed Central

    Zary, Nabil; Kononowicz, Andrzej A.

    2014-01-01

    Background. Massive Open Online Courses (MOOCs) are an emerging trend in online learning. However, their technology is not yet completely adjusted to the needs of healthcare education. Integration of Virtual Patients within MOOCs to increase interactivity and foster clinical reasoning skills training, has been discussed in the past, but not verified by a practical implementation. Objective. To investigate the technical feasibility of integrating MOOCs with Virtual Patients for the purpose of enabling further research into the potential pedagogical benefits of this approach. Methods. We selected OpenEdx and Open Labyrinth as representative constituents of a MOOC platform and Virtual Patient system integration. Based upon our prior experience we selected the most fundamental technical requirement to address. Grounded in the available literature we identified an e-learning standard to guide the integration. We attempted to demonstrate the feasibility of the integration by designing a “proof-of-concept” prototype. The resulting pilot implementation was subject of verification by two test cases. Results. A Single Sign-On mechanism connecting Open Labyrinth with OpenEdx and based on the IMS LTI standard was successfully implemented and verified. Conclusion. We investigated the technical perspective of integrating Virtual Patients with MOOCs. By addressing this crucial technical requirement we set a base for future research on the educational benefits of using virtual patients in MOOCs. This provides new opportunities for integrating specialized software in healthcare education at massive scale. PMID:25405078

  10. REVEAL: Software Documentation and Platform Migration

    NASA Technical Reports Server (NTRS)

    Wilson, Michael A.; Veibell, Victoir T.; Freudinger, Lawrence C.

    2008-01-01

    The Research Environment for Vehicle Embedded Analysis on Linux (REVEAL) is reconfigurable data acquisition software designed for network-distributed test and measurement applications. In development since 2001, it has been successfully demonstrated in support of a number of actual missions within NASA s Suborbital Science Program. Improvements to software configuration control were needed to properly support both an ongoing transition to operational status and continued evolution of REVEAL capabilities. For this reason the project described in this report targets REVEAL software source documentation and deployment of the software on a small set of hardware platforms different from what is currently used in the baseline system implementation. This report specifically describes the actions taken over a ten week period by two undergraduate student interns and serves as a final report for that internship. The topics discussed include: the documentation of REVEAL source code; the migration of REVEAL to other platforms; and an end-to-end field test that successfully validates the efforts.

  11. TU-AB-303-08: GPU-Based Software Platform for Efficient Image-Guided Adaptive Radiation Therapy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Park, S; Robinson, A; McNutt, T

    2015-06-15

    Purpose: In this study, we develop an integrated software platform for adaptive radiation therapy (ART) that combines fast and accurate image registration, segmentation, and dose computation/accumulation methods. Methods: The proposed system consists of three key components; 1) deformable image registration (DIR), 2) automatic segmentation, and 3) dose computation/accumulation. The computationally intensive modules including DIR and dose computation have been implemented on a graphics processing unit (GPU). All required patient-specific data including the planning CT (pCT) with contours, daily cone-beam CTs, and treatment plan are automatically queried and retrieved from their own databases. To improve the accuracy of DIR between pCTmore » and CBCTs, we use the double force demons DIR algorithm in combination with iterative CBCT intensity correction by local intensity histogram matching. Segmentation of daily CBCT is then obtained by propagating contours from the pCT. Daily dose delivered to the patient is computed on the registered pCT by a GPU-accelerated superposition/convolution algorithm. Finally, computed daily doses are accumulated to show the total delivered dose to date. Results: Since the accuracy of DIR critically affects the quality of the other processes, we first evaluated our DIR method on eight head-and-neck cancer cases and compared its performance. Normalized mutual-information (NMI) and normalized cross-correlation (NCC) computed as similarity measures, and our method produced overall NMI of 0.663 and NCC of 0.987, outperforming conventional methods by 3.8% and 1.9%, respectively. Experimental results show that our registration method is more consistent and roust than existing algorithms, and also computationally efficient. Computation time at each fraction took around one minute (30–50 seconds for registration and 15–25 seconds for dose computation). Conclusion: We developed an integrated GPU-accelerated software platform that enables accurate and efficient DIR, auto-segmentation, and dose computation, thus supporting an efficient ART workflow. This work was supported by NIH/NCI under grant R42CA137886.« less

  12. A LabVIEW®-based software for the control of the AUTORAD platform: a fully automated multisequential flow injection analysis Lab-on-Valve (MSFIA-LOV) system for radiochemical analysis.

    PubMed

    Barbesi, Donato; Vicente Vilas, Víctor; Millet, Sylvain; Sandow, Miguel; Colle, Jean-Yves; Aldave de Las Heras, Laura

    2017-01-01

    A LabVIEW ® -based software for the control of the fully automated multi-sequential flow injection analysis Lab-on-Valve (MSFIA-LOV) platform AutoRAD performing radiochemical analysis is described. The analytical platform interfaces an Arduino ® -based device triggering multiple detectors providing a flexible and fit for purpose choice of detection systems. The different analytical devices are interfaced to the PC running LabVIEW ® VI software using USB and RS232 interfaces, both for sending commands and receiving confirmation or error responses. The AUTORAD platform has been successfully applied for the chemical separation and determination of Sr, an important fission product pertinent to nuclear waste.

  13. Upgrades to the Probabilistic NAS Platform Air Traffic Simulation Software

    NASA Technical Reports Server (NTRS)

    Hunter, George; Boisvert, Benjamin

    2013-01-01

    This document is the final report for the project entitled "Upgrades to the Probabilistic NAS Platform Air Traffic Simulation Software." This report consists of 17 sections which document the results of the several subtasks of this effort. The Probabilistic NAS Platform (PNP) is an air operations simulation platform developed and maintained by the Saab Sensis Corporation. The improvements made to the PNP simulation include the following: an airborne distributed separation assurance capability, a required time of arrival assignment and conformance capability, and a tactical and strategic weather avoidance capability.

  14. Note: computer controlled rotation mount for large diameter optics.

    PubMed

    Rakonjac, Ana; Roberts, Kris O; Deb, Amita B; Kjærgaard, Niels

    2013-02-01

    We describe the construction of a motorized optical rotation mount with a 40 mm clear aperture. The device is used to remotely control the power of large diameter laser beams for a magneto-optical trap. A piezo-electric ultrasonic motor on a printed circuit board provides rotation with a precision better than 0.03° and allows for a very compact design. The rotation unit is controlled from a computer via serial communication, making integration into most software control platforms straightforward.

  15. An Interdisciplinary Approach Between Medical Informatics and Social Sciences to Transdisciplinary Requirements Engineering for an Integrated Care Setting.

    PubMed

    Vielhauer, Jan; Böckmann, Britta

    2017-01-01

    Requirements engineering of software products for elderly people faces some special challenges to ensure a maximum of user acceptance. Within the scope of a research project, a web-based platform and a mobile app are approached to enable people to live in their own home as long as possible. This paper is about a developed method of interdisciplinary requirements engineering by a team of social scientists in cooperation with computer scientists.

  16. Integrating CLIPS applications into heterogeneous distributed systems

    NASA Technical Reports Server (NTRS)

    Adler, Richard M.

    1991-01-01

    SOCIAL is an advanced, object-oriented development tool for integrating intelligent and conventional applications across heterogeneous hardware and software platforms. SOCIAL defines a family of 'wrapper' objects called agents, which incorporate predefined capabilities for distributed communication and control. Developers embed applications within agents and establish interactions between distributed agents via non-intrusive message-based interfaces. This paper describes a predefined SOCIAL agent that is specialized for integrating C Language Integrated Production System (CLIPS)-based applications. The agent's high-level Application Programming Interface supports bidirectional flow of data, knowledge, and commands to other agents, enabling CLIPS applications to initiate interactions autonomously, and respond to requests and results from heterogeneous remote systems. The design and operation of CLIPS agents are illustrated with two distributed applications that integrate CLIPS-based expert systems with other intelligent systems for isolating and mapping problems in the Space Shuttle Launch Processing System at the NASA Kennedy Space Center.

  17. Evolution of a Reconfigurable Processing Platform for a Next Generation Space Software Defined Radio

    NASA Technical Reports Server (NTRS)

    Kacpura, Thomas J.; Downey, Joseph A.; Anderson, Keffery R.; Baldwin, Keith

    2014-01-01

    The National Aeronautics and Space Administration (NASA)Harris Ka-Band Software Defined Radio (SDR) is the first, fully reprogrammable space-qualified SDR operating in the Ka-Band frequency range. Providing exceptionally higher data communication rates than previously possible, this SDR offers in-orbit reconfiguration, multi-waveform operation, and fast deployment due to its highly modular hardware and software architecture. Currently in operation on the International Space Station (ISS), this new paradigm of reconfigurable technology is enabling experimenters to investigate navigation and networking in the space environment.The modular SDR and the NASA developed Space Telecommunications Radio System (STRS) architecture standard are the basis for Harris reusable, digital signal processing space platform trademarked as AppSTAR. As a result, two new space radio products are a synthetic aperture radar payload and an Automatic Detection Surveillance Broadcast (ADS-B) receiver. In addition, Harris is currently developing many new products similar to the Ka-Band software defined radio for other applications. For NASAs next generation flight Ka-Band radio development, leveraging these advancements could lead to a more robust and more capable software defined radio.The space environment has special considerations different from terrestrial applications that must be considered for any system operated in space. Each space mission has unique requirements that can make these systems unique. These unique requirements can make products that are expensive and limited in reuse. Space systems put a premium on size, weight and power. A key trade is the amount of reconfigurability in a space system. The more reconfigurable the hardware platform, the easier it is to adapt to the platform to the next mission, and this reduces the amount of non-recurring engineering costs. However, the more reconfigurable platforms often use more spacecraft resources. Software has similar considerations to hardware. Having an architecture standard promotes reuse of software and firmware. Space platforms have limited processor capability, which makes the trade on the amount of amount of flexibility paramount.

  18. Integrated Building Management System (IBMS)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Anita Lewis

    This project provides a combination of software and services that more easily and cost-effectively help to achieve optimized building performance and energy efficiency. Featuring an open-platform, cloud- hosted application suite and an intuitive user experience, this solution simplifies a traditionally very complex process by collecting data from disparate building systems and creating a single, integrated view of building and system performance. The Fault Detection and Diagnostics algorithms developed within the IBMS have been designed and tested as an integrated component of the control algorithms running the equipment being monitored. The algorithms identify the normal control behaviors of the equipment withoutmore » interfering with the equipment control sequences. The algorithms also work without interfering with any cooperative control sequences operating between different pieces of equipment or building systems. In this manner the FDD algorithms create an integrated building management system.« less

  19. Advantages of Brahms for Specifying and Implementing a Multiagent Human-Robotic Exploration System

    NASA Technical Reports Server (NTRS)

    Clancey, William J.; Sierhuis, Maarten; Kaskiris, Charis; vanHoof, Ron

    2003-01-01

    We have developed a model-based, distributed architecture that integrates diverse components in a system designed for lunar and planetary surface operations: an astronaut's space suit, cameras, all-terrain vehicles, robotic assistant, crew in a local habitat, and mission support team. Software processes ('agents') implemented in the Brahms language, run on multiple, mobile platforms. These mobile agents interpret and transform available data to help people and robotic systems coordinate their actions to make operations more safe and efficient. The Brahms-based mobile agent architecture (MAA) uses a novel combination of agent types so the software agents may understand and facilitate communications between people and between system components. A state-of-the-art spoken dialogue interface is integrated with Brahms models, supporting a speech-driven field observation record and rover command system. An important aspect of the methodology involves first simulating the entire system in Brahms, then configuring the agents into a runtime system Thus, Brahms provides a language, engine, and system builder's toolkit for specifying and implementing multiagent systems.

  20. Design of underwater robot lines based on a hybrid automatic optimization strategy

    NASA Astrophysics Data System (ADS)

    Lyu, Wenjing; Luo, Weilin

    2014-09-01

    In this paper, a hybrid automatic optimization strategy is proposed for the design of underwater robot lines. Isight is introduced as an integration platform. The construction of this platform is based on the user programming and several commercial software including UG6.0, GAMBIT2.4.6 and FLUENT12.0. An intelligent parameter optimization method, the particle swarm optimization, is incorporated into the platform. To verify the strategy proposed, a simulation is conducted on the underwater robot model 5470, which originates from the DTRC SUBOFF project. With the automatic optimization platform, the minimal resistance is taken as the optimization goal; the wet surface area as the constraint condition; the length of the fore-body, maximum body radius and after-body's minimum radius as the design variables. With the CFD calculation, the RANS equations and the standard turbulence model are used for direct numerical simulation. By analyses of the simulation results, it is concluded that the platform is of high efficiency and feasibility. Through the platform, a variety of schemes for the design of the lines are generated and the optimal solution is achieved. The combination of the intelligent optimization algorithm and the numerical simulation ensures a global optimal solution and improves the efficiency of the searching solutions.

  1. Rapid sample classification using an open port sampling interface coupled with liquid introduction atmospheric pressure ionization mass spectrometry.

    PubMed

    Van Berkel, Gary J; Kertesz, Vilmos

    2017-02-15

    An "Open Access"-like mass spectrometric platform to fully utilize the simplicity of the manual open port sampling interface for rapid characterization of unprocessed samples by liquid introduction atmospheric pressure ionization mass spectrometry has been lacking. The in-house developed integrated software with a simple, small and relatively low-cost mass spectrometry system introduced here fills this void. Software was developed to operate the mass spectrometer, to collect and process mass spectrometric data files, to build a database and to classify samples using such a database. These tasks were accomplished via the vendor-provided software libraries. Sample classification based on spectral comparison utilized the spectral contrast angle method. Using the developed software platform near real-time sample classification is exemplified using a series of commercially available blue ink rollerball pens and vegetable oils. In the case of the inks, full scan positive and negative ion ESI mass spectra were both used for database generation and sample classification. For the vegetable oils, full scan positive ion mode APCI mass spectra were recorded. The overall accuracy of the employed spectral contrast angle statistical model was 95.3% and 98% in case of the inks and oils, respectively, using leave-one-out cross-validation. This work illustrates that an open port sampling interface/mass spectrometer combination, with appropriate instrument control and data processing software, is a viable direct liquid extraction sampling and analysis system suitable for the non-expert user and near real-time sample classification via database matching. Published in 2016. This article is a U.S. Government work and is in the public domain in the USA. Published in 2016. This article is a U.S. Government work and is in the public domain in the USA.

  2. NeuroMatic: An Integrated Open-Source Software Toolkit for Acquisition, Analysis and Simulation of Electrophysiological Data

    PubMed Central

    Rothman, Jason S.; Silver, R. Angus

    2018-01-01

    Acquisition, analysis and simulation of electrophysiological properties of the nervous system require multiple software packages. This makes it difficult to conserve experimental metadata and track the analysis performed. It also complicates certain experimental approaches such as online analysis. To address this, we developed NeuroMatic, an open-source software toolkit that performs data acquisition (episodic, continuous and triggered recordings), data analysis (spike rasters, spontaneous event detection, curve fitting, stationarity) and simulations (stochastic synaptic transmission, synaptic short-term plasticity, integrate-and-fire and Hodgkin-Huxley-like single-compartment models). The merging of a wide range of tools into a single package facilitates a more integrated style of research, from the development of online analysis functions during data acquisition, to the simulation of synaptic conductance trains during dynamic-clamp experiments. Moreover, NeuroMatic has the advantage of working within Igor Pro, a platform-independent environment that includes an extensive library of built-in functions, a history window for reviewing the user's workflow and the ability to produce publication-quality graphics. Since its original release, NeuroMatic has been used in a wide range of scientific studies and its user base has grown considerably. NeuroMatic version 3.0 can be found at http://www.neuromatic.thinkrandom.com and https://github.com/SilverLabUCL/NeuroMatic. PMID:29670519

  3. DeviceEditor visual biological CAD canvas

    PubMed Central

    2012-01-01

    Background Biological Computer Aided Design (bioCAD) assists the de novo design and selection of existing genetic components to achieve a desired biological activity, as part of an integrated design-build-test cycle. To meet the emerging needs of Synthetic Biology, bioCAD tools must address the increasing prevalence of combinatorial library design, design rule specification, and scar-less multi-part DNA assembly. Results We report the development and deployment of web-based bioCAD software, DeviceEditor, which provides a graphical design environment that mimics the intuitive visual whiteboard design process practiced in biological laboratories. The key innovations of DeviceEditor include visual combinatorial library design, direct integration with scar-less multi-part DNA assembly design automation, and a graphical user interface for the creation and modification of design specification rules. We demonstrate how biological designs are rendered on the DeviceEditor canvas, and we present effective visualizations of genetic component ordering and combinatorial variations within complex designs. Conclusions DeviceEditor liberates researchers from DNA base-pair manipulation, and enables users to create successful prototypes using standardized, functional, and visual abstractions. Open and documented software interfaces support further integration of DeviceEditor with other bioCAD tools and software platforms. DeviceEditor saves researcher time and institutional resources through correct-by-construction design, the automation of tedious tasks, design reuse, and the minimization of DNA assembly costs. PMID:22373390

  4. A New Approach to Integrate Internet-of-Things and Software-as-a-Service Model for Logistic Systems: A Case Study

    PubMed Central

    Chen, Shang-Liang; Chen, Yun-Yao; Hsu, Chiang

    2014-01-01

    Cloud computing is changing the ways software is developed and managed in enterprises, which is changing the way of doing business in that dynamically scalable and virtualized resources are regarded as services over the Internet. Traditional manufacturing systems such as supply chain management (SCM), customer relationship management (CRM), and enterprise resource planning (ERP) are often developed case by case. However, effective collaboration between different systems, platforms, programming languages, and interfaces has been suggested by researchers. In cloud-computing-based systems, distributed resources are encapsulated into cloud services and centrally managed, which allows high automation, flexibility, fast provision, and ease of integration at low cost. The integration between physical resources and cloud services can be improved by combining Internet of things (IoT) technology and Software-as-a-Service (SaaS) technology. This study proposes a new approach for developing cloud-based manufacturing systems based on a four-layer SaaS model. There are three main contributions of this paper: (1) enterprises can develop their own cloud-based logistic management information systems based on the approach proposed in this paper; (2) a case study based on literature reviews with experimental results is proposed to verify that the system performance is remarkable; (3) challenges encountered and feedback collected from T Company in the case study are discussed in this paper for the purpose of enterprise deployment. PMID:24686728

  5. A new approach to integrate Internet-of-things and software-as-a-service model for logistic systems: a case study.

    PubMed

    Chen, Shang-Liang; Chen, Yun-Yao; Hsu, Chiang

    2014-03-28

    Cloud computing is changing the ways software is developed and managed in enterprises, which is changing the way of doing business in that dynamically scalable and virtualized resources are regarded as services over the Internet. Traditional manufacturing systems such as supply chain management (SCM), customer relationship management (CRM), and enterprise resource planning (ERP) are often developed case by case. However, effective collaboration between different systems, platforms, programming languages, and interfaces has been suggested by researchers. In cloud-computing-based systems, distributed resources are encapsulated into cloud services and centrally managed, which allows high automation, flexibility, fast provision, and ease of integration at low cost. The integration between physical resources and cloud services can be improved by combining Internet of things (IoT) technology and Software-as-a-Service (SaaS) technology. This study proposes a new approach for developing cloud-based manufacturing systems based on a four-layer SaaS model. There are three main contributions of this paper: (1) enterprises can develop their own cloud-based logistic management information systems based on the approach proposed in this paper; (2) a case study based on literature reviews with experimental results is proposed to verify that the system performance is remarkable; (3) challenges encountered and feedback collected from T Company in the case study are discussed in this paper for the purpose of enterprise deployment.

  6. Design and Simulation of Material-Integrated Distributed Sensor Processing with a Code-Based Agent Platform and Mobile Multi-Agent Systems

    PubMed Central

    Bosse, Stefan

    2015-01-01

    Multi-agent systems (MAS) can be used for decentralized and self-organizing data processing in a distributed system, like a resource-constrained sensor network, enabling distributed information extraction, for example, based on pattern recognition and self-organization, by decomposing complex tasks in simpler cooperative agents. Reliable MAS-based data processing approaches can aid the material-integration of structural-monitoring applications, with agent processing platforms scaled to the microchip level. The agent behavior, based on a dynamic activity-transition graph (ATG) model, is implemented with program code storing the control and the data state of an agent, which is novel. The program code can be modified by the agent itself using code morphing techniques and is capable of migrating in the network between nodes. The program code is a self-contained unit (a container) and embeds the agent data, the initialization instructions and the ATG behavior implementation. The microchip agent processing platform used for the execution of the agent code is a standalone multi-core stack machine with a zero-operand instruction format, leading to a small-sized agent program code, low system complexity and high system performance. The agent processing is token-queue-based, similar to Petri-nets. The agent platform can be implemented in software, too, offering compatibility at the operational and code level, supporting agent processing in strong heterogeneous networks. In this work, the agent platform embedded in a large-scale distributed sensor network is simulated at the architectural level by using agent-based simulation techniques. PMID:25690550

  7. Design and simulation of material-integrated distributed sensor processing with a code-based agent platform and mobile multi-agent systems.

    PubMed

    Bosse, Stefan

    2015-02-16

    Multi-agent systems (MAS) can be used for decentralized and self-organizing data processing in a distributed system, like a resource-constrained sensor network, enabling distributed information extraction, for example, based on pattern recognition and self-organization, by decomposing complex tasks in simpler cooperative agents. Reliable MAS-based data processing approaches can aid the material-integration of structural-monitoring applications, with agent processing platforms scaled to the microchip level. The agent behavior, based on a dynamic activity-transition graph (ATG) model, is implemented with program code storing the control and the data state of an agent, which is novel. The program code can be modified by the agent itself using code morphing techniques and is capable of migrating in the network between nodes. The program code is a self-contained unit (a container) and embeds the agent data, the initialization instructions and the ATG behavior implementation. The microchip agent processing platform used for the execution of the agent code is a standalone multi-core stack machine with a zero-operand instruction format, leading to a small-sized agent program code, low system complexity and high system performance. The agent processing is token-queue-based, similar to Petri-nets. The agent platform can be implemented in software, too, offering compatibility at the operational and code level, supporting agent processing in strong heterogeneous networks. In this work, the agent platform embedded in a large-scale distributed sensor network is simulated at the architectural level by using agent-based simulation techniques.

  8. TERRA REF: Advancing phenomics with high resolution, open access sensor and genomics data

    NASA Astrophysics Data System (ADS)

    LeBauer, D.; Kooper, R.; Burnette, M.; Willis, C.

    2017-12-01

    Automated plant measurement has the potential to improve understanding of genetic and environmental controls on plant traits (phenotypes). The application of sensors and software in the automation of high throughput phenotyping reflects a fundamental shift from labor intensive hand measurements to drone, tractor, and robot mounted sensing platforms. These tools are expected to speed the rate of crop improvement by enabling plant breeders to more accurately select plants with improved yields, resource use efficiency, and stress tolerance. However, there are many challenges facing high throughput phenomics: sensors and platforms are expensive, currently there are few standard methods of data collection and storage, and the analysis of large data sets requires high performance computers and automated, reproducible computing pipelines. To overcome these obstacles and advance the science of high throughput phenomics, the TERRA Phenotyping Reference Platform (TERRA-REF) team is developing an open-access database of high resolution sensor data. TERRA REF is an integrated field and greenhouse phenotyping system that includes: a reference field scanner with fifteen sensors that can generate terrabytes of data each day at mm resolution; UAV, tractor, and fixed field sensing platforms; and an automated controlled-environment scanner. These platforms will enable investigation of diverse sensing modalities, and the investigation of traits under controlled and field environments. It is the goal of TERRA REF to lower the barrier to entry for academic and industry researchers by providing high-resolution data, open source software, and online computing resources. Our project is unique in that all data will be made fully public in November 2018, and is already available to early adopters through the beta-user program. We will describe the datasets and how to use them as well as the databases and computing pipeline and how these can be reused and remixed in other phenomics pipelines. Finally, we will describe the National Data Service workbench, a cloud computing platform that can access the petabyte scale data while supporting reproducible research.

  9. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Iverson, Aaron

    Ra Power Management (RPM) has developed a cloud based software platform that manages the financial and operational functions of third party financed solar projects throughout their lifecycle. RPM’s software streamlines and automates the sales, financing, and management of a portfolio of solar assets. The software helps solar developers automate the most difficult aspects of asset management, leading to increased transparency, efficiency, and reduction in human error. More importantly, our platform will help developers save money by improving their operating margins

  10. Monitoring and Detection Platform to Prevent Anomalous Situations in Home Care

    PubMed Central

    Villarrubia, Gabriel; Bajo, Javier; De Paz, Juan F.; Corchado, Juan M.

    2014-01-01

    Monitoring and tracking people at home usually requires high cost hardware installations, which implies they are not affordable in many situations. This study/paper proposes a monitoring and tracking system for people with medical problems. A virtual organization of agents based on the PANGEA platform, which allows the easy integration of different devices, was created for this study. In this case, a virtual organization was implemented to track and monitor patients carrying a Holter monitor. The system includes the hardware and software required to perform: ECG measurements, monitoring through accelerometers and WiFi networks. Furthermore, the use of interactive television can moderate interactivity with the user. The system makes it possible to merge the information and facilitates patient tracking efficiently with low cost. PMID:24905853

  11. Control and structural optimization for maneuvering large spacecraft

    NASA Technical Reports Server (NTRS)

    Chun, H. M.; Turner, J. D.; Yu, C. C.

    1990-01-01

    Presented here are the results of an advanced control design as well as a discussion of the requirements for automating both the structures and control design efforts for maneuvering a large spacecraft. The advanced control application addresses a general three dimensional slewing problem, and is applied to a large geostationary platform. The platform consists of two flexible antennas attached to the ends of a flexible truss. The control strategy involves an open-loop rigid body control profile which is derived from a nonlinear optimal control problem and provides the main control effort. A perturbation feedback control reduces the response due to the flexibility of the structure. Results are shown which demonstrate the usefulness of the approach. Software issues are considered for developing an integrated structures and control design environment.

  12. Creating an X Window Terminal-Based Information Technology Center.

    ERIC Educational Resources Information Center

    Klassen, Tim W.

    1997-01-01

    The creation of an information technology center at the University of Oregon Science Library is described. Goals included providing access to Internet-based resources and multimedia software, platforms for running science-oriented software, and resources so students can create multimedia materials. A mixed-lab platform was created with Unix-based…

  13. A Facility and Architecture for Autonomy Research

    NASA Technical Reports Server (NTRS)

    Pisanich, Greg; Clancy, Daniel (Technical Monitor)

    2002-01-01

    Autonomy is a key enabling factor in the advancement of the remote robotic exploration. There is currently a large gap between autonomy software at the research level and software that is ready for insertion into near-term space missions. The Mission Simulation Facility (MST) will bridge this gap by providing a simulation framework and suite of simulation tools to support research in autonomy for remote exploration. This system will allow developers of autonomy software to test their models in a high-fidelity simulation and evaluate their system's performance against a set of integrated, standardized simulations. The Mission Simulation ToolKit (MST) uses a distributed architecture with a communication layer that is built on top of the standardized High Level Architecture (HLA). This architecture enables the use of existing high fidelity models, allows mixing simulation components from various computing platforms and enforces the use of a standardized high-level interface among components. The components needed to achieve a realistic simulation can be grouped into four categories: environment generation (terrain, environmental features), robotic platform behavior (robot dynamics), instrument models (camera/spectrometer/etc.), and data analysis. The MST will provide basic components in these areas but allows users to plug-in easily any refined model by means of a communication protocol. Finally, a description file defines the robot and environment parameters for easy configuration and ensures that all the simulation models share the same information.

  14. The structural bioinformatics library: modeling in biomolecular science and beyond.

    PubMed

    Cazals, Frédéric; Dreyfus, Tom

    2017-04-01

    Software in structural bioinformatics has mainly been application driven. To favor practitioners seeking off-the-shelf applications, but also developers seeking advanced building blocks to develop novel applications, we undertook the design of the Structural Bioinformatics Library ( SBL , http://sbl.inria.fr ), a generic C ++/python cross-platform software library targeting complex problems in structural bioinformatics. Its tenet is based on a modular design offering a rich and versatile framework allowing the development of novel applications requiring well specified complex operations, without compromising robustness and performances. The SBL involves four software components (1-4 thereafter). For end-users, the SBL provides ready to use, state-of-the-art (1) applications to handle molecular models defined by unions of balls, to deal with molecular flexibility, to model macro-molecular assemblies. These applications can also be combined to tackle integrated analysis problems. For developers, the SBL provides a broad C ++ toolbox with modular design, involving core (2) algorithms , (3) biophysical models and (4) modules , the latter being especially suited to develop novel applications. The SBL comes with a thorough documentation consisting of user and reference manuals, and a bugzilla platform to handle community feedback. The SBL is available from http://sbl.inria.fr. Frederic.Cazals@inria.fr. Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com

  15. Use of the NetBeans Platform for NASA Robotic Conjunction Assessment Risk Analysis

    NASA Technical Reports Server (NTRS)

    Sabey, Nickolas J.

    2014-01-01

    The latest Java and JavaFX technologies are very attractive software platforms for customers involved in space mission operations such as those of NASA and the US Air Force. For NASA Robotic Conjunction Assessment Risk Analysis (CARA), the NetBeans platform provided an environment in which scalable software solutions could be developed quickly and efficiently. Both Java 8 and the NetBeans platform are in the process of simplifying CARA development in secure environments by providing a significant amount of capability in a single accredited package, where accreditation alone can account for 6-8 months for each library or software application. Capabilities either in use or being investigated by CARA include: 2D and 3D displays with JavaFX, parallelization with the new Streams API, and scalability through the NetBeans plugin architecture.

  16. Pathogen metadata platform: software for accessing and analyzing pathogen strain information.

    PubMed

    Chang, Wenling E; Peterson, Matthew W; Garay, Christopher D; Korves, Tonia

    2016-09-15

    Pathogen metadata includes information about where and when a pathogen was collected and the type of environment it came from. Along with genomic nucleotide sequence data, this metadata is growing rapidly and becoming a valuable resource not only for research but for biosurveillance and public health. However, current freely available tools for analyzing this data are geared towards bioinformaticians and/or do not provide summaries and visualizations needed to readily interpret results. We designed a platform to easily access and summarize data about pathogen samples. The software includes a PostgreSQL database that captures metadata useful for disease outbreak investigations, and scripts for downloading and parsing data from NCBI BioSample and BioProject into the database. The software provides a user interface to query metadata and obtain standardized results in an exportable, tab-delimited format. To visually summarize results, the user interface provides a 2D histogram for user-selected metadata types and mapping of geolocated entries. The software is built on the LabKey data platform, an open-source data management platform, which enables developers to add functionalities. We demonstrate the use of the software in querying for a pathogen serovar and for genome sequence identifiers. This software enables users to create a local database for pathogen metadata, populate it with data from NCBI, easily query the data, and obtain visual summaries. Some of the components, such as the database, are modular and can be incorporated into other data platforms. The source code is freely available for download at https://github.com/wchangmitre/bioattribution .

  17. ADMS State of the Industry and Gap Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Agalgaonkar, Yashodhan P.; Marinovici, Maria C.; Vadari, Subramanian V.

    2016-03-31

    An Advanced distribution management system (ADMS) is a platform for optimized distribution system operational management. This platform comprises of distribution management system (DMS) applications, supervisory control and data acquisition (SCADA), outage management system (OMS), and distributed energy resource management system (DERMS). One of the primary objectives of this work is to study and analyze several ADMS component and auxiliary systems. All the important component and auxiliary systems, SCADA, GISs, DMSs, AMRs/AMIs, OMSs, and DERMS, are discussed in this report. Their current generation technologies are analyzed, and their integration (or evolution) with an ADMS technology is discussed. An ADMS technology statemore » of the art and gap analysis is also presented. There are two technical gaps observed. The integration challenge between the component operational systems is the single largest challenge for ADMS design and deployment. Another significant challenge noted is concerning essential ADMS applications, for instance, fault location, isolation, and service restoration (FLISR), volt-var optimization (VVO), etc. There are a relatively small number of ADMS application developers as ADMS software platform is not open source. There is another critical gap and while not being technical in nature (when compared the two above) is still important to consider. The data models currently residing in utility GIS systems are either incomplete or inaccurate or both. This data is essential for planning and operations because it is typically one of the primary sources from which power system model are created. To achieve the full potential of ADMS, the ability to execute acute Power Flow solution is an important pre-requisite. These critical gaps are hindering wider Utility adoption of an ADMS technology. The development of an open architecture platform can eliminate many of these barriers and also aid seamless integration of distribution Utility legacy systems with an ADMS.« less

  18. Architecture of a platform for hardware-in-the-loop simulation of flying vehicle control systems

    NASA Astrophysics Data System (ADS)

    Belokon', S. A.; Zolotukhin, Yu. N.; Filippov, M. N.

    2017-07-01

    A hardware-software platform is presented, which is designed for the development and hardware-in-the-loop simulation of flying vehicle control systems. This platform ensures the construction of the mathematical model of the plant, development of algorithms and software for onboard radioelectronic equipment and ground control station, and visualization of the three-dimensional model of the vehicle and external environment of the cockpit in the simulator training mode.

  19. Novel Platform for MRI-Guided Convection-Enhanced Delivery of Therapeutics: Preclinical Validation in Nonhuman Primate Brain

    PubMed Central

    Richardson, R. Mark; Kells, Adrian P.; Martin, Alastair J.; Larson, Paul S.; Starr, Philip A.; Piferi, Peter G.; Bates, Geoffrey; Tansey, Lisa; Rosenbluth, Kathryn H.; Bringas, John R.; Berger, Mitchel S.; Bankiewicz, Krystof S.

    2011-01-01

    Background/Aims A skull-mounted aiming device and integrated software platform has been developed for MRI-guided neurological interventions. In anticipation of upcoming gene therapy clinical trials, we adapted this device for real-time convection-enhanced delivery of therapeutics via a custom-designed infusion cannula. The targeting accuracy of this delivery system and the performance of the infusion cannula were validated in nonhuman primates. Methods Infusions of gadoteridol were delivered to multiple brain targets and the targeting error was determined for each cannula placement. Cannula performance was assessed by analyzing gadoteridol distributions and by histological analysis of tissue damage. Results The average targeting error for all targets (n = 11) was 0.8 mm (95% CI = 0.14). For clinically relevant volumes, the distribution volume of gadoteridol increased as a linear function (R2 = 0.97) of the infusion volume (average slope = 3.30, 95% CI = 0.2). No infusions in any target produced occlusion, cannula reflux or leakage from adjacent tracts, and no signs of unexpected tissue damage were observed. Conclusions This integrated delivery platform allows real-time convection-enhanced delivery to be performed with a high level of precision, predictability and safety. This approach may improve the success rate for clinical trials involving intracerebral drug delivery by direct infusion. PMID:21494065

  20. The Osseus platform: a prototype for advanced web-based distributed simulation

    NASA Astrophysics Data System (ADS)

    Franceschini, Derrick; Riecken, Mark

    2016-05-01

    Recent technological advances in web-based distributed computing and database technology have made possible a deeper and more transparent integration of some modeling and simulation applications. Despite these advances towards true integration of capabilities, disparate systems, architectures, and protocols will remain in the inventory for some time to come. These disparities present interoperability challenges for distributed modeling and simulation whether the application is training, experimentation, or analysis. Traditional approaches call for building gateways to bridge between disparate protocols and retaining interoperability specialists. Challenges in reconciling data models also persist. These challenges and their traditional mitigation approaches directly contribute to higher costs, schedule delays, and frustration for the end users. Osseus is a prototype software platform originally funded as a research project by the Defense Modeling & Simulation Coordination Office (DMSCO) to examine interoperability alternatives using modern, web-based technology and taking inspiration from the commercial sector. Osseus provides tools and services for nonexpert users to connect simulations, targeting the time and skillset needed to successfully connect disparate systems. The Osseus platform presents a web services interface to allow simulation applications to exchange data using modern techniques efficiently over Local or Wide Area Networks. Further, it provides Service Oriented Architecture capabilities such that finer granularity components such as individual models can contribute to simulation with minimal effort.

  1. Design and Development of a Technology Platform for DNA-Encoded Library Production and Affinity Selection.

    PubMed

    Castañón, Jesús; Román, José Pablo; Jessop, Theodore C; de Blas, Jesús; Haro, Rubén

    2018-06-01

    DNA-encoded libraries (DELs) have emerged as an efficient and cost-effective drug discovery tool for the exploration and screening of very large chemical space using small-molecule collections of unprecedented size. Herein, we report an integrated automation and informatics system designed to enhance the quality, efficiency, and throughput of the production and affinity selection of these libraries. The platform is governed by software developed according to a database-centric architecture to ensure data consistency, integrity, and availability. Through its versatile protocol management functionalities, this application captures the wide diversity of experimental processes involved with DEL technology, keeps track of working protocols in the database, and uses them to command robotic liquid handlers for the synthesis of libraries. This approach provides full traceability of building-blocks and DNA tags in each split-and-pool cycle. Affinity selection experiments and high-throughput sequencing reads are also captured in the database, and the results are automatically deconvoluted and visualized in customizable representations. Researchers can compare results of different experiments and use machine learning methods to discover patterns in data. As of this writing, the platform has been validated through the generation and affinity selection of various libraries, and it has become the cornerstone of the DEL production effort at Lilly.

  2. Smartphone-Based Food Diagnostic Technologies: A Review.

    PubMed

    Rateni, Giovanni; Dario, Paolo; Cavallo, Filippo

    2017-06-20

    A new generation of mobile sensing approaches offers significant advantages over traditional platforms in terms of test speed, control, low cost, ease-of-operation, and data management, and requires minimal equipment and user involvement. The marriage of novel sensing technologies with cellphones enables the development of powerful lab-on-smartphone platforms for many important applications including medical diagnosis, environmental monitoring, and food safety analysis. This paper reviews the recent advancements and developments in the field of smartphone-based food diagnostic technologies, with an emphasis on custom modules to enhance smartphone sensing capabilities. These devices typically comprise multiple components such as detectors, sample processors, disposable chips, batteries and software, which are integrated with a commercial smartphone. One of the most important aspects of developing these systems is the integration of these components onto a compact and lightweight platform that requires minimal power. To date, researchers have demonstrated several promising approaches employing various sensing techniques and device configurations. We aim to provide a systematic classification according to the detection strategy, providing a critical discussion of strengths and weaknesses. We have also extended the analysis to the food scanning devices that are increasingly populating the Internet of Things (IoT) market, demonstrating how this field is indeed promising, as the research outputs are quickly capitalized on new start-up companies.

  3. Smartphone-Based Food Diagnostic Technologies: A Review

    PubMed Central

    Rateni, Giovanni; Dario, Paolo; Cavallo, Filippo

    2017-01-01

    A new generation of mobile sensing approaches offers significant advantages over traditional platforms in terms of test speed, control, low cost, ease-of-operation, and data management, and requires minimal equipment and user involvement. The marriage of novel sensing technologies with cellphones enables the development of powerful lab-on-smartphone platforms for many important applications including medical diagnosis, environmental monitoring, and food safety analysis. This paper reviews the recent advancements and developments in the field of smartphone-based food diagnostic technologies, with an emphasis on custom modules to enhance smartphone sensing capabilities. These devices typically comprise multiple components such as detectors, sample processors, disposable chips, batteries and software, which are integrated with a commercial smartphone. One of the most important aspects of developing these systems is the integration of these components onto a compact and lightweight platform that requires minimal power. To date, researchers have demonstrated several promising approaches employing various sensing techniques and device configurations. We aim to provide a systematic classification according to the detection strategy, providing a critical discussion of strengths and weaknesses. We have also extended the analysis to the food scanning devices that are increasingly populating the Internet of Things (IoT) market, demonstrating how this field is indeed promising, as the research outputs are quickly capitalized on new start-up companies. PMID:28632188

  4. Advances in the TRIDEC Cloud

    NASA Astrophysics Data System (ADS)

    Hammitzsch, Martin; Spazier, Johannes; Reißland, Sven

    2016-04-01

    The TRIDEC Cloud is a platform that merges several complementary cloud-based services for instant tsunami propagation calculations and automated background computation with graphics processing units (GPU), for web-mapping of hazard specific geospatial data, and for serving relevant functionality to handle, share, and communicate threat specific information in a collaborative and distributed environment. The platform offers a modern web-based graphical user interface so that operators in warning centres and stakeholders of other involved parties (e.g. CPAs, ministries) just need a standard web browser to access a full-fledged early warning and information system with unique interactive features such as Cloud Messages and Shared Maps. Furthermore, the TRIDEC Cloud can be accessed in different modes, e.g. the monitoring mode, which provides important functionality required to act in a real event, and the exercise-and-training mode, which enables training and exercises with virtual scenarios re-played by a scenario player. The software system architecture and open interfaces facilitate global coverage so that the system is applicable for any region in the world and allow the integration of different sensor systems as well as the integration of other hazard types and use cases different to tsunami early warning. Current advances of the TRIDEC Cloud platform will be summarized in this presentation.

  5. Using Distributed Data over HBase in Big Data Analytics Platform for Clinical Services

    PubMed Central

    Zamani, Hamid

    2017-01-01

    Big data analytics (BDA) is important to reduce healthcare costs. However, there are many challenges of data aggregation, maintenance, integration, translation, analysis, and security/privacy. The study objective to establish an interactive BDA platform with simulated patient data using open-source software technologies was achieved by construction of a platform framework with Hadoop Distributed File System (HDFS) using HBase (key-value NoSQL database). Distributed data structures were generated from benchmarked hospital-specific metadata of nine billion patient records. At optimized iteration, HDFS ingestion of HFiles to HBase store files revealed sustained availability over hundreds of iterations; however, to complete MapReduce to HBase required a week (for 10 TB) and a month for three billion (30 TB) indexed patient records, respectively. Found inconsistencies of MapReduce limited the capacity to generate and replicate data efficiently. Apache Spark and Drill showed high performance with high usability for technical support but poor usability for clinical services. Hospital system based on patient-centric data was challenging in using HBase, whereby not all data profiles were fully integrated with the complex patient-to-hospital relationships. However, we recommend using HBase to achieve secured patient data while querying entire hospital volumes in a simplified clinical event model across clinical services. PMID:29375652

  6. Using Distributed Data over HBase in Big Data Analytics Platform for Clinical Services.

    PubMed

    Chrimes, Dillon; Zamani, Hamid

    2017-01-01

    Big data analytics (BDA) is important to reduce healthcare costs. However, there are many challenges of data aggregation, maintenance, integration, translation, analysis, and security/privacy. The study objective to establish an interactive BDA platform with simulated patient data using open-source software technologies was achieved by construction of a platform framework with Hadoop Distributed File System (HDFS) using HBase (key-value NoSQL database). Distributed data structures were generated from benchmarked hospital-specific metadata of nine billion patient records. At optimized iteration, HDFS ingestion of HFiles to HBase store files revealed sustained availability over hundreds of iterations; however, to complete MapReduce to HBase required a week (for 10 TB) and a month for three billion (30 TB) indexed patient records, respectively. Found inconsistencies of MapReduce limited the capacity to generate and replicate data efficiently. Apache Spark and Drill showed high performance with high usability for technical support but poor usability for clinical services. Hospital system based on patient-centric data was challenging in using HBase, whereby not all data profiles were fully integrated with the complex patient-to-hospital relationships. However, we recommend using HBase to achieve secured patient data while querying entire hospital volumes in a simplified clinical event model across clinical services.

  7. Cloud Based Earth Observation Data Exploitation Platforms

    NASA Astrophysics Data System (ADS)

    Romeo, A.; Pinto, S.; Loekken, S.; Marin, A.

    2017-12-01

    In the last few years data produced daily by several private and public Earth Observation (EO) satellites reached the order of tens of Terabytes, representing for scientists and commercial application developers both a big opportunity for their exploitation and a challenge for their management. New IT technologies, such as Big Data and cloud computing, enable the creation of web-accessible data exploitation platforms, which offer to scientists and application developers the means to access and use EO data in a quick and cost effective way. RHEA Group is particularly active in this sector, supporting the European Space Agency (ESA) in the Exploitation Platforms (EP) initiative, developing technology to build multi cloud platforms for the processing and analysis of Earth Observation data, and collaborating with larger European initiatives such as the European Plate Observing System (EPOS) and the European Open Science Cloud (EOSC). An EP is a virtual workspace, providing a user community with access to (i) large volume of data, (ii) algorithm development and integration environment, (iii) processing software and services (e.g. toolboxes, visualization routines), (iv) computing resources, (v) collaboration tools (e.g. forums, wiki, etc.). When an EP is dedicated to a specific Theme, it becomes a Thematic Exploitation Platform (TEP). Currently, ESA has seven TEPs in a pre-operational phase dedicated to geo-hazards monitoring and prevention, costal zones, forestry areas, hydrology, polar regions, urban areas and food security. On the technology development side, solutions like the multi cloud EO data processing platform provides the technology to integrate ICT resources and EO data from different vendors in a single platform. In particular it offers (i) Multi-cloud data discovery, (ii) Multi-cloud data management and access and (iii) Multi-cloud application deployment. This platform has been demonstrated with the EGI Federated Cloud, Innovation Platform Testbed Poland and the Amazon Web Services cloud. This work will present an overview of the TEPs and the multi-cloud EO data processing platform, and discuss their main achievements and their impacts in the context of distributed Research Infrastructures such as EPOS and EOSC.

  8. Methodology for automating software systems. Task 1 of the foundations for automating software systems

    NASA Technical Reports Server (NTRS)

    Moseley, Warren

    1989-01-01

    The early stages of a research program designed to establish an experimental research platform for software engineering are described. Major emphasis is placed on Computer Assisted Software Engineering (CASE). The Poor Man's CASE Tool is based on the Apple Macintosh system, employing available software including Focal Point II, Hypercard, XRefText, and Macproject. These programs are functional in themselves, but through advanced linking are available for operation from within the tool being developed. The research platform is intended to merge software engineering technology with artificial intelligence (AI). In the first prototype of the PMCT, however, the sections of AI are not included. CASE tools assist the software engineer in planning goals, routes to those goals, and ways to measure progress. The method described allows software to be synthesized instead of being written or built.

  9. FREEWAT: an HORIZON 2020 project to build open source tools for water management.

    NASA Astrophysics Data System (ADS)

    Rossetto, Rudy; Borsi, Iacopo; Foglia, Laura

    2015-04-01

    FREEWAT is an HORIZON 2020 project financed by the EU Commission under the call WATER INNOVATION: BOOSTING ITS VALUE FOR EUROPE. FREEWAT main result will be an open source and public domain GIS integrated modelling environment for the simulation of water quantity and quality in surface water and groundwater with an integrated water management and planning module. FREEWAT aims at promoting water resource management by simplifying the application of the Water Framework Directive and other EU water related Directives. Specific objectives of the FREEWAT project are: to coordinate previous EU and national funded research to integrate existing software modules for water management in a single environment into the GIS based FREEWAT and to support the FREEWAT application in an innovative participatory approach gathering technical staff and relevant stakeholders (in primis policy and decision makers) in designing scenarios for the proper application of water policies. The open source characteristics of the platform allow to consider this an initiative "ad includendum" (looking for inclusion of other entities), as further research institutions, private developers etc. may contribute to the platform development. The core of the FREEWAT platform will be the SID&GRID framework in its version ported to the QGIS desktop. SID&GRID (GIS integrated physically-based distributed numerical hydrological model based on a modified version of MODFLOW 2005; Rossetto et al. 2013) is an open source and public domain modelling platform firstly developed within the EU-POR FSE 2007-2013 Regione Toscana - Italy and then ported to the QGIS desktop through a dedicated fund by Regione Toscana. SID&GRID will be complemented by June 2015 with solute transport (also density dependent) capabilities in aquifers within the MARSOL (2014) EU FPVII project. Activities will be mainly carried out on two branches: (i) integration of modules, so that the software will fit the end-users requirements, including tools for better producing feasibility and management plans; (ii) a set of activities devoted to fix bugs and to provide a well-integrated interface for the different tools implemented. Further capabilities to be integrated are: - a dedicated module for water management and planning that will help to manage and aggregate all the distributed data coming from the simulation scenarios; - a whole module for calibration, uncertainty and sensitivity analysis; - a module for solute transport in the unsaturated zone; - a module for crop growth and water requirements in agriculture; - tools for dealing with groundwater quality issues; - tools for the analysis, interpretation and visualization of hydrogeological data. Through creating a common environment among water research/professionals, policy makers and implementers, FREEWAT main impact will be on enhancing science- and participatory approach and evidence-based decision making in water resource management, hence producing relevant and appropriate outcomes for policy implementation. The Consortium is constituted by partners from various water sectors from 10 EU countries, plus Turkey and Ukraine. Synergies with the UNESCO HOPE initiative on free and open source software in water management greatly boost the value of the project. Large stakeholders involvement is thought to guarantee results dissemination and exploitation. Acknowledgements This paper is presented within the framework of the project FREEWAT, which has received funding from the European Union's Horizon 2020 research and innovation programme under Grant Agreement n. 642224. References MARSOL (2014). Demonstrating Managed Aquifer Recharge as a Solution to Water Scarcity and Drought www.marsol.eu [accessed 4 January 2015] Rossetto, R., Borsi, I., Schifani, C., Bonari, E., Mogorovich P. & Primicerio M. (2013) - SID&GRID: integrating hydrological modeling in GIS environment hydroinformatics system for the management of the water resource. Rendiconti Online Societa Geologica Italiana Volume 24, 282-283

  10. An integrated movement capture and control platform applied towards autonomous movements of surgical robots.

    PubMed

    Daluja, Sachin; Golenberg, Lavie; Cao, Alex; Pandya, Abhilash K; Auner, Gregory W; Klein, Michael D

    2009-01-01

    Robotic surgery has gradually gained acceptance due to its numerous advantages such as tremor filtration, increased dexterity and motion scaling. There remains, however, a significant scope for improvement, especially in the areas of surgeon-robot interface and autonomous procedures. Previous studies have attempted to identify factors affecting a surgeon's performance in a master-slave robotic system by tracking hand movements. These studies relied on conventional optical or magnetic tracking systems, making their use impracticable in the operating room. This study concentrated on building an intrinsic movement capture platform using microcontroller based hardware wired to a surgical robot. Software was developed to enable tracking and analysis of hand movements while surgical tasks were performed. Movement capture was applied towards automated movements of the robotic instruments. By emulating control signals, recorded surgical movements were replayed by the robot's end-effectors. Though this work uses a surgical robot as the platform, the ideas and concepts put forward are applicable to telerobotic systems in general.

  11. Research on the error model of airborne celestial/inertial integrated navigation system

    NASA Astrophysics Data System (ADS)

    Zheng, Xiaoqiang; Deng, Xiaoguo; Yang, Xiaoxu; Dong, Qiang

    2015-02-01

    Celestial navigation subsystem of airborne celestial/inertial integrated navigation system periodically correct the positioning error and heading drift of the inertial navigation system, by which the inertial navigation system can greatly improve the accuracy of long-endurance navigation. Thus the navigation accuracy of airborne celestial navigation subsystem directly decides the accuracy of the integrated navigation system if it works for long time. By building the mathematical model of the airborne celestial navigation system based on the inertial navigation system, using the method of linear coordinate transformation, we establish the error transfer equation for the positioning algorithm of airborne celestial system. Based on these we built the positioning error model of the celestial navigation. And then, based on the positioning error model we analyze and simulate the positioning error which are caused by the error of the star tracking platform with the MATLAB software. Finally, the positioning error model is verified by the information of the star obtained from the optical measurement device in range and the device whose location are known. The analysis and simulation results show that the level accuracy and north accuracy of tracking platform are important factors that limit airborne celestial navigation systems to improve the positioning accuracy, and the positioning error have an approximate linear relationship with the level error and north error of tracking platform. The error of the verification results are in 1000m, which shows that the model is correct.

  12. PC based PLCs and ethernet based fieldbus: the new standard platform for future VLT instrument control

    NASA Astrophysics Data System (ADS)

    Kiekebusch, Mario J.; Lucuix, Christian; Erm, Toomas M.; Chiozzi, Gianluca; Zamparelli, Michele; Kern, Lothar; Brast, Roland; Pirani, Werther; Reiss, Roland; Popovic, Dan; Knudstrup, Jens; Duchateau, Michel; Sandrock, Stefan; Di Lieto, Nicola

    2014-07-01

    ESO is currently in the final phase of the standardization process for PC-based Programmable Logical Controllers (PLCs) as the new platform for the development of control systems for future VLT/VLTI instruments. The standard solution used until now consists of a Local Control Unit (LCU), a VME-based system having a CPU and commercial and proprietary boards. This system includes several layers of software and many thousands of lines of code developed and maintained in house. LCUs have been used for several years as the interface to control instrument functions but now are being replaced by commercial off-the-shelf (COTS) systems based on BECKHOFF Embedded PCs and the EtherCAT fieldbus. ESO is working on the completion of the software framework that enables a seamless integration into the VLT control system in order to be ready to support upcoming instruments like ESPRESSO and ERIS, that will be the first fully VLT compliant instruments using the new standard. The technology evaluation and standardization process has been a long and combined effort of various engineering disciplines like electronics, control and software, working together to define a solution that meets the requirements and minimizes the impact on the observatory operations and maintenance. This paper presents the challenges of the standardization process and the steps involved in such a change. It provides a technical overview of how industrial standards like EtherCAT, OPC-UA, PLCOpen MC and TwinCAT can be used to replace LCU features in various areas like software engineering and programming languages, motion control, time synchronization and astronomical tracking.

  13. PLUME and research sotware

    NASA Astrophysics Data System (ADS)

    Baudin, Veronique; Gomez-Diaz, Teresa

    2013-04-01

    The PLUME open platform (https://www.projet-plume.org) has as first goal to share competences and to value the knowledge of software experts within the French higher education and research communities. The project proposes in its platform the access to more than 380 index cards describing useful and economic software for this community, with open access to everybody. The second goal of PLUME focuses on to improve the visibility of software produced by research laboratories within the higher education and research communities. The "development-ESR" index cards briefly describe the main features of the software, including references to research publications associated to it. The platform counts more than 300 cards describing research software, where 89 cards have an English version. In this talk we describe the theme classification and the taxonomy of the index cards and the evolution with new themes added to the project. We will also focus on the organisation of PLUME as an open project and its interests in the promotion of free/open source software from and for research, contributing to the creation of a community of shared knowledge.

  14. A wellness software platform with smart wearable devices and the demonstration report for personal wellness management

    NASA Astrophysics Data System (ADS)

    Kang, Won-Seok; Son, Chang-Sik; Lee, Sangho; Choi, Rock-Hyun; Ha, Yeong-Mi

    2017-07-01

    In this paper, we introduce a wellness software platform, called WellnessHumanCare, is a semi-automatic wellness management software platform which has the functions of complex wellness data acquisition(mental, physical and environmental one) with smart wearable devices, complex wellness condition analysis, private-aware online/offline recommendation, real-time monitoring apps (Smartphone-based, Web-based) and so on and we has demonstrated a wellness management service with 79 participants (experimental group: 39, control group: 40) who has worked at experimental group (H Corp.) and control group (K Corp.), Korea and 3 months in order to show the efficiency of the WellnessHumanCare.

  15. Raven-II: an open platform for surgical robotics research.

    PubMed

    Hannaford, Blake; Rosen, Jacob; Friedman, Diana W; King, Hawkeye; Roan, Phillip; Cheng, Lei; Glozman, Daniel; Ma, Ji; Kosari, Sina Nia; White, Lee

    2013-04-01

    The Raven-II is a platform for collaborative research on advances in surgical robotics. Seven universities have begun research using this platform. The Raven-II system has two 3-DOF spherical positioning mechanisms capable of attaching interchangeable four DOF instruments. The Raven-II software is based on open standards such as Linux and ROS to maximally facilitate software development. The mechanism is robust enough for repeated experiments and animal surgery experiments, but is not engineered to sufficient safety standards for human use. Mechanisms in place for interaction among the user community and dissemination of results include an electronic forum, an online software SVN repository, and meetings and workshops at major robotics conferences.

  16. An integrated development framework for rapid development of platform-independent and reusable satellite on-board software

    NASA Astrophysics Data System (ADS)

    Ziemke, Claas; Kuwahara, Toshinori; Kossev, Ivan

    2011-09-01

    Even in the field of small satellites, the on-board data handling subsystem has become complex and powerful. With the introduction of powerful CPUs and the availability of considerable amounts of memory on-board a small satellite it has become possible to utilize the flexibility and power of contemporary platform-independent real-time operating systems. Especially the non-commercial sector such like university institutes and community projects such as AMSAT or SSETI are characterized by the inherent lack of financial as well as manpower resources. The opportunity to utilize such real-time operating systems will contribute significantly to achieve a successful mission. Nevertheless the on-board software of a satellite is much more than just an operating system. It has to fulfill a multitude of functional requirements such as: Telecommand interpretation and execution, execution of control loops, generation of telemetry data and frames, failure detection isolation and recovery, the communication with peripherals and so on. Most of the aforementioned tasks are of generic nature and have to be conducted on any satellite with only minor modifications. A general set of functional requirements as well as a protocol for communication is defined in the SA ECSS-E-70-41A standard "Telemetry and telecommand packet utilization". This standard not only defines the communication protocol of the satellite-ground link but also defines a set of so called services which have to be available on-board of every compliant satellite and which are of generic nature. In this paper, a platform-independent and reusable framework is described which is implementing not only the ECSS-E-70-41A standard but also functionalities for interprocess communication, scheduling and a multitude of tasks commonly performed on-board of a satellite. By making use of the capabilities of the high-level programming language C/C++, the powerful open source library BOOST, the real-time operating system RTEMS and finally by providing generic functionalities compliant to the ECSS-E-70-41A standard the proposed framework can provide a great boost in productivity. Together with open source tools such like the GNU tool-chain, Eclipse SDK, the simulation framework OpenSimKit, the emulator QEMU, the proposed on-board software framework forms an integrated development framework. It is possible to design, code and build the on-board software together with the operating system and then run it on a simulated satellite for performance analysis and debugging purposes. This makes it possible to rapidly develop and deploy a full-fledged satellite on-board software with minimal cost and in a limited time frame.

  17. Utilizing Technology for FCS Education: Selecting Appropriate Interactive Webinar Software

    ERIC Educational Resources Information Center

    Zoumenou, Virginie; Sigman-Grant, Madeleine; Coleman, Gayle; Malekian, Fatemeh; Zee, Julia M. K.; Fountain, Brent J.; Marsh, Akela

    2015-01-01

    The purpose of this research was to identify commonly used interactive webinar software platforms and to conduct a testing session on best practices related to an interactive webinar. The study employed the Adobe Connect and the Maestro Conference platforms. The 15 participants experienced five best practices: pre-work, polling, breakout room,…

  18. The Requirements and Design of the Rapid Prototyping Capabilities System

    NASA Astrophysics Data System (ADS)

    Haupt, T. A.; Moorhead, R.; O'Hara, C.; Anantharaj, V.

    2006-12-01

    The Rapid Prototyping Capabilities (RPC) system will provide the capability to rapidly evaluate innovative methods of linking science observations. To this end, the RPC will provide the capability to integrate the software components and tools needed to evaluate the use of a wide variety of current and future NASA sensors, numerical models, and research results, model outputs, and knowledge, collectively referred to as "resources". It is assumed that the resources are geographically distributed, and thus RPC will provide the support for the location transparency of the resources. The RPC system requires providing support for: (1) discovery, semantic understanding, secure access and transport mechanisms for data products available from the known data provides; (2) data assimilation and geo- processing tools for all data transformations needed to match given data products to the model input requirements; (3) model management including catalogs of models and model metadata, and mechanisms for creation environments for model execution; and (4) tools for model output analysis and model benchmarking. The challenge involves developing a cyberinfrastructure for a coordinated aggregate of software, hardware and other technologies, necessary to facilitate RPC experiments, as well as human expertise to provide an integrated, "end-to-end" platform to support the RPC objectives. Such aggregation is to be achieved through a horizontal integration of loosely coupled services. The cyberinfrastructure comprises several software layers. At the bottom, the Grid fabric encompasses network protocols, optical networks, computational resources, storage devices, and sensors. At the top, applications use workload managers to coordinate their access to physical resources. Applications are not tightly bounded to a single physical resource. Instead, they bind dynamically to resources (i.e., they are provisioned) via a common grid infrastructure layer. For the RPC system, the cyberinfrastructure must support organizing computations (or "data transformations" in general) into complex workflows with resource discovery, automatic resource allocation, monitoring, preserving provenance as well as to aggregate heterogeneous, distributed data into knowledge databases. Such service orchestration is the responsibility of the "collective services" layer. For RPC, this layer will be based on Java Business Integration (JBI, [JSR-208]) specification which is a standards-based integration platform that combines messaging, web services, data transformation, and intelligent routing to reliably connect and coordinate the interaction of significant numbers of diverse applications (plug-in components) across organizational boundaries. JBI concept is a new approach to integration that can provide the underpinnings for loosely coupled, highly distributed integration network that can scale beyond the limits of currently used hub-and-spoke brokers. This presentation discusses the requirements, design and early prototype of the NASA-sponsored RPC system under development at Mississippi State University, demonstrating the integration of data provisioning mechanisms, data transformation tools and computational models into a single interoperable system enabling rapid execution of RPC experiments.

  19. Development and Validation of an Interactive Internet Platform for Older People: The Healthy Ageing Through Internet Counselling in the Elderly Study.

    PubMed

    Jongstra, Susan; Beishuizen, Cathrien; Andrieu, Sandrine; Barbera, Mariagnese; van Dorp, Matthijs; van de Groep, Bram; Guillemont, Juliette; Mangialasche, Francesca; van Middelaar, Tessa; Moll van Charante, Eric; Soininen, Hilkka; Kivipelto, Miia; Richard, Edo

    2017-02-01

    A myriad of Web-based applications on self-management have been developed, but few focus on older people. In the face of global aging, older people form an important target population for cardiovascular prevention. This article describes the full development of an interactive Internet platform for older people, which was designed for the Healthy Ageing Through Internet Counselling in the Elderly (HATICE) study. We provide recommendations to design senior-friendly Web-based applications for a new approach to multicomponent cardiovascular prevention. The development of the platform followed five phases: (1) conceptual framework; (2) platform concept and functional design; (3) platform building (software and content); (4) testing and pilot study; and (5) final product. We performed a meta-analysis, reviewed guidelines for cardiovascular diseases, and consulted end users, experts, and software developers to create the platform concept and content. The software was built in iterative cycles. In the pilot study, 41 people aged ≥65 years used the platform for 8 weeks. Participants used the interactive features of the platform and appreciated the coach support. During all phases adjustments were made to incorporate all improvements from the previous phases. The final platform is a personal, secured, and interactive platform supported by a coach. When carefully designed, an interactive Internet platform is acceptable and feasible for use by older people with basic computer skills. To improve acceptability by older people, we recommend involving the end users in the process of development, to personalize the platform and to combine the application with human support. The interactive HATICE platform will be tested for efficacy in a multinational randomized controlled trial (ISRCTN48151589).

  20. A collection of open source applications for mass spectrometry data mining.

    PubMed

    Gallardo, Óscar; Ovelleiro, David; Gay, Marina; Carrascal, Montserrat; Abian, Joaquin

    2014-10-01

    We present several bioinformatics applications for the identification and quantification of phosphoproteome components by MS. These applications include a front-end graphical user interface that combines several Thermo RAW formats to MASCOT™ Generic Format extractors (EasierMgf), two graphical user interfaces for search engines OMSSA and SEQUEST (OmssaGui and SequestGui), and three applications, one for the management of databases in FASTA format (FastaTools), another for the integration of search results from up to three search engines (Integrator), and another one for the visualization of mass spectra and their corresponding database search results (JsonVisor). These applications were developed to solve some of the common problems found in proteomic and phosphoproteomic data analysis and were integrated in the workflow for data processing and feeding on our LymPHOS database. Applications were designed modularly and can be used standalone. These tools are written in Perl and Python programming languages and are supported on Windows platforms. They are all released under an Open Source Software license and can be freely downloaded from our software repository hosted at GoogleCode. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

Top