Sample records for computing platform acquisition

  1. Real-Time Compressive Sensing MRI Reconstruction Using GPU Computing and Split Bregman Methods

    PubMed Central

    Smith, David S.; Gore, John C.; Yankeelov, Thomas E.; Welch, E. Brian

    2012-01-01

    Compressive sensing (CS) has been shown to enable dramatic acceleration of MRI acquisition in some applications. Being an iterative reconstruction technique, CS MRI reconstructions can be more time-consuming than traditional inverse Fourier reconstruction. We have accelerated our CS MRI reconstruction by factors of up to 27 by using a split Bregman solver combined with a graphics processing unit (GPU) computing platform. The increases in speed we find are similar to those we measure for matrix multiplication on this platform, suggesting that the split Bregman methods parallelize efficiently. We demonstrate that the combination of the rapid convergence of the split Bregman algorithm and the massively parallel strategy of GPU computing can enable real-time CS reconstruction of even acquisition data matrices of dimension 40962 or more, depending on available GPU VRAM. Reconstruction of two-dimensional data matrices of dimension 10242 and smaller took ~0.3 s or less, showing that this platform also provides very fast iterative reconstruction for small-to-moderate size images. PMID:22481908

  2. Real-Time Compressive Sensing MRI Reconstruction Using GPU Computing and Split Bregman Methods.

    PubMed

    Smith, David S; Gore, John C; Yankeelov, Thomas E; Welch, E Brian

    2012-01-01

    Compressive sensing (CS) has been shown to enable dramatic acceleration of MRI acquisition in some applications. Being an iterative reconstruction technique, CS MRI reconstructions can be more time-consuming than traditional inverse Fourier reconstruction. We have accelerated our CS MRI reconstruction by factors of up to 27 by using a split Bregman solver combined with a graphics processing unit (GPU) computing platform. The increases in speed we find are similar to those we measure for matrix multiplication on this platform, suggesting that the split Bregman methods parallelize efficiently. We demonstrate that the combination of the rapid convergence of the split Bregman algorithm and the massively parallel strategy of GPU computing can enable real-time CS reconstruction of even acquisition data matrices of dimension 4096(2) or more, depending on available GPU VRAM. Reconstruction of two-dimensional data matrices of dimension 1024(2) and smaller took ~0.3 s or less, showing that this platform also provides very fast iterative reconstruction for small-to-moderate size images.

  3. Implementation of High Speed Distributed Data Acquisition System

    NASA Astrophysics Data System (ADS)

    Raju, Anju P.; Sekhar, Ambika

    2012-09-01

    This paper introduces a high speed distributed data acquisition system based on a field programmable gate array (FPGA). The aim is to develop a "distributed" data acquisition interface. The development of instruments such as personal computers and engineering workstations based on "standard" platforms is the motivation behind this effort. Using standard platforms as the controlling unit allows independence in hardware from a particular vendor and hardware platform. The distributed approach also has advantages from a functional point of view: acquisition resources become available to multiple instruments; the acquisition front-end can be physically remote from the rest of the instrument. High speed data acquisition system transmits data faster to a remote computer system through Ethernet interface. The data is acquired through 16 analog input channels. The input data commands are multiplexed and digitized and then the data is stored in 1K buffer for each input channel. The main control unit in this design is the 16 bit processor implemented in the FPGA. This 16 bit processor is used to set up and initialize the data source and the Ethernet controller, as well as control the flow of data from the memory element to the NIC. Using this processor we can initialize and control the different configuration registers in the Ethernet controller in a easy manner. Then these data packets are sending to the remote PC through the Ethernet interface. The main advantages of the using FPGA as standard platform are its flexibility, low power consumption, short design duration, fast time to market, programmability and high density. The main advantages of using Ethernet controller AX88796 over others are its non PCI interface, the presence of embedded SRAM where transmit and reception buffers are located and high-performance SRAM-like interface. The paper introduces the implementation of the distributed data acquisition using FPGA by VHDL. The main advantages of this system are high accuracy, high speed, real time monitoring.

  4. Low Cost Electroencephalographic Acquisition Amplifier to serve as Teaching and Research Tool

    PubMed Central

    Jain, Ankit; Kim, Insoo; Gluckman, Bruce J.

    2012-01-01

    We describe the development and testing of a low cost, easily constructed electroencephalographic acquisition amplifier for noninvasive Brain Computer Interface (BCI) education and research. The acquisition amplifier is constructed from newly available off-the-shelf integrated circuit components, and readily sends a 24-bit data stream via USB bus to a computer platform. We demonstrate here the hardware’s use in the analysis of a visually evoked P300 paradigm for a choose one-of-eight task. This clearly shows the applicability of this system as a low cost teaching and research tool. PMID:22254699

  5. Raspberry Pi-powered imaging for plant phenotyping.

    PubMed

    Tovar, Jose C; Hoyer, J Steen; Lin, Andy; Tielking, Allison; Callen, Steven T; Elizabeth Castillo, S; Miller, Michael; Tessman, Monica; Fahlgren, Noah; Carrington, James C; Nusinow, Dmitri A; Gehan, Malia A

    2018-03-01

    Image-based phenomics is a powerful approach to capture and quantify plant diversity. However, commercial platforms that make consistent image acquisition easy are often cost-prohibitive. To make high-throughput phenotyping methods more accessible, low-cost microcomputers and cameras can be used to acquire plant image data. We used low-cost Raspberry Pi computers and cameras to manage and capture plant image data. Detailed here are three different applications of Raspberry Pi-controlled imaging platforms for seed and shoot imaging. Images obtained from each platform were suitable for extracting quantifiable plant traits (e.g., shape, area, height, color) en masse using open-source image processing software such as PlantCV. This protocol describes three low-cost platforms for image acquisition that are useful for quantifying plant diversity. When coupled with open-source image processing tools, these imaging platforms provide viable low-cost solutions for incorporating high-throughput phenomics into a wide range of research programs.

  6. Development and operation of a real-time data acquisition system for the NASA-LaRC differential absorption lidar

    NASA Technical Reports Server (NTRS)

    Butler, C.

    1985-01-01

    Computer hardware and software of the NASA multipurpose differential absorption lidar (DIAL) sysatem were improved. The NASA DIAL system is undergoing development and experimental deployment for remote measurement of atmospheric trace gas concentration from ground and aircraft platforms. A viable DIAL system was developed with the capability of remotely measuring O3 and H2O concentrations from an aircraft platform. Test flights were successfully performed on board the NASA/Goddard Flight Center Electra aircraft from 1980 to 1984. Improvements on the DIAL data acquisition system (DAS) are described.

  7. 3D Data Acquisition Platform for Human Activity Understanding

    DTIC Science & Technology

    2016-03-02

    address fundamental research problems of representation and invariant description of3D data, human motion modeling and applications of human activity analysis, and computational optimization of large-scale 3D data.

  8. 3D Data Acquisition Platform for Human Activity Understanding

    DTIC Science & Technology

    2016-03-02

    address fundamental research problems of representation and invariant description of 3D data, human motion modeling and applications of human activity analysis, and computational optimization of large-scale 3D data.

  9. An integrated compact airborne multispectral imaging system using embedded computer

    NASA Astrophysics Data System (ADS)

    Zhang, Yuedong; Wang, Li; Zhang, Xuguo

    2015-08-01

    An integrated compact airborne multispectral imaging system using embedded computer based control system was developed for small aircraft multispectral imaging application. The multispectral imaging system integrates CMOS camera, filter wheel with eight filters, two-axis stabilized platform, miniature POS (position and orientation system) and embedded computer. The embedded computer has excellent universality and expansibility, and has advantages in volume and weight for airborne platform, so it can meet the requirements of control system of the integrated airborne multispectral imaging system. The embedded computer controls the camera parameters setting, filter wheel and stabilized platform working, image and POS data acquisition, and stores the image and data. The airborne multispectral imaging system can connect peripheral device use the ports of the embedded computer, so the system operation and the stored image data management are easy. This airborne multispectral imaging system has advantages of small volume, multi-function, and good expansibility. The imaging experiment results show that this system has potential for multispectral remote sensing in applications such as resource investigation and environmental monitoring.

  10. Theory and operation of the real-time data acquisition system for the NASA-LaRC differential absorption lidar (DIAL)

    NASA Technical Reports Server (NTRS)

    Butler, C.

    1986-01-01

    The improvement of computer hardware and software of the NASA Multipurpose Differential Absorption Lidar (DIAL) system is documented. The NASA DIAL system is undergoing development and experimental deployment at NASA Langley Research Center for the remote measurement of atmospheric trace gas concentrations from ground and aircraft platforms. A viable DIAL system was developed capable of remotely measuring O3 and H2O concentrations from an aircraft platform. Test flights of the DIAL system were successfully performed onboard the NASA Goddard Flight Center Electra aircraft from 1980 to 1985. The DIAL Data Acquisition System has undergone a number of improvements over the past few years. These improvements have now been field tested. The theory behind a real time computer system as it applies to the needs of the DIAL system is discussed. This report is designed to be used as an operational manual for the DIAL DAS.

  11. The Common Data Acquisition Platform in the Helmholtz Association

    NASA Astrophysics Data System (ADS)

    Kaever, P.; Balzer, M.; Kopmann, A.; Zimmer, M.; Rongen, H.

    2017-04-01

    Various centres of the German Helmholtz Association (HGF) started in 2012 to develop a modular data acquisition (DAQ) platform, covering the entire range from detector readout to data transfer into parallel computing environments. This platform integrates generic hardware components like the multi-purpose HGF-Advanced Mezzanine Card or a smart scientific camera framework, adding user value with Linux drivers and board support packages. Technically the scope comprises the DAQ-chain from FPGA-modules to computing servers, notably frontend-electronics-interfaces, microcontrollers and GPUs with their software plus high-performance data transmission links. The core idea is a generic and component-based approach, enabling the implementation of specific experiment requirements with low effort. This so called DTS-platform will support standards like MTCA.4 in hard- and software to ensure compatibility with commercial components. Its capability to deploy on other crate standards or FPGA-boards with PCI express or Ethernet interfaces remains an essential feature. Competences of the participating centres are coordinated in order to provide a solid technological basis for both research topics in the Helmholtz Programme ``Matter and Technology'': ``Detector Technology and Systems'' and ``Accelerator Research and Development''. The DTS-platform aims at reducing costs and development time and will ensure access to latest technologies for the collaboration. Due to its flexible approach, it has the potential to be applied in other scientific programs.

  12. Development of Labview based data acquisition and multichannel analyzer software for radioactive particle tracking system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rahman, Nur Aira Abd, E-mail: nur-aira@nuclearmalaysia.gov.my; Yussup, Nolida; Ibrahim, Maslina Bt. Mohd

    2015-04-29

    A DAQ (data acquisition) software called RPTv2.0 has been developed for Radioactive Particle Tracking System in Malaysian Nuclear Agency. RPTv2.0 that features scanning control GUI, data acquisition from 12-channel counter via RS-232 interface, and multichannel analyzer (MCA). This software is fully developed on National Instruments Labview 8.6 platform. Ludlum Model 4612 Counter is used to count the signals from the scintillation detectors while a host computer is used to send control parameters, acquire and display data, and compute results. Each detector channel consists of independent high voltage control, threshold or sensitivity value and window settings. The counter is configured withmore » a host board and twelve slave boards. The host board collects the counts from each slave board and communicates with the computer via RS-232 data interface.« less

  13. TestDose: A nuclear medicine software based on Monte Carlo modeling for generating gamma camera acquisitions and dosimetry

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Garcia, Marie-Paule, E-mail: marie-paule.garcia@univ-brest.fr; Villoing, Daphnée; McKay, Erin

    Purpose: The TestDose platform was developed to generate scintigraphic imaging protocols and associated dosimetry by Monte Carlo modeling. TestDose is part of a broader project (www.dositest.com) whose aim is to identify the biases induced by different clinical dosimetry protocols. Methods: The TestDose software allows handling the whole pipeline from virtual patient generation to resulting planar and SPECT images and dosimetry calculations. The originality of their approach relies on the implementation of functional segmentation for the anthropomorphic model representing a virtual patient. Two anthropomorphic models are currently available: 4D XCAT and ICRP 110. A pharmacokinetic model describes the biodistribution of amore » given radiopharmaceutical in each defined compartment at various time-points. The Monte Carlo simulation toolkit GATE offers the possibility to accurately simulate scintigraphic images and absorbed doses in volumes of interest. The TestDose platform relies on GATE to reproduce precisely any imaging protocol and to provide reference dosimetry. For image generation, TestDose stores user’s imaging requirements and generates automatically command files used as input for GATE. Each compartment is simulated only once and the resulting output is weighted using pharmacokinetic data. Resulting compartment projections are aggregated to obtain the final image. For dosimetry computation, emission data are stored in the platform database and relevant GATE input files are generated for the virtual patient model and associated pharmacokinetics. Results: Two samples of software runs are given to demonstrate the potential of TestDose. A clinical imaging protocol for the Octreoscan™ therapeutical treatment was implemented using the 4D XCAT model. Whole-body “step and shoot” acquisitions at different times postinjection and one SPECT acquisition were generated within reasonable computation times. Based on the same Octreoscan™ kinetics, a dosimetry computation performed on the ICRP 110 model is also presented. Conclusions: The proposed platform offers a generic framework to implement any scintigraphic imaging protocols and voxel/organ-based dosimetry computation. Thanks to the modular nature of TestDose, other imaging modalities could be supported in the future such as positron emission tomography.« less

  14. TestDose: A nuclear medicine software based on Monte Carlo modeling for generating gamma camera acquisitions and dosimetry.

    PubMed

    Garcia, Marie-Paule; Villoing, Daphnée; McKay, Erin; Ferrer, Ludovic; Cremonesi, Marta; Botta, Francesca; Ferrari, Mahila; Bardiès, Manuel

    2015-12-01

    The TestDose platform was developed to generate scintigraphic imaging protocols and associated dosimetry by Monte Carlo modeling. TestDose is part of a broader project (www.dositest.com) whose aim is to identify the biases induced by different clinical dosimetry protocols. The TestDose software allows handling the whole pipeline from virtual patient generation to resulting planar and SPECT images and dosimetry calculations. The originality of their approach relies on the implementation of functional segmentation for the anthropomorphic model representing a virtual patient. Two anthropomorphic models are currently available: 4D XCAT and ICRP 110. A pharmacokinetic model describes the biodistribution of a given radiopharmaceutical in each defined compartment at various time-points. The Monte Carlo simulation toolkit gate offers the possibility to accurately simulate scintigraphic images and absorbed doses in volumes of interest. The TestDose platform relies on gate to reproduce precisely any imaging protocol and to provide reference dosimetry. For image generation, TestDose stores user's imaging requirements and generates automatically command files used as input for gate. Each compartment is simulated only once and the resulting output is weighted using pharmacokinetic data. Resulting compartment projections are aggregated to obtain the final image. For dosimetry computation, emission data are stored in the platform database and relevant gate input files are generated for the virtual patient model and associated pharmacokinetics. Two samples of software runs are given to demonstrate the potential of TestDose. A clinical imaging protocol for the Octreoscan™ therapeutical treatment was implemented using the 4D XCAT model. Whole-body "step and shoot" acquisitions at different times postinjection and one SPECT acquisition were generated within reasonable computation times. Based on the same Octreoscan™ kinetics, a dosimetry computation performed on the ICRP 110 model is also presented. The proposed platform offers a generic framework to implement any scintigraphic imaging protocols and voxel/organ-based dosimetry computation. Thanks to the modular nature of TestDose, other imaging modalities could be supported in the future such as positron emission tomography.

  15. A simple encoding method for Sigma-Delta ADC based biopotential acquisition systems.

    PubMed

    Guerrero, Federico N; Spinelli, Enrique M

    2017-10-01

    Sigma Delta analogue-to-digital converters allow acquiring the full dynamic range of biomedical signals at the electrodes, resulting in less complex hardware and increased measurement robustness. However, the increased data size per sample (typically 24 bits) demands the transmission of extremely large volumes of data across the isolation barrier, thus increasing power consumption on the patient side. This problem is accentuated when a large number of channels is used as in current 128-256 electrodes biopotential acquisition systems, that usually opt for an optic fibre link to the computer. An analogous problem occurs for simpler low-power acquisition platforms that transmit data through a wireless link to a computing platform. In this paper, a low-complexity encoding method is presented to decrease sample data size without losses, while preserving the full DC-coupled signal. The method achieved a 2.3 average compression ratio evaluated over an ECG and EMG signal bank acquired with equipment based on Sigma-Delta converters. It demands a very low processing load: a C language implementation is presented that resulted in an 110 clock cycles average execution on an 8-bit microcontroller.

  16. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lingerfelt, Eric J; Endeve, Eirik; Hui, Yawei

    Improvements in scientific instrumentation allow imaging at mesoscopic to atomic length scales, many spectroscopic modes, and now--with the rise of multimodal acquisition systems and the associated processing capability--the era of multidimensional, informationally dense data sets has arrived. Technical issues in these combinatorial scientific fields are exacerbated by computational challenges best summarized as a necessity for drastic improvement in the capability to transfer, store, and analyze large volumes of data. The Bellerophon Environment for Analysis of Materials (BEAM) platform provides material scientists the capability to directly leverage the integrated computational and analytical power of High Performance Computing (HPC) to perform scalablemore » data analysis and simulation and manage uploaded data files via an intuitive, cross-platform client user interface. This framework delivers authenticated, "push-button" execution of complex user workflows that deploy data analysis algorithms and computational simulations utilizing compute-and-data cloud infrastructures and HPC environments like Titan at the Oak Ridge Leadershp Computing Facility (OLCF).« less

  17. Data acquisition and real-time control using spreadsheets: interfacing Excel with external hardware.

    PubMed

    Aliane, Nourdine

    2010-07-01

    Spreadsheets have become a popular computational tool and a powerful platform for performing engineering calculations. Moreover, spreadsheets include a macro language, which permits the inclusion of standard computer code in worksheets, and thereby enable developers to greatly extend spreadsheets' capabilities by designing specific add-ins. This paper describes how to use Excel spreadsheets in conjunction to Visual Basic for Application programming language to perform data acquisition and real-time control. Afterwards, the paper presents two Excel applications with interactive user interfaces developed for laboratory demonstrations and experiments in an introductory course in control. 2010 ISA. Published by Elsevier Ltd. All rights reserved.

  18. Distributed MRI reconstruction using Gadgetron-based cloud computing.

    PubMed

    Xue, Hui; Inati, Souheil; Sørensen, Thomas Sangild; Kellman, Peter; Hansen, Michael S

    2015-03-01

    To expand the open source Gadgetron reconstruction framework to support distributed computing and to demonstrate that a multinode version of the Gadgetron can be used to provide nonlinear reconstruction with clinically acceptable latency. The Gadgetron framework was extended with new software components that enable an arbitrary number of Gadgetron instances to collaborate on a reconstruction task. This cloud-enabled version of the Gadgetron was deployed on three different distributed computing platforms ranging from a heterogeneous collection of commodity computers to the commercial Amazon Elastic Compute Cloud. The Gadgetron cloud was used to provide nonlinear, compressed sensing reconstruction on a clinical scanner with low reconstruction latency (eg, cardiac and neuroimaging applications). The proposed setup was able to handle acquisition and 11 -SPIRiT reconstruction of nine high temporal resolution real-time, cardiac short axis cine acquisitions, covering the ventricles for functional evaluation, in under 1 min. A three-dimensional high-resolution brain acquisition with 1 mm(3) isotropic pixel size was acquired and reconstructed with nonlinear reconstruction in less than 5 min. A distributed computing enabled Gadgetron provides a scalable way to improve reconstruction performance using commodity cluster computing. Nonlinear, compressed sensing reconstruction can be deployed clinically with low image reconstruction latency. © 2014 Wiley Periodicals, Inc.

  19. Criteria for Evaluating a Game-Based CALL Platform

    ERIC Educational Resources Information Center

    Ní Chiaráin, Neasa; Ní Chasaide, Ailbhe

    2017-01-01

    Game-based Computer-Assisted Language Learning (CALL) is an area that currently warrants attention, as task-based, interactive, multimodal games increasingly show promise for language learning. This area is inherently multidisciplinary--theories from second language acquisition, games, and psychology must be explored and relevant concepts from…

  20. CloVR: a virtual machine for automated and portable sequence analysis from the desktop using cloud computing.

    PubMed

    Angiuoli, Samuel V; Matalka, Malcolm; Gussman, Aaron; Galens, Kevin; Vangala, Mahesh; Riley, David R; Arze, Cesar; White, James R; White, Owen; Fricke, W Florian

    2011-08-30

    Next-generation sequencing technologies have decentralized sequence acquisition, increasing the demand for new bioinformatics tools that are easy to use, portable across multiple platforms, and scalable for high-throughput applications. Cloud computing platforms provide on-demand access to computing infrastructure over the Internet and can be used in combination with custom built virtual machines to distribute pre-packaged with pre-configured software. We describe the Cloud Virtual Resource, CloVR, a new desktop application for push-button automated sequence analysis that can utilize cloud computing resources. CloVR is implemented as a single portable virtual machine (VM) that provides several automated analysis pipelines for microbial genomics, including 16S, whole genome and metagenome sequence analysis. The CloVR VM runs on a personal computer, utilizes local computer resources and requires minimal installation, addressing key challenges in deploying bioinformatics workflows. In addition CloVR supports use of remote cloud computing resources to improve performance for large-scale sequence processing. In a case study, we demonstrate the use of CloVR to automatically process next-generation sequencing data on multiple cloud computing platforms. The CloVR VM and associated architecture lowers the barrier of entry for utilizing complex analysis protocols on both local single- and multi-core computers and cloud systems for high throughput data processing.

  1. ACQ4: an open-source software platform for data acquisition and analysis in neurophysiology research.

    PubMed

    Campagnola, Luke; Kratz, Megan B; Manis, Paul B

    2014-01-01

    The complexity of modern neurophysiology experiments requires specialized software to coordinate multiple acquisition devices and analyze the collected data. We have developed ACQ4, an open-source software platform for performing data acquisition and analysis in experimental neurophysiology. This software integrates the tasks of acquiring, managing, and analyzing experimental data. ACQ4 has been used primarily for standard patch-clamp electrophysiology, laser scanning photostimulation, multiphoton microscopy, intrinsic imaging, and calcium imaging. The system is highly modular, which facilitates the addition of new devices and functionality. The modules included with ACQ4 provide for rapid construction of acquisition protocols, live video display, and customizable analysis tools. Position-aware data collection allows automated construction of image mosaics and registration of images with 3-dimensional anatomical atlases. ACQ4 uses free and open-source tools including Python, NumPy/SciPy for numerical computation, PyQt for the user interface, and PyQtGraph for scientific graphics. Supported hardware includes cameras, patch clamp amplifiers, scanning mirrors, lasers, shutters, Pockels cells, motorized stages, and more. ACQ4 is available for download at http://www.acq4.org.

  2. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Trujillo, Angelina Michelle

    Strategy, Planning, Acquiring- very large scale computing platforms come and go and planning for immensely scalable machines often precedes actual procurement by 3 years. Procurement can be another year or more. Integration- After Acquisition, machines must be integrated into the computing environments at LANL. Connection to scalable storage via large scale storage networking, assuring correct and secure operations. Management and Utilization – Ongoing operations, maintenance, and trouble shooting of the hardware and systems software at massive scale is required.

  3. NiftyPET: a High-throughput Software Platform for High Quantitative Accuracy and Precision PET Imaging and Analysis.

    PubMed

    Markiewicz, Pawel J; Ehrhardt, Matthias J; Erlandsson, Kjell; Noonan, Philip J; Barnes, Anna; Schott, Jonathan M; Atkinson, David; Arridge, Simon R; Hutton, Brian F; Ourselin, Sebastien

    2018-01-01

    We present a standalone, scalable and high-throughput software platform for PET image reconstruction and analysis. We focus on high fidelity modelling of the acquisition processes to provide high accuracy and precision quantitative imaging, especially for large axial field of view scanners. All the core routines are implemented using parallel computing available from within the Python package NiftyPET, enabling easy access, manipulation and visualisation of data at any processing stage. The pipeline of the platform starts from MR and raw PET input data and is divided into the following processing stages: (1) list-mode data processing; (2) accurate attenuation coefficient map generation; (3) detector normalisation; (4) exact forward and back projection between sinogram and image space; (5) estimation of reduced-variance random events; (6) high accuracy fully 3D estimation of scatter events; (7) voxel-based partial volume correction; (8) region- and voxel-level image analysis. We demonstrate the advantages of this platform using an amyloid brain scan where all the processing is executed from a single and uniform computational environment in Python. The high accuracy acquisition modelling is achieved through span-1 (no axial compression) ray tracing for true, random and scatter events. Furthermore, the platform offers uncertainty estimation of any image derived statistic to facilitate robust tracking of subtle physiological changes in longitudinal studies. The platform also supports the development of new reconstruction and analysis algorithms through restricting the axial field of view to any set of rings covering a region of interest and thus performing fully 3D reconstruction and corrections using real data significantly faster. All the software is available as open source with the accompanying wiki-page and test data.

  4. 77 FR 40302 - Department of the Treasury Acquisition Regulation; Internet Payment Platform

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-07-09

    ... Treasury Acquisition Regulation; Internet Payment Platform AGENCY: Office of the Procurement Executive... Treasury Acquisition Regulation (DTAR) to implement use of the Internet Payment Platform, a centralized... implement the Internet Payment Platform (IPP) no later than the end of fiscal year 2012; with all new...

  5. Theory and operation of the real-time data acquisition system for the NASA-LaRC differential absorption lidar (DIAL)

    NASA Technical Reports Server (NTRS)

    Butler, Carolyn; Spencer, Randall

    1988-01-01

    The improvement of computer hardware and software of the NASA Multipurpose Differential Absorption Lidar (DIAL) system is documented. The NASA DIAL system has undergone development and experimental deployment at NASA/Langley Res. Center for the remote measurement of atmospheric trace gas concentrations from ground and aircraft platforms. A viable DIAL system was developed capable of remotely measuring O3 and H2O concentrations from an aircraft platform. The DIAL Data Acquisition System (DAS) has undergone a number of improvements also. Due to the participation of the DIAL in the Global Tropospheric Experiment, modifications and improvements of the system were tested and used both in the lab and in air. Therefore, this is an operational manual for the DIAL DAS.

  6. Computer optimization techniques for NASA Langley's CSI evolutionary model's real-time control system

    NASA Technical Reports Server (NTRS)

    Elliott, Kenny B.; Ugoletti, Roberto; Sulla, Jeff

    1992-01-01

    The evolution and optimization of a real-time digital control system is presented. The control system is part of a testbed used to perform focused technology research on the interactions of spacecraft platform and instrument controllers with the flexible-body dynamics of the platform and platform appendages. The control system consists of Computer Automated Measurement and Control (CAMAC) standard data acquisition equipment interfaced to a workstation computer. The goal of this work is to optimize the control system's performance to support controls research using controllers with up to 50 states and frame rates above 200 Hz. The original system could support a 16-state controller operating at a rate of 150 Hz. By using simple yet effective software improvements, Input/Output (I/O) latencies and contention problems are reduced or eliminated in the control system. The final configuration can support a 16-state controller operating at 475 Hz. Effectively the control system's performance was increased by a factor of 3.

  7. A Software Development Platform for Wearable Medical Applications.

    PubMed

    Zhang, Ruikai; Lin, Wei

    2015-10-01

    Wearable medical devices have become a leading trend in healthcare industry. Microcontrollers are computers on a chip with sufficient processing power and preferred embedded computing units in those devices. We have developed a software platform specifically for the design of the wearable medical applications with a small code footprint on the microcontrollers. It is supported by the open source real time operating system FreeRTOS and supplemented with a set of standard APIs for the architectural specific hardware interfaces on the microcontrollers for data acquisition and wireless communication. We modified the tick counter routine in FreeRTOS to include a real time soft clock. When combined with the multitasking features in the FreeRTOS, the platform offers the quick development of wearable applications and easy porting of the application code to different microprocessors. Test results have demonstrated that the application software developed using this platform are highly efficient in CPU usage while maintaining a small code foot print to accommodate the limited memory space in microcontrollers.

  8. Voyager backgrounder

    NASA Technical Reports Server (NTRS)

    1981-01-01

    The Voyager spacecraft and experiments are described. The spacecraft description includes the structure and configuration, communications systems, power supplies, computer command subsystems, and the science platform. The experiments discussed are investigations of cosmic rays, low-energy charged particles, magnetic fields, and plasma waves, along with studies in radio astronomy photopolarimetry. The tracking and data acquisition procedures for the missions are presented.

  9. CloVR: A virtual machine for automated and portable sequence analysis from the desktop using cloud computing

    PubMed Central

    2011-01-01

    Background Next-generation sequencing technologies have decentralized sequence acquisition, increasing the demand for new bioinformatics tools that are easy to use, portable across multiple platforms, and scalable for high-throughput applications. Cloud computing platforms provide on-demand access to computing infrastructure over the Internet and can be used in combination with custom built virtual machines to distribute pre-packaged with pre-configured software. Results We describe the Cloud Virtual Resource, CloVR, a new desktop application for push-button automated sequence analysis that can utilize cloud computing resources. CloVR is implemented as a single portable virtual machine (VM) that provides several automated analysis pipelines for microbial genomics, including 16S, whole genome and metagenome sequence analysis. The CloVR VM runs on a personal computer, utilizes local computer resources and requires minimal installation, addressing key challenges in deploying bioinformatics workflows. In addition CloVR supports use of remote cloud computing resources to improve performance for large-scale sequence processing. In a case study, we demonstrate the use of CloVR to automatically process next-generation sequencing data on multiple cloud computing platforms. Conclusion The CloVR VM and associated architecture lowers the barrier of entry for utilizing complex analysis protocols on both local single- and multi-core computers and cloud systems for high throughput data processing. PMID:21878105

  10. Data reduction analysis and application technique development for atmospheric trace gas constituents derived from remote sensors on satellite or airborne platforms

    NASA Technical Reports Server (NTRS)

    Casas, J. C.; Campbell, S. A.

    1981-01-01

    The applicability of the gas filter correlation radiometer (GFCR) to the measurement of tropospheric carbon monoxide gas was investigated. An assessment of the GFRC measurement system to a regional measurement program was conducted through extensive aircraft flight-testing of several versions of the GFRC. Investigative work in the following areas is described: flight test planning and coordination, acquisition of verifying CO measurements, determination and acquisition of supporting meteorological data requirements, and development of supporting computational software.

  11. Integrated Component-based Data Acquisition Systems for Aerospace Test Facilities

    NASA Technical Reports Server (NTRS)

    Ross, Richard W.

    2001-01-01

    The Multi-Instrument Integrated Data Acquisition System (MIIDAS), developed by the NASA Langley Research Center, uses commercial off the shelf (COTS) products, integrated with custom software, to provide a broad range of capabilities at a low cost throughout the system s entire life cycle. MIIDAS combines data acquisition capabilities with online and post-test data reduction computations. COTS products lower purchase and maintenance costs by reducing the level of effort required to meet system requirements. Object-oriented methods are used to enhance modularity, encourage reusability, and to promote adaptability, reducing software development costs. Using only COTS products and custom software supported on multiple platforms reduces the cost of porting the system to other platforms. The post-test data reduction capabilities of MIIDAS have been installed at four aerospace testing facilities at NASA Langley Research Center. The systems installed at these facilities provide a common user interface, reducing the training time required for personnel that work across multiple facilities. The techniques employed by MIIDAS enable NASA to build a system with a lower initial purchase price and reduced sustaining maintenance costs. With MIIDAS, NASA has built a highly flexible next generation data acquisition and reduction system for aerospace test facilities that meets customer expectations.

  12. ACQ4: an open-source software platform for data acquisition and analysis in neurophysiology research

    PubMed Central

    Campagnola, Luke; Kratz, Megan B.; Manis, Paul B.

    2014-01-01

    The complexity of modern neurophysiology experiments requires specialized software to coordinate multiple acquisition devices and analyze the collected data. We have developed ACQ4, an open-source software platform for performing data acquisition and analysis in experimental neurophysiology. This software integrates the tasks of acquiring, managing, and analyzing experimental data. ACQ4 has been used primarily for standard patch-clamp electrophysiology, laser scanning photostimulation, multiphoton microscopy, intrinsic imaging, and calcium imaging. The system is highly modular, which facilitates the addition of new devices and functionality. The modules included with ACQ4 provide for rapid construction of acquisition protocols, live video display, and customizable analysis tools. Position-aware data collection allows automated construction of image mosaics and registration of images with 3-dimensional anatomical atlases. ACQ4 uses free and open-source tools including Python, NumPy/SciPy for numerical computation, PyQt for the user interface, and PyQtGraph for scientific graphics. Supported hardware includes cameras, patch clamp amplifiers, scanning mirrors, lasers, shutters, Pockels cells, motorized stages, and more. ACQ4 is available for download at http://www.acq4.org. PMID:24523692

  13. Hard real-time closed-loop electrophysiology with the Real-Time eXperiment Interface (RTXI)

    PubMed Central

    George, Ansel; Dorval, Alan D.; Christini, David J.

    2017-01-01

    The ability to experimentally perturb biological systems has traditionally been limited to static pre-programmed or operator-controlled protocols. In contrast, real-time control allows dynamic probing of biological systems with perturbations that are computed on-the-fly during experimentation. Real-time control applications for biological research are available; however, these systems are costly and often restrict the flexibility and customization of experimental protocols. The Real-Time eXperiment Interface (RTXI) is an open source software platform for achieving hard real-time data acquisition and closed-loop control in biological experiments while retaining the flexibility needed for experimental settings. RTXI has enabled users to implement complex custom closed-loop protocols in single cell, cell network, animal, and human electrophysiology studies. RTXI is also used as a free and open source, customizable electrophysiology platform in open-loop studies requiring online data acquisition, processing, and visualization. RTXI is easy to install, can be used with an extensive range of external experimentation and data acquisition hardware, and includes standard modules for implementing common electrophysiology protocols. PMID:28557998

  14. Advances in Spatial Data Infrastructure, Acquisition, Analysis, Archiving and Dissemination

    NASA Technical Reports Server (NTRS)

    Ramapriyan, Hampapuran K.; Rochon, Gilbert L.; Duerr, Ruth; Rank, Robert; Nativi, Stefano; Stocker, Erich Franz

    2010-01-01

    The authors review recent contributions to the state-of-thescience and benign proliferation of satellite remote sensing, spatial data infrastructure, near-real-time data acquisition, analysis on high performance computing platforms, sapient archiving, multi-modal dissemination and utilization for a wide array of scientific applications. The authors also address advances in Geoinformatics and its growing ubiquity, as evidenced by its inclusion as a focus area within the American Geophysical Union (AGU), European Geosciences Union (EGU), as well as by the evolution of the IEEE Geoscience and Remote Sensing Society's (GRSS) Data Archiving and Distribution Technical Committee (DAD TC).

  15. The Center of Attention

    NASA Technical Reports Server (NTRS)

    2000-01-01

    New Hampshire-based Creare, Inc. used a NASA SBIR contract with Dryden to develop "middleware" known commercially as DataTurbine. DataTurbine acts as "glueware" allowing communication between dissimilar computer platforms and analysis, storage and acquisition of shared data. DataTurbine relies on Ring Buffered Network Bus technology, which is a software server providing a buffered network path between suppliers and consumers of information.

  16. Comparison of Uncalibrated Rgbvi with Spectrometer-Based Ndvi Derived from Uav Sensing Systems on Field Scale

    NASA Astrophysics Data System (ADS)

    Bareth, G.; Bolten, A.; Gnyp, M. L.; Reusch, S.; Jasper, J.

    2016-06-01

    The development of UAV-based sensing systems for agronomic applications serves the improvement of crop management. The latter is in the focus of precision agriculture which intends to optimize yield, fertilizer input, and crop protection. Besides, in some cropping systems vehicle-based sensing devices are less suitable because fields cannot be entered from certain growing stages onwards. This is true for rice, maize, sorghum, and many more crops. Consequently, UAV-based sensing approaches fill a niche of very high resolution data acquisition on the field scale in space and time. While mounting RGB digital compact cameras to low-weight UAVs (< 5 kg) is well established, the miniaturization of sensors in the last years also enables hyperspectral data acquisition from those platforms. From both, RGB and hyperspectral data, vegetation indices (VIs) are computed to estimate crop growth parameters. In this contribution, we compare two different sensing approaches from a low-weight UAV platform (< 5 kg) for monitoring a nitrogen field experiment of winter wheat and a corresponding farmers' field in Western Germany. (i) A standard digital compact camera was flown to acquire RGB images which are used to compute the RGBVI and (ii) NDVI is computed from a newly modified version of the Yara N-Sensor. The latter is a well-established tractor-based hyperspectral sensor for crop management and is available on the market since a decade. It was modified for this study to fit the requirements of UAV-based data acquisition. Consequently, we focus on three objectives in this contribution: (1) to evaluate the potential of the uncalibrated RGBVI for monitoring nitrogen status in winter wheat, (2) investigate the UAV-based performance of the modified Yara N-Sensor, and (3) compare the results of the two different UAV-based sensing approaches for winter wheat.

  17. Accelerated GPU based SPECT Monte Carlo simulations.

    PubMed

    Garcia, Marie-Paule; Bert, Julien; Benoit, Didier; Bardiès, Manuel; Visvikis, Dimitris

    2016-06-07

    Monte Carlo (MC) modelling is widely used in the field of single photon emission computed tomography (SPECT) as it is a reliable technique to simulate very high quality scans. This technique provides very accurate modelling of the radiation transport and particle interactions in a heterogeneous medium. Various MC codes exist for nuclear medicine imaging simulations. Recently, new strategies exploiting the computing capabilities of graphical processing units (GPU) have been proposed. This work aims at evaluating the accuracy of such GPU implementation strategies in comparison to standard MC codes in the context of SPECT imaging. GATE was considered the reference MC toolkit and used to evaluate the performance of newly developed GPU Geant4-based Monte Carlo simulation (GGEMS) modules for SPECT imaging. Radioisotopes with different photon energies were used with these various CPU and GPU Geant4-based MC codes in order to assess the best strategy for each configuration. Three different isotopes were considered: (99m) Tc, (111)In and (131)I, using a low energy high resolution (LEHR) collimator, a medium energy general purpose (MEGP) collimator and a high energy general purpose (HEGP) collimator respectively. Point source, uniform source, cylindrical phantom and anthropomorphic phantom acquisitions were simulated using a model of the GE infinia II 3/8" gamma camera. Both simulation platforms yielded a similar system sensitivity and image statistical quality for the various combinations. The overall acceleration factor between GATE and GGEMS platform derived from the same cylindrical phantom acquisition was between 18 and 27 for the different radioisotopes. Besides, a full MC simulation using an anthropomorphic phantom showed the full potential of the GGEMS platform, with a resulting acceleration factor up to 71. The good agreement with reference codes and the acceleration factors obtained support the use of GPU implementation strategies for improving computational efficiency of SPECT imaging simulations.

  18. DataForge: Modular platform for data storage and analysis

    NASA Astrophysics Data System (ADS)

    Nozik, Alexander

    2018-04-01

    DataForge is a framework for automated data acquisition, storage and analysis based on modern achievements of applied programming. The aim of the DataForge is to automate some standard tasks like parallel data processing, logging, output sorting and distributed computing. Also the framework extensively uses declarative programming principles via meta-data concept which allows a certain degree of meta-programming and improves results reproducibility.

  19. 20. Perimeter acquisition radar building room #105, shockisolated platform for ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    20. Perimeter acquisition radar building room #105, shock-isolated platform for chillers is easily seen on the right - Stanley R. Mickelsen Safeguard Complex, Perimeter Acquisition Radar Building, Limited Access Area, between Limited Access Patrol Road & Service Road A, Nekoma, Cavalier County, ND

  20. Retina-like sensor image coordinates transformation and display

    NASA Astrophysics Data System (ADS)

    Cao, Fengmei; Cao, Nan; Bai, Tingzhu; Song, Shengyu

    2015-03-01

    For a new kind of retina-like senor camera, the image acquisition, coordinates transformation and interpolation need to be realized. Both of the coordinates transformation and interpolation are computed in polar coordinate due to the sensor's particular pixels distribution. The image interpolation is based on sub-pixel interpolation and its relative weights are got in polar coordinates. The hardware platform is composed of retina-like senor camera, image grabber and PC. Combined the MIL and OpenCV library, the software program is composed in VC++ on VS 2010. Experience results show that the system can realizes the real-time image acquisition, coordinate transformation and interpolation.

  1. [Development of multi-channels cardiac electrophysiological polygraph with LabVIEW as software platform and its clinical application].

    PubMed

    Fan, Shounian; Jiang, Yi; Jiang, Chenxi; Yang, Tianhe; Zhang, Chengyun; Liu, Junshi; Wu, Qiang; Zheng, Yaxi; Liu, Xiaoqiao

    2004-10-01

    Polygraph has become a necessary instrument in interventional cardiology and fundamental research of medicine up to the present. In this study, a LabView development system (DS) (developed by NI in U.S.) used as software platform, a DAQ data acquisition module and universal computer used as hardware platform, were creatively coupled with our self-made low noise multi-channels preamplifier to develop Multi-channels electrocardiograph. The device possessed the functions such as real time display of physiological process, digit highpass and lowpass, 50Hz filtered and gain adjustment, instant storing, random playback and printing, and process control stimulation. Besides, it was small-sized, economically practical and easy to operate. It could advance the spread of cardiac intervention treatment in hospitals.

  2. Research and design of portable photoelectric rotary table data-acquisition and analysis system

    NASA Astrophysics Data System (ADS)

    Yang, Dawei; Yang, Xiufang; Han, Junfeng; Yan, Xiaoxu

    2015-02-01

    Photoelectric rotary table as the main test tracking measurement platform, widely use in shooting range and aerospace fields. In the range of photoelectric tracking measurement system, in order to meet the photoelectric testing instruments and equipment of laboratory and field application demand, research and design the portable photoelectric rotary table data acquisition and analysis system, and introduces the FPGA device based on Xilinx company Virtex-4 series and its peripheral module of the system hardware design, and the software design of host computer in VC++ 6.0 programming platform and MFC package based on class libraries. The data acquisition and analysis system for data acquisition, display and storage, commission control, analysis, laboratory wave playback, transmission and fault diagnosis, and other functions into an organic whole, has the advantages of small volume, can be embedded, high speed, portable, simple operation, etc. By photoelectric tracking turntable as experimental object, carries on the system software and hardware alignment, the experimental results show that the system can realize the data acquisition, analysis and processing of photoelectric tracking equipment and control of turntable debugging good, and measurement results are accurate, reliable and good maintainability and extensibility. The research design for advancing the photoelectric tracking measurement equipment debugging for diagnosis and condition monitoring and fault analysis as well as the standardization and normalization of the interface and improve the maintainability of equipment is of great significance, and has certain innovative and practical value.

  3. [Application of virtual instrumentation technique in toxicological studies].

    PubMed

    Moczko, Jerzy A

    2005-01-01

    Research investigations require frequently direct connection of measuring equipment to the computer. Virtual instrumentation technique considerably facilitates programming of sophisticated acquisition-and-analysis procedures. In standard approach these two steps are performed subsequently with separate software tools. The acquired data are transfered with export / import procedures of particular program to the another one which executes next step of analysis. The described procedure is cumbersome, time consuming and may be potential source of the errors. In 1987 National Instruments Corporation introduced LabVIEW language based on the concept of graphical programming. Contrary to conventional textual languages it allows the researcher to concentrate on the resolved problem and omit all syntactical rules. Programs developed in LabVIEW are called as virtual instruments (VI) and are portable among different computer platforms as PCs, Macintoshes, Sun SPARCstations, Concurrent PowerMAX stations, HP PA/RISK workstations. This flexibility warrants that the programs prepared for one particular platform would be also appropriate to another one. In presented paper basic principles of connection of research equipment to computer systems were described.

  4. Platform-dependent optimization considerations for mHealth applications

    NASA Astrophysics Data System (ADS)

    Kaghyan, Sahak; Akopian, David; Sarukhanyan, Hakob

    2015-03-01

    Modern mobile devices contain integrated sensors that enable multitude of applications in such fields as mobile health (mHealth), entertainment, sports, etc. Human physical activity monitoring is one of such the emerging applications. There exists a range of challenges that relate to activity monitoring tasks, and, particularly, exploiting optimal solutions and architectures for respective mobile software application development. This work addresses mobile computations related to integrated inertial sensors for activity monitoring, such as accelerometers, gyroscopes, integrated global positioning system (GPS) and WLAN-based positioning, that can be used for activity monitoring. Some of the aspects will be discussed in this paper. Each of the sensing data sources has its own characteristics such as specific data formats, data rates, signal acquisition durations etc., and these specifications affect energy consumption. Energy consumption significantly varies as sensor data acquisition is followed by data analysis including various transformations and signal processing algorithms. This paper will address several aspects of more optimal activity monitoring implementations exploiting state-of-the-art capabilities of modern platforms.

  5. A Technology Analysis to Support Acquisition of UAVs for Gulf Coalition Forces Operations

    DTIC Science & Technology

    2017-06-01

    their selection of the most suitable and cost-effective unmanned aerial vehicles to support detection operations. This study uses Map Aware Non ...being detected by Gulf Coalition Forces and improved time to detect them, support the use of UAVs in detection missions. Computer experimentations and...aerial vehicles to support detection operations. We use Map Aware Non - Uniform Automata, an agent-based simulation software platform, for the

  6. A software platform for phase contrast x-ray breast imaging research.

    PubMed

    Bliznakova, K; Russo, P; Mettivier, G; Requardt, H; Popov, P; Bravin, A; Buliev, I

    2015-06-01

    To present and validate a computer-based simulation platform dedicated for phase contrast x-ray breast imaging research. The software platform, developed at the Technical University of Varna on the basis of a previously validated x-ray imaging software simulator, comprises modules for object creation and for x-ray image formation. These modules were updated to take into account the refractive index for phase contrast imaging as well as implementation of the Fresnel-Kirchhoff diffraction theory of the propagating x-ray waves. Projection images are generated in an in-line acquisition geometry. To test and validate the platform, several phantoms differing in their complexity were constructed and imaged at 25 keV and 60 keV at the beamline ID17 of the European Synchrotron Radiation Facility. The software platform was used to design computational phantoms that mimic those used in the experimental study and to generate x-ray images in absorption and phase contrast modes. The visual and quantitative results of the validation process showed an overall good correlation between simulated and experimental images and show the potential of this platform for research in phase contrast x-ray imaging of the breast. The application of the platform is demonstrated in a feasibility study for phase contrast images of complex inhomogeneous and anthropomorphic breast phantoms, compared to x-ray images generated in absorption mode. The improved visibility of mammographic structures suggests further investigation and optimisation of phase contrast x-ray breast imaging, especially when abnormalities are present. The software platform can be exploited also for educational purposes. Copyright © 2015 Elsevier Ltd. All rights reserved.

  7. Simultaneous electrical recording of cardiac electrophysiology and contraction on chip

    DOE PAGES

    Qian, Fang; Huang, Chao; Lin, Yi-Dong; ...

    2017-04-18

    Prevailing commercialized cardiac platforms for in vitro drug development utilize planar microelectrode arrays to map action potentials, or impedance sensing to record contraction in real time, but cannot record both functions on the same chip with high spatial resolution. We report a novel cardiac platform that can record cardiac tissue adhesion, electrophysiology, and contractility on the same chip. The platform integrates two independent yet interpenetrating sensor arrays: a microelectrode array for field potential readouts and an interdigitated electrode array for impedance readouts. Together, these arrays provide real-time, non-invasive data acquisition of both cardiac electrophysiology and contractility under physiological conditions andmore » under drug stimuli. Furthermore, we cultured human induced pluripotent stem cell-derived cardiomyocytes as a model system, and used to validate the platform with an excitation–contraction decoupling chemical. Preliminary data using the platform to investigate the effect of the drug norepinephrine are combined with computational efforts. Finally, this platform provides a quantitative and predictive assay system that can potentially be used for comprehensive assessment of cardiac toxicity earlier in the drug discovery process.« less

  8. Simultaneous electrical recording of cardiac electrophysiology and contraction on chip

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Qian, Fang; Huang, Chao; Lin, Yi-Dong

    Prevailing commercialized cardiac platforms for in vitro drug development utilize planar microelectrode arrays to map action potentials, or impedance sensing to record contraction in real time, but cannot record both functions on the same chip with high spatial resolution. We report a novel cardiac platform that can record cardiac tissue adhesion, electrophysiology, and contractility on the same chip. The platform integrates two independent yet interpenetrating sensor arrays: a microelectrode array for field potential readouts and an interdigitated electrode array for impedance readouts. Together, these arrays provide real-time, non-invasive data acquisition of both cardiac electrophysiology and contractility under physiological conditions andmore » under drug stimuli. Furthermore, we cultured human induced pluripotent stem cell-derived cardiomyocytes as a model system, and used to validate the platform with an excitation–contraction decoupling chemical. Preliminary data using the platform to investigate the effect of the drug norepinephrine are combined with computational efforts. Finally, this platform provides a quantitative and predictive assay system that can potentially be used for comprehensive assessment of cardiac toxicity earlier in the drug discovery process.« less

  9. Display integration for ground combat vehicles

    NASA Astrophysics Data System (ADS)

    Busse, David J.

    1998-09-01

    The United States Army's requirement to employ high resolution target acquisition sensors and information warfare to increase its dominance over enemy forces has led to the need to integrate advanced display devices into ground combat vehicle crew stations. The Army's force structure require the integration of advanced displays on both existing and emerging ground combat vehicle systems. The fielding of second generation target acquisition sensors, color digital terrain maps and high volume digital command and control information networks on these platforms define display performance requirements. The greatest challenge facing the system integrator is the development and integration of advanced displays that meet operational, vehicle and human computer interface performance requirements for the ground combat vehicle fleet. The subject of this paper is to address those challenges: operational and vehicle performance, non-soldier centric crew station configurations, display performance limitations related to human computer interfaces and vehicle physical environments, display technology limitations and the Department of Defense (DOD) acquisition reform initiatives. How the ground combat vehicle Program Manager and system integrator are addressing these challenges are discussed through the integration of displays on fielded, current and future close combat vehicle applications.

  10. Performance and scalability of Fourier domain optical coherence tomography acceleration using graphics processing units.

    PubMed

    Li, Jian; Bloch, Pavel; Xu, Jing; Sarunic, Marinko V; Shannon, Lesley

    2011-05-01

    Fourier domain optical coherence tomography (FD-OCT) provides faster line rates, better resolution, and higher sensitivity for noninvasive, in vivo biomedical imaging compared to traditional time domain OCT (TD-OCT). However, because the signal processing for FD-OCT is computationally intensive, real-time FD-OCT applications demand powerful computing platforms to deliver acceptable performance. Graphics processing units (GPUs) have been used as coprocessors to accelerate FD-OCT by leveraging their relatively simple programming model to exploit thread-level parallelism. Unfortunately, GPUs do not "share" memory with their host processors, requiring additional data transfers between the GPU and CPU. In this paper, we implement a complete FD-OCT accelerator on a consumer grade GPU/CPU platform. Our data acquisition system uses spectrometer-based detection and a dual-arm interferometer topology with numerical dispersion compensation for retinal imaging. We demonstrate that the maximum line rate is dictated by the memory transfer time and not the processing time due to the GPU platform's memory model. Finally, we discuss how the performance trends of GPU-based accelerators compare to the expected future requirements of FD-OCT data rates.

  11. A Shared Platform for Studying Second Language Acquisition

    ERIC Educational Resources Information Center

    MacWhinney, Brian

    2017-01-01

    The study of second language acquisition (SLA) can benefit from the same process of datasharing that has proven effective in areas such as first language acquisition and aphasiology. Researchers can work together to construct a shared platform that combines data from spoken and written corpora, online tutors, and Web-based experimentation. Many of…

  12. All solid state mid-infrared dual-comb spectroscopy platform based on QCL technology

    NASA Astrophysics Data System (ADS)

    Hugi, Andreas; Geiser, Markus; Villares, Gustavo; Cappelli, Francesco; Blaser, Stephane; Faist, Jérôme

    2015-01-01

    We develop a spectroscopy platform for industrial applications based on semiconductor quantum cascade laser (QCL) frequency combs. The platform's key features will be an unmatched combination of bandwidth of 100 cm-1, resolution of 100 kHz, speed of ten to hundreds of μs as well as size and robustness, opening doors to beforehand unreachable markets. The sensor can be built extremely compact and robust since the laser source is an all-electrically pumped semiconductor optical frequency comb and no mechanical elements are required. However, the parallel acquisition of dual-comb spectrometers comes at the price of enormous data-rates. For system scalability, robustness and optical simplicity we use free-running QCL combs. Therefore no complicated optical locking mechanisms are required. To reach high signal-to-noise ratios, we develop an algorithm, which is based on combination of coherent and non-coherent averaging. This algorithm is specifically optimized for free-running and small footprint, therefore high-repetition rate, comb sources. As a consequence, our system generates data-rates of up to 3.2 GB/sec. These data-rates need to be reduced by several orders of magnitude in real-time in order to be useful for spectral fitting algorithms. We present the development of a data-treatment solution, which reaches a single-channel throughput of 22% using a standard laptop-computer. Using a state-of-the art desktop computer, the throughput is increased to 43%. This is combined with a data-acquisition board to a stand-alone data processing unit, allowing real-time industrial process observation and continuous averaging to achieve highest signal fidelity.

  13. Sensory System for Implementing a Human—Computer Interface Based on Electrooculography

    PubMed Central

    Barea, Rafael; Boquete, Luciano; Rodriguez-Ascariz, Jose Manuel; Ortega, Sergio; López, Elena

    2011-01-01

    This paper describes a sensory system for implementing a human–computer interface based on electrooculography. An acquisition system captures electrooculograms and transmits them via the ZigBee protocol. The data acquired are analysed in real time using a microcontroller-based platform running the Linux operating system. The continuous wavelet transform and neural network are used to process and analyse the signals to obtain highly reliable results in real time. To enhance system usability, the graphical interface is projected onto special eyewear, which is also used to position the signal-capturing electrodes. PMID:22346579

  14. Experimental characterization and analysis of the BITalino platforms against a reference device.

    PubMed

    Batista, Diana; Silva, Hugo; Fred, Ana

    2017-07-01

    Low-cost hardware platforms for biomedical engineering are becoming increasingly available, which empower the research community in the development of new projects in a wide range of areas related with physiological data acquisition. Building upon previous work by our group, this work compares the quality of the data acquired by means of two different versions of the multimodal physiological computing platform BITalino, with a device that can be considered a reference. We acquired data from 5 sensors, namely Accelerometry (ACC), Electrocardiography (ECG), Electroencephalography (EEG), Electrodermal Activity (EDA) and Electromyography (EMG). Experimental evaluation shows that ACC, ECG and EDA data are highly correlated with the reference in what concerns the raw waveforms. When compared by means of their commonly used features, EEG and EMG data are also quite similar across the different devices.

  15. COLD-SAT dynamic model

    NASA Technical Reports Server (NTRS)

    Adams, Neil S.; Bollenbacher, Gary

    1992-01-01

    This report discusses the development and underlying mathematics of a rigid-body computer model of a proposed cryogenic on-orbit liquid depot storage, acquisition, and transfer spacecraft (COLD-SAT). This model, referred to in this report as the COLD-SAT dynamic model, consists of both a trajectory model and an attitudinal model. All disturbance forces and torques expected to be significant for the actual COLD-SAT spacecraft are modeled to the required degree of accuracy. Control and experimental thrusters are modeled, as well as fluid slosh. The model also computes microgravity disturbance accelerations at any specified point in the spacecraft. The model was developed by using the Boeing EASY5 dynamic analysis package and will run on Apollo, Cray, and other computing platforms.

  16. Research on cloud-based remote measurement and analysis system

    NASA Astrophysics Data System (ADS)

    Gao, Zhiqiang; He, Lingsong; Su, Wei; Wang, Can; Zhang, Changfan

    2015-02-01

    The promising potential of cloud computing and its convergence with technologies such as cloud storage, cloud push, mobile computing allows for creation and delivery of newer type of cloud service. Combined with the thought of cloud computing, this paper presents a cloud-based remote measurement and analysis system. This system mainly consists of three parts: signal acquisition client, web server deployed on the cloud service, and remote client. This system is a special website developed using asp.net and Flex RIA technology, which solves the selective contradiction between two monitoring modes, B/S and C/S. This platform supplies customer condition monitoring and data analysis service by Internet, which was deployed on the cloud server. Signal acquisition device is responsible for data (sensor data, audio, video, etc.) collection and pushes the monitoring data to the cloud storage database regularly. Data acquisition equipment in this system is only conditioned with the function of data collection and network function such as smartphone and smart sensor. This system's scale can adjust dynamically according to the amount of applications and users, so it won't cause waste of resources. As a representative case study, we developed a prototype system based on Ali cloud service using the rotor test rig as the research object. Experimental results demonstrate that the proposed system architecture is feasible.

  17. An agile acquisition decision-support workbench for evaluating ISR effectiveness

    NASA Astrophysics Data System (ADS)

    Stouch, Daniel W.; Champagne, Valerie; Mow, Christopher; Rosenberg, Brad; Serrin, Joshua

    2011-06-01

    The U.S. Air Force is consistently evolving to support current and future operations through the planning and execution of intelligence, surveillance and reconnaissance (ISR) missions. However, it is a challenge to maintain a precise awareness of current and emerging ISR capabilities to properly prepare for future conflicts. We present a decisionsupport tool for acquisition managers to empirically compare ISR capabilities and approaches to employing them, thereby enabling the DoD to acquire ISR platforms and sensors that provide the greatest return on investment. We have developed an analysis environment to perform modeling and simulation-based experiments to objectively compare alternatives. First, the analyst specifies an operational scenario for an area of operations by providing terrain and threat information; a set of nominated collections; sensor and platform capabilities; and processing, exploitation, and dissemination (PED) capacities. Next, the analyst selects and configures ISR collection strategies to generate collection plans. The analyst then defines customizable measures of effectiveness or performance to compute during the experiment. Finally, the analyst empirically compares the efficacy of each solution and generates concise reports to document their conclusions, providing traceable evidence for acquisition decisions. Our capability demonstrates the utility of using a workbench environment for analysts to design and run experiments. Crafting impartial metrics enables the acquisition manager to focus on evaluating solutions based on specific military needs. Finally, the metric and collection plan visualizations provide an intuitive understanding of the suitability of particular solutions. This facilitates a more agile acquisition strategy that handles rapidly changing technology in response to current military needs.

  18. Design and development of C-arm based cone-beam CT for image-guided interventions: initial results

    NASA Astrophysics Data System (ADS)

    Chen, Guang-Hong; Zambelli, Joseph; Nett, Brian E.; Supanich, Mark; Riddell, Cyril; Belanger, Barry; Mistretta, Charles A.

    2006-03-01

    X-ray cone-beam computed tomography (CBCT) is of importance in image-guided intervention (IGI) and image-guided radiation therapy (IGRT). In this paper, we present a cone-beam CT data acquisition system using a GE INNOVA 4100 (GE Healthcare Technologies, Waukesha, Wisconsin) clinical system. This new cone-beam data acquisition mode was developed for research purposes without interfering with any clinical function of the system. It provides us a basic imaging pipeline for more advanced cone-beam data acquisition methods. It also provides us a platform to study and overcome the limiting factors such as cone-beam artifacts and limiting low contrast resolution in current C-arm based cone-beam CT systems. A geometrical calibration method was developed to experimentally determine parameters of the scanning geometry to correct the image reconstruction for geometric non-idealities. Extensive phantom studies and some small animal studies have been conducted to evaluate the performance of our cone-beam CT data acquisition system.

  19. [Design and implementation of mobile terminal data acquisition for Chinese materia medica resources survey].

    PubMed

    Qi, Yuan-Hua; Wang, Hui; Zhang, Xiao-Bo; Jin, Yan; Ge, Xiao-Guang; Jing, Zhi-Xian; Wang, Ling; Zhao, Yu-Ping; Guo, Lan-Ping; Huang, Lu-Qi

    2017-11-01

    In this paper, a data acquisition system based on mobile terminal combining GPS, offset correction, automatic speech recognition and database networking technology was designed implemented with the function of locating the latitude and elevation information fast, taking conveniently various types of Chinese herbal plant photos, photos, samples habitat photos and so on. The mobile system realizes automatic association with Chinese medicine source information, through the voice recognition function it records the information of plant characteristics and environmental characteristics, and record relevant plant specimen information. The data processing platform based on Chinese medicine resources survey data reporting client can effectively assists in indoor data processing, derives the mobile terminal data to computer terminal. The established data acquisition system provides strong technical support for the fourth national survey of the Chinese materia medica resources (CMMR). Copyright© by the Chinese Pharmaceutical Association.

  20. Thermal Analysis for Monitoring Effects of Shock-Induced Physical, Mechanical, and Chemical Changes in Materials

    DTIC Science & Technology

    2015-01-19

    MS WINDOWS platform, which enables multitasking with simultaneous evaluation and operation 1. REPORT DATE (DD-MM-YYYY) 4. TITLE AND SUBTITLE 13...measurement and analysis software for data acquisition, storage and evaluation with MS WINDOWS platform, which enables multitasking with simultaneous...Proteus measurement and analysis software for data acquisition, storage and evaluation with MS WINDOWS platform, which enables multitasking with

  1. Design of a mobile brain computer interface-based smart multimedia controller.

    PubMed

    Tseng, Kevin C; Lin, Bor-Shing; Wong, Alice May-Kuen; Lin, Bor-Shyh

    2015-03-06

    Music is a way of expressing our feelings and emotions. Suitable music can positively affect people. However, current multimedia control methods, such as manual selection or automatic random mechanisms, which are now applied broadly in MP3 and CD players, cannot adaptively select suitable music according to the user's physiological state. In this study, a brain computer interface-based smart multimedia controller was proposed to select music in different situations according to the user's physiological state. Here, a commercial mobile tablet was used as the multimedia platform, and a wireless multi-channel electroencephalograph (EEG) acquisition module was designed for real-time EEG monitoring. A smart multimedia control program built in the multimedia platform was developed to analyze the user's EEG feature and select music according his/her state. The relationship between the user's state and music sorted by listener's preference was also examined in this study. The experimental results show that real-time music biofeedback according a user's EEG feature may positively improve the user's attention state.

  2. TomoMiner and TomoMinerCloud: A software platform for large-scale subtomogram structural analysis

    PubMed Central

    Frazier, Zachary; Xu, Min; Alber, Frank

    2017-01-01

    SUMMARY Cryo-electron tomography (cryoET) captures the 3D electron density distribution of macromolecular complexes in close to native state. With the rapid advance of cryoET acquisition technologies, it is possible to generate large numbers (>100,000) of subtomograms, each containing a macromolecular complex. Often, these subtomograms represent a heterogeneous sample due to variations in structure and composition of a complex in situ form or because particles are a mixture of different complexes. In this case subtomograms must be classified. However, classification of large numbers of subtomograms is a time-intensive task and often a limiting bottleneck. This paper introduces an open source software platform, TomoMiner, for large-scale subtomogram classification, template matching, subtomogram averaging, and alignment. Its scalable and robust parallel processing allows efficient classification of tens to hundreds of thousands of subtomograms. Additionally, TomoMiner provides a pre-configured TomoMinerCloud computing service permitting users without sufficient computing resources instant access to TomoMiners high-performance features. PMID:28552576

  3. Data Streaming for Metabolomics: Accelerating Data Processing and Analysis from Days to Minutes

    PubMed Central

    2016-01-01

    The speed and throughput of analytical platforms has been a driving force in recent years in the “omics” technologies and while great strides have been accomplished in both chromatography and mass spectrometry, data analysis times have not benefited at the same pace. Even though personal computers have become more powerful, data transfer times still represent a bottleneck in data processing because of the increasingly complex data files and studies with a greater number of samples. To meet the demand of analyzing hundreds to thousands of samples within a given experiment, we have developed a data streaming platform, XCMS Stream, which capitalizes on the acquisition time to compress and stream recently acquired data files to data processing servers, mimicking just-in-time production strategies from the manufacturing industry. The utility of this XCMS Online-based technology is demonstrated here in the analysis of T cell metabolism and other large-scale metabolomic studies. A large scale example on a 1000 sample data set demonstrated a 10 000-fold time savings, reducing data analysis time from days to minutes. Further, XCMS Stream has the capability to increase the efficiency of downstream biochemical dependent data acquisition (BDDA) analysis by initiating data conversion and data processing on subsets of data acquired, expanding its application beyond data transfer to smart preliminary data decision-making prior to full acquisition. PMID:27983788

  4. Data streaming for metabolomics: Accelerating data processing and analysis from days to minutes

    DOE PAGES

    Montenegro-Burke, J. Rafael; Aisporna, Aries E.; Benton, H. Paul; ...

    2016-12-16

    The speed and throughput of analytical platforms has been a driving force in recent years in the “omics” technologies and while great strides have been accomplished in both chromatography and mass spectrometry, data analysis times have not benefited at the same pace. Even though personal computers have become more powerful, data transfer times still represent a bottleneck in data processing because of the increasingly complex data files and studies with a greater number of samples. To meet the demand of analyzing hundreds to thousands of samples within a given experiment, we have developed a data streaming platform, XCMS Stream, whichmore » capitalizes on the acquisition time to compress and stream recently acquired data files to data processing servers, mimicking just-in-time production strategies from the manufacturing industry. The utility of this XCMS Online-based technology is demonstrated here in the analysis of T cell metabolism and other large-scale metabolomic studies. A large scale example on a 1000 sample data set demonstrated a 10 000-fold time savings, reducing data analysis time from days to minutes. Here, XCMS Stream has the capability to increase the efficiency of downstream biochemical dependent data acquisition (BDDA) analysis by initiating data conversion and data processing on subsets of data acquired, expanding its application beyond data transfer to smart preliminary data decision-making prior to full acquisition.« less

  5. Data Streaming for Metabolomics: Accelerating Data Processing and Analysis from Days to Minutes.

    PubMed

    Montenegro-Burke, J Rafael; Aisporna, Aries E; Benton, H Paul; Rinehart, Duane; Fang, Mingliang; Huan, Tao; Warth, Benedikt; Forsberg, Erica; Abe, Brian T; Ivanisevic, Julijana; Wolan, Dennis W; Teyton, Luc; Lairson, Luke; Siuzdak, Gary

    2017-01-17

    The speed and throughput of analytical platforms has been a driving force in recent years in the "omics" technologies and while great strides have been accomplished in both chromatography and mass spectrometry, data analysis times have not benefited at the same pace. Even though personal computers have become more powerful, data transfer times still represent a bottleneck in data processing because of the increasingly complex data files and studies with a greater number of samples. To meet the demand of analyzing hundreds to thousands of samples within a given experiment, we have developed a data streaming platform, XCMS Stream, which capitalizes on the acquisition time to compress and stream recently acquired data files to data processing servers, mimicking just-in-time production strategies from the manufacturing industry. The utility of this XCMS Online-based technology is demonstrated here in the analysis of T cell metabolism and other large-scale metabolomic studies. A large scale example on a 1000 sample data set demonstrated a 10 000-fold time savings, reducing data analysis time from days to minutes. Further, XCMS Stream has the capability to increase the efficiency of downstream biochemical dependent data acquisition (BDDA) analysis by initiating data conversion and data processing on subsets of data acquired, expanding its application beyond data transfer to smart preliminary data decision-making prior to full acquisition.

  6. Design of verification platform for wireless vision sensor networks

    NASA Astrophysics Data System (ADS)

    Ye, Juanjuan; Shang, Fei; Yu, Chuang

    2017-08-01

    At present, the majority of research for wireless vision sensor networks (WVSNs) still remains in the software simulation stage, and the verification platforms of WVSNs that available for use are very few. This situation seriously restricts the transformation from theory research of WVSNs to practical application. Therefore, it is necessary to study the construction of verification platform of WVSNs. This paper combines wireless transceiver module, visual information acquisition module and power acquisition module, designs a high-performance wireless vision sensor node whose core is ARM11 microprocessor and selects AODV as the routing protocol to set up a verification platform called AdvanWorks for WVSNs. Experiments show that the AdvanWorks can successfully achieve functions of image acquisition, coding, wireless transmission, and obtain the effective distance parameters between nodes, which lays a good foundation for the follow-up application of WVSNs.

  7. Design of Smart-Meter data acquisition device based on Cloud Platform

    NASA Astrophysics Data System (ADS)

    Chen, Xiangqun; Huang, Rui; Shen, Liman; chen, Hao; Xiong, Dezhi; Xiao, Xiangqi; Liu, Mouhai; Xu, Renheng

    2018-05-01

    In recent years, the government has attached great importance to ‘Four-Meter Unified’ Project. Under the call of national policy, State Grid is participate in building ‘Four-Meter Unified’ Project actively by making use of electricity information acquisition system. In this paper, a new type Smart-Meter data acquisition device based on Cloud Platform is designed according to the newest series of standards Energy Measure and Management System for Electric, Water, Gas and Heat Meter, and this paper introduces the general scheme, main hardware design and main software design for the Smart-Meter data acquisition device.

  8. Mounted Smartphones as Measurement and Control Platforms for Motor-Based Laboratory Test-Beds †

    PubMed Central

    Frank, Jared A.; Brill, Anthony; Kapila, Vikram

    2016-01-01

    Laboratory education in science and engineering often entails the use of test-beds equipped with costly peripherals for sensing, acquisition, storage, processing, and control of physical behavior. However, costly peripherals are no longer necessary to obtain precise measurements and achieve stable feedback control of test-beds. With smartphones performing diverse sensing and processing tasks, this study examines the feasibility of mounting smartphones directly to test-beds to exploit their embedded hardware and software in the measurement and control of the test-beds. This approach is a first step towards replacing laboratory-grade peripherals with more compact and affordable smartphone-based platforms, whose interactive user interfaces can engender wider participation and engagement from learners. Demonstrative cases are presented in which the sensing, computation, control, and user interaction with three motor-based test-beds are handled by a mounted smartphone. Results of experiments and simulations are used to validate the feasibility of mounted smartphones as measurement and feedback control platforms for motor-based laboratory test-beds, report the measurement precision and closed-loop performance achieved with such platforms, and address challenges in the development of platforms to maintain system stability. PMID:27556464

  9. Mounted Smartphones as Measurement and Control Platforms for Motor-Based Laboratory Test-Beds.

    PubMed

    Frank, Jared A; Brill, Anthony; Kapila, Vikram

    2016-08-20

    Laboratory education in science and engineering often entails the use of test-beds equipped with costly peripherals for sensing, acquisition, storage, processing, and control of physical behavior. However, costly peripherals are no longer necessary to obtain precise measurements and achieve stable feedback control of test-beds. With smartphones performing diverse sensing and processing tasks, this study examines the feasibility of mounting smartphones directly to test-beds to exploit their embedded hardware and software in the measurement and control of the test-beds. This approach is a first step towards replacing laboratory-grade peripherals with more compact and affordable smartphone-based platforms, whose interactive user interfaces can engender wider participation and engagement from learners. Demonstrative cases are presented in which the sensing, computation, control, and user interaction with three motor-based test-beds are handled by a mounted smartphone. Results of experiments and simulations are used to validate the feasibility of mounted smartphones as measurement and feedback control platforms for motor-based laboratory test-beds, report the measurement precision and closed-loop performance achieved with such platforms, and address challenges in the development of platforms to maintain system stability.

  10. ChiMS: Open-source instrument control software platform on LabVIEW for imaging/depth profiling mass spectrometers.

    PubMed

    Cui, Yang; Hanley, Luke

    2015-06-01

    ChiMS is an open-source data acquisition and control software program written within LabVIEW for high speed imaging and depth profiling mass spectrometers. ChiMS can also transfer large datasets from a digitizer to computer memory at high repetition rate, save data to hard disk at high throughput, and perform high speed data processing. The data acquisition mode generally simulates a digital oscilloscope, but with peripheral devices integrated for control as well as advanced data sorting and processing capabilities. Customized user-designed experiments can be easily written based on several included templates. ChiMS is additionally well suited to non-laser based mass spectrometers imaging and various other experiments in laser physics, physical chemistry, and surface science.

  11. ChiMS: Open-source instrument control software platform on LabVIEW for imaging/depth profiling mass spectrometers

    PubMed Central

    Cui, Yang; Hanley, Luke

    2015-01-01

    ChiMS is an open-source data acquisition and control software program written within LabVIEW for high speed imaging and depth profiling mass spectrometers. ChiMS can also transfer large datasets from a digitizer to computer memory at high repetition rate, save data to hard disk at high throughput, and perform high speed data processing. The data acquisition mode generally simulates a digital oscilloscope, but with peripheral devices integrated for control as well as advanced data sorting and processing capabilities. Customized user-designed experiments can be easily written based on several included templates. ChiMS is additionally well suited to non-laser based mass spectrometers imaging and various other experiments in laser physics, physical chemistry, and surface science. PMID:26133872

  12. ChiMS: Open-source instrument control software platform on LabVIEW for imaging/depth profiling mass spectrometers

    NASA Astrophysics Data System (ADS)

    Cui, Yang; Hanley, Luke

    2015-06-01

    ChiMS is an open-source data acquisition and control software program written within LabVIEW for high speed imaging and depth profiling mass spectrometers. ChiMS can also transfer large datasets from a digitizer to computer memory at high repetition rate, save data to hard disk at high throughput, and perform high speed data processing. The data acquisition mode generally simulates a digital oscilloscope, but with peripheral devices integrated for control as well as advanced data sorting and processing capabilities. Customized user-designed experiments can be easily written based on several included templates. ChiMS is additionally well suited to non-laser based mass spectrometers imaging and various other experiments in laser physics, physical chemistry, and surface science.

  13. Development of fast wireless detection system for fixed offshore platform

    NASA Astrophysics Data System (ADS)

    Li, Zhigang; Yu, Yan; Jiao, Dong; Wang, Jie; Li, Zhirui; Ou, Jinping

    2011-04-01

    Offshore platforms' security is concerned since in 1950s and 1960s, and in the early 1980s some important specifications and standards are built, and all these provide technical basis of fixed platform design, construction, installation and evaluation. With the condition that more and more platforms are in serving over age, the research about the evaluation and detection technology of offshore platform has been a hotspot, especially underwater detection, and assessment method based on the finite element calculation. For fixed platform structure detection, conventional NDT methods, such as eddy current, magnetic powder, permeate, X-ray and ultrasonic, etc, are generally used. These techniques are more mature, intuitive, but underwater detection needs underwater robot, the necessary supporting tools of auxiliary equipment, and trained professional team, thus resources and cost used are considerable, installation time of test equipment is long. This project presents a new kind of fast wireless detection and damage diagnosis system for fixed offshore platform using wireless sensor networks, that is, wireless sensor nodes can be put quickly on the offshore platform, detect offshore platform structure global status by wireless communication, and then make diagnosis. This system is operated simply, suitable for offshore platform integrity states rapid assessment. The designed system consists in intelligence acquisition equipment and 8 wireless collection nodes, the whole system has 64 collection channels, namely every wireless collection node has eight 16-bit accuracy of A/D channels. Wireless collection node, integrated with vibration sensing unit, embedded low-power micro-processing unit, wireless transceiver unit, large-capacity power unit, and GPS time synchronization unit, can finish the functions such as vibration data collection, initial analysis, data storage, data wireless transmission. Intelligence acquisition equipment, integrated with high-performance computation unit, wireless transceiver unit, mobile power unit and embedded data analysis software, can totally control multi-wireless collection nodes, receive and analyze data, parameter identification. Data is transmitted at the 2.4GHz wireless communication channel, every sensing data channel in charge of data transmission is in a stable frequency band, control channel responsible for the control of power parameters is in a public frequency band. The test is initially conducted for the designed system, experimental results show that the system has good application prospects and practical value with fast arrangement, high sampling rate, high resolution, capacity of low frequency detection.

  14. The SBAS Sentinel-1 Surveillance service for automatic and systematic generation of Earth surface displacement within the GEP platform.

    NASA Astrophysics Data System (ADS)

    Casu, Francesco; De Luca, Claudio; Lanari, Riccardo; Manunta, Michele; Zinno, Ivana

    2017-04-01

    The Geohazards Exploitation Platform (GEP) is an ESA activity of the Earth Observation (EO) ground segment to demonstrate the benefit of new technologies for large scale processing of EO data. GEP aims at providing both on-demand processing services for scientific users of the geohazards community and an integration platform for new EO data analysis processors dedicated to scientists and other expert users. In the Remote Sensing scenario, a crucial role is played by the recently launched Sentinel-1 (S1) constellation that, with its global acquisition policy, has literally flooded the scientific community with a huge amount of data acquired over large part of the Earth on a regular basis (down to 6-days with both Sentinel-1A and 1B passes). Moreover, the S1 data, as part of the European Copernicus program, are openly and freely accessible, thus fostering their use for the development of tools for Earth surface monitoring. In particular, due to their specific SAR Interferometry (InSAR) design, Sentinel-1 satellites can be exploited to build up operational services for the generation of advanced interferometric products that can be very useful within risk management and natural hazard monitoring scenarios. Accordingly, in this work we present the activities carried out for the development, integration, and deployment of the SBAS Sentinel-1 Surveillance service of CNR-IREA within the GEP platform. This service is based on a parallel implementation of the SBAS approach, referred to as P-SBAS, able to effectively run in large distributed computing infrastructures (grid and cloud) and to allow for an efficient computation of large SAR data sequences with advanced DInSAR approaches. In particular, the Surveillance service developed on GEP platform consists on the systematic and automatic processing of Sentinel-1 data on selected Areas of Interest (AoI) to generate updated surface displacement time series via the SBAS-InSAR algorithm. We built up a system that is automatically triggered by every new S1 acquisition over the AoI, once it is available on the S1 catalogue. Then, tacking benefit from the SBAS results generated by previous runs of the service, the system processes the new acquisitions only, thus saving storage space and computing time and finally generating an updated SBAS time series. The same P-SBAS processor underlying the Surveillance service is also available through the GEP as a standard on-demand DInSAR service, thus allowing the scientific community to generate S1 SBAS time series on areas not covered by the Surveillance service itself. It is worth noting that the SBAS Sentinel-1 Surveillance service on GEP represents the core of the EPOSAR service, which will deliver S1 displacement time series of Earth surface on a regular basis for the European Plate Observing System (EPOS) Research Infrastructure community. In particular, the main goal of EPOSAR is to contribute with advanced technique and methods, which have already well demonstrated their effectiveness and relevance, in investigating the physical processes controlling earthquakes, volcanic eruptions and unrest episodes as well as those driving tectonics and Earth surface dynamics.

  15. Computer aided system engineering for space construction

    NASA Technical Reports Server (NTRS)

    Racheli, Ugo

    1989-01-01

    This viewgraph presentation covers the following topics. Construction activities envisioned for the assembly of large platforms in space (as well as interplanetary spacecraft and bases on extraterrestrial surfaces) require computational tools that exceed the capability of conventional construction management programs. The Center for Space Construction is investigating the requirements for new computational tools and, at the same time, suggesting the expansion of graduate and undergraduate curricula to include proficiency in Computer Aided Engineering (CAE) though design courses and individual or team projects in advanced space systems design. In the center's research, special emphasis is placed on problems of constructability and of the interruptability of planned activity sequences to be carried out by crews operating under hostile environmental conditions. The departure point for the planned work is the acquisition of the MCAE I-DEAS software, developed by the Structural Dynamics Research Corporation (SDRC), and its expansion to the level of capability denoted by the acronym IDEAS**2 currently used for configuration maintenance on Space Station Freedom. In addition to improving proficiency in the use of I-DEAS and IDEAS**2, it is contemplated that new software modules will be developed to expand the architecture of IDEAS**2. Such modules will deal with those analyses that require the integration of a space platform's configuration with a breakdown of planned construction activities and with a failure modes analysis to support computer aided system engineering (CASE) applied to space construction.

  16. The Belle II Pixel Detector Data Acquisition and Background Suppression System

    NASA Astrophysics Data System (ADS)

    Lautenbach, K.; Deschamps, B.; Dingfelder, J.; Getzkow, D.; Geßler, T.; Konorov, I.; Kühn, W.; Lange, S.; Levit, D.; Liu, Z.-A.; Marinas, C.; Münchow, D.; Rabusov, A.; Reiter, S.; Spruck, B.; Wessel, C.; Zhao, J.

    2017-06-01

    The Belle II experiment at the future SuperKEKB collider in Tsukuba, Japan, features a design luminosity of 8 · 1035 cm-2s-1, which is a factor of 40 larger than that of its predecessor Belle. The pixel detector (PXD) with about 8 million pixels is based on the DEPFET technology and will improve the vertex resolution in beam direction by a factor of 2. With an estimated trigger rate of 30 kHz, the PXD is expected to generate a data rate of 20 GBytes/s, which is about 10 times larger than the amount of data generated by all other Belle II subdetectors. Due to the large beam-related background, the PXD requires a data acquisition system with high-bandwidth data links and realtime background reduction by a factor of 30. To achieve this, the Belle II pixel DAQ uses an FPGA-based computing platform with high speed serial links implemented in the ATCA (Advanced Telecommunications Computing Architecture) standard. The architecture and performance of the data acquisition system and data reduction of the PXD will be presented. In April 2016 and February 2017 a prototype PXD-DAQ system operated in a test beam campaign delivered data with the whole readout chain under realistic high rate conditions. Final results from the beam test will be presented.

  17. An exergame system based on force platforms and body key-point detection for balance training.

    PubMed

    Lavarda, Marcos D; de Borba, Pedro A; Oliveira, Matheus R; Borba, Gustavo B; de Souza, Mauren A; Gamba, Humberto R

    2016-08-01

    Postural instability affects a large number of people and can compromise even simple activities of the daily routine. Therapies for balance training can strongly benefit from auxiliary devices specially designed for this purpose. In this paper, we present a system for balance training that uses the metaphor of a game, what contributes to the motivation and engagement of the patients during a treatment. Such approach is usually named exergame, in which input devices for posturographic assessment and a visual output perform the interaction with the subject. The proposed system uses two force platforms, one positioned under the feet and the other under the hip of the subject. The force platforms employ regular load cells and a microcontroller-based signal acquisition module to capture and transmit the samples to a computer. Moreover, a computer vision module performs body key-point detection, based on real time segmentation of markers attached to the subject. For the validation of the system, we conducted experiments with 20 neurologically intact volunteers during two tests: comparison of the stabilometric parameters obtained from the system with those obtained from a commercial baropodometer and the practice of several exergames. Results show that the proposed system is completely functional and can be used as a versatile tool for balance training.

  18. Development of a UAV system for VNIR-TIR acquisitions in precision agriculture

    NASA Astrophysics Data System (ADS)

    Misopolinos, L.; Zalidis, Ch.; Liakopoulos, V.; Stavridou, D.; Katsigiannis, P.; Alexandridis, T. K.; Zalidis, G.

    2015-06-01

    Adoption of precision agriculture techniques requires the development of specialized tools that provide spatially distributed information. Both flying platforms and airborne sensors are being continuously evolved to cover the needs of plant and soil sensing at affordable costs. Due to restrictions in payload, flying platforms are usually limited to carry a single sensor on board. The aim of this work is to present the development of a vertical take-off and landing autonomous unmanned aerial vehicle (VTOL UAV) system for the simultaneous acquisition of high resolution vertical images at the visible, near infrared (VNIR) and thermal infrared (TIR) wavelengths. A system was developed that has the ability to trigger two cameras simultaneously with a fully automated process and no pilot intervention. A commercial unmanned hexacopter UAV platform was optimized to increase reliability, ease of operation and automation. The designed systems communication platform is based on a reduced instruction set computing (RISC) processor running Linux OS with custom developed drivers in an efficient way, while keeping the cost and weight to a minimum. Special software was also developed for the automated image capture, data processing and on board data and metadata storage. The system was tested over a kiwifruit field in northern Greece, at flying heights of 70 and 100m above the ground. The acquired images were mosaicked and geo-corrected. Images from both flying heights were of good quality and revealed unprecedented detail within the field. The normalized difference vegetation index (NDVI) was calculated along with the thermal image in order to provide information on the accurate location of stressors and other parameters related to the crop productivity. Compared to other available sources of data, this system can provide low cost, high resolution and easily repeatable information to cover the requirements of precision agriculture.

  19. myBrain: a novel EEG embedded system for epilepsy monitoring.

    PubMed

    Pinho, Francisco; Cerqueira, João; Correia, José; Sousa, Nuno; Dias, Nuno

    2017-10-01

    The World Health Organisation has pointed that a successful health care delivery, requires effective medical devices as tools for prevention, diagnosis, treatment and rehabilitation. Several studies have concluded that longer monitoring periods and outpatient settings might increase diagnosis accuracy and success rate of treatment selection. The long-term monitoring of epileptic patients through electroencephalography (EEG) has been considered a powerful tool to improve the diagnosis, disease classification, and treatment of patients with such condition. This work presents the development of a wireless and wearable EEG acquisition platform suitable for both long-term and short-term monitoring in inpatient and outpatient settings. The developed platform features 32 passive dry electrodes, analogue-to-digital signal conversion with 24-bit resolution and a variable sampling frequency from 250 Hz to 1000 Hz per channel, embedded in a stand-alone module. A computer-on-module embedded system runs a Linux ® operating system that rules the interface between two software frameworks, which interact to satisfy the real-time constraints of signal acquisition as well as parallel recording, processing and wireless data transmission. A textile structure was developed to accommodate all components. Platform performance was evaluated in terms of hardware, software and signal quality. The electrodes were characterised through electrochemical impedance spectroscopy and the operating system performance running an epileptic discrimination algorithm was evaluated. Signal quality was thoroughly assessed in two different approaches: playback of EEG reference signals and benchmarking with a clinical-grade EEG system in alpha-wave replacement and steady-state visual evoked potential paradigms. The proposed platform seems to efficiently monitor epileptic patients in both inpatient and outpatient settings and paves the way to new ambulatory clinical regimens as well as non-clinical EEG applications.

  20. Controller and data acquisition system for SIDECAR ASIC driven HAWAII detectors

    NASA Astrophysics Data System (ADS)

    Ramaprakash, Anamparambu; Burse, Mahesh; Chordia, Pravin; Chillal, Kalpesh; Kohok, Abhay; Mestry, Vilas; Punnadi, Sujit; Sinha, Sakya

    2010-07-01

    SIDECAR is an Application Specific Integrated Circuit (ASIC), which can be used for control and data acquisition from near-IR HAWAII detectors offered by Teledyne Imaging Sensors (TIS), USA. The standard interfaces provided by Teledyne are COM API and socket servers running under MS Windows platform. These interfaces communicate to the ASIC (and the detector) through an intermediate card called JWST ASIC Drive Electronics (JADE2). As part of an ongoing programme of several years, for developing astronomical focal plane array (CCDs, CMOS and Hybrid) controllers and data acquisition systems (CDAQs), IUCAA is currently developing the next generation controllers employing Virtex-5 family FPGA devices. We present here the capabilities which are built into these new CDAQs for handling HAWAII detectors. In our system, the computer which hosts the application programme, user interface and device drivers runs on a Linux platform. It communicates through a hot-pluggable USB interface (with an optional optical fibre extender) to the FPGA-based card which replaces the JADE2. The FPGA board in turn, controls the SIDECAR ASIC and through it a HAWAII-2RG detector, both of which are located in a cryogenic test Dewar set up which is liquid nitrogen cooled. The system can acquire data over 1, 4, or 32 readout channels, with or without binning, at different speeds, can define sub-regions for readout, offers various readout schemes like Fowler sampling, up-theramp etc. In this paper, we present the performance results obtained from a prototype system.

  1. Near Theoretical Gigabit Link Efficiency for Distributed Data Acquisition Systems

    PubMed Central

    Abu-Nimeh, Faisal T.; Choong, Woon-Seng

    2017-01-01

    Link efficiency, data integrity, and continuity for high-throughput and real-time systems is crucial. Most of these applications require specialized hardware and operating systems as well as extensive tuning in order to achieve high efficiency. Here, we present an implementation of gigabit Ethernet data streaming which can achieve 99.26% link efficiency while maintaining no packet losses. The design and implementation are built on OpenPET, an opensource data acquisition platform for nuclear medical imaging, where (a) a crate hosting multiple OpenPET detector boards uses a User Datagram Protocol over Internet Protocol (UDP/IP) Ethernet soft-core, that is capable of understanding PAUSE frames, to stream data out to a computer workstation; (b) the receiving computer uses Netmap to allow the processing software (i.e., user space), which is written in Python, to directly receive and manage the network card’s ring buffers, bypassing the operating system kernel’s networking stack; and (c) a multi-threaded application using synchronized queues is implemented in the processing software (Python) to free up the ring buffers as quickly as possible while preserving data integrity and flow continuity. PMID:28630948

  2. Near Theoretical Gigabit Link Efficiency for Distributed Data Acquisition Systems.

    PubMed

    Abu-Nimeh, Faisal T; Choong, Woon-Seng

    2017-03-01

    Link efficiency, data integrity, and continuity for high-throughput and real-time systems is crucial. Most of these applications require specialized hardware and operating systems as well as extensive tuning in order to achieve high efficiency. Here, we present an implementation of gigabit Ethernet data streaming which can achieve 99.26% link efficiency while maintaining no packet losses. The design and implementation are built on OpenPET, an opensource data acquisition platform for nuclear medical imaging, where (a) a crate hosting multiple OpenPET detector boards uses a User Datagram Protocol over Internet Protocol (UDP/IP) Ethernet soft-core, that is capable of understanding PAUSE frames, to stream data out to a computer workstation; (b) the receiving computer uses Netmap to allow the processing software (i.e., user space), which is written in Python, to directly receive and manage the network card's ring buffers, bypassing the operating system kernel's networking stack; and (c) a multi-threaded application using synchronized queues is implemented in the processing software (Python) to free up the ring buffers as quickly as possible while preserving data integrity and flow continuity.

  3. Design of a Mobile Brain Computer Interface-Based Smart Multimedia Controller

    PubMed Central

    Tseng, Kevin C.; Lin, Bor-Shing; Wong, Alice May-Kuen; Lin, Bor-Shyh

    2015-01-01

    Music is a way of expressing our feelings and emotions. Suitable music can positively affect people. However, current multimedia control methods, such as manual selection or automatic random mechanisms, which are now applied broadly in MP3 and CD players, cannot adaptively select suitable music according to the user’s physiological state. In this study, a brain computer interface-based smart multimedia controller was proposed to select music in different situations according to the user’s physiological state. Here, a commercial mobile tablet was used as the multimedia platform, and a wireless multi-channel electroencephalograph (EEG) acquisition module was designed for real-time EEG monitoring. A smart multimedia control program built in the multimedia platform was developed to analyze the user’s EEG feature and select music according his/her state. The relationship between the user’s state and music sorted by listener’s preference was also examined in this study. The experimental results show that real-time music biofeedback according a user’s EEG feature may positively improve the user’s attention state. PMID:25756862

  4. Teaching smartphone and microcontroller systems using "Android Java"

    NASA Astrophysics Data System (ADS)

    Tigrek, Seyitriza

    Mobile devices are becoming indispensable tools for many students and educators. Mobile technology is starting a new era in the computing methodologies in many engineering disciplines and laboratories. Microcontroller extension that communicates with mobile devices will take the data acquisition and control process into a new level in the sensing technology and communication. The purpose of this thesis is to develop a framework to incorporate the new mobile platform with robust embedded systems into the engineering curriculum. For this purpose a course material is developed "Introduction to Programming Java on a Mobile Platform" to teach novice programmers how to create applications, specifically on Android. Combining an introductory level programming class with the Android platform can appeal to non-programming individuals in multiple disciplines. The proposed course curriculum reduces the learning time, and allows senior engineering students to use the new framework for their specific needs in the labs such as mobile data acquisition and control projects. This work provides techniques for instructors with modest programming background to teach cutting edge technology, which is smartphone programming. Techniques developed in this work minimize unnecessary information carried into current teaching approaches with hands-on practice. It also helps the students with minimal background requirements overcome the barriers that have evolved around computer programming. The motivation of this thesis is to create a tailored programming introductory course to teach Java programming on Android by incorporating selected efficient methods from extant literature. The mechanism proposed in this thesis is to keep students motivated by an active approach based on student-centered learning with collaborative work. Teamwork through pair programming is adapted in this teaching process. Bloom's taxonomy, along with a knowledge survey, is used as a guide to classify the information and exercise problems. A prototype curriculum is a deliverable of this research that is suitable for novice programmers-such as engineering freshmen students. It also contains advanced material that allows senior students to use mobile phone and a microcontroller system to enhance engineering laboratories.

  5. Generation of Classical DInSAR and PSI Ground Motion Maps on a Cloud Thematic Platform

    NASA Astrophysics Data System (ADS)

    Mora, Oscar; Ordoqui, Patrick; Romero, Laia

    2016-08-01

    This paper presents the experience of ALTAMIRA INFORMATION uploading InSAR (Synthetic Aperture Radar Interferometry) services in the Geohazard Exploitation Platform (GEP), supported by ESA. Two different processing chains are presented jointly with ground motion maps obtained from the cloud computing, one being DIAPASON for classical DInSAR and SPN (Stable Point Network) for PSI (Persistent Scatterer Interferometry) processing. The product obtained from DIAPASON is the interferometric phase related to ground motion (phase fringes from a SAR pair). SPN provides motion data (mean velocity and time series) on high-quality pixels from a stack of SAR images. DIAPASON is already implemented, and SPN is under development to be exploited with historical data coming from ERS-1/2 and ENVISAT satellites, and current acquisitions of SENTINEL-1 in SLC and TOPSAR modes.

  6. Data-acquisition system for environmental monitoring aboard a twin-engined aircraft

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tichler, J.; Bernstein, H.; Brown, R.M.

    A number of experimental platforms have been used in support of the Multistate Atmospheric Power Production Study (MAP3S) and the Coastal Meteorology programs at Brookhaven National Laboratory. These platforms include a twin-engine Britten Norman Islander aircraft, a motorized van, a variety of boats and temporary enclosures set up in the field. Each platform carried a data logger consisting of a multiplexer, an analog to digital (A/D) converter and a four track endless loop magnetic tape for data storage. In recent years it has become increasingly evident that the data loggers in use were no longer adequate. Since the aircraft providedmore » the most constraints on the data acquisition system as well as being the most important research platform, a data system was designed for that platform with the secondary goal that the system would serve as a prototype for systems to be used on other platforms.« less

  7. 77 FR 10714 - Department of the Treasury Acquisition Regulation; Internet Payment Platform

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-02-23

    ...The Department of the Treasury is proposing to amend the Department of the Treasury Acquisition Regulation (DTAR) to implement use of the Internet Payment Platform, a centralized electronic invoicing and payment information system, and to change the definition of bureau to reflect the consolidation on July 21, 2011 of the Office of Thrift Supervision with the Office of the Comptroller of the Currency.

  8. An Assessment of Iterative Reconstruction Methods for Sparse Ultrasound Imaging

    PubMed Central

    Valente, Solivan A.; Zibetti, Marcelo V. W.; Pipa, Daniel R.; Maia, Joaquim M.; Schneider, Fabio K.

    2017-01-01

    Ultrasonic image reconstruction using inverse problems has recently appeared as an alternative to enhance ultrasound imaging over beamforming methods. This approach depends on the accuracy of the acquisition model used to represent transducers, reflectivity, and medium physics. Iterative methods, well known in general sparse signal reconstruction, are also suited for imaging. In this paper, a discrete acquisition model is assessed by solving a linear system of equations by an ℓ1-regularized least-squares minimization, where the solution sparsity may be adjusted as desired. The paper surveys 11 variants of four well-known algorithms for sparse reconstruction, and assesses their optimization parameters with the goal of finding the best approach for iterative ultrasound imaging. The strategy for the model evaluation consists of using two distinct datasets. We first generate data from a synthetic phantom that mimics real targets inside a professional ultrasound phantom device. This dataset is contaminated with Gaussian noise with an estimated SNR, and all methods are assessed by their resulting images and performances. The model and methods are then assessed with real data collected by a research ultrasound platform when scanning the same phantom device, and results are compared with beamforming. A distinct real dataset is finally used to further validate the proposed modeling. Although high computational effort is required by iterative methods, results show that the discrete model may lead to images closer to ground-truth than traditional beamforming. However, computing capabilities of current platforms need to evolve before frame rates currently delivered by ultrasound equipments are achievable. PMID:28282862

  9. A low-cost computer-controlled Arduino-based educational laboratory system for teaching the fundamentals of photovoltaic cells

    NASA Astrophysics Data System (ADS)

    Zachariadou, K.; Yiasemides, K.; Trougkakos, N.

    2012-11-01

    We present a low-cost, fully computer-controlled, Arduino-based, educational laboratory (SolarInsight) to be used in undergraduate university courses concerned with electrical engineering and physics. The major goal of the system is to provide students with the necessary instrumentation, software tools and methodology in order to learn fundamental concepts of semiconductor physics by exploring the process of an experimental physics inquiry. The system runs under the Windows operating system and is composed of a data acquisition/control board, a power supply and processing boards, sensing elements, a graphical user interface and data analysis software. The data acquisition/control board is based on the Arduino open source electronics prototyping platform. The graphical user interface and communication with the Arduino are developed in C# and C++ programming languages respectively, by using IDE Microsoft Visual Studio 2010 Professional, which is freely available to students. Finally, the data analysis is performed by using the open source, object-oriented framework ROOT. Currently the system supports five teaching activities, each one corresponding to an independent tab in the user interface. SolarInsight has been partially developed in the context of a diploma thesis conducted within the Technological Educational Institute of Piraeus under the co-supervision of the Physics and Electronic Computer Systems departments’ academic staff.

  10. Free Global Dsm Assessment on Large Scale Areas Exploiting the Potentialities of the Innovative Google Earth Engine Platform

    NASA Astrophysics Data System (ADS)

    Nascetti, A.; Di Rita, M.; Ravanelli, R.; Amicuzi, M.; Esposito, S.; Crespi, M.

    2017-05-01

    The high-performance cloud-computing platform Google Earth Engine has been developed for global-scale analysis based on the Earth observation data. In particular, in this work, the geometric accuracy of the two most used nearly-global free DSMs (SRTM and ASTER) has been evaluated on the territories of four American States (Colorado, Michigan, Nevada, Utah) and one Italian Region (Trentino Alto- Adige, Northern Italy) exploiting the potentiality of this platform. These are large areas characterized by different terrain morphology, land covers and slopes. The assessment has been performed using two different reference DSMs: the USGS National Elevation Dataset (NED) and a LiDAR acquisition. The DSMs accuracy has been evaluated through computation of standard statistic parameters, both at global scale (considering the whole State/Region) and in function of the terrain morphology using several slope classes. The geometric accuracy in terms of Standard deviation and NMAD, for SRTM range from 2-3 meters in the first slope class to about 45 meters in the last one, whereas for ASTER, the values range from 5-6 to 30 meters. In general, the performed analysis shows a better accuracy for the SRTM in the flat areas whereas the ASTER GDEM is more reliable in the steep areas, where the slopes increase. These preliminary results highlight the GEE potentialities to perform DSM assessment on a global scale.

  11. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wackerbarth, David

    Sandia National Laboratories has developed a computer program to review, reduce and manipulate waveform data. PlotData is designed for post-acquisition waveform data analysis. PlotData is both a post-acquisition and an advanced interactive data analysis environment. PlotData requires unidirectional waveform data with both uniform and discrete time-series measurements. PlotData operates on a National Instruments' LabVIEW™ software platform. Using PlotData, the user can capture waveform data from digitizing oscilloscopes over a GPIB, USB and Ethernet interface from Tektronix, Lecroy or Agilent scopes. PlotData can both import and export several types of binary waveform files including, but not limited to, Tektronix .wmf files,more » Lecroy.trc files and xy pair ASCIIfiles. Waveform manipulation includes numerous math functions, integration, differentiation, smoothing, truncation, and other specialized data reduction routines such as VISAR, POV, PVDF (Bauer) piezoelectric gauges, and piezoresistive gauges such as carbon manganin pressure gauges.« less

  12. Development of a cloud-based system for remote monitoring of a PVT panel

    NASA Astrophysics Data System (ADS)

    Saraiva, Luis; Alcaso, Adérito; Vieira, Paulo; Ramos, Carlos Figueiredo; Cardoso, Antonio Marques

    2016-10-01

    The paper presents a monitoring system developed for an energy conversion system based on the sun and known as thermophotovoltaic panel (PVT). The project was implemented using two embedded microcontrollers platforms (arduino Leonardo and arduino yún), wireless transmission systems (WI-FI and XBEE) and net computing ,commonly known as cloud (Google cloud). The main objective of the project is to provide remote access and real-time data monitoring (like: electrical current, electrical voltage, input fluid temperature, output fluid temperature, backward fluid temperature, up PV glass temperature, down PV glass temperature, ambient temperature, solar radiation, wind speed, wind direction and fluid mass flow). This project demonstrates the feasibility of using inexpensive microcontroller's platforms and free internet service in theWeb, to support the remote study of renewable energy systems, eliminating the acquisition of dedicated systems typically more expensive and limited in the kind of processing proposed.

  13. The Spacelab Instrument Pointing System (IPS) and its first flight

    NASA Astrophysics Data System (ADS)

    Heusmann, H.; Wolf, P.

    1985-11-01

    The development of the Instrument Pointing System (IPS) as part of Spacelab's experimental apparatus for open Pallet direct space exposure, and its test flight aboard the Shuttle Orbiter are discussed. The IPS is a three-axis-controlled platform with stellar, sun and earth pointing modes, and a better than 1 arcsec pointing ability. The development of an 'inside-out gimbal' configuration with the platform acting like a joint between the unstable Shuttle and the inertially stabilized payload facilitated close to hemispherical pointing and the adaptability for payloads of almost any size. Gimbal axes torquers counteract Orbiter acceleration due to crew movement and thruster firings, and facilitate target acquisition and precision pointing, by command from a crew-engaged computer preprogrammed for all possible control steps. Carrying an experimental solar-physics payload, the IPS correctly performed all intended functions and withstood launch and orbital loads. Several anomalies were detected and successfully corrected in-flight.

  14. Learning tactile skills through curious exploration

    PubMed Central

    Pape, Leo; Oddo, Calogero M.; Controzzi, Marco; Cipriani, Christian; Förster, Alexander; Carrozza, Maria C.; Schmidhuber, Jürgen

    2012-01-01

    We present curiosity-driven, autonomous acquisition of tactile exploratory skills on a biomimetic robot finger equipped with an array of microelectromechanical touch sensors. Instead of building tailored algorithms for solving a specific tactile task, we employ a more general curiosity-driven reinforcement learning approach that autonomously learns a set of motor skills in absence of an explicit teacher signal. In this approach, the acquisition of skills is driven by the information content of the sensory input signals relative to a learner that aims at representing sensory inputs using fewer and fewer computational resources. We show that, from initially random exploration of its environment, the robotic system autonomously develops a small set of basic motor skills that lead to different kinds of tactile input. Next, the system learns how to exploit the learned motor skills to solve supervised texture classification tasks. Our approach demonstrates the feasibility of autonomous acquisition of tactile skills on physical robotic platforms through curiosity-driven reinforcement learning, overcomes typical difficulties of engineered solutions for active tactile exploration and underactuated control, and provides a basis for studying developmental learning through intrinsic motivation in robots. PMID:22837748

  15. Use of application containers and workflows for genomic data analysis.

    PubMed

    Schulz, Wade L; Durant, Thomas J S; Siddon, Alexa J; Torres, Richard

    2016-01-01

    The rapid acquisition of biological data and development of computationally intensive analyses has led to a need for novel approaches to software deployment. In particular, the complexity of common analytic tools for genomics makes them difficult to deploy and decreases the reproducibility of computational experiments. Recent technologies that allow for application virtualization, such as Docker, allow developers and bioinformaticians to isolate these applications and deploy secure, scalable platforms that have the potential to dramatically increase the efficiency of big data processing. While limitations exist, this study demonstrates a successful implementation of a pipeline with several discrete software applications for the analysis of next-generation sequencing (NGS) data. With this approach, we significantly reduced the amount of time needed to perform clonal analysis from NGS data in acute myeloid leukemia.

  16. The Unlock Project: a Python-based framework for practical brain-computer interface communication "app" development.

    PubMed

    Brumberg, Jonathan S; Lorenz, Sean D; Galbraith, Byron V; Guenther, Frank H

    2012-01-01

    In this paper we present a framework for reducing the development time needed for creating applications for use in non-invasive brain-computer interfaces (BCI). Our framework is primarily focused on facilitating rapid software "app" development akin to current efforts in consumer portable computing (e.g. smart phones and tablets). This is accomplished by handling intermodule communication without direct user or developer implementation, instead relying on a core subsystem for communication of standard, internal data formats. We also provide a library of hardware interfaces for common mobile EEG platforms for immediate use in BCI applications. A use-case example is described in which a user with amyotrophic lateral sclerosis participated in an electroencephalography-based BCI protocol developed using the proposed framework. We show that our software environment is capable of running in real-time with updates occurring 50-60 times per second with limited computational overhead (5 ms system lag) while providing accurate data acquisition and signal analysis.

  17. A Hardware-in-the-Loop Simulation Platform for the Verification and Validation of Safety Control Systems

    NASA Astrophysics Data System (ADS)

    Rankin, Drew J.; Jiang, Jin

    2011-04-01

    Verification and validation (V&V) of safety control system quality and performance is required prior to installing control system hardware within nuclear power plants (NPPs). Thus, the objective of the hardware-in-the-loop (HIL) platform introduced in this paper is to verify the functionality of these safety control systems. The developed platform provides a flexible simulated testing environment which enables synchronized coupling between the real and simulated world. Within the platform, National Instruments (NI) data acquisition (DAQ) hardware provides an interface between a programmable electronic system under test (SUT) and a simulation computer. Further, NI LabVIEW resides on this remote DAQ workstation for signal conversion and routing between Ethernet and standard industrial signals as well as for user interface. The platform is applied to the testing of a simplified implementation of Canadian Deuterium Uranium (CANDU) shutdown system no. 1 (SDS1) which monitors only the steam generator level of the simulated NPP. CANDU NPP simulation is performed on a Darlington NPP desktop training simulator provided by Ontario Power Generation (OPG). Simplified SDS1 logic is implemented on an Invensys Tricon v9 programmable logic controller (PLC) to test the performance of both the safety controller and the implemented logic. Prior to HIL simulation, platform availability of over 95% is achieved for the configuration used during the V&V of the PLC. Comparison of HIL simulation results to benchmark simulations shows good operational performance of the PLC following a postulated initiating event (PIE).

  18. Large Spatial Scale Ground Displacement Mapping through the P-SBAS Processing of Sentinel-1 Data on a Cloud Computing Environment

    NASA Astrophysics Data System (ADS)

    Casu, F.; Bonano, M.; de Luca, C.; Lanari, R.; Manunta, M.; Manzo, M.; Zinno, I.

    2017-12-01

    Since its launch in 2014, the Sentinel-1 (S1) constellation has played a key role on SAR data availability and dissemination all over the World. Indeed, the free and open access data policy adopted by the European Copernicus program together with the global coverage acquisition strategy, make the Sentinel constellation as a game changer in the Earth Observation scenario. Being the SAR data become ubiquitous, the technological and scientific challenge is focused on maximizing the exploitation of such huge data flow. In this direction, the use of innovative processing algorithms and distributed computing infrastructures, such as the Cloud Computing platforms, can play a crucial role. In this work we present a Cloud Computing solution for the advanced interferometric (DInSAR) processing chain based on the Parallel SBAS (P-SBAS) approach, aimed at processing S1 Interferometric Wide Swath (IWS) data for the generation of large spatial scale deformation time series in efficient, automatic and systematic way. Such a DInSAR chain ingests Sentinel 1 SLC images and carries out several processing steps, to finally compute deformation time series and mean deformation velocity maps. Different parallel strategies have been designed ad hoc for each processing step of the P-SBAS S1 chain, encompassing both multi-core and multi-node programming techniques, in order to maximize the computational efficiency achieved within a Cloud Computing environment and cut down the relevant processing times. The presented P-SBAS S1 processing chain has been implemented on the Amazon Web Services platform and a thorough analysis of the attained parallel performances has been performed to identify and overcome the major bottlenecks to the scalability. The presented approach is used to perform national-scale DInSAR analyses over Italy, involving the processing of more than 3000 S1 IWS images acquired from both ascending and descending orbits. Such an experiment confirms the big advantage of exploiting large computational and storage resources of Cloud Computing platforms for large scale DInSAR analysis. The presented Cloud Computing P-SBAS processing chain can be a precious tool in the perspective of developing operational services disposable for the EO scientific community related to hazard monitoring and risk prevention and mitigation.

  19. FUX-Sim: Implementation of a fast universal simulation/reconstruction framework for X-ray systems.

    PubMed

    Abella, Monica; Serrano, Estefania; Garcia-Blas, Javier; García, Ines; de Molina, Claudia; Carretero, Jesus; Desco, Manuel

    2017-01-01

    The availability of digital X-ray detectors, together with advances in reconstruction algorithms, creates an opportunity for bringing 3D capabilities to conventional radiology systems. The downside is that reconstruction algorithms for non-standard acquisition protocols are generally based on iterative approaches that involve a high computational burden. The development of new flexible X-ray systems could benefit from computer simulations, which may enable performance to be checked before expensive real systems are implemented. The development of simulation/reconstruction algorithms in this context poses three main difficulties. First, the algorithms deal with large data volumes and are computationally expensive, thus leading to the need for hardware and software optimizations. Second, these optimizations are limited by the high flexibility required to explore new scanning geometries, including fully configurable positioning of source and detector elements. And third, the evolution of the various hardware setups increases the effort required for maintaining and adapting the implementations to current and future programming models. Previous works lack support for completely flexible geometries and/or compatibility with multiple programming models and platforms. In this paper, we present FUX-Sim, a novel X-ray simulation/reconstruction framework that was designed to be flexible and fast. Optimized implementation for different families of GPUs (CUDA and OpenCL) and multi-core CPUs was achieved thanks to a modularized approach based on a layered architecture and parallel implementation of the algorithms for both architectures. A detailed performance evaluation demonstrates that for different system configurations and hardware platforms, FUX-Sim maximizes performance with the CUDA programming model (5 times faster than other state-of-the-art implementations). Furthermore, the CPU and OpenCL programming models allow FUX-Sim to be executed over a wide range of hardware platforms.

  20. Database for the collection and analysis of clinical data and images of neoplasms of the sinonasal tract.

    PubMed

    Trimarchi, Matteo; Lund, Valerie J; Nicolai, Piero; Pini, Massimiliano; Senna, Massimo; Howard, David J

    2004-04-01

    The Neoplasms of the Sinonasal Tract software package (NSNT v 1.0) implements a complete visual database for patients with sinonasal neoplasia, facilitating standardization of data and statistical analysis. The software, which is compatible with the Macintosh and Windows platforms, provides multiuser application with a dedicated server (on Windows NT or 2000 or Macintosh OS 9 or X and a network of clients) together with web access, if required. The system hardware consists of an Apple Power Macintosh G4500 MHz computer with PCI bus, 256 Mb of RAM plus 60 Gb hard disk, or any IBM-compatible computer with a Pentium 2 processor. Image acquisition may be performed with different frame-grabber cards for analog or digital video input of different standards (PAL, SECAM, or NTSC) and levels of quality (VHS, S-VHS, Betacam, Mini DV, DV). The visual database is based on 4th Dimension by 4D Inc, and video compression is made in real-time MPEG format. Six sections have been developed: demographics, symptoms, extent of disease, radiology, treatment, and follow-up. Acquisition of data includes computed tomography and magnetic resonance imaging, histology, and endoscopy images, allowing sequential comparison. Statistical analysis integral to the program provides Kaplan-Meier survival curves. The development of a dedicated, user-friendly database for sinonasal neoplasia facilitates a multicenter network and has obvious clinical and research benefits.

  1. Development of high-availability ATCA/PCIe data acquisition instrumentation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Correia, Miguel; Sousa, Jorge; Batista, Antonio J.N.

    2015-07-01

    Latest Fusion energy experiments envision a quasi-continuous operation regime. In consequence, the largest experimental devices, currently in development, specify high-availability (HA) requirements for the whole plant infrastructure. HA features enable the whole facility to perform seamlessly in the case of failure of any of its components, coping with the increasing duration of plasma discharges (steady-state) and assuring safety of equipment, people, environment and investment. IPFN developed a control and data acquisition system, aiming for fast control of advanced Fusion devices, which is thus required to provide such HA features. The system is based on in-house developed Advanced Telecommunication Computing Architecturemore » (ATCA) instrumentation modules - IO blades and data switch blades, establishing a PCIe network on the ATCA shelf's back-plane. The data switch communicates to an external host computer through a PCIe data network. At the hardware management level, the system architecture takes advantage of ATCA native redundancy and hot swap specifications to implement fail-over substitution of IO or data switch blades. A redundant host scheme is also supported by the ATCA/PCIe platform. At the software level, PCIe provides implementation of hot plug services, which translate the hardware changes to the corresponding software/operating system devices. The paper presents how the ATCA and PCIe based system can be setup to perform with the desired degree of HA, thus being suitable for advanced Fusion control and data acquisition systems. (authors)« less

  2. Use of application containers and workflows for genomic data analysis

    PubMed Central

    Schulz, Wade L.; Durant, Thomas J. S.; Siddon, Alexa J.; Torres, Richard

    2016-01-01

    Background: The rapid acquisition of biological data and development of computationally intensive analyses has led to a need for novel approaches to software deployment. In particular, the complexity of common analytic tools for genomics makes them difficult to deploy and decreases the reproducibility of computational experiments. Methods: Recent technologies that allow for application virtualization, such as Docker, allow developers and bioinformaticians to isolate these applications and deploy secure, scalable platforms that have the potential to dramatically increase the efficiency of big data processing. Results: While limitations exist, this study demonstrates a successful implementation of a pipeline with several discrete software applications for the analysis of next-generation sequencing (NGS) data. Conclusions: With this approach, we significantly reduced the amount of time needed to perform clonal analysis from NGS data in acute myeloid leukemia. PMID:28163975

  3. Sensor network based vehicle classification and license plate identification system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Frigo, Janette Rose; Brennan, Sean M; Rosten, Edward J

    Typically, for energy efficiency and scalability purposes, sensor networks have been used in the context of environmental and traffic monitoring applications in which operations at the sensor level are not computationally intensive. But increasingly, sensor network applications require data and compute intensive sensors such video cameras and microphones. In this paper, we describe the design and implementation of two such systems: a vehicle classifier based on acoustic signals and a license plate identification system using a camera. The systems are implemented in an energy-efficient manner to the extent possible using commercially available hardware, the Mica motes and the Stargate platform.more » Our experience in designing these systems leads us to consider an alternate more flexible, modular, low-power mote architecture that uses a combination of FPGAs, specialized embedded processing units and sensor data acquisition systems.« less

  4. uSOP: A Microprocessor-Based Service-Oriented Platform for Control and Monitoring

    NASA Astrophysics Data System (ADS)

    Aloisio, Alberto; Ameli, Fabrizio; Anastasio, Antonio; Branchini, Paolo; Di Capua, Francesco; Giordano, Raffaele; Izzo, Vincenzo; Tortone, Gennaro

    2017-06-01

    uSOP is a general purpose single-board computer designed for deep embedded applications in control and monitoring of detectors, sensors, and complex laboratory equipment. In this paper, we present and discuss the main aspects of the hardware and software designs and the expandable peripheral architecture built around serial busses. We show the tests done with state-of-the-art ΔΣ 24-b ADC acquisition modules, in order to assess the achievable noise floor in a typical application. Eventually, we report on the deployment of uSOP in the monitoring system framework of the Belle2 experiment, presently under construction at the KEK Laboratory (Tsukuba, Japan).

  5. A portable ECG monitoring device with Bluetooth and Holter capabilities for telemedicine applications.

    PubMed

    Lucani, Daniel; Cataldo, Giancarlos; Cruz, Julio; Villegas, Guillermo; Wong, Sara

    2006-01-01

    A prototype of a portable ECG-monitoring device has been developed for clinical and non-clinical environments as part of a telemedicine system to provide remote and continuous surveillance of patients. The device can acquire, store and/or transmit ECG signals to computer-based platforms or specially configured access points (AP) with Intranet/Internet capabilities in order to reach remote monitoring stations. Acquired data can be stored in a flash memory card in FAT16 format for later recovery, or transmitted via Bluetooth or USB to a local station or AP. This data acquisition module (DAM) operates in two modes: Holter and on-line transmission.

  6. Structure-from-motion for MAV image sequence analysis with photogrammetric applications

    NASA Astrophysics Data System (ADS)

    Schönberger, J. L.; Fraundorfer, F.; Frahm, J.-M.

    2014-08-01

    MAV systems have found increased attention in the photogrammetric community as an (autonomous) image acquisition platform for accurate 3D reconstruction. For an accurate reconstruction in feasible time, the acquired imagery requires specialized SfM software. Current systems typically use high-resolution sensors in pre-planned flight missions from far distance. We describe and evaluate a new SfM pipeline specifically designed for sequential, close-distance, and low-resolution imagery from mobile cameras with relatively high frame-rate and high overlap. Experiments demonstrate reduced computational complexity by leveraging the temporal consistency, comparable accuracy and point density with respect to state-of-the-art systems.

  7. HTAPP: High-Throughput Autonomous Proteomic Pipeline

    PubMed Central

    Yu, Kebing; Salomon, Arthur R.

    2011-01-01

    Recent advances in the speed and sensitivity of mass spectrometers and in analytical methods, the exponential acceleration of computer processing speeds, and the availability of genomic databases from an array of species and protein information databases have led to a deluge of proteomic data. The development of a lab-based automated proteomic software platform for the automated collection, processing, storage, and visualization of expansive proteomic datasets is critically important. The high-throughput autonomous proteomic pipeline (HTAPP) described here is designed from the ground up to provide critically important flexibility for diverse proteomic workflows and to streamline the total analysis of a complex proteomic sample. This tool is comprised of software that controls the acquisition of mass spectral data along with automation of post-acquisition tasks such as peptide quantification, clustered MS/MS spectral database searching, statistical validation, and data exploration within a user-configurable lab-based relational database. The software design of HTAPP focuses on accommodating diverse workflows and providing missing software functionality to a wide range of proteomic researchers to accelerate the extraction of biological meaning from immense proteomic data sets. Although individual software modules in our integrated technology platform may have some similarities to existing tools, the true novelty of the approach described here is in the synergistic and flexible combination of these tools to provide an integrated and efficient analysis of proteomic samples. PMID:20336676

  8. Knowledge Acquisition and Management for the NASA Earth Exchange (NEX)

    NASA Astrophysics Data System (ADS)

    Votava, P.; Michaelis, A.; Nemani, R. R.

    2013-12-01

    NASA Earth Exchange (NEX) is a data, computing and knowledge collaboratory that houses NASA satellite, climate and ancillary data where a focused community can come together to share modeling and analysis codes, scientific results, knowledge and expertise on a centralized platform with access to large supercomputing resources. As more and more projects are being executed on NEX, we are increasingly focusing on capturing the knowledge of the NEX users and provide mechanisms for sharing it with the community in order to facilitate reuse and accelerate research. There are many possible knowledge contributions to NEX, it can be a wiki entry on the NEX portal contributed by a developer, information extracted from a publication in an automated way, or a workflow captured during code execution on the supercomputing platform. The goal of the NEX knowledge platform is to capture and organize this information and make it easily accessible to the NEX community and beyond. The knowledge acquisition process consists of three main faucets - data and metadata, workflows and processes, and web-based information. Once the knowledge is acquired, it is processed in a number of ways ranging from custom metadata parsers to entity extraction using natural language processing techniques. The processed information is linked with existing taxonomies and aligned with internal ontology (which heavily reuses number of external ontologies). This forms a knowledge graph that can then be used to improve users' search query results as well as provide additional analytics capabilities to the NEX system. Such a knowledge graph will be an important building block in creating a dynamic knowledge base for the NEX community where knowledge is both generated and easily shared.

  9. 77 FR 13069 - Department of the Treasury Acquisition Regulation; Internet Payment Platform; Correction

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-03-05

    ...This document contains corrections to a notice of proposed rulemaking, which was published in the Federal Register on Thursday, February 23, 2012 (77 FR 10714), relating to the Internet Payment Platform.

  10. A Control System and Streaming DAQ Platform with Image-Based Trigger for X-ray Imaging

    NASA Astrophysics Data System (ADS)

    Stevanovic, Uros; Caselle, Michele; Cecilia, Angelica; Chilingaryan, Suren; Farago, Tomas; Gasilov, Sergey; Herth, Armin; Kopmann, Andreas; Vogelgesang, Matthias; Balzer, Matthias; Baumbach, Tilo; Weber, Marc

    2015-06-01

    High-speed X-ray imaging applications play a crucial role for non-destructive investigations of the dynamics in material science and biology. On-line data analysis is necessary for quality assurance and data-driven feedback, leading to a more efficient use of a beam time and increased data quality. In this article we present a smart camera platform with embedded Field Programmable Gate Array (FPGA) processing that is able to stream and process data continuously in real-time. The setup consists of a Complementary Metal-Oxide-Semiconductor (CMOS) sensor, an FPGA readout card, and a readout computer. It is seamlessly integrated in a new custom experiment control system called Concert that provides a more efficient way of operating a beamline by integrating device control, experiment process control, and data analysis. The potential of the embedded processing is demonstrated by implementing an image-based trigger. It records the temporal evolution of physical events with increased speed while maintaining the full field of view. The complete data acquisition system, with Concert and the smart camera platform was successfully integrated and used for fast X-ray imaging experiments at KIT's synchrotron radiation facility ANKA.

  11. Instrumentino: An Open-Source Software for Scientific Instruments.

    PubMed

    Koenka, Israel Joel; Sáiz, Jorge; Hauser, Peter C

    2015-01-01

    Scientists often need to build dedicated computer-controlled experimental systems. For this purpose, it is becoming common to employ open-source microcontroller platforms, such as the Arduino. These boards and associated integrated software development environments provide affordable yet powerful solutions for the implementation of hardware control of transducers and acquisition of signals from detectors and sensors. It is, however, a challenge to write programs that allow interactive use of such arrangements from a personal computer. This task is particularly complex if some of the included hardware components are connected directly to the computer and not via the microcontroller. A graphical user interface framework, Instrumentino, was therefore developed to allow the creation of control programs for complex systems with minimal programming effort. By writing a single code file, a powerful custom user interface is generated, which enables the automatic running of elaborate operation sequences and observation of acquired experimental data in real time. The framework, which is written in Python, allows extension by users, and is made available as an open source project.

  12. Note: A disposable x-ray camera based on mass produced complementary metal-oxide-semiconductor sensors and single-board computers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hoidn, Oliver R.; Seidler, Gerald T., E-mail: seidler@uw.edu

    We have integrated mass-produced commercial complementary metal-oxide-semiconductor (CMOS) image sensors and off-the-shelf single-board computers into an x-ray camera platform optimized for acquisition of x-ray spectra and radiographs at energies of 2–6 keV. The CMOS sensor and single-board computer are complemented by custom mounting and interface hardware that can be easily acquired from rapid prototyping services. For single-pixel detection events, i.e., events where the deposited energy from one photon is substantially localized in a single pixel, we establish ∼20% quantum efficiency at 2.6 keV with ∼190 eV resolution and a 100 kHz maximum detection rate. The detector platform’s useful intrinsic energymore » resolution, 5-μm pixel size, ease of use, and obvious potential for parallelization make it a promising candidate for many applications at synchrotron facilities, in laser-heating plasma physics studies, and in laboratory-based x-ray spectrometry.« less

  13. Exposure to marijuana smoke impairs memory retrieval in mice.

    PubMed

    Niyuhire, Floride; Varvel, Stephen A; Martin, Billy R; Lichtman, Aron H

    2007-09-01

    Marijuana (Cannabis sativa) and its primary psychoactive component, delta-9-tetrahydrocannabinol (Delta(9)-THC), have long been known to disrupt cognition in humans. Although Delta(9)-THC and other cannabinoids disrupt performance in a wide range of animal models of learning and memory, few studies have investigated the effects of smoked marijuana in these paradigms. Moreover, in preclinical studies, cannabinoids are generally administered before acquisition, and because retention is generally evaluated soon afterward, it is difficult to distinguish between processes related to acquisition and retrieval. In the present study, we investigated the specific effects of marijuana smoke and injected Delta(9)-THC on acquisition versus memory retrieval in a mouse repeated acquisition Morris water-maze task. To distinguish between these processes, subjects were administered Delta(9)-THC or they were exposed to marijuana smoke either 30 min before acquisition or 30 min before the retention test. Inhalation of marijuana smoke or injected Delta(9)-THC impaired the ability of the mice to learn the location of the hidden platform and to recall the platform location once learning had already taken place. In contrast, neither drug impaired performance in a cued task in which the platform was made visible. Finally, the cannabinoid-1 (CB(1)) receptor antagonist N-(piperidin-1-yl)-5-(4-chlorophenyl)-1-(2,4-dichlorophenyl)-4-methyl-1H-pyrazole-3-carboxamide HCl (rimonabant) blocked the memory disruptive effects of both Delta(9)-THC and marijuana. These data represent the first evidence demonstrating that marijuana impairs memory retrieval through a CB(1) receptor mechanism of action and independently of its effects on sensorimotor performance, motivation, and initial acquisition.

  14. The development of data acquisition and processing application system for RF ion source

    NASA Astrophysics Data System (ADS)

    Zhang, Xiaodan; Wang, Xiaoying; Hu, Chundong; Jiang, Caichao; Xie, Yahong; Zhao, Yuanzhe

    2017-07-01

    As the key ion source component of nuclear fusion auxiliary heating devices, the radio frequency (RF) ion source is developed and applied gradually to offer a source plasma with the advantages of ease of control and high reliability. In addition, it easily achieves long-pulse steady-state operation. During the process of the development and testing of the RF ion source, a lot of original experimental data will be generated. Therefore, it is necessary to develop a stable and reliable computer data acquisition and processing application system for realizing the functions of data acquisition, storage, access, and real-time monitoring. In this paper, the development of a data acquisition and processing application system for the RF ion source is presented. The hardware platform is based on the PXI system and the software is programmed on the LabVIEW development environment. The key technologies that are used for the implementation of this software programming mainly include the long-pulse data acquisition technology, multi-threading processing technology, transmission control communication protocol, and the Lempel-Ziv-Oberhumer data compression algorithm. Now, this design has been tested and applied on the RF ion source. The test results show that it can work reliably and steadily. With the help of this design, the stable plasma discharge data of the RF ion source are collected, stored, accessed, and monitored in real-time. It is shown that it has a very practical application significance for the RF experiments.

  15. [Testing system design and analysis for the execution units of anti-thrombotic device].

    PubMed

    Li, Zhelong; Cui, Haipo; Shang, Kun; Liao, Yuehua; Zhou, Xun

    2015-02-01

    In an anti-thrombotic pressure circulatory device, relays and solenoid valves serve as core execution units. Thus the therapeutic efficacy and patient safety of the device will directly depend on their performance. A new type of testing system for relays and solenoid valves used in the anti-thrombotic device has been developed, which can test action response time and fatigue performance of relay and solenoid valve. PC, data acquisition card and test platform are used in this testing system based on human-computer interaction testing modules. The testing objectives are realized by using the virtual instrument technology, the high-speed data acquisition technology and reasonable software design. The two sets of the system made by relay and solenoid valve are tested. The results proved the universality and reliability of the testing system so that these relays and solenoid valves could be accurately used in the antithrombotic pressure circulatory equipment. The newly-developed testing system has a bright future in the aspects of promotion and application prospect.

  16. A real-time spike sorting method based on the embedded GPU.

    PubMed

    Zelan Yang; Kedi Xu; Xiang Tian; Shaomin Zhang; Xiaoxiang Zheng

    2017-07-01

    Microelectrode arrays with hundreds of channels have been widely used to acquire neuron population signals in neuroscience studies. Online spike sorting is becoming one of the most important challenges for high-throughput neural signal acquisition systems. Graphic processing unit (GPU) with high parallel computing capability might provide an alternative solution for increasing real-time computational demands on spike sorting. This study reported a method of real-time spike sorting through computing unified device architecture (CUDA) which was implemented on an embedded GPU (NVIDIA JETSON Tegra K1, TK1). The sorting approach is based on the principal component analysis (PCA) and K-means. By analyzing the parallelism of each process, the method was further optimized in the thread memory model of GPU. Our results showed that the GPU-based classifier on TK1 is 37.92 times faster than the MATLAB-based classifier on PC while their accuracies were the same with each other. The high-performance computing features of embedded GPU demonstrated in our studies suggested that the embedded GPU provide a promising platform for the real-time neural signal processing.

  17. Practical remarks on the heart rate and saturation measurement methodology

    NASA Astrophysics Data System (ADS)

    Kowal, M.; Kubal, S.; Piotrowski, P.; Staniec, K.

    2017-05-01

    A surface reflection-based method for measuring heart rate and saturation has been introduced as one having a significant advantage over legacy methods in that it lends itself for use in special applications such as those where a person’s mobility is of prime importance (e.g. during a miner’s work) and excluding the use of traditional clips. Then, a complete ATmega1281-based microcontroller platform has been described for performing computational tasks of signal processing and wireless transmission. In the next section remarks have been provided regarding the basic signal processing rules beginning with raw voltage samples of converted optical signals, their acquisition, storage and smoothing. This chapter ends with practical remarks demonstrating an exponential dependence between the minimum measurable heart rate and the readout resolution at different sampling frequencies for different cases of averaging depth (in bits). The following section is devoted strictly to the heart rate and hemoglobin oxygenation (saturation) measurement with the use of the presented platform, referenced to measurements obtained with a stationary certified pulsoxymeter.

  18. A Scalable Cyberinfrastructure for Interactive Visualization of Terascale Microscopy Data

    PubMed Central

    Venkat, A.; Christensen, C.; Gyulassy, A.; Summa, B.; Federer, F.; Angelucci, A.; Pascucci, V.

    2017-01-01

    The goal of the recently emerged field of connectomics is to generate a wiring diagram of the brain at different scales. To identify brain circuitry, neuroscientists use specialized microscopes to perform multichannel imaging of labeled neurons at a very high resolution. CLARITY tissue clearing allows imaging labeled circuits through entire tissue blocks, without the need for tissue sectioning and section-to-section alignment. Imaging the large and complex non-human primate brain with sufficient resolution to identify and disambiguate between axons, in particular, produces massive data, creating great computational challenges to the study of neural circuits. Researchers require novel software capabilities for compiling, stitching, and visualizing large imagery. In this work, we detail the image acquisition process and a hierarchical streaming platform, ViSUS, that enables interactive visualization of these massive multi-volume datasets using a standard desktop computer. The ViSUS visualization framework has previously been shown to be suitable for 3D combustion simulation, climate simulation and visualization of large scale panoramic images. The platform is organized around a hierarchical cache oblivious data layout, called the IDX file format, which enables interactive visualization and exploration in ViSUS, scaling to the largest 3D images. In this paper we showcase the VISUS framework used in an interactive setting with the microscopy data. PMID:28638896

  19. A Scalable Cyberinfrastructure for Interactive Visualization of Terascale Microscopy Data.

    PubMed

    Venkat, A; Christensen, C; Gyulassy, A; Summa, B; Federer, F; Angelucci, A; Pascucci, V

    2016-08-01

    The goal of the recently emerged field of connectomics is to generate a wiring diagram of the brain at different scales. To identify brain circuitry, neuroscientists use specialized microscopes to perform multichannel imaging of labeled neurons at a very high resolution. CLARITY tissue clearing allows imaging labeled circuits through entire tissue blocks, without the need for tissue sectioning and section-to-section alignment. Imaging the large and complex non-human primate brain with sufficient resolution to identify and disambiguate between axons, in particular, produces massive data, creating great computational challenges to the study of neural circuits. Researchers require novel software capabilities for compiling, stitching, and visualizing large imagery. In this work, we detail the image acquisition process and a hierarchical streaming platform, ViSUS, that enables interactive visualization of these massive multi-volume datasets using a standard desktop computer. The ViSUS visualization framework has previously been shown to be suitable for 3D combustion simulation, climate simulation and visualization of large scale panoramic images. The platform is organized around a hierarchical cache oblivious data layout, called the IDX file format, which enables interactive visualization and exploration in ViSUS, scaling to the largest 3D images. In this paper we showcase the VISUS framework used in an interactive setting with the microscopy data.

  20. An Interview with Matthew P. Greving, PhD. Interview by Vicki Glaser.

    PubMed

    Greving, Matthew P

    2011-10-01

    Matthew P. Greving is Chief Scientific Officer at Nextval Inc., a company founded in early 2010 that has developed a discovery platform called MassInsight™.. He received his PhD in Biochemistry from Arizona State University, and prior to that he spent nearly 7 years working as a software engineer. This experience in solving complex computational problems fueled his interest in developing technologies and algorithms related to acquisition and analysis of high-dimensional biochemical data. To address the existing problems associated with label-based microarray readouts, he beganwork on a technique for label-free mass spectrometry (MS) microarray readout compatible with both matrix-assisted laser/desorption ionization (MALDI) and matrix-free nanostructure initiator mass spectrometry (NIMS). This is the core of Nextval’s MassInsight technology, which utilizes picoliter noncontact deposition of high-density arrays on mass-readout substrates along with computational algorithms for high-dimensional data processingand reduction.

  1. Computer assisted operations in Petroleum Development Oman (PDO)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Al-Hinai, S.H.; Mutimer, K.

    1995-10-01

    Petroleum Development Oman (PDO) currently produces some 750,000 bopd and 900,000 bwpd from some 74 fields in a large geographical area and diverse operating conditions. A key corporate objective is to reduce operating costs by exploiting productivity gains from proven technology. Automation is seen as a means of managing the rapid growth of well population and production facilities. the overall objective is to improve field management through continuous monitoring of wells and facilities and dissemination of data throughout the whole organization. A major upgrade of PDO`s field Supervisory Control and Data Acquisition (SCADA) system is complete providing a platform tomore » exploit new initiatives particularly for production optimization of artificial lift systems and automatic well testing using multi selector valves, coriolis flow meter measurements and multi component (oil, gas, water) flowmeter. The paper describes PDO`s experience including benefits and challenges which have to be managed when developing Computer Assisted Operations (CAO).« less

  2. Computing Platforms for Big Biological Data Analytics: Perspectives and Challenges.

    PubMed

    Yin, Zekun; Lan, Haidong; Tan, Guangming; Lu, Mian; Vasilakos, Athanasios V; Liu, Weiguo

    2017-01-01

    The last decade has witnessed an explosion in the amount of available biological sequence data, due to the rapid progress of high-throughput sequencing projects. However, the biological data amount is becoming so great that traditional data analysis platforms and methods can no longer meet the need to rapidly perform data analysis tasks in life sciences. As a result, both biologists and computer scientists are facing the challenge of gaining a profound insight into the deepest biological functions from big biological data. This in turn requires massive computational resources. Therefore, high performance computing (HPC) platforms are highly needed as well as efficient and scalable algorithms that can take advantage of these platforms. In this paper, we survey the state-of-the-art HPC platforms for big biological data analytics. We first list the characteristics of big biological data and popular computing platforms. Then we provide a taxonomy of different biological data analysis applications and a survey of the way they have been mapped onto various computing platforms. After that, we present a case study to compare the efficiency of different computing platforms for handling the classical biological sequence alignment problem. At last we discuss the open issues in big biological data analytics.

  3. Computational Modeling for Language Acquisition: A Tutorial With Syntactic Islands.

    PubMed

    Pearl, Lisa S; Sprouse, Jon

    2015-06-01

    Given the growing prominence of computational modeling in the acquisition research community, we present a tutorial on how to use computational modeling to investigate learning strategies that underlie the acquisition process. This is useful for understanding both typical and atypical linguistic development. We provide a general overview of why modeling can be a particularly informative tool and some general considerations when creating a computational acquisition model. We then review a concrete example of a computational acquisition model for complex structural knowledge referred to as syntactic islands. This includes an overview of syntactic islands knowledge, a precise definition of the acquisition task being modeled, the modeling results, and how to meaningfully interpret those results in a way that is relevant for questions about knowledge representation and the learning process. Computational modeling is a powerful tool that can be used to understand linguistic development. The general approach presented here can be used to investigate any acquisition task and any learning strategy, provided both are precisely defined.

  4. Integrated instrumentation & computation environment for GRACE

    NASA Astrophysics Data System (ADS)

    Dhekne, P. S.

    2002-03-01

    The project GRACE (Gamma Ray Astrophysics with Coordinated Experiments) aims at setting up a state of the art Gamma Ray Observatory at Mt. Abu, Rajasthan for undertaking comprehensive scientific exploration over a wide spectral window (10's keV - 100's TeV) from a single location through 4 coordinated experiments. The cumulative data collection rate of all the telescopes is expected to be about 1 GB/hr, necessitating innovations in the data management environment. As real-time data acquisition and control as well as off-line data processing, analysis and visualization environment of these systems is based on the us cutting edge and affordable technologies in the field of computers, communications and Internet. We propose to provide a single, unified environment by seamless integration of instrumentation and computations by taking advantage of the recent advancements in Web based technologies. This new environment will allow researchers better acces to facilities, improve resource utilization and enhance collaborations by having identical environments for online as well as offline usage of this facility from any location. We present here a proposed implementation strategy for a platform independent web-based system that supplements automated functions with video-guided interactive and collaborative remote viewing, remote control through virtual instrumentation console, remote acquisition of telescope data, data analysis, data visualization and active imaging system. This end-to-end web-based solution will enhance collaboration among researchers at the national and international level for undertaking scientific studies, using the telescope systems of the GRACE project.

  5. Satellite data-relay activities in Arizona

    USGS Publications Warehouse

    Boner, F.C.; Blee, J.W.; Shope, W.G.

    1985-01-01

    The U.S. Geological Survey (USGS) Arizona District collects data from automated streamflow stations for a wide variety of uses. Data from these stations are provided to Federal, State, and local agencies that have a responsibility to issue flood warnings; to generate forecasts of water availability; to monitor flow to insure compliance with treaties and other legal mandates; and to manage reservoirs for hydropower, flood abatement, and municipal and irrigation water supply. In the mid-1970's, the escalation of data collection costs and a need for more timely data led the Arizona District to examine alternatives for remote data acquisition. On the basis of successful data communications experiments with NASA 's Landsat satellite, an operational system for satellite-data relay was developed in 1976 using the National Oceanic and Atmospheric Administrations 's (NOAA) Geostationary Operational Environmental Satellite (GOES). A total of 62 data collection platforms (DCP's) was operated in 1983. Satellite telemetry operations are controlled at the remote data-collection stations by small battery-operated data collection platforms. The DCP 's periodically collect data from the sensors, store the data in computer memory, and at preset times transmit the data to the GOES satellite. The satellite retransmits the data to Earth where a ground-receive station transmits or transfers the data by land communications to the USGS computer in Reston, Virginia, for processing. The satellite relay transfers the data from sensor to computer in minutes; therefore, the data are available to users on a near real-time basis. (Author 's abstract)

  6. Smart Water: Energy-Water Optimization in Drinking Water Systems

    EPA Science Inventory

    This project aims to develop and commercialize a Smart Water Platform – Sensor-based Data-driven Energy-Water Optimization technology in drinking water systems. The key technological advances rely on cross-platform data acquisition and management system, model-based real-time sys...

  7. A miniaturized NQR spectrometer for a multi-channel NQR-based detection device

    NASA Astrophysics Data System (ADS)

    Beguš, Samo; Jazbinšek, Vojko; Pirnat, Janez; Trontelj, Zvonko

    2014-10-01

    A low frequency (0.5-5 MHz) battery operated sensitive pulsed NQR spectrometer with a transmitter power up to 5 W and a total mass of about 3 kg aimed at detecting 14 N NQR signals, predominantly of illicit materials, was designed and assembled. This spectrometer uses a standard software defined radio (SDR) platform for the data acquisition unit. Signal processing is done with the LabView Virtual instrument on a personal computer. We successfully tested the spectrometer by measuring 14 N NQR signals from aminotetrazole monohydrate (ATMH), potassium nitrate (PN), paracetamol (PCM) and trinitrotoluene (TNT). Such a spectrometer is a feasible component of a portable single or multichannel 14 N NQR based detection device.

  8. 3D Data Acquisition Platform for Human Activity Understanding

    DTIC Science & Technology

    2016-03-02

    3D data. The support for the acquisition of such research instrumentation have significantly facilitated our current and future research and educate ...SECURITY CLASSIFICATION OF: In this project, we incorporated motion capture devices, 3D vision sensors, and EMG sensors to cross validate...multimodality data acquisition, and address fundamental research problems of representation and invariant description of 3D data, human motion modeling and

  9. 7. Perimeter acquisition radar power plant room #202, battery equipment ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    7. Perimeter acquisition radar power plant room #202, battery equipment room; showing battery room (in background) and multiple source power converter (in foreground). The picture offers another look at the shock-isolation system developed for each platform - Stanley R. Mickelsen Safeguard Complex, Perimeter Acquisition Radar Power Plant, In Limited Access Area, Southwest of PARB at end of Service Road B, Nekoma, Cavalier County, ND

  10. 48 CFR 27.405-3 - Commercial computer software.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... software. 27.405-3 Section 27.405-3 Federal Acquisition Regulations System FEDERAL ACQUISITION REGULATION... Commercial computer software. (a) When contracting other than from GSA's Multiple Award Schedule contracts for the acquisition of commercial computer software, no specific contract clause prescribed in this...

  11. 48 CFR 27.405-3 - Commercial computer software.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... software. 27.405-3 Section 27.405-3 Federal Acquisition Regulations System FEDERAL ACQUISITION REGULATION... Commercial computer software. (a) When contracting other than from GSA's Multiple Award Schedule contracts for the acquisition of commercial computer software, no specific contract clause prescribed in this...

  12. 48 CFR 27.405-3 - Commercial computer software.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... software. 27.405-3 Section 27.405-3 Federal Acquisition Regulations System FEDERAL ACQUISITION REGULATION... Commercial computer software. (a) When contracting other than from GSA's Multiple Award Schedule contracts for the acquisition of commercial computer software, no specific contract clause prescribed in this...

  13. 48 CFR 27.405-3 - Commercial computer software.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... software. 27.405-3 Section 27.405-3 Federal Acquisition Regulations System FEDERAL ACQUISITION REGULATION... Commercial computer software. (a) When contracting other than from GSA's Multiple Award Schedule contracts for the acquisition of commercial computer software, no specific contract clause prescribed in this...

  14. 48 CFR 27.405-3 - Commercial computer software.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... software. 27.405-3 Section 27.405-3 Federal Acquisition Regulations System FEDERAL ACQUISITION REGULATION... Commercial computer software. (a) When contracting other than from GSA's Multiple Award Schedule contracts for the acquisition of commercial computer software, no specific contract clause prescribed in this...

  15. Design, Results, Evolution and Status of the ATLAS Simulation at Point1 Project

    NASA Astrophysics Data System (ADS)

    Ballestrero, S.; Batraneanu, S. M.; Brasolin, F.; Contescu, C.; Fazio, D.; Di Girolamo, A.; Lee, C. J.; Pozo Astigarraga, M. E.; Scannicchio, D. A.; Sedov, A.; Twomey, M. S.; Wang, F.; Zaytsev, A.

    2015-12-01

    During the LHC Long Shutdown 1 (LSI) period, that started in 2013, the Simulation at Point1 (Sim@P1) project takes advantage, in an opportunistic way, of the TDAQ (Trigger and Data Acquisition) HLT (High-Level Trigger) farm of the ATLAS experiment. This farm provides more than 1300 compute nodes, which are particularly suited for running event generation and Monte Carlo production jobs that are mostly CPU and not I/O bound. It is capable of running up to 2700 Virtual Machines (VMs) each with 8 CPU cores, for a total of up to 22000 parallel jobs. This contribution gives a review of the design, the results, and the evolution of the Sim@P1 project, operating a large scale OpenStack based virtualized platform deployed on top of the ATLAS TDAQ HLT farm computing resources. During LS1, Sim@P1 was one of the most productive ATLAS sites: it delivered more than 33 million CPU-hours and it generated more than 1.1 billion Monte Carlo events. The design aspects are presented: the virtualization platform exploited by Sim@P1 avoids interferences with TDAQ operations and it guarantees the security and the usability of the ATLAS private network. The cloud mechanism allows the separation of the needed support on both infrastructural (hardware, virtualization layer) and logical (Grid site support) levels. This paper focuses on the operational aspects of such a large system during the upcoming LHC Run 2 period: simple, reliable, and efficient tools are needed to quickly switch from Sim@P1 to TDAQ mode and back, to exploit the resources when they are not used for the data acquisition, even for short periods. The evolution of the central OpenStack infrastructure is described, as it was upgraded from Folsom to the Icehouse release, including the scalability issues addressed.

  16. Automatic and accurate reconstruction of distal humerus contours through B-Spline fitting based on control polygon deformation.

    PubMed

    Mostafavi, Kamal; Tutunea-Fatan, O Remus; Bordatchev, Evgueni V; Johnson, James A

    2014-12-01

    The strong advent of computer-assisted technologies experienced by the modern orthopedic surgery prompts for the expansion of computationally efficient techniques to be built on the broad base of computer-aided engineering tools that are readily available. However, one of the common challenges faced during the current developmental phase continues to remain the lack of reliable frameworks to allow a fast and precise conversion of the anatomical information acquired through computer tomography to a format that is acceptable to computer-aided engineering software. To address this, this study proposes an integrated and automatic framework capable to extract and then postprocess the original imaging data to a common planar and closed B-Spline representation. The core of the developed platform relies on the approximation of the discrete computer tomography data by means of an original two-step B-Spline fitting technique based on successive deformations of the control polygon. In addition to its rapidity and robustness, the developed fitting technique was validated to produce accurate representations that do not deviate by more than 0.2 mm with respect to alternate representations of the bone geometry that were obtained through different-contact-based-data acquisition or data processing methods. © IMechE 2014.

  17. Acquisition and tracking for underwater optical communications

    NASA Astrophysics Data System (ADS)

    Williams, Andrew J.; Laycock, Leslie L.; Griffith, Michael S.; McCarthy, Andrew G.; Rowe, Duncan P.

    2017-10-01

    There is a growing requirement to transfer large volumes of data between underwater platforms. As seawater is transmissive in the visible band, underwater optical communications is an active area of interest since it offers the potential for power efficient, covert and high bandwidth datalinks at short to medium ranges. Short range systems have been successfully demonstrated using sources with low directionality. To realise higher data rates and/or longer ranges, the use of more efficient directional beams is required; by necessity, these must be sufficiently aligned to achieve the required link margin. For mobile platforms, the acquisition and tracking of each node is therefore critical in order to establish and maintain an optical datalink. This paper describes work undertaken to demonstrate acquisition and tracking in a 3D underwater environment. A range of optical sources, beam steering technologies, and tracking sensors have been assessed for suitability. A novel scanning strategy exploiting variable beam divergence was developed to provide robust acquisition whilst minimising acquisition time. A prototype system was assembled and demonstrated in a large water tank. This utilised custom quadrant detectors based on Silicon PhotoMultiplier (SiPM) arrays for fine tracking, and a Wide Field of View (WFoV) sCMOS camera for link acquisition. Fluidic lenses provided dynamic control of beam divergence, and AC modulation/filtering enabled background rejection. The system successfully demonstrated robust optical acquisition and tracking between two nodes with only nanowatt received optical powers. The acquisition time was shown to be dependent on the initial conditions and the transmitted optical power.

  18. Multispectral remote sensing from unmanned aircraft: image processing workflows and applications for rangeland environments

    USDA-ARS?s Scientific Manuscript database

    Using unmanned aircraft systems (UAS) as remote sensing platforms offers the unique ability for repeated deployment for acquisition of high temporal resolution data at very high spatial resolution. Most image acquisitions from UAS have been in the visible bands, while multispectral remote sensing ap...

  19. ISS Efforts to Fully Utilize its Target Acquisition Capability Serves as an Analog for Future Laser Pointing Communications Systems

    NASA Technical Reports Server (NTRS)

    Jackson, Dan

    2017-01-01

    The ISS is an outstanding platform for developing, testing and refining laser communications systems for future exploration. A recent ISS project which improved ISS communications satellite acquisition performance proves the platform’s utility as a laser communications systems testbed.

  20. 48 CFR 227.7203-2 - Acquisition of noncommercial computer software and computer software documentation.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... noncommercial computer software and computer software documentation. 227.7203-2 Section 227.7203-2 Federal... CONTRACTING REQUIREMENTS PATENTS, DATA, AND COPYRIGHTS Rights in Computer Software and Computer Software Documentation 227.7203-2 Acquisition of noncommercial computer software and computer software documentation. (a...

  1. 48 CFR 227.7203-2 - Acquisition of noncommercial computer software and computer software documentation.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... noncommercial computer software and computer software documentation. 227.7203-2 Section 227.7203-2 Federal... CONTRACTING REQUIREMENTS PATENTS, DATA, AND COPYRIGHTS Rights in Computer Software and Computer Software Documentation 227.7203-2 Acquisition of noncommercial computer software and computer software documentation. (a...

  2. 48 CFR 227.7203-2 - Acquisition of noncommercial computer software and computer software documentation.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... noncommercial computer software and computer software documentation. 227.7203-2 Section 227.7203-2 Federal... CONTRACTING REQUIREMENTS PATENTS, DATA, AND COPYRIGHTS Rights in Computer Software and Computer Software Documentation 227.7203-2 Acquisition of noncommercial computer software and computer software documentation. (a...

  3. 48 CFR 227.7203-2 - Acquisition of noncommercial computer software and computer software documentation.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... noncommercial computer software and computer software documentation. 227.7203-2 Section 227.7203-2 Federal... CONTRACTING REQUIREMENTS PATENTS, DATA, AND COPYRIGHTS Rights in Computer Software and Computer Software Documentation 227.7203-2 Acquisition of noncommercial computer software and computer software documentation. (a...

  4. 48 CFR 227.7203-2 - Acquisition of noncommercial computer software and computer software documentation.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... noncommercial computer software and computer software documentation. 227.7203-2 Section 227.7203-2 Federal... CONTRACTING REQUIREMENTS PATENTS, DATA, AND COPYRIGHTS Rights in Computer Software and Computer Software Documentation 227.7203-2 Acquisition of noncommercial computer software and computer software documentation. (a...

  5. Computational Modeling for Language Acquisition: A Tutorial with Syntactic Islands

    ERIC Educational Resources Information Center

    Pearl, Lisa S.; Sprouse, Jon

    2015-01-01

    Purpose: Given the growing prominence of computational modeling in the acquisition research community, we present a tutorial on how to use computational modeling to investigate learning strategies that underlie the acquisition process. This is useful for understanding both typical and atypical linguistic development. Method: We provide a general…

  6. COSBID-M3: a platform for multimodal monitoring, data collection, and research in neurocritical care.

    PubMed

    Wilson, J Adam; Shutter, Lori A; Hartings, Jed A

    2013-01-01

    Neuromonitoring in patients with severe brain trauma and stroke is often limited to intracranial pressure (ICP); advanced neuroscience intensive care units may also monitor brain oxygenation (partial pressure of brain tissue oxygen, P(bt)O(2)), electroencephalogram (EEG), cerebral blood flow (CBF), or neurochemistry. For example, cortical spreading depolarizations (CSDs) recorded by electrocorticography (ECoG) are associated with delayed cerebral ischemia after subarachnoid hemorrhage and are an attractive target for novel therapeutic approaches. However, to better understand pathophysiologic relations and realize the potential of multimodal monitoring, a common platform for data collection and integration is needed. We have developed a multimodal system that integrates clinical, research, and imaging data into a single research and development (R&D) platform. Our system is adapted from the widely used BCI2000, a brain-computer interface tool which is written in the C++ language and supports over 20 data acquisition systems. It is optimized for real-time analysis of multimodal data using advanced time and frequency domain analyses and is extensible for research development using a combination of C++, MATLAB, and Python languages. Continuous streams of raw and processed data, including BP (blood pressure), ICP, PtiO2, CBF, ECoG, EEG, and patient video are stored in an open binary data format. Selected events identified in raw (e.g., ICP) or processed (e.g., CSD) measures are displayed graphically, can trigger alarms, or can be sent to researchers or clinicians via text message. For instance, algorithms for automated detection of CSD have been incorporated, and processed ECoG signals are projected onto three-dimensional (3D) brain models based on patient magnetic resonance imaging (MRI) and computed tomographic (CT) scans, allowing real-time correlation of pathoanatomy and cortical function. This platform will provide clinicians and researchers with an advanced tool to investigate pathophysiologic relationships and novel measures of cerebral status, as well as implement treatment algorithms based on such multimodal measures.

  7. SU-C-18C-06: Radiation Dose Reduction in Body Interventional Radiology: Clinical Results Utilizing a New Imaging Acquisition and Processing Platform

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kohlbrenner, R; Kolli, KP; Taylor, A

    2014-06-01

    Purpose: To quantify the patient radiation dose reduction achieved during transarterial chemoembolization (TACE) procedures performed in a body interventional radiology suite equipped with the Philips Allura Clarity imaging acquisition and processing platform, compared to TACE procedures performed in the same suite equipped with the Philips Allura Xper platform. Methods: Total fluoroscopy time, cumulative dose area product, and cumulative air kerma were recorded for the first 25 TACE procedures performed to treat hepatocellular carcinoma (HCC) in a Philips body interventional radiology suite equipped with Philips Allura Clarity. The same data were collected for the prior 85 TACE procedures performed to treatmore » HCC in the same suite equipped with Philips Allura Xper. Mean values from these cohorts were compared using two-tailed t tests. Results: Following installation of the Philips Allura Clarity platform, a 42.8% reduction in mean cumulative dose area product (3033.2 versus 1733.6 mGycm∧2, p < 0.0001) and a 31.2% reduction in mean cumulative air kerma (1445.4 versus 994.2 mGy, p < 0.001) was achieved compared to similar procedures performed in the same suite equipped with the Philips Allura Xper platform. Mean total fluoroscopy time was not significantly different between the two cohorts (1679.3 versus 1791.3 seconds, p = 0.41). Conclusion: This study demonstrates a significant patient radiation dose reduction during TACE procedures performed to treat HCC after a body interventional radiology suite was converted to the Philips Allura Clarity platform from the Philips Allura Xper platform. Future work will focus on evaluation of patient dose reduction in a larger cohort of patients across a broader range of procedures and in specific populations, including obese patients and pediatric patients, and comparison of image quality between the two platforms. Funding for this study was provided by Philips Healthcare, with 5% salary support provided to authors K. Pallav Kolli and Robert G. Gould for time devoted to the study. Data acquisition and analysis was performed by the authors independent of the funding source.« less

  8. [Temperature Measurement with Bluetooth under Android Platform].

    PubMed

    Wang, Shuai; Shen, Hao; Luo, Changze

    2015-03-01

    To realize the real-time transmission of temperature data and display using the platform of intelligent mobile phone and bluetooth. Application of Arduino Uno R3 in temperature data acquisition of digital temperature sensor DS18B20 acquisition, through the HC-05 bluetooth transmits the data to the intelligent smart phone Android system, realizes transmission of temperature data. Using Java language to write applications program under Android development environment, can achieve real-time temperature data display, storage and drawing temperature fluctuations drawn graphics. Temperature sensor is experimentally tested to meet the body temperature measurement precision and accuracy. This paper can provide a reference for other smart phone mobile medical product development.

  9. Electronics design of the airborne stabilized platform attitude acquisition module

    NASA Astrophysics Data System (ADS)

    Xu, Jiang; Wei, Guiling; Cheng, Yong; Li, Baolin; Bu, Hongyi; Wang, Hao; Zhang, Zhanwei; Li, Xingni

    2014-02-01

    We present an attitude acquisition module electronics design for the airborne stabilized platform. The design scheme, which is based on Integrated MEMS sensor ADIS16405, develops the attitude information processing algorithms and the hardware circuit. The hardware circuits with a small volume of only 44.9 x 43.6 x 24.6 mm3, has the characteristics of lightweight, modularization and digitalization. The interface design of the PC software uses the combination plane chart with track line to receive the attitude information and display. Attitude calculation uses the Kalman filtering algorithm to improve the measurement accuracy of the module in the dynamic environment.

  10. Energy Consumption Management of Virtual Cloud Computing Platform

    NASA Astrophysics Data System (ADS)

    Li, Lin

    2017-11-01

    For energy consumption management research on virtual cloud computing platforms, energy consumption management of virtual computers and cloud computing platform should be understood deeper. Only in this way can problems faced by energy consumption management be solved. In solving problems, the key to solutions points to data centers with high energy consumption, so people are in great need to use a new scientific technique. Virtualization technology and cloud computing have become powerful tools in people’s real life, work and production because they have strong strength and many advantages. Virtualization technology and cloud computing now is in a rapid developing trend. It has very high resource utilization rate. In this way, the presence of virtualization and cloud computing technologies is very necessary in the constantly developing information age. This paper has summarized, explained and further analyzed energy consumption management questions of the virtual cloud computing platform. It eventually gives people a clearer understanding of energy consumption management of virtual cloud computing platform and brings more help to various aspects of people’s live, work and son on.

  11. Security screening via computational imaging using frequency-diverse metasurface apertures

    NASA Astrophysics Data System (ADS)

    Smith, David R.; Reynolds, Matthew S.; Gollub, Jonah N.; Marks, Daniel L.; Imani, Mohammadreza F.; Yurduseven, Okan; Arnitz, Daniel; Pedross-Engel, Andreas; Sleasman, Timothy; Trofatter, Parker; Boyarsky, Michael; Rose, Alec; Odabasi, Hayrettin; Lipworth, Guy

    2017-05-01

    Computational imaging is a proven strategy for obtaining high-quality images with fast acquisition rates and simpler hardware. Metasurfaces provide exquisite control over electromagnetic fields, enabling the radiated field to be molded into unique patterns. The fusion of these two concepts can bring about revolutionary advances in the design of imaging systems for security screening. In the context of computational imaging, each field pattern serves as a single measurement of a scene; imaging a scene can then be interpreted as estimating the reflectivity distribution of a target from a set of measurements. As with any computational imaging system, the key challenge is to arrive at a minimal set of measurements from which a diffraction-limited image can be resolved. Here, we show that the information content of a frequency-diverse metasurface aperture can be maximized by design, and used to construct a complete millimeter-wave imaging system spanning a 2 m by 2 m area, consisting of 96 metasurfaces, capable of producing diffraction-limited images of human-scale targets. The metasurfacebased frequency-diverse system presented in this work represents an inexpensive, but tremendously flexible alternative to traditional hardware paradigms, offering the possibility of low-cost, real-time, and ubiquitous screening platforms.

  12. Data acquisition system for operational earth observation missions

    NASA Technical Reports Server (NTRS)

    Deerwester, J. M.; Alexander, D.; Arno, R. D.; Edsinger, L. E.; Norman, S. M.; Sinclair, K. F.; Tindle, E. L.; Wood, R. D.

    1972-01-01

    The data acquisition system capabilities expected to be available in the 1980 time period as part of operational Earth observation missions are identified. By data acquisition system is meant the sensor platform (spacecraft or aircraft), the sensors themselves and the communication system. Future capabilities and support requirements are projected for the following sensors: film camera, return beam vidicon, multispectral scanner, infrared scanner, infrared radiometer, microwave scanner, microwave radiometer, coherent side-looking radar, and scatterometer.

  13. Online Graph Completion: Multivariate Signal Recovery in Computer Vision.

    PubMed

    Kim, Won Hwa; Jalal, Mona; Hwang, Seongjae; Johnson, Sterling C; Singh, Vikas

    2017-07-01

    The adoption of "human-in-the-loop" paradigms in computer vision and machine learning is leading to various applications where the actual data acquisition (e.g., human supervision) and the underlying inference algorithms are closely interwined. While classical work in active learning provides effective solutions when the learning module involves classification and regression tasks, many practical issues such as partially observed measurements, financial constraints and even additional distributional or structural aspects of the data typically fall outside the scope of this treatment. For instance, with sequential acquisition of partial measurements of data that manifest as a matrix (or tensor), novel strategies for completion (or collaborative filtering) of the remaining entries have only been studied recently. Motivated by vision problems where we seek to annotate a large dataset of images via a crowdsourced platform or alternatively, complement results from a state-of-the-art object detector using human feedback, we study the "completion" problem defined on graphs, where requests for additional measurements must be made sequentially. We design the optimization model in the Fourier domain of the graph describing how ideas based on adaptive submodularity provide algorithms that work well in practice. On a large set of images collected from Imgur, we see promising results on images that are otherwise difficult to categorize. We also show applications to an experimental design problem in neuroimaging.

  14. FORCEnet Net Centric Architecture - A Standards View

    DTIC Science & Technology

    2006-06-01

    SHARED SERVICES NETWORKING/COMMUNICATIONS STORAGE COMPUTING PLATFORM DATA INTERCHANGE/INTEGRATION DATA MANAGEMENT APPLICATION...R V I C E P L A T F O R M S E R V I C E F R A M E W O R K USER-FACING SERVICES SHARED SERVICES NETWORKING/COMMUNICATIONS STORAGE COMPUTING PLATFORM...E F R A M E W O R K USER-FACING SERVICES SHARED SERVICES NETWORKING/COMMUNICATIONS STORAGE COMPUTING PLATFORM DATA INTERCHANGE/INTEGRATION

  15. A miniaturized NQR spectrometer for a multi-channel NQR-based detection device.

    PubMed

    Beguš, Samo; Jazbinšek, Vojko; Pirnat, Janez; Trontelj, Zvonko

    2014-10-01

    A low frequency (0.5-5 MHz) battery operated sensitive pulsed NQR spectrometer with a transmitter power up to 5 W and a total mass of about 3 kg aimed at detecting (14)N NQR signals, predominantly of illicit materials, was designed and assembled. This spectrometer uses a standard software defined radio (SDR) platform for the data acquisition unit. Signal processing is done with the LabView Virtual instrument on a personal computer. We successfully tested the spectrometer by measuring (14)N NQR signals from aminotetrazole monohydrate (ATMH), potassium nitrate (PN), paracetamol (PCM) and trinitrotoluene (TNT). Such a spectrometer is a feasible component of a portable single or multichannel (14)N NQR based detection device. Copyright © 2014 Elsevier Inc. All rights reserved.

  16. A versatile nondestructive evaluation imaging workstation

    NASA Technical Reports Server (NTRS)

    Chern, E. James; Butler, David W.

    1994-01-01

    Ultrasonic C-scan and eddy current imaging systems are of the pointwise type evaluation systems that rely on a mechanical scanner to physically maneuver a probe relative to the specimen point by point in order to acquire data and generate images. Since the ultrasonic C-scan and eddy current imaging systems are based on the same mechanical scanning mechanisms, the two systems can be combined using the same PC platform with a common mechanical manipulation subsystem and integrated data acquisition software. Based on this concept, we have developed an IBM PC-based combined ultrasonic C-scan and eddy current imaging system. The system is modularized and provides capacity for future hardware and software expansions. Advantages associated with the combined system are: (1) eliminated duplication of the computer and mechanical hardware, (2) unified data acquisition, processing and storage software, (3) reduced setup time for repetitious ultrasonic and eddy current scans, and (4) improved system efficiency. The concept can be adapted to many engineering systems by integrating related PC-based instruments into one multipurpose workstation such as dispensing, machining, packaging, sorting, and other industrial applications.

  17. SignalPlant: an open signal processing software platform.

    PubMed

    Plesinger, F; Jurco, J; Halamek, J; Jurak, P

    2016-07-01

    The growing technical standard of acquisition systems allows the acquisition of large records, often reaching gigabytes or more in size as is the case with whole-day electroencephalograph (EEG) recordings, for example. Although current 64-bit software for signal processing is able to process (e.g. filter, analyze, etc) such data, visual inspection and labeling will probably suffer from rather long latency during the rendering of large portions of recorded signals. For this reason, we have developed SignalPlant-a stand-alone application for signal inspection, labeling and processing. The main motivation was to supply investigators with a tool allowing fast and interactive work with large multichannel records produced by EEG, electrocardiograph and similar devices. The rendering latency was compared with EEGLAB and proves significantly faster when displaying an image from a large number of samples (e.g. 163-times faster for 75  ×  10(6) samples). The presented SignalPlant software is available free and does not depend on any other computation software. Furthermore, it can be extended with plugins by third parties ensuring its adaptability to future research tasks and new data formats.

  18. A versatile nondestructive evaluation imaging workstation

    NASA Astrophysics Data System (ADS)

    Chern, E. James; Butler, David W.

    1994-02-01

    Ultrasonic C-scan and eddy current imaging systems are of the pointwise type evaluation systems that rely on a mechanical scanner to physically maneuver a probe relative to the specimen point by point in order to acquire data and generate images. Since the ultrasonic C-scan and eddy current imaging systems are based on the same mechanical scanning mechanisms, the two systems can be combined using the same PC platform with a common mechanical manipulation subsystem and integrated data acquisition software. Based on this concept, we have developed an IBM PC-based combined ultrasonic C-scan and eddy current imaging system. The system is modularized and provides capacity for future hardware and software expansions. Advantages associated with the combined system are: (1) eliminated duplication of the computer and mechanical hardware, (2) unified data acquisition, processing and storage software, (3) reduced setup time for repetitious ultrasonic and eddy current scans, and (4) improved system efficiency. The concept can be adapted to many engineering systems by integrating related PC-based instruments into one multipurpose workstation such as dispensing, machining, packaging, sorting, and other industrial applications.

  19. Design and Evaluation of a Scalable and Reconfigurable Multi-Platform System for Acoustic Imaging

    PubMed Central

    Izquierdo, Alberto; Villacorta, Juan José; del Val Puente, Lara; Suárez, Luis

    2016-01-01

    This paper proposes a scalable and multi-platform framework for signal acquisition and processing, which allows for the generation of acoustic images using planar arrays of MEMS (Micro-Electro-Mechanical Systems) microphones with low development and deployment costs. Acoustic characterization of MEMS sensors was performed, and the beam pattern of a module, based on an 8 × 8 planar array and of several clusters of modules, was obtained. A flexible framework, formed by an FPGA, an embedded processor, a computer desktop, and a graphic processing unit, was defined. The processing times of the algorithms used to obtain the acoustic images, including signal processing and wideband beamforming via FFT, were evaluated in each subsystem of the framework. Based on this analysis, three frameworks are proposed, defined by the specific subsystems used and the algorithms shared. Finally, a set of acoustic images obtained from sound reflected from a person are presented as a case study in the field of biometric identification. These results reveal the feasibility of the proposed system. PMID:27727174

  20. Using e-Learning Platforms for Mastery Learning in Developmental Mathematics Courses

    ERIC Educational Resources Information Center

    Boggs, Stacey; Shore, Mark; Shore, JoAnna

    2004-01-01

    Many colleges and universities have adopted e-learning platforms to utilize computers as an instructional tool in developmental (i.e., beginning and intermediate algebra) mathematics courses. An e-learning platform is a computer program used to enhance course instruction via computers and the Internet. Allegany College of Maryland is currently…

  1. Topographical and Chemical Imaging of a Phase Separated Polymer Using a Combined Atomic Force Microscopy/Infrared Spectroscopy/Mass Spectrometry Platform

    DOE PAGES

    Tai, Tamin; Karácsony, Orsolya; Bocharova, Vera; ...

    2016-02-18

    This article describes how the use of a hybrid atomic force microscopy/infrared spectroscopy/mass spectrometry imaging platform was demonstrated for the acquisition and correlation of nanoscale sample surface topography and chemical images based on infrared spectroscopy and mass spectrometry.

  2. Boutiques: a flexible framework to integrate command-line applications in computing platforms.

    PubMed

    Glatard, Tristan; Kiar, Gregory; Aumentado-Armstrong, Tristan; Beck, Natacha; Bellec, Pierre; Bernard, Rémi; Bonnet, Axel; Brown, Shawn T; Camarasu-Pop, Sorina; Cervenansky, Frédéric; Das, Samir; Ferreira da Silva, Rafael; Flandin, Guillaume; Girard, Pascal; Gorgolewski, Krzysztof J; Guttmann, Charles R G; Hayot-Sasson, Valérie; Quirion, Pierre-Olivier; Rioux, Pierre; Rousseau, Marc-Étienne; Evans, Alan C

    2018-05-01

    We present Boutiques, a system to automatically publish, integrate, and execute command-line applications across computational platforms. Boutiques applications are installed through software containers described in a rich and flexible JSON language. A set of core tools facilitates the construction, validation, import, execution, and publishing of applications. Boutiques is currently supported by several distinct virtual research platforms, and it has been used to describe dozens of applications in the neuroinformatics domain. We expect Boutiques to improve the quality of application integration in computational platforms, to reduce redundancy of effort, to contribute to computational reproducibility, and to foster Open Science.

  3. Perfect count: a novel approach for the single platform enumeration of absolute CD4+ T-lymphocytes.

    PubMed

    Storie, Ian; Sawle, Alex; Goodfellow, Karen; Whitby, Liam; Granger, Vivian; Ward, Rosalie Y; Peel, Janet; Smart, Theresa; Reilly, John T; Barnett, David

    2004-01-01

    The derivation of reliable CD4(+) T lymphocyte counts is vital for the monitoring of disease progression and therapeutic effectiveness in HIV(+) individuals. Flow cytometry has emerged as the method of choice for CD4(+) T lymphocyte enumeration, with single-platform technology, coupled with reference counting beads, fast becoming the "gold standard." However, although single-platform, bead-based, sample acquisition requires the ratio of beads to cells to remain unchanged, there is no available method, until recently, to monitor this. Perfect Count beads have been developed to address this issue and to incorporate two bead populations, with different densities, to allow the detection of inadequate mixing. Comparison of the relative proportions of both beads with the manufacture's defined limits enables an internal QC check during sample acquisition. In this study, we have compared CD4(+) T lymphocyte counts, obtained from 104 HIV(+) patients, using TruCount beads with MultiSet software (defined as the predicated method) and the new Perfect Count beads, incorporating an in house sequential gating strategy. We have demonstrated an excellent degree of correlation between the predicate method and the Perfect Count system (r(2) = 0.9955; Bland Altman bias +27 CD4(+) T lymphocytes/microl). The Perfect Count system is a robust method for performing single platform absolute counts and has the added advantage of having internal QC checks. Such an approach enables the operator to identify potential problems during sample preparation, acquisition and analysis. Copyright 2003 Wiley-Liss, Inc.

  4. A high performance scientific cloud computing environment for materials simulations

    NASA Astrophysics Data System (ADS)

    Jorissen, K.; Vila, F. D.; Rehr, J. J.

    2012-09-01

    We describe the development of a scientific cloud computing (SCC) platform that offers high performance computation capability. The platform consists of a scientific virtual machine prototype containing a UNIX operating system and several materials science codes, together with essential interface tools (an SCC toolset) that offers functionality comparable to local compute clusters. In particular, our SCC toolset provides automatic creation of virtual clusters for parallel computing, including tools for execution and monitoring performance, as well as efficient I/O utilities that enable seamless connections to and from the cloud. Our SCC platform is optimized for the Amazon Elastic Compute Cloud (EC2). We present benchmarks for prototypical scientific applications and demonstrate performance comparable to local compute clusters. To facilitate code execution and provide user-friendly access, we have also integrated cloud computing capability in a JAVA-based GUI. Our SCC platform may be an alternative to traditional HPC resources for materials science or quantum chemistry applications.

  5. Dynamic high-speed acquisition system design of transmission error with USB based on LabVIEW and FPGA

    NASA Astrophysics Data System (ADS)

    Zheng, Yong; Chen, Yan

    2013-10-01

    To realize the design of dynamic acquisition system for real-time detection of transmission chain error is very important to improve the machining accuracy of machine tool. In this paper, the USB controller and FPGA is used for hardware platform design, combined with LabVIEW to design user applications, NI-VISA is taken for develop USB drivers, and ultimately achieve the dynamic acquisition system design of transmission error

  6. The application of a computer data acquisition system to a new high temperature tribometer

    NASA Technical Reports Server (NTRS)

    Bonham, Charles D.; Dellacorte, Christopher

    1991-01-01

    The two data acquisition computer programs are described which were developed for a high temperature friction and wear test apparatus, a tribometer. The raw data produced by the tribometer and the methods used to sample that data are explained. In addition, the instrumentation and computer hardware and software are presented. Also shown is how computer data acquisition was applied to increase convenience and productivity on a high temperature tribometer.

  7. The application of a computer data acquisition system for a new high temperature tribometer

    NASA Technical Reports Server (NTRS)

    Bonham, Charles D.; Dellacorte, Christopher

    1990-01-01

    The two data acquisition computer programs are described which were developed for a high temperature friction and wear test apparatus, a tribometer. The raw data produced by the tribometer and the methods used to sample that data are explained. In addition, the instrumentation and computer hardware and software are presented. Also shown is how computer data acquisition was applied to increase convenience and productivity on a high temperature tribometer.

  8. Flexible Description Language for HPC based Processing of Remote Sense Data

    NASA Astrophysics Data System (ADS)

    Nandra, Constantin; Gorgan, Dorian; Bacu, Victor

    2016-04-01

    When talking about Big Data, the most challenging aspect lays in processing them in order to gain new insight, find new patterns and gain knowledge from them. This problem is likely most apparent in the case of Earth Observation (EO) data. With ever higher numbers of data sources and increasing data acquisition rates, dealing with EO data is indeed a challenge [1]. Geoscientists should address this challenge by using flexible and efficient tools and platforms. To answer this trend, the BigEarth project [2] aims to combine the advantages of high performance computing solutions with flexible processing description methodologies in order to reduce both task execution times and task definition time and effort. As a component of the BigEarth platform, WorDeL (Workflow Description Language) [3] is intended to offer a flexible, compact and modular approach to the task definition process. WorDeL, unlike other description alternatives such as Python or shell scripts, is oriented towards the description topologies, using them as abstractions for the processing programs. This feature is intended to make it an attractive alternative for users lacking in programming experience. By promoting modular designs, WorDeL not only makes the processing descriptions more user-readable and intuitive, but also helps organizing the processing tasks into independent sub-tasks, which can be executed in parallel on multi-processor platforms in order to improve execution times. As a BigEarth platform [4] component, WorDeL represents the means by which the user interacts with the system, describing processing algorithms in terms of existing operators and workflows [5], which are ultimately translated into sets of executable commands. The WorDeL language has been designed to help in the definition of compute-intensive, batch tasks which can be distributed and executed on high-performance, cloud or grid-based architectures in order to improve the processing time. Main references for further information: [1] Gorgan, D., "Flexible and Adaptive Processing of Earth Observation Data over High Performance Computation Architectures", International Conference and Exhibition Satellite 2015, August 17-19, Houston, Texas, USA. [2] Bigearth project - flexible processing of big earth data over high performance computing architectures. http://cgis.utcluj.ro/bigearth, (2014) [3] Nandra, C., Gorgan, D., "Workflow Description Language for Defining Big Earth Data Processing Tasks", Proceedings of the Intelligent Computer Communication and Processing (ICCP), IEEE-Press, pp. 461-468, (2015). [4] Bacu, V., Stefan, T., Gorgan, D., "Adaptive Processing of Earth Observation Data on Cloud Infrastructures Based on Workflow Description", Proceedings of the Intelligent Computer Communication and Processing (ICCP), IEEE-Press, pp.444-454, (2015). [5] Mihon, D., Bacu, V., Colceriu, V., Gorgan, D., "Modeling of Earth Observation Use Cases through the KEOPS System", Proceedings of the Intelligent Computer Communication and Processing (ICCP), IEEE-Press, pp. 455-460, (2015).

  9. EUROPattern Suite technology for computer-aided immunofluorescence microscopy in autoantibody diagnostics.

    PubMed

    Krause, C; Ens, K; Fechner, K; Voigt, J; Fraune, J; Rohwäder, E; Hahn, M; Danckwardt, M; Feirer, C; Barth, E; Martinetz, T; Stöcker, W

    2015-04-01

    Antinuclear autoantibodies (ANA) are highly informative biomarkers in autoimmune diagnostics. The increasing demand for effective test systems, however, has led to the development of a confusingly large variety of different platforms. One of them, the indirect immunofluorescence (IIF), is regarded as the common gold standard for ANA screening, as described in a position statement by the American College of Rheumatology in 2009. Technological solutions have been developed aimed at standardization and automation of IIF to overcome methodological limitations and subjective bias in IIF interpretation. In this review, we present the EUROPattern Suite, a system for computer-aided immunofluorescence microscopy (CAIFM) including automated acquisition of digital images and evaluation of IIF results. The system was originally designed for ANA diagnostics on human epithelial cells, but its applications have been extended with the latest system update version 1.5 to the analysis of antineutrophil cytoplasmic antibodies (ANCA) and anti-dsDNA antibodies. © The Author(s) 2015 Reprints and permissions: sagepub.co.uk/journalsPermissions.nav.

  10. Single-exposure quantitative phase imaging in color-coded LED microscopy.

    PubMed

    Lee, Wonchan; Jung, Daeseong; Ryu, Suho; Joo, Chulmin

    2017-04-03

    We demonstrate single-shot quantitative phase imaging (QPI) in a platform of color-coded LED microscopy (cLEDscope). The light source in a conventional microscope is replaced by a circular LED pattern that is trisected into subregions with equal area, assigned to red, green, and blue colors. Image acquisition with a color image sensor and subsequent computation based on weak object transfer functions allow for the QPI of a transparent specimen. We also provide a correction method for color-leakage, which may be encountered in implementing our method with consumer-grade LEDs and image sensors. Most commercially available LEDs and image sensors do not provide spectrally isolated emissions and pixel responses, generating significant error in phase estimation in our method. We describe the correction scheme for this color-leakage issue, and demonstrate improved phase measurement accuracy. The computational model and single-exposure QPI capability of our method are presented by showing images of calibrated phase samples and cellular specimens.

  11. Airborne optical tracking control system design study

    NASA Astrophysics Data System (ADS)

    1992-09-01

    The Kestrel LOS Tracking Program involves the development of a computer and algorithms for use in passive tracking of airborne targets from a high altitude balloon platform. The computer receivers track error signals from a video tracker connected to one of the imaging sensors. In addition, an on-board IRU (gyro), accelerometers, a magnetometer, and a two-axis inclinometer provide inputs which are used for initial acquisitions and course and fine tracking. Signals received by the control processor from the video tracker, IRU, accelerometers, magnetometer, and inclinometer are utilized by the control processor to generate drive signals for the payload azimuth drive, the Gimballed Mirror System (GMS), and the Fast Steering Mirror (FSM). The hardware which will be procured under the LOS tracking activity is the Controls Processor (CP), the IRU, and the FSM. The performance specifications for the GMS and the payload canister azimuth driver are established by the LOS tracking design team in an effort to achieve a tracking jitter of less than 3 micro-rad, 1 sigma for one axis.

  12. Specifications and implementation of the RT MHD control system for the EC launcher of FTU

    NASA Astrophysics Data System (ADS)

    Galperti, C.; Alessi, E.; Boncagni, L.; Bruschi, A.; Granucci, G.; Grosso, A.; Iannone, F.; Marchetto, C.; Nowak, S.; Panella, M.; Sozzi, C.; Tilia, B.

    2012-09-01

    To perform real time plasma control experiments using EC heating waves by using the new fast launcher installed on FTU a dedicated data acquisition and elaboration system has been designed recently. A prototypical version of the acquisition/control system has been recently developed and will be tested on FTU machine in its next experimental campaign. The open-source framework MARTe (Multi-threaded Application Real-Time executor) on Linux/RTAI real-time operating system has been chosen as software platform to realize the control system. Standard open-architecture industrial PCs, based either on VME bus and CompactPCI bus equipped with standard input/output cards are the chosen hardware platform.

  13. 48 CFR 52.253-1 - Computer Generated Forms.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 48 Federal Acquisition Regulations System 2 2012-10-01 2012-10-01 false Computer Generated Forms....253-1 Computer Generated Forms. As prescribed in FAR 53.111, insert the following clause: Computer... by the Federal Acquisition Regulation (FAR) may be submitted on a computer generated version of the...

  14. 48 CFR 52.253-1 - Computer Generated Forms.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 48 Federal Acquisition Regulations System 2 2013-10-01 2013-10-01 false Computer Generated Forms....253-1 Computer Generated Forms. As prescribed in FAR 53.111, insert the following clause: Computer... by the Federal Acquisition Regulation (FAR) may be submitted on a computer generated version of the...

  15. 48 CFR 52.253-1 - Computer Generated Forms.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 48 Federal Acquisition Regulations System 2 2014-10-01 2014-10-01 false Computer Generated Forms....253-1 Computer Generated Forms. As prescribed in FAR 53.111, insert the following clause: Computer... by the Federal Acquisition Regulation (FAR) may be submitted on a computer generated version of the...

  16. 48 CFR 52.253-1 - Computer Generated Forms.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 48 Federal Acquisition Regulations System 2 2011-10-01 2011-10-01 false Computer Generated Forms....253-1 Computer Generated Forms. As prescribed in FAR 53.111, insert the following clause: Computer... by the Federal Acquisition Regulation (FAR) may be submitted on a computer generated version of the...

  17. 48 CFR 52.253-1 - Computer Generated Forms.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 48 Federal Acquisition Regulations System 2 2010-10-01 2010-10-01 false Computer Generated Forms....253-1 Computer Generated Forms. As prescribed in FAR 53.111, insert the following clause: Computer... by the Federal Acquisition Regulation (FAR) may be submitted on a computer generated version of the...

  18. Vectorized data acquisition and fast triple-correlation integrals for Fluorescence Triple Correlation Spectroscopy

    NASA Astrophysics Data System (ADS)

    Ridgeway, William K.; Millar, David P.; Williamson, James R.

    2013-04-01

    Fluorescence Correlation Spectroscopy (FCS) is widely used to quantify reaction rates and concentrations of molecules in vitro and in vivo. We recently reported Fluorescence Triple Correlation Spectroscopy (F3CS), which correlates three signals together instead of two. F3CS can analyze the stoichiometries of complex mixtures and detect irreversible processes by identifying time-reversal asymmetries. Here we report the computational developments that were required for the realization of F3CS and present the results as the Triple Correlation Toolbox suite of programs. Triple Correlation Toolbox is a complete data analysis pipeline capable of acquiring, correlating and fitting large data sets. Each segment of the pipeline handles error estimates for accurate error-weighted global fitting. Data acquisition was accelerated with a combination of off-the-shelf counter-timer chips and vectorized operations on 128-bit registers. This allows desktop computers with inexpensive data acquisition cards to acquire hours of multiple-channel data with sub-microsecond time resolution. Off-line correlation integrals were implemented as a two delay time multiple-tau scheme that scales efficiently with multiple processors and provides an unprecedented view of linked dynamics. Global fitting routines are provided to fit FCS and F3CS data to models containing up to ten species. Triple Correlation Toolbox is a complete package that enables F3CS to be performed on existing microscopes. Catalogue identifier: AEOP_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEOP_v1_0.html Program obtainable from: CPC Program Library, Queen’s University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 50189 No. of bytes in distributed program, including test data, etc.: 6135283 Distribution format: tar.gz Programming language: C/Assembly. Computer: Any with GCC and library support. Operating system: Linux and OS X (data acq. for Linux only due to library availability), not tested on Windows. RAM: ≥512 MB. Classification: 16.4. External routines: NIDAQmx (National Instruments), Gnu Scientific Library, GTK+, PLplot (optional) Nature of problem: Fluorescence Triple Correlation Spectroscopy required three things: data acquisition at faster speeds than were possible without expensive custom hardware, triple-correlation routines that could process 1/2 TB data sets rapidly, and fitting routines capable of handling several to a hundred fit parameters and 14,000 + data points, each with error estimates. Solution method: A novel data acquisition concept mixed signal processing with off-the-shelf hardware and data-parallel processing using 128-bit registers found in desktop CPUs. Correlation algorithms used fractal data structures and multithreading to reduce data analysis times. Global fitting was implemented with robust minimization routines and provides feedback that allows the user to critically inspect initial guesses and fits. Restrictions: Data acquisition only requires a National Instruments data acquisition card (it was tested on Linux using card PCIe-6251) and a simple home-built circuit. Unusual features: Hand-coded ×86-64 assembly for data acquisition loops (platform-independent C code also provided). Additional comments: A complete collection of tools to perform Fluorescence Triple Correlation Spectroscopy-from data acquisition to two-tau correlation of large data sets, to model fitting. Running time: 1-5 h of data analysis per hour of data collected. Varies depending on data-acquisition length, time resolution, data density and number of cores used for correlation integrals.

  19. Designing a Virtual Social Space for Language Acquisition

    ERIC Educational Resources Information Center

    Woolson, Maria Alessandra

    2012-01-01

    Middleverse de Español (MdE) is an evolving platform for foreign language (FL) study, aligned to the goals of ACTFL's National Standards and 2007 MLA report. The project simulates an immersive environment in a virtual 3-D space for the acquisition of translingual and transcultural competence in Spanish meant to support content-based and…

  20. Boutiques: a flexible framework to integrate command-line applications in computing platforms

    PubMed Central

    Glatard, Tristan; Kiar, Gregory; Aumentado-Armstrong, Tristan; Beck, Natacha; Bellec, Pierre; Bernard, Rémi; Bonnet, Axel; Brown, Shawn T; Camarasu-Pop, Sorina; Cervenansky, Frédéric; Das, Samir; Ferreira da Silva, Rafael; Flandin, Guillaume; Girard, Pascal; Gorgolewski, Krzysztof J; Guttmann, Charles R G; Hayot-Sasson, Valérie; Quirion, Pierre-Olivier; Rioux, Pierre; Rousseau, Marc-Étienne; Evans, Alan C

    2018-01-01

    Abstract We present Boutiques, a system to automatically publish, integrate, and execute command-line applications across computational platforms. Boutiques applications are installed through software containers described in a rich and flexible JSON language. A set of core tools facilitates the construction, validation, import, execution, and publishing of applications. Boutiques is currently supported by several distinct virtual research platforms, and it has been used to describe dozens of applications in the neuroinformatics domain. We expect Boutiques to improve the quality of application integration in computational platforms, to reduce redundancy of effort, to contribute to computational reproducibility, and to foster Open Science. PMID:29718199

  1. Impact of MOODLE Platform on the Pedagogy of Students and Staff: Cross-Curricular Comparison

    ERIC Educational Resources Information Center

    Jackson, Emerson Abraham

    2017-01-01

    In the current Information Age, technology is taking the lead in moving teaching and learning beyond that which was once viewed as typically didactic approach to knowledge acquisition. The outcome of this research paper have explored the importance of MOODLE learning platform and its contribution in promoting flexible teaching, and the need for…

  2. A Lightweight Remote Parallel Visualization Platform for Interactive Massive Time-varying Climate Data Analysis

    NASA Astrophysics Data System (ADS)

    Li, J.; Zhang, T.; Huang, Q.; Liu, Q.

    2014-12-01

    Today's climate datasets are featured with large volume, high degree of spatiotemporal complexity and evolving fast overtime. As visualizing large volume distributed climate datasets is computationally intensive, traditional desktop based visualization applications fail to handle the computational intensity. Recently, scientists have developed remote visualization techniques to address the computational issue. Remote visualization techniques usually leverage server-side parallel computing capabilities to perform visualization tasks and deliver visualization results to clients through network. In this research, we aim to build a remote parallel visualization platform for visualizing and analyzing massive climate data. Our visualization platform was built based on Paraview, which is one of the most popular open source remote visualization and analysis applications. To further enhance the scalability and stability of the platform, we have employed cloud computing techniques to support the deployment of the platform. In this platform, all climate datasets are regular grid data which are stored in NetCDF format. Three types of data access methods are supported in the platform: accessing remote datasets provided by OpenDAP servers, accessing datasets hosted on the web visualization server and accessing local datasets. Despite different data access methods, all visualization tasks are completed at the server side to reduce the workload of clients. As a proof of concept, we have implemented a set of scientific visualization methods to show the feasibility of the platform. Preliminary results indicate that the framework can address the computation limitation of desktop based visualization applications.

  3. 48 CFR 212.212 - Computer software.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 48 Federal Acquisition Regulations System 3 2013-10-01 2013-10-01 false Computer software. 212.212... Acquisition of Commercial Items 212.212 Computer software. (1) Departments and agencies shall identify and... technology development), opportunities for the use of commercial computer software and other non...

  4. 48 CFR 212.212 - Computer software.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 48 Federal Acquisition Regulations System 3 2011-10-01 2011-10-01 false Computer software. 212.212... Acquisition of Commercial Items 212.212 Computer software. (1) Departments and agencies shall identify and... technology development), opportunities for the use of commercial computer software and other non...

  5. 48 CFR 212.212 - Computer software.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 48 Federal Acquisition Regulations System 3 2012-10-01 2012-10-01 false Computer software. 212.212... Acquisition of Commercial Items 212.212 Computer software. (1) Departments and agencies shall identify and... technology development), opportunities for the use of commercial computer software and other non...

  6. 48 CFR 212.212 - Computer software.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 48 Federal Acquisition Regulations System 3 2010-10-01 2010-10-01 false Computer software. 212.212... Acquisition of Commercial Items 212.212 Computer software. (1) Departments and agencies shall identify and... technology development), opportunities for the use of commercial computer software and other non...

  7. 48 CFR 212.212 - Computer software.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 48 Federal Acquisition Regulations System 3 2014-10-01 2014-10-01 false Computer software. 212.212... Acquisition of Commercial Items 212.212 Computer software. (1) Departments and agencies shall identify and... technology development), opportunities for the use of commercial computer software and other non...

  8. NMRbox: A Resource for Biomolecular NMR Computation.

    PubMed

    Maciejewski, Mark W; Schuyler, Adam D; Gryk, Michael R; Moraru, Ion I; Romero, Pedro R; Ulrich, Eldon L; Eghbalnia, Hamid R; Livny, Miron; Delaglio, Frank; Hoch, Jeffrey C

    2017-04-25

    Advances in computation have been enabling many recent advances in biomolecular applications of NMR. Due to the wide diversity of applications of NMR, the number and variety of software packages for processing and analyzing NMR data is quite large, with labs relying on dozens, if not hundreds of software packages. Discovery, acquisition, installation, and maintenance of all these packages is a burdensome task. Because the majority of software packages originate in academic labs, persistence of the software is compromised when developers graduate, funding ceases, or investigators turn to other projects. To simplify access to and use of biomolecular NMR software, foster persistence, and enhance reproducibility of computational workflows, we have developed NMRbox, a shared resource for NMR software and computation. NMRbox employs virtualization to provide a comprehensive software environment preconfigured with hundreds of software packages, available as a downloadable virtual machine or as a Platform-as-a-Service supported by a dedicated compute cloud. Ongoing development includes a metadata harvester to regularize, annotate, and preserve workflows and facilitate and enhance data depositions to BioMagResBank, and tools for Bayesian inference to enhance the robustness and extensibility of computational analyses. In addition to facilitating use and preservation of the rich and dynamic software environment for biomolecular NMR, NMRbox fosters the development and deployment of a new class of metasoftware packages. NMRbox is freely available to not-for-profit users. Copyright © 2017 Biophysical Society. All rights reserved.

  9. Integrated Microsensors for Autonomous Microrobots

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    ADKINS, DOUGLAS R.; BYRNE, RAYMOND H.; HELLER, EDWIN J.

    2003-02-01

    This report describes the development of a miniature mobile microrobot device and several microsystems needed to create a miniature microsensor delivery platform. This work was funded under LDRD No.10785, entitled, ''Integrated Microsensors for Autonomous Microrobots''. The approach adopted in this project was to develop a mobile platform, to which would be attached wireless RF remote control and data acquisition in addition to various microsensors. A modular approach was used to produce a versatile microrobot platform and reduce power consumption and physical size.

  10. 48 CFR 212.7003 - Technical data and computer software.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... computer software. 212.7003 Section 212.7003 Federal Acquisition Regulations System DEFENSE ACQUISITION... data and computer software. For purposes of establishing delivery requirements and license rights for technical data under 227.7102 and for computer software under 227.7202, there shall be a rebuttable...

  11. 48 CFR 212.7003 - Technical data and computer software.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... computer software. 212.7003 Section 212.7003 Federal Acquisition Regulations System DEFENSE ACQUISITION... data and computer software. For purposes of establishing delivery requirements and license rights for technical data under 227.7102 and for computer software under 227.7202, there shall be a rebuttable...

  12. 48 CFR 212.7003 - Technical data and computer software.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... computer software. 212.7003 Section 212.7003 Federal Acquisition Regulations System DEFENSE ACQUISITION... data and computer software. For purposes of establishing delivery requirements and license rights for technical data under 227.7102 and for computer software under 227.7202, there shall be a rebuttable...

  13. 48 CFR 212.7003 - Technical data and computer software.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... computer software. 212.7003 Section 212.7003 Federal Acquisition Regulations System DEFENSE ACQUISITION... data and computer software. For purposes of establishing delivery requirements and license rights for technical data under 227.7102 and for computer software under 227.7202, there shall be a rebuttable...

  14. 48 CFR 212.7003 - Technical data and computer software.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... computer software. 212.7003 Section 212.7003 Federal Acquisition Regulations System DEFENSE ACQUISITION... data and computer software. For purposes of establishing delivery requirements and license rights for technical data under 227.7102 and for computer software under 227.7202, there shall be a rebuttable...

  15. Low Computational Signal Acquisition for GNSS Receivers Using a Resampling Strategy and Variable Circular Correlation Time

    PubMed Central

    Zhang, Yeqing; Wang, Meiling; Li, Yafeng

    2018-01-01

    For the objective of essentially decreasing computational complexity and time consumption of signal acquisition, this paper explores a resampling strategy and variable circular correlation time strategy specific to broadband multi-frequency GNSS receivers. In broadband GNSS receivers, the resampling strategy is established to work on conventional acquisition algorithms by resampling the main lobe of received broadband signals with a much lower frequency. Variable circular correlation time is designed to adapt to different signal strength conditions and thereby increase the operation flexibility of GNSS signal acquisition. The acquisition threshold is defined as the ratio of the highest and second highest correlation results in the search space of carrier frequency and code phase. Moreover, computational complexity of signal acquisition is formulated by amounts of multiplication and summation operations in the acquisition process. Comparative experiments and performance analysis are conducted on four sets of real GPS L2C signals with different sampling frequencies. The results indicate that the resampling strategy can effectively decrease computation and time cost by nearly 90–94% with just slight loss of acquisition sensitivity. With circular correlation time varying from 10 ms to 20 ms, the time cost of signal acquisition has increased by about 2.7–5.6% per millisecond, with most satellites acquired successfully. PMID:29495301

  16. Low Computational Signal Acquisition for GNSS Receivers Using a Resampling Strategy and Variable Circular Correlation Time.

    PubMed

    Zhang, Yeqing; Wang, Meiling; Li, Yafeng

    2018-02-24

    For the objective of essentially decreasing computational complexity and time consumption of signal acquisition, this paper explores a resampling strategy and variable circular correlation time strategy specific to broadband multi-frequency GNSS receivers. In broadband GNSS receivers, the resampling strategy is established to work on conventional acquisition algorithms by resampling the main lobe of received broadband signals with a much lower frequency. Variable circular correlation time is designed to adapt to different signal strength conditions and thereby increase the operation flexibility of GNSS signal acquisition. The acquisition threshold is defined as the ratio of the highest and second highest correlation results in the search space of carrier frequency and code phase. Moreover, computational complexity of signal acquisition is formulated by amounts of multiplication and summation operations in the acquisition process. Comparative experiments and performance analysis are conducted on four sets of real GPS L2C signals with different sampling frequencies. The results indicate that the resampling strategy can effectively decrease computation and time cost by nearly 90-94% with just slight loss of acquisition sensitivity. With circular correlation time varying from 10 ms to 20 ms, the time cost of signal acquisition has increased by about 2.7-5.6% per millisecond, with most satellites acquired successfully.

  17. Design of a Single-Cell Positioning Controller Using Electroosmotic Flow and Image Processing

    PubMed Central

    Ay, Chyung; Young, Chao-Wang; Chen, Jhong-Yin

    2013-01-01

    The objective of the current research was not only to provide a fast and automatic positioning platform for single cells, but also improved biomolecular manipulation techniques. In this study, an automatic platform for cell positioning using electroosmotic flow and image processing technology was designed. The platform was developed using a PCI image acquisition interface card for capturing images from a microscope and then transferring them to a computer using human-machine interface software. This software was designed by the Laboratory Virtual Instrument Engineering Workbench, a graphical language for finding cell positions and viewing the driving trace, and the fuzzy logic method for controlling the voltage or time of an electric field. After experiments on real human leukemic cells (U-937), the success of the cell positioning rate achieved by controlling the voltage factor reaches 100% within 5 s. A greater precision is obtained when controlling the time factor, whereby the success rate reaches 100% within 28 s. Advantages in both high speed and high precision are attained if these two voltage and time control methods are combined. The control speed with the combined method is about 5.18 times greater than that achieved by the time method, and the control precision with the combined method is more than five times greater than that achieved by the voltage method. PMID:23698272

  18. Computational Evaluation of the Traceback Method

    ERIC Educational Resources Information Center

    Kol, Sheli; Nir, Bracha; Wintner, Shuly

    2014-01-01

    Several models of language acquisition have emerged in recent years that rely on computational algorithms for simulation and evaluation. Computational models are formal and precise, and can thus provide mathematically well-motivated insights into the process of language acquisition. Such models are amenable to robust computational evaluation,…

  19. Radar cross calibration investigation TAMU radar polarimeter calibration measurements

    NASA Technical Reports Server (NTRS)

    Blanchard, A. J.; Newton, R. W.; Bong, S.; Kronke, C.; Warren, G. L.; Carey, D.

    1982-01-01

    A short pulse, 20 MHz bandwidth, three frequency radar polarimeter system (RPS) operates at center frequencies of 10.003 GHz, 4.75 GHz, and 1.6 GHz and utilizes dual polarized transmit and receive antennas for each frequency. The basic lay-out of the RPS is different from other truck mounted systems in that it uses a pulse compression IF section common to all three RF heads. Separate transmit and receive antennas are used to improve the cross-polarization isolation at each particular frequency. The receive is a digitally controlled gain modulated subsystem and is interfaced directly with a microprocesser computer for control and data manipulation. Antenna focusing distance, focusing each antenna pair, rf head stability, and polarization characteristics of RPS antennas are discussed. Platform and data acquisition procedures are described.

  20. Observing orbital debris using space-based telescopes. I - Mission orbit considerations

    NASA Technical Reports Server (NTRS)

    Reynolds, Robert C.; Talent, David L.; Vilas, Faith

    1989-01-01

    In this paper, mission orbit considerations are addressed for using the Space Shuttle as a telescope platform for observing man-made orbital debris. Computer modeling of various electrooptical systems predicts that such a space-borne system will be able to detect particles as small as 1-mm diameter. The research is meant to support the development of debris- collision warning sensors through the acquisition of spatial distribution and spectral characteristics for debris and testing of detector combinations on a shuttle-borne telescopic experiment. The technique can also be applied to low-earth-orbit-debris environment monitoring systems. It is shown how the choice of mission orbit, season of launch, and time of day of launch may be employed to provide extended periods of favorable observing conditions.

  1. A versatile optical microscope for time-dependent single-molecule and single-particle spectroscopy

    NASA Astrophysics Data System (ADS)

    Li, Hao; Yang, Haw

    2018-03-01

    This work reports the design and implementation of a multi-function optical microscope for time-dependent spectroscopy on single molecules and single nanoparticles. It integrates the now-routine single-object measurements into one standalone platform so that no reconfiguration is needed when switching between different types of sample or spectroscopy modes. The illumination modes include evanescent field through total internal reflection, dark-field illumination, and epi-excitation onto a diffraction-limited spot suitable for confocal detection. The detection modes include spectrally resolved line imaging, wide-field imaging with dual-color capability, and two-color single-element photon-counting detection. The switch between different spectroscopy and data acquisition modes is fully automated and executed through computer programming. The capability of this microscope is demonstrated through selected proof-of-principle experiments.

  2. Automated sample area definition for high-throughput microscopy.

    PubMed

    Zeder, M; Ellrott, A; Amann, R

    2011-04-01

    High-throughput screening platforms based on epifluorescence microscopy are powerful tools in a variety of scientific fields. Although some applications are based on imaging geometrically defined samples such as microtiter plates, multiwell slides, or spotted gene arrays, others need to cope with inhomogeneously located samples on glass slides. The analysis of microbial communities in aquatic systems by sample filtration on membrane filters followed by multiple fluorescent staining, or the investigation of tissue sections are examples. Therefore, we developed a strategy for flexible and fast definition of sample locations by the acquisition of whole slide overview images and automated sample recognition by image analysis. Our approach was tested on different microscopes and the computer programs are freely available (http://www.technobiology.ch). Copyright © 2011 International Society for Advancement of Cytometry.

  3. A versatile optical microscope for time-dependent single-molecule and single-particle spectroscopy.

    PubMed

    Li, Hao; Yang, Haw

    2018-03-28

    This work reports the design and implementation of a multi-function optical microscope for time-dependent spectroscopy on single molecules and single nanoparticles. It integrates the now-routine single-object measurements into one standalone platform so that no reconfiguration is needed when switching between different types of sample or spectroscopy modes. The illumination modes include evanescent field through total internal reflection, dark-field illumination, and epi-excitation onto a diffraction-limited spot suitable for confocal detection. The detection modes include spectrally resolved line imaging, wide-field imaging with dual-color capability, and two-color single-element photon-counting detection. The switch between different spectroscopy and data acquisition modes is fully automated and executed through computer programming. The capability of this microscope is demonstrated through selected proof-of-principle experiments.

  4. Formal design and verification of a reliable computing platform for real-time control. Phase 1: Results

    NASA Technical Reports Server (NTRS)

    Divito, Ben L.; Butler, Ricky W.; Caldwell, James L.

    1990-01-01

    A high-level design is presented for a reliable computing platform for real-time control applications. Design tradeoffs and analyses related to the development of the fault-tolerant computing platform are discussed. The architecture is formalized and shown to satisfy a key correctness property. The reliable computing platform uses replicated processors and majority voting to achieve fault tolerance. Under the assumption of a majority of processors working in each frame, it is shown that the replicated system computes the same results as a single processor system not subject to failures. Sufficient conditions are obtained to establish that the replicated system recovers from transient faults within a bounded amount of time. Three different voting schemes are examined and proved to satisfy the bounded recovery time conditions.

  5. Comparison of Quantitative Mass Spectrometry Platforms for Monitoring Kinase ATP Probe Uptake in Lung Cancer.

    PubMed

    Hoffman, Melissa A; Fang, Bin; Haura, Eric B; Rix, Uwe; Koomen, John M

    2018-01-05

    Recent developments in instrumentation and bioinformatics have led to new quantitative mass spectrometry platforms including LC-MS/MS with data-independent acquisition (DIA) and targeted analysis using parallel reaction monitoring mass spectrometry (LC-PRM), which provide alternatives to well-established methods, such as LC-MS/MS with data-dependent acquisition (DDA) and targeted analysis using multiple reaction monitoring mass spectrometry (LC-MRM). These tools have been used to identify signaling perturbations in lung cancers and other malignancies, supporting the development of effective kinase inhibitors and, more recently, providing insights into therapeutic resistance mechanisms and drug repurposing opportunities. However, detection of kinases in biological matrices can be challenging; therefore, activity-based protein profiling enrichment of ATP-utilizing proteins was selected as a test case for exploring the limits of detection of low-abundance analytes in complex biological samples. To examine the impact of different MS acquisition platforms, quantification of kinase ATP uptake following kinase inhibitor treatment was analyzed by four different methods: LC-MS/MS with DDA and DIA, LC-MRM, and LC-PRM. For discovery data sets, DIA increased the number of identified kinases by 21% and reduced missingness when compared with DDA. In this context, MRM and PRM were most effective at identifying global kinome responses to inhibitor treatment, highlighting the value of a priori target identification and manual evaluation of quantitative proteomics data sets. We compare results for a selected set of desthiobiotinylated peptides from PRM, MRM, and DIA and identify considerations for selecting a quantification method and postprocessing steps that should be used for each data acquisition strategy.

  6. Design method of ARM based embedded iris recognition system

    NASA Astrophysics Data System (ADS)

    Wang, Yuanbo; He, Yuqing; Hou, Yushi; Liu, Ting

    2008-03-01

    With the advantages of non-invasiveness, uniqueness, stability and low false recognition rate, iris recognition has been successfully applied in many fields. Up to now, most of the iris recognition systems are based on PC. However, a PC is not portable and it needs more power. In this paper, we proposed an embedded iris recognition system based on ARM. Considering the requirements of iris image acquisition and recognition algorithm, we analyzed the design method of the iris image acquisition module, designed the ARM processing module and its peripherals, studied the Linux platform and the recognition algorithm based on this platform, finally actualized the design method of ARM-based iris imaging and recognition system. Experimental results show that the ARM platform we used is fast enough to run the iris recognition algorithm, and the data stream can flow smoothly between the camera and the ARM chip based on the embedded Linux system. It's an effective method of using ARM to actualize portable embedded iris recognition system.

  7. Performance Evaluation of Block Acquisition and Tracking Algorithms Using an Open Source GPS Receiver Platform

    NASA Technical Reports Server (NTRS)

    Ramachandran, Ganesh K.; Akopian, David; Heckler, Gregory W.; Winternitz, Luke B.

    2011-01-01

    Location technologies have many applications in wireless communications, military and space missions, etc. US Global Positioning System (GPS) and other existing and emerging Global Navigation Satellite Systems (GNSS) are expected to provide accurate location information to enable such applications. While GNSS systems perform very well in strong signal conditions, their operation in many urban, indoor, and space applications is not robust or even impossible due to weak signals and strong distortions. The search for less costly, faster and more sensitive receivers is still in progress. As the research community addresses more and more complicated phenomena there exists a demand on flexible multimode reference receivers, associated SDKs, and development platforms which may accelerate and facilitate the research. One of such concepts is the software GPS/GNSS receiver (GPS SDR) which permits a facilitated access to algorithmic libraries and a possibility to integrate more advanced algorithms without hardware and essential software updates. The GNU-SDR and GPS-SDR open source receiver platforms are such popular examples. This paper evaluates the performance of recently proposed block-corelator techniques for acquisition and tracking of GPS signals using open source GPS-SDR platform.

  8. Evaluation of Emerging Energy-Efficient Heterogeneous Computing Platforms for Biomolecular and Cellular Simulation Workloads.

    PubMed

    Stone, John E; Hallock, Michael J; Phillips, James C; Peterson, Joseph R; Luthey-Schulten, Zaida; Schulten, Klaus

    2016-05-01

    Many of the continuing scientific advances achieved through computational biology are predicated on the availability of ongoing increases in computational power required for detailed simulation and analysis of cellular processes on biologically-relevant timescales. A critical challenge facing the development of future exascale supercomputer systems is the development of new computing hardware and associated scientific applications that dramatically improve upon the energy efficiency of existing solutions, while providing increased simulation, analysis, and visualization performance. Mobile computing platforms have recently become powerful enough to support interactive molecular visualization tasks that were previously only possible on laptops and workstations, creating future opportunities for their convenient use for meetings, remote collaboration, and as head mounted displays for immersive stereoscopic viewing. We describe early experiences adapting several biomolecular simulation and analysis applications for emerging heterogeneous computing platforms that combine power-efficient system-on-chip multi-core CPUs with high-performance massively parallel GPUs. We present low-cost power monitoring instrumentation that provides sufficient temporal resolution to evaluate the power consumption of individual CPU algorithms and GPU kernels. We compare the performance and energy efficiency of scientific applications running on emerging platforms with results obtained on traditional platforms, identify hardware and algorithmic performance bottlenecks that affect the usability of these platforms, and describe avenues for improving both the hardware and applications in pursuit of the needs of molecular modeling tasks on mobile devices and future exascale computers.

  9. 75 FR 6185 - Information Collection Requirement; Defense Federal Acquisition Regulation Supplement; Rights in...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-02-08

    ...; Defense Federal Acquisition Regulation Supplement; Rights in Technical Data and Computer Software (OMB... 227.72, Rights in Computer Software and Computer Software Documentation, and related provisions and... rights in technical data and computer software. DoD needs this information to implement 10 U.S.C. 2320...

  10. 78 FR 30898 - Information Collection Requirement; Defense Federal Acquisition Regulation Supplement; Rights in...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-05-23

    ... Data and Computer Software AGENCY: Defense Acquisition Regulations System; Department of Defense (DoD... in Technical Data, and Subpart 227.72, Rights in Computer Software and Computer Software... are associated with rights in technical data and computer software. DoD needs this information to...

  11. QuantWorm: a comprehensive software package for Caenorhabditis elegans phenotypic assays.

    PubMed

    Jung, Sang-Kyu; Aleman-Meza, Boanerges; Riepe, Celeste; Zhong, Weiwei

    2014-01-01

    Phenotypic assays are crucial in genetics; however, traditional methods that rely on human observation are unsuitable for quantitative, large-scale experiments. Furthermore, there is an increasing need for comprehensive analyses of multiple phenotypes to provide multidimensional information. Here we developed an automated, high-throughput computer imaging system for quantifying multiple Caenorhabditis elegans phenotypes. Our imaging system is composed of a microscope equipped with a digital camera and a motorized stage connected to a computer running the QuantWorm software package. Currently, the software package contains one data acquisition module and four image analysis programs: WormLifespan, WormLocomotion, WormLength, and WormEgg. The data acquisition module collects images and videos. The WormLifespan software counts the number of moving worms by using two time-lapse images; the WormLocomotion software computes the velocity of moving worms; the WormLength software measures worm body size; and the WormEgg software counts the number of eggs. To evaluate the performance of our software, we compared the results of our software with manual measurements. We then demonstrated the application of the QuantWorm software in a drug assay and a genetic assay. Overall, the QuantWorm software provided accurate measurements at a high speed. Software source code, executable programs, and sample images are available at www.quantworm.org. Our software package has several advantages over current imaging systems for C. elegans. It is an all-in-one package for quantifying multiple phenotypes. The QuantWorm software is written in Java and its source code is freely available, so it does not require use of commercial software or libraries. It can be run on multiple platforms and easily customized to cope with new methods and requirements.

  12. a New Approach for Progressive Dense Reconstruction from Consecutive Images Based on Prior Low-Density 3d Point Clouds

    NASA Astrophysics Data System (ADS)

    Lari, Z.; El-Sheimy, N.

    2017-09-01

    In recent years, the increasing incidence of climate-related disasters has tremendously affected our environment. In order to effectively manage and reduce dramatic impacts of such events, the development of timely disaster management plans is essential. Since these disasters are spatial phenomena, timely provision of geospatial information is crucial for effective development of response and management plans. Due to inaccessibility of the affected areas and limited budget of first-responders, timely acquisition of the required geospatial data for these applications is usually possible only using low-cost imaging and georefencing sensors mounted on unmanned platforms. Despite rapid collection of the required data using these systems, available processing techniques are not yet capable of delivering geospatial information to responders and decision makers in a timely manner. To address this issue, this paper introduces a new technique for dense 3D reconstruction of the affected scenes which can deliver and improve the needed geospatial information incrementally. This approach is implemented based on prior 3D knowledge of the scene and employs computationally-efficient 2D triangulation, feature descriptor, feature matching and point verification techniques to optimize and speed up 3D dense scene reconstruction procedure. To verify the feasibility and computational efficiency of the proposed approach, an experiment using a set of consecutive images collected onboard a UAV platform and prior low-density airborne laser scanning over the same area is conducted and step by step results are provided. A comparative analysis of the proposed approach and an available image-based dense reconstruction technique is also conducted to prove the computational efficiency and competency of this technique for delivering geospatial information with pre-specified accuracy.

  13. Xi-CAM v1.2.3

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    PANDOLFI, RONALD; KUMAR, DINESH; VENKATAKRISHNAN, SINGANALLUR

    Xi-CAM aims to provide a community driven platform for multimodal analysis in synchrotron science. The platform core provides a robust plugin infrastructure for extensibility, allowing continuing development to simply add further functionality. Current modules include tools for characterization with (GI)SAXS, Tomography, and XAS. This will continue to serve as a development base as algorithms for multimodal analysis develop. Seamless remote data access, visualization and analysis are key elements of Xi-CAM, and will become critical to synchrotron data infrastructure as expectations for future data volume and acquisition rates rise with continuously increasing throughputs. The highly interactive design elements of Xi-cam willmore » similarly support a generation of users which depend on immediate data quality feedback during high-throughput or burst acquisition modes.« less

  14. MATtrack: A MATLAB-Based Quantitative Image Analysis Platform for Investigating Real-Time Photo-Converted Fluorescent Signals in Live Cells.

    PubMed

    Courtney, Jane; Woods, Elena; Scholz, Dimitri; Hall, William W; Gautier, Virginie W

    2015-01-01

    We introduce here MATtrack, an open source MATLAB-based computational platform developed to process multi-Tiff files produced by a photo-conversion time lapse protocol for live cell fluorescent microscopy. MATtrack automatically performs a series of steps required for image processing, including extraction and import of numerical values from Multi-Tiff files, red/green image classification using gating parameters, noise filtering, background extraction, contrast stretching and temporal smoothing. MATtrack also integrates a series of algorithms for quantitative image analysis enabling the construction of mean and standard deviation images, clustering and classification of subcellular regions and injection point approximation. In addition, MATtrack features a simple user interface, which enables monitoring of Fluorescent Signal Intensity in multiple Regions of Interest, over time. The latter encapsulates a region growing method to automatically delineate the contours of Regions of Interest selected by the user, and performs background and regional Average Fluorescence Tracking, and automatic plotting. Finally, MATtrack computes convenient visualization and exploration tools including a migration map, which provides an overview of the protein intracellular trajectories and accumulation areas. In conclusion, MATtrack is an open source MATLAB-based software package tailored to facilitate the analysis and visualization of large data files derived from real-time live cell fluorescent microscopy using photoconvertible proteins. It is flexible, user friendly, compatible with Windows, Mac, and Linux, and a wide range of data acquisition software. MATtrack is freely available for download at eleceng.dit.ie/courtney/MATtrack.zip.

  15. Feasibility of an ultra-low power digital signal processor platform as a basis for a fully implantable brain-computer interface system.

    PubMed

    Wang, Po T; Gandasetiawan, Keulanna; McCrimmon, Colin M; Karimi-Bidhendi, Alireza; Liu, Charles Y; Heydari, Payam; Nenadic, Zoran; Do, An H

    2016-08-01

    A fully implantable brain-computer interface (BCI) can be a practical tool to restore independence to those affected by spinal cord injury. We envision that such a BCI system will invasively acquire brain signals (e.g. electrocorticogram) and translate them into control commands for external prostheses. The feasibility of such a system was tested by implementing its benchtop analogue, centered around a commercial, ultra-low power (ULP) digital signal processor (DSP, TMS320C5517, Texas Instruments). A suite of signal processing and BCI algorithms, including (de)multiplexing, Fast Fourier Transform, power spectral density, principal component analysis, linear discriminant analysis, Bayes rule, and finite state machine was implemented and tested in the DSP. The system's signal acquisition fidelity was tested and characterized by acquiring harmonic signals from a function generator. In addition, the BCI decoding performance was tested, first with signals from a function generator, and subsequently using human electroencephalogram (EEG) during eyes opening and closing task. On average, the system spent 322 ms to process and analyze 2 s of data. Crosstalk (<;-65 dB) and harmonic distortion (~1%) were minimal. Timing jitter averaged 49 μs per 1000 ms. The online BCI decoding accuracies were 100% for both function generator and EEG data. These results show that a complex BCI algorithm can be executed on an ULP DSP without compromising performance. This suggests that the proposed hardware platform may be used as a basis for future, fully implantable BCI systems.

  16. MATtrack: A MATLAB-Based Quantitative Image Analysis Platform for Investigating Real-Time Photo-Converted Fluorescent Signals in Live Cells

    PubMed Central

    Courtney, Jane; Woods, Elena; Scholz, Dimitri; Hall, William W.; Gautier, Virginie W.

    2015-01-01

    We introduce here MATtrack, an open source MATLAB-based computational platform developed to process multi-Tiff files produced by a photo-conversion time lapse protocol for live cell fluorescent microscopy. MATtrack automatically performs a series of steps required for image processing, including extraction and import of numerical values from Multi-Tiff files, red/green image classification using gating parameters, noise filtering, background extraction, contrast stretching and temporal smoothing. MATtrack also integrates a series of algorithms for quantitative image analysis enabling the construction of mean and standard deviation images, clustering and classification of subcellular regions and injection point approximation. In addition, MATtrack features a simple user interface, which enables monitoring of Fluorescent Signal Intensity in multiple Regions of Interest, over time. The latter encapsulates a region growing method to automatically delineate the contours of Regions of Interest selected by the user, and performs background and regional Average Fluorescence Tracking, and automatic plotting. Finally, MATtrack computes convenient visualization and exploration tools including a migration map, which provides an overview of the protein intracellular trajectories and accumulation areas. In conclusion, MATtrack is an open source MATLAB-based software package tailored to facilitate the analysis and visualization of large data files derived from real-time live cell fluorescent microscopy using photoconvertible proteins. It is flexible, user friendly, compatible with Windows, Mac, and Linux, and a wide range of data acquisition software. MATtrack is freely available for download at eleceng.dit.ie/courtney/MATtrack.zip. PMID:26485569

  17. Integrating CAD modules in a PACS environment using a wide computing infrastructure.

    PubMed

    Suárez-Cuenca, Jorge J; Tilve, Amara; López, Ricardo; Ferro, Gonzalo; Quiles, Javier; Souto, Miguel

    2017-04-01

    The aim of this paper is to describe a project designed to achieve a total integration of different CAD algorithms into the PACS environment by using a wide computing infrastructure. The aim is to build a system for the entire region of Galicia, Spain, to make CAD accessible to multiple hospitals by employing different PACSs and clinical workstations. The new CAD model seeks to connect different devices (CAD systems, acquisition modalities, workstations and PACS) by means of networking based on a platform that will offer different CAD services. This paper describes some aspects related to the health services of the region where the project was developed, CAD algorithms that were either employed or selected for inclusion in the project, and several technical aspects and results. We have built a standard-based platform with which users can request a CAD service and receive the results in their local PACS. The process runs through a web interface that allows sending data to the different CAD services. A DICOM SR object is received with the results of the algorithms stored inside the original study in the proper folder with the original images. As a result, a homogeneous service to the different hospitals of the region will be offered. End users will benefit from a homogeneous workflow and a standardised integration model to request and obtain results from CAD systems in any modality, not dependant on commercial integration models. This new solution will foster the deployment of these technologies in the entire region of Galicia.

  18. Toward real-time virtual biopsy of oral lesions using confocal laser endomicroscopy interfaced with embedded computing.

    PubMed

    Thong, Patricia S P; Tandjung, Stephanus S; Movania, Muhammad Mobeen; Chiew, Wei-Ming; Olivo, Malini; Bhuvaneswari, Ramaswamy; Seah, Hock-Soon; Lin, Feng; Qian, Kemao; Soo, Khee-Chee

    2012-05-01

    Oral lesions are conventionally diagnosed using white light endoscopy and histopathology. This can pose a challenge because the lesions may be difficult to visualise under white light illumination. Confocal laser endomicroscopy can be used for confocal fluorescence imaging of surface and subsurface cellular and tissue structures. To move toward real-time "virtual" biopsy of oral lesions, we interfaced an embedded computing system to a confocal laser endomicroscope to achieve a prototype three-dimensional (3-D) fluorescence imaging system. A field-programmable gated array computing platform was programmed to enable synchronization of cross-sectional image grabbing and Z-depth scanning, automate the acquisition of confocal image stacks and perform volume rendering. Fluorescence imaging of the human and murine oral cavities was carried out using the fluorescent dyes fluorescein sodium and hypericin. Volume rendering of cellular and tissue structures from the oral cavity demonstrate the potential of the system for 3-D fluorescence visualization of the oral cavity in real-time. We aim toward achieving a real-time virtual biopsy technique that can complement current diagnostic techniques and aid in targeted biopsy for better clinical outcomes.

  19. CERN openlab: Engaging industry for innovation in the LHC Run 3-4 R&D programme

    NASA Astrophysics Data System (ADS)

    Girone, M.; Purcell, A.; Di Meglio, A.; Rademakers, F.; Gunne, K.; Pachou, M.; Pavlou, S.

    2017-10-01

    LHC Run3 and Run4 represent an unprecedented challenge for HEP computing in terms of both data volume and complexity. New approaches are needed for how data is collected and filtered, processed, moved, stored and analysed if these challenges are to be met with a realistic budget. To develop innovative techniques we are fostering relationships with industry leaders. CERN openlab is a unique resource for public-private partnership between CERN and leading Information Communication and Technology (ICT) companies. Its mission is to accelerate the development of cutting-edge solutions to be used by the worldwide HEP community. In 2015, CERN openlab started its phase V with a strong focus on tackling the upcoming LHC challenges. Several R&D programs are ongoing in the areas of data acquisition, networks and connectivity, data storage architectures, computing provisioning, computing platforms and code optimisation and data analytics. This paper gives an overview of the various innovative technologies that are currently being explored by CERN openlab V and discusses the long-term strategies that are pursued by the LHC communities with the help of industry in closing the technological gap in processing and storage needs expected in Run3 and Run4.

  20. The Utility of Cognitive Plausibility in Language Acquisition Modeling: Evidence from Word Segmentation

    ERIC Educational Resources Information Center

    Phillips, Lawrence; Pearl, Lisa

    2015-01-01

    The informativity of a computational model of language acquisition is directly related to how closely it approximates the actual acquisition task, sometimes referred to as the model's "cognitive plausibility." We suggest that though every computational model necessarily idealizes the modeled task, an informative language acquisition…

  1. Development of a computer model to predict platform station keeping requirements in the Gulf of Mexico using remote sensing data

    NASA Technical Reports Server (NTRS)

    Barber, Bryan; Kahn, Laura; Wong, David

    1990-01-01

    Offshore operations such as oil drilling and radar monitoring require semisubmersible platforms to remain stationary at specific locations in the Gulf of Mexico. Ocean currents, wind, and waves in the Gulf of Mexico tend to move platforms away from their desired locations. A computer model was created to predict the station keeping requirements of a platform. The computer simulation uses remote sensing data from satellites and buoys as input. A background of the project, alternate approaches to the project, and the details of the simulation are presented.

  2. Computer-Aided Process and Tools for Mobile Software Acquisition

    DTIC Science & Technology

    2013-04-01

    Software Acquisition Christopher Bonine , Man-Tak Shing, and Thomas W. Otani Naval Postgraduate School Published April 1, 2013 Approved for public...ManTech International Corporation Computer-Aided Process and Tools for Mobile Software Acquisition Christopher Bonine , Man-Tak Shing, and Thomas W. Otani...Mobile Software Acquisition Christopher Bonine — Bonine is a lieutenant in the United States Navy. He is currently assigned to the Navy Cyber Defense

  3. A Low-Cost Data Acquisition System for Automobile Dynamics Applications

    PubMed Central

    González, Alejandro; Vinolas, Jordi

    2018-01-01

    This project addresses the need for the implementation of low-cost acquisition technology in the field of vehicle engineering: the design, development, manufacture, and verification of a low-cost Arduino-based data acquisition platform to be used in <80 Hz data acquisition in vehicle dynamics, using low-cost accelerometers. In addition to this, a comparative study is carried out of professional vibration acquisition technologies and low-cost systems, obtaining optimum results for low- and medium-frequency operations with an error of 2.19% on road tests. It is therefore concluded that these technologies are applicable to the automobile industry, thereby allowing the project costs to be reduced and thus facilitating access to this kind of research that requires limited resources. PMID:29382039

  4. A Low-Cost Data Acquisition System for Automobile Dynamics Applications.

    PubMed

    González, Alejandro; Olazagoitia, José Luis; Vinolas, Jordi

    2018-01-27

    This project addresses the need for the implementation of low-cost acquisition technology in the field of vehicle engineering: the design, development, manufacture, and verification of a low-cost Arduino-based data acquisition platform to be used in <80 Hz data acquisition in vehicle dynamics, using low-cost accelerometers. In addition to this, a comparative study is carried out of professional vibration acquisition technologies and low-cost systems, obtaining optimum results for low- and medium-frequency operations with an error of 2.19% on road tests. It is therefore concluded that these technologies are applicable to the automobile industry, thereby allowing the project costs to be reduced and thus facilitating access to this kind of research that requires limited resources.

  5. Traffic information computing platform for big data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Duan, Zongtao, E-mail: ztduan@chd.edu.cn; Li, Ying, E-mail: ztduan@chd.edu.cn; Zheng, Xibin, E-mail: ztduan@chd.edu.cn

    Big data environment create data conditions for improving the quality of traffic information service. The target of this article is to construct a traffic information computing platform for big data environment. Through in-depth analysis the connotation and technology characteristics of big data and traffic information service, a distributed traffic atomic information computing platform architecture is proposed. Under the big data environment, this type of traffic atomic information computing architecture helps to guarantee the traffic safety and efficient operation, more intelligent and personalized traffic information service can be used for the traffic information users.

  6. Flight Simulator Platform Motion and Air Transport Pilot Training

    NASA Technical Reports Server (NTRS)

    Lee, Alfred T.; Bussolari, Steven R.

    1989-01-01

    The influence of flight simulator platform motion on pilot training and performance was examined In two studies utilizing a B-727-200 aircraft simulator. The simulator, located at Ames Research Center, Is certified by the FAA for upgrade and transition training in air carrier operations. Subjective ratings and objective performance of experienced B-727 pilots did not reveal any reliable effects of wide variations In platform motion de- sign. Motion platform variations did, however, affect the acquisition of control skill by pilots with no prior heavy aircraft flying experience. The effect was limited to pitch attitude control inputs during the early phase of landing training. Implications for the definition of platform motion requirements in air transport pilot training are discussed.

  7. The Utah Educational Technology Initiative Year Two Evaluation: Program Implementation, Computer Acquisition and Placement, and Computer Use.

    ERIC Educational Resources Information Center

    Mergendoller, John R.; And Others

    This evaluation report describes program implementation, computer acquisition and placement, and computer use during the second year (1991-92) of the Utah Educational Technology Initiative (ETI). In addition, it discusses the various ways computers are used in Utah schools and reports the opinions and experiences of ETI coordinators in the 12…

  8. Age-dependent effects of neonatal methamphetamine exposure on spatial learning

    PubMed Central

    Vorhees, Charles V.; Skelton, Matthew R.; Williams, Michael T.

    2009-01-01

    Neonatal rats exposed to (+)-methamphetamine (MA) display spatial learning and reference memory deficits in the Morris water maze. In separate experiments the emergence and permanence of these effects were determined. Twenty litters were used in each experiment, and two male/female pairs/litter received saline or MA (5 mg/kg four times a day) on postnatal days (P) 11–20. In experiment 1, one MA and one saline pair from each litter began testing on either P30 or P40, whereas in experiment 2, testing began on P180 or P360. Animals received trials in a straight swimming channel and then in the Morris maze (acquisition, reversal, and reduced platform phases). In both experiments, MA-treated groups showed impaired learning in the platform trials and impaired reference memory in the probe trials, which were largely independent of age. The P30 and P40 MA impairments were seen on acquisition and reduced platform trials but not on reversal. In the probe trials, MA effects were seen during all phases. The P180 and P360 MA-induced deficits were seen in all phases of the platform trials. In probe trials, deficits were only seen during the reversal and reduced platform phases. The results demonstrate that neonatal MA treatment induces spatial learning and reference memory deficits that emerge early and persist until at least 1 year of age, suggesting permanence. PMID:17762523

  9. Computer Skills Acquisition: A Review and Future Directions for Research.

    ERIC Educational Resources Information Center

    Gattiker, Urs E.

    A review of past research on training employees for computer-mediated work leads to the development of theory and propositions concerning the relationship between different variables, such as: (1) individual factors; (2) task and person-computer interface; (3) characteristics of training design for the acquisition of computer skills; and (4) the…

  10. Open source acceleration of wave optics simulations on energy efficient high-performance computing platforms

    NASA Astrophysics Data System (ADS)

    Beck, Jeffrey; Bos, Jeremy P.

    2017-05-01

    We compare several modifications to the open-source wave optics package, WavePy, intended to improve execution time. Specifically, we compare the relative performance of the Intel MKL, a CPU based OpenCV distribution, and GPU-based version. Performance is compared between distributions both on the same compute platform and between a fully-featured computing workstation and the NVIDIA Jetson TX1 platform. Comparisons are drawn in terms of both execution time and power consumption. We have found that substituting the Fast Fourier Transform operation from OpenCV provides a marked improvement on all platforms. In addition, we show that embedded platforms offer some possibility for extensive improvement in terms of efficiency compared to a fully featured workstation.

  11. Graphene as a platform for novel nanoelectronic devices

    NASA Astrophysics Data System (ADS)

    Standley, Brian

    Graphene's superlative electrical and mechanical properties, combined with its compatibility with existing planar silicon-based technology, make it an attractive platform for novel nanoelectronic devices. The development of two such devices is reported--a nonvolatile memory element exploiting the nanoscale graphene edge and a field-effect transistor using graphene for both the conducting channel and, in oxidized form, the gate dielectric. These experiments were enabled by custom software written to fully utilize both instrument-based and computer-based data acquisition hardware and provide a simple measurement automation system. Graphene break junctions were studied and found to exhibit switching behavior in response to an electric field. This switching allows the devices to act as nonvolatile memory elements which have demonstrated thousands of writing cycles and long retention times. A model for device operation is proposed based on the formation and breaking of carbon-atom chains that bridge the junctions. Information storage was demonstrated using the concept of rank coding, in which information is stored in the relative conductance of multiple graphene switches in a memory cell. The high mobility and two dimensional nature of graphene make it an attractive material for field-effect transistors. Another ultrathin layered materialmd graphene's insulating analogue, graphite oxidemd was studied as an alternative to bulk gate dielectric materials such as Al2O3 or HfO 2. Transistors were fabricated comprising single or bilayer graphene channels, graphite oxide gate insulators, and metal top-gates. Electron transport measurements reveal minimal leakage through the graphite oxide at room temperature. Its breakdown electric field was found to be comparable to SiO2, typically ˜1-3 x 108 V/m, while its dielectric constant is slightly higher, kappa ≈ 4.3. As nanoelectronics experiments and their associated instrumentation continue to grow in complexity the need for powerful data acquisition software has only increased. This role has traditionally been filled by semiconductor parameter analyzers or desktop computers running LabVIEW. Mezurit 2 represents a hybrid approach, providing basic virtual instruments which can be controlled in concert through a comprehensive scripting interface. Each virtual instrument's model of operation is described and an architectural overview is provided.

  12. Generation of Digital Surface Models from satellite photogrammetry: the DSM-OPT service of the ESA Geohazards Exploitation Platform (GEP)

    NASA Astrophysics Data System (ADS)

    Stumpf, André; Michéa, David; Malet, Jean-Philippe

    2017-04-01

    The continuously increasing fleet of agile stereo-capable very-high resolution (VHR) optical satellites has facilitated the acquisition of multi-view images of the earth surface. Theoretical revisit times have been reduced to less than one day and the highest spatial resolution which is commercially available amounts now to 30 cm/pixel. Digital Surface Models (DSM) and point clouds computed from such satellite stereo-acquisitions can provide valuable input for studies in geomorphology, tectonics, glaciology, hydrology and urban remote sensing The photogrammetric processing, however, still requires significant expertise, computational resources and costly commercial software. To enable a large Earth Science community (researcher and end-users) to process easily and rapidly VHR multi-view images, the work targets the implementation of a fully automatic satellite-photogrammetry pipeline (i.e DSM-OPT) on the ESA Geohazards Exploitation Platform (GEP). The implemented pipeline is based on the open-source photogrammetry library MicMac [1] and is designed for distributed processing on a cloud-based infrastructure. The service can be employed in pre-defined processing modes (i.e. urban, plain, hilly, and mountainous environments) or in an advanced processing mode (i.e. in which expert-users have the possibility to adapt the processing parameters to their specific applications). Four representative use cases are presented to illustrate the accuracy of the resulting surface models and ortho-images as well as the overall processing time. These use cases consisted of the construction of surface models from series of Pléiades images for four applications: urban analysis (Strasbourg, France), landslide detection in mountainous environments (South French Alps), co-seismic deformation in mountain environments (Central Italy earthquake sequence of 2016) and fault recognition for paleo-tectonic analysis (North-East India). Comparisons of the satellite-derived topography to airborne LiDAR topography are discussed. [1] Rupnik, E., Pierrot Deseilligny, M., Delorme, A., and Klinger, Y.: Refined satellite image orientation in the free open-source photogrammetric tools APERO/MICMAC, ISPRS Ann. Photogramm. Remote Sens. Spatial Inf. Sci., III-1, 83-90, doi:10.5194/isprs-annals-III-1-83-2016, 2016.

  13. Interactive Computer-Assisted Instruction in Acid-Base Physiology for Mobile Computer Platforms

    ERIC Educational Resources Information Center

    Longmuir, Kenneth J.

    2014-01-01

    In this project, the traditional lecture hall presentation of acid-base physiology in the first-year medical school curriculum was replaced by interactive, computer-assisted instruction designed primarily for the iPad and other mobile computer platforms. Three learning modules were developed, each with ~20 screens of information, on the subjects…

  14. 48 CFR 227.7207 - Contractor data repositories.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... repositories. 227.7207 Section 227.7207 Federal Acquisition Regulations System DEFENSE ACQUISITION REGULATIONS... Computer Software and Computer Software Documentation 227.7207 Contractor data repositories. Follow 227.7108 when it is in the Government's interests to have a data repository include computer software or to...

  15. Possibilities and limits of Internet-based registers.

    PubMed

    Wild, Michael; Candrian, Aron; Wenda, Klaus

    2009-03-01

    The Internet is an inexpensive platform for the investigation of medical questions in case of low prevalence. By accessing www.ao-nailregister.org, every interested participant may participate in the English-language survey of the complications specific to the femoral nail. The address data of the participant, the anonymised key data of the patients and the medical parameters are entered. In real time, these data are checked for plausibility, evaluated and published on the Internet where they are freely accessible immediately. Because of national differences, data acquisition caused considerable difficulties at the beginning. In addition, wrong data were entered because of linguistic or contextual misunderstandings. After having reworked the questionnaire completely, facilitating data input and implementing an automated plausibility check, these difficulties could be cleared. In a next step, the automatic evaluation of the data was implemented. Only very few data still have to be checked for plausibility manually to exclude wrong entries, which cannot be verified by the computer. The effort required for data acquisition and evaluation of the Internet-based femoral nail register was reduced distinctly. The possibility of free international participation as well as the freely accessible representation of the results offers transparency.

  16. Decentralized system identification using stochastic subspace identification on wireless smart sensor networks

    NASA Astrophysics Data System (ADS)

    Sim, Sung-Han; Spencer, Billie F., Jr.; Park, Jongwoong; Jung, Hyungjo

    2012-04-01

    Wireless Smart Sensor Networks (WSSNs) facilitates a new paradigm to structural identification and monitoring for civil infrastructure. Conventional monitoring systems based on wired sensors and centralized data acquisition and processing have been considered to be challenging and costly due to cabling and expensive equipment and maintenance costs. WSSNs have emerged as a technology that can overcome such difficulties, making deployment of a dense array of sensors on large civil structures both feasible and economical. However, as opposed to wired sensor networks in which centralized data acquisition and processing is common practice, WSSNs require decentralized computing algorithms to reduce data transmission due to the limitation associated with wireless communication. Thus, several system identification methods have been implemented to process sensor data and extract essential information, including Natural Excitation Technique with Eigensystem Realization Algorithm, Frequency Domain Decomposition (FDD), and Random Decrement Technique (RDT); however, Stochastic Subspace Identification (SSI) has not been fully utilized in WSSNs, while SSI has the strong potential to enhance the system identification. This study presents a decentralized system identification using SSI in WSSNs. The approach is implemented on MEMSIC's Imote2 sensor platform and experimentally verified using a 5-story shear building model.

  17. An Analysis of the Defense Acquisition Strategy for Unmanned Systems

    DTIC Science & Technology

    2013-11-20

    Product Service Code RAA Rapid Acquisition Authority RCS Radar Cross Section REF Rapid Equipping Force RFID Radio Frequency Identification RDT...the radio frequency identification (RFID) chip also provides a useful basis for comparison. WWII served as the proving ground for RFID technology...enabling miniaturized Free Space Optical Communications systems capable of scaling across data rates, distances, and platforms and integrating with radio

  18. Updates in metabolomics tools and resources: 2014-2015.

    PubMed

    Misra, Biswapriya B; van der Hooft, Justin J J

    2016-01-01

    Data processing and interpretation represent the most challenging and time-consuming steps in high-throughput metabolomic experiments, regardless of the analytical platforms (MS or NMR spectroscopy based) used for data acquisition. Improved machinery in metabolomics generates increasingly complex datasets that create the need for more and better processing and analysis software and in silico approaches to understand the resulting data. However, a comprehensive source of information describing the utility of the most recently developed and released metabolomics resources--in the form of tools, software, and databases--is currently lacking. Thus, here we provide an overview of freely-available, and open-source, tools, algorithms, and frameworks to make both upcoming and established metabolomics researchers aware of the recent developments in an attempt to advance and facilitate data processing workflows in their metabolomics research. The major topics include tools and researches for data processing, data annotation, and data visualization in MS and NMR-based metabolomics. Most in this review described tools are dedicated to untargeted metabolomics workflows; however, some more specialist tools are described as well. All tools and resources described including their analytical and computational platform dependencies are summarized in an overview Table. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  19. Research on self-calibration biaxial autocollimator based on ZYNQ

    NASA Astrophysics Data System (ADS)

    Guo, Pan; Liu, Bingguo; Liu, Guodong; Zhong, Yao; Lu, Binghui

    2018-01-01

    Autocollimators are mainly based on computers or the electronic devices that can be connected to the internet, and its precision, measurement range and resolution are all defective, and external displays are needed to display images in real time. What's more, there is no real-time calibration for autocollimator in the market. In this paper, we propose a biaxial autocollimator based on the ZYNQ embedded platform to solve the above problems. Firstly, the traditional optical system is improved and a light path is added for real-time calibration. Then, in order to improve measurement speed, the embedded platform based on ZYNQ that combines Linux operating system with autocollimator is designed. In this part, image acquisition, image processing, image display and the man-machine interaction interface based on Qt are achieved. Finally, the system realizes two-dimensional small angle measurement. Experimental results showed that the proposed method can improve the angle measurement accuracy. The standard deviation of the close distance (1.5m) is 0.15" in horizontal direction of image and 0.24"in vertical direction, the repeatability of measurement of the long distance (10m) is improved by 0.12 in horizontal direction of image and 0.3 in vertical direction.

  20. Integrating light-sheet imaging with virtual reality to recapitulate developmental cardiac mechanics.

    PubMed

    Ding, Yichen; Abiri, Arash; Abiri, Parinaz; Li, Shuoran; Chang, Chih-Chiang; Baek, Kyung In; Hsu, Jeffrey J; Sideris, Elias; Li, Yilei; Lee, Juhyun; Segura, Tatiana; Nguyen, Thao P; Bui, Alexander; Sevag Packard, René R; Fei, Peng; Hsiai, Tzung K

    2017-11-16

    Currently, there is a limited ability to interactively study developmental cardiac mechanics and physiology. We therefore combined light-sheet fluorescence microscopy (LSFM) with virtual reality (VR) to provide a hybrid platform for 3D architecture and time-dependent cardiac contractile function characterization. By taking advantage of the rapid acquisition, high axial resolution, low phototoxicity, and high fidelity in 3D and 4D (3D spatial + 1D time or spectra), this VR-LSFM hybrid methodology enables interactive visualization and quantification otherwise not available by conventional methods, such as routine optical microscopes. We hereby demonstrate multiscale applicability of VR-LSFM to (a) interrogate skin fibroblasts interacting with a hyaluronic acid-based hydrogel, (b) navigate through the endocardial trabecular network during zebrafish development, and (c) localize gene therapy-mediated potassium channel expression in adult murine hearts. We further combined our batch intensity normalized segmentation algorithm with deformable image registration to interface a VR environment with imaging computation for the analysis of cardiac contraction. Thus, the VR-LSFM hybrid platform demonstrates an efficient and robust framework for creating a user-directed microenvironment in which we uncovered developmental cardiac mechanics and physiology with high spatiotemporal resolution.

  1. Integrating light-sheet imaging with virtual reality to recapitulate developmental cardiac mechanics

    PubMed Central

    Ding, Yichen; Abiri, Arash; Abiri, Parinaz; Li, Shuoran; Chang, Chih-Chiang; Hsu, Jeffrey J.; Sideris, Elias; Li, Yilei; Lee, Juhyun; Segura, Tatiana; Nguyen, Thao P.; Bui, Alexander; Sevag Packard, René R.; Hsiai, Tzung K.

    2017-01-01

    Currently, there is a limited ability to interactively study developmental cardiac mechanics and physiology. We therefore combined light-sheet fluorescence microscopy (LSFM) with virtual reality (VR) to provide a hybrid platform for 3D architecture and time-dependent cardiac contractile function characterization. By taking advantage of the rapid acquisition, high axial resolution, low phototoxicity, and high fidelity in 3D and 4D (3D spatial + 1D time or spectra), this VR-LSFM hybrid methodology enables interactive visualization and quantification otherwise not available by conventional methods, such as routine optical microscopes. We hereby demonstrate multiscale applicability of VR-LSFM to (a) interrogate skin fibroblasts interacting with a hyaluronic acid–based hydrogel, (b) navigate through the endocardial trabecular network during zebrafish development, and (c) localize gene therapy-mediated potassium channel expression in adult murine hearts. We further combined our batch intensity normalized segmentation algorithm with deformable image registration to interface a VR environment with imaging computation for the analysis of cardiac contraction. Thus, the VR-LSFM hybrid platform demonstrates an efficient and robust framework for creating a user-directed microenvironment in which we uncovered developmental cardiac mechanics and physiology with high spatiotemporal resolution. PMID:29202458

  2. Active disturbance rejection controller of fine tracking system for free space optical communication

    NASA Astrophysics Data System (ADS)

    Cui, Ning; Liu, Yang; Chen, Xinglin; Wang, Yan

    2013-08-01

    Free space optical communication is one of the best approaches in future communications. Laser beam's acquisition, pointing and tracking are crucial technologies of free space optical communication. Fine tracking system is important component of APT (acquisition, pointing and tracking) system. It cooperates with the coarse pointing system in executing the APT mission. Satellite platform vibration and disturbance, which reduce received optical power, increase bit error rate and affect seriously the natural performance of laser communication. For the characteristic of satellite platform, an active disturbance rejection controller was designed to reduce the vibration and disturbance. There are three major contributions in the paper. Firstly, the effects of vibration on the inter satellite optical communications were analyzed, and the reasons and characters of vibration of the satellite platform were summarized. The amplitude-frequency response of a filter was designed according to the power spectral density of platform vibration of SILEX (Semiconductor Inter-satellite Laser Experiment), and then the signals of platform vibration were generated by filtering white Gaussian noise using the filter. Secondly, the fast steering mirror is a key component of the fine tracking system for optical communication. The mechanical design and model analysis was made to the tip/tilt mirror driven by the piezoelectric actuator and transmitted by the flexure hinge. The transfer function of the fast steering mirror, camera, D/A data acquisition card was established, and the theory model of transfer function of this system was further obtained. Finally, an active disturbance rejection control method is developed, multiple parallel extended state observers were designed for estimation of unknown dynamics and external disturbance, and the estimated states were used for nonlinear feedback control and compensation to improve system performance. The simulation results show that the designed controller not only accurately estimates and compensates the disturbances, but also realizes the robustness to estimation of unknown dynamics. The controller can satisfy the requirement of fine tracking accuracy for free space optical communication system.

  3. Contributions of Platform Motion to Simulator Training Effectiveness: Study II--Aerobatics. Interim Report for Period March 1976-November 1977.

    ERIC Educational Resources Information Center

    Martin, Elizabeth L.; Waag, Wayne L.

    A transfer-of-training design was used to evaluate the contributions of simulator training with synergistic six-degrees-of-freedom platform motion to aerobatic skills acquisition in the novice pilot. Thirty-six undergraduate pilot trainees were randomly assigned to one of three treatment groups: motion, no-motion, and control. Those in the control…

  4. Space environmental effects on materials

    NASA Technical Reports Server (NTRS)

    Schwinghmaer, R. J.

    1980-01-01

    The design of long life platforms and structures for space is discussed in terms of the space environmental effects on the materials used. Vacuum, ultraviolet radiation, and charged particle radiation are among the factors considered. Research oriented toward the acquisition of long term environmental effects data needed to support the design and development of large low Earth orbit and geosynchronous Earth orbit space platforms and systems is described.

  5. Hardware-Assisted Large-Scale Neuroevolution for Multiagent Learning

    DTIC Science & Technology

    2014-12-30

    SECURITY CLASSIFICATION OF: This DURIP equipment award was used to purchase, install, and bring on-line two Berkeley Emulation Engines ( BEEs ) and two...mini- BEE machines to establish an FPGA-based high-performance multiagent training platform and its associated software. This acquisition of BEE4-W...Platform; Probabilistic Domain Transformation; Hardware-Assisted; FPGA; BEE ; Hive Brain; Multiagent. REPORT DOCUMENTATION PAGE 11. SPONSOR/MONITOR’S

  6. The revolution in data gathering systems

    NASA Technical Reports Server (NTRS)

    Cambra, J. M.; Trover, W. F.

    1975-01-01

    Data acquisition systems used in NASA's wind tunnels from the 1950's through the present time are summarized as a baseline for assessing the impact of minicomputers and microcomputers on data acquisition and data processing. Emphasis is placed on the cyclic evolution in computer technology which transformed the central computer system, and finally the distributed computer system. Other developments discussed include: medium scale integration, large scale integration, combining the functions of data acquisition and control, and micro and minicomputers.

  7. Evaluation of Emerging Energy-Efficient Heterogeneous Computing Platforms for Biomolecular and Cellular Simulation Workloads

    PubMed Central

    Stone, John E.; Hallock, Michael J.; Phillips, James C.; Peterson, Joseph R.; Luthey-Schulten, Zaida; Schulten, Klaus

    2016-01-01

    Many of the continuing scientific advances achieved through computational biology are predicated on the availability of ongoing increases in computational power required for detailed simulation and analysis of cellular processes on biologically-relevant timescales. A critical challenge facing the development of future exascale supercomputer systems is the development of new computing hardware and associated scientific applications that dramatically improve upon the energy efficiency of existing solutions, while providing increased simulation, analysis, and visualization performance. Mobile computing platforms have recently become powerful enough to support interactive molecular visualization tasks that were previously only possible on laptops and workstations, creating future opportunities for their convenient use for meetings, remote collaboration, and as head mounted displays for immersive stereoscopic viewing. We describe early experiences adapting several biomolecular simulation and analysis applications for emerging heterogeneous computing platforms that combine power-efficient system-on-chip multi-core CPUs with high-performance massively parallel GPUs. We present low-cost power monitoring instrumentation that provides sufficient temporal resolution to evaluate the power consumption of individual CPU algorithms and GPU kernels. We compare the performance and energy efficiency of scientific applications running on emerging platforms with results obtained on traditional platforms, identify hardware and algorithmic performance bottlenecks that affect the usability of these platforms, and describe avenues for improving both the hardware and applications in pursuit of the needs of molecular modeling tasks on mobile devices and future exascale computers. PMID:27516922

  8. Investigation of left and right lateral fluid percussion injury in C57BL6/J mice: In vivo functional consequences

    PubMed Central

    Schurman, Lesley D.; Smith, Terry L.; Morales, Anthony J.; Lee, Nancy N.; Reeves, Thomas M.; Phillips, Linda L.; Lichtman, Aron H.

    2017-01-01

    Although rodent models of traumatic brain injury (TBI) reliably produce cognitive and motor disturbances, behavioral characterization resulting from left and right hemisphere injuries remains unexplored. Here we examined the functional consequences of targeting the left versus right parietal cortex in lateral fluid percussion injury, on Morris water maze (MWM) spatial memory tasks (fixed platform and reversal) and neurological motor deficits (neurological severity score and rotarod). In the MWM fixed platform task, right lateral injury produced a small delay in acquisition rate compared to left. However, injury to either hemisphere resulted in probe trial deficits. In the MWM reversal task, left-right performance deficits were not evident, though left lateral injury produced mild acquisition and probe trial deficits compared to sham controls. Additionally, left and right injury produced similar neurological motor task deficits, impaired righting times, and lesion volumes. Injury to either hemisphere also produced robust ipsilateral, and modest contralateral, morphological changes in reactive microglia and astrocytes. In conclusion, left and right lateral TBI impaired MWM performance, with mild fixed platform acquisition rate differences, despite similar motor deficits, histological damage, and glial cell reactivity. Thus, while both left and right lateral TBI produce cognitive deficits, laterality in mouse MWM learning and memory merits consideration in the investigation of TBI-induced cognitive consequences. PMID:28527714

  9. Investigation of left and right lateral fluid percussion injury in C57BL6/J mice: In vivo functional consequences.

    PubMed

    Schurman, Lesley D; Smith, Terry L; Morales, Anthony J; Lee, Nancy N; Reeves, Thomas M; Phillips, Linda L; Lichtman, Aron H

    2017-07-13

    Although rodent models of traumatic brain injury (TBI) reliably produce cognitive and motor disturbances, behavioral characterization resulting from left and right hemisphere injuries remains unexplored. Here we examined the functional consequences of targeting the left versus right parietal cortex in lateral fluid percussion injury, on Morris water maze (MWM) spatial memory tasks (fixed platform and reversal) and neurological motor deficits (neurological severity score and rotarod). In the MWM fixed platform task, right lateral injury produced a small delay in acquisition rate compared to left. However, injury to either hemisphere resulted in probe trial deficits. In the MWM reversal task, left-right performance deficits were not evident, though left lateral injury produced mild acquisition and probe trial deficits compared to sham controls. Additionally, left and right injury produced similar neurological motor task deficits, impaired righting times, and lesion volumes. Injury to either hemisphere also produced robust ipsilateral, and modest contralateral, morphological changes in reactive microglia and astrocytes. In conclusion, left and right lateral TBI impaired MWM performance, with mild fixed platform acquisition rate differences, despite similar motor deficits, histological damage, and glial cell reactivity. Thus, while both left and right lateral TBI produce cognitive deficits, laterality in mouse MWM learning and memory merits consideration in the investigation of TBI-induced cognitive consequences. Copyright © 2017. Published by Elsevier B.V.

  10. Autonomous self-organizing resource manager for multiple networked platforms

    NASA Astrophysics Data System (ADS)

    Smith, James F., III

    2002-08-01

    A fuzzy logic based expert system for resource management has been developed that automatically allocates electronic attack (EA) resources in real-time over many dissimilar autonomous naval platforms defending their group against attackers. The platforms can be very general, e.g., ships, planes, robots, land based facilities, etc. Potential foes the platforms deal with can also be general. This paper provides an overview of the resource manager including the four fuzzy decision trees that make up the resource manager; the fuzzy EA model; genetic algorithm based optimization; co-evolutionary data mining through gaming; and mathematical, computational and hardware based validation. Methods of automatically designing new multi-platform EA techniques are considered. The expert system runs on each defending platform rendering it an autonomous system requiring no human intervention. There is no commanding platform. Instead the platforms work cooperatively as a function of battlespace geometry; sensor data such as range, bearing, ID, uncertainty measures for sensor output; intelligence reports; etc. Computational experiments will show the defending networked platform's ability to self- organize. The platforms' ability to self-organize is illustrated through the output of the scenario generator, a software package that automates the underlying data mining problem and creates a computer movie of the platforms' interaction for evaluation.

  11. Uncover the Cloud for Geospatial Sciences and Applications to Adopt Cloud Computing

    NASA Astrophysics Data System (ADS)

    Yang, C.; Huang, Q.; Xia, J.; Liu, K.; Li, J.; Xu, C.; Sun, M.; Bambacus, M.; Xu, Y.; Fay, D.

    2012-12-01

    Cloud computing is emerging as the future infrastructure for providing computing resources to support and enable scientific research, engineering development, and application construction, as well as work force education. On the other hand, there is a lot of doubt about the readiness of cloud computing to support a variety of scientific research, development and educations. This research is a project funded by NASA SMD to investigate through holistic studies how ready is the cloud computing to support geosciences. Four applications with different computing characteristics including data, computing, concurrent, and spatiotemporal intensities are taken to test the readiness of cloud computing to support geosciences. Three popular and representative cloud platforms including Amazon EC2, Microsoft Azure, and NASA Nebula as well as a traditional cluster are utilized in the study. Results illustrates that cloud is ready to some degree but more research needs to be done to fully implemented the cloud benefit as advertised by many vendors and defined by NIST. Specifically, 1) most cloud platform could help stand up new computing instances, a new computer, in a few minutes as envisioned, therefore, is ready to support most computing needs in an on demand fashion; 2) the load balance and elasticity, a defining characteristic, is ready in some cloud platforms, such as Amazon EC2, to support bigger jobs, e.g., needs response in minutes, while some are not ready to support the elasticity and load balance well. All cloud platform needs further research and development to support real time application at subminute level; 3) the user interface and functionality of cloud platforms vary a lot and some of them are very professional and well supported/documented, such as Amazon EC2, some of them needs significant improvement for the general public to adopt cloud computing without professional training or knowledge about computing infrastructure; 4) the security is a big concern in cloud computing platform, with the sharing spirit of cloud computing, it is very hard to ensure higher level security, except a private cloud is built for a specific organization without public access, public cloud platform does not support FISMA medium level yet and may never be able to support FISMA high level; 5) HPC jobs needs of cloud computing is not well supported and only Amazon EC2 supports this well. The research is being taken by NASA and other agencies to consider cloud computing adoption. We hope the publication of the research would also benefit the public to adopt cloud computing.

  12. 48 CFR 52.227-19 - Commercial Computer Software License.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... Software License. 52.227-19 Section 52.227-19 Federal Acquisition Regulations System FEDERAL ACQUISITION... Clauses 52.227-19 Commercial Computer Software License. As prescribed in 27.409(g), insert the following clause: Commercial Computer Software License (DEC 2007) (a) Notwithstanding any contrary provisions...

  13. 48 CFR 52.227-19 - Commercial Computer Software License.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... Software License. 52.227-19 Section 52.227-19 Federal Acquisition Regulations System FEDERAL ACQUISITION... Clauses 52.227-19 Commercial Computer Software License. As prescribed in 27.409(g), insert the following clause: Commercial Computer Software License (DEC 2007) (a) Notwithstanding any contrary provisions...

  14. 48 CFR 52.227-19 - Commercial Computer Software License.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... Software License. 52.227-19 Section 52.227-19 Federal Acquisition Regulations System FEDERAL ACQUISITION... Clauses 52.227-19 Commercial Computer Software License. As prescribed in 27.409(g), insert the following clause: Commercial Computer Software License (DEC 2007) (a) Notwithstanding any contrary provisions...

  15. 48 CFR 52.227-19 - Commercial Computer Software License.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... Software License. 52.227-19 Section 52.227-19 Federal Acquisition Regulations System FEDERAL ACQUISITION... Clauses 52.227-19 Commercial Computer Software License. As prescribed in 27.409(g), insert the following clause: Commercial Computer Software License (DEC 2007) (a) Notwithstanding any contrary provisions...

  16. 48 CFR 52.227-19 - Commercial Computer Software License.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... Software License. 52.227-19 Section 52.227-19 Federal Acquisition Regulations System FEDERAL ACQUISITION... Clauses 52.227-19 Commercial Computer Software License. As prescribed in 27.409(g), insert the following clause: Commercial Computer Software License (DEC 2007) (a) Notwithstanding any contrary provisions...

  17. Rotating Desk for Collaboration by Two Computer Programmers

    NASA Technical Reports Server (NTRS)

    Riley, John Thomas

    2005-01-01

    A special-purpose desk has been designed to facilitate collaboration by two computer programmers sharing one desktop computer or computer terminal. The impetus for the design is a trend toward what is known in the software industry as extreme programming an approach intended to ensure high quality without sacrificing the quantity of computer code produced. Programmers working in pairs is a major feature of extreme programming. The present desk design minimizes the stress of the collaborative work environment. It supports both quality and work flow by making it unnecessary for programmers to get in each other s way. The desk (see figure) includes a rotating platform that supports a computer video monitor, keyboard, and mouse. The desk enables one programmer to work on the keyboard for any amount of time and then the other programmer to take over without breaking the train of thought. The rotating platform is supported by a turntable bearing that, in turn, is supported by a weighted base. The platform contains weights to improve its balance. The base includes a stand for a computer, and is shaped and dimensioned to provide adequate foot clearance for both users. The platform includes an adjustable stand for the monitor, a surface for the keyboard and mouse, and spaces for work papers, drinks, and snacks. The heights of the monitor, keyboard, and mouse are set to minimize stress. The platform can be rotated through an angle of 40 to give either user a straight-on view of the monitor and full access to the keyboard and mouse. Magnetic latches keep the platform preferentially at either of the two extremes of rotation. To switch between users, one simply grabs the edge of the platform and pulls it around. The magnetic latch is easily released, allowing the platform to rotate freely to the position of the other user

  18. Micromagnetics on high-performance workstation and mobile computational platforms

    NASA Astrophysics Data System (ADS)

    Fu, S.; Chang, R.; Couture, S.; Menarini, M.; Escobar, M. A.; Kuteifan, M.; Lubarda, M.; Gabay, D.; Lomakin, V.

    2015-05-01

    The feasibility of using high-performance desktop and embedded mobile computational platforms is presented, including multi-core Intel central processing unit, Nvidia desktop graphics processing units, and Nvidia Jetson TK1 Platform. FastMag finite element method-based micromagnetic simulator is used as a testbed, showing high efficiency on all the platforms. Optimization aspects of improving the performance of the mobile systems are discussed. The high performance, low cost, low power consumption, and rapid performance increase of the embedded mobile systems make them a promising candidate for micromagnetic simulations. Such architectures can be used as standalone systems or can be built as low-power computing clusters.

  19. Continuous measurement of breast tumor hormone receptor expression: a comparison of two computational pathology platforms

    PubMed Central

    Ahern, Thomas P.; Beck, Andrew H.; Rosner, Bernard A.; Glass, Ben; Frieling, Gretchen; Collins, Laura C.; Tamimi, Rulla M.

    2017-01-01

    Background Computational pathology platforms incorporate digital microscopy with sophisticated image analysis to permit rapid, continuous measurement of protein expression. We compared two computational pathology platforms on their measurement of breast tumor estrogen receptor (ER) and progesterone receptor (PR) expression. Methods Breast tumor microarrays from the Nurses’ Health Study were stained for ER (n=592) and PR (n=187). One expert pathologist scored cases as positive if ≥1% of tumor nuclei exhibited stain. ER and PR were then measured with the Definiens Tissue Studio (automated) and Aperio Digital Pathology (user-supervised) platforms. Platform-specific measurements were compared using boxplots, scatter plots, and correlation statistics. Classification of ER and PR positivity by platform-specific measurements was evaluated with areas under receiver operating characteristic curves (AUC) from univariable logistic regression models, using expert pathologist classification as the standard. Results Both platforms showed considerable overlap in continuous measurements of ER and PR between positive and negative groups classified by expert pathologist. Platform-specific measurements were strongly and positively correlated with one another (rho≥0.77). The user-supervised Aperio workflow performed slightly better than the automated Definiens workflow at classifying ER positivity (AUCAperio=0.97; AUCDefiniens=0.90; difference=0.07, 95% CI: 0.05, 0.09) and PR positivity (AUCAperio=0.94; AUCDefiniens=0.87; difference=0.07, 95% CI: 0.03, 0.12). Conclusion Paired hormone receptor expression measurements from two different computational pathology platforms agreed well with one another. The user-supervised workflow yielded better classification accuracy than the automated workflow. Appropriately validated computational pathology algorithms enrich molecular epidemiology studies with continuous protein expression data and may accelerate tumor biomarker discovery. PMID:27729430

  20. WE-FG-207B-08: Dual-Energy CT Iodine Accuracy Across Vendors and Platforms

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jacobsen, M; Wood, C; Cody, D

    Purpose: Although a major benefit of dual-energy CT is its quantitative capabilities, it is critical to understand how results vary by scanner manufacturer and/or model before making clinical patient management decisions. Each manufacturer utilizes a specific dual-energy CT approach; cross-calibration may be required for facilities with more than one dual-energy CT scanner type. Methods: A solid dual-energy quality control phantom (Gammex, Inc.; Appleton, WI) representing a large body cross-section containing three Iodine inserts (2mg/ml, 5mg/ml, 15 mg/ml) was scanned on these CT systems: GE HD-750 (80/140kVp), prototype GE Revolution CT with GSI (80/140kVp), Siemens Flash (80/140kVp and 100/140kVp), and Philipsmore » IQon (120kVp and 140kVp). Iodine content was measured in units of concentration (mg/ml) from a single 5mm-thick central image. Three to five acquisitions were performed on each scanner platform in order to compute standard deviation. Scan acquisitions were approximately dose-matched (∼25mGy CTDIvol) and image parameters were as consistent as possible (thickness, kernel, no noise reduction applied). Results: Iodine measurement error ranges were −0.24-0.16 mg/ml for the 2mg/ml insert (−12.0 − 8.0%), −0.28–0.26 mg/ml for the 5mg/ml insert (−5.6 − 5.2%), and −1.16−0.99 mg/ml for the 15mg/ml insert (−7.7 − 6.6%). Standard deviations ranged from 0 to 0.19 mg/ml for the repeated acquisitions from each scanner. The average iodine measurement error and standard deviation across all systems and inserts was −0.21 ± 0.48 mg/ml (−1.5 ± 6.48%). The largest absolute measurement error was found in the 15mg/ml iodine insert. Conclusion: There was generally good agreement in Iodine quantification across 3 dual-energy CT manufacturers and 4 scanner models. This was unexpected given the widely different underlying dual-energy CT mechanisms employed. Future work will include additional scanner platforms, independent verification of the Iodine insert standard concentrations (especially the 15 mg/ml insert), and how much measurement variability can be clinically tolerated. This research has been supported by funds from Dr. William Murphy, Jr., the John S. Dunn, Sr. Distinguished Chair in Diagnostic Imaging at MD Anderson Cancer Center.« less

  1. Study on the application of mobile internet cloud computing platform

    NASA Astrophysics Data System (ADS)

    Gong, Songchun; Fu, Songyin; Chen, Zheng

    2012-04-01

    The innovative development of computer technology promotes the application of the cloud computing platform, which actually is the substitution and exchange of a sort of resource service models and meets the needs of users on the utilization of different resources after changes and adjustments of multiple aspects. "Cloud computing" owns advantages in many aspects which not merely reduce the difficulties to apply the operating system and also make it easy for users to search, acquire and process the resources. In accordance with this point, the author takes the management of digital libraries as the research focus in this paper, and analyzes the key technologies of the mobile internet cloud computing platform in the operation process. The popularization and promotion of computer technology drive people to create the digital library models, and its core idea is to strengthen the optimal management of the library resource information through computers and construct an inquiry and search platform with high performance, allowing the users to access to the necessary information resources at any time. However, the cloud computing is able to promote the computations within the computers to distribute in a large number of distributed computers, and hence implement the connection service of multiple computers. The digital libraries, as a typical representative of the applications of the cloud computing, can be used to carry out an analysis on the key technologies of the cloud computing.

  2. Performance Evaluation of Cots Uav for Architectural Heritage Documentation. a Test on S.GIULIANO Chapel in Savigliano (cn) - Italy

    NASA Astrophysics Data System (ADS)

    Chiabrando, F.; Teppati Losè, L.

    2017-08-01

    Even more the use of UAV platforms is a standard for images or videos acquisitions from an aerial point of view. According to the enormous growth of requests, we are assisting to an increasing of the production of COTS (Commercial off the Shelf) platforms and systems to answer to the market requirements. In this last years, different platforms have been developed and sell at low-medium cost and nowadays the offer of interesting systems is very large. One of the most important company that produce UAV and other imaging systems is the DJI (Dà-Jiāng Innovations Science and Technology Co., Ltd) founded in 2006 headquartered in Shenzhen - China. The platforms realized by the company range from low cost systems up to professional equipment, tailored for high resolution acquisitions useful for film maker purposes. According to the characteristics of the last developed low cost DJI platforms, the onboard sensors and the performance of the modern photogrammetric software based on Structure from Motion (SfM) algorithms, those systems are nowadays employed for performing 3D surveys starting from the small up to the large scale. The present paper is aimed to test the characteristic in terms of image quality, flight operations, flight planning and accuracy evaluation of the final products of three COTS platforms realized by DJI: the Mavic Pro, the Phantom 4 and the Phantom 4 PRO. The test site chosen was the Chapel of San Giuliano in the municipality of Savigliano (Cuneo-Italy), a small church with two aisles dating back to the early eleventh century.

  3. Research on Service Platform of Internet of Things for Smart City

    NASA Astrophysics Data System (ADS)

    Wang, W.; He, Z.; Huang, D.; Zhang, X.

    2014-04-01

    The application of Internet of Things in surveying and mapping industry basically is at the exploration stage, has not formed a unified standard. Chongqing Institute of Surveying and Mapping (CQISM) launched the research p roject "Research on the Technology of Internet of Things for Smart City". The project focuses on the key technologies of information transmission and exchange on the Internet of Things platform. The data standards of Internet of Things are designed. The real-time acquisition, mass storage and distributed data service of mass sensors are realized. On this basis, CQISM deploys the prototype platform of Internet of Things. The simulation application in Connected Car proves that the platform design is scientific and practical.

  4. [The Key Technology Study on Cloud Computing Platform for ECG Monitoring Based on Regional Internet of Things].

    PubMed

    Yang, Shu; Qiu, Yuyan; Shi, Bo

    2016-09-01

    This paper explores the methods of building the internet of things of a regional ECG monitoring, focused on the implementation of ECG monitoring center based on cloud computing platform. It analyzes implementation principles of automatic identifi cation in the types of arrhythmia. It also studies the system architecture and key techniques of cloud computing platform, including server load balancing technology, reliable storage of massive smalfi les and the implications of quick search function.

  5. Photoacoustic and Ultrasonic Image-Guided Needle Biopsy of the Prostate

    DTIC Science & Technology

    2015-10-01

    improved spectral fidelity (i.e., reduced redshift in obtained spectra) using the Verasonics ultrasound platform, is suited for eventual clinical...trials. The Verasonics acquisition is performed in a Delrin holder and coordinated with a LabVIEW data acquisition device to allow for energy... Verasonics Inc., Toronto, Canada), the ratio of PA signal at 850 vs 750 nm was calculated for each oxygenation concentration (Fig. 5) to achieve

  6. Assessment of Techniques for Evaluating Computer Systems for Federal Agency Procurements. Final Report.

    ERIC Educational Resources Information Center

    Letmanyi, Helen

    Developed to identify and qualitatively assess computer system evaluation techniques for use during acquisition of general purpose computer systems, this document presents several criteria for comparison and selection. An introduction discusses the automatic data processing (ADP) acquisition process and the need to plan for uncertainty through…

  7. The Advantages and Disadvantages of Computer Technology in Second Language Acquisition

    ERIC Educational Resources Information Center

    Lai, Cheng-Chieh; Kritsonis, William Allan

    2006-01-01

    The purpose of this article is to discuss the advantages and disadvantages of computer technology and Computer Assisted Language Learning (CALL) programs for current second language learning. According to the National Clearinghouse for English Language Acquisition & Language Instruction Educational Programs' report (2002), more than nine million…

  8. Architecture and Implementation of OpenPET Firmware and Embedded Software

    PubMed Central

    Abu-Nimeh, Faisal T.; Ito, Jennifer; Moses, William W.; Peng, Qiyu; Choong, Woon-Seng

    2016-01-01

    OpenPET is an open source, modular, extendible, and high-performance platform suitable for multi-channel data acquisition and analysis. Due to the flexibility of the hardware, firmware, and software architectures, the platform is capable of interfacing with a wide variety of detector modules not only in medical imaging but also in homeland security applications. Analog signals from radiation detectors share similar characteristics – a pulse whose area is proportional to the deposited energy and whose leading edge is used to extract a timing signal. As a result, a generic design method of the platform is adopted for the hardware, firmware, and software architectures and implementations. The analog front-end is hosted on a module called a Detector Board, where each board can filter, combine, timestamp, and process multiple channels independently. The processed data is formatted and sent through a backplane bus to a module called Support Board, where 1 Support Board can host up to eight Detector Board modules. The data in the Support Board, coming from 8 Detector Board modules, can be aggregated or correlated (if needed) depending on the algorithm implemented or runtime mode selected. It is then sent out to a computer workstation for further processing. The number of channels (detector modules), to be processed, mandates the overall OpenPET System Configuration, which is designed to handle up to 1,024 channels using 16-channel Detector Boards in the Standard System Configuration and 16,384 channels using 32-channel Detector Boards in the Large System Configuration. PMID:27110034

  9. Remote canopy hemispherical image collection system

    NASA Astrophysics Data System (ADS)

    Wan, Xuefen; Liu, Bingyu; Yang, Yi; Han, Fang; Cui, Jian

    2016-11-01

    Canopies are major part of plant photosynthesis and have distinct architectural elements such as tree crowns, whorls, branches, shoots, etc. By measuring canopy structural parameters, the solar radiation interception, photosynthesis effects and the spatio-temporal distribution of solar radiation under the canopy can be evaluated. Among canopy structure parameters, Leaf Area Index (LAI) is the key one. Leaf area index is a crucial variable in agronomic and environmental studies, because of its importance for estimating the amount of radiation intercepted by the canopy and the crop water requirements. The LAI can be achieved by hemispheric images which are obtained below the canopy with high accuracy and effectiveness. But existing hemispheric images canopy-LAI measurement technique is based on digital SLR camera with a fisheye lens. Users need to collect hemispheric image manually. The SLR camera with fisheye lens is not suit for long-term canopy-LAI outdoor measurement too. And the high cost of SLR limits its capacity. In recent years, with the development of embedded system and image processing technology, low cost remote canopy hemispheric image acquisition technology is becoming possible. In this paper, we present a remote hemispheric canopy image acquisition system with in-field/host configuration. In-field node based on imbed platform, low cost image sensor and fisheye lens is designed to achieve hemispherical image of plant canopy at distance with low cost. Solar radiation and temperature/humidity data, which are important for evaluating image data validation, are obtained for invalid hemispherical image elimination and node maintenance too. Host computer interacts with in-field node by 3G network. The hemispherical image calibration and super resolution are used to improve image quality in host computer. Results show that the remote canopy image collection system can make low cost remote canopy image acquisition for LAI effectively. It will be a potential technology candidate for low-cost remote canopy hemispherical image collection to measure canopy LAI.

  10. Cloud computing for comparative genomics with windows azure platform.

    PubMed

    Kim, Insik; Jung, Jae-Yoon; Deluca, Todd F; Nelson, Tristan H; Wall, Dennis P

    2012-01-01

    Cloud computing services have emerged as a cost-effective alternative for cluster systems as the number of genomes and required computation power to analyze them increased in recent years. Here we introduce the Microsoft Azure platform with detailed execution steps and a cost comparison with Amazon Web Services.

  11. Cloud Computing for Comparative Genomics with Windows Azure Platform

    PubMed Central

    Kim, Insik; Jung, Jae-Yoon; DeLuca, Todd F.; Nelson, Tristan H.; Wall, Dennis P.

    2012-01-01

    Cloud computing services have emerged as a cost-effective alternative for cluster systems as the number of genomes and required computation power to analyze them increased in recent years. Here we introduce the Microsoft Azure platform with detailed execution steps and a cost comparison with Amazon Web Services. PMID:23032609

  12. Analysis of scalability of high-performance 3D image processing platform for virtual colonoscopy

    NASA Astrophysics Data System (ADS)

    Yoshida, Hiroyuki; Wu, Yin; Cai, Wenli

    2014-03-01

    One of the key challenges in three-dimensional (3D) medical imaging is to enable the fast turn-around time, which is often required for interactive or real-time response. This inevitably requires not only high computational power but also high memory bandwidth due to the massive amount of data that need to be processed. For this purpose, we previously developed a software platform for high-performance 3D medical image processing, called HPC 3D-MIP platform, which employs increasingly available and affordable commodity computing systems such as the multicore, cluster, and cloud computing systems. To achieve scalable high-performance computing, the platform employed size-adaptive, distributable block volumes as a core data structure for efficient parallelization of a wide range of 3D-MIP algorithms, supported task scheduling for efficient load distribution and balancing, and consisted of a layered parallel software libraries that allow image processing applications to share the common functionalities. We evaluated the performance of the HPC 3D-MIP platform by applying it to computationally intensive processes in virtual colonoscopy. Experimental results showed a 12-fold performance improvement on a workstation with 12-core CPUs over the original sequential implementation of the processes, indicating the efficiency of the platform. Analysis of performance scalability based on the Amdahl's law for symmetric multicore chips showed the potential of a high performance scalability of the HPC 3DMIP platform when a larger number of cores is available.

  13. Phenoliner: A New Field Phenotyping Platform for Grapevine Research

    PubMed Central

    Kicherer, Anna; Herzog, Katja; Bendel, Nele; Klück, Hans-Christian; Backhaus, Andreas; Wieland, Markus; Klingbeil, Lasse; Läbe, Thomas; Hohl, Christian; Petry, Willi; Kuhlmann, Heiner; Seiffert, Udo; Töpfer, Reinhard

    2017-01-01

    In grapevine research the acquisition of phenotypic data is largely restricted to the field due to its perennial nature and size. The methodologies used to assess morphological traits and phenology are mainly limited to visual scoring. Some measurements for biotic and abiotic stress, as well as for quality assessments, are done by invasive measures. The new evolving sensor technologies provide the opportunity to perform non-destructive evaluations of phenotypic traits using different field phenotyping platforms. One of the biggest technical challenges for field phenotyping of grapevines are the varying light conditions and the background. In the present study the Phenoliner is presented, which represents a novel type of a robust field phenotyping platform. The vehicle is based on a grape harvester following the concept of a moveable tunnel. The tunnel it is equipped with different sensor systems (RGB and NIR camera system, hyperspectral camera, RTK-GPS, orientation sensor) and an artificial broadband light source. It is independent from external light conditions and in combination with artificial background, the Phenoliner enables standardised acquisition of high-quality, geo-referenced sensor data. PMID:28708080

  14. Phenoliner: A New Field Phenotyping Platform for Grapevine Research.

    PubMed

    Kicherer, Anna; Herzog, Katja; Bendel, Nele; Klück, Hans-Christian; Backhaus, Andreas; Wieland, Markus; Rose, Johann Christian; Klingbeil, Lasse; Läbe, Thomas; Hohl, Christian; Petry, Willi; Kuhlmann, Heiner; Seiffert, Udo; Töpfer, Reinhard

    2017-07-14

    In grapevine research the acquisition of phenotypic data is largely restricted to the field due to its perennial nature and size. The methodologies used to assess morphological traits and phenology are mainly limited to visual scoring. Some measurements for biotic and abiotic stress, as well as for quality assessments, are done by invasive measures. The new evolving sensor technologies provide the opportunity to perform non-destructive evaluations of phenotypic traits using different field phenotyping platforms. One of the biggest technical challenges for field phenotyping of grapevines are the varying light conditions and the background. In the present study the Phenoliner is presented, which represents a novel type of a robust field phenotyping platform. The vehicle is based on a grape harvester following the concept of a moveable tunnel. The tunnel it is equipped with different sensor systems (RGB and NIR camera system, hyperspectral camera, RTK-GPS, orientation sensor) and an artificial broadband light source. It is independent from external light conditions and in combination with artificial background, the Phenoliner enables standardised acquisition of high-quality, geo-referenced sensor data.

  15. Cloud Based Applications and Platforms (Presentation)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brodt-Giles, D.

    2014-05-15

    Presentation to the Cloud Computing East 2014 Conference, where we are highlighting our cloud computing strategy, describing the platforms on the cloud (including Smartgrid.gov), and defining our process for implementing cloud based applications.

  16. A cloud computing based platform for sleep behavior and chronic diseases collaborative research.

    PubMed

    Kuo, Mu-Hsing; Borycki, Elizabeth; Kushniruk, Andre; Huang, Yueh-Min; Hung, Shu-Hui

    2014-01-01

    The objective of this study is to propose a Cloud Computing based platform for sleep behavior and chronic disease collaborative research. The platform consists of two main components: (1) a sensing bed sheet with textile sensors to automatically record patient's sleep behaviors and vital signs, and (2) a service-oriented cloud computing architecture (SOCCA) that provides a data repository and allows for sharing and analysis of collected data. Also, we describe our systematic approach to implementing the SOCCA. We believe that the new cloud-based platform can provide nurse and other health professional researchers located in differing geographic locations with a cost effective, flexible, secure and privacy-preserved research environment.

  17. Data Management Standards in Computer-aided Acquisition and Logistic Support (CALS)

    NASA Technical Reports Server (NTRS)

    Jefferson, David K.

    1990-01-01

    Viewgraphs and discussion on data management standards in computer-aided acquisition and logistic support (CALS) are presented. CALS is intended to reduce cost, increase quality, and improve timeliness of weapon system acquisition and support by greatly improving the flow of technical information. The phase 2 standards, industrial environment, are discussed. The information resource dictionary system (IRDS) is described.

  18. Autonomous Sample Acquisition for Planetary and Small Body Explorations

    NASA Technical Reports Server (NTRS)

    Ghavimi, Ali R.; Serricchio, Frederick; Dolgin, Ben; Hadaegh, Fred Y.

    2000-01-01

    Robotic drilling and autonomous sample acquisition are considered as the key technology requirements in future planetary or small body exploration missions. Core sampling or subsurface drilling operation is envisioned to be off rovers or landers. These supporting platforms are inherently flexible, light, and can withstand only limited amount of reaction forces and torques. This, together with unknown properties of sampled materials, makes the sampling operation a tedious task and quite challenging. This paper highlights the recent advancements in the sample acquisition control system design and development for the in situ scientific exploration of planetary and small interplanetary missions.

  19. Design and development of data acquisition system based on WeChat hardware

    NASA Astrophysics Data System (ADS)

    Wang, Zhitao; Ding, Lei

    2018-06-01

    Data acquisition system based on WeChat hardware provides methods for popularization and practicality of data acquisition. The whole system is based on WeChat hardware platform, where the hardware part is developed on DA14580 development board and the software part is based on Alibaba Cloud. We designed service module, logic processing module, data processing module and database module. The communication between hardware and software uses AirSync Protocal. We tested this system by collecting temperature and humidity data, and the result shows that the system can aquisite the temperature and humidity in real time according to settings.

  20. A novel signal acquisition platform of human cardiovascular information with noninvasive method

    NASA Astrophysics Data System (ADS)

    Chen, Longcong; Cai, Shaoxi; Li, Bo; Jiang, Qifeng; Ke, Ming; Zhao, Yi; Chen, Sijia; Zou, Misha

    2017-05-01

    Cardiovascular diseases (CVDs) are considered the major cause of death worldwide, so more researchers pay more and more attention to the development of a non-invasive method to obtain as much cardiovascular information (CVI) as possible for early screening and diagnosing. It is known that considerable brain information could be probed by a variety of stimuli (such as video, light, and sound). Therefore, it is quite possible that much more CVI could be extracted via giving the human body some special interrelated stimulus. Based on this hypothesis, we designed a novel signal platform to acquire more CVI with a special stimulus, which is to give a gradual decrease and a different settable constant pressure to six air belts placed on two-side brachia, wrists, and ankles, respectively. During the stimulating process, the platform is able to collect 24-channel dynamic signals related with CVI synchronously. Moreover, to improve the measurement accuracy of signal acquisition, a high precision reference chip and a software correction are adopted in this platform. Additionally, we have also shown some collection instances and analysis results in this paper for its reliability. The results suggest that our platform can not only be applied on study in a deep-going way of relationship between collected signals and CVDs but can also serve as the basic tool for developing a new noninvasive cardiovascular function detection instrument and system that can be used both at home and in the hospital.

  1. CBRAIN: a web-based, distributed computing platform for collaborative neuroimaging research

    PubMed Central

    Sherif, Tarek; Rioux, Pierre; Rousseau, Marc-Etienne; Kassis, Nicolas; Beck, Natacha; Adalat, Reza; Das, Samir; Glatard, Tristan; Evans, Alan C.

    2014-01-01

    The Canadian Brain Imaging Research Platform (CBRAIN) is a web-based collaborative research platform developed in response to the challenges raised by data-heavy, compute-intensive neuroimaging research. CBRAIN offers transparent access to remote data sources, distributed computing sites, and an array of processing and visualization tools within a controlled, secure environment. Its web interface is accessible through any modern browser and uses graphical interface idioms to reduce the technical expertise required to perform large-scale computational analyses. CBRAIN's flexible meta-scheduling has allowed the incorporation of a wide range of heterogeneous computing sites, currently including nine national research High Performance Computing (HPC) centers in Canada, one in Korea, one in Germany, and several local research servers. CBRAIN leverages remote computing cycles and facilitates resource-interoperability in a transparent manner for the end-user. Compared with typical grid solutions available, our architecture was designed to be easily extendable and deployed on existing remote computing sites with no tool modification, administrative intervention, or special software/hardware configuration. As October 2013, CBRAIN serves over 200 users spread across 53 cities in 17 countries. The platform is built as a generic framework that can accept data and analysis tools from any discipline. However, its current focus is primarily on neuroimaging research and studies of neurological diseases such as Autism, Parkinson's and Alzheimer's diseases, Multiple Sclerosis as well as on normal brain structure and development. This technical report presents the CBRAIN Platform, its current deployment and usage and future direction. PMID:24904400

  2. CBRAIN: a web-based, distributed computing platform for collaborative neuroimaging research.

    PubMed

    Sherif, Tarek; Rioux, Pierre; Rousseau, Marc-Etienne; Kassis, Nicolas; Beck, Natacha; Adalat, Reza; Das, Samir; Glatard, Tristan; Evans, Alan C

    2014-01-01

    The Canadian Brain Imaging Research Platform (CBRAIN) is a web-based collaborative research platform developed in response to the challenges raised by data-heavy, compute-intensive neuroimaging research. CBRAIN offers transparent access to remote data sources, distributed computing sites, and an array of processing and visualization tools within a controlled, secure environment. Its web interface is accessible through any modern browser and uses graphical interface idioms to reduce the technical expertise required to perform large-scale computational analyses. CBRAIN's flexible meta-scheduling has allowed the incorporation of a wide range of heterogeneous computing sites, currently including nine national research High Performance Computing (HPC) centers in Canada, one in Korea, one in Germany, and several local research servers. CBRAIN leverages remote computing cycles and facilitates resource-interoperability in a transparent manner for the end-user. Compared with typical grid solutions available, our architecture was designed to be easily extendable and deployed on existing remote computing sites with no tool modification, administrative intervention, or special software/hardware configuration. As October 2013, CBRAIN serves over 200 users spread across 53 cities in 17 countries. The platform is built as a generic framework that can accept data and analysis tools from any discipline. However, its current focus is primarily on neuroimaging research and studies of neurological diseases such as Autism, Parkinson's and Alzheimer's diseases, Multiple Sclerosis as well as on normal brain structure and development. This technical report presents the CBRAIN Platform, its current deployment and usage and future direction.

  3. Acceleration of Cherenkov angle reconstruction with the new Intel Xeon/FPGA compute platform for the particle identification in the LHCb Upgrade

    NASA Astrophysics Data System (ADS)

    Faerber, Christian

    2017-10-01

    The LHCb experiment at the LHC will upgrade its detector by 2018/2019 to a ‘triggerless’ readout scheme, where all the readout electronics and several sub-detector parts will be replaced. The new readout electronics will be able to readout the detector at 40 MHz. This increases the data bandwidth from the detector down to the Event Filter farm to 40 TBit/s, which also has to be processed to select the interesting proton-proton collision for later storage. The architecture of such a computing farm, which can process this amount of data as efficiently as possible, is a challenging task and several compute accelerator technologies are being considered for use inside the new Event Filter farm. In the high performance computing sector more and more FPGA compute accelerators are used to improve the compute performance and reduce the power consumption (e.g. in the Microsoft Catapult project and Bing search engine). Also for the LHCb upgrade the usage of an experimental FPGA accelerated computing platform in the Event Building or in the Event Filter farm is being considered and therefore tested. This platform from Intel hosts a general CPU and a high performance FPGA linked via a high speed link which is for this platform a QPI link. On the FPGA an accelerator is implemented. The used system is a two socket platform from Intel with a Xeon CPU and an FPGA. The FPGA has cache-coherent memory access to the main memory of the server and can collaborate with the CPU. As a first step, a computing intensive algorithm to reconstruct Cherenkov angles for the LHCb RICH particle identification was successfully ported in Verilog to the Intel Xeon/FPGA platform and accelerated by a factor of 35. The same algorithm was ported to the Intel Xeon/FPGA platform with OpenCL. The implementation work and the performance will be compared. Also another FPGA accelerator the Nallatech 385 PCIe accelerator with the same Stratix V FPGA were tested for performance. The results show that the Intel Xeon/FPGA platforms, which are built in general for high performance computing, are also very interesting for the High Energy Physics community.

  4. Information and psychomotor skills knowledge acquisition: A student-customer-centered and computer-supported approach.

    PubMed

    Nicholson, Anita; Tobin, Mary

    2006-01-01

    This presentation will discuss coupling commercial and customized computer-supported teaching aids to provide BSN nursing students with a friendly customer-centered self-study approach to psychomotor skill acquisition.

  5. The Cyborg Astrobiologist: testing a novelty detection algorithm on two mobile exploration systems at Rivas Vaciamadrid in Spain and at the Mars Desert Research Station in Utah

    NASA Astrophysics Data System (ADS)

    McGuire, P. C.; Gross, C.; Wendt, L.; Bonnici, A.; Souza-Egipsy, V.; Ormö, J.; Díaz-Martínez, E.; Foing, B. H.; Bose, R.; Walter, S.; Oesker, M.; Ontrup, J.; Haschke, R.; Ritter, H.

    2010-01-01

    In previous work, a platform was developed for testing computer-vision algorithms for robotic planetary exploration. This platform consisted of a digital video camera connected to a wearable computer for real-time processing of images at geological and astrobiological field sites. The real-time processing included image segmentation and the generation of interest points based upon uncommonness in the segmentation maps. Also in previous work, this platform for testing computer-vision algorithms has been ported to a more ergonomic alternative platform, consisting of a phone camera connected via the Global System for Mobile Communications (GSM) network to a remote-server computer. The wearable-computer platform has been tested at geological and astrobiological field sites in Spain (Rivas Vaciamadrid and Riba de Santiuste), and the phone camera has been tested at a geological field site in Malta. In this work, we (i) apply a Hopfield neural-network algorithm for novelty detection based upon colour, (ii) integrate a field-capable digital microscope on the wearable computer platform, (iii) test this novelty detection with the digital microscope at Rivas Vaciamadrid, (iv) develop a Bluetooth communication mode for the phone-camera platform, in order to allow access to a mobile processing computer at the field sites, and (v) test the novelty detection on the Bluetooth-enabled phone camera connected to a netbook computer at the Mars Desert Research Station in Utah. This systems engineering and field testing have together allowed us to develop a real-time computer-vision system that is capable, for example, of identifying lichens as novel within a series of images acquired in semi-arid desert environments. We acquired sequences of images of geologic outcrops in Utah and Spain consisting of various rock types and colours to test this algorithm. The algorithm robustly recognized previously observed units by their colour, while requiring only a single image or a few images to learn colours as familiar, demonstrating its fast learning capability.

  6. Platform-independent method for computer aided schematic drawings

    DOEpatents

    Vell, Jeffrey L [Slingerlands, NY; Siganporia, Darius M [Clifton Park, NY; Levy, Arthur J [Fort Lauderdale, FL

    2012-02-14

    A CAD/CAM method is disclosed for a computer system to capture and interchange schematic drawing and associated design information. The schematic drawing and design information are stored in an extensible, platform-independent format.

  7. A Big Data Platform for Storing, Accessing, Mining and Learning Geospatial Data

    NASA Astrophysics Data System (ADS)

    Yang, C. P.; Bambacus, M.; Duffy, D.; Little, M. M.

    2017-12-01

    Big Data is becoming a norm in geoscience domains. A platform that is capable to effiently manage, access, analyze, mine, and learn the big data for new information and knowledge is desired. This paper introduces our latest effort on developing such a platform based on our past years' experiences on cloud and high performance computing, analyzing big data, comparing big data containers, and mining big geospatial data for new information. The platform includes four layers: a) the bottom layer includes a computing infrastructure with proper network, computer, and storage systems; b) the 2nd layer is a cloud computing layer based on virtualization to provide on demand computing services for upper layers; c) the 3rd layer is big data containers that are customized for dealing with different types of data and functionalities; d) the 4th layer is a big data presentation layer that supports the effient management, access, analyses, mining and learning of big geospatial data.

  8. SU-D-206-02: Evaluation of Partial Storage of the System Matrix for Cone Beam Computed Tomography Using a GPU Platform

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Matenine, D; Cote, G; Mascolo-Fortin, J

    2016-06-15

    Purpose: Iterative reconstruction algorithms in computed tomography (CT) require a fast method for computing the intersections between the photons’ trajectories and the object, also called ray-tracing or system matrix computation. This work evaluates different ways to store the system matrix, aiming to reconstruct dense image grids in reasonable time. Methods: We propose an optimized implementation of the Siddon’s algorithm using graphics processing units (GPUs) with a novel data storage scheme. The algorithm computes a part of the system matrix on demand, typically, for one projection angle. The proposed method was enhanced with accelerating options: storage of larger subsets of themore » system matrix, systematic reuse of data via geometric symmetries, an arithmetic-rich parallel code and code configuration via machine learning. It was tested on geometries mimicking a cone beam CT acquisition of a human head. To realistically assess the execution time, the ray-tracing routines were integrated into a regularized Poisson-based reconstruction algorithm. The proposed scheme was also compared to a different approach, where the system matrix is fully pre-computed and loaded at reconstruction time. Results: Fast ray-tracing of realistic acquisition geometries, which often lack spatial symmetry properties, was enabled via the proposed method. Ray-tracing interleaved with projection and backprojection operations required significant additional time. In most cases, ray-tracing was shown to use about 66 % of the total reconstruction time. In absolute terms, tracing times varied from 3.6 s to 7.5 min, depending on the problem size. The presence of geometrical symmetries allowed for non-negligible ray-tracing and reconstruction time reduction. Arithmetic-rich parallel code and machine learning permitted a modest reconstruction time reduction, in the order of 1 %. Conclusion: Partial system matrix storage permitted the reconstruction of higher 3D image grid sizes and larger projection datasets at the cost of additional time, when compared to the fully pre-computed approach. This work was supported in part by the Fonds de recherche du Quebec - Nature et technologies (FRQ-NT). The authors acknowledge partial support by the CREATE Medical Physics Research Training Network grant of the Natural Sciences and Engineering Research Council of Canada (Grant No. 432290).« less

  9. The Efficacy of the Internet-Based Blackboard Platform in Developmental Writing Classes

    ERIC Educational Resources Information Center

    Shudooh, Yusuf M.

    2016-01-01

    The application of computer-assisted platforms in writing classes is a relatively new paradigm in education. The adoption of computers-assisted writing classes is gaining ground in many western and non western universities. Numerous issues can be addressed when conducting computer-assisted classes (CAC). However, a few studies conducted to assess…

  10. NASA Tech Briefs, February 2011

    NASA Technical Reports Server (NTRS)

    2011-01-01

    Topics covered include: Multi-Segment Radius Measurement Using an Absolute Distance Meter Through a Null Assembly; Fiber-Optic Magnetic-Field-Strength Measurement System for Lightning Detection; Photocatalytic Active Radiation Measurements and Use; Computer Generated Hologram System for Wavefront Measurement System Calibration; Non-Contact Thermal Properties Measurement with Low-Power Laser and IR Camera System; SpaceCube 2.0: An Advanced Hybrid Onboard Data Processor; CMOS Imager Has Better Cross-Talk and Full-Well Performance; High-Performance Wireless Telemetry; Telemetry-Based Ranging; JWST Wavefront Control Toolbox; Java Image I/O for VICAR, PDS, and ISIS; X-Band Acquisition Aid Software; Antimicrobial-Coated Granules for Disinfecting Water; Range 7 Scanner Integration with PaR Robot Scanning System; Methods of Antimicrobial Coating of Diverse Materials; High-Operating-Temperature Barrier Infrared Detector with Tailorable Cutoff Wavelength; A Model of Reduced Kinetics for Alkane Oxidation Using Constituents and Species for N-Heptane; Thermally Conductive Tape Based on Carbon Nanotube Arrays; Two Catalysts for Selective Oxidation of Contaminant Gases; Nanoscale Metal Oxide Semiconductors for Gas Sensing; Lightweight, Ultra-High-Temperature, CMC-Lined Carbon/Carbon Structures; Sample Acquisition and Handling System from a Remote Platform; Improved Rare-Earth Emitter Hollow Cathode; High-Temperature Smart Structures for Engine Noise Reduction and Performance Enhancement; Cryogenic Scan Mechanism for Fourier Transform Spectrometer; Piezoelectric Rotary Tube Motor; Thermoelectric Energy Conversion Technology for High-Altitude Airships; Combustor Computations for CO2-Neutral Aviation; Use of Dynamic Distortion to Predict and Alleviate Loss of Control; Cycle Time Reduction in Trapped Mercury Ion Atomic Frequency Standards; and A (201)Hg+ Comagnetometer for (199)Hg+ Trapped Ion Space Atomic Clocks.

  11. Task Performance with List-Mode Data

    NASA Astrophysics Data System (ADS)

    Caucci, Luca

    This dissertation investigates the application of list-mode data to detection, estimation, and image reconstruction problems, with an emphasis on emission tomography in medical imaging. We begin by introducing a theoretical framework for list-mode data and we use it to define two observers that operate on list-mode data. These observers are applied to the problem of detecting a signal (known in shape and location) buried in a random lumpy background. We then consider maximum-likelihood methods for the estimation of numerical parameters from list-mode data, and we characterize the performance of these estimators via the so-called Fisher information matrix. Reconstruction from PET list-mode data is then considered. In a process we called "double maximum-likelihood" reconstruction, we consider a simple PET imaging system and we use maximum-likelihood methods to first estimate a parameter vector for each pair of gamma-ray photons that is detected by the hardware. The collection of these parameter vectors forms a list, which is then fed to another maximum-likelihood algorithm for volumetric reconstruction over a grid of voxels. Efficient parallel implementation of the algorithms discussed above is then presented. In this work, we take advantage of two low-cost, mass-produced computing platforms that have recently appeared on the market, and we provide some details on implementing our algorithms on these devices. We conclude this dissertation work by elaborating on a possible application of list-mode data to X-ray digital mammography. We argue that today's CMOS detectors and computing platforms have become fast enough to make X-ray digital mammography list-mode data acquisition and processing feasible.

  12. Rasdaman for Big Spatial Raster Data

    NASA Astrophysics Data System (ADS)

    Hu, F.; Huang, Q.; Scheele, C. J.; Yang, C. P.; Yu, M.; Liu, K.

    2015-12-01

    Spatial raster data have grown exponentially over the past decade. Recent advancements on data acquisition technology, such as remote sensing, have allowed us to collect massive observation data of various spatial resolution and domain coverage. The volume, velocity, and variety of such spatial data, along with the computational intensive nature of spatial queries, pose grand challenge to the storage technologies for effective big data management. While high performance computing platforms (e.g., cloud computing) can be used to solve the computing-intensive issues in big data analysis, data has to be managed in a way that is suitable for distributed parallel processing. Recently, rasdaman (raster data manager) has emerged as a scalable and cost-effective database solution to store and retrieve massive multi-dimensional arrays, such as sensor, image, and statistics data. Within this paper, the pros and cons of using rasdaman to manage and query spatial raster data will be examined and compared with other common approaches, including file-based systems, relational databases (e.g., PostgreSQL/PostGIS), and NoSQL databases (e.g., MongoDB and Hive). Earth Observing System (EOS) data collected from NASA's Atmospheric Scientific Data Center (ASDC) will be used and stored in these selected database systems, and a set of spatial and non-spatial queries will be designed to benchmark their performance on retrieving large-scale, multi-dimensional arrays of EOS data. Lessons learnt from using rasdaman will be discussed as well.

  13. Power Efficient Hardware Architecture of SHA-1 Algorithm for Trusted Mobile Computing

    NASA Astrophysics Data System (ADS)

    Kim, Mooseop; Ryou, Jaecheol

    The Trusted Mobile Platform (TMP) is developed and promoted by the Trusted Computing Group (TCG), which is an industry standard body to enhance the security of the mobile computing environment. The built-in SHA-1 engine in TMP is one of the most important circuit blocks and contributes the performance of the whole platform because it is used as key primitives supporting platform integrity and command authentication. Mobile platforms have very stringent limitations with respect to available power, physical circuit area, and cost. Therefore special architecture and design methods for low power SHA-1 circuit are required. In this paper, we present a novel and efficient hardware architecture of low power SHA-1 design for TMP. Our low power SHA-1 hardware can compute 512-bit data block using less than 7,000 gates and has a power consumption about 1.1 mA on a 0.25μm CMOS process.

  14. An Evaluation of Architectural Platforms for Parallel Navier-Stokes Computations

    NASA Technical Reports Server (NTRS)

    Jayasimha, D. N.; Hayder, M. E.; Pillay, S. K.

    1996-01-01

    We study the computational, communication, and scalability characteristics of a computational fluid dynamics application, which solves the time accurate flow field of a jet using the compressible Navier-Stokes equations, on a variety of parallel architecture platforms. The platforms chosen for this study are a cluster of workstations (the LACE experimental testbed at NASA Lewis), a shared memory multiprocessor (the Cray YMP), and distributed memory multiprocessors with different topologies - the IBM SP and the Cray T3D. We investigate the impact of various networks connecting the cluster of workstations on the performance of the application and the overheads induced by popular message passing libraries used for parallelization. The work also highlights the importance of matching the memory bandwidth to the processor speed for good single processor performance. By studying the performance of an application on a variety of architectures, we are able to point out the strengths and weaknesses of each of the example computing platforms.

  15. Parallelizing Navier-Stokes Computations on a Variety of Architectural Platforms

    NASA Technical Reports Server (NTRS)

    Jayasimha, D. N.; Hayder, M. E.; Pillay, S. K.

    1997-01-01

    We study the computational, communication, and scalability characteristics of a Computational Fluid Dynamics application, which solves the time accurate flow field of a jet using the compressible Navier-Stokes equations, on a variety of parallel architectural platforms. The platforms chosen for this study are a cluster of workstations (the LACE experimental testbed at NASA Lewis), a shared memory multiprocessor (the Cray YMP), distributed memory multiprocessors with different topologies-the IBM SP and the Cray T3D. We investigate the impact of various networks, connecting the cluster of workstations, on the performance of the application and the overheads induced by popular message passing libraries used for parallelization. The work also highlights the importance of matching the memory bandwidth to the processor speed for good single processor performance. By studying the performance of an application on a variety of architectures, we are able to point out the strengths and weaknesses of each of the example computing platforms.

  16. gProcess and ESIP Platforms for Satellite Imagery Processing over the Grid

    NASA Astrophysics Data System (ADS)

    Bacu, Victor; Gorgan, Dorian; Rodila, Denisa; Pop, Florin; Neagu, Gabriel; Petcu, Dana

    2010-05-01

    The Environment oriented Satellite Data Processing Platform (ESIP) is developed through the SEE-GRID-SCI (SEE-GRID eInfrastructure for regional eScience) co-funded by the European Commission through FP7 [1]. The gProcess Platform [2] is a set of tools and services supporting the development and the execution over the Grid of the workflow based processing, and particularly the satelite imagery processing. The ESIP [3], [4] is build on top of the gProcess platform by adding a set of satellite image processing software modules and meteorological algorithms. The satellite images can reveal and supply important information on earth surface parameters, climate data, pollution level, weather conditions that can be used in different research areas. Generally, the processing algorithms of the satellite images can be decomposed in a set of modules that forms a graph representation of the processing workflow. Two types of workflows can be defined in the gProcess platform: abstract workflow (PDG - Process Description Graph), in which the user defines conceptually the algorithm, and instantiated workflow (iPDG - instantiated PDG), which is the mapping of the PDG pattern on particular satellite image and meteorological data [5]. The gProcess platform allows the definition of complex workflows by combining data resources, operators, services and sub-graphs. The gProcess platform is developed for the gLite middleware that is available in EGEE and SEE-GRID infrastructures [6]. gProcess exposes the specific functionality through web services [7]. The Editor Web Service retrieves information on available resources that are used to develop complex workflows (available operators, sub-graphs, services, supported resources, etc.). The Manager Web Service deals with resources management (uploading new resources such as workflows, operators, services, data, etc.) and in addition retrieves information on workflows. The Executor Web Service manages the execution of the instantiated workflows on the Grid infrastructure. In addition, this web service monitors the execution and generates statistical data that are important to evaluate performances and to optimize execution. The Viewer Web Service allows access to input and output data. To prove and to validate the utility of the gProcess and ESIP platforms there were developed the GreenView and GreenLand applications. The GreenView related functionality includes the refinement of some meteorological data such as temperature, and the calibration of the satellite images based on field measurements. The GreenLand application performs the classification of the satellite images by using a set of vegetation indices. The gProcess and ESIP platforms are used as well in GiSHEO project [8] to support the processing of Earth Observation data over the Grid in eGLE (GiSHEO eLearning Environment). Experiments of performance assessment were conducted and they have revealed that the workflow-based execution could improve the execution time of a satellite image processing algorithm [9]. It is not a reliable solution to execute all the workflow nodes on different machines. The execution of some nodes can be more time consuming and they will be performed in a longer time than other nodes. The total execution time will be affected because some nodes will slow down the execution. It is important to correctly balance the workflow nodes. Based on some optimization strategy the workflow nodes can be grouped horizontally, vertically or in a hybrid approach. In this way, those operators will be executed on one machine and also the data transfer between workflow nodes will be lower. The dynamic nature of the Grid infrastructure makes it more exposed to the occurrence of failures. These failures can occur at worker node, services availability, storage element, etc. Currently gProcess has support for some basic error prevention and error management solutions. In future, some more advanced error prevention and management solutions will be integrated in the gProcess platform. References [1] SEE-GRID-SCI Project, http://www.see-grid-sci.eu/ [2] Bacu V., Stefanut T., Rodila D., Gorgan D., Process Description Graph Composition by gProcess Platform. HiPerGRID - 3rd International Workshop on High Performance Grid Middleware, 28 May, Bucharest. Proceedings of CSCS-17 Conference, Vol.2., ISSN 2066-4451, pp. 423-430, (2009). [3] ESIP Platform, http://wiki.egee-see.org/index.php/JRA1_Commonalities [4] Gorgan D., Bacu V., Rodila D., Pop Fl., Petcu D., Experiments on ESIP - Environment oriented Satellite Data Processing Platform. SEE-GRID-SCI User Forum, 9-10 Dec 2009, Bogazici University, Istanbul, Turkey, ISBN: 978-975-403-510-0, pp. 157-166 (2009). [5] Radu, A., Bacu, V., Gorgan, D., Diagrammatic Description of Satellite Image Processing Workflow. Workshop on Grid Computing Applications Development (GridCAD) at the SYNASC Symposium, 28 September 2007, Timisoara, IEEE Computer Press, ISBN 0-7695-3078-8, 2007, pp. 341-348 (2007). [6] Gorgan D., Bacu V., Stefanut T., Rodila D., Mihon D., Grid based Satellite Image Processing Platform for Earth Observation Applications Development. IDAACS'2009 - IEEE Fifth International Workshop on "Intelligent Data Acquisition and Advanced Computing Systems: Technology and Applications", 21-23 September, Cosenza, Italy, IEEE Published in Computer Press, 247-252 (2009). [7] Rodila D., Bacu V., Gorgan D., Integration of Satellite Image Operators as Workflows in the gProcess Application. Proceedings of ICCP2009 - IEEE 5th International Conference on Intelligent Computer Communication and Processing, 27-29 Aug, 2009 Cluj-Napoca. ISBN: 978-1-4244-5007-7, pp. 355-358 (2009). [8] GiSHEO consortium, Project site, http://gisheo.info.uvt.ro [9] Bacu V., Gorgan D., Graph Based Evaluation of Satellite Imagery Processing over Grid. ISPDC 2008 - 7th International Symposium on Parallel and Distributed Computing, July 1-5, 2008, Krakow, Poland. IEEE Computer Society 2008, ISBN: 978-0-7695-3472-5, pp. 147-154.

  17. A Web Tool for Research in Nonlinear Optics

    NASA Astrophysics Data System (ADS)

    Prikhod'ko, Nikolay V.; Abramovsky, Viktor A.; Abramovskaya, Natalia V.; Demichev, Andrey P.; Kryukov, Alexandr P.; Polyakov, Stanislav P.

    2016-02-01

    This paper presents a project of developing the web platform called WebNLO for computer modeling of nonlinear optics phenomena. We discuss a general scheme of the platform and a model for interaction between the platform modules. The platform is built as a set of interacting RESTful web services (SaaS approach). Users can interact with the platform through a web browser or command line interface. Such a resource has no analogues in the field of nonlinear optics and will be created for the first time therefore allowing researchers to access high-performance computing resources that will significantly reduce the cost of the research and development process.

  18. Reproducible Earth observation analytics: challenges, ideas, and a study case on containerized land use change detection

    NASA Astrophysics Data System (ADS)

    Appel, Marius; Nüst, Daniel; Pebesma, Edzer

    2017-04-01

    Geoscientific analyses of Earth observation data typically involve a long path from data acquisition to scientific results and conclusions. Before starting the actual processing, scenes must be downloaded from the providers' platforms and the computing infrastructure needs to be prepared. The computing environment often requires specialized software, which in turn might have lots of dependencies. The software is often highly customized and provided without commercial support, which leads to rather ad-hoc systems and irreproducible results. To let other scientists reproduce the analyses, the full workspace including data, code, the computing environment, and documentation must be bundled and shared. Technologies such as virtualization or containerization allow for the creation of identical computing environments with relatively little effort. Challenges, however, arise when the volume of the data is too large, when computations are done in a cluster environment, or when complex software components such as databases are used. We discuss these challenges for the example of scalable Land use change detection on Landsat imagery. We present a reproducible implementation that runs R and the scalable data management and analytical system SciDB within a Docker container. Thanks to an explicit container recipe (the Dockerfile), this enables the all-in-one reproduction including the installation of software components, the ingestion of the data, and the execution of the analysis in a well-defined environment. We furthermore discuss possibilities how the implementation could be transferred to multi-container environments in order to support reproducibility on large cluster environments.

  19. Continuous measurement of breast tumour hormone receptor expression: a comparison of two computational pathology platforms.

    PubMed

    Ahern, Thomas P; Beck, Andrew H; Rosner, Bernard A; Glass, Ben; Frieling, Gretchen; Collins, Laura C; Tamimi, Rulla M

    2017-05-01

    Computational pathology platforms incorporate digital microscopy with sophisticated image analysis to permit rapid, continuous measurement of protein expression. We compared two computational pathology platforms on their measurement of breast tumour oestrogen receptor (ER) and progesterone receptor (PR) expression. Breast tumour microarrays from the Nurses' Health Study were stained for ER (n=592) and PR (n=187). One expert pathologist scored cases as positive if ≥1% of tumour nuclei exhibited stain. ER and PR were then measured with the Definiens Tissue Studio (automated) and Aperio Digital Pathology (user-supervised) platforms. Platform-specific measurements were compared using boxplots, scatter plots and correlation statistics. Classification of ER and PR positivity by platform-specific measurements was evaluated with areas under receiver operating characteristic curves (AUC) from univariable logistic regression models, using expert pathologist classification as the standard. Both platforms showed considerable overlap in continuous measurements of ER and PR between positive and negative groups classified by expert pathologist. Platform-specific measurements were strongly and positively correlated with one another (r≥0.77). The user-supervised Aperio workflow performed slightly better than the automated Definiens workflow at classifying ER positivity (AUC Aperio =0.97; AUC Definiens =0.90; difference=0.07, 95% CI 0.05 to 0.09) and PR positivity (AUC Aperio =0.94; AUC Definiens =0.87; difference=0.07, 95% CI 0.03 to 0.12). Paired hormone receptor expression measurements from two different computational pathology platforms agreed well with one another. The user-supervised workflow yielded better classification accuracy than the automated workflow. Appropriately validated computational pathology algorithms enrich molecular epidemiology studies with continuous protein expression data and may accelerate tumour biomarker discovery. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/.

  20. Computer task performance by subjects with Duchenne muscular dystrophy.

    PubMed

    Malheiros, Silvia Regina Pinheiro; da Silva, Talita Dias; Favero, Francis Meire; de Abreu, Luiz Carlos; Fregni, Felipe; Ribeiro, Denise Cardoso; de Mello Monteiro, Carlos Bandeira

    2016-01-01

    Two specific objectives were established to quantify computer task performance among people with Duchenne muscular dystrophy (DMD). First, we compared simple computational task performance between subjects with DMD and age-matched typically developing (TD) subjects. Second, we examined correlations between the ability of subjects with DMD to learn the computational task and their motor functionality, age, and initial task performance. The study included 84 individuals (42 with DMD, mean age of 18±5.5 years, and 42 age-matched controls). They executed a computer maze task; all participants performed the acquisition (20 attempts) and retention (five attempts) phases, repeating the same maze. A different maze was used to verify transfer performance (five attempts). The Motor Function Measure Scale was applied, and the results were compared with maze task performance. In the acquisition phase, a significant decrease was found in movement time (MT) between the first and last acquisition block, but only for the DMD group. For the DMD group, MT during transfer was shorter than during the first acquisition block, indicating improvement from the first acquisition block to transfer. In addition, the TD group showed shorter MT than the DMD group across the study. DMD participants improved their performance after practicing a computational task; however, the difference in MT was present in all attempts among DMD and control subjects. Computational task improvement was positively influenced by the initial performance of individuals with DMD. In turn, the initial performance was influenced by their distal functionality but not their age or overall functionality.

  1. The Processing Cost of Reference Set Computation: Acquisition of Stress Shift and Focus

    ERIC Educational Resources Information Center

    Reinhart, Tanya

    2004-01-01

    Reference set computation -- the construction of a (global) comparison set to determine whether a given derivation is appropriate in context -- comes with a processing cost. I argue that this cost is directly visible at the acquisition stage: In those linguistic areas in which it has been independently established that such computation is indeed…

  2. 48 CFR 227.7206 - Contracts for architect-engineer services.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... Rights in Computer Software and Computer Software Documentation 227.7206 Contracts for architect-engineer services. Follow 227.7107 when contracting for architect-engineer services. ...-engineer services. 227.7206 Section 227.7206 Federal Acquisition Regulations System DEFENSE ACQUISITION...

  3. 48 CFR 227.7206 - Contracts for architect-engineer services.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... Rights in Computer Software and Computer Software Documentation 227.7206 Contracts for architect-engineer services. Follow 227.7107 when contracting for architect-engineer services. ...-engineer services. 227.7206 Section 227.7206 Federal Acquisition Regulations System DEFENSE ACQUISITION...

  4. 48 CFR 227.7206 - Contracts for architect-engineer services.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... Rights in Computer Software and Computer Software Documentation 227.7206 Contracts for architect-engineer services. Follow 227.7107 when contracting for architect-engineer services. ...-engineer services. 227.7206 Section 227.7206 Federal Acquisition Regulations System DEFENSE ACQUISITION...

  5. 48 CFR 227.7206 - Contracts for architect-engineer services.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... Rights in Computer Software and Computer Software Documentation 227.7206 Contracts for architect-engineer services. Follow 227.7107 when contracting for architect-engineer services. ...-engineer services. 227.7206 Section 227.7206 Federal Acquisition Regulations System DEFENSE ACQUISITION...

  6. 48 CFR 227.7206 - Contracts for architect-engineer services.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ...-engineer services. 227.7206 Section 227.7206 Federal Acquisition Regulations System DEFENSE ACQUISITION... Rights in Computer Software and Computer Software Documentation 227.7206 Contracts for architect-engineer services. Follow 227.7107 when contracting for architect-engineer services. ...

  7. Design of platform for removing screws from LCD display shields

    NASA Astrophysics Data System (ADS)

    Tu, Zimei; Qin, Qin; Dou, Jianfang; Zhu, Dongdong

    2017-11-01

    Removing the screws on the sides of a shield is a necessary process in disassembling a computer LCD display. To solve this issue, a platform has been designed for removing the screws on display shields. This platform uses virtual instrument technology with LabVIEW as the development environment to design the mechanical structure with the technologies of motion control, human-computer interaction and target recognition. This platform removes the screws from the sides of the shield of an LCD display mechanically thus to guarantee follow-up separation and recycle.

  8. On the performances of computer vision algorithms on mobile platforms

    NASA Astrophysics Data System (ADS)

    Battiato, S.; Farinella, G. M.; Messina, E.; Puglisi, G.; Ravì, D.; Capra, A.; Tomaselli, V.

    2012-01-01

    Computer Vision enables mobile devices to extract the meaning of the observed scene from the information acquired with the onboard sensor cameras. Nowadays, there is a growing interest in Computer Vision algorithms able to work on mobile platform (e.g., phone camera, point-and-shot-camera, etc.). Indeed, bringing Computer Vision capabilities on mobile devices open new opportunities in different application contexts. The implementation of vision algorithms on mobile devices is still a challenging task since these devices have poor image sensors and optics as well as limited processing power. In this paper we have considered different algorithms covering classic Computer Vision tasks: keypoint extraction, face detection, image segmentation. Several tests have been done to compare the performances of the involved mobile platforms: Nokia N900, LG Optimus One, Samsung Galaxy SII.

  9. Elastic Cloud Computing Architecture and System for Heterogeneous Spatiotemporal Computing

    NASA Astrophysics Data System (ADS)

    Shi, X.

    2017-10-01

    Spatiotemporal computation implements a variety of different algorithms. When big data are involved, desktop computer or standalone application may not be able to complete the computation task due to limited memory and computing power. Now that a variety of hardware accelerators and computing platforms are available to improve the performance of geocomputation, different algorithms may have different behavior on different computing infrastructure and platforms. Some are perfect for implementation on a cluster of graphics processing units (GPUs), while GPUs may not be useful on certain kind of spatiotemporal computation. This is the same situation in utilizing a cluster of Intel's many-integrated-core (MIC) or Xeon Phi, as well as Hadoop or Spark platforms, to handle big spatiotemporal data. Furthermore, considering the energy efficiency requirement in general computation, Field Programmable Gate Array (FPGA) may be a better solution for better energy efficiency when the performance of computation could be similar or better than GPUs and MICs. It is expected that an elastic cloud computing architecture and system that integrates all of GPUs, MICs, and FPGAs could be developed and deployed to support spatiotemporal computing over heterogeneous data types and computational problems.

  10. Generation of data acquisition requests for the ASTER satellite instrument for monitoring a globally distributed target: glaciers

    USGS Publications Warehouse

    Raup, B.H.; Kieffer, H.H.; Hare, T.M.; Kargel, J.S.

    2000-01-01

    The advanced spaceborne thermal emission and reflection radiometer (ASTER) instrument is scheduled to be launched on the EOS Terra platform in 1999. The Global Land Ice Measurements from Space project has planned to acquire ASTER images of most of the world's land ice annually during the six-year ASTER mission. This article describes the process of creating the data acquisition requests needed to cover approximately 170,000 glacier targets.

  11. Development and operation of a real-time data acquisition system for the NASA, Langley Research Center Differential Absorption Lidar

    NASA Technical Reports Server (NTRS)

    Butler, C.; Kindle, E. C.

    1984-01-01

    The capabilities of the DIAL data acquisition system (DAS) for the remote measurement of atmospheric trace gas concentrations from ground and aircraft platforms were extended through the purchase and integration of other hardware and the implementation of improved software. An operational manual for the current system is presented. Hardware and peripheral device registers are outlined only as an aid in debugging any DAS problems which may arise.

  12. Exploiting Artificial Intelligence for Analysis and Data Selection on-board the Puerto Rico CubeSat

    NASA Astrophysics Data System (ADS)

    Bergman, J. E. S.; Bruhn, F.; Funk, P.; Isham, B.; Rincón-Charris, A. A.; Capo-Lugo, P.; Åhlén, L.

    2015-10-01

    CubeSat missions are constrained by the limited resources provided by the platform. Many payload providers have learned to cope with the low mass and power but the poor telemetry allocation remains a bottleneck. In the end, it is the data delivered to ground which determines the value of the mission. However, transmitting more data does not necessarily guarantee high value, since the value also depends on the data quality. By exploiting fast on-board computing and efficient artificial intelligence (AI) algorithms for analysis and data selection one could optimize the usage of the telemetry link and so increase the value of the mission. In a pilot project, we attempt to do this on the Puerto Rico CubeSat, where science objectives include the acquisition of space weather data to aid better understanding of the Sun to Earth connection.

  13. High-Performance 3D Compressive Sensing MRI Reconstruction Using Many-Core Architectures.

    PubMed

    Kim, Daehyun; Trzasko, Joshua; Smelyanskiy, Mikhail; Haider, Clifton; Dubey, Pradeep; Manduca, Armando

    2011-01-01

    Compressive sensing (CS) describes how sparse signals can be accurately reconstructed from many fewer samples than required by the Nyquist criterion. Since MRI scan duration is proportional to the number of acquired samples, CS has been gaining significant attention in MRI. However, the computationally intensive nature of CS reconstructions has precluded their use in routine clinical practice. In this work, we investigate how different throughput-oriented architectures can benefit one CS algorithm and what levels of acceleration are feasible on different modern platforms. We demonstrate that a CUDA-based code running on an NVIDIA Tesla C2050 GPU can reconstruct a 256 × 160 × 80 volume from an 8-channel acquisition in 19 seconds, which is in itself a significant improvement over the state of the art. We then show that Intel's Knights Ferry can perform the same 3D MRI reconstruction in only 12 seconds, bringing CS methods even closer to clinical viability.

  14. Assessing UAV platform types and optical sensor specifications

    NASA Astrophysics Data System (ADS)

    Altena, B.; Goedemé, T.

    2014-05-01

    Photogrammetric acquisition with unmanned aerial vehicles (UAV) has grown extensively over the last couple of years. Such mobile platforms and their processing software have matured, resulting in a market which offers off-the-shelf mapping solutions to surveying companies and geospatial enterprises. Different approaches in platform type and optical instruments exist, though its resulting products have similar specifications. To demonstrate differences in acquisitioning practice, a case study over an open mine was flown with two different off-the-shelf UAVs (a fixed-wing and a multi-rotor). The resulting imagery is analyzed to clarify the differences in collection quality. We look at image settings, and stress the fact of photographic experience if manual setting are applied. For mapping production it might be safest to set the camera on automatic. Furthermore, we try to estimate if blur is present due to image motion. A subtle trend seems to be present, for the fast flying platform though its extent is of similar order to the slow moving one. It shows both systems operate at their limits. Finally, the lens distortion is assessed with special attention to chromatic aberration. Here we see that through calibration such aberrations could be present, however detecting this phenomena directly on imagery is not straightforward. For such effects a normal lens is sufficient, though a better lens and collimator does give significant improvement.

  15. Discovery and analysis of time delay sources in the USGS personal computer data collection platform (PCDCP) system

    USGS Publications Warehouse

    White, Timothy C.; Sauter, Edward A.; Stewart, Duff C.

    2014-01-01

    Intermagnet is an international oversight group which exists to establish a global network for geomagnetic observatories. This group establishes data standards and standard operating procedures for members and prospective members. Intermagnet has proposed a new One-Second Data Standard, for that emerging geomagnetic product. The standard specifies that all data collected must have a time stamp accuracy of ±10 milliseconds of the top-of-the-second Coordinated Universal Time. Therefore, the U.S. Geological Survey Geomagnetism Program has designed and executed several tests on its current data collection system, the Personal Computer Data Collection Platform. Tests are designed to measure the time shifts introduced by individual components within the data collection system, as well as to measure the time shift introduced by the entire Personal Computer Data Collection Platform. Additional testing designed for Intermagnet will be used to validate further such measurements. Current results of the measurements showed a 5.0–19.9 millisecond lag for the vertical channel (Z) of the Personal Computer Data Collection Platform and a 13.0–25.8 millisecond lag for horizontal channels (H and D) of the collection system. These measurements represent a dynamically changing delay introduced within the U.S. Geological Survey Personal Computer Data Collection Platform.

  16. SU-D-BRD-02: A Web-Based Image Processing and Plan Evaluation Platform (WIPPEP) for Future Cloud-Based Radiotherapy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chai, X; Liu, L; Xing, L

    Purpose: Visualization and processing of medical images and radiation treatment plan evaluation have traditionally been constrained to local workstations with limited computation power and ability of data sharing and software update. We present a web-based image processing and planning evaluation platform (WIPPEP) for radiotherapy applications with high efficiency, ubiquitous web access, and real-time data sharing. Methods: This software platform consists of three parts: web server, image server and computation server. Each independent server communicates with each other through HTTP requests. The web server is the key component that provides visualizations and user interface through front-end web browsers and relay informationmore » to the backend to process user requests. The image server serves as a PACS system. The computation server performs the actual image processing and dose calculation. The web server backend is developed using Java Servlets and the frontend is developed using HTML5, Javascript, and jQuery. The image server is based on open source DCME4CHEE PACS system. The computation server can be written in any programming language as long as it can send/receive HTTP requests. Our computation server was implemented in Delphi, Python and PHP, which can process data directly or via a C++ program DLL. Results: This software platform is running on a 32-core CPU server virtually hosting the web server, image server, and computation servers separately. Users can visit our internal website with Chrome browser, select a specific patient, visualize image and RT structures belonging to this patient and perform image segmentation running Delphi computation server and Monte Carlo dose calculation on Python or PHP computation server. Conclusion: We have developed a webbased image processing and plan evaluation platform prototype for radiotherapy. This system has clearly demonstrated the feasibility of performing image processing and plan evaluation platform through a web browser and exhibited potential for future cloud based radiotherapy.« less

  17. Validation of the Monte Carlo simulator GATE for indium-111 imaging.

    PubMed

    Assié, K; Gardin, I; Véra, P; Buvat, I

    2005-07-07

    Monte Carlo simulations are useful for optimizing and assessing single photon emission computed tomography (SPECT) protocols, especially when aiming at measuring quantitative parameters from SPECT images. Before Monte Carlo simulated data can be trusted, the simulation model must be validated. The purpose of this work was to validate the use of GATE, a new Monte Carlo simulation platform based on GEANT4, for modelling indium-111 SPECT data, the quantification of which is of foremost importance for dosimetric studies. To that end, acquisitions of (111)In line sources in air and in water and of a cylindrical phantom were performed, together with the corresponding simulations. The simulation model included Monte Carlo modelling of the camera collimator and of a back-compartment accounting for photomultiplier tubes and associated electronics. Energy spectra, spatial resolution, sensitivity values, images and count profiles obtained for experimental and simulated data were compared. An excellent agreement was found between experimental and simulated energy spectra. For source-to-collimator distances varying from 0 to 20 cm, simulated and experimental spatial resolution differed by less than 2% in air, while the simulated sensitivity values were within 4% of the experimental values. The simulation of the cylindrical phantom closely reproduced the experimental data. These results suggest that GATE enables accurate simulation of (111)In SPECT acquisitions.

  18. A Wearable Home BCI system: preliminary results with SSVEP protocol.

    PubMed

    Piccini, Luca; Parini, Sergio; Maggi, Luca; Andreoni, Giuseppe

    2005-01-01

    This paper presents and discusses the realization and the performances of a wearable system for EEG-based BCI applications. The system (called Kimera) consists of a two-layer hardware architecture (the wireless acquisition and transmission board based on a Bluetooth ® ARM chip, and a low power miniaturized biosignal acquisition analog front end) together with a software suite (called Bellerophonte) for the Graphic User Interface management, protocol execution, data recording, transmission and processing. The implemented BCI system was based on the SSVEP protocol, applied to a two state selection by using standards display/monitor with a couple of high efficiency LEDs. The frequency features of the signal were computed and used in the intention detection. The BCI algorithm is based on a supervised classifier implemented through a multi-class Canonical Discriminant Analysis (CDA) with a continuous realtime feedback based on the mahalanobis distance parameter. Five healthy subjects participated in the first phase for a preliminary device validation. The obtained results are very interesting and promising, being lined out to the most recent performance reported in literature with a significant improvement both in system and in classification capabilities. The user-friendliness and low cost of the Kimera& Bellerophonte platform make it suitable for the development of home BCI applications.

  19. A high speed buffer for LV data acquisition

    NASA Technical Reports Server (NTRS)

    Cavone, Angelo A.; Sterlina, Patrick S.; Clemmons, James I., Jr.; Meyers, James F.

    1987-01-01

    The laser velocimeter (autocovariance) buffer interface is a data acquisition subsystem designed specifically for the acquisition of data from a laser velocimeter. The subsystem acquires data from up to six laser velocimeter components in parallel, measures the times between successive data points for each of the components, establishes and maintains a coincident condition between any two or three components, and acquires data from other instrumentation systems simultaneously with the laser velocimeter data points. The subsystem is designed to control the entire data acquisition process based on initial setup parameters obtained from a host computer and to be independent of the computer during the acquisition. On completion of the acquisition cycle, the interface transfers the contents of its memory to the host under direction of the host via a single 16-bit parallel DMA channel.

  20. Peer Review-Based Scripted Collaboration to Support Domain-Specific and Domain-General Knowledge Acquisition in Computer Science

    ERIC Educational Resources Information Center

    Demetriadis, Stavros; Egerter, Tina; Hanisch, Frank; Fischer, Frank

    2011-01-01

    This study investigates the effectiveness of using peer review in the context of scripted collaboration to foster both domain-specific and domain-general knowledge acquisition in the computer science domain. Using a one-factor design with a script and a control condition, students worked in small groups on a series of computer science problems…

  1. Effects of Computer-Based Practice on the Acquisition and Maintenance of Basic Academic Skills for Children with Moderate to Intensive Educational Needs

    ERIC Educational Resources Information Center

    Everhart, Julie M.; Alber-Morgan, Sheila R.; Park, Ju Hee

    2011-01-01

    This study investigated the effects of computer-based practice on the acquisition and maintenance of basic academic skills for two children with moderate to intensive disabilities. The special education teacher created individualized computer games that enabled the participants to independently practice academic skills that corresponded with their…

  2. Defense Acquisitions Acronyms and Terms

    DTIC Science & Technology

    2012-12-01

    Computer-Aided Design CADD Computer-Aided Design and Drafting CAE Component Acquisition Executive; Computer-Aided Engineering CAIV Cost As an...Radiation to Ordnance HFE Human Factors Engineering HHA Health Hazard Assessment HNA Host-Nation Approval HNS Host-Nation Support HOL High -Order...Engineering Change Proposal VHSIC Very High Speed Integrated Circuit VLSI Very Large Scale Integration VOC Volatile Organic Compound W WAN Wide

  3. Upgrade to the control system of the reflectometry diagnostic of ASDEX upgrade

    NASA Astrophysics Data System (ADS)

    Graça, S.; Santos, J.; Manso, M. E.

    2004-10-01

    The broadband frequency modulation-continuous wave microwave/millimeter wave reflectometer of ASDEX upgrade tokamak (Institut für Plasma Physik (IPP), Garching, Germany) developed by Centro de Fusão Nuclear (Lisboa, Portugal) with the collaboration of IPP, is a complex system with 13 channels (O and X modes) and two types of operation modes (swept and fixed frequency). The control system that ensures remote operation of the diagnostic incorporates VME and CAMAC bus based acquisition/timing systems. Microprocessor input/output boards are used to control and monitor the microwave circuitry and associated electronic devices. The implementation of the control system is based on an object-oriented client/server model: a centralized server manages the hardware and receives input from remote clients. Communication is handled through transmission control protocol/internet protocol sockets. Here we describe recent upgrades of the control system aiming to: (i) accommodate new channels; (ii) adapt to the heterogeneity of computing platforms and operating systems; and (iii) overcome remote access restrictions. Platform and operating system independence was achieved by redesigning the graphical user interface in JAVA. As secure shell is the standard remote access protocol adopted in major fusion laboratories, secure shell tunneling was implemented to allow remote operation of the diagnostic through the existing firewalls.

  4. Digital imaging of root traits (DIRT): a high-throughput computing and collaboration platform for field-based root phenomics.

    PubMed

    Das, Abhiram; Schneider, Hannah; Burridge, James; Ascanio, Ana Karine Martinez; Wojciechowski, Tobias; Topp, Christopher N; Lynch, Jonathan P; Weitz, Joshua S; Bucksch, Alexander

    2015-01-01

    Plant root systems are key drivers of plant function and yield. They are also under-explored targets to meet global food and energy demands. Many new technologies have been developed to characterize crop root system architecture (CRSA). These technologies have the potential to accelerate the progress in understanding the genetic control and environmental response of CRSA. Putting this potential into practice requires new methods and algorithms to analyze CRSA in digital images. Most prior approaches have solely focused on the estimation of root traits from images, yet no integrated platform exists that allows easy and intuitive access to trait extraction and analysis methods from images combined with storage solutions linked to metadata. Automated high-throughput phenotyping methods are increasingly used in laboratory-based efforts to link plant genotype with phenotype, whereas similar field-based studies remain predominantly manual low-throughput. Here, we present an open-source phenomics platform "DIRT", as a means to integrate scalable supercomputing architectures into field experiments and analysis pipelines. DIRT is an online platform that enables researchers to store images of plant roots, measure dicot and monocot root traits under field conditions, and share data and results within collaborative teams and the broader community. The DIRT platform seamlessly connects end-users with large-scale compute "commons" enabling the estimation and analysis of root phenotypes from field experiments of unprecedented size. DIRT is an automated high-throughput computing and collaboration platform for field based crop root phenomics. The platform is accessible at http://www.dirt.iplantcollaborative.org/ and hosted on the iPlant cyber-infrastructure using high-throughput grid computing resources of the Texas Advanced Computing Center (TACC). DIRT is a high volume central depository and high-throughput RSA trait computation platform for plant scientists working on crop roots. It enables scientists to store, manage and share crop root images with metadata and compute RSA traits from thousands of images in parallel. It makes high-throughput RSA trait computation available to the community with just a few button clicks. As such it enables plant scientists to spend more time on science rather than on technology. All stored and computed data is easily accessible to the public and broader scientific community. We hope that easy data accessibility will attract new tool developers and spur creative data usage that may even be applied to other fields of science.

  5. Design and Construction of Detector and Data Acquisition Elements for Proton Computed Tomography

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fermi Research Alliance; Northern Illinois University

    2015-07-15

    Proton computed tomography (pCT) offers an alternative to x-ray imaging with potential for three-dimensional imaging, reduced radiation exposure, and in-situ imaging. Northern Illinois University (NIU) is developing a second-generation proton computed tomography system with a goal of demonstrating the feasibility of three-dimensional imaging within clinically realistic imaging times. The second-generation pCT system is comprised of a tracking system, a calorimeter, data acquisition, a computing farm, and software algorithms. The proton beam encounters the upstream tracking detectors, the patient or phantom, the downstream tracking detectors, and a calorimeter. The schematic layout of the PCT system is shown. The data acquisition sendsmore » the proton scattering information to an offline computing farm. Major innovations of the second generation pCT project involve an increased data acquisition rate ( MHz range) and development of three-dimensional imaging algorithms. The Fermilab Particle Physics Division and Northern Illinois Center for Accelerator and Detector Development at Northern Illinois University worked together to design and construct the tracking detectors, calorimeter, readout electronics and detector mounting system.« less

  6. GenomicTools: a computational platform for developing high-throughput analytics in genomics.

    PubMed

    Tsirigos, Aristotelis; Haiminen, Niina; Bilal, Erhan; Utro, Filippo

    2012-01-15

    Recent advances in sequencing technology have resulted in the dramatic increase of sequencing data, which, in turn, requires efficient management of computational resources, such as computing time, memory requirements as well as prototyping of computational pipelines. We present GenomicTools, a flexible computational platform, comprising both a command-line set of tools and a C++ API, for the analysis and manipulation of high-throughput sequencing data such as DNA-seq, RNA-seq, ChIP-seq and MethylC-seq. GenomicTools implements a variety of mathematical operations between sets of genomic regions thereby enabling the prototyping of computational pipelines that can address a wide spectrum of tasks ranging from pre-processing and quality control to meta-analyses. Additionally, the GenomicTools platform is designed to analyze large datasets of any size by minimizing memory requirements. In practical applications, where comparable, GenomicTools outperforms existing tools in terms of both time and memory usage. The GenomicTools platform (version 2.0.0) was implemented in C++. The source code, documentation, user manual, example datasets and scripts are available online at http://code.google.com/p/ibm-cbc-genomic-tools.

  7. Silhouette-based approach of 3D image reconstruction for automated image acquisition using robotic arm

    NASA Astrophysics Data System (ADS)

    Azhar, N.; Saad, W. H. M.; Manap, N. A.; Saad, N. M.; Syafeeza, A. R.

    2017-06-01

    This study presents the approach of 3D image reconstruction using an autonomous robotic arm for the image acquisition process. A low cost of the automated imaging platform is created using a pair of G15 servo motor connected in series to an Arduino UNO as a main microcontroller. Two sets of sequential images were obtained using different projection angle of the camera. The silhouette-based approach is used in this study for 3D reconstruction from the sequential images captured from several different angles of the object. Other than that, an analysis based on the effect of different number of sequential images on the accuracy of 3D model reconstruction was also carried out with a fixed projection angle of the camera. The effecting elements in the 3D reconstruction are discussed and the overall result of the analysis is concluded according to the prototype of imaging platform.

  8. Real-time UNIX in HEP data acquisition

    NASA Astrophysics Data System (ADS)

    Buono, S.; Gaponenko, I.; Jones, R.; Mapelli, L.; Mornacchi, G.; Prigent, D.; Sanchez-Corral, E.; Skiadelli, M.; Toppers, A.; Duval, P. Y.; Ferrato, D.; Le Van Suu, A.; Qian, Z.; Rondot, C.; Ambrosini, G.; Fumagalli, G.; Aguer, M.; Huet, M.

    1994-12-01

    Today's experimentation in high energy physics is characterized by an increasing need for sensitivity to rare phenomena and complex physics signatures, which require the use of huge and sophisticated detectors and consequently a high performance readout and data acquisition. Multi-level triggering, hierarchical data collection and an always increasing amount of processing power, distributed throughout the data acquisition layers, will impose a number of features on the software environment, especially the need for a high level of standardization. Real-time UNIX seems, today, the best solution for the platform independence, operating system interface standards and real-time features necessary for data acquisition in HEP experiments. We present the results of the evaluation, in a realistic application environment, of a Real-Time UNIX operating system: the EP/LX real-time UNIX system.

  9. The Design and Development of Test Platform for Wheat Precision Seeding Based on Image Processing Techniques

    NASA Astrophysics Data System (ADS)

    Li, Qing; Lin, Haibo; Xiu, Yu-Feng; Wang, Ruixue; Yi, Chuijie

    The test platform of wheat precision seeding based on image processing techniques is designed to develop the wheat precision seed metering device with high efficiency and precision. Using image processing techniques, this platform gathers images of seeds (wheat) on the conveyer belt which are falling from seed metering device. Then these data are processed and analyzed to calculate the qualified rate, reseeding rate and leakage sowing rate, etc. This paper introduces the whole structure, design parameters of the platform and hardware & software of the image acquisition system were introduced, as well as the method of seed identification and seed-space measurement using image's threshold and counting the seed's center. By analyzing the experimental result, the measurement error is less than ± 1mm.

  10. The performance of low-cost commercial cloud computing as an alternative in computational chemistry.

    PubMed

    Thackston, Russell; Fortenberry, Ryan C

    2015-05-05

    The growth of commercial cloud computing (CCC) as a viable means of computational infrastructure is largely unexplored for the purposes of quantum chemistry. In this work, the PSI4 suite of computational chemistry programs is installed on five different types of Amazon World Services CCC platforms. The performance for a set of electronically excited state single-point energies is compared between these CCC platforms and typical, "in-house" physical machines. Further considerations are made for the number of cores or virtual CPUs (vCPUs, for the CCC platforms), but no considerations are made for full parallelization of the program (even though parallelization of the BLAS library is implemented), complete high-performance computing cluster utilization, or steal time. Even with this most pessimistic view of the computations, CCC resources are shown to be more cost effective for significant numbers of typical quantum chemistry computations. Large numbers of large computations are still best utilized by more traditional means, but smaller-scale research may be more effectively undertaken through CCC services. © 2015 Wiley Periodicals, Inc.

  11. COMPUTER-AIDED DATA ACQUISITION FOR COMBUSTION EXPERIMENTS

    EPA Science Inventory

    The article describes the use of computer-aided data acquisition techniques to aid the research program of the Combustion Research Branch (CRB) of the U.S. EPA's Air and Energy Engineering Research Laboratory (AEERL) in Research Triangle Park, NC, in particular on CRB's bench-sca...

  12. A Platform-Independent Plugin for Navigating Online Radiology Cases.

    PubMed

    Balkman, Jason D; Awan, Omer A

    2016-06-01

    Software methods that enable navigation of radiology cases on various digital platforms differ between handheld devices and desktop computers. This has resulted in poor compatibility of online radiology teaching files across mobile smartphones, tablets, and desktop computers. A standardized, platform-independent, or "agnostic" approach for presenting online radiology content was produced in this work by leveraging modern hypertext markup language (HTML) and JavaScript web software technology. We describe the design and evaluation of this software, demonstrate its use across multiple viewing platforms, and make it publicly available as a model for future development efforts.

  13. A generic readout system for astrophysical detectors

    NASA Astrophysics Data System (ADS)

    Doumayrou, E.; Lortholary, M.

    2012-09-01

    We have developed a generic digital platform to fulfill the needs for the development of new detectors in astrophysics, which is used in lab, for ground-based telescopes instruments and also in prototype versions for space instruments development. This system is based on hardware FPGA electronic board (called MISE) together with software on a PC computer (called BEAR). The MISE board generates the fast clocking which reads the detectors thanks to a programmable digital sequencer and performs data acquisition, buffering of digitalized pixels outputs and interfaces with others boards. The data are then sent to the PC via a SpaceWire or Usb link. The BEAR software sets the MISE board up, makes data acquisition and enables the visualization, processing and the storage of data in line. These software tools are made of C++ and Labview (NI) on a Linux OS. MISE and BEAR make a generic acquisition architecture, on which dedicated analog boards are plugged, so that to accommodate with detectors specificity: number of pixels, the readout channels and frequency, analog bias and clock interfaces. We have used this concept to build a camera for the P-ARTEMIS project including a 256 pixels sub-millimeter bolometer detector at 10Kpixel/s (SPIE 7741-12 (2010)). For the EUCLID project, a lab camera is now working for the test of CCDs 4Mpixels at 4*200Kpixel/s. Another is working for the testing of new near infrared detectors (NIR LFSA for the ESA TRP program) 110Kpixels at 2*100Kpixels/s. Other projects are in progress for the space missions PLATO and SPICA.

  14. Quantitative colorectal cancer perfusion measurement using dynamic contrast-enhanced multidetector-row computed tomography: effect of acquisition time and implications for protocols.

    PubMed

    Goh, Vicky; Halligan, Steve; Hugill, Jo-Ann; Gartner, Louise; Bartram, Clive I

    2005-01-01

    To determine the effect of acquisition time on quantitative colorectal cancer perfusion measurement. Dynamic contrast-enhanced computed tomography (CT) was performed prospectively in 10 patients with histologically proven colorectal cancer using 4-detector row CT (Lightspeed Plus; GE Healthcare Technologies, Waukesha, WI). Tumor blood flow, blood volume, mean transit time, and permeability were assessed for 3 acquisition times (45, 65, and 130 seconds). Mean values for all 4 perfusion parameters for each acquisition time were compared using the paired t test. Significant differences in permeability values were noted between acquisitions of 45 seconds and 65 and 130 seconds, respectively (P=0.02, P=0.007). There was no significant difference for values of blood volume, blood flow, and mean transit time between any of the acquisition times. Scan acquisitions of 45 seconds are too short for reliable permeability measurement in the abdomen. Longer acquisition times are required.

  15. A PC-based single-ADC multi-parameter data acquisition system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Woodring, M.; Kegel, G.H.R.; Egan, J.J.

    1995-10-01

    A personal computer (PC) based mult parameter data acquisition system using the Microsoft Window operating environment has been designed and constructed. An IBI AT compatible personal computer with an Intel 486DX5 microprocessor was combined with a National Instruments ATIDIO 32 digital I/O card, a single Canberra 8713 ADC with 13-bit resolution and a modified Canberra 8223 8-input analog multiplexer to acquil data from experiments carried out at the UML Van de Graa accelerator. The accelerator data acquisition (ADAC) computer environment was programmed in Microsoft Visual BASIC for use i Windows. ADAC allows event-mode data acquisition with up to eight parametersmore » (modifiable to 64) and the simultaneous display parameters during acquisition. Additional features of ADAC include replay of event-mode data and graphical analysis/display of data. TV ADAC environment is easy to upgrade or expand, inexpensive 1 implement, and is specifically designed to meet the needs of nuclei spectroscopy.« less

  16. 77 FR 76588 - Request for Proposal Platform Pilot

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-12-28

    ...The Small Business Administration (SBA) is announcing a pilot where federal agencies will test a new request for proposal (RFP) platform (RFP-EZ) to streamline the process through which the government buys web design and related technology services from small businesses for acquisitions valued at or below the simplified acquisition threshold (SAT). RFP-EZ is one of five projects sponsored by the Office of Science and Technology Policy's Presidential Innovation Fellows Program, which leverages the ingenuity of leading problem solvers from across America together with federal innovators to tackle projects that aim to fuel job creation, save taxpayers money, and significantly improve how the federal government serves the American people. Under the RFP-EZ pilot, which will initially run from December 28, 2012 through May 1, 2013, agencies will identify individual procurements valued at or below the simplified acquisition threshold that can be set aside for small businesses to test a suite of functional tools for: (1) Simplifying the development of statements of work, (2) improving agency access to information about small businesses, (3) enabling small businesses to submit quotes, bids or proposals (collectively referred to as proposals) electronically in response to a solicitation posted on Federal Business Opportunities (FedBizOpps); (4) enhancing efficiencies for evaluating proposals, and (5) improving how information (including prices paid by federal agencies) is captured and stored. The pilot will be conducted in accordance with existing laws and regulations. Interested parties are encouraged to review and comment on the functionality of RFP-EZ, as described at www.sba.gov/rfpez and highlighted in this notice. Responses to this notice will be considered for possible refinements to the RFP-EZ platform during the pilot and as part of the evaluation of the benefits and costs of making RFP-EZ a permanent platform fully integrated with FedBizOpps, the System for Award Management and agency contract writing systems.

  17. Unified, Cross-Platform, Open-Source Library Package for High-Performance Computing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kozacik, Stephen

    Compute power is continually increasing, but this increased performance is largely found in sophisticated computing devices and supercomputer resources that are difficult to use, resulting in under-utilization. We developed a unified set of programming tools that will allow users to take full advantage of the new technology by allowing them to work at a level abstracted away from the platform specifics, encouraging the use of modern computing systems, including government-funded supercomputer facilities.

  18. Architecture and Implementation of OpenPET Firmware and Embedded Software

    DOE PAGES

    Abu-Nimeh, Faisal T.; Ito, Jennifer; Moses, William W.; ...

    2016-01-11

    OpenPET is an open source, modular, extendible, and high-performance platform suitable for multi-channel data acquisition and analysis. Due to the versatility of the hardware, firmware, and software architectures, the platform is capable of interfacing with a wide variety of detector modules not only in medical imaging but also in homeland security applications. Analog signals from radiation detectors share similar characteristics-a pulse whose area is proportional to the deposited energy and whose leading edge is used to extract a timing signal. As a result, a generic design method of the platform is adopted for the hardware, firmware, and software architectures andmore » implementations. The analog front-end is hosted on a module called a Detector Board, where each board can filter, combine, timestamp, and process multiple channels independently. The processed data is formatted and sent through a backplane bus to a module called Support Board, where 1 Support Board can host up to eight Detector Board modules. The data in the Support Board, coming from 8 Detector Board modules, can be aggregated or correlated (if needed) depending on the algorithm implemented or runtime mode selected. It is then sent out to a computer workstation for further processing. The number of channels (detector modules), to be processed, mandates the overall OpenPET System Configuration, which is designed to handle up to 1,024 channels using 16-channel Detector Boards in the Standard System Configuration and 16,384 channels using 32-channel Detector Boards in the Large System Configuration.« less

  19. Fractal dimension approach in postural control of subjects with Prader-Willi Syndrome

    PubMed Central

    2011-01-01

    Background Static posturography is user-friendly technique suitable for the study of the centre of pressure (CoP) trajectory. However, the utility of static posturography in clinical practice is somehow limited and there is a need for reliable approaches to extract physiologically meaningful information from stabilograms. The aim of this study was to quantify the postural strategy of Prader-Willi patients with the fractal dimension technique in addition to the CoP trajectory analysis in time and frequency domain. Methods 11 adult patients affected by Prader-Willi Syndrome (PWS) and 20 age-matched individuals (Control group: CG) were included in this study. Postural acquisitions were conducted by means of a force platform and the participants were required to stand barefoot on the platform with eyes open and heels at standardized distance and position for 30 seconds. Platform data were analysed in time and frequency domain. Fractal Dimension (FD) was also computed. Results The analysis of CoP vs. time showed that in PWS participants all the parameters were statistically different from CG, with greater displacements along both the antero-posterior and medio-lateral direction and longer CoP tracks. As for frequency analysis, our data showed no significant differences between PWS and CG. FD evidenced that PWS individuals were characterized by greater value in comparison with CG. Conclusions Our data showed that while the analysis in the frequency domain did not seem to explain the postural deficit in PWS, the FD method appears to provide a more informative description of it and to complement and integrate the time domain analysis. PMID:21854639

  20. The Effect of Computer Simulations on Acquisition of Knowledge and Cognitive Load: A Gender Perspective

    ERIC Educational Resources Information Center

    Kaheru, Sam J.; Kriek, Jeanne

    2016-01-01

    A study on the effect of the use of computer simulations (CS) on the acquisition of knowledge and cognitive load was undertaken with 104 Grade 11 learners in four schools in rural South Africa on the physics topic geometrical optics. Owing to the lack of resources a teacher-centred approach was followed in the use of computer simulations. The…

  1. Middleware for Plug and Play Integration of Heterogeneous Sensor Resources into the Sensor Web

    PubMed Central

    Toma, Daniel M.; Jirka, Simon; Del Río, Joaquín

    2017-01-01

    The study of global phenomena requires the combination of a considerable amount of data coming from different sources, acquired by different observation platforms and managed by institutions working in different scientific fields. Merging this data to provide extensive and complete data sets to monitor the long-term, global changes of our oceans is a major challenge. The data acquisition and data archival procedures usually vary significantly depending on the acquisition platform. This lack of standardization ultimately leads to information silos, preventing the data to be effectively shared across different scientific communities. In the past years, important steps have been taken in order to improve both standardization and interoperability, such as the Open Geospatial Consortium’s Sensor Web Enablement (SWE) framework. Within this framework, standardized models and interfaces to archive, access and visualize the data from heterogeneous sensor resources have been proposed. However, due to the wide variety of software and hardware architectures presented by marine sensors and marine observation platforms, there is still a lack of uniform procedures to integrate sensors into existing SWE-based data infrastructures. In this work, a framework aimed to enable sensor plug and play integration into existing SWE-based data infrastructures is presented. First, an analysis of the operations required to automatically identify, configure and operate a sensor are analysed. Then, the metadata required for these operations is structured in a standard way. Afterwards, a modular, plug and play, SWE-based acquisition chain is proposed. Finally different use cases for this framework are presented. PMID:29244732

  2. SU-G-BRA-01: A Real-Time Tumor Localization and Guidance Platform for Radiotherapy Using US and MRI

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bednarz, B; Culberson, W; Bassetti, M

    Purpose: To develop and validate a real-time motion management platform for radiotherapy that directly tracks tumor motion using ultrasound and MRI. This will be a cost-effective and non-invasive real-time platform combining the excellent temporal resolution of ultrasound with the excellent soft-tissue contrast of MRI. Methods: A 4D planar ultrasound acquisition during the treatment that is coupled to a pre-treatment calibration training image set consisting of a simultaneous 4D ultrasound and 4D MRI acquisition. The image sets will be rapidly matched using advanced image and signal processing algorithms, allowing the display of virtual MR images of the tumor/organ motion in real-timemore » from an ultrasound acquisition. Results: The completion of this work will result in several innovations including: a (2D) patch-like, MR and LINAC compatible 4D planar ultrasound transducer that is electronically steerable for hands-free operation to provide real-time virtual MR and ultrasound imaging for motion management during radiation therapy; a multi- modal tumor localization strategy that uses ultrasound and MRI; and fast and accurate image processing algorithms that provide real-time information about the motion and location of tumor or related soft-tissue structures within the patient. Conclusion: If successful, the proposed approach will provide real-time guidance for radiation therapy without degrading image or treatment plan quality. The approach would be equally suitable for image-guided proton beam or heavy ion-beam therapy. This work is partially funded by NIH grant R01CA190298.« less

  3. Soft tissue deformation for surgical simulation: a position-based dynamics approach.

    PubMed

    Camara, Mafalda; Mayer, Erik; Darzi, Ara; Pratt, Philip

    2016-06-01

    To assist the rehearsal and planning of robot-assisted partial nephrectomy, a real-time simulation platform is presented that allows surgeons to visualise and interact with rapidly constructed patient-specific biomechanical models of the anatomical regions of interest. Coupled to a framework for volumetric deformation, the platform furthermore simulates intracorporeal 2D ultrasound image acquisition, using preoperative imaging as the data source. This not only facilitates the planning of optimal transducer trajectories and viewpoints, but can also act as a validation context for manually operated freehand 3D acquisitions and reconstructions. The simulation platform was implemented within the GPU-accelerated NVIDIA FleX position-based dynamics framework. In order to validate the model and determine material properties and other simulation parameter values, a porcine kidney with embedded fiducial beads was CT-scanned and segmented. Acquisitions for the rest position and three different levels of probe-induced deformation were collected. Optimal values of the cluster stiffness coefficients were determined for a range of different particle radii, where the objective function comprised the mean distance error between real and simulated fiducial positions over the sequence of deformations. The mean fiducial error at each deformation stage was found to be compatible with the level of ultrasound probe calibration error typically observed in clinical practice. Furthermore, the simulation exhibited unconditional stability on account of its use of clustered shape-matching constraints. A novel position-based dynamics implementation of soft tissue deformation has been shown to facilitate several desirable simulation characteristics: real-time performance, unconditional stability, rapid model construction enabling patient-specific behaviour and accuracy with respect to reference CT images.

  4. University Students Use of Computers and Mobile Devices for Learning and Their Reading Speed on Different Platforms

    ERIC Educational Resources Information Center

    Mpofu, Bongeka

    2016-01-01

    This research was aimed at the investigation of mobile device and computer use at a higher learning institution. The goal was to determine the current use of computers and mobile devices for learning and the students' reading speed on different platforms. The research was contextualised in a sample of students at the University of South Africa.…

  5. Experimental validation of A-mode ultrasound acquisition system for computer assisted orthopaedic surgery

    NASA Astrophysics Data System (ADS)

    De Lorenzo, Danilo; De Momi, Elena; Beretta, Elisa; Cerveri, Pietro; Perona, Franco; Ferrigno, Giancarlo

    2009-02-01

    Computer Assisted Orthopaedic Surgery (CAOS) systems improve the results and the standardization of surgical interventions. Anatomical landmarks and bone surface detection is straightforward to either register the surgical space with the pre-operative imaging space and to compute biomechanical parameters for prosthesis alignment. Surface points acquisition increases the intervention invasiveness and can be influenced by the soft tissue layer interposition (7-15mm localization errors). This study is aimed at evaluating the accuracy of a custom-made A-mode ultrasound (US) system for non invasive detection of anatomical landmarks and surfaces. A-mode solutions eliminate the necessity of US images segmentation, offers real-time signal processing and requires less invasive equipment. The system consists in a single transducer US probe optically tracked, a pulser/receiver and an FPGA-based board, which is responsible for logic control command generation and for real-time signal processing and three custom-made board (signal acquisition, blanking and synchronization). We propose a new calibration method of the US system. The experimental validation was then performed measuring the length of known-shape polymethylmethacrylate boxes filled with pure water and acquiring bone surface points on a bovine bone phantom covered with soft-tissue mimicking materials. Measurement errors were computed through MR and CT images acquisitions of the phantom. Points acquisition on bone surface with the US system demonstrated lower errors (1.2mm) than standard pointer acquisition (4.2mm).

  6. Application verification research of cloud computing technology in the field of real time aerospace experiment

    NASA Astrophysics Data System (ADS)

    Wan, Junwei; Chen, Hongyan; Zhao, Jing

    2017-08-01

    According to the requirements of real-time, reliability and safety for aerospace experiment, the single center cloud computing technology application verification platform is constructed. At the IAAS level, the feasibility of the cloud computing technology be applied to the field of aerospace experiment is tested and verified. Based on the analysis of the test results, a preliminary conclusion is obtained: Cloud computing platform can be applied to the aerospace experiment computing intensive business. For I/O intensive business, it is recommended to use the traditional physical machine.

  7. Acquisition of Long-Duration, Low-Gravity Slosh Data Utilizing Existing ISS Equipment (SPHERES) for Calibration of CFD Models of Coupled Fluid-Vehicle Behavior

    NASA Technical Reports Server (NTRS)

    Schallhorn, Paul; Roth, Jacob; Marsell, Brandon; Kirk, Daniel; Gutierrez, Hector; Saenz-Otero, Alvar; Dorney, Daniel; Moder, Jeffrey

    2013-01-01

    Accurate prediction of coupled fluid slosh and launch vehicle or spacecraft dynamics (e.g., nutation/precessional movement about various axes, attitude changes, ect.) requires Computational Fluid Dynamics (CFD) models calibrated with low-gravity, long duration slosh data. Recently completed investigations of reduced gravity slosh behavior have demonstrated the limitations of utilizing parabolic flights on specialized aircraft with respect to the specific objectives of the experiments. Although valuable data was collected, the benefits of longer duration low-gravity environments were clearly established. The proposed research provides the first data set from long duration tests in zero gravity that can be directly used to benchmark CFD models, including the interaction between the sloshing fluid and the tank/vehicle dynamics. To explore the coupling of liquid slosh with the motion of an unconstrained tank in microgravity, NASA's Kennedy Space Center, Launch Services Program has teamed up with the Florida Institute of Technology (FIT), Massachusetts Institute of Technology (MIT) and the NASA Game Changing Development Program (GCD) to perform a series of slosh dynamics experiments on the International Space Station using the SPHERES platform. The Synchronized Position Hold Engage Reorient Experimental Satellites (SPHERES) testbed provides a unique, free-floating instrumented platform on ISS that can be utilized in a manner that would solve many of the limitations of the current knowledge related to propellant slosh dynamics on launch vehicle and spacecraft fuel tanks. The six degree of freedom (6-DOF) motion of the SPHERES free-flyer is controlled by an array of cold-flow C02 thrusters, supplied from a built-in liquid C02 tank. These SPHERES can independently navigate and re-orient themselves within the ISS. The intent of this project is to design an externally mounted tank to be driven inside the ISS by a set of two SPHERES devices (Figure 1). The tank geometry simulates a launch vehicle upper stage propellant tank and the maneuvers replicate those of real vehicles. The design includes inertial sensors, data acquisition, image capture and data storage interfaces to the SPHERES VERTIGO computer system on board the flight article assembly. The design also includes mechanical and electronic interfaces to the existing SPHERES hardware, which include self-contained packages that can operate in conjunction with the existing SPHERES electronics

  8. Acquisition of Long-Duration, Low-Gravity Slosh Data Utilizing Existing ISS Equipment (SPHERES) for Calibration of CFD Models of Coupled Fluid-Vehicle Behavior

    NASA Technical Reports Server (NTRS)

    Schallhorn, Paul; Roth, Jacob; Marsell, Brandon; Kirk, Daniel; Gutierrez, Hector; Saenz-Otero, Alvar; Dorney, Daniel; Moder, Jeffrey

    2012-01-01

    Accurate prediction of coupled fluid slosh and launch vehicle or spacecraft dynamics (e.g., nutation/precessional movement about various axes, attitude changes, ect.) requires Computational Fluid Dynamics (CFD) models calibrated with low-gravity, long duration slosh data. Recently completed investigations of reduced gravity slosh behavior have demonstrated the limitations of utilizing parabolic flights on specialized aircraft with respect to the specific objectives of the experiments. Although valuable data was collected, the benefits of longer duration low-gravity environments were clearly established. The proposed research provides the first data set from long duration tests in zero gravity that can be directly used to benchmark CFD models, including the interaction between the sloshing fluid and the tank/vehicle dynamics. To explore the coupling of liquid slosh with the motion of an unconstrained tank in microgravity, NASA's Kennedy Space Center, Launch Services Program has teamed up with the Florida Institute of Technology (FIT), Massachusetts Institute of Technology (MIT) and the Office of the Chief Technologist (OCT) to perform a series of slosh dynamics experiments on the International Space Station using the SPHERES platform. The Synchronized Position Hold Engage Reorient Experimental Satellites (SPHERES) testbed provides a unique, free-floating instrumented platform on ISS that can be utilized in a manner that would solve many of the limitations of the current knowledge related to propellant slosh dynamics on launch vehicle and spacecraft fuel tanks. The six degree of freedom (6-DOF) motion of the SPHERES free-flyer is controlled by an array of cold-flow C02 thrusters, supplied from a built-in liquid C02 tank. These SPHERES can independently navigate and re-orient themselves within the ISS. The intent of this project is to design an externally mounted tank to be driven inside the ISS by a set of two SPHERES devices (Figure 1 ). The tank geometry simulates a launch vehicle upper stage propellant tank and the maneuvers replicate those of real vehicles. The design includes inertial sensors, data acquisition, image capture and data storage interfaces to the SPHERES VERTIGO computer system on board the flight article assembly. The design also includes mechanical and electronic interfaces to the existing SPHERES hardware, which include self-contained packages that can operate in conjunction with the existing SPHERES electronics.

  9. GATECloud.net: a platform for large-scale, open-source text processing on the cloud.

    PubMed

    Tablan, Valentin; Roberts, Ian; Cunningham, Hamish; Bontcheva, Kalina

    2013-01-28

    Cloud computing is increasingly being regarded as a key enabler of the 'democratization of science', because on-demand, highly scalable cloud computing facilities enable researchers anywhere to carry out data-intensive experiments. In the context of natural language processing (NLP), algorithms tend to be complex, which makes their parallelization and deployment on cloud platforms a non-trivial task. This study presents a new, unique, cloud-based platform for large-scale NLP research--GATECloud. net. It enables researchers to carry out data-intensive NLP experiments by harnessing the vast, on-demand compute power of the Amazon cloud. Important infrastructural issues are dealt with by the platform, completely transparently for the researcher: load balancing, efficient data upload and storage, deployment on the virtual machines, security and fault tolerance. We also include a cost-benefit analysis and usage evaluation.

  10. Open chemistry: RESTful web APIs, JSON, NWChem and the modern web application.

    PubMed

    Hanwell, Marcus D; de Jong, Wibe A; Harris, Christopher J

    2017-10-30

    An end-to-end platform for chemical science research has been developed that integrates data from computational and experimental approaches through a modern web-based interface. The platform offers an interactive visualization and analytics environment that functions well on mobile, laptop and desktop devices. It offers pragmatic solutions to ensure that large and complex data sets are more accessible. Existing desktop applications/frameworks were extended to integrate with high-performance computing resources, and offer command-line tools to automate interaction-connecting distributed teams to this software platform on their own terms. The platform was developed openly, and all source code hosted on the GitHub platform with automated deployment possible using Ansible coupled with standard Ubuntu-based machine images deployed to cloud machines. The platform is designed to enable teams to reap the benefits of the connected web-going beyond what conventional search and analytics platforms offer in this area. It also has the goal of offering federated instances, that can be customized to the sites/research performed. Data gets stored using JSON, extending upon previous approaches using XML, building structures that support computational chemistry calculations. These structures were developed to make it easy to process data across different languages, and send data to a JavaScript-based web client.

  11. Two Demonstrations with a New Data-Acquisition System

    ERIC Educational Resources Information Center

    Kraftmakher, Yaakov

    2014-01-01

    Nowadays, the use of data-acquisition systems in undergraduate laboratories is routine. Many computer-assisted experiments became possible with the PASCO scientific data-acquisition system based on the 750 Interface and DataStudio software. A new data-acquisition system developed by PASCO includes the 850 Universal Interface and Capstone software.…

  12. Automatic real time evaluation of red blood cell elasticity by optical tweezers

    NASA Astrophysics Data System (ADS)

    Moura, Diógenes S.; Silva, Diego C. N.; Williams, Ajoke J.; Bezerra, Marcos A. C.; Fontes, Adriana; de Araujo, Renato E.

    2015-05-01

    Optical tweezers have been used to trap, manipulate, and measure individual cell properties. In this work, we show that the association of a computer controlled optical tweezers system with image processing techniques allows rapid and reproducible evaluation of cell deformability. In particular, the deformability of red blood cells (RBCs) plays a key role in the transport of oxygen through the blood microcirculation. The automatic measurement processes consisted of three steps: acquisition, segmentation of images, and measurement of the elasticity of the cells. An optical tweezers system was setup on an upright microscope equipped with a CCD camera and a motorized XYZ stage, computer controlled by a Labview platform. On the optical tweezers setup, the deformation of the captured RBC was obtained by moving the motorized stage. The automatic real-time homemade system was evaluated by measuring RBCs elasticity from normal donors and patients with sickle cell anemia. Approximately 150 erythrocytes were examined, and the elasticity values obtained by using the developed system were compared to the values measured by two experts. With the automatic system, there was a significant time reduction (60 × ) of the erythrocytes elasticity evaluation. Automated system can help to expand the applications of optical tweezers in hematology and hemotherapy.

  13. Multi-Contrast Imaging and Digital Refocusing on a Mobile Microscope with a Domed LED Array.

    PubMed

    Phillips, Zachary F; D'Ambrosio, Michael V; Tian, Lei; Rulison, Jared J; Patel, Hurshal S; Sadras, Nitin; Gande, Aditya V; Switz, Neil A; Fletcher, Daniel A; Waller, Laura

    2015-01-01

    We demonstrate the design and application of an add-on device for improving the diagnostic and research capabilities of CellScope--a low-cost, smartphone-based point-of-care microscope. We replace the single LED illumination of the original CellScope with a programmable domed LED array. By leveraging recent advances in computational illumination, this new device enables simultaneous multi-contrast imaging with brightfield, darkfield, and phase imaging modes. Further, we scan through illumination angles to capture lightfield datasets, which can be used to recover 3D intensity and phase images without any hardware changes. This digital refocusing procedure can be used for either 3D imaging or software-only focus correction, reducing the need for precise mechanical focusing during field experiments. All acquisition and processing is performed on the mobile phone and controlled through a smartphone application, making the computational microscope compact and portable. Using multiple samples and different objective magnifications, we demonstrate that the performance of our device is comparable to that of a commercial microscope. This unique device platform extends the field imaging capabilities of CellScope, opening up new clinical and research possibilities.

  14. Multi-Contrast Imaging and Digital Refocusing on a Mobile Microscope with a Domed LED Array

    PubMed Central

    Phillips, Zachary F.; D'Ambrosio, Michael V.; Tian, Lei; Rulison, Jared J.; Patel, Hurshal S.; Sadras, Nitin; Gande, Aditya V.; Switz, Neil A.; Fletcher, Daniel A.; Waller, Laura

    2015-01-01

    We demonstrate the design and application of an add-on device for improving the diagnostic and research capabilities of CellScope—a low-cost, smartphone-based point-of-care microscope. We replace the single LED illumination of the original CellScope with a programmable domed LED array. By leveraging recent advances in computational illumination, this new device enables simultaneous multi-contrast imaging with brightfield, darkfield, and phase imaging modes. Further, we scan through illumination angles to capture lightfield datasets, which can be used to recover 3D intensity and phase images without any hardware changes. This digital refocusing procedure can be used for either 3D imaging or software-only focus correction, reducing the need for precise mechanical focusing during field experiments. All acquisition and processing is performed on the mobile phone and controlled through a smartphone application, making the computational microscope compact and portable. Using multiple samples and different objective magnifications, we demonstrate that the performance of our device is comparable to that of a commercial microscope. This unique device platform extends the field imaging capabilities of CellScope, opening up new clinical and research possibilities. PMID:25969980

  15. Embedded-Based Graphics Processing Unit Cluster Platform for Multiple Sequence Alignments

    PubMed Central

    Wei, Jyh-Da; Cheng, Hui-Jun; Lin, Chun-Yuan; Ye, Jin; Yeh, Kuan-Yu

    2017-01-01

    High-end graphics processing units (GPUs), such as NVIDIA Tesla/Fermi/Kepler series cards with thousands of cores per chip, are widely applied to high-performance computing fields in a decade. These desktop GPU cards should be installed in personal computers/servers with desktop CPUs, and the cost and power consumption of constructing a GPU cluster platform are very high. In recent years, NVIDIA releases an embedded board, called Jetson Tegra K1 (TK1), which contains 4 ARM Cortex-A15 CPUs and 192 Compute Unified Device Architecture cores (belong to Kepler GPUs). Jetson Tegra K1 has several advantages, such as the low cost, low power consumption, and high applicability, and it has been applied into several specific applications. In our previous work, a bioinformatics platform with a single TK1 (STK platform) was constructed, and this previous work is also used to prove that the Web and mobile services can be implemented in the STK platform with a good cost-performance ratio by comparing a STK platform with the desktop CPU and GPU. In this work, an embedded-based GPU cluster platform will be constructed with multiple TK1s (MTK platform). Complex system installation and setup are necessary procedures at first. Then, 2 job assignment modes are designed for the MTK platform to provide services for users. Finally, ClustalW v2.0.11 and ClustalWtk will be ported to the MTK platform. The experimental results showed that the speedup ratios achieved 5.5 and 4.8 times for ClustalW v2.0.11 and ClustalWtk, respectively, by comparing 6 TK1s with a single TK1. The MTK platform is proven to be useful for multiple sequence alignments. PMID:28835734

  16. Embedded-Based Graphics Processing Unit Cluster Platform for Multiple Sequence Alignments.

    PubMed

    Wei, Jyh-Da; Cheng, Hui-Jun; Lin, Chun-Yuan; Ye, Jin; Yeh, Kuan-Yu

    2017-01-01

    High-end graphics processing units (GPUs), such as NVIDIA Tesla/Fermi/Kepler series cards with thousands of cores per chip, are widely applied to high-performance computing fields in a decade. These desktop GPU cards should be installed in personal computers/servers with desktop CPUs, and the cost and power consumption of constructing a GPU cluster platform are very high. In recent years, NVIDIA releases an embedded board, called Jetson Tegra K1 (TK1), which contains 4 ARM Cortex-A15 CPUs and 192 Compute Unified Device Architecture cores (belong to Kepler GPUs). Jetson Tegra K1 has several advantages, such as the low cost, low power consumption, and high applicability, and it has been applied into several specific applications. In our previous work, a bioinformatics platform with a single TK1 (STK platform) was constructed, and this previous work is also used to prove that the Web and mobile services can be implemented in the STK platform with a good cost-performance ratio by comparing a STK platform with the desktop CPU and GPU. In this work, an embedded-based GPU cluster platform will be constructed with multiple TK1s (MTK platform). Complex system installation and setup are necessary procedures at first. Then, 2 job assignment modes are designed for the MTK platform to provide services for users. Finally, ClustalW v2.0.11 and ClustalWtk will be ported to the MTK platform. The experimental results showed that the speedup ratios achieved 5.5 and 4.8 times for ClustalW v2.0.11 and ClustalWtk, respectively, by comparing 6 TK1s with a single TK1. The MTK platform is proven to be useful for multiple sequence alignments.

  17. The Use of Computer-Based Videogames in Knowledge Acquisition and Retention.

    ERIC Educational Resources Information Center

    Ricci, Katrina E.

    1994-01-01

    Research conducted at the Naval Training Systems Center in Orlando, Florida, investigated the acquisition and retention of basic knowledge with subject matter presented in the forms of text, test, and game. Results are discussed in terms of the effectiveness of computer-based games for military training. (Author/AEF)

  18. A Study of Multimedia Application-Based Vocabulary Acquisition

    ERIC Educational Resources Information Center

    Shao, Jing

    2012-01-01

    The development of computer-assisted language learning (CALL) has created the opportunity for exploring the effects of the multimedia application on foreign language vocabulary acquisition in recent years. This study provides an overview the computer-assisted language learning (CALL) and detailed a developing result of CALL--multimedia. With the…

  19. GPU-based High-Performance Computing for Radiation Therapy

    PubMed Central

    Jia, Xun; Ziegenhein, Peter; Jiang, Steve B.

    2014-01-01

    Recent developments in radiotherapy therapy demand high computation powers to solve challenging problems in a timely fashion in a clinical environment. Graphics processing unit (GPU), as an emerging high-performance computing platform, has been introduced to radiotherapy. It is particularly attractive due to its high computational power, small size, and low cost for facility deployment and maintenance. Over the past a few years, GPU-based high-performance computing in radiotherapy has experienced rapid developments. A tremendous amount of studies have been conducted, in which large acceleration factors compared with the conventional CPU platform have been observed. In this article, we will first give a brief introduction to the GPU hardware structure and programming model. We will then review the current applications of GPU in major imaging-related and therapy-related problems encountered in radiotherapy. A comparison of GPU with other platforms will also be presented. PMID:24486639

  20. OpenACC performance for simulating 2D radial dambreak using FVM HLLE flux

    NASA Astrophysics Data System (ADS)

    Gunawan, P. H.; Pahlevi, M. R.

    2018-03-01

    The aim of this paper is to investigate the performances of openACC platform for computing 2D radial dambreak. Here, the shallow water equation will be used to describe and simulate 2D radial dambreak with finite volume method (FVM) using HLLE flux. OpenACC is a parallel computing platform based on GPU cores. Indeed, from this research this platform is used to minimize computational time on the numerical scheme performance. The results show the using OpenACC, the computational time is reduced. For the dry and wet radial dambreak simulations using 2048 grids, the computational time of parallel is obtained 575.984 s and 584.830 s respectively for both simulations. These results show the successful of OpenACC when they are compared with the serial time of dry and wet radial dambreak simulations which are collected 28047.500 s and 29269.40 s respectively.

  1. MACBenAbim: A Multi-platform Mobile Application for searching keyterms in Computational Biology and Bioinformatics.

    PubMed

    Oluwagbemi, Olugbenga O; Adewumi, Adewole; Esuruoso, Abimbola

    2012-01-01

    Computational biology and bioinformatics are gradually gaining grounds in Africa and other developing nations of the world. However, in these countries, some of the challenges of computational biology and bioinformatics education are inadequate infrastructures, and lack of readily-available complementary and motivational tools to support learning as well as research. This has lowered the morale of many promising undergraduates, postgraduates and researchers from aspiring to undertake future study in these fields. In this paper, we developed and described MACBenAbim (Multi-platform Mobile Application for Computational Biology and Bioinformatics), a flexible user-friendly tool to search for, define and describe the meanings of keyterms in computational biology and bioinformatics, thus expanding the frontiers of knowledge of the users. This tool also has the capability of achieving visualization of results on a mobile multi-platform context. MACBenAbim is available from the authors for non-commercial purposes.

  2. Experimental Studies on Dynamic Vibration Absorber using Shape Memory Alloy (NiTi) Springs

    NASA Astrophysics Data System (ADS)

    Kumar, V. Raj; Kumar, M. B. Bharathi Raj; Kumar, M. Senthil

    2011-10-01

    Shape memory alloy (SMA) springs have been used as actuators in many applications although their use in the vibration control area is very recent. Since shape memory alloys differ from conventional alloy materials in many ways, the traditional design approach for springs is not completely suitable for designing SMA springs. Some vibration control concepts utilizing unique characteristics of SMA's will be presented in this paper. A dynamic vibration absorber (DVA) using shape memory alloy (SMA) actuator is developed for attenuation of vibration in a cantilever beam. The design procedure of the DVA is presented. The system consists of a cantilever beam which is considered to generate the real-time vibration using shaker. A SMA spring is used with a mass attached to its end. The stiffness of the SMA spring is dynamically varied in such a way to attenuate the vibration. Both simulation and experimentation are carried out using PID controller. The experiments were carried out by interfacing the experimental setup with a computer using LabVIEW software, Data acquisition and control are implemented using a PCI data acquisition card. Standard PID controllers have been used to control the vibration of the beam. Experimental results are used to demonstrate the effectiveness of the controllers designed and the usefulness of the proposed test platform by exciting the structure at resonance. In experimental setup, an accelerometer is used to measure the vibration which is fed to computer and correspondingly the SMA spring is actuated to change its stiffness to control the vibration. The results obtained illustrate that the developed DVA using SMA actuator is very effective in reducing structural response and have great potential to be an active vibration control medium.

  3. Experimental Studies on Dynamic Vibration Absorber using Shape Memory Alloy (NiTi) Springs

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kumar, V. Raj; Kumar, M. B. Bharathi Raj; Kumar, M. Senthil

    2011-10-20

    Shape memory alloy (SMA) springs have been used as actuators in many applications although their use in the vibration control area is very recent. Since shape memory alloys differ from conventional alloy materials in many ways, the traditional design approach for springs is not completely suitable for designing SMA springs. Some vibration control concepts utilizing unique characteristics of SMA's will be presented in this paper.A dynamic vibration absorber (DVA) using shape memory alloy (SMA) actuator is developed for attenuation of vibration in a cantilever beam. The design procedure of the DVA is presented. The system consists of a cantilever beammore » which is considered to generate the real-time vibration using shaker. A SMA spring is used with a mass attached to its end. The stiffness of the SMA spring is dynamically varied in such a way to attenuate the vibration. Both simulation and experimentation are carried out using PID controller. The experiments were carried out by interfacing the experimental setup with a computer using LabVIEW software, Data acquisition and control are implemented using a PCI data acquisition card. Standard PID controllers have been used to control the vibration of the beam. Experimental results are used to demonstrate the effectiveness of the controllers designed and the usefulness of the proposed test platform by exciting the structure at resonance. In experimental setup, an accelerometer is used to measure the vibration which is fed to computer and correspondingly the SMA spring is actuated to change its stiffness to control the vibration. The results obtained illustrate that the developed DVA using SMA actuator is very effective in reducing structural response and have great potential to be an active vibration control medium.« less

  4. Workload Characterization of CFD Applications Using Partial Differential Equation Solvers

    NASA Technical Reports Server (NTRS)

    Waheed, Abdul; Yan, Jerry; Saini, Subhash (Technical Monitor)

    1998-01-01

    Workload characterization is used for modeling and evaluating of computing systems at different levels of detail. We present workload characterization for a class of Computational Fluid Dynamics (CFD) applications that solve Partial Differential Equations (PDEs). This workload characterization focuses on three high performance computing platforms: SGI Origin2000, EBM SP-2, a cluster of Intel Pentium Pro bases PCs. We execute extensive measurement-based experiments on these platforms to gather statistics of system resource usage, which results in workload characterization. Our workload characterization approach yields a coarse-grain resource utilization behavior that is being applied for performance modeling and evaluation of distributed high performance metacomputing systems. In addition, this study enhances our understanding of interactions between PDE solver workloads and high performance computing platforms and is useful for tuning these applications.

  5. Dual cameras acquisition and display system of retina-like sensor camera and rectangular sensor camera

    NASA Astrophysics Data System (ADS)

    Cao, Nan; Cao, Fengmei; Lin, Yabin; Bai, Tingzhu; Song, Shengyu

    2015-04-01

    For a new kind of retina-like senor camera and a traditional rectangular sensor camera, dual cameras acquisition and display system need to be built. We introduce the principle and the development of retina-like senor. Image coordinates transformation and interpolation based on sub-pixel interpolation need to be realized for our retina-like sensor's special pixels distribution. The hardware platform is composed of retina-like senor camera, rectangular sensor camera, image grabber and PC. Combined the MIL and OpenCV library, the software program is composed in VC++ on VS 2010. Experience results show that the system can realizes two cameras' acquisition and display.

  6. Mission specification for three generic mission classes

    NASA Technical Reports Server (NTRS)

    1979-01-01

    Mission specifications for three generic mission classes are generated to provide a baseline for definition and analysis of data acquisition platform system concepts. The mission specifications define compatible groupings of sensors that satisfy specific earth resources and environmental mission objectives. The driving force behind the definition of sensor groupings is mission need; platform and space transportation system constraints are of secondary importance. The three generic mission classes are: (1) low earth orbit sun-synchronous; (2) geosynchronous; and (3) non-sun-synchronous, nongeosynchronous. These missions are chosen to provide a variety of sensor complements and implementation concepts. Each mission specification relates mission categories, mission objectives, measured parameters, and candidate sensors to orbits and coverage, operations compatibility, and platform fleet size.

  7. A wellness software platform with smart wearable devices and the demonstration report for personal wellness management

    NASA Astrophysics Data System (ADS)

    Kang, Won-Seok; Son, Chang-Sik; Lee, Sangho; Choi, Rock-Hyun; Ha, Yeong-Mi

    2017-07-01

    In this paper, we introduce a wellness software platform, called WellnessHumanCare, is a semi-automatic wellness management software platform which has the functions of complex wellness data acquisition(mental, physical and environmental one) with smart wearable devices, complex wellness condition analysis, private-aware online/offline recommendation, real-time monitoring apps (Smartphone-based, Web-based) and so on and we has demonstrated a wellness management service with 79 participants (experimental group: 39, control group: 40) who has worked at experimental group (H Corp.) and control group (K Corp.), Korea and 3 months in order to show the efficiency of the WellnessHumanCare.

  8. Application of a Smartphone Metabolomics Platform to the Authentication of Schisandra sinensis.

    PubMed

    Kwon, Hyuk Nam; Phan, Hong-Duc; Xu, Wen Jun; Ko, Yoon-Joo; Park, Sunghyouk

    2016-05-01

    Herbal medicines have been used for a long time all around the world. Since the quality of herbal preparations depends on the source of herbal materials, there has been a strong need to develop methods to correctly identify the origin of materials. To develop a smartphone metabolomics platform as a simpler and low-cost alternative for the identification of herbal material source. Schisandra sinensis extracts from Korea and China were prepared. The visible spectra of all samples were measured by a smartphone spectrometer platform. This platform included all the necessary measures built-in for the metabolomics research: data acquisition, processing, chemometric analysis and visualisation of the results. The result of the smartphone metabolomics platform was compared to that of NMR-based metabolomics, suggesting the feasibility of smartphone platform in metabolomics research. The smartphone metabolomics platform gave similar results to the NMR method, showing good separation between Korean and Chinese materials and correct predictability for all test samples. With its accuracy and advantages of affordability, user-friendliness, and portability, the smartphone metabolomics platform could be applied to the authentication of other medicinal plants. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  9. The Design of a High Performance Earth Imagery and Raster Data Management and Processing Platform

    NASA Astrophysics Data System (ADS)

    Xie, Qingyun

    2016-06-01

    This paper summarizes the general requirements and specific characteristics of both geospatial raster database management system and raster data processing platform from a domain-specific perspective as well as from a computing point of view. It also discusses the need of tight integration between the database system and the processing system. These requirements resulted in Oracle Spatial GeoRaster, a global scale and high performance earth imagery and raster data management and processing platform. The rationale, design, implementation, and benefits of Oracle Spatial GeoRaster are described. Basically, as a database management system, GeoRaster defines an integrated raster data model, supports image compression, data manipulation, general and spatial indices, content and context based queries and updates, versioning, concurrency, security, replication, standby, backup and recovery, multitenancy, and ETL. It provides high scalability using computer and storage clustering. As a raster data processing platform, GeoRaster provides basic operations, image processing, raster analytics, and data distribution featuring high performance computing (HPC). Specifically, HPC features include locality computing, concurrent processing, parallel processing, and in-memory computing. In addition, the APIs and the plug-in architecture are discussed.

  10. Research of aerial imaging spectrometer data acquisition technology based on USB 3.0

    NASA Astrophysics Data System (ADS)

    Huang, Junze; Wang, Yueming; He, Daogang; Yu, Yanan

    2016-11-01

    With the emergence of UAV (unmanned aerial vehicle) platform for aerial imaging spectrometer, research of aerial imaging spectrometer DAS(data acquisition system) faces new challenges. Due to the limitation of platform and other factors, the aerial imaging spectrometer DAS requires small-light, low-cost and universal. Traditional aerial imaging spectrometer DAS system is expensive, bulky, non-universal and unsupported plug-and-play based on PCIe. So that has been unable to meet promotion and application of the aerial imaging spectrometer. In order to solve these problems, the new data acquisition scheme bases on USB3.0 interface.USB3.0 can provide guarantee of small-light, low-cost and universal relying on the forward-looking technology advantage. USB3.0 transmission theory is up to 5Gbps.And the GPIF programming interface achieves 3.2Gbps of the effective theoretical data bandwidth.USB3.0 can fully meet the needs of the aerial imaging spectrometer data transmission rate. The scheme uses the slave FIFO asynchronous data transmission mode between FPGA and USB3014 interface chip. Firstly system collects spectral data from TLK2711 of high-speed serial interface chip. Then FPGA receives data in DDR2 cache after ping-pong data processing. Finally USB3014 interface chip transmits data via automatic-dma approach and uploads to PC by USB3.0 cable. During the manufacture of aerial imaging spectrometer, the DAS can achieve image acquisition, transmission, storage and display. All functions can provide the necessary test detection for aerial imaging spectrometer. The test shows that system performs stable and no data lose. Average transmission speed and storage speed of writing SSD can stabilize at 1.28Gbps. Consequently ,this data acquisition system can meet application requirements for aerial imaging spectrometer.

  11. Acquisition and long-term retention of spatial learning in the human immunodeficiency virus-1 transgenic rat: Effects of repeated nicotine treatment

    PubMed Central

    Vigorito, Michael; Cao, Junran; Li, Ming D.; Chang, Sulie L.

    2013-01-01

    The HIV-1 transgenic (HIV-1Tg) rat shows a deficit in learning to locate a submerged platform in a multiple-trial water maze task compared to transgenic littermate and F344 control rats (Vigorito et al. 2008; Lashomb et al. 2009). Nicotine is known to have neuroprotective effects possibly by minimizing cytotoxic effects of glutamate or by modulating a cholinergic anti-inflammatory pathway. Nicotine also improves performance in a variety of learning tasks by enhancing attention and short-term memory (STM). The purpose of this study was to determine if the learning deficit in HIV-1Tg is ameliorated by repeated nicotine treatment independent of its effects on STM. HIV-1Tg and F344 rats were treated (subcutaneous) with nicotine (0.25mg/kg/injection) or saline twice daily and tested in a single-trial-per-day procedure which precludes the impact of STM on the acquisition of the spatial learning task. HIV-1Tg rats showed a deficit in the acquisition of the task and in the long-term retention for the platform location in a probe test. Nicotine did not ameliorate the deficit in HIV-1Tg rats and slightly worsened performance during acquisition. Analysis of individual differences in performance during the probe test suggested that nicotine improved performance in some F344 rats but not in HIV-1Tg rats. These results indicate that a deficit in the consolidation of long-term memory (LTM) contributes to the acquisition deficit of HIV1-Tg rats. The results, however, do not provide any evidence of the amelioration of the learning deficit observed in this behavioral model at least with the nicotine dose tested. PMID:23456952

  12. Blue guardian: an open architecture for rapid ISR demonstration

    NASA Astrophysics Data System (ADS)

    Barrett, Donald A.; Borntrager, Luke A.; Green, David M.

    2016-05-01

    Throughout the Department of Defense (DoD), acquisition, platform integration, and life cycle costs for weapons systems have continued to rise. Although Open Architecture (OA) interface standards are one of the primary methods being used to reduce these costs, the Air Force Rapid Capabilities Office (AFRCO) has extended the OA concept and chartered the Open Mission System (OMS) initiative with industry to develop and demonstrate a consensus-based, non-proprietary, OA standard for integrating subsystems and services into airborne platforms. The new OMS standard provides the capability to decouple vendor-specific sensors, payloads, and service implementations from platform-specific architectures and is still in the early stages of maturation and demonstration. The Air Force Research Laboratory (AFRL) - Sensors Directorate has developed the Blue Guardian program to demonstrate advanced sensing technology utilizing open architectures in operationally relevant environments. Over the past year, Blue Guardian has developed a platform architecture using the Air Force's OMS reference architecture and conducted a ground and flight test program of multiple payload combinations. Systems tested included a vendor-unique variety of Full Motion Video (FMV) systems, a Wide Area Motion Imagery (WAMI) system, a multi-mode radar system, processing and database functions, multiple decompression algorithms, multiple communications systems, and a suite of software tools. Initial results of the Blue Guardian program show the promise of OA to DoD acquisitions, especially for Intelligence, Surveillance and Reconnaissance (ISR) payload applications. Specifically, the OMS reference architecture was extremely useful in reducing the cost and time required for integrating new systems.

  13. The Generation Challenge Programme Platform: Semantic Standards and Workbench for Crop Science

    PubMed Central

    Bruskiewich, Richard; Senger, Martin; Davenport, Guy; Ruiz, Manuel; Rouard, Mathieu; Hazekamp, Tom; Takeya, Masaru; Doi, Koji; Satoh, Kouji; Costa, Marcos; Simon, Reinhard; Balaji, Jayashree; Akintunde, Akinnola; Mauleon, Ramil; Wanchana, Samart; Shah, Trushar; Anacleto, Mylah; Portugal, Arllet; Ulat, Victor Jun; Thongjuea, Supat; Braak, Kyle; Ritter, Sebastian; Dereeper, Alexis; Skofic, Milko; Rojas, Edwin; Martins, Natalia; Pappas, Georgios; Alamban, Ryan; Almodiel, Roque; Barboza, Lord Hendrix; Detras, Jeffrey; Manansala, Kevin; Mendoza, Michael Jonathan; Morales, Jeffrey; Peralta, Barry; Valerio, Rowena; Zhang, Yi; Gregorio, Sergio; Hermocilla, Joseph; Echavez, Michael; Yap, Jan Michael; Farmer, Andrew; Schiltz, Gary; Lee, Jennifer; Casstevens, Terry; Jaiswal, Pankaj; Meintjes, Ayton; Wilkinson, Mark; Good, Benjamin; Wagner, James; Morris, Jane; Marshall, David; Collins, Anthony; Kikuchi, Shoshi; Metz, Thomas; McLaren, Graham; van Hintum, Theo

    2008-01-01

    The Generation Challenge programme (GCP) is a global crop research consortium directed toward crop improvement through the application of comparative biology and genetic resources characterization to plant breeding. A key consortium research activity is the development of a GCP crop bioinformatics platform to support GCP research. This platform includes the following: (i) shared, public platform-independent domain models, ontology, and data formats to enable interoperability of data and analysis flows within the platform; (ii) web service and registry technologies to identify, share, and integrate information across diverse, globally dispersed data sources, as well as to access high-performance computational (HPC) facilities for computationally intensive, high-throughput analyses of project data; (iii) platform-specific middleware reference implementations of the domain model integrating a suite of public (largely open-access/-source) databases and software tools into a workbench to facilitate biodiversity analysis, comparative analysis of crop genomic data, and plant breeding decision making. PMID:18483570

  14. Social Computing as Next-Gen Learning Paradigm: A Platform and Applications

    NASA Astrophysics Data System (ADS)

    Margherita, Alessandro; Taurino, Cesare; Del Vecchio, Pasquale

    As a field at the intersection between computer science and people behavior, social computing can contribute significantly in the endeavor of innovating how individuals and groups interact for learning and working purposes. In particular, the generation of Internet applications tagged as web 2.0 provides an opportunity to create new “environments” where people can exchange knowledge and experience, create new knowledge and learn together. This chapter illustrates the design and application of a prototypal platform which embeds tools such as blog, wiki, folksonomy and RSS in a unique web-based system. This platform has been developed to support a case-based and project-driven learning strategy for the development of business and technology management competencies in undergraduate and graduate education programs. A set of illustrative scenarios are described to show how a learning community can be promoted, created, and sustained through the technological platform.

  15. A generic, cost-effective, and scalable cell lineage analysis platform

    PubMed Central

    Biezuner, Tamir; Spiro, Adam; Raz, Ofir; Amir, Shiran; Milo, Lilach; Adar, Rivka; Chapal-Ilani, Noa; Berman, Veronika; Fried, Yael; Ainbinder, Elena; Cohen, Galit; Barr, Haim M.; Halaban, Ruth; Shapiro, Ehud

    2016-01-01

    Advances in single-cell genomics enable commensurate improvements in methods for uncovering lineage relations among individual cells. Current sequencing-based methods for cell lineage analysis depend on low-resolution bulk analysis or rely on extensive single-cell sequencing, which is not scalable and could be biased by functional dependencies. Here we show an integrated biochemical-computational platform for generic single-cell lineage analysis that is retrospective, cost-effective, and scalable. It consists of a biochemical-computational pipeline that inputs individual cells, produces targeted single-cell sequencing data, and uses it to generate a lineage tree of the input cells. We validated the platform by applying it to cells sampled from an ex vivo grown tree and analyzed its feasibility landscape by computer simulations. We conclude that the platform may serve as a generic tool for lineage analysis and thus pave the way toward large-scale human cell lineage discovery. PMID:27558250

  16. An interactive parallel programming environment applied in atmospheric science

    NASA Technical Reports Server (NTRS)

    vonLaszewski, G.

    1996-01-01

    This article introduces an interactive parallel programming environment (IPPE) that simplifies the generation and execution of parallel programs. One of the tasks of the environment is to generate message-passing parallel programs for homogeneous and heterogeneous computing platforms. The parallel programs are represented by using visual objects. This is accomplished with the help of a graphical programming editor that is implemented in Java and enables portability to a wide variety of computer platforms. In contrast to other graphical programming systems, reusable parts of the programs can be stored in a program library to support rapid prototyping. In addition, runtime performance data on different computing platforms is collected in a database. A selection process determines dynamically the software and the hardware platform to be used to solve the problem in minimal wall-clock time. The environment is currently being tested on a Grand Challenge problem, the NASA four-dimensional data assimilation system.

  17. Ku-Band rendezvous radar performance computer simulation model

    NASA Technical Reports Server (NTRS)

    Magnusson, H. G.; Goff, M. F.

    1984-01-01

    All work performed on the Ku-band rendezvous radar performance computer simulation model program since the release of the preliminary final report is summarized. Developments on the program fall into three distinct categories: (1) modifications to the existing Ku-band radar tracking performance computer model; (2) the addition of a highly accurate, nonrealtime search and acquisition performance computer model to the total software package developed on this program; and (3) development of radar cross section (RCS) computation models for three additional satellites. All changes in the tracking model involved improvements in the automatic gain control (AGC) and the radar signal strength (RSS) computer models. Although the search and acquisition computer models were developed under the auspices of the Hughes Aircraft Company Ku-Band Integrated Radar and Communications Subsystem program office, they have been supplied to NASA as part of the Ku-band radar performance comuter model package. Their purpose is to predict Ku-band acquisition performance for specific satellite targets on specific missions. The RCS models were developed for three satellites: the Long Duration Exposure Facility (LDEF) spacecraft, the Solar Maximum Mission (SMM) spacecraft, and the Space Telescopes.

  18. Ku-Band rendezvous radar performance computer simulation model

    NASA Astrophysics Data System (ADS)

    Magnusson, H. G.; Goff, M. F.

    1984-06-01

    All work performed on the Ku-band rendezvous radar performance computer simulation model program since the release of the preliminary final report is summarized. Developments on the program fall into three distinct categories: (1) modifications to the existing Ku-band radar tracking performance computer model; (2) the addition of a highly accurate, nonrealtime search and acquisition performance computer model to the total software package developed on this program; and (3) development of radar cross section (RCS) computation models for three additional satellites. All changes in the tracking model involved improvements in the automatic gain control (AGC) and the radar signal strength (RSS) computer models. Although the search and acquisition computer models were developed under the auspices of the Hughes Aircraft Company Ku-Band Integrated Radar and Communications Subsystem program office, they have been supplied to NASA as part of the Ku-band radar performance comuter model package. Their purpose is to predict Ku-band acquisition performance for specific satellite targets on specific missions. The RCS models were developed for three satellites: the Long Duration Exposure Facility (LDEF) spacecraft, the Solar Maximum Mission (SMM) spacecraft, and the Space Telescopes.

  19. Scalable and massively parallel Monte Carlo photon transport simulations for heterogeneous computing platforms

    NASA Astrophysics Data System (ADS)

    Yu, Leiming; Nina-Paravecino, Fanny; Kaeli, David; Fang, Qianqian

    2018-01-01

    We present a highly scalable Monte Carlo (MC) three-dimensional photon transport simulation platform designed for heterogeneous computing systems. Through the development of a massively parallel MC algorithm using the Open Computing Language framework, this research extends our existing graphics processing unit (GPU)-accelerated MC technique to a highly scalable vendor-independent heterogeneous computing environment, achieving significantly improved performance and software portability. A number of parallel computing techniques are investigated to achieve portable performance over a wide range of computing hardware. Furthermore, multiple thread-level and device-level load-balancing strategies are developed to obtain efficient simulations using multiple central processing units and GPUs.

  20. Cellular computational platform and neurally inspired elements thereof

    DOEpatents

    Okandan, Murat

    2016-11-22

    A cellular computational platform is disclosed that includes a multiplicity of functionally identical, repeating computational hardware units that are interconnected electrically and optically. Each computational hardware unit includes a reprogrammable local memory and has interconnections to other such units that have reconfigurable weights. Each computational hardware unit is configured to transmit signals into the network for broadcast in a protocol-less manner to other such units in the network, and to respond to protocol-less broadcast messages that it receives from the network. Each computational hardware unit is further configured to reprogram the local memory in response to incoming electrical and/or optical signals.

  1. The Surveillance and On-demand Sentinel-1 SBAS Services on the Geohazards Exploitation Platforms

    NASA Astrophysics Data System (ADS)

    Casu, F.; de Luca, C.; Zinno, I.; Manunta, M.; Lanari, R.

    2017-12-01

    The Geohazards Exploitation Platform (GEP) is an ESA R&D activity of the EO ground segment to demonstrate the benefit of new technologies for large scale processing of EO data. GEP aims at providing on-demand processing services for specific user needs, as well as systematic processing services to address the need of the geohazards community for common information layers and, finally, to integrate newly developed processors for scientists and other expert users. In this context, a crucial role is played by the recently launched Sentinel-1 (S1) constellation that, with its global acquisition policy, has flooded the scientific community with a huge amount of data acquired over large part of the Earth on a regular basis (down to 6-days with both Sentinel-1A and 1B passes). The Sentinel-1 data, as part of the European Copernicus program, are openly and freely accessible, thus fostering their use for the development of automated and systematic tools for Earth surface monitoring. In particular, due to their specific SAR Interferometry (InSAR) design, Sentinel-1 satellites can be exploited to build up operational services for the easy and rapid generation of advanced InSAR products useful for risk management and natural hazard monitoring. In this work we present the activities carried out for the development, integration, and deployment of two SBAS Sentinel-1 services of CNR-IREA within the GEP framework, namely the Surveillance and On-demand services. The Surveillance service consists on the systematic and automatic processing of Sentinel-1 data over selected Areas of Interest (AoI) to generate updated surface displacement time series via the SBAS-InSAR algorithm. We built up a system that is automatically triggered by every new Sentinel-1 acquisition over the AoI, once it is available on the S1 catalogue. Then, the system processes the new acquisitions only, thus saving storage space and computing time. The processing, which relies on the Parallel version of the SBAS (P-SBAS) chain, allows us to effectively perform massive, systematic and automatic analysis of S1 SAR data. It is worth noting that the SBAS Sentinel-1 services on GEP represent the core of the EPOSAR service, which will deliver S1 displacement time series of Earth surface for the European Plate Observing System (EPOS) Research Infrastructure community.

  2. Software for Acquiring Image Data for PIV

    NASA Technical Reports Server (NTRS)

    Wernet, Mark P.; Cheung, H. M.; Kressler, Brian

    2003-01-01

    PIV Acquisition (PIVACQ) is a computer program for acquisition of data for particle-image velocimetry (PIV). In the PIV system for which PIVACQ was developed, small particles entrained in a flow are illuminated with a sheet of light from a pulsed laser. The illuminated region is monitored by a charge-coupled-device camera that operates in conjunction with a data-acquisition system that includes a frame grabber and a counter-timer board, both installed in a single computer. The camera operates in "frame-straddle" mode where a pair of images can be obtained closely spaced in time (on the order of microseconds). The frame grabber acquires image data from the camera and stores the data in the computer memory. The counter/timer board triggers the camera and synchronizes the pulsing of the laser with acquisition of data from the camera. PIVPROC coordinates all of these functions and provides a graphical user interface, through which the user can control the PIV data-acquisition system. PIVACQ enables the user to acquire a sequence of single-exposure images, display the images, process the images, and then save the images to the computer hard drive. PIVACQ works in conjunction with the PIVPROC program which processes the images of particles into the velocity field in the illuminated plane.

  3. Computer-Aided Process and Tools for Mobile Software Acquisition

    DTIC Science & Technology

    2013-07-30

    moldo^j= pmlkploba=obmloq=pbofbp= Computer-Aided Process and Tools for Mobile Software Acquisition 30 July 2013 LT Christopher Bonine , USN, Dr...Christopher Bonine is a lieutenant in the United States Navy. He is currently assigned to the Navy Cyber Defense Operations Command in Norfolk, VA. He has...interests are in development and implementation of cyber security policy. Bonine has a master’s in computer science from the Naval Postgraduate School

  4. Fast-Acquisition/Weak-Signal-Tracking GPS Receiver for HEO

    NASA Technical Reports Server (NTRS)

    Wintemitz, Luke; Boegner, Greg; Sirotzky, Steve

    2004-01-01

    A report discusses the technical background and design of the Navigator Global Positioning System (GPS) receiver -- . a radiation-hardened receiver intended for use aboard spacecraft. Navigator is capable of weak signal acquisition and tracking as well as much faster acquisition of strong or weak signals with no a priori knowledge or external aiding. Weak-signal acquisition and tracking enables GPS use in high Earth orbits (HEO), and fast acquisition allows for the receiver to remain without power until needed in any orbit. Signal acquisition and signal tracking are, respectively, the processes of finding and demodulating a signal. Acquisition is the more computationally difficult process. Previous GPS receivers employ the method of sequentially searching the two-dimensional signal parameter space (code phase and Doppler). Navigator exploits properties of the Fourier transform in a massively parallel search for the GPS signal. This method results in far faster acquisition times [in the lab, 12 GPS satellites have been acquired with no a priori knowledge in a Low-Earth-Orbit (LEO) scenario in less than one second]. Modeling has shown that Navigator will be capable of acquiring signals down to 25 dB-Hz, appropriate for HEO missions. Navigator is built using the radiation-hardened ColdFire microprocessor and housing the most computationally intense functions in dedicated field-programmable gate arrays. The high performance of the algorithm and of the receiver as a whole are made possible by optimizing computational efficiency and carefully weighing tradeoffs among the sampling rate, data format, and data-path bit width.

  5. Is CALL Obsolete? Language Acquisition and Language Learning Revisited in a Digital Age

    ERIC Educational Resources Information Center

    Jarvis, Huw; Krashen, Stephen

    2014-01-01

    In this article, Huw Jarvis and Stephen Krashen ask "Is CALL Obsolete?" When the term CALL (Computer-Assisted Language Learning) was introduced in the 1960s, the language education profession knew only about language learning, not language acquisition, and assumed the computer's primary contribution to second language acquisition…

  6. A Survey of Knowledge Management Skills Acquisition in an Online Team-Based Distributed Computing Course

    ERIC Educational Resources Information Center

    Thomas, Jennifer D. E.

    2007-01-01

    This paper investigates students' perceptions of their acquisition of knowledge management skills, namely thinking and team-building skills, resulting from the integration of various resources and technologies into an entirely team-based, online upper level distributed computing (DC) information systems (IS) course. Results seem to indicate that…

  7. Authenticity and Authorship in the Computer-Mediated Acquisition of L2 Literacy.

    ERIC Educational Resources Information Center

    Kramsch, Claire

    2000-01-01

    Examines two tenets of communicative language teaching--authenticity of the input and authorship of the language user--in an electronic environment. Reviews research in textually-mediated second language acquisition and analyzes two cases of computer-mediated language learning: the construction of a multimedia CD-ROM by American college learners…

  8. An Empirical Generative Framework for Computational Modeling of Language Acquisition

    ERIC Educational Resources Information Center

    Waterfall, Heidi R.; Sandbank, Ben; Onnis, Luca; Edelman, Shimon

    2010-01-01

    This paper reports progress in developing a computer model of language acquisition in the form of (1) a generative grammar that is (2) algorithmically learnable from realistic corpus data, (3) viable in its large-scale quantitative performance and (4) psychologically real. First, we describe new algorithmic methods for unsupervised learning of…

  9. A Parallel Distributed-Memory Particle Method Enables Acquisition-Rate Segmentation of Large Fluorescence Microscopy Images

    PubMed Central

    Afshar, Yaser; Sbalzarini, Ivo F.

    2016-01-01

    Modern fluorescence microscopy modalities, such as light-sheet microscopy, are capable of acquiring large three-dimensional images at high data rate. This creates a bottleneck in computational processing and analysis of the acquired images, as the rate of acquisition outpaces the speed of processing. Moreover, images can be so large that they do not fit the main memory of a single computer. We address both issues by developing a distributed parallel algorithm for segmentation of large fluorescence microscopy images. The method is based on the versatile Discrete Region Competition algorithm, which has previously proven useful in microscopy image segmentation. The present distributed implementation decomposes the input image into smaller sub-images that are distributed across multiple computers. Using network communication, the computers orchestrate the collectively solving of the global segmentation problem. This not only enables segmentation of large images (we test images of up to 1010 pixels), but also accelerates segmentation to match the time scale of image acquisition. Such acquisition-rate image segmentation is a prerequisite for the smart microscopes of the future and enables online data compression and interactive experiments. PMID:27046144

  10. A Parallel Distributed-Memory Particle Method Enables Acquisition-Rate Segmentation of Large Fluorescence Microscopy Images.

    PubMed

    Afshar, Yaser; Sbalzarini, Ivo F

    2016-01-01

    Modern fluorescence microscopy modalities, such as light-sheet microscopy, are capable of acquiring large three-dimensional images at high data rate. This creates a bottleneck in computational processing and analysis of the acquired images, as the rate of acquisition outpaces the speed of processing. Moreover, images can be so large that they do not fit the main memory of a single computer. We address both issues by developing a distributed parallel algorithm for segmentation of large fluorescence microscopy images. The method is based on the versatile Discrete Region Competition algorithm, which has previously proven useful in microscopy image segmentation. The present distributed implementation decomposes the input image into smaller sub-images that are distributed across multiple computers. Using network communication, the computers orchestrate the collectively solving of the global segmentation problem. This not only enables segmentation of large images (we test images of up to 10(10) pixels), but also accelerates segmentation to match the time scale of image acquisition. Such acquisition-rate image segmentation is a prerequisite for the smart microscopes of the future and enables online data compression and interactive experiments.

  11. Acquisition of gamma camera and physiological data by computer.

    PubMed

    Hack, S N; Chang, M; Line, B R; Cooper, J A; Robeson, G H

    1986-11-01

    We have designed, implemented, and tested a new Research Data Acquisition System (RDAS) that permits a general purpose digital computer to acquire signals from both gamma camera sources and physiological signal sources concurrently. This system overcomes the limited multi-source, high speed data acquisition capabilities found in most clinically oriented nuclear medicine computers. The RDAS can simultaneously input signals from up to four gamma camera sources with a throughput of 200 kHz per source and from up to eight physiological signal sources with an aggregate throughput of 50 kHz. Rigorous testing has found the RDAS to exhibit acceptable linearity and timing characteristics. In addition, flood images obtained by this system were compared with flood images acquired by a commercial nuclear medicine computer system. National Electrical Manufacturers Association performance standards of the flood images were found to be comparable.

  12. The Effect of In-Service Training of Computer Science Teachers on Scratch Programming Language Skills Using an Electronic Learning Platform on Programming Skills and the Attitudes towards Teaching Programming

    ERIC Educational Resources Information Center

    Alkaria, Ahmed; Alhassan, Riyadh

    2017-01-01

    This study was conducted to examine the effect of in-service training of computer science teachers in Scratch language using an electronic learning platform on acquiring programming skills and attitudes towards teaching programming. The sample of this study consisted of 40 middle school computer science teachers. They were assigned into two…

  13. SarcOptiM for ImageJ: high-frequency online sarcomere length computing on stimulated cardiomyocytes.

    PubMed

    Pasqualin, Côme; Gannier, François; Yu, Angèle; Malécot, Claire O; Bredeloux, Pierre; Maupoil, Véronique

    2016-08-01

    Accurate measurement of cardiomyocyte contraction is a critical issue for scientists working on cardiac physiology and physiopathology of diseases implying contraction impairment. Cardiomyocytes contraction can be quantified by measuring sarcomere length, but few tools are available for this, and none is freely distributed. We developed a plug-in (SarcOptiM) for the ImageJ/Fiji image analysis platform developed by the National Institutes of Health. SarcOptiM computes sarcomere length via fast Fourier transform analysis of video frames captured or displayed in ImageJ and thus is not tied to a dedicated video camera. It can work in real time or offline, the latter overcoming rotating motion or displacement-related artifacts. SarcOptiM includes a simulator and video generator of cardiomyocyte contraction. Acquisition parameters, such as pixel size and camera frame rate, were tested with both experimental recordings of rat ventricular cardiomyocytes and synthetic videos. It is freely distributed, and its source code is available. It works under Windows, Mac, or Linux operating systems. The camera speed is the limiting factor, since the algorithm can compute online sarcomere shortening at frame rates >10 kHz. In conclusion, SarcOptiM is a free and validated user-friendly tool for studying cardiomyocyte contraction in all species, including human. Copyright © 2016 the American Physiological Society.

  14. SiMon: Simulation Monitor for Computational Astrophysics

    NASA Astrophysics Data System (ADS)

    Xuran Qian, Penny; Cai, Maxwell Xu; Portegies Zwart, Simon; Zhu, Ming

    2017-09-01

    Scientific discovery via numerical simulations is important in modern astrophysics. This relatively new branch of astrophysics has become possible due to the development of reliable numerical algorithms and the high performance of modern computing technologies. These enable the analysis of large collections of observational data and the acquisition of new data via simulations at unprecedented accuracy and resolution. Ideally, simulations run until they reach some pre-determined termination condition, but often other factors cause extensive numerical approaches to break down at an earlier stage. In those cases, processes tend to be interrupted due to unexpected events in the software or the hardware. In those cases, the scientist handles the interrupt manually, which is time-consuming and prone to errors. We present the Simulation Monitor (SiMon) to automatize the farming of large and extensive simulation processes. Our method is light-weight, it fully automates the entire workflow management, operates concurrently across multiple platforms and can be installed in user space. Inspired by the process of crop farming, we perceive each simulation as a crop in the field and running simulation becomes analogous to growing crops. With the development of SiMon we relax the technical aspects of simulation management. The initial package was developed for extensive parameter searchers in numerical simulations, but it turns out to work equally well for automating the computational processing and reduction of observational data reduction.

  15. VMEbus based computer and real-time UNIX as infrastructure of DAQ

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yasu, Y.; Fujii, H.; Nomachi, M.

    1994-12-31

    This paper describes what the authors have constructed as the infrastructure of data acquisition system (DAQ). The paper reports recent developments concerned with HP VME board computer with LynxOS (HP742rt/HP-RT) and Alpha/OSF1 with VMEbus adapter. The paper also reports current status of developing a Benchmark Suite for Data Acquisition (DAQBENCH) for measuring not only the performance of VME/CAMAC access but also that of the context switching, the inter-process communications and so on, for various computers including Workstation-based systems and VME board computers.

  16. Data Acquisition (DAQ) system dedicated for remote sensing applications on Unmanned Aerial Vehicles (UAV)

    NASA Astrophysics Data System (ADS)

    Keleshis, C.; Ioannou, S.; Vrekoussis, M.; Levin, Z.; Lange, M. A.

    2014-08-01

    Continuous advances in unmanned aerial vehicles (UAV) and the increased complexity of their applications raise the demand for improved data acquisition systems (DAQ). These improvements may comprise low power consumption, low volume and weight, robustness, modularity and capability to interface with various sensors and peripherals while maintaining the high sampling rates and processing speeds. Such a system has been designed and developed and is currently integrated on the Autonomous Flying Platforms for Atmospheric and Earth Surface Observations (APAESO/NEA-YΠOΔOMH/NEKΠ/0308/09) however, it can be easily adapted to any UAV or any other mobile vehicle. The system consists of a single-board computer with a dual-core processor, rugged surface-mount memory and storage device, analog and digital input-output ports and many other peripherals that enhance its connectivity with various sensors, imagers and on-board devices. The system is powered by a high efficiency power supply board. Additional boards such as frame-grabbers, differential global positioning system (DGPS) satellite receivers, general packet radio service (3G-4G-GPRS) modems for communication redundancy have been interfaced to the core system and are used whenever there is a mission need. The onboard DAQ system can be preprogrammed for automatic data acquisition or it can be remotely operated during the flight from the ground control station (GCS) using a graphical user interface (GUI) which has been developed and will also be presented in this paper. The unique design of the GUI and the DAQ system enables the synchronized acquisition of a variety of scientific and UAV flight data in a single core location. The new DAQ system and the GUI have been successfully utilized in several scientific UAV missions. In conclusion, the novel DAQ system provides the UAV and the remote-sensing community with a new tool capable of reliably acquiring, processing, storing and transmitting data from any sensor integrated on an UAV.

  17. Environmental Detectives--The Development of an Augmented Reality Platform for Environmental Simulations

    ERIC Educational Resources Information Center

    Klopfer, Eric; Squire, Kurt

    2008-01-01

    The form factors of handheld computers make them increasingly popular among K-12 educators. Although some compelling examples of educational software for handhelds exist, we believe that the potential of this platform are just being discovered. This paper reviews innovative applications for mobile computing for both education and entertainment…

  18. Beam Dynamics Simulation Platform and Studies of Beam Breakup in Dielectric Wakefield Structures

    NASA Astrophysics Data System (ADS)

    Schoessow, P.; Kanareykin, A.; Jing, C.; Kustov, A.; Altmark, A.; Gai, W.

    2010-11-01

    A particle-Green's function beam dynamics code (BBU-3000) to study beam breakup effects is incorporated into a parallel computing framework based on the Boinc software environment, and supports both task farming on a heterogeneous cluster and local grid computing. User access to the platform is through a web browser.

  19. Assessing the Decision Process towards Bring Your Own Device

    ERIC Educational Resources Information Center

    Koester, Richard F.

    2017-01-01

    Information technology continues to evolve to the point where mobile technologies--such as smart phones, tablets, and ultra-mobile computers have the embedded flexibility and power to be a ubiquitous platform to fulfill the entire user's computing needs. Mobile technology users view these platforms as adaptable enough to be the single solution for…

  20. Multivariate Gradient Analysis for Evaluating and Visualizing a Learning System Platform for Computer Programming

    ERIC Educational Resources Information Center

    Mather, Richard

    2015-01-01

    This paper explores the application of canonical gradient analysis to evaluate and visualize student performance and acceptance of a learning system platform. The subject of evaluation is a first year BSc module for computer programming. This uses "Ceebot," an animated and immersive game-like development environment. Multivariate…

  1. The community FabLab platform: applications and implications in biomedical engineering.

    PubMed

    Stephenson, Makeda K; Dow, Douglas E

    2014-01-01

    Skill development in science, technology, engineering and math (STEM) education present one of the most formidable challenges of modern society. The Community FabLab platform presents a viable solution. Each FabLab contains a suite of modern computer numerical control (CNC) equipment, electronics and computing hardware and design, programming, computer aided design (CAD) and computer aided machining (CAM) software. FabLabs are community and educational resources and open to the public. Development of STEM based workforce skills such as digital fabrication and advanced manufacturing can be enhanced using this platform. Particularly notable is the potential of the FabLab platform in STEM education. The active learning environment engages and supports a diversity of learners, while the iterative learning that is supported by the FabLab rapid prototyping platform facilitates depth of understanding, creativity, innovation and mastery. The product and project based learning that occurs in FabLabs develops in the student a personal sense of accomplishment, self-awareness, command of the material and technology. This helps build the interest and confidence necessary to excel in STEM and throughout life. Finally the introduction and use of relevant technologies at every stage of the education process ensures technical familiarity and a broad knowledge base needed for work in STEM based fields. Biomedical engineering education strives to cultivate broad technical adeptness, creativity, interdisciplinary thought, and an ability to form deep conceptual understanding of complex systems. The FabLab platform is well designed to enhance biomedical engineering education.

  2. Comparison of scientific computing platforms for MCNP4A Monte Carlo calculations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hendricks, J.S.; Brockhoff, R.C.

    1994-04-01

    The performance of seven computer platforms is evaluated with the widely used and internationally available MCNP4A Monte Carlo radiation transport code. All results are reproducible and are presented in such a way as to enable comparison with computer platforms not in the study. The authors observed that the HP/9000-735 workstation runs MCNP 50% faster than the Cray YMP 8/64. Compared with the Cray YMP 8/64, the IBM RS/6000-560 is 68% as fast, the Sun Sparc10 is 66% as fast, the Silicon Graphics ONYX is 90% as fast, the Gateway 2000 model 4DX2-66V personal computer is 27% as fast, and themore » Sun Sparc2 is 24% as fast. In addition to comparing the timing performance of the seven platforms, the authors observe that changes in compilers and software over the past 2 yr have resulted in only modest performance improvements, hardware improvements have enhanced performance by less than a factor of [approximately]3, timing studies are very problem dependent, MCNP4Q runs about as fast as MCNP4.« less

  3. Homomorphic encryption experiments on IBM's cloud quantum computing platform

    NASA Astrophysics Data System (ADS)

    Huang, He-Liang; Zhao, You-Wei; Li, Tan; Li, Feng-Guang; Du, Yu-Tao; Fu, Xiang-Qun; Zhang, Shuo; Wang, Xiang; Bao, Wan-Su

    2017-02-01

    Quantum computing has undergone rapid development in recent years. Owing to limitations on scalability, personal quantum computers still seem slightly unrealistic in the near future. The first practical quantum computer for ordinary users is likely to be on the cloud. However, the adoption of cloud computing is possible only if security is ensured. Homomorphic encryption is a cryptographic protocol that allows computation to be performed on encrypted data without decrypting them, so it is well suited to cloud computing. Here, we first applied homomorphic encryption on IBM's cloud quantum computer platform. In our experiments, we successfully implemented a quantum algorithm for linear equations while protecting our privacy. This demonstration opens a feasible path to the next stage of development of cloud quantum information technology.

  4. A Parallel Point Matching Algorithm for Landmark Based Image Registration Using Multicore Platform

    PubMed Central

    Yang, Lin; Gong, Leiguang; Zhang, Hong; Nosher, John L.; Foran, David J.

    2013-01-01

    Point matching is crucial for many computer vision applications. Establishing the correspondence between a large number of data points is a computationally intensive process. Some point matching related applications, such as medical image registration, require real time or near real time performance if applied to critical clinical applications like image assisted surgery. In this paper, we report a new multicore platform based parallel algorithm for fast point matching in the context of landmark based medical image registration. We introduced a non-regular data partition algorithm which utilizes the K-means clustering algorithm to group the landmarks based on the number of available processing cores, which optimize the memory usage and data transfer. We have tested our method using the IBM Cell Broadband Engine (Cell/B.E.) platform. The results demonstrated a significant speed up over its sequential implementation. The proposed data partition and parallelization algorithm, though tested only on one multicore platform, is generic by its design. Therefore the parallel algorithm can be extended to other computing platforms, as well as other point matching related applications. PMID:24308014

  5. Portability and Cross-Platform Performance of an MPI-Based Parallel Polygon Renderer

    NASA Technical Reports Server (NTRS)

    Crockett, Thomas W.

    1999-01-01

    Visualizing the results of computations performed on large-scale parallel computers is a challenging problem, due to the size of the datasets involved. One approach is to perform the visualization and graphics operations in place, exploiting the available parallelism to obtain the necessary rendering performance. Over the past several years, we have been developing algorithms and software to support visualization applications on NASA's parallel supercomputers. Our results have been incorporated into a parallel polygon rendering system called PGL. PGL was initially developed on tightly-coupled distributed-memory message-passing systems, including Intel's iPSC/860 and Paragon, and IBM's SP2. Over the past year, we have ported it to a variety of additional platforms, including the HP Exemplar, SGI Origin2OOO, Cray T3E, and clusters of Sun workstations. In implementing PGL, we have had two primary goals: cross-platform portability and high performance. Portability is important because (1) our manpower resources are limited, making it difficult to develop and maintain multiple versions of the code, and (2) NASA's complement of parallel computing platforms is diverse and subject to frequent change. Performance is important in delivering adequate rendering rates for complex scenes and ensuring that parallel computing resources are used effectively. Unfortunately, these two goals are often at odds. In this paper we report on our experiences with portability and performance of the PGL polygon renderer across a range of parallel computing platforms.

  6. NeuroPG: open source software for optical pattern generation and data acquisition

    PubMed Central

    Avants, Benjamin W.; Murphy, Daniel B.; Dapello, Joel A.; Robinson, Jacob T.

    2015-01-01

    Patterned illumination using a digital micromirror device (DMD) is a powerful tool for optogenetics. Compared to a scanning laser, DMDs are inexpensive and can easily create complex illumination patterns. Combining these complex spatiotemporal illumination patterns with optogenetics allows DMD-equipped microscopes to probe neural circuits by selectively manipulating the activity of many individual cells or many subcellular regions at the same time. To use DMDs to study neural activity, scientists must develop specialized software to coordinate optical stimulation patterns with the acquisition of electrophysiological and fluorescence data. To meet this growing need we have developed an open source optical pattern generation software for neuroscience—NeuroPG—that combines, DMD control, sample visualization, and data acquisition in one application. Built on a MATLAB platform, NeuroPG can also process, analyze, and visualize data. The software is designed specifically for the Mightex Polygon400; however, as an open source package, NeuroPG can be modified to incorporate any data acquisition, imaging, or illumination equipment that is compatible with MATLAB’s Data Acquisition and Image Acquisition toolboxes. PMID:25784873

  7. An integrated information system for the acquisition, management and sharing of environmental data aimed to decision making

    NASA Astrophysics Data System (ADS)

    La Loggia, Goffredo; Arnone, Elisa; Ciraolo, Giuseppe; Maltese, Antonino; Noto, Leonardo; Pernice, Umberto

    2012-09-01

    This paper reports the first results of the Project SESAMO - SistEma informativo integrato per l'acquisizione, geStione e condivisione di dati AMbientali per il supportO alle decisioni (Integrated Information System for the acquisition, management and sharing of environmental data aimed to decision making). The main aim of the project is to design and develop an integrated environmental information platform able to provide monitoring services for decision support, integrating data from different environmental monitoring systems (including WSN). This ICT platform, based on a service-oriented architecture (SOA), will be developed to coordinate a wide variety of data acquisition systems, based on heterogeneous technologies and communication protocols, providing different sort of environmental monitoring services. The implementation and validation of the SESAMO platform and its services will involve three specific environmental domains: 1) Urban water losses; 2) Early warning system for rainfall-induced landslides; 3) Precision irrigation planning. Services in the first domain are enabled by a low cost sensors network collecting and transmitting data, in order to allow the pipeline network managers to analyze pressure, velocity and discharge data for reducing water losses in an urban contest. This paper outlines the SESAMO functional and technological structure and then gives a concise description of the service design and development process for the second and third domain. Services in the second domain are enabled by a prototypal early warning system able to identify in near-real time high-risk zones of rainfall-induced landslides. Services in the third domain are aimed to optimize irrigation planning of vineyards depending on plant water stress.

  8. Covering Chemical Diversity of Genetically-Modified Tomatoes Using Metabolomics for Objective Substantial Equivalence Assessment

    PubMed Central

    Hirai, Tadayoshi; Oikawa, Akira; Matsuda, Fumio; Fukushima, Atsushi; Arita, Masanori; Watanabe, Shin; Yano, Megumu; Hiwasa-Tanase, Kyoko; Ezura, Hiroshi; Saito, Kazuki

    2011-01-01

    As metabolomics can provide a biochemical snapshot of an organism's phenotype it is a promising approach for charting the unintended effects of genetic modification. A critical obstacle for this application is the inherently limited metabolomic coverage of any single analytical platform. We propose using multiple analytical platforms for the direct acquisition of an interpretable data set of estimable chemical diversity. As an example, we report an application of our multi-platform approach that assesses the substantial equivalence of tomatoes over-expressing the taste-modifying protein miraculin. In combination, the chosen platforms detected compounds that represent 86% of the estimated chemical diversity of the metabolites listed in the LycoCyc database. Following a proof-of-safety approach, we show that % had an acceptable range of variation while simultaneously indicating a reproducible transformation-related metabolic signature. We conclude that multi-platform metabolomics is an approach that is both sensitive and robust and that it constitutes a good starting point for characterizing genetically modified organisms. PMID:21359231

  9. A gimbal platform stabilization for topographic applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Michele, Mangiameli, E-mail: michele.mangiameli@dica.unict.it; Giuseppe, Mussumeci

    2015-03-10

    The aim of this work is the stabilization of a Gimbal platform for optical sensors acquisitions in topographic applications using mobile vehicles. The stabilization of the line of sight (LOS) consists in tracking the command velocity in presence of nonlinear noise due to the external environment. The hardware architecture is characterized by an Ardupilot platform that allows the control of both the mobile device and the Gimbal. Here we developed a new approach to stabilize the Gimbal platform, which is based on neural network. For the control system, we considered a plant that represents the transfer function of the servomore » system control model for an inertial stabilized Gimbal platform. The transductor used in the feed-back line control is characterized by the Rate Gyro transfer function installed onboard of Ardupilot. For the simulation and investigation of the system performance, we used the Simulink tool of Matlab. Results show that the hardware/software approach is efficient, reliable and cheap for direct photogrammetry, as well as for general purpose applications using mobile vehicles.« less

  10. Genomics Virtual Laboratory: A Practical Bioinformatics Workbench for the Cloud

    PubMed Central

    Afgan, Enis; Sloggett, Clare; Goonasekera, Nuwan; Makunin, Igor; Benson, Derek; Crowe, Mark; Gladman, Simon; Kowsar, Yousef; Pheasant, Michael; Horst, Ron; Lonie, Andrew

    2015-01-01

    Background Analyzing high throughput genomics data is a complex and compute intensive task, generally requiring numerous software tools and large reference data sets, tied together in successive stages of data transformation and visualisation. A computational platform enabling best practice genomics analysis ideally meets a number of requirements, including: a wide range of analysis and visualisation tools, closely linked to large user and reference data sets; workflow platform(s) enabling accessible, reproducible, portable analyses, through a flexible set of interfaces; highly available, scalable computational resources; and flexibility and versatility in the use of these resources to meet demands and expertise of a variety of users. Access to an appropriate computational platform can be a significant barrier to researchers, as establishing such a platform requires a large upfront investment in hardware, experience, and expertise. Results We designed and implemented the Genomics Virtual Laboratory (GVL) as a middleware layer of machine images, cloud management tools, and online services that enable researchers to build arbitrarily sized compute clusters on demand, pre-populated with fully configured bioinformatics tools, reference datasets and workflow and visualisation options. The platform is flexible in that users can conduct analyses through web-based (Galaxy, RStudio, IPython Notebook) or command-line interfaces, and add/remove compute nodes and data resources as required. Best-practice tutorials and protocols provide a path from introductory training to practice. The GVL is available on the OpenStack-based Australian Research Cloud (http://nectar.org.au) and the Amazon Web Services cloud. The principles, implementation and build process are designed to be cloud-agnostic. Conclusions This paper provides a blueprint for the design and implementation of a cloud-based Genomics Virtual Laboratory. We discuss scope, design considerations and technical and logistical constraints, and explore the value added to the research community through the suite of services and resources provided by our implementation. PMID:26501966

  11. Performance of MCNP4A on seven computing platforms

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hendricks, J.S.; Brockhoff, R.C.

    1994-12-31

    The performance of seven computer platforms has been evaluated with the MCNP4A Monte Carlo radiation transport code. For the first time we report timing results using MCNP4A and its new test set and libraries. Comparisons are made on platforms not available to us in previous MCNP timing studies. By using MCNP4A and its 325-problem test set, a widely-used and readily-available physics production code is used; the timing comparison is not limited to a single ``typical`` problem, demonstrating the problem dependence of timing results; the results are reproducible at the more than 100 installations around the world using MCNP; comparison ofmore » performance of other computer platforms to the ones tested in this study is possible because we present raw data rather than normalized results; and a measure of the increase in performance of computer hardware and software over the past two years is possible. The computer platforms reported are the Cray-YMP 8/64, IBM RS/6000-560, Sun Sparc10, Sun Sparc2, HP/9000-735, 4 processor 100 MHz Silicon Graphics ONYX, and Gateway 2000 model 4DX2-66V PC. In 1991 a timing study of MCNP4, the predecessor to MCNP4A, was conducted using ENDF/B-V cross-section libraries, which are export protected. The new study is based upon the new MCNP 25-problem test set which utilizes internationally available data. MCNP4A, its test problems and the test data library are available from the Radiation Shielding and Information Center in Oak Ridge, Tennessee, or from the NEA Data Bank in Saclay, France. Anyone with the same workstation and compiler can get the same test problem sets, the same library files, and the same MCNP4A code from RSIC or NEA and replicate our results. And, because we report raw data, comparison of the performance of other compute platforms and compilers can be made.« less

  12. Particle Identification on an FPGA Accelerated Compute Platform for the LHCb Upgrade

    NASA Astrophysics Data System (ADS)

    Fäerber, Christian; Schwemmer, Rainer; Machen, Jonathan; Neufeld, Niko

    2017-07-01

    The current LHCb readout system will be upgraded in 2018 to a “triggerless” readout of the entire detector at the Large Hadron Collider collision rate of 40 MHz. The corresponding bandwidth from the detector down to the foreseen dedicated computing farm (event filter farm), which acts as the trigger, has to be increased by a factor of almost 100 from currently 500 Gb/s up to 40 Tb/s. The event filter farm will preanalyze the data and will select the events on an event by event basis. This will reduce the bandwidth down to a manageable size to write the interesting physics data to tape. The design of such a system is a challenging task, and the reason why different new technologies are considered and have to be investigated for the different parts of the system. For the usage in the event building farm or in the event filter farm (trigger), an experimental field programmable gate array (FPGA) accelerated computing platform is considered and, therefore, tested. FPGA compute accelerators are used more and more in standard servers such as for Microsoft Bing search or Baidu search. The platform we use hosts a general Intel CPU and a high-performance FPGA linked via the high-speed Intel QuickPath Interconnect. An accelerator is implemented on the FPGA. It is very likely that these platforms, which are built, in general, for high-performance computing, are also very interesting for the high-energy physics community. First, the performance results of smaller test cases performed at the beginning are presented. Afterward, a part of the existing LHCb RICH particle identification is tested and is ported to the experimental FPGA accelerated platform. We have compared the performance of the LHCb RICH particle identification running on a normal CPU with the performance of the same algorithm, which is running on the Xeon-FPGA compute accelerator platform.

  13. Genomics Virtual Laboratory: A Practical Bioinformatics Workbench for the Cloud.

    PubMed

    Afgan, Enis; Sloggett, Clare; Goonasekera, Nuwan; Makunin, Igor; Benson, Derek; Crowe, Mark; Gladman, Simon; Kowsar, Yousef; Pheasant, Michael; Horst, Ron; Lonie, Andrew

    2015-01-01

    Analyzing high throughput genomics data is a complex and compute intensive task, generally requiring numerous software tools and large reference data sets, tied together in successive stages of data transformation and visualisation. A computational platform enabling best practice genomics analysis ideally meets a number of requirements, including: a wide range of analysis and visualisation tools, closely linked to large user and reference data sets; workflow platform(s) enabling accessible, reproducible, portable analyses, through a flexible set of interfaces; highly available, scalable computational resources; and flexibility and versatility in the use of these resources to meet demands and expertise of a variety of users. Access to an appropriate computational platform can be a significant barrier to researchers, as establishing such a platform requires a large upfront investment in hardware, experience, and expertise. We designed and implemented the Genomics Virtual Laboratory (GVL) as a middleware layer of machine images, cloud management tools, and online services that enable researchers to build arbitrarily sized compute clusters on demand, pre-populated with fully configured bioinformatics tools, reference datasets and workflow and visualisation options. The platform is flexible in that users can conduct analyses through web-based (Galaxy, RStudio, IPython Notebook) or command-line interfaces, and add/remove compute nodes and data resources as required. Best-practice tutorials and protocols provide a path from introductory training to practice. The GVL is available on the OpenStack-based Australian Research Cloud (http://nectar.org.au) and the Amazon Web Services cloud. The principles, implementation and build process are designed to be cloud-agnostic. This paper provides a blueprint for the design and implementation of a cloud-based Genomics Virtual Laboratory. We discuss scope, design considerations and technical and logistical constraints, and explore the value added to the research community through the suite of services and resources provided by our implementation.

  14. radR: an open-source platform for acquiring and analysing data on biological targets observed by surveillance radar.

    PubMed

    Taylor, Philip D; Brzustowski, John M; Matkovich, Carolyn; Peckford, Michael L; Wilson, Dave

    2010-10-26

    Radar has been used for decades to study movement of insects, birds and bats. In spite of this, there are few readily available software tools for the acquisition, storage and processing of such data. Program radR was developed to solve this problem. Program radR is an open source software tool for the acquisition, storage and analysis of data from marine radars operating in surveillance mode. radR takes time series data with a two-dimensional spatial component as input from some source (typically a radar digitizing card) and extracts and retains information of biological relevance (i.e. moving targets). Low-level data processing is implemented in "C" code, but user-defined functions written in the "R" statistical programming language can be called at pre-defined steps in the calculations. Output data formats are designed to allow for future inclusion of additional data items without requiring change to C code. Two brands of radar digitizing card are currently supported as data sources. We also provide an overview of the basic considerations of setting up and running a biological radar study. Program radR provides a convenient, open source platform for the acquisition and analysis of radar data of biological targets.

  15. Signal processing and general purpose data acquisition system for on-line tomographic measurements

    NASA Astrophysics Data System (ADS)

    Murari, A.; Martin, P.; Hemming, O.; Manduchi, G.; Marrelli, L.; Taliercio, C.; Hoffmann, A.

    1997-01-01

    New analog signal conditioning electronics and data acquisition systems have been developed for the soft x-ray and bolometric tomography diagnostic in the reverse field pinch experiment (RFX). For the soft x-ray detectors the analog signal processing includes a fully differential current to voltage conversion, with up to a 200 kHz bandwidth. For the bolometers, a 50 kHz carrier frequency amplifier allows a maximum bandwidth of 10 kHz. In both cases the analog signals are digitized with a 1 MHz sampling rate close to the diagnostic and are transmitted via a transparent asynchronous xmitter/receiver interface (TAXI) link to purpose built Versa Module Europa (VME) modules which perform data acquisition. A software library has been developed for data preprocessing and tomographic reconstruction. It has been written in C language and is self-contained, i.e., no additional mathematical library is required. The package is therefore platform-free: in particular it can perform online analysis in a real-time application, such as continuous display and feedback, and is portable for long duration fusion or other physical experiments. Due to the modular organization of the library, new preprocessing and analysis modules can be easily integrated in the environment. This software is implemented in RFX over three different platforms: open VMS, digital Unix, and VME 68040 CPU.

  16. radR: an open-source platform for acquiring and analysing data on biological targets observed by surveillance radar

    PubMed Central

    2010-01-01

    Background Radar has been used for decades to study movement of insects, birds and bats. In spite of this, there are few readily available software tools for the acquisition, storage and processing of such data. Program radR was developed to solve this problem. Results Program radR is an open source software tool for the acquisition, storage and analysis of data from marine radars operating in surveillance mode. radR takes time series data with a two-dimensional spatial component as input from some source (typically a radar digitizing card) and extracts and retains information of biological relevance (i.e. moving targets). Low-level data processing is implemented in "C" code, but user-defined functions written in the "R" statistical programming language can be called at pre-defined steps in the calculations. Output data formats are designed to allow for future inclusion of additional data items without requiring change to C code. Two brands of radar digitizing card are currently supported as data sources. We also provide an overview of the basic considerations of setting up and running a biological radar study. Conclusions Program radR provides a convenient, open source platform for the acquisition and analysis of radar data of biological targets. PMID:20977735

  17. Semi-physical Simulation Platform of a Parafoil Nonlinear Dynamic System

    NASA Astrophysics Data System (ADS)

    Gao, Hai-Tao; Yang, Sheng-Bo; Zhu, Er-Lin; Sun, Qing-Lin; Chen, Zeng-Qiang; Kang, Xiao-Feng

    2013-11-01

    Focusing on the problems in the process of simulation and experiment on a parafoil nonlinear dynamic system, such as limited methods, high cost and low efficiency we present a semi-physical simulation platform. It is designed by connecting parts of physical objects to a computer, and remedies the defect that a computer simulation is divorced from a real environment absolutely. The main components of the platform and its functions, as well as simulation flows, are introduced. The feasibility and validity are verified through a simulation experiment. The experimental results show that the platform has significance for improving the quality of the parafoil fixed-point airdrop system, shortening the development cycle and saving cost.

  18. Functional design for operational earth resources ground data processing

    NASA Technical Reports Server (NTRS)

    Baldwin, C. J. (Principal Investigator); Bradford, L. H.; Hutson, D. E.; Jugle, D. R.

    1972-01-01

    The author has identified the following significant results. Study emphasis was on developing a unified concept for the required ground system, capable of handling data from all viable acquisition platforms and sensor groupings envisaged as supporting operational earth survey programs. The platforms considered include both manned and unmanned spacecraft in near earth orbit, and continued use of low and high altitude aircraft. The sensor systems include both imaging and nonimaging devices, operated both passively and actively, from the ultraviolet to the microwave regions of the electromagnetic spectrum.

  19. Load monitoring of aerospace structures utilizing micro-electro-mechanical systems for static and quasi-static loading conditions

    NASA Astrophysics Data System (ADS)

    Martinez, M.; Rocha, B.; Li, M.; Shi, G.; Beltempo, A.; Rutledge, R.; Yanishevsky, M.

    2012-11-01

    The National Research Council Canada (NRC) has worked on the development of structural health monitoring (SHM) test platforms for assessing the performance of sensor systems for load monitoring applications. The first SHM platform consists of a 5.5 m cantilever aluminum beam that provides an optimal scenario for evaluating the ability of a load monitoring system to measure bending, torsion and shear loads. The second SHM platform contains an added level of structural complexity, by consisting of aluminum skins with bonded/riveted stringers, typical of an aircraft lower wing structure. These two load monitoring platforms are well characterized and documented, providing loading conditions similar to those encountered during service. In this study, a micro-electro-mechanical system (MEMS) for acquiring data from triads of gyroscopes, accelerometers and magnetometers is described. The system was used to compute changes in angles at discrete stations along the platforms. The angles obtained from the MEMS were used to compute a second, third or fourth order degree polynomial surface from which displacements at every point could be computed. The use of a new Kalman filter was evaluated for angle estimation, from which displacements in the structure were computed. The outputs of the newly developed algorithms were then compared to the displacements obtained from the linear variable displacement transducers connected to the platforms. The displacement curves were subsequently post-processed either analytically, or with the help of a finite element model of the structure, to estimate strains and loads. The estimated strains were compared with baseline strain gauge instrumentation installed on the platforms. This new approach for load monitoring was able to provide accurate estimates of applied strains and shear loads.

  20. A Dedicated Computational Platform for Cellular Monte Carlo T-CAD Software Tools

    DTIC Science & Technology

    2015-07-14

    computer that establishes an encrypted Virtual Private Network ( OpenVPN [44]) based on the Secure Socket Layer (SSL) paradigm. Each user is given a...security certificate for each device used to connect to the computing nodes. Stable OpenVPN clients are available for Linux, Microsoft Windows, Apple OSX...platform is granted by an encrypted connection base on the Secure Socket Layer (SSL) protocol, and implemented in the OpenVPN Virtual Personal Network

  1. Motivation in a Computer-Supported Collaborative Learning Scenario and Its Impact on Learning Activities and Knowledge Acquisition

    ERIC Educational Resources Information Center

    Schoor, Cornelia; Bannert, Maria

    2011-01-01

    Addressing a drawback in current research on computer-supported collaborative learning (CSCL), this study investigated the influence of motivation on learning activities and knowledge acquisition during CSCL. Participants' (N = 200 university students) task was to develop a handout for which they had first an individual preparing phase followed by…

  2. MEGA X: Molecular Evolutionary Genetics Analysis across Computing Platforms.

    PubMed

    Kumar, Sudhir; Stecher, Glen; Li, Michael; Knyaz, Christina; Tamura, Koichiro

    2018-06-01

    The Molecular Evolutionary Genetics Analysis (Mega) software implements many analytical methods and tools for phylogenomics and phylomedicine. Here, we report a transformation of Mega to enable cross-platform use on Microsoft Windows and Linux operating systems. Mega X does not require virtualization or emulation software and provides a uniform user experience across platforms. Mega X has additionally been upgraded to use multiple computing cores for many molecular evolutionary analyses. Mega X is available in two interfaces (graphical and command line) and can be downloaded from www.megasoftware.net free of charge.

  3. Wireless sensor platform

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Joshi, Pooran C.; Killough, Stephen M.; Kuruganti, Phani Teja

    A wireless sensor platform and methods of manufacture are provided. The platform involves providing a plurality of wireless sensors, where each of the sensors is fabricated on flexible substrates using printing techniques and low temperature curing. Each of the sensors can include planar sensor elements and planar antennas defined using the printing and curing. Further, each of the sensors can include a communications system configured to encode the data from the sensors into a spread spectrum code sequence that is transmitted to a central computer(s) for use in monitoring an area associated with the sensors.

  4. Development of Low-Cost Microcontroller-Based Interface for Data Acquisition and Control of Microbioreactor Operation.

    PubMed

    Husain, Abdul Rashid; Hadad, Yaser; Zainal Alam, Muhd Nazrul Hisham

    2016-10-01

    This article presents the development of a low-cost microcontroller-based interface for a microbioreactor operation. An Arduino MEGA 2560 board with 54 digital input/outputs, including 15 pulse-width-modulation outputs, has been chosen to perform the acquisition and control of the microbioreactor. The microbioreactor (volume = 800 µL) was made of poly(dimethylsiloxane) and poly(methylmethacrylate) polymers. The reactor was built to be equipped with sensors and actuators for the control of reactor temperature and the mixing speed. The article discusses the circuit of the microcontroller-based platform, describes the signal conditioning steps, and evaluates the capacity of the proposed low-cost microcontroller-based interface in terms of control accuracy and system responses. It is demonstrated that the proposed microcontroller-based platform is able to operate parallel microbioreactor operation with satisfactory performances. Control accuracy at a deviation less than 5% of the set-point values and responses in the range of few seconds have been recorded. © 2015 Society for Laboratory Automation and Screening.

  5. Automated 3D reconstruction of interiors with multiple scan views

    NASA Astrophysics Data System (ADS)

    Sequeira, Vitor; Ng, Kia C.; Wolfart, Erik; Goncalves, Joao G. M.; Hogg, David C.

    1998-12-01

    This paper presents two integrated solutions for realistic 3D model acquisition and reconstruction; an early prototype, in the form of a push trolley, and a later prototype in the form of an autonomous robot. The systems encompass all hardware and software required, from laser and video data acquisition, processing and output of texture-mapped 3D models in VRML format, to batteries for power supply and wireless network communications. The autonomous version is also equipped with a mobile platform and other sensors for the purpose of automatic navigation. The applications for such a system range from real estate and tourism (e.g., showing a 3D computer model of a property to a potential buyer or tenant) or as tool for content creation (e.g., creating 3D models of heritage buildings or producing broadcast quality virtual studios). The system can also be used in industrial environments as a reverse engineering tool to update the design of a plant, or as a 3D photo-archive for insurance purposes. The system is Internet compatible: the photo-realistic models can be accessed via the Internet and manipulated interactively in 3D using a common Web browser with a VRML plug-in. Further information and example reconstructed models are available on- line via the RESOLV web-page at http://www.scs.leeds.ac.uk/resolv/.

  6. Development of an Unmanned Aerial Vehicle-Borne Crop-Growth Monitoring System.

    PubMed

    Ni, Jun; Yao, Lili; Zhang, Jingchao; Cao, Weixing; Zhu, Yan; Tai, Xiuxiang

    2017-03-03

    In view of the demand for a low-cost, high-throughput method for the continuous acquisition of crop growth information, this study describes a crop-growth monitoring system which uses an unmanned aerial vehicle (UAV) as an operating platform. The system is capable of real-time online acquisition of various major indexes, e.g., the normalized difference vegetation index (NDVI) of the crop canopy, ratio vegetation index (RVI), leaf nitrogen accumulation (LNA), leaf area index (LAI), and leaf dry weight (LDW). By carrying out three-dimensional numerical simulations based on computational fluid dynamics, spatial distributions were obtained for the UAV down-wash flow fields on the surface of the crop canopy. Based on the flow-field characteristics and geometrical dimensions, a UAV-borne crop-growth sensor was designed. Our field experiments show that the monitoring system has good dynamic stability and measurement accuracy over the range of operating altitudes of the sensor. The linear fitting determination coefficients (R²) for the output RVI value with respect to LNA, LAI, and LDW are 0.63, 0.69, and 0.66, respectively, and the Root-mean-square errors (RMSEs) are 1.42, 1.02 and 3.09, respectively. The equivalent figures for the output NDVI value are 0.60, 0.65, and 0.62 (LNA, LAI, and LDW, respectively) and the RMSEs are 1.44, 1.01 and 3.01, respectively.

  7. A filter circuit board for the Earthworm Seismic Data Acquisition System

    USGS Publications Warehouse

    Jensen, Edward Gray

    2000-01-01

    The Earthworm system is a seismic network data acquisition and processing system used by the Northern California Seismic Network as well as many other seismic networks. The input to the system is comprised of many realtime electronic waveforms fed to a multi-channel digitizer on a PC platform. The digitizer consists of one or more National Instruments Corp. AMUX–64T multiplexer boards attached to an A/D converter board located in the computer. Originally, passive filters were installed on the multiplexers to eliminate electronic noise picked up in cabling. It was later discovered that a small amount of crosstalk occurred between successive channels in the digitizing sequence. Though small, this crosstalk will cause what appear to be small earthquake arrivals at the wrong time on some channels. This can result in erroneous calculation of earthquake arrival times, particularly by automated algorithms. To deal with this problem, an Earthworm filter board was developed to provide the needed filtering while eliminating crosstalk. This report describes the tests performed to find a suitable solution, and the design of the circuit board. Also included are all the details needed to build and install this board in an Earthworm system or any other system using the AMUX–64T board. Available below is the report in PDF format as well as an archive file containing the circuit board manufacturing information.

  8. Uav Photogrammetry with Oblique Images: First Analysis on Data Acquisition and Processing

    NASA Astrophysics Data System (ADS)

    Aicardi, I.; Chiabrando, F.; Grasso, N.; Lingua, A. M.; Noardo, F.; Spanò, A.

    2016-06-01

    In recent years, many studies revealed the advantages of using airborne oblique images for obtaining improved 3D city models (e.g. including façades and building footprints). Expensive airborne cameras, installed on traditional aerial platforms, usually acquired the data. The purpose of this paper is to evaluate the possibility of acquire and use oblique images for the 3D reconstruction of a historical building, obtained by UAV (Unmanned Aerial Vehicle) and traditional COTS (Commercial Off-the-Shelf) digital cameras (more compact and lighter than generally used devices), for the realization of high-level-of-detail architectural survey. The critical issues of the acquisitions from a common UAV (flight planning strategies, ground control points, check points distribution and measurement, etc.) are described. Another important considered aspect was the evaluation of the possibility to use such systems as low cost methods for obtaining complete information from an aerial point of view in case of emergency problems or, as in the present paper, in the cultural heritage application field. The data processing was realized using SfM-based approach for point cloud generation: different dense image-matching algorithms implemented in some commercial and open source software were tested. The achieved results are analysed and the discrepancies from some reference LiDAR data are computed for a final evaluation. The system was tested on the S. Maria Chapel, a part of the Novalesa Abbey (Italy).

  9. Next Generation Image-Based Phenotyping of Root System Architecture

    NASA Astrophysics Data System (ADS)

    Davis, T. W.; Shaw, N. M.; Cheng, H.; Larson, B. G.; Craft, E. J.; Shaff, J. E.; Schneider, D. J.; Piñeros, M. A.; Kochian, L. V.

    2016-12-01

    The development of the Plant Root Imaging and Data Acquisition (PRIDA) hardware/software system enables researchers to collect digital images, along with all the relevant experimental details, of a range of hydroponically grown agricultural crop roots for 2D and 3D trait analysis. Previous efforts of image-based root phenotyping focused on young cereals, such as rice; however, there is a growing need to measure both older and larger root systems, such as those of maize and sorghum, to improve our understanding of the underlying genetics that control favorable rooting traits for plant breeding programs to combat the agricultural risks presented by climate change. Therefore, a larger imaging apparatus has been prototyped for capturing 3D root architecture with an adaptive control system and innovative plant root growth media that retains three-dimensional root architectural features. New publicly available multi-platform software has been released with considerations for both high throughput (e.g., 3D imaging of a single root system in under ten minutes) and high portability (e.g., support for the Raspberry Pi computer). The software features unified data collection, management, exploration and preservation for continued trait and genetics analysis of root system architecture. The new system makes data acquisition efficient and includes features that address the needs of researchers and technicians, such as reduced imaging time, semi-automated camera calibration with uncertainty characterization, and safe storage of the critical experimental data.

  10. Development of an Unmanned Aerial Vehicle-Borne Crop-Growth Monitoring System

    PubMed Central

    Ni, Jun; Yao, Lili; Zhang, Jingchao; Cao, Weixing; Zhu, Yan; Tai, Xiuxiang

    2017-01-01

    In view of the demand for a low-cost, high-throughput method for the continuous acquisition of crop growth information, this study describes a crop-growth monitoring system which uses an unmanned aerial vehicle (UAV) as an operating platform. The system is capable of real-time online acquisition of various major indexes, e.g., the normalized difference vegetation index (NDVI) of the crop canopy, ratio vegetation index (RVI), leaf nitrogen accumulation (LNA), leaf area index (LAI), and leaf dry weight (LDW). By carrying out three-dimensional numerical simulations based on computational fluid dynamics, spatial distributions were obtained for the UAV down-wash flow fields on the surface of the crop canopy. Based on the flow-field characteristics and geometrical dimensions, a UAV-borne crop-growth sensor was designed. Our field experiments show that the monitoring system has good dynamic stability and measurement accuracy over the range of operating altitudes of the sensor. The linear fitting determination coefficients (R2) for the output RVI value with respect to LNA, LAI, and LDW are 0.63, 0.69, and 0.66, respectively, and the Root-mean-square errors (RMSEs) are 1.42, 1.02 and 3.09, respectively. The equivalent figures for the output NDVI value are 0.60, 0.65, and 0.62 (LNA, LAI, and LDW, respectively) and the RMSEs are 1.44, 1.01 and 3.01, respectively. PMID:28273815

  11. Measurement system of correlation functions of microwave single photon source in real time

    NASA Astrophysics Data System (ADS)

    Korenkov, A.; Dmitriev, A.; Astafiev, O.

    2018-02-01

    Several quantum setups, such as quantum key distribution networks[1] and quantum simulators (e.g. boson sampling), by their design rely on single photon sources (SPSs). These quantum setups were demonstrated to operate in optical frequency domain. However, following the steady advances in circuit quantum electrodynamics, a proposal has been made recently[2] to demonstrate boson sampling with microwave photons. This in turn requires the development of reliable microwave SPS. It's one of the most important characteristics are the first-order and the second-order correlation functions g1 and g2. The measurement technique of g1 and g2 is significantly different from that in the optical domain [3],[4] because of the current unavailability of microwave single-photon detectors. In particular, due to high levels of noise present in the system a substantial amount of statistics in needed to be acquired. This work presents a platform for measurement of g1 and g2 that processes the incoming data in real time, maximizing the efficiency of data acquisition. The use of field-programmable gate array (FPGA) electronics, common in similar experiments[3] but complex in programming, is avoided; instead, the calculations are performed on a standard desktop computer. The platform is used to perform the measurements of the first-order and the second-order correlation functions of the microwave SPS.

  12. Open chemistry: RESTful web APIs, JSON, NWChem and the modern web application

    DOE PAGES

    Hanwell, Marcus D.; de Jong, Wibe A.; Harris, Christopher J.

    2017-10-30

    An end-to-end platform for chemical science research has been developed that integrates data from computational and experimental approaches through a modern web-based interface. The platform offers an interactive visualization and analytics environment that functions well on mobile, laptop and desktop devices. It offers pragmatic solutions to ensure that large and complex data sets are more accessible. Existing desktop applications/frameworks were extended to integrate with high-performance computing resources, and offer command-line tools to automate interaction - connecting distributed teams to this software platform on their own terms. The platform was developed openly, and all source code hosted on the GitHub platformmore » with automated deployment possible using Ansible coupled with standard Ubuntu-based machine images deployed to cloud machines. The platform is designed to enable teams to reap the benefits of the connected web - going beyond what conventional search and analytics platforms offer in this area. It also has the goal of offering federated instances, that can be customized to the sites/research performed. Data gets stored using JSON, extending upon previous approaches using XML, building structures that support computational chemistry calculations. These structures were developed to make it easy to process data across different languages, and send data to a JavaScript-based web client.« less

  13. Open chemistry: RESTful web APIs, JSON, NWChem and the modern web application

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hanwell, Marcus D.; de Jong, Wibe A.; Harris, Christopher J.

    An end-to-end platform for chemical science research has been developed that integrates data from computational and experimental approaches through a modern web-based interface. The platform offers an interactive visualization and analytics environment that functions well on mobile, laptop and desktop devices. It offers pragmatic solutions to ensure that large and complex data sets are more accessible. Existing desktop applications/frameworks were extended to integrate with high-performance computing resources, and offer command-line tools to automate interaction - connecting distributed teams to this software platform on their own terms. The platform was developed openly, and all source code hosted on the GitHub platformmore » with automated deployment possible using Ansible coupled with standard Ubuntu-based machine images deployed to cloud machines. The platform is designed to enable teams to reap the benefits of the connected web - going beyond what conventional search and analytics platforms offer in this area. It also has the goal of offering federated instances, that can be customized to the sites/research performed. Data gets stored using JSON, extending upon previous approaches using XML, building structures that support computational chemistry calculations. These structures were developed to make it easy to process data across different languages, and send data to a JavaScript-based web client.« less

  14. Correlation of quantitative dual-energy computed tomography iodine maps and abdominal computed tomography perfusion measurements: are single-acquisition dual-energy computed tomography iodine maps more than a reduced-dose surrogate of conventional computed tomography perfusion?

    PubMed

    Stiller, Wolfram; Skornitzke, Stephan; Fritz, Franziska; Klauss, Miriam; Hansen, Jens; Pahn, Gregor; Grenacher, Lars; Kauczor, Hans-Ulrich

    2015-10-01

    Study objectives were the quantitative evaluation of whether conventional abdominal computed tomography (CT) perfusion measurements mathematically correlate with quantitative single-acquisition dual-energy CT (DECT) iodine concentration maps, the determination of the optimum time of acquisition for achieving maximum correlation, and the estimation of the potential for radiation exposure reduction when replacing conventional CT perfusion by single-acquisition DECT iodine concentration maps. Dual-energy CT perfusion sequences were dynamically acquired over 51 seconds (34 acquisitions every 1.5 seconds) in 24 patients with histologically verified pancreatic carcinoma using dual-source DECT at tube potentials of 80 kVp and 140 kVp. Using software developed in-house, perfusion maps were calculated from 80-kVp image series using the maximum slope model after deformable motion correction. In addition, quantitative iodine maps were calculated for each of the 34 DECT acquisitions per patient. Within a manual segmentation of the pancreas, voxel-by-voxel correlation between the perfusion map and each of the iodine maps was calculated for each patient to determine the optimum time of acquisition topt defined as the acquisition time of the iodine map with the highest correlation coefficient. Subsequently, regions of interest were placed inside the tumor and inside healthy pancreatic tissue, and correlation between mean perfusion values and mean iodine concentrations within these regions of interest at topt was calculated for the patient sample. The mean (SD) topt was 31.7 (5.4) seconds after the start of contrast agent injection. The mean (SD) perfusion values for healthy pancreatic and tumor tissues were 67.8 (26.7) mL per 100 mL/min and 43.7 (32.2) mL per 100 mL/min, respectively. At topt, the mean (SD) iodine concentrations were 2.07 (0.71) mg/mL in healthy pancreatic and 1.69 (0.98) mg/mL in tumor tissue, respectively. Overall, the correlation between perfusion values and iodine concentrations was high (0.77), with correlation of 0.89 in tumor and of 0.56 in healthy pancreatic tissue at topt. Comparing radiation exposure associated with a single DECT acquisition at topt (0.18 mSv) to that of an 80 kVp CT perfusion sequence (2.96 mSv) indicates that an average reduction of Deff by 94% could be achieved by replacing conventional CT perfusion with a single-acquisition DECT iodine concentration map. Quantitative iodine concentration maps obtained with DECT correlate well with conventional abdominal CT perfusion measurements, suggesting that quantitative iodine maps calculated from a single DECT acquisition at an organ-specific and patient-specific optimum time of acquisition might be able to replace conventional abdominal CT perfusion measurements if the time of acquisition is carefully calibrated. This could lead to large reductions of radiation exposure to the patients while offering quantitative perfusion data for diagnosis.

  15. The Proposed Use of Unmanned Aerial System Surrogate Research Aircraft for National Airspace System Integration Research

    NASA Technical Reports Server (NTRS)

    Howell, Charles T., III

    2011-01-01

    Research is needed to determine what procedures, aircraft sensors and other systems will be required to allow Unmanned Aerial Systems (UAS) to safely operate with manned aircraft in the National Airspace System (NAS). This paper explores the use of Unmanned Aerial System (UAS) Surrogate research aircraft to serve as platforms for UAS systems research, development, and flight testing. These aircraft would be manned with safety pilots and researchers that would allow for flight operations almost anywhere in the NAS without the need for a Federal Aviation Administration (FAA) Certificate of Authorization (COA). With pilot override capability, these UAS Surrogate aircraft would be controlled from ground stations like true UAS s. It would be possible to file and fly these UAS Surrogate aircraft in the NAS with normal traffic and they would be better platforms for real world UAS research and development over existing vehicles flying in restricted ranges or other sterilized airspace. These UAS surrogate aircraft could be outfitted with research systems as required such as computers, state sensors, video recording, data acquisition, data link, telemetry, instrumentation, and Automatic Dependent Surveillance-Broadcast (ADS-B). These surrogate aircraft could also be linked to onboard or ground based simulation facilities to further extend UAS research capabilities. Potential areas for UAS Surrogate research include the development, flight test and evaluation of sensors to aide in the process of air traffic "see-and-avoid". These and other sensors could be evaluated in real-time and compared with onboard human evaluation pilots. This paper examines the feasibility of using UAS Surrogate research aircraft as test platforms for a variety of UAS related research.

  16. A Versatile Embedded Platform for EMG Acquisition and Gesture Recognition.

    PubMed

    Benatti, Simone; Casamassima, Filippo; Milosevic, Bojan; Farella, Elisabetta; Schönle, Philipp; Fateh, Schekeb; Burger, Thomas; Huang, Qiuting; Benini, Luca

    2015-10-01

    Wearable devices offer interesting features, such as low cost and user friendliness, but their use for medical applications is an open research topic, given the limited hardware resources they provide. In this paper, we present an embedded solution for real-time EMG-based hand gesture recognition. The work focuses on the multi-level design of the system, integrating the hardware and software components to develop a wearable device capable of acquiring and processing EMG signals for real-time gesture recognition. The system combines the accuracy of a custom analog front end with the flexibility of a low power and high performance microcontroller for on-board processing. Our system achieves the same accuracy of high-end and more expensive active EMG sensors used in applications with strict requirements on signal quality. At the same time, due to its flexible configuration, it can be compared to the few wearable platforms designed for EMG gesture recognition available on market. We demonstrate that we reach similar or better performance while embedding the gesture recognition on board, with the benefit of cost reduction. To validate this approach, we collected a dataset of 7 gestures from 4 users, which were used to evaluate the impact of the number of EMG channels, the number of recognized gestures and the data rate on the recognition accuracy and on the computational demand of the classifier. As a result, we implemented a SVM recognition algorithm capable of real-time performance on the proposed wearable platform, achieving a classification rate of 90%, which is aligned with the state-of-the-art off-line results and a 29.7 mW power consumption, guaranteeing 44 hours of continuous operation with a 400 mAh battery.

  17. An Open Data Platform in the framework of the EGI-LifeWatch Competence Center

    NASA Astrophysics Data System (ADS)

    Aguilar Gómez, Fernando; de Lucas, Jesús Marco; Yaiza Rodríguez Marrero, Ana

    2016-04-01

    The working pilot of an Open Data Platform supporting the full data cycle in research is presented. It aims to preserve knowledge explicitly, starting with the description of the Case Studies, and integrating data and software management and preservation on equal basis. The uninterrupted support in the chain starts at the data acquisition level and covers up to the support for reuse and publication in an open framework, providing integrity and provenance controls. The Lifewatch Open Science Framework is a pilot web portal developed in collaboration with different commercial companies that tries to enrich and integrate different data lifecycle-related tools in order to address the management of the different steps: data planning, gathering, storing, curation, preservation, sharing, discovering, etc. To achieve this goal, the platform includes the following features: -Data Management Planning. Tool to set up an structure of the data, including what data will be generated, how it will be exploited, re-used, curated, preserved, etc. It has a semantic approach: includes reference to ontologies in order to express what data will be gathered. -Close to instrumentation. The portal includes a distributed storage system that can be used both for storing data from instruments and output data from analysis. All that data can be shared -Analysis. Resources from EGI Federated Cloud are accessible within the portal, so that users can exploit computing resources to perform analysis and other processes, including workflows. -Preservation. Data can be preserved in different systems and DOIs can be minted not only for datasets but also for software, DMPs, etc. The presentation will show the different components of the framework as well as how it can be extrapolated to other communities.

  18. Technical note: RabbitCT--an open platform for benchmarking 3D cone-beam reconstruction algorithms.

    PubMed

    Rohkohl, C; Keck, B; Hofmann, H G; Hornegger, J

    2009-09-01

    Fast 3D cone beam reconstruction is mandatory for many clinical workflows. For that reason, researchers and industry work hard on hardware-optimized 3D reconstruction. Backprojection is a major component of many reconstruction algorithms that require a projection of each voxel onto the projection data, including data interpolation, before updating the voxel value. This step is the bottleneck of most reconstruction algorithms and the focus of optimization in recent publications. A crucial limitation, however, of these publications is that the presented results are not comparable to each other. This is mainly due to variations in data acquisitions, preprocessing, and chosen geometries and the lack of a common publicly available test dataset. The authors provide such a standardized dataset that allows for substantial comparison of hardware accelerated backprojection methods. They developed an open platform RabbitCT (www.rabbitCT.com) for worldwide comparison in backprojection performance and ranking on different architectures using a specific high resolution C-arm CT dataset of a rabbit. This includes a sophisticated benchmark interface, a prototype implementation in C++, and image quality measures. At the time of writing, six backprojection implementations are already listed on the website. Optimizations include multithreading using Intel threading building blocks and OpenMP, vectorization using SSE, and computation on the GPU using CUDA 2.0. There is a need for objectively comparing backprojection implementations for reconstruction algorithms. RabbitCT aims to provide a solution to this problem by offering an open platform with fair chances for all participants. The authors are looking forward to a growing community and await feedback regarding future evaluations of novel software- and hardware-based acceleration schemes.

  19. Automated Work Package: Initial Wireless Communication Platform Design, Development, and Evaluation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Al Rashdan, Ahmad Yahya Mohammad; Agarwal, Vivek

    The Department of Energy’s Light Water Reactor Sustainability Program is developing the scientific basis to ensure long-term reliability, productivity, safety, and security of the nuclear power industry in the United States. The Instrumentation, Information, and Control (II&C) pathway of the program aims to increase the role of advanced II&C technologies to achieve this objective. One of the pathway efforts at Idaho National Laboratory (INL) is to improve the work packages execution process by replacing the expensive, inefficient, bulky, complex, and error-prone paper-based work orders with automated work packages (AWPs). An AWP is an automated and dynamic presentation of the workmore » package designed to guide the user through the work process. It is loaded on a mobile device, such as a tablet, and is capable of communicating with plant equipment and systems to acquire plant and procedure states. The AWP replaces those functions where a computer is more efficient and reliable than a human. To enable the automatic acquisition of plant data, it is necessary to design and develop a prototype platform for data exchange between the field instruments and the AWP mobile devices. The development of the platform aims to reveal issues and solutions generalizable to large-scale implementation of a similar system. Topics such as bandwidth, robustness, response time, interference, and security are usually associated with wireless communication. These concerns, along with other requirements, are listed in an earlier INL report. Specifically, the targeted issues and performance aspects in this work are relevant to the communication infrastructure from the perspective of promptness, robustness, expandability, and interoperability with different technologies.« less

  20. The Relationship between Chief Information Officer Transformational Leadership and Computing Platform Operating Systems

    ERIC Educational Resources Information Center

    Anderson, George W.

    2010-01-01

    The purpose of this study was to relate the strength of Chief Information Officer (CIO) transformational leadership behaviors to 1 of 5 computing platform operating systems (OSs) that may be selected for a firm's Enterprise Resource Planning (ERP) business system. Research shows executive leader behaviors may promote innovation through the use of…

  1. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Li, C.; Yu, G.; Wang, K.

    The physical designs of the new concept reactors which have complex structure, various materials and neutronic energy spectrum, have greatly improved the requirements to the calculation methods and the corresponding computing hardware. Along with the widely used parallel algorithm, heterogeneous platforms architecture has been introduced into numerical computations in reactor physics. Because of the natural parallel characteristics, the CPU-FPGA architecture is often used to accelerate numerical computation. This paper studies the application and features of this kind of heterogeneous platforms used in numerical calculation of reactor physics through practical examples. After the designed neutron diffusion module based on CPU-FPGA architecturemore » achieves a 11.2 speed up factor, it is proved to be feasible to apply this kind of heterogeneous platform into reactor physics. (authors)« less

  2. Modular HPC I/O characterization with Darshan

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Snyder, Shane; Carns, Philip; Harms, Kevin

    2016-11-13

    Contemporary high-performance computing (HPC) applications encompass a broad range of distinct I/O strategies and are often executed on a number of different compute platforms in their lifetime. These large-scale HPC platforms employ increasingly complex I/O subsystems to provide a suitable level of I/O performance to applications. Tuning I/O workloads for such a system is nontrivial, and the results generally are not portable to other HPC systems. I/O profiling tools can help to address this challenge, but most existing tools only instrument specific components within the I/O subsystem that provide a limited perspective on I/O performance. The increasing diversity of scientificmore » applications and computing platforms calls for greater flexibililty and scope in I/O characterization.« less

  3. Research on private cloud computing based on analysis on typical opensource platform: a case study with Eucalyptus and Wavemaker

    NASA Astrophysics Data System (ADS)

    Yu, Xiaoyuan; Yuan, Jian; Chen, Shi

    2013-03-01

    Cloud computing is one of the most popular topics in the IT industry and is recently being adopted by many companies. It has four development models, as: public cloud, community cloud, hybrid cloud and private cloud. Except others, private cloud can be implemented in a private network, and delivers some benefits of cloud computing without pitfalls. This paper makes a comparison of typical open source platforms through which we can implement a private cloud. After this comparison, we choose Eucalyptus and Wavemaker to do a case study on the private cloud. We also do some performance estimation of cloud platform services and development of prototype software as cloud services.

  4. Using the High-Level Based Program Interface to Facilitate the Large Scale Scientific Computing

    PubMed Central

    Shang, Yizi; Shang, Ling; Gao, Chuanchang; Lu, Guiming; Ye, Yuntao; Jia, Dongdong

    2014-01-01

    This paper is to make further research on facilitating the large-scale scientific computing on the grid and the desktop grid platform. The related issues include the programming method, the overhead of the high-level program interface based middleware, and the data anticipate migration. The block based Gauss Jordan algorithm as a real example of large-scale scientific computing is used to evaluate those issues presented above. The results show that the high-level based program interface makes the complex scientific applications on large-scale scientific platform easier, though a little overhead is unavoidable. Also, the data anticipation migration mechanism can improve the efficiency of the platform which needs to process big data based scientific applications. PMID:24574931

  5. Design Strategy for a Formally Verified Reliable Computing Platform

    NASA Technical Reports Server (NTRS)

    Butler, Ricky W.; Caldwell, James L.; DiVito, Ben L.

    1991-01-01

    This paper presents a high-level design for a reliable computing platform for real-time control applications. The design tradeoffs and analyses related to the development of a formally verified reliable computing platform are discussed. The design strategy advocated in this paper requires the use of techniques that can be completely characterized mathematically as opposed to more powerful or more flexible algorithms whose performance properties can only be analyzed by simulation and testing. The need for accurate reliability models that can be related to the behavior models is also stressed. Tradeoffs between reliability and voting complexity are explored. In particular, the transient recovery properties of the system are found to be fundamental to both the reliability analysis as well as the "correctness" models.

  6. Analysis of outcomes in radiation oncology: An integrated computational platform

    PubMed Central

    Liu, Dezhi; Ajlouni, Munther; Jin, Jian-Yue; Ryu, Samuel; Siddiqui, Farzan; Patel, Anushka; Movsas, Benjamin; Chetty, Indrin J.

    2009-01-01

    Radiotherapy research and outcome analyses are essential for evaluating new methods of radiation delivery and for assessing the benefits of a given technology on locoregional control and overall survival. In this article, a computational platform is presented to facilitate radiotherapy research and outcome studies in radiation oncology. This computational platform consists of (1) an infrastructural database that stores patient diagnosis, IMRT treatment details, and follow-up information, (2) an interface tool that is used to import and export IMRT plans in DICOM RT and AAPM/RTOG formats from a wide range of planning systems to facilitate reproducible research, (3) a graphical data analysis and programming tool that visualizes all aspects of an IMRT plan including dose, contour, and image data to aid the analysis of treatment plans, and (4) a software package that calculates radiobiological models to evaluate IMRT treatment plans. Given the limited number of general-purpose computational environments for radiotherapy research and outcome studies, this computational platform represents a powerful and convenient tool that is well suited for analyzing dose distributions biologically and correlating them with the delivered radiation dose distributions and other patient-related clinical factors. In addition the database is web-based and accessible by multiple users, facilitating its convenient application and use. PMID:19544785

  7. Superconducting Optoelectronic Circuits for Neuromorphic Computing

    NASA Astrophysics Data System (ADS)

    Shainline, Jeffrey M.; Buckley, Sonia M.; Mirin, Richard P.; Nam, Sae Woo

    2017-03-01

    Neural networks have proven effective for solving many difficult computational problems, yet implementing complex neural networks in software is computationally expensive. To explore the limits of information processing, it is necessary to implement new hardware platforms with large numbers of neurons, each with a large number of connections to other neurons. Here we propose a hybrid semiconductor-superconductor hardware platform for the implementation of neural networks and large-scale neuromorphic computing. The platform combines semiconducting few-photon light-emitting diodes with superconducting-nanowire single-photon detectors to behave as spiking neurons. These processing units are connected via a network of optical waveguides, and variable weights of connection can be implemented using several approaches. The use of light as a signaling mechanism overcomes fanout and parasitic constraints on electrical signals while simultaneously introducing physical degrees of freedom which can be employed for computation. The use of supercurrents achieves the low power density (1 mW /cm2 at 20-MHz firing rate) necessary to scale to systems with enormous entropy. Estimates comparing the proposed hardware platform to a human brain show that with the same number of neurons (1 011) and 700 independent connections per neuron, the hardware presented here may achieve an order of magnitude improvement in synaptic events per second per watt.

  8. 48 CFR 227.7202 - Commercial computer software and commercial computer software documentation.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... software and commercial computer software documentation. 227.7202 Section 227.7202 Federal Acquisition... REQUIREMENTS PATENTS, DATA, AND COPYRIGHTS Rights in Computer Software and Computer Software Documentation 227.7202 Commercial computer software and commercial computer software documentation. ...

  9. 48 CFR 227.7203 - Noncommercial computer software and noncommercial computer software documentation.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... software and noncommercial computer software documentation. 227.7203 Section 227.7203 Federal Acquisition... REQUIREMENTS PATENTS, DATA, AND COPYRIGHTS Rights in Computer Software and Computer Software Documentation 227.7203 Noncommercial computer software and noncommercial computer software documentation. ...

  10. 48 CFR 227.7203 - Noncommercial computer software and noncommercial computer software documentation.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... software and noncommercial computer software documentation. 227.7203 Section 227.7203 Federal Acquisition... REQUIREMENTS PATENTS, DATA, AND COPYRIGHTS Rights in Computer Software and Computer Software Documentation 227.7203 Noncommercial computer software and noncommercial computer software documentation. ...

  11. 48 CFR 227.7203 - Noncommercial computer software and noncommercial computer software documentation.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... software and noncommercial computer software documentation. 227.7203 Section 227.7203 Federal Acquisition... REQUIREMENTS PATENTS, DATA, AND COPYRIGHTS Rights in Computer Software and Computer Software Documentation 227.7203 Noncommercial computer software and noncommercial computer software documentation. ...

  12. 48 CFR 227.7202 - Commercial computer software and commercial computer software documentation.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... software and commercial computer software documentation. 227.7202 Section 227.7202 Federal Acquisition... REQUIREMENTS PATENTS, DATA, AND COPYRIGHTS Rights in Computer Software and Computer Software Documentation 227.7202 Commercial computer software and commercial computer software documentation. ...

  13. 48 CFR 227.7203 - Noncommercial computer software and noncommercial computer software documentation.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... software and noncommercial computer software documentation. 227.7203 Section 227.7203 Federal Acquisition... REQUIREMENTS PATENTS, DATA, AND COPYRIGHTS Rights in Computer Software and Computer Software Documentation 227.7203 Noncommercial computer software and noncommercial computer software documentation. ...

  14. 48 CFR 227.7202 - Commercial computer software and commercial computer software documentation.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... software and commercial computer software documentation. 227.7202 Section 227.7202 Federal Acquisition... REQUIREMENTS PATENTS, DATA, AND COPYRIGHTS Rights in Computer Software and Computer Software Documentation 227.7202 Commercial computer software and commercial computer software documentation. ...

  15. 48 CFR 227.7202 - Commercial computer software and commercial computer software documentation.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... software and commercial computer software documentation. 227.7202 Section 227.7202 Federal Acquisition... REQUIREMENTS PATENTS, DATA, AND COPYRIGHTS Rights in Computer Software and Computer Software Documentation 227.7202 Commercial computer software and commercial computer software documentation. ...

  16. 48 CFR 227.7202 - Commercial computer software and commercial computer software documentation.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... software and commercial computer software documentation. 227.7202 Section 227.7202 Federal Acquisition... REQUIREMENTS PATENTS, DATA, AND COPYRIGHTS Rights in Computer Software and Computer Software Documentation 227.7202 Commercial computer software and commercial computer software documentation. ...

  17. 48 CFR 227.7203 - Noncommercial computer software and noncommercial computer software documentation.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... software and noncommercial computer software documentation. 227.7203 Section 227.7203 Federal Acquisition... REQUIREMENTS PATENTS, DATA, AND COPYRIGHTS Rights in Computer Software and Computer Software Documentation 227.7203 Noncommercial computer software and noncommercial computer software documentation. ...

  18. TDat: An Efficient Platform for Processing Petabyte-Scale Whole-Brain Volumetric Images.

    PubMed

    Li, Yuxin; Gong, Hui; Yang, Xiaoquan; Yuan, Jing; Jiang, Tao; Li, Xiangning; Sun, Qingtao; Zhu, Dan; Wang, Zhenyu; Luo, Qingming; Li, Anan

    2017-01-01

    Three-dimensional imaging of whole mammalian brains at single-neuron resolution has generated terabyte (TB)- and even petabyte (PB)-sized datasets. Due to their size, processing these massive image datasets can be hindered by the computer hardware and software typically found in biological laboratories. To fill this gap, we have developed an efficient platform named TDat, which adopts a novel data reformatting strategy by reading cuboid data and employing parallel computing. In data reformatting, TDat is more efficient than any other software. In data accessing, we adopted parallelization to fully explore the capability for data transmission in computers. We applied TDat in large-volume data rigid registration and neuron tracing in whole-brain data with single-neuron resolution, which has never been demonstrated in other studies. We also showed its compatibility with various computing platforms, image processing software and imaging systems.

  19. Microcomputer data acquisition and control.

    PubMed

    East, T D

    1986-01-01

    In medicine and biology there are many tasks that involve routine well defined procedures. These tasks are ideal candidates for computerized data acquisition and control. As the performance of microcomputers rapidly increases and cost continues to go down the temptation to automate the laboratory becomes great. To the novice computer user the choices of hardware and software are overwhelming and sadly most of the computer sales persons are not at all familiar with real-time applications. If you want to bill your patients you have hundreds of packaged systems to choose from; however, if you want to do real-time data acquisition the choices are very limited and confusing. The purpose of this chapter is to provide the novice computer user with the basics needed to set up a real-time data acquisition system with the common microcomputers. This chapter will cover the following issues necessary to establish a real time data acquisition and control system: Analysis of the research problem: Definition of the problem; Description of data and sampling requirements; Cost/benefit analysis. Choice of Microcomputer hardware and software: Choice of microprocessor and bus structure; Choice of operating system; Choice of layered software. Digital Data Acquisition: Parallel Data Transmission; Serial Data Transmission; Hardware and software available. Analog Data Acquisition: Description of amplitude and frequency characteristics of the input signals; Sampling theorem; Specification of the analog to digital converter; Hardware and software available; Interface to the microcomputer. Microcomputer Control: Analog output; Digital output; Closed-Loop Control. Microcomputer data acquisition and control in the 21st Century--What is in the future? High speed digital medical equipment networks; Medical decision making and artificial intelligence.

  20. Airborne and Ground-Based Platforms for Data Collection in Small Vineyards: Examples from the UK and Switzerland

    NASA Astrophysics Data System (ADS)

    Green, David R.; Gómez, Cristina; Fahrentrapp, Johannes

    2015-04-01

    This paper presents an overview of some of the low-cost ground and airborne platforms and technologies now becoming available for data collection in small area vineyards. Low-cost UAV or UAS platforms and cameras are now widely available as the means to collect both vertical and oblique aerial still photography and airborne videography in vineyards. Examples of small aerial platforms include the AR Parrot Drone, the DJI Phantom (1 and 2), and 3D Robotics IRIS+. Both fixed-wing and rotary wings platforms offer numerous advantages for aerial image acquisition including the freedom to obtain high resolution imagery at any time required. Imagery captured can be stored on mobile devices such as an Apple iPad and shared, written directly to a memory stick or card, or saved to the Cloud. The imagery can either be visually interpreted or subjected to semi-automated analysis using digital image processing (DIP) software to extract information about vine status or the vineyard environment. At the ground-level, a radio-controlled 'rugged' model 4x4 vehicle can also be used as a mobile platform to carry a number of sensors (e.g. a Go-Pro camera) around a vineyard, thereby facilitating quick and easy field data collection from both within the vine canopy and rows. For the small vineyard owner/manager with limited financial resources, this technology has a number of distinct advantages to aid in vineyard management practices: it is relatively cheap to purchase; requires a short learning-curve to use and to master; can make use of autonomous ground control units for repetitive coverage enabling reliable monitoring; and information can easily be analysed and integrated within a GIS with minimal expertise. In addition, these platforms make widespread use of familiar and everyday, off-the-shelf technologies such as WiFi, Go-Pro cameras, Cloud computing, and smartphones or tablets as the control interface, all with a large and well established end-user support base. Whilst there are still some limitations which constrain their use, including battery power and flight time, data connectivity, and payload capacity, such platforms nevertheless offer quick, low-cost, easy, and repeatable ways to capture valuable contextual data for small vineyards, complementing other sources of data used in Precision Viticulture (PV) and vineyard management. As these technologies continue to evolve very quickly, and more lightweight sensors become available for the smaller ground and airborne platforms, this will offer even more possibilities for a wider range of information to be acquired to aid in the monitoring, mapping, and management of small vineyards. The paper is illustrated with some examples from the UK and Switzerland.

  1. Interfacing HTCondor-CE with OpenStack

    NASA Astrophysics Data System (ADS)

    Bockelman, B.; Caballero Bejar, J.; Hover, J.

    2017-10-01

    Over the past few years, Grid Computing technologies have reached a high level of maturity. One key aspect of this success has been the development and adoption of newer Compute Elements to interface the external Grid users with local batch systems. These new Compute Elements allow for better handling of jobs requirements and a more precise management of diverse local resources. However, despite this level of maturity, the Grid Computing world is lacking diversity in local execution platforms. As Grid Computing technologies have historically been driven by the needs of the High Energy Physics community, most resource providers run the platform (operating system version and architecture) that best suits the needs of their particular users. In parallel, the development of virtualization and cloud technologies has accelerated recently, making available a variety of solutions, both commercial and academic, proprietary and open source. Virtualization facilitates performing computational tasks on platforms not available at most computing sites. This work attempts to join the technologies, allowing users to interact with computing sites through one of the standard Computing Elements, HTCondor-CE, but running their jobs within VMs on a local cloud platform, OpenStack, when needed. The system will re-route, in a transparent way, end user jobs into dynamically-launched VM worker nodes when they have requirements that cannot be satisfied by the static local batch system nodes. Also, once the automated mechanisms are in place, it becomes straightforward to allow an end user to invoke a custom Virtual Machine at the site. This will allow cloud resources to be used without requiring the user to establish a separate account. Both scenarios are described in this work.

  2. Simulation/Gaming and the Acquisition of Communicative Competence in Another Language.

    ERIC Educational Resources Information Center

    Garcia-Carbonell, Amparo; Rising, Beverly; Montero, Begona; Watts, Frances

    2001-01-01

    Discussion of communicative competence in second language acquisition focuses on a theoretical and practical meshing of simulation and gaming methodology with theories of foreign language acquisition, including task-based learning, interaction, and comprehensible input. Describes experiments conducted with computer-assisted simulations in…

  3. Sentinel-1 automatic processing chain for volcanic and seismic areas monitoring within the Geohazards Exploitation Platform (GEP)

    NASA Astrophysics Data System (ADS)

    De Luca, Claudio; Zinno, Ivana; Manunta, Michele; Lanari, Riccardo; Casu, Francesco

    2016-04-01

    The microwave remote sensing scenario is rapidly evolving through development of new sensor technology for Earth Observation (EO). In particular, Sentinel-1A (S1A) is the first of a sensors' constellation designed to provide a satellite data stream for the Copernicus European program. Sentinel-1A has been specifically designed to provide, over land, Differential Interferometric Synthetic Aperture Radar (DInSAR) products to analyze and investigate Earth's surface displacements. S1A peculiarities include wide ground coverage (250 km of swath), C-band operational frequency and short revisit time (that will reduce from 12 to 6 days when the twin system Sentinel-1B will be placed in orbit during 2016). Such characteristics, together with the global coverage acquisition policy, make the Sentinel-1 constellation to be extremely suitable for volcanic and seismic areas studying and monitoring worldwide, thus allowing the generation of both ground displacement information with increasing rapidity and new geological understanding. The main acquisition mode over land is the so called Interferometric Wide Swath (IWS) that is based on the Terrain Observation by Progressive Scans (TOPS) technique and that guarantees the mentioned S1A large coverage characteristics at expense of a not trivial interferometric processing. Moreover, the satellite spatial coverage and the reduced revisit time will lead to an exponential increase of the data archives that, after the launch of Sentine-1B, will reach about 3TB per day. Therefore, the EO scientific community needs from the one hand automated and effective DInSAR tools able to address the S1A processing complexity, and from the other hand the computing and storage capacities to face out the expected large amount of data. Then, it is becoming more crucial to move processors and tools close to the satellite archives, being not efficient anymore the approach of downloading and processing data with in-house computing facilities. To address these issues, ESA recently funded the development of the Geohazards Exploitation Platform (GEP), a project aimed at putting together data, processing tools and results to make them accessible to the EO scientific community, with particular emphasis to the Geohazard Supersites & Natural Laboratories and the CEOS Seismic Hazards and Volcanoes Pilots. In this work we present the integration of the parallel version of a well-known DInSAR algorithm referred to as Small BAseline Subset (P-SBAS) within the GEP platform for processing Sentinel-1 data. The integration allowed us to set up an operational on-demand web tool, open to every user, aimed at automatically processing S1A data for the generation of SBAS displacement time-series. Main characteristics as well as a number of experimental results obtained by using the implemented web tool will be also shown. This work is partially supported by: the RITMARE project of Italian MIUR, the DPC-CNR agreement and the ESA GEP project.

  4. The Data Acquisition and Control Systems of the Jet Noise Laboratory at the NASA Langley Research Center

    NASA Technical Reports Server (NTRS)

    Jansen, B. J., Jr.

    1998-01-01

    The features of the data acquisition and control systems of the NASA Langley Research Center's Jet Noise Laboratory are presented. The Jet Noise Laboratory is a facility that simulates realistic mixed flow turbofan jet engine nozzle exhaust systems in simulated flight. The system is capable of acquiring data for a complete take-off assessment of noise and nozzle performance. This paper describes the development of an integrated system to control and measure the behavior of model jet nozzles featuring dual independent high pressure combusting air streams with wind tunnel flow. The acquisition and control system is capable of simultaneous measurement of forces, moments, static and dynamic model pressures and temperatures, and jet noise. The design concepts for the coordination of the control computers and multiple data acquisition computers and instruments are discussed. The control system design and implementation are explained, describing the features, equipment, and the experiences of using a primarily Personal Computer based system. Areas for future development are examined.

  5. Social Skills Training for Adolescents With Autism Spectrum Disorder Using Facebook (Project Rex Connect): A Survey Study

    PubMed Central

    Morriss, Danielle; Warren, Nancy; Truelove, James; Warthen, Jennifer; Ross, Charles Paul; Mood, George; Snook, Charlotte Anne; Borckardt, Jeffrey

    2017-01-01

    Background Adolescents with autism spectrum disorder (ASD) spend more time using electronic screen media than neurotypical peers; preliminary evidence suggests that computer-assisted or Web-based interventions may be beneficial for social skills acquisition. The current generation of adolescents accesses the Internet through computers or phones almost daily, and Facebook is the most frequently used social media platform among teenagers. This is the first research study to explore the use of Facebook as a therapeutic tool for adolescents with ASD. Objective To study the feasibility and clinical impact of using a Web-based social platform in combination with social skills training for adolescents with ASD. Methods This pilot study enrolled 6 participants (all males; mean age 14.1 years) in an online social skills training group using Facebook. Data was collected on the participants’ social and behavioral functioning at the start and conclusion of the intervention. Outcome measures included the Social Responsiveness Scale-2, the Social Skills Improvement System Rating Scale, and the Project Rex Parent Survey. Participants were surveyed at the conclusion of the intervention regarding their experience. Results No statistically significant differences in measurable outcomes were observed. However, the online addition of Facebook was well received by participants and their parents. The Facebook intervention was able to be executed with a careful privacy protocol in place and at minimal safety risk to participants. Conclusions The utilization of Facebook to facilitate delivery of social skills training for adolescents with ASD appears to be feasible, although the clinical impact of such an addition is still unclear. It is important to note that social difficulties of participants persisted with the addition of the online platform and participants still required assistance to engage with peers in an online environment. A Web-based intervention such as the one utilized in this study has the potential to reach a mass number of patients with ASD and could address disparities in access to in-person treatment services. However, the complexity and evolving nature of Facebook’s website and privacy settings leads to a number of unique online safety concerns that may limit its clinical utility. Issues encountered in our study support the development of an alternative and closed Web-based social platform designed specifically for the target audience with ASD; this platform could be a safer and more easily moderated setting for aiding in social skills development. Despite a small sample size with no statistically significant improvements of target symptoms, the use of electronic screen media as a therapeutic tool for adolescents with ASD is still a promising area of research warranting further investigation. Our study helps inform future obstacles regarding feasibility and safety. PMID:28115297

  6. Cloud Computing with iPlant Atmosphere.

    PubMed

    McKay, Sheldon J; Skidmore, Edwin J; LaRose, Christopher J; Mercer, Andre W; Noutsos, Christos

    2013-10-15

    Cloud Computing refers to distributed computing platforms that use virtualization software to provide easy access to physical computing infrastructure and data storage, typically administered through a Web interface. Cloud-based computing provides access to powerful servers, with specific software and virtual hardware configurations, while eliminating the initial capital cost of expensive computers and reducing the ongoing operating costs of system administration, maintenance contracts, power consumption, and cooling. This eliminates a significant barrier to entry into bioinformatics and high-performance computing for many researchers. This is especially true of free or modestly priced cloud computing services. The iPlant Collaborative offers a free cloud computing service, Atmosphere, which allows users to easily create and use instances on virtual servers preconfigured for their analytical needs. Atmosphere is a self-service, on-demand platform for scientific computing. This unit demonstrates how to set up, access and use cloud computing in Atmosphere. Copyright © 2013 John Wiley & Sons, Inc.

  7. Optical RISC computer

    NASA Astrophysics Data System (ADS)

    Guilfoyle, Peter S.; Stone, Richard V.; Hessenbruch, John M.; Zeise, Frederick F.

    1993-07-01

    A second generation digital optical computer (DOC II) has been developed which utilizes a RISC based operating system as its host. This 32 bit, high performance (12.8 GByte/sec), computing platform demonstrates a number of basic principals that are inherent to parallel free space optical interconnects such as speed (up to 1012 bit operations per second) and low power 1.2 fJ per bit). Although DOC II is a general purpose machine, special purpose applications have been developed and are currently being evaluated on the optical platform.

  8. SOI layout decomposition for double patterning lithography on high-performance computer platforms

    NASA Astrophysics Data System (ADS)

    Verstov, Vladimir; Zinchenko, Lyudmila; Makarchuk, Vladimir

    2014-12-01

    In the paper silicon on insulator layout decomposition algorithms for the double patterning lithography on high performance computing platforms are discussed. Our approach is based on the use of a contradiction graph and a modified concurrent breadth-first search algorithm. We evaluate our technique on 45 nm Nangate Open Cell Library including non-Manhattan geometry. Experimental results show that our soft computing algorithms decompose layout successfully and a minimal distance between polygons in layout is increased.

  9. Transitioning ISR architecture into the cloud

    NASA Astrophysics Data System (ADS)

    Lash, Thomas D.

    2012-06-01

    Emerging cloud computing platforms offer an ideal opportunity for Intelligence, Surveillance, and Reconnaissance (ISR) intelligence analysis. Cloud computing platforms help overcome challenges and limitations of traditional ISR architectures. Modern ISR architectures can benefit from examining commercial cloud applications, especially as they relate to user experience, usage profiling, and transformational business models. This paper outlines legacy ISR architectures and their limitations, presents an overview of cloud technologies and their applications to the ISR intelligence mission, and presents an idealized ISR architecture implemented with cloud computing.

  10. Banking: financing trends in an acquisitive health care market--focus on long-term care.

    PubMed

    Gordon, L J; Bressler, A

    1998-01-01

    This article reviews the long-term care sector of the health care industry, particularly the factors driving sector consolidation and, through the use of four transactions as a platform, discusses key credit issues and risks facing long-term care companies.

  11. Experiments with Sensor Motes and Java-DSP

    ERIC Educational Resources Information Center

    Kwon, Homin; Berisha, V.; Atti, V.; Spanias, A.

    2009-01-01

    Distributed wireless sensor networks (WSNs) are being proposed for various applications including defense, security, and smart stages. The introduction of hardware wireless sensors in a signal processing education setting can serve as a paradigm for data acquisition, collaborative signal processing, or simply as a platform for obtaining,…

  12. High-speed multiple sequence alignment on a reconfigurable platform.

    PubMed

    Oliver, Tim; Schmidt, Bertil; Maskell, Douglas; Nathan, Darran; Clemens, Ralf

    2006-01-01

    Progressive alignment is a widely used approach to compute multiple sequence alignments (MSAs). However, aligning several hundred sequences by popular progressive alignment tools requires hours on sequential computers. Due to the rapid growth of sequence databases biologists have to compute MSAs in a far shorter time. In this paper we present a new approach to MSA on reconfigurable hardware platforms to gain high performance at low cost. We have constructed a linear systolic array to perform pairwise sequence distance computations using dynamic programming. This results in an implementation with significant runtime savings on a standard FPGA.

  13. Facilitating NASA Earth Science Data Processing Using Nebula Cloud Computing

    NASA Technical Reports Server (NTRS)

    Pham, Long; Chen, Aijun; Kempler, Steven; Lynnes, Christopher; Theobald, Michael; Asghar, Esfandiari; Campino, Jane; Vollmer, Bruce

    2011-01-01

    Cloud Computing has been implemented in several commercial arenas. The NASA Nebula Cloud Computing platform is an Infrastructure as a Service (IaaS) built in 2008 at NASA Ames Research Center and 2010 at GSFC. Nebula is an open source Cloud platform intended to: a) Make NASA realize significant cost savings through efficient resource utilization, reduced energy consumption, and reduced labor costs. b) Provide an easier way for NASA scientists and researchers to efficiently explore and share large and complex data sets. c) Allow customers to provision, manage, and decommission computing capabilities on an as-needed bases

  14. COED Transactions, Vol. XI, No. 2, February 1979. A Student Designed Microcomputer Based Data Acquisition System.

    ERIC Educational Resources Information Center

    Mitchell, Eugene E., Ed.

    In context of an instrumentation course, four ocean engineering students set out to design and construct a micro-computer based data acquisition system that would be compatible with the University's CYBER host computer. The project included hardware design in the area of sampling, analog-to-digital conversion and timing coordination. It also…

  15. Research and Development in the Computer and Information Sciences. Volume 1, Information Acquisition, Sensing, and Input: A Selective Literature Review.

    ERIC Educational Resources Information Center

    Stevens, Mary Elizabeth

    The series, of which this is the initial report, is intended to give a selective overview of research and development efforts and requirements in the computer and information sciences. The operations of information acquisition, sensing, and input to information processing systems are considered in generalized terms. Specific topics include but are…

  16. The Effect of Emphasizing Mathematical Structure in the Acquisition of Whole Number Computation Skills (Addition and Subtraction) By Seven- and Eight-Year Olds: A Clinical Investigation.

    ERIC Educational Resources Information Center

    Uprichard, A. Edward; Collura, Carolyn

    This investigation sought to determine the effect of emphasizing mathematical structure in the acquisition of computational skills by seven- and eight-year-olds. The meaningful development-of-structure approach emphasized closure, commutativity, associativity, and the identity element of addition; the inverse relationship between addition and…

  17. A new DoD initiative: the Computational Research and Engineering Acquisition Tools and Environments (CREATE) program

    NASA Astrophysics Data System (ADS)

    Arevalo, S.; Atwood, C.; Bell, P.; Blacker, T. D.; Dey, S.; Fisher, D.; Fisher, D. A.; Genalis, P.; Gorski, J.; Harris, A.; Hill, K.; Hurwitz, M.; Kendall, R. P.; Meakin, R. L.; Morton, S.; Moyer, E. T.; Post, D. E.; Strawn, R.; Veldhuizen, D. v.; Votta, L. G.; Wynn, S.; Zelinski, G.

    2008-07-01

    In FY2008, the U.S. Department of Defense (DoD) initiated the Computational Research and Engineering Acquisition Tools and Environments (CREATE) program, a 360M program with a two-year planning phase and a ten-year execution phase. CREATE will develop and deploy three computational engineering tool sets for DoD acquisition programs to use to design aircraft, ships and radio-frequency antennas. The planning and execution of CREATE are based on the 'lessons learned' from case studies of large-scale computational science and engineering projects. The case studies stress the importance of a stable, close-knit development team; a focus on customer needs and requirements; verification and validation; flexible and agile planning, management, and development processes; risk management; realistic schedules and resource levels; balanced short- and long-term goals and deliverables; and stable, long-term support by the program sponsor. Since it began in FY2008, the CREATE program has built a team and project structure, developed requirements and begun validating them, identified candidate products, established initial connections with the acquisition programs, begun detailed project planning and development, and generated the initial collaboration infrastructure necessary for success by its multi-institutional, multidisciplinary teams.

  18. Integration of lyoplate based flow cytometry and computational analysis for standardized immunological biomarker discovery.

    PubMed

    Villanova, Federica; Di Meglio, Paola; Inokuma, Margaret; Aghaeepour, Nima; Perucha, Esperanza; Mollon, Jennifer; Nomura, Laurel; Hernandez-Fuentes, Maria; Cope, Andrew; Prevost, A Toby; Heck, Susanne; Maino, Vernon; Lord, Graham; Brinkman, Ryan R; Nestle, Frank O

    2013-01-01

    Discovery of novel immune biomarkers for monitoring of disease prognosis and response to therapy in immune-mediated inflammatory diseases is an important unmet clinical need. Here, we establish a novel framework for immunological biomarker discovery, comparing a conventional (liquid) flow cytometry platform (CFP) and a unique lyoplate-based flow cytometry platform (LFP) in combination with advanced computational data analysis. We demonstrate that LFP had higher sensitivity compared to CFP, with increased detection of cytokines (IFN-γ and IL-10) and activation markers (Foxp3 and CD25). Fluorescent intensity of cells stained with lyophilized antibodies was increased compared to cells stained with liquid antibodies. LFP, using a plate loader, allowed medium-throughput processing of samples with comparable intra- and inter-assay variability between platforms. Automated computational analysis identified novel immunophenotypes that were not detected with manual analysis. Our results establish a new flow cytometry platform for standardized and rapid immunological biomarker discovery with wide application to immune-mediated diseases.

  19. Integration of Lyoplate Based Flow Cytometry and Computational Analysis for Standardized Immunological Biomarker Discovery

    PubMed Central

    Villanova, Federica; Di Meglio, Paola; Inokuma, Margaret; Aghaeepour, Nima; Perucha, Esperanza; Mollon, Jennifer; Nomura, Laurel; Hernandez-Fuentes, Maria; Cope, Andrew; Prevost, A. Toby; Heck, Susanne; Maino, Vernon; Lord, Graham; Brinkman, Ryan R.; Nestle, Frank O.

    2013-01-01

    Discovery of novel immune biomarkers for monitoring of disease prognosis and response to therapy in immune-mediated inflammatory diseases is an important unmet clinical need. Here, we establish a novel framework for immunological biomarker discovery, comparing a conventional (liquid) flow cytometry platform (CFP) and a unique lyoplate-based flow cytometry platform (LFP) in combination with advanced computational data analysis. We demonstrate that LFP had higher sensitivity compared to CFP, with increased detection of cytokines (IFN-γ and IL-10) and activation markers (Foxp3 and CD25). Fluorescent intensity of cells stained with lyophilized antibodies was increased compared to cells stained with liquid antibodies. LFP, using a plate loader, allowed medium-throughput processing of samples with comparable intra- and inter-assay variability between platforms. Automated computational analysis identified novel immunophenotypes that were not detected with manual analysis. Our results establish a new flow cytometry platform for standardized and rapid immunological biomarker discovery with wide application to immune-mediated diseases. PMID:23843942

  20. LabVIEW: a software system for data acquisition, data analysis, and instrument control.

    PubMed

    Kalkman, C J

    1995-01-01

    Computer-based data acquisition systems play an important role in clinical monitoring and in the development of new monitoring tools. LabVIEW (National Instruments, Austin, TX) is a data acquisition and programming environment that allows flexible acquisition and processing of analog and digital data. The main feature that distinguishes LabVIEW from other data acquisition programs is its highly modular graphical programming language, "G," and a large library of mathematical and statistical functions. The advantage of graphical programming is that the code is flexible, reusable, and self-documenting. Subroutines can be saved in a library and reused without modification in other programs. This dramatically reduces development time and enables researchers to develop or modify their own programs. LabVIEW uses a large amount of processing power and computer memory, thus requiring a powerful computer. A large-screen monitor is desirable when developing larger applications. LabVIEW is excellently suited for testing new monitoring paradigms, analysis algorithms, or user interfaces. The typical LabVIEW user is the researcher who wants to develop a new monitoring technique, a set of new (derived) variables by integrating signals from several existing patient monitors, closed-loop control of a physiological variable, or a physiological simulator.

  1. The Utility of Cognitive Plausibility in Language Acquisition Modeling: Evidence From Word Segmentation.

    PubMed

    Phillips, Lawrence; Pearl, Lisa

    2015-11-01

    The informativity of a computational model of language acquisition is directly related to how closely it approximates the actual acquisition task, sometimes referred to as the model's cognitive plausibility. We suggest that though every computational model necessarily idealizes the modeled task, an informative language acquisition model can aim to be cognitively plausible in multiple ways. We discuss these cognitive plausibility checkpoints generally and then apply them to a case study in word segmentation, investigating a promising Bayesian segmentation strategy. We incorporate cognitive plausibility by using an age-appropriate unit of perceptual representation, evaluating the model output in terms of its utility, and incorporating cognitive constraints into the inference process. Our more cognitively plausible model shows a beneficial effect of cognitive constraints on segmentation performance. One interpretation of this effect is as a synergy between the naive theories of language structure that infants may have and the cognitive constraints that limit the fidelity of their inference processes, where less accurate inference approximations are better when the underlying assumptions about how words are generated are less accurate. More generally, these results highlight the utility of incorporating cognitive plausibility more fully into computational models of language acquisition. Copyright © 2015 Cognitive Science Society, Inc.

  2. Making Spatial Statistics Service Accessible On Cloud Platform

    NASA Astrophysics Data System (ADS)

    Mu, X.; Wu, J.; Li, T.; Zhong, Y.; Gao, X.

    2014-04-01

    Web service can bring together applications running on diverse platforms, users can access and share various data, information and models more effectively and conveniently from certain web service platform. Cloud computing emerges as a paradigm of Internet computing in which dynamical, scalable and often virtualized resources are provided as services. With the rampant growth of massive data and restriction of net, traditional web services platforms have some prominent problems existing in development such as calculation efficiency, maintenance cost and data security. In this paper, we offer a spatial statistics service based on Microsoft cloud. An experiment was carried out to evaluate the availability and efficiency of this service. The results show that this spatial statistics service is accessible for the public conveniently with high processing efficiency.

  3. High performance GPU processing for inversion using uniform grid searches

    NASA Astrophysics Data System (ADS)

    Venetis, Ioannis E.; Saltogianni, Vasso; Stiros, Stathis; Gallopoulos, Efstratios

    2017-04-01

    Many geophysical problems are described by systems of redundant, highly non-linear systems of ordinary equations with constant terms deriving from measurements and hence representing stochastic variables. Solution (inversion) of such problems is based on numerical, optimization methods, based on Monte Carlo sampling or on exhaustive searches in cases of two or even three "free" unknown variables. Recently the TOPological INVersion (TOPINV) algorithm, a grid search-based technique in the Rn space, has been proposed. TOPINV is not based on the minimization of a certain cost function and involves only forward computations, hence avoiding computational errors. The basic concept is to transform observation equations into inequalities on the basis of an optimization parameter k and of their standard errors, and through repeated "scans" of n-dimensional search grids for decreasing values of k to identify the optimal clusters of gridpoints which satisfy observation inequalities and by definition contain the "true" solution. Stochastic optimal solutions and their variance-covariance matrices are then computed as first and second statistical moments. Such exhaustive uniform searches produce an excessive computational load and are extremely time consuming for common computers based on a CPU. An alternative is to use a computing platform based on a GPU, which nowadays is affordable to the research community, which provides a much higher computing performance. Using the CUDA programming language to implement TOPINV allows the investigation of the attained speedup in execution time on such a high performance platform. Based on synthetic data we compared the execution time required for two typical geophysical problems, modeling magma sources and seismic faults, described with up to 18 unknown variables, on both CPU/FORTRAN and GPU/CUDA platforms. The same problems for several different sizes of search grids (up to 1012 gridpoints) and numbers of unknown variables were solved on both platforms, and execution time as a function of the grid dimension for each problem was recorded. Results indicate an average speedup in calculations by a factor of 100 on the GPU platform; for example problems with 1012 grid-points require less than two hours instead of several days on conventional desktop computers. Such a speedup encourages the application of TOPINV on high performance platforms, as a GPU, in cases where nearly real time decisions are necessary, for example finite fault modeling to identify possible tsunami sources.

  4. PRoViScout: a planetary scouting rover demonstrator

    NASA Astrophysics Data System (ADS)

    Paar, Gerhard; Woods, Mark; Gimkiewicz, Christiane; Labrosse, Frédéric; Medina, Alberto; Tyler, Laurence; Barnes, David P.; Fritz, Gerald; Kapellos, Konstantinos

    2012-01-01

    Mobile systems exploring Planetary surfaces in future will require more autonomy than today. The EU FP7-SPACE Project ProViScout (2010-2012) establishes the building blocks of such autonomous exploration systems in terms of robotics vision by a decision-based combination of navigation and scientific target selection, and integrates them into a framework ready for and exposed to field demonstration. The PRoViScout on-board system consists of mission management components such as an Executive, a Mars Mission On-Board Planner and Scheduler, a Science Assessment Module, and Navigation & Vision Processing modules. The platform hardware consists of the rover with the sensors and pointing devices. We report on the major building blocks and their functions & interfaces, emphasizing on the computer vision parts such as image acquisition (using a novel zoomed 3D-Time-of-Flight & RGB camera), mapping from 3D-TOF data, panoramic image & stereo reconstruction, hazard and slope maps, visual odometry and the recognition of potential scientifically interesting targets.

  5. Demonstrating real-time feedback that enhances the performance of measurement sequence with cat states in a cavity

    NASA Astrophysics Data System (ADS)

    Ofek, N.; Petrenko, A.; Liu, Y.; Vlastakis, B.; Sun, L.; Leghtas, Z.; Heeres, R.; Sliwa, K. M.; Mirrahimi, M.; Jiang, L.; Devoret, M. H.; Schoelkopf, R. J.

    2015-03-01

    Real-time feedback offers not just the convenience of streamlined data acquisition, but is an essential element in any quantum computational architecture that requires branching based on measurement outcomes. State-preparation, mitigating the effects of qubit decoherence, and recording the trajectories of quantum systems are just a few of the many potential applications of real-time feedback. Photon number parity measurements of cat states in superconducting resonators are a particularly useful platform for demonstrating the clear advantages of having sophisticated feedback schemes to enhance the performance a proposed error-correction protocol [Leghtas et.al. PRL 2013]. In a cQED architecture, where a transmon qubit is coupled to two superconducting cavities, we present a field-programmable gate array (FPGA) device capable of making decisions and calculations with latency times far shorter than the lifetimes of any of the system's constituents. This level of performance opens the door to realizing many complex, previously unfeasible, experiments in superconducting qubit systems.

  6. Virtual Instrument for Emissions Measurement of Internal Combustion Engines

    PubMed Central

    Pérez, Armando; Montero, Gisela; Coronado, Marcos; García, Conrado; Pérez, Rubén

    2016-01-01

    The gases emissions measurement systems in internal combustion engines are strict and expensive nowadays. For this reason, a virtual instrument was developed to measure the combustion emissions from an internal combustion diesel engine, running with diesel-biodiesel mixtures. This software is called virtual instrument for emissions measurement (VIEM), and it was developed in the platform of LabVIEW 2010® virtual programming. VIEM works with sensors connected to a signal conditioning system, and a data acquisition system is used as interface for a computer in order to measure and monitor in real time the emissions of O2, NO, CO, SO2, and CO2 gases. This paper shows the results of the VIEM programming, the integrated circuits diagrams used for the signal conditioning of sensors, and the sensors characterization of O2, NO, CO, SO2, and CO2. VIEM is a low-cost instrument and is simple and easy to use. Besides, it is scalable, making it flexible and defined by the user. PMID:27034893

  7. High-Performance 3D Compressive Sensing MRI Reconstruction Using Many-Core Architectures

    PubMed Central

    Kim, Daehyun; Trzasko, Joshua; Smelyanskiy, Mikhail; Haider, Clifton; Dubey, Pradeep; Manduca, Armando

    2011-01-01

    Compressive sensing (CS) describes how sparse signals can be accurately reconstructed from many fewer samples than required by the Nyquist criterion. Since MRI scan duration is proportional to the number of acquired samples, CS has been gaining significant attention in MRI. However, the computationally intensive nature of CS reconstructions has precluded their use in routine clinical practice. In this work, we investigate how different throughput-oriented architectures can benefit one CS algorithm and what levels of acceleration are feasible on different modern platforms. We demonstrate that a CUDA-based code running on an NVIDIA Tesla C2050 GPU can reconstruct a 256 × 160 × 80 volume from an 8-channel acquisition in 19 seconds, which is in itself a significant improvement over the state of the art. We then show that Intel's Knights Ferry can perform the same 3D MRI reconstruction in only 12 seconds, bringing CS methods even closer to clinical viability. PMID:21922017

  8. FACE-IT. A Science Gateway for Food Security Research

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Montella, Raffaele; Kelly, David; Xiong, Wei

    Progress in sustainability science is hindered by challenges in creating and managing complex data acquisition, processing, simulation, post-processing, and intercomparison pipelines. To address these challenges, we developed the Framework to Advance Climate, Economic, and Impact Investigations with Information Technology (FACE-IT) for crop and climate impact assessments. This integrated data processing and simulation framework enables data ingest from geospatial archives; data regridding, aggregation, and other processing prior to simulation; large-scale climate impact simulations with agricultural and other models, leveraging high-performance and cloud computing; and post-processing to produce aggregated yields and ensemble variables needed for statistics, for model intercomparison, and to connectmore » biophysical models to global and regional economic models. FACE-IT leverages the capabilities of the Globus Galaxies platform to enable the capture of workflows and outputs in well-defined, reusable, and comparable forms. We describe FACE-IT and applications within the Agricultural Model Intercomparison and Improvement Project and the Center for Robust Decision-making on Climate and Energy Policy.« less

  9. Mathematical analysis of the impact mechanism of information platform on agro-product supply chain and agro-product competitiveness

    NASA Astrophysics Data System (ADS)

    Jiang, Qi-Jie; Jin, Mao-Zhu; Ren, Pei-Yu

    2017-04-01

    How to optimize agro-product supply chain to promote its operating efficiency so as to enhance the competitiveness of regional agricultural products has posed a problem to academic circles, business circles and governments of various levels. One way to solve this problem is to introduce an information platform into the supply chain, which this essay focuses on. Firstly, a review of existing research findings concerning the agro-product competitiveness, agro-product supply chain (ASC) and information platform was given. Secondly, we constructed a mathematical model to analyze the impact of information platform on the bullwhip effect in ASC. Thirdly, another mathematical model was constructed to help compare and analyze the impact of information platform on information acquisition of members in ASC. The research results show that the implantation of information platform can mitigate the bullwhip effect in ASC, and members can determine order amount or production more close to the actual market demand. And also the information platform can reduce the time for members in ASC to get information from other members. Besides, information platform can help ASC to alleviate information asymmetry among upstream and downstream members. Furthermore, researches about the operating mechanism and pattern, technical feature and running structure of the information platform, along with their impacts on agro-product supply chain and the competitiveness of agricultural products need to be advanced.

  10. A Set of Free Cross-Platform Authoring Programs for Flexible Web-Based CALL Exercises

    ERIC Educational Resources Information Center

    O'Brien, Myles

    2012-01-01

    The Mango Suite is a set of three freely downloadable cross-platform authoring programs for flexible network-based CALL exercises. They are Adobe Air applications, so they can be used on Windows, Macintosh, or Linux computers, provided the freely-available Adobe Air has been installed on the computer. The exercises which the programs generate are…

  11. Development of a Cloud Computing-Based Pier Type Port Structure Stability Evaluation Platform Using Fiber Bragg Grating Sensors.

    PubMed

    Jo, Byung Wan; Jo, Jun Ho; Khan, Rana Muhammad Asad; Kim, Jung Hoon; Lee, Yun Sung

    2018-05-23

    Structure Health Monitoring is a topic of great interest in port structures due to the ageing of structures and the limitations of evaluating structures. This paper presents a cloud computing-based stability evaluation platform for a pier type port structure using Fiber Bragg Grating (FBG) sensors in a system consisting of a FBG strain sensor, FBG displacement gauge, FBG angle meter, gateway, and cloud computing-based web server. The sensors were installed on core components of the structure and measurements were taken to evaluate the structures. The measurement values were transmitted to the web server via the gateway to analyze and visualize them. All data were analyzed and visualized in the web server to evaluate the structure based on the safety evaluation index (SEI). The stability evaluation platform for pier type port structures involves the efficient monitoring of the structures which can be carried out easily anytime and anywhere by converging new technologies such as cloud computing and FBG sensors. In addition, the platform has been successfully implemented at “Maryang Harbor” situated in Maryang-Meyon of Korea to test its durability.

  12. 20170312 - Computer Simulation of Developmental ...

    EPA Pesticide Factsheets

    Rationale: Recent progress in systems toxicology and synthetic biology have paved the way to new thinking about in vitro/in silico modeling of developmental processes and toxicities, both for embryological and reproductive impacts. Novel in vitro platforms such as 3D organotypic culture models, engineered microscale tissues and complex microphysiological systems (MPS), together with computational models and computer simulation of tissue dynamics, lend themselves to a integrated testing strategies for predictive toxicology. As these emergent methodologies continue to evolve, they must be integrally tied to maternal/fetal physiology and toxicity of the developing individual across early lifestage transitions, from fertilization to birth, through puberty and beyond. Scope: This symposium will focus on how the novel technology platforms can help now and in the future, with in vitro/in silico modeling of complex biological systems for developmental and reproductive toxicity issues, and translating systems models into integrative testing strategies. The symposium is based on three main organizing principles: (1) that novel in vitro platforms with human cells configured in nascent tissue architectures with a native microphysiological environments yield mechanistic understanding of developmental and reproductive impacts of drug/chemical exposures; (2) that novel in silico platforms with high-throughput screening (HTS) data, biologically-inspired computational models of

  13. Computer Simulation of Developmental Processes and ...

    EPA Pesticide Factsheets

    Rationale: Recent progress in systems toxicology and synthetic biology have paved the way to new thinking about in vitro/in silico modeling of developmental processes and toxicities, both for embryological and reproductive impacts. Novel in vitro platforms such as 3D organotypic culture models, engineered microscale tissues and complex microphysiological systems (MPS), together with computational models and computer simulation of tissue dynamics, lend themselves to a integrated testing strategies for predictive toxicology. As these emergent methodologies continue to evolve, they must be integrally tied to maternal/fetal physiology and toxicity of the developing individual across early lifestage transitions, from fertilization to birth, through puberty and beyond. Scope: This symposium will focus on how the novel technology platforms can help now and in the future, with in vitro/in silico modeling of complex biological systems for developmental and reproductive toxicity issues, and translating systems models into integrative testing strategies. The symposium is based on three main organizing principles: (1) that novel in vitro platforms with human cells configured in nascent tissue architectures with a native microphysiological environments yield mechanistic understanding of developmental and reproductive impacts of drug/chemical exposures; (2) that novel in silico platforms with high-throughput screening (HTS) data, biologically-inspired computational models of

  14. Reliable Acquisition of RAM Dumps from Intel-Based Apple Mac Computers over FireWire

    NASA Astrophysics Data System (ADS)

    Gladyshev, Pavel; Almansoori, Afrah

    RAM content acquisition is an important step in live forensic analysis of computer systems. FireWire offers an attractive way to acquire RAM content of Apple Mac computers equipped with a FireWire connection. However, the existing techniques for doing so require substantial knowledge of the target computer configuration and cannot be used reliably on a previously unknown computer in a crime scene. This paper proposes a novel method for acquiring RAM content of Apple Mac computers over FireWire, which automatically discovers necessary information about the target computer and can be used in the crime scene setting. As an application of the developed method, the techniques for recovery of AOL Instant Messenger (AIM) conversation fragments from RAM dumps are also discussed in this paper.

  15. 48 CFR 227.7203-15 - Subcontractor rights in computer software or computer software documentation.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... computer software or computer software documentation. 227.7203-15 Section 227.7203-15 Federal Acquisition... REQUIREMENTS PATENTS, DATA, AND COPYRIGHTS Rights in Computer Software and Computer Software Documentation 227.7203-15 Subcontractor rights in computer software or computer software documentation. (a...

  16. 48 CFR 227.7203-15 - Subcontractor rights in computer software or computer software documentation.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... computer software or computer software documentation. 227.7203-15 Section 227.7203-15 Federal Acquisition... REQUIREMENTS PATENTS, DATA, AND COPYRIGHTS Rights in Computer Software and Computer Software Documentation 227.7203-15 Subcontractor rights in computer software or computer software documentation. (a...

  17. 48 CFR 227.7203-15 - Subcontractor rights in computer software or computer software documentation.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... computer software or computer software documentation. 227.7203-15 Section 227.7203-15 Federal Acquisition... REQUIREMENTS PATENTS, DATA, AND COPYRIGHTS Rights in Computer Software and Computer Software Documentation 227.7203-15 Subcontractor rights in computer software or computer software documentation. (a...

  18. 48 CFR 227.7203-15 - Subcontractor rights in computer software or computer software documentation.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... computer software or computer software documentation. 227.7203-15 Section 227.7203-15 Federal Acquisition... REQUIREMENTS PATENTS, DATA, AND COPYRIGHTS Rights in Computer Software and Computer Software Documentation 227.7203-15 Subcontractor rights in computer software or computer software documentation. (a...

  19. 48 CFR 227.7203-15 - Subcontractor rights in computer software or computer software documentation.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... computer software or computer software documentation. 227.7203-15 Section 227.7203-15 Federal Acquisition... REQUIREMENTS PATENTS, DATA, AND COPYRIGHTS Rights in Computer Software and Computer Software Documentation 227.7203-15 Subcontractor rights in computer software or computer software documentation. (a...

  20. Cooperative Mobile Sensing Systems for In Situ Measurements in Hazardous Environments

    NASA Astrophysics Data System (ADS)

    Argrow, B.

    2005-12-01

    Sondes are typically deployed from manned aircraft or taken to altitude by a balloon before they are dropped. There are obvious safety and physical limitations that dictate where and how sondes are deployed. These limitations have severely constrained sonde deployment into highly dynamic and dangerous environments. Additionally, conventional parachute dropsondes provide no means for active control. The "smartsonde" idea is to integrate miniature sonde packages into micro air vehicles (MAVs). These MAVs will be ferried into the hard to reach and hazardous environments to provide in situ measurements in regions that have been heretofore out of reach. Once deployed, the MAV will provide some means of control of the sonde, to enable it to remain aloft and to provide some measure of directional control. Preliminary smartsonde communications experiments have been completed. These experiments focused on characterizing the capabilities of the 802.11.4 wireless protocol. Range measurements with 60-mW, 2.4-GHz radios showed 100% throughput rate over 2.7 km during air to ground tests. The experiments also demonstrated the integration of an in-house distributed computing system that provides the interface between the sensors, UAV flight computers, and the telemetry system. The University of Colorado's Research and Engineering Center for Unmanned Vehicles (RECUV) is developing an engineering system that integrates small mobile sensor attributes into flexible mobile sensor infrastructures to be deployed for in situ sensing in hazardous environments. There are three focus applications: 1) Wildfire, to address sensing, communications, situational awareness, and safety needs to support fire-fighting operations and to increase capabilities for dynamic data acquisition for modeling and prediction; 2) Polar, where heterogeneous mixes of platforms and sensors will provide in-situ data acquisition from beneath the ocean surface into the troposphere; 3) Storm, to address the challenges of volumetric in-situ data acquisition in the extremely dynamic environments of severe storms. The common thread among these applications is the need for a cooperative mobile sensing system, where sensor packages are integrated into custom platforms that enable targeting of areas of interest through the cooperative control, with varying levels of autonomy, of small unmanned vehicles. RECUV has demonstrated mobile ad hoc networks using WiFi (802.11b) radios simultaneously deployed in fixed and mobile ground nodes and unmanned aerial vehicles (UAVs). Recently, an autonomous UAV was deployed with a miniature sensor package that returned real-time temperature, pressure, and humidity data, through the ad hoc communications network. The UAV demonstrated the ability to autonomously make flight-path decisions based on the sensor data that was monitored by the flight computer. Current work is now focused on integrating the sensor package into a smartsonde to be deployed from a UAV mothership. Benign scenarios for upcoming tests to validate the collaborative mobile sensing system paradigm include scenarios with features similar to those that will be encountered in the hazardous and dynamic environments a of the wildfire, polar, and storm applications. These include a fly-through of a dust devil on the planes of eastern Colorado and deployment of a dual-mode smartsonde that transmits at high data rates while airborne then, upon landing, it switches to quiet, power-saving mode , where in situ data is logged and only transmitted when the sonde package is queried during overflights of a UAV mothership.

Top