Sample records for sample acquisition processing

  1. Coring Sample Acquisition Tool

    NASA Technical Reports Server (NTRS)

    Haddad, Nicolas E.; Murray, Saben D.; Walkemeyer, Phillip E.; Badescu, Mircea; Sherrit, Stewart; Bao, Xiaoqi; Kriechbaum, Kristopher L.; Richardson, Megan; Klein, Kerry J.

    2012-01-01

    A sample acquisition tool (SAT) has been developed that can be used autonomously to sample drill and capture rock cores. The tool is designed to accommodate core transfer using a sample tube to the IMSAH (integrated Mars sample acquisition and handling) SHEC (sample handling, encapsulation, and containerization) without ever touching the pristine core sample in the transfer process.

  2. A seamless acquisition digital storage oscilloscope with three-dimensional waveform display

    NASA Astrophysics Data System (ADS)

    Yang, Kuojun; Tian, Shulin; Zeng, Hao; Qiu, Lei; Guo, Lianping

    2014-04-01

    In traditional digital storage oscilloscope (DSO), sampled data need to be processed after each acquisition. During data processing, the acquisition is stopped and oscilloscope is blind to the input signal. Thus, this duration is called dead time. With the rapid development of modern electronic systems, the effect of infrequent events becomes significant. To capture these occasional events in shorter time, dead time in traditional DSO that causes the loss of measured signal needs to be reduced or even eliminated. In this paper, a seamless acquisition oscilloscope without dead time is proposed. In this oscilloscope, three-dimensional waveform mapping (TWM) technique, which converts sampled data to displayed waveform, is proposed. With this technique, not only the process speed is improved, but also the probability information of waveform is displayed with different brightness. Thus, a three-dimensional waveform is shown to the user. To reduce processing time further, parallel TWM which processes several sampled points simultaneously, and dual-port random access memory based pipelining technique which can process one sampling point in one clock period are proposed. Furthermore, two DDR3 (Double-Data-Rate Three Synchronous Dynamic Random Access Memory) are used for storing sampled data alternately, thus the acquisition can continue during data processing. Therefore, the dead time of DSO is eliminated. In addition, a double-pulse test method is adopted to test the waveform capturing rate (WCR) of the oscilloscope and a combined pulse test method is employed to evaluate the oscilloscope's capture ability comprehensively. The experiment results show that the WCR of the designed oscilloscope is 6 250 000 wfms/s (waveforms per second), the highest value in all existing oscilloscopes. The testing results also prove that there is no dead time in our oscilloscope, thus realizing the seamless acquisition.

  3. A seamless acquisition digital storage oscilloscope with three-dimensional waveform display

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yang, Kuojun, E-mail: kuojunyang@gmail.com; Guo, Lianping; School of Electrical and Electronic Engineering, Nanyang Technological University

    In traditional digital storage oscilloscope (DSO), sampled data need to be processed after each acquisition. During data processing, the acquisition is stopped and oscilloscope is blind to the input signal. Thus, this duration is called dead time. With the rapid development of modern electronic systems, the effect of infrequent events becomes significant. To capture these occasional events in shorter time, dead time in traditional DSO that causes the loss of measured signal needs to be reduced or even eliminated. In this paper, a seamless acquisition oscilloscope without dead time is proposed. In this oscilloscope, three-dimensional waveform mapping (TWM) technique, whichmore » converts sampled data to displayed waveform, is proposed. With this technique, not only the process speed is improved, but also the probability information of waveform is displayed with different brightness. Thus, a three-dimensional waveform is shown to the user. To reduce processing time further, parallel TWM which processes several sampled points simultaneously, and dual-port random access memory based pipelining technique which can process one sampling point in one clock period are proposed. Furthermore, two DDR3 (Double-Data-Rate Three Synchronous Dynamic Random Access Memory) are used for storing sampled data alternately, thus the acquisition can continue during data processing. Therefore, the dead time of DSO is eliminated. In addition, a double-pulse test method is adopted to test the waveform capturing rate (WCR) of the oscilloscope and a combined pulse test method is employed to evaluate the oscilloscope's capture ability comprehensively. The experiment results show that the WCR of the designed oscilloscope is 6 250 000 wfms/s (waveforms per second), the highest value in all existing oscilloscopes. The testing results also prove that there is no dead time in our oscilloscope, thus realizing the seamless acquisition.« less

  4. Probabilistic Round Trip Contamination Analysis of a Mars Sample Acquisition and Handling Process Using Markovian Decompositions

    NASA Technical Reports Server (NTRS)

    Hudson, Nicolas; Lin, Ying; Barengoltz, Jack

    2010-01-01

    A method for evaluating the probability of a Viable Earth Microorganism (VEM) contaminating a sample during the sample acquisition and handling (SAH) process of a potential future Mars Sample Return mission is developed. A scenario where multiple core samples would be acquired using a rotary percussive coring tool, deployed from an arm on a MER class rover is analyzed. The analysis is conducted in a structured way by decomposing sample acquisition and handling process into a series of discrete time steps, and breaking the physical system into a set of relevant components. At each discrete time step, two key functions are defined: The probability of a VEM being released from each component, and the transport matrix, which represents the probability of VEM transport from one component to another. By defining the expected the number of VEMs on each component at the start of the sampling process, these decompositions allow the expected number of VEMs on each component at each sampling step to be represented as a Markov chain. This formalism provides a rigorous mathematical framework in which to analyze the probability of a VEM entering the sample chain, as well as making the analysis tractable by breaking the process down into small analyzable steps.

  5. Time-jittered marine seismic data acquisition via compressed sensing and sparsity-promoting wavefield reconstruction

    NASA Astrophysics Data System (ADS)

    Wason, H.; Herrmann, F. J.; Kumar, R.

    2016-12-01

    Current efforts towards dense shot (or receiver) sampling and full azimuthal coverage to produce high resolution images have led to the deployment of multiple source vessels (or streamers) across marine survey areas. Densely sampled marine seismic data acquisition, however, is expensive, and hence necessitates the adoption of sampling schemes that save acquisition costs and time. Compressed sensing is a sampling paradigm that aims to reconstruct a signal--that is sparse or compressible in some transform domain--from relatively fewer measurements than required by the Nyquist sampling criteria. Leveraging ideas from the field of compressed sensing, we show how marine seismic acquisition can be setup as a compressed sensing problem. A step ahead from multi-source seismic acquisition is simultaneous source acquisition--an emerging technology that is stimulating both geophysical research and commercial efforts--where multiple source arrays/vessels fire shots simultaneously resulting in better coverage in marine surveys. Following the design principles of compressed sensing, we propose a pragmatic simultaneous time-jittered time-compressed marine acquisition scheme where single or multiple source vessels sail across an ocean-bottom array firing airguns at jittered times and source locations, resulting in better spatial sampling and speedup acquisition. Our acquisition is low cost since our measurements are subsampled. Simultaneous source acquisition generates data with overlapping shot records, which need to be separated for further processing. We can significantly impact the reconstruction quality of conventional seismic data from jittered data and demonstrate successful recovery by sparsity promotion. In contrast to random (sub)sampling, acquisition via jittered (sub)sampling helps in controlling the maximum gap size, which is a practical requirement of wavefield reconstruction with localized sparsifying transforms. We illustrate our results with simulations of simultaneous time-jittered marine acquisition for 2D and 3D ocean-bottom cable survey.

  6. Mars Science Laboratory Sample Acquisition, Sample Processing and Handling: Subsystem Design and Test Challenges

    NASA Technical Reports Server (NTRS)

    Jandura, Louise

    2010-01-01

    The Sample Acquisition/Sample Processing and Handling subsystem for the Mars Science Laboratory is a highly-mechanized, Rover-based sampling system that acquires powdered rock and regolith samples from the Martian surface, sorts the samples into fine particles through sieving, and delivers small portions of the powder into two science instruments inside the Rover. SA/SPaH utilizes 17 actuated degrees-of-freedom to perform the functions needed to produce 5 sample pathways in support of the scientific investigation on Mars. Both hardware redundancy and functional redundancy are employed in configuring this sampling system so some functionality is retained even with the loss of a degree-of-freedom. Intentional dynamic environments are created to move sample while vibration isolators attenuate this environment at the sensitive instruments located near the dynamic sources. In addition to the typical flight hardware qualification test program, two additional types of testing are essential for this kind of sampling system: characterization of the intentionally-created dynamic environment and testing of the sample acquisition and processing hardware functions using Mars analog materials in a low pressure environment. The overall subsystem design and configuration are discussed along with some of the challenges, tradeoffs, and lessons learned in the areas of fault tolerance, intentional dynamic environments, and special testing

  7. A Sub-Sampling Approach for Data Acquisition in Gamma Ray Emission Tomography

    NASA Astrophysics Data System (ADS)

    Fysikopoulos, Eleftherios; Kopsinis, Yannis; Georgiou, Maria; Loudos, George

    2016-06-01

    State of the art data acquisition systems for small animal imaging gamma ray detectors often rely on free running Analog to Digital Converters (ADCs) and high density Field Programmable Gate Arrays (FPGA) devices for digital signal processing. In this work, a sub-sampling acquisition approach, which exploits a priori information regarding the shape of the obtained detector pulses is proposed. Output pulses shape depends on the response of the scintillation crystal, photodetector's properties and amplifier/shaper operation. Using these known characteristics of the detector pulses prior to digitization, one can model the voltage pulse derived from the shaper (a low-pass filter, last in the front-end electronics chain), in order to reduce the desirable sampling rate of ADCs. Fitting with a small number of measurements, pulse shape estimation is then feasible. In particular, the proposed sub-sampling acquisition approach relies on a bi-exponential modeling of the pulse shape. We show that the properties of the pulse that are relevant for Single Photon Emission Computed Tomography (SPECT) event detection (i.e., position and energy) can be calculated by collecting just a small fraction of the number of samples usually collected in data acquisition systems used so far. Compared to the standard digitization process, the proposed sub-sampling approach allows the use of free running ADCs with sampling rate reduced by a factor of 5. Two small detectors consisting of Cerium doped Gadolinium Aluminum Gallium Garnet (Gd3Al2Ga3O12 : Ce or GAGG:Ce) pixelated arrays (array elements: 2 × 2 × 5 mm3 and 1 × 1 × 10 mm3 respectively) coupled to a Position Sensitive Photomultiplier Tube (PSPMT) were used for experimental evaluation. The two detectors were used to obtain raw images and energy histograms under 140 keV and 661.7 keV irradiation respectively. The sub-sampling acquisition technique (10 MHz sampling rate) was compared with a standard acquisition method (52 MHz sampling rate), in terms of energy resolution and image signal to noise ratio for both gamma ray energies. The Levenberg-Marquardt (LM) non-linear least-squares algorithm was used, in post processing, in order to fit the acquired data with the proposed model. The results showed that analog pulses prior to digitization are being estimated with high accuracy after fitting with the bi-exponential model.

  8. The Mars Science Laboratory Organic Check Material

    NASA Technical Reports Server (NTRS)

    Conrad, Pamela G.; Eigenbrode, J. E.; Mogensen, C. T.; VonderHeydt, M. O.; Glavin, D. P.; Mahaffy, P. M.; Johnson, J. A.

    2011-01-01

    The Organic Check Material (OCM) has been developed for use on the Mars Science Laboratory mission to serve as a sample standard for verification of organic cleanliness and characterization of potential sample alteration as a function of the sample acquisition and portioning process on the Curiosity rover. OCM samples will be acquired using the same procedures for drilling, portioning and delivery as are used to study martian samples with The Sample Analysis at Mars (SAM) instrument suite during MSL surface operations. Because the SAM suite is highly sensitive to organic molecules, the mission can better verify the cleanliness of Curiosity's sample acquisition hardware if a known material can be processed through SAM and compared with the results obtained from martian samples.

  9. A Real-Time Image Acquisition And Processing System For A RISC-Based Microcomputer

    NASA Astrophysics Data System (ADS)

    Luckman, Adrian J.; Allinson, Nigel M.

    1989-03-01

    A low cost image acquisition and processing system has been developed for the Acorn Archimedes microcomputer. Using a Reduced Instruction Set Computer (RISC) architecture, the ARM (Acorn Risc Machine) processor provides instruction speeds suitable for image processing applications. The associated improvement in data transfer rate has allowed real-time video image acquisition without the need for frame-store memory external to the microcomputer. The system is comprised of real-time video digitising hardware which interfaces directly to the Archimedes memory, and software to provide an integrated image acquisition and processing environment. The hardware can digitise a video signal at up to 640 samples per video line with programmable parameters such as sampling rate and gain. Software support includes a work environment for image capture and processing with pixel, neighbourhood and global operators. A friendly user interface is provided with the help of the Archimedes Operating System WIMP (Windows, Icons, Mouse and Pointer) Manager. Windows provide a convenient way of handling images on the screen and program control is directed mostly by pop-up menus.

  10. High frequency signal acquisition and control system based on DSP+FPGA

    NASA Astrophysics Data System (ADS)

    Liu, Xiao-qi; Zhang, Da-zhi; Yin, Ya-dong

    2017-10-01

    This paper introduces a design and implementation of high frequency signal acquisition and control system based on DSP + FPGA. The system supports internal/external clock and internal/external trigger sampling. It has a maximum sampling rate of 400MBPS and has a 1.4GHz input bandwidth for the ADC. Data can be collected continuously or periodically in systems and they are stored in DDR2. At the same time, the system also supports real-time acquisition, the collected data after digital frequency conversion and Cascaded Integrator-Comb (CIC) filtering, which then be sent to the CPCI bus through the high-speed DSP, can be assigned to the fiber board for subsequent processing. The system integrates signal acquisition and pre-processing functions, which uses high-speed A/D, high-speed DSP and FPGA mixed technology and has a wide range of uses in data acquisition and recording. In the signal processing, the system can be seamlessly connected to the dedicated processor board. The system has the advantages of multi-selectivity, good scalability and so on, which satisfies the different requirements of different signals in different projects.

  11. Robust Methods for Sensing and Reconstructing Sparse Signals

    ERIC Educational Resources Information Center

    Carrillo, Rafael E.

    2012-01-01

    Compressed sensing (CS) is an emerging signal acquisition framework that goes against the traditional Nyquist sampling paradigm. CS demonstrates that a sparse, or compressible, signal can be acquired using a low rate acquisition process. Since noise is always present in practical data acquisition systems, sensing and reconstruction methods are…

  12. Improving the Acquisition and Management of Sample Curation Data

    NASA Technical Reports Server (NTRS)

    Todd, Nancy S.; Evans, Cindy A.; Labasse, Dan

    2011-01-01

    This paper discusses the current sample documentation processes used during and after a mission, examines the challenges and special considerations needed for designing effective sample curation data systems, and looks at the results of a simulated sample result mission and the lessons learned from this simulation. In addition, it introduces a new data architecture for an integrated sample Curation data system being implemented at the NASA Astromaterials Acquisition and Curation department and discusses how it improves on existing data management systems.

  13. Due Diligence Processes for Public Acquisition of Mining-Impacted Landscapes

    NASA Astrophysics Data System (ADS)

    Martin, E.; Monohan, C.; Keeble-Toll, A. K.

    2016-12-01

    The acquisition of public land is critical for achieving conservation and habitat goals in rural regions projected to experience continuously high rates of population growth. To ensure that public funds are utilized responsibly in the purchase of conservation easements appropriate due diligence processes must be established that limit landowner liability post-acquisition. Traditional methods of characterizing contamination in regions where legacy mining activities were prevalent may not utilize current scientific knowledge and understanding of contaminant fate, transport and bioavailability, and therefore are likely to have type two error. Agency prescribed assessment methods utilized under CERLA in many cases fail to detect contamination that presents liability issues by failing to require water quality sampling that would reveal offsite transport potential of contaminants posing human health risks, including mercury. Historical analysis can be used to inform judgmental sampling to identify hotspots and contaminants of concern. Land acquisition projects at two historic mine sites in Nevada County, California, the Champion Mine Complex and the Black Swan Preserve have established the necessity of re-thinking due diligence processes for mining-impacted landscapes. These pilot projects demonstrate that pre-acquisition assessment in the Gold Country must include judgmental sampling and evaluation of contaminant transport. Best practices using the current scientific knowledge must be codified by agencies, consultants, and NGOs in order to ensure responsible use of public funds and to safeguard public health.

  14. Hybrid data acquisition and processing strategies with increased throughput and selectivity: pSMART analysis for global qualitative and quantitative analysis.

    PubMed

    Prakash, Amol; Peterman, Scott; Ahmad, Shadab; Sarracino, David; Frewen, Barbara; Vogelsang, Maryann; Byram, Gregory; Krastins, Bryan; Vadali, Gouri; Lopez, Mary

    2014-12-05

    Data-dependent acquisition (DDA) and data-independent acquisition strategies (DIA) have both resulted in improved understanding of proteomics samples. Both strategies have advantages and disadvantages that are well-published, where DDA is typically applied for deep discovery and DIA may be used to create sample records. In this paper, we present a hybrid data acquisition and processing strategy (pSMART) that combines the strengths of both techniques and provides significant benefits for qualitative and quantitative peptide analysis. The performance of pSMART is compared to published DIA strategies in an experiment that allows the objective assessment of DIA performance with respect to interrogation of previously acquired MS data. The results of this experiment demonstrate that pSMART creates fewer decoy hits than a standard DIA strategy. Moreover, we show that pSMART is more selective, sensitive, and reproducible than either standard DIA or DDA strategies alone.

  15. Ecological Fallacy in Reading Acquisition Research: Masking Constructive Processes of the Learner.

    ERIC Educational Resources Information Center

    Berninger, Virginia W.; Abbott, Robert D.

    A study examined whether conclusions about constructive processes in reading based on analysis of group data were consistent with those based on an analysis of individual data. Subjects, selected from a larger sample of 45 first grade students who had participated in a longitudinal study on acquisition of linguistic procedures for printed words,…

  16. A Description of the Development, Capabilities, and Operational Status of the Test SLATE Data Acquisition System at the National Transonic Facility

    NASA Technical Reports Server (NTRS)

    Cramer, Christopher J.; Wright, James D.; Simmons, Scott A.; Bobbitt, Lynn E.; DeMoss, Joshua A.

    2015-01-01

    The paper will present a brief background of the previous data acquisition system at the National Transonic Facility (NTF) and the reasoning and goals behind the upgrade to the current Test SLATE (Test Software Laboratory and Automated Testing Environments) data acquisition system. The components, performance characteristics, and layout of the Test SLATE system within the NTF control room will be discussed. The development, testing, and integration of Test SLATE within NTF operations will be detailed. The operational capabilities of the system will be outlined including: test setup, instrumentation calibration, automatic test sequencer setup, data recording, communication between data and facility control systems, real time display monitoring, and data reduction. The current operational status of the Test SLATE system and its performance during recent NTF testing will be highlighted including high-speed, frame-by-frame data acquisition with conditional sampling post-processing applied. The paper concludes with current development work on the system including the capability for real-time conditional sampling during data acquisition and further efficiency enhancements to the wind tunnel testing process.

  17. Reducing acquisition times in multidimensional NMR with a time-optimized Fourier encoding algorithm

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhang, Zhiyong; Department of Electronic Science, Fujian Provincial Key Laboratory of Plasma and Magnetic Resonance, Xiamen University, Xiamen, Fujian 361005; Smith, Pieter E. S.

    Speeding up the acquisition of multidimensional nuclear magnetic resonance (NMR) spectra is an important topic in contemporary NMR, with central roles in high-throughput investigations and analyses of marginally stable samples. A variety of fast NMR techniques have been developed, including methods based on non-uniform sampling and Hadamard encoding, that overcome the long sampling times inherent to schemes based on fast-Fourier-transform (FFT) methods. Here, we explore the potential of an alternative fast acquisition method that leverages a priori knowledge, to tailor polychromatic pulses and customized time delays for an efficient Fourier encoding of the indirect domain of an NMR experiment. Bymore » porting the encoding of the indirect-domain to the excitation process, this strategy avoids potential artifacts associated with non-uniform sampling schemes and uses a minimum number of scans equal to the number of resonances present in the indirect dimension. An added convenience is afforded by the fact that a usual 2D FFT can be used to process the generated data. Acquisitions of 2D heteronuclear correlation NMR spectra on quinine and on the anti-inflammatory drug isobutyl propionic phenolic acid illustrate the new method's performance. This method can be readily automated to deal with complex samples such as those occurring in metabolomics, in in-cell as well as in in vivo NMR applications, where speed and temporal stability are often primary concerns.« less

  18. Endoscopic ultrasound guided fine needle aspiration and useful ancillary methods

    PubMed Central

    Tadic, Mario; Stoos-Veic, Tajana; Kusec, Rajko

    2014-01-01

    The role of endoscopic ultrasound (EUS) in evaluating pancreatic pathology has been well documented from the beginning of its clinical use. High spatial resolution and the close proximity to the evaluated organs within the mediastinum and abdominal cavity allow detection of small focal lesions and precise tissue acquisition from suspected lesions within the reach of this method. Fine needle aspiration (FNA) is considered of additional value to EUS and is performed to obtain tissue diagnosis. Tissue acquisition from suspected lesions for cytological or histological analysis allows, not only the differentiation between malignant and non-malignant lesions, but, in most cases, also the accurate distinction between the various types of malignant lesions. It is well documented that the best results are achieved only if an adequate sample is obtained for further analysis, if the material is processed in an appropriate way, and if adequate ancillary methods are performed. This is a multi-step process and could be quite a challenge in some cases. In this article, we discuss the technical aspects of tissue acquisition by EUS-guided-FNA (EUS-FNA), as well as the role of an on-site cytopathologist, various means of specimen processing, and the selection of the appropriate ancillary method for providing an accurate tissue diagnosis and maximizing the yield of this method. The main goal of this review is to alert endosonographers, not only to the different possibilities of tissue acquisition, namely EUS-FNA, but also to bring to their attention the importance of proper sample processing in the evaluation of various lesions in the gastrointestinal tract and other accessible organs. All aspects of tissue acquisition (needles, suction, use of stylet, complications, etc.) have been well discussed lately. Adequate tissue samples enable comprehensive diagnoses, which answer the main clinical questions, thus enabling targeted therapy. PMID:25339816

  19. Optimization of LC-Orbitrap-HRMS acquisition and MZmine 2 data processing for nontarget screening of environmental samples using design of experiments.

    PubMed

    Hu, Meng; Krauss, Martin; Brack, Werner; Schulze, Tobias

    2016-11-01

    Liquid chromatography-high resolution mass spectrometry (LC-HRMS) is a well-established technique for nontarget screening of contaminants in complex environmental samples. Automatic peak detection is essential, but its performance has only rarely been assessed and optimized so far. With the aim to fill this gap, we used pristine water extracts spiked with 78 contaminants as a test case to evaluate and optimize chromatogram and spectral data processing. To assess whether data acquisition strategies have a significant impact on peak detection, three values of MS cycle time (CT) of an LTQ Orbitrap instrument were tested. Furthermore, the key parameter settings of the data processing software MZmine 2 were optimized to detect the maximum number of target peaks from the samples by the design of experiments (DoE) approach and compared to a manual evaluation. The results indicate that short CT significantly improves the quality of automatic peak detection, which means that full scan acquisition without additional MS 2 experiments is suggested for nontarget screening. MZmine 2 detected 75-100 % of the peaks compared to manual peak detection at an intensity level of 10 5 in a validation dataset on both spiked and real water samples under optimal parameter settings. Finally, we provide an optimization workflow of MZmine 2 for LC-HRMS data processing that is applicable for environmental samples for nontarget screening. The results also show that the DoE approach is useful and effort-saving for optimizing data processing parameters. Graphical Abstract ᅟ.

  20. MSL's Widgets: Adding Rebustness to Martian Sample Acquisition, Handling, and Processing

    NASA Technical Reports Server (NTRS)

    Roumeliotis, Chris; Kennedy, Brett; Lin, Justin; DeGrosse, Patrick; Cady, Ian; Onufer, Nicholas; Sigel, Deborah; Jandura, Louise; Anderson, Robert; Katz, Ira; hide

    2013-01-01

    Mars Science Laboratory's (MSL) Sample Acquisition Sample Processing and Handling (SA-SPaH) system is one of the most ambitious terrain interaction and manipulation systems ever built and successfully used outside of planet earth. Mars has a ruthless environment that has surprised many who have tried to explore there. The robustness widget program was implemented by the MSL project to help ensure the SA-SPaH system would be robust enough to the surprises of this ruthless Martian environment. The robustness widget program was an effort of extreme schedule pressure and responsibility, but was accomplished with resounding success. This paper will focus on a behind the scenes look at MSL's robustness widgets: the particle fun zone, the wind guards, and the portioner pokers.

  1. Mars Science Laboratory Sample Acquisition, Sample Processing and Handling Subsystem: A Description of the Sampling Functionality

    NASA Astrophysics Data System (ADS)

    Jandura, L.; Burke, K.; Kennedy, B.; Melko, J.; Okon, A.; Sunshine, D.

    2009-12-01

    The Sample Acquisition/Sample Processing and Handling (SA/SPaH) subsystem for the Mars Science Library (MSL) is a rover-based sampling system scheduled to launch in 2011. The SA/SPaH consists of a powdering drill and a scooping, sieving, and portioning device mounted on a turret at the end of a robotic arm. Also on the turret is a dust removal tool for clearing the surface of scientific targets, and two science instruments mounted on vibration isolators. The SA/SPaH can acquire powder from rocks at depths of 20 to 50 mm and can also pick up loose regolith with its scoop. The acquired sample is sieved and portioned and delivered to one of two instruments inside the rover for analysis. The functionality of the system will be described along with the targets the system can acquire and the sample that can be delivered. Top View of the SA/SPaH on the Rover

  2. Differences in Adaptive Competency Acquisition between Traditionally Certified and Alternatively Certified Technology Education Teachers.

    ERIC Educational Resources Information Center

    Coyle-Rogers, Patricia G.; Rogers, George E.

    A study determined whether there are any differences in the adaptive competency acquisition between technology education teachers who have completed a school district add-on alternative certification process and technology education teachers who completed a traditional baccalaureate degree certification program. Non-probability sampling was used…

  3. Signal enhancement for the sensitivity-limited solid state NMR experiments using a continuous, non-uniform acquisition scheme

    NASA Astrophysics Data System (ADS)

    Qiang, Wei

    2011-12-01

    We describe a sampling scheme for the two-dimensional (2D) solid state NMR experiments, which can be readily applied to the sensitivity-limited samples. The sampling scheme utilizes continuous, non-uniform sampling profile for the indirect dimension, i.e. the acquisition number decreases as a function of the evolution time ( t1) in the indirect dimension. For a beta amyloid (Aβ) fibril sample, we observed overall 40-50% signal enhancement by measuring the cross peak volume, while the cross peak linewidths remained comparable to the linewidths obtained by regular sampling and processing strategies. Both the linear and Gaussian decay functions for the acquisition numbers result in similar percentage of increment in signal. In addition, we demonstrated that this sampling approach can be applied with different dipolar recoupling approaches such as radiofrequency assisted diffusion (RAD) and finite-pulse radio-frequency-driven recoupling (fpRFDR). This sampling scheme is especially suitable for the sensitivity-limited samples which require long signal averaging for each t1 point, for instance the biological membrane proteins where only a small fraction of the sample is isotopically labeled.

  4. Multislice spiral CT simulator for dynamic cardiopulmonary studies

    NASA Astrophysics Data System (ADS)

    De Francesco, Silvia; Ferreira da Silva, Augusto M.

    2002-04-01

    We've developed a Multi-slice Spiral CT Simulator modeling the acquisition process of a real tomograph over a 4-dimensional phantom (4D MCAT) of the human thorax. The simulator allows us to visually characterize artifacts due to insufficient temporal sampling and a priori evaluate the quality of the images obtained in cardio-pulmonary studies (both with single-/multi-slice and ECG gated acquisition processes). The simulating environment allows both for conventional and spiral scanning modes and includes a model of noise in the acquisition process. In case of spiral scanning, reconstruction facilities include longitudinal interpolation methods (360LI and 180LI both for single and multi-slice). Then, the reconstruction of the section is performed through FBP. The reconstructed images/volumes are affected by distortion due to insufficient temporal sampling of the moving object. The developed simulating environment allows us to investigate the nature of the distortion characterizing it qualitatively and quantitatively (using, for example, Herman's measures). Much of our work is focused on the determination of adequate temporal sampling and sinogram regularization techniques. At the moment, the simulator model is limited to the case of multi-slice tomograph, being planned as a next step of development the extension to cone beam or area detectors.

  5. Prevalence of phonological disorders and phonological processes in typical and atypical phonological development.

    PubMed

    Ceron, Marizete Ilha; Gubiani, Marileda Barichello; Oliveira, Camila Rosa de; Gubiani, Marieli Barichello; Keske-Soares, Márcia

    2017-05-08

    To determine the occurrence of phonological disorders by age, gender and school type, and analyze the phonological processes observed in typical and atypical phonological development across different age groups. The sample consisted of 866 children aged between 3:0 and 8:11 years, recruited from public and private schools in the city of Santa Maria/RS. A phonological evaluation was performed to analyze the operative phonological processes. 15.26% (n = 132) of the sample presented atypical phonological acquisition (phonological disorders). Phonological impairments were more frequent in public school students across all age groups. Phonological alterations were most frequent between ages 4 -to 6, and more prevalent in males than females in all but the youngest age group. The most common phonological processes in typical phonological acquisition were: cluster reduction; nonlateral liquid deletion in coda; nonlateral liquid substitution in onset; semivocalization of lateral liquids in coda; and unstressed syllable deletion. In children with phonological disorders, the most common phonological processes were: lateral and nonlateral liquid substitution in onset position; nonlateral liquid deletion; fronting of fricatives in onset position; unstressed syllable deletion; semivocalization of nonlateral liquid in coda; and nonlateral liquid deletion in coda position. Phonological processes were highly prevalent in the present sample, and occurred more often in boys than in girls. Information regarding the type and frequency of phonological processes in both typical phonological acquisition and phonological disorders may contribute to early diagnosis and increase the efficiency of treatment planning.

  6. LACIE performance predictor FOC users manual

    NASA Technical Reports Server (NTRS)

    1976-01-01

    The LACIE Performance Predictor (LPP) is a computer simulation of the LACIE process for predicting worldwide wheat production. The simulation provides for the introduction of various errors into the system and provides estimates based on these errors, thus allowing the user to determine the impact of selected error sources. The FOC LPP simulates the acquisition of the sample segment data by the LANDSAT Satellite (DAPTS), the classification of the agricultural area within the sample segment (CAMS), the estimation of the wheat yield (YES), and the production estimation and aggregation (CAS). These elements include data acquisition characteristics, environmental conditions, classification algorithms, the LACIE aggregation and data adjustment procedures. The operational structure for simulating these elements consists of the following key programs: (1) LACIE Utility Maintenance Process, (2) System Error Executive, (3) Ephemeris Generator, (4) Access Generator, (5) Acquisition Selector, (6) LACIE Error Model (LEM), and (7) Post Processor.

  7. Autonomous Surface Sample Acquisition for Planetary and Lunar Exploration

    NASA Astrophysics Data System (ADS)

    Barnes, D. P.

    2007-08-01

    Surface science sample acquisition is a critical activity within any planetary and lunar exploration mission, and our research is focused upon the design, implementation, experimentation and demonstration of an onboard autonomous surface sample acquisition capability for a rover equipped with a robotic arm upon which are mounted appropriate science instruments. Images captured by a rover stereo camera system can be processed using shape from stereo methods and a digital elevation model (DEM) generated. We have developed a terrain feature identification algorithm that can determine autonomously from DEM data suitable regions for instrument placement and/or surface sample acquisition. Once identified, surface normal data can be generated autonomously which are then used to calculate an arm trajectory for instrument placement and sample acquisition. Once an instrument placement and sample acquisition trajectory has been calculated, a collision detection algorithm is required to ensure the safe operation of the arm during sample acquisition.We have developed a novel adaptive 'bounding spheres' approach to this problem. Once potential science targets have been identified, and these are within the reach of the arm and will not cause any undesired collision, then the 'cost' of executing the sample acquisition activity is required. Such information which includes power expenditure and duration can be used to select the 'best' target from a set of potential targets. We have developed a science sample acquisition resource requirements calculation that utilises differential inverse kinematics methods to yield a high fidelity result, thus improving upon simple 1st order approximations. To test our algorithms a new Planetary Analogue Terrain (PAT) Laboratory has been created that has a terrain region composed of Mars Soil Simulant-D from DLR Germany, and rocks that have been fully characterised in the laboratory. These have been donated by the UK Planetary Analogue Field Study network, and constitute the science targets for our autonomous sample acquisition work. Our PAT Lab. terrain has been designed to support our new rover chassis which is based upon the ExoMars rover Concept-E mechanics which were investigated during the ESA ExoMars Phase A study. The rover has 6 wheel drives, 6 wheels steering, and a 6 wheel walking capability. Mounted on the rover chassis is the UWA robotic arm and mast. We have designed and built a PanCam system complete with a computer controlled pan and tilt mechanism. The UWA PanCam is based upon the ExoMars PanCam (Phase A study) and hence supports two Wide Angle Cameras (WAC - 64 degree FOV), and a High Resolution Camera (HRC - 5 degree FOV). WAC separation is 500 mm. Software has been developed to capture images which form the data input into our on-board autonomous surface sample acquisition algorithms.

  8. Hardware/Software Issues for Video Guidance Systems: The Coreco Frame Grabber

    NASA Technical Reports Server (NTRS)

    Bales, John W.

    1996-01-01

    The F64 frame grabber is a high performance video image acquisition and processing board utilizing the TMS320C40 and TMS34020 processors. The hardware is designed for the ISA 16 bit bus and supports multiple digital or analog cameras. It has an acquisition rate of 40 million pixels per second, with a variable sampling frequency of 510 kHz to MO MHz. The board has a 4MB frame buffer memory expandable to 32 MB, and has a simultaneous acquisition and processing capability. It supports both VGA and RGB displays, and accepts all analog and digital video input standards.

  9. Imaging system design and image interpolation based on CMOS image sensor

    NASA Astrophysics Data System (ADS)

    Li, Yu-feng; Liang, Fei; Guo, Rui

    2009-11-01

    An image acquisition system is introduced, which consists of a color CMOS image sensor (OV9620), SRAM (CY62148), CPLD (EPM7128AE) and DSP (TMS320VC5509A). The CPLD implements the logic and timing control to the system. SRAM stores the image data, and DSP controls the image acquisition system through the SCCB (Omni Vision Serial Camera Control Bus). The timing sequence of the CMOS image sensor OV9620 is analyzed. The imaging part and the high speed image data memory unit are designed. The hardware and software design of the image acquisition and processing system is given. CMOS digital cameras use color filter arrays to sample different spectral components, such as red, green, and blue. At the location of each pixel only one color sample is taken, and the other colors must be interpolated from neighboring samples. We use the edge-oriented adaptive interpolation algorithm for the edge pixels and bilinear interpolation algorithm for the non-edge pixels to improve the visual quality of the interpolated images. This method can get high processing speed, decrease the computational complexity, and effectively preserve the image edges.

  10. Data Streaming for Metabolomics: Accelerating Data Processing and Analysis from Days to Minutes

    PubMed Central

    2016-01-01

    The speed and throughput of analytical platforms has been a driving force in recent years in the “omics” technologies and while great strides have been accomplished in both chromatography and mass spectrometry, data analysis times have not benefited at the same pace. Even though personal computers have become more powerful, data transfer times still represent a bottleneck in data processing because of the increasingly complex data files and studies with a greater number of samples. To meet the demand of analyzing hundreds to thousands of samples within a given experiment, we have developed a data streaming platform, XCMS Stream, which capitalizes on the acquisition time to compress and stream recently acquired data files to data processing servers, mimicking just-in-time production strategies from the manufacturing industry. The utility of this XCMS Online-based technology is demonstrated here in the analysis of T cell metabolism and other large-scale metabolomic studies. A large scale example on a 1000 sample data set demonstrated a 10 000-fold time savings, reducing data analysis time from days to minutes. Further, XCMS Stream has the capability to increase the efficiency of downstream biochemical dependent data acquisition (BDDA) analysis by initiating data conversion and data processing on subsets of data acquired, expanding its application beyond data transfer to smart preliminary data decision-making prior to full acquisition. PMID:27983788

  11. Data streaming for metabolomics: Accelerating data processing and analysis from days to minutes

    DOE PAGES

    Montenegro-Burke, J. Rafael; Aisporna, Aries E.; Benton, H. Paul; ...

    2016-12-16

    The speed and throughput of analytical platforms has been a driving force in recent years in the “omics” technologies and while great strides have been accomplished in both chromatography and mass spectrometry, data analysis times have not benefited at the same pace. Even though personal computers have become more powerful, data transfer times still represent a bottleneck in data processing because of the increasingly complex data files and studies with a greater number of samples. To meet the demand of analyzing hundreds to thousands of samples within a given experiment, we have developed a data streaming platform, XCMS Stream, whichmore » capitalizes on the acquisition time to compress and stream recently acquired data files to data processing servers, mimicking just-in-time production strategies from the manufacturing industry. The utility of this XCMS Online-based technology is demonstrated here in the analysis of T cell metabolism and other large-scale metabolomic studies. A large scale example on a 1000 sample data set demonstrated a 10 000-fold time savings, reducing data analysis time from days to minutes. Here, XCMS Stream has the capability to increase the efficiency of downstream biochemical dependent data acquisition (BDDA) analysis by initiating data conversion and data processing on subsets of data acquired, expanding its application beyond data transfer to smart preliminary data decision-making prior to full acquisition.« less

  12. Data Streaming for Metabolomics: Accelerating Data Processing and Analysis from Days to Minutes.

    PubMed

    Montenegro-Burke, J Rafael; Aisporna, Aries E; Benton, H Paul; Rinehart, Duane; Fang, Mingliang; Huan, Tao; Warth, Benedikt; Forsberg, Erica; Abe, Brian T; Ivanisevic, Julijana; Wolan, Dennis W; Teyton, Luc; Lairson, Luke; Siuzdak, Gary

    2017-01-17

    The speed and throughput of analytical platforms has been a driving force in recent years in the "omics" technologies and while great strides have been accomplished in both chromatography and mass spectrometry, data analysis times have not benefited at the same pace. Even though personal computers have become more powerful, data transfer times still represent a bottleneck in data processing because of the increasingly complex data files and studies with a greater number of samples. To meet the demand of analyzing hundreds to thousands of samples within a given experiment, we have developed a data streaming platform, XCMS Stream, which capitalizes on the acquisition time to compress and stream recently acquired data files to data processing servers, mimicking just-in-time production strategies from the manufacturing industry. The utility of this XCMS Online-based technology is demonstrated here in the analysis of T cell metabolism and other large-scale metabolomic studies. A large scale example on a 1000 sample data set demonstrated a 10 000-fold time savings, reducing data analysis time from days to minutes. Further, XCMS Stream has the capability to increase the efficiency of downstream biochemical dependent data acquisition (BDDA) analysis by initiating data conversion and data processing on subsets of data acquired, expanding its application beyond data transfer to smart preliminary data decision-making prior to full acquisition.

  13. Planning Related to the Curation and Processing of Returned Martian Samples

    NASA Astrophysics Data System (ADS)

    McCubbin, F. M.; Harrington, A. D.

    2018-04-01

    Many of the planning activities in the NASA Astromaterials Acquisition and Curation Office at JSC are centered around Mars Sample Return. The importance of contamination knowledge and the benefits of a mobile/modular receiving facility are discussed.

  14. Skills Acquisition in Plantain Flour Processing Enterprises: A Validation of Training Modules for Senior Secondary Schools

    ERIC Educational Resources Information Center

    Udofia, Nsikak-Abasi; Nlebem, Bernard S.

    2013-01-01

    This study was to validate training modules that can help provide requisite skills for Senior Secondary school students in plantain flour processing enterprises for self-employment and to enable them pass their examination. The study covered Rivers State. Purposive sampling technique was used to select a sample size of 205. Two sets of structured…

  15. A new processing scheme for ultra-high resolution direct infusion mass spectrometry data

    NASA Astrophysics Data System (ADS)

    Zielinski, Arthur T.; Kourtchev, Ivan; Bortolini, Claudio; Fuller, Stephen J.; Giorio, Chiara; Popoola, Olalekan A. M.; Bogialli, Sara; Tapparo, Andrea; Jones, Roderic L.; Kalberer, Markus

    2018-04-01

    High resolution, high accuracy mass spectrometry is widely used to characterise environmental or biological samples with highly complex composition enabling the identification of chemical composition of often unknown compounds. Despite instrumental advancements, the accurate molecular assignment of compounds acquired in high resolution mass spectra remains time consuming and requires automated algorithms, especially for samples covering a wide mass range and large numbers of compounds. A new processing scheme is introduced implementing filtering methods based on element assignment, instrumental error, and blank subtraction. Optional post-processing incorporates common ion selection across replicate measurements and shoulder ion removal. The scheme allows both positive and negative direct infusion electrospray ionisation (ESI) and atmospheric pressure photoionisation (APPI) acquisition with the same programs. An example application to atmospheric organic aerosol samples using an Orbitrap mass spectrometer is reported for both ionisation techniques resulting in final spectra with 0.8% and 8.4% of the peaks retained from the raw spectra for APPI positive and ESI negative acquisition, respectively.

  16. Energy Efficient GNSS Signal Acquisition Using Singular Value Decomposition (SVD).

    PubMed

    Bermúdez Ordoñez, Juan Carlos; Arnaldo Valdés, Rosa María; Gómez Comendador, Fernando

    2018-05-16

    A significant challenge in global navigation satellite system (GNSS) signal processing is a requirement for a very high sampling rate. The recently-emerging compressed sensing (CS) theory makes processing GNSS signals at a low sampling rate possible if the signal has a sparse representation in a certain space. Based on CS and SVD theories, an algorithm for sampling GNSS signals at a rate much lower than the Nyquist rate and reconstructing the compressed signal is proposed in this research, which is validated after the output from that process still performs signal detection using the standard fast Fourier transform (FFT) parallel frequency space search acquisition. The sparse representation of the GNSS signal is the most important precondition for CS, by constructing a rectangular Toeplitz matrix (TZ) of the transmitted signal, calculating the left singular vectors using SVD from the TZ, to achieve sparse signal representation. Next, obtaining the M-dimensional observation vectors based on the left singular vectors of the SVD, which are equivalent to the sampler operator in standard compressive sensing theory, the signal can be sampled below the Nyquist rate, and can still be reconstructed via ℓ 1 minimization with accuracy using convex optimization. As an added value, there is a GNSS signal acquisition enhancement effect by retaining the useful signal and filtering out noise by projecting the signal into the most significant proper orthogonal modes (PODs) which are the optimal distributions of signal power. The algorithm is validated with real recorded signals, and the results show that the proposed method is effective for sampling, reconstructing intermediate frequency (IF) GNSS signals in the time discrete domain.

  17. Energy Efficient GNSS Signal Acquisition Using Singular Value Decomposition (SVD)

    PubMed Central

    Arnaldo Valdés, Rosa María; Gómez Comendador, Fernando

    2018-01-01

    A significant challenge in global navigation satellite system (GNSS) signal processing is a requirement for a very high sampling rate. The recently-emerging compressed sensing (CS) theory makes processing GNSS signals at a low sampling rate possible if the signal has a sparse representation in a certain space. Based on CS and SVD theories, an algorithm for sampling GNSS signals at a rate much lower than the Nyquist rate and reconstructing the compressed signal is proposed in this research, which is validated after the output from that process still performs signal detection using the standard fast Fourier transform (FFT) parallel frequency space search acquisition. The sparse representation of the GNSS signal is the most important precondition for CS, by constructing a rectangular Toeplitz matrix (TZ) of the transmitted signal, calculating the left singular vectors using SVD from the TZ, to achieve sparse signal representation. Next, obtaining the M-dimensional observation vectors based on the left singular vectors of the SVD, which are equivalent to the sampler operator in standard compressive sensing theory, the signal can be sampled below the Nyquist rate, and can still be reconstructed via ℓ1 minimization with accuracy using convex optimization. As an added value, there is a GNSS signal acquisition enhancement effect by retaining the useful signal and filtering out noise by projecting the signal into the most significant proper orthogonal modes (PODs) which are the optimal distributions of signal power. The algorithm is validated with real recorded signals, and the results show that the proposed method is effective for sampling, reconstructing intermediate frequency (IF) GNSS signals in the time discrete domain. PMID:29772731

  18. Full-field wrist pulse signal acquisition and analysis by 3D Digital Image Correlation

    NASA Astrophysics Data System (ADS)

    Xue, Yuan; Su, Yong; Zhang, Chi; Xu, Xiaohai; Gao, Zeren; Wu, Shangquan; Zhang, Qingchuan; Wu, Xiaoping

    2017-11-01

    Pulse diagnosis is an essential part in four basic diagnostic methods (inspection, listening, inquiring and palpation) in traditional Chinese medicine, which depends on longtime training and rich experience, so computerized pulse acquisition has been proposed and studied to ensure the objectivity. To imitate the process that doctors using three fingertips with different pressures to feel fluctuations in certain areas containing three acupoints, we established a five dimensional pulse signal acquisition system adopting a non-contacting optical metrology method, 3D digital image correlation, to record the full-field displacements of skin fluctuations under different pressures. The system realizes real-time full-field vibration mode observation with 10 FPS. The maximum sample frequency is 472 Hz for detailed post-processing. After acquisition, the signals are analyzed according to the amplitude, pressure, and pulse wave velocity. The proposed system provides a novel optical approach for digitalizing pulse diagnosis and massive pulse signal data acquisition for various types of patients.

  19. Development of Data Acquisition Set-up for Steady-state Experiments

    NASA Astrophysics Data System (ADS)

    Srivastava, Amit K.; Gupta, Arnab D.; Sunil, S.; Khan, Ziauddin

    2017-04-01

    For short duration experiments, generally digitized data is transferred for processing and storage after the experiment whereas in case of steady-state experiment the data is acquired, processed, displayed and stored continuously in pipelined manner. This requires acquiring data through special techniques for storage and on-the-go viewing data to display the current data trends for various physical parameters. A small data acquisition set-up is developed for continuously acquiring signals from various physical parameters at different sampling rate for long duration experiment. This includes the hardware set-up for signal digitization, Field Programmable Gate Arrays (FPGA) based timing system for clock synchronization and event/trigger distribution, time slicing of data streams for storage of data chunks to enable viewing of data during acquisition and channel profile display through down sampling etc. In order to store a long data stream of indefinite/long time duration, the data stream is divided into data slices/chunks of user defined time duration. Data chunks avoid the problem of non-access of server data until the channel data file is closed at the end of the long duration experiment. A graphical user interface has been developed in Lab VIEW application development environment for configuring the data acquisition hardware and storing data chunks on local machine as well as at remote data server through Python for further data access. The data plotting and analysis utilities have been developed with Python software, which provides tools for further data processing. This paper describes the development and implementation of data acquisition for steady-state experiment.

  20. High speed CMOS acquisition system based on FPGA embedded image processing for electro-optical measurements

    NASA Astrophysics Data System (ADS)

    Rosu-Hamzescu, Mihnea; Polonschii, Cristina; Oprea, Sergiu; Popescu, Dragos; David, Sorin; Bratu, Dumitru; Gheorghiu, Eugen

    2018-06-01

    Electro-optical measurements, i.e., optical waveguides and plasmonic based electrochemical impedance spectroscopy (P-EIS), are based on the sensitive dependence of refractive index of electro-optical sensors on surface charge density, modulated by an AC electrical field applied to the sensor surface. Recently, P-EIS has emerged as a new analytical tool that can resolve local impedance with high, optical spatial resolution, without using microelectrodes. This study describes a high speed image acquisition and processing system for electro-optical measurements, based on a high speed complementary metal-oxide semiconductor (CMOS) sensor and a field-programmable gate array (FPGA) board. The FPGA is used to configure CMOS parameters, as well as to receive and locally process the acquired images by performing Fourier analysis for each pixel, deriving the real and imaginary parts of the Fourier coefficients for the AC field frequencies. An AC field generator, for single or multi-sine signals, is synchronized with the high speed acquisition system for phase measurements. The system was successfully used for real-time angle-resolved electro-plasmonic measurements from 30 Hz up to 10 kHz, providing results consistent to ones obtained by a conventional electrical impedance approach. The system was able to detect amplitude variations with a relative variation of ±1%, even for rather low sampling rates per period (i.e., 8 samples per period). The PC (personal computer) acquisition and control software allows synchronized acquisition for multiple FPGA boards, making it also suitable for simultaneous angle-resolved P-EIS imaging.

  1. Ring artifact reduction in synchrotron x-ray tomography through helical acquisition

    NASA Astrophysics Data System (ADS)

    Pelt, Daniël M.; Parkinson, Dilworth Y.

    2018-03-01

    In synchrotron x-ray tomography, systematic defects in certain detector elements can result in arc-shaped artifacts in the final reconstructed image of the scanned sample. These ring artifacts are commonly found in many applications of synchrotron tomography, and can make it difficult or impossible to use the reconstructed image in further analyses. The severity of ring artifacts is often reduced in practice by applying pre-processing on the acquired data, or post-processing on the reconstructed image. However, such additional processing steps can introduce additional artifacts as well, and rely on specific choices of hyperparameter values. In this paper, a different approach to reducing the severity of ring artifacts is introduced: a helical acquisition mode. By moving the sample parallel to the rotation axis during the experiment, the sample is detected at different detector positions in each projection, reducing the effect of systematic errors in detector elements. Alternatively, helical acquisition can be viewed as a way to transform ring artifacts to helix-like artifacts in the reconstructed volume, reducing their severity. We show that data acquired with the proposed mode can be transformed to data acquired with a virtual circular trajectory, enabling further processing of the data with existing software packages for circular data. Results for both simulated data and experimental data show that the proposed method is able to significantly reduce ring artifacts in practice, even compared with popular existing methods, without introducing additional artifacts.

  2. RTSPM: real-time Linux control software for scanning probe microscopy.

    PubMed

    Chandrasekhar, V; Mehta, M M

    2013-01-01

    Real time computer control is an essential feature of scanning probe microscopes, which have become important tools for the characterization and investigation of nanometer scale samples. Most commercial (and some open-source) scanning probe data acquisition software uses digital signal processors to handle the real time data processing and control, which adds to the expense and complexity of the control software. We describe here scan control software that uses a single computer and a data acquisition card to acquire scan data. The computer runs an open-source real time Linux kernel, which permits fast acquisition and control while maintaining a responsive graphical user interface. Images from a simulated tuning-fork based microscope as well as a standard topographical sample are also presented, showing some of the capabilities of the software.

  3. Concrete thawing studied by single-point ramped imaging.

    PubMed

    Prado, P J; Balcom, B J; Beyea, S D; Armstrong, R L; Bremner, T W

    1997-12-01

    A series of two-dimensional images of proton distribution in a hardened concrete sample has been obtained during the thawing process (from -50 degrees C up to 11 degrees C). The SPRITE sequence is optimal for this study given the characteristic short relaxation times of water in this porous media (T2* < 200 micros and T1 < 3.6 ms). The relaxation parameters of the sample were determined in order to optimize the time efficiency of the sequence, permitting a 4-scan 64 x 64 acquisition in under 3 min. The image acquisition is fast on the time scale of the temperature evolution of the specimen. The frozen water distribution is quantified through a position based study of the image contrast. A multiple point acquisition method is presented and the signal sensitivity improvement is discussed.

  4. Thermal imaging measurement of lateral diffusivity and non-invasive material defect detection

    DOEpatents

    Sun, Jiangang; Deemer, Chris

    2003-01-01

    A system and method for determining lateral thermal diffusivity of a material sample using a heat pulse; a sample oriented within an orthogonal coordinate system; an infrared camera; and a computer that has a digital frame grabber, and data acquisition and processing software. The mathematical model used within the data processing software is capable of determining the lateral thermal diffusivity of a sample of finite boundaries. The system and method may also be used as a nondestructive method for detecting and locating cracks within the material sample.

  5. 77 FR 9617 - Defense Federal Acquisition Regulation Supplement; DoD Voucher Processing (DFARS Case 2011-D054)

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-02-17

    ... risk-based sampling methodologies will be reviewed and approved by the contract auditors for... the disbursing office. All interim vouchers are subject to an audit of actual costs incurred after... process currently referenced [[Page 9618

  6. Fast acquisition of multidimensional NMR spectra of solids and mesophases using alternative sampling methods.

    PubMed

    Lesot, Philippe; Kazimierczuk, Krzysztof; Trébosc, Julien; Amoureux, Jean-Paul; Lafon, Olivier

    2015-11-01

    Unique information about the atom-level structure and dynamics of solids and mesophases can be obtained by the use of multidimensional nuclear magnetic resonance (NMR) experiments. Nevertheless, the acquisition of these experiments often requires long acquisition times. We review here alternative sampling methods, which have been proposed to circumvent this issue in the case of solids and mesophases. Compared to the spectra of solutions, those of solids and mesophases present some specificities because they usually display lower signal-to-noise ratios, non-Lorentzian line shapes, lower spectral resolutions and wider spectral widths. We highlight herein the advantages and limitations of these alternative sampling methods. A first route to accelerate the acquisition time of multidimensional NMR spectra consists in the use of sparse sampling schemes, such as truncated, radial or random sampling ones. These sparsely sampled datasets are generally processed by reconstruction methods differing from the Discrete Fourier Transform (DFT). A host of non-DFT methods have been applied for solids and mesophases, including the G-matrix Fourier transform, the linear least-square procedures, the covariance transform, the maximum entropy and the compressed sensing. A second class of alternative sampling consists in departing from the Jeener paradigm for multidimensional NMR experiments. These non-Jeener methods include Hadamard spectroscopy as well as spatial or orientational encoding of the evolution frequencies. The increasing number of high field NMR magnets and the development of techniques to enhance NMR sensitivity will contribute to widen the use of these alternative sampling methods for the study of solids and mesophases in the coming years. Copyright © 2015 John Wiley & Sons, Ltd.

  7. An Analysis of U.S. Army Health Hazard Assessments During the Acquisition of Military Materiel

    DTIC Science & Technology

    2010-06-03

    protective equipment (PPE) (Milz, Conrad, & Soule , 2003). Engineering controls can eliminate hazards through system design, substitution of hazardous...Milz, Conrad, & Soule , 2003). Engineering control measures can serve to 7 minimize hazards where they cannot be eliminated, with preference for...during the materiel acquisitions process, and (c) will evaluate a sample of the database for accuracy by comparing the data entries to original reports

  8. Automated paleomagnetic and rock magnetic data acquisition with an in-line horizontal "2G" system

    NASA Astrophysics Data System (ADS)

    Mullender, Tom A. T.; Frederichs, Thomas; Hilgenfeldt, Christian; de Groot, Lennart V.; Fabian, Karl; Dekkers, Mark J.

    2016-09-01

    Today's paleomagnetic and magnetic proxy studies involve processing of large sample collections while simultaneously demanding high quality data and high reproducibility. Here we describe a fully automated interface based on a commercial horizontal pass-through "2G" DC-SQUID magnetometer. This system is operational at the universities of Bremen (Germany) and Utrecht (Netherlands) since 1998 and 2006, respectively, while a system is currently being built at NGU Trondheim (Norway). The magnetometers are equipped with "in-line" alternating field (AF) demagnetization, a direct-current bias field coil along the coaxial AF demagnetization coil for the acquisition of anhysteretic remanent magnetization (ARM) and a long pulse-field coil for the acquisition of isothermal remanent magnetization (IRM). Samples are contained in dedicated low magnetization perspex holders that are manipulated by a pneumatic pick-and-place-unit. Upon desire samples can be measured in several positions considerably enhancing data quality in particular for magnetically weak samples. In the Bremen system, the peak of the IRM pulse fields is actively measured which reduces the discrepancy between the set field and the field that is actually applied. Techniques for quantifying and removing gyroremanent overprints and for measuring the viscosity of IRM further extend the range of applications of the system. Typically c. 300 paleomagnetic samples can be AF demagnetized per week (15 levels) in the three-position protocol. The versatility of the system is illustrated by several examples of paleomagnetic and rock magnetic data processing.

  9. Non-uniformly weighted sampling for faster localized two-dimensional correlated spectroscopy of the brain in vivo

    NASA Astrophysics Data System (ADS)

    Verma, Gaurav; Chawla, Sanjeev; Nagarajan, Rajakumar; Iqbal, Zohaib; Albert Thomas, M.; Poptani, Harish

    2017-04-01

    Two-dimensional localized correlated spectroscopy (2D L-COSY) offers greater spectral dispersion than conventional one-dimensional (1D) MRS techniques, yet long acquisition times and limited post-processing support have slowed its clinical adoption. Improving acquisition efficiency and developing versatile post-processing techniques can bolster the clinical viability of 2D MRS. The purpose of this study was to implement a non-uniformly weighted sampling (NUWS) scheme for faster acquisition of 2D-MRS. A NUWS 2D L-COSY sequence was developed for 7T whole-body MRI. A phantom containing metabolites commonly observed in the brain at physiological concentrations was scanned ten times with both the NUWS scheme of 12:48 duration and a 17:04 constant eight-average sequence using a 32-channel head coil. 2D L-COSY spectra were also acquired from the occipital lobe of four healthy volunteers using both the proposed NUWS and the conventional uniformly-averaged L-COSY sequence. The NUWS 2D L-COSY sequence facilitated 25% shorter acquisition time while maintaining comparable SNR in humans (+0.3%) and phantom studies (+6.0%) compared to uniform averaging. NUWS schemes successfully demonstrated improved efficiency of L-COSY, by facilitating a reduction in scan time without affecting signal quality.

  10. 77 FR 2682 - Defense Federal Acquisition Regulation Supplement; DoD Voucher Processing

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-01-19

    ... selected using sampling methodologies will be reviewed and approved by the contract auditors for... Disallowing costs after incurrence. * * * * * (b) * * * (i) The contract auditor is the authorized...

  11. Applying Online Monitoring for Nuclear Power Plant Instrumentation and Control

    NASA Astrophysics Data System (ADS)

    Hashemian, H. M.

    2010-10-01

    This paper presents a practical review of the state-of-the-art means for applying OLM data acquisition in nuclear power plant instrumentation and control, qualifying or validating the OLM data, and then analyzing it for static and dynamic performance monitoring applications. Whereas data acquisition for static or steady-state OLM applications can require sample rates of anywhere from 1 to 10 seconds to 1 minutes per sample, for dynamic data acquisition, higher sampling frequencies are required (e.g., 100 to 1000 Hz) using a dedicated data acquisition system capable of providing isolation, anti-aliasing and removal of extraneous noise, and analog-to-digital (A/D) conversion. Qualifying the data for use with OLM algorithms can involve removing data `dead' spots (for static data) and calculating, examining, and trending amplitude probability density, variance, skewness, and kurtosis. For static OLM applications with redundant signals, trending and averaging qualification techniques are used, and for single or non-redundant signals physical and empirical modeling are used. Dynamic OLM analysis is performed in the frequency domain and/or time domain, and is based on the assumption that sensors' or transmitters' dynamic characteristics are linear and that the input noise signal (i.e., the process fluctuations) has proper spectral characteristics.

  12. Pulse-Flow Microencapsulation System

    NASA Technical Reports Server (NTRS)

    Morrison, Dennis R.

    2006-01-01

    The pulse-flow microencapsulation system (PFMS) is an automated system that continuously produces a stream of liquid-filled microcapsules for delivery of therapeutic agents to target tissues. Prior microencapsulation systems have relied on batch processes that involve transfer of batches between different apparatuses for different stages of production followed by sampling for acquisition of quality-control data, including measurements of size. In contrast, the PFMS is a single, microprocessor-controlled system that performs all processing steps, including acquisition of quality-control data. The quality-control data can be used as real-time feedback to ensure the production of large quantities of uniform microcapsules.

  13. Neutron Tomography at the Los Alamos Neutron Science Center

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Myers, William Riley

    Neutron imaging is an incredibly powerful tool for non-destructive sample characterization and materials science. Neutron tomography is one technique that results in a three-dimensional model of the sample, representing the interaction of the neutrons with the sample. This relies both on reliable data acquisition and on image processing after acquisition. Over the course of the project, the focus has changed from the former to the latter, culminating in a large-scale reconstruction of a meter-long fossilized skull. The full reconstruction is not yet complete, though tools have been developed to improve the speed and accuracy of the reconstruction. This project helpsmore » to improve the capabilities of LANSCE and LANL with regards to imaging large or unwieldy objects.« less

  14. Boosting sensitivity and suppressing artifacts via multi-acquisition in direct polarization NMR experiments with small flip-angle pulses.

    PubMed

    Fu, Riqiang; Hernández-Maldonado, Arturo J

    2018-05-24

    A small flip-angle pulse direct polarization is the simplest method commonly used to quantify various compositions in many materials applications. This method sacrifices the sensitivity per scan in exchange for rapid repeating of data acquisition for signal accumulation. In addition, the resulting spectrum often encounters artifacts from background signals from probe components and/or from acoustic rings leading to a distorted baseline, especially in low-γ nuclei and wideline NMR. In this work, a multi-acquisition scheme is proposed to boost the sensitivity per scan and at the same time effectively suppress these artifacts. Here, an adiabatic inversion pulse is first applied in order to bring the magnetization from the +z to -z axis and then a small flip-angle pulse excitation is used before the data acquisition. Right after the first acquisition, the adiabatic inversion pulse is applied again to flip the magnetization back to the +z axis. The second data acquisition takes place after another small flip-angle pulse excitation. The difference between the two consecutive acquisitions cancels out any artifacts, while the wanted signals are accumulated. This acquisition process can be repeated many times before going into next scan. Therefore, by acquiring the signals multiple times in a single scan the sensitivity is improved. A mixture sample of flufenamic acid and 3,5-difluorobenzoic acid and a titanium silicate sample have been used to demonstrate the advantages of this newly proposed method. Copyright © 2018 Elsevier Inc. All rights reserved.

  15. Synchronized and noise-robust audio recordings during realtime magnetic resonance imaging scans.

    PubMed

    Bresch, Erik; Nielsen, Jon; Nayak, Krishna; Narayanan, Shrikanth

    2006-10-01

    This letter describes a data acquisition setup for recording, and processing, running speech from a person in a magnetic resonance imaging (MRI) scanner. The main focus is on ensuring synchronicity between image and audio acquisition, and in obtaining good signal to noise ratio to facilitate further speech analysis and modeling. A field-programmable gate array based hardware design for synchronizing the scanner image acquisition to other external data such as audio is described. The audio setup itself features two fiber optical microphones and a noise-canceling filter. Two noise cancellation methods are described including a novel approach using a pulse sequence specific model of the gradient noise of the MRI scanner. The setup is useful for scientific speech production studies. Sample results of speech and singing data acquired and processed using the proposed method are given.

  16. Synchronized and noise-robust audio recordings during realtime magnetic resonance imaging scans (L)

    PubMed Central

    Bresch, Erik; Nielsen, Jon; Nayak, Krishna; Narayanan, Shrikanth

    2007-01-01

    This letter describes a data acquisition setup for recording, and processing, running speech from a person in a magnetic resonance imaging (MRI) scanner. The main focus is on ensuring synchronicity between image and audio acquisition, and in obtaining good signal to noise ratio to facilitate further speech analysis and modeling. A field-programmable gate array based hardware design for synchronizing the scanner image acquisition to other external data such as audio is described. The audio setup itself features two fiber optical microphones and a noise-canceling filter. Two noise cancellation methods are described including a novel approach using a pulse sequence specific model of the gradient noise of the MRI scanner. The setup is useful for scientific speech production studies. Sample results of speech and singing data acquired and processed using the proposed method are given. PMID:17069275

  17. Sampling optimization for high-speed weigh-in-motion measurements using in-pavement strain-based sensors

    NASA Astrophysics Data System (ADS)

    Zhang, Zhiming; Huang, Ying; Bridgelall, Raj; Palek, Leonard; Strommen, Robert

    2015-06-01

    Weigh-in-motion (WIM) measurement has been widely used for weight enforcement, pavement design, freight management, and intelligent transportation systems to monitor traffic in real-time. However, to use such sensors effectively, vehicles must exit the traffic stream and slow down to match their current capabilities. Hence, agencies need devices with higher vehicle passing speed capabilities to enable continuous weight measurements at mainline speeds. The current practices for data acquisition at such high speeds are fragmented. Deployment configurations and settings depend mainly on the experiences of operation engineers. To assure adequate data, most practitioners use very high frequency measurements that result in redundant samples, thereby diminishing the potential for real-time processing. The larger data memory requirements from higher sample rates also increase storage and processing costs. The field lacks a sampling design or standard to guide appropriate data acquisition of high-speed WIM measurements. This study develops the appropriate sample rate requirements as a function of the vehicle speed. Simulations and field experiments validate the methods developed. The results will serve as guidelines for future high-speed WIM measurements using in-pavement strain-based sensors.

  18. Image processing tools dedicated to quantification in 3D fluorescence microscopy

    NASA Astrophysics Data System (ADS)

    Dieterlen, A.; De Meyer, A.; Colicchio, B.; Le Calvez, S.; Haeberlé, O.; Jacquey, S.

    2006-05-01

    3-D optical fluorescent microscopy now becomes an efficient tool for the volume investigation of living biological samples. Developments in instrumentation have permitted to beat off the conventional Abbe limit. In any case the recorded image can be described by the convolution equation between the original object and the Point Spread Function (PSF) of the acquisition system. Due to the finite resolution of the instrument, the original object is recorded with distortions and blurring, and contaminated by noise. This induces that relevant biological information cannot be extracted directly from raw data stacks. If the goal is 3-D quantitative analysis, then to assess optimal performance of the instrument and to ensure the data acquisition reproducibility, the system characterization is mandatory. The PSF represents the properties of the image acquisition system; we have proposed the use of statistical tools and Zernike moments to describe a 3-D PSF system and to quantify the variation of the PSF. This first step toward standardization is helpful to define an acquisition protocol optimizing exploitation of the microscope depending on the studied biological sample. Before the extraction of geometrical information and/or intensities quantification, the data restoration is mandatory. Reduction of out-of-focus light is carried out computationally by deconvolution process. But other phenomena occur during acquisition, like fluorescence photo degradation named "bleaching", inducing an alteration of information needed for restoration. Therefore, we have developed a protocol to pre-process data before the application of deconvolution algorithms. A large number of deconvolution methods have been described and are now available in commercial package. One major difficulty to use this software is the introduction by the user of the "best" regularization parameters. We have pointed out that automating the choice of the regularization level; also greatly improves the reliability of the measurements although it facilitates the use. Furthermore, to increase the quality and the repeatability of quantitative measurements a pre-filtering of images improves the stability of deconvolution process. In the same way, the PSF prefiltering stabilizes the deconvolution process. We have shown that Zemike polynomials can be used to reconstruct experimental PSF, preserving system characteristics and removing the noise contained in the PSF.

  19. SignalPlant: an open signal processing software platform.

    PubMed

    Plesinger, F; Jurco, J; Halamek, J; Jurak, P

    2016-07-01

    The growing technical standard of acquisition systems allows the acquisition of large records, often reaching gigabytes or more in size as is the case with whole-day electroencephalograph (EEG) recordings, for example. Although current 64-bit software for signal processing is able to process (e.g. filter, analyze, etc) such data, visual inspection and labeling will probably suffer from rather long latency during the rendering of large portions of recorded signals. For this reason, we have developed SignalPlant-a stand-alone application for signal inspection, labeling and processing. The main motivation was to supply investigators with a tool allowing fast and interactive work with large multichannel records produced by EEG, electrocardiograph and similar devices. The rendering latency was compared with EEGLAB and proves significantly faster when displaying an image from a large number of samples (e.g. 163-times faster for 75  ×  10(6) samples). The presented SignalPlant software is available free and does not depend on any other computation software. Furthermore, it can be extended with plugins by third parties ensuring its adaptability to future research tasks and new data formats.

  20. Fast-Acquisition/Weak-Signal-Tracking GPS Receiver for HEO

    NASA Technical Reports Server (NTRS)

    Wintemitz, Luke; Boegner, Greg; Sirotzky, Steve

    2004-01-01

    A report discusses the technical background and design of the Navigator Global Positioning System (GPS) receiver -- . a radiation-hardened receiver intended for use aboard spacecraft. Navigator is capable of weak signal acquisition and tracking as well as much faster acquisition of strong or weak signals with no a priori knowledge or external aiding. Weak-signal acquisition and tracking enables GPS use in high Earth orbits (HEO), and fast acquisition allows for the receiver to remain without power until needed in any orbit. Signal acquisition and signal tracking are, respectively, the processes of finding and demodulating a signal. Acquisition is the more computationally difficult process. Previous GPS receivers employ the method of sequentially searching the two-dimensional signal parameter space (code phase and Doppler). Navigator exploits properties of the Fourier transform in a massively parallel search for the GPS signal. This method results in far faster acquisition times [in the lab, 12 GPS satellites have been acquired with no a priori knowledge in a Low-Earth-Orbit (LEO) scenario in less than one second]. Modeling has shown that Navigator will be capable of acquiring signals down to 25 dB-Hz, appropriate for HEO missions. Navigator is built using the radiation-hardened ColdFire microprocessor and housing the most computationally intense functions in dedicated field-programmable gate arrays. The high performance of the algorithm and of the receiver as a whole are made possible by optimizing computational efficiency and carefully weighing tradeoffs among the sampling rate, data format, and data-path bit width.

  1. Compressed sensing system considerations for ECG and EMG wireless biosensors.

    PubMed

    Dixon, Anna M R; Allstot, Emily G; Gangopadhyay, Daibashish; Allstot, David J

    2012-04-01

    Compressed sensing (CS) is an emerging signal processing paradigm that enables sub-Nyquist processing of sparse signals such as electrocardiogram (ECG) and electromyogram (EMG) biosignals. Consequently, it can be applied to biosignal acquisition systems to reduce the data rate to realize ultra-low-power performance. CS is compared to conventional and adaptive sampling techniques and several system-level design considerations are presented for CS acquisition systems including sparsity and compression limits, thresholding techniques, encoder bit-precision requirements, and signal recovery algorithms. Simulation studies show that compression factors greater than 16X are achievable for ECG and EMG signals with signal-to-quantization noise ratios greater than 60 dB.

  2. Development of automation software for neutron activation analysis process in Malaysian nuclear agency

    NASA Astrophysics Data System (ADS)

    Yussup, N.; Rahman, N. A. A.; Ibrahim, M. M.; Mokhtar, M.; Salim, N. A. A.; Soh@Shaari, S. C.; Azman, A.

    2017-01-01

    Neutron Activation Analysis (NAA) process has been established in Malaysian Nuclear Agency (Nuclear Malaysia) since 1980s. Most of the procedures established especially from sample registration to sample analysis are performed manually. These manual procedures carried out by the NAA laboratory personnel are time consuming and inefficient. Hence, a software to support the system automation is developed to provide an effective method to replace redundant manual data entries and produce faster sample analysis and calculation process. This paper describes the design and development of automation software for NAA process which consists of three sub-programs. The sub-programs are sample registration, hardware control and data acquisition; and sample analysis. The data flow and connection between the sub-programs will be explained. The software is developed by using National Instrument LabView development package.

  3. High-throughput hyperpolarized 13C metabolic investigations using a multi-channel acquisition system

    NASA Astrophysics Data System (ADS)

    Lee, Jaehyuk; Ramirez, Marc S.; Walker, Christopher M.; Chen, Yunyun; Yi, Stacey; Sandulache, Vlad C.; Lai, Stephen Y.; Bankson, James A.

    2015-11-01

    Magnetic resonance imaging and spectroscopy of hyperpolarized (HP) compounds such as [1-13C]-pyruvate have shown tremendous potential for offering new insight into disease and response to therapy. New applications of this technology in clinical research and care will require extensive validation in cells and animal models, a process that may be limited by the high cost and modest throughput associated with dynamic nuclear polarization. Relatively wide spectral separation between [1-13C]-pyruvate and its chemical endpoints in vivo are conducive to simultaneous multi-sample measurements, even in the presence of a suboptimal global shim. Multi-channel acquisitions could conserve costs and accelerate experiments by allowing acquisition from multiple independent samples following a single dissolution. Unfortunately, many existing preclinical MRI systems are equipped with only a single channel for broadband acquisitions. In this work, we examine the feasibility of this concept using a broadband multi-channel digital receiver extension and detector arrays that allow concurrent measurement of dynamic spectroscopic data from ex vivo enzyme phantoms, in vitro anaplastic thyroid carcinoma cells, and in vivo in tumor-bearing mice. Throughput and the cost of consumables were improved by up to a factor of four. These preliminary results demonstrate the potential for efficient multi-sample studies employing hyperpolarized agents.

  4. ATS-6 - Preliminary results from the 13/18-GHz COMSAT Propagation Experiment

    NASA Technical Reports Server (NTRS)

    Hyde, G.

    1975-01-01

    The 13/18-GHz COMSAT Propagation Experiment (CPE) is reviewed, the data acquisition and processing are discussed, and samples of preliminary results are presented. The need for measurements of both hydrometeor-induced attenuation statistics and diversity effectiveness is brought out. The facilitation of the experiment - CPE dual frequency and diversity site location, the CPE ground transmit terminals, the CPE transponder on Applications Technology Satellite-6 (ATS-6), and the CPE receive and data acquisition system - is briefly examined. The on-line preprocessing of the received signal is reviewed, followed by a discussion of the off-line processing of this database to remove signal fluctuations not due to hydrometeors. Finally, samples of the results of first-level analysis of the resultant data for the 18-GHz diversity site near Boston, Mass., and for the dual frequency 13/18-GHz site near Detroit, Mich., are presented and discussed.

  5. Acquisition of background and technical information and class trip planning

    NASA Technical Reports Server (NTRS)

    Mackinnon, R. M.; Wake, W. H.

    1981-01-01

    Instructors who are very familiar with a study area, as well as those who are not, find the field trip information acquisition and planning process speeded and made more effective by organizing it in stages. The stage follow a deductive progression: from the associated context region, to the study area, to the specific sample window sites, and from generalized background information on the study region to specific technical data on the environmental and human use systems to be interpreted at each site. On the class trip and in the follow up laboratory, the learning/interpretive process are at first deductive in applying previously learned information and skills to analysis of the study site, then inductive in reading and interpreting the landscape, imagery, and maps of the site, correlating them with information of other samples sites and building valid generalizations about the larger study area, its context region, and other (similar and/or contrasting) regions.

  6. Low frequency noise elimination technique for 24-bit Σ-Δ data acquisition systems.

    PubMed

    Qu, Shao-Bo; Robert, Olivier; Lognonné, Philippe; Zhou, Ze-Bing; Yang, Shan-Qing

    2015-03-01

    Low frequency 1/f noise is one of the key limiting factors of high precision measurement instruments. In this paper, digital correlated double sampling is implemented to reduce the offset and low frequency 1/f noise of a data acquisition system with 24-bit sigma delta (Σ-Δ) analog to digital converter (ADC). The input voltage is modulated by cross-coupled switches, which are synchronized to the sampling clock, and converted into digital signal by ADC. By using a proper switch frequency, the unwanted parasitic signal frequencies generated by the switches are avoided. The noise elimination processing is made through the principle of digital correlated double sampling, which is equivalent to a time shifted subtraction for the sampled voltage. The low frequency 1/f noise spectrum density of the data acquisition system is reduced to be flat down to the measurement frequency lower limit, which is about 0.0001 Hz in this paper. The noise spectrum density is eliminated by more than 60 dB at 0.0001 Hz, with a residual noise floor of (9 ± 2) nV/Hz(1/2) which is limited by the intrinsic white noise floor of the ADC above its corner frequency.

  7. System and method for controlling depth of imaging in tissues using fluorescence microscopy under ultraviolet excitation following staining with fluorescing agents

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Demos, Stavros; Levenson, Richard

    The present disclosure relates to a method for analyzing tissue specimens. In one implementation the method involves obtaining a tissue sample and exposing the sample to one or more fluorophores as contrast agents to enhance contrast of subcellular compartments of the tissue sample. The tissue sample is illuminated by an ultraviolet (UV) light having a wavelength between about 200 nm to about 400 nm, with the wavelength being selected to result in penetration to only a specified depth below a surface of the tissue sample. Inter-image operations between images acquired under different imaging parameters allow for improvement of the imagemore » quality via removal of unwanted image components. A microscope may be used to image the tissue sample and provide the image to an image acquisition system that makes use of a camera. The image acquisition system may create a corresponding image that is transmitted to a display system for processing and display.« less

  8. Dual-view plane illumination microscopy for rapid and spatially isotropic imaging

    PubMed Central

    Kumar, Abhishek; Wu, Yicong; Christensen, Ryan; Chandris, Panagiotis; Gandler, William; McCreedy, Evan; Bokinsky, Alexandra; Colón-Ramos, Daniel A; Bao, Zhirong; McAuliffe, Matthew; Rondeau, Gary; Shroff, Hari

    2015-01-01

    We describe the construction and use of a compact dual-view inverted selective plane illumination microscope (diSPIM) for time-lapse volumetric (4D) imaging of living samples at subcellular resolution. Our protocol enables a biologist with some prior microscopy experience to assemble a diSPIM from commercially available parts, to align optics and test system performance, to prepare samples, and to control hardware and data processing with our software. Unlike existing light sheet microscopy protocols, our method does not require the sample to be embedded in agarose; instead, samples are prepared conventionally on glass coverslips. Tissue culture cells and Caenorhabditis elegans embryos are used as examples in this protocol; successful implementation of the protocol results in isotropic resolution and acquisition speeds up to several volumes per s on these samples. Assembling and verifying diSPIM performance takes ~6 d, sample preparation and data acquisition take up to 5 d and postprocessing takes 3–8 h, depending on the size of the data. PMID:25299154

  9. In-flight edge response measurements for high-spatial-resolution remote sensing systems

    NASA Astrophysics Data System (ADS)

    Blonski, Slawomir; Pagnutti, Mary A.; Ryan, Robert; Zanoni, Vickie

    2002-09-01

    In-flight measurements of spatial resolution were conducted as part of the NASA Scientific Data Purchase Verification and Validation process. Characterization included remote sensing image products with ground sample distance of 1 meter or less, such as those acquired with the panchromatic imager onboard the IKONOS satellite and the airborne ADAR System 5500 multispectral instrument. Final image products were used to evaluate the effects of both the image acquisition system and image post-processing. Spatial resolution was characterized by full width at half maximum of an edge-response-derived line spread function. The edge responses were analyzed using the tilted-edge technique that overcomes the spatial sampling limitations of the digital imaging systems. As an enhancement to existing algorithms, the slope of the edge response and the orientation of the edge target were determined by a single computational process. Adjacent black and white square panels, either painted on a flat surface or deployed as tarps, formed the ground-based edge targets used in the tests. Orientation of the deployable tarps was optimized beforehand, based on simulations of the imaging system. The effects of such factors as acquisition geometry, temporal variability, Modulation Transfer Function compensation, and ground sample distance on spatial resolution were investigated.

  10. Ultrasonic acoustic levitation for fast frame rate X-ray protein crystallography at room temperature.

    PubMed

    Tsujino, Soichiro; Tomizaki, Takashi

    2016-05-06

    Increasing the data acquisition rate of X-ray diffraction images for macromolecular crystals at room temperature at synchrotrons has the potential to significantly accelerate both structural analysis of biomolecules and structure-based drug developments. Using lysozyme model crystals, we demonstrated the rapid acquisition of X-ray diffraction datasets by combining a high frame rate pixel array detector with ultrasonic acoustic levitation of protein crystals in liquid droplets. The rapid spinning of the crystal within a levitating droplet ensured an efficient sampling of the reciprocal space. The datasets were processed with a program suite developed for serial femtosecond crystallography (SFX). The structure, which was solved by molecular replacement, was found to be identical to the structure obtained by the conventional oscillation method for up to a 1.8-Å resolution limit. In particular, the absence of protein crystal damage resulting from the acoustic levitation was carefully established. These results represent a key step towards a fully automated sample handling and measurement pipeline, which has promising prospects for a high acquisition rate and high sample efficiency for room temperature X-ray crystallography.

  11. Ultrasonic acoustic levitation for fast frame rate X-ray protein crystallography at room temperature

    NASA Astrophysics Data System (ADS)

    Tsujino, Soichiro; Tomizaki, Takashi

    2016-05-01

    Increasing the data acquisition rate of X-ray diffraction images for macromolecular crystals at room temperature at synchrotrons has the potential to significantly accelerate both structural analysis of biomolecules and structure-based drug developments. Using lysozyme model crystals, we demonstrated the rapid acquisition of X-ray diffraction datasets by combining a high frame rate pixel array detector with ultrasonic acoustic levitation of protein crystals in liquid droplets. The rapid spinning of the crystal within a levitating droplet ensured an efficient sampling of the reciprocal space. The datasets were processed with a program suite developed for serial femtosecond crystallography (SFX). The structure, which was solved by molecular replacement, was found to be identical to the structure obtained by the conventional oscillation method for up to a 1.8-Å resolution limit. In particular, the absence of protein crystal damage resulting from the acoustic levitation was carefully established. These results represent a key step towards a fully automated sample handling and measurement pipeline, which has promising prospects for a high acquisition rate and high sample efficiency for room temperature X-ray crystallography.

  12. Ultrasonic acoustic levitation for fast frame rate X-ray protein crystallography at room temperature

    PubMed Central

    Tsujino, Soichiro; Tomizaki, Takashi

    2016-01-01

    Increasing the data acquisition rate of X-ray diffraction images for macromolecular crystals at room temperature at synchrotrons has the potential to significantly accelerate both structural analysis of biomolecules and structure-based drug developments. Using lysozyme model crystals, we demonstrated the rapid acquisition of X-ray diffraction datasets by combining a high frame rate pixel array detector with ultrasonic acoustic levitation of protein crystals in liquid droplets. The rapid spinning of the crystal within a levitating droplet ensured an efficient sampling of the reciprocal space. The datasets were processed with a program suite developed for serial femtosecond crystallography (SFX). The structure, which was solved by molecular replacement, was found to be identical to the structure obtained by the conventional oscillation method for up to a 1.8-Å resolution limit. In particular, the absence of protein crystal damage resulting from the acoustic levitation was carefully established. These results represent a key step towards a fully automated sample handling and measurement pipeline, which has promising prospects for a high acquisition rate and high sample efficiency for room temperature X-ray crystallography. PMID:27150272

  13. The Progress of CDAS

    NASA Technical Reports Server (NTRS)

    Zhu, Renjie; Zhang, Xiuzhong; Wei, Wenren; Xiang, Ying; Li, Bin; Wu, Yajun; Shu, Fengchun; Luo, Jintao; Wang, Jinqing; Xue, Zhuhe; hide

    2010-01-01

    The Chinese Data Acquisition System (CDAS) based on FPGA techniques has been developed in China for the purpose of replacing the traditional analog baseband converter. CDAS is a high speed data acquisition and processing system with 1024 Msps sample rate for 512M bandwidth input and up to 16 channels (both USB and LSB) output with VSI interface compatible. The instrument is a flexible environment which can be updated easily. In this paper, the construction, the performance, the experiment results, and the future plans of CDAS will be reported.

  14. Crossed hot-wire data acquisition and reduction system

    NASA Technical Reports Server (NTRS)

    Westphal, R. V.; Mehta, R. D.

    1984-01-01

    The report describes a system for rapid computerized calibration acquisition, and processing of data from a crossed hot-wire anemometer is described. Advantages of the system are its speed, minimal use of analog electronics, and improved accuracy of the resulting data. Two components of mean velocity and turbulence statistics up to third order are provided by the data reduction. Details of the hardware, calibration procedures, response equations, software, and sample results from measurements in a turbulent plane mixing layer are presented.

  15. [Design and implementation of mobile terminal data acquisition for Chinese materia medica resources survey].

    PubMed

    Qi, Yuan-Hua; Wang, Hui; Zhang, Xiao-Bo; Jin, Yan; Ge, Xiao-Guang; Jing, Zhi-Xian; Wang, Ling; Zhao, Yu-Ping; Guo, Lan-Ping; Huang, Lu-Qi

    2017-11-01

    In this paper, a data acquisition system based on mobile terminal combining GPS, offset correction, automatic speech recognition and database networking technology was designed implemented with the function of locating the latitude and elevation information fast, taking conveniently various types of Chinese herbal plant photos, photos, samples habitat photos and so on. The mobile system realizes automatic association with Chinese medicine source information, through the voice recognition function it records the information of plant characteristics and environmental characteristics, and record relevant plant specimen information. The data processing platform based on Chinese medicine resources survey data reporting client can effectively assists in indoor data processing, derives the mobile terminal data to computer terminal. The established data acquisition system provides strong technical support for the fourth national survey of the Chinese materia medica resources (CMMR). Copyright© by the Chinese Pharmaceutical Association.

  16. Process observation in fiber laser-based selective laser melting

    NASA Astrophysics Data System (ADS)

    Thombansen, Ulrich; Gatej, Alexander; Pereira, Milton

    2015-01-01

    The process observation in selective laser melting (SLM) focuses on observing the interaction point where the powder is processed. To provide process relevant information, signals have to be acquired that are resolved in both time and space. Especially in high-power SLM, where more than 1 kW of laser power is used, processing speeds of several meters per second are required for a high-quality processing results. Therefore, an implementation of a suitable process observation system has to acquire a large amount of spatially resolved data at low sampling speeds or it has to restrict the acquisition to a predefined area at a high sampling speed. In any case, it is vitally important to synchronously record the laser beam position and the acquired signal. This is a prerequisite that allows the recorded data become information. Today, most SLM systems employ f-theta lenses to focus the processing laser beam onto the powder bed. This report describes the drawbacks that result for process observation and suggests a variable retro-focus system which solves these issues. The beam quality of fiber lasers delivers the processing laser beam to the powder bed at relevant focus diameters, which is a key prerequisite for this solution to be viable. The optical train we present here couples the processing laser beam and the process observation coaxially, ensuring consistent alignment of interaction zone and observed area. With respect to signal processing, we have developed a solution that synchronously acquires signals from a pyrometer and the position of the laser beam by sampling the data with a field programmable gate array. The relevance of the acquired signals has been validated by the scanning of a sample filament. Experiments with grooved samples show a correlation between different powder thicknesses and the acquired signals at relevant processing parameters. This basic work takes a first step toward self-optimization of the manufacturing process in SLM. It enables the addition of cognitive functions to the manufacturing system to the extent that the system could track its own process. The results are based on analyzing and redesigning the optical train, in combination with a real-time signal acquisition system which provides a solution to certain technological barriers.

  17. Real-Time Data Display

    NASA Technical Reports Server (NTRS)

    Pedings, Marc

    2007-01-01

    RT-Display is a MATLAB-based data acquisition environment designed to use a variety of commercial off-the-shelf (COTS) hardware to digitize analog signals to a standard data format usable by other post-acquisition data analysis tools. This software presents the acquired data in real time using a variety of signal-processing algorithms. The acquired data is stored in a standard Operator Interactive Signal Processing Software (OISPS) data-formatted file. RT-Display is primarily configured to use the Agilent VXI (or equivalent) data acquisition boards used in such systems as MIDDAS (Multi-channel Integrated Dynamic Data Acquisition System). The software is generalized and deployable in almost any testing environment, without limitations or proprietary configuration for a specific test program or project. With the Agilent hardware configured and in place, users can start the program and, in one step, immediately begin digitizing multiple channels of data. Once the acquisition is completed, data is converted into a common binary format that also can be translated to specific formats used by external analysis software, such as OISPS and PC-Signal (product of AI Signal Research Inc.). RT-Display at the time of this reporting was certified on Agilent hardware capable of acquisition up to 196,608 samples per second. Data signals are presented to the user on-screen simultaneously for 16 channels. Each channel can be viewed individually, with a maximum capability of 160 signal channels (depending on hardware configuration). Current signal presentations include: time data, fast Fourier transforms (FFT), and power spectral density plots (PSD). Additional processing algorithms can be easily incorporated into this environment.

  18. On-field measurement trial of 4×128 Gbps PDM-QPSK signals by linear optical sampling

    NASA Astrophysics Data System (ADS)

    Bin Liu; Wu, Zhichao; Fu, Songnian; Feng, Yonghua; Liu, Deming

    2017-02-01

    Linear optical sampling is a promising characterization technique for advanced modulation formats, together with digital signal processing (DSP) and software-synchronized algorithm. We theoretically investigate the acquisition of optical sampling, when the high-speed signal under test is either periodic or random. Especially, when the profile of optical sampling pulse is asymmetrical, the repetition frequency of sampling pulse needs careful adjustment in order to obtain correct waveform. Then, we demonstrate on-field measurement trial of commercial four-channel 128 Gbps polarization division multiplexing quadrature phase shift keying (PDM-QPSK) signals with truly random characteristics by self-developed equipment. A passively mode-locked fiber laser (PMFL) with a repetition frequency of 95.984 MHz is used as optical sampling source, meanwhile four balanced photo detectors (BPDs) with 400 MHz bandwidth and four-channel analog-to-digital convertor (ADC) with 1.25 GS/s sampling rate are used for data acquisition. The performance comparison with conventional optical modulation analyzer (OMA) verifies that the self-developed equipment has the advantages of low cost, easy implementation, and fast response.

  19. A wireless data acquisition system for acoustic emission testing

    NASA Astrophysics Data System (ADS)

    Zimmerman, A. T.; Lynch, J. P.

    2013-01-01

    As structural health monitoring (SHM) systems have seen increased demand due to lower costs and greater capabilities, wireless technologies have emerged that enable the dense distribution of transducers and the distributed processing of sensor data. In parallel, ultrasonic techniques such as acoustic emission (AE) testing have become increasingly popular in the non-destructive evaluation of materials and structures. These techniques, which involve the analysis of frequency content between 1 kHz and 1 MHz, have proven effective in detecting the onset of cracking and other early-stage failure in active structures such as airplanes in flight. However, these techniques typically involve the use of expensive and bulky monitoring equipment capable of accurately sensing AE signals at sampling rates greater than 1 million samples per second. In this paper, a wireless data acquisition system is presented that is capable of collecting, storing, and processing AE data at rates of up to 20 MHz. Processed results can then be wirelessly transmitted in real-time, creating a system that enables the use of ultrasonic techniques in large-scale SHM systems.

  20. Thomson Scattering Diagnostic Data Acquisition Systems for Modern Fusion Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ivanenko, S.V.; Khilchenko, A.D.; Ovchar, V.K.

    2015-07-01

    Uniquely designed complex data acquisition system for Thomson scattering diagnostic was developed. It allows recording short duration (3-5 ns) scattered pulses with 2 GHz sampling rate and 10-bit total resolution in oscilloscope mode. The system consists up to 48 photo detector modules with 0- 200 MHz bandwidth, 1-48 simultaneously sampling ADC modules and synchronization subsystem. The photo detector modules are based on avalanche photodiodes (APD) and ultra-low noise trans-impedance amplifiers. ADC modules include fast analog to digital converters and digital units based on the FPGA (Field- Programmable Gate Array) for data processing and storage. The synchronization subsystem is used tomore » form triggering pulses and to organize the simultaneously mode of ADC modules operation. (authors)« less

  1. Low Computational Signal Acquisition for GNSS Receivers Using a Resampling Strategy and Variable Circular Correlation Time

    PubMed Central

    Zhang, Yeqing; Wang, Meiling; Li, Yafeng

    2018-01-01

    For the objective of essentially decreasing computational complexity and time consumption of signal acquisition, this paper explores a resampling strategy and variable circular correlation time strategy specific to broadband multi-frequency GNSS receivers. In broadband GNSS receivers, the resampling strategy is established to work on conventional acquisition algorithms by resampling the main lobe of received broadband signals with a much lower frequency. Variable circular correlation time is designed to adapt to different signal strength conditions and thereby increase the operation flexibility of GNSS signal acquisition. The acquisition threshold is defined as the ratio of the highest and second highest correlation results in the search space of carrier frequency and code phase. Moreover, computational complexity of signal acquisition is formulated by amounts of multiplication and summation operations in the acquisition process. Comparative experiments and performance analysis are conducted on four sets of real GPS L2C signals with different sampling frequencies. The results indicate that the resampling strategy can effectively decrease computation and time cost by nearly 90–94% with just slight loss of acquisition sensitivity. With circular correlation time varying from 10 ms to 20 ms, the time cost of signal acquisition has increased by about 2.7–5.6% per millisecond, with most satellites acquired successfully. PMID:29495301

  2. Low Computational Signal Acquisition for GNSS Receivers Using a Resampling Strategy and Variable Circular Correlation Time.

    PubMed

    Zhang, Yeqing; Wang, Meiling; Li, Yafeng

    2018-02-24

    For the objective of essentially decreasing computational complexity and time consumption of signal acquisition, this paper explores a resampling strategy and variable circular correlation time strategy specific to broadband multi-frequency GNSS receivers. In broadband GNSS receivers, the resampling strategy is established to work on conventional acquisition algorithms by resampling the main lobe of received broadband signals with a much lower frequency. Variable circular correlation time is designed to adapt to different signal strength conditions and thereby increase the operation flexibility of GNSS signal acquisition. The acquisition threshold is defined as the ratio of the highest and second highest correlation results in the search space of carrier frequency and code phase. Moreover, computational complexity of signal acquisition is formulated by amounts of multiplication and summation operations in the acquisition process. Comparative experiments and performance analysis are conducted on four sets of real GPS L2C signals with different sampling frequencies. The results indicate that the resampling strategy can effectively decrease computation and time cost by nearly 90-94% with just slight loss of acquisition sensitivity. With circular correlation time varying from 10 ms to 20 ms, the time cost of signal acquisition has increased by about 2.7-5.6% per millisecond, with most satellites acquired successfully.

  3. Kerr hysteresis loop tracer with alternate driving magnetic field up to 10 kHz

    NASA Astrophysics Data System (ADS)

    Callegaro, Luca; Fiorini, Carlo; Triggiani, Giacomo; Puppin, Ezio

    1997-07-01

    A magneto-optical Kerr loop tracer for hysteresis loop measurements in thin films with field excitation frequency f0 from 10 mHz to 10 kHz is described. A very high sensitivity is obtained by using an ultrabright light-emitting diode as a low-noise light source and a novel acquisition process. The field is generated with a coil driven by an audio amplifier connected to a free-running oscillator. The conditioned detector output constitutes the magnetization signal (M); the magnetic field (H) is measured with a fast Hall probe. The acquisition electronics are based on a set of sample-and-hold amplifiers which allow the simultaneous sampling of M, H, and dH/dt. Acquisition is driven by a personal computer equipped with a multifunction I/O board. Test results on a 120 nm Fe film on Si substrate are shown. The coercive field of the film increases with frequency and nearly doubles at 10 kHz with respect to dc.

  4. Data processing for water monitoring system

    NASA Technical Reports Server (NTRS)

    Monford, L.; Linton, A. T.

    1978-01-01

    Water monitoring data acquisition system is structured about central computer that controls sampling and sensor operation, and analyzes and displays data in real time. Unit is essentially separated into two systems: computer system, and hard wire backup system which may function separately or with computer.

  5. Evaluation of optimized b-value sampling schemas for diffusion kurtosis imaging with an application to stroke patient data

    PubMed Central

    Yan, Xu; Zhou, Minxiong; Ying, Lingfang; Yin, Dazhi; Fan, Mingxia; Yang, Guang; Zhou, Yongdi; Song, Fan; Xu, Dongrong

    2013-01-01

    Diffusion kurtosis imaging (DKI) is a new method of magnetic resonance imaging (MRI) that provides non-Gaussian information that is not available in conventional diffusion tensor imaging (DTI). DKI requires data acquisition at multiple b-values for parameter estimation; this process is usually time-consuming. Therefore, fewer b-values are preferable to expedite acquisition. In this study, we carefully evaluated various acquisition schemas using different numbers and combinations of b-values. Acquisition schemas that sampled b-values that were distributed to two ends were optimized. Compared to conventional schemas using equally spaced b-values (ESB), optimized schemas require fewer b-values to minimize fitting errors in parameter estimation and may thus significantly reduce scanning time. Following a ranked list of optimized schemas resulted from the evaluation, we recommend the 3b schema based on its estimation accuracy and time efficiency, which needs data from only 3 b-values at 0, around 800 and around 2600 s/mm2, respectively. Analyses using voxel-based analysis (VBA) and region-of-interest (ROI) analysis with human DKI datasets support the use of the optimized 3b (0, 1000, 2500 s/mm2) DKI schema in practical clinical applications. PMID:23735303

  6. A sub-sampled approach to extremely low-dose STEM

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stevens, A.; Luzi, L.; Yang, H.

    The inpainting of randomly sub-sampled images acquired by scanning transmission electron microscopy (STEM) is an attractive method for imaging under low-dose conditions (≤ 1 e -Å 2) without changing either the operation of the microscope or the physics of the imaging process. We show that 1) adaptive sub-sampling increases acquisition speed, resolution, and sensitivity; and 2) random (non-adaptive) sub-sampling is equivalent, but faster than, traditional low-dose techniques. Adaptive sub-sampling opens numerous possibilities for the analysis of beam sensitive materials and in-situ dynamic processes at the resolution limit of the aberration corrected microscope and is demonstrated here for the analysis ofmore » the node distribution in metal-organic frameworks (MOFs).« less

  7. Design of area array CCD image acquisition and display system based on FPGA

    NASA Astrophysics Data System (ADS)

    Li, Lei; Zhang, Ning; Li, Tianting; Pan, Yue; Dai, Yuming

    2014-09-01

    With the development of science and technology, CCD(Charge-coupled Device) has been widely applied in various fields and plays an important role in the modern sensing system, therefore researching a real-time image acquisition and display plan based on CCD device has great significance. This paper introduces an image data acquisition and display system of area array CCD based on FPGA. Several key technical challenges and problems of the system have also been analyzed and followed solutions put forward .The FPGA works as the core processing unit in the system that controls the integral time sequence .The ICX285AL area array CCD image sensor produced by SONY Corporation has been used in the system. The FPGA works to complete the driver of the area array CCD, then analog front end (AFE) processes the signal of the CCD image, including amplification, filtering, noise elimination, CDS correlation double sampling, etc. AD9945 produced by ADI Corporation to convert analog signal to digital signal. Developed Camera Link high-speed data transmission circuit, and completed the PC-end software design of the image acquisition, and realized the real-time display of images. The result through practical testing indicates that the system in the image acquisition and control is stable and reliable, and the indicators meet the actual project requirements.

  8. Direct Sequence Spread Spectrum (DSSS) Receiver, User Manual

    DTIC Science & Technology

    2008-01-01

    sampled data is clocked in to correlator data registers and a comparison is made between the code and data register contents, producing a correlation ...symbol (equal to the processing gain Gp ) but need not be otherwise synchronised with the spreading codes . This allows a very long and noise- like PRBS...and Q channels are independently but synchronously sampled . Complex Real ADC FIR Filter Interpolator Acquisition Correlators

  9. Quick acquisition and recognition method for the beacon in deep space optical communications.

    PubMed

    Wang, Qiang; Liu, Yuefei; Ma, Jing; Tan, Liying; Yu, Siyuan; Li, Changjiang

    2016-12-01

    In deep space optical communications, it is very difficult to acquire the beacon given the long communication distance. Acquisition efficiency is essential for establishing and holding the optical communication link. Here we proposed a quick acquisition and recognition method for the beacon in deep optical communications based on the characteristics of the deep optical link. To identify the beacon from the background light efficiently, we utilized the maximum similarity between the collecting image and the reference image for accurate recognition and acquisition of the beacon in the area of uncertainty. First, the collecting image and the reference image were processed by Fourier-Mellin. Second, image sampling and image matching were applied for the accurate positioning of the beacon. Finally, the field programmable gate array (FPGA)-based system was used to verify and realize this method. The experimental results showed that the acquisition time for the beacon was as fast as 8.1s. Future application of this method in the system design of deep optical communication will be beneficial.

  10. NeuroPG: open source software for optical pattern generation and data acquisition

    PubMed Central

    Avants, Benjamin W.; Murphy, Daniel B.; Dapello, Joel A.; Robinson, Jacob T.

    2015-01-01

    Patterned illumination using a digital micromirror device (DMD) is a powerful tool for optogenetics. Compared to a scanning laser, DMDs are inexpensive and can easily create complex illumination patterns. Combining these complex spatiotemporal illumination patterns with optogenetics allows DMD-equipped microscopes to probe neural circuits by selectively manipulating the activity of many individual cells or many subcellular regions at the same time. To use DMDs to study neural activity, scientists must develop specialized software to coordinate optical stimulation patterns with the acquisition of electrophysiological and fluorescence data. To meet this growing need we have developed an open source optical pattern generation software for neuroscience—NeuroPG—that combines, DMD control, sample visualization, and data acquisition in one application. Built on a MATLAB platform, NeuroPG can also process, analyze, and visualize data. The software is designed specifically for the Mightex Polygon400; however, as an open source package, NeuroPG can be modified to incorporate any data acquisition, imaging, or illumination equipment that is compatible with MATLAB’s Data Acquisition and Image Acquisition toolboxes. PMID:25784873

  11. Indoor and Outdoor Mobile Mapping Systems for Architectural Surveys

    NASA Astrophysics Data System (ADS)

    Campi, M.; di Luggo, A.; Monaco, S.; Siconolfi, M.; Palomba, D.

    2018-05-01

    This paper presents the results of architectural surveys carried out with mobile mapping systems. The data acquired through different instruments for both indoor and outdoor surveying are analyzed and compared. The study sample shows what is required for an acquisition in a dynamic mode indicating the criteria for the creation of a georeferenced network for indoor spaces, as well as the operational processes concerning data capture, processing, and management. The differences between a dynamic and static scan have been evaluated, with a comparison being made with the aerial photogrammetric survey of the same sample.

  12. Metabolic profiling of body fluids and multivariate data analysis.

    PubMed

    Trezzi, Jean-Pierre; Jäger, Christian; Galozzi, Sara; Barkovits, Katalin; Marcus, Katrin; Mollenhauer, Brit; Hiller, Karsten

    2017-01-01

    Metabolome analyses of body fluids are challenging due pre-analytical variations, such as pre-processing delay and temperature, and constant dynamical changes of biochemical processes within the samples. Therefore, proper sample handling starting from the time of collection up to the analysis is crucial to obtain high quality samples and reproducible results. A metabolomics analysis is divided into 4 main steps: 1) Sample collection, 2) Metabolite extraction, 3) Data acquisition and 4) Data analysis. Here, we describe a protocol for gas chromatography coupled to mass spectrometry (GC-MS) based metabolic analysis for biological matrices, especially body fluids. This protocol can be applied on blood serum/plasma, saliva and cerebrospinal fluid (CSF) samples of humans and other vertebrates. It covers sample collection, sample pre-processing, metabolite extraction, GC-MS measurement and guidelines for the subsequent data analysis. Advantages of this protocol include: •Robust and reproducible metabolomics results, taking into account pre-analytical variations that may occur during the sampling process•Small sample volume required•Rapid and cost-effective processing of biological samples•Logistic regression based determination of biomarker signatures for in-depth data analysis.

  13. Rapid Processing of Turner Designs Model 10-Au-005 Internally Logged Fluorescence Data

    EPA Science Inventory

    Continuous recording of dye fluorescence using field fluorometers at selected sampling sites facilitates acquisition of real-time dye tracing data. The Turner Designs Model 10-AU-005 field fluorometer allows for frequent fluorescence readings, data logging, and easy downloading t...

  14. FORTRAN PROCESSING OF FLUOROMETRIC DATA LOGGED BY A TURNER DESIGNS FIELD FLUOROMETER

    EPA Science Inventory

    Continuous recording of dye fluorescence using field fluorometers at selected sampling sites facilitate acquisition of real-time dye-tracing data. The Turner Designs Model 10-AU-005 Field Fluorometer allows for frequent fluorescence readings, data logging, and easy downloading t...

  15. Simulate different environments TDLAS On the analysis of the test signal strength

    NASA Astrophysics Data System (ADS)

    Li, Xin; Zhou, Tao; Jia, Xiaodong

    2014-12-01

    TDLAS system is the use of the wavelength tuning characteristics of the laser diode, for detecting the absorption spectrum of the gas absorption line. Detecting the gas space, temperature, pressure and flow rate and concentration. The use of laboratory techniques TDLAS gas detection, experimental simulation engine combustion water vapor and smoke. using an optical lens system receives the signal acquisition and signal interference test analysis. Analog water vapor and smoke in two different environments in the sample pool interference. In both experiments environmental interference gas absorption in the optical signal acquisition, signal amplitude variation analysis, and records related to the signal data. In order to study site conditions in the engine combustion process for signal acquisition provides an ideal experimental data .

  16. Dynamic autofocus for continuous-scanning time-delay-and-integration image acquisition in automated microscopy.

    PubMed

    Bravo-Zanoguera, Miguel E; Laris, Casey A; Nguyen, Lam K; Oliva, Mike; Price, Jeffrey H

    2007-01-01

    Efficient image cytometry of a conventional microscope slide means rapid acquisition and analysis of 20 gigapixels of image data (at 0.3-microm sampling). The voluminous data motivate increased acquisition speed to enable many biomedical applications. Continuous-motion time-delay-and-integrate (TDI) scanning has the potential to speed image acquisition while retaining sensitivity, but the challenge of implementing high-resolution autofocus operating simultaneously with acquisition has limited its adoption. We develop a dynamic autofocus system for this need using: 1. a "volume camera," consisting of nine fiber optic imaging conduits to charge-coupled device (CCD) sensors, that acquires images in parallel from different focal planes, 2. an array of mixed analog-digital processing circuits that measure the high spatial frequencies of the multiple image streams to create focus indices, and 3. a software system that reads and analyzes the focus data streams and calculates best focus for closed feedback loop control. Our system updates autofocus at 56 Hz (or once every 21 microm of stage travel) to collect sharply focused images sampled at 0.3x0.3 microm(2)/pixel at a stage speed of 2.3 mms. The system, tested by focusing in phase contrast and imaging long fluorescence strips, achieves high-performance closed-loop image-content-based autofocus in continuous scanning for the first time.

  17. Design and DSP implementation of star image acquisition and star point fast acquiring and tracking

    NASA Astrophysics Data System (ADS)

    Zhou, Guohui; Wang, Xiaodong; Hao, Zhihang

    2006-02-01

    Star sensor is a special high accuracy photoelectric sensor. Attitude acquisition time is an important function index of star sensor. In this paper, the design target is to acquire 10 samples per second dynamic performance. On the basis of analyzing CCD signals timing and star image processing, a new design and a special parallel architecture for improving star image processing are presented in this paper. In the design, the operation moving the data in expanded windows including the star to the on-chip memory of DSP is arranged in the invalid period of CCD frame signal. During the CCD saving the star image to memory, DSP processes the data in the on-chip memory. This parallelism greatly improves the efficiency of processing. The scheme proposed here results in enormous savings of memory normally required. In the scheme, DSP HOLD mode and CPLD technology are used to make a shared memory between CCD and DSP. The efficiency of processing is discussed in numerical tests. Only in 3.5ms is acquired the five lightest stars in the star acquisition stage. In 43us, the data in five expanded windows including stars are moved into the internal memory of DSP, and in 1.6ms, five star coordinates are achieved in the star tracking stage.

  18. Lunar Processing Cabinet 2.0: Retrofitting Gloveboxes into the 21st Century

    NASA Technical Reports Server (NTRS)

    Calaway, M. J.

    2015-01-01

    In 2014, the Apollo 16 Lunar Processing Glovebox (cabinet 38) in the Lunar Curation Laboratory at NASA JSC received an upgrade including new technology interfaces. A Jacobs - Technology Innovation Project provided the primary resources to retrofit this glovebox into the 21st century. NASA Astromaterials Acquisition & Curation Office continues the over 40 year heritage of preserving lunar materials for future scientific studies in state-of-the-art facilities. This enhancement has not only modernized the contamination controls, but provides new innovative tools for processing and characterizing lunar samples as well as supports real-time exchange of sample images and information with the scientific community throughout the world.

  19. Post-acquisition data processing for the screening of transformation products of different organic contaminants. Two-year monitoring of river water using LC-ESI-QTOF-MS and GCxGC-EI-TOF-MS.

    PubMed

    López, S Herrera; Ulaszewska, M M; Hernando, M D; Martínez Bueno, M J; Gómez, M J; Fernández-Alba, A R

    2014-11-01

    This study describes a comprehensive strategy for detecting and elucidating the chemical structures of expected and unexpected transformation products (TPs) from chemicals found in river water and effluent wastewater samples, using liquid chromatography coupled to electrospray ionization quadrupole-time-of-flight mass spectrometer (LC-ESI-QTOF-MS), with post-acquisition data processing and an automated search using an in-house database. The efficacy of the mass defect filtering (MDF) approach to screen metabolites from common biotransformation pathways was tested, and it was shown to be sufficiently sensitive and applicable for detecting metabolites in environmental samples. Four omeprazole metabolites and two venlafaxine metabolites were identified in river water samples. This paper reports the analytical results obtained during 2 years of monitoring, carried out at eight sampling points along the Henares River (Spain). Multiresidue monitoring, for targeted analysis, includes a group of 122 chemicals, amongst which are pharmaceuticals, personal care products, pesticides and PAHs. For this purpose, two analytical methods were used based on direct injection with a LC-ESI-QTOF-MS system and stir bar sorptive extraction (SBSE) with bi-dimensional gas chromatography coupled with a time-of-flight spectrometer (GCxGC-EI-TOF-MS).

  20. 42 CFR 37.44 - Approval of radiographic facilities that use digital radiography systems.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... image acquisition, digitization, processing, compression, transmission, display, archiving, and... quality digital chest radiographs by submitting to NIOSH digital radiographic image files of a test object... digital radiographic image files from six or more sample chest radiographs that are of acceptable quality...

  1. 42 CFR 37.44 - Approval of radiographic facilities that use digital radiography systems.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... image acquisition, digitization, processing, compression, transmission, display, archiving, and... quality digital chest radiographs by submitting to NIOSH digital radiographic image files of a test object... digital radiographic image files from six or more sample chest radiographs that are of acceptable quality...

  2. Using Fourier transform IR spectroscopy to analyze biological materials

    PubMed Central

    Baker, Matthew J; Trevisan, Júlio; Bassan, Paul; Bhargava, Rohit; Butler, Holly J; Dorling, Konrad M; Fielden, Peter R; Fogarty, Simon W; Fullwood, Nigel J; Heys, Kelly A; Hughes, Caryn; Lasch, Peter; Martin-Hirsch, Pierre L; Obinaju, Blessing; Sockalingum, Ganesh D; Sulé-Suso, Josep; Strong, Rebecca J; Walsh, Michael J; Wood, Bayden R; Gardner, Peter; Martin, Francis L

    2015-01-01

    IR spectroscopy is an excellent method for biological analyses. It enables the nonperturbative, label-free extraction of biochemical information and images toward diagnosis and the assessment of cell functionality. Although not strictly microscopy in the conventional sense, it allows the construction of images of tissue or cell architecture by the passing of spectral data through a variety of computational algorithms. Because such images are constructed from fingerprint spectra, the notion is that they can be an objective reflection of the underlying health status of the analyzed sample. One of the major difficulties in the field has been determining a consensus on spectral pre-processing and data analysis. This manuscript brings together as coauthors some of the leaders in this field to allow the standardization of methods and procedures for adapting a multistage approach to a methodology that can be applied to a variety of cell biological questions or used within a clinical setting for disease screening or diagnosis. We describe a protocol for collecting IR spectra and images from biological samples (e.g., fixed cytology and tissue sections, live cells or biofluids) that assesses the instrumental options available, appropriate sample preparation, different sampling modes as well as important advances in spectral data acquisition. After acquisition, data processing consists of a sequence of steps including quality control, spectral pre-processing, feature extraction and classification of the supervised or unsupervised type. A typical experiment can be completed and analyzed within hours. Example results are presented on the use of IR spectra combined with multivariate data processing. PMID:24992094

  3. Improving GPR Surveys Productivity by Array Technology and Fully Automated Processing

    NASA Astrophysics Data System (ADS)

    Morello, Marco; Ercoli, Emanuele; Mazzucchelli, Paolo; Cottino, Edoardo

    2016-04-01

    The realization of network infrastructures with lower environmental impact and the tendency to use digging technologies less invasive in terms of time and space of road occupation and restoration play a key-role in the development of communication networks. However, pre-existing buried utilities must be detected and located in the subsurface, to exploit the high productivity of modern digging apparatus. According to SUE quality level B+ both position and depth of subsurface utilities must be accurately estimated, demanding for 3D GPR surveys. In fact, the advantages of 3D GPR acquisitions (obtained either by multiple 2D recordings or by an antenna array) versus 2D acquisitions are well-known. Nonetheless, the amount of acquired data for such 3D acquisitions does not usually allow to complete processing and interpretation directly in field and in real-time, thus limiting the overall efficiency of the GPR acquisition. As an example, the "low impact mini-trench "technique (addressed in ITU - International Telecommunication Union - L.83 recommendation) requires that non-destructive mapping of buried services enhances its productivity to match the improvements of new digging equipment. Nowadays multi-antenna and multi-pass GPR acquisitions demand for new processing techniques that can obtain high quality subsurface images, taking full advantage of 3D data: the development of a fully automated and real-time 3D GPR processing system plays a key-role in overall optical network deployment profitability. Furthermore, currently available computing power suggests the feasibility of processing schemes that incorporate better focusing algorithms. A novel processing scheme, whose goal is the automated processing and detection of buried targets that can be applied in real-time to 3D GPR array systems, has been developed and fruitfully tested with two different GPR arrays (16 antennas, 900 MHz central frequency, and 34 antennas, 600 MHz central frequency). The proposed processing scheme take advantage of 3D data multiplicity by continuous real time data focusing. Pre-stack reflection angle gathers G(x, θ; v) are computed at nv different velocities (by the mean of Kirchhoff depth-migration kernels, that can naturally cope with any acquisition pattern and handle irregular sampling issues). It must be noted that the analysis of pre-stack reflection angle gathers plays a key-role in automated detection: targets are identified and the best local propagation velocities are recovered through a correlation estimate computed for all the nv reflection angle gathers. Indeed, the data redundancy of 3D GPR acquisitions highly improves the proposed automatic detection reliability. The goal of real-time automated processing has been pursued without the need of specific high performance processing hardware (a simple laptop is required). Moreover, the automatization of the entire surveying process allows to obtain high quality and repeatable results without the need of skilled interpreters. The proposed acquisition procedure has been extensively tested: more than 100 Km of acquired data prove the feasibility of the proposed approach.

  4. High Speed PC Based Data Acquisition and Instrumentation for Measurement of Simulated Low Earth Orbit Thermally Induced Disturbances

    NASA Technical Reports Server (NTRS)

    Sills, Joel W., Jr.; Griffin, Thomas J. (Technical Monitor)

    2001-01-01

    The Hubble Space Telescope (HST) Disturbance Verification Test (DVT) was conducted to characterize responses of the Observatory's new set of rigid solar array's (SA3) to thermally induced 'creak' or stiction releases. The data acquired in the DVT were used in verification of the HST Pointing Control System on-orbit performance, post-Servicing Mission 3B (SM3B). The test simulated the on-orbit environment on a deployed SA3 flight wing. Instrumentation for this test required pretest simulations in order to select the correct sensitivities. Vacuum compatible, highly accurate accelerometers and force gages were used for this test. The complexity of the test, as well as a short planning schedule, required a data acquisition system that was easy to configure, highly flexible, and extremely robust. A PC Windows oriented data acquisition system meets these requirements, allowing the test engineers to minimize the time required to plan and perform complex environmental test. The SA3 DVT provided a direct practical and complex demonstration of the versatility that PC based data acquisition systems provide. Two PC based data acquisition systems were assembled to acquire, process, distribute, and provide real time processing for several types of transducers used in the SA3 DVT. A high sample rate digital tape recorder was used to archive the sensor signals. The two systems provided multi-channel hardware and software architecture and were selected based on the test requirements. How these systems acquire and processes multiple data rates from different transducer types is discussed, along with the system hardware and software architecture.

  5. Seeking Positive Experiences Can Produce Illusory Correlations

    ERIC Educational Resources Information Center

    Denrell, Jerker; Le Mens, Gael

    2011-01-01

    Individuals tend to select again alternatives about which they have positive impressions and to avoid alternatives about which they have negative impressions. Here we show how this sequential sampling feature of the information acquisition process leads to the emergence of an illusory correlation between estimates of the attributes of…

  6. Knowledge of Some Derivational Processes in Two Samples of Bilingual Children

    ERIC Educational Resources Information Center

    Marckworth, M. Lois

    1978-01-01

    A report on a study concerning the bilingual child in a monolingual community. It investigates the acquisition of a set of English derivational morphemes by bilingual children and the effect of external factors, such as school, exposure time, age and home, in the children's language experience. (AMH)

  7. Data Acquisition for Modular Biometric Monitoring System

    NASA Technical Reports Server (NTRS)

    Grodsinsky, Carlos M. (Inventor); Chmiel, Alan J. (Inventor); Humphreys, Bradley T. (Inventor)

    2014-01-01

    A modular system for acquiring biometric data includes a plurality of data acquisition modules configured to sample biometric data from at least one respective input channel at a data acquisition rate. A representation of the sampled biometric data is stored in memory of each of the plurality of data acquisition modules. A central control system is in communication with each of the plurality of data acquisition modules through a bus. The central control system is configured to collect data asynchronously, via the bus, from the memory of the plurality of data acquisition modules according to a relative fullness of the memory of the plurality of data acquisition modules.

  8. Accurate Sample Time Reconstruction of Inertial FIFO Data.

    PubMed

    Stieber, Sebastian; Dorsch, Rainer; Haubelt, Christian

    2017-12-13

    In the context of modern cyber-physical systems, the accuracy of underlying sensor data plays an increasingly important role in sensor data fusion and feature extraction. The raw events of multiple sensors have to be aligned in time to enable high quality sensor fusion results. However, the growing number of simultaneously connected sensor devices make the energy saving data acquisition and processing more and more difficult. Hence, most of the modern sensors offer a first-in-first-out (FIFO) interface to store multiple data samples and to relax timing constraints, when handling multiple sensor devices. However, using the FIFO interface increases the negative influence of individual clock drifts-introduced by fabrication inaccuracies, temperature changes and wear-out effects-onto the sampling data reconstruction. Furthermore, additional timing offset errors due to communication and software latencies increases with a growing number of sensor devices. In this article, we present an approach for an accurate sample time reconstruction independent of the actual clock drift with the help of an internal sensor timer. Such timers are already available in modern sensors, manufactured in micro-electromechanical systems (MEMS) technology. The presented approach focuses on calculating accurate time stamps using the sensor FIFO interface in a forward-only processing manner as a robust and energy saving solution. The proposed algorithm is able to lower the overall standard deviation of reconstructed sampling periods below 40 μ s, while run-time savings of up to 42% are achieved, compared to single sample acquisition.

  9. An automated system for whole microscopic image acquisition and analysis.

    PubMed

    Bueno, Gloria; Déniz, Oscar; Fernández-Carrobles, María Del Milagro; Vállez, Noelia; Salido, Jesús

    2014-09-01

    The field of anatomic pathology has experienced major changes over the last decade. Virtual microscopy (VM) systems have allowed experts in pathology and other biomedical areas to work in a safer and more collaborative way. VMs are automated systems capable of digitizing microscopic samples that were traditionally examined one by one. The possibility of having digital copies reduces the risk of damaging original samples, and also makes it easier to distribute copies among other pathologists. This article describes the development of an automated high-resolution whole slide imaging (WSI) system tailored to the needs and problems encountered in digital imaging for pathology, from hardware control to the full digitization of samples. The system has been built with an additional digital monochromatic camera together with the color camera by default and LED transmitted illumination (RGB). Monochrome cameras are the preferred method of acquisition for fluorescence microscopy. The system is able to digitize correctly and form large high resolution microscope images for both brightfield and fluorescence. The quality of the digital images has been quantified using three metrics based on sharpness, contrast and focus. It has been proved on 150 tissue samples of brain autopsies, prostate biopsies and lung cytologies, at five magnifications: 2.5×, 10×, 20×, 40×, and 63×. The article is focused on the hardware set-up and the acquisition software, although results of the implemented image processing techniques included in the software and applied to the different tissue samples are also presented. © 2014 Wiley Periodicals, Inc.

  10. Highly variable acquisition rates of Ixodes scapularis (Acari: Ixodidae) by birds on an Atlantic barrier island.

    PubMed

    Mitra, S S; Buckley, P A; Buckley, F G; Ginsberg, H S

    2010-11-01

    Acquisition of ticks by bird hosts is a central process in the transmission cycles of many tick-borne zoonoses, but tick recruitment by birds has received little direct study. We documented acquisition of Ixodes scapularis Say on birds at Fire Island, NY, by removing ticks from mist-netted birds, and recording the number of ticks on birds recaptured within 4 d of release. Eight bird species acquired at least 0.8 ticks bird(-1) day(-1) during the seasonal peak for at least one age class of I. scapularis. Gray Catbirds, Eastern Towhees, Common Yellowthroats, and Northern Waterthrushes collectively accounted for 83% of all tick acquisitions; and six individuals apportioned among Black-billed Cuckoo, Gray Catbird, Eastern Towhee, and Common Yellowthroat were simultaneously infested with both larvae and nymphs. Bird species with the highest acquisition rates were generally ground foragers, whereas birds that did not acquire ticks in our samples generally foraged above the ground. Tick acquisition by birds did not differ between deciduous and coniferous forests. Among the 15 bird species with the highest recruitment rates, acquisition of nymphs was not correlated with acquisition of larvae. Tick acquisition rates by individual bird species were not correlated with the reservoir competence of those species for Lyme borreliae. However, birds with high tick acquisition rates can contribute large numbers of infected ticks, and thus help maintain the enzootic cycle, even if their levels of reservoir competence are relatively low.

  11. Hebb learning, verbal short-term memory, and the acquisition of phonological forms in children.

    PubMed

    Mosse, Emma K; Jarrold, Christopher

    2008-04-01

    Recent work using the Hebb effect as a marker for implicit long-term acquisition of serial order has demonstrated a functional equivalence across verbal and visuospatial short-term memory. The current study extends this observation to a sample of five- to six-year-olds using verbal and spatial immediate serial recall and also correlates the magnitude of Hebb learning with explicit measures of word and nonword paired-associate learning. Comparable Hebb effects were observed in both domains, but only nonword learning was significantly related to the magnitude of Hebb learning. Nonword learning was also independently related to individuals' general level of verbal serial recall. This suggests that vocabulary acquisition depends on both a domain-specific short-term memory system and a domain-general process of learning through repetition.

  12. A Compact, Solid-State UV (266 nm) Laser System Capable of Burst-Mode Operation for Laser Ablation Desorption Processing

    NASA Technical Reports Server (NTRS)

    Arevalo, Ricardo, Jr.; Coyle, Barry; Paulios, Demetrios; Stysley, Paul; Feng, Steve; Getty, Stephanie; Binkerhoff, William

    2015-01-01

    Compared to wet chemistry and pyrolysis techniques, in situ laser-based methods of chemical analysis provide an ideal way to characterize precious planetary materials without requiring extensive sample processing. In particular, laser desorption and ablation techniques allow for rapid, reproducible and robust data acquisition over a wide mass range, plus: Quantitative, spatially-resolved measurements of elemental and molecular (organic and inorganic) abundances; Low analytical blanks and limits-of-detection ( ng g-1); and, the destruction of minimal quantities of sample ( g) compared to traditional solution and/or pyrolysis analyses (mg).

  13. The effect of signal acquisition and processing choices on ApEn values: towards a "gold standard" for distinguishing effort levels from isometric force records.

    PubMed

    Forrest, Sarah M; Challis, John H; Winter, Samantha L

    2014-06-01

    Approximate entropy (ApEn) is frequently used to identify changes in the complexity of isometric force records with ageing and disease. Different signal acquisition and processing parameters have been used, making comparison or confirmation of results difficult. This study determined the effect of sampling and parameter choices by examining changes in ApEn values across a range of submaximal isometric contractions of the first dorsal interosseus. Reducing the sample rate by decimation changed both the value and pattern of ApEn values dramatically. The pattern of ApEn values across the range of effort levels was not sensitive to the filter cut-off frequency, or the criterion used to extract the section of data for analysis. The complexity increased with increasing effort levels using a fixed 'r' value (which accounts for measurement noise) but decreased with increasing effort level when 'r' was set to 0.1 of the standard deviation of force. It is recommended isometric force records are sampled at frequencies >200Hz, template length ('m') is set to 2, and 'r' set to measurement system noise or 0.1SD depending on physiological process to be distinguished. It is demonstrated that changes in ApEn across effort levels are related to changes in force gradation strategy. Copyright © 2014 IPEM. Published by Elsevier Ltd. All rights reserved.

  14. Automatic differential analysis of NMR experiments in complex samples.

    PubMed

    Margueritte, Laure; Markov, Petar; Chiron, Lionel; Starck, Jean-Philippe; Vonthron-Sénécheau, Catherine; Bourjot, Mélanie; Delsuc, Marc-André

    2018-06-01

    Liquid state nuclear magnetic resonance (NMR) is a powerful tool for the analysis of complex mixtures of unknown molecules. This capacity has been used in many analytical approaches: metabolomics, identification of active compounds in natural extracts, and characterization of species, and such studies require the acquisition of many diverse NMR measurements on series of samples. Although acquisition can easily be performed automatically, the number of NMR experiments involved in these studies increases very rapidly, and this data avalanche requires to resort to automatic processing and analysis. We present here a program that allows the autonomous, unsupervised processing of a large corpus of 1D, 2D, and diffusion-ordered spectroscopy experiments from a series of samples acquired in different conditions. The program provides all the signal processing steps, as well as peak-picking and bucketing of 1D and 2D spectra, the program and its components are fully available. In an experiment mimicking the search of a bioactive species in a natural extract, we use it for the automatic detection of small amounts of artemisinin added to a series of plant extracts and for the generation of the spectral fingerprint of this molecule. This program called Plasmodesma is a novel tool that should be useful to decipher complex mixtures, particularly in the discovery of biologically active natural products from plants extracts but can also in drug discovery or metabolomics studies. Copyright © 2017 John Wiley & Sons, Ltd.

  15. Autonomous Sample Acquisition for Planetary and Small Body Explorations

    NASA Technical Reports Server (NTRS)

    Ghavimi, Ali R.; Serricchio, Frederick; Dolgin, Ben; Hadaegh, Fred Y.

    2000-01-01

    Robotic drilling and autonomous sample acquisition are considered as the key technology requirements in future planetary or small body exploration missions. Core sampling or subsurface drilling operation is envisioned to be off rovers or landers. These supporting platforms are inherently flexible, light, and can withstand only limited amount of reaction forces and torques. This, together with unknown properties of sampled materials, makes the sampling operation a tedious task and quite challenging. This paper highlights the recent advancements in the sample acquisition control system design and development for the in situ scientific exploration of planetary and small interplanetary missions.

  16. A Research on Second Language Acquisition and College English Teaching

    ERIC Educational Resources Information Center

    Li, Changyu

    2009-01-01

    It was in the 1970s that American linguist S.D. Krashen created the theory of "language acquisition". The theories on second language acquisition were proposed based on the study on the second language acquisition process and its rules. Here, the second language acquisition process refers to the process in which a learner with the…

  17. Post-acquisition data mining techniques for LC-MS/MS-acquired data in drug metabolite identification.

    PubMed

    Dhurjad, Pooja Sukhdev; Marothu, Vamsi Krishna; Rathod, Rajeshwari

    2017-08-01

    Metabolite identification is a crucial part of the drug discovery process. LC-MS/MS-based metabolite identification has gained widespread use, but the data acquired by the LC-MS/MS instrument is complex, and thus the interpretation of data becomes troublesome. Fortunately, advancements in data mining techniques have simplified the process of data interpretation with improved mass accuracy and provide a potentially selective, sensitive, accurate and comprehensive way for metabolite identification. In this review, we have discussed the targeted (extracted ion chromatogram, mass defect filter, product ion filter, neutral loss filter and isotope pattern filter) and untargeted (control sample comparison, background subtraction and metabolomic approaches) post-acquisition data mining techniques, which facilitate the drug metabolite identification. We have also discussed the importance of integrated data mining strategy.

  18. Separation and Sealing of a Sample Container Using Brazing

    NASA Technical Reports Server (NTRS)

    Bar-Cohen, Yoseph; Rivellini, Tommaso P.; Wincentsen, James E.; Gershman, Robert

    2007-01-01

    A special double-wall container and a process for utilizing the container are being developed to enable (1) acquisition of a sample of material in a dirty environment that may include a biological and/or chemical hazard; (2) sealing a lid onto the inner part of the container to hermetically enclose the sample; (3) separating the resulting hermetic container from the dirty environment; and (4) bringing that hermetic container, without any biological or chemical contamination of its outer surface, into a clean environment. The process is denoted S(exp 3)B (separation, seaming, and sealing using brazing) because sealing of the sample into the hermetic container, separating the container from the dirty environment, and bringing the container with a clean outer surface into the clean environment are all accomplished simultaneously with a brazing operation.

  19. The NASA, Marshall Space Flight Center drop tube user's manual

    NASA Technical Reports Server (NTRS)

    Rathz, Thomas J.; Robinson, Michael B.

    1990-01-01

    A comprehensive description of the structural and instrumentation hardware and the experimental capabilities of the 105-meter Marshall Space Flight Center Drop Tube Facility is given. This document is to serve as a guide to the investigator who wishes to perform materials processing experiments in the Drop Tube. Particular attention is given to the Tube's hardware to which an investigator must interface to perform experiments. This hardware consists of the permanent structural hardware (with such items as vacuum flanges), and the experimental hardware (with the furnaces and the sample insertion devices). Two furnaces, an electron-beam and an electromagnetic levitator, are currently used to melt metallic samples in a process environment that can range from 10(exp -6) Torr to 1 atmosphere. Details of these furnaces, the processing environment gases/vacuum, the electrical power, and data acquisition capabilities are specified to allow an investigator to design his/her experiment to maximize successful results and to reduce experimental setup time on the Tube. Various devices used to catch samples while inflicting minimum damage and to enhance turnaround time between experiments are described. Enough information is provided to allow an investigator who wishes to build his/her own furnace or sample catch devices to easily interface it to the Tube. The experimental instrumentation and data acquisition systems used to perform pre-drop and in-flight measurements of the melting and solidification process are also detailed. Typical experimental results are presented as an indicator of the type of data that is provided by the Drop Tube Facility. A summary bibliography of past Drop Tube experiments is provided, and an appendix explaining the noncontact temperature determination of free-falling drops is provided. This document is to be revised occasionally as improvements to the Facility are made and as the summary bibliography grows.

  20. Networks for image acquisition, processing and display

    NASA Technical Reports Server (NTRS)

    Ahumada, Albert J., Jr.

    1990-01-01

    The human visual system comprises layers of networks which sample, process, and code images. Understanding these networks is a valuable means of understanding human vision and of designing autonomous vision systems based on network processing. Ames Research Center has an ongoing program to develop computational models of such networks. The models predict human performance in detection of targets and in discrimination of displayed information. In addition, the models are artificial vision systems sharing properties with biological vision that has been tuned by evolution for high performance. Properties include variable density sampling, noise immunity, multi-resolution coding, and fault-tolerance. The research stresses analysis of noise in visual networks, including sampling, photon, and processing unit noises. Specific accomplishments include: models of sampling array growth with variable density and irregularity comparable to that of the retinal cone mosaic; noise models of networks with signal-dependent and independent noise; models of network connection development for preserving spatial registration and interpolation; multi-resolution encoding models based on hexagonal arrays (HOP transform); and mathematical procedures for simplifying analysis of large networks.

  1. Nuclear Magnetic Resonance Spectroscopy-Based Identification of Yeast.

    PubMed

    Himmelreich, Uwe; Sorrell, Tania C; Daniel, Heide-Marie

    2017-01-01

    Rapid and robust high-throughput identification of environmental, industrial, or clinical yeast isolates is important whenever relatively large numbers of samples need to be processed in a cost-efficient way. Nuclear magnetic resonance (NMR) spectroscopy generates complex data based on metabolite profiles, chemical composition and possibly on medium consumption, which can not only be used for the assessment of metabolic pathways but also for accurate identification of yeast down to the subspecies level. Initial results on NMR based yeast identification where comparable with conventional and DNA-based identification. Potential advantages of NMR spectroscopy in mycological laboratories include not only accurate identification but also the potential of automated sample delivery, automated analysis using computer-based methods, rapid turnaround time, high throughput, and low running costs.We describe here the sample preparation, data acquisition and analysis for NMR-based yeast identification. In addition, a roadmap for the development of classification strategies is given that will result in the acquisition of a database and analysis algorithms for yeast identification in different environments.

  2. Modular Biometric Monitoring System

    NASA Technical Reports Server (NTRS)

    Chmiel, Alan J. (Inventor); Humphreys, Bradley T. (Inventor)

    2017-01-01

    A modular system for acquiring biometric data includes a plurality of data acquisition modules configured to sample biometric data from at least one respective input channel at a data acquisition rate. A representation of the sampled biometric data is stored in memory of each of the plurality of data acquisition modules. A central control system is in communication with each of the plurality of data acquisition modules through a bus. The central control system is configured to control communication of data, via the bus, with each of the plurality of data acquisition modules.

  3. 42 CFR 37.44 - Approval of radiographic facilities that use digital radiography systems.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... effective management, safety, and proper performance of chest image acquisition, digitization, processing... digital chest radiographs by submitting to NIOSH digital radiographic image files of a test object (e.g... radiographic image files from six or more sample chest radiographs that are of acceptable quality to one or...

  4. The growth of the central region by acquisition of counterrotating gas in star-forming galaxies

    PubMed Central

    Chen, Yan-Mei; Shi, Yong; Tremonti, Christy A.; Bershady, Matt; Merrifield, Michael; Emsellem, Eric; Jin, Yi-Fei; Huang, Song; Fu, Hai; Wake, David A.; Bundy, Kevin; Stark, David; Lin, Lihwai; Argudo-Fernandez, Maria; Bergmann, Thaisa Storchi; Bizyaev, Dmitry; Brownstein, Joel; Bureau, Martin; Chisholm, John; Drory, Niv; Guo, Qi; Hao, Lei; Hu, Jian; Li, Cheng; Li, Ran; Lopes, Alexandre Roman; Pan, Kai-Ke; Riffel, Rogemar A.; Thomas, Daniel; Wang, Lan; Westfall, Kyle; Yan, Ren-Bin

    2016-01-01

    Galaxies grow through both internal and external processes. In about 10% of nearby red galaxies with little star formation, gas and stars are counter-rotating, demonstrating the importance of external gas acquisition in these galaxies. However, systematic studies of such phenomena in blue, star-forming galaxies are rare, leaving uncertain the role of external gas acquisition in driving evolution of blue galaxies. Here, based on new measurements with integral field spectroscopy of a large representative galaxy sample, we find an appreciable fraction of counter-rotators among blue galaxies (9 out of 489 galaxies). The central regions of blue counter-rotators show younger stellar populations and more intense, ongoing star formation than their outer parts, indicating ongoing growth of the central regions. The result offers observational evidence that the acquisition of external gas in blue galaxies is possible; the interaction with pre-existing gas funnels the gas into nuclear regions (<1 kpc) to form new stars. PMID:27759033

  5. The growth of the central region by acquisition of counterrotating gas in star-forming galaxies.

    PubMed

    Chen, Yan-Mei; Shi, Yong; Tremonti, Christy A; Bershady, Matt; Merrifield, Michael; Emsellem, Eric; Jin, Yi-Fei; Huang, Song; Fu, Hai; Wake, David A; Bundy, Kevin; Stark, David; Lin, Lihwai; Argudo-Fernandez, Maria; Bergmann, Thaisa Storchi; Bizyaev, Dmitry; Brownstein, Joel; Bureau, Martin; Chisholm, John; Drory, Niv; Guo, Qi; Hao, Lei; Hu, Jian; Li, Cheng; Li, Ran; Lopes, Alexandre Roman; Pan, Kai-Ke; Riffel, Rogemar A; Thomas, Daniel; Wang, Lan; Westfall, Kyle; Yan, Ren-Bin

    2016-10-19

    Galaxies grow through both internal and external processes. In about 10% of nearby red galaxies with little star formation, gas and stars are counter-rotating, demonstrating the importance of external gas acquisition in these galaxies. However, systematic studies of such phenomena in blue, star-forming galaxies are rare, leaving uncertain the role of external gas acquisition in driving evolution of blue galaxies. Here, based on new measurements with integral field spectroscopy of a large representative galaxy sample, we find an appreciable fraction of counter-rotators among blue galaxies (9 out of 489 galaxies). The central regions of blue counter-rotators show younger stellar populations and more intense, ongoing star formation than their outer parts, indicating ongoing growth of the central regions. The result offers observational evidence that the acquisition of external gas in blue galaxies is possible; the interaction with pre-existing gas funnels the gas into nuclear regions (<1 kpc) to form new stars.

  6. Smart Vest: wearable multi-parameter remote physiological monitoring system.

    PubMed

    Pandian, P S; Mohanavelu, K; Safeer, K P; Kotresh, T M; Shakunthala, D T; Gopal, Parvati; Padaki, V C

    2008-05-01

    The wearable physiological monitoring system is a washable shirt, which uses an array of sensors connected to a central processing unit with firmware for continuously monitoring physiological signals. The data collected can be correlated to produce an overall picture of the wearer's health. In this paper, we discuss the wearable physiological monitoring system called 'Smart Vest'. The Smart Vest consists of a comfortable to wear vest with sensors integrated for monitoring physiological parameters, wearable data acquisition and processing hardware and remote monitoring station. The wearable data acquisition system is designed using microcontroller and interfaced with wireless communication and global positioning system (GPS) modules. The physiological signals monitored are electrocardiogram (ECG), photoplethysmogram (PPG), body temperature, blood pressure, galvanic skin response (GSR) and heart rate. The acquired physiological signals are sampled at 250samples/s, digitized at 12-bit resolution and transmitted wireless to a remote physiological monitoring station along with the geo-location of the wearer. The paper describes a prototype Smart Vest system used for remote monitoring of physiological parameters and the clinical validation of the data are also presented.

  7. Price and convenience: The influence of supermarkets on consumption of ultra-processed foods and beverages in Brazil.

    PubMed

    Machado, Priscila Pereira; Claro, Rafael Moreira; Canella, Daniela Silva; Sarti, Flávia Mori; Levy, Renata Bertazzi

    2017-09-01

    To evaluate the influence of convenience and price of ultra-processed foods and beverages on purchases at supermarkets. The study used data on food and beverage acquisition for household consumption from the Brazilian Household Budget Survey, performed in a random sample of 55,970 households between 2008 and 2009. Foods and beverages were categorized into four groups, according to characteristics of food processing. Retail stores were grouped into supermarkets and other food stores. Proportion of calories from foods and beverages purchased at supermarkets and other food stores, and respective mean prices (R$/1000 kcal), were calculated according to households' geographical and socioeconomic characteristics. Effect of convenience in household purchases at retail stores was expressed by the acquisition of several food items at the same store. The influence of convenience and prices of ultra-processed products on purchases at supermarkets was analyzed using log-log regression model with estimation of elasticity coefficients. The mean prices of foods and beverages purchased at supermarkets were 37% lower in comparison to other food stores. The share of ultra-processed foods and beverages in purchases made at supermarkets was 25% higher than at other food stores. An increase of 1% in prices of ultra-processed food items led to a 0.59% reduction in calorie acquisition at supermarkets (R 2  = 0.75; p < 0.001). On the other hand, an increase of 1% in the number of food items purchased at supermarkets resulted in 1.83% increase in calorie acquisition of ultra-processed foods and beverages (p < 0.001). Convenience and lower relative prices of food items purchased at supermarkets, in comparison to other food stores, are relevant to explain higher share of purchases of ultra-processed foods and beverages at supermarkets. Copyright © 2017 Elsevier Ltd. All rights reserved.

  8. Image reconstructions from super-sampled data sets with resolution modeling in PET imaging.

    PubMed

    Li, Yusheng; Matej, Samuel; Metzler, Scott D

    2014-12-01

    Spatial resolution in positron emission tomography (PET) is still a limiting factor in many imaging applications. To improve the spatial resolution for an existing scanner with fixed crystal sizes, mechanical movements such as scanner wobbling and object shifting have been considered for PET systems. Multiple acquisitions from different positions can provide complementary information and increased spatial sampling. The objective of this paper is to explore an efficient and useful reconstruction framework to reconstruct super-resolution images from super-sampled low-resolution data sets. The authors introduce a super-sampling data acquisition model based on the physical processes with tomographic, downsampling, and shifting matrices as its building blocks. Based on the model, we extend the MLEM and Landweber algorithms to reconstruct images from super-sampled data sets. The authors also derive a backprojection-filtration-like (BPF-like) method for the super-sampling reconstruction. Furthermore, they explore variant methods for super-sampling reconstructions: the separate super-sampling resolution-modeling reconstruction and the reconstruction without downsampling to further improve image quality at the cost of more computation. The authors use simulated reconstruction of a resolution phantom to evaluate the three types of algorithms with different super-samplings at different count levels. Contrast recovery coefficient (CRC) versus background variability, as an image-quality metric, is calculated at each iteration for all reconstructions. The authors observe that all three algorithms can significantly and consistently achieve increased CRCs at fixed background variability and reduce background artifacts with super-sampled data sets at the same count levels. For the same super-sampled data sets, the MLEM method achieves better image quality than the Landweber method, which in turn achieves better image quality than the BPF-like method. The authors also demonstrate that the reconstructions from super-sampled data sets using a fine system matrix yield improved image quality compared to the reconstructions using a coarse system matrix. Super-sampling reconstructions with different count levels showed that the more spatial-resolution improvement can be obtained with higher count at a larger iteration number. The authors developed a super-sampling reconstruction framework that can reconstruct super-resolution images using the super-sampling data sets simultaneously with known acquisition motion. The super-sampling PET acquisition using the proposed algorithms provides an effective and economic way to improve image quality for PET imaging, which has an important implication in preclinical and clinical region-of-interest PET imaging applications.

  9. Associations of emotional arousal, dissociation and symptom severity with operant conditioning in borderline personality disorder.

    PubMed

    Paret, Christian; Hoesterey, Steffen; Kleindienst, Nikolaus; Schmahl, Christian

    2016-10-30

    Those with borderline personality disorder (BPD) display altered evaluations regarding reward and punishment compared to others. The processing of rewards is basal for operant conditioning. However, studies addressing operant conditioning in BPD patients are rare. In the current study, an operant conditioning task combining learning acquisition and reversal was used. BPD patients and matched healthy controls (HCs) were exposed to aversive and neutral stimuli to assess the influence of emotion on learning. Picture content, dissociation, aversive tension and symptom severity were rated. Error rates were measured. Results showed no group interactions between aversive versus neutral scenes. The higher emotional arousal, dissociation and tension, the worse the acquisition, but not reversal, scores were for BPD patients. Scores from the Borderline Symptom List were associated with more errors in the reversal, but not the acquisition phase. The results are preliminary evidence for impaired acquisition learning due to increased emotional arousal, dissociation and tension in BPD patients. A failure to process punishment in the reversal phase was associated with symptom severity and may be related to neuropsychological dysfunctioning involving the ventromedial prefrontal cortex. Conclusions are limited due to the correlational study design and the small sample size. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  10. 48 CFR 8.705-3 - Allocation process.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 48 Federal Acquisition Regulations System 1 2011-10-01 2011-10-01 false Allocation process. 8.705-3 Section 8.705-3 Federal Acquisition Regulations System FEDERAL ACQUISITION REGULATION ACQUISITION PLANNING REQUIRED SOURCES OF SUPPLIES AND SERVICES Acquisition From Nonprofit Agencies Employing People Who...

  11. A twin study of the genetics of fear conditioning.

    PubMed

    Hettema, John M; Annas, Peter; Neale, Michael C; Kendler, Kenneth S; Fredrikson, Mats

    2003-07-01

    Fear conditioning is a traditional model for the acquisition of fears and phobias. Studies of the genetic architecture of fear conditioning may inform gene-finding strategies for anxiety disorders. The objective of this study was to determine the genetic and environmental sources of individual differences in fear conditioning by means of a twin sample. Classic fear conditioning data were experimentally obtained from 173 same-sex twin pairs (90 monozygotic and 83 dizygotic). Sequences of evolutionary fear-relevant (snakes and spiders) and fear-irrelevant (circles and triangles) pictorial stimuli served as conditioned stimuli paired with a mild electric shock serving as the unconditioned stimulus. The outcome measure was the electrodermal skin conductance response. We applied structural equation modeling methods to the 3 conditioning phases of habituation, acquisition, and extinction to determine the extent to which genetic and environmental factors underlie individual variation in associative and nonassociative learning. All components of the fear conditioning process in humans demonstrated moderate heritability, in the range of 35% to 45%. Best-fitting multivariate models suggest that 2 sets of genes may underlie the trait of fear conditioning: one that most strongly affects nonassociative processes of habituation that also is shared with acquisition and extinction, and a second that appears related to associative fear conditioning processes. In addition, these data provide tentative evidence of differences in heritability based on the fear relevance of the stimuli. Genes represent a significant source of individual variation in the habituation, acquisition, and extinction of fears, and genetic effects specific to fear conditioning are involved.

  12. A Post-Processing Receiver for the Lunar Laser Communications Demonstration Project

    NASA Technical Reports Server (NTRS)

    Srinivasan, Meera; Birnbaum, Kevin; Cheng, Michael; Quirk, Kevin

    2013-01-01

    The Lunar Laser Communications Demonstration Project undertaken by MIT Lincoln Laboratory and NASA's Goddard Space Flight Center will demonstrate high-rate laser communications from lunar orbit to the Earth. NASA's Jet Propulsion Laboratory is developing a backup ground station supporting a data rate of 39 Mbps that is based on a non-real-time software post-processing receiver architecture. This approach entails processing sample-rate-limited data without feedback in the presence high uncertainty in downlink clock characteristics under low signal flux conditions. In this paper we present a receiver concept that addresses these challenges with descriptions of the photodetector assembly, sample acquisition and recording platform, and signal processing approach. End-to-end coded simulation and laboratory data analysis results are presented that validate the receiver conceptual design.

  13. Method and apparatus for implementing material thermal property measurement by flash thermal imaging

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sun, Jiangang

    A method and apparatus are provided for implementing measurement of material thermal properties including measurement of thermal effusivity of a coating and/or film or a bulk material of uniform property. The test apparatus includes an infrared camera, a data acquisition and processing computer coupled to the infrared camera for acquiring and processing thermal image data, a flash lamp providing an input of heat onto the surface of a two-layer sample with an enhanced optical filter covering the flash lamp attenuating an entire infrared wavelength range with a series of thermal images is taken of the surface of the two-layer sample.

  14. Mineralogy and Elemental Composition of Wind Drift Soil at Rocknest, Gale Crater

    NASA Technical Reports Server (NTRS)

    Blake, D. F.; Bish, D. L.; Morris, R. V.; Downs, R. T.; Trieman, A. H.; Morrison, S. M.; Chipera, S. J.; Ming, D. W.; Yen, A. S.; Vaniman, D. T.; hide

    2013-01-01

    The Mars Science Laboratory rover Curiosity has been exploring Mars since August 5, 2012, conducting engineering and first-time activities with its mobility system, arm, sample acquisition and processing system (SA/SPaH-CHIMRA) and science instruments. Curiosity spent 54 sols at a location named "Rocknest," collecting and processing five scoops of loose, unconsolidated materials ("soil") acquired from an aeolian bedform (Fig. 1). The Chemistry and Mineralogy (CheMin) instrument analyzed portions of scoops 3, 4, and 5, to obtain the first quantitative mineralogical analysis of Mars soil, and to provide context for Sample Analysis at Mars (SAM) measurements of volatiles, isotopes and possible organic materials.

  15. ERP responses to lexical-semantic processing in typically developing toddlers, in adults, and in toddlers at risk for language and learning impairment.

    PubMed

    Cantiani, Chiara; Riva, Valentina; Piazza, Caterina; Melesi, Giulia; Mornati, Giulia; Bettoni, Roberta; Marino, Cecilia; Molteni, Massimo

    2017-08-01

    Children begin to establish lexical-semantic representations during their first year of life, resulting in a rapid growth of vocabulary around 18-24 months of age. The neural mechanisms underlying this initial ability to map words onto conceptual representations remain relatively unknown. In the present study, the electrophysiological underpinnings of these mechanisms are explored during the critical phase of lexical acquisition using a picture-word matching paradigm. Event-Related Potentials (ERPs) elicited by words (either congruous or incongruous with the previous picture context) and pseudo-words are investigated in 20-month-old toddlers (N = 20) and compared to those elicited in a sample of adults (N = 20), reflecting the final and efficient system, and a sample of toddlers at familial risk for language and learning impairment (LLI, N = 15). The results suggest that the architecture underlying spoken word representation and processing is constant throughout development, even if some differences between children and adults emerged. Interestingly, children seem to be faster than adults in processing incongruent words, probably because relying on a different and more superficial strategy. This early strategy does not seem to be present in children at risk for LLI. In addition, both groups of children do not show different and specific electrophysiological underpinnings in response to real but incongruent words and unknown words, suggesting that during the critical phase of lexical acquisition any potential word is processed in a similar way. Overall, children at risk for LLI turned out to be sensitive to verbal incongruity of the lexical-semantic context, although some differences from typically developing children emerged, reflecting slower processing and less automatic responses. Taken together, the findings of this study pave the way to further research to investigate these effects in clinical and at-risk populations with the general purpose of disentangling the underlying mechanisms of lexical acquisition, and potentially predicting later language (dis)abilities. Copyright © 2017 Elsevier Ltd. All rights reserved.

  16. Benchmarking contactless acquisition sensor reproducibility for latent fingerprint trace evidence

    NASA Astrophysics Data System (ADS)

    Hildebrandt, Mario; Dittmann, Jana

    2015-03-01

    Optical, nano-meter range, contactless, non-destructive sensor devices are promising acquisition techniques in crime scene trace forensics, e.g. for digitizing latent fingerprint traces. Before new approaches are introduced in crime investigations, innovations need to be positively tested and quality ensured. In this paper we investigate sensor reproducibility by studying different scans from four sensors: two chromatic white light sensors (CWL600/CWL1mm), one confocal laser scanning microscope, and one NIR/VIS/UV reflection spectrometer. Firstly, we perform an intra-sensor reproducibility testing for CWL600 with a privacy conform test set of artificial-sweat printed, computer generated fingerprints. We use 24 different fingerprint patterns as original samples (printing samples/templates) for printing with artificial sweat (physical trace samples) and their acquisition with contactless sensory resulting in 96 sensor images, called scan or acquired samples. The second test set for inter-sensor reproducibility assessment consists of the first three patterns from the first test set, acquired in two consecutive scans using each device. We suggest using a simple feature space set in spatial and frequency domain known from signal processing and test its suitability for six different classifiers classifying scan data into small differences (reproducible) and large differences (non-reproducible). Furthermore, we suggest comparing the classification results with biometric verification scores (calculated with NBIS, with threshold of 40) as biometric reproducibility score. The Bagging classifier is nearly for all cases the most reliable classifier in our experiments and the results are also confirmed with the biometric matching rates.

  17. Fast imaging of laboratory core floods using 3D compressed sensing RARE MRI.

    PubMed

    Ramskill, N P; Bush, I; Sederman, A J; Mantle, M D; Benning, M; Anger, B C; Appel, M; Gladden, L F

    2016-09-01

    Three-dimensional (3D) imaging of the fluid distributions within the rock is essential to enable the unambiguous interpretation of core flooding data. Magnetic resonance imaging (MRI) has been widely used to image fluid saturation in rock cores; however, conventional acquisition strategies are typically too slow to capture the dynamic nature of the displacement processes that are of interest. Using Compressed Sensing (CS), it is possible to reconstruct a near-perfect image from significantly fewer measurements than was previously thought necessary, and this can result in a significant reduction in the image acquisition times. In the present study, a method using the Rapid Acquisition with Relaxation Enhancement (RARE) pulse sequence with CS to provide 3D images of the fluid saturation in rock core samples during laboratory core floods is demonstrated. An objective method using image quality metrics for the determination of the most suitable regularisation functional to be used in the CS reconstructions is reported. It is shown that for the present application, Total Variation outperforms the Haar and Daubechies3 wavelet families in terms of the agreement of their respective CS reconstructions with a fully-sampled reference image. Using the CS-RARE approach, 3D images of the fluid saturation in the rock core have been acquired in 16min. The CS-RARE technique has been applied to image the residual water saturation in the rock during a water-water displacement core flood. With a flow rate corresponding to an interstitial velocity of vi=1.89±0.03ftday(-1), 0.1 pore volumes were injected over the course of each image acquisition, a four-fold reduction when compared to a fully-sampled RARE acquisition. Finally, the 3D CS-RARE technique has been used to image the drainage of dodecane into the water-saturated rock in which the dynamics of the coalescence of discrete clusters of the non-wetting phase are clearly observed. The enhancement in the temporal resolution that has been achieved using the CS-RARE approach enables dynamic transport processes pertinent to laboratory core floods to be investigated in 3D on a time-scale and with a spatial resolution that, until now, has not been possible. Copyright © 2016 The Authors. Published by Elsevier Inc. All rights reserved.

  18. Microbial Mineral Weathering for Nutrient Acquisition Releases Arsenic

    NASA Astrophysics Data System (ADS)

    Mailloux, B. J.; Alexandrova, E.; Keimowitz, A.; Wovkulich, K.; Freyer, G.; Stolz, J.; Kenna, T.; Pichler, T.; Polizzotto, M.; Dong, H.; Radloff, K. A.; van Geen, A.

    2008-12-01

    Tens of millions of people in Southeast Asia drink groundwater contaminated with naturally occurring arsenic. The process of arsenic release from the sediment to the groundwater remains poorly understood. Experiments were performed to determine if microbial mineral weathering for nutrient acquisition can serve as a potential mechanism for arsenic mobilization. We performed microcosm experiments with Burkholderia fungorum, phosphate free artificial groundwater, and natural apatite. Controls included incubations with no cells and with killed cells. Additionally, samples were treated with two spikes - an arsenic spike, to show that arsenic release is independent of the initial arsenic concentration, and a phosphate spike to determine whether release occurs at field relevant phosphate conditions. We show in laboratory experiments that phosphate-limited cells of Burkholderia fungorum mobilize ancillary arsenic from apatite as a by-product of mineral weathering for nutrient acquisition. The released arsenic does not undergo a redox transformation but appears to be solubilized from the apatite mineral lattice as arsenate during weathering. Apatite has been shown to be commonly present in sediment samples from Bangladesh aquifers. Analysis of apatite purified from the Ganges, Brahamputra, Meghna drainage basin shows 210 mg/kg of arsenic, which is higher than the average crustal level. Finally, we demonstrate the presence of the microbial phenotype that releases arsenic from apatite in Bangladesh sediments. These results suggest that microbial weathering for nutrient acquisition could be an important mechanism for arsenic mobilization.

  19. Development and Validation of a Qualitative Method for Target Screening of 448 Pesticide Residues in Fruits and Vegetables Using UHPLC/ESI Q-Orbitrap Based on Data-Independent Acquisition and Compound Database.

    PubMed

    Wang, Jian; Chow, Willis; Chang, James; Wong, Jon W

    2017-01-18

    A semiautomated qualitative method for target screening of 448 pesticide residues in fruits and vegetables was developed and validated using ultrahigh-performance liquid chromatography coupled with electrospray ionization quadrupole Orbitrap high-resolution mass spectrometry (UHPLC/ESI Q-Orbitrap). The Q-Orbitrap Full MS/dd-MS 2 (data dependent acquisition) was used to acquire product-ion spectra of individual pesticides to build a compound database or an MS library, while its Full MS/DIA (data independent acquisition) was utilized for sample data acquisition from fruit and vegetable matrices fortified with pesticides at 10 and 100 μg/kg for target screening purpose. Accurate mass, retention time and response threshold were three key parameters in a compound database that were used to detect incurred pesticide residues in samples. The concepts and practical aspects of in-spectrum mass correction or solvent background lock-mass correction, retention time alignment and response threshold adjustment are discussed while building a functional and working compound database for target screening. The validated target screening method is capable of screening at least 94% and 99% of 448 pesticides at 10 and 100 μg/kg, respectively, in fruits and vegetables without having to evaluate every compound manually during data processing, which significantly reduced the workload in routine practice.

  20. Contractor relationships and inter-organizational strategies in NASA's R and D acquisition process

    NASA Technical Reports Server (NTRS)

    Guiltinan, J.

    1976-01-01

    Interorganizational analysis of NASA's acquisition process for research and development systems is discussed. The importance of understanding the contractor environment, constraints, and motives in selecting an acquisition strategy is demonstrated. By articulating clear project goals, by utilizing information about the contractor and his needs at each stage in the acquisition process, and by thorough analysis of the inter-organizational relationship, improved selection of acquisition strategies and business practices is possible.

  1. Advanced Engine Health Management Applications of the SSME Real-Time Vibration Monitoring System

    NASA Technical Reports Server (NTRS)

    Fiorucci, Tony R.; Lakin, David R., II; Reynolds, Tracy D.; Turner, James E. (Technical Monitor)

    2000-01-01

    The Real Time Vibration Monitoring System (RTVMS) is a 32-channel high speed vibration data acquisition and processing system developed at Marshall Space Flight Center (MSFC). It Delivers sample rates as high as 51,200 samples/second per channel and performs Fast Fourier Transform (FFT) processing via on-board digital signal processing (DSP) chips in a real-time format. Advanced engine health assessment is achieved by utilizing the vibration spectra to provide accurate sensor validation and enhanced engine vibration redlines. Discrete spectral signatures (such as synchronous) that are indicators of imminent failure can be assessed and utilized to mitigate catastrophic engine failures- a first in rocket engine health assessment. This paper is presented in viewgraph form.

  2. Current Protocols in Pharmacology

    PubMed Central

    2016-01-01

    Determination of drug or drug metabolite concentrations in biological samples, particularly in serum or plasma, is fundamental to describing the relationships between administered dose, route of administration, and time after dose to the drug concentrations achieved and to the observed effects of the drug. A well-characterized, accurate analytical method is needed, but it must also be established that the analyte concentration in the sample at the time of analysis is the same as the concentration at sample acquisition. Drugs and metabolites may be susceptible to degradation in samples due to metabolism or to physical and chemical processes, resulting in a lower measured concentration than was in the original sample. Careful examination of analyte stability during processing and storage and adjustment of procedures and conditions to maximize that stability are a critical part of method validation for the analysis, and can ensure the accuracy of the measured concentrations. PMID:27960029

  3. Test and Validation of the Mars Science Laboratory Robotic Arm

    NASA Technical Reports Server (NTRS)

    Robinson, M.; Collins, C.; Leger, P.; Kim, W.; Carsten, J.; Tompkins, V.; Trebi-Ollennu, A.; Florow, B.

    2013-01-01

    The Mars Science Laboratory Robotic Arm (RA) is a key component for achieving the primary scientific goals of the mission. The RA supports sample acquisition by precisely positioning a scoop above loose regolith or accurately preloading a percussive drill on Martian rocks or rover-mounted organic check materials. It assists sample processing by orienting a sample processing unit called CHIMRA through a series of gravity-relative orientations and sample delivery by positioning the sample portion door above an instrument inlet or the observation tray. In addition the RA facilitates contact science by accurately positioning the dust removal tool, Alpha Particle X-Ray Spectrometer (APXS) and the Mars Hand Lens Imager (MAHLI) relative to surface targets. In order to fulfill these seemingly disparate science objectives the RA must satisfy a variety of accuracy and performance requirements. This paper describes the necessary arm requirement specification and the test campaign to demonstrate these requirements were satisfied.

  4. 48 CFR 908.7116 - Electronic data processing tape.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 48 Federal Acquisition Regulations System 5 2010-10-01 2010-10-01 false Electronic data processing... Electronic data processing tape. (a) Acquisitions of electronic data processing tape by DOE offices shall be in accordance with FPMR 41 CFR 101-26.508. (b) Acquisitions of electronic data processing tape by...

  5. 48 CFR 908.7116 - Electronic data processing tape.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 48 Federal Acquisition Regulations System 5 2011-10-01 2011-10-01 false Electronic data processing... Electronic data processing tape. (a) Acquisitions of electronic data processing tape by DOE offices shall be in accordance with FPMR 41 CFR 101-26.508. (b) Acquisitions of electronic data processing tape by...

  6. 48 CFR 908.7116 - Electronic data processing tape.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 48 Federal Acquisition Regulations System 5 2013-10-01 2013-10-01 false Electronic data processing... Electronic data processing tape. (a) Acquisitions of electronic data processing tape by DOE offices shall be in accordance with FPMR 41 CFR 101-26.508. (b) Acquisitions of electronic data processing tape by...

  7. 48 CFR 908.7116 - Electronic data processing tape.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 48 Federal Acquisition Regulations System 5 2014-10-01 2014-10-01 false Electronic data processing... Electronic data processing tape. (a) Acquisitions of electronic data processing tape by DOE offices shall be in accordance with FPMR 41 CFR 101-26.508. (b) Acquisitions of electronic data processing tape by...

  8. 48 CFR 908.7116 - Electronic data processing tape.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 48 Federal Acquisition Regulations System 5 2012-10-01 2012-10-01 false Electronic data processing... Electronic data processing tape. (a) Acquisitions of electronic data processing tape by DOE offices shall be in accordance with FPMR 41 CFR 101-26.508. (b) Acquisitions of electronic data processing tape by...

  9. Performance of a segmented HPGe detector at KRISS.

    PubMed

    Han, Jubong; Lee, K B; Lee, Jong-Man; Lee, S H; Park, Tae Soon; Oh, J S

    2018-04-01

    A 24 segmented HPGe coaxial detector was set up with a digitized data acquisition system (DAQ). The DAQ was composed of a digitizer (5 × 10 7 sampling/s), a Field-Programmable Gate Array (FPGA), and a real time operating system. The Full Width Half Maximum (FWHM), rise time, signal characteristics, and spectra of a 137 Cs source were evaluated. The data were processed using an in-house developed gamma-ray tracking system. Copyright © 2017 Elsevier Ltd. All rights reserved.

  10. Design of Control Software for a High-Speed Coherent Doppler Lidar System for CO2 Measurement

    NASA Technical Reports Server (NTRS)

    Vanvalkenburg, Randal L.; Beyon, Jeffrey Y.; Koch, Grady J.; Yu, Jirong; Singh, Upendra N.; Kavaya, Michael J.

    2010-01-01

    The design of the software for a 2-micron coherent high-speed Doppler lidar system for CO2 measurement at NASA Langley Research Center is discussed in this paper. The specific strategy and design topology to meet the requirements of the system are reviewed. In order to attain the high-speed digitization of the different types of signals to be sampled on multiple channels, a carefully planned design of the control software is imperative. Samples of digitized data from each channel and their roles in data analysis post processing are also presented. Several challenges of extremely-fast, high volume data acquisition are discussed. The software must check the validity of each lidar return as well as other monitoring channel data in real-time. For such high-speed data acquisition systems, the software is a key component that enables the entire scope of CO2 measurement studies using commercially available system components.

  11. A functional description of the Buffered Telemetry Demodulator (BTD)

    NASA Technical Reports Server (NTRS)

    Tsou, H.; Shah, B.; Lee, R.; Hinedi, S.

    1993-01-01

    This article gives a functional description of the buffered telemetry demodulator (BTD), which operates on recorded digital samples to extract the symbols from the received signal. The key advantages of the BTD are as follows: (1) its ability to reprocess the signal to reduce acquisition time; (2) its ability to use future information about the signal and to perform smoothing on past samples; and (3) its minimum transmission bandwidth requirement as each sub carrier harmonic is processed individually. The first application of the BTD would be the Galileo S-band contingency mission, where the signal is so weak that reprocessing to reduce the acquisition time is crucial. Moreover, in the event of employing antenna arraying with full spectrum combining, only the sub carrier harmonics need to be transmitted between sites, resulting in significant reduction in data rate transmission requirements. Software implementation of the BTD is described for various general-purpose computers.

  12. Quantitative nanoscopy: Tackling sampling limitations in (S)TEM imaging of polymers and composites.

    PubMed

    Gnanasekaran, Karthikeyan; Snel, Roderick; de With, Gijsbertus; Friedrich, Heiner

    2016-01-01

    Sampling limitations in electron microscopy questions whether the analysis of a bulk material is representative, especially while analyzing hierarchical morphologies that extend over multiple length scales. We tackled this problem by automatically acquiring a large series of partially overlapping (S)TEM images with sufficient resolution, subsequently stitched together to generate a large-area map using an in-house developed acquisition toolbox (TU/e Acquisition ToolBox) and stitching module (TU/e Stitcher). In addition, we show that quantitative image analysis of the large scale maps provides representative information that can be related to the synthesis and process conditions of hierarchical materials, which moves electron microscopy analysis towards becoming a bulk characterization tool. We demonstrate the power of such an analysis by examining two different multi-phase materials that are structured over multiple length scales. Copyright © 2015 Elsevier B.V. All rights reserved.

  13. Cleft Audit Protocol for Speech (CAPS-A): A Comprehensive Training Package for Speech Analysis

    ERIC Educational Resources Information Center

    Sell, D.; John, A.; Harding-Bell, A.; Sweeney, T.; Hegarty, F.; Freeman, J.

    2009-01-01

    Background: The previous literature has largely focused on speech analysis systems and ignored process issues, such as the nature of adequate speech samples, data acquisition, recording and playback. Although there has been recognition of the need for training on tools used in speech analysis associated with cleft palate, little attention has been…

  14. Integration of digital signal processing technologies with pulsed electron paramagnetic resonance imaging

    PubMed Central

    Pursley, Randall H.; Salem, Ghadi; Devasahayam, Nallathamby; Subramanian, Sankaran; Koscielniak, Janusz; Krishna, Murali C.; Pohida, Thomas J.

    2006-01-01

    The integration of modern data acquisition and digital signal processing (DSP) technologies with Fourier transform electron paramagnetic resonance (FT-EPR) imaging at radiofrequencies (RF) is described. The FT-EPR system operates at a Larmor frequency (Lf) of 300 MHz to facilitate in vivo studies. This relatively low frequency Lf, in conjunction with our ~10 MHz signal bandwidth, enables the use of direct free induction decay time-locked subsampling (TLSS). This particular technique provides advantages by eliminating the traditional analog intermediate frequency downconversion stage along with the corresponding noise sources. TLSS also results in manageable sample rates that facilitate the design of DSP-based data acquisition and image processing platforms. More specifically, we utilize a high-speed field programmable gate array (FPGA) and a DSP processor to perform advanced real-time signal and image processing. The migration to a DSP-based configuration offers the benefits of improved EPR system performance, as well as increased adaptability to various EPR system configurations (i.e., software configurable systems instead of hardware reconfigurations). The required modifications to the FT-EPR system design are described, with focus on the addition of DSP technologies including the application-specific hardware, software, and firmware developed for the FPGA and DSP processor. The first results of using real-time DSP technologies in conjunction with direct detection bandpass sampling to implement EPR imaging at RF frequencies are presented. PMID:16243552

  15. CORSAIR (COmet Rendezvous, Sample Acquisition, Investigation, and Return): A New Frontiers Mission Concept to Collect Samples from a Comet and Return Them to Earth for Study

    NASA Astrophysics Data System (ADS)

    Sandford, S. A.; Chabot, N. L.; Dello Russo, N.; Leary, J. C.; Reynolds, E. L.; Weaver, H. A.; Wooden, D. H.

    2017-07-01

    CORSAIR (COmet Rendezvous, Sample Acquisition, Investigation, and Return) is a mission concept submitted in response to NASA's New Frontiers 4 call. CORSAIR's proposed mission is to return comet nucleus samples to Earth for detailed analysis.

  16. Processing and Analysis of Multichannel Extracellular Neuronal Signals: State-of-the-Art and Challenges

    PubMed Central

    Mahmud, Mufti; Vassanelli, Stefano

    2016-01-01

    In recent years multichannel neuronal signal acquisition systems have allowed scientists to focus on research questions which were otherwise impossible. They act as a powerful means to study brain (dys)functions in in-vivo and in in-vitro animal models. Typically, each session of electrophysiological experiments with multichannel data acquisition systems generate large amount of raw data. For example, a 128 channel signal acquisition system with 16 bits A/D conversion and 20 kHz sampling rate will generate approximately 17 GB data per hour (uncompressed). This poses an important and challenging problem of inferring conclusions from the large amounts of acquired data. Thus, automated signal processing and analysis tools are becoming a key component in neuroscience research, facilitating extraction of relevant information from neuronal recordings in a reasonable time. The purpose of this review is to introduce the reader to the current state-of-the-art of open-source packages for (semi)automated processing and analysis of multichannel extracellular neuronal signals (i.e., neuronal spikes, local field potentials, electroencephalogram, etc.), and the existing Neuroinformatics infrastructure for tool and data sharing. The review is concluded by pinpointing some major challenges that are being faced, which include the development of novel benchmarking techniques, cloud-based distributed processing and analysis tools, as well as defining novel means to share and standardize data. PMID:27313507

  17. 48 CFR 736.602-5 - Short selection process for procurements not to exceed the simplified acquisition threshold.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 48 Federal Acquisition Regulations System 5 2010-10-01 2010-10-01 false Short selection process for procurements not to exceed the simplified acquisition threshold. 736.602-5 Section 736.602-5... selection process for procurements not to exceed the simplified acquisition threshold. References to FAR 36...

  18. An automated atmospheric sampling system operating on 747 airliners

    NASA Technical Reports Server (NTRS)

    Perkins, P. J.; Gustafsson, U. R. C.

    1976-01-01

    An air sampling system that automatically measures the temporal and spatial distribution of particulate and gaseous constituents of the atmosphere is collecting data on commercial air routes covering the world. Measurements are made in the upper troposphere and lower stratosphere (6 to 12 km) of constituents related to aircraft engine emissions and other pollutants. Aircraft operated by different airlines sample air at latitudes from the Arctic to Australia. This unique system includes specialized instrumentation, a special air inlet probe for sampling outside air, a computerized automatic control, and a data acquisition system. Air constituent and related flight data are tape recorded in flight for later computer processing on the ground.

  19. Non-destructive forensic latent fingerprint acquisition with chromatic white light sensors

    NASA Astrophysics Data System (ADS)

    Leich, Marcus; Kiltz, Stefan; Dittmann, Jana; Vielhauer, Claus

    2011-02-01

    Non-destructive latent fingerprint acquisition is an emerging field of research, which, unlike traditional methods, makes latent fingerprints available for additional verification or further analysis like tests for substance abuse or age estimation. In this paper a series of tests is performed to investigate the overall suitability of a high resolution off-the-shelf chromatic white light sensor for the contact-less and non-destructive latent fingerprint acquisition. Our paper focuses on scanning previously determined regions with exemplary acquisition parameter settings. 3D height field and reflection data of five different latent fingerprints on six different types of surfaces (HDD platter, brushed metal, painted car body (metallic and non-metallic finish), blued metal, veneered plywood) are experimentally studied. Pre-processing is performed by removing low-frequency gradients. The quality of the results is assessed subjectively; no automated feature extraction is performed. Additionally, the degradation of the fingerprint during the acquisition period is observed. While the quality of the acquired data is highly dependent on surface structure, the sensor is capable of detecting the fingerprint on all sample surfaces. On blued metal the residual material is detected; however, the ridge line structure dissolves within minutes after fingerprint placement.

  20. Paleointensity in ignimbrites and other volcaniclastic flows

    NASA Astrophysics Data System (ADS)

    Bowles, J. A.; Gee, J. S.; Jackson, M. J.

    2011-12-01

    Ash flow tuffs (ignimbrites) are common worldwide, frequently contain fine-grained magnetite hosted in the glassy matrix, and often have high-quality 40Ar/39Ar ages. This makes them attractive candidates for paleointensity studies, potentially allowing for a substantial increase in the number of well-dated paleointensity estimates. However, the timing and nature of remanence acquisition in ignimbrites are not sufficiently understood to allow confident interpretation of paleointensity data from ash flows. The remanence acquisition may be a complex function of mineralogy and thermal history. Emplacement conditions and post-emplacement processes vary considerably between and within tuffs and may potentially affect the ability to recover ancient field intensity information. To better understand the relevant magnetic recording assemblage(s) and remanence acquisition processes we have collected samples from two well-documented historical ignimbrites, the 1980 ash flows at Mt. St. Helens (MSH), Washington, and the 1912 flows from Mt. Katmai in the Valley of Ten Thousand Smokes (VTTS), Alaska. Data from these relatively small, poorly- to non-welded historical flows are compared to the more extensive and more densely welded 0.76 Ma Bishop Tuff. This sample set enables us to better understand the geologic processes that destroy or preserve paleointensity information so that samples from ancient tuffs may be selected with care. Thellier-type paleointensity experiments carried out on pumice blocks sampled from the MSH flows resulted in a paleointensity of 55.8 μT +/- 0.8 (1 standard error). This compares favorably with the actual value of 56.0 μT. Excluded specimens of poor technical quality were dominantly from sites that were either emplaced at low temperature (<350°C) or were subject to post-emplacement hydrothermal alteration. The VTTS experienced much more wide-spread low-temperature hydrothermal activity than did MSH. Pumice-bearing ash matrix samples from this locality are characterized by at least two magnetic phases, one of which appears to carry a chemical remanent magnetization. Paleointensities derived from the second phase give results that vary widely but which may be correlated with degree of hydrothermal alteration or hydration. Preliminary data from the Bishop Tuff suggests that vapor-phase alteration at high (>600°C) temperatures does not corrupt the paleointensity signal, and additional data will be presented which explores this more fully.

  1. Development and Flight Testing of an Adaptable Vehicle Health-Monitoring Architecture

    NASA Technical Reports Server (NTRS)

    Woodard, Stanley E.; Coffey, Neil C.; Gonzalez, Guillermo A.; Woodman, Keith L.; Weathered, Brenton W.; Rollins, Courtney H.; Taylor, B. Douglas; Brett, Rube R.

    2003-01-01

    Development and testing of an adaptable wireless health-monitoring architecture for a vehicle fleet is presented. It has three operational levels: one or more remote data acquisition units located throughout the vehicle; a command and control unit located within the vehicle; and a terminal collection unit to collect analysis results from all vehicles. Each level is capable of performing autonomous analysis with a trained adaptable expert system. The remote data acquisition unit has an eight channel programmable digital interface that allows the user discretion for choosing type of sensors; number of sensors, sensor sampling rate, and sampling duration for each sensor. The architecture provides framework for a tributary analysis. All measurements at the lowest operational level are reduced to provide analysis results necessary to gauge changes from established baselines. These are then collected at the next level to identify any global trends or common features from the prior level. This process is repeated until the results are reduced at the highest operational level. In the framework, only analysis results are forwarded to the next level to reduce telemetry congestion. The system's remote data acquisition hardware and non-analysis software have been flight tested on the NASA Langley B757's main landing gear.

  2. A simple encoding method for Sigma-Delta ADC based biopotential acquisition systems.

    PubMed

    Guerrero, Federico N; Spinelli, Enrique M

    2017-10-01

    Sigma Delta analogue-to-digital converters allow acquiring the full dynamic range of biomedical signals at the electrodes, resulting in less complex hardware and increased measurement robustness. However, the increased data size per sample (typically 24 bits) demands the transmission of extremely large volumes of data across the isolation barrier, thus increasing power consumption on the patient side. This problem is accentuated when a large number of channels is used as in current 128-256 electrodes biopotential acquisition systems, that usually opt for an optic fibre link to the computer. An analogous problem occurs for simpler low-power acquisition platforms that transmit data through a wireless link to a computing platform. In this paper, a low-complexity encoding method is presented to decrease sample data size without losses, while preserving the full DC-coupled signal. The method achieved a 2.3 average compression ratio evaluated over an ECG and EMG signal bank acquired with equipment based on Sigma-Delta converters. It demands a very low processing load: a C language implementation is presented that resulted in an 110 clock cycles average execution on an 8-bit microcontroller.

  3. Generalized analog thresholding for spike acquisition at ultralow sampling rates

    PubMed Central

    He, Bryan D.; Wein, Alex; Varshney, Lav R.; Kusuma, Julius; Richardson, Andrew G.

    2015-01-01

    Efficient spike acquisition techniques are needed to bridge the divide from creating large multielectrode arrays (MEA) to achieving whole-cortex electrophysiology. In this paper, we introduce generalized analog thresholding (gAT), which achieves millisecond temporal resolution with sampling rates as low as 10 Hz. Consider the torrent of data from a single 1,000-channel MEA, which would generate more than 3 GB/min using standard 30-kHz Nyquist sampling. Recent neural signal processing methods based on compressive sensing still require Nyquist sampling as a first step and use iterative methods to reconstruct spikes. Analog thresholding (AT) remains the best existing alternative, where spike waveforms are passed through an analog comparator and sampled at 1 kHz, with instant spike reconstruction. By generalizing AT, the new method reduces sampling rates another order of magnitude, detects more than one spike per interval, and reconstructs spike width. Unlike compressive sensing, the new method reveals a simple closed-form solution to achieve instant (noniterative) spike reconstruction. The base method is already robust to hardware nonidealities, including realistic quantization error and integration noise. Because it achieves these considerable specifications using hardware-friendly components like integrators and comparators, generalized AT could translate large-scale MEAs into implantable devices for scientific investigation and medical technology. PMID:25904712

  4. Combined Acquisition/Processing For Data Reduction

    NASA Astrophysics Data System (ADS)

    Kruger, Robert A.

    1982-01-01

    Digital image processing systems necessarily consist of three components: acquisition, storage/retrieval and processing. The acquisition component requires the greatest data handling rates. By coupling together the acquisition witn some online hardwired processing, data rates and capacities for short term storage can be reduced. Furthermore, long term storage requirements can be reduced further by appropriate processing and editing of image data contained in short term memory. The net result could be reduced performance requirements for mass storage, processing and communication systems. Reduced amounts of data also snouid speed later data analysis and diagnostic decision making.

  5. High-Speed Data Acquisition and Digital Signal Processing System for PET Imaging Techniques Applied to Mammography

    NASA Astrophysics Data System (ADS)

    Martinez, J. D.; Benlloch, J. M.; Cerda, J.; Lerche, Ch. W.; Pavon, N.; Sebastia, A.

    2004-06-01

    This paper is framed into the Positron Emission Mammography (PEM) project, whose aim is to develop an innovative gamma ray sensor for early breast cancer diagnosis. Currently, breast cancer is detected using low-energy X-ray screening. However, functional imaging techniques such as PET/FDG could be employed to detect breast cancer and track disease changes with greater sensitivity. Furthermore, a small and less expensive PET camera can be utilized minimizing main problems of whole body PET. To accomplish these objectives, we are developing a new gamma ray sensor based on a newly released photodetector. However, a dedicated PEM detector requires an adequate data acquisition (DAQ) and processing system. The characterization of gamma events needs a free-running analog-to-digital converter (ADC) with sampling rates of more than 50 Ms/s and must achieve event count rates up to 10 MHz. Moreover, comprehensive data processing must be carried out to obtain event parameters necessary for performing the image reconstruction. A new generation digital signal processor (DSP) has been used to comply with these requirements. This device enables us to manage the DAQ system at up to 80 Ms/s and to execute intensive calculi over the detector signals. This paper describes our designed DAQ and processing architecture whose main features are: very high-speed data conversion, multichannel synchronized acquisition with zero dead time, a digital triggering scheme, and high throughput of data with an extensive optimization of the signal processing algorithms.

  6. Three-year-olds obey the sample size principle of induction: the influence of evidence presentation and sample size disparity on young children's generalizations.

    PubMed

    Lawson, Chris A

    2014-07-01

    Three experiments with 81 3-year-olds (M=3.62years) examined the conditions that enable young children to use the sample size principle (SSP) of induction-the inductive rule that facilitates generalizations from large rather than small samples of evidence. In Experiment 1, children exhibited the SSP when exemplars were presented sequentially but not when exemplars were presented simultaneously. Results from Experiment 3 suggest that the advantage of sequential presentation is not due to the additional time to process the available input from the two samples but instead may be linked to better memory for specific individuals in the large sample. In addition, findings from Experiments 1 and 2 suggest that adherence to the SSP is mediated by the disparity between presented samples. Overall, these results reveal that the SSP appears early in development and is guided by basic cognitive processes triggered during the acquisition of input. Copyright © 2013 Elsevier Inc. All rights reserved.

  7. MSL Chemistry and Mineralogy X-Ray Diffraction X-Ray Fluorescence (CheMin) Instrument

    NASA Technical Reports Server (NTRS)

    Zimmerman, Wayne; Blake, Dave; Harris, William; Morookian, John Michael; Randall, Dave; Reder, Leonard J.; Sarrazin, Phillipe

    2013-01-01

    This paper provides an overview of the Mars Science Laboratory (MSL) Chemistry and Mineralogy Xray Diffraction (XRD), X-ray Fluorescence (XRF) (CheMin) Instrument, an element of the landed Curiosity rover payload, which landed on Mars in August of 2012. The scientific goal of the MSL mission is to explore and quantitatively assess regions in Gale Crater as a potential habitat for life - past or present. The CheMin instrument will receive Martian rock and soil samples from the MSL Sample Acquisition/Sample Processing and Handling (SA/SPaH) system, and process it utilizing X-Ray spectroscopy methods to determine mineral composition. The Chemin instrument will analyze Martian soil and rocks to enable scientists to investigate geophysical processes occurring on Mars. The CheMin science objectives and proposed surface operations are described along with the CheMin hardware with an emphasis on the system engineering challenges associated with developing such a complex instrument.

  8. Blueprint for Acquisition Reform, Version 3.0

    DTIC Science & Technology

    2008-07-01

    represents a substantial and immediate step forward in establishing the Coast Guard as a model mid-sized federal agency for acquisition processes...Blueprint for Acquisition Reform in the U. S. Coast Guard “The Coast Guard must become the model for mid-sized Federal agency acquisition in process...acquisition (DoD 5000 model >CG Major Systems Acquisition Manual) • Deepwater Program Executive Officer (PEO): System of Systems performance-based

  9. All-digital GPS receiver mechanization

    NASA Astrophysics Data System (ADS)

    Ould, P. C.; van Wechel, R. J.

    The paper describes the all-digital baseband correlation processing of GPS signals, which is characterized by (1) a potential for improved antijamming performance, (2) fast acquisition by a digital matched filter, (3) reduction of adjustment, (4) increased system reliability, and (5) provision of a basis for the realization of a high degree of VLSI potential for the development of small economical GPS sets. The basic technical approach consists of a broadband fix-tuned RF converter followed by a digitizer; digital-matched-filter acquisition section; phase- and delay-lock tracking via baseband digital correlation; software acquisition logic and loop filter implementation; and all-digital implementation of the feedback numerical controlled oscillators and code generator. Broadband in-phase and quadrature tracking is performed by an arctangent angle detector followed by a phase-unwrapping algorithm that eliminates false locks induced by sampling and data bit transitions, and yields a wide pull-in frequency range approaching one-fourth of the loop iteration frequency.

  10. Development of an Arduino-based electrical impedance tomography system with application to dam internal erosion detection

    NASA Astrophysics Data System (ADS)

    Masi, Matteo; Ferdos, Farzad; Losito, Gabriella; Solari, Luca

    2016-04-01

    Electrical Impedance Tomography (EIT) is a technique for the imaging of the electrical properties of conductive materials. In EIT, the spatial distribution of the electrical resistivity or electrical conductivity within a domain is reconstructed using measurements made with electrodes placed at the boundaries of the domain. Data acquisition is typically made by applying an electrical current to the object under investigation using a set of electrodes, and measuring the developed voltage between the other electrodes. The tomographic image is then obtained using an inversion algorithm. This work describes the implementation of a simple and low cost 3D EIT measurement system suitable for laboratory-scale studies. The system was specifically developed for the time-lapse imaging of soil samples subjected to erosion processes during laboratory tests. The tests reproduce the process of internal erosion of soil particles by water flow within a granular media; this process is one of the most common causes of failure of earthen levees and embankment dams. The measurements needed strict requirements of speed and accuracy due to the varying time scale and magnitude of these processes. The developed EIT system consists of a PC which controls I/O cards (multiplexers) through the Arduino micro-controller, an external current generator, a digital acquisition device (DAQ), a power supply and the electrodes. The ease of programming of the Arduino interface greatly helped the implementation of custom acquisition software, increasing the overall flexibility of the system and the creation of specific acquisition schemes and configurations. The system works with a multi-electrode configuration of up to 48 channels but it was designed to be upgraded to an arbitrary large number of electrodes by connecting additional multiplexer cards (> 96 electrodes). The acquisition was optimized for multi-channel measurements so that the overall time of acquisition is dramatically reduced compared to the single channel instrumentation. The accuracy and operation were tested under different conditions. The results from preliminary tests show that the system is able to clearly identify objects discriminated by different resistivity. Furthermore, measurements carried out during internal erosion simulations demonstrate that even small variations in the electrical resistivity can be captured and these changes can be related to the erosion processes.

  11. An Acquisition Guide for Executives

    EPA Pesticide Factsheets

    This guide covers the following subjects; What is Acquisition?, Purpose and Primary Functions of the Agency’s Acquisition System, Key Organizations in Acquisitions, Legal Framework, Key Players in Acquisitions, Acquisition Process, Acquisition Thresholds

  12. Unmanned Maritime Systems Incremental Acquisition Approach

    DTIC Science & Technology

    2016-12-01

    We find that current UMS acquisitions are utilizing previous acquisition reforms, but could benefit from additional contractor peer competition and...peer review. Additional cost and schedule benefits could result from contractor competition during build processes in each incremental process. We...acquisitions are utilizing previous acquisition reforms, but could benefit from additional contractor peer competition and peer review. Additional

  13. 48 CFR 36.602-5 - Short selection process for contracts not to exceed the simplified acquisition threshold.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 48 Federal Acquisition Regulations System 1 2013-10-01 2013-10-01 false Short selection process for contracts not to exceed the simplified acquisition threshold. 36.602-5 Section 36.602-5 Federal Acquisition Regulations System FEDERAL ACQUISITION REGULATION SPECIAL CATEGORIES OF CONTRACTING CONSTRUCTION AND ARCHITECT-ENGINEER CONTRACTS...

  14. 48 CFR 36.602-5 - Short selection process for contracts not to exceed the simplified acquisition threshold.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 48 Federal Acquisition Regulations System 1 2014-10-01 2014-10-01 false Short selection process for contracts not to exceed the simplified acquisition threshold. 36.602-5 Section 36.602-5 Federal Acquisition Regulations System FEDERAL ACQUISITION REGULATION SPECIAL CATEGORIES OF CONTRACTING CONSTRUCTION AND ARCHITECT-ENGINEER CONTRACTS...

  15. 48 CFR 36.602-5 - Short selection process for contracts not to exceed the simplified acquisition threshold.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 48 Federal Acquisition Regulations System 1 2012-10-01 2012-10-01 false Short selection process for contracts not to exceed the simplified acquisition threshold. 36.602-5 Section 36.602-5 Federal Acquisition Regulations System FEDERAL ACQUISITION REGULATION SPECIAL CATEGORIES OF CONTRACTING CONSTRUCTION AND ARCHITECT-ENGINEER CONTRACTS...

  16. 48 CFR 36.602-5 - Short selection process for contracts not to exceed the simplified acquisition threshold.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 48 Federal Acquisition Regulations System 1 2010-10-01 2010-10-01 false Short selection process for contracts not to exceed the simplified acquisition threshold. 36.602-5 Section 36.602-5 Federal Acquisition Regulations System FEDERAL ACQUISITION REGULATION SPECIAL CATEGORIES OF CONTRACTING CONSTRUCTION AND ARCHITECT-ENGINEER CONTRACTS...

  17. 48 CFR 36.602-5 - Short selection process for contracts not to exceed the simplified acquisition threshold.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 48 Federal Acquisition Regulations System 1 2011-10-01 2011-10-01 false Short selection process for contracts not to exceed the simplified acquisition threshold. 36.602-5 Section 36.602-5 Federal Acquisition Regulations System FEDERAL ACQUISITION REGULATION SPECIAL CATEGORIES OF CONTRACTING CONSTRUCTION AND ARCHITECT-ENGINEER CONTRACTS...

  18. Second Language Acquisition: Possible Insights from Studies on How Birds Acquire Song.

    ERIC Educational Resources Information Center

    Neapolitan, Denise M.; And Others

    1988-01-01

    Reviews research that demonstrates parallels between general linguistic and cognitive processes in human language acquisition and avian acquisition of song and discusses how such research may provide new insights into the processes of second-language acquisition. (Author/CB)

  19. Mars Science Laboratory CHIMRA: A Device for Processing Powdered Martian Samples

    NASA Technical Reports Server (NTRS)

    Sunshine, Daniel

    2010-01-01

    The CHIMRA is an extraterrestrial sample acquisition and processing device for the Mars Science Laboratory that emphasizes robustness and adaptability through design configuration. This work reviews the guidelines utilized to invent the initial CHIMRA and the strategy employed in advancing the design; these principles will be discussed in relation to both the final CHIMRA design and similar future devices. The computational synthesis necessary to mature a boxed-in impact-generating mechanism will be presented alongside a detailed mechanism description. Results from the development testing required to advance the design for a highly-loaded, long-life and high-speed bearing application will be presented. Lessons learned during the assembly and testing of this subsystem as well as results and lessons from the sample-handling development test program will be reviewed.

  20. Dual-wavelength OR-PAM with compressed sensing for cell tracking in a 3D cell culture system

    NASA Astrophysics Data System (ADS)

    Huang, Rou-Xuan; Fu, Ying; Liu, Wang; Ma, Yu-Ting; Hsieh, Bao-Yu; Chen, Shu-Ching; Sun, Mingjian; Li, Pai-Chi

    2018-02-01

    Monitoring dynamic interactions of T cells migrating toward tumor is beneficial to understand how cancer immunotherapy works. Optical-resolution photoacoustic microscope (OR-PAM) can provide not only high spatial resolution but also deeper penetration than conventional optical microscopy. With the aid of exogenous contrast agents, the dual-wavelength OR-PAM can be applied to map the distribution of CD8+ cytotoxic T lymphocytes (CTLs) with gold nanospheres (AuNS) under 523nm laser irradiation and Hepta1-6 tumor spheres with indocyanine green (ICG) under 800nm irradiation. However, at 1K laser PRF, it takes approximately 20 minutes to obtain a full sample volume of 160 × 160 × 150 μm3 . To increase the imaging rate, we propose a random non-uniform sparse sampling mechanism to achieve fast sparse photoacoustic data acquisition. The image recovery process is formulated as a low-rank matrix recovery (LRMR) based on compressed sensing (CS) theory. We show that it could be stably recovered via nuclear-norm minimization optimization problem to maintain image quality from a significantly fewer measurement. In this study, we use the dual-wavelength OR-PAM with CS to visualize T cell trafficking in a 3D culture system with higher temporal resolution. Data acquisition time is reduced by 40% in such sample volume where sampling density is 0.5. The imaging system reveals the potential to understand the dynamic cellular process for preclinical screening of anti-cancer drugs.

  1. Cognitive processes in children's reading and attention: the role of working memory, divided attention, and response inhibition.

    PubMed

    Savage, Robert; Cornish, Kim; Manly, Tom; Hollis, Chris

    2006-08-01

    Children experiencing attention difficulties have documented cognitive deficits in working memory (WM), response inhibition and dual tasks. Recent evidence suggests however that these same cognitive processes are also closely associated with reading acquisition. This paper therefore explores whether these variables predicted attention difficulties or reading among 123 children with and without significant attention problems sampled from the school population. Children were screened using current WM and attention task measures. Three factors explained variance in WM and attention tasks. Response inhibition tasks loaded mainly with central executive measures, but a dual processing task loaded with the visual-spatial WM measures. Phonological loop measures loaded independently of attention measures. After controls for age, IQ and attention-group membership, phonological loop and 'central processing' measures both predicted reading ability. A 'visual memory/dual-task' factor predicted attention group membership after controls for age, IQ and reading ability. Results thus suggest that some of the processes previously assumed to be predictive of attention problems may reflect processes involved in reading acquisition. Visual memory and dual-task functioning are, however, purer indices of cognitive difficulty in children experiencing attention problems.

  2. Data processing has major impact on the outcome of quantitative label-free LC-MS analysis.

    PubMed

    Chawade, Aakash; Sandin, Marianne; Teleman, Johan; Malmström, Johan; Levander, Fredrik

    2015-02-06

    High-throughput multiplexed protein quantification using mass spectrometry is steadily increasing in popularity, with the two major techniques being data-dependent acquisition (DDA) and targeted acquisition using selected reaction monitoring (SRM). However, both techniques involve extensive data processing, which can be performed by a multitude of different software solutions. Analysis of quantitative LC-MS/MS data is mainly performed in three major steps: processing of raw data, normalization, and statistical analysis. To evaluate the impact of data processing steps, we developed two new benchmark data sets, one each for DDA and SRM, with samples consisting of a long-range dilution series of synthetic peptides spiked in a total cell protein digest. The generated data were processed by eight different software workflows and three postprocessing steps. The results show that the choice of the raw data processing software and the postprocessing steps play an important role in the final outcome. Also, the linear dynamic range of the DDA data could be extended by an order of magnitude through feature alignment and a charge state merging algorithm proposed here. Furthermore, the benchmark data sets are made publicly available for further benchmarking and software developments.

  3. The Defense Systems Acquisition and Review Council

    DTIC Science & Technology

    1976-09-15

    THE DEFENSE SYSTEMS ACQUISITION AND REVIEW COUNCIL.. A Study of Areas of Consideration Affecting the Functions and Process of Defense Major...COUNfIL: 4 Study of Areas of Considerationi Affecting/he Functions and Process of Defense _- 1 Major Systems Acquisition. , ; O,. v AUTHOR(e) I. C...Studies DSARC -- Functions and Process OSDCAIG *, Army Systems . A - =A -- he Defense Systems Acquisition Review Council (DSARC) was created to assume

  4. Global analysis of microscopic fluorescence lifetime images using spectral segmentation and a digital micromirror spatial illuminator.

    PubMed

    Bednarkiewicz, Artur; Whelan, Maurice P

    2008-01-01

    Fluorescence lifetime imaging (FLIM) is very demanding from a technical and computational perspective, and the output is usually a compromise between acquisition/processing time and data accuracy and precision. We present a new approach to acquisition, analysis, and reconstruction of microscopic FLIM images by employing a digital micromirror device (DMD) as a spatial illuminator. In the first step, the whole field fluorescence image is collected by a color charge-coupled device (CCD) camera. Further qualitative spectral analysis and sample segmentation are performed to spatially distinguish between spectrally different regions on the sample. Next, the fluorescence of the sample is excited segment by segment, and fluorescence lifetimes are acquired with a photon counting technique. FLIM image reconstruction is performed by either raster scanning the sample or by directly accessing specific regions of interest. The unique features of the DMD illuminator allow the rapid on-line measurement of global good initial parameters (GIP), which are supplied to the first iteration of the fitting algorithm. As a consequence, a decrease of the computation time required to obtain a satisfactory quality-of-fit is achieved without compromising the accuracy and precision of the lifetime measurements.

  5. A research of a high precision multichannel data acquisition system

    NASA Astrophysics Data System (ADS)

    Zhong, Ling-na; Tang, Xiao-ping; Yan, Wei

    2013-08-01

    The output signals of the focusing system in lithography are analog. To convert the analog signals into digital ones which are more flexible and stable to process, a desirable data acquisition system is required. The resolution of data acquisition, to some extent, affects the accuracy of focusing. In this article, we first compared performance between the various kinds of analog-to-digital converters (ADC) available on the market at the moment. Combined with the specific requirements (sampling frequency, converting accuracy, numbers of channels etc) and the characteristics (polarization, amplitude range etc) of the analog signals, the model of the ADC to be used as the core chip in our hardware design was determined. On this basis, we chose other chips needed in the hardware circuit that would well match with ADC, then the overall hardware design was obtained. Validation of our data acquisition system was verified through experiments and it can be demonstrated that the system can effectively realize the high resolution conversion of the multi-channel analog signals and give the accurate focusing information in lithography.

  6. Hardware Implementation of Multiple Fan Beam Projection Technique in Optical Fibre Process Tomography

    PubMed Central

    Rahim, Ruzairi Abdul; Fazalul Rahiman, Mohd Hafiz; Leong, Lai Chen; Chan, Kok San; Pang, Jon Fea

    2008-01-01

    The main objective of this project is to implement the multiple fan beam projection technique using optical fibre sensors with the aim to achieve a high data acquisition rate. Multiple fan beam projection technique here is defined as allowing more than one emitter to transmit light at the same time using the switch-mode fan beam method. For the thirty-two pairs of sensors used, the 2-projection technique and 4-projection technique are being investigated. Sixteen sets of projections will complete one frame of light emission for the 2-projection technique while eight sets of projection will complete one frame of light emission for the 4-projection technique. In order to facilitate data acquisition process, PIC microcontroller and the sample and hold circuit are being used. This paper summarizes the hardware configuration and design for this project. PMID:27879885

  7. Design of extensible meteorological data acquisition system based on FPGA

    NASA Astrophysics Data System (ADS)

    Zhang, Wen; Liu, Yin-hua; Zhang, Hui-jun; Li, Xiao-hui

    2015-02-01

    In order to compensate the tropospheric refraction error generated in the process of satellite navigation and positioning. Temperature, humidity and air pressure had to be used in concerned models to calculate the value of this error. While FPGA XC6SLX16 was used as the core processor, the integrated silicon pressure sensor MPX4115A and digital temperature-humidity sensor SHT75 are used as the basic meteorological parameter detection devices. The core processer was used to control the real-time sampling of ADC AD7608 and to acquire the serial output data of SHT75. The data was stored in the BRAM of XC6SLX16 and used to generate standard meteorological parameters in NEMA format. The whole design was based on Altium hardware platform and ISE software platform. The system was described in the VHDL language and schematic diagram to realize the correct detection of temperature, humidity, air pressure. The 8-channel synchronous sampling characteristics of AD7608 and programmable external resources of FPGA laid the foundation for the increasing of analog or digital meteorological element signal. The designed meteorological data acquisition system featured low cost, high performance, multiple expansions.

  8. Validation of nonlinear interferometric vibrational imaging as a molecular OCT technique by the use of Raman microscopy

    NASA Astrophysics Data System (ADS)

    Benalcazar, Wladimir A.; Jiang, Zhi; Marks, Daniel L.; Geddes, Joseph B.; Boppart, Stephen A.

    2009-02-01

    We validate a molecular imaging technique called Nonlinear Interferometric Vibrational Imaging (NIVI) by comparing vibrational spectra with those acquired from Raman microscopy. This broadband coherent anti-Stokes Raman scattering (CARS) technique uses heterodyne detection and OCT acquisition and design principles to interfere a CARS signal generated by a sample with a local oscillator signal generated separately by a four-wave mixing process. These are mixed and demodulated by spectral interferometry. Its confocal configuration allows the acquisition of 3D images based on endogenous molecular signatures. Images from both phantom and mammary tissues have been acquired by this instrument and its spectrum is compared with its spontaneous Raman signatures.

  9. Authentication of Closely Related Fish and Derived Fish Products Using Tandem Mass Spectrometry and Spectral Library Matching.

    PubMed

    Nessen, Merel A; van der Zwaan, Dennis J; Grevers, Sander; Dalebout, Hans; Staats, Martijn; Kok, Esther; Palmblad, Magnus

    2016-05-11

    Proteomics methodology has seen increased application in food authentication, including tandem mass spectrometry of targeted species-specific peptides in raw, processed, or mixed food products. We have previously described an alternative principle that uses untargeted data acquisition and spectral library matching, essentially spectral counting, to compare and identify samples without the need for genomic sequence information in food species populations. Here, we present an interlaboratory comparison demonstrating how a method based on this principle performs in a realistic context. We also increasingly challenge the method by using data from different types of mass spectrometers, by trying to distinguish closely related and commercially important flatfish, and by analyzing heavily contaminated samples. The method was found to be robust in different laboratories, and 94-97% of the analyzed samples were correctly identified, including all processed and contaminated samples.

  10. Towards the low-dose characterization of beam sensitive nanostructures via implementation of sparse image acquisition in scanning transmission electron microscopy

    NASA Astrophysics Data System (ADS)

    Hwang, Sunghwan; Han, Chang Wan; Venkatakrishnan, Singanallur V.; Bouman, Charles A.; Ortalan, Volkan

    2017-04-01

    Scanning transmission electron microscopy (STEM) has been successfully utilized to investigate atomic structure and chemistry of materials with atomic resolution. However, STEM’s focused electron probe with a high current density causes the electron beam damages including radiolysis and knock-on damage when the focused probe is exposed onto the electron-beam sensitive materials. Therefore, it is highly desirable to decrease the electron dose used in STEM for the investigation of biological/organic molecules, soft materials and nanomaterials in general. With the recent emergence of novel sparse signal processing theories, such as compressive sensing and model-based iterative reconstruction, possibilities of operating STEM under a sparse acquisition scheme to reduce the electron dose have been opened up. In this paper, we report our recent approach to implement a sparse acquisition in STEM mode executed by a random sparse-scan and a signal processing algorithm called model-based iterative reconstruction (MBIR). In this method, a small portion, such as 5% of randomly chosen unit sampling areas (i.e. electron probe positions), which corresponds to pixels of a STEM image, within the region of interest (ROI) of the specimen are scanned with an electron probe to obtain a sparse image. Sparse images are then reconstructed using the MBIR inpainting algorithm to produce an image of the specimen at the original resolution that is consistent with an image obtained using conventional scanning methods. Experimental results for down to 5% sampling show consistency with the full STEM image acquired by the conventional scanning method. Although, practical limitations of the conventional STEM instruments, such as internal delays of the STEM control electronics and the continuous electron gun emission, currently hinder to achieve the full potential of the sparse acquisition STEM in realizing the low dose imaging condition required for the investigation of beam-sensitive materials, the results obtained in our experiments demonstrate the sparse acquisition STEM imaging is potentially capable of reducing the electron dose by at least 20 times expanding the frontiers of our characterization capabilities for investigation of biological/organic molecules, polymers, soft materials and nanostructures in general.

  11. The Classification of Tongue Colors with Standardized Acquisition and ICC Profile Correction in Traditional Chinese Medicine

    PubMed Central

    Tu, Li-ping; Chen, Jing-bo; Hu, Xiao-juan; Zhang, Zhi-feng

    2016-01-01

    Background and Goal. The application of digital image processing techniques and machine learning methods in tongue image classification in Traditional Chinese Medicine (TCM) has been widely studied nowadays. However, it is difficult for the outcomes to generalize because of lack of color reproducibility and image standardization. Our study aims at the exploration of tongue colors classification with a standardized tongue image acquisition process and color correction. Methods. Three traditional Chinese medical experts are chosen to identify the selected tongue pictures taken by the TDA-1 tongue imaging device in TIFF format through ICC profile correction. Then we compare the mean value of L * a * b * of different tongue colors and evaluate the effect of the tongue color classification by machine learning methods. Results. The L * a * b * values of the five tongue colors are statistically different. Random forest method has a better performance than SVM in classification. SMOTE algorithm can increase classification accuracy by solving the imbalance of the varied color samples. Conclusions. At the premise of standardized tongue acquisition and color reproduction, preliminary objectification of tongue color classification in Traditional Chinese Medicine (TCM) is feasible. PMID:28050555

  12. An operations manual for the digital data system

    NASA Technical Reports Server (NTRS)

    Jones, Michael G.

    1988-01-01

    The Digital Data System (DDS) was designed to incorporate the analog-to-digital conversion process into the initial data acquisition stage and to store the data in a digital format. This conversion is done as part of the acquisition process. Consequently, the data are ready to be analyzed as soon as the test is completed. This capability permits the researcher to alter test parameters during the course of the experiment based on the information acquired in a prior portion of the test. The DDS is currently able to simultaneously acquire up to 10 channels of data. The purpose of this document is fourfold: (1) to describe the capabilities of the hardware in sufficient detail to allow the reader to determine whether the DDS is the optimum system for a particular experiment; (2) to present some of the more significant software developed to provide analyses within a short time of the completion of data acquisition; (3) to provide the reader with sample runs of major software routines to demonstrate their convenience and simple usage; and (4) a portion of the document is used to describe software which uses an FFT-box to provide a means of comparison against which the DDS can be checked.

  13. The Classification of Tongue Colors with Standardized Acquisition and ICC Profile Correction in Traditional Chinese Medicine.

    PubMed

    Qi, Zhen; Tu, Li-Ping; Chen, Jing-Bo; Hu, Xiao-Juan; Xu, Jia-Tuo; Zhang, Zhi-Feng

    2016-01-01

    Background and Goal . The application of digital image processing techniques and machine learning methods in tongue image classification in Traditional Chinese Medicine (TCM) has been widely studied nowadays. However, it is difficult for the outcomes to generalize because of lack of color reproducibility and image standardization. Our study aims at the exploration of tongue colors classification with a standardized tongue image acquisition process and color correction. Methods . Three traditional Chinese medical experts are chosen to identify the selected tongue pictures taken by the TDA-1 tongue imaging device in TIFF format through ICC profile correction. Then we compare the mean value of L * a * b * of different tongue colors and evaluate the effect of the tongue color classification by machine learning methods. Results . The L * a * b * values of the five tongue colors are statistically different. Random forest method has a better performance than SVM in classification. SMOTE algorithm can increase classification accuracy by solving the imbalance of the varied color samples. Conclusions . At the premise of standardized tongue acquisition and color reproduction, preliminary objectification of tongue color classification in Traditional Chinese Medicine (TCM) is feasible.

  14. Collecting cometary soil samples? Development of the ROSETTA sample acquisition system

    NASA Technical Reports Server (NTRS)

    Coste, P. A.; Fenzi, M.; Eiden, Michael

    1993-01-01

    In the reference scenario of the ROSETTA CNRS mission, the Sample Acquisition System is mounted on the Comet Lander. Its tasks are to acquire three kinds of cometary samples and to transfer them to the Earth Return Capsule. Operations are to be performed in vacuum and microgravity, on a probably rough and dusty surface, in a largely unknown material, at temperatures in the order of 100 K. The concept and operation of the Sample Acquisition System are presented. The design of the prototype corer and surface sampling tool, and of the equipment for testing them at cryogenic temperatures in ambient conditions and in vacuum in various materials representing cometary soil, are described. Results of recent preliminary tests performed in low temperature thermal vacuum in a cometary analog ice-dust mixture are provided.

  15. 48 CFR 15.101-1 - Tradeoff process.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 48 Federal Acquisition Regulations System 1 2010-10-01 2010-10-01 false Tradeoff process. 15.101-1 Section 15.101-1 Federal Acquisition Regulations System FEDERAL ACQUISITION REGULATION CONTRACTING METHODS AND CONTRACT TYPES CONTRACTING BY NEGOTIATION Source Selection Processes and Techniques 15.101-1...

  16. Mars sample return: Site selection and sample acquisition study

    NASA Technical Reports Server (NTRS)

    Nickle, N. (Editor)

    1980-01-01

    Various vehicle and mission options were investigated for the continued exploration of Mars; the cost of a minimum sample return mission was estimated; options and concepts were synthesized into program possibilities; and recommendations for the next Mars mission were made to the Planetary Program office. Specific sites and all relevant spacecraft and ground-based data were studied in order to determine: (1) the adequacy of presently available data for identifying landing sities for a sample return mission that would assure the acquisition of material from the most important geologic provinces of Mars; (2) the degree of surface mobility required to assure sample acquisition for these sites; (3) techniques to be used in the selection and drilling of rock a samples; and (4) the degree of mobility required at the two Viking sites to acquire these samples.

  17. Acquisition by Processing Theory: A Theory of Everything?

    ERIC Educational Resources Information Center

    Carroll, Susanne E.

    2004-01-01

    Truscott and Sharwood Smith (henceforth T&SS) propose a novel theory of language acquisition, "Acquisition by Processing Theory" (APT), designed to account for both first and second language acquisition, monolingual and bilingual speech perception and parsing, and speech production. This is a tall order. Like any theoretically ambitious…

  18. The Role of Influenza and Parainfluenza Infections in Nasopharyngeal Pneumococcal Acquisition Among Young Children

    PubMed Central

    Grijalva, Carlos G.; Griffin, Marie R.; Edwards, Kathryn M.; Williams, John V.; Gil, Ana I.; Verastegui, Hector; Hartinger, Stella M.; Vidal, Jorge E.; Klugman, Keith P.; Lanata, Claudio F.

    2014-01-01

    Background. Animal models suggest that influenza infection favors nasopharyngeal acquisition of pneumococci. We assessed this relationship with influenza and other respiratory viruses in young children. Methods. A case-control study was nested within a prospective cohort study of acute respiratory illness (ARI) in Andean children <3 years of age (RESPIRA-PERU study). Weekly household visits were made to identify ARI and obtain nasal swabs for viral detection using real-time reverse-transcription polymerase chain reaction. Monthly nasopharyngeal (NP) samples were obtained to assess pneumococcal colonization. We determined whether specific respiratory viral ARI episodes occurring within the interval between NP samples increased the risk of NP acquisition of new pneumococcal serotypes. Results. A total of 729 children contributed 2128 episodes of observation, including 681 pneumococcal acquisition episodes (new serotype, not detected in prior sample), 1029 nonacquisition episodes (no colonization or persistent colonization with the same serotype as the prior sample), and 418 indeterminate episodes. The risk of pneumococcal acquisition increased following influenza-ARI (adjusted odds ratio [AOR], 2.19; 95% confidence interval [CI], 1.02–4.69) and parainfluenza-ARI (AOR, 1.86; 95% CI, 1.15–3.01), when compared with episodes without ARI. Other viral infections (respiratory syncytial virus, human metapneumovirus, human rhinovirus, and adenovirus) were not associated with acquisition. Conclusions. Influenza and parainfluenza ARIs appeared to facilitate pneumococcal acquisition among young children. As acquisition increases the risk of pneumococcal diseases, these observations are pivotal in our attempts to prevent pneumococcal disease. PMID:24621951

  19. Development and Flight Testing of an Autonomous Landing Gear Health-Monitoring System

    NASA Technical Reports Server (NTRS)

    Woodard, Stanley E.; Coffey, Neil C.; Gonzalez, Guillermo A.; Taylor, B. Douglas; Brett, Rube R.; Woodman, Keith L.; Weathered, Brenton W.; Rollins, Courtney H.

    2003-01-01

    Development and testing of an adaptable vehicle health-monitoring architecture is presented. The architecture is being developed for a fleet of vehicles. It has three operational levels: one or more remote data acquisition units located throughout the vehicle; a command and control unit located within the vehicle; and, a terminal collection unit to collect analysis results from all vehicles. Each level is capable of performing autonomous analysis with a trained expert system. Communication between all levels is done with wireless radio frequency interfaces. The remote data acquisition unit has an eight channel programmable digital interface that allows the user discretion for choosing type of sensors; number of sensors, sensor sampling rate and sampling duration for each sensor. The architecture provides framework for a tributary analysis. All measurements at the lowest operational level are reduced to provide analysis results necessary to gauge changes from established baselines. These are then collected at the next level to identify any global trends or common features from the prior level. This process is repeated until the results are reduced at the highest operational level. In the framework, only analysis results are forwarded to the next level to reduce telemetry congestion. The system's remote data acquisition hardware and non-analysis software have been flight tested on the NASA Langley B757's main landing gear. The flight tests were performed to validate the following: the wireless radio frequency communication capabilities of the system, the hardware design, command and control; software operation; and, data acquisition, storage and retrieval.

  20. 29. Perimeter acquisition radar building room #318, data processing system ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    29. Perimeter acquisition radar building room #318, data processing system area; data processor maintenance and operations center, showing data processing consoles - Stanley R. Mickelsen Safeguard Complex, Perimeter Acquisition Radar Building, Limited Access Area, between Limited Access Patrol Road & Service Road A, Nekoma, Cavalier County, ND

  1. 48 CFR 801.602-78 - Processing solicitations and contract documents for legal or technical review-Veterans Health...

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 48 Federal Acquisition Regulations System 5 2010-10-01 2010-10-01 false Processing solicitations..., Central Office (except Office of Construction and Facilities Management), the National Acquisition Center, and the Denver Acquisition and Logistics Center. 801.602-78 Section 801.602-78 Federal Acquisition...

  2. The acquisition process of musical tonal schema: implications from connectionist modeling.

    PubMed

    Matsunaga, Rie; Hartono, Pitoyo; Abe, Jun-Ichi

    2015-01-01

    Using connectionist modeling, we address fundamental questions concerning the acquisition process of musical tonal schema of listeners. Compared to models of previous studies, our connectionist model (Learning Network for Tonal Schema, LeNTS) was better equipped to fulfill three basic requirements. Specifically, LeNTS was equipped with a learning mechanism, bound by culture-general properties, and trained by sufficient melody materials. When exposed to Western music, LeNTS acquired musical 'scale' sensitivity early and 'harmony' sensitivity later. The order of acquisition of scale and harmony sensitivities shown by LeNTS was consistent with the culture-specific acquisition order shown by musically westernized children. The implications of these results for the acquisition process of a tonal schema of listeners are as follows: (a) the acquisition process may entail small and incremental changes, rather than large and stage-like changes, in corresponding neural circuits; (b) the speed of schema acquisition may mainly depend on musical experiences rather than maturation; and (c) the learning principles of schema acquisition may be culturally invariant while the acquired tonal schemas are varied with exposed culture-specific music.

  3. The acquisition process of musical tonal schema: implications from connectionist modeling

    PubMed Central

    Matsunaga, Rie; Hartono, Pitoyo; Abe, Jun-ichi

    2015-01-01

    Using connectionist modeling, we address fundamental questions concerning the acquisition process of musical tonal schema of listeners. Compared to models of previous studies, our connectionist model (Learning Network for Tonal Schema, LeNTS) was better equipped to fulfill three basic requirements. Specifically, LeNTS was equipped with a learning mechanism, bound by culture-general properties, and trained by sufficient melody materials. When exposed to Western music, LeNTS acquired musical ‘scale’ sensitivity early and ‘harmony’ sensitivity later. The order of acquisition of scale and harmony sensitivities shown by LeNTS was consistent with the culture-specific acquisition order shown by musically westernized children. The implications of these results for the acquisition process of a tonal schema of listeners are as follows: (a) the acquisition process may entail small and incremental changes, rather than large and stage-like changes, in corresponding neural circuits; (b) the speed of schema acquisition may mainly depend on musical experiences rather than maturation; and (c) the learning principles of schema acquisition may be culturally invariant while the acquired tonal schemas are varied with exposed culture-specific music. PMID:26441725

  4. Reengineering the Acquisition/Procurement Process: A Methodology for Requirements Collection

    NASA Technical Reports Server (NTRS)

    Taylor, Randall; Vanek, Thomas

    2011-01-01

    This paper captures the systematic approach taken by JPL's Acquisition Reengineering Project team, the methodology used, challenges faced, and lessons learned. It provides pragmatic "how-to" techniques and tools for collecting requirements and for identifying areas of improvement in an acquisition/procurement process or other core process of interest.

  5. 48 CFR 15.202 - Advisory multi-step process.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 48 Federal Acquisition Regulations System 1 2010-10-01 2010-10-01 false Advisory multi-step... Information 15.202 Advisory multi-step process. (a) The agency may publish a presolicitation notice (see 5.204... participate in the acquisition. This process should not be used for multi-step acquisitions where it would...

  6. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Novak, Erik; Trolinger, James D.; Lacey, Ian

    This work reports on the development of a binary pseudo-random test sample optimized to calibrate the MTF of optical microscopes. The sample consists of a number of 1-D and 2-D patterns, with different minimum sizes of spatial artifacts from 300 nm to 2 microns. We describe the mathematical background, fabrication process, data acquisition and analysis procedure to return spatial frequency based instrument calibration. We show that the developed samples satisfy the characteristics of a test standard: functionality, ease of specification and fabrication, reproducibility, and low sensitivity to manufacturing error. © (2015) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading ofmore » the abstract is permitted for personal use only.« less

  7. Hardware Timestamping for an Image Acquisition System Based on FlexRIO and IEEE 1588 v2 Standard

    NASA Astrophysics Data System (ADS)

    Esquembri, S.; Sanz, D.; Barrera, E.; Ruiz, M.; Bustos, A.; Vega, J.; Castro, R.

    2016-02-01

    Current fusion devices usually implement distributed acquisition systems for the multiple diagnostics of their experiments. However, each diagnostic is composed by hundreds or even thousands of signals, including images from the vessel interior. These signals and images must be correctly timestamped, because all the information will be analyzed to identify plasma behavior using temporal correlations. For acquisition devices without synchronization mechanisms the timestamp is given by another device with timing capabilities when signaled by the first device. Later, each data should be related with its timestamp, usually via software. This critical action is unfeasible for software applications when sampling rates are high. In order to solve this problem this paper presents the implementation of an image acquisition system with real-time hardware timestamping mechanism. This is synchronized with a master clock using the IEEE 1588 v2 Precision Time Protocol (PTP). Synchronization, image acquisition and processing, and timestamping mechanisms are implemented using Field Programmable Gate Array (FPGA) and a timing card -PTP v2 synchronized. The system has been validated using a camera simulator streaming videos from fusion databases. The developed architecture is fully compatible with ITER Fast Controllers and has been integrated with EPICS to control and monitor the whole system.

  8. A Dual-Channel Acquisition Method Based on Extended Replica Folding Algorithm for Long Pseudo-Noise Code in Inter-Satellite Links.

    PubMed

    Zhao, Hongbo; Chen, Yuying; Feng, Wenquan; Zhuang, Chen

    2018-05-25

    Inter-satellite links are an important component of the new generation of satellite navigation systems, characterized by low signal-to-noise ratio (SNR), complex electromagnetic interference and the short time slot of each satellite, which brings difficulties to the acquisition stage. The inter-satellite link in both Global Positioning System (GPS) and BeiDou Navigation Satellite System (BDS) adopt the long code spread spectrum system. However, long code acquisition is a difficult and time-consuming task due to the long code period. Traditional folding methods such as extended replica folding acquisition search technique (XFAST) and direct average are largely restricted because of code Doppler and additional SNR loss caused by replica folding. The dual folding method (DF-XFAST) and dual-channel method have been proposed to achieve long code acquisition in low SNR and high dynamic situations, respectively, but the former is easily affected by code Doppler and the latter is not fast enough. Considering the environment of inter-satellite links and the problems of existing algorithms, this paper proposes a new long code acquisition algorithm named dual-channel acquisition method based on the extended replica folding algorithm (DC-XFAST). This method employs dual channels for verification. Each channel contains an incoming signal block. Local code samples are folded and zero-padded to the length of the incoming signal block. After a circular FFT operation, the correlation results contain two peaks of the same magnitude and specified relative position. The detection process is eased through finding the two largest values. The verification takes all the full and partial peaks into account. Numerical results reveal that the DC-XFAST method can improve acquisition performance while acquisition speed is guaranteed. The method has a significantly higher acquisition probability than folding methods XFAST and DF-XFAST. Moreover, with the advantage of higher detection probability and lower false alarm probability, it has a lower mean acquisition time than traditional XFAST, DF-XFAST and zero-padding.

  9. Order of stimulus presentation influences children's acquisition in receptive identification tasks.

    PubMed

    Petursdottir, Anna Ingeborg; Aguilar, Gabriella

    2016-03-01

    Receptive identification is usually taught in matching-to-sample format, which entails the presentation of an auditory sample stimulus and several visual comparison stimuli in each trial. Conflicting recommendations exist regarding the order of stimulus presentation in matching-to-sample trials. The purpose of this study was to compare acquisition in receptive identification tasks under 2 conditions: when the sample was presented before the comparisons (sample first) and when the comparisons were presented before the sample (comparison first). Participants included 4 typically developing kindergarten-age boys. Stimuli, which included birds and flags, were presented on a computer screen. Acquisition in the 2 conditions was compared in an adapted alternating-treatments design combined with a multiple baseline design across stimulus sets. All participants took fewer trials to meet the mastery criterion in the sample-first condition than in the comparison-first condition. © 2015 Society for the Experimental Analysis of Behavior.

  10. A 2D MTF approach to evaluate and guide dynamic imaging developments.

    PubMed

    Chao, Tzu-Cheng; Chung, Hsiao-Wen; Hoge, W Scott; Madore, Bruno

    2010-02-01

    As the number and complexity of partially sampled dynamic imaging methods continue to increase, reliable strategies to evaluate performance may prove most useful. In the present work, an analytical framework to evaluate given reconstruction methods is presented. A perturbation algorithm allows the proposed evaluation scheme to perform robustly without requiring knowledge about the inner workings of the method being evaluated. A main output of the evaluation process consists of a two-dimensional modulation transfer function, an easy-to-interpret visual rendering of a method's ability to capture all combinations of spatial and temporal frequencies. Approaches to evaluate noise properties and artifact content at all spatial and temporal frequencies are also proposed. One fully sampled phantom and three fully sampled cardiac cine datasets were subsampled (R = 4 and 8) and reconstructed with the different methods tested here. A hybrid method, which combines the main advantageous features observed in our assessments, was proposed and tested in a cardiac cine application, with acceleration factors of 3.5 and 6.3 (skip factors of 4 and 8, respectively). This approach combines features from methods such as k-t sensitivity encoding, unaliasing by Fourier encoding the overlaps in the temporal dimension-sensitivity encoding, generalized autocalibrating partially parallel acquisition, sensitivity profiles from an array of coils for encoding and reconstruction in parallel, self, hybrid referencing with unaliasing by Fourier encoding the overlaps in the temporal dimension and generalized autocalibrating partially parallel acquisition, and generalized autocalibrating partially parallel acquisition-enhanced sensitivity maps for sensitivity encoding reconstructions.

  11. Nonuniform sampling and non-Fourier signal processing methods in multidimensional NMR

    PubMed Central

    Mobli, Mehdi; Hoch, Jeffrey C.

    2017-01-01

    Beginning with the introduction of Fourier Transform NMR by Ernst and Anderson in 1966, time domain measurement of the impulse response (the free induction decay, FID) consisted of sampling the signal at a series of discrete intervals. For compatibility with the discrete Fourier transform (DFT), the intervals are kept uniform, and the Nyquist theorem dictates the largest value of the interval sufficient to avoid aliasing. With the proposal by Jeener of parametric sampling along an indirect time dimension, extension to multidimensional experiments employed the same sampling techniques used in one dimension, similarly subject to the Nyquist condition and suitable for processing via the discrete Fourier transform. The challenges of obtaining high-resolution spectral estimates from short data records using the DFT were already well understood, however. Despite techniques such as linear prediction extrapolation, the achievable resolution in the indirect dimensions is limited by practical constraints on measuring time. The advent of non-Fourier methods of spectrum analysis capable of processing nonuniformly sampled data has led to an explosion in the development of novel sampling strategies that avoid the limits on resolution and measurement time imposed by uniform sampling. The first part of this review discusses the many approaches to data sampling in multidimensional NMR, the second part highlights commonly used methods for signal processing of such data, and the review concludes with a discussion of other approaches to speeding up data acquisition in NMR. PMID:25456315

  12. Full-field transient vibrometry of the human tympanic membrane by local phase correlation and high-speed holography

    NASA Astrophysics Data System (ADS)

    Dobrev, Ivo; Furlong, Cosme; Cheng, Jeffrey T.; Rosowski, John J.

    2014-09-01

    Understanding the human hearing process would be helped by quantification of the transient mechanical response of the human ear, including the human tympanic membrane (TM or eardrum). We propose a new hybrid high-speed holographic system (HHS) for acquisition and quantification of the full-field nanometer transient (i.e., >10 kHz) displacement of the human TM. We have optimized and implemented a 2+1 frame local correlation (LC) based phase sampling method in combination with a high-speed (i.e., >40 K fps) camera acquisition system. To our knowledge, there is currently no existing system that provides such capabilities for the study of the human TM. The LC sampling method has a displacement difference of <11 nm relative to measurements obtained by a four-phase step algorithm. Comparisons between our high-speed acquisition system and a laser Doppler vibrometer indicate differences of <10 μs. The high temporal (i.e., >40 kHz) and spatial (i.e., >100 k data points) resolution of our HHS enables parallel measurements of all points on the surface of the TM, which allows quantification of spatially dependent motion parameters, such as modal frequencies and acoustic delays. Such capabilities could allow inferring local material properties across the surface of the TM.

  13. Optimization of image quality and acquisition time for lab-based X-ray microtomography using an iterative reconstruction algorithm

    NASA Astrophysics Data System (ADS)

    Lin, Qingyang; Andrew, Matthew; Thompson, William; Blunt, Martin J.; Bijeljic, Branko

    2018-05-01

    Non-invasive laboratory-based X-ray microtomography has been widely applied in many industrial and research disciplines. However, the main barrier to the use of laboratory systems compared to a synchrotron beamline is its much longer image acquisition time (hours per scan compared to seconds to minutes at a synchrotron), which results in limited application for dynamic in situ processes. Therefore, the majority of existing laboratory X-ray microtomography is limited to static imaging; relatively fast imaging (tens of minutes per scan) can only be achieved by sacrificing imaging quality, e.g. reducing exposure time or number of projections. To alleviate this barrier, we introduce an optimized implementation of a well-known iterative reconstruction algorithm that allows users to reconstruct tomographic images with reasonable image quality, but requires lower X-ray signal counts and fewer projections than conventional methods. Quantitative analysis and comparison between the iterative and the conventional filtered back-projection reconstruction algorithm was performed using a sandstone rock sample with and without liquid phases in the pore space. Overall, by implementing the iterative reconstruction algorithm, the required image acquisition time for samples such as this, with sparse object structure, can be reduced by a factor of up to 4 without measurable loss of sharpness or signal to noise ratio.

  14. Gigabit Digital Filter Bank: Digital Backend Subsystem in the VERA Data-Acquisition System

    NASA Astrophysics Data System (ADS)

    Iguchi, Satoru; Kkurayama, Tomoharu; Kawaguchi, Noriyuki; Kawakami, Kazuyuki

    2005-02-01

    The VERA terminal is a new data-acquisition system developed for the VERA project, which is a project to construct a new Japanese VLBI array dedicated to make a 3-D map of our Milky Way Galaxy in terms of high-precision astrometry. New technology, a gigabit digital filter, was introduced in the development. The importance and advantages of a digital filter for radio astronomy have been studied as follows: (1) the digital filter can realize a variety of observation modes and maintain compatibility with different data-acquisition systems (Kiuchi et al. 1997 and Iguchi et al. 2000a), (2) the folding noise occurring in the sampling process can be reduced by combination with a higher-order sampling technique (Iguchi, Kawaguchi 2002), (3) and an ideal sharp cut-off bandedge and a flat amplitude/phase responses are approached by using a large number of taps available to use LSI of a large number of logic cells (Iguchi et al. 2000a). We developed the custom Finite Impulse Response filter chips and manufactured the Gigabit Digital Filter Banks (GDFBs) as a digital backend subsystem in the VERA terminal. In this paper, the design and development of the GDFB are presented in detail, and the performances and demonstrations of the developed GDFB are shown.

  15. 48 CFR 1336.602-5 - Short selection process for contracts not to exceed the simplified acquisition threshold.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 48 Federal Acquisition Regulations System 5 2011-10-01 2011-10-01 false Short selection process for contracts not to exceed the simplified acquisition threshold. 1336.602-5 Section 1336.602-5 Federal Acquisition Regulations System DEPARTMENT OF COMMERCE SPECIAL CATEGORIES OF CONTRACTING CONSTRUCTION AND ARCHITECT-ENGINEER CONTRACTS...

  16. 48 CFR 836.602-5 - Short selection process for contracts not to exceed the simplified acquisition threshold.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 48 Federal Acquisition Regulations System 5 2012-10-01 2012-10-01 false Short selection process for contracts not to exceed the simplified acquisition threshold. 836.602-5 Section 836.602-5 Federal Acquisition Regulations System DEPARTMENT OF VETERANS AFFAIRS SPECIAL CATEGORIES OF CONTRACTING CONSTRUCTION AND ARCHITECT-ENGINEER CONTRACTS...

  17. 48 CFR 636.602-5 - Short selection processes for contracts not to exceed the simplified acquisition threshold.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 48 Federal Acquisition Regulations System 4 2013-10-01 2013-10-01 false Short selection processes for contracts not to exceed the simplified acquisition threshold. 636.602-5 Section 636.602-5 Federal Acquisition Regulations System DEPARTMENT OF STATE SPECIAL CATEGORIES OF CONTRACTING CONSTRUCTION AND ARCHITECT-ENGINEER CONTRACTS...

  18. 48 CFR 436.602-5 - Short selection process for contracts not to exceed the simplified acquisition threshold.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 48 Federal Acquisition Regulations System 4 2013-10-01 2013-10-01 false Short selection process for contracts not to exceed the simplified acquisition threshold. 436.602-5 Section 436.602-5 Federal Acquisition Regulations System DEPARTMENT OF AGRICULTURE SPECIAL CATEGORIES OF CONTRACTING CONSTRUCTION AND ARCHITECT-ENGINEER CONTRACTS...

  19. 48 CFR 1436.602-5 - Short selection processes for contracts not to exceed the simplified acquisition threshold.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 48 Federal Acquisition Regulations System 5 2014-10-01 2014-10-01 false Short selection processes for contracts not to exceed the simplified acquisition threshold. 1436.602-5 Section 1436.602-5 Federal Acquisition Regulations System DEPARTMENT OF THE INTERIOR SPECIAL CATEGORIES OF CONTRACTING CONSTRUCTION AND ARCHITECT-ENGINEER CONTRACTS...

  20. 48 CFR 736.602-5 - Short selection process for procurements not to exceed the simplified acquisition threshold.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 48 Federal Acquisition Regulations System 5 2011-10-01 2011-10-01 false Short selection process for procurements not to exceed the simplified acquisition threshold. 736.602-5 Section 736.602-5 Federal Acquisition Regulations System AGENCY FOR INTERNATIONAL DEVELOPMENT SPECIAL CATEGORIES OF CONTRACTING CONSTRUCTION AND ARCHITECT-ENGINEER CONTRACT...

  1. 48 CFR 1336.602-5 - Short selection process for contracts not to exceed the simplified acquisition threshold.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 48 Federal Acquisition Regulations System 5 2014-10-01 2014-10-01 false Short selection process for contracts not to exceed the simplified acquisition threshold. 1336.602-5 Section 1336.602-5 Federal Acquisition Regulations System DEPARTMENT OF COMMERCE SPECIAL CATEGORIES OF CONTRACTING CONSTRUCTION AND ARCHITECT-ENGINEER CONTRACTS...

  2. 48 CFR 636.602-5 - Short selection processes for contracts not to exceed the simplified acquisition threshold.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 48 Federal Acquisition Regulations System 4 2010-10-01 2010-10-01 false Short selection processes for contracts not to exceed the simplified acquisition threshold. 636.602-5 Section 636.602-5 Federal Acquisition Regulations System DEPARTMENT OF STATE SPECIAL CATEGORIES OF CONTRACTING CONSTRUCTION AND ARCHITECT-ENGINEER CONTRACTS...

  3. 48 CFR 1036.602-5 - Short selection process for contracts not to exceed the simplified acquisition threshold.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 48 Federal Acquisition Regulations System 5 2011-10-01 2011-10-01 false Short selection process for contracts not to exceed the simplified acquisition threshold. 1036.602-5 Section 1036.602-5 Federal Acquisition Regulations System DEPARTMENT OF THE TREASURY SPECIAL CATEGORIES OF CONTRACTING CONSTRUCTION AND ARCHITECT-ENGINEER CONTRACTS...

  4. 48 CFR 436.602-5 - Short selection process for contracts not to exceed the simplified acquisition threshold.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 48 Federal Acquisition Regulations System 4 2014-10-01 2014-10-01 false Short selection process for contracts not to exceed the simplified acquisition threshold. 436.602-5 Section 436.602-5 Federal Acquisition Regulations System DEPARTMENT OF AGRICULTURE SPECIAL CATEGORIES OF CONTRACTING CONSTRUCTION AND ARCHITECT-ENGINEER CONTRACTS...

  5. 48 CFR 436.602-5 - Short selection process for contracts not to exceed the simplified acquisition threshold.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 48 Federal Acquisition Regulations System 4 2011-10-01 2011-10-01 false Short selection process for contracts not to exceed the simplified acquisition threshold. 436.602-5 Section 436.602-5 Federal Acquisition Regulations System DEPARTMENT OF AGRICULTURE SPECIAL CATEGORIES OF CONTRACTING CONSTRUCTION AND ARCHITECT-ENGINEER CONTRACTS...

  6. 48 CFR 1436.602-5 - Short selection processes for contracts not to exceed the simplified acquisition threshold.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 48 Federal Acquisition Regulations System 5 2011-10-01 2011-10-01 false Short selection processes for contracts not to exceed the simplified acquisition threshold. 1436.602-5 Section 1436.602-5 Federal Acquisition Regulations System DEPARTMENT OF THE INTERIOR SPECIAL CATEGORIES OF CONTRACTING CONSTRUCTION AND ARCHITECT-ENGINEER CONTRACTS...

  7. 48 CFR 736.602-5 - Short selection process for procurements not to exceed the simplified acquisition threshold.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 48 Federal Acquisition Regulations System 5 2012-10-01 2012-10-01 false Short selection process for procurements not to exceed the simplified acquisition threshold. 736.602-5 Section 736.602-5 Federal Acquisition Regulations System AGENCY FOR INTERNATIONAL DEVELOPMENT SPECIAL CATEGORIES OF CONTRACTING CONSTRUCTION AND ARCHITECT-ENGINEER CONTRACT...

  8. 48 CFR 436.602-5 - Short selection process for contracts not to exceed the simplified acquisition threshold.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 48 Federal Acquisition Regulations System 4 2010-10-01 2010-10-01 false Short selection process for contracts not to exceed the simplified acquisition threshold. 436.602-5 Section 436.602-5 Federal Acquisition Regulations System DEPARTMENT OF AGRICULTURE SPECIAL CATEGORIES OF CONTRACTING CONSTRUCTION AND ARCHITECT-ENGINEER CONTRACTS...

  9. 48 CFR 836.602-5 - Short selection process for contracts not to exceed the simplified acquisition threshold.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 48 Federal Acquisition Regulations System 5 2011-10-01 2011-10-01 false Short selection process for contracts not to exceed the simplified acquisition threshold. 836.602-5 Section 836.602-5 Federal Acquisition Regulations System DEPARTMENT OF VETERANS AFFAIRS SPECIAL CATEGORIES OF CONTRACTING CONSTRUCTION AND ARCHITECT-ENGINEER CONTRACTS...

  10. 48 CFR 1336.602-5 - Short selection process for contracts not to exceed the simplified acquisition threshold.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 48 Federal Acquisition Regulations System 5 2013-10-01 2013-10-01 false Short selection process for contracts not to exceed the simplified acquisition threshold. 1336.602-5 Section 1336.602-5 Federal Acquisition Regulations System DEPARTMENT OF COMMERCE SPECIAL CATEGORIES OF CONTRACTING CONSTRUCTION AND ARCHITECT-ENGINEER CONTRACTS...

  11. 48 CFR 836.602-5 - Short selection process for contracts not to exceed the simplified acquisition threshold.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 48 Federal Acquisition Regulations System 5 2014-10-01 2014-10-01 false Short selection process for contracts not to exceed the simplified acquisition threshold. 836.602-5 Section 836.602-5 Federal Acquisition Regulations System DEPARTMENT OF VETERANS AFFAIRS SPECIAL CATEGORIES OF CONTRACTING CONSTRUCTION AND ARCHITECT-ENGINEER CONTRACTS...

  12. 48 CFR 1336.602-5 - Short selection process for contracts not to exceed the simplified acquisition threshold.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 48 Federal Acquisition Regulations System 5 2012-10-01 2012-10-01 false Short selection process for contracts not to exceed the simplified acquisition threshold. 1336.602-5 Section 1336.602-5 Federal Acquisition Regulations System DEPARTMENT OF COMMERCE SPECIAL CATEGORIES OF CONTRACTING CONSTRUCTION AND ARCHITECT-ENGINEER CONTRACTS...

  13. 48 CFR 1436.602-5 - Short selection processes for contracts not to exceed the simplified acquisition threshold.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 48 Federal Acquisition Regulations System 5 2012-10-01 2012-10-01 false Short selection processes for contracts not to exceed the simplified acquisition threshold. 1436.602-5 Section 1436.602-5 Federal Acquisition Regulations System DEPARTMENT OF THE INTERIOR SPECIAL CATEGORIES OF CONTRACTING CONSTRUCTION AND ARCHITECT-ENGINEER CONTRACTS...

  14. 48 CFR 636.602-5 - Short selection processes for contracts not to exceed the simplified acquisition threshold.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 48 Federal Acquisition Regulations System 4 2011-10-01 2011-10-01 false Short selection processes for contracts not to exceed the simplified acquisition threshold. 636.602-5 Section 636.602-5 Federal Acquisition Regulations System DEPARTMENT OF STATE SPECIAL CATEGORIES OF CONTRACTING CONSTRUCTION AND ARCHITECT-ENGINEER CONTRACTS...

  15. 48 CFR 636.602-5 - Short selection processes for contracts not to exceed the simplified acquisition threshold.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... for contracts not to exceed the simplified acquisition threshold. 636.602-5 Section 636.602-5 Federal... not to exceed the simplified acquisition threshold. The short selection process described in FAR 36.602-5 is authorized for use for contracts not expected to exceed the simplified acquisition threshold...

  16. 48 CFR 736.602-5 - Short selection process for procurements not to exceed the simplified acquisition threshold.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 48 Federal Acquisition Regulations System 5 2014-10-01 2014-10-01 false Short selection process for procurements not to exceed the simplified acquisition threshold. 736.602-5 Section 736.602-5 Federal Acquisition Regulations System AGENCY FOR INTERNATIONAL DEVELOPMENT SPECIAL CATEGORIES OF CONTRACTING CONSTRUCTION AND ARCHITECT-ENGINEER CONTRACT...

  17. 48 CFR 1036.602-5 - Short selection process for contracts not to exceed the simplified acquisition threshold.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 48 Federal Acquisition Regulations System 5 2013-10-01 2013-10-01 false Short selection process for contracts not to exceed the simplified acquisition threshold. 1036.602-5 Section 1036.602-5 Federal Acquisition Regulations System DEPARTMENT OF THE TREASURY SPECIAL CATEGORIES OF CONTRACTING CONSTRUCTION AND ARCHITECT-ENGINEER CONTRACTS...

  18. 48 CFR 436.602-5 - Short selection process for contracts not to exceed the simplified acquisition threshold.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 48 Federal Acquisition Regulations System 4 2012-10-01 2012-10-01 false Short selection process for contracts not to exceed the simplified acquisition threshold. 436.602-5 Section 436.602-5 Federal... to exceed the simplified acquisition threshold. The HCA may include either or both procedures in FAR...

  19. 48 CFR 1336.602-5 - Short selection process for contracts not to exceed the simplified acquisition threshold.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... for contracts not to exceed the simplified acquisition threshold. 1336.602-5 Section 1336.602-5... for contracts not to exceed the simplified acquisition threshold. (a) In contracts not expected to exceed the simplified acquisition threshold, either or both of the short selection processes set out at...

  20. 48 CFR 1036.602-5 - Short selection process for contracts not to exceed the simplified acquisition threshold.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 48 Federal Acquisition Regulations System 5 2012-10-01 2012-10-01 false Short selection process for contracts not to exceed the simplified acquisition threshold. 1036.602-5 Section 1036.602-5 Federal Acquisition Regulations System DEPARTMENT OF THE TREASURY SPECIAL CATEGORIES OF CONTRACTING CONSTRUCTION AND ARCHITECT-ENGINEER CONTRACTS...

  1. 48 CFR 1036.602-5 - Short selection process for contracts not to exceed the simplified acquisition threshold.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 48 Federal Acquisition Regulations System 5 2014-10-01 2014-10-01 false Short selection process for contracts not to exceed the simplified acquisition threshold. 1036.602-5 Section 1036.602-5 Federal Acquisition Regulations System DEPARTMENT OF THE TREASURY SPECIAL CATEGORIES OF CONTRACTING CONSTRUCTION AND ARCHITECT-ENGINEER CONTRACTS...

  2. 48 CFR 1436.602-5 - Short selection processes for contracts not to exceed the simplified acquisition threshold.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 48 Federal Acquisition Regulations System 5 2013-10-01 2013-10-01 false Short selection processes for contracts not to exceed the simplified acquisition threshold. 1436.602-5 Section 1436.602-5 Federal Acquisition Regulations System DEPARTMENT OF THE INTERIOR SPECIAL CATEGORIES OF CONTRACTING CONSTRUCTION AND ARCHITECT-ENGINEER CONTRACTS...

  3. 48 CFR 636.602-5 - Short selection processes for contracts not to exceed the simplified acquisition threshold.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 48 Federal Acquisition Regulations System 4 2014-10-01 2014-10-01 false Short selection processes for contracts not to exceed the simplified acquisition threshold. 636.602-5 Section 636.602-5 Federal Acquisition Regulations System DEPARTMENT OF STATE SPECIAL CATEGORIES OF CONTRACTING CONSTRUCTION AND ARCHITECT-ENGINEER CONTRACTS...

  4. Mobile field data acquisition in geosciences

    NASA Astrophysics Data System (ADS)

    Golodoniuc, Pavel; Klump, Jens; Reid, Nathan; Gray, David

    2016-04-01

    The Discovering Australia's Mineral Resources Program of CSIRO is conducting a study to develop novel methods and techniques to reliably define distal footprints of mineral systems under regolith cover in the Capricorn Orogen - the area that lies between two well-known metallogenic provinces of Pilbara and Yilgarn Cratons in Western Australia. The multidisciplinary study goes beyond the boundaries of a specific discipline and aims at developing new methods to integrate heterogeneous datasets to gain insight into the key indicators of mineralisation. The study relies on large regional datasets obtained from previous hydrogeochemical, regolith, and resistate mineral studies around known deposits, as well as new data obtained from the recent field sampling campaigns around areas of interest. With thousands of water, vegetation, rock and soil samples collected over the past years, it has prompted us to look at ways to standardise field sampling procedures and review the data acquisition process. This process has evolved over the years (Golodoniuc et al., 2015; Klump et al., 2015) and has now reached the phase where fast and reliable collection of scientific data in remote areas is possible. The approach is backed by a unified discipline-agnostic platform - the Federated Archaeological Information Management System (FAIMS). FAIMS is an open source framework for mobile field data acquisition, developed at the University of New South Wales for archaeological field data collection. The FAIMS framework can easily be adapted to a diverse range of scenarios, different kinds of samples, each with its own peculiarities, integration with GPS, and the ability to associate photographs taken with the device embedded camera with captured data. Three different modules have been developed so far, dedicated to geochemical water, plant and rock sampling. All modules feature automatic date and position recording, and reproduce the established data recording workflows. The rock sampling module also features an interactive GIS component allowing to enter field observations as annotations to a map. The open communication protocols and file formats used by FAIMS modules allow easy integration with existing spatial data infrastructures and third-party applications, such as ArcGIS. The remoteness of the focus areas in the Capricorn region required reliable mechanisms for data replication and an added level of redundancy. This was achieved through the use of the FAIMS Server without adding a tightly coupled dependency on it - the mobile devices could continue to work independently in the case the server fails. To support collaborative fieldwork, "FAIMS on a Truck" offers networked collaboration within a field team using mobile applications as asynchronous rich clients. The framework runs on compatible Android devices (e.g., tablets, smart phones) with the network infrastructure supported by a FAIMS Server. The server component is installed in a field vehicle to provide data synchronisation between multiple mobile devices, backup and data transfer. The data entry process was streamlined and followed the workflow that field crews were accustomed to with added data validation capabilities. The use of a common platform allowed us to adopt the framework within multiple disciplines, improve data acquisition times, and reduce human-introduced errors. We continue to work with other research groups and continue to explore the possibilities to adopt the technology in other applications, e.g., agriculture.

  5. Sequential time interleaved random equivalent sampling for repetitive signal.

    PubMed

    Zhao, Yijiu; Liu, Jingjing

    2016-12-01

    Compressed sensing (CS) based sampling techniques exhibit many advantages over other existing approaches for sparse signal spectrum sensing; they are also incorporated into non-uniform sampling signal reconstruction to improve the efficiency, such as random equivalent sampling (RES). However, in CS based RES, only one sample of each acquisition is considered in the signal reconstruction stage, and it will result in more acquisition runs and longer sampling time. In this paper, a sampling sequence is taken in each RES acquisition run, and the corresponding block measurement matrix is constructed using a Whittaker-Shannon interpolation formula. All the block matrices are combined into an equivalent measurement matrix with respect to all sampling sequences. We implemented the proposed approach with a multi-cores analog-to-digital converter (ADC), whose ADC cores are time interleaved. A prototype realization of this proposed CS based sequential random equivalent sampling method has been developed. It is able to capture an analog waveform at an equivalent sampling rate of 40 GHz while sampled at 1 GHz physically. Experiments indicate that, for a sparse signal, the proposed CS based sequential random equivalent sampling exhibits high efficiency.

  6. Selecting Senior Acquisition Officials: Assessing the Current Processes and Practices for Recruiting, Confirming, and Retaining Senior Officials in the Acquisition Workforce

    DTIC Science & Technology

    2016-04-21

    Selecting Senior Acquisition Officials Assessing the Current Processes and Practices for Recruiting, Confirming, and Retaining Senior Officials...Task Group 2 Terms of Reference (TOR)  Selection of Senior Officials in the Acquisition Workforce – Consider ethics rules, congressional committee... Senior Acquisition positions – Re-validate the conflicts of interest and risk mitigation rules “[T]he committee directs the Chair of the Defense Business

  7. TestSTORM: Simulator for optimizing sample labeling and image acquisition in localization based super-resolution microscopy

    PubMed Central

    Sinkó, József; Kákonyi, Róbert; Rees, Eric; Metcalf, Daniel; Knight, Alex E.; Kaminski, Clemens F.; Szabó, Gábor; Erdélyi, Miklós

    2014-01-01

    Localization-based super-resolution microscopy image quality depends on several factors such as dye choice and labeling strategy, microscope quality and user-defined parameters such as frame rate and number as well as the image processing algorithm. Experimental optimization of these parameters can be time-consuming and expensive so we present TestSTORM, a simulator that can be used to optimize these steps. TestSTORM users can select from among four different structures with specific patterns, dye and acquisition parameters. Example results are shown and the results of the vesicle pattern are compared with experimental data. Moreover, image stacks can be generated for further evaluation using localization algorithms, offering a tool for further software developments. PMID:24688813

  8. Air Traffic Control: Immature Software Acquisition Processes Increase FAA System Acquisition Risks

    DOT National Transportation Integrated Search

    1997-03-01

    The General Accounting Office (GAO) at the request of Congress reviewed (1) : the maturity of Federal Aviation Administration's (FAA's) Air Traffic Control : (ATC) modernization software acquisition processes, and (2) the steps/actions : FAA has unde...

  9. Progressive compressive imager

    NASA Astrophysics Data System (ADS)

    Evladov, Sergei; Levi, Ofer; Stern, Adrian

    2012-06-01

    We have designed and built a working automatic progressive sampling imaging system based on the vector sensor concept, which utilizes a unique sampling scheme of Radon projections. This sampling scheme makes it possible to progressively add information resulting in tradeoff between compression and the quality of reconstruction. The uniqueness of our sampling is that in any moment of the acquisition process the reconstruction can produce a reasonable version of the image. The advantage of the gradual addition of the samples is seen when the sparsity rate of the object is unknown, and thus the number of needed measurements. We have developed the iterative algorithm OSO (Ordered Sets Optimization) which employs our sampling scheme for creation of nearly uniform distributed sets of samples, which allows the reconstruction of Mega-Pixel images. We present the good quality reconstruction from compressed data ratios of 1:20.

  10. Is Children's Acquisition of the Passive a Staged Process? Evidence from Six- and Nine-Year-Olds' Production of Passives

    ERIC Educational Resources Information Center

    Messenger, Katherine; Branigan, Holly P.; McLean, Janet F.

    2012-01-01

    We report a syntactic priming experiment that examined whether children's acquisition of the passive is a staged process, with acquisition of constituent structure preceding acquisition of thematic role mappings. Six-year-olds and nine-year-olds described transitive actions after hearing active and passive prime descriptions involving the same or…

  11. A novel 3D Cartesian random sampling strategy for Compressive Sensing Magnetic Resonance Imaging.

    PubMed

    Valvano, Giuseppe; Martini, Nicola; Santarelli, Maria Filomena; Chiappino, Dante; Landini, Luigi

    2015-01-01

    In this work we propose a novel acquisition strategy for accelerated 3D Compressive Sensing Magnetic Resonance Imaging (CS-MRI). This strategy is based on a 3D cartesian sampling with random switching of the frequency encoding direction with other K-space directions. Two 3D sampling strategies are presented. In the first strategy, the frequency encoding direction is randomly switched with one of the two phase encoding directions. In the second strategy, the frequency encoding direction is randomly chosen between all the directions of the K-Space. These strategies can lower the coherence of the acquisition, in order to produce reduced aliasing artifacts and to achieve a better image quality after Compressive Sensing (CS) reconstruction. Furthermore, the proposed strategies can reduce the typical smoothing of CS due to the limited sampling of high frequency locations. We demonstrated by means of simulations that the proposed acquisition strategies outperformed the standard Compressive Sensing acquisition. This results in a better quality of the reconstructed images and in a greater achievable acceleration.

  12. A low-rank matrix recovery approach for energy efficient EEG acquisition for a wireless body area network.

    PubMed

    Majumdar, Angshul; Gogna, Anupriya; Ward, Rabab

    2014-08-25

    We address the problem of acquiring and transmitting EEG signals in Wireless Body Area Networks (WBAN) in an energy efficient fashion. In WBANs, the energy is consumed by three operations: sensing (sampling), processing and transmission. Previous studies only addressed the problem of reducing the transmission energy. For the first time, in this work, we propose a technique to reduce sensing and processing energy as well: this is achieved by randomly under-sampling the EEG signal. We depart from previous Compressed Sensing based approaches and formulate signal recovery (from under-sampled measurements) as a matrix completion problem. A new algorithm to solve the matrix completion problem is derived here. We test our proposed method and find that the reconstruction accuracy of our method is significantly better than state-of-the-art techniques; and we achieve this while saving sensing, processing and transmission energy. Simple power analysis shows that our proposed methodology consumes considerably less power compared to previous CS based techniques.

  13. Data-Driven Sampling Matrix Boolean Optimization for Energy-Efficient Biomedical Signal Acquisition by Compressive Sensing.

    PubMed

    Wang, Yuhao; Li, Xin; Xu, Kai; Ren, Fengbo; Yu, Hao

    2017-04-01

    Compressive sensing is widely used in biomedical applications, and the sampling matrix plays a critical role on both quality and power consumption of signal acquisition. It projects a high-dimensional vector of data into a low-dimensional subspace by matrix-vector multiplication. An optimal sampling matrix can ensure accurate data reconstruction and/or high compression ratio. Most existing optimization methods can only produce real-valued embedding matrices that result in large energy consumption during data acquisition. In this paper, we propose an efficient method that finds an optimal Boolean sampling matrix in order to reduce the energy consumption. Compared to random Boolean embedding, our data-driven Boolean sampling matrix can improve the image recovery quality by 9 dB. Moreover, in terms of sampling hardware complexity, it reduces the energy consumption by 4.6× and the silicon area by 1.9× over the data-driven real-valued embedding.

  14. A self-teaching image processing and voice-recognition-based, intelligent and interactive system to educate visually impaired children

    NASA Astrophysics Data System (ADS)

    Iqbal, Asim; Farooq, Umar; Mahmood, Hassan; Asad, Muhammad Usman; Khan, Akrama; Atiq, Hafiz Muhammad

    2010-02-01

    A self teaching image processing and voice recognition based system is developed to educate visually impaired children, chiefly in their primary education. System comprises of a computer, a vision camera, an ear speaker and a microphone. Camera, attached with the computer system is mounted on the ceiling opposite (on the required angle) to the desk on which the book is placed. Sample images and voices in the form of instructions and commands of English, Urdu alphabets, Numeric Digits, Operators and Shapes are already stored in the database. A blind child first reads the embossed character (object) with the help of fingers than he speaks the answer, name of the character, shape etc into the microphone. With the voice command of a blind child received by the microphone, image is taken by the camera which is processed by MATLAB® program developed with the help of Image Acquisition and Image processing toolbox and generates a response or required set of instructions to child via ear speaker, resulting in self education of a visually impaired child. Speech recognition program is also developed in MATLAB® with the help of Data Acquisition and Signal Processing toolbox which records and process the command of the blind child.

  15. Autonomous Metabolomics for Rapid Metabolite Identification in Global Profiling

    DOE PAGES

    Benton, H. Paul; Ivanisevic, Julijana; Mahieu, Nathaniel G.; ...

    2014-12-12

    An autonomous metabolomic workflow combining mass spectrometry analysis with tandem mass spectrometry data acquisition was designed to allow for simultaneous data processing and metabolite characterization. Although previously tandem mass spectrometry data have been generated on the fly, the experiments described herein combine this technology with the bioinformatic resources of XCMS and METLIN. We can analyze large profiling datasets and simultaneously obtain structural identifications, as a result of this unique integration. Furthermore, validation of the workflow on bacterial samples allowed the profiling on the order of a thousand metabolite features with simultaneous tandem mass spectra data acquisition. The tandem mass spectrometrymore » data acquisition enabled automatic search and matching against the METLIN tandem mass spectrometry database, shortening the current workflow from days to hours. Overall, the autonomous approach to untargeted metabolomics provides an efficient means of metabolomic profiling, and will ultimately allow the more rapid integration of comparative analyses, metabolite identification, and data analysis at a systems biology level.« less

  16. Signal acquisition and scale calibration for beam power density distribution of electron beam welding

    NASA Astrophysics Data System (ADS)

    Peng, Yong; Li, Hongqiang; Shen, Chunlong; Guo, Shun; Zhou, Qi; Wang, Kehong

    2017-06-01

    The power density distribution of electron beam welding (EBW) is a key factor to reflect the beam quality. The beam quality test system was designed for the actual beam power density distribution of high-voltage EBW. After the analysis of characteristics and phase relationship between the deflection control signal and the acquisition signal, the Post-Trigger mode was proposed for the signal acquisition meanwhile the same external clock source was shared by the control signal and the sampling clock. The power density distribution of beam cross-section was reconstructed using one-dimensional signal that was processed by median filtering, twice signal segmentation and spatial scale calibration. The diameter of beam cross-section was defined by amplitude method and integral method respectively. The measured diameter of integral definition is bigger than that of amplitude definition, but for the ideal distribution the former is smaller than the latter. The measured distribution without symmetrical shape is not concentrated compared to Gaussian distribution.

  17. Target-locking acquisition with real-time confocal (TARC) microscopy.

    PubMed

    Lu, Peter J; Sims, Peter A; Oki, Hidekazu; Macarthur, James B; Weitz, David A

    2007-07-09

    We present a real-time target-locking confocal microscope that follows an object moving along an arbitrary path, even as it simultaneously changes its shape, size and orientation. This Target-locking Acquisition with Realtime Confocal (TARC) microscopy system integrates fast image processing and rapid image acquisition using a Nipkow spinning-disk confocal microscope. The system acquires a 3D stack of images, performs a full structural analysis to locate a feature of interest, moves the sample in response, and then collects the next 3D image stack. In this way, data collection is dynamically adjusted to keep a moving object centered in the field of view. We demonstrate the system's capabilities by target-locking freely-diffusing clusters of attractive colloidal particles, and activelytransported quantum dots (QDs) endocytosed into live cells free to move in three dimensions, for several hours. During this time, both the colloidal clusters and live cells move distances several times the length of the imaging volume.

  18. A multi-center study benchmarks software tools for label-free proteome quantification

    PubMed Central

    Gillet, Ludovic C; Bernhardt, Oliver M.; MacLean, Brendan; Röst, Hannes L.; Tate, Stephen A.; Tsou, Chih-Chiang; Reiter, Lukas; Distler, Ute; Rosenberger, George; Perez-Riverol, Yasset; Nesvizhskii, Alexey I.; Aebersold, Ruedi; Tenzer, Stefan

    2016-01-01

    The consistent and accurate quantification of proteins by mass spectrometry (MS)-based proteomics depends on the performance of instruments, acquisition methods and data analysis software. In collaboration with the software developers, we evaluated OpenSWATH, SWATH2.0, Skyline, Spectronaut and DIA-Umpire, five of the most widely used software methods for processing data from SWATH-MS (sequential window acquisition of all theoretical fragment ion spectra), a method that uses data-independent acquisition (DIA) for label-free protein quantification. We analyzed high-complexity test datasets from hybrid proteome samples of defined quantitative composition acquired on two different MS instruments using different SWATH isolation windows setups. For consistent evaluation we developed LFQbench, an R-package to calculate metrics of precision and accuracy in label-free quantitative MS, and report the identification performance, robustness and specificity of each software tool. Our reference datasets enabled developers to improve their software tools. After optimization, all tools provided highly convergent identification and reliable quantification performance, underscoring their robustness for label-free quantitative proteomics. PMID:27701404

  19. A multicenter study benchmarks software tools for label-free proteome quantification.

    PubMed

    Navarro, Pedro; Kuharev, Jörg; Gillet, Ludovic C; Bernhardt, Oliver M; MacLean, Brendan; Röst, Hannes L; Tate, Stephen A; Tsou, Chih-Chiang; Reiter, Lukas; Distler, Ute; Rosenberger, George; Perez-Riverol, Yasset; Nesvizhskii, Alexey I; Aebersold, Ruedi; Tenzer, Stefan

    2016-11-01

    Consistent and accurate quantification of proteins by mass spectrometry (MS)-based proteomics depends on the performance of instruments, acquisition methods and data analysis software. In collaboration with the software developers, we evaluated OpenSWATH, SWATH 2.0, Skyline, Spectronaut and DIA-Umpire, five of the most widely used software methods for processing data from sequential window acquisition of all theoretical fragment-ion spectra (SWATH)-MS, which uses data-independent acquisition (DIA) for label-free protein quantification. We analyzed high-complexity test data sets from hybrid proteome samples of defined quantitative composition acquired on two different MS instruments using different SWATH isolation-window setups. For consistent evaluation, we developed LFQbench, an R package, to calculate metrics of precision and accuracy in label-free quantitative MS and report the identification performance, robustness and specificity of each software tool. Our reference data sets enabled developers to improve their software tools. After optimization, all tools provided highly convergent identification and reliable quantification performance, underscoring their robustness for label-free quantitative proteomics.

  20. Accelerated high-resolution photoacoustic tomography via compressed sensing

    NASA Astrophysics Data System (ADS)

    Arridge, Simon; Beard, Paul; Betcke, Marta; Cox, Ben; Huynh, Nam; Lucka, Felix; Ogunlade, Olumide; Zhang, Edward

    2016-12-01

    Current 3D photoacoustic tomography (PAT) systems offer either high image quality or high frame rates but are not able to deliver high spatial and temporal resolution simultaneously, which limits their ability to image dynamic processes in living tissue (4D PAT). A particular example is the planar Fabry-Pérot (FP) photoacoustic scanner, which yields high-resolution 3D images but takes several minutes to sequentially map the incident photoacoustic field on the 2D sensor plane, point-by-point. However, as the spatio-temporal complexity of many absorbing tissue structures is rather low, the data recorded in such a conventional, regularly sampled fashion is often highly redundant. We demonstrate that combining model-based, variational image reconstruction methods using spatial sparsity constraints with the development of novel PAT acquisition systems capable of sub-sampling the acoustic wave field can dramatically increase the acquisition speed while maintaining a good spatial resolution: first, we describe and model two general spatial sub-sampling schemes. Then, we discuss how to implement them using the FP interferometer and demonstrate the potential of these novel compressed sensing PAT devices through simulated data from a realistic numerical phantom and through measured data from a dynamic experimental phantom as well as from in vivo experiments. Our results show that images with good spatial resolution and contrast can be obtained from highly sub-sampled PAT data if variational image reconstruction techniques that describe the tissues structures with suitable sparsity-constraints are used. In particular, we examine the use of total variation (TV) regularization enhanced by Bregman iterations. These novel reconstruction strategies offer new opportunities to dramatically increase the acquisition speed of photoacoustic scanners that employ point-by-point sequential scanning as well as reducing the channel count of parallelized schemes that use detector arrays.

  1. 48 CFR 242.1203 - Processing agreements.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 48 Federal Acquisition Regulations System 3 2010-10-01 2010-10-01 false Processing agreements. 242.1203 Section 242.1203 Federal Acquisition Regulations System DEFENSE ACQUISITION REGULATIONS SYSTEM, DEPARTMENT OF DEFENSE CONTRACT MANAGEMENT CONTRACT ADMINISTRATION AND AUDIT SERVICES Novation and Change-of...

  2. 25 CFR 700.115 - Preliminary acquisition notice.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... Acquisition and Disposal of Habitations and/or Improvements § 700.115 Preliminary acquisition notice. As soon as feasible in the acquisition process, the Commission shall issue a preliminary acquisition notice.../her habitations and/or improvements. (b) Explain that such preliminary acquisition notice is not a...

  3. Living in a digital world: features and applications of FPGA in photon detection

    NASA Astrophysics Data System (ADS)

    Arnesano, Cosimo

    Optical spectroscopy and imaging outcomes rely upon many factors; one of the most critical is the photon acquisition and processing method employed. For some types of measurements it may be crucial to acquire every single photon quickly with temporal resolution, but in other cases it is important to acquire as many photons as possible, regardless of the time information about each of them. Fluorescence Lifetime Imaging Microscopy belongs to the first case, where the information of the time of arrival of every single photon in every single pixel is fundamental in obtaining the desired information. Spectral tissue imaging belongs to the second case, where high photon density is needed in order to calculate the optical parameters necessary to build the spectral image. In both cases, the current instrumentation suffers from limitations in terms of acquisition time, duty cycle, cost, and radio-frequency interference and emission. We developed the Digital Frequency-Domain approach for photon acquisition and processing purpose using new digital technology. This approach is based on the use of photon detectors in photon counting mode, and the digital heterodyning method to acquire data which is analyzed in the frequency domain to provide the information of the time of arrival of the photons . In conjunction with the use of pulsed laser sources, this method allows the determination of the time of arrival of the photons using the harmonic content of the frequency domain analysis. The parallel digital FD design is a powerful approach that others the possibility to implement a variety of different applications in fluorescence spectroscopy and microscopy. It can be applied to fluorometry, Fluorescence Lifetime Imaging (FLIM), and Fluorescence Correlation Spectroscopy (FCS), as well as multi frequency and multi wavelength tissue imaging in compact portable medical devices. It dramatically reduces the acquisition time from the several minutes scale to the seconds scale, performs signal processing in a digital fashion avoiding RF emission and it is extremely inexpensive. This development is the result of a systematic study carried on a previous design known as the FLIMBox developed as part of a thesis of another graduate student. The extensive work done in maximizing the performance of the original FLIMBox led us to develop a new hardware solution with exciting and promising results and potential that were not possible in the previous hardware realization, where the signal harmonic content was limited by the FPGA technology. The new design permits acquisition of a much larger harmonic content of the sample response when it is excited with a pulsed light source in one single measurement using the digital mixing principle that was developed in the original design. Furthermore, we used the parallel digital FD principle to perform tissue imaging through Diffuse Optical Spectroscopy (DOS) measurements. We integrated the FLIMBox in a new system that uses a supercontinuum white laser with high brightness as a single light source and photomultipliers with large detection area, both allowing a high penetration depth with extremely low power at the sample. The parallel acquisition, achieved by using the FlimBox, decreases the time required for standard serial systems that scan through all modulation frequencies. Furthermore, the all-digital acquisition avoids analog noise, removes the analog mixer of the conventional frequency domain approach, and it does not generate radio-frequencies, normally present in current analog systems. We are able to obtain a very sensitive acquisition due to the high signal to noise ratio (S/N). The successful results obtained by utilizing digital technology in photon acquisition and processing, prompted us to extend the use of FPGA to other applications, such as phosphorescence detection. Using the FPGA concept we proposed possible solutions to outstanding problems with the current technology. In this thesis I discuss new possible scenarios where new FPGA chips are applied to spectral tissue imaging.

  4. System safety management lessons learned from the US Army acquisition process

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Piatt, J.A.

    1989-05-01

    The Assistant Secretary of the Army for Research, Development and Acquisition directed the Army Safety Center to provide an audit of the causes of accidents and safety of use restrictions on recently fielded systems by tracking residual hazards back through the acquisition process. The objective was to develop lessons learned'' that could be applied to the acquisition process to minimize mishaps in fielded systems. System safety management lessons learned are defined as Army practices or policies, derived from past successes and failures, that are expected to be effective in eliminating or reducing specific systemic causes of residual hazards. They aremore » broadly applicable and supportive of the Army structure and acquisition objectives. Pacific Northwest Laboratory (PNL) was given the task of conducting an independent, objective appraisal of the Army's system safety program in the context of the Army materiel acquisition process by focusing on four fielded systems which are products of that process. These systems included the Apache helicopter, the Bradley Fighting Vehicle (BFV), the Tube Launched, Optically Tracked, Wire Guided (TOW) Missile and the High Mobility Multipurpose Wheeled Vehicle (HMMWV). The objective of this study was to develop system safety management lessons learned associated with the acquisition process. The first step was to identify residual hazards associated with the selected systems. Since it was impossible to track all residual hazards through the acquisition process, certain well-known, high visibility hazards were selected for detailed tracking. These residual hazards illustrate a variety of systemic problems. Systemic or process causes were identified for each residual hazard and analyzed to determine why they exist. System safety management lessons learned were developed to address related systemic causal factors. 29 refs., 5 figs.« less

  5. A novel PMT test system based on waveform sampling

    NASA Astrophysics Data System (ADS)

    Yin, S.; Ma, L.; Ning, Z.; Qian, S.; Wang, Y.; Jiang, X.; Wang, Z.; Yu, B.; Gao, F.; Zhu, Y.; Wang, Z.

    2018-01-01

    Comparing with the traditional test system based on a QDC and TDC and scaler, a test system based on waveform sampling is constructed for signal sampling of the 8"R5912 and the 20"R12860 Hamamatsu PMT in different energy states from single to multiple photoelectrons. In order to achieve high throughput and to reduce the dead time in data processing, the data acquisition software based on LabVIEW is developed and runs with a parallel mechanism. The analysis algorithm is realized in LabVIEW and the spectra of charge, amplitude, signal width and rising time are analyzed offline. The results from Charge-to-Digital Converter, Time-to-Digital Converter and waveform sampling are discussed in detailed comparison.

  6. Advanced Curation Activities at NASA: Preparing to Receive, Process, and Distribute Samples Returned from Future Missions

    NASA Technical Reports Server (NTRS)

    McCubbin, Francis M.; Zeigler, Ryan A.

    2017-01-01

    The Astromaterials Acquisition and Curation Office (henceforth referred to herein as NASA Curation Office) at NASA Johnson Space Center (JSC) is responsible for curating all of NASA's extraterrestrial samples. Under the governing document, NASA Policy Directive (NPD) 7100.10F JSC is charged with curation of all extraterrestrial material under NASA control, including future NASA missions. The Directive goes on to define Curation as including documentation, preservation, preparation, and distribution of samples for research, education, and public outreach. Here we briefly describe NASA's astromaterials collections and our ongoing efforts related to enhancing the utility of our current collections as well as our efforts to prepare for future sample return missions. We collectively refer to these efforts as advanced curation.

  7. Optimization of sampling pattern and the design of Fourier ptychographic illuminator.

    PubMed

    Guo, Kaikai; Dong, Siyuan; Nanda, Pariksheet; Zheng, Guoan

    2015-03-09

    Fourier ptychography (FP) is a recently developed imaging approach that facilitates high-resolution imaging beyond the cutoff frequency of the employed optics. In the original FP approach, a periodic LED array is used for sample illumination, and therefore, the scanning pattern is a uniform grid in the Fourier space. Such a uniform sampling scheme leads to 3 major problems for FP, namely: 1) it requires a large number of raw images, 2) it introduces the raster grid artefacts in the reconstruction process, and 3) it requires a high-dynamic-range detector. Here, we investigate scanning sequences and sampling patterns to optimize the FP approach. For most biological samples, signal energy is concentrated at low-frequency region, and as such, we can perform non-uniform Fourier sampling in FP by considering the signal structure. In contrast, conventional ptychography perform uniform sampling over the entire real space. To implement the non-uniform Fourier sampling scheme in FP, we have designed and built an illuminator using LEDs mounted on a 3D-printed plastic case. The advantages of this illuminator are threefold in that: 1) it reduces the number of image acquisitions by at least 50% (68 raw images versus 137 in the original FP setup), 2) it departs from the translational symmetry of sampling to solve the raster grid artifact problem, and 3) it reduces the dynamic range of the captured images 6 fold. The results reported in this paper significantly shortened acquisition time and improved quality of FP reconstructions. It may provide new insights for developing Fourier ptychographic imaging platforms and find important applications in digital pathology.

  8. Icy Soil Acquisition Device for the 2007 Phoenix Mars Lander

    NASA Technical Reports Server (NTRS)

    Chu, Philip; Wilson, Jack; Davis, Kiel; Shiraishi, Lori; Burke, Kevin

    2008-01-01

    The Icy Soil Acquisition Device is a first of its kind mechanism that is designed to acquire ice-bearing soil from the surface of the Martian polar region and transfer the samples to analytical instruments, playing a critical role in the potential discovery of existing water on Mars. The device incorporates a number of novel features that further the state of the art in spacecraft design for harsh environments, sample acquisition and handling, and high-speed low torque mechanism design.

  9. Sparse-sampling with time-encoded (TICO) stimulated Raman scattering for fast image acquisition

    NASA Astrophysics Data System (ADS)

    Hakert, Hubertus; Eibl, Matthias; Karpf, Sebastian; Huber, Robert

    2017-07-01

    Modern biomedical imaging modalities aim to provide researchers a multimodal contrast for a deeper insight into a specimen under investigation. A very promising technique is stimulated Raman scattering (SRS) microscopy, which can unveil the chemical composition of a sample with a very high specificity. Although the signal intensities are enhanced manifold to achieve a faster acquisition of images if compared to standard Raman microscopy, there is a trade-off between specificity and acquisition speed. Commonly used SRS concepts either probe only very few Raman transitions as the tuning of the applied laser sources is complicated or record whole spectra with a spectrometer based setup. While the first approach is fast, it reduces the specificity and the spectrometer approach records whole spectra -with energy differences where no Raman information is present-, which limits the acquisition speed. Therefore, we present a new approach based on the TICO-Raman concept, which we call sparse-sampling. The TICO-sparse-sampling setup is fully electronically controllable and allows probing of only the characteristic peaks of a Raman spectrum instead of always acquiring a whole spectrum. By reducing the spectral points to the relevant peaks, the acquisition time can be greatly reduced compared to a uniformly, equidistantly sampled Raman spectrum while the specificity and the signal to noise ratio (SNR) are maintained. Furthermore, all laser sources are completely fiber based. The synchronized detection enables a full resolution of the Raman signal, whereas the analogue and digital balancing allows shot noise limited detection. First imaging results with polystyrene (PS) and polymethylmethacrylate (PMMA) beads confirm the advantages of TICO sparse-sampling. We achieved a pixel dwell time as low as 35 μs for an image differentiating both species. The mechanical properties of the applied voice coil stage for scanning the sample currently limits even faster acquisition.

  10. Signal processing and general purpose data acquisition system for on-line tomographic measurements

    NASA Astrophysics Data System (ADS)

    Murari, A.; Martin, P.; Hemming, O.; Manduchi, G.; Marrelli, L.; Taliercio, C.; Hoffmann, A.

    1997-01-01

    New analog signal conditioning electronics and data acquisition systems have been developed for the soft x-ray and bolometric tomography diagnostic in the reverse field pinch experiment (RFX). For the soft x-ray detectors the analog signal processing includes a fully differential current to voltage conversion, with up to a 200 kHz bandwidth. For the bolometers, a 50 kHz carrier frequency amplifier allows a maximum bandwidth of 10 kHz. In both cases the analog signals are digitized with a 1 MHz sampling rate close to the diagnostic and are transmitted via a transparent asynchronous xmitter/receiver interface (TAXI) link to purpose built Versa Module Europa (VME) modules which perform data acquisition. A software library has been developed for data preprocessing and tomographic reconstruction. It has been written in C language and is self-contained, i.e., no additional mathematical library is required. The package is therefore platform-free: in particular it can perform online analysis in a real-time application, such as continuous display and feedback, and is portable for long duration fusion or other physical experiments. Due to the modular organization of the library, new preprocessing and analysis modules can be easily integrated in the environment. This software is implemented in RFX over three different platforms: open VMS, digital Unix, and VME 68040 CPU.

  11. Acoustically levitated droplets: a contactless sampling method for fluorescence studies.

    PubMed

    Leiterer, Jork; Grabolle, Markus; Rurack, Knut; Resch-Genger, Ute; Ziegler, Jan; Nann, Thomas; Panne, Ulrich

    2008-01-01

    Acoustic levitation is used as a new tool to study concentration-dependent processes in fluorescence spectroscopy. With this technique, small amounts of liquid and solid samples can be measured without the need for sample supports or containers, which often limits signal acquisition and can even alter sample properties due to interactions with the support material. We demonstrate that, because of the small sample volume, fluorescence measurements at high concentrations of an organic dye are possible without the limitation of inner-filter effects, which hamper such experiments in conventional, cuvette-based measurements. Furthermore, we show that acoustic levitation of liquid samples provides an experimentally simple way to study distance-dependent fluorescence modulations in semiconductor nanocrystals. The evaporation of the solvent during levitation leads to a continuous increase of solute concentration and can easily be monitored by laser-induced fluorescence.

  12. Light sheet microscopy.

    PubMed

    Weber, Michael; Mickoleit, Michaela; Huisken, Jan

    2014-01-01

    This chapter introduces the concept of light sheet microscopy along with practical advice on how to design and build such an instrument. Selective plane illumination microscopy is presented as an alternative to confocal microscopy due to several superior features such as high-speed full-frame acquisition, minimal phototoxicity, and multiview sample rotation. Based on our experience over the last 10 years, we summarize the key concepts in light sheet microscopy, typical implementations, and successful applications. In particular, sample mounting for long time-lapse imaging and the resulting challenges in data processing are discussed in detail. © 2014 Elsevier Inc. All rights reserved.

  13. Specific methodology for capacitance imaging by atomic force microscopy: A breakthrough towards an elimination of parasitic effects

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Estevez, Ivan; Concept Scientific Instruments, ZA de Courtaboeuf, 2 rue de la Terre de Feu, 91940 Les Ulis; Chrétien, Pascal

    2014-02-24

    On the basis of a home-made nanoscale impedance measurement device associated with a commercial atomic force microscope, a specific operating process is proposed in order to improve absolute (in sense of “nonrelative”) capacitance imaging by drastically reducing the parasitic effects due to stray capacitance, surface topography, and sample tilt. The method, combining a two-pass image acquisition with the exploitation of approach curves, has been validated on sets of calibration samples consisting in square parallel plate capacitors for which theoretical capacitance values were numerically calculated.

  14. 48 CFR 50.103-5 - Processing cases.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 48 Federal Acquisition Regulations System 1 2010-10-01 2010-10-01 false Processing cases. 50.103-5 Section 50.103-5 Federal Acquisition Regulations System FEDERAL ACQUISITION REGULATION CONTRACT MANAGEMENT... knowledge of individuals when documentary evidence is lacking, and audits if considered necessary to...

  15. On Temperature Rise Within the Shear Bands in Bulk Metallic Glasses

    NASA Astrophysics Data System (ADS)

    Bazlov, A. I.; Churyumov, A. Yu.; Buchet, M.; Louzguine-Luzgin, D. V.

    2018-05-01

    Room temperature deformation process in a bulk metallic glassy sample was studied by using a hydraulic thermomechanical simulator. The temperature rise during each separate shear band propagation event was measured with a high data acquisition frequency by a thermocouple welded to the sample. Calculation showed that when propagation of the well developed shear bands takes place along the entire sample the temperature inside the shear band should be close to the glass-transition temperature. It was also possible to resolve the temporal stress distribution and a double-stage character of stress drops was also observed. The obtained results are compared with the literature data obtained by infrared camera measurements and the results of finite elements modeling.

  16. On Temperature Rise Within the Shear Bands in Bulk Metallic Glasses

    NASA Astrophysics Data System (ADS)

    Bazlov, A. I.; Churyumov, A. Yu.; Buchet, M.; Louzguine-Luzgin, D. V.

    2018-03-01

    Room temperature deformation process in a bulk metallic glassy sample was studied by using a hydraulic thermomechanical simulator. The temperature rise during each separate shear band propagation event was measured with a high data acquisition frequency by a thermocouple welded to the sample. Calculation showed that when propagation of the well developed shear bands takes place along the entire sample the temperature inside the shear band should be close to the glass-transition temperature. It was also possible to resolve the temporal stress distribution and a double-stage character of stress drops was also observed. The obtained results are compared with the literature data obtained by infrared camera measurements and the results of finite elements modeling.

  17. Applications of nonlocal means algorithm in low-dose X-ray CT image processing and reconstruction: a review

    PubMed Central

    Zhang, Hao; Zeng, Dong; Zhang, Hua; Wang, Jing; Liang, Zhengrong

    2017-01-01

    Low-dose X-ray computed tomography (LDCT) imaging is highly recommended for use in the clinic because of growing concerns over excessive radiation exposure. However, the CT images reconstructed by the conventional filtered back-projection (FBP) method from low-dose acquisitions may be severely degraded with noise and streak artifacts due to excessive X-ray quantum noise, or with view-aliasing artifacts due to insufficient angular sampling. In 2005, the nonlocal means (NLM) algorithm was introduced as a non-iterative edge-preserving filter to denoise natural images corrupted by additive Gaussian noise, and showed superior performance. It has since been adapted and applied to many other image types and various inverse problems. This paper specifically reviews the applications of the NLM algorithm in LDCT image processing and reconstruction, and explicitly demonstrates its improving effects on the reconstructed CT image quality from low-dose acquisitions. The effectiveness of these applications on LDCT and their relative performance are described in detail. PMID:28303644

  18. Single input state, single–mode fiber–based polarization sensitive optical frequency domain imaging by eigenpolarization referencing

    PubMed Central

    Lippok, Norman; Villiger, Martin; Jun, Chang–Su; Bouma, Brett E.

    2015-01-01

    Fiber–based polarization sensitive OFDI is more challenging than free–space implementations. Using multiple input states, fiber–based systems provide sample birefringence information with the benefit of a flexible sample arm but come at the cost of increased system and acquisition complexity, and either reduce acquisition speed or require increased acquisition bandwidth. Here we show that with the calibration of a single polarization state, fiber–based configurations can approach the conceptual simplicity of traditional free–space configurations. We remotely control the polarization state of the light incident at the sample using the eigenpolarization states of a wave plate as a reference, and determine the Jones matrix of the output fiber. We demonstrate this method for polarization sensitive imaging of biological samples. PMID:25927775

  19. Erosion Modeling in Central China - Soil Data Acquisition by Conditioned Latin Hypercube Sampling and Incorporation of Legacy Data

    NASA Astrophysics Data System (ADS)

    Stumpf, Felix; Schönbrodt-Stitt, Sarah; Schmidt, Karsten; Behrens, Thorsten; Scholten, Thomas

    2013-04-01

    The Three Gorges Dam at the Yangtze River in Central China outlines a prominent example of human-induced environmental impacts. Throughout one year the water table at the main river fluctuates about 30m due to impoundment and drainage activities. The dynamic water table implicates a range of georisks such as soil erosion, mass movements, sediment transport and diffuse matter inputs into the reservoir. Within the framework of the joint Sino-German project YANGTZE GEO, the subproject "Soil Erosion" deals with soil erosion risks and sediment transport pathways into the reservoir. The study site is a small catchment (4.8 km²) in Badong, approximately 100 km upstream the dam. It is characterized by scattered plots of agricultural landuse and resettlements in a largely wooded, steep sloping and mountainous area. Our research is focused on data acquisition and processing to develop a process-oriented erosion model. Hereby, area-covering knowledge of specific soil properties in the catchment is an intrinsic input parameter. This will be acquired by means of digital soil mapping (DSM). Thereby, soil properties are estimated by covariates. The functions are calibrated by soil property samples. The DSM approach is based on an appropriate sample design, which reflects the heterogeneity of the catchment, regarding the covariates with influence on the relevant soil properties. In this approach the covariates, processed by a digital terrain analysis, are outlined by the slope, altitude, profile curvature, plane curvature, and the aspect. For the development of the sample design, we chose the Conditioned Latin Hypercube Sampling (cLHS) procedure (Minasny and McBratney, 2006). It provides an efficient method of sampling variables from their multivariate distribution. Thereby, a sample size n from multiple variables is drawn such that for each variable the sample is marginally maximally stratified. The method ensures the maximal stratification by two features: First, number of strata equals the sample size n and secondly, the probability of falling in each of the strata is n-¹ (McKay et al., 1979). We extended the classical cLHS with extremes (Schmidt et al., 2012) approach by incorporating legacy data of previous field campaigns. Instead of identifying precise sample locations by CLHS, we demarcate the multivariate attribute space of the samples based on the histogram borders of each stratum. This widens the spatial scope of the actual CLHS sample locations and allows the incorporation of legacy data lying within that scope. Furthermore, this approach provides an extended potential regarding the accessibility of sample sites in the field.

  20. Computer-Aided Process and Tools for Mobile Software Acquisition

    DTIC Science & Technology

    2013-04-01

    Software Acquisition Christopher Bonine , Man-Tak Shing, and Thomas W. Otani Naval Postgraduate School Published April 1, 2013 Approved for public...ManTech International Corporation Computer-Aided Process and Tools for Mobile Software Acquisition Christopher Bonine , Man-Tak Shing, and Thomas W. Otani...Mobile Software Acquisition Christopher Bonine — Bonine is a lieutenant in the United States Navy. He is currently assigned to the Navy Cyber Defense

  1. Nonuniform sampling and non-Fourier signal processing methods in multidimensional NMR.

    PubMed

    Mobli, Mehdi; Hoch, Jeffrey C

    2014-11-01

    Beginning with the introduction of Fourier Transform NMR by Ernst and Anderson in 1966, time domain measurement of the impulse response (the free induction decay, FID) consisted of sampling the signal at a series of discrete intervals. For compatibility with the discrete Fourier transform (DFT), the intervals are kept uniform, and the Nyquist theorem dictates the largest value of the interval sufficient to avoid aliasing. With the proposal by Jeener of parametric sampling along an indirect time dimension, extension to multidimensional experiments employed the same sampling techniques used in one dimension, similarly subject to the Nyquist condition and suitable for processing via the discrete Fourier transform. The challenges of obtaining high-resolution spectral estimates from short data records using the DFT were already well understood, however. Despite techniques such as linear prediction extrapolation, the achievable resolution in the indirect dimensions is limited by practical constraints on measuring time. The advent of non-Fourier methods of spectrum analysis capable of processing nonuniformly sampled data has led to an explosion in the development of novel sampling strategies that avoid the limits on resolution and measurement time imposed by uniform sampling. The first part of this review discusses the many approaches to data sampling in multidimensional NMR, the second part highlights commonly used methods for signal processing of such data, and the review concludes with a discussion of other approaches to speeding up data acquisition in NMR. Copyright © 2014 Elsevier B.V. All rights reserved.

  2. New Tools and Methods for Assessing Risk-Management Strategies

    DTIC Science & Technology

    2004-03-01

    Theories to evaluate the risks and benefits of various acquisition alternatives and allowed researchers to monitor the process students used to make a...revealed distinct risk-management strategies. 15. SUBJECT TERMS risk managements, acquisition process, expected value theory , multi-attribute utility theory ...Utility Theories to evaluate the risks and benefits of various acquisition alternatives, and allowed us to monitor the process subjects used to arrive at

  3. Three Big Ideas for Reforming Acquisition: Evidence-Based Propositions for Transformation

    DTIC Science & Technology

    2015-04-30

    specific ideas for improving key aspects of defense acquisition reforming the process for managing capabilities, addressing technology insertion, and...offers three specific ideas for improving key aspects of defense acquisition: reforming the process for managing capabilities, addressing technology...and process changes need to be made for any significant change to be seen. This paper offers reform ideas in three specific areas: achieving the

  4. Waterfall notch-filtering for restoration of acoustic backscatter records from Admiralty Bay, Antarctica

    NASA Astrophysics Data System (ADS)

    Fonseca, Luciano; Hung, Edson Mintsu; Neto, Arthur Ayres; Magrani, Fábio José Guedes

    2018-06-01

    A series of multibeam sonar surveys were conducted from 2009 to 2013 around Admiralty Bay, Shetland Islands, Antarctica. These surveys provided a detailed bathymetric model that helped understand and characterize the bottom geology of this remote area. Unfortunately, the acoustic backscatter records registered during these bathymetric surveys were heavily contaminated with noise and motion artifacts. These artifacts persisted in the backscatter records despite the fact that the proper acquisition geometry and the necessary offsets and delays were applied during the survey and in post-processing. These noisy backscatter records were very difficult to interpret and to correlate with gravity-core samples acquired in the same area. In order to address this issue, a directional notch-filter was applied to the backscatter waterfall in the along-track direction. The proposed filter provided better estimates for the backscatter strength of each sample by considerably reducing residual motion artifacts. The restoration of individual samples was possible since the waterfall frame of reference preserves the acquisition geometry. Then, a remote seafloor characterization procedure based on an acoustic model inversion was applied to the restored backscatter samples, generating remote estimates of acoustic impedance. These remote estimates were compared to Multi Sensor Core Logger measurements of acoustic impedance obtained from gravity core samples. The remote estimates and the Core Logger measurements of acoustic impedance were comparable when the shallow seafloor was homogeneous. The proposed waterfall notch-filtering approach can be applied to any sonar record, provided that we know the system ping-rate and sampling frequency.

  5. Development of induction current acquisition device based on ARM

    NASA Astrophysics Data System (ADS)

    Ji, Yanju; Liu, Xiyang; Huang, Wanyu; Yao, Jiang; Yuan, Guiyang; Hui, Luan; Guan, Shanshan

    2018-03-01

    We design an induction current acquisition device based on ARM in order to realize high resolution and high sampling rate of acquisition for the induction current in wire-loop. Considering its characteristics of fast attenuation and small signal amplitude, we use the method of multi-path fusion for noise suppression. In the paper, the design is carried out from three aspects of analog circuit and device selection, independent power supply structure and the electromagnetic interference suppression of high frequency. DMA and ping-pong buffer, as a new data transmission technology, solves real time storage problem of massive data. The performance parameters of ARM acquisition device are tested. The comparison test of ARM acquisition device and cRIO acquisition device is performed at different time constants. The results show that it has 120dB dynamic range, 47kHz bandwidth, 96kHz sampling rate, 5μV the smallest resolution, and its average error value is not more than 4%, which proves the high accuracy and stability of the device.

  6. Implicit and Explicit Cognitive Processes in Incidental Vocabulary Acquisition

    ERIC Educational Resources Information Center

    Ender, Andrea

    2016-01-01

    Studies on vocabulary acquisition in second language learning have revealed that a large amount of vocabulary is learned without an overt intention, in other words, incidentally. This article investigates the relevance of different lexical processing strategies for vocabulary acquisition when reading a text for comprehension among 24 advanced…

  7. Automated data acquisition technology development:Automated modeling and control development

    NASA Technical Reports Server (NTRS)

    Romine, Peter L.

    1995-01-01

    This report documents the completion of, and improvements made to, the software developed for automated data acquisition and automated modeling and control development on the Texas Micro rackmounted PC's. This research was initiated because a need was identified by the Metal Processing Branch of NASA Marshall Space Flight Center for a mobile data acquisition and data analysis system, customized for welding measurement and calibration. Several hardware configurations were evaluated and a PC based system was chosen. The Welding Measurement System (WMS), is a dedicated instrument strickly for use of data acquisition and data analysis. In addition to the data acquisition functions described in this thesis, WMS also supports many functions associated with process control. The hardware and software requirements for an automated acquisition system for welding process parameters, welding equipment checkout, and welding process modeling were determined in 1992. From these recommendations, NASA purchased the necessary hardware and software. The new welding acquisition system is designed to collect welding parameter data and perform analysis to determine the voltage versus current arc-length relationship for VPPA welding. Once the results of this analysis are obtained, they can then be used to develop a RAIL function to control welding startup and shutdown without torch crashing.

  8. Off-line real-time FTIR analysis of a process step in imipenem production

    NASA Astrophysics Data System (ADS)

    Boaz, Jhansi R.; Thomas, Scott M.; Meyerhoffer, Steven M.; Staskiewicz, Steven J.; Lynch, Joseph E.; Egan, Richard S.; Ellison, Dean K.

    1992-08-01

    We have developed an FT-IR method, using a Spectra-Tech Monit-IR 400 systems, to monitor off-line the completion of a reaction in real-time. The reaction is moisture-sensitive and analysis by more conventional methods (normal-phase HPLC) is difficult to reproduce. The FT-IR method is based on the shift of a diazo band when a conjugated beta-diketone is transformed into a silyl enol ether during the reaction. The reaction mixture is examined directly by IR and does not require sample workup. Data acquisition time is less than one minute. The method has been validated for specificity, precision and accuracy. The results obtained by the FT-IR method for known mixtures and in-process samples compare favorably with those from a normal-phase HPLC method.

  9. Spectral Dynamics Inc., ships hybrid, 316-channel data acquisition system to Sandia Labs.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schwartz, Douglas

    2003-09-01

    Spectral Dynamics announced the shipment of a 316-channel data acquisition system. The system was custom designed for the Light Initiated High Explosive (LIHE) facility at Sandia Labs in Albuquerque, New Mexico by Spectral Dynamics Advanced Research Products Group. This Spectral Dynamics data acquisition system was tailored to meet the unique LIHE environmental and testing requirements utilizing Spectral Dynamics commercial off the shelf (COTS) Jaguar and VIDAS products supplemented by SD Alliance partner's (COTS) products. 'This system is just the beginning of our cutting edge merged technology solutions,' stated Mark Remelman, Manager for the Spectral Dynamics Advanced Research Products Group. 'Thismore » Hybrid system has 316-channels of data acquisition capability, comprised of 102.4kHz direct to disk acquisition and 2.5MHz, 200Mhz & 500Mhz RAM based capabilities. In addition it incorporates the advanced bridge conditioning and dynamic configuration capabilities offered by Spectral Dynamics new Smart Interface Panel System (SIPS{trademark}).' After acceptance testing, Tony King, the Instrumentation Engineer facilitating the project for the Sandia LIHE group commented; 'The LIHE staff was very impressed with the design, construction, attention to detail and overall performance of the instrumentation system'. This system combines VIDAS, a leading edge fourth generation SD-VXI hardware and field-proven software system from SD's Advanced Research Products Group with SD's Jaguar, a multiple Acquisition Control Peripheral (ACP) system that allows expansion to hundreds of channels without sacrificing signal processing performance. Jaguar incorporates dedicated throughput disks for each ACP providing time streaming to disk at up to the maximum sample rate. Spectral Dynamics, Inc. is a leading worldwide supplier of systems and software for advanced computer-automated data acquisition, vibration testing, structural dynamics, explosive shock, high-speed transient capture, acoustic analysis, monitoring, measurement, control and backup. Spectral Dynamics products are used for research, design verification, product testing and process improvement by manufacturers of all types of electrical, electronic and mechanical products, as well as by universities and government-funded agencies. The Advanced Research Products Group is the newest addition to the Spectral Dynamics family. Their newest VXI data acquisition hardware pushes the envelope on capabilities and embodies the same rock solid design methodologies, which have always differentiated Spectral Dynamics from its competition.« less

  10. Full-field transient vibrometry of the human tympanic membrane by local phase correlation and high-speed holography

    PubMed Central

    Dobrev, Ivo; Furlong, Cosme; Cheng, Jeffrey T.; Rosowski, John J.

    2014-01-01

    Abstract. Understanding the human hearing process would be helped by quantification of the transient mechanical response of the human ear, including the human tympanic membrane (TM or eardrum). We propose a new hybrid high-speed holographic system (HHS) for acquisition and quantification of the full-field nanometer transient (i.e., >10  kHz) displacement of the human TM. We have optimized and implemented a 2+1 frame local correlation (LC) based phase sampling method in combination with a high-speed (i.e., >40  K fps) camera acquisition system. To our knowledge, there is currently no existing system that provides such capabilities for the study of the human TM. The LC sampling method has a displacement difference of <11  nm relative to measurements obtained by a four-phase step algorithm. Comparisons between our high-speed acquisition system and a laser Doppler vibrometer indicate differences of <10  μs. The high temporal (i.e., >40  kHz) and spatial (i.e., >100  k data points) resolution of our HHS enables parallel measurements of all points on the surface of the TM, which allows quantification of spatially dependent motion parameters, such as modal frequencies and acoustic delays. Such capabilities could allow inferring local material properties across the surface of the TM. PMID:25191832

  11. Characterization of Decision Making Behaviors Associated with Human Systems Integration (HSI) Design Tradeoffs: Subject Matter Expert Interviews

    DTIC Science & Technology

    2014-11-18

    this research was to characterize the naturalistic decision making process used in Naval Aviation acquisition to assess cost, schedule and...Naval Aviation acquisitions can be identified, which can support the future development of new processes and tools for training and decision making...part of Department of Defense acquisition processes , HSI ensures that operator, maintainer and sustainer considerations are incorporated into

  12. An automated atmospheric sampling system operating on 747 airliners

    NASA Technical Reports Server (NTRS)

    Perkins, P.; Gustafsson, U. R. C.

    1975-01-01

    An air sampling system that automatically measures the temporal and spatial distribution of selected particulate and gaseous constituents of the atmosphere has been installed on a number of commercial airliners and is collecting data on commercial air routes covering the world. Measurements of constituents related to aircraft engine emissions and other pollutants are made in the upper troposphere and lower stratosphere (6 to 12 km) in support of the Global Air Sampling Program (GASP). Aircraft operated by different airlines sample air at latitudes from the Arctic to Australia. This system includes specialized instrumentation for measuring carbon monoxide, ozone, water vapor, and particulates, a special air inlet probe for sampling outside air, a computerized automatic control, and a data acquisition system. Air constituents and related flight data are tape recorded in flight for later computer processing on the ground.

  13. Pore water sampling in acid sulfate soils: a new peeper method.

    PubMed

    Johnston, Scott G; Burton, Edward D; Keene, Annabelle F; Bush, Richard T; Sullivan, Leigh A; Isaacson, Lloyd

    2009-01-01

    This study describes the design, deployment, and application of a modified equilibration dialysis device (peeper) optimized for sampling pore waters in acid sulfate soils (ASS). The modified design overcomes the limitations of traditional-style peepers, when sampling firm ASS materials over relatively large depth intervals. The new peeper device uses removable, individual cells of 25 mL volume housed in a 1.5 m long rigid, high-density polyethylene rod. The rigid housing structure allows the device to be inserted directly into relatively firm soils without requiring a supporting frame. The use of removable cells eliminates the need for a large glove-box after peeper retrieval, thus simplifying physical handling. Removable cells are easily maintained in an inert atmosphere during sample processing and the 25-mL sample volume is sufficient for undertaking multiple analyses. A field evaluation of equilibration times indicates that 32 to 38 d of deployment was necessary. Overall, the modified method is simple and effective and well suited to acquisition and processing of redox-sensitive pore water profiles>1 m deep in acid sulfate soil or any other firm wetland soils.

  14. Studying Dynamic Processes of Nano-sized Objects in Liquid using Scanning Transmission Electron Microscopy.

    PubMed

    Hermannsdörfer, Justus; de Jonge, Niels

    2017-02-05

    Samples fully embedded in liquid can be studied at a nanoscale spatial resolution with Scanning Transmission Electron Microscopy (STEM) using a microfluidic chamber assembled in the specimen holder for Transmission Electron Microscopy (TEM) and STEM. The microfluidic system consists of two silicon microchips supporting thin Silicon Nitride (SiN) membrane windows. This article describes the basic steps of sample loading and data acquisition. Most important of all is to ensure that the liquid compartment is correctly assembled, thus providing a thin liquid layer and a vacuum seal. This protocol also includes a number of tests necessary to perform during sample loading in order to ensure correct assembly. Once the sample is loaded in the electron microscope, the liquid thickness needs to be measured. Incorrect assembly may result in a too-thick liquid, while a too-thin liquid may indicate the absence of liquid, such as when a bubble is formed. Finally, the protocol explains how images are taken and how dynamic processes can be studied. A sample containing AuNPs is imaged both in pure water and in saline.

  15. Studying Dynamic Processes of Nano-sized Objects in Liquid using Scanning Transmission Electron Microscopy

    PubMed Central

    Hermannsdörfer, Justus; de Jonge, Niels

    2017-01-01

    Samples fully embedded in liquid can be studied at a nanoscale spatial resolution with Scanning Transmission Electron Microscopy (STEM) using a microfluidic chamber assembled in the specimen holder for Transmission Electron Microscopy (TEM) and STEM. The microfluidic system consists of two silicon microchips supporting thin Silicon Nitride (SiN) membrane windows. This article describes the basic steps of sample loading and data acquisition. Most important of all is to ensure that the liquid compartment is correctly assembled, thus providing a thin liquid layer and a vacuum seal. This protocol also includes a number of tests necessary to perform during sample loading in order to ensure correct assembly. Once the sample is loaded in the electron microscope, the liquid thickness needs to be measured. Incorrect assembly may result in a too-thick liquid, while a too-thin liquid may indicate the absence of liquid, such as when a bubble is formed. Finally, the protocol explains how images are taken and how dynamic processes can be studied. A sample containing AuNPs is imaged both in pure water and in saline. PMID:28190028

  16. An Open-Source Storage Solution for Cryo-Electron Microscopy Samples.

    PubMed

    Ultee, Eveline; Schenkel, Fred; Yang, Wen; Brenzinger, Susanne; Depelteau, Jamie S; Briegel, Ariane

    2018-02-01

    Cryo-electron microscopy (cryo-EM) enables the study of biological structures in situ in great detail and to solve protein structures at Ångstrom level resolution. Due to recent advances in instrumentation and data processing, the field of cryo-EM is a rapidly growing. Access to facilities and national centers that house the state-of-the-art microscopes is limited due to the ever-rising demand, resulting in long wait times between sample preparation and data acquisition. To improve sample storage, we have developed a cryo-storage system with an efficient, high storage capacity that enables sample storage in a highly organized manner. This system is simple to use, cost-effective and easily adaptable for any type of grid storage box and dewar and any size cryo-EM laboratory.

  17. Laser-induced photo emission detection: data acquisition based on light intensity counting

    NASA Astrophysics Data System (ADS)

    Yulianto, N.; Yudasari, N.; Putri, K. Y.

    2017-04-01

    Laser Induced Breakdown Detection (LIBD) is one of the quantification techniques for colloids. There are two ways of detection in LIBD: optical detection and acoustic detection. LIBD is based on the detection of plasma emission due to the interaction between particle and laser beam. In this research, the changing of light intensity during plasma formations was detected by a photodiode sensor. A photo emission data acquisition system was built to collect and transform them into digital counts. The real-time system used data acquisition device National Instrument DAQ 6009 and LABVIEW software. The system has been tested on distilled water and tap water samples. The result showed 99.8% accuracy by using counting technique in comparison to the acoustic detection with sample rate of 10 Hz, thus the acquisition system can be applied as an alternative method to the existing LIBD acquisition system.

  18. Enhancement of the Acquisition Process for a Combat System-A Case Study to Model the Workflow Processes for an Air Defense System Acquisition

    DTIC Science & Technology

    2009-12-01

    Business Process Modeling BPMN Business Process Modeling Notation SoA Service-oriented Architecture UML Unified Modeling Language CSP...system developers. Supporting technologies include Business Process Modeling Notation ( BPMN ), Unified Modeling Language (UML), model-driven architecture

  19. Phonological Acquisition of Korean Consonants in Conversational Speech Produced by Young Korean Children

    ERIC Educational Resources Information Center

    Kim, Minjung; Kim, Soo-Jin; Stoel-Gammon, Carol

    2017-01-01

    This study investigates the phonological acquisition of Korean consonants using conversational speech samples collected from sixty monolingual typically developing Korean children aged two, three, and four years. Phonemic acquisition was examined for syllable-initial and syllable-final consonants. Results showed that Korean children acquired stops…

  20. Possible Detection of Perchlorates by Evolved Gas Analysis of Rocknest Soils: Global Implication

    NASA Technical Reports Server (NTRS)

    Archer, P. D., Jr.; Sutter, B.; Ming, D. W.; McKay, C. P.; Navarro-Gonzalez, R.; Franz, H. B.; McAdam, A.; Mahaffy, P. R.

    2013-01-01

    The Sample Analysis at Mars (SAM) instrument suite on board the Mars Science Laboratory (MSL) recently ran four samples from an aeolian bedform named Rocknest. Rocknest was selected as the source of the first samples analyzed because it is representative of both windblown material in Gale crater as well as the globally-distributed dust. The four samples analyzed by SAM were portioned from the fifth scoop at this location. The material delivered to SAM passed through a 150 m sieve and should have been well mixed during the sample acquisition/ preparation/handoff process. Rocknest samples were heated to 835 C at a 35 C/minute ramp rate with a He carrier gas flow rate of 1.5 standard cubic centimeters per minute and at an oven pressure of 30 mbar. Evolved gases were detected by a quadrupole mass spectrometer (QMS).

  1. A system design of data acquisition and processing for side-scatter lidar

    NASA Astrophysics Data System (ADS)

    Zhang, ZhanYe; Xie, ChenBo; Wang, ZhenZhu; Kuang, ZhiQiang; Deng, Qian; Tao, ZongMing; Liu, Dong; Wang, Yingjian

    2018-03-01

    A system for collecting data of Side-Scatter lidar based on Charge Coupled Device (CCD),is designed and implemented. The system of data acquisition is based on Microsoft. Net structure and the language of C# is used to call dynamic link library (DLL) of CCD for realization of the real-time data acquisition and processing. The software stores data as txt file for post data acquisition and analysis. The system has ability to operate CCD device in all-day, automatic, continuous and high frequency data acquisition and processing conditions, which will catch 24-hour information of the atmospheric scatter's light intensity and retrieve the spatial and temporal properties of aerosol particles. The experimental result shows that the system is convenient to observe the aerosol optical characteristics near surface.

  2. Study of sample drilling techniques for Mars sample return missions

    NASA Technical Reports Server (NTRS)

    Mitchell, D. C.; Harris, P. T.

    1980-01-01

    To demonstrate the feasibility of acquiring various surface samples for a Mars sample return mission the following tasks were performed: (1) design of a Mars rover-mounted drill system capable of acquiring crystalline rock cores; prediction of performance, mass, and power requirements for various size systems, and the generation of engineering drawings; (2) performance of simulated permafrost coring tests using a residual Apollo lunar surface drill, (3) design of a rock breaker system which can be used to produce small samples of rock chips from rocks which are too large to return to Earth, but too small to be cored with the Rover-mounted drill; (4)design of sample containers for the selected regolith cores, rock cores, and small particulate or rock samples; and (5) design of sample handling and transfer techniques which will be required through all phase of sample acquisition, processing, and stowage on-board the Earth return vehicle. A preliminary design of a light-weight Rover-mounted sampling scoop was also developed.

  3. Evolution Of The Operational Energy Strategy And Its Consideration In The Defense Acquisition Process

    DTIC Science & Technology

    2016-09-01

    OPERATIONAL ENERGY STRATEGY AND ITS CONSIDERATION IN THE DEFENSE ACQUISITION PROCESS by Richard J. Kendig Ashley D. Seaton Robert J. Rodgers...project 4. TITLE AND SUBTITLE EVOLUTION OF THE OPERATIONAL ENERGY STRATEGY AND ITS CONSIDERATION IN THE DEFENSE ACQUISITION PROCESS 5. FUNDING...looked at the DOD Operational Energy Strategy evolution and how it applies to new and modified weapon systems, considering the three-legged table of the

  4. Development and Flight Testing of an Adaptive Vehicle Health-Monitoring Architecture

    NASA Technical Reports Server (NTRS)

    Woodard, Stanley E.; Coffey, Neil C.; Gonzalez, Guillermo A.; Taylor, B. Douglas; Brett, Rube R.; Woodman, Keith L.; Weathered, Brenton W.; Rollins, Courtney H.

    2002-01-01

    On going development and testing of an adaptable vehicle health-monitoring architecture is presented. The architecture is being developed for a fleet of vehicles. It has three operational levels: one or more remote data acquisition units located throughout the vehicle; a command and control unit located within the vehicle, and, a terminal collection unit to collect analysis results from all vehicles. Each level is capable of performing autonomous analysis with a trained expert system. The expert system is parameterized, which makes it adaptable to be trained to both a user's subject reasoning and existing quantitative analytic tools. Communication between all levels is done with wireless radio frequency interfaces. The remote data acquisition unit has an eight channel programmable digital interface that allows the user discretion for choosing type of sensors; number of sensors, sensor sampling rate and sampling duration for each sensor. The architecture provides framework for a tributary analysis. All measurements at the lowest operational level are reduced to provide analysis results necessary to gauge changes from established baselines. These are then collected at the next level to identify any global trends or common features from the prior level. This process is repeated until the results are reduced at the highest operational level. In the framework, only analysis results are forwarded to the next level to reduce telemetry congestion. The system's remote data acquisition hardware and non-analysis software have been flight tested on the NASA Langley B757's main landing gear. The flight tests were performed to validate the following: the wireless radio frequency communication capabilities of the system, the hardware design, command and control; software operation and, data acquisition, storage and retrieval.

  5. Design of remote laser-induced fluorescence system's acquisition circuit

    NASA Astrophysics Data System (ADS)

    Wang, Guoqing; Lou, Yue; Wang, Ran; Yan, Debao; Li, Xin; Zhao, Xin; Chen, Dong; Zhao, Qi

    2017-10-01

    Laser-induced fluorescence system(LIfS) has been found its significant application in identifying one kind of substance from another by its properties even it's thimbleful, and becomes useful in plenty of fields. Many superior works have reported LIfS' theoretical analysis , designs and uses. However, the usual LIPS is always constructed in labs to detect matter quite closely, for the system using low-power laser as excitation source and charge coupled device (CCD) as detector. Promoting the detectivity of LIfS is of much concern to spread its application. Here, we take a high-energy narrow-pulse laser instead of commonly used continuous wave laser to operate sample, thus we can get strong fluorescent. Besides, photomultiplier (PMT) with high sensitivity is adopted in our system to detect extremely weak fluorescence after a long flight time from the sample to the detector. Another advantage in our system, as the fluorescence collected into spectroscopy, multiple wavelengths of light can be converted to the corresponding electrical signals with the linear array multichannel PMT. Therefore, at the cost of high-powered incentive and high-sensitive detector, a remote LIFS is get. In order to run this system, it is of importance to turn light signal to digital signal which can be processed by computer. The pulse width of fluorescence is deeply associated with excitation laser, at the nanosecond(ns) level, which has a high demand for acquisition circuit. We design an acquisition circuit including, I/V conversion circuit, amplifying circuit and peak-holding circuit. The simulation of circuit shows that peak-holding circuit can be one effective approach to reducing difficulty of acquisition circuit.

  6. The Acquisition and Transfer of Botanical Classification by Elementary Science Methods Students.

    ERIC Educational Resources Information Center

    Knapp, Clifford Edward

    Investigated were two questions related to the acquisition and transfer of botanical classification skill by elementary science methods students. Data were collected from a sample of 89 students enrolled in methods courses. Sixty-two students served as the experimental sample, and 27 served as the control for the transfer portion of the research.…

  7. A data-independent acquisition workflow for qualitative screening of new psychoactive substances in biological samples.

    PubMed

    Kinyua, Juliet; Negreira, Noelia; Ibáñez, María; Bijlsma, Lubertus; Hernández, Félix; Covaci, Adrian; van Nuijs, Alexander L N

    2015-11-01

    Identification of new psychoactive substances (NPS) is challenging. Developing targeted methods for their analysis can be difficult and costly due to their impermanence on the drug scene. Accurate-mass mass spectrometry (AMMS) using a quadrupole time-of-flight (QTOF) analyzer can be useful for wide-scope screening since it provides sensitive, full-spectrum MS data. Our article presents a qualitative screening workflow based on data-independent acquisition mode (all-ions MS/MS) on liquid chromatography (LC) coupled to QTOFMS for the detection and identification of NPS in biological matrices. The workflow combines and structures fundamentals of target and suspect screening data processing techniques in a structured algorithm. This allows the detection and tentative identification of NPS and their metabolites. We have applied the workflow to two actual case studies involving drug intoxications where we detected and confirmed the parent compounds ketamine, 25B-NBOMe, 25C-NBOMe, and several predicted phase I and II metabolites not previously reported in urine and serum samples. The screening workflow demonstrates the added value for the detection and identification of NPS in biological matrices.

  8. Mobile phone based SCADA for industrial automation.

    PubMed

    Ozdemir, Engin; Karacor, Mevlut

    2006-01-01

    SCADA is the acronym for "Supervisory Control And Data Acquisition." SCADA systems are widely used in industry for supervisory control and data acquisition of industrial processes. Conventional SCADA systems use PC, notebook, thin client, and PDA as a client. In this paper, a Java-enabled mobile phone has been used as a client in a sample SCADA application in order to display and supervise the position of a sample prototype crane. The paper presents an actual implementation of the on-line controlling of the prototype crane via mobile phone. The wireless communication between the mobile phone and the SCADA server is performed by means of a base station via general packet radio service (GPRS) and wireless application protocol (WAP). Test results have indicated that the mobile phone based SCADA integration using the GPRS or WAP transfer scheme could enhance the performance of the crane in a day without causing an increase in the response times of SCADA functions. The operator can visualize and modify the plant parameters using his mobile phone, without reaching the site. In this way maintenance costs are reduced and productivity is increased.

  9. An Automated Platform for High-Resolution Tissue Imaging Using Nanospray Desorption Electrospray Ionization Mass Spectrometry

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lanekoff, Ingela T.; Heath, Brandi S.; Liyu, Andrey V.

    2012-10-02

    An automated platform has been developed for acquisition and visualization of mass spectrometry imaging (MSI) data using nanospray desorption electrospray ionization (nano-DESI). The new system enables robust operation of the nano-DESI imaging source over many hours. This is achieved by controlling the distance between the sample and the probe by mounting the sample holder onto an automated XYZ stage and defining the tilt of the sample plane. This approach is useful for imaging of relatively flat samples such as thin tissue sections. Custom software called MSI QuickView was developed for visualization of large data sets generated in imaging experiments. MSImore » QuickView enables fast visualization of the imaging data during data acquisition and detailed processing after the entire image is acquired. The performance of the system is demonstrated by imaging rat brain tissue sections. High resolution mass analysis combined with MS/MS experiments enabled identification of lipids and metabolites in the tissue section. In addition, high dynamic range and sensitivity of the technique allowed us to generate ion images of low-abundance isobaric lipids. High-spatial resolution image acquired over a small region of the tissue section revealed the spatial distribution of an abundant brain metabolite, creatine, in the white and gray matter that is consistent with the literature data obtained using magnetic resonance spectroscopy.« less

  10. Raman spectroscopic analysis of real samples: Brazilian bauxite mineralogy

    NASA Astrophysics Data System (ADS)

    Faulstich, Fabiano Richard Leite; Castro, Harlem V.; de Oliveira, Luiz Fernando Cappa; Neumann, Reiner

    2011-10-01

    In this investigation, Raman spectroscopy with 1064 and 632.8 nm excitation was used to investigate real mineral samples of bauxite ore from mines of Northern Brazil, together with Raman mapping and X-rays diffraction. The obtained results show clearly that the use of microRaman spectroscopy is a powerful tool for the identification of all the minerals usually found in bauxites: gibbsite, kaolinite, goethite, hematite, anatase and quartz. Bulk samples can also be analysed, and FT-Raman is more adequate due to better signal-to-noise ratio and representativity, although not efficient for kaolinite. The identification of fingerprinting vibrations for all the minerals allows the acquisition of Raman-based chemical maps, potentially powerful tools for process mineralogy applied to bauxite ores.

  11. Accelerated Optical Projection Tomography Applied to In Vivo Imaging of Zebrafish

    PubMed Central

    Correia, Teresa; Yin, Jun; Ramel, Marie-Christine; Andrews, Natalie; Katan, Matilda; Bugeon, Laurence; Dallman, Margaret J.; McGinty, James; Frankel, Paul; French, Paul M. W.; Arridge, Simon

    2015-01-01

    Optical projection tomography (OPT) provides a non-invasive 3-D imaging modality that can be applied to longitudinal studies of live disease models, including in zebrafish. Current limitations include the requirement of a minimum number of angular projections for reconstruction of reasonable OPT images using filtered back projection (FBP), which is typically several hundred, leading to acquisition times of several minutes. It is highly desirable to decrease the number of required angular projections to decrease both the total acquisition time and the light dose to the sample. This is particularly important to enable longitudinal studies, which involve measurements of the same fish at different time points. In this work, we demonstrate that the use of an iterative algorithm to reconstruct sparsely sampled OPT data sets can provide useful 3-D images with 50 or fewer projections, thereby significantly decreasing the minimum acquisition time and light dose while maintaining image quality. A transgenic zebrafish embryo with fluorescent labelling of the vasculature was imaged to acquire densely sampled (800 projections) and under-sampled data sets of transmitted and fluorescence projection images. The under-sampled OPT data sets were reconstructed using an iterative total variation-based image reconstruction algorithm and compared against FBP reconstructions of the densely sampled data sets. To illustrate the potential for quantitative analysis following rapid OPT data acquisition, a Hessian-based method was applied to automatically segment the reconstructed images to select the vasculature network. Results showed that 3-D images of the zebrafish embryo and its vasculature of sufficient visual quality for quantitative analysis can be reconstructed using the iterative algorithm from only 32 projections—achieving up to 28 times improvement in imaging speed and leading to total acquisition times of a few seconds. PMID:26308086

  12. The Impact of Maintenance Free Operating Period Approach to Acquisition Approaches, System Sustainment, and Costs

    DTIC Science & Technology

    2013-01-07

    of MFOP principles on processes, procedures , and costs in acquisition planning. It investigates MFOP and reviews the results of a 2005 submarine pilot...approach be a game changer? This paper evaluates the potential impact of MFOP principles on processes, procedures , and costs in acquisition planning. It...elli=lc=_rpfkbpp=C=mr_if`=mlif`v= = - 2 - k^s^i=mlpqdo^ar^qb=p`elli= procedures , and costs in acquisition planning. The scope of the research was to

  13. Automating Acquisitions: The Planning Process.

    ERIC Educational Resources Information Center

    Bryant, Bonita

    1984-01-01

    Account of process followed at large academic library in preparing for automation of acquisition and fund accounting functions highlights planning criteria, local goals, planning process elements (selecting participants, assigning tasks, devising timetable, providing foundations, evaluating systems, determining costs, formulating recommendations).…

  14. 48 CFR 801.602-78 - Processing solicitations and contract documents for legal or technical review-Veterans Health...

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ..., Central Office (except Office of Construction and Facilities Management), the National Acquisition Center, and the Denver Acquisition and Logistics Center. 801.602-78 Section 801.602-78 Federal Acquisition... Acquisition Center, and the Denver Acquisition and Logistics Center. (a) If legal or technical review is...

  15. 48 CFR 801.602-78 - Processing solicitations and contract documents for legal or technical review-Veterans Health...

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ..., Central Office (except Office of Construction and Facilities Management), the National Acquisition Center, and the Denver Acquisition and Logistics Center. 801.602-78 Section 801.602-78 Federal Acquisition... Acquisition Center, and the Denver Acquisition and Logistics Center. (a) If legal or technical review is...

  16. 48 CFR 801.602-78 - Processing solicitations and contract documents for legal or technical review-Veterans Health...

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ..., Central Office (except Office of Construction and Facilities Management), the National Acquisition Center, and the Denver Acquisition and Logistics Center. 801.602-78 Section 801.602-78 Federal Acquisition... Acquisition Center, and the Denver Acquisition and Logistics Center. (a) If legal or technical review is...

  17. 48 CFR 852.246-72 - Frozen processed foods.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 48 Federal Acquisition Regulations System 5 2011-10-01 2011-10-01 false Frozen processed foods. 852.246-72 Section 852.246-72 Federal Acquisition Regulations System DEPARTMENT OF VETERANS AFFAIRS... Frozen processed foods. As prescribed in 846.302-72, insert the following clause: Frozen Processed Foods...

  18. 48 CFR 852.246-72 - Frozen processed foods.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 48 Federal Acquisition Regulations System 5 2010-10-01 2010-10-01 false Frozen processed foods. 852.246-72 Section 852.246-72 Federal Acquisition Regulations System DEPARTMENT OF VETERANS AFFAIRS... Frozen processed foods. As prescribed in 846.302-72, insert the following clause: Frozen Processed Foods...

  19. 48 CFR 852.246-72 - Frozen processed foods.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 48 Federal Acquisition Regulations System 5 2014-10-01 2014-10-01 false Frozen processed foods. 852.246-72 Section 852.246-72 Federal Acquisition Regulations System DEPARTMENT OF VETERANS AFFAIRS... Frozen processed foods. As prescribed in 846.302-72, insert the following clause: Frozen Processed Foods...

  20. 48 CFR 852.246-72 - Frozen processed foods.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 48 Federal Acquisition Regulations System 5 2012-10-01 2012-10-01 false Frozen processed foods. 852.246-72 Section 852.246-72 Federal Acquisition Regulations System DEPARTMENT OF VETERANS AFFAIRS... Frozen processed foods. As prescribed in 846.302-72, insert the following clause: Frozen Processed Foods...

  1. 48 CFR 852.246-72 - Frozen processed foods.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 48 Federal Acquisition Regulations System 5 2013-10-01 2013-10-01 false Frozen processed foods. 852.246-72 Section 852.246-72 Federal Acquisition Regulations System DEPARTMENT OF VETERANS AFFAIRS... Frozen processed foods. As prescribed in 846.302-72, insert the following clause: Frozen Processed Foods...

  2. CO2 Acquisition Membrane (CAM) Project

    NASA Technical Reports Server (NTRS)

    Mason, Larry W.

    2003-01-01

    The CO2 Acquisition Membrane (CAM) project was performed to develop, test, and analyze thin film membrane materials for separation and purification of carbon dioxide (CO2) from mixtures of gases, such as those found in the Martian atmosphere. The membranes developed in this project are targeted toward In Situ Resource Utilization (ISRU) applications, such as In Situ Propellant Production (ISPP) and In Situ Consumables Production (ISCP). These membrane materials may be used in a variety of ISRU systems, for example as the atmospheric inlet filter for an ISPP process to enhance the concentration of CO2 for use as a reactant gas, to passively separate argon and nitrogen trace gases from CO2 for habitat pressurization, to provide a system for removal of CO2 from breathing gases in a closed environment, or within a process stream to selectively separate CO2 from other gaseous components. The membranes identified and developed for CAM were evaluated for use in candidate ISRU processes and other gas separation applications, and will help to lay the foundation for future unmanned sample return and human space missions. CAM is a cooperative project split among three institutions: Lockheed Martin Astronautics (LMA), the Colorado School of Mines (CSM), and Marshall Space Flight Center (MSFC).

  3. Use of near-infrared spectroscopy and multipoint measurements for quality control of pharmaceutical drug products.

    PubMed

    Boiret, Mathieu; Chauchard, Fabien

    2017-01-01

    Near-infrared (NIR) spectroscopy is a non-destructive analytical technique that enables better-understanding and optimization of pharmaceutical processes and final drug products. The use in line is often limited by acquisition speed and sampling area. This work focuses on performing a multipoint measurement at high acquisition speed at the end of the manufacturing process on a conveyor belt system to control both the distribution and the content of active pharmaceutical ingredient within final drug products, i.e., tablets. A specially designed probe with several collection fibers was developed for this study. By measuring spectral and spatial information, it provides physical and chemical knowledge on the final drug product. The NIR probe was installed on a conveyor belt system that enables the analysis of a lot of tablets. The use of these NIR multipoint measurement probes on a conveyor belt system provided an innovative method that has the potential to be used as a new paradigm to ensure the drug product quality at the end of the manufacturing process and as a new analytical method for the real-time release control strategy. Graphical abstract Use of near-infrared spectroscopy and multipoint measurements for quality control of pharmaceutical drug products.

  4. Research on width control of Metal Fused-coating Additive Manufacturing based on active control

    NASA Astrophysics Data System (ADS)

    Ren, Chuan qi; Wei, Zheng ying; Wang, Xin; Du, Jun; Zhang, Shan; Zhang, Zhitong; Bai, Hao

    2017-12-01

    Given the stability of the shape of the forming layer is one of the key problems that affect the final quality of the sample morphology, taking a study on the forming process and the control method of morphology make a significant difference to metal fused-coating additive manufacturing (MFCAM) in achieving the efficient and stable forming. To improve the quality and precision of the samples of single-layer single pass, a control method of morphology based on active control was established by this paper. The real-time acquisition of image was realized by CCD and the characteristics of morphology of the forming process were simultaneously extracted. Making analysis of the characteristics of the width during the process, the relationship between the relative difference of different frames and moving speed was given. A large number of experiments are used to verify the response speed and accuracy of the system. The results show that the active system can improve the morphology of the sample and the smoothness of the width of the single channel, and increase the uniformity of width by 55.16%.

  5. Smart Networked Elements in Support of ISHM

    NASA Technical Reports Server (NTRS)

    Oostdyk, Rebecca; Mata, Carlos; Perotti, Jose M.

    2008-01-01

    At the core of ISHM is the ability to extract information and knowledge from raw data. Conventional data acquisition systems sample and convert physical measurements to engineering units, which higher-level systems use to derive health and information about processes and systems. Although health management is essential at the top level, there are considerable advantages to implementing health-related functions at the sensor level. The distribution of processing to lower levels reduces bandwidth requirements, enhances data fusion, and improves the resolution for detection and isolation of failures in a system, subsystem, component, or process. The Smart Networked Element (SNE) has been developed to implement intelligent functions and algorithms at the sensor level in support of ISHM.

  6. Degree of food processing of household acquisition patterns in a Brazilian urban area is related to food buying preferences and perceived food environment.

    PubMed

    Vedovato, G M; Trude, A C B; Kharmats, A Y; Martins, P A

    2015-04-01

    This cross-sectional study examined the association between local food environment and consumers' acquisition of ultra-processed food. Households were randomly selected from 36 census tracts in Santos City, Brazil. Mothers, of varying economic status, who had children ages 10 or younger (n = 538) were interviewed concerning: their household food acquisition of 31 groups of food and beverages, perceptions of local food environment, food sources destinations, means of transportation used, and socioeconomic status. Food acquisition patterns were classified based on the degree of industrial food processing. Logistic regression models were fitted to assess the association between consumer behaviors and acquisition patterns. The large variety of fresh produce available in supermarkets was significantly related to lower odds of ultra-processed food purchases. After adjusting for sociodemographic characteristics, higher odds for minimally-processed food acquisition were associated with: frequent use of specialized markets to purchase fruits and vegetables (OR 1.89, 95% CI 1.01-2.34), the habit of walking to buy food (OR 1.58, 95% CI 1.08-2.30), and perceived availability of fresh produce in participants' neighborhood (OR 1.58, 95% CI 1.08-2.30). Acquisition of ultra-processed food was positively associated with the use of taxis as principal means of transportation to food sources (OR 2.35, 95% CI 1.08-5.13), and negatively associated with perceived availability of a variety of fruits and vegetables in the neighborhood (OR 0.57, 95% CI 0.37-0.88). The results suggest that interventions aiming to promote acquisition of less processed food in settings similar to Santos, may be most effective if they focus on increasing the number of specialized fresh food markets in local neighborhood areas, improve residents' awareness of these markets' availability, and provide appropriate transportation. Copyright © 2014 Elsevier Ltd. All rights reserved.

  7. Observations of flat-spectrum radio sources at λ850 μm from the James Clerk Maxwell Telescope - I. 1997 April to 2000 April

    NASA Astrophysics Data System (ADS)

    Robson, E. I.; Stevens, J. A.; Jenness, T.

    2001-11-01

    Calibrated data for 65 flat-spectrum extragalactic radio sources are presented at a wavelength of 850μm, covering a three-year period from 1997 April. The data, obtained from the James Clerk Maxwell Telescope using the SCUBA camera in pointing mode, were analysed using an automated pipeline process based on the Observatory Reduction and Acquisition Control-Data Reduction (orac-dr) system. This paper describes the techniques used to analyse and calibrate the data, and presents the data base of results along with a representative sample of the better-sampled light curves.

  8. Standardizing the experimental conditions for using urine in NMR-based metabolomic studies with a particular focus on diagnostic studies: a review.

    PubMed

    Emwas, Abdul-Hamid; Luchinat, Claudio; Turano, Paola; Tenori, Leonardo; Roy, Raja; Salek, Reza M; Ryan, Danielle; Merzaban, Jasmeen S; Kaddurah-Daouk, Rima; Zeri, Ana Carolina; Nagana Gowda, G A; Raftery, Daniel; Wang, Yulan; Brennan, Lorraine; Wishart, David S

    The metabolic composition of human biofluids can provide important diagnostic and prognostic information. Among the biofluids most commonly analyzed in metabolomic studies, urine appears to be particularly useful. It is abundant, readily available, easily stored and can be collected by simple, noninvasive techniques. Moreover, given its chemical complexity, urine is particularly rich in potential disease biomarkers. This makes it an ideal biofluid for detecting or monitoring disease processes. Among the metabolomic tools available for urine analysis, NMR spectroscopy has proven to be particularly well-suited, because the technique is highly reproducible and requires minimal sample handling. As it permits the identification and quantification of a wide range of compounds, independent of their chemical properties, NMR spectroscopy has been frequently used to detect or discover disease fingerprints and biomarkers in urine. Although protocols for NMR data acquisition and processing have been standardized, no consensus on protocols for urine sample selection, collection, storage and preparation in NMR-based metabolomic studies have been developed. This lack of consensus may be leading to spurious biomarkers being reported and may account for a general lack of reproducibility between laboratories. Here, we review a large number of published studies on NMR-based urine metabolic profiling with the aim of identifying key variables that may affect the results of metabolomics studies. From this survey, we identify a number of issues that require either standardization or careful accounting in experimental design and provide some recommendations for urine collection, sample preparation and data acquisition.

  9. A simple and versatile data acquisition system for software coincidence and pulse-height discrimination in 4πβ-γ coincidence experiments.

    PubMed

    Kawada, Y; Yamada, T; Unno, Y; Yunoki, A; Sato, Y; Hino, Y

    2012-09-01

    A simple but versatile data acquisition system for software coincidence experiments is described, in which any time stamping and live time controller are not provided. Signals from β- and γ-channels are fed to separately two fast ADCs (16 bits, 25 MHz clock maximum) via variable delay circuits and pulse-height stretchers, and also to pulse-height discriminators. The discriminating level was set to just above the electronic noise. Two ADCs were controlled with a common clock signal, and triggered simultaneously by the logic OR pulses from both discriminators. Paired digital signals for each sampling were sent to buffer memories connected to main PC with a FIFO (First-In, First-Out) pipe via USB. After data acquisition in list mode, various processing including pulse-height analyses was performed using MS-Excel (version 2007 and later). The usefulness of this system was demonstrated for 4πβ(PS)-4πγ coincidence measurements of (60)Co, (134)Cs and (152)Eu. Possibilities of other extended applications will be touched upon. Copyright © 2012 Elsevier Ltd. All rights reserved.

  10. Wireless, battery-operated data acquisition system for mobile spectrometry applications and (potentially) for the Internet of things

    NASA Astrophysics Data System (ADS)

    Fitzgerald, Ryan; Karanassios, Vassili

    2017-05-01

    There are many applications requiring chemical analysis in the field and analytical results in (near) real-time. For example, when accidental spills occur. In others, collecting samples in the field followed by analysis in a lab increases costs and introduces time-delays. In such cases, "bring part of the lab to the sample" would be ideal. Toward this ideal (and to further reduce size and weight), we developed a relatively inexpensive, battery-operated, wireless data acquisition hardware system around an Arduino nano micro-controller and a 16-bit ADC (Analog-to- Digital Converter) with a max sampling rate of 860 samples/s. The hardware communicates the acquired data using low-power Bluetooth. Software for data acquisition and data display was written in Python. Potential ways of making the hardware-software approach described here a part of the Internet-of-Things (IoT) are presented.

  11. Acquisition of German Pluralization Rules in Monolingual and Multilingual Children

    ERIC Educational Resources Information Center

    Zaretsky, Eugen; Lange, Benjamin P.; Euler, Harald A.; Neumann, Katrin

    2013-01-01

    Existing studies on plural acquisition in German have relied on small samples and thus hardly deliver generalizable and differentiated results. Here, overgeneralizations of certain plural allomorphs and other tendencies in the acquisition of German plural markers are described on the basis of test data from 7,394 3- to 5-year-old monolingual…

  12. Multi-mode acquisition (MMA): An MS/MS acquisition strategy for maximizing selectivity, specificity and sensitivity of DIA product ion spectra.

    PubMed

    Williams, Brad J; Ciavarini, Steve J; Devlin, Curt; Cohn, Steven M; Xie, Rong; Vissers, Johannes P C; Martin, LeRoy B; Caswell, Allen; Langridge, James I; Geromanos, Scott J

    2016-08-01

    In proteomics studies, it is generally accepted that depth of coverage and dynamic range is limited in data-directed acquisitions. The serial nature of the method limits both sensitivity and the number of precursor ions that can be sampled. To that end, a number of data-independent acquisition (DIA) strategies have been introduced with these methods, for the most part, immune to the sampling issue; nevertheless, some do have other limitations with respect to sensitivity. The major limitation with DIA approaches is interference, i.e., MS/MS spectra are highly chimeric and often incapable of being identified using conventional database search engines. Utilizing each available dimension of separation prior to ion detection, we present a new multi-mode acquisition (MMA) strategy multiplexing both narrowband and wideband DIA acquisitions in a single analytical workflow. The iterative nature of the MMA workflow limits the adverse effects of interference with minimal loss in sensitivity. Qualitative identification can be performed by selected ion chromatograms or conventional database search strategies. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  13. Development of a software tool to support chemical and biological terrorism intelligence analysis

    NASA Astrophysics Data System (ADS)

    Hunt, Allen R.; Foreman, William

    1997-01-01

    AKELA has developed a software tool which uses a systems analytic approach to model the critical processes which support the acquisition of biological and chemical weapons by terrorist organizations. This tool has four major components. The first is a procedural expert system which describes the weapon acquisition process. It shows the relationship between the stages a group goes through to acquire and use a weapon, and the activities in each stage required to be successful. It applies to both state sponsored and small group acquisition. An important part of this expert system is an analysis of the acquisition process which is embodied in a list of observables of weapon acquisition activity. These observables are cues for intelligence collection The second component is a detailed glossary of technical terms which helps analysts with a non- technical background understand the potential relevance of collected information. The third component is a linking capability which shows where technical terms apply to the parts of the acquisition process. The final component is a simple, intuitive user interface which shows a picture of the entire process at a glance and lets the user move quickly to get more detailed information. This paper explains e each of these five model components.

  14. Advances in optical information processing IV; Proceedings of the Meeting, Orlando, FL, Apr. 18-20, 1990

    NASA Astrophysics Data System (ADS)

    Pape, Dennis R.

    1990-09-01

    The present conference discusses topics in optical image processing, optical signal processing, acoustooptic spectrum analyzer systems and components, and optical computing. Attention is given to tradeoffs in nonlinearly recorded matched filters, miniature spatial light modulators, detection and classification using higher-order statistics of optical matched filters, rapid traversal of an image data base using binary synthetic discriminant filters, wideband signal processing for emitter location, an acoustooptic processor for autonomous SAR guidance, and sampling of Fresnel transforms. Also discussed are an acoustooptic RF signal-acquisition system, scanning acoustooptic spectrum analyzers, the effects of aberrations on acoustooptic systems, fast optical digital arithmetic processors, information utilization in analog and digital processing, optical processors for smart structures, and a self-organizing neural network for unsupervised learning.

  15. Performance of Adsorption - Based CO2 Acquisition Hardware for Mars ISRU

    NASA Technical Reports Server (NTRS)

    Finn, John E.; Mulloth, Lila M.; Borchers, Bruce A.; Luna, Bernadette (Technical Monitor)

    2000-01-01

    Chemical processing of the dusty, low-pressure Martian atmosphere typically requires conditioning and compression of the gases as first steps. A temperature-swing adsorption process can perform these tasks using nearly solid-state hardware and with relatively low power consumption compared to alternative processes. In addition, the process can separate the atmospheric constituents, producing both pressurized CO2 and a buffer gas mixture of nitrogen and argon. To date we have developed and tested adsorption compressors at scales appropriate for the near-term robotic missions that will lead the way to ISRU-based human exploration missions. In this talk we describe the characteristics, testing, and performance of these devices. We also discuss scale-up issues associated with meeting the processing demands of sample return and human missions.

  16. A neuromathematical model of human information processing and its application to science content acquisition

    NASA Astrophysics Data System (ADS)

    Anderson, O. Roger

    The rate of information processing during science learning and the efficiency of the learner in mobilizing relevant information in long-term memory as an aid in transmitting newly acquired information to stable storage in long-term memory are fundamental aspects of science content acquisition. These cognitive processes, moreover, may be substantially related in tempo and quality of organization to the efficiency of higher thought processes such as divergent thinking and problem-solving ability that characterize scientific thought. As a contribution to our quantitative understanding of these fundamental information processes, a mathematical model of information acquisition is presented and empirically evaluated in comparison to evidence obtained from experimental studies of science content acquisition. Computer-based models are used to simulate variations in learning parameters and to generate the theoretical predictions to be empirically tested. The initial tests of the predictive accuracy of the model show close agreement between predicted and actual mean recall scores in short-term learning tasks. Implications of the model for human information acquisition and possible future research are discussed in the context of the unique theoretical framework of the model.

  17. HTAPP: High-Throughput Autonomous Proteomic Pipeline

    PubMed Central

    Yu, Kebing; Salomon, Arthur R.

    2011-01-01

    Recent advances in the speed and sensitivity of mass spectrometers and in analytical methods, the exponential acceleration of computer processing speeds, and the availability of genomic databases from an array of species and protein information databases have led to a deluge of proteomic data. The development of a lab-based automated proteomic software platform for the automated collection, processing, storage, and visualization of expansive proteomic datasets is critically important. The high-throughput autonomous proteomic pipeline (HTAPP) described here is designed from the ground up to provide critically important flexibility for diverse proteomic workflows and to streamline the total analysis of a complex proteomic sample. This tool is comprised of software that controls the acquisition of mass spectral data along with automation of post-acquisition tasks such as peptide quantification, clustered MS/MS spectral database searching, statistical validation, and data exploration within a user-configurable lab-based relational database. The software design of HTAPP focuses on accommodating diverse workflows and providing missing software functionality to a wide range of proteomic researchers to accelerate the extraction of biological meaning from immense proteomic data sets. Although individual software modules in our integrated technology platform may have some similarities to existing tools, the true novelty of the approach described here is in the synergistic and flexible combination of these tools to provide an integrated and efficient analysis of proteomic samples. PMID:20336676

  18. Graphical user interface for image acquisition and processing

    DOEpatents

    Goldberg, Kenneth A.

    2002-01-01

    An event-driven GUI-based image acquisition interface for the IDL programming environment designed for CCD camera control and image acquisition directly into the IDL environment where image manipulation and data analysis can be performed, and a toolbox of real-time analysis applications. Running the image acquisition hardware directly from IDL removes the necessity of first saving images in one program and then importing the data into IDL for analysis in a second step. Bringing the data directly into IDL creates an opportunity for the implementation of IDL image processing and display functions in real-time. program allows control over the available charge coupled device (CCD) detector parameters, data acquisition, file saving and loading, and image manipulation and processing, all from within IDL. The program is built using IDL's widget libraries to control the on-screen display and user interface.

  19. 48 CFR 15.101-2 - Lowest price technically acceptable source selection process.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 48 Federal Acquisition Regulations System 1 2010-10-01 2010-10-01 false Lowest price technically acceptable source selection process. 15.101-2 Section 15.101-2 Federal Acquisition Regulations System FEDERAL... Processes and Techniques 15.101-2 Lowest price technically acceptable source selection process. (a) The...

  20. Simulation of the Beating Heart Based on Physically Modeling aDeformable Balloon

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rohmer, Damien; Sitek, Arkadiusz; Gullberg, Grant T.

    2006-07-18

    The motion of the beating heart is complex and createsartifacts in SPECT and x-ray CT images. Phantoms such as the JaszczakDynamic Cardiac Phantom are used to simulate cardiac motion forevaluationof acquisition and data processing protocols used for cardiacimaging. Two concentric elastic membranes filled with water are connectedto tubing and pump apparatus for creating fluid flow in and out of theinner volume to simulate motion of the heart. In the present report, themovement of two concentric balloons is solved numerically in order tocreate a computer simulation of the motion of the moving membranes in theJaszczak Dynamic Cardiac Phantom. A system ofmore » differential equations,based on the physical properties, determine the motion. Two methods aretested for solving the system of differential equations. The results ofboth methods are similar providing a final shape that does not convergeto a trivial circular profile. Finally,a tomographic imaging simulationis performed by acquiring static projections of the moving shape andreconstructing the result to observe motion artifacts. Two cases aretaken into account: in one case each projection angle is sampled for ashort time interval and the other case is sampled for a longer timeinterval. The longer sampling acquisition shows a clear improvement indecreasing the tomographic streaking artifacts.« less

  1. Space science technology: In-situ science. Sample Acquisition, Analysis, and Preservation Project summary

    NASA Technical Reports Server (NTRS)

    Aaron, Kim

    1991-01-01

    The Sample Acquisition, Analysis, and Preservation Project is summarized in outline and graphic form. The objective of the project is to develop component and system level technology to enable the unmanned collection, analysis and preservation of physical, chemical and mineralogical data from the surface of planetary bodies. Technology needs and challenges are identified and specific objectives are described.

  2. Robotics-assisted mass spectrometry assay platform enabled by open-source electronics.

    PubMed

    Chiu, Shih-Hao; Urban, Pawel L

    2015-02-15

    Mass spectrometry (MS) is an important analytical technique with numerous applications in clinical analysis, biochemistry, environmental analysis, geology and physics. Its success builds on the ability of MS to determine molecular weights of analytes, and elucidate their structures. However, sample handling prior to MS requires a lot of attention and labor. In this work we were aiming to automate processing samples for MS so that analyses could be conducted without much supervision of experienced analysts. The goal of this study was to develop a robotics and information technology-oriented platform that could control the whole analysis process including sample delivery, reaction-based assay, data acquisition, and interaction with the analyst. The proposed platform incorporates a robotic arm for handling sample vials delivered to the laboratory, and several auxiliary devices which facilitate and secure the analysis process. They include: multi-relay board, infrared sensors, photo-interrupters, gyroscopes, force sensors, fingerprint scanner, barcode scanner, touch screen panel, and internet interface. The control of all the building blocks is achieved through implementation of open-source electronics (Arduino), and enabled by custom-written programs in C language. The advantages of the proposed system include: low cost, simplicity, small size, as well as facile automation of sample delivery and processing without the intervention of the analyst. It is envisaged that this simple robotic system may be the forerunner of automated laboratories dedicated to mass spectrometric analysis of biological samples. Copyright © 2014 Elsevier B.V. All rights reserved.

  3. Data Acquisition and Processing System for Airborne Wind Profiling with a Pulsed, 2-Micron, Coherent-Detection, Doppler Lidar System

    NASA Technical Reports Server (NTRS)

    Beyon, J. Y.; Koch, G. J.; Kavaya, M. J.

    2010-01-01

    A data acquisition and signal processing system is being developed for a 2-micron airborne wind profiling coherent Doppler lidar system. This lidar, called the Doppler Aerosol Wind Lidar (DAWN), is based on a Ho:Tm:LuLiF laser transmitter and 15-cm diameter telescope. It is being packaged for flights onboard the NASA DC-8, with the first flights in the summer of 2010 in support of the NASA Genesis and Rapid Intensification Processes (GRIP) campaign for the study of hurricanes. The data acquisition and processing system is housed in a compact PCI chassis and consists of four components such as a digitizer, a digital signal processing (DSP) module, a video controller, and a serial port controller. The data acquisition and processing software (DAPS) is also being developed to control the system including real-time data analysis and display. The system detects an external 10 Hz trigger pulse and initiates the data acquisition and processing process, and displays selected wind profile parameters such as Doppler shift, power distribution, wind directions and velocities. Doppler shift created by aircraft motion is measured by an inertial navigation/GPS sensor and fed to the signal processing system for real-time removal of aircraft effects from wind measurements. A general overview of the system and the DAPS as well as the coherent Doppler lidar system is presented in this paper.

  4. The Earth Microbiome Project: Meeting report of the "1 EMP meeting on sample selection and acquisition" at Argonne National Laboratory October 6 2010.

    PubMed

    Gilbert, Jack A; Meyer, Folker; Jansson, Janet; Gordon, Jeff; Pace, Norman; Tiedje, James; Ley, Ruth; Fierer, Noah; Field, Dawn; Kyrpides, Nikos; Glöckner, Frank-Oliver; Klenk, Hans-Peter; Wommack, K Eric; Glass, Elizabeth; Docherty, Kathryn; Gallery, Rachel; Stevens, Rick; Knight, Rob

    2010-12-25

    This report details the outcome the first meeting of the Earth Microbiome Project to discuss sample selection and acquisition. The meeting, held at the Argonne National Laboratory on Wednesday October 6(th) 2010, focused on discussion of how to prioritize environmental samples for sequencing and metagenomic analysis as part of the global effort of the EMP to systematically determine the functional and phylogenetic diversity of microbial communities across the world.

  5. The Making of a Government LSI - From Warfare Capability to Operational System

    DTIC Science & Technology

    2015-04-30

    continues to evolve and implement Lead System Integrator (LSI) acquisition strategies, they have started to define numerous program initiatives that...employ more integrated engineering and management processes and techniques. These initiatives are developing varying acquisition approaches that define (1...government LSI transformation. Navy Systems Commands have begun adding a higher level of integration into their acquisition process with the

  6. Signal existence verification (SEV) for GPS low received power signal detection using the time-frequency approach.

    PubMed

    Jan, Shau-Shiun; Sun, Chih-Cheng

    2010-01-01

    The detection of low received power of global positioning system (GPS) signals in the signal acquisition process is an important issue for GPS applications. Improving the miss-detection problem of low received power signal is crucial, especially for urban or indoor environments. This paper proposes a signal existence verification (SEV) process to detect and subsequently verify low received power GPS signals. The SEV process is based on the time-frequency representation of GPS signal, and it can capture the characteristic of GPS signal in the time-frequency plane to enhance the GPS signal acquisition performance. Several simulations and experiments are conducted to show the effectiveness of the proposed method for low received power signal detection. The contribution of this work is that the SEV process is an additional scheme to assist the GPS signal acquisition process in low received power signal detection, without changing the original signal acquisition or tracking algorithms.

  7. Full data acquisition in Kelvin Probe Force Microscopy: Mapping dynamic electric phenomena in real space

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Balke, Nina; Kalinin, Sergei V.; Jesse, Stephen

    Kelvin probe force microscopy (KPFM) has provided deep insights into the role local electronic, ionic and electrochemical processes play on the global functionality of materials and devices, even down to the atomic scale. Conventional KPFM utilizes heterodyne detection and bias feedback to measure the contact potential difference (CPD) between tip and sample. This measurement paradigm, however, permits only partial recovery of the information encoded in bias- and time-dependent electrostatic interactions between the tip and sample and effectively down-samples the cantilever response to a single measurement of CPD per pixel. This level of detail is insufficient for electroactive materials, devices, ormore » solid-liquid interfaces, where non-linear dielectrics are present or spurious electrostatic events are possible. Here, we simulate and experimentally validate a novel approach for spatially resolved KPFM capable of a full information transfer of the dynamic electric processes occurring between tip and sample. General acquisition mode, or G-Mode, adopts a big data approach utilising high speed detection, compression, and storage of the raw cantilever deflection signal in its entirety at high sampling rates (> 4 MHz), providing a permanent record of the tip trajectory. We develop a range of methodologies for analysing the resultant large multidimensional datasets involving classical, physics-based and information-based approaches. Physics-based analysis of G-Mode KPFM data recovers the parabolic bias dependence of the electrostatic force for each cycle of the excitation voltage, leading to a multidimensional dataset containing spatial and temporal dependence of the CPD and capacitance channels. We use multivariate statistical methods to reduce data volume and separate the complex multidimensional data sets into statistically significant components that can then be mapped onto separate physical mechanisms. Overall, G-Mode KPFM offers a new paradigm to study dynamic electric phenomena in electroactive interfaces as well as offer a promising approach to extend KPFM to solid-liquid interfaces.« less

  8. Full data acquisition in Kelvin Probe Force Microscopy: Mapping dynamic electric phenomena in real space

    DOE PAGES

    Balke, Nina; Kalinin, Sergei V.; Jesse, Stephen; ...

    2016-08-12

    Kelvin probe force microscopy (KPFM) has provided deep insights into the role local electronic, ionic and electrochemical processes play on the global functionality of materials and devices, even down to the atomic scale. Conventional KPFM utilizes heterodyne detection and bias feedback to measure the contact potential difference (CPD) between tip and sample. This measurement paradigm, however, permits only partial recovery of the information encoded in bias- and time-dependent electrostatic interactions between the tip and sample and effectively down-samples the cantilever response to a single measurement of CPD per pixel. This level of detail is insufficient for electroactive materials, devices, ormore » solid-liquid interfaces, where non-linear dielectrics are present or spurious electrostatic events are possible. Here, we simulate and experimentally validate a novel approach for spatially resolved KPFM capable of a full information transfer of the dynamic electric processes occurring between tip and sample. General acquisition mode, or G-Mode, adopts a big data approach utilising high speed detection, compression, and storage of the raw cantilever deflection signal in its entirety at high sampling rates (> 4 MHz), providing a permanent record of the tip trajectory. We develop a range of methodologies for analysing the resultant large multidimensional datasets involving classical, physics-based and information-based approaches. Physics-based analysis of G-Mode KPFM data recovers the parabolic bias dependence of the electrostatic force for each cycle of the excitation voltage, leading to a multidimensional dataset containing spatial and temporal dependence of the CPD and capacitance channels. We use multivariate statistical methods to reduce data volume and separate the complex multidimensional data sets into statistically significant components that can then be mapped onto separate physical mechanisms. Overall, G-Mode KPFM offers a new paradigm to study dynamic electric phenomena in electroactive interfaces as well as offer a promising approach to extend KPFM to solid-liquid interfaces.« less

  9. Cleft audit protocol for speech (CAPS-A): a comprehensive training package for speech analysis.

    PubMed

    Sell, D; John, A; Harding-Bell, A; Sweeney, T; Hegarty, F; Freeman, J

    2009-01-01

    The previous literature has largely focused on speech analysis systems and ignored process issues, such as the nature of adequate speech samples, data acquisition, recording and playback. Although there has been recognition of the need for training on tools used in speech analysis associated with cleft palate, little attention has been paid to this issue. To design, execute, and evaluate a training programme for speech and language therapists on the systematic and reliable use of the Cleft Audit Protocol for Speech-Augmented (CAPS-A), addressing issues of standardized speech samples, data acquisition, recording, playback, and listening guidelines. Thirty-six specialist speech and language therapists undertook the training programme over four days. This consisted of two days' training on the CAPS-A tool followed by a third day, making independent ratings and transcriptions on ten new cases which had been previously recorded during routine audit data collection. This task was repeated on day 4, a minimum of one month later. Ratings were made using the CAPS-A record form with the CAPS-A definition table. An analysis was made of the speech and language therapists' CAPS-A ratings at occasion 1 and occasion 2 and the intra- and inter-rater reliability calculated. Trained therapists showed consistency in individual judgements on specific sections of the tool. Intraclass correlation coefficients were calculated for each section with good agreement on eight of 13 sections. There were only fair levels of agreement on anterior oral cleft speech characteristics, non-cleft errors/immaturities and voice. This was explained, at least in part, by their low prevalence which affects the calculation of the intraclass correlation coefficient statistic. Speech and language therapists benefited from training on the CAPS-A, focusing on specific aspects of speech using definitions of parameters and scalar points, in order to apply the tool systematically and reliably. Ratings are enhanced by ensuring a high degree of attention to the nature of the data, standardizing the speech sample, data acquisition, the listening process together with the use of high-quality recording and playback equipment. In addition, a method is proposed for maintaining listening skills following training as part of an individual's continuing education.

  10. First Language Acquisition and Teaching

    ERIC Educational Resources Information Center

    Cruz-Ferreira, Madalena

    2011-01-01

    "First language acquisition" commonly means the acquisition of a single language in childhood, regardless of the number of languages in a child's natural environment. Language acquisition is variously viewed as predetermined, wondrous, a source of concern, and as developing through formal processes. "First language teaching" concerns schooling in…

  11. 48 CFR 836.602-5 - Short selection process for contracts not to exceed the simplified acquisition threshold.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... for contracts not to exceed the simplified acquisition threshold. 836.602-5 Section 836.602-5 Federal... contracts not to exceed the simplified acquisition threshold. Either of the procedures provided in FAR 36... simplified acquisition threshold. ...

  12. MMX-I: A data-processing software for multi-modal X-ray imaging and tomography

    NASA Astrophysics Data System (ADS)

    Bergamaschi, A.; Medjoubi, K.; Messaoudi, C.; Marco, S.; Somogyi, A.

    2017-06-01

    Scanning hard X-ray imaging allows simultaneous acquisition of multimodal information, including X-ray fluorescence, absorption, phase and dark-field contrasts, providing structural and chemical details of the samples. Combining these scanning techniques with the infrastructure developed for fast data acquisition at Synchrotron Soleil permits to perform multimodal imaging and tomography during routine user experiments at the Nanoscopium beamline. A main challenge of such imaging techniques is the online processing and analysis of the generated very large volume (several hundreds of Giga Bytes) multimodal data-sets. This is especially important for the wide user community foreseen at the user oriented Nanoscopium beamline (e.g. from the fields of Biology, Life Sciences, Geology, Geobiology), having no experience in such data-handling. MMX-I is a new multi-platform open-source freeware for the processing and reconstruction of scanning multi-technique X-ray imaging and tomographic datasets. The MMX-I project aims to offer, both expert users and beginners, the possibility of processing and analysing raw data, either on-site or off-site. Therefore we have developed a multi-platform (Mac, Windows and Linux 64bit) data processing tool, which is easy to install, comprehensive, intuitive, extendable and user-friendly. MMX-I is now routinely used by the Nanoscopium user community and has demonstrated its performance in treating big data.

  13. Induced Environment Contamination Monitor (IECM), air sampler - Results from the Space Transport System (STS-2) flight

    NASA Technical Reports Server (NTRS)

    Peters, P. N.; Hester, H. B.; Bertsch, W.; Mayfield, H.; Zatko, D.

    1983-01-01

    An investigation involving sampling the rapidly changing environment of the Shuttle cargo bay is considered. Four time-integrated samples and one rapid acquisition sample were collected to determine the types and quantities of contaminants present during ascent and descent of the Shuttle. The sampling times for the various bottles were controlled by valves operated by the Data Acquisition and Control System (DACS) of the IECM. Many of the observed species were found to be common solvents used in cleaning surfaces. When the actual volume sampled is taken into account, the relative mass of organics sampled during descent is about 20 percent less than during ascent.

  14. Pyrolysis process for the treatment of scrap tyres: preliminary experimental results.

    PubMed

    Galvagno, S; Casu, S; Casabianca, T; Calabrese, A; Cornacchia, G

    2002-01-01

    The aim of this work is the evaluation, on a pilot scale, of scrap tyre pyrolysis process performance and the characteristics of the products under different process parameters, such as temperature, residence time, pressure, etc. In this frame, a series of tests were carried out at varying process temperatures between 550 and 680 degrees C, other parameters being equal. Pyrolysis plant process data are collected by an acquisition system; scrap tyre samples used for the treatment, solid and liquid by-products and produced syngas were analysed through both on-line monitoring (for gas) and laboratory analyses. Results show that process temperature, in the explored range, does not seem to seriously influence the volatilisation reaction yield, at least from a quantitative point of view, while it observably influences the distribution of the volatile fraction (liquid and gas) and by-products characteristics.

  15. Experiment Automation with a Robot Arm using the Liquids Reflectometer Instrument at the Spallation Neutron Source

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zolnierczuk, Piotr A; Vacaliuc, Bogdan; Sundaram, Madhan

    The Liquids Reflectometer instrument installed at the Spallation Neutron Source (SNS) enables observations of chemical kinetics, solid-state reactions and phase-transitions of thin film materials at both solid and liquid surfaces. Effective measurement of these behaviors requires each sample to be calibrated dynamically using the neutron beam and the data acquisition system in a feedback loop. Since the SNS is an intense neutron source, the time needed to perform the measurement can be the same as the alignment process, leading to a labor-intensive operation that is exhausting to users. An update to the instrument control system, completed in March 2013, implementedmore » the key features of automated sample alignment and robot-driven sample management, allowing for unattended operation over extended periods, lasting as long as 20 hours. We present a case study of the effort, detailing the mechanical, electrical and software modifications that were made as well as the lessons learned during the integration, verification and testing process.« less

  16. AV-8B Remanufacture Program as Part of the Audit of the Defense Acquisition Board Review Process - FY 1994

    DTIC Science & Technology

    1994-06-03

    wft*:¥A:ft:i:ft& OFFICE OF THE INSPECTOR GENERAL AV-8B REMANUFACTURE PROGRAM AS PART OF THE AUDIT OF THE DEFENSE ACQUISITION BOARD...Part of the Audit of the Defense Acquisition Board Review Process - FY 1994 B. DATE Report Downloaded From the Internet: 03/23/99 C. Report’s Point...NAVY FOR RESEARCH DEVELOPMENT AND ACQUISITION SUBJECT: Audit Report on the AV-8B Remanufacture Program as Part of the Audit of the Defense

  17. A Review and Annotated Bibliography of Armor Gunnery Training Device Effectiveness Literature

    DTIC Science & Technology

    1993-11-01

    training effectiveness (skill acquisition, skill reten-tion, performance prediction, transfer of training) and (b) research limitations (sample size...standalone, tank-appended, subcaliber, and laser) and four areas of training effectiveness (skill acquisition, skill retention, performance prediction, and...standalone, tank-appended, subcaliber, laser) and areas of training effectiveness (skill acquisition, skill retention, performance prediction, transfer of

  18. A computer controlled signal preprocessor for laser fringe anemometer applications

    NASA Technical Reports Server (NTRS)

    Oberle, Lawrence G.

    1987-01-01

    The operation of most commercially available laser fringe anemometer (LFA) counter-processors assumes that adjustments are made to the signal processing independent of the computer used for reducing the data acquired. Not only does the researcher desire a record of these parameters attached to the data acquired, but changes in flow conditions generally require that these settings be changed to improve data quality. Because of this limitation, on-line modification of the data acquisition parameters can be difficult and time consuming. A computer-controlled signal preprocessor has been developed which makes possible this optimization of the photomultiplier signal as a normal part of the data acquisition process. It allows computer control of the filter selection, signal gain, and photo-multiplier voltage. The raw signal from the photomultiplier tube is input to the preprocessor which, under the control of a digital computer, filters the signal and amplifies it to an acceptable level. The counter-processor used at Lewis Research Center generates the particle interarrival times, as well as the time-of-flight of the particle through the probe volume. The signal preprocessor allows computer control of the acquisition of these data.Through the preprocessor, the computer also can control the hand shaking signals for the interface between itself and the counter-processor. Finally, the signal preprocessor splits the pedestal from the signal before filtering, and monitors the photo-multiplier dc current, sends a signal proportional to this current to the computer through an analog to digital converter, and provides an alarm if the current exceeds a predefined maximum. Complete drawings and explanations are provided in the text as well as a sample interface program for use with the data acquisition software.

  19. On-line 3-dimensional confocal imaging in vivo.

    PubMed

    Li, J; Jester, J V; Cavanagh, H D; Black, T D; Petroll, W M

    2000-09-01

    In vivo confocal microscopy through focusing (CMTF) can provide a 3-D stack of high-resolution corneal images and allows objective measurements of corneal sublayer thickness and backscattering. However, current systems require time-consuming off-line image processing and analysis on multiple software platforms. Furthermore, there is a trade off between the CMTF speed and measurement precision. The purpose of this study was to develop a novel on-line system for in vivo corneal imaging and analysis that overcomes these limitations. A tandem scanning confocal microscope (TSCM) was used for corneal imaging. The TSCM video camera was interfaced directly to a PC image acquisition board to implement real-time digitization. Software was developed to allow in vivo 2-D imaging, CMTF image acquisition, interactive 3-D reconstruction, and analysis of CMTF data to be performed on line in a single user-friendly environment. A procedure was also incorporated to separate the odd/even video fields, thereby doubling the CMTF sampling rate and theoretically improving the precision of CMTF thickness measurements by a factor of two. In vivo corneal examinations of a normal human and a photorefractive keratectomy patient are presented to demonstrate the capabilities of the new system. Improvements in the convenience, speed, and functionality of in vivo CMTF image acquisition, display, and analysis are demonstrated. This is the first full-featured software package designed for in vivo TSCM imaging of the cornea, which performs both 2-D and 3-D image acquisition, display, and processing as well as CMTF analysis. The use of a PC platform and incorporation of easy to use, on line, and interactive features should help to improve the clinical utility of this technology.

  20. Learning (Not) to Predict: Grammatical Gender Processing in Second Language Acquisition

    ERIC Educational Resources Information Center

    Hopp, Holger

    2016-01-01

    In two experiments, this article investigates the predictive processing of gender agreement in adult second language (L2) acquisition. We test (1) whether instruction on lexical gender can lead to target predictive agreement processing and (2) how variability in lexical gender representations moderates L2 gender agreement processing. In a…

  1. Assessment of Navy Contract Management Processes

    DTIC Science & Technology

    2016-02-22

    Assessment of Navy Contract Management Processes 22 February 2016 Dr. Rene G. Rendon, Associate Professor Graduate School of Business ...Know) for each survey item in each contract management process area. Acquisition Research Program Graduate School of Business ...management process . Figure 1. U.S. Navy CMMM Maturity Levels Acquisition Research Program Graduate School of Business

  2. Virtual k -Space Modulation Optical Microscopy

    NASA Astrophysics Data System (ADS)

    Kuang, Cuifang; Ma, Ye; Zhou, Renjie; Zheng, Guoan; Fang, Yue; Xu, Yingke; Liu, Xu; So, Peter T. C.

    2016-07-01

    We report a novel superresolution microscopy approach for imaging fluorescence samples. The reported approach, termed virtual k -space modulation optical microscopy (VIKMOM), is able to improve the lateral resolution by a factor of 2, reduce the background level, improve the optical sectioning effect and correct for unknown optical aberrations. In the acquisition process of VIKMOM, we used a scanning confocal microscope setup with a 2D detector array to capture sample information at each scanned x -y position. In the recovery process of VIKMOM, we first modulated the captured data by virtual k -space coding and then employed a ptychography-inspired procedure to recover the sample information and correct for unknown optical aberrations. We demonstrated the performance of the reported approach by imaging fluorescent beads, fixed bovine pulmonary artery endothelial (BPAE) cells, and living human astrocytes (HA). As the VIKMOM approach is fully compatible with conventional confocal microscope setups, it may provide a turn-key solution for imaging biological samples with ˜100 nm lateral resolution, in two or three dimensions, with improved optical sectioning capabilities and aberration correcting.

  3. Real-time oil-saturation monitoring in rock cores with low-field NMR.

    PubMed

    Mitchell, J; Howe, A M; Clarke, A

    2015-07-01

    Nuclear magnetic resonance (NMR) provides a powerful suite of tools for studying oil in reservoir core plugs at the laboratory scale. Low-field magnets are preferred for well-log calibration and to minimize magnetic-susceptibility-induced internal gradients in the porous medium. We demonstrate that careful data processing, combined with prior knowledge of the sample properties, enables real-time acquisition and interpretation of saturation state (relative amount of oil and water in the pores of a rock). Robust discrimination of oil and brine is achieved with diffusion weighting. We use this real-time analysis to monitor the forced displacement of oil from porous materials (sintered glass beads and sandstones) and to generate capillary desaturation curves. The real-time output enables in situ modification of the flood protocol and accurate control of the saturation state prior to the acquisition of standard NMR core analysis data, such as diffusion-relaxation correlations. Although applications to oil recovery and core analysis are demonstrated, the implementation highlights the general practicality of low-field NMR as an inline sensor for real-time industrial process control. Copyright © 2015 Elsevier Inc. All rights reserved.

  4. Regridding reconstruction algorithm for real-time tomographic imaging

    PubMed Central

    Marone, F.; Stampanoni, M.

    2012-01-01

    Sub-second temporal-resolution tomographic microscopy is becoming a reality at third-generation synchrotron sources. Efficient data handling and post-processing is, however, difficult when the data rates are close to 10 GB s−1. This bottleneck still hinders exploitation of the full potential inherent in the ultrafast acquisition speed. In this paper the fast reconstruction algorithm gridrec, highly optimized for conventional CPU technology, is presented. It is shown that gridrec is a valuable alternative to standard filtered back-projection routines, despite being based on the Fourier transform method. In fact, the regridding procedure used for resampling the Fourier space from polar to Cartesian coordinates couples excellent performance with negligible accuracy degradation. The stronger dependence of the observed signal-to-noise ratio for gridrec reconstructions on the number of angular views makes the presented algorithm even superior to filtered back-projection when the tomographic problem is well sampled. Gridrec not only guarantees high-quality results but it provides up to 20-fold performance increase, making real-time monitoring of the sub-second acquisition process a reality. PMID:23093766

  5. Real-time digital data-acquisition system for determining load characteristics. Volume 2: Operating, programming and maintenance instructions

    NASA Astrophysics Data System (ADS)

    Podesto, B.; Lapointe, A.; Larose, G.; Robichaud, Y.; Vaillancourt, C.

    1981-03-01

    The design and construction of a Real-Time Digital Data Acquisition System (RTDDAS) to be used in substations for on-site recording and preprocessing load response data were included. The gathered data can be partially processed on site to compute the apparent, active and reactive powers, voltage and current rms values, and instantaneous values of phase voltages and currents. On-site processing capability is provided for rapid monitoring of the field data to ensure that the test setup is suitable. Production analysis of field data is accomplished off-line on a central computer from data recorded on a dual-density (800/1600) magnetic tape which is IBM-compatible. Parallel channels of data can be recorded at a variable rate from 480 to 9000 samples per second per channel. The RTDDAS is housed in a 9.1 m (30-ft) trailer which is shielded from electromagnetic interference and protected by isolators from switching surges. The test must sometimes be performed. Information pertaining to the installation, software operation, and maintenance is presented.

  6. DDS-Suite - A Dynamic Data Acquisition, Processing, and Analysis System for Wind Tunnel Testing

    NASA Technical Reports Server (NTRS)

    Burnside, Jathan J.

    2012-01-01

    Wind Tunnels have optimized their steady-state data systems for acquisition and analysis and even implemented large dynamic-data acquisition systems, however development of near real-time processing and analysis tools for dynamic-data have lagged. DDS-Suite is a set of tools used to acquire, process, and analyze large amounts of dynamic data. Each phase of the testing process: acquisition, processing, and analysis are handled by separate components so that bottlenecks in one phase of the process do not affect the other, leading to a robust system. DDS-Suite is capable of acquiring 672 channels of dynamic data at rate of 275 MB / s. More than 300 channels of the system use 24-bit analog-to-digital cards and are capable of producing data with less than 0.01 of phase difference at 1 kHz. System architecture, design philosophy, and examples of use during NASA Constellation and Fundamental Aerodynamic tests are discussed.

  7. Acquisition of Scientific Literature in Developing Countries. 2: Malaysia.

    ERIC Educational Resources Information Center

    Taib, Rosna

    1989-01-01

    Describes the acquisition of scientific literature by academic libraries in Malaysia. The discussion covers the impact of government policies, library acquisition policies, the selection process, acquisition of special materials, the role of gifts and exchanges, and problems with customs clearance and censorship. Progress in cooperative…

  8. TH-E-17A-07: Improved Cine Four-Dimensional Computed Tomography (4D CT) Acquisition and Processing Method

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Castillo, S; Castillo, R; Castillo, E

    2014-06-15

    Purpose: Artifacts arising from the 4D CT acquisition and post-processing methods add systematic uncertainty to the treatment planning process. We propose an alternate cine 4D CT acquisition and post-processing method to consistently reduce artifacts, and explore patient parameters indicative of image quality. Methods: In an IRB-approved protocol, 18 patients with primary thoracic malignancies received a standard cine 4D CT acquisition followed by an oversampling 4D CT that doubled the number of images acquired. A second cohort of 10 patients received the clinical 4D CT plus 3 oversampling scans for intra-fraction reproducibility. The clinical acquisitions were processed by the standard phasemore » sorting method. The oversampling acquisitions were processed using Dijkstras algorithm to optimize an artifact metric over available image data. Image quality was evaluated with a one-way mixed ANOVA model using a correlation-based artifact metric calculated from the final 4D CT image sets. Spearman correlations and a linear mixed model tested the association between breathing parameters, patient characteristics, and image quality. Results: The oversampling 4D CT scans reduced artifact presence significantly by 27% and 28%, for the first cohort and second cohort respectively. From cohort 2, the inter-replicate deviation for the oversampling method was within approximately 13% of the cross scan average at the 0.05 significance level. Artifact presence for both clinical and oversampling methods was significantly correlated with breathing period (ρ=0.407, p-value<0.032 clinical, ρ=0.296, p-value<0.041 oversampling). Artifact presence in the oversampling method was significantly correlated with amount of data acquired, (ρ=-0.335, p-value<0.02) indicating decreased artifact presence with increased breathing cycles per scan location. Conclusion: The 4D CT oversampling acquisition with optimized sorting reduced artifact presence significantly and reproducibly compared to the phase-sorted clinical acquisition.« less

  9. Bottleneck Analysis on the DoD Pre-Milestone B Acquisition Processes

    DTIC Science & Technology

    2013-04-01

    Acquisition Processes Danielle Worger and Teresa Wu, Arizona State University Eugene Rex Jalao, Arizona State University and University of the...Danielle Worger and Teresa Wu, Arizona State University Eugene Rex Jalao, Arizona State University and University of the Philippines Christopher...Air Force Institute of Technology The RITE Approach to Agile Acquisition Timothy Boyce, Iva Sherman, and Nicholas Roussel Space and Naval Warfare

  10. Establishing and Maintaining an Extensive Library of Patient-Derived Xenograft Models.

    PubMed

    Mattar, Marissa; McCarthy, Craig R; Kulick, Amanda R; Qeriqi, Besnik; Guzman, Sean; de Stanchina, Elisa

    2018-01-01

    Patient-derived xenograft (PDX) models have recently emerged as a highly desirable platform in oncology and are expected to substantially broaden the way in vivo studies are designed and executed and to reshape drug discovery programs. However, acquisition of patient-derived samples, and propagation, annotation and distribution of PDXs are complex processes that require a high degree of coordination among clinic, surgery and laboratory personnel, and are fraught with challenges that are administrative, procedural and technical. Here, we examine in detail the major aspects of this complex process and relate our experience in establishing a PDX Core Laboratory within a large academic institution.

  11. Longitudinal Comparison of the Microbiota During Klebsiella pneumoniae Carbapenemase-Producing Klebsiella pneumoniae (KPC-Kp) Acquisition in Long-Term Acute Care Hospital (LTACH) patients

    PubMed Central

    Seekatz, Anna; Bassis, Christine M; Lolans, Karen; Yelin, Rachel D; Moore, Nicholas M; Okamoto, Koh; Rhee, Yoona; Bell, Pamela; Dangana, Thelma; Sidimirova, Galina; Weinstein, Robert A; Fogg, Louis; Lin, Michael Y; Young, Vincent B; Hayden, Mary K

    2017-01-01

    Abstract Background Colonization with KPC-Kp precedes infection and represents a potential target for intervention. To identify microbial signatures associated with KPC-Kp acquisition, we conducted a prospective, longitudinal study of the fecal microbiota in LTACH patients at risk of acquiring KPC-Kp. Methods We collected admission and weekly rectal swab samples from patients admitted to one LTACH from May 2015 to May 2016. Patients were screened for KPC-Kp by PCR at each sampling time. KPC acquisition was confirmed by culture of KPC-Kp. To assess changes in the microbiota related to acquisition, we sequenced the 16S rRNA gene (V4 region) from collected rectal swabs. Diversity, intra-individual changes, and the relative abundance of the operational taxonomic unit (OTU) that contains KPC-Kp were compared in patients who were KPC-Kp negative upon admission and who had at least one additional swab sample collected. Results 318 patients (1247 samples) were eligible for analysis; 3.7 samples (mean) were collected per patient. Sixty-two patients (19.5%) acquired KPC-Kp (cases) and 256 patients remained negative for all carbapenem-resistant Enterobacteriaceae throughout their stay (controls). Median length of stay before KPC-Kp detection was 14.5 days. At time of KPC-Kp acquisition, levels of an Enterobacteriaceae OTU increased significantly compared with pre-acquisition samples and to samples from control patients (Wilcoxon test, P < 0.0001). Similarly, we observed a decrease in total diversity of the fecal microbiota at time of acquisition in cases (P < 0.01). Compared with controls, cases exhibited decreased intra-individual fecal microbiota similarity immediately prior to acquisition of KPC-Kp (P < 0.01). Comparison of microbial features at time of admission using random forest revealed a higher abundance of Enterococcus and Escherichia OTUs in controls vs cases. Conclusion We observed intra-individual changes in the fecal microbiota of case patients prior to acquisition of KPC-Kp. Compared with patients who did not acquire KPC-Kp, cases exhibited significant changes in microbiota diversity and increased abundance of potential KPC-Kp at acquisition. Our results suggest that shifts in the microbiota may precede colonization by KPC-Kp. Disclosures N. M. Moore, Cepheid: Research Contractor, Funded and provided reagents for associated research projects; R. A. Weinstein, OpGen: Receipt of donated laboratory services for project, Research support; CLorox: Receipt of contributed product, Conducting studies in healthcare facilities that are receiving contributed product; Molnlycke: Receipt of contributed product, Conducting studies in healthcare facilities that are receiving contributed product; Sage Products: Receipt of contributed product, Conducting studies in healthcare facilities that are receiving contributed product; M. Y. Lin, Sage, Inc.: receipt of contributed product, Conducting studies in healthcare facilities that are receiving contributed product; OpGen, Inc.: receipt of in-kind laboratory services, Conducting studies in healthcare facilities that are receiving contributed product; M. K. Hayden, OpGen, Inc.: Receipt of donated laboratory services for project, Research support; Clorox: Receipt of contributed product, Conducting studies in healthcare facilities that are receiving contributed product; Molnlycke: Receipt of contributed product, Conducting studies in healthcare facilities that are receiving contributed product; Sage Products: Receipt of contributed product, Conducting studies in healthcare facilities that are receiving contributed product.

  12. Auditory Processing Disorder and Foreign Language Acquisition

    ERIC Educational Resources Information Center

    Veselovska, Ganna

    2015-01-01

    This article aims at exploring various strategies for coping with the auditory processing disorder in the light of foreign language acquisition. The techniques relevant to dealing with the auditory processing disorder can be attributed to environmental and compensatory approaches. The environmental one involves actions directed at creating a…

  13. Software Acquisition: Evolution, Total Quality Management, and Applications to the Army Tactical Missile System

    DTIC Science & Technology

    1992-06-01

    presents the concept of software Total Quality Management (TQM) which focuses on the entire process of software acquisition, as a partial solution to...software TQM can be applied to software acquisition. Software Development, Software Acquisition, Total Quality management (TQM), Army Tactical Missile

  14. 32 CFR 700.326 - The Assistant Secretary of the Navy (Research, Development and Acquisition).

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ..., development and acquisition, except for military requirements and operational test and evaluation; (b) Direct management of acquisition programs; (c) All aspects of the acquisition process within the Department of the..., procurement, competition, contracts and business management, logistics, product integrity, and education and...

  15. 48 CFR 1436.602-5 - Short selection processes for contracts not to exceed the simplified acquisition threshold.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... for contracts not to exceed the simplified acquisition threshold. 1436.602-5 Section 1436.602-5... for contracts not to exceed the simplified acquisition threshold. At each occurrence, CO approval...-engineer contracts not expected to exceed the simplified acquisition threshold. ...

  16. Into the Curriculum. Art: A Path to Monet--Following in Linnea's Footsteps [and] Reading/Language Arts: Legends about Humanity's Acquisition of Fire [and] Reading/Language Arts: Analytical Book Reviews [and] Science/Art: Build a Beautiful Butterfly [and] Science: Life Processes of Plants [and] Social Studies: Community Helpers: Fire Fighters.

    ERIC Educational Resources Information Center

    Schultis, Cathy; Troisi, Andrea; Vidor, Constance; Rostek, Andrea; Linsky, Melissa Carruthers

    1998-01-01

    Presents six curriculum guides for art, language arts, reading, science, and social studies. Each activity identifies library media skills objectives, curriculum objectives, grade levels, resources, librarian and teacher instructional roles, activity and procedures for completion, activity samples, guidelines for evaluating finished activities,…

  17. ASPRS Digital Imagery Guideline Image Gallery Discussion

    NASA Technical Reports Server (NTRS)

    Ryan, Robert

    2002-01-01

    The objectives of the image gallery are to 1) give users and providers a simple means of identifying appropriate imagery for a given application/feature extraction; and 2) define imagery sufficiently to be described in engineering and acquisition terms. This viewgraph presentation includes a discussion of edge response and aliasing for image processing, and a series of images illustrating the effects of signal to noise ratio (SNR) on images. Another series of images illustrates how images are affected by varying the ground sample distances (GSD).

  18. Optimisation of wavelength modulated Raman spectroscopy: towards high throughput cell screening.

    PubMed

    Praveen, Bavishna B; Mazilu, Michael; Marchington, Robert F; Herrington, C Simon; Riches, Andrew; Dholakia, Kishan

    2013-01-01

    In the field of biomedicine, Raman spectroscopy is a powerful technique to discriminate between normal and cancerous cells. However the strong background signal from the sample and the instrumentation affects the efficiency of this discrimination technique. Wavelength Modulated Raman spectroscopy (WMRS) may suppress the background from the Raman spectra. In this study we demonstrate a systematic approach for optimizing the various parameters of WMRS to achieve a reduction in the acquisition time for potential applications such as higher throughput cell screening. The Signal to Noise Ratio (SNR) of the Raman bands depends on the modulation amplitude, time constant and total acquisition time. It was observed that the sampling rate does not influence the signal to noise ratio of the Raman bands if three or more wavelengths are sampled. With these optimised WMRS parameters, we increased the throughput in the binary classification of normal human urothelial cells and bladder cancer cells by reducing the total acquisition time to 6 s which is significantly lower in comparison to previous acquisition times required for the discrimination between similar cell types.

  19. Field programmable gate array processing of eye-safe all-fiber coherent wind Doppler lidar return signals

    NASA Astrophysics Data System (ADS)

    Abdelazim, S.; Santoro, D.; Arend, M.; Moshary, F.; Ahmed, S.

    2011-11-01

    A field deployable all-fiber eye-safe Coherent Doppler LIDAR is being developed at the Optical Remote Sensing Lab at the City College of New York (CCNY) and is designed to monitor wind fields autonomously and continuously in urban settings. Data acquisition is accomplished by sampling lidar return signals at 400 MHz and performing onboard processing using field programmable gate arrays (FPGAs). The FPGA is programmed to accumulate signal information that is used to calculate the power spectrum of the atmospherically back scattered signal. The advantage of using FPGA is that signal processing will be performed at the hardware level, reducing the load on the host computer and allowing for 100% return signal processing. An experimental setup measured wind speeds at ranges of up to 3 km.

  20. 48 CFR 1401.7001-4 - Acquisition performance measurement systems.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ...-pronged approach that includes self assessment, statistical data for validation and flexible quality... regulations governing the acquisition process; and (3) Identify and implement changes necessary to improve the... through the review and oversight process. ...

  1. 48 CFR 1401.7001-4 - Acquisition performance measurement systems.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ...-pronged approach that includes self assessment, statistical data for validation and flexible quality... regulations governing the acquisition process; and (3) Identify and implement changes necessary to improve the... through the review and oversight process. ...

  2. 48 CFR 1401.7001-4 - Acquisition performance measurement systems.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ...-pronged approach that includes self assessment, statistical data for validation and flexible quality... regulations governing the acquisition process; and (3) Identify and implement changes necessary to improve the... through the review and oversight process. ...

  3. 48 CFR 1401.7001-4 - Acquisition performance measurement systems.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ...-pronged approach that includes self assessment, statistical data for validation and flexible quality... regulations governing the acquisition process; and (3) Identify and implement changes necessary to improve the... through the review and oversight process. ...

  4. Research on control law accelerator of digital signal process chip TMS320F28035 for real-time data acquisition and processing

    NASA Astrophysics Data System (ADS)

    Zhao, Shuangle; Zhang, Xueyi; Sun, Shengli; Wang, Xudong

    2017-08-01

    TI C2000 series digital signal process (DSP) chip has been widely used in electrical engineering, measurement and control, communications and other professional fields, DSP TMS320F28035 is one of the most representative of a kind. When using the DSP program, need data acquisition and data processing, and if the use of common mode C or assembly language programming, the program sequence, analogue-to-digital (AD) converter cannot be real-time acquisition, often missing a lot of data. The control low accelerator (CLA) processor can run in parallel with the main central processing unit (CPU), and the frequency is consistent with the main CPU, and has the function of floating point operations. Therefore, the CLA coprocessor is used in the program, and the CLA kernel is responsible for data processing. The main CPU is responsible for the AD conversion. The advantage of this method is to reduce the time of data processing and realize the real-time performance of data acquisition.

  5. Contributions of the Study of Japanese as a Second language to our General Understanding of Second Language Acquisition and the Definition of Second Language Acquisition Research.

    ERIC Educational Resources Information Center

    Wakabayashi, Shigenori

    2003-01-01

    Reviews three books on the acquisition of Japanese as a second language: "Second Language Acquisition Process in the Classroom" by A.S. Ohta;"The Acquisition of Grammar by Learners of Japanese" (English translation of title), by H. Noda, K. Sakoda, K. Shibuya, and N. Kobayashi; and "The Acquisition of Japanese as a Second Language," B. K. Kanno,…

  6. High-throughput microcoil NMR of compound libraries using zero-dispersion segmented flow analysis.

    PubMed

    Kautz, Roger A; Goetzinger, Wolfgang K; Karger, Barry L

    2005-01-01

    An automated system for loading samples into a microcoil NMR probe has been developed using segmented flow analysis. This approach enhanced 2-fold the throughput of the published direct injection and flow injection methods, improved sample utilization 3-fold, and was applicable to high-field NMR facilities with long transfer lines between the sample handler and NMR magnet. Sample volumes of 2 microL (10-30 mM, approximately 10 microg) were drawn from a 96-well microtiter plate by a sample handler, then pumped to a 0.5-microL microcoil NMR probe as a queue of closely spaced "plugs" separated by an immiscible fluorocarbon fluid. Individual sample plugs were detected by their NMR signal and automatically positioned for stopped-flow data acquisition. The sample in the NMR coil could be changed within 35 s by advancing the queue. The fluorocarbon liquid wetted the wall of the Teflon transfer line, preventing the DMSO samples from contacting the capillary wall and thus reducing sample losses to below 5% after passage through the 3-m transfer line. With a wash plug of solvent between samples, sample-to-sample carryover was <1%. Significantly, the samples did not disperse into the carrier liquid during loading or during acquisitions of several days for trace analysis. For automated high-throughput analysis using a 16-second acquisition time, spectra were recorded at a rate of 1.5 min/sample and total deuterated solvent consumption was <0.5 mL (1 US dollar) per 96-well plate.

  7. Introduction to Defense Acquisition Management

    DTIC Science & Technology

    1989-03-01

    the system after Most new systems follow the same formatted and its usefullness in the weapon inventory predictable life cycle, and fit the model...natives for system concept development Statement (MNS) setting forth requirements need- -An acquisition strategy is developed to guide ed to meet the...6 BUSINESS, FINANCIAL AND TECHNICAL ASPECTS OF SYSTEMS ACQUISITION Management of the systems acquisition process The acquisition planning phase of the

  8. An Experimental Study of Interventions for the Acquisition and Retention of Motivational Interviewing Skills among Probation Officers

    ERIC Educational Resources Information Center

    Asteris, Mark M., Jr.

    2012-01-01

    This study was designed to investigate the differences in Motivational Interviewing (MI) skill acquisition and retention among probation officers. This study had a randomized, experimental, pretest-posttest control group design using the MITI 3.1.1 and the VASE-R to measure MI skill acquisition and retention. A random sample (n = 24) of probation…

  9. Technology Readiness Assessment (TRA) Deskbook

    DTIC Science & Technology

    2003-09-01

    be certified as being compliant with the FMEA by the Under Secretary of Defense (Comptroller) (USD(C)). B.3 A COMMENT ON THE TRA PROCESS The Interim...I-2 1.4 Acquisition Process Overview...II-6 2.2.3 Processing the TRA Results .......................................................... II-6 2.3 Component Acquisition Executive (CAE

  10. Line drawing of STS-34 middeck experiment Polymer Morphology (PM)

    NASA Technical Reports Server (NTRS)

    1989-01-01

    STS-34 middeck experiment Polymer Morphology (PM) and its apparatus is illustrated in this line drawing. Apparatus for the experiment, developed by 3M, includes a Fournier transform infrared (FTIR) spectrometer, an automatic sample manipulating system and a process control and data acquisition computer known as the Generic Electronics Module (GEM). STS-34 mission specialists will interface with the PM experiment through a small, NASA-supplied laptop computer that is used as an input and output device for the main PM computer. PM experiment is an organic materials processing experiment designed to explore the effects of microgravity on polymeric materials as they are processed in space and is being conducted by 3M's Space Research and Applications Laboratory.

  11. Precision of computer vision systems for real-time inspection of contact wire wear in railways

    NASA Astrophysics Data System (ADS)

    Borromeo, Susana; Aparicio, Jose L.

    2005-02-01

    This paper is oriented to study techniques to improve the precision of the systems for wear measurement of contact wire in the railways. The problematic of wear measurement characterized by some important determining factors like rate of sampling and auscultation conditions is studied in detail. The different solutions to resolve the problematic successfully are examined. Issues related to image acquisition and image processing are discussed. Type of illumination and sensors employed, image processing hardware and image processing algorithms are some topics studied. Once analyzed each one factor which have influence on the precision of the measurement system, there are proposed an assembly of solutions that allow to optimize the conditions under which the inspection can be carried out.

  12. A Psychometric Study of Reading Processes in L2 Acquisition: Deploying Deep Processing to Push Learners' Discourse Towards Syntactic Processing-Based Constructions

    ERIC Educational Resources Information Center

    Manuel, Carlos J.

    2009-01-01

    This study assesses reading processes and/or strategies needed to deploy deep processing that could push learners towards syntactic-based constructions in L2 classrooms. Research has found L2 acquisition to present varying degrees of success and/or fossilization (Bley-Vroman 1989, Birdsong 1992 and Sharwood Smith 1994). For example, learners have…

  13. Agglutination by anti-capsular polysaccharide antibody is associated with protection against experimental human pneumococcal carriage

    PubMed Central

    Reiné, J; Zangari, T; Owugha, JT; Pennington, SH; Gritzfeld, JF; Wright, AD; Collins, AM; van Selm, S; de Jonge, MI; Gordon, SB; Weiser, JN; Ferreira, DM

    2016-01-01

    The ability of pneumococcal conjugate vaccine (PCV) to decrease transmission by blocking the acquisition of colonization has been attributed to herd immunity. We describe the role of mucosal IgG to capsular polysaccharide (CPS) in mediating protection from carriage, translating our findings from a murine model to humans. We used a flow-cytometric assay to quantify antibody-mediated agglutination demonstrating that hyperimmune sera generated against an unencapsulated mutant was poorly agglutinating. Passive immunization with this antiserum was ineffective to block acquisition of colonization compared to agglutinating antisera raised against the encapsulated parent strain. In the human challenge model samples were collected from PCV and control vaccinated adults. In PCV-vaccinated subjects IgG levels to CPS were increased in serum and nasal wash (NW). IgG to the inoculated strain CPS dropped in NW samples after inoculation suggesting its sequestration by colonizing pneumococci. In post-vaccination NW samples pneumococci were heavily agglutinated compared to pre-vaccination samples in subjects protected against carriage. Our results indicate that pneumococcal agglutination mediated by CPS specific antibodies is a key mechanism of protection against acquisition of carriage. Capsule may be the only vaccine target that can elicit strong agglutinating antibody responses, leading to protection against carriage acquisition and generation of herd immunity. PMID:27579859

  14. Design of Multishell Sampling Schemes with Uniform Coverage in Diffusion MRI

    PubMed Central

    Caruyer, Emmanuel; Lenglet, Christophe; Sapiro, Guillermo; Deriche, Rachid

    2017-01-01

    Purpose In diffusion MRI, a technique known as diffusion spectrum imaging reconstructs the propagator with a discrete Fourier transform, from a Cartesian sampling of the diffusion signal. Alternatively, it is possible to directly reconstruct the orientation distribution function in q-ball imaging, providing so-called high angular resolution diffusion imaging. In between these two techniques, acquisitions on several spheres in q-space offer an interesting trade-off between the angular resolution and the radial information gathered in diffusion MRI. A careful design is central in the success of multishell acquisition and reconstruction techniques. Methods The design of acquisition in multishell is still an open and active field of research, however. In this work, we provide a general method to design multishell acquisition with uniform angular coverage. This method is based on a generalization of electrostatic repulsion to multishell. Results We evaluate the impact of our method using simulations, on the angular resolution in one and two bundles of fiber configurations. Compared to more commonly used radial sampling, we show that our method improves the angular resolution, as well as fiber crossing discrimination. Discussion We propose a novel method to design sampling schemes with optimal angular coverage and show the positive impact on angular resolution in diffusion MRI. PMID:23625329

  15. Texture-adaptive hyperspectral video acquisition system with a spatial light modulator

    NASA Astrophysics Data System (ADS)

    Fang, Xiaojing; Feng, Jiao; Wang, Yongjin

    2014-10-01

    We present a new hybrid camera system based on spatial light modulator (SLM) to capture texture-adaptive high-resolution hyperspectral video. The hybrid camera system records a hyperspectral video with low spatial resolution using a gray camera and a high-spatial resolution video using a RGB camera. The hyperspectral video is subsampled by the SLM. The subsampled points can be adaptively selected according to the texture characteristic of the scene by combining with digital imaging analysis and computational processing. In this paper, we propose an adaptive sampling method utilizing texture segmentation and wavelet transform (WT). We also demonstrate the effectiveness of the sampled pattern on the SLM with the proposed method.

  16. Quantitative myocardial perfusion from static cardiac and dynamic arterial CT

    NASA Astrophysics Data System (ADS)

    Bindschadler, Michael; Branch, Kelley R.; Alessio, Adam M.

    2018-05-01

    Quantitative myocardial blood flow (MBF) estimation by dynamic contrast enhanced cardiac computed tomography (CT) requires multi-frame acquisition of contrast transit through the blood pool and myocardium to inform the arterial input and tissue response functions. Both the input and the tissue response functions for the entire myocardium are sampled with each acquisition. However, the long breath holds and frequent sampling can result in significant motion artifacts and relatively high radiation dose. To address these limitations, we propose and evaluate a new static cardiac and dynamic arterial (SCDA) quantitative MBF approach where (1) the input function is well sampled using either prediction from pre-scan timing bolus data or measured from dynamic thin slice ‘bolus tracking’ acquisitions, and (2) the whole-heart tissue response data is limited to one contrast enhanced CT acquisition. A perfusion model uses the dynamic arterial input function to generate a family of possible myocardial contrast enhancement curves corresponding to a range of MBF values. Combined with the timing of the single whole-heart acquisition, these curves generate a lookup table relating myocardial contrast enhancement to quantitative MBF. We tested the SCDA approach in 28 patients that underwent a full dynamic CT protocol both at rest and vasodilator stress conditions. Using measured input function plus single (enhanced CT only) or plus double (enhanced and contrast free baseline CT’s) myocardial acquisitions yielded MBF estimates with root mean square (RMS) error of 1.2 ml/min/g and 0.35 ml/min/g, and radiation dose reductions of 90% and 83%, respectively. The prediction of the input function based on timing bolus data and the static acquisition had an RMS error compared to the measured input function of 26.0% which led to MBF estimation errors greater than threefold higher than using the measured input function. SCDA presents a new, simplified approach for quantitative perfusion imaging with an acquisition strategy offering substantial radiation dose and computational complexity savings over dynamic CT.

  17. Low rank magnetic resonance fingerprinting.

    PubMed

    Mazor, Gal; Weizman, Lior; Tal, Assaf; Eldar, Yonina C

    2016-08-01

    Magnetic Resonance Fingerprinting (MRF) is a relatively new approach that provides quantitative MRI using randomized acquisition. Extraction of physical quantitative tissue values is preformed off-line, based on acquisition with varying parameters and a dictionary generated according to the Bloch equations. MRF uses hundreds of radio frequency (RF) excitation pulses for acquisition, and therefore high under-sampling ratio in the sampling domain (k-space) is required. This under-sampling causes spatial artifacts that hamper the ability to accurately estimate the quantitative tissue values. In this work, we introduce a new approach for quantitative MRI using MRF, called Low Rank MRF. We exploit the low rank property of the temporal domain, on top of the well-known sparsity of the MRF signal in the generated dictionary domain. We present an iterative scheme that consists of a gradient step followed by a low rank projection using the singular value decomposition. Experiments on real MRI data demonstrate superior results compared to conventional implementation of compressed sensing for MRF at 15% sampling ratio.

  18. Analysis of the times involved in processing and communication in a lower limb simulation system controlled by SEMG

    NASA Astrophysics Data System (ADS)

    Profumieri, A.; Bonell, C.; Catalfamo, P.; Cherniz, A.

    2016-04-01

    Virtual reality has been proposed for different applications, including the evaluation of new control strategies and training protocols for upper limb prostheses and for the study of new rehabilitation programs. In this study, a lower limb simulation environment commanded by surface electromyography signals is evaluated. The time delays generated by the acquisition and processing stages for the signals that would command the knee joint, were measured and different acquisition windows were analysed. The subjective perception of the quality of simulation was also evaluated when extra delays were added to the process. The results showed that the acquisition window is responsible for the longest delay. Also, the basic implemented processes allowed for the acquisition of three signal channels for commanding the simulation. Finally, the communication between different applications is arguably efficient, although it depends on the amount of data to be sent.

  19. Effective use of metadata in the integration and analysis of multi-dimensional optical data

    NASA Astrophysics Data System (ADS)

    Pastorello, G. Z.; Gamon, J. A.

    2012-12-01

    Data discovery and integration relies on adequate metadata. However, creating and maintaining metadata is time consuming and often poorly addressed or avoided altogether, leading to problems in later data analysis and exchange. This is particularly true for research fields in which metadata standards do not yet exist or are under development, or within smaller research groups without enough resources. Vegetation monitoring using in-situ and remote optical sensing is an example of such a domain. In this area, data are inherently multi-dimensional, with spatial, temporal and spectral dimensions usually being well characterized. Other equally important aspects, however, might be inadequately translated into metadata. Examples include equipment specifications and calibrations, field/lab notes and field/lab protocols (e.g., sampling regimen, spectral calibration, atmospheric correction, sensor view angle, illumination angle), data processing choices (e.g., methods for gap filling, filtering and aggregation of data), quality assurance, and documentation of data sources, ownership and licensing. Each of these aspects can be important as metadata for search and discovery, but they can also be used as key data fields in their own right. If each of these aspects is also understood as an "extra dimension," it is possible to take advantage of them to simplify the data acquisition, integration, analysis, visualization and exchange cycle. Simple examples include selecting data sets of interest early in the integration process (e.g., only data collected according to a specific field sampling protocol) or applying appropriate data processing operations to different parts of a data set (e.g., adaptive processing for data collected under different sky conditions). More interesting scenarios involve guided navigation and visualization of data sets based on these extra dimensions, as well as partitioning data sets to highlight relevant subsets to be made available for exchange. The DAX (Data Acquisition to eXchange) Web-based tool uses a flexible metadata representation model and takes advantage of multi-dimensional data structures to translate metadata types into data dimensions, effectively reshaping data sets according to available metadata. With that, metadata is tightly integrated into the acquisition-to-exchange cycle, allowing for more focused exploration of data sets while also increasing the value of, and incentives for, keeping good metadata. The tool is being developed and tested with optical data collected in different settings, including laboratory, field, airborne, and satellite platforms.

  20. Minimalism and Beyond: Second Language Acquisition for the Twenty-First Century.

    ERIC Educational Resources Information Center

    Balcom, Patricia A.

    2001-01-01

    Provides a general overview of two books--"The Second Time Around: Minimalism and Second Language Acquisition" and "Second Language Syntax: A Generative Introduction--and shows how the respond to key issues in second language acquisition, including the process of second language acquisition, access to universal grammar, the role of…

  1. Defense Acquisition Workforce: The Air Force Needs to Evaluate Changes in Funding for Civilians Engaged in Space Acquisition

    DTIC Science & Technology

    2013-07-01

    1) revitalize the acquisition workforce; (2) improve the requirements generation process; (3) instill budget and financial DOD Acquisition...with our four recommended actions (see app . I). In concurring with our recommendations, DOD stated that the Air Force will evaluate the pilot program

  2. The skills related to the early reading acquisition in Spain and Peru

    PubMed Central

    Ávila, Vicenta; Martínez, Tomás; Ysla, Liz

    2018-01-01

    This paper deals with the skills related to the early reading acquisition in two countries that share language. Traditionally on reading readiness research there is a great interest to find out what factors affect early reading ability, but differ from other academic skills that affect general school learnings. Furthermore, it is also known how the influence of pre-reading variables in two countries with the same language, affect the development of the reading. On the other hand, several studies have examined what skills are related to reading readiness (phonological awareness, alphabetic awareness, naming speed, linguistic skills, metalinguistic knowledge and basic cognitive processes), but there are no studies showing whether countries can also influence the development of these skills.Our main objective in this study was to establish whether there were differences in the degree of acquisition of these skills between Spanish (119 children) and Peruvian (128 children), five years old children assessed in their own countries and after controlling Economic, Social and Cultural Status (ESCS). The results show that there are significant differences in the degree of acquisition of these skills between these two samples. It's especially relevant, in these results, that the main predictor in a regression study was the country of origin, explaining a higher percentage of variance than other variables such as age differences, in months, or gender. These findings corroborate the results obtained in other studies with migrant population. PMID:29505592

  3. Single transmission line interrogated multiple channel data acquisition system

    DOEpatents

    Fasching, George E.; Keech, Jr., Thomas W.

    1980-01-01

    A single transmission line interrogated multiple channel data acquisition system is provided in which a plurality of remote station/sensor circuits each monitors a specific process variable and each transmits measurement values over a single transmission line to a master interrogating station when addressed by said master interrogating station. Typically, as many as 330 remote stations may be parallel connected to the transmission line which may exceed 7,000 feet. The interrogation rate is typically 330 stations/second. The master interrogating station samples each station according to a shared, charging transmit-receive cycle. All remote station address signals, all data signals from the remote stations/sensors and all power for all of the remote station/sensors are transmitted via a single continuous terminated coaxial cable. A means is provided for periodically and remotely calibrating all remote sensors for zero and span. A provision is available to remotely disconnect any selected sensor station from the main transmission line.

  4. Free-decay time-domain modal identification for large space structures

    NASA Technical Reports Server (NTRS)

    Kim, Hyoung M.; Vanhorn, David A.; Doiron, Harold H.

    1992-01-01

    Concept definition studies for the Modal Identification Experiment (MIE), a proposed space flight experiment for the Space Station Freedom (SSF), have demonstrated advantages and compatibility of free-decay time-domain modal identification techniques with the on-orbit operational constraints of large space structures. Since practical experience with modal identification using actual free-decay responses of large space structures is very limited, several numerical and test data reduction studies were conducted. Major issues and solutions were addressed, including closely-spaced modes, wide frequency range of interest, data acquisition errors, sampling delay, excitation limitations, nonlinearities, and unknown disturbances during free-decay data acquisition. The data processing strategies developed in these studies were applied to numerical simulations of the MIE, test data from a deployable truss, and launch vehicle flight data. Results of these studies indicate free-decay time-domain modal identification methods can provide accurate modal parameters necessary to characterize the structural dynamics of large space structures.

  5. Computational Modeling for Language Acquisition: A Tutorial With Syntactic Islands.

    PubMed

    Pearl, Lisa S; Sprouse, Jon

    2015-06-01

    Given the growing prominence of computational modeling in the acquisition research community, we present a tutorial on how to use computational modeling to investigate learning strategies that underlie the acquisition process. This is useful for understanding both typical and atypical linguistic development. We provide a general overview of why modeling can be a particularly informative tool and some general considerations when creating a computational acquisition model. We then review a concrete example of a computational acquisition model for complex structural knowledge referred to as syntactic islands. This includes an overview of syntactic islands knowledge, a precise definition of the acquisition task being modeled, the modeling results, and how to meaningfully interpret those results in a way that is relevant for questions about knowledge representation and the learning process. Computational modeling is a powerful tool that can be used to understand linguistic development. The general approach presented here can be used to investigate any acquisition task and any learning strategy, provided both are precisely defined.

  6. Earthquake recordings from the 2002 Seattle Seismic Hazard Investigation of Puget Sound (SHIPS), Washington State

    USGS Publications Warehouse

    Pratt, Thomas L.; Meagher, Karen L.; Brocher, Thomas M.; Yelin, Thomas; Norris, Robert; Hultgrien, Lynn; Barnett, Elizabeth; Weaver, Craig S.

    2003-01-01

    This report describes seismic data obtained during the fourth Seismic Hazard Investigation of Puget Sound (SHIPS) experiment, termed Seattle SHIPS . The experiment was designed to study the influence of the Seattle sedimentary basin on ground shaking during earthquakes. To accomplish this, we deployed seismometers over the basin to record local earthquakes, quarry blasts, and teleseisms during the period of January 26 to May 27, 2002. We plan to analyze the recordings to compute spectral amplitudes at each site, to determine the variability of ground motions over the basin. During the Seattle SHIPS experiment, seismometers were deployed at 87 sites in a 110-km-long east-west line, three north-south lines, and a grid throughout the Seattle urban area (Figure 1). At each of these sites, an L-22, 2-Hz velocity transducer was installed and connected to a REF TEK Digital Acquisition System (DAS), both provided by the Program for Array Seismic Studies of the Continental Lithosphere (PASSCAL) of the Incorporated Research Institutes for Seismology (IRIS). The instruments were installed on January 26 and 27, and were retrieved gradually between April 18 and May 27. All instruments continuously sampled all three components of motion (velocity) at a sample rate of 50 samples/sec. To ensure accurate computations of amplitude, we calibrated the geophones in situ to obtain the instrument responses. In this report, we discuss the acquisition of these data, we describe the processing and merging of these data into 1-hour long traces and into windowed events, we discuss the geophone calibration process and its results, and we display some of the earthquake recordings.

  7. Mosaic construction, processing, and review of very large electron micrograph composites

    NASA Astrophysics Data System (ADS)

    Vogt, Robert C., III; Trenkle, John M.; Harmon, Laurel A.

    1996-11-01

    A system of programs is described for acquisition, mosaicking, cueing and interactive review of large-scale transmission electron micrograph composite images. This work was carried out as part of a final-phase clinical analysis study of a drug for the treatment of diabetic peripheral neuropathy. MOre than 500 nerve biopsy samples were prepared, digitally imaged, processed, and reviewed. For a given sample, typically 1000 or more 1.5 megabyte frames were acquired, for a total of between 1 and 2 gigabytes of data per sample. These frames were then automatically registered and mosaicked together into a single virtual image composite, which was subsequently used to perform automatic cueing of axons and axon clusters, as well as review and marking by qualified neuroanatomists. Statistics derived from the review process were used to evaluate the efficacy of the drug in promoting regeneration of myelinated nerve fibers. This effort demonstrates a new, entirely digital capability for doing large-scale electron micrograph studies, in which all of the relevant specimen data can be included at high magnification, as opposed to simply taking a random sample of discrete locations. It opens up the possibility of a new era in electron microscopy--one which broadens the scope of questions that this imaging modality can be used to answer.

  8. A digital acquisition and elaboration system for nuclear fast pulse detection

    NASA Astrophysics Data System (ADS)

    Esposito, B.; Riva, M.; Marocco, D.; Kaschuck, Y.

    2007-03-01

    A new digital acquisition and elaboration system has been developed and assembled in ENEA-Frascati for the direct sampling of fast pulses from nuclear detectors such as scintillators and diamond detectors. The system is capable of performing the digital sampling of the pulses (200 MSamples/s, 14-bit) and the simultaneous (compressed) data transfer for further storage and software elaboration. The design (FPGA-based) is oriented to real-time applications and has been developed in order to allow acquisition with no loss of pulses and data storage for long-time intervals (tens of s at MHz pulse count rates) without the need of large on-board memory. A dedicated pulse analysis software, written in LabVIEWTM, performs the treatment of the acquired pulses, including pulse recognition, pile-up rejection, baseline removal, pulse shape particle separation and pulse height spectra analysis. The acquisition and pre-elaboration programs have been fully integrated with the analysis software.

  9. The Influence of Working Memory and Phonological Processing on English Language Learner Children's Bilingual Reading and Language Acquisition

    ERIC Educational Resources Information Center

    Swanson, H. Lee; Orosco, Michael J.; Lussier, Cathy M.; Gerber, Michael M.; Guzman-Orth, Danielle A.

    2011-01-01

    In this study, we explored whether the contribution of working memory (WM) to children's (N = 471) 2nd language (L2) reading and language acquisition was best accounted for by processing efficiency at a phonological level and/or by executive processes independent of phonological processing. Elementary school children (Grades 1, 2, & 3) whose…

  10. Palladium-based Mass-Tag Cell Barcoding with a Doublet-Filtering Scheme and Single Cell Deconvolution Algorithm

    PubMed Central

    Zunder, Eli R.; Finck, Rachel; Behbehani, Gregory K.; Amir, El-ad D.; Krishnaswamy, Smita; Gonzalez, Veronica D.; Lorang, Cynthia G.; Bjornson, Zach; Spitzer, Matthew H.; Bodenmiller, Bernd; Fantl, Wendy J.; Pe’er, Dana; Nolan, Garry P.

    2015-01-01

    SUMMARY Mass-tag cell barcoding (MCB) labels individual cell samples with unique combinatorial barcodes, after which they are pooled for processing and measurement as a single multiplexed sample. The MCB method eliminates variability between samples in antibody staining and instrument sensitivity, reduces antibody consumption, and shortens instrument measurement time. Here, we present an optimized MCB protocol with several improvements over previously described methods. The use of palladium-based labeling reagents expands the number of measurement channels available for mass cytometry and reduces interference with lanthanide-based antibody measurement. An error-detecting combinatorial barcoding scheme allows cell doublets to be identified and removed from the analysis. A debarcoding algorithm that is single cell-based rather than population-based improves the accuracy and efficiency of sample deconvolution. This debarcoding algorithm has been packaged into software that allows rapid and unbiased sample deconvolution. The MCB procedure takes 3–4 h, not including sample acquisition time of ~1 h per million cells. PMID:25612231

  11. In-Situ Operations and Planning for the Mars Science Laboratory Robotic Arm: The First 200 Sols

    NASA Technical Reports Server (NTRS)

    Robinson, M.; Collins, C.; Leger, P.; Carsten, J.; Tompkins, V.; Hartman, F.; Yen, J.

    2013-01-01

    The Robotic Arm (RA) has operated for more than 200 Martian solar days (or sols) since the Mars Science Laboratory rover touched down in Gale Crater on August 5, 2012. During the first seven months on Mars the robotic arm has performed multiple contact science sols including the positioning of the Alpha Particle X-Ray Spectrometer (APXS) and/or Mars Hand Lens Imager (MAHLI) with respect to rocks or loose regolith targets. The RA has supported sample acquisition using both the scoop and drill, sample processing with CHIMRA (Collection and Handling for In- Situ Martian Rock Analysis), and delivery of sample portions to the observation tray, and the SAM (Sample Analysis at Mars) and CHEMIN (Chemistry and Mineralogy) science instruments. This paper describes the planning and execution of robotic arm activities during surface operations, and reviews robotic arm performance results from Mars to date.

  12. A developmental perspective on reading dysfunction: accuracy and rate criteria in the subtyping of dyslexic children.

    PubMed

    Lovett, M W

    1984-05-01

    Children referred with specific reading dysfunction were subtyped as accuracy disabled or rate disabled according to criteria developed from an information processing model of reading skill. Multiple measures of oral and written language development were compared for two subtyped samples matched on age, sex, and IQ. The two samples were comparable in reading fluency, reading comprehension, word knowledge, and word retrieval functions. Accuracy disabled readers demonstrated inferior decoding and spelling skills. The accuracy disabled sample proved deficient in their understanding of oral language structure and in their ability to associate unfamiliar pseudowords and novel symbols in a task designed to simulate some of the learning involved in initial reading acquisition. It was suggested that these two samples of disabled readers may be best described with respect to their relative standing along a theoretical continuum of normal reading development.

  13. Scanning transmission electron microscopy through-focal tilt-series on biological specimens.

    PubMed

    Trepout, Sylvain; Messaoudi, Cédric; Perrot, Sylvie; Bastin, Philippe; Marco, Sergio

    2015-10-01

    Since scanning transmission electron microscopy can produce high signal-to-noise ratio bright-field images of thick (≥500 nm) specimens, this tool is emerging as the method of choice to study thick biological samples via tomographic approaches. However, in a convergent-beam configuration, the depth of field is limited because only a thin portion of the specimen (from a few nanometres to tens of nanometres depending on the convergence angle) can be imaged in focus. A method known as through-focal imaging enables recovery of the full depth of information by combining images acquired at different levels of focus. In this work, we compare tomographic reconstruction with the through-focal tilt-series approach (a multifocal series of images per tilt angle) with reconstruction with the classic tilt-series acquisition scheme (one single-focus image per tilt angle). We visualised the base of the flagellum in the protist Trypanosoma brucei via an acquisition and image-processing method tailored to obtain quantitative and qualitative descriptors of reconstruction volumes. Reconstructions using through-focal imaging contained more contrast and more details for thick (≥500 nm) biological samples. Copyright © 2015 Elsevier Ltd. All rights reserved.

  14. Non-destructive sampling of a comet

    NASA Astrophysics Data System (ADS)

    Jessberger, H. L.; Kotthaus, M.

    1991-04-01

    Various conditions which must be met for the development of a nondestructive sampling and acquisition system are outlined and the development of a new robotic sampling system suited for use on a cometary surface is briefly discussed. The Rosetta mission of ESA will take samples of a comet nucleus and return both core and volatile samples to earth. Various considerations which must be taken into account for such a project are examined including the identification of design parameters for sample quality; the identification of the most probable site conditions; the development of a sample acquisition system with respect to these conditions; the production of model materials and model conditions; and the investigation of the relevant material properties. An adequate sampling system should also be designed and built, including various tools, and the system should be tested under simulated cometary conditions.

  15. 48 CFR 750.7110 - Processing cases.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 48 Federal Acquisition Regulations System 5 2010-10-01 2010-10-01 false Processing cases. 750.7110 Section 750.7110 Federal Acquisition Regulations System AGENCY FOR INTERNATIONAL DEVELOPMENT CONTRACT MANAGEMENT EXTRAORDINARY CONTRACTUAL ACTIONS Extraordinary Contractual Actions To Protect Foreign Policy...

  16. 48 CFR 3009.570-2 - Policy.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... ACQUISITION REGULATION (HSAR) ACQUISITION PLANNING CONTRACTOR QUALIFICATIONS Organizational and Consultant... took appropriate steps to prevent any organizational conflict of interest in the selection process; or... process over which the entity exercised no control. (c) CONSTRUCTION—Nothing in this section 3009.570...

  17. A Five-Year Plan for Meeting the Automatic Data Processing and Telecommunications Needs of the Federal Government. Volume 2: Major Information Technology Systems Acquisition Plans of Federal Executive Agencies, 1984-1989.

    ERIC Educational Resources Information Center

    Department of Commerce, Washington, DC.

    This volume, the second of two, presents and analyzes the information technology acquisition plans of the Federal Government by agency and component. A brief description covers the outlays planned for major information technology acquisitions of general purpose data processing and telecommunications systems, facilities, and related services for 6…

  18. Possible Overlapping Time Frames of Acquisition and Consolidation Phases in Object Memory Processes: A Pharmacological Approach

    ERIC Educational Resources Information Center

    Akkerman, Sven; Blokland, Arjan; Prickaerts, Jos

    2016-01-01

    In previous studies, we have shown that acetylcholinesterase inhibitors and phosphodiesterase inhibitors (PDE-Is) are able to improve object memory by enhancing acquisition processes. On the other hand, only PDE-Is improve consolidation processes. Here we show that the cholinesterase inhibitor donepezil also improves memory performance when…

  19. The Acquisition of Integrated Science Process Skills in a Web-Based Learning Environment

    ERIC Educational Resources Information Center

    Saat, Rohaida Mohd

    2004-01-01

    Web-based learning is becoming prevalent in science learning. Some use specially designed programs, while others use materials available on the Internet. This qualitative case study examined the process of acquisition of integrated science process skills, particularly the skill of controlling variables, in a web-based learning environment among…

  20. Word Order Processing in the Bilingual Brain

    ERIC Educational Resources Information Center

    Saur, Dorothee; Baumgaertner, Annette; Moehring, Anja; Buchel, Christian; Bonnesen, Matthias; Rose, Michael; Musso, Mariachristina; Meisel, Jurgen M.

    2009-01-01

    One of the issues debated in the field of bilingualism is the question of a "critical period" for second language acquisition. Recent studies suggest an influence of age of onset of acquisition (AOA) particularly on syntactic processing; however, the processing of word order in a sentence context has not yet been examined specifically. We used…

  1. Digital Curation of Earth Science Samples Starts in the Field

    NASA Astrophysics Data System (ADS)

    Lehnert, K. A.; Hsu, L.; Song, L.; Carter, M. R.

    2014-12-01

    Collection of physical samples in the field is an essential part of research in the Earth Sciences. Samples provide a basis for progress across many disciplines, from the study of global climate change now and over the Earth's history, to present and past biogeochemical cycles, to magmatic processes and mantle dynamics. The types of samples, methods of collection, and scope and scale of sampling campaigns are highly diverse, ranging from large-scale programs to drill rock and sediment cores on land, in lakes, and in the ocean, to environmental observation networks with continuous sampling, to single investigator or small team expeditions to remote areas around the globe or trips to local outcrops. Cyberinfrastructure for sample-related fieldwork needs to cater to the different needs of these diverse sampling activities, aligning with specific workflows, regional constraints such as connectivity or climate, and processing of samples. In general, digital tools should assist with capture and management of metadata about the sampling process (location, time, method) and the sample itself (type, dimension, context, images, etc.), management of the physical objects (e.g., sample labels with QR codes), and the seamless transfer of sample metadata to data systems and software relevant to the post-sampling data acquisition, data processing, and sample curation. In order to optimize CI capabilities for samples, tools and workflows need to adopt community-based standards and best practices for sample metadata, classification, identification and registration. This presentation will provide an overview and updates of several ongoing efforts that are relevant to the development of standards for digital sample management: the ODM2 project that has generated an information model for spatially-discrete, feature-based earth observations resulting from in-situ sensors and environmental samples, aligned with OGC's Observation & Measurements model (Horsburgh et al, AGU FM 2014); implementation of the IGSN (International Geo Sample Number) as a globally unique sample identifier via a distributed system of allocating agents and a central registry; and the EarthCube Research Coordination Network iSamplES (Internet of Samples in the Earth Sciences) that aims to improve sharing and curation of samples through the use of CI.

  2. Real time data acquisition of a countrywide commercial microwave link network

    NASA Astrophysics Data System (ADS)

    Chwala, Christian; Keis, Felix; Kunstmann, Harald

    2015-04-01

    Research in recent years has shown that data from commercial microwave link networks can provide very valuable precipitation information. Since these networks comprise the backbone of the cell phone network, they provide countrywide coverage. However acquiring the necessary data from the network operators is still difficult. Data is usually made available for researchers with a large time delay and often at irregular basis. This of course hinders the exploitation of commercial microwave link data in operational applications like QPE forecasts running at national meteorological services. To overcome this, we have developed a custom software in joint cooperation with our industry partner Ericsson. The software is installed on a dedicated server at Ericsson and is capable of acquiring data from the countrywide microwave link network in Germany. In its current first operational testing phase, data from several hundred microwave links in southern Germany is recorded. All data is instantaneously sent to our server where it is stored and organized in an emerging database. Time resolution for the Ericsson data is one minute. The custom acquisition software, however, is capable of processing higher sampling rates. Additionally we acquire and manage 1 Hz data from four microwave links operated by the skiing resort in Garmisch-Partenkirchen. We will present the concept of the data acquisition and show details of the custom-built software. Additionally we will showcase the accessibility and basic processing of real time microwave link data via our database web frontend.

  3. Advancements of labelled radio-pharmaceutics imaging with the PIM-MPGD

    NASA Astrophysics Data System (ADS)

    Donnard, J.; Arlicot, N.; Berny, R.; Carduner, H.; Leray, P.; Morteau, E.; Servagent, N.; Thers, D.

    2009-11-01

    The Beta autoradiography is widely used in pharmacology or in biological fields to study the response of an organism to a certain kind of molecule. The image of the distribution is processed by studying the concentration of the radioactivity into different organs. We report on the development of an integrated apparatus based on a PIM device (Parallel Ionization Multiplier) able to process the image of 10 microscope slides at the same time over an area of 18*18 cm2. Thanks to a vacuum pump and a regulation gas circuit, 5 minutes is sufficient to begin an acquisition. All the electronics and the gas distribution are included in the structure leading to a transportable device. Special software has been developed to process data in real time with image visualization. Biological samples can be labelled with β emitters of low energy like 3H/14C or Auger electrons of 125I/99mTc. The measured spatial resolution is 30 μm in 3H and the trigger and the charge rate are constant over more than 6 days of acquisition showing good stability of the device. Moreover, collaboration with doctors and biologists of INSERM (National Institute for Medical Research in France) has started in order to demonstrate that MPGD's can be easily proposed outside a physics laboratory.

  4. Beamline Electrostatic Levitator (BESL) for in-situ High Energy K-Ray Diffraction Studies of Levitated Solids and Liquids at High Temperature

    NASA Technical Reports Server (NTRS)

    Gangopadhyay, A. K.; Lee, G. W.; Kelton, K. F.; Rogers, J. R.; Goldman, A. I.; Robinson, D. S.; Rathz, T. J.; Hyers, R. W.

    2005-01-01

    Determinations of the phase formation sequence, the crystal structures and the thermodynamic properties of materials at high temperatures are difficult because of contamination from the sample container and environment. Containerless processing techniques, such as electrostatic (ESL), electromagnetic (EML), aerodynamic, and acoustic levitation, are most suitable these studies. An adaptation of ESL for in-situ structural studies of a wide range of materials, including metals, semiconductors, insulators using high energy (125 keV) synchrotron x-rays is described here. This beamline ESL (BESL) allows the in-situ determination of the atomic structures of equilibrium solid and liquid phases, including undercooled liquids, as well as real-time studies of solid-solid and liquid-solid phase transformations. The use of image plate (MAR345) or GE-Angio detectors enables fast (30 ms - 1s) acquisition of complete diffraction patterns over a wide q-range (4 - 140/mm). The wide temperature range (300 - 2500 K), containerless processing under high vacuum (10(exp -7) - 10(exp -8) torr), and fast data acquisition, make BESL particularly suitable for phase diagram studies of high temperature materials. An additional, critically important, feature of BESL is the ability to also make simultaneous measurement of a host of thermo-physical properties, including the specific heat, enthalpy of transformation, solidus and liquidus temperatures, density, viscosity, and surface tension; all on the same sample and simultaneous with the structural measurements.

  5. Calorimetric study of mutant human lysozymes with partially introduced Ca2+ binding sites and its efficient refolding system from inclusion bodies.

    PubMed

    Koshiba, T; Tsumoto, K; Masaki, K; Kawano, K; Nitta, K; Kumagai, I

    1998-08-01

    During the process of evolution, ancestral lysozymes evolved into calcium-binding lysozymes by acquiring three critical aspartate residues at positions 86, 91 and 92. To investigate the process of the acquisition of calcium-binding ability, two of the aspartates were partially introduced into human lysozyme at positions 86, 91 and 92. These mutants (HLQ86D, HLA92D and HLQ86D/D91Q/A92D), having two critical aspartates in calcium-binding sites, were expressed in Escherichia coli as non-active inclusion bodies. For the preparation of lysozyme samples, a refolding system using thioredoxin was established. This system allowed for effective refolding of wild-type and mutant lysozymes, and 100% of activity was recovered within 4 days. The calcium ion dependence of the melting temperature (Tm) of wild-type and mutant lysozymes was investigated by differential scanning calorimetry at pH 4.5. The Tm values of wild-type, HLQ86D and HLA92D mutants were not dependent on calcium ion concentration. However, the Tm of HLQ86D/D91Q/A92D was 4 degrees higher in the presence of 50 mM CaCl2 than in its absence, and the calcium-binding constant of this mutant was estimated to be 2.25(+/-0.25)x10(2) M(-1) at pH 4.5. Moreover, the calcium-binding ability of this mutant was confirmed by the result using Sephadex G-25 gel chromatography. These results indicate that it is indispensable to have at least two aspartates at positions 86 and 92 for acquisition of calcium-binding ability. The process of the acquisition of calcium-binding site during evolution of calcium-binding lysozyme is discussed.

  6. Optimized Design and Analysis of Sparse-Sampling fMRI Experiments

    PubMed Central

    Perrachione, Tyler K.; Ghosh, Satrajit S.

    2013-01-01

    Sparse-sampling is an important methodological advance in functional magnetic resonance imaging (fMRI), in which silent delays are introduced between MR volume acquisitions, allowing for the presentation of auditory stimuli without contamination by acoustic scanner noise and for overt vocal responses without motion-induced artifacts in the functional time series. As such, the sparse-sampling technique has become a mainstay of principled fMRI research into the cognitive and systems neuroscience of speech, language, hearing, and music. Despite being in use for over a decade, there has been little systematic investigation of the acquisition parameters, experimental design considerations, and statistical analysis approaches that bear on the results and interpretation of sparse-sampling fMRI experiments. In this report, we examined how design and analysis choices related to the duration of repetition time (TR) delay (an acquisition parameter), stimulation rate (an experimental design parameter), and model basis function (an analysis parameter) act independently and interactively to affect the neural activation profiles observed in fMRI. First, we conducted a series of computational simulations to explore the parameter space of sparse design and analysis with respect to these variables; second, we validated the results of these simulations in a series of sparse-sampling fMRI experiments. Overall, these experiments suggest the employment of three methodological approaches that can, in many situations, substantially improve the detection of neurophysiological response in sparse fMRI: (1) Sparse analyses should utilize a physiologically informed model that incorporates hemodynamic response convolution to reduce model error. (2) The design of sparse fMRI experiments should maintain a high rate of stimulus presentation to maximize effect size. (3) TR delays of short to intermediate length can be used between acquisitions of sparse-sampled functional image volumes to increase the number of samples and improve statistical power. PMID:23616742

  7. Optimized design and analysis of sparse-sampling FMRI experiments.

    PubMed

    Perrachione, Tyler K; Ghosh, Satrajit S

    2013-01-01

    Sparse-sampling is an important methodological advance in functional magnetic resonance imaging (fMRI), in which silent delays are introduced between MR volume acquisitions, allowing for the presentation of auditory stimuli without contamination by acoustic scanner noise and for overt vocal responses without motion-induced artifacts in the functional time series. As such, the sparse-sampling technique has become a mainstay of principled fMRI research into the cognitive and systems neuroscience of speech, language, hearing, and music. Despite being in use for over a decade, there has been little systematic investigation of the acquisition parameters, experimental design considerations, and statistical analysis approaches that bear on the results and interpretation of sparse-sampling fMRI experiments. In this report, we examined how design and analysis choices related to the duration of repetition time (TR) delay (an acquisition parameter), stimulation rate (an experimental design parameter), and model basis function (an analysis parameter) act independently and interactively to affect the neural activation profiles observed in fMRI. First, we conducted a series of computational simulations to explore the parameter space of sparse design and analysis with respect to these variables; second, we validated the results of these simulations in a series of sparse-sampling fMRI experiments. Overall, these experiments suggest the employment of three methodological approaches that can, in many situations, substantially improve the detection of neurophysiological response in sparse fMRI: (1) Sparse analyses should utilize a physiologically informed model that incorporates hemodynamic response convolution to reduce model error. (2) The design of sparse fMRI experiments should maintain a high rate of stimulus presentation to maximize effect size. (3) TR delays of short to intermediate length can be used between acquisitions of sparse-sampled functional image volumes to increase the number of samples and improve statistical power.

  8. Low-Pressure Testing of the Mars Science Laboratory’s Solid Sampling System: Test Methods and Preliminary Results

    NASA Astrophysics Data System (ADS)

    Mukherjee, S.; von der Heydt, M.; Hanson, C.; Jandura, L.

    2009-12-01

    The Mars Science Laboratory mission is scheduled to launch in 2011 with an extensive suite of in situ science instruments. Acquiring, processing and delivering appropriate samples of rock and martian regolith to the instruments is a critical component in realizing the science capability of these payload elements. However, there are a number of challenges in validating the design of these systems. In particular, differences in the environment (atmospheric pressure and composition, temperature, gravity), target materials (variation in rock and soil properties), and state of the hardware (electrical potential, particulate coatings) may effect sampling performance. To better understand the end-to-end system and allow development of mitigation strategies if necessary, early testing of high-fidelity engineering models of the hardware in the solid sample chain is being conducted. The components of the sample acquisition, processing & delivery chain that will be tested are the drill, scoop, sieves, portioners, and instrument inlet funnels. An evaluation of the environmental parameter space was conducted to identify a subset that may have significant effects on sampling performance and cannot be well bounded by analysis. Accordingly, support equipment to enable testing at Mars surface pressures (5-10 Torr), with carbon dioxide was designed and built. A description of the testing set-up, investigations, and preliminary results will be presented.

  9. Machine Learning and Computer Vision System for Phenotype Data Acquisition and Analysis in Plants.

    PubMed

    Navarro, Pedro J; Pérez, Fernando; Weiss, Julia; Egea-Cortines, Marcos

    2016-05-05

    Phenomics is a technology-driven approach with promising future to obtain unbiased data of biological systems. Image acquisition is relatively simple. However data handling and analysis are not as developed compared to the sampling capacities. We present a system based on machine learning (ML) algorithms and computer vision intended to solve the automatic phenotype data analysis in plant material. We developed a growth-chamber able to accommodate species of various sizes. Night image acquisition requires near infrared lightning. For the ML process, we tested three different algorithms: k-nearest neighbour (kNN), Naive Bayes Classifier (NBC), and Support Vector Machine. Each ML algorithm was executed with different kernel functions and they were trained with raw data and two types of data normalisation. Different metrics were computed to determine the optimal configuration of the machine learning algorithms. We obtained a performance of 99.31% in kNN for RGB images and a 99.34% in SVM for NIR. Our results show that ML techniques can speed up phenomic data analysis. Furthermore, both RGB and NIR images can be segmented successfully but may require different ML algorithms for segmentation.

  10. DoD Acquisition Workforce Education: An SBA Education Case Study

    ERIC Educational Resources Information Center

    Davenport, Richard W.

    2009-01-01

    A Department of Defense (DoD) M&S education task force is in the process of studying the Modeling and Simulation (M&S) education of the acquisition workforce. Historically, DoD acquisition workforce education is not referred to as education, but rather what the Defense Acquisition University (DAU) refers to as "practitioner training, career…

  11. Language acquisition is model-based rather than model-free.

    PubMed

    Wang, Felix Hao; Mintz, Toben H

    2016-01-01

    Christiansen & Chater (C&C) propose that learning language is learning to process language. However, we believe that the general-purpose prediction mechanism they propose is insufficient to account for many phenomena in language acquisition. We argue from theoretical considerations and empirical evidence that many acquisition tasks are model-based, and that different acquisition tasks require different, specialized models.

  12. Missile Defense: European Phased Adaptive Approach Acquisitions Face Synchronization, Transparency, and Accountability Challenges

    DTIC Science & Technology

    2010-12-21

    House of Representatives Subject: Missile Defense: European Phased Adaptive Approach Acquisitions Face Synchronization , Transparency, and...TITLE AND SUBTITLE Missile Defense: European Phased Adaptive Approach Acquisitions Face Synchronization , Transparency, and Accountability...However, we found that DOD has not fully implemented a management process that synchronizes EPAA acquisition activities and ensures transparency and

  13. Coordinating Council. Seventh Meeting: Acquisitions

    NASA Technical Reports Server (NTRS)

    1992-01-01

    The theme for this NASA Scientific and Technical Information Program Coordinating Council meeting was Acquisitions. In addition to NASA and the NASA Center for AeroSpace Information (CASI) presentations, the report contains fairly lengthy visuals about acquisitions at the Defense Technical Information Center. CASI's acquisitions program and CASI's proactive acquisitions activity were described. There was a presentation on the document evaluation process at CASI. A talk about open literature scope and coverage at the American Institute of Aeronautics and Astronautics was also given. An overview of the STI Program's Acquisitions Experts Committee was given next. Finally acquisitions initiatives of the NASA STI program were presented.

  14. Label-Free Biomedical Imaging Using High-Speed Lock-In Pixel Sensor for Stimulated Raman Scattering

    PubMed Central

    Mars, Kamel; Kawahito, Shoji; Yasutomi, Keita; Kagawa, Keiichiro; Yamada, Takahiro

    2017-01-01

    Raman imaging eliminates the need for staining procedures, providing label-free imaging to study biological samples. Recent developments in stimulated Raman scattering (SRS) have achieved fast acquisition speed and hyperspectral imaging. However, there has been a problem of lack of detectors suitable for MHz modulation rate parallel detection, detecting multiple small SRS signals while eliminating extremely strong offset due to direct laser light. In this paper, we present a complementary metal-oxide semiconductor (CMOS) image sensor using high-speed lock-in pixels for stimulated Raman scattering that is capable of obtaining the difference of Stokes-on and Stokes-off signal at modulation frequency of 20 MHz in the pixel before reading out. The generated small SRS signal is extracted and amplified in a pixel using a high-speed and large area lateral electric field charge modulator (LEFM) employing two-step ion implantation and an in-pixel pair of low-pass filter, a sample and hold circuit and a switched capacitor integrator using a fully differential amplifier. A prototype chip is fabricated using 0.11 μm CMOS image sensor technology process. SRS spectra and images of stearic acid and 3T3-L1 samples are successfully obtained. The outcomes suggest that hyperspectral and multi-focus SRS imaging at video rate is viable after slight modifications to the pixel architecture and the acquisition system. PMID:29120358

  15. Improved detection and mapping of deepwater hydrocarbon seeps: optimizing multibeam echosounder seafloor backscatter acquisition and processing techniques

    NASA Astrophysics Data System (ADS)

    Mitchell, Garrett A.; Orange, Daniel L.; Gharib, Jamshid J.; Kennedy, Paul

    2018-06-01

    Marine seep hunting surveys are a current focus of hydrocarbon exploration surveys due to recent advances in offshore geophysical surveying, geochemical sampling, and analytical technologies. Hydrocarbon seeps are ephemeral, small, discrete, and therefore difficult to sample on the deep seafloor. Multibeam echosounders are an efficient seafloor exploration tool to remotely locate and map seep features. Geophysical signatures from hydrocarbon seeps are acoustically-evident in bathymetric, seafloor backscatter, midwater backscatter datasets. Interpretation of these signatures in backscatter datasets is a fundamental component of commercial seep hunting campaigns. Degradation of backscatter datasets resulting from environmental, geometric, and system noise can interfere with the detection and delineation of seeps. We present a relative backscatter intensity normalization method and an oversampling acquisition technique that can improve the geological resolvability of hydrocarbon seeps. We use Green Canyon (GC) Block 600 in the Northern Gulf of Mexico as a seep calibration site for a Kongsberg EM302 30 kHz MBES prior to the start of the Gigante seep hunting program to analyze these techniques. At GC600, we evaluate the results of a backscatter intensity normalization, assess the effectiveness of 2X seafloor coverage in resolving seep-related features in backscatter data, and determine the off-nadir detection limits of bubble plumes using the EM302. Incorporating these techniques into seep hunting surveys can improve the detectability and sampling of seafloor seeps.

  16. Label-Free Biomedical Imaging Using High-Speed Lock-In Pixel Sensor for Stimulated Raman Scattering.

    PubMed

    Mars, Kamel; Lioe, De Xing; Kawahito, Shoji; Yasutomi, Keita; Kagawa, Keiichiro; Yamada, Takahiro; Hashimoto, Mamoru

    2017-11-09

    Raman imaging eliminates the need for staining procedures, providing label-free imaging to study biological samples. Recent developments in stimulated Raman scattering (SRS) have achieved fast acquisition speed and hyperspectral imaging. However, there has been a problem of lack of detectors suitable for MHz modulation rate parallel detection, detecting multiple small SRS signals while eliminating extremely strong offset due to direct laser light. In this paper, we present a complementary metal-oxide semiconductor (CMOS) image sensor using high-speed lock-in pixels for stimulated Raman scattering that is capable of obtaining the difference of Stokes-on and Stokes-off signal at modulation frequency of 20 MHz in the pixel before reading out. The generated small SRS signal is extracted and amplified in a pixel using a high-speed and large area lateral electric field charge modulator (LEFM) employing two-step ion implantation and an in-pixel pair of low-pass filter, a sample and hold circuit and a switched capacitor integrator using a fully differential amplifier. A prototype chip is fabricated using 0.11 μm CMOS image sensor technology process. SRS spectra and images of stearic acid and 3T3-L1 samples are successfully obtained. The outcomes suggest that hyperspectral and multi-focus SRS imaging at video rate is viable after slight modifications to the pixel architecture and the acquisition system.

  17. Spatial working memory in Wistar rats: brain sex differences in metabolic activity.

    PubMed

    Méndez-López, Magdalena; Méndez, Marta; López, Laudino; Arias, Jorge L

    2009-05-29

    Several works have shown that males and females differ in the ability to learn spatial locations in mazes. In this study, we used the Morris water maze to assess the acquisition of a spatial working memory (WM) task in adult male and female Wistar rats. The task consisted of a paired sample procedure made up of two daily identical trials, sample and retention. To study the oxidative metabolic activity of some brain limbic system regions after the WM task, we applied the cytochrome oxidase (COx) histochemistry. In addition to the experimental groups, free swimming control groups and untrained naïve groups were added to explore the COx changes not specific to the learning process. Similar spatial performances were found between sexes as only one more sample and retention trials were needed in males to reduce the escape latencies significantly. Males showed decreased COx activity as compared to control groups in the medial prefrontal cortex (prelimbic and infralimbic regions) as well as in the lateral septum and dentate gyrus. Regarding females, an increase in COx activity was found in nucleus accumbens, ventral tegmental area and supramammillary region in relation to control groups. Overall, these findings suggest that the acquisition of the spatial WM task is mediated by different subsystems in a sex-dependent manner that points to the hippocampus as the central structure in males whereas other structures would be central in females.

  18. Strategic and Nonstrategic Information Acquisition.

    ERIC Educational Resources Information Center

    Berger, Charles R.

    2002-01-01

    Uses dual-process theories and research concerned with automaticity and the role conceptual short-term memory plays in visual information processing to illustrate both the ubiquity of nonstrategic information acquisition during interpersonal communication and its potential consequences on judgments and behavior. Discusses theoretical and…

  19. Sustaining Equipment and the Rapid Acquisition Process: The Forgotten Phase

    DTIC Science & Technology

    2012-02-24

    Operation of the Defense Acquisition System,” December 8, 2008. 7 Rasch , Robert. A, Jr. Lessons Learned from Rapid Acquisition: Better, Faster, Cheaper...Life Cycle Management Responsibilities,” Defense AR Journal, 17.2 (April 2010): 183. 37 Robert A. Rasch , Lessons Learned from Rapid Acquisition: Better...Accountability Office (GAO) Report, Subject: Rapid Acquisition of Mine Resistant Protected Vehicles, July 15, 2008, 4. 39 Ibid. 40 Ibid. 41 Robert A. Rasch

  20. The influence of the microscope lamp filament colour temperature on the process of digital images of histological slides acquisition standardization.

    PubMed

    Korzynska, Anna; Roszkowiak, Lukasz; Pijanowska, Dorota; Kozlowski, Wojciech; Markiewicz, Tomasz

    2014-01-01

    The aim of this study is to compare the digital images of the tissue biopsy captured with optical microscope using bright field technique under various light conditions. The range of colour's variation in immunohistochemically stained with 3,3'-Diaminobenzidine and Haematoxylin tissue samples is immense and coming from various sources. One of them is inadequate setting of camera's white balance to microscope's light colour temperature. Although this type of error can be easily handled during the stage of image acquisition, it can be eliminated with use of colour adjustment algorithms. The examination of the dependence of colour variation from microscope's light temperature and settings of the camera is done as an introductory research to the process of automatic colour standardization. Six fields of view with empty space among the tissue samples have been selected for analysis. Each field of view has been acquired 225 times with various microscope light temperature and camera white balance settings. The fourteen randomly chosen images have been corrected and compared, with the reference image, by the following methods: Mean Square Error, Structural SIMilarity and visual assessment of viewer. For two types of backgrounds and two types of objects, the statistical image descriptors: range, median, mean and its standard deviation of chromaticity on a and b channels from CIELab colour space, and luminance L, and local colour variability for objects' specific area have been calculated. The results have been averaged for 6 images acquired in the same light conditions and camera settings for each sample. The analysis of the results leads to the following conclusions: (1) the images collected with white balance setting adjusted to light colour temperature clusters in certain area of chromatic space, (2) the process of white balance correction for images collected with white balance camera settings not matched to the light temperature moves image descriptors into proper chromatic space but simultaneously the value of luminance changes. So the process of the image unification in a sense of colour fidelity can be solved in separate introductory stage before the automatic image analysis.

  1. A multi-threshold sampling method for TOF-PET signal processing

    NASA Astrophysics Data System (ADS)

    Kim, H.; Kao, C. M.; Xie, Q.; Chen, C. T.; Zhou, L.; Tang, F.; Frisch, H.; Moses, W. W.; Choong, W. S.

    2009-04-01

    As an approach to realizing all-digital data acquisition for positron emission tomography (PET), we have previously proposed and studied a multi-threshold sampling method to generate samples of a PET event waveform with respect to a few user-defined amplitudes. In this sampling scheme, one can extract both the energy and timing information for an event. In this paper, we report our prototype implementation of this sampling method and the performance results obtained with this prototype. The prototype consists of two multi-threshold discriminator boards and a time-to-digital converter (TDC) board. Each of the multi-threshold discriminator boards takes one input and provides up to eight threshold levels, which can be defined by users, for sampling the input signal. The TDC board employs the CERN HPTDC chip that determines the digitized times of the leading and falling edges of the discriminator output pulses. We connect our prototype electronics to the outputs of two Hamamatsu R9800 photomultiplier tubes (PMTs) that are individually coupled to a 6.25×6.25×25 mm3 LSO crystal. By analyzing waveform samples generated by using four thresholds, we obtain a coincidence timing resolution of about 340 ps and an ˜18% energy resolution at 511 keV. We are also able to estimate the decay-time constant from the resulting samples and obtain a mean value of 44 ns with an ˜9 ns FWHM. In comparison, using digitized waveforms obtained at a 20 GSps sampling rate for the same LSO/PMT modules we obtain ˜300 ps coincidence timing resolution, ˜14% energy resolution at 511 keV, and ˜5 ns FWHM for the estimated decay-time constant. Details of the results on the timing and energy resolutions by using the multi-threshold method indicate that it is a promising approach for implementing digital PET data acquisition.

  2. Evaluation of area strain response of dielectric elastomer actuator using image processing technique

    NASA Astrophysics Data System (ADS)

    Sahu, Raj K.; Sudarshan, Koyya; Patra, Karali; Bhaumik, Shovan

    2014-03-01

    Dielectric elastomer actuator (DEA) is a kind of soft actuators that can produce significantly large electric-field induced actuation strain and may be a basic unit of artificial muscles and robotic elements. Understanding strain development on a pre-stretched sample at different regimes of electrical field is essential for potential applications. In this paper, we report about ongoing work on determination of area strain using digital camera and image processing technique. The setup, developed in house consists of low cost digital camera, data acquisition and image processing algorithm. Samples have been prepared by biaxially stretched acrylic tape and supported between two cardboard frames. Carbon-grease has been pasted on the both sides of the sample, which will be compliant with electric field induced large deformation. Images have been grabbed before and after the application of high voltage. From incremental image area, strain has been calculated as a function of applied voltage on a pre-stretched dielectric elastomer (DE) sample. Area strain has been plotted with the applied voltage for different pre-stretched samples. Our study shows that the area strain exhibits nonlinear relationship with applied voltage. For same voltage higher area strain has been generated on a sample having higher pre-stretched value. Also our characterization matches well with previously published results which have been done with costly video extensometer. The study may be helpful for the designers to fabricate the biaxial pre-stretched planar actuator from similar kind of materials.

  3. UK Defence Acquisition Process for NEC: Transaction Governance within an Integrated Project Team

    DTIC Science & Technology

    2009-04-22

    3-tier framework for a study of the acquisition of an Advance Military Vehicle (AMV), we explore the shaping of the buyer -supplier relationship in...of an Advance Military Vehicle (AMV), we explore the shaping of the buyer -supplier relationship in the context of the UK defence acquisition process...of the buyer , the MoD, and how this impacts its suppliers in the defence industrial base. An historical review of defence industrial relations is

  4. The Requirement for Acquisition and Logistics Integration: An Examination of Reliability Management Within the Marine Corps Acquisition Process

    DTIC Science & Technology

    2002-12-01

    HMMWV family of vehicles, LVS family of vehicles, and the M198 Howitzer). The analysis is limited to an assessment of reliability management issues...AND LOGISTICS INTEGRATION: AN EXAMINATION OF RELIABILITY MANAGEMENT WITHIN THE MARINE CORPS ACQUISITION PROCESS by Marvin L. Norcross, Jr...Operations and Reports, 1215 Jefferson Davis Highway, Suite 1204, Arlington, VA 22202-4302, and to the Office of Management and Budget, Paperwork Reduction

  5. A high speed data acquisition and analysis system for transonic velocity, density, and total temperature fluctuations

    NASA Technical Reports Server (NTRS)

    Clukey, Steven J.

    1988-01-01

    The high speed Dynamic Data Acquisition System (DDAS) is described which provides the capability for the simultaneous measurement of velocity, density, and total temperature fluctuations. The system of hardware and software is described in context of the wind tunnel environment. The DDAS replaces both a recording mechanism and a separate data processing system. The data acquisition and data reduction process has been combined within DDAS. DDAS receives input from hot wires and anemometers, amplifies and filters the signals with computer controlled modules, and converts the analog signals to digital with real-time simultaneous digitization followed by digital recording on disk or tape. Automatic acquisition (either from a computer link to an existing wind tunnel acquisition system, or from data acquisition facilities within DDAS) collects necessary calibration and environment data. The generation of hot wire sensitivities is done in DDAS, as is the application of sensitivities to the hot wire data to generate turbulence quantities. The presentation of the raw and processed data, in terms of root mean square values of velocity, density and temperature, and the processing of the spectral data is accomplished on demand in near-real-time- with DDAS. A comprehensive description of the interface to the DDAS and of the internal mechanisms will be prosented. A summary of operations relevant to the use of the DDAS will be provided.

  6. 48 CFR 434.003 - Responsibilities.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ...) individually or as a group will participate in this decision making process. (b) The Chief Information Officer (CIO) is the Major Information Technology Systems Executive. For acquisitions of information technology... information technology system acquisition, designating an acquisition to be a major information technology...

  7. 48 CFR 434.003 - Responsibilities.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ...) individually or as a group will participate in this decision making process. (b) The Chief Information Officer (CIO) is the Major Information Technology Systems Executive. For acquisitions of information technology... information technology system acquisition, designating an acquisition to be a major information technology...

  8. 48 CFR 434.003 - Responsibilities.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ...) individually or as a group will participate in this decision making process. (b) The Chief Information Officer (CIO) is the Major Information Technology Systems Executive. For acquisitions of information technology... information technology system acquisition, designating an acquisition to be a major information technology...

  9. 48 CFR 434.003 - Responsibilities.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ...) individually or as a group will participate in this decision making process. (b) The Chief Information Officer (CIO) is the Major Information Technology Systems Executive. For acquisitions of information technology... information technology system acquisition, designating an acquisition to be a major information technology...

  10. 48 CFR 15.102 - Oral presentations.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 48 Federal Acquisition Regulations System 1 2010-10-01 2010-10-01 false Oral presentations. 15.102 Section 15.102 Federal Acquisition Regulations System FEDERAL ACQUISITION REGULATION CONTRACTING METHODS AND CONTRACT TYPES CONTRACTING BY NEGOTIATION Source Selection Processes and Techniques 15.102 Oral...

  11. Health Hazard Assessment and Toxicity Clearances in the Army Acquisition Process

    NASA Technical Reports Server (NTRS)

    Macko, Joseph A., Jr.

    2000-01-01

    The United States Army Materiel Command, Army Acquisition Pollution Prevention Support Office (AAPPSO) is responsible for creating and managing the U.S. Army Wide Acquisition Pollution Prevention Program. They have established Integrated Process Teams (IPTs) within each of the Major Subordinate Commands of the Army Materiel Command. AAPPSO provides centralized integration, coordination, and oversight of the Army Acquisition Pollution Prevention Program (AAPPP) , and the IPTs provide the decentralized execution of the AAPPSO program. AAPPSO issues policy and guidance, provides resources and prioritizes P2 efforts. It is the policy of the (AAPPP) to require United States Army Surgeon General approval of all materials or substances that will be used as an alternative to existing hazardous materials, toxic materials and substances, and ozone-depleting substances. The Army has a formal process established to address this effort. Army Regulation 40-10 requires a Health Hazard Assessment (HHA) during the Acquisition milestones of a new Army system. Army Regulation 40-5 addresses the Toxicity Clearance (TC) process to evaluate new chemicals and materials prior to acceptance as an alternative. U.S. Army Center for Health Promotion and Preventive Medicine is the Army's matrixed medical health organization that performs the HHA and TC mission.

  12. Data Acquisition with GPUs: The DAQ for the Muon $g$-$2$ Experiment at Fermilab

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gohn, W.

    Graphical Processing Units (GPUs) have recently become a valuable computing tool for the acquisition of data at high rates and for a relatively low cost. The devices work by parallelizing the code into thousands of threads, each executing a simple process, such as identifying pulses from a waveform digitizer. The CUDA programming library can be used to effectively write code to parallelize such tasks on Nvidia GPUs, providing a significant upgrade in performance over CPU based acquisition systems. The muonmore » $g$-$2$ experiment at Fermilab is heavily relying on GPUs to process its data. The data acquisition system for this experiment must have the ability to create deadtime-free records from 700 $$\\mu$$s muon spills at a raw data rate 18 GB per second. Data will be collected using 1296 channels of $$\\mu$$TCA-based 800 MSPS, 12 bit waveform digitizers and processed in a layered array of networked commodity processors with 24 GPUs working in parallel to perform a fast recording of the muon decays during the spill. The described data acquisition system is currently being constructed, and will be fully operational before the start of the experiment in 2017.« less

  13. An Operationally Responsive Space Architecture for 2025

    DTIC Science & Technology

    2008-06-22

    Organizational Relationships, Asset Loss Mitigation, Availability, Flexibility, and Streamlined Acquisition Processes . These pillars allowed the solutions...were considered. Analysis was further supported by a performance versus cost process which provided a final test of solution feasibility. Relative cost...Availability, Flexibility, and Streamlined Acquisition Processes . These pillars allowed the solutions, material and non-material, to be organized for

  14. Microprogrammable Integrated Data Acquisition System-Fatigue Life Data Application

    DTIC Science & Technology

    1976-03-01

    Lt. James W. Sturges, successfully applied the Midas general system [Sturges, 1975] to the fatigue life data monitoring problem and proved its...life data problem . The Midas FLD system computer program generates the required signals in the proper sequence for effectively sampling the 8-channel...Integrated Data Acquisition System- Fatigue Life Data Application" ( Midas FLD) is a microprocessor based data acquisition system. It incorporates a Pro-Log

  15. 48 CFR 803.806 - Processing suspected violations.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 48 Federal Acquisition Regulations System 5 2011-10-01 2011-10-01 false Processing suspected violations. 803.806 Section 803.806 Federal Acquisition Regulations System DEPARTMENT OF VETERANS AFFAIRS GENERAL IMPROPER BUSINESS PRACTICES AND PERSONAL CONFLICTS OF INTEREST Limitation on the Payment of Funds...

  16. 48 CFR 842.1203 - Processing agreements.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 48 Federal Acquisition Regulations System 5 2010-10-01 2010-10-01 false Processing agreements. 842.1203 Section 842.1203 Federal Acquisition Regulations System DEPARTMENT OF VETERANS AFFAIRS CONTRACT... submit all supporting agreements and documentation to the OGC for review as to legal sufficiency. ...

  17. 48 CFR 30.604 - Processing changes to disclosed or established cost accounting practices.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... disclosed or established cost accounting practices. 30.604 Section 30.604 Federal Acquisition Regulations System FEDERAL ACQUISITION REGULATION GENERAL CONTRACTING REQUIREMENTS COST ACCOUNTING STANDARDS ADMINISTRATION CAS Administration 30.604 Processing changes to disclosed or established cost accounting practices...

  18. 48 CFR 30.604 - Processing changes to disclosed or established cost accounting practices.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... disclosed or established cost accounting practices. 30.604 Section 30.604 Federal Acquisition Regulations System FEDERAL ACQUISITION REGULATION GENERAL CONTRACTING REQUIREMENTS COST ACCOUNTING STANDARDS ADMINISTRATION CAS Administration 30.604 Processing changes to disclosed or established cost accounting practices...

  19. 32 CFR 989.1 - Purpose.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... Acquisition Programs and Major Automated Information System Acquisition Programs. 1 To comply with NEPA and... ANALYSIS PROCESS (EIAP) § 989.1 Purpose. (a) This part implements the Air Force Environmental Impact Analysis Process (EIAP) and provides procedures for environmental impact analysis both within the United...

  20. Color and Vector Flow Imaging in Parallel Ultrasound With Sub-Nyquist Sampling.

    PubMed

    Madiena, Craig; Faurie, Julia; Poree, Jonathan; Garcia, Damien; Garcia, Damien; Madiena, Craig; Faurie, Julia; Poree, Jonathan

    2018-05-01

    RF acquisition with a high-performance multichannel ultrasound system generates massive data sets in short periods of time, especially in "ultrafast" ultrasound when digital receive beamforming is required. Sampling at a rate four times the carrier frequency is the standard procedure since this rule complies with the Nyquist-Shannon sampling theorem and simplifies quadrature sampling. Bandpass sampling (or undersampling) outputs a bandpass signal at a rate lower than the maximal frequency without harmful aliasing. Advantages over Nyquist sampling are reduced storage volumes and data workflow, and simplified digital signal processing tasks. We used RF undersampling in color flow imaging (CFI) and vector flow imaging (VFI) to decrease data volume significantly (factor of 3 to 13 in our configurations). CFI and VFI with Nyquist and sub-Nyquist samplings were compared in vitro and in vivo. The estimate errors due to undersampling were small or marginal, which illustrates that Doppler and vector Doppler images can be correctly computed with a drastically reduced amount of RF samples. Undersampling can be a method of choice in CFI and VFI to avoid information overload and reduce data transfer and storage.

  1. Low-dose x-ray tomography through a deep convolutional neural network

    DOE PAGES

    Yang, Xiaogang; De Andrade, Vincent; Scullin, William; ...

    2018-02-07

    Synchrotron-based X-ray tomography offers the potential of rapid large-scale reconstructions of the interiors of materials and biological tissue at fine resolution. However, for radiation sensitive samples, there remain fundamental trade-offs between damaging samples during longer acquisition times and reducing signals with shorter acquisition times. We present a deep convolutional neural network (CNN) method that increases the acquired X-ray tomographic signal by at least a factor of 10 during low-dose fast acquisition by improving the quality of recorded projections. Short exposure time projections enhanced with CNN show similar signal to noise ratios as compared with long exposure time projections and muchmore » lower noise and more structural information than low-dose fats acquisition without CNN. We optimized this approach using simulated samples and further validated on experimental nano-computed tomography data of radiation sensitive mouse brains acquired with a transmission X-ray microscopy. We demonstrate that automated algorithms can reliably trace brain structures in datasets collected with low dose-CNN. As a result, this method can be applied to other tomographic or scanning based X-ray imaging techniques and has great potential for studying faster dynamics in specimens.« less

  2. Low-dose x-ray tomography through a deep convolutional neural network

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yang, Xiaogang; De Andrade, Vincent; Scullin, William

    Synchrotron-based X-ray tomography offers the potential of rapid large-scale reconstructions of the interiors of materials and biological tissue at fine resolution. However, for radiation sensitive samples, there remain fundamental trade-offs between damaging samples during longer acquisition times and reducing signals with shorter acquisition times. We present a deep convolutional neural network (CNN) method that increases the acquired X-ray tomographic signal by at least a factor of 10 during low-dose fast acquisition by improving the quality of recorded projections. Short exposure time projections enhanced with CNN show similar signal to noise ratios as compared with long exposure time projections and muchmore » lower noise and more structural information than low-dose fats acquisition without CNN. We optimized this approach using simulated samples and further validated on experimental nano-computed tomography data of radiation sensitive mouse brains acquired with a transmission X-ray microscopy. We demonstrate that automated algorithms can reliably trace brain structures in datasets collected with low dose-CNN. As a result, this method can be applied to other tomographic or scanning based X-ray imaging techniques and has great potential for studying faster dynamics in specimens.« less

  3. Model-based frequency response characterization of a digital-image analysis system for epifluorescence microscopy

    NASA Technical Reports Server (NTRS)

    Hazra, Rajeeb; Viles, Charles L.; Park, Stephen K.; Reichenbach, Stephen E.; Sieracki, Michael E.

    1992-01-01

    Consideration is given to a model-based method for estimating the spatial frequency response of a digital-imaging system (e.g., a CCD camera) that is modeled as a linear, shift-invariant image acquisition subsystem that is cascaded with a linear, shift-variant sampling subsystem. The method characterizes the 2D frequency response of the image acquisition subsystem to beyond the Nyquist frequency by accounting explicitly for insufficient sampling and the sample-scene phase. Results for simulated systems and a real CCD-based epifluorescence microscopy system are presented to demonstrate the accuracy of the method.

  4. Sample acquisition and instrument deployment

    NASA Technical Reports Server (NTRS)

    Boyd, Robert C.

    1995-01-01

    Progress is reported in developing the Sample Acquisition and Instrument Deployment (SAID) system, a robotic system for deploying science instruments and acquiring samples for analysis. The system is a conventional four degree of freedom manipulator 2 meters in length. A baseline design has been achieved through analysis and trade studies. The design considers environmental operating conditions on the surface of Mars, as well as volume constraints on proposed Mars landers. Control issues have also been studied, and simulations of joint and tip movements have been performed. The systems have been fabricated and tested in environmental chambers, as well as soil testing and robotic control testing.

  5. Psychometric assessment of the processes of change scale for sun protection.

    PubMed

    Sillice, Marie A; Babbin, Steven F; Redding, Colleen A; Rossi, Joseph S; Paiva, Andrea L; Velicer, Wayne F

    2018-01-01

    The fourteen-factor Processes of Change Scale for Sun Protection assesses behavioral and experiential strategies that underlie the process of sun protection acquisition and maintenance. Variations of this measure have been used effectively in several randomized sun protection trials, both for evaluation and as a basis for intervention. However, there are no published studies, to date, that evaluate the psychometric properties of the scale. The present study evaluated factorial invariance and scale reliability in a national sample (N = 1360) of adults involved in a Transtheoretical model tailored intervention for exercise and sun protection, at baseline. Invariance testing ranged from least to most restrictive: Configural Invariance (constraints only factor structure and zero loadings); Pattern Identity Invariance (equal factor loadings across target groups); and Strong Factorial Invariance (equal factor loadings and measurement errors). Multi-sample structural equation modeling tested the invariance of the measurement model across seven subgroups: age, education, ethnicity, gender, race, skin tone, and Stage of Change for Sun Protection. Strong factorial invariance was found across all subgroups. Internal consistency coefficient Alpha and factor rho reliability, respectively, were .83 and .80 for behavioral processes, .91 and .89 for experiential processes, and .93 and .91 for the global scale. These results provide strong empirical evidence that the scale is consistent, has internal validity and can be used in research interventions with population-based adult samples.

  6. AgRISTARS: Foreign commodity production forecasting. Minutes of the annual formal project manager's review, including preliminary technical review reports of FY80 experiments. [wheat/barley and corn/soybean experiments

    NASA Technical Reports Server (NTRS)

    1980-01-01

    The U.S./Canada wheat/barley exploratory experiment is discussed with emphasis on labeling, machine processing using P1A, and the crop calendar. Classification and the simulated aggregation test used in the U.S. corn/soybean exploratory experiment are also considered. Topics covered regarding the foreign commodity production forecasting project include: (1) the acquisition, handling, and processing of both U.S. and foreign agricultural data, as well as meteorological data. The accuracy assessment methodology, multicrop sampling and aggregation technology development, frame development, the yield project interface, and classification for area estimation are also examined.

  7. Heuristic-based information acquisition and decision making among pilots.

    PubMed

    Wiggins, Mark W; Bollwerk, Sandra

    2006-01-01

    This research was designed to examine the impact of heuristic-based approaches to the acquisition of task-related information on the selection of an optimal alternative during simulated in-flight decision making. The work integrated features of naturalistic and normative decision making and strategies of information acquisition within a computer-based, decision support framework. The study comprised two phases, the first of which involved familiarizing pilots with three different heuristic-based strategies of information acquisition: frequency, elimination by aspects, and majority of confirming decisions. The second stage enabled participants to choose one of the three strategies of information acquisition to resolve a fourth (choice) scenario. The results indicated that task-oriented experience, rather than the information acquisition strategies, predicted the selection of the optimal alternative. It was also evident that of the three strategies available, the elimination by aspects information acquisition strategy was preferred by most participants. It was concluded that task-oriented experience, rather than the process of information acquisition, predicted task accuracy during the decision-making task. It was also concluded that pilots have a preference for one particular approach to information acquisition. Applications of outcomes of this research include the development of decision support systems that adapt to the information-processing capabilities and preferences of users.

  8. Acquisition of Derivational Lexical Rules: A Case Study of the Acquisition of French Agent Noun Forms by L2 Learners

    ERIC Educational Resources Information Center

    Redouane, Rabia

    2007-01-01

    This study investigates L2 learners' use of French derivational processes and their strategies as they form agent nouns. It also attempts to find out which of the acquisitional principles (conventionality, semantic transparency, formal simplicity, and productivity) advanced by Clark (1993, 2003) for various L1s acquisition of word formation…

  9. Stress Modulates Instrumental Learning Performances in Horses (Equus caballus) in Interaction with Temperament

    PubMed Central

    Valenchon, Mathilde; Lévy, Frédéric; Prunier, Armelle; Moussu, Chantal; Calandreau, Ludovic; Lansade, Léa

    2013-01-01

    The present study investigates how the temperament of the animal affects the influence of acute stress on the acquisition and reacquisition processes of a learning task. After temperament was assessed, horses were subjected to a stressor before or after the acquisition session of an instrumental task. Eight days later, horses were subjected to a reacquisition session without any stressor. Stress before acquisition tended to enhance the number of successes at the beginning of the acquisition session. Eight days later, during the reacquisition session, contrary to non-stressed animals, horses stressed after acquisition, and, to a lesser extent, horses stressed before acquisition, did not improve their performance between acquisition and reacquisition sessions. Temperament influenced learning performances in stressed horses only. Particularly, locomotor activity improved performances whereas fearfulness impaired them under stressful conditions. Results suggest that direct exposure to a stressor tended to increase acquisition performances, whereas a state of stress induced by the memory of a stressor, because it has been previously associated with the learning context, impaired reacquisition performances. The negative effect of a state of stress on reacquisition performances appeared to be stronger when exposure to the stressor occurred after rather than before the acquisition session. Temperament had an impact on both acquisition and reacquisition processes, but under stressful conditions only. These results suggest that stress is necessary to reveal the influence of temperament on cognitive performances. PMID:23626801

  10. Data acquisition channel apparatus

    NASA Astrophysics Data System (ADS)

    Higgins, C. H.; Skipper, J. D.

    1985-10-01

    Dicussed is a hybrid integrated circuit data acquisition channel apparatus employing an operational amplifier fed by a low current differential bipolar transistor preamplifier having separate feedback gain and signal gain determining elements and providing an amplified signal output to a sample and hold and analog-to-digital converter circuits. The disclosed apparatus operates with low energy and small space requirements and is capable of operations without the sample and hold circuit where the nature of the applied input signal permits.

  11. Microcomputer data acquisition and control.

    PubMed

    East, T D

    1986-01-01

    In medicine and biology there are many tasks that involve routine well defined procedures. These tasks are ideal candidates for computerized data acquisition and control. As the performance of microcomputers rapidly increases and cost continues to go down the temptation to automate the laboratory becomes great. To the novice computer user the choices of hardware and software are overwhelming and sadly most of the computer sales persons are not at all familiar with real-time applications. If you want to bill your patients you have hundreds of packaged systems to choose from; however, if you want to do real-time data acquisition the choices are very limited and confusing. The purpose of this chapter is to provide the novice computer user with the basics needed to set up a real-time data acquisition system with the common microcomputers. This chapter will cover the following issues necessary to establish a real time data acquisition and control system: Analysis of the research problem: Definition of the problem; Description of data and sampling requirements; Cost/benefit analysis. Choice of Microcomputer hardware and software: Choice of microprocessor and bus structure; Choice of operating system; Choice of layered software. Digital Data Acquisition: Parallel Data Transmission; Serial Data Transmission; Hardware and software available. Analog Data Acquisition: Description of amplitude and frequency characteristics of the input signals; Sampling theorem; Specification of the analog to digital converter; Hardware and software available; Interface to the microcomputer. Microcomputer Control: Analog output; Digital output; Closed-Loop Control. Microcomputer data acquisition and control in the 21st Century--What is in the future? High speed digital medical equipment networks; Medical decision making and artificial intelligence.

  12. 48 CFR 15.100 - Scope of subpart.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 48 Federal Acquisition Regulations System 1 2010-10-01 2010-10-01 false Scope of subpart. 15.100 Section 15.100 Federal Acquisition Regulations System FEDERAL ACQUISITION REGULATION CONTRACTING METHODS AND CONTRACT TYPES CONTRACTING BY NEGOTIATION Source Selection Processes and Techniques 15.100 Scope...

  13. 48 CFR 1034.004 - Acquisition strategy.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... Order, Task Order, or Interagency Agreement) to the overall investment requirements and management... investment; (3) A description of the effort, by acquisition, and the plans to include required clauses in the... requirements to manage the acquisition processes through the investment lifecycle; (7) Consideration of optimal...

  14. Federal Library Programs for Acquisition of Foreign Materials.

    ERIC Educational Resources Information Center

    Cylke, Frank Kurt

    Sixteen libraries representing those agencies holding membership on the Federal Library Committee were surveyed to determine library foreign language or imprint holdings, acquisitions techniques, procedures and/or problems. Specific questions, relating to holdings, staff, budget and the acquisition, processing, reference and translation services…

  15. 48 CFR 15.101 - Best value continuum.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ....101 Section 15.101 Federal Acquisition Regulations System FEDERAL ACQUISITION REGULATION CONTRACTING METHODS AND CONTRACT TYPES CONTRACTING BY NEGOTIATION Source Selection Processes and Techniques 15.101... cost or price may vary. For example, in acquisitions where the requirement is clearly definable and the...

  16. Data acquisition and processing system for the HT-6M tokamak fusion experiment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shu, Y.T.; Liu, G.C.; Pang, J.Q.

    1987-08-01

    This paper describes a high-speed data acquisition and processing system which has been successfully operated on the HT-6M tokamak fusion experimental device. The system collects, archives and analyzes up to 512 kilobytes of data from each shot of the experiment. A shot lasts 50-150 milliseconds and occurs every 5-10 minutes. The system consists of two PDP-11/24 computer systems. One PDP-11/24 is used for real-time data taking and on-line data analysis. It is based upon five CAMAC crates organized into a parallel branch. Another PDP-11/24 is used for off-line data processing. Both data acquisition software RSX-DAS and data processing software RSX-DAPmore » have modular, multi-tasking and concurrent processing features.« less

  17. A thermal emission spectral library of rock-forming minerals

    NASA Astrophysics Data System (ADS)

    Christensen, Philip R.; Bandfield, Joshua L.; Hamilton, Victoria E.; Howard, Douglas A.; Lane, Melissa D.; Piatek, Jennifer L.; Ruff, Steven W.; Stefanov, William L.

    2000-04-01

    A library of thermal infrared spectra of silicate, carbonate, sulfate, phosphate, halide, and oxide minerals has been prepared for comparison to spectra obtained from planetary and Earth-orbiting spacecraft, airborne instruments, and laboratory measurements. The emphasis in developing this library has been to obtain pure samples of specific minerals. All samples were hand processed and analyzed for composition and purity. The majority are 710-1000 μm particle size fractions, chosen to minimize particle size effects. Spectral acquisition follows a method described previously, and emissivity is determined to within 2% in most cases. Each mineral spectrum is accompanied by descriptive information in database form including compositional information, sample quality, and a comments field to describe special circumstances and unique conditions. More than 150 samples were selected to include the common rock-forming minerals with an emphasis on igneous and sedimentary minerals. This library is available in digital form and will be expanded as new, well-characterized samples are acquired.

  18. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Anheier, Norman C.; Cannon, Bret D.; Martinez, Alonzo

    The International Atomic Energy Agency’s (IAEA’s) long-term research and development plan calls for more cost-effective and efficient safeguard methods to detect and deter misuse of gaseous centrifuge enrichment plants (GCEPs). The IAEA’s current safeguards approaches at GCEPs are based on a combination of routine and random inspections that include environmental sampling and destructive assay (DA) sample collection from UF6 in-process material and selected cylinders. Samples are then shipped offsite for subsequent laboratory analysis. In this paper, a new DA sample collection and onsite analysis approach that could help to meet challenges in transportation and chain of custody for UF6 DAmore » samples is introduced. This approach uses a handheld sampler concept and a Laser Ablation, Laser Absorbance Spectrometry (LAARS) analysis instrument, both currently under development at the Pacific Northwest National Laboratory. A LAARS analysis instrument could be temporarily or permanently deployed in the IAEA control room of the facility, in the IAEA data acquisition cabinet, for example. The handheld PNNL DA sampler design collects and stabilizes a much smaller DA sample mass compared to current sampling methods. The significantly lower uranium mass reduces the sample radioactivity and the stabilization approach diminishes the risk of uranium and hydrogen fluoride release. These attributes enable safe sample handling needed during onsite LAARS assay and may help ease shipping challenges for samples to be processed at the IAEA’s offsite laboratory. The LAARS and DA sampler implementation concepts will be described and preliminary technical viability results presented.« less

  19. Report: CSB Needs to Improve Its Acquisition Approvals and Other Processes to Ensure Best Value for Taxpayers

    EPA Pesticide Factsheets

    Report #15-P-0245, July 31, 2015. CSB's acquisition process is at risk and may have ineffective operations without a strategy to implement controls. Further, CSB has limited evidence it contracted at the best value.

  20. Geo-referenced digital data acquisition and processing system using LiDAR technology : executive summary report.

    DOT National Transportation Integrated Search

    2006-02-01

    Problem: State-of-the-art airborne mapping is in major : transition, which affects both the data acquisition and : data processing technologies. The IT age has brought : powerful sensors and revolutionary new techniques to : acquire spatial data in l...

  1. Mars Science Laboratory CHIMRA/IC/DRT Flight Software for Sample Acquisition and Processing

    NASA Technical Reports Server (NTRS)

    Kim, Won S.; Leger, Chris; Carsten, Joseph; Helmick, Daniel; Kuhn, Stephen; Redick, Richard; Trujillo, Diana

    2013-01-01

    The design methodologies of using sequence diagrams, multi-process functional flow diagrams, and hierarchical state machines were successfully applied in designing three MSL (Mars Science Laboratory) flight software modules responsible for handling actuator motions of the CHIMRA (Collection and Handling for In Situ Martian Rock Analysis), IC (Inlet Covers), and DRT (Dust Removal Tool) mechanisms. The methodologies were essential to specify complex interactions with other modules, support concurrent foreground and background motions, and handle various fault protections. Studying task scenarios with multi-process functional flow diagrams yielded great insight to overall design perspectives. Since the three modules require three different levels of background motion support, the methodologies presented in this paper provide an excellent comparison. All three modules are fully operational in flight.

  2. Data acquisition for a real time fault monitoring and diagnosis knowledge-based system for space power system

    NASA Technical Reports Server (NTRS)

    Wilhite, Larry D.; Lee, S. C.; Lollar, Louis F.

    1989-01-01

    The design and implementation of the real-time data acquisition and processing system employed in the AMPERES project is described, including effective data structures for efficient storage and flexible manipulation of the data by the knowledge-based system (KBS), the interprocess communication mechanism required between the data acquisition system and the KBS, and the appropriate data acquisition protocols for collecting data from the sensors. Sensor data are categorized as critical or noncritical data on the basis of the inherent frequencies of the signals and the diagnostic requirements reflected in their values. The critical data set contains 30 analog values and 42 digital values and is collected every 10 ms. The noncritical data set contains 240 analog values and is collected every second. The collected critical and noncritical data are stored in separate circular buffers. Buffers are created in shared memory to enable other processes, i.e., the fault monitoring and diagnosis process and the user interface process, to freely access the data sets.

  3. Biomedical signal acquisition, processing and transmission using smartphone

    NASA Astrophysics Data System (ADS)

    Roncagliolo, Pablo; Arredondo, Luis; González, Agustín

    2007-11-01

    This article describes technical aspects involved in the programming of a system of acquisition, processing and transmission of biomedical signals by using mobile devices. This task is aligned with the permanent development of new technologies for the diagnosis and sickness treatment, based on the feasibility of measuring continuously different variables as electrocardiographic signals, blood pressure, oxygen concentration, pulse or simply temperature. The contribution of this technology is settled on its portability and low cost, which allows its massive use. Specifically this work analyzes the feasibility of acquisition and the processing of signals from a standard smartphone. Work results allow to state that nowadays these equipments have enough processing capacity to execute signals acquisition systems. These systems along with external servers make it possible to imagine a near future where the possibility of making continuous measures of biomedical variables will not be restricted only to hospitals but will also begin to be more frequently used in the daily life and at home.

  4. Coast Guard Acquisitions: Enhanced Oversight of Testing Could Benefit National Security Cutter Program and Future DHS Acquisitions

    DTIC Science & Technology

    2016-02-03

    Response Cutter program and plans for its upcoming Offshore Patrol Cutter program, that the Coast Guard has matured its acquisition process. The process to...Cutter and Offshore Patrol Cutter programs. GAO reviewed these two programs in June 2014 (GAO-14-450) and April 2015 (GAO- 15-171SP) and also...Government Accountability Office areas of competition and the schedule for initial testing. Furthermore, as the $12 billion Offshore Patrol Cutter program

  5. Antarctic Meteorite Classification and Petrographic Database Enhancements

    NASA Technical Reports Server (NTRS)

    Todd, N. S.; Satterwhite, C. E.; Righter, K.

    2012-01-01

    The Antarctic Meteorite collection, which is comprised of over 18,700 meteorites, is one of the largest collections of meteorites in the world. These meteorites have been collected since the late 1970 s as part of a three-agency agreement between NASA, the National Science Foundation, and the Smithsonian Institution [1]. Samples collected each season are analyzed at NASA s Meteorite Lab and the Smithsonian Institution and results are published twice a year in the Antarctic Meteorite Newsletter, which has been in publication since 1978. Each newsletter lists the samples collected and processed and provides more in-depth details on selected samples of importance to the scientific community. Data about these meteorites is also published on the NASA Curation website [2] and made available through the Meteorite Classification Database allowing scientists to search by a variety of parameters. This paper describes enhancements that have been made to the database and to the data and photo acquisition process to provide the meteorite community with faster access to meteorite data concurrent with the publication of the Antarctic Meteorite Newsletter twice a year.

  6. Phase retrieval and 3D imaging in gold nanoparticles based fluorescence microscopy (Conference Presentation)

    NASA Astrophysics Data System (ADS)

    Ilovitsh, Tali; Ilovitsh, Asaf; Weiss, Aryeh M.; Meir, Rinat; Zalevsky, Zeev

    2017-02-01

    Optical sectioning microscopy can provide highly detailed three dimensional (3D) images of biological samples. However, it requires acquisition of many images per volume, and is therefore time consuming, and may not be suitable for live cell 3D imaging. We propose the use of the modified Gerchberg-Saxton phase retrieval algorithm to enable full 3D imaging of gold nanoparticles tagged sample using only two images. The reconstructed field is free space propagated to all other focus planes using post processing, and the 2D z-stack is merged to create a 3D image of the sample with high fidelity. Because we propose to apply the phase retrieving on nano particles, the regular ambiguities typical to the Gerchberg-Saxton algorithm, are eliminated. The proposed concept is then further enhanced also for tracking of single fluorescent particles within a three dimensional (3D) cellular environment based on image processing algorithms that can significantly increases localization accuracy of the 3D point spread function in respect to regular Gaussian fitting. All proposed concepts are validated both on simulated data as well as experimentally.

  7. Combined optimization of image-gathering and image-processing systems for scene feature detection

    NASA Technical Reports Server (NTRS)

    Halyo, Nesim; Arduini, Robert F.; Samms, Richard W.

    1987-01-01

    The relationship between the image gathering and image processing systems for minimum mean squared error estimation of scene characteristics is investigated. A stochastic optimization problem is formulated where the objective is to determine a spatial characteristic of the scene rather than a feature of the already blurred, sampled and noisy image data. An analytical solution for the optimal characteristic image processor is developed. The Wiener filter for the sampled image case is obtained as a special case, where the desired characteristic is scene restoration. Optimal edge detection is investigated using the Laplacian operator x G as the desired characteristic, where G is a two dimensional Gaussian distribution function. It is shown that the optimal edge detector compensates for the blurring introduced by the image gathering optics, and notably, that it is not circularly symmetric. The lack of circular symmetry is largely due to the geometric effects of the sampling lattice used in image acquisition. The optimal image gathering optical transfer function is also investigated and the results of a sensitivity analysis are shown.

  8. Quasi-Uniform High Speed Foam Crush Testing Using a Guided Drop Mass Impact

    NASA Technical Reports Server (NTRS)

    Jones, Lisa E. (Technical Monitor); Kellas, Sotiris

    2004-01-01

    A relatively simple method for measuring the dynamic crush response of foam materials at various loading rates is described. The method utilizes a drop mass impact configuration with mass and impact velocity selected such that the crush speed remains approximately uniform during the entire sample crushing event. Instrumentation, data acquisition, and data processing techniques are presented, and limitations of the test method are discussed. The objective of the test method is to produce input data for dynamic finite element modeling involving crash and energy absorption characteristics of foam materials.

  9. An educational video game for nutrition of young people: Theory and design

    PubMed Central

    Ledoux, Tracey; Griffith, Melissa; Thompson, Debbe; Nguyen, Nga; Watson, Kathy; Baranowski, Janice; Buday, Richard; Abdelsamad, Dina; Baranowski, Tom

    2016-01-01

    Background Playing Escape from DIAB (DIAB) and Nanoswarm (NANO), epic video game adventures, increased fruit and vegetable consumption among a multi-ethnic sample of 10–12 year old children during pilot testing. Key elements of both games were educational mini-games embedded in the overall game that promoted knowledge acquisition regarding diet, physical activity and energy balance. 95–100% of participants demonstrated mastery of these mini-games suggesting knowledge acquisition. Aim This article describes the process of designing and developing the educational mini-games. A secondary purpose was to explore the experience of children while playing the games. Method The educational games were based on Social Cognitive and Mastery Learning Theories. A multidisciplinary team of behavioral nutrition, PA, and video game experts designed, developed, and tested the mini-games. Results Alpha testing revealed children generally liked the mini-games and found them to be reasonably challenging. Process evaluation data from pilot testing revealed almost all participants completed nearly all educational mini-games in a reasonable amount of time suggesting feasibility of this approach. Conclusions Future research should continue to explore the use of video games in educating children to achieve healthy behavior changes. PMID:27547019

  10. Instrumentation for optical ocean remote sensing

    NASA Technical Reports Server (NTRS)

    Esaias, W. E.

    1991-01-01

    Instruments used in ocean color remote sensing algorithm development, validation, and data acquisition which have the potential for further commercial development and marketing are discussed. The Ocean Data Acquisition System (ODAS) is an aircraft-borne radiometer system suitable for light aircraft, which has applications for rapid measurement of chlorophyll pigment concentrations along the flight line. The instrument package includes a three channel radiometer system for upwelling radiance, an infrared temperature sensor, a three-channel downwelling irradiance sensor, and Loran-C navigation. Data are stored on a PC and processed to transects or interpolated 'images' on the ground. The instrument has been in operational use for two and one half years. The accuracy of pigment concentrations from the instrument is quite good, even in complex Chesapeake Bay waters. To help meet the requirement for validation of future satellite missions, a prototype air-deployable drifting buoy for measurement of near-surface upwelled radiance in multiple channnels is undergoing test deployment. The optical drifter burst samples radiance, stores and processes the data, and uses the Argos system as a data link. Studies are underway to explore the limits to useful lifetime with respect to power and fouling.

  11. An educational video game for nutrition of young people: Theory and design.

    PubMed

    Ledoux, Tracey; Griffith, Melissa; Thompson, Debbe; Nguyen, Nga; Watson, Kathy; Baranowski, Janice; Buday, Richard; Abdelsamad, Dina; Baranowski, Tom

    2016-08-01

    Playing Escape from DIAB (DIAB) and Nanoswarm (NANO) , epic video game adventures, increased fruit and vegetable consumption among a multi-ethnic sample of 10-12 year old children during pilot testing. Key elements of both games were educational mini-games embedded in the overall game that promoted knowledge acquisition regarding diet, physical activity and energy balance. 95-100% of participants demonstrated mastery of these mini-games suggesting knowledge acquisition. This article describes the process of designing and developing the educational mini-games. A secondary purpose was to explore the experience of children while playing the games. The educational games were based on Social Cognitive and Mastery Learning Theories. A multidisciplinary team of behavioral nutrition, PA, and video game experts designed, developed, and tested the mini-games. Alpha testing revealed children generally liked the mini-games and found them to be reasonably challenging. Process evaluation data from pilot testing revealed almost all participants completed nearly all educational mini-games in a reasonable amount of time suggesting feasibility of this approach. Future research should continue to explore the use of video games in educating children to achieve healthy behavior changes.

  12. Identification and restoration in 3D fluorescence microscopy

    NASA Astrophysics Data System (ADS)

    Dieterlen, Alain; Xu, Chengqi; Haeberle, Olivier; Hueber, Nicolas; Malfara, R.; Colicchio, B.; Jacquey, Serge

    2004-06-01

    3-D optical fluorescent microscopy becomes now an efficient tool for volumic investigation of living biological samples. The 3-D data can be acquired by Optical Sectioning Microscopy which is performed by axial stepping of the object versus the objective. For any instrument, each recorded image can be described by a convolution equation between the original object and the Point Spread Function (PSF) of the acquisition system. To assess performance and ensure the data reproducibility, as for any 3-D quantitative analysis, the system indentification is mandatory. The PSF explains the properties of the image acquisition system; it can be computed or acquired experimentally. Statistical tools and Zernike moments are shown appropriate and complementary to describe a 3-D system PSF and to quantify the variation of the PSF as function of the optical parameters. Some critical experimental parameters can be identified with these tools. This is helpful for biologist to define an aquisition protocol optimizing the use of the system. Reduction of out-of-focus light is the task of 3-D microscopy; it is carried out computationally by deconvolution process. Pre-filtering the images improves the stability of deconvolution results, now less dependent on the regularization parameter; this helps the biologists to use restoration process.

  13. ACQUISITION OF REPRESENTATIVE GROUND WATER QUALITY SAMPLES FOR METALS

    EPA Science Inventory

    R.S. Kerr Environmental Research Laboratory (RSKERL) personnel have evaluated sampling procedures for the collection of representative, accurate, and reproducible ground water quality samples for metals for the past four years. Intensive sampling research at three different field...

  14. The influence of secondary conditions on job acquisition and retention in adults with spinal cord injury.

    PubMed

    Meade, Michelle A; Forchheimer, Martin B; Krause, James S; Charlifue, Susan

    2011-03-01

    To examine the associations of job acquisition and job retention to secondary conditions, hospitalizations, and nursing home stays for adults with spinal cord injury (SCI). Retrospective analysis of longitudinal data from multicenter study. Community setting. Two samples of adults participating in the SCI Model Systems; the first sample consisted of persons who reported being unemployed at follow-up (n=9501); the second sample consisted of those who reported working at follow-up (n=5,150). Not applicable. Job acquisition (change from not working at 1 anniversary of injury to working at the following data collection) and job retention (maintenance of work between 2 assessment periods). Discrete time hazard modeling was used to assess how secondary conditions affect job acquisition. After controlling for the effects of demographic and injury characteristics, hospitalizations within the last 12 months were associated with decreased chance of having obtained employment. Hierarchic logistic regression analyses were used to examine job retention. Hospitalizations and the presence of PUs were associated with lower odds of job retention once demographic and injury characteristics were controlled. Secondary conditions from the previous assessment period were not significantly related to either job acquisition or job retention after the variance from demographic and injury characteristics and current secondary conditions were controlled. Hospitalization, as well as a limited number of secondary conditions, were associated with reduced odds of both job acquisition and job retention among adults with SCI. Interventions that can prevent secondary conditions and reduce the need for hospitalizations may be beneficial in improving employment for this population. Copyright © 2011 American Congress of Rehabilitation Medicine. Published by Elsevier Inc. All rights reserved.

  15. EFM data mapped into 2D images of tip-sample contact potential difference and capacitance second derivative.

    PubMed

    Lilliu, S; Maragliano, C; Hampton, M; Elliott, M; Stefancich, M; Chiesa, M; Dahlem, M S; Macdonald, J E

    2013-11-27

    We report a simple technique for mapping Electrostatic Force Microscopy (EFM) bias sweep data into 2D images. The method allows simultaneous probing, in the same scanning area, of the contact potential difference and the second derivative of the capacitance between tip and sample, along with the height information. The only required equipment consists of a microscope with lift-mode EFM capable of phase shift detection. We designate this approach as Scanning Probe Potential Electrostatic Force Microscopy (SPP-EFM). An open-source MATLAB Graphical User Interface (GUI) for images acquisition, processing and analysis has been developed. The technique is tested with Indium Tin Oxide (ITO) and with poly(3-hexylthiophene) (P3HT) nanowires for organic transistor applications.

  16. Comparison of two data acquisition and processing systems of Moller polarimeter in Hall A of Jefferson Lab

    DOE PAGES

    Vereshchaka, Vadym V.; Glamazdin, Oleksandr V.; Pomatsalyuk, Roman I.

    2014-07-01

    Two data acquisition and processing systems are used simultaneously to measure electron beam polarization by Moller polarimeter in Hall A of Jefferson Lab (Newport News, VA, USA). The old system (since 1997) is fully functional, but is not repairable in case of malfunction (system modules arenot manufactured anymore). The new system (since 2010) based on flash-ADC is more accurate, but currently requires more detailed adjustment and further improvement. Description and specifications of two data acquisition and processing systems have been given. The results of polarization measurements during experiments conducted in Hall A from 2010 to 2012 are compared.

  17. Development of the Data Acquisition and Processing System for a Pulsed 2-Micron Coherent Doppler Lidar System

    NASA Technical Reports Server (NTRS)

    Beyon, Jeffrey Y.; Koch, Grady J.; Kavaya, Michael J.

    2010-01-01

    A general overview of the development of a data acquisition and processing system is presented for a pulsed, 2-micron coherent Doppler Lidar system located in NASA Langley Research Center in Hampton, Virginia, USA. It is a comprehensive system that performs high-speed data acquisition, analysis, and data display both in real time and offline. The first flight missions are scheduled for the summer of 2010 as part of the NASA Genesis and Rapid Intensification Processes (GRIP) campaign for the study of hurricanes. The system as well as the control software is reviewed and its requirements and unique features are discussed.

  18. 48 CFR 312.101 - Policy.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... PLANNING ACQUISITION OF COMMERCIAL ITEMS Acquisition of Commercial Items-General 312.101 Policy. (a) It is.... Accordingly, HHS has implemented a Strategic Sourcing Program through which it awards BPAs or other contract vehicles to achieve savings for commercial items and services across HHS and make the acquisition process...

  19. 23 CFR 710.309 - Acquisition.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 23 Highways 1 2012-04-01 2012-04-01 false Acquisition. 710.309 Section 710.309 Highways FEDERAL HIGHWAY ADMINISTRATION, DEPARTMENT OF TRANSPORTATION RIGHT-OF-WAY AND ENVIRONMENT RIGHT-OF-WAY AND REAL ESTATE Project Development § 710.309 Acquisition. The process of acquiring real property includes...

  20. 48 CFR 1407.102 - Policy.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... PLANNING ACQUISITION PLANNING Acquisition Plans 1407.102 Policy. DOI has implemented its acquisition planning system in 404 DM. This system meets the criteria prescribed in FAR Subpart 7.1, 375 DM, OCIO Program Management, and 376 DM, Automated Data Processing. Each of these addresses strategic planning for...

Top